High-Performance Computing Data Center | Energy Systems Integration
Facility | NREL High-Performance Computing Data Center High-Performance Computing Data Center The Energy Systems Integration Facility's High-Performance Computing Data Center is home to Peregrine -the largest high-performance computing system in the world exclusively dedicated to advancing
Berkeley Lab - Materials Sciences Division
Computational Study of Excited-State Phenomena in Energy Materials Center for X-ray Optics MSD Facilities Ion and Materials Physics Scattering and Instrumentation Science Centers Center for Computational Study of Sciences Centers Center for Computational Study of Excited-State Phenomena in Energy Materials Center for X
Energy 101: Energy Efficient Data Centers
None
2018-04-16
Data centers provide mission-critical computing functions vital to the daily operation of top U.S. economic, scientific, and technological organizations. These data centers consume large amounts of energy to run and maintain their computer systems, servers, and associated high-performance componentsâup to 3% of all U.S. electricity powers data centers. And as more information comes online, data centers will consume even more energy. Data centers can become more energy efficient by incorporating features like power-saving "stand-by" modes, energy monitoring software, and efficient cooling systems instead of energy-intensive air conditioners. These and other efficiency improvements to data centers can produce significant energy savings, reduce the load on the electric grid, and help protect the nation by increasing the reliability of critical computer operations.
High-Performance Computing Data Center Warm-Water Liquid Cooling |
Computational Science | NREL Warm-Water Liquid Cooling High-Performance Computing Data Center Warm-Water Liquid Cooling NREL's High-Performance Computing Data Center (HPC Data Center) is liquid water Liquid cooling technologies offer a more energy-efficient solution that also allows for effective
The effective use of virtualization for selection of data centers in a cloud computing environment
NASA Astrophysics Data System (ADS)
Kumar, B. Santhosh; Parthiban, Latha
2018-04-01
Data centers are the places which consist of network of remote servers to store, access and process the data. Cloud computing is a technology where users worldwide will submit the tasks and the service providers will direct the requests to the data centers which are responsible for execution of tasks. The servers in the data centers need to employ the virtualization concept so that multiple tasks can be executed simultaneously. In this paper we proposed an algorithm for data center selection based on energy of virtual machines created in server. The virtualization energy in each of the server is calculated and total energy of the data center is obtained by the summation of individual server energy. The tasks submitted are routed to the data center with least energy consumption which will result in minimizing the operational expenses of a service provider.
High Performance Computing Meets Energy Efficiency - Continuum Magazine |
NREL High Performance Computing Meets Energy Efficiency High Performance Computing Meets Energy turbines. Simulation by Patrick J. Moriarty and Matthew J. Churchfield, NREL The new High Performance Computing Data Center at the National Renewable Energy Laboratory (NREL) hosts high-speed, high-volume data
Implementing the Data Center Energy Productivity Metric in a High Performance Computing Data Center
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sego, Landon H.; Marquez, Andres; Rawson, Andrew
2013-06-30
As data centers proliferate in size and number, the improvement of their energy efficiency and productivity has become an economic and environmental imperative. Making these improvements requires metrics that are robust, interpretable, and practical. We discuss the properties of a number of the proposed metrics of energy efficiency and productivity. In particular, we focus on the Data Center Energy Productivity (DCeP) metric, which is the ratio of useful work produced by the data center to the energy consumed performing that work. We describe our approach for using DCeP as the principal outcome of a designed experiment using a highly instrumented,more » high-performance computing data center. We found that DCeP was successful in clearly distinguishing different operational states in the data center, thereby validating its utility as a metric for identifying configurations of hardware and software that would improve energy productivity. We also discuss some of the challenges and benefits associated with implementing the DCeP metric, and we examine the efficacy of the metric in making comparisons within a data center and between data centers.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hules, John
This 1998 annual report from the National Scientific Energy Research Computing Center (NERSC) presents the year in review of the following categories: Computational Science; Computer Science and Applied Mathematics; and Systems and Services. Also presented are science highlights in the following categories: Basic Energy Sciences; Biological and Environmental Research; Fusion Energy Sciences; High Energy and Nuclear Physics; and Advanced Scientific Computing Research and Other Projects.
Diabat Interpolation for Polymorph Free-Energy Differences.
Kamat, Kartik; Peters, Baron
2017-02-02
Existing methods to compute free-energy differences between polymorphs use harmonic approximations, advanced non-Boltzmann bias sampling techniques, and/or multistage free-energy perturbations. This work demonstrates how Bennett's diabat interpolation method ( J. Comput. Phys. 1976, 22, 245 ) can be combined with energy gaps from lattice-switch Monte Carlo techniques ( Phys. Rev. E 2000, 61, 906 ) to swiftly estimate polymorph free-energy differences. The new method requires only two unbiased molecular dynamics simulations, one for each polymorph. To illustrate the new method, we compute the free-energy difference between face-centered cubic and body-centered cubic polymorphs for a Gaussian core solid. We discuss the justification for parabolic models of the free-energy diabats and similarities to methods that have been used in studies of electron transfer.
Facilities | Integrated Energy Solutions | NREL
strategies needed to optimize our entire energy system. A photo of the high-performance computer at NREL . High-Performance Computing Data Center High-performance computing facilities at NREL provide high-speed
The Energy Efficiency Potential of Cloud-Based Software: A U.S. Case Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Masanet, Eric; Shehabi, Arman; Liang, Jiaqi
The energy use of data centers is a topic that has received much attention, given that data centers currently account for 1-2% of global electricity use. However, cloud computing holds great potential to reduce data center energy demand moving forward, due to both large reductions in total servers through consolidation and large increases in facility efficiencies compared to traditional local data centers. However, analyzing the net energy implications of shifts to the cloud can be very difficult, because data center services can affect many different components of society’s economic and energy systems.
High-Performance Computing Data Center Efficiency Dashboard | Computational
recovery water (ERW) loop Heat exchanger for energy recovery Thermosyphon Heat exchanger between ERW loop and cooling tower loop Evaporative cooling towers Learn more about our energy-efficient facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhattacharya, Papri; Prokopchuk, Demyan E.; Mock, Michael T.
2017-03-01
This review examines the synthesis and acid reactivity of transition metal dinitrogen complexes bearing diphosphine ligands containing pendant amine groups in the second coordination sphere. This manuscript is a review of the work performed in the Center for Molecular Electrocatalysis. This work was supported as part of the Center for Molecular Electrocatalysis, an Energy Frontier Research Center funded by the U.S. Department of Energy (U.S. DOE), Office of Science, Office of Basic Energy Sciences. EPR studies on Fe were performed using EMSL, a national scientific user facility sponsored by the DOE’s Office of Biological and Environmental Research and located atmore » PNNL. Computational resources were provided by the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. Pacific Northwest National Laboratory is operated by Battelle for the U.S. DOE.« less
Argonne Research Library | Argonne National Laboratory
Publications Researchers Postdocs Exascale Computing Institute for Molecular Engineering at Argonne Work with Scientific Publications Researchers Postdocs Exascale Computing Institute for Molecular Engineering at IMEInstitute for Molecular Engineering JCESRJoint Center for Energy Storage Research MCSGMidwest Center for
NASA Astrophysics Data System (ADS)
Wei, Wang; Chongchao, Pan; Yikai, Liang; Gang, Li
2017-11-01
With the rapid development of information technology, the scale of data center increases quickly, and the energy consumption of computer room also increases rapidly, among which, energy consumption of air conditioning cooling makes up a large proportion. How to apply new technology to reduce the energy consumption of the computer room becomes an important topic of energy saving in the current research. This paper study internet of things technology, and design a kind of green computer room environmental monitoring system. In the system, we can get the real-time environment data from the application of wireless sensor network technology, which will be showed in a creative way of three-dimensional effect. In the environment monitor, we can get the computer room assets view, temperature cloud view, humidity cloud view, microenvironment view and so on. Thus according to the condition of the microenvironment, we can adjust the air volume, temperature and humidity parameters of the air conditioning for the individual equipment cabinet to realize the precise air conditioning refrigeration. And this can reduce the energy consumption of air conditioning, as a result, the overall energy consumption of the green computer room will reduce greatly. At the same time, we apply this project in the computer center of Weihai, and after a year of test and running, we find that it took a good energy saving effect, which fully verified the effectiveness of this project on the energy conservation of the computer room.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Darrow, Ken; Hedman, Bruce
Data centers represent a rapidly growing and very energy intensive activity in commercial, educational, and government facilities. In the last five years the growth of this sector was the electric power equivalent to seven new coal-fired power plants. Data centers consume 1.5% of the total power in the U.S. Growth over the next five to ten years is expected to require a similar increase in power generation. This energy consumption is concentrated in buildings that are 10-40 times more energy intensive than a typical office building. The sheer size of the market, the concentrated energy consumption per facility, and themore » tendency of facilities to cluster in 'high-tech' centers all contribute to a potential power infrastructure crisis for the industry. Meeting the energy needs of data centers is a moving target. Computing power is advancing rapidly, which reduces the energy requirements for data centers. A lot of work is going into improving the computing power of servers and other processing equipment. However, this increase in computing power is increasing the power densities of this equipment. While fewer pieces of equipment may be needed to meet a given data processing load, the energy density of a facility designed to house this higher efficiency equipment will be as high as or higher than it is today. In other words, while the data center of the future may have the IT power of ten data centers of today, it is also going to have higher power requirements and higher power densities. This report analyzes the opportunities for CHP technologies to assist primary power in making the data center more cost-effective and energy efficient. Broader application of CHP will lower the demand for electricity from central stations and reduce the pressure on electric transmission and distribution infrastructure. This report is organized into the following sections: (1) Data Center Market Segmentation--the description of the overall size of the market, the size and types of facilities involved, and the geographic distribution. (2) Data Center Energy Use Trends--a discussion of energy use and expected energy growth and the typical energy consumption and uses in data centers. (3) CHP Applicability--Potential configurations, CHP case studies, applicable equipment, heat recovery opportunities (cooling), cost and performance benchmarks, and power reliability benefits (4) CHP Drivers and Hurdles--evaluation of user benefits, social benefits, market structural issues and attitudes toward CHP, and regulatory hurdles. (5) CHP Paths to Market--Discussion of technical needs, education, strategic partnerships needed to promote CHP in the IT community.« less
Cotes-Ruiz, Iván Tomás; Prado, Rocío P.; García-Galán, Sebastián; Muñoz-Expósito, José Enrique; Ruiz-Reyes, Nicolás
2017-01-01
Nowadays, the growing computational capabilities of Cloud systems rely on the reduction of the consumed power of their data centers to make them sustainable and economically profitable. The efficient management of computing resources is at the heart of any energy-aware data center and of special relevance is the adaptation of its performance to workload. Intensive computing applications in diverse areas of science generate complex workload called workflows, whose successful management in terms of energy saving is still at its beginning. WorkflowSim is currently one of the most advanced simulators for research on workflows processing, offering advanced features such as task clustering and failure policies. In this work, an expected power-aware extension of WorkflowSim is presented. This new tool integrates a power model based on a computing-plus-communication design to allow the optimization of new management strategies in energy saving considering computing, reconfiguration and networks costs as well as quality of service, and it incorporates the preeminent strategy for on host energy saving: Dynamic Voltage Frequency Scaling (DVFS). The simulator is designed to be consistent in different real scenarios and to include a wide repertory of DVFS governors. Results showing the validity of the simulator in terms of resources utilization, frequency and voltage scaling, power, energy and time saving are presented. Also, results achieved by the intra-host DVFS strategy with different governors are compared to those of the data center using a recent and successful DVFS-based inter-host scheduling strategy as overlapped mechanism to the DVFS intra-host technique. PMID:28085932
Cotes-Ruiz, Iván Tomás; Prado, Rocío P; García-Galán, Sebastián; Muñoz-Expósito, José Enrique; Ruiz-Reyes, Nicolás
2017-01-01
Nowadays, the growing computational capabilities of Cloud systems rely on the reduction of the consumed power of their data centers to make them sustainable and economically profitable. The efficient management of computing resources is at the heart of any energy-aware data center and of special relevance is the adaptation of its performance to workload. Intensive computing applications in diverse areas of science generate complex workload called workflows, whose successful management in terms of energy saving is still at its beginning. WorkflowSim is currently one of the most advanced simulators for research on workflows processing, offering advanced features such as task clustering and failure policies. In this work, an expected power-aware extension of WorkflowSim is presented. This new tool integrates a power model based on a computing-plus-communication design to allow the optimization of new management strategies in energy saving considering computing, reconfiguration and networks costs as well as quality of service, and it incorporates the preeminent strategy for on host energy saving: Dynamic Voltage Frequency Scaling (DVFS). The simulator is designed to be consistent in different real scenarios and to include a wide repertory of DVFS governors. Results showing the validity of the simulator in terms of resources utilization, frequency and voltage scaling, power, energy and time saving are presented. Also, results achieved by the intra-host DVFS strategy with different governors are compared to those of the data center using a recent and successful DVFS-based inter-host scheduling strategy as overlapped mechanism to the DVFS intra-host technique.
Data Center Energy Efficiency Measurement Assessment Kit Guide and Specification
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2012-10-26
A portable and temporary wireless mesh assessment kit can be used to speed up and reduce the costs of a data center energy use assessment and overcome the issues with respect to shutdowns. The assessment kit is comprised of temperature, relative humidity, and pressure sensors. Also included are power meters that can be installed on computer room air conditioners (CRACs) without intrusive interruption of data center operations. The assessment kit produces data required for a detailed energy assessment of the data center.
Joint the Center for Applied Scientific Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gamblin, Todd; Bremer, Timo; Van Essen, Brian
The Center for Applied Scientific Computing serves as Livermore Lab’s window to the broader computer science, computational physics, applied mathematics, and data science research communities. In collaboration with academic, industrial, and other government laboratory partners, we conduct world-class scientific research and development on problems critical to national security. CASC applies the power of high-performance computing and the efficiency of modern computational methods to the realms of stockpile stewardship, cyber and energy security, and knowledge discovery for intelligence applications.
ERIC Educational Resources Information Center
Cottrell, William B.; And Others
The Nuclear Safety Information Center (NSIC) is a highly sophisticated scientific information center operated at Oak Ridge National Laboratory (ORNL) for the U.S. Atomic Energy Commission. Its information file, which consists of both data and bibliographic information, is computer stored and numerous programs have been developed to facilitate the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chiang, Patrick
2014-01-31
The research goal of this CAREER proposal is to develop energy-efficient, VLSI interconnect circuits and systems that will facilitate future massively-parallel, high-performance computing. Extreme-scale computing will exhibit massive parallelism on multiple vertical levels, from thou sands of computational units on a single processor to thousands of processors in a single data center. Unfortunately, the energy required to communicate between these units at every level (on chip, off-chip, off-rack) will be the critical limitation to energy efficiency. Therefore, the PI's career goal is to become a leading researcher in the design of energy-efficient VLSI interconnect for future computing systems.
Study of the Issues of Computational Aerothermodynamics Using a Riemann Solver
2008-07-01
storage is from the translational energy resulting from the translational motion of the center of mass of the molecule. A molecule also has a rotational ...energy storage mode since the molecules can rotate about their center of mass. The third energy storage mode of molecules is from the atoms of...is the sum of the four energy modes mentioned above, namely the translational, rotational , vibrational and electronic energies. For monoatomic
High performance computing for advanced modeling and simulation of materials
NASA Astrophysics Data System (ADS)
Wang, Jue; Gao, Fei; Vazquez-Poletti, Jose Luis; Li, Jianjiang
2017-02-01
The First International Workshop on High Performance Computing for Advanced Modeling and Simulation of Materials (HPCMS2015) was held in Austin, Texas, USA, Nov. 18, 2015. HPCMS 2015 was organized by Computer Network Information Center (Chinese Academy of Sciences), University of Michigan, Universidad Complutense de Madrid, University of Science and Technology Beijing, Pittsburgh Supercomputing Center, China Institute of Atomic Energy, and Ames Laboratory.
View southeast of computer controlled energy monitoring system. System replaced ...
View southeast of computer controlled energy monitoring system. System replaced strip chart recorders and other instruments under the direct observation of the load dispatcher. - Thirtieth Street Station, Load Dispatch Center, Thirtieth & Market Streets, Railroad Station, Amtrak (formerly Pennsylvania Railroad Station), Philadelphia, Philadelphia County, PA
Computational Science News | Computational Science | NREL
-Cooled High-Performance Computing Technology at the ESIF February 28, 2018 NREL Launches New Website for High-Performance Computing System Users The National Renewable Energy Laboratory (NREL) Computational Science Center has launched a revamped website for users of the lab's high-performance computing (HPC
Berkeley Lab - Materials Sciences Division
Computational Study of Excited-State Phenomena in Energy Materials Center for X-ray Optics MSD Facilities Ion Excited-State Phenomena in Energy Materials Center for X-ray Optics MSD Facilities Ion Beam Analysis Behavior of Lithium Metal across a Rigid Block Copolymer Electrolyte Membrane. Journal of the
Impact of office productivity cloud computing on energy consumption and greenhouse gas emissions.
Williams, Daniel R; Tang, Yinshan
2013-05-07
Cloud computing is usually regarded as being energy efficient and thus emitting less greenhouse gases (GHG) than traditional forms of computing. When the energy consumption of Microsoft's cloud computing Office 365 (O365) and traditional Office 2010 (O2010) software suites were tested and modeled, some cloud services were found to consume more energy than the traditional form. The developed model in this research took into consideration the energy consumption at the three main stages of data transmission; data center, network, and end user device. Comparable products from each suite were selected and activities were defined for each product to represent a different computing type. Microsoft provided highly confidential data for the data center stage, while the networking and user device stages were measured directly. A new measurement and software apportionment approach was defined and utilized allowing the power consumption of cloud services to be directly measured for the user device stage. Results indicated that cloud computing is more energy efficient for Excel and Outlook which consumed less energy and emitted less GHG than the standalone counterpart. The power consumption of the cloud based Outlook (8%) and Excel (17%) was lower than their traditional counterparts. However, the power consumption of the cloud version of Word was 17% higher than its traditional equivalent. A third mixed access method was also measured for Word which emitted 5% more GHG than the traditional version. It is evident that cloud computing may not provide a unified way forward to reduce energy consumption and GHG. Direct conversion from the standalone package into the cloud provision platform can now consider energy and GHG emissions at the software development and cloud service design stage using the methods described in this research.
PNNL streamlines energy-guzzling computers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beckman, Mary T.; Marquez, Andres
In a room the size of a garage, two rows of six-foot-tall racks holding supercomputer hard drives sit back-to-back. Thin tubes and wires snake off the hard drives, slithering into the corners. Stepping between the rows, a rush of heat whips around you -- the air from fans blowing off processing heat. But walk farther in, between the next racks of hard drives, and the temperature drops noticeably. These drives are being cooled by a non-conducting liquid that runs right over the hardworking processors. The liquid carries the heat away in tubes, saving the air a few degrees. This ismore » the Energy Smart Data Center at Pacific Northwest National Laboratory. The bigger, faster, and meatier supercomputers get, the more energy they consume. PNNL's Andres Marquez has developed this test bed to learn how to train the behemoths in energy efficiency. The work will help supercomputers perform better as well. Processors have to keep cool or suffer from "thermal throttling," says Marquez. "That's the performance threshold where the computer is too hot to run well. That threshold is an industry secret." The center at EMSL, DOE's national scientific user facility at PNNL, harbors several ways of experimenting with energy usage. For example, the room's air conditioning is isolated from the rest of EMSL -- pipes running beneath the floor carry temperature-controlled water through heat exchangers to cooling towers outside. "We can test whether it's more energy efficient to cool directly on the processing chips or out in the water tower," says Marquez. The hard drives feed energy and temperature data to a network server running specially designed software that controls and monitors the data center. To test the center’s limits, the team runs the processors flat out – not only on carefully controlled test programs in the Energy Smart computers, but also on real world software from other EMSL research, such as regional weather forecasting models. Marquez's group is also developing "power aware computing", where the computer programs themselves perform calculations more energy efficiently. Maybe once computers get smart about energy, they'll have tips for their users.« less
Parallel Computing:. Some Activities in High Energy Physics
NASA Astrophysics Data System (ADS)
Willers, Ian
This paper examines some activities in High Energy Physics that utilise parallel computing. The topic includes all computing from the proposed SIMD front end detectors, the farming applications, high-powered RISC processors and the large machines in the computer centers. We start by looking at the motivation behind using parallelism for general purpose computing. The developments around farming are then described from its simplest form to the more complex system in Fermilab. Finally, there is a list of some developments that are happening close to the experiments.
The Future is Hera! Analyzing Astronomical Over the Internet
NASA Technical Reports Server (NTRS)
Valencic, L. A.; Chai, P.; Pence, W.; Shafer, R.; Snowden, S.
2008-01-01
Hera is the data processing facility provided by the High Energy Astrophysics Science Archive Research Center (HEASARC) at the NASA Goddard Space Flight Center for analyzing astronomical data. Hera provides all the pre-installed software packages, local disk space, and computing resources need to do general processing of FITS format data files residing on the users local computer, and to do research using the publicly available data from the High ENergy Astrophysics Division. Qualified students, educators and researchers may freely use the Hera services over the internet of research and educational purposes.
Berkeley Lab - Materials Sciences Division
Computational Study of Excited-State Phenomena in Energy Materials Center for X-ray Optics MSD Facilities Ion Facilities and Centers Staff Center for X-ray Optics Patrick Naulleau Director 510-486-4529 2-432 PNaulleau
Energy Consumption Management of Virtual Cloud Computing Platform
NASA Astrophysics Data System (ADS)
Li, Lin
2017-11-01
For energy consumption management research on virtual cloud computing platforms, energy consumption management of virtual computers and cloud computing platform should be understood deeper. Only in this way can problems faced by energy consumption management be solved. In solving problems, the key to solutions points to data centers with high energy consumption, so people are in great need to use a new scientific technique. Virtualization technology and cloud computing have become powerful tools in people’s real life, work and production because they have strong strength and many advantages. Virtualization technology and cloud computing now is in a rapid developing trend. It has very high resource utilization rate. In this way, the presence of virtualization and cloud computing technologies is very necessary in the constantly developing information age. This paper has summarized, explained and further analyzed energy consumption management questions of the virtual cloud computing platform. It eventually gives people a clearer understanding of energy consumption management of virtual cloud computing platform and brings more help to various aspects of people’s live, work and son on.
@berkeley.edu 510-642-1220 Research profile » A U.S. Department of Energy National Laboratory Operated by the Computational Study of Excited-State Phenomena in Energy Materials Center for X-ray Optics MSD Facilities Ion Investigators Division Staff Facilities and Centers Staff Jobs Safety Personnel Resources Committees In Case of
High-Performance Computing Data Center Power Usage Effectiveness |
Power Usage Effectiveness When the Energy Systems Integration Facility (ESIF) was conceived, NREL set an , ventilation, and air conditioning (HVAC), which captures fan walls, fan coils that support the data center
Hammond Photo of Steven Hammond Steve Hammond Center Director II-Technical Steven.Hammond@nrel.gov | 303-275-4121 Steve Hammond is director of the Computational Science Center at the National Renewable includes leading NREL's efforts in energy efficient data centers. Prior to NREL, Steve managed the
Argonne's Magellan Cloud Computing Research Project
Beckman, Pete
2017-12-11
Pete Beckman, head of Argonne's Leadership Computing Facility (ALCF), discusses the Department of Energy's new $32-million Magellan project, which designed to test how cloud computing can be used for scientific research. More information: http://www.anl.gov/Media_Center/News/2009/news091014a.html
Argonne's Magellan Cloud Computing Research Project
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beckman, Pete
Pete Beckman, head of Argonne's Leadership Computing Facility (ALCF), discusses the Department of Energy's new $32-million Magellan project, which designed to test how cloud computing can be used for scientific research. More information: http://www.anl.gov/Media_Center/News/2009/news091014a.html
An Efficient Virtual Machine Consolidation Scheme for Multimedia Cloud Computing.
Han, Guangjie; Que, Wenhui; Jia, Gangyong; Shu, Lei
2016-02-18
Cloud computing has innovated the IT industry in recent years, as it can delivery subscription-based services to users in the pay-as-you-go model. Meanwhile, multimedia cloud computing is emerging based on cloud computing to provide a variety of media services on the Internet. However, with the growing popularity of multimedia cloud computing, its large energy consumption cannot only contribute to greenhouse gas emissions, but also result in the rising of cloud users' costs. Therefore, the multimedia cloud providers should try to minimize its energy consumption as much as possible while satisfying the consumers' resource requirements and guaranteeing quality of service (QoS). In this paper, we have proposed a remaining utilization-aware (RUA) algorithm for virtual machine (VM) placement, and a power-aware algorithm (PA) is proposed to find proper hosts to shut down for energy saving. These two algorithms have been combined and applied to cloud data centers for completing the process of VM consolidation. Simulation results have shown that there exists a trade-off between the cloud data center's energy consumption and service-level agreement (SLA) violations. Besides, the RUA algorithm is able to deal with variable workload to prevent hosts from overloading after VM placement and to reduce the SLA violations dramatically.
US Department of Energy High School Student Supercomputing Honors Program: A follow-up assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1987-01-01
The US DOE High School Student Supercomputing Honors Program was designed to recognize high school students with superior skills in mathematics and computer science and to provide them with formal training and experience with advanced computer equipment. This document reports on the participants who attended the first such program, which was held at the National Magnetic Fusion Energy Computer Center at the Lawrence Livermore National Laboratory (LLNL) during August 1985.
High-Performance Computing Data Center Waste Heat Reuse | Computational
control room With heat exchangers, heat energy in the energy recovery water (ERW) loop becomes available to heat the facility's process hot water (PHW) loop. Once heated, the PHW loop supplies: Active loop in the courtyard of the ESIF's main entrance District heating loop: If additional heat is needed
Science Photo of person viewing 3D visualization of a wind turbine The NREL Computational Science challenges in fields ranging from condensed matter physics and nonlinear dynamics to computational fluid dynamics. NREL is also home to the most energy-efficient data center in the world, featuring Peregrine-the
Transport properties of two-dimensional metal-phthalocyanine junctions: An ab initio study
NASA Astrophysics Data System (ADS)
Liu, Shuang-Long; Wang, Yun-Peng; Li, Xiang-Guo; Cheng, Hai-Ping
We study two dimensional (2D) electronic/spintronic junctions made of metal-organic frameworks via first-principles simulation. The system consists of two Mn-phthalocyanine leads and a Ni-phthalocyanine center. A 2D Mn phthalocyanine sheet is ferromagnetic half metal and a 2D Ni phthalocyanine sheet is nonmagnetic semiconductor. Our results show that this system has a large tunnel magnetic resistance. The transmission coefficient at Fermi energy decays exponentially with the length of the central region which is not surprising. However, the transmission of the junction can be tuned using gate voltage by up to two orders of magnitude. The origin of the change lies in the mode matching between the lead and the center electronic states. Moreover, the threshold gate voltage varies with the length of the center region which provides a way of engineering the transport properties. Finally, we combine non-equilibrium Green's function and Boltzmann transport equation to compute conductance of the junction. This work was supported by the US Department of Energy (DOE), Office of Basic Energy Sciences (BES), under Contract No. DE-FG02-02ER45995. Computations were done using the utilities of NERSC and University of Florida Research Computing.
Generalized shading analysis for paraboloidal collector fields
NASA Technical Reports Server (NTRS)
Osborn, D. B.
1980-01-01
This paper presents the development and results of a generalized shading analysis for a field of point-focus parabolic dish concentrators. Shading of one concentrator by another with attendant loss of energy is a function of the position of the sun and the relative locations of the concentrators within the field. A method is presented for determining the annualized energy loss which includes a trade-off of system life-cycle energy as a function of concentrator spacing and field geometric layout. System energy output is computed on an annualized basis, employing 15 minute-increment environmental data tapes for the year 1976 at Barstow, California. For a land cost of $5000 per acre, lowest system energy cost occurs at about a 25 percent packing fraction (concentrator area/land area) for a typical 1-MWe dish-Stirling solar thermal power plant. Basic equations are given for computing the shading and concomitant energy loss as a function of concentrator center-to-center spacing, field layout site location.
Chmela, Jiří; Greisch, Jean-François; Harding, Michael E; Klopper, Wim; Kappes, Manfred M; Schooss, Detlef
2018-03-08
The gas-phase laser-induced photoluminescence of cationic mononuclear gadolinium and lutetium complexes involving two 9-oxophenalen-1-one ligands is reported. Performing measurements at a temperature of 83 K enables us to resolve vibronic transitions. Via comparison to Franck-Condon computations, the main vibrational contributions to the ligand-centered phosphorescence are determined to involve rocking, wagging, and stretching of the 9-oxophenalen-1-one-lanthanoid coordination in the low-energy range, intraligand bending, and stretching in the medium- to high-energy range, rocking of the carbonyl and methine groups, and C-H stretching beyond. Whereas Franck-Condon calculations based on density-functional harmonic frequency computations reproduce the main features of the vibrationally resolved emission spectra, the absolute transition energies as determined by density functional theory are off by several thousand wavenumbers. This discrepancy is found to remain at higher computational levels. The relative energy of the Gd(III) and Lu(III) emission bands is only reproduced at the coupled-cluster singles and doubles level and beyond.
DOT National Transportation Integrated Search
1998-08-01
The United States Department of Transportation, John A. Volpe National Transportation Systems : Center (Volpe Center), Acoustics Facility, in support of the Federal Aviation Administrations : Office of Environment and Energy (AEE), has recently co...
HNM, heliport noise model : version 2.2 user's guide
DOT National Transportation Integrated Search
1994-02-01
The John A. Volpe National Transportation Systems Center (Volpe Center), in support of : the Federal Aviation Administration, Office of Environment and Energy, has developed : Version 2.2 of the Heliport Noise Model (HNM). The HNM is a computer progr...
First-principles data-driven discovery of transition metal oxides for artificial photosynthesis
NASA Astrophysics Data System (ADS)
Yan, Qimin
We develop a first-principles data-driven approach for rapid identification of transition metal oxide (TMO) light absorbers and photocatalysts for artificial photosynthesis using the Materials Project. Initially focusing on Cr, V, and Mn-based ternary TMOs in the database, we design a broadly-applicable multiple-layer screening workflow automating density functional theory (DFT) and hybrid functional calculations of bulk and surface electronic and magnetic structures. We further assess the electrochemical stability of TMOs in aqueous environments from computed Pourbaix diagrams. Several promising earth-abundant low band-gap TMO compounds with desirable band edge energies and electrochemical stability are identified by our computational efforts and then synergistically evaluated using high-throughput synthesis and photoelectrochemical screening techniques by our experimental collaborators at Caltech. Our joint theory-experiment effort has successfully identified new earth-abundant copper and manganese vanadate complex oxides that meet highly demanding requirements for photoanodes, substantially expanding the known space of such materials. By integrating theory and experiment, we validate our approach and develop important new insights into structure-property relationships for TMOs for oxygen evolution photocatalysts, paving the way for use of first-principles data-driven techniques in future applications. This work is supported by the Materials Project Predictive Modeling Center and the Joint Center for Artificial Photosynthesis through the U.S. Department of Energy, Office of Basic Energy Sciences, Materials Sciences and Engineering Division, under Contract No. DE-AC02-05CH11231. Computational resources also provided by the Department of Energy through the National Energy Supercomputing Center.
Staff | Computational Science | NREL
develops and leads laboratory-wide efforts in high-performance computing and energy-efficient data centers Professional IV-High Perf Computing Jim.Albin@nrel.gov 303-275-4069 Ananthan, Shreyas Senior Scientist - High -Performance Algorithms and Modeling Shreyas.Ananthan@nrel.gov 303-275-4807 Bendl, Kurt IT Professional IV-High
Ammonia Oxidation by Abstraction of Three Hydrogen Atoms from a Mo–NH 3 Complex
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhattacharya, Papri; Heiden, Zachariah M.; Wiedner, Eric S.
We report ammonia oxidation by homolytic cleavage of all three H atoms from a Mo-15NH3 complex using the 2,4,6-tri-tert-butylphenoxyl radical to afford a Mo-alkylimido (Mo=15NR) complex (R = 2,4,6-tri-t-butylcyclohexa-2,5-dien-1-one). Reductive cleavage of Mo=15NR generates a terminal Mo≡N nitride, and a [Mo-15NH]+ complex is formed by protonation. Computational analysis describes the energetic profile for the stepwise removal of three H atoms from the Mo-15NH3 complex and the formation of Mo=15NR. Acknowledgment. This work was supported as part of the Center for Molecular Electrocatalysis, an Energy Frontier Re-search Center funded by the U.S. Department of Energy (U.S. DOE), Office of Science, Officemore » of Basic Energy Sciences. EPR and mass spectrometry experiments were performed using EMSL, a national scientific user facility sponsored by the DOE’s Office of Biological and Environmental Research and located at PNNL. The authors thank Dr. Eric D. Walter and Dr. Rosalie Chu for assistance in performing EPR and mass spectroscopy analysis, respectively. Computational resources provided by the National Energy Re-search Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. Pacific North-west National Laboratory is operated by Battelle for the U.S. DOE.« less
Lawrenz, Morgan; Baron, Riccardo; Wang, Yi; McCammon, J Andrew
2012-01-01
The Independent-Trajectory Thermodynamic Integration (IT-TI) approach for free energy calculation with distributed computing is described. IT-TI utilizes diverse conformational sampling obtained from multiple, independent simulations to obtain more reliable free energy estimates compared to single TI predictions. The latter may significantly under- or over-estimate the binding free energy due to finite sampling. We exemplify the advantages of the IT-TI approach using two distinct cases of protein-ligand binding. In both cases, IT-TI yields distributions of absolute binding free energy estimates that are remarkably centered on the target experimental values. Alternative protocols for the practical and general application of IT-TI calculations are investigated. We highlight a protocol that maximizes predictive power and computational efficiency.
Solar Schools: Rock LaFleche Development Center; Troy Elementary Schools.
ERIC Educational Resources Information Center
School Business Affairs, 1979
1979-01-01
A solar heating and hot water system are part of the basic design at a center for severely handicapped children in California. A computer will be recording temperature and energy usage at the largest solar installation in a Michigan school. (Author/MLF)
Effect of Graphene with Nanopores on Metal Clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Hu; Chen, Xianlang; Wang, Lei
Porous graphene, which is a novel type of defective graphene, shows excellent potential as a support material for metal clusters. In this work, the stability and electronic structures of metal clusters (Pd, Ir, Rh) supported on pristine graphene and graphene with different sizes of nanopore were investigated by first-principle density functional theory (DFT) calculations. Thereafter, CO adsorption and oxidation reaction on the Pd-graphene system were chosen to evaluate its catalytic performance. Graphene with nanopore can strongly stabilize the metal clusters and cause a substantial downshift of the d-band center of the metal clusters, thus decreasing CO adsorption. All binding energies,more » d-band centers, and adsorption energies show a linear change with the size of the nanopore: a bigger size of nanopore corresponds to a stronger metal clusters bond to the graphene, lower downshift of the d-band center, and weaker CO adsorption. By using a suitable size nanopore, supported Pd clusters on the graphene will have similar CO and O2 adsorption ability, thus leading to superior CO tolerance. The DFT calculated reaction energy barriers show that graphene with nanopore is a superior catalyst for CO oxidation reaction. These properties can play an important role in instructing graphene-supported metal catalyst preparation to prevent the diffusion or agglomeration of metal clusters and enhance catalytic performance. This work was supported by National Basic Research Program of China (973Program) (2013CB733501), the National Natural Science Foundation of China (NSFC-21176221, 21136001, 21101137, 21306169, and 91334013). D. Mei acknowledges the support from the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences. Pacific Northwest National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle. Computing time was granted by the grand challenge of computational catalysis of the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) and by the National Energy Research Scientific Computing Center (NERSC).« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raugei, Simone; DuBois, Daniel L.; Rousseau, Roger J.
Rational design of molecular catalysts requires a systematic approach to designing ligands with specific functionality and precisely tailored electronic and steric properties. It then becomes possible to devise computer protocols to predict accurately the required properties and ultimately to design catalysts by computer. In this account we first review how thermodynamic properties such as oxidation-reduction potentials (E0), acidities (pKa), and hydride donor abilities (ΔGH-) form the basis for a systematic design of molecular catalysts for reactions that are critical for a secure energy future (hydrogen evolution and oxidation, oxygen and nitrogen reduction, and carbon dioxide reduction). We highlight how densitymore » functional theory allows us to determine and predict these properties within “chemical” accuracy (~ 0.06 eV for redox potentials, ~ 1 pKa unit for pKa values, and ~ 1.5 kcal/mol for hydricities). These quantities determine free energy maps and profiles associated with catalytic cycles, i.e. the relative energies of intermediates, and help us distinguish between desirable and high-energy pathways and mechanisms. Good catalysts have flat profiles that avoid high activation barriers due to low and high energy intermediates. We illustrate how the criterion of a flat energy profile lends itself to the prediction of design points by computer for optimum catalysts. This research was carried out in the Center for Molecular Electro-catalysis, an Energy Frontier Research Center funded by the U.S. Department of Energy (DOE), Office of Science, Office of Basic Energy Sciences. Pacific Northwest National Laboratory (PNNL) is operated for the DOE by Battelle.« less
TOSCA calculations and measurements for the SLAC SLC damping ring dipole magnet
NASA Astrophysics Data System (ADS)
Early, R. A.; Cobb, J. K.
1985-04-01
The SLAC damping ring dipole magnet was originally designed with removable nose pieces at the ends. Recently, a set of magnetic measurements was taken of the vertical component of induction along the center of the magnet for four different pole-end configurations and several current settings. The three dimensional computer code TOSCA, which is currently installed on the National Magnetic Fusion Energy Computer Center's Cray X-MP, was used to compute field values for the four configurations at current settings near saturation. Comparisons were made for magnetic induction as well as effective magnetic lengths for the different configurations.
ERIC Educational Resources Information Center
Clearing: Nature and Learning in the Pacific Northwest, 1985
1985-01-01
Presents an activity in which students create a computer program capable of recording and projecting paper use at school. Includes instructional strategies and background information such as requirements for pounds of paper/tree, energy needs, water consumption, and paper value at the recycling center. A sample program is included. (DH)
BigData and computing challenges in high energy and nuclear physics
NASA Astrophysics Data System (ADS)
Klimentov, A.; Grigorieva, M.; Kiryanov, A.; Zarochentsev, A.
2017-06-01
In this contribution we discuss the various aspects of the computing resource needs experiments in High Energy and Nuclear Physics, in particular at the Large Hadron Collider. This will evolve in the future when moving from LHC to HL-LHC in ten years from now, when the already exascale levels of data we are processing could increase by a further order of magnitude. The distributed computing environment has been a great success and the inclusion of new super-computing facilities, cloud computing and volunteering computing for the future is a big challenge, which we are successfully mastering with a considerable contribution from many super-computing centres around the world, academic and commercial cloud providers. We also discuss R&D computing projects started recently in National Research Center ``Kurchatov Institute''
High-Performance Computing Data Center Cooling System Energy Efficiency |
approaches involve a cooling distribution unit (CDU) (2), which interfaces with the facility cooling loop and to the energy recovery water (ERW) loop (5), which is a closed-loop system. There are three heat rejection options for this IT load: When possible, heat energy from the energy recovery loop is transferred
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zunger, Alex
"Inverse Design: Playing 'Jeopardy' in Materials Science" was submitted by the Center for Inverse Design (CID) to the "Life at the Frontiers of Energy Research" video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. CID, an EFRC directed by Bill Tumas at the National Renewable Energy Laboratory is a partnership of scientists from six institutions: NREL (lead), Northwestern University, University of Colorado, Colorado School of Mines, Stanford University, and Oregon State University. The Office of Basic Energy Sciencesmore » in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges. The mission of the Center for Inverse Design is 'to replace trial-and-error methods used in the development of materials for solar energy conversion with an inverse design approach powered by theory and computation.' Research topics are: solar photovoltaic, photonic, metamaterial, defects, spin dynamics, matter by design, novel materials synthesis, and defect tolerant materials.« less
DCDM1: Lessons Learned from the World's Most Energy Efficient Data Center
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sickinger, David E; Van Geet, Otto D; Carter, Thomas
This presentation discusses the holistic approach to design the world's most energy-efficient data center, which is located at the U.S. Department of Energy National Renewable Energy Laboratory (NREL). This high-performance computing (HPC) data center has achieved a trailing twelve-month average power usage effectiveness (PUE) of 1.04 and features a chiller-less design, component-level warm-water liquid cooling, and waste heat capture and reuse. We provide details of the demonstrated PUE and energy reuse effectiveness (ERE) and lessons learned during four years of production operation. Recent efforts to dramatically reduce the water footprint will also be discussed. Johnson Controls partnered with NREL andmore » Sandia National Laboratories to deploy a thermosyphon cooler (TSC) as a test bed at NREL's HPC data center that resulted in a 50% reduction in water usage during the first year of operation. The Thermosyphon Cooler Hybrid System (TCHS) integrates the control of a dry heat rejection device with an open cooling tower.« less
Computer Assisted Virtual Environment - CAVE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erickson, Phillip; Podgorney, Robert; Weingartner,
Research at the Center for Advanced Energy Studies is taking on another dimension with a 3-D device known as a Computer Assisted Virtual Environment. The CAVE uses projection to display high-end computer graphics on three walls and the floor. By wearing 3-D glasses to create depth perception and holding a wand to move and rotate images, users can delve into data.
Computer Assisted Virtual Environment - CAVE
Erickson, Phillip; Podgorney, Robert; Weingartner,
2018-05-30
Research at the Center for Advanced Energy Studies is taking on another dimension with a 3-D device known as a Computer Assisted Virtual Environment. The CAVE uses projection to display high-end computer graphics on three walls and the floor. By wearing 3-D glasses to create depth perception and holding a wand to move and rotate images, users can delve into data.
NASA Astrophysics Data System (ADS)
Tsuda, Kunikazu; Tano, Shunichi; Ichino, Junko
To lower power consumption has becomes a worldwide concern. It is also becoming a bigger area in Computer Systems, such as reflected by the growing use of software-as-a-service and cloud computing whose market has increased since 2000, at the same time, the number of data centers that accumulates and manages the computer has increased rapidly. Power consumption at data centers is accounts for a big share of the entire IT power usage, and is still rapidly increasing. This research focuses on the air-conditioning that occupies accounts for the biggest portion of electric power consumption by data centers, and proposes to develop a technique to lower the power consumption by applying the natural cool air and the snow for control temperature and humidity. We verify those effectiveness of this approach by the experiment. Furthermore, we also examine the extent to which energy reduction is possible when a data center is located in Hokkaido.
Nonperturbative methods in HZE ion transport
NASA Technical Reports Server (NTRS)
Wilson, John W.; Badavi, Francis F.; Costen, Robert C.; Shinn, Judy L.
1993-01-01
A nonperturbative analytic solution of the high charge and energy (HZE) Green's function is used to implement a computer code for laboratory ion beam transport. The code is established to operate on the Langley Research Center nuclear fragmentation model used in engineering applications. Computational procedures are established to generate linear energy transfer (LET) distributions for a specified ion beam and target for comparison with experimental measurements. The code is highly efficient and compares well with the perturbation approximations.
Center for Building Science: Annual report, FY 1986
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cairns, E.J.; Rosenfeld, A.H.
1987-05-01
The Center for Building Science consists of four programs in the Applied Science Division: energy analysis, buildings energy systems, windows and lighting, and indoor environment. It was established to provide an umbrella so that goups in different programs but with similar interests could combine to perform joint research, develop new research areas, share resources, and produce joint publications. As detailed below, potential savings for the U.S. society from energy efficient buildings are enormous. But these savings can only be realized through an expanding federal RandD program that develops expertise in this new area. The Center for Building Science develops efficientmore » new building componenets, computer models, data and information systems, and trains needed builidng scientists. 135 refs., 72 figs., 18 tabs.« less
Localized overlap algorithm for unexpanded dispersion energies
NASA Astrophysics Data System (ADS)
Rob, Fazle; Misquitta, Alston J.; Podeszwa, Rafał; Szalewicz, Krzysztof
2014-03-01
First-principles-based, linearly scaling algorithm has been developed for calculations of dispersion energies from frequency-dependent density susceptibility (FDDS) functions with account of charge-overlap effects. The transition densities in FDDSs are fitted by a set of auxiliary atom-centered functions. The terms in the dispersion energy expression involving products of such functions are computed using either the unexpanded (exact) formula or from inexpensive asymptotic expansions, depending on the location of these functions relative to the dimer configuration. This approach leads to significant savings of computational resources. In particular, for a dimer consisting of two elongated monomers with 81 atoms each in a head-to-head configuration, the most favorable case for our algorithm, a 43-fold speedup has been achieved while the approximate dispersion energy differs by less than 1% from that computed using the standard unexpanded approach. In contrast, the dispersion energy computed from the distributed asymptotic expansion differs by dozens of percent in the van der Waals minimum region. A further increase of the size of each monomer would result in only small increased costs since all the additional terms would be computed from the asymptotic expansion.
Press Releases | Argonne National Laboratory
Electrochemical Energy Science --Center for Transportation Research --Chain Reaction Innovations --Computation renewable energy such as wind and solar power. April 25, 2018 John Carlisle, director of Chain Reaction across nation to grow startups Argonne announces second cohort of Chain Reaction Innovations. April 18
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sheppy, M.; Lobato, C.; Van Geet, O.
2011-12-01
This publication detailing the design, implementation strategies, and continuous performance monitoring of NREL's Research Support Facility data center. Data centers are energy-intensive spaces that facilitate the transmission, receipt, processing, and storage of digital data. These spaces require redundancies in power and storage, as well as infrastructure, to cool computing equipment and manage the resulting waste heat (Tschudi, Xu, Sartor, and Stein, 2003). Data center spaces can consume more than 100 times the energy of standard office spaces (VanGeet 2011). The U.S. Environmental Protection Agency (EPA) reported that data centers used 61 billion kilowatt-hours (kWh) in 2006, which was 1.5% ofmore » the total electricity consumption in the U.S. (U.S. EPA, 2007). Worldwide, data centers now consume more energy annually than Sweden (New York Times, 2009). Given their high energy consumption and conventional operation practices, there is a potential for huge energy savings in data centers. The National Renewable Energy Laboratory (NREL) is world renowned for its commitment to green building construction. In June 2010, the laboratory finished construction of a 220,000-square-foot (ft{sup 2}), LEED Platinum, Research Support Facility (RSF), which included a 1,900-ft{sup 2} data center. The RSF will expand to 360,000 ft{sup 2} with the opening of an additional wing December, 2011. The project's request for proposals (RFP) set a whole-building demand-side energy use requirement of a nominal 35 kBtu/ft{sup 2} per year. On-site renewable energy generation will offset the annual energy consumption. To support the RSF's energy goals, NREL's new data center was designed to minimize its energy footprint without compromising service quality. Several implementation challenges emerged during the design, construction, and first 11 months of operation of the RSF data center. This document highlights these challenges and describes in detail how NREL successfully overcame them. The IT settings and strategies outlined in this document have been used to significantly reduce data center energy requirements in the RSF; however, these can also be used in existing buildings and retrofits.« less
Integrating Grid Services into the Cray XT4 Environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
NERSC; Cholia, Shreyas; Lin, Hwa-Chun Wendy
2009-05-01
The 38640 core Cray XT4"Franklin" system at the National Energy Research Scientific Computing Center (NERSC) is a massively parallel resource available to Department of Energy researchers that also provides on-demand grid computing to the Open Science Grid. The integration of grid services on Franklin presented various challenges, including fundamental differences between the interactive and compute nodes, a stripped down compute-node operating system without dynamic library support, a shared-root environment and idiosyncratic application launching. Inour work, we describe how we resolved these challenges on a running, general-purpose production system to provide on-demand compute, storage, accounting and monitoring services through generic gridmore » interfaces that mask the underlying system-specific details for the end user.« less
Electronic Structure Theory | Materials Science | NREL
design and discover materials for energy applications. This includes detailed studies of the physical computing. Key Research Areas Materials by Design NREL leads the U.S. Department of Energy's Center for Next Generation of Materials by Design, which incorporates metastability and synthesizability. Learn more about
ASCR Cybersecurity for Scientific Computing Integrity - Research Pathways and Ideas Workshop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peisert, Sean; Potok, Thomas E.; Jones, Todd
At the request of the U.S. Department of Energy's (DOE) Office of Science (SC) Advanced Scientific Computing Research (ASCR) program office, a workshop was held June 2-3, 2015, in Gaithersburg, MD, to identify potential long term (10 to +20 year) cybersecurity fundamental basic research and development challenges, strategies and roadmap facing future high performance computing (HPC), networks, data centers, and extreme-scale scientific user facilities. This workshop was a follow-on to the workshop held January 7-9, 2015, in Rockville, MD, that examined higher level ideas about scientific computing integrity specific to the mission of the DOE Office of Science. Issues includedmore » research computation and simulation that takes place on ASCR computing facilities and networks, as well as network-connected scientific instruments, such as those run by various DOE Office of Science programs. Workshop participants included researchers and operational staff from DOE national laboratories, as well as academic researchers and industry experts. Participants were selected based on the submission of abstracts relating to the topics discussed in the previous workshop report [1] and also from other ASCR reports, including "Abstract Machine Models and Proxy Architectures for Exascale Computing" [27], the DOE "Preliminary Conceptual Design for an Exascale Computing Initiative" [28], and the January 2015 machine learning workshop [29]. The workshop was also attended by several observers from DOE and other government agencies. The workshop was divided into three topic areas: (1) Trustworthy Supercomputing, (2) Extreme-Scale Data, Knowledge, and Analytics for Understanding and Improving Cybersecurity, and (3) Trust within High-end Networking and Data Centers. Participants were divided into three corresponding teams based on the category of their abstracts. The workshop began with a series of talks from the program manager and workshop chair, followed by the leaders for each of the three topics and a representative of each of the four major DOE Office of Science Advanced Scientific Computing Research Facilities: the Argonne Leadership Computing Facility (ALCF), the Energy Sciences Network (ESnet), the National Energy Research Scientific Computing Center (NERSC), and the Oak Ridge Leadership Computing Facility (OLCF). The rest of the workshop consisted of topical breakout discussions and focused writing periods that produced much of this report.« less
NASA Astrophysics Data System (ADS)
Lutz, Jesse J.; Duan, Xiaofeng F.; Burggraf, Larry W.
2018-03-01
Valence excitation spectra are computed for deep-center silicon-vacancy defects in 3C, 4H, and 6H silicon carbide (SiC), and comparisons are made with literature photoluminescence measurements. Optimizations of nuclear geometries surrounding the defect centers are performed within a Gaussian basis-set framework using many-body perturbation theory or density functional theory (DFT) methods, with computational expenses minimized by a QM/MM technique called SIMOMM. Vertical excitation energies are subsequently obtained by applying excitation-energy, electron-attached, and ionized equation-of-motion coupled-cluster (EOMCC) methods, where appropriate, as well as time-dependent (TD) DFT, to small models including only a few atoms adjacent to the defect center. We consider the relative quality of various EOMCC and TD-DFT methods for (i) energy-ordering potential ground states differing incrementally in charge and multiplicity, (ii) accurately reproducing experimentally measured photoluminescence peaks, and (iii) energy-ordering defects of different types occurring within a given polytype. The extensibility of this approach to transition-metal defects is also tested by applying it to silicon-substituted chromium defects in SiC and comparing with measurements. It is demonstrated that, when used in conjunction with SIMOMM-optimized geometries, EOMCC-based methods can provide a reliable prediction of the ground-state charge and multiplicity, while also giving a quantitative description of the photoluminescence spectra, accurate to within 0.1 eV of measurement for all cases considered.
Multicore: Fallout from a Computing Evolution
Yelick, Kathy [Director, NERSC
2017-12-09
July 22, 2008 Berkeley Lab lecture: Parallel computing used to be reserved for big science and engineering projects, but in two years that's all changed. Even laptops and hand-helds use parallel processors. Unfortunately, the software hasn't kept pace. Kathy Yelick, Director of the National Energy Research Scientific Computing Center at Berkeley Lab, describes the resulting chaos and the computing community's efforts to develop exciting applications that take advantage of tens or hundreds of processors on a single chip.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beckman, P.; Martin, D.; Drugan, C.
2010-11-23
This year the Argonne Leadership Computing Facility (ALCF) delivered nearly 900 million core hours of science. The research conducted at their leadership class facility touched our lives in both minute and massive ways - whether it was studying the catalytic properties of gold nanoparticles, predicting protein structures, or unearthing the secrets of exploding stars. The authors remained true to their vision to act as the forefront computational center in extending science frontiers by solving pressing problems for our nation. Our success in this endeavor was due mainly to the Department of Energy's (DOE) INCITE (Innovative and Novel Computational Impact onmore » Theory and Experiment) program. The program awards significant amounts of computing time to computationally intensive, unclassified research projects that can make high-impact scientific advances. This year, DOE allocated 400 million hours of time to 28 research projects at the ALCF. Scientists from around the world conducted the research, representing such esteemed institutions as the Princeton Plasma Physics Laboratory, National Institute of Standards and Technology, and European Center for Research and Advanced Training in Scientific Computation. Argonne also provided Director's Discretionary allocations for research challenges, addressing such issues as reducing aerodynamic noise, critical for next-generation 'green' energy systems. Intrepid - the ALCF's 557-teraflops IBM Blue/Gene P supercomputer - enabled astounding scientific solutions and discoveries. Intrepid went into full production five months ahead of schedule. As a result, the ALCF nearly doubled the days of production computing available to the DOE Office of Science, INCITE awardees, and Argonne projects. One of the fastest supercomputers in the world for open science, the energy-efficient system uses about one-third as much electricity as a machine of comparable size built with more conventional parts. In October 2009, President Barack Obama recognized the excellence of the entire Blue Gene series by awarding it to the National Medal of Technology and Innovation. Other noteworthy achievements included the ALCF's collaboration with the National Energy Research Scientific Computing Center (NERSC) to examine cloud computing as a potential new computing paradigm for scientists. Named Magellan, the DOE-funded initiative will explore which science application programming models work well within the cloud, as well as evaluate the challenges that come with this new paradigm. The ALCF obtained approval for its next-generation machine, a 10-petaflops system to be delivered in 2012. This system will allow us to resolve ever more pressing problems, even more expeditiously through breakthrough science in the years to come.« less
NREL's Building-Integrated Supercomputer Provides Heating and Efficient Computing (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2014-09-01
NREL's Energy Systems Integration Facility (ESIF) is meant to investigate new ways to integrate energy sources so they work together efficiently, and one of the key tools to that investigation, a new supercomputer, is itself a prime example of energy systems integration. NREL teamed with Hewlett-Packard (HP) and Intel to develop the innovative warm-water, liquid-cooled Peregrine supercomputer, which not only operates efficiently but also serves as the primary source of building heat for ESIF offices and laboratories. This innovative high-performance computer (HPC) can perform more than a quadrillion calculations per second as part of the world's most energy-efficient HPC datamore » center.« less
NPBalsara@lbl.gov 510-642-8973 Research profile » A U.S. Department of Energy National Laboratory Operated by the University of California UC logo Questions & Comments * Privacy & Security Notice Computational Study of Excited-State Phenomena in Energy Materials Center for X-ray Optics MSD Facilities Ion
California, Berkeley tingxu@berkeley.edu 510-642-1632 Research profile » A U.S. Department of Energy National Laboratory Operated by the University of California UC logo Questions & Comments * Privacy Computational Study of Excited-State Phenomena in Energy Materials Center for X-ray Optics MSD Facilities Ion
Todo, A S; Hiromoto, G; Turner, J E; Hamm, R N; Wright, H A
1982-12-01
Previous calculations of the initial energies of electrons produced in water irradiated by photons are extended to 1 GeV by including pair and triplet production. Calculations were performed with the Monte Carlo computer code PHOEL-3, which replaces the earlier code, PHOEL-2. Tables of initial electron energies are presented for single interactions of monoenergetic photons at a number of energies from 10 keV to 1 GeV. These tables can be used to compute kerma in water irradiated by photons with arbitrary energy spectra to 1 GeV. In addition, separate tables of Compton-and pair-electron spectra are given over this energy range. The code PHOEL-3 is available from the Radiation Shielding Information Center, Oak Ridge National Laboratory, Oak Ridge, TN 37830.
Biomedical Computing Technology Information Center: introduction and report of early progress
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maskewitz, B.F.; Henne, R.L.; McClain, W.J.
1976-01-01
In July 1975, the Biomedical Computing Technology Information Center (BCTIC) was established by the Division of Biomedical and Environmental Research of the U. S. Energy Research and Development Administration (ERDA) at the Oak Ridge National Laboratory. BCTIC collects, organizes, evaluates, and disseminates information on computing technology pertinent to biomedicine, providing needed routes of communication between installations and serving as a clearinghouse for the exchange of biomedical computing software, data, and interface designs. This paper presents BCTIC's functions and early progress to the MUMPS Users' Group in order to stimulate further discussion and cooperation between the two organizations. (BCTIC services aremore » available to its sponsors and their contractors and to any individual/group willing to participate in mutual exchange.) 1 figure.« less
Multicore: Fallout From a Computing Evolution (LBNL Summer Lecture Series)
Yelick, Kathy [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC)
2018-05-07
Summer Lecture Series 2008: Parallel computing used to be reserved for big science and engineering projects, but in two years that's all changed. Even laptops and hand-helds use parallel processors. Unfortunately, the software hasn't kept pace. Kathy Yelick, Director of the National Energy Research Scientific Computing Center at Berkeley Lab, describes the resulting chaos and the computing community's efforts to develop exciting applications that take advantage of tens or hundreds of processors on a single chip.
Mechanical Computing Redux: Limitations at the Nanoscale
NASA Astrophysics Data System (ADS)
Liu, Tsu-Jae King
2014-03-01
Technology solutions for overcoming the energy efficiency limits of nanoscale complementary metal oxide semiconductor (CMOS) technology ultimately will be needed in order to address the growing issue of integrated-circuit chip power density. Off-state leakage current sets a fundamental lower limit in energy per operation for any voltage-level-based digital logic implemented with transistors (CMOS and beyond), which leads to practical limits for device density (i.e. cost) and operating frequency (i.e. system performance). Mechanical switches have zero off-state leakag and hence can overcome this fundamental limit. Contact adhesive force sets a lower limit for the switching energy of a mechanical switch, however, and also directly impacts its performance. This paper will review recent progress toward the development of nano-electro-mechanical relay technology and discuss remaining challenges for realizing the promise of mechanical computing for ultra-low-power computing. Supported by the Center for Energy Efficient Electronics Science (NSF Award 0939514).
Evaluation of Rankine cycle air conditioning system hardware by computer simulation
NASA Technical Reports Server (NTRS)
Healey, H. M.; Clark, D.
1978-01-01
A computer program for simulating the performance of a variety of solar powered Rankine cycle air conditioning system components (RCACS) has been developed. The computer program models actual equipment by developing performance maps from manufacturers data and is capable of simulating off-design operation of the RCACS components. The program designed to be a subroutine of the Marshall Space Flight Center (MSFC) Solar Energy System Analysis Computer Program 'SOLRAD', is a complete package suitable for use by an occasional computer user in developing performance maps of heating, ventilation and air conditioning components.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Driscoll, Frederick R.
The University of Washington (UW) - Northwest National Marine Renewable Energy Center (UW-NNMREC) and the National Renewable Energy Laboratory (NREL) will collaborate to advance research and development (R&D) of Marine Hydrokinetic (MHK) renewable energy technology, specifically renewable energy captured from ocean tidal currents. UW-NNMREC is endeavoring to establish infrastructure, capabilities and tools to support in-water testing of marine energy technology. NREL is leveraging its experience and capabilities in field testing of wind systems to develop protocols and instrumentation to advance field testing of MHK systems. Under this work, UW-NNMREC and NREL will work together to develop a common instrumentation systemmore » and testing methodologies, standards and protocols. UW-NNMREC is also establishing simulation capabilities for MHK turbine and turbine arrays. NREL has extensive experience in wind turbine array modeling and is developing several computer based numerical simulation capabilities for MHK systems. Under this CRADA, UW-NNMREC and NREL will work together to augment single device and array modeling codes. As part of this effort UW NNMREC will also work with NREL to run simulations on NREL's high performance computer system.« less
NASA Astrophysics Data System (ADS)
Liu, Jingfa; Song, Beibei; Liu, Zhaoxia; Huang, Weibo; Sun, Yuanyuan; Liu, Wenjie
2013-11-01
Protein structure prediction (PSP) is a classical NP-hard problem in computational biology. The energy-landscape paving (ELP) method is a class of heuristic global optimization algorithm, and has been successfully applied to solving many optimization problems with complex energy landscapes in the continuous space. By putting forward a new update mechanism of the histogram function in ELP and incorporating the generation of initial conformation based on the greedy strategy and the neighborhood search strategy based on pull moves into ELP, an improved energy-landscape paving (ELP+) method is put forward. Twelve general benchmark instances are first tested on both two-dimensional and three-dimensional (3D) face-centered-cubic (fcc) hydrophobic-hydrophilic (HP) lattice models. The lowest energies by ELP+ are as good as or better than those of other methods in the literature for all instances. Then, five sets of larger-scale instances, denoted by S, R, F90, F180, and CASP target instances on the 3D FCC HP lattice model are tested. The proposed algorithm finds lower energies than those by the five other methods in literature. Not unexpectedly, this is particularly pronounced for the longer sequences considered. Computational results show that ELP+ is an effective method for PSP on the fcc HP lattice model.
DePaolo, Donald J. (Director, Center for Nanoscale Control of Geologic CO2); NCGC Staff
2017-12-09
'Carbon in Underland' was submitted by the Center for Nanoscale Control of Geologic CO2 (NCGC) to the 'Life at the Frontiers of Energy Research' video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. This video was selected as one of five winners by a distinguished panel of judges for its 'entertaining animation and engaging explanations of carbon sequestration'. NCGC, an EFRC directed by Donald J. DePaolo at Lawrence Berkeley National Laboratory is a partnership of scientists from seven institutions: LBNL (lead) Massachusetts Institute of Technology, Lawrence Livermore National Laboratory, Oak Ridge National Laboratory, University of California, Davis, Ohio State University, and Washington University in St. Louis. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges. The mission of the Center for Nanoscale Control of Geologic CO{sub 2} is 'to use new investigative tools, combined with experiments and computer simulations, to build a fundamental understanding of molecular-to-pore-scale processes in fluid-rock systems, and to demonstrate the ability to control critical aspects of flow, transport, and mineralization in porous rock media as applied to geologic sequestration of CO{sub 2}. Research topics are: bio-inspired, CO{sub 2} (store), greenhouse gas, and interfacial characterization.
High Efficiency Photonic Switch for Data Centers
DOE Office of Scientific and Technical Information (OSTI.GOV)
LaComb, Lloyd J.; Bablumyan, Arkady; Ordyan, Armen
2016-12-06
The worldwide demand for instant access to information is driving internet growth rates above 50% annually. This rapid growth is straining the resources and architectures of existing data centers, metro networks and high performance computer centers. If the current business as usual model continues, data centers alone will require 400TWhr of electricity by 2020. In order to meet the challenges of a faster and more cost effective data centers, metro networks and supercomputing facilities, we have demonstrated a new type of optical switch that will support transmissions speeds up to 1Tb/s, and requires significantly less energy per bit than
Energy Systems Integration Partnerships: NREL + Sandia + Johnson Controls
DOE Office of Scientific and Technical Information (OSTI.GOV)
NREL and Sandia National Laboratories partnered with Johnson Controls to deploy the company's BlueStream Hybrid Cooling System at ESIF's high-performance computing data center to reduce water consumption seen in evaporative cooling towers.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Munro, J.K. Jr.
1980-05-01
The advent of large, fast computers has opened the way to modeling more complex physical processes and to handling very large quantities of experimental data. The amount of information that can be processed in a short period of time is so great that use of graphical displays assumes greater importance as a means of displaying this information. Information from dynamical processes can be displayed conveniently by use of animated graphics. This guide presents the basic techniques for generating black and white animated graphics, with consideration of aesthetic, mechanical, and computational problems. The guide is intended for use by someone whomore » wants to make movies on the National Magnetic Fusion Energy Computing Center (NMFECC) CDC-7600. Problems encountered by a geographically remote user are given particular attention. Detailed information is given that will allow a remote user to do some file checking and diagnosis before giving graphics files to the system for processing into film in order to spot problems without having to wait for film to be delivered. Source listings of some useful software are given in appendices along with descriptions of how to use it. 3 figures, 5 tables.« less
Optimal Operation of Data Centers in Future Smart Grid
NASA Astrophysics Data System (ADS)
Ghamkhari, Seyed Mahdi
The emergence of cloud computing has established a growing trend towards building massive, energy-hungry, and geographically distributed data centers. Due to their enormous energy consumption, data centers are expected to have major impact on the electric grid by significantly increasing the load at locations where they are built. However, data centers also provide opportunities to help the grid with respect to robustness and load balancing. For instance, as data centers are major and yet flexible electric loads, they can be proper candidates to offer ancillary services, such as voluntary load reduction, to the smart grid. Also, data centers may better stabilize the price of energy in the electricity markets, and at the same time reduce their electricity cost by exploiting the diversity in the price of electricity in the day-ahead and real-time electricity markets. In this thesis, such potentials are investigated within an analytical profit maximization framework by developing new mathematical models based on queuing theory. The proposed models capture the trade-off between quality-of-service and power consumption in data centers. They are not only accurate, but also they posses convexity characteristics that facilitate joint optimization of data centers' service rates, demand levels and demand bids to different electricity markets. The analysis is further expanded to also develop a unified comprehensive energy portfolio optimization for data centers in the future smart grid. Specifically, it is shown how utilizing one energy option may affect selecting other energy options that are available to a data center. For example, we will show that the use of on-site storage and the deployment of geographical workload distribution can particularly help data centers in utilizing high-risk energy options such as renewable generation. The analytical approach in this thesis takes into account service-level-agreements, risk management constraints, and also the statistical characteristics of the Internet workload and the electricity prices. Using empirical data, the performance of our proposed profit maximization models for data centers are evaluated, and the capability of data centers to benefit from participation in a variety of Demand Response programs is assessed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sartori, E.; Roussin, R.W.
This paper presents a brief review of computer codes concerned with checking, plotting, processing and using of covariances of neutron cross-section data. It concentrates on those available from the computer code information centers of the United States and the OECD/Nuclear Energy Agency. Emphasis will be placed also on codes using covariances for specific applications such as uncertainty analysis, data adjustment and data consistency analysis. Recent evaluations contain neutron cross section covariance information for all isotopes of major importance for technological applications of nuclear energy. It is therefore important that the available software tools needed for taking advantage of this informationmore » are widely known as hey permit the determination of better safety margins and allow the optimization of more economic, I designs of nuclear energy systems.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weisbin, C.R.
1987-03-01
This document reviews research accomplishments achieved by the staff of the Center for Engineering Systems Advanced Research (CESAR) during the fiscal years 1984 through 1987. The manuscript also describes future CESAR objectives for the 1988-1991 planning horizon, and beyond. As much as possible, the basic research goals are derived from perceived Department of Energy (DOE) needs for increased safety, productivity, and competitiveness in the United States energy producing and consuming facilities. Research areas covered include the HERMIES-II Robot, autonomous robot navigation, hypercube computers, machine vision, and manipulators.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radtke, M.A.
This paper will chronicle the activity at Wisconsin Public Service Corporation (WPSC) that resulted in the complete migration of a traditional, late 1970`s vintage, Energy Management System (EMS). The new environment includes networked microcomputers, minicomputers, and the corporate mainframe, and provides on-line access to employees outside the energy control center and some WPSC customers. In the late 1980`s, WPSC was forecasting an EMS computer upgrade or replacement to address both capacity and technology needs. Reasoning that access to diverse computing resources would best position the company to accommodate the uncertain needs of the energy industry in the 90`s, WPSC chosemore » to investigate an in-place migration to a network of computers, able to support heterogeneous hardware and operating systems. The system was developed in a modular fashion, with individual modules being deployed as soon as they were completed. The functional and technical specification was continuously enhanced as operating experience was gained from each operational module. With the migration off the original EMS computers complete, the networked system called DEMAXX (Distributed Energy Management Architecture with eXtensive eXpandability) has exceeded expectations in the areas of: cost, performance, flexibility, and reliability.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radtke, M.A.
This paper will chronicle the activity at Wisconsin Public Service Corporation (WPSC) that resulted in the complete migration of a traditional, late 1970`s vintage, Energy management System (EMS). The new environment includes networked microcomputers, minicomputers, and the corporate mainframe, and provides on-line access to employees outside the energy control center and some WPSC customers. In the late 1980`s, WPSC was forecasting an EMS computer upgrade or replacement to address both capacity and technology needs. Reasoning that access to diverse computing resources would best position the company to accommodate the uncertain needs of the energy industry in the 90`s, WPSC chosemore » to investigate an in-place migration to a network of computers, able to support heterogeneous hardware and operating systems. The system was developed in a modular fashion, with individual modules being deployed as soon as they were completed. The functional and technical specification was continuously enhanced as operating experience was gained from each operational module. With the migration of the original EMS computers complete, the networked system called DEMAXX (Distributed Energy Management Architecture with eXtensive eXpandability) has exceeded expectations in the areas of: cost, performance, flexibility, and reliability.« less
Oklahoma Center for High Energy Physics (OCHEP)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nandi, S; Strauss, M J; Snow, J
2012-02-29
The DOE EPSCoR implementation grant, with the support from the State of Oklahoma and from the three universities, Oklahoma State University, University of Oklahoma and Langston University, resulted in establishing of the Oklahoma Center for High Energy Physics (OCHEP) in 2004. Currently, OCHEP continues to flourish as a vibrant hub for research in experimental and theoretical particle physics and an educational center in the State of Oklahoma. All goals of the original proposal were successfully accomplished. These include foun- dation of a new experimental particle physics group at OSU, the establishment of a Tier 2 computing facility for the Largemore » Hadron Collider (LHC) and Tevatron data analysis at OU and organization of a vital particle physics research center in Oklahoma based on resources of the three universities. OSU has hired two tenure-track faculty members with initial support from the grant funds. Now both positions are supported through OSU budget. This new HEP Experimental Group at OSU has established itself as a full member of the Fermilab D0 Collaboration and LHC ATLAS Experiment and has secured external funds from the DOE and the NSF. These funds currently support 2 graduate students, 1 postdoctoral fellow, and 1 part-time engineer. The grant initiated creation of a Tier 2 computing facility at OU as part of the Southwest Tier 2 facility, and a permanent Research Scientist was hired at OU to maintain and run the facility. Permanent support for this position has now been provided through the OU university budget. OCHEP represents a successful model of cooperation of several universities, providing the establishment of critical mass of manpower, computing and hardware resources. This led to increasing Oklahoma's impact in all areas of HEP, theory, experiment, and computation. The Center personnel are involved in cutting edge research in experimental, theoretical, and computational aspects of High Energy Physics with the research areas ranging from the search for new phenomena at the Fermilab Tevatron and the CERN Large Hadron Collider to theoretical modeling, computer simulation, detector development and testing, and physics analysis. OCHEP faculty members participating on the D0 collaboration at the Fermilab Tevatron and on the ATLAS collaboration at the CERN LHC have made major impact on the Standard Model (SM) Higgs boson search, top quark studies, B physics studies, and measurements of Quantum Chromodynamics (QCD) phenomena. The OCHEP Grid computing facility consists of a large computer cluster which is playing a major role in data analysis and Monte Carlo productions for both the D0 and ATLAS experiments. Theoretical efforts are devoted to new ideas in Higgs bosons physics, extra dimensions, neutrino masses and oscillations, Grand Unified Theories, supersymmetric models, dark matter, and nonperturbative quantum field theory. Theory members are making major contributions to the understanding of phenomena being explored at the Tevatron and the LHC. They have proposed new models for Higgs bosons, and have suggested new signals for extra dimensions, and for the search of supersymmetric particles. During the seven year period when OCHEP was partially funded through the DOE EPSCoR implementation grant, OCHEP members published over 500 refereed journal articles and made over 200 invited presentations at major conferences. The Center is also involved in education and outreach activities by offering summer research programs for high school teachers and college students, and organizing summer workshops for high school teachers, sometimes coordinating with the Quarknet programs at OSU and OU. The details of the Center can be found in http://ochep.phy.okstate.edu.« less
The hydrodynamics of off-center explosions. [of supernovae
NASA Technical Reports Server (NTRS)
Fryxell, B. A.
1979-01-01
The behavior of off-center supernova explosions is investigated using a two-dimensional hydrodynamic code. An important application of these calculations is the possible formation of high-velocity pulsars. The dependence of the final velocity of the collapsed remnant on the location and energy of the explosion is computed. The largest remnant velocities result from explosions located at a mass fraction of 0.5. An explosion energy 50% greater than the binding energy of the star ejects 0.51 solar masses, producing a 1.4 solar mass remnant with a velocity of 400 km/s. However, this energy must be generated in a very small region of the star in order to create the required asymmetry in the explosion. Because of this, a specific energy of about 10 to the 20th ergs/g is needed. Nuclear reactions can produce no more than about 5 x 10 to the 17th erg/g, and it is unclear how the energy produced in gravitational collapse models can be sufficiently localized. Unless a supernova mechanism can be found which can produce enough energy in a small region of the star, off-center explosions do not provide a satisfactory explanation for high-velocity pulsars.
Integration of the Chinese HPC Grid in ATLAS Distributed Computing
NASA Astrophysics Data System (ADS)
Filipčič, A.;
2017-10-01
Fifteen Chinese High-Performance Computing sites, many of them on the TOP500 list of most powerful supercomputers, are integrated into a common infrastructure providing coherent access to a user through an interface based on a RESTful interface called SCEAPI. These resources have been integrated into the ATLAS Grid production system using a bridge between ATLAS and SCEAPI which translates the authorization and job submission protocols between the two environments. The ARC Computing Element (ARC-CE) forms the bridge using an extended batch system interface to allow job submission to SCEAPI. The ARC-CE was setup at the Institute for High Energy Physics, Beijing, in order to be as close as possible to the SCEAPI front-end interface at the Computing Network Information Center, also in Beijing. This paper describes the technical details of the integration between ARC-CE and SCEAPI and presents results so far with two supercomputer centers, Tianhe-IA and ERA. These two centers have been the pilots for ATLAS Monte Carlo Simulation in SCEAPI and have been providing CPU power since fall 2015.
The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Diachin, L F; Garaizar, F X; Henson, V E
2009-10-12
In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE andmore » the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.« less
User's guide for FRMOD, a zero dimensional FRM burn code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Driemeryer, D.; Miley, G.H.
1979-10-15
The zero-dimensional FRM plasma burn code, FRMOD is written in the FORTRAN language and is currently available on the Control Data Corporation (CDC) 7600 computer at the Magnetic Fusion Energy Computer Center (MFECC), sponsored by the US Department of Energy, in Livermore, CA. This guide assumes that the user is familiar with the system architecture and some of the utility programs available on the MFE-7600 machine, since online documentation is available for system routines through the use of the DOCUMENT utility. Users may therefore refer to it for answers to system related questions.
Analysis and Representation of Miscellaneous Electric Loads in NEMS
2017-01-01
Miscellaneous Electric Loads (MELs) comprise a growing portion of delivered energy consumption in residential and commercial buildings. Miscellaneous end uses—including televisions, personal computers, security systems, data center servers, and many other devices—have continued to penetrate into building-related market segments. Part of this proliferation of devices and equipment can be attributed to increased service demand for entertainment, computing, and convenience appliances.
LARGE BUILDING HVAC SIMULATION
The report discusses the monitoring and collection of data relating to indoor pressures and radon concentrations under several test conditions in a large school building in Bartow, Florida. The Florida Solar Energy Center (FSEC) used an integrated computational software, FSEC 3.0...
Coulomb Impurity Problem of Graphene in Strong Coupling Regime in Magnetic Fields.
Kim, S C; Yang, S-R Eric
2015-10-01
We investigate the Coulomb impurity problem of graphene in strong coupling limit in the presence of magnetic fields. When the strength of the Coulomb potential is sufficiently strong the electron of the lowest energy boundstate of the n = 0 Landau level may fall to the center of the potential. To prevent this spurious effect the Coulomb potential must be regularized. The scaling function for the inverse probability density of this state at the center of the impurity potential is computed in the strong coupling regime. The dependence of the computed scaling function on the regularization parameter changes significantly as the strong coupling regime is approached.
Computing Protein-Protein Association Affinity with Hybrid Steered Molecular Dynamics.
Rodriguez, Roberto A; Yu, Lili; Chen, Liao Y
2015-09-08
Computing protein-protein association affinities is one of the fundamental challenges in computational biophysics/biochemistry. The overwhelming amount of statistics in the phase space of very high dimensions cannot be sufficiently sampled even with today's high-performance computing power. In this article, we extend a potential of mean force (PMF)-based approach, the hybrid steered molecular dynamics (hSMD) approach we developed for ligand-protein binding, to protein-protein association problems. For a protein complex consisting of two protomers, P1 and P2, we choose m (≥3) segments of P1 whose m centers of mass are to be steered in a chosen direction and n (≥3) segments of P2 whose n centers of mass are to be steered in the opposite direction. The coordinates of these m + n centers constitute a phase space of 3(m + n) dimensions (3(m + n)D). All other degrees of freedom of the proteins, ligands, solvents, and solutes are freely subject to the stochastic dynamics of the all-atom model system. Conducting SMD along a line in this phase space, we obtain the 3(m + n)D PMF difference between two chosen states: one single state in the associated state ensemble and one single state in the dissociated state ensemble. This PMF difference is the first of four contributors to the protein-protein association energy. The second contributor is the 3(m + n - 1)D partial partition in the associated state accounting for the rotations and fluctuations of the (m + n - 1) centers while fixing one of the m + n centers of the P1-P2 complex. The two other contributors are the 3(m - 1)D partial partition of P1 and the 3(n - 1)D partial partition of P2 accounting for the rotations and fluctuations of their m - 1 or n - 1 centers while fixing one of the m/n centers of P1/P2 in the dissociated state. Each of these three partial partitions can be factored exactly into a 6D partial partition in multiplication with a remaining factor accounting for the small fluctuations while fixing three of the centers of P1, P2, or the P1-P2 complex, respectively. These small fluctuations can be well-approximated as Gaussian, and every 6D partition can be reduced in an exact manner to three problems of 1D sampling, counting the rotations and fluctuations around one of the centers as being fixed. We implement this hSMD approach to the Ras-RalGDS complex, choosing three centers on RalGDS and three on Ras (m = n = 3). At a computing cost of about 71.6 wall-clock hours using 400 computing cores in parallel, we obtained the association energy, -9.2 ± 1.9 kcal/mol on the basis of CHARMM 36 parameters, which well agrees with the experimental data, -8.4 ± 0.2 kcal/mol.
Design and deployment of an elastic network test-bed in IHEP data center based on SDN
NASA Astrophysics Data System (ADS)
Zeng, Shan; Qi, Fazhi; Chen, Gang
2017-10-01
High energy physics experiments produce huge amounts of raw data, while because of the sharing characteristics of the network resources, there is no guarantee of the available bandwidth for each experiment which may cause link congestion problems. On the other side, with the development of cloud computing technologies, IHEP have established a cloud platform based on OpenStack which can ensure the flexibility of the computing and storage resources, and more and more computing applications have been deployed on virtual machines established by OpenStack. However, under the traditional network architecture, network capability can’t be required elastically, which becomes the bottleneck of restricting the flexible application of cloud computing. In order to solve the above problems, we propose an elastic cloud data center network architecture based on SDN, and we also design a high performance controller cluster based on OpenDaylight. In the end, we present our current test results.
LTSS compendium: an introduction to the CDC 7600 and the Livermore Timesharing System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fong, K. W.
1977-08-15
This report is an introduction to the CDC 7600 computer and to the Livermore Timesharing System (LTSS) used by the National Magnetic Fusion Energy Computer Center (NMFECC) and the Lawrence Livermore Laboratory Computer Center (LLLCC or Octopus network) on their 7600's. This report is based on a document originally written specifically about the system as it is implemented at NMFECC but has been broadened to point out differences in implementation at LLLCC. It also contains information about LLLCC not relevant to NMFECC. This report is written for computational physicists who want to prepare large production codes to run under LTSSmore » on the 7600's. The generalized discussion of the operating system focuses on creating and executing controllees. This document and its companion, UCID-17557, CDC 7600 LTSS Programming Stratagems, provide a basis for understanding more specialized documents about individual parts of the system.« less
Haidar, Azzam; Jagode, Heike; Vaccaro, Phil; ...
2018-03-22
The emergence of power efficiency as a primary constraint in processor and system design poses new challenges concerning power and energy awareness for numerical libraries and scientific applications. Power consumption also plays a major role in the design of data centers, which may house petascale or exascale-level computing systems. At these extreme scales, understanding and improving the energy efficiency of numerical libraries and their related applications becomes a crucial part of the successful implementation and operation of the computing system. In this paper, we study and investigate the practice of controlling a compute system's power usage, and we explore howmore » different power caps affect the performance of numerical algorithms with different computational intensities. Further, we determine the impact, in terms of performance and energy usage, that these caps have on a system running scientific applications. This analysis will enable us to characterize the types of algorithms that benefit most from these power management schemes. Our experiments are performed using a set of representative kernels and several popular scientific benchmarks. Lastly, we quantify a number of power and performance measurements and draw observations and conclusions that can be viewed as a roadmap to achieving energy efficiency in the design and execution of scientific algorithms.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Haidar, Azzam; Jagode, Heike; Vaccaro, Phil
The emergence of power efficiency as a primary constraint in processor and system design poses new challenges concerning power and energy awareness for numerical libraries and scientific applications. Power consumption also plays a major role in the design of data centers, which may house petascale or exascale-level computing systems. At these extreme scales, understanding and improving the energy efficiency of numerical libraries and their related applications becomes a crucial part of the successful implementation and operation of the computing system. In this paper, we study and investigate the practice of controlling a compute system's power usage, and we explore howmore » different power caps affect the performance of numerical algorithms with different computational intensities. Further, we determine the impact, in terms of performance and energy usage, that these caps have on a system running scientific applications. This analysis will enable us to characterize the types of algorithms that benefit most from these power management schemes. Our experiments are performed using a set of representative kernels and several popular scientific benchmarks. Lastly, we quantify a number of power and performance measurements and draw observations and conclusions that can be viewed as a roadmap to achieving energy efficiency in the design and execution of scientific algorithms.« less
Tight-binding calculation studies of vacancy and adatom defects in graphene
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Wei; Lu, Wen-Cai; Zhang, Hong-Xing
2016-02-19
Computational studies of complex defects in graphene usually need to deal with a larger number of atoms than the current first-principles methods can handle. We show a recently developed three-center tight-binding potential for carbon is very efficient for large scale atomistic simulations and can accurately describe the structures and energies of various defects in graphene. Using the three-center tight-binding potential, we have systematically studied the stable structures and formation energies of vacancy and embedded-atom defects of various sizes up to 4 vacancies and 4 embedded atoms in graphene. In conclusion, our calculations reveal low-energy defect structures and provide a moremore » comprehensive understanding of the structures and stability of defects in graphene.« less
Ubiquitous Green Computing Techniques for High Demand Applications in Smart Environments
Zapater, Marina; Sanchez, Cesar; Ayala, Jose L.; Moya, Jose M.; Risco-Martín, José L.
2012-01-01
Ubiquitous sensor network deployments, such as the ones found in Smart cities and Ambient intelligence applications, require constantly increasing high computational demands in order to process data and offer services to users. The nature of these applications imply the usage of data centers. Research has paid much attention to the energy consumption of the sensor nodes in WSNs infrastructures. However, supercomputing facilities are the ones presenting a higher economic and environmental impact due to their very high power consumption. The latter problem, however, has been disregarded in the field of smart environment services. This paper proposes an energy-minimization workload assignment technique, based on heterogeneity and application-awareness, that redistributes low-demand computational tasks from high-performance facilities to idle nodes with low and medium resources in the WSN infrastructure. These non-optimal allocation policies reduce the energy consumed by the whole infrastructure and the total execution time. PMID:23112621
Ubiquitous green computing techniques for high demand applications in Smart environments.
Zapater, Marina; Sanchez, Cesar; Ayala, Jose L; Moya, Jose M; Risco-Martín, José L
2012-01-01
Ubiquitous sensor network deployments, such as the ones found in Smart cities and Ambient intelligence applications, require constantly increasing high computational demands in order to process data and offer services to users. The nature of these applications imply the usage of data centers. Research has paid much attention to the energy consumption of the sensor nodes in WSNs infrastructures. However, supercomputing facilities are the ones presenting a higher economic and environmental impact due to their very high power consumption. The latter problem, however, has been disregarded in the field of smart environment services. This paper proposes an energy-minimization workload assignment technique, based on heterogeneity and application-awareness, that redistributes low-demand computational tasks from high-performance facilities to idle nodes with low and medium resources in the WSN infrastructure. These non-optimal allocation policies reduce the energy consumed by the whole infrastructure and the total execution time.
The role of broken symmetry in solvation of a spherical cavity in classical and quantum water models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Remsing, Richard C.; Baer, Marcel D.; Schenter, Gregory K.
2014-08-21
Insertion of a hard sphere cavity in liquid water breaks translational symmetry and generates an electrostatic potential difference between the region near the cavity and the bulk. Here, we clarify the physical interpretation of this potential and its calculation. We also show that the electrostatic potential in the center of small, medium, and large cavities depends very sensitively on the form of the assumed molecular interactions for dfferent classical simple point-charge models and quantum mechanical DFT-based interaction potentials, as reected in their description of donor and acceptor hydrogen bonds near the cavity. These dfferences can signifcantly affect the magnitude ofmore » the scalar electrostatic potential. We argue that the result of these studies will have direct consequences toward our understanding of the thermodynamics of ion solvation through the cavity charging process. JDW and RCR are supported by the National Science Foundation (Grants CHE0848574 and CHE1300993). CJM and GKS are supported by the U.S. Department of Energy`s (DOE) Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences and Biosciences. Pacific Northwest National Laboratory (PNNL) is operated for the Department of Energy by Battelle. MDB is grateful for the support of the Linus Pauling Distinguished Postdoctoral Fellowship Program at PNNL. We acknowledge illuminating discussions and sharing of ideas and preprints with Dr. Shawn M. Kathmann and Prof. Tom Beck. The DFT simulations used resources of the National Energy Research Scientific Computing Center, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231. Additional computing resources were generously allocated by PNNL's Institutional Computing program.« less
NASA Astrophysics Data System (ADS)
Altomare, Albino; Cesario, Eugenio; Mastroianni, Carlo
2016-10-01
The opportunity of using Cloud resources on a pay-as-you-go basis and the availability of powerful data centers and high bandwidth connections are speeding up the success and popularity of Cloud systems, which is making on-demand computing a common practice for enterprises and scientific communities. The reasons for this success include natural business distribution, the need for high availability and disaster tolerance, the sheer size of their computational infrastructure, and/or the desire to provide uniform access times to the infrastructure from widely distributed client sites. Nevertheless, the expansion of large data centers is resulting in a huge rise of electrical power consumed by hardware facilities and cooling systems. The geographical distribution of data centers is becoming an opportunity: the variability of electricity prices, environmental conditions and client requests, both from site to site and with time, makes it possible to intelligently and dynamically (re)distribute the computational workload and achieve as diverse business goals as: the reduction of costs, energy consumption and carbon emissions, the satisfaction of performance constraints, the adherence to Service Level Agreement established with users, etc. This paper proposes an approach that helps to achieve the business goals established by the data center administrators. The workload distribution is driven by a fitness function, evaluated for each data center, which weighs some key parameters related to business objectives, among which, the price of electricity, the carbon emission rate, the balance of load among the data centers etc. For example, the energy costs can be reduced by using a "follow the moon" approach, e.g. by migrating the workload to data centers where the price of electricity is lower at that time. Our approach uses data about historical usage of the data centers and data about environmental conditions to predict, with the help of regressive models, the values of the parameters of the fitness function, and then to appropriately tune the weights assigned to the parameters in accordance to the business goals. Preliminary experimental results, presented in this paper, show encouraging benefits.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morgan, O.B. Jr.; Berry, L.A.; Sheffield, J.
This annual report on fusion energy discusses the progress on work in the following main topics: toroidal confinement experiments; atomic physics and plasma diagnostics development; plasma theory and computing; plasma-materials interactions; plasma technology; superconducting magnet development; fusion engineering design center; materials research and development; and neutron transport. (LSP)
Kinetic energy budgets in areas of convection
NASA Technical Reports Server (NTRS)
Fuelberg, H. E.
1979-01-01
Synoptic scale budgets of kinetic energy are computed using 3 and 6 h data from three of NASA's Atmospheric Variability Experiments (AVE's). Numerous areas of intense convection occurred during the three experiments. Large kinetic energy variability, with periods as short as 6 h, is observed in budgets computed over each entire experiment area and over limited volumes that barely enclose the convection and move with it. Kinetic energy generation and transport processes in the smaller volumes are often a maximum when the enclosed storms are near peak intensity, but the nature of the various energy processes differs between storm cases and seems closely related to the synoptic conditions. A commonly observed energy budget for peak storm intensity indicates that generation of kinetic energy by cross-contour flow is the major energy source while dissipation to subgrid scales is the major sink. Synoptic scale vertical motion transports kinetic energy from lower to upper levels of the atmosphere while low-level horizontal flux convergence and upper-level horizontal divergence also occur. Spatial fields of the energy budget terms show that the storm environment is a major center of energy activity for the entire area.
Nano Goes Magnetic to Attract Big Business
NASA Technical Reports Server (NTRS)
2006-01-01
Glenn Research Center has combined state-of-the-art electrical designs with complex, computer-aided analyses to develop some of today s most advanced power systems, in space and on Earth. The center s Power and On-Board Propulsion Technology Division is the brain behind many of these power systems. For space, this division builds technologies that help power the International Space Station, the Hubble Space Telescope, and Earth-orbiting satellites. For Earth, it has woven advanced aerospace power concepts into commercial energy applications that include solar and nuclear power generation, battery and fuel cell energy storage, communications and telecommunications satellites, cryocoolers, hybrid and electric vehicles, and heating and air-conditioning systems.
Facilitating NASA Earth Science Data Processing Using Nebula Cloud Computing
NASA Technical Reports Server (NTRS)
Pham, Long; Chen, Aijun; Kempler, Steven; Lynnes, Christopher; Theobald, Michael; Asghar, Esfandiari; Campino, Jane; Vollmer, Bruce
2011-01-01
Cloud Computing has been implemented in several commercial arenas. The NASA Nebula Cloud Computing platform is an Infrastructure as a Service (IaaS) built in 2008 at NASA Ames Research Center and 2010 at GSFC. Nebula is an open source Cloud platform intended to: a) Make NASA realize significant cost savings through efficient resource utilization, reduced energy consumption, and reduced labor costs. b) Provide an easier way for NASA scientists and researchers to efficiently explore and share large and complex data sets. c) Allow customers to provision, manage, and decommission computing capabilities on an as-needed bases
Energy Innovation Hubs: A Home for Scientific Collaboration
Chu, Steven
2017-12-11
Secretary Chu will host a live, streaming Q&A session with the directors of the Energy Innovation Hubs on Tuesday, March 6, at 2:15 p.m. EST. The directors will be available for questions regarding their teams' work and the future of American energy. Ask your questions in the comments below, or submit them on Facebook, Twitter (@energy), or send an e-mail to newmedia@hq.doe.gov, prior or during the live event. Dr. Hank Foley is the director of the Greater Philadelphia Innovation Cluster for Energy-Efficient Buildings, which is pioneering new data intensive techniques for designing and operating energy efficient buildings, including advanced computer modeling. Dr. Douglas Kothe is the director of the Consortium for Advanced Simulation of Light Water Reactors, which uses powerful supercomputers to create "virtual" reactors that will help improve the safety and performance of both existing and new nuclear reactors. Dr. Nathan Lewis is the director of the Joint Center for Artificial Photosynthesis, which focuses on how to produce fuels from sunlight, water, and carbon dioxide. The Energy Innovation Hubs are major integrated research centers, with researchers from many different institutions and technical backgrounds. Each hub is focused on a specific high priority goal, rapidly accelerating scientific discoveries and shortening the path from laboratory innovation to technological development and commercial deployment of critical energy technologies. Ask your questions in the comments below, or submit them on Facebook, Twitter (@energy), or send an e-mail to newmedia@energy.gov, prior or during the live event. The Energy Innovation Hubs are major integrated research centers, with researchers from many different institutions and technical backgrounds. Each Hub is focused on a specific high priority goal, rapidly accelerating scientific discoveries and shortening the path from laboratory innovation to technological development and commercial deployment of critical energy technologies. Dr. Hank Holey is the director of the Greater Philadelphia Innovation Cluster for Energy-Efficient Buildings, which is pioneering new data intensive techniques for designing and operating energy efficient buildings, including advanced computer modeling. Dr. Douglas Kothe is the director of the Modeling and Simulation for Nuclear Reactors Hub, which uses powerful supercomputers to create "virtual" reactors that will help improve the safety and performance of both existing and new nuclear reactors. Dr. Nathan Lewis is the director of the Joint Center for Artificial Photosynthesis Hub, which focuses on how to produce biofuels from sunlight, water, and carbon dioxide.
Public Dialogue on Science in Sweden.
ERIC Educational Resources Information Center
Dyring, Annagreta
1988-01-01
Explains how Sweden has proceeded to popularize science. Addresses topics dealing with policy, the energy debate, booklets with large circulation, computers and society, contacts between schools and research, building up small science centers, mass media, literary quality, children's responsibility, and some of the challenges. (RT)
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
Los Alamos National Laboratory Search Site submit About Mission Business Newsroom Publications Los Innovation in New Mexico Los Alamos Collaboration for Explosives Detection (LACED) SensorNexus Exascale Computing Project (ECP) User Facilities Center for Integrated Nanotechnologies (CINT) Los Alamos Neutron
High-Performance Computing Unlocks Innovation at NREL - Video Text Version
scales, data visualizations and large-scale modeling provide insights and test new ideas. But this type most energy-efficient data center in the world. NREL and Hewlett-Packard won an R&D 100 award-the
Identification of cost effective energy conservation measures
NASA Technical Reports Server (NTRS)
Bierenbaum, H. S.; Boggs, W. H.
1978-01-01
In addition to a successful program of readily implemented conservation actions for reducing building energy consumption at Kennedy Space Center, recent detailed analyses have identified further substantial savings for buildings representative of technical facilities designed when energy costs were low. The techniques employed for determination of these energy savings consisted of facility configuration analysis, power and lighting measurements, detailed computer simulations and simulation verifications. Use of these methods resulted in identification of projected energy savings as large as $330,000 a year (approximately two year break-even period) in a single building. Application of these techniques to other commercial buildings is discussed
DOE Office of Scientific and Technical Information (OSTI.GOV)
Labios, Liezel A.; Heiden, Zachariah M.; Mock, Michael T.
2015-05-04
The synthesis of a series of P EtP NRR' (P EtP NRR' = Et₂PCH₂CH₂P(CH₂NRR')₂, R = H, R' = Ph or 2,4-difluorophenyl; R = R' = Ph or iPr) diphosphine ligands containing mono- and disubstituted pendant amine groups, and the preparation of their corresponding molybdenum bis(dinitrogen) complexes trans-Mo(N₂)₂(PMePh₂)₂(P EtP NRR') is described. In situ IR and multinuclear NMR spectroscopic studies monitoring the stepwise addition of (HOTf) to trans-Mo(N₂)₂(PMePh₂)₂(P EtP NRR') complexes in THF at -40 °C show that the electronic and steric properties of the R and R' groups of the pendant amines influence whether the complexes are protonated atmore » Mo, a pendant amine, a coordinated N2 ligand, or a combination of these sites. For example, complexes containing mono-aryl substituted pendant amines are protonated at Mo and pendant amine to generate mono- and dicationic Mo–H species. Protonation of the complex containing less basic diphenyl-substituted pendant amines exclusively generates a monocationic hydrazido (Mo(NNH₂)) product, indicating preferential protonation of an N₂ ligand. Addition of HOTf to the complex featuring more basic diisopropyl amines primarily produces a monocationic product protonated at a pendant amine site, as well as a trace amount of dicationic Mo(NNH₂) product that contain protonated pendant amines. In addition, trans-Mo(N₂)₂(PMePh₂)₂(depe) (depe = Et₂PCH₂CH₂PEt₂) without a pendant amine was synthesized and treated with HOTf, generating a monocationic Mo(NNH₂) product. Protonolysis experiments conducted on select complexes in the series afforded trace amounts of NH₄⁺. Computational analysis of the series of trans-Mo(N₂)₂(PMePh₂)₂(P EtP NRR') complexes provides further insight into the proton affinity values of the metal center, N₂ ligand, and pendant amine sites to rationalize the differing reactivity profiles. This research was supported as part of the Center for Molecular Electrocatalysis, an Energy Frontier Research Center funded by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences. Computational resources provided by the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. Pacific Northwest National Laboratory is operated by Battelle for the U.S. Department of Energy.« less
An Efficient Virtual Machine Consolidation Scheme for Multimedia Cloud Computing
Han, Guangjie; Que, Wenhui; Jia, Gangyong; Shu, Lei
2016-01-01
Cloud computing has innovated the IT industry in recent years, as it can delivery subscription-based services to users in the pay-as-you-go model. Meanwhile, multimedia cloud computing is emerging based on cloud computing to provide a variety of media services on the Internet. However, with the growing popularity of multimedia cloud computing, its large energy consumption cannot only contribute to greenhouse gas emissions, but also result in the rising of cloud users’ costs. Therefore, the multimedia cloud providers should try to minimize its energy consumption as much as possible while satisfying the consumers’ resource requirements and guaranteeing quality of service (QoS). In this paper, we have proposed a remaining utilization-aware (RUA) algorithm for virtual machine (VM) placement, and a power-aware algorithm (PA) is proposed to find proper hosts to shut down for energy saving. These two algorithms have been combined and applied to cloud data centers for completing the process of VM consolidation. Simulation results have shown that there exists a trade-off between the cloud data center’s energy consumption and service-level agreement (SLA) violations. Besides, the RUA algorithm is able to deal with variable workload to prevent hosts from overloading after VM placement and to reduce the SLA violations dramatically. PMID:26901201
NASA Technical Reports Server (NTRS)
Deiwert, G. S.; Yoshikawa, K. K.
1975-01-01
A semiclassical model proposed by Pearson and Hansen (1974) for computing collision-induced transition probabilities in diatomic molecules is tested by the direct-simulation Monte Carlo method. Specifically, this model is described by point centers of repulsion for collision dynamics, and the resulting classical trajectories are used in conjunction with the Schroedinger equation for a rigid-rotator harmonic oscillator to compute the rotational energy transition probabilities necessary to evaluate the rotation-translation exchange phenomena. It is assumed that a single, average energy spacing exists between the initial state and possible final states for a given collision.
Theoretical Comparison Between Candidates for Dark Matter
NASA Astrophysics Data System (ADS)
McKeough, James; Hira, Ajit; Valdez, Alexandra
2017-01-01
Since the generally-accepted view among astrophysicists is that the matter component of the universe is mostly dark matter, the search for dark matter particles continues unabated. The Large Underground Xenon (LUX) improvements, aided by advanced computer simulations at the U.S. Department of Energy's Lawrence Berkeley National Laboratory's (Berkeley Lab) National Energy Research Scientific Computing Center (NERSC) and Brown University's Center for Computation and Visualization (CCV), can potentially eliminate some particle models of dark matter. Generally, the proposed candidates can be put in three categories: baryonic dark matter, hot dark matter, and cold dark matter. The Lightest Supersymmetric Particle(LSP) of supersymmetric models is a dark matter candidate, and is classified as a Weakly Interacting Massive Particle (WIMP). Similar to the cosmic microwave background radiation left over from the Big Bang, there is a background of low-energy neutrinos in our Universe. According to some researchers, these may be the explanation for the dark matter. One advantage of the Neutrino Model is that they are known to exist. Dark matter made from neutrinos is termed ``hot dark matter''. We formulate a novel empirical function for the average density profile of cosmic voids, identified via the watershed technique in ΛCDM N-body simulations. This function adequately treats both void size and redshift, and describes the scale radius and the central density of voids. We started with a five-parameter model. Our research is mainly on LSP and Neutrino models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fong, K. W.
1977-08-15
This report deals with some techniques in applied programming using the Livermore Timesharing System (LTSS) on the CDC 7600 computers at the National Magnetic Fusion Energy Computer Center (NMFECC) and the Lawrence Livermore Laboratory Computer Center (LLLCC or Octopus network). This report is based on a document originally written specifically about the system as it is implemented at NMFECC but has been revised to accommodate differences between LLLCC and NMFECC implementations. Topics include: maintaining programs, debugging, recovering from system crashes, and using the central processing unit, memory, and input/output devices efficiently and economically. Routines that aid in these procedures aremore » mentioned. The companion report, UCID-17556, An LTSS Compendium, discusses the hardware and operating system and should be read before reading this report.« less
Deployment and Operational Experiences with CernVM-FS at the GridKa Tier-1 Center
NASA Astrophysics Data System (ADS)
Alef, Manfred; Jäger, Axel; Petzold and, Andreas; Verstege, Bernhard
2012-12-01
In 2012 the GridKa Tier-1 computing center hosts 130 kHS06 computing resources and 14PB disk and 17PB tape space. These resources are shared between the four LHC VOs and a number of national and international VOs from high energy physics and other sciences. CernVM-FS has been deployed at GridKa to supplement the existing NFS-based system to access VO software on the worker nodes. It provides a solution tailored to the requirement of the LHC VOs. We will focus on the first operational experiences and the monitoring of CernVM-FS on the worker nodes and the squid caches.
Internal controls over computer-processed financial data at Boeing Petroleum Services
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1992-02-14
The Strategic Petroleum Reserve (SPR) is responsible for purchasing and storing crude oil to mitigate the potential adverse impact of any future disruptions in crude oil imports. Boeing Petroleum Services, Inc. (BPS) operates the SPR under a US Department of Energy (DOE) management and operating contract. BPS receives support for various information systems and other information processing needs from a mainframe computer center. The objective of the audit was to determine if the internal controls implemented by BPS for computer systems were adequate to assure processing reliability.
The Petascale Data Storage Institute
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gibson, Garth; Long, Darrell; Honeyman, Peter
2013-07-01
Petascale computing infrastructures for scientific discovery make petascale demands on information storage capacity, performance, concurrency, reliability, availability, and manageability.The Petascale Data Storage Institute focuses on the data storage problems found in petascale scientific computing environments, with special attention to community issues such as interoperability, community buy-in, and shared tools.The Petascale Data Storage Institute is a collaboration between researchers at Carnegie Mellon University, National Energy Research Scientific Computing Center, Pacific Northwest National Laboratory, Oak Ridge National Laboratory, Sandia National Laboratory, Los Alamos National Laboratory, University of Michigan, and the University of California at Santa Cruz.
DOE Office of Scientific and Technical Information (OSTI.GOV)
National Energy Research Supercomputing Center; He, Yun; Kramer, William T.C.
2008-05-07
The newest workhorse of the National Energy Research Scientific Computing Center is a Cray XT4 with 9,736 dual core nodes. This paper summarizes Franklin user experiences from friendly early user period to production period. Selected successful user stories along with top issues affecting user experiences are presented.
Ab initio molecular simulations with numeric atom-centered orbitals
NASA Astrophysics Data System (ADS)
Blum, Volker; Gehrke, Ralf; Hanke, Felix; Havu, Paula; Havu, Ville; Ren, Xinguo; Reuter, Karsten; Scheffler, Matthias
2009-11-01
We describe a complete set of algorithms for ab initio molecular simulations based on numerically tabulated atom-centered orbitals (NAOs) to capture a wide range of molecular and materials properties from quantum-mechanical first principles. The full algorithmic framework described here is embodied in the Fritz Haber Institute "ab initio molecular simulations" (FHI-aims) computer program package. Its comprehensive description should be relevant to any other first-principles implementation based on NAOs. The focus here is on density-functional theory (DFT) in the local and semilocal (generalized gradient) approximations, but an extension to hybrid functionals, Hartree-Fock theory, and MP2/GW electron self-energies for total energies and excited states is possible within the same underlying algorithms. An all-electron/full-potential treatment that is both computationally efficient and accurate is achieved for periodic and cluster geometries on equal footing, including relaxation and ab initio molecular dynamics. We demonstrate the construction of transferable, hierarchical basis sets, allowing the calculation to range from qualitative tight-binding like accuracy to meV-level total energy convergence with the basis set. Since all basis functions are strictly localized, the otherwise computationally dominant grid-based operations scale as O(N) with system size N. Together with a scalar-relativistic treatment, the basis sets provide access to all elements from light to heavy. Both low-communication parallelization of all real-space grid based algorithms and a ScaLapack-based, customized handling of the linear algebra for all matrix operations are possible, guaranteeing efficient scaling (CPU time and memory) up to massively parallel computer systems with thousands of CPUs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Jeff
"Carbon in Underland" was submitted by the Center for Nanoscale Controls on Geologic CO2 (NCGC) to the "Life at the Frontiers of Energy Research" video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. This video was selected as one of five winners by a distinguished panel of judges for its "entertaining animation and engaging explanations of carbon sequestration". NCGC, an EFRC directed by Donald J. DePaolo at Lawrence Berkeley National Laboratory is a partnership of scientists from sevenmore » institutions: LBNL (lead) Massachusetts Institute of Technology, Lawrence Livermore National Laboratory, Oak Ridge National Laboratory, University of California, Davis, Ohio State University, and Washington University in St. Louis. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges. The mission of the Center for Nanoscale Control of Geologic CO2 is 'to use new investigative tools, combined with experiments and computer simulations, to build a fundamental understanding of molecular-to-pore-scale processes in fluid-rock systems, and to demonstrate the ability to control critical aspects of flow, transport, and mineralization in porous rock media as applied to geologic sequestration of CO2. Research topics are: bio-inspired, CO2 (store), greenhouse gas, and interfacial characterization.« less
CSP: A Multifaceted Hybrid Architecture for Space Computing
NASA Technical Reports Server (NTRS)
Rudolph, Dylan; Wilson, Christopher; Stewart, Jacob; Gauvin, Patrick; George, Alan; Lam, Herman; Crum, Gary Alex; Wirthlin, Mike; Wilson, Alex; Stoddard, Aaron
2014-01-01
Research on the CHREC Space Processor (CSP) takes a multifaceted hybrid approach to embedded space computing. Working closely with the NASA Goddard SpaceCube team, researchers at the National Science Foundation (NSF) Center for High-Performance Reconfigurable Computing (CHREC) at the University of Florida and Brigham Young University are developing hybrid space computers that feature an innovative combination of three technologies: commercial-off-the-shelf (COTS) devices, radiation-hardened (RadHard) devices, and fault-tolerant computing. Modern COTS processors provide the utmost in performance and energy-efficiency but are susceptible to ionizing radiation in space, whereas RadHard processors are virtually immune to this radiation but are more expensive, larger, less energy-efficient, and generations behind in speed and functionality. By featuring COTS devices to perform the critical data processing, supported by simpler RadHard devices that monitor and manage the COTS devices, and augmented with novel uses of fault-tolerant hardware, software, information, and networking within and between COTS devices, the resulting system can maximize performance and reliability while minimizing energy consumption and cost. NASA Goddard has adopted the CSP concept and technology with plans underway to feature flight-ready CSP boards on two upcoming space missions.
Energy-Aware Computation Offloading of IoT Sensors in Cloudlet-Based Mobile Edge Computing.
Ma, Xiao; Lin, Chuang; Zhang, Han; Liu, Jianwei
2018-06-15
Mobile edge computing is proposed as a promising computing paradigm to relieve the excessive burden of data centers and mobile networks, which is induced by the rapid growth of Internet of Things (IoT). This work introduces the cloud-assisted multi-cloudlet framework to provision scalable services in cloudlet-based mobile edge computing. Due to the constrained computation resources of cloudlets and limited communication resources of wireless access points (APs), IoT sensors with identical computation offloading decisions interact with each other. To optimize the processing delay and energy consumption of computation tasks, theoretic analysis of the computation offloading decision problem of IoT sensors is presented in this paper. In more detail, the computation offloading decision problem of IoT sensors is formulated as a computation offloading game and the condition of Nash equilibrium is derived by introducing the tool of a potential game. By exploiting the finite improvement property of the game, the Computation Offloading Decision (COD) algorithm is designed to provide decentralized computation offloading strategies for IoT sensors. Simulation results demonstrate that the COD algorithm can significantly reduce the system cost compared with the random-selection algorithm and the cloud-first algorithm. Furthermore, the COD algorithm can scale well with increasing IoT sensors.
Solar energy for a community recreation center
DOE Office of Scientific and Technical Information (OSTI.GOV)
Libman, D.E.
1980-01-01
A 58,000 ft/sup 2/ recreation center in Shenandoah, Georgia is described. Rooftop solar collectors and reflectors serve as a basis for the active solar heating and cooling systems. The recreation center clearly demonstrates the technical feasibility of solar application in a recreation setting; economically, however, results are shown to be mixed. Although effective in the heating mode, solar cooling is considered as questionable in terms of a reasonable payoff period. A computer model predicts a payoff period of 11 years based on 1977 energy prices. The design and construction costs of the solar heating and cooling system ($726,000) was 90%more » financed by ERDA. A hockey-size ice rink and a gymnasium plus locker rooms and meeting rooms comprised the major part of the floor space. Problems encountered and operation of the facility are described. (MJJ)« less
Energy Systems Integration Facility (ESIF): Golden, CO - Energy Integration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sheppy, Michael; VanGeet, Otto; Pless, Shanti
2015-03-01
At NREL's Energy Systems Integration Facility (ESIF) in Golden, Colo., scientists and engineers work to overcome challenges related to how the nation generates, delivers and uses energy by modernizing the interplay between energy sources, infrastructure, and data. Test facilities include a megawatt-scale ac electric grid, photovoltaic simulators and a load bank. Additionally, a high performance computing data center (HPCDC) is dedicated to advancing renewable energy and energy efficient technologies. A key design strategy is to use waste heat from the HPCDC to heat parts of the building. The ESIF boasts an annual EUI of 168.3 kBtu/ft2. This article describes themore » building's procurement, design and first year of performance.« less
Color-Center Production and Formation in Electron-Irradiated Magnesium Aluminate Spinel and Ceria
Costantini, Jean-Marc; Lelong, Gerald; Guillaumet, Maxime; ...
2016-06-20
Single crystals of magnesium aluminate spinel (MgAl2O4) with (100) or (110) orientations and cerium dioxide or ceria (CeO2) were irradiated by 1.0-MeV and 2.5-MeV electrons in a high fluence range. Point-defect production was studied by off-line UV-visible optical spectroscopy after irradiation. For spinel, regardless of both crystal orientation and electron energy, two characteristic broad bands centered at photon energies of 5.4 eV and 4.9 eV were assigned to F and F+ centers (neutral and singly-ionized oxygen vacancies), respectively, on the basis of available literature data. No clear differences in colour-centre formation were observed for the two crystal orientations. Using calculationsmore » of displacement cross sections by elastic collisions, these results are consistent with a very large threshold displacement energy (200 eV) for oxygen atoms at RT. A third very broad band centered at 3.7 eV might be attributed either to an oxygen hole center (V-type center) or an F2 dimer center (oxygen di-vacancy). The onset of recovery of these color centers took place at 200°C with almost full bleaching at 600°C. Activation energies (~0.3-0.4 eV) for defect recovery were deduced from the isochronal annealing data by using a first-order kinetics analysis. For ceria, a sub band-gap absorption feature peaked at ~3.1 eV was recorded for 2.5-MeV electron irradiation only. Assuming a ballistic process, we suggest that the latter defect might result from cerium atom displacement on the basis of computed cross sections.« less
Computer Technology for Industry
NASA Technical Reports Server (NTRS)
1979-01-01
In this age of the computer, more and more business firms are automating their operations for increased efficiency in a great variety of jobs, from simple accounting to managing inventories, from precise machining to analyzing complex structures. In the interest of national productivity, NASA is providing assistance both to longtime computer users and newcomers to automated operations. Through a special technology utilization service, NASA saves industry time and money by making available already developed computer programs which have secondary utility. A computer program is essentially a set of instructions which tells the computer how to produce desired information or effect by drawing upon its stored input. Developing a new program from scratch can be costly and time-consuming. Very often, however, a program developed for one purpose can readily be adapted to a totally different application. To help industry take advantage of existing computer technology, NASA operates the Computer Software Management and Information Center (COSMIC)(registered TradeMark),located at the University of Georgia. COSMIC maintains a large library of computer programs developed for NASA, the Department of Defense, the Department of Energy and other technology-generating agencies of the government. The Center gets a continual flow of software packages, screens them for adaptability to private sector usage, stores them and informs potential customers of their availability.
Hayashi, Shigehiko; Uchida, Yoshihiro; Hasegawa, Taisuke; Higashi, Masahiro; Kosugi, Takahiro; Kamiya, Motoshi
2017-05-05
Many remarkable molecular functions of proteins use their characteristic global and slow conformational dynamics through coupling of local chemical states in reaction centers with global conformational changes of proteins. To theoretically examine the functional processes of proteins in atomic detail, a methodology of quantum mechanical/molecular mechanical (QM/MM) free-energy geometry optimization is introduced. In the methodology, a geometry optimization of a local reaction center is performed with a quantum mechanical calculation on a free-energy surface constructed with conformational samples of the surrounding protein environment obtained by a molecular dynamics simulation with a molecular mechanics force field. Geometry optimizations on extensive free-energy surfaces by a QM/MM reweighting free-energy self-consistent field method designed to be variationally consistent and computationally efficient have enabled examinations of the multiscale molecular coupling of local chemical states with global protein conformational changes in functional processes and analysis and design of protein mutants with novel functional properties.
Jeharsae, Rohani; Sangthong, Rassamee; Chongsuvivatwong, Virasakdi
2011-09-01
This survey examined nutritional intake and the effects of armed conflict on energy-protein inadequacy amonng children aged one to less than five years. Fifty health centers were randomly selected. Three children were randomly selected from each 12-month old interval age groups in each health center. Four hundred seventy eight children and their primary caregivers were recruited. Food intake was collected from a single 24-hour food recall and was computed to percentage of the Thai Dietary Reference Intake (DRI). Violent event rates were classified by quartiles. Dietary intake stratified by age groups was examined. Logistic regression was used to examine association between armed conflict and inadequacy of food intake. Average of DRI was above 100% for both energy and protein intake. Snacks contributed to one-fourth of energy intake. Inadequacy of energy and protein intake was 27% and 7%, respectively. There was no association between armed conflict and inadequacy of energy and protein consumption.
NASA Astrophysics Data System (ADS)
Hayashi, Shigehiko; Uchida, Yoshihiro; Hasegawa, Taisuke; Higashi, Masahiro; Kosugi, Takahiro; Kamiya, Motoshi
2017-05-01
Many remarkable molecular functions of proteins use their characteristic global and slow conformational dynamics through coupling of local chemical states in reaction centers with global conformational changes of proteins. To theoretically examine the functional processes of proteins in atomic detail, a methodology of quantum mechanical/molecular mechanical (QM/MM) free-energy geometry optimization is introduced. In the methodology, a geometry optimization of a local reaction center is performed with a quantum mechanical calculation on a free-energy surface constructed with conformational samples of the surrounding protein environment obtained by a molecular dynamics simulation with a molecular mechanics force field. Geometry optimizations on extensive free-energy surfaces by a QM/MM reweighting free-energy self-consistent field method designed to be variationally consistent and computationally efficient have enabled examinations of the multiscale molecular coupling of local chemical states with global protein conformational changes in functional processes and analysis and design of protein mutants with novel functional properties.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carter, Thomas; Liu, Zan; Sickinger, David
The Thermosyphon Cooler Hybrid System (TCHS) integrates the control of a dry heat rejection device, the thermosyphon cooler (TSC), with an open cooling tower. A combination of equipment and controls, this new heat rejection system embraces the 'smart use of water,' using evaporative cooling when it is most advantageous and then saving water and modulating toward increased dry sensible cooling as system operations and ambient weather conditions permit. Innovative fan control strategies ensure the most economical balance between water savings and parasitic fan energy. The unique low-pressure-drop design of the TSC allows water to be cooled directly by the TSCmore » evaporator without risk of bursting tubes in subfreezing ambient conditions. Johnson Controls partnered with the National Renewable Energy Laboratory (NREL) and Sandia National Laboratories to deploy the TSC as a test bed at NREL's high-performance computing (HPC) data center in the first half of 2016. Located in NREL's Energy Systems Integration Facility (ESIF), this HPC data center has achieved an annualized average power usage effectiveness rating of 1.06 or better since 2012. Warm-water liquid cooling is used to capture heat generated by computer systems direct to water; that waste heat is either reused as the primary heat source in the ESIF building or rejected using evaporative cooling. This data center is the single largest source of water and power demand on the NREL campus, using about 7,600 m3 (2.0 million gal) of water during the past year with an hourly average IT load of nearly 1 MW (3.4 million Btu/h) -- so dramatically reducing water use while continuing efficient data center operations is of significant interest. Because Sandia's climate is similar to NREL's, this new heat rejection system being deployed at NREL has gained interest at Sandia. Sandia's data centers utilize an hourly average of 8.5 MW (29 million Btu/h) and are also one of the largest consumers of water on Sandia's site. In addition to describing the installation of the TSC and its integration into the ESIF, this paper focuses on the full heat rejection system simulation program used for hourly analysis of the energy and water consumption of the complete system under varying operating scenarios. A follow-up paper will detail the test results. The evaluation of the TSC's performance at NREL will also determine a path forward at Sandia for possible deployment in a large-scale system not only for data center use but also possibly site wide.« less
Application of electrochemical energy storage in solar thermal electric generation systems
NASA Technical Reports Server (NTRS)
Das, R.; Krauthamer, S.; Frank, H.
1982-01-01
This paper assesses the status, cost, and performance of existing electrochemical energy storage systems, and projects the cost, performance, and availability of advanced storage systems for application in terrestrial solar thermal electric generation. A 10 MWe solar plant with five hours of storage is considered and the cost of delivered energy is computed for sixteen different storage systems. The results indicate that the five most attractive electrochemical storage systems use the following battery types: zinc-bromine (Exxon), iron-chromium redox (NASA/Lewis Research Center, LeRC), sodium-sulfur (Ford), sodium-sulfur (Dow), and zinc-chlorine (Energy Development Associates, EDA).
Application-oriented offloading in heterogeneous networks for mobile cloud computing
NASA Astrophysics Data System (ADS)
Tseng, Fan-Hsun; Cho, Hsin-Hung; Chang, Kai-Di; Li, Jheng-Cong; Shih, Timothy K.
2018-04-01
Nowadays Internet applications have become more complicated that mobile device needs more computing resources for shorter execution time but it is restricted to limited battery capacity. Mobile cloud computing (MCC) is emerged to tackle the finite resource problem of mobile device. MCC offloads the tasks and jobs of mobile devices to cloud and fog environments by using offloading scheme. It is vital to MCC that which task should be offloaded and how to offload efficiently. In the paper, we formulate the offloading problem between mobile device and cloud data center and propose two algorithms based on application-oriented for minimum execution time, i.e. the Minimum Offloading Time for Mobile device (MOTM) algorithm and the Minimum Execution Time for Cloud data center (METC) algorithm. The MOTM algorithm minimizes offloading time by selecting appropriate offloading links based on application categories. The METC algorithm minimizes execution time in cloud data center by selecting virtual and physical machines with corresponding resource requirements of applications. Simulation results show that the proposed mechanism not only minimizes total execution time for mobile devices but also decreases their energy consumption.
Tropical Ocean and Global Atmosphere (TOGA) heat exchange project: A summary report
NASA Technical Reports Server (NTRS)
Liu, W. T.; Niiler, P. P.
1985-01-01
A pilot data center to compute ocean atmosphere heat exchange over the tropical ocean is prposed at the Jet Propulsion Laboratory (JPL) in response to the scientific needs of the Tropical Ocean and Global Atmosphere (TOGA) Program. Optimal methods will be used to estimate sea surface temperature (SET), surface wind speed, and humidity from spaceborne observations. A monthly summary of these parameters will be used to compute ocean atmosphere latent heat exchanges. Monthly fields of surface heat flux over tropical oceans will be constructed using estimations of latent heat exchanges and short wave radiation from satellite data. Verification of all satellite data sets with in situ measurements at a few locations will be provided. The data center will be an experimental active archive where the quality and quantity of data required for TOGA flux computation are managed. The center is essential to facilitate the construction of composite data sets from global measurements taken from different sensors on various satellites. It will provide efficient utilization and easy access to the large volume of satellite data available for studies of ocean atmosphere energy exchanges.
2015-11-01
provided by a stand-alone desktop or hand held computing device. This introduces into the discussion a large number of mobile , tactical command...control, communications, and computer (C4) systems across the Services. A couple of examples are mobile command posts mounted on the back of an M1152... infrastructure (DCPI). This term encompasses on-site backup generators, switchgear, uninterruptible power supplies (UPS), power distribution units
Computation of Temperature-Dependent Legendre Moments of a Double-Differential Elastic Cross Section
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arbanas, Goran; Dunn, Michael E; Larson, Nancy M
2011-01-01
A general expression for temperature-dependent Legendre moments of a double-differential elastic scattering cross section was derived by Ouisloumen and Sanchez [Nucl. Sci. Eng. 107, 189-200 (1991)]. Attempts to compute this expression are hindered by the three-fold nested integral, limiting their practical application to just the zeroth Legendre moment of an isotropic scattering. It is shown that the two innermost integrals could be evaluated analytically to all orders of Legendre moments, and for anisotropic scattering, by a recursive application of the integration by parts method. For this method to work, the anisotropic angular distribution in the center of mass is expressedmore » as an expansion in Legendre polynomials. The first several Legendre moments of elastic scattering of neutrons on U-238 are computed at T=1000 K at incoming energy 6.5 eV for isotropic scattering in the center of mass frame. Legendre moments of the anisotropic angular distribution given via Blatt-Biedenharn coefficients are computed at ~1 keV. The results are in agreement with those computed by the Monte Carlo method.« less
How are the energy waves blocked on the way from hot to cold?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bai, Xianming; He, Lingfeng; Khafizov, Marat
Representing the Center for Materials Science of Nuclear Fuel (CMSNF), this document is one of the entries in the Ten Hundred and One Word Challenge. As part of the challenge, the 46 Energy Frontier Research Centers were invited to represent their science in images, cartoons, photos, words and original paintings, but any descriptions or words could only use the 1000 most commonly used words in the English language, with the addition of one word important to each of the EFRCs and the mission of DOE energy. The mission of CMSNF to develop an experimentally validated multi-scale computational capability for themore » predictive understanding of the impact of microstructure on thermal transport in nuclear fuel under irradiation, with ultimate application to UO2 as a model system« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shorgin, Sergey Ya.; Pechinkin, Alexander V.; Samouylov, Konstantin E.
Cloud computing is promising technology to manage and improve utilization of computing center resources to deliver various computing and IT services. For the purpose of energy saving there is no need to unnecessarily operate many servers under light loads, and they are switched off. On the other hand, some servers should be switched on in heavy load cases to prevent very long delays. Thus, waiting times and system operating cost can be maintained on acceptable level by dynamically adding or removing servers. One more fact that should be taken into account is significant server setup costs and activation times. Formore » better energy efficiency, cloud computing system should not react on instantaneous increase or instantaneous decrease of load. That is the main motivation for using queuing systems with hysteresis for cloud computing system modelling. In the paper, we provide a model of cloud computing system in terms of multiple server threshold-based infinite capacity queuing system with hysteresis and noninstantanuous server activation. For proposed model, we develop a method for computing steady-state probabilities that allow to estimate a number of performance measures.« less
Hera: High Energy Astronomical Data Analysis via the Internet
NASA Astrophysics Data System (ADS)
Valencic, Lynne A.; Chai, P.; Pence, W.; Snowden, S.
2011-09-01
The HEASARC at NASA Goddard Space Flight Center has developed Hera, a data processing facility for analyzing high energy astronomical data over the internet. Hera provides all the software packages, disk space, and computing resources needed to do general processing of and advanced research on publicly available data from High Energy Astrophysics missions. The data and data products are kept on a server at GSFC and can be downloaded to a user's local machine. This service is provided for free to students, educators, and researchers for educational and research purposes.
Multi-Dimensional Optimization for Cloud Based Multi-Tier Applications
ERIC Educational Resources Information Center
Jung, Gueyoung
2010-01-01
Emerging trends toward cloud computing and virtualization have been opening new avenues to meet enormous demands of space, resource utilization, and energy efficiency in modern data centers. By being allowed to host many multi-tier applications in consolidated environments, cloud infrastructure providers enable resources to be shared among these…
National Energy Research Scientific Computing Center
Overview NERSC Mission Contact us Staff Org Chart NERSC History NERSC Stakeholders Usage and User HPC Requirements Reviews NERSC HPC Achievement Awards User Submitted Research Citations NERSC User data archive NERSC Resources Table For Users Live Status User Announcements My NERSC Getting Started
Modeling the Flow of Rarefied Gases at NASA
NASA Technical Reports Server (NTRS)
Forrest E. Lumpkin, III
2012-01-01
At modest temperatures, the thermal energy of atmospheric diatomic gases such as nitrogen is primarily distributed between only translational and rotational energy modes. Furthermore, these energy modes are fully excited such that the specific heat at constant volume is well approximated by the simple expression C(sub v) = 5/2 R. As a result, classical mechanics provides a suitable approximation at such temperatures of the true quantum mechanical behavior of the inter-molecular collisions of such molecules. Using classical mechanics, the transfer of energy between rotational and translation energy modes is studied. The approach of Lordi and Mates is adopted to compute the trajectories and time dependent rotational orientations and energies during the collision of two non-polar diatomic molecules. A Monte-Carlo analysis is performed collecting data from the results of many such simulations in order to estimate the rotational relaxation time. A Graphical Processing Unit (GPU) is employed to improve the performance of the Monte-Carlo analysis. A comparison of the performance of the GPU implementation to an implementation on traditional computer architecture is made. Effects of the assumed inter-molecular potential on the relaxation time are studied. The seminar will also present highlights of computational analyses performed at NASA Johnson Space Center of heat transfer in rarefied gases.
DOE/ NREL Build One of the World's Most Energy Efficient Office Spaces
Radocy, Rachel; Livingston, Brian; von Luhrte, Rich
2018-05-18
Technology â from sophisticated computer modeling to advanced windows that actually open â will help the newest building at the U.S. Department of Energy's (DOE) National Renewable Energy Laboratory (NREL) be one of the world's most energy efficient offices. Scheduled to open this summer, the 222,000 square-foot RSF will house more than 800 staff and an energy efficient information technology data center. Because 19 percent of the country's energy is used by commercial buildings, DOE plans to make this facility a showcase for energy efficiency. DOE hopes the design of the RSF will be replicated by the building industry and help reduce the nation's energy consumption by changing the way commercial buildings are designed and built.
A Look Inside Argonne's Center for Nanoscale Materials
Divan, Ralu; Rosenthal, Dan; Rose, Volker; Wai Hla
2018-05-23
At a very small, or "nano" scale, materials behave differently. The study of nanomaterials is much more than miniaturization - scientists are discovering how changes in size change a material's properties. From sunscreen to computer memory, the applications of nanoscale materials research are all around us. Researchers at Argonne's Center for Nanoscale Materials are creating new materials, methods and technologies to address some of the world's greatest challenges in energy security, lightweight but durable materials, high-efficiency lighting, information storage, environmental stewardship and advanced medical devices.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Franz, James A.; O'Hagan, Molly J.; Ho, Ming-Hsun
2013-12-09
The [Ni(PR2NR’2)2]2+ catalysts, (where PR2NR´2 is 1,5-R´-3,7-R-1,5-diaza-3,7-diphosphacyclooctane), are some of the fastest reported for hydrogen production and oxidation, however, chair/boat isomerization and the presence of a fifth solvent ligand have the potential to slow catalysis by incorrectly positioning the pendant amines or blocking the addition of hydrogen. Here, we report the structural dynamics of a series of [Ni(PR2NR’2)2]n+ complexes, characterized by NMR spectroscopy and theoretical modeling. A fast exchange process was observed for the [Ni(CH3CN)(PR2NR’2)2]2+ complexes which depends on the ligand. This exchange process was identified to occur through a three step mechanism including dissociation of the acetonitrile, boat/chair isomerizationmore » of each of the four rings identified by the phosphine ligands (including nitrogen inversion), and reassociation of acetonitrile on the opposite side of the complex. The rate of the chair/boat inversion can be influenced by varying the substituent on the nitrogen atom, but the rate of the overall exchange process is at least an order of magnitude faster than the catalytic rate in acetonitrile demonstrating that the structural dynamics of the [Ni(PR2NR´2)2]2+ complexes does not hinder catalysis. This material is based upon work supported as part of the Center for Molecular Electrocatalysis, an Energy Frontier Research Center funded by the US Department of Energy, Office of Science, Office of Basic Energy Sciences under FWP56073. Research by J.A.F., M.O., M-H. H., M.L.H, D.L.D. A.M.A., S. R. and R.M.B. was carried out in the Center for Molecular Electrocatalysis, an Energy Frontier Research Center funded by the U.S. Department of Energy, Office of Science. W.J.S. and S.L. were funded by the DOE Office of Science Early Career Research Program through the Office of Basic Energy Sciences. T.L. was supported by the US Department of Energy, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences. Pacific Northwest National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle. Computational resources were provided at W. R. Wiley Environmental Molecular Science Laboratory (EMSL), a national scientific user facility sponsored by the Department of Energy’s Office of Biological and Environmental Research located at Pacific Northwest National Laboratory; the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory; and the Jaguar supercomputer at Oak Ridge National Laboratory (INCITE 2008-2011 award supported by the Office of Science of the U.S. DOE under Contract No. DE-AC0500OR22725).« less
NASA Astrophysics Data System (ADS)
Maire, Pierre-Henri; Abgrall, Rémi; Breil, Jérôme; Loubère, Raphaël; Rebourcet, Bernard
2013-02-01
In this paper, we describe a cell-centered Lagrangian scheme devoted to the numerical simulation of solid dynamics on two-dimensional unstructured grids in planar geometry. This numerical method, utilizes the classical elastic-perfectly plastic material model initially proposed by Wilkins [M.L. Wilkins, Calculation of elastic-plastic flow, Meth. Comput. Phys. (1964)]. In this model, the Cauchy stress tensor is decomposed into the sum of its deviatoric part and the thermodynamic pressure which is defined by means of an equation of state. Regarding the deviatoric stress, its time evolution is governed by a classical constitutive law for isotropic material. The plasticity model employs the von Mises yield criterion and is implemented by means of the radial return algorithm. The numerical scheme relies on a finite volume cell-centered method wherein numerical fluxes are expressed in terms of sub-cell force. The generic form of the sub-cell force is obtained by requiring the scheme to satisfy a semi-discrete dissipation inequality. Sub-cell force and nodal velocity to move the grid are computed consistently with cell volume variation by means of a node-centered solver, which results from total energy conservation. The nominally second-order extension is achieved by developing a two-dimensional extension in the Lagrangian framework of the Generalized Riemann Problem methodology, introduced by Ben-Artzi and Falcovitz [M. Ben-Artzi, J. Falcovitz, Generalized Riemann Problems in Computational Fluid Dynamics, Cambridge Monogr. Appl. Comput. Math. (2003)]. Finally, the robustness and the accuracy of the numerical scheme are assessed through the computation of several test cases.
Yahoo! Compute Coop (YCC). A Next-Generation Passive Cooling Design for Data Centers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robison, AD; Page, Christina; Lytle, Bob
The purpose of the Yahoo! Compute Coop (YCC) project is to research, design, build and implement a greenfield "efficient data factory" and to specifically demonstrate that the YCC concept is feasible for large facilities housing tens of thousands of heat-producing computing servers. The project scope for the Yahoo! Compute Coop technology includes: - Analyzing and implementing ways in which to drastically decrease energy consumption and waste output. - Analyzing the laws of thermodynamics and implementing naturally occurring environmental effects in order to maximize the "free-cooling" for large data center facilities. "Free cooling" is the direct usage of outside air tomore » cool the servers vs. traditional "mechanical cooling" which is supplied by chillers or other Dx units. - Redesigning and simplifying building materials and methods. - Shortening and simplifying build-to-operate schedules while at the same time reducing initial build and operating costs. Selected for its favorable climate, the greenfield project site is located in Lockport, NY. Construction on the 9.0 MW critical load data center facility began in May 2009, with the fully operational facility deployed in September 2010. The relatively low initial build cost, compatibility with current server and network models, and the efficient use of power and water are all key features that make it a highly compatible and globally implementable design innovation for the data center industry. Yahoo! Compute Coop technology is designed to achieve 99.98% uptime availability. This integrated building design allows for free cooling 99% of the year via the building's unique shape and orientation, as well as server physical configuration.« less
Computer program for flat sector thrust bearing performance
NASA Technical Reports Server (NTRS)
Presler, A. F.; Etsion, I.
1977-01-01
A versatile computer program is presented which achieves a rapid, numerical solution of the Reynolds equation for a flat sector thrust pad bearing with either compressible or liquid lubricants. Program input includes a range in values of the geometric and operating parameters of the sector bearing. Performance characteristics are obtained from the calculated bearing pressure distribution. These are the load capacity, center of pressure coordinates, frictional energy dissipation, and flow rates of liquid lubricant across the bearing edges. Two sample problems are described.
The Future is Hera: Analyzing Astronomical Data Over the Internet
NASA Astrophysics Data System (ADS)
Valencic, Lynne A.; Snowden, S.; Chai, P.; Shafer, R.
2009-01-01
Hera is the new data processing facility provided by the HEASARC at the NASA Goddard Space Flight Center for analyzing astronomical data. Hera provides all the preinstalled software packages, local disk space, and computing resources needed to do general processing of FITS format data files residing on the user's local computer, and to do advanced research using the publicly available data from High Energy Astrophysics missions. Qualified students, educators, and researchers may freely use the Hera services over the internet for research and educational purposes.
Scientific Computing Strategic Plan for the Idaho National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whiting, Eric Todd
Scientific computing is a critical foundation of modern science. Without innovations in the field of computational science, the essential missions of the Department of Energy (DOE) would go unrealized. Taking a leadership role in such innovations is Idaho National Laboratory’s (INL’s) challenge and charge, and is central to INL’s ongoing success. Computing is an essential part of INL’s future. DOE science and technology missions rely firmly on computing capabilities in various forms. Modeling and simulation, fueled by innovations in computational science and validated through experiment, are a critical foundation of science and engineering. Big data analytics from an increasing numbermore » of widely varied sources is opening new windows of insight and discovery. Computing is a critical tool in education, science, engineering, and experiments. Advanced computing capabilities in the form of people, tools, computers, and facilities, will position INL competitively to deliver results and solutions on important national science and engineering challenges. A computing strategy must include much more than simply computers. The foundational enabling component of computing at many DOE national laboratories is the combination of a showcase like data center facility coupled with a very capable supercomputer. In addition, network connectivity, disk storage systems, and visualization hardware are critical and generally tightly coupled to the computer system and co located in the same facility. The existence of these resources in a single data center facility opens the doors to many opportunities that would not otherwise be possible.« less
Catalytic N 2 Reduction to Silylamines and Thermodynamics of N 2 Binding at Square Planar Fe
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prokopchuk, Demyan E.; Wiedner, Eric S.; Walter, Eric D.
The geometric constraints imposed by a tetradentate P 4N 2 ligand play an essential role in stabilizing square planar Fe complexes with changes in metal oxidation state. A combination of high-pressure electrochemistry and variable temperature UV-vis spectroscopy were used to obtain these thermodynamic measurements, while X-ray crystallography, 57Fe Mössbauer spectroscopy, and EPR spectroscopy were used to fully characterize these new compounds. Analysis of Fe 0, FeI, and FeII complexes reveals that the free energy of N 2 binding across three oxidation states spans more than 37 kcal mol -1. The square pyramidal Fe0(N 2)(P 4N 2) complex catalyzes the conversionmore » of N 2 to N(SiR 3) 3 (R = Me, Et) at room temperature, representing the highest turnover number (TON) of any Fe-based N 2 silylation catalyst to date (up to 65 equiv N(SiMe 3) 3 per Fe center). Elevated N 2 pressures (> 1 atm) have a dramatic effect on catalysis, increasing N 2 solubility and the thermodynamic N 2 binding affinity at Fe0(N 2)(P 4N 2). Acknowledgment. This research was supported as part of the Center for Molecular Electrocatalysis, an Energy Frontier Research Center funded by the U.S. Department of Energy (DOE), Office of Science, Office of Basic Energy Sciences. EPR experiments were performed using EMSL, a national scientific user facility sponsored by the DOE’s Office of Biological and Environmental Research and located at Pacific Northwest National Laboratory (PNNL). PNNL is operated by Battelle for the U.S. DOE. Computational resources were provided by the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. The authors thank Prof. Yisong Alex Guo at Carnegie Mellon University for recording Mössbauer data for some complexes and Emma Wellington and Kaye Kuphal for their assistance with the collection of Mössbauer data at Colgate University, Dr. Katarzyna Grubel for X-ray assistance, and Dr. Rosalie Chu for mass spectrometry assistance. The authors also thank Dr. Aaron Appel and Dr. Alex Kendall for helpful discussions.« less
Electrochemistry for Energy Conversion
NASA Astrophysics Data System (ADS)
O'Hayre, Ryan
2010-10-01
Imagine a laptop computer that runs for 30 hours on a single charge. Imagine a world where you plug your house into your car and power lines are a distant memory. These dreams motivate today's fuel cell research. While some dreams (like powering your home with your fuel cell car) may be distant, others (like a 30-hour fuel cell laptop) may be closer than you think. If you are curious about fuel cells---how they work, when you might start seeing them in your daily life--- this talk is for you. Learn about the state-of-the art in fuel cells, and where the technology is likely to be headed in the next 20 years. You'll also be treated to several ``behind-the scenes'' glimpses of cutting-edge research projects under development in the Renewable Energy Materials Center at the Colorado School of Mines--- projects like an ``ionic transistor'' that works with protons instead of electrons, and a special ceramic membrane material that enables the ``uphill'' diffusion of steam. Associate Professor Ryan O'Hayre's laboratory at the Colorado School of Mines develops new materials and devices to enable alternative energy technologies including fuel cells and solar cells. Prof. O'Hayre and his students collaborate with the Colorado Fuel Cell Center, the Colorado Center for Advanced Ceramics, the Renewable Energy Materials Science and Engineering Center, and the National Renewable Energy Laboratory.[4pt] In collaboration with Ann Deml, Jianhua Tong, Svitlana Pylypenko, Archana Subramaniyan, Micahael Sanders, Jason Fish, Annette Bunge, Colorado School of Mines.
The Magellan Final Report on Cloud Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
,; Coghlan, Susan; Yelick, Katherine
The goal of Magellan, a project funded through the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR), was to investigate the potential role of cloud computing in addressing the computing needs for the DOE Office of Science (SC), particularly related to serving the needs of mid- range computing and future data-intensive computing workloads. A set of research questions was formed to probe various aspects of cloud computing from performance, usability, and cost. To address these questions, a distributed testbed infrastructure was deployed at the Argonne Leadership Computing Facility (ALCF) and the National Energy Research Scientific Computingmore » Center (NERSC). The testbed was designed to be flexible and capable enough to explore a variety of computing models and hardware design points in order to understand the impact for various scientific applications. During the project, the testbed also served as a valuable resource to application scientists. Applications from a diverse set of projects such as MG-RAST (a metagenomics analysis server), the Joint Genome Institute, the STAR experiment at the Relativistic Heavy Ion Collider, and the Laser Interferometer Gravitational Wave Observatory (LIGO), were used by the Magellan project for benchmarking within the cloud, but the project teams were also able to accomplish important production science utilizing the Magellan cloud resources.« less
Extending Moore's Law via Computationally Error Tolerant Computing.
Deng, Bobin; Srikanth, Sriseshan; Hein, Eric R.; ...
2018-03-01
Dennard scaling has ended. Lowering the voltage supply (V dd) to sub-volt levels causes intermittent losses in signal integrity, rendering further scaling (down) no longer acceptable as a means to lower the power required by a processor core. However, it is possible to correct the occasional errors caused due to lower V dd in an efficient manner and effectively lower power. By deploying the right amount and kind of redundancy, we can strike a balance between overhead incurred in achieving reliability and energy savings realized by permitting lower V dd. One promising approach is the Redundant Residue Number System (RRNS)more » representation. Unlike other error correcting codes, RRNS has the important property of being closed under addition, subtraction and multiplication, thus enabling computational error correction at a fraction of an overhead compared to conventional approaches. We use the RRNS scheme to design a Computationally-Redundant, Energy-Efficient core, including the microarchitecture, Instruction Set Architecture (ISA) and RRNS centered algorithms. Finally, from the simulation results, this RRNS system can reduce the energy-delay-product by about 3× for multiplication intensive workloads and by about 2× in general, when compared to a non-error-correcting binary core.« less
Extending Moore's Law via Computationally Error Tolerant Computing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Deng, Bobin; Srikanth, Sriseshan; Hein, Eric R.
Dennard scaling has ended. Lowering the voltage supply (V dd) to sub-volt levels causes intermittent losses in signal integrity, rendering further scaling (down) no longer acceptable as a means to lower the power required by a processor core. However, it is possible to correct the occasional errors caused due to lower V dd in an efficient manner and effectively lower power. By deploying the right amount and kind of redundancy, we can strike a balance between overhead incurred in achieving reliability and energy savings realized by permitting lower V dd. One promising approach is the Redundant Residue Number System (RRNS)more » representation. Unlike other error correcting codes, RRNS has the important property of being closed under addition, subtraction and multiplication, thus enabling computational error correction at a fraction of an overhead compared to conventional approaches. We use the RRNS scheme to design a Computationally-Redundant, Energy-Efficient core, including the microarchitecture, Instruction Set Architecture (ISA) and RRNS centered algorithms. Finally, from the simulation results, this RRNS system can reduce the energy-delay-product by about 3× for multiplication intensive workloads and by about 2× in general, when compared to a non-error-correcting binary core.« less
NASA Technical Reports Server (NTRS)
Raju, I. S.
1986-01-01
The Q3DG is a computer program developed to perform a quasi-three-dimensional stress analysis for composite laminates which may contain delaminations. The laminates may be subjected to mechanical, thermal, and hygroscopic loads. The program uses the finite element method and models the laminates with eight-noded parabolic isoparametric elements. The program computes the strain-energy-release components and the total strain-energy release in all three modes for delamination growth. A rectangular mesh and data file generator, DATGEN, is included. The DATGEN program can be executed interactively and is user friendly. The documentation includes sections dealing with the Q3D analysis theory, derivation of element stiffness matrices and consistent load vectors for the parabolic element. Several sample problems with the input for Q3DG and output from the program are included. The capabilities of the DATGEN program are illustrated with examples of interactive sessions. A microfiche of all the examples is included. The Q3DG and DATGEN programs have been implemented on CYBER 170 class computers. Q3DG and DATGEN were developed at the Langley Research Center during the early eighties and documented in 1984 to 1985.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sigrin, Benjamin O
High customer acquisition costs remain a persistent challenge in the U.S. residential solar industry. Effective customer acquisition in the residential solar market is increasingly achieved with the help of data analysis and machine learning, whether that means more targeted advertising, understanding customer motivations, or responding to competitors. New research by the National Renewable Energy Laboratory, Sandia National Laboratories, Vanderbilt University, University of Pennsylvania, and the California Center for Sustainable Energy and funded through the U.S. Department of Energy's Solar Energy Evolution and Diffusion (SEEDS) program demonstrates novel computational methods that can help drive down costs in the residential solar industry.
Overall, the implementation of a computer-controlled hydrogen generation system and subsequent conversion of small engine equipment for hydrogen use has been surprisingly straightforward from an engineering and technology standpoint. More testing is required to get a better gr...
Leadership Team | Water Power | NREL
leading wind energy and water power research efforts in structural analysis and simulation, computational Leadership Team Leadership Team Learn more about the expertise and technical skills of the water power research team and staff at NREL. Photo of Daniel Laird Daniel Laird Center Director I-Technical Dr
Army Maneuver Center of Excellence
2012-10-18
agreements throughout DoD DARPA, JIEDDO, DHS, FAA, DoE, NSA , NASA, SMDC, etc. Strategic Partnerships Benefit the Army Materiel Enterprise External... Neuroscience Network Sciences Hierarchical Computing Extreme Energy Science Autonomous Systems Technology Emerging Sciences Meso-scale (grain...scales • Improvements in Soldier-system overall performance → operational neuroscience and advanced simulation and training technologies
NASA Astrophysics Data System (ADS)
Salavati-Fard, Taha; Jenness, Glen; Caratzoulas, Stavros; Doren, Douglas
Using computational methods, the catalytic effects of oxide surfaces on the Diels-Alder reaction between biomass-derived furan and methyl acrylate are investigated. The cycloadduct can be dehydrated later to produce methyl benzoic which is an important step toward benzoic acid production. Different systems such as clean, partially hydroxylated and fully hydroxylated ZrO2 are considered. The Langmuir and Eley-Rideal mechanisms are studied, as well. Our calculations show that the oxide surfaces catalyze the reaction significantly through the interaction of metal sites with the electron-poor reactant. The calculations are interpreted by making use of the total and projected electronic density of states and band structure of the catalyst. This material is based on work supported as part of the Catalysis Center for Energy Innovation, an Energy Frontier Research Center funded by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences under Award Number DE-SC0001004.
NASA Technical Reports Server (NTRS)
Ketchum, Eleanor A. (Inventor)
2000-01-01
A computer-implemented method and apparatus for determining position of a vehicle within 100 km autonomously from magnetic field measurements and attitude data without a priori knowledge of position. An inverted dipole solution of two possible position solutions for each measurement of magnetic field data are deterministically calculated by a program controlled processor solving the inverted first order spherical harmonic representation of the geomagnetic field for two unit position vectors 180 degrees apart and a vehicle distance from the center of the earth. Correction schemes such as a successive substitutions and a Newton-Raphson method are applied to each dipole. The two position solutions for each measurement are saved separately. Velocity vectors for the position solutions are calculated so that a total energy difference for each of the two resultant position paths is computed. The position path with the smaller absolute total energy difference is chosen as the true position path of the vehicle.
Shrestha, Kushal; Jakubikova, Elena
2015-08-20
Light-harvesting antennas are protein-pigment complexes that play a crucial role in natural photosynthesis. The antenna complexes absorb light and transfer energy to photosynthetic reaction centers where charge separation occurs. This work focuses on computational studies of the electronic structure of the pigment networks of light-harvesting complex I (LH1), LH1 with the reaction center (RC-LH1), and light-harvesting complex II (LH2) found in purple bacteria. As the pigment networks of LH1, RC-LH1, and LH2 contain thousands of atoms, conventional density functional theory (DFT) and ab initio calculations of these systems are not computationally feasible. Therefore, we utilize DFT in conjunction with the energy-based fragmentation with molecular orbitals method and a semiempirical approach employing the extended Hückel model Hamiltonian to determine the electronic properties of these pigment assemblies. Our calculations provide a deeper understanding of the electronic structure of natural light-harvesting complexes, especially their pigment networks, which could assist in rational design of artificial photosynthetic devices.
Ultra-low-energy analog straintronics using multiferroic composites
NASA Astrophysics Data System (ADS)
Roy, Kuntal
2014-03-01
Multiferroic devices, i.e., a magnetostrictive nanomagnet strain-coupled with a piezoelectric layer, are promising as binary switches for ultra-low-energy digital computing in beyond Moore's law era [Roy, K. Appl. Phys. Lett. 103, 173110 (2013), Roy, K. et al. Appl. Phys. Lett. 99, 063108 (2011), Phys. Rev. B 83, 224412 (2011), Scientific Reports (Nature Publishing Group) 3, 3038 (2013), J. Appl. Phys. 112, 023914 (2012)]. We show here that such multiferroic devices, apart from performing digital computation, can be also utilized for analog computing purposes, e.g., voltage amplification, filter etc. The analog computing capability is conceived by considering that magnetization's mean orientation shifts gradually although nanomagnet's potential minima changes abruptly. Using tunneling magnetoresistance (TMR) measurement, a continuous output voltage while varying the input voltage can be produced. Stochastic Landau-Lifshitz-Gilbert (LLG) equation in the presence of room-temperature (300 K) thermal fluctuations is solved to demonstrate the analog computing capability of such multiferroic devices. This work was supported in part by FAME, one of six centers of STARnet, a Semiconductor Research Corporation program sponsored by MARCO and DARPA.
A Novel Cost Based Model for Energy Consumption in Cloud Computing
Horri, A.; Dastghaibyfard, Gh.
2015-01-01
Cloud data centers consume enormous amounts of electrical energy. To support green cloud computing, providers also need to minimize cloud infrastructure energy consumption while conducting the QoS. In this study, for cloud environments an energy consumption model is proposed for time-shared policy in virtualization layer. The cost and energy usage of time-shared policy were modeled in the CloudSim simulator based upon the results obtained from the real system and then proposed model was evaluated by different scenarios. In the proposed model, the cache interference costs were considered. These costs were based upon the size of data. The proposed model was implemented in the CloudSim simulator and the related simulation results indicate that the energy consumption may be considerable and that it can vary with different parameters such as the quantum parameter, data size, and the number of VMs on a host. Measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment. Also, measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment. PMID:25705716
A novel cost based model for energy consumption in cloud computing.
Horri, A; Dastghaibyfard, Gh
2015-01-01
Cloud data centers consume enormous amounts of electrical energy. To support green cloud computing, providers also need to minimize cloud infrastructure energy consumption while conducting the QoS. In this study, for cloud environments an energy consumption model is proposed for time-shared policy in virtualization layer. The cost and energy usage of time-shared policy were modeled in the CloudSim simulator based upon the results obtained from the real system and then proposed model was evaluated by different scenarios. In the proposed model, the cache interference costs were considered. These costs were based upon the size of data. The proposed model was implemented in the CloudSim simulator and the related simulation results indicate that the energy consumption may be considerable and that it can vary with different parameters such as the quantum parameter, data size, and the number of VMs on a host. Measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment. Also, measured results validate the model and demonstrate that there is a tradeoff between energy consumption and QoS in the cloud environment.
NASA Technical Reports Server (NTRS)
Venkatapathy, Ethiraj; Gulhan, Ali; Aftosmis, Michael; Brock, Joseph; Mathias, Donovan; Need, Dominic; Rodriguez, David; Seltner, Patrick; Stern, Eric; Wiles, Sebastian
2017-01-01
An airburst from a large asteroid during entry can cause significant ground damage. The damage depends on the energy and the altitude of airburst. Breakup of asteroids into fragments and their lateral spread have been observed. Modeling the underlying physics of fragmented bodies interacting at hypersonic speeds and the spread of fragments is needed for a true predictive capability. Current models use heuristic arguments and assumptions such as pancaking or point source explosive energy release at pre-determined altitude or an assumed fragmentation spread rate to predict airburst damage. A multi-year collaboration between German Aerospace Center (DLR) and NASA has been established to develop validated computational tools to address the above challenge.
NASA Astrophysics Data System (ADS)
Sarkar, Supratik; Sarkar, Samrat; Bose, Chayanika
2018-07-01
We present a general formulation of the ground state binding energy of a shallow hydrogenic impurity in spherical quantum dot with parabolic confinement, considering the effects of polarization and self energy. The variational approach within the effective mass approximation is employed here. The binding energy of an on-center impurity is computed for a GaAs/AlxGa1-xAs quantum dot as a function of the dot size with the dot barrier as parameter. The influence of polarization and self energy are also treated separately. Results indicate that the binding energy increases due to the presence of polarization charge, while decreases due to the self energy of the carrier. An overall enhancement in impurity binding energy, especially for small dots is noted.
Collisional Penrose process with spinning particles
NASA Astrophysics Data System (ADS)
Mukherjee, Sajal
2018-03-01
In this article, we have investigated collisional Penrose process (CPP) using spinning particles in a Kerr spacetime. Recent studies have shown that the collision between two spinning particles can produce a significantly high energy in the center of mass frame. Here, we explicitly compute the energy extraction and efficiency as measured by an observer at infinity. We consider the colliding particles as well as the escaping particles may contain spins. It has been shown that the energy extraction is larger than the non-spinning case and also their possibility to escape to infinity is wider than the geodesics.
Incident Energy Focused Design and Validation for the Floating Potential Probe
NASA Technical Reports Server (NTRS)
Fincannon, James
2002-01-01
Utilizing the spacecraft shadowing and incident energy analysis capabilities of the NASA Glenn Research Center Power and Propulsion Office's SPACE System Power Analysis for Capability Evaluation) computer code, this paper documents the analyses for various International Space Station (ISS) Floating Potential Probe (EPP) preliminary design options. These options include various solar panel orientations and configurations as well as deployment locations on the ISS. The incident energy for the final selected option is characterized. A good correlation between the predicted data and on-orbit operational telemetry is demonstrated. Minor deviations are postulated to be induced by degradation or sensor drift.
Vossenaar, M; Jaramillo, P M; Soto-Méndez, M-J; Panday, B; Hamelinck, V; Bermúdez, O I; Doak, C M; Mathias, P; Solomons, N W
2012-12-01
Adequate nutrition is critical to child development and institutions such as day-care centers could potentially complement children's diets to achieve optimal daily intakes. The aim of the study was to describe the full-day diet of children, examining and contrasting the relative contribution of home-derived versus institutional energy and nutrient sources. The present comparison should be considered in the domain of a case-study format. The diets of 33, 3-6 y old children attending low-income day-care centers serving either 3 or a single meal were examined. The home-diet was assessed by means of 3 non-consecutive 24-hr recalls. Estimated energy and nutrient intakes at the centers and at home were assessed and related to Recommended Nutrient Intakes (RNI). Nutrient densities, critical densities and main sources of nutrients were computed. We observed that in children attending the day-care center serving three meals, home-foods contributed less than half the daily energy (47.7%) and between 29.9% and 53.5% of daily nutrients. In children receiving only lunch outside the home, energy contribution from the home was 83.9% and 304 kcal lower than for children receiving 3 meals. Furthermore, between 59.0% and 94.8% of daily nutrients were provided at home. Daily energy, nutrient intakes and nutrient densities were well above the nutrient requirements for this age group, and particularly high for vitamin A. The overall dietary variety was superior in the situation of greater contribution of home fare, but overall the nutrient density and adequacy of the aggregate intakes did not differ in any important manner.
Outer Membrane Protein Folding and Topology from a Computational Transfer Free Energy Scale.
Lin, Meishan; Gessmann, Dennis; Naveed, Hammad; Liang, Jie
2016-03-02
Knowledge of the transfer free energy of amino acids from aqueous solution to a lipid bilayer is essential for understanding membrane protein folding and for predicting membrane protein structure. Here we report a computational approach that can calculate the folding free energy of the transmembrane region of outer membrane β-barrel proteins (OMPs) by combining an empirical energy function with a reduced discrete state space model. We quantitatively analyzed the transfer free energies of 20 amino acid residues at the center of the lipid bilayer of OmpLA. Our results are in excellent agreement with the experimentally derived hydrophobicity scales. We further exhaustively calculated the transfer free energies of 20 amino acids at all positions in the TM region of OmpLA. We found that the asymmetry of the Gram-negative bacterial outer membrane as well as the TM residues of an OMP determine its functional fold in vivo. Our results suggest that the folding process of an OMP is driven by the lipid-facing residues in its hydrophobic core, and its NC-IN topology is determined by the differential stabilities of OMPs in the asymmetrical outer membrane. The folding free energy is further reduced by lipid A and assisted by general depth-dependent cooperativities that exist between polar and ionizable residues. Moreover, context-dependency of transfer free energies at specific positions in OmpLA predict regions important for protein function as well as structural anomalies. Our computational approach is fast, efficient and applicable to any OMP.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson, Alastair; Regnier, Cindy; Settlemyre, Kevin
Massachusetts Institute of Technology (MIT) partnered with the U.S. Department of Energy (DOE) to develop and implement solutions to retrofit existing buildings to reduce energy consumption by at least 30% as part of DOE’s Commercial Building Partnerships (CBP) Program.1 Lawrence Berkeley National Laboratory (LBNL) provided technical expertise in support of this DOE program. MIT is one of the U.S.’s foremost higher education institutions, occupying a campus that is nearly 100 years old, with a building floor area totaling more than 12 million square feet. The CBP project focused on improving the energy performance of two campus buildings, the Ray andmore » Maria Stata Center (RMSC) and the Building W91 (BW91) data center. A key goal of the project was to identify energy saving measures that could be applied to other buildings both within MIT’s portfolio and at other higher education institutions. The CBP retrofits at MIT are projected to reduce energy consumption by approximately 48%, including a reduction of around 72% in RMSC lighting energy and a reduction of approximately 55% in RMSC server room HVAC energy. The energy efficiency measure (EEM) package proposed for the BW91 data center is expected to reduce heating, ventilation, and air-conditioning (HVAC) energy use by 30% to 50%, depending on the final air intake temperature that is established for the server racks. The RMSC, an iconic building designed by Frank Gehry, houses the Computer Science and Artificial Intelligence Laboratory, the Laboratory for Information and Decision Systems, and the Department of Linguistics and Philosophy.« less
Online production validation in a HEP environment
NASA Astrophysics Data System (ADS)
Harenberg, T.; Kuhl, T.; Lang, N.; Mättig, P.; Sandhoff, M.; Schwanenberger, C.; Volkmer, F.
2017-03-01
In high energy physics (HEP) event simulations, petabytes of data are processed and stored requiring millions of CPU-years. This enormous demand for computing resources is handled by centers distributed worldwide, which form part of the LHC computing grid. The consumption of such an important amount of resources demands for an efficient production of simulation and for the early detection of potential errors. In this article we present a new monitoring framework for grid environments, which polls a measure of data quality during job execution. This online monitoring facilitates the early detection of configuration errors (specially in simulation parameters), and may thus contribute to significant savings in computing resources.
How Data Becomes Physics: Inside the RACF
Ernst, Michael; Rind, Ofer; Rajagopalan, Srini; Lauret, Jerome; Pinkenburg, Chris
2018-06-22
The RHIC & ATLAS Computing Facility (RACF) at the U.S. Department of Energyâs (DOE) Brookhaven National Laboratory sits at the center of a global computing network. It connects more than 2,500 researchers around the world with the data generated by millions of particle collisions taking place each second at Brookhaven Lab's Relativistic Heavy Ion Collider (RHIC, a DOE Office of Science User Facility for nuclear physics research), and the ATLAS experiment at the Large Hadron Collider in Europe. Watch this video to learn how the people and computing resources of the RACF serve these scientists to turn petabytes of raw data into physics discoveries.
NASA Technical Reports Server (NTRS)
1987-01-01
Philip Morris research center scientists use a computer program called CECTRP, for Chemical Equilibrium Composition and Transport Properties, to gain insight into the behavior of atoms as they progress along the reaction pathway. Use of the program lets the scientist accurately predict the behavior of a given molecule or group of molecules. Computer generated data must be checked by laboratory experiment, but the use of CECTRP saves the researchers hundreds of hours of laboratory time since experiments must run only to validate the computer's prediction. Philip Morris estimates that had CECTRP not been available, at least two man years would have been required to develop a program to perform similar free energy calculations.
Barmak, Katayun; Liu, Jiaxing; Harlan, Liam; Xiao, Penghao; Duncan, Juliana; Henkelman, Graeme
2017-10-21
The enthalpy and activation energy for the transformation of the metastable form of tungsten, β-W, which has the topologically close-packed A15 structure (space group Pm3¯n), to equilibrium α-W, which is body-centered cubic (A2, space group Im3¯m), was measured using differential scanning calorimetry. The β-W films were 1 μm-thick and were prepared by sputter deposition in argon with a small amount of nitrogen. The transformation enthalpy was measured as -8.3 ± 0.4 kJ/mol (-86 ± 4 meV/atom) and the transformation activation energy as 2.2 ± 0.1 eV. The measured enthalpy was found to agree well with the difference in energies of α and β tungsten computed using density functional theory, which gave a value of -82 meV/atom for the transformation enthalpy. A calculated concerted transformation mechanism with a barrier of 0.4 eV/atom, in which all the atoms in an A15 unit cell transform into A2, was found to be inconsistent with the experimentally measured activation energy for any critical nucleus larger than two A2 unit cells. Larger calculations of eight A15 unit cells spontaneously relax to a mechanism in which part of the supercell first transforms from A15 to A2, creating a phase boundary, before the remaining A15 transforms into the A2 phase. Both calculations indicate that a nucleation and growth mechanism is favored over a concerted transformation. More consistent with the experimental activation energy was that of a calculated local transformation mechanism at the A15-A2 phase boundary, computed as 1.7 eV using molecular dynamics simulations. This calculated phase transformation mechanism involves collective rearrangements of W atoms in the disordered interface separating the A15 and A2 phases.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maire, Pierre-Henri, E-mail: maire@celia.u-bordeaux1.fr; Abgrall, Rémi, E-mail: remi.abgrall@math.u-bordeau1.fr; Breil, Jérôme, E-mail: breil@celia.u-bordeaux1.fr
2013-02-15
In this paper, we describe a cell-centered Lagrangian scheme devoted to the numerical simulation of solid dynamics on two-dimensional unstructured grids in planar geometry. This numerical method, utilizes the classical elastic-perfectly plastic material model initially proposed by Wilkins [M.L. Wilkins, Calculation of elastic–plastic flow, Meth. Comput. Phys. (1964)]. In this model, the Cauchy stress tensor is decomposed into the sum of its deviatoric part and the thermodynamic pressure which is defined by means of an equation of state. Regarding the deviatoric stress, its time evolution is governed by a classical constitutive law for isotropic material. The plasticity model employs themore » von Mises yield criterion and is implemented by means of the radial return algorithm. The numerical scheme relies on a finite volume cell-centered method wherein numerical fluxes are expressed in terms of sub-cell force. The generic form of the sub-cell force is obtained by requiring the scheme to satisfy a semi-discrete dissipation inequality. Sub-cell force and nodal velocity to move the grid are computed consistently with cell volume variation by means of a node-centered solver, which results from total energy conservation. The nominally second-order extension is achieved by developing a two-dimensional extension in the Lagrangian framework of the Generalized Riemann Problem methodology, introduced by Ben-Artzi and Falcovitz [M. Ben-Artzi, J. Falcovitz, Generalized Riemann Problems in Computational Fluid Dynamics, Cambridge Monogr. Appl. Comput. Math. (2003)]. Finally, the robustness and the accuracy of the numerical scheme are assessed through the computation of several test cases.« less
Modeling the Proton Radiation Belt With Van Allen Probes Relativistic Electron-Proton Telescope Data
NASA Technical Reports Server (NTRS)
Kanekal, S. G.; Li, X.; Baker, D. N.; Selesnick, R. S.; Hoxie, V. C.
2018-01-01
An empirical model of the proton radiation belt is constructed from data taken during 2013-2017 by the Relativistic Electron-Proton Telescopes on the Van Allen Probes satellites. The model intensity is a function of time, kinetic energy in the range 18-600 megaelectronvolts, equatorial pitch angle, and L shell of proton guiding centers. Data are selected, on the basis of energy deposits in each of the nine silicon detectors, to reduce background caused by hard proton energy spectra at low L. Instrument response functions are computed by Monte Carlo integration, using simulated proton paths through a simplified structural model, to account for energy loss in shielding material for protons outside the nominal field of view. Overlap of energy channels, their wide angular response, and changing satellite orientation require the model dependencies on all three independent variables be determined simultaneously. This is done by least squares minimization with a customized steepest descent algorithm. Model uncertainty accounts for statistical data error and systematic error in the simulated instrument response. A proton energy spectrum is also computed from data taken during the 8 January 2014 solar event, to illustrate methods for the simpler case of an isotropic and homogeneous model distribution. Radiation belt and solar proton results are compared to intensities computed with a simplified, on-axis response that can provide a good approximation under limited circumstances.
Modeling the Proton Radiation Belt With Van Allen Probes Relativistic Electron-Proton Telescope Data
NASA Astrophysics Data System (ADS)
Selesnick, R. S.; Baker, D. N.; Kanekal, S. G.; Hoxie, V. C.; Li, X.
2018-01-01
An empirical model of the proton radiation belt is constructed from data taken during 2013-2017 by the Relativistic Electron-Proton Telescopes on the Van Allen Probes satellites. The model intensity is a function of time, kinetic energy in the range 18-600 MeV, equatorial pitch angle, and L shell of proton guiding centers. Data are selected, on the basis of energy deposits in each of the nine silicon detectors, to reduce background caused by hard proton energy spectra at low L. Instrument response functions are computed by Monte Carlo integration, using simulated proton paths through a simplified structural model, to account for energy loss in shielding material for protons outside the nominal field of view. Overlap of energy channels, their wide angular response, and changing satellite orientation require the model dependencies on all three independent variables be determined simultaneously. This is done by least squares minimization with a customized steepest descent algorithm. Model uncertainty accounts for statistical data error and systematic error in the simulated instrument response. A proton energy spectrum is also computed from data taken during the 8 January 2014 solar event, to illustrate methods for the simpler case of an isotropic and homogeneous model distribution. Radiation belt and solar proton results are compared to intensities computed with a simplified, on-axis response that can provide a good approximation under limited circumstances.
None
2017-12-09
Learn what it will take to create tomorrow's net-zero energy home as scientists reveal the secrets of cool roofs, smart windows, and computer-driven energy control systems. The net-zero energy home: Scientists are working to make tomorrow's homes more than just energy efficient -- they want them to be zero energy. Iain Walker, a scientist in the Lab's Energy Performance of Buildings Group, will discuss what it takes to develop net-zero energy houses that generate as much energy as they use through highly aggressive energy efficiency and on-site renewable energy generation. Talking back to the grid: Imagine programming your house to use less energy if the electricity grid is full or price are high. Mary Ann Piette, deputy director of Berkeley Lab's building technology department and director of the Lab's Demand Response Research Center, will discuss how new technologies are enabling buildings to listen to the grid and automatically change their thermostat settings or lighting loads, among other demands, in response to fluctuating electricity prices. The networked (and energy efficient) house: In the future, your home's lights, climate control devices, computers, windows, and appliances could be controlled via a sophisticated digital network. If it's plugged in, it'll be connected. Bruce Nordman, an energy scientist in Berkeley Lab's Energy End-Use Forecasting group, will discuss how he and other scientists are working to ensure these networks help homeowners save energy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Learn what it will take to create tomorrow's net-zero energy home as scientists reveal the secrets of cool roofs, smart windows, and computer-driven energy control systems. The net-zero energy home: Scientists are working to make tomorrow's homes more than just energy efficient -- they want them to be zero energy. Iain Walker, a scientist in the Lab's Energy Performance of Buildings Group, will discuss what it takes to develop net-zero energy houses that generate as much energy as they use through highly aggressive energy efficiency and on-site renewable energy generation. Talking back to the grid: Imagine programming your house tomore » use less energy if the electricity grid is full or price are high. Mary Ann Piette, deputy director of Berkeley Lab's building technology department and director of the Lab's Demand Response Research Center, will discuss how new technologies are enabling buildings to listen to the grid and automatically change their thermostat settings or lighting loads, among other demands, in response to fluctuating electricity prices. The networked (and energy efficient) house: In the future, your home's lights, climate control devices, computers, windows, and appliances could be controlled via a sophisticated digital network. If it's plugged in, it'll be connected. Bruce Nordman, an energy scientist in Berkeley Lab's Energy End-Use Forecasting group, will discuss how he and other scientists are working to ensure these networks help homeowners save energy.« less
Planning and management of cloud computing networks
NASA Astrophysics Data System (ADS)
Larumbe, Federico
The evolution of the Internet has a great impact on a big part of the population. People use it to communicate, query information, receive news, work, and as entertainment. Its extraordinary usefulness as a communication media made the number of applications and technological resources explode. However, that network expansion comes at the cost of an important power consumption. If the power consumption of telecommunication networks and data centers is considered as the power consumption of a country, it would rank at the 5 th place in the world. Furthermore, the number of servers in the world is expected to grow by a factor of 10 between 2013 and 2020. This context motivates us to study techniques and methods to allocate cloud computing resources in an optimal way with respect to cost, quality of service (QoS), power consumption, and environmental impact. The results we obtained from our test cases show that besides minimizing capital expenditures (CAPEX) and operational expenditures (OPEX), the response time can be reduced up to 6 times, power consumption by 30%, and CO2 emissions by a factor of 60. Cloud computing provides dynamic access to IT resources as a service. In this paradigm, programs are executed in servers connected to the Internet that users access from their computers and mobile devices. The first advantage of this architecture is to reduce the time of application deployment and interoperability, because a new user only needs a web browser and does not need to install software on local computers with specific operating systems. Second, applications and information are available from everywhere and with any device with an Internet access. Also, servers and IT resources can be dynamically allocated depending on the number of users and workload, a feature called elasticity. This thesis studies the resource management of cloud computing networks and is divided in three main stages. We start by analyzing the planning of cloud computing networks to get a comprehensive vision. The first question to be solved is what are the optimal data center locations. We found that the location of each data center has a big impact on cost, QoS, power consumption, and greenhouse gas emissions. An optimization problem with a multi-criteria objective function is proposed to decide jointly the optimal location of data centers and software components, link capacities, and information routing. Once the network planning has been analyzed, the problem of dynamic resource provisioning in real time is addressed. In this context, virtualization is a key technique in cloud computing because each server can be shared by multiple Virtual Machines (VMs) and the total power consumption can be reduced. In the same line of location problems, we propose a Green Cloud Broker that optimizes VM placement across multiple data centers. In fact, when multiple data centers are considered, response time can be reduced by placing VMs close to users, cost can be minimized, power consumption can be optimized by using energy efficient data centers, and CO2 emissions can be decreased by choosing data centers provided with renewable energy sources. The third stage of the analysis is the short-term management of a cloud data center. In particular, a method is proposed to assign VMs to servers by considering communication traffic among VMs. Cloud data centers receive new applications over time and these applications need on-demand resource provisioning. Each application is composed of multiple types of VMs that interact among themselves. A program called scheduler must place each new VM in a server and that impacts the QoS and power consumption. Our method places VMs that communicate among themselves in servers that are close to each other in the network topology, thus reducing communication delay and increasing the throughput available among VMs. Furthermore, the power consumption of each type of server is considered and the most efficient ones are chosen to place the VMs. The number of VMs of each application can be dynamically changed to match the workload and servers not needed in a particular period can be suspended to save energy. The methodology developed is based on Mixed Integer Programming (MIP) models to formalize the problems and use state of the art optimization solvers. Then, heuristics are developed to solve cases with more than 1,000 potential data center locations for the planning problem, 1,000 nodes for the cloud broker, and 128,000 servers for the VM placement problem. Solutions with very short optimality gaps, between 0% and 1.95%, are obtained, and execution time in the order of minutes for the planning problem and less than a second for real time cases. We consider that this thesis on resource provisioning of cloud computing networks includes important contributions on this research area, and innovative commercial applications based on the proposed methods have promising future.
Sah, Chitranjan; Yadav, Ajit Kumar; Venkataramani, Sugumar
2018-06-21
Computational studies on five-membered heterocycles with single heteroatom and their isomeric dehydro-borole 1a-1c, cyclopentadiene 2a-2c, pyrrole 3a-3c, furan 4b-4c, phosphole 5a-5c, and thiophene 6b-6c radicals have been carried out. Geometrical aspects through ground state electronic structures and stability aspects using bond dissociation energies (BDE) and radical stabilization energies (RSE) have been envisaged in this regard. Spin densities, electrostatic potentials (ESP), and natural bond orbital (NBO) analysis unveiled the extent of spin delocalization. The estimated nucleus-independent chemical shifts (NICS) values revealed the difference in aromaticity characteristics of radicals. Particularly the heteroatom centered radicals exhibit odd electron π-delocalized systems with a quasi-antiaromatic character. Various factors such as, the relative position of the radical center with respect to heteroatoms, resonance, ring strain and orbital interactions influence the stability that follows the order: heteroatom centered > β-centered > α-centered radicals. Among the influences of various factors, we confirmed the existence of a competition between delocalization and the ring strain, and the interplay of both decides the overall stability order.
Analysis of Application Power and Schedule Composition in a High Performance Computing Environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Elmore, Ryan; Gruchalla, Kenny; Phillips, Caleb
As the capacity of high performance computing (HPC) systems continues to grow, small changes in energy management have the potential to produce significant energy savings. In this paper, we employ an extensive informatics system for aggregating and analyzing real-time performance and power use data to evaluate energy footprints of jobs running in an HPC data center. We look at the effects of algorithmic choices for a given job on the resulting energy footprints, and analyze application-specific power consumption, and summarize average power use in the aggregate. All of these views reveal meaningful power variance between classes of applications as wellmore » as chosen methods for a given job. Using these data, we discuss energy-aware cost-saving strategies based on reordering the HPC job schedule. Using historical job and power data, we present a hypothetical job schedule reordering that: (1) reduces the facility's peak power draw and (2) manages power in conjunction with a large-scale photovoltaic array. Lastly, we leverage this data to understand the practical limits on predicting key power use metrics at the time of submission.« less
First-principles Studies of Ferroelectricity in BiMnO3 Thin Films
NASA Astrophysics Data System (ADS)
Wang, Yun-Peng; Cheng, Hai-Ping
The ferroelectricity in BiMnO3 thin films is a long-standing problem. We employed a first-principles density functional theory with inclusion of the local Hubbard Coulomb (U) and exchange (J) terms. The parameters U and J are optimized to reproduce the atomic structure and the energy gap of bulk C2/c BiMnO3. With these optimal U and J parameters, the calculated ferromagnetic Curie temperature and lattice dynamics properties agree with experiments. We then studied the ferroelectricity in few-layer BiMnO3 thin films on SrTiO3(001) substrates. Our calculations identified ferroelectricity in monolayer, bilayer and trilayer BiMnO3 thin films. We find that the energy barrier for 90° rotation of electric polarization is about 3 - 4 times larger than that of conventional ferroelectric materials. This work was supported by the US Department of Energy (DOE), Office of Basic Energy Sciences (BES), under Contract No. DE-FG02-02ER45995. Computations were done using the utilities of the National Energy Research Scientific Computing Center (NERSC).
Documentary of MFENET, a national computer network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shuttleworth, B.O.
1977-06-01
The national Magnetic Fusion Energy Computer Network (MFENET) is a newly operational star network of geographically separated heterogeneous hosts and a communications subnetwork of PDP-11 processors. Host processors interfaced to the subnetwork currently include a CDC 7600 at the Central Computer Center (CCC) and several DECsystem-10's at User Service Centers (USC's). The network was funded by a U.S. government agency (ERDA) to provide in an economical manner the needed computational resources to magnetic confinement fusion researchers. Phase I operation of MFENET distributed the processing power of the CDC 7600 among the USC's through the provision of file transport between anymore » two hosts and remote job entry to the 7600. Extending the capabilities of Phase I, MFENET Phase II provided interactive terminal access to the CDC 7600 from the USC's. A file management system is maintained at the CCC for all network users. The history and development of MFENET are discussed, with emphasis on the protocols used to link the host computers and the USC software. Comparisons are made of MFENET versus ARPANET (Advanced Research Projects Agency Computer Network) and DECNET (Digital Distributed Network Architecture). DECNET and MFENET host-to host, host-to-CCP, and link protocols are discussed in detail. The USC--CCP interface is described briefly. 43 figures, 2 tables.« less
NASA Technical Reports Server (NTRS)
1981-01-01
Turbonetics Energy, Inc.'s steam turbines are used as power generating systems in the oil and gas, chemical, pharmaceuticals, metals and mining, and pulp and paper industries. The Turbonetics line benefited from use of NASA research data on radial inflow steam turbines and from company contact with personnel of Lewis Research Center, also use of Lewis-developed computer programs to determine performance characteristics of turbines.
Relativistic Collisions of Highly-Charged Ions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ionescu, Dorin; Belkacem, Ali
1998-11-19
The physics of elementary atomic processes in relativistic collisions between highly-charged ions and atoms or other ions is briefly discussed, and some recent theoretical and experimental results in this field are summarized. They include excitation, capture, ionization, and electron-positron pair creation. The numerical solution of the two-center Dirac equation in momentum space is shown to be a powerful nonperturbative method for describing atomic processes in relativistic collisions involving heavy and highly-charged ions. By propagating negative-energy wave packets in time the evolution of the QED vacuum around heavy ions in relativistic motion is investigated. Recent results obtained from numerical calculations usingmore » massively parallel processing on the Cray-T3E supercomputer of the National Energy Research Scientific Computer Center (NERSC) at Berkeley National Laboratory are presented.« less
Dynamic provisioning of local and remote compute resources with OpenStack
NASA Astrophysics Data System (ADS)
Giffels, M.; Hauth, T.; Polgart, F.; Quast, G.
2015-12-01
Modern high-energy physics experiments rely on the extensive usage of computing resources, both for the reconstruction of measured events as well as for Monte-Carlo simulation. The Institut fur Experimentelle Kernphysik (EKP) at KIT is participating in both the CMS and Belle experiments with computing and storage resources. In the upcoming years, these requirements are expected to increase due to growing amount of recorded data and the rise in complexity of the simulated events. It is therefore essential to increase the available computing capabilities by tapping into all resource pools. At the EKP institute, powerful desktop machines are available to users. Due to the multi-core nature of modern CPUs, vast amounts of CPU time are not utilized by common desktop usage patterns. Other important providers of compute capabilities are classical HPC data centers at universities or national research centers. Due to the shared nature of these installations, the standardized software stack required by HEP applications cannot be installed. A viable way to overcome this constraint and offer a standardized software environment in a transparent manner is the usage of virtualization technologies. The OpenStack project has become a widely adopted solution to virtualize hardware and offer additional services like storage and virtual machine management. This contribution will report on the incorporation of the institute's desktop machines into a private OpenStack Cloud. The additional compute resources provisioned via the virtual machines have been used for Monte-Carlo simulation and data analysis. Furthermore, a concept to integrate shared, remote HPC centers into regular HEP job workflows will be presented. In this approach, local and remote resources are merged to form a uniform, virtual compute cluster with a single point-of-entry for the user. Evaluations of the performance and stability of this setup and operational experiences will be discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Srinath Vadlamani; Scott Kruger; Travis Austin
Extended magnetohydrodynamic (MHD) codes are used to model the large, slow-growing instabilities that are projected to limit the performance of International Thermonuclear Experimental Reactor (ITER). The multiscale nature of the extended MHD equations requires an implicit approach. The current linear solvers needed for the implicit algorithm scale poorly because the resultant matrices are so ill-conditioned. A new solver is needed, especially one that scales to the petascale. The most successful scalable parallel processor solvers to date are multigrid solvers. Applying multigrid techniques to a set of equations whose fundamental modes are dispersive waves is a promising solution to CEMM problems.more » For the Phase 1, we implemented multigrid preconditioners from the HYPRE project of the Center for Applied Scientific Computing at LLNL via PETSc of the DOE SciDAC TOPS for the real matrix systems of the extended MHD code NIMROD which is a one of the primary modeling codes of the OFES-funded Center for Extended Magnetohydrodynamic Modeling (CEMM) SciDAC. We implemented the multigrid solvers on the fusion test problem that allows for real matrix systems with success, and in the process learned about the details of NIMROD data structures and the difficulties of inverting NIMROD operators. The further success of this project will allow for efficient usage of future petascale computers at the National Leadership Facilities: Oak Ridge National Laboratory, Argonne National Laboratory, and National Energy Research Scientific Computing Center. The project will be a collaborative effort between computational plasma physicists and applied mathematicians at Tech-X Corporation, applied mathematicians Front Range Scientific Computations, Inc. (who are collaborators on the HYPRE project), and other computational plasma physicists involved with the CEMM project.« less
A review on economic emission dispatch problems using quantum computational intelligence
NASA Astrophysics Data System (ADS)
Mahdi, Fahad Parvez; Vasant, Pandian; Kallimani, Vish; Abdullah-Al-Wadud, M.
2016-11-01
Economic emission dispatch (EED) problems are one of the most crucial problems in power systems. Growing energy demand, limitation of natural resources and global warming make this topic into the center of discussion and research. This paper reviews the use of Quantum Computational Intelligence (QCI) in solving Economic Emission Dispatch problems. QCI techniques like Quantum Genetic Algorithm (QGA) and Quantum Particle Swarm Optimization (QPSO) algorithm are discussed here. This paper will encourage the researcher to use more QCI based algorithm to get better optimal result for solving EED problems.
NASA Astrophysics Data System (ADS)
Bender, Jason D.
Understanding hypersonic aerodynamics is important for the design of next-generation aerospace vehicles for space exploration, national security, and other applications. Ground-level experimental studies of hypersonic flows are difficult and expensive; thus, computational science plays a crucial role in this field. Computational fluid dynamics (CFD) simulations of extremely high-speed flows require models of chemical and thermal nonequilibrium processes, such as dissociation of diatomic molecules and vibrational energy relaxation. Current models are outdated and inadequate for advanced applications. We describe a multiscale computational study of gas-phase thermochemical processes in hypersonic flows, starting at the atomic scale and building systematically up to the continuum scale. The project was part of a larger effort centered on collaborations between aerospace scientists and computational chemists. We discuss the construction of potential energy surfaces for the N4, N2O2, and O4 systems, focusing especially on the multi-dimensional fitting problem. A new local fitting method named L-IMLS-G2 is presented and compared with a global fitting method. Then, we describe the theory of the quasiclassical trajectory (QCT) approach for modeling molecular collisions. We explain how we implemented the approach in a new parallel code for high-performance computing platforms. Results from billions of QCT simulations of high-energy N2 + N2, N2 + N, and N2 + O2 collisions are reported and analyzed. Reaction rate constants are calculated and sets of reactive trajectories are characterized at both thermal equilibrium and nonequilibrium conditions. The data shed light on fundamental mechanisms of dissociation and exchange reactions -- and their coupling to internal energy transfer processes -- in thermal environments typical of hypersonic flows. We discuss how the outcomes of this investigation and other related studies lay a rigorous foundation for new macroscopic models for hypersonic CFD. This research was supported by the Department of Energy Computational Science Graduate Fellowship and by the Air Force Office of Scientific Research Multidisciplinary University Research Initiative.
First-principles study of the binding energy between nanostructures and its scaling with system size
NASA Astrophysics Data System (ADS)
Tao, Jianmin; Jiao, Yang; Mo, Yuxiang; Yang, Zeng-Hui; Zhu, Jian-Xin; Hyldgaard, Per; Perdew, John P.
2018-04-01
The equilibrium van der Waals binding energy is an important factor in the design of materials and devices. However, it presents great computational challenges for materials built up from nanostructures. Here we investigate the binding-energy scaling behavior from first-principles calculations. We show that the equilibrium binding energy per atom between identical nanostructures can scale up or down with nanostructure size, but can be parametrized for large N with an analytical formula (in meV/atom), Eb/N =a +b /N +c /N2+d /N3 , where N is the number of atoms in a nanostructure and a , b , c , and d are fitting parameters, depending on the properties of a nanostructure. The formula is consistent with a finite large-size limit of binding energy per atom. We find that there are two competing factors in the determination of the binding energy: Nonadditivities of van der Waals coefficients and center-to-center distance between nanostructures. To decode the detail, the nonadditivity of the static multipole polarizability is investigated from an accurate spherical-shell model. We find that the higher-order multipole polarizability displays ultrastrong intrinsic nonadditivity, no matter if the dipole polarizability is additive or not.
NASA Astrophysics Data System (ADS)
Feskov, Serguei V.; Ivanov, Anatoly I.
2018-03-01
An approach to the construction of diabatic free energy surfaces (FESs) for ultrafast electron transfer (ET) in a supramolecule with an arbitrary number of electron localization centers (redox sites) is developed, supposing that the reorganization energies for the charge transfers and shifts between all these centers are known. Dimensionality of the coordinate space required for the description of multistage ET in this supramolecular system is shown to be equal to N - 1, where N is the number of the molecular centers involved in the reaction. The proposed algorithm of FES construction employs metric properties of the coordinate space, namely, relation between the solvent reorganization energy and the distance between the two FES minima. In this space, the ET reaction coordinate zn n' associated with electron transfer between the nth and n'th centers is calculated through the projection to the direction, connecting the FES minima. The energy-gap reaction coordinates zn n' corresponding to different ET processes are not in general orthogonal so that ET between two molecular centers can create nonequilibrium distribution, not only along its own reaction coordinate but along other reaction coordinates too. This results in the influence of the preceding ET steps on the kinetics of the ensuing ET. It is important for the ensuing reaction to be ultrafast to proceed in parallel with relaxation along the ET reaction coordinates. Efficient algorithms for numerical simulation of multistage ET within the stochastic point-transition model are developed. The algorithms are based on the Brownian simulation technique with the recrossing-event detection procedure. The main advantages of the numerical method are (i) its computational complexity is linear with respect to the number of electronic states involved and (ii) calculations can be naturally parallelized up to the level of individual trajectories. The efficiency of the proposed approach is demonstrated for a model supramolecular system involving four redox centers.
Hu, Yu-Chen
2018-01-01
The emergence of smart Internet of Things (IoT) devices has highly favored the realization of smart homes in a down-stream sector of a smart grid. The underlying objective of Demand Response (DR) schemes is to actively engage customers to modify their energy consumption on domestic appliances in response to pricing signals. Domestic appliance scheduling is widely accepted as an effective mechanism to manage domestic energy consumption intelligently. Besides, to residential customers for DR implementation, maintaining a balance between energy consumption cost and users’ comfort satisfaction is a challenge. Hence, in this paper, a constrained Particle Swarm Optimization (PSO)-based residential consumer-centric load-scheduling method is proposed. The method can be further featured with edge computing. In contrast with cloud computing, edge computing—a method of optimizing cloud computing technologies by driving computing capabilities at the IoT edge of the Internet as one of the emerging trends in engineering technology—addresses bandwidth-intensive contents and latency-sensitive applications required among sensors and central data centers through data analytics at or near the source of data. A non-intrusive load-monitoring technique proposed previously is utilized to automatic determination of physical characteristics of power-intensive home appliances from users’ life patterns. The swarm intelligence, constrained PSO, is used to minimize the energy consumption cost while considering users’ comfort satisfaction for DR implementation. The residential consumer-centric load-scheduling method proposed in this paper is evaluated under real-time pricing with inclining block rates and is demonstrated in a case study. The experimentation reported in this paper shows the proposed residential consumer-centric load-scheduling method can re-shape loads by home appliances in response to DR signals. Moreover, a phenomenal reduction in peak power consumption is achieved by 13.97%. PMID:29702607
DOE Office of Scientific and Technical Information (OSTI.GOV)
Celik, I.; Chattree, M.
1988-07-01
An assessment of the theoretical and numerical aspects of the computer code, PCGC-2, is made; and the results of the application of this code to the Morgantown Energy Technology Center (METC) advanced gasification facility entrained-flow reactor, ''the gasifier,'' are presented. PCGC-2 is a code suitable for simulating pulverized coal combustion or gasification under axisymmetric (two-dimensional) flow conditions. The governing equations for the gas and particulate phase have been reviewed. The numerical procedure and the related programming difficulties have been elucidated. A single-particle model similar to the one used in PCGC-2 has been developed, programmed, and applied to some simple situationsmore » in order to gain insight to the physics of coal particle heat-up, devolatilization, and char oxidation processes. PCGC-2 was applied to the METC entrained-flow gasifier to study numerically the flash pyrolysis of coal, and gasification of coal with steam or carbon dioxide. The results from the simulations are compared with measurements. The gas and particle residence times, particle temperature, and mass component history were also calculated and the results were analyzed. The results provide useful information for understanding the fundamentals of coal gasification and for assessment of experimental results performed using the reactor considered. 69 refs., 35 figs., 23 tabs.« less
Catalog of databases and reports
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burtis, M.D.
1997-04-01
This catalog provides information about the many reports and materials made available by the US Department of Energy`s (DOE`s) Global Change Research Program (GCRP) and the Carbon Dioxide Information Analysis Center (CDIAC). The catalog is divided into nine sections plus the author and title indexes: Section A--US Department of Energy Global Change Research Program Research Plans and Summaries; Section B--US Department of Energy Global Change Research Program Technical Reports; Section C--US Department of Energy Atmospheric Radiation Measurement (ARM) Program Reports; Section D--Other US Department of Energy Reports; Section E--CDIAC Reports; Section F--CDIAC Numeric Data and Computer Model Distribution; Section G--Othermore » Databases Distributed by CDIAC; Section H--US Department of Agriculture Reports on Response of Vegetation to Carbon Dioxide; and Section I--Other Publications.« less
NVU dynamics. I. Geodesic motion on the constant-potential-energy hypersurface.
Ingebrigtsen, Trond S; Toxvaerd, Søren; Heilmann, Ole J; Schrøder, Thomas B; Dyre, Jeppe C
2011-09-14
An algorithm is derived for computer simulation of geodesics on the constant-potential-energy hypersurface of a system of N classical particles. First, a basic time-reversible geodesic algorithm is derived by discretizing the geodesic stationarity condition and implementing the constant-potential-energy constraint via standard Lagrangian multipliers. The basic NVU algorithm is tested by single-precision computer simulations of the Lennard-Jones liquid. Excellent numerical stability is obtained if the force cutoff is smoothed and the two initial configurations have identical potential energy within machine precision. Nevertheless, just as for NVE algorithms, stabilizers are needed for very long runs in order to compensate for the accumulation of numerical errors that eventually lead to "entropic drift" of the potential energy towards higher values. A modification of the basic NVU algorithm is introduced that ensures potential-energy and step-length conservation; center-of-mass drift is also eliminated. Analytical arguments confirmed by simulations demonstrate that the modified NVU algorithm is absolutely stable. Finally, we present simulations showing that the NVU algorithm and the standard leap-frog NVE algorithm have identical radial distribution functions for the Lennard-Jones liquid. © 2011 American Institute of Physics
Energy and daylighting: A correlation between quality of light and energy consciousness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krug, N.
1997-12-31
Energy and Daylighting, an advanced topics graduate/professional elective has been established to help the student develop a deeper understanding of Architectural Daylighting, Energy Conserving Design, and Material/Construction/Methods through direct application. After a brief survey of the principles and applications of current and developing attitudes and techniques in energy conservation and natural lighting strategies is conducted (in order to build upon previous courses), an extensive exercise follows which allows the student the opportunity for direct applications. Both computer modeling/analysis and physical modeling (light box simulation with photographic documentation) are employed to focus attention on the interrelationships between natural lighting and passivemore » energy conserving design--all within the context of establishing environmental (interior) quality and (exterior) design direction. As a result, students broaden their understanding of natural light and energy conservation as design tools; the importance of environmental responsibility, both built and natural environments; and using computer analysis as a design tool. This presentation centers around the activities and results obtained from explorations into Energy and Daylighting. Discussion will highlight the course objectives, the methodology involved in the studies, specific requirements and means of evaluation, a slide show of befores and afters (results), and a retrospective look at the course`s value, as well as future directions and implications.« less
Dehydration of 1-octadecanol over H-BEA: A combined experimental and computational study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Song, Wenji; Liu, Yuanshuai; Barath, Eszter
Liquid phase dehydration of 1-octdecanol, which is intermediately formed during the hydrodeoxygenation of microalgae oil, has been explored in a combined experimental and computational study. The alkyl chain of C18 alcohol interacts with acid sites during diffusion inside the zeolite pores, resulting in an inefficient utilization of the Brønsted acid sites for samples with high acid site concentrations. The parallel intra- and inter- molecular dehydration pathways having different activation energies pass through alternative reaction intermediates. Formation of surface-bound alkoxide species is the rate-limiting step during intramolecular dehydration, whereas intermolecular dehydration proceeds via a bulky dimer intermediate. Octadecene is the primarymore » dehydration product over H-BEA at 533 K. Despite of the main contribution of Brønsted acid sites towards both dehydration pathways, Lewis acid sites are also active in the formation of dioctadecyl ether. The intramolecular dehydration to octadecene and cleavage of the intermediately formed ether, however, require strong BAS. L. Wang, D. Mei and J. A. Lercher, acknowledge the partial support from the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences. Pacific Northwest National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle. Computing time was granted by the grand challenge of computational catalysis of the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL) and by the National Energy Research Scientific Computing Center (NERSC). EMSL is a national scientific user facility located at Pacific Northwest National Laboratory (PNNL) and sponsored by DOE’s Office of Biological and Environmental Research.« less
Network Computing Infrastructure to Share Tools and Data in Global Nuclear Energy Partnership
NASA Astrophysics Data System (ADS)
Kim, Guehee; Suzuki, Yoshio; Teshima, Naoya
CCSE/JAEA (Center for Computational Science and e-Systems/Japan Atomic Energy Agency) integrated a prototype system of a network computing infrastructure for sharing tools and data to support the U.S. and Japan collaboration in GNEP (Global Nuclear Energy Partnership). We focused on three technical issues to apply our information process infrastructure, which are accessibility, security, and usability. In designing the prototype system, we integrated and improved both network and Web technologies. For the accessibility issue, we adopted SSL-VPN (Security Socket Layer-Virtual Private Network) technology for the access beyond firewalls. For the security issue, we developed an authentication gateway based on the PKI (Public Key Infrastructure) authentication mechanism to strengthen the security. Also, we set fine access control policy to shared tools and data and used shared key based encryption method to protect tools and data against leakage to third parties. For the usability issue, we chose Web browsers as user interface and developed Web application to provide functions to support sharing tools and data. By using WebDAV (Web-based Distributed Authoring and Versioning) function, users can manipulate shared tools and data through the Windows-like folder environment. We implemented the prototype system in Grid infrastructure for atomic energy research: AEGIS (Atomic Energy Grid Infrastructure) developed by CCSE/JAEA. The prototype system was applied for the trial use in the first period of GNEP.
Energy harvesting from human walking to power biomedical devices using oscillating generation.
Montoya, Jose A; Mariscal, Dulce M; Romero, Edwar
2016-08-01
This work summarizes the energy generation limits from walking employing a pendulum-based generation system. Self-winding wristwatches have exploited successfully this energy input technique for decades. Pendulum-based planar devices use the rotation to produce energy for inertial generators. Then the oscillations of body motion during locomotion present an opportunity to extract kinetic energy from planar generators. The sinusoidal motion of the center of gravity of the body, on the sagittal and frontal planes, and the limbs swinging are compliant with oscillating devices. Portable biomedical devices can extract energy from everyday walking to extend battery life or decrease battery size. Computer simulations suggest energy availability of 0.05-1.2 mJ on the chest, 0.5-2.5 mJ on the hip and 0.5-41 mJ on the elbow from walking.
Thermal radiation view factor: Methods, accuracy and computer-aided procedures
NASA Technical Reports Server (NTRS)
Kadaba, P. V.
1982-01-01
The computer aided thermal analysis programs which predicts the result of predetermined acceptable temperature range prior to stationing of these orbiting equipment in various attitudes with respect to the Sun and the Earth was examined. Complexity of the surface geometries suggests the use of numerical schemes for the determination of these viewfactors. Basic definitions and standard methods which form the basis for various digital computer methods and various numerical methods are presented. The physical model and the mathematical methods on which a number of available programs are built are summarized. The strength and the weaknesses of the methods employed, the accuracy of the calculations and the time required for computations are evaluated. The situations where accuracies are important for energy calculations are identified and methods to save computational times are proposed. Guide to best use of the available programs at several centers and the future choices for efficient use of digital computers are included in the recommendations.
UTDallas Offline Computing System for B Physics with the Babar Experiment at SLAC
NASA Astrophysics Data System (ADS)
Benninger, Tracy L.
1998-10-01
The University of Texas at Dallas High Energy Physics group is building a high performance, large storage computing system for B physics research with the BaBar experiment (``factory'') at the Stanford Linear Accelerator Center. The goal of this system is to analyze one terabyte of complex Event Store data from BaBar in one to two days. The foundation of the computing system is a Sun E6000 Enterprise multiprocessor system, with additions of a Sun StorEdge L1800 Tape Library, a Sun Workstation for processing batch jobs, staging disks and interface cards. The design considerations, current status, projects underway, and possible upgrade paths will be discussed.
Nagano, Akinori; Komura, Taku; Fukashiro, Senshi
2007-01-01
Background The purpose of this study was to investigate the coordination strategy of maximal-effort horizontal jumping in comparison with vertical jumping, using the methodology of computer simulation. Methods A skeletal model that has nine rigid body segments and twenty degrees of freedom was developed. Thirty-two Hill-type lower limb muscles were attached to the model. The excitation-contraction dynamics of the contractile element, the tissues around the joints to limit the joint range of motion, as well as the foot-ground interaction were implemented. Simulations were initiated from an identical standing posture for both motions. Optimal pattern of the activation input signal was searched through numerical optimization. For the horizontal jumping, the goal was to maximize the horizontal distance traveled by the body's center of mass. For the vertical jumping, the goal was to maximize the height reached by the body's center of mass. Results As a result, it was found that the hip joint was utilized more vigorously in the horizontal jumping than in the vertical jumping. The muscles that have a function of joint flexion such as the m. iliopsoas, m. rectus femoris and m. tibialis anterior were activated to a greater level during the countermovement in the horizontal jumping with an effect of moving the body's center of mass in the forward direction. Muscular work was transferred to the mechanical energy of the body's center of mass more effectively in the horizontal jump, which resulted in a greater energy gain of the body's center of mass throughout the motion. Conclusion These differences in the optimal coordination strategy seem to be caused from the requirement that the body's center of mass needs to be located above the feet in a vertical jumping, whereas this requirement is not so strict in a horizontal jumping. PMID:17543118
Python in the NERSC Exascale Science Applications Program for Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ronaghi, Zahra; Thomas, Rollin; Deslippe, Jack
We describe a new effort at the National Energy Re- search Scientific Computing Center (NERSC) in performance analysis and optimization of scientific Python applications targeting the Intel Xeon Phi (Knights Landing, KNL) many- core architecture. The Python-centered work outlined here is part of a larger effort called the NERSC Exascale Science Applications Program (NESAP) for Data. NESAP for Data focuses on applications that process and analyze high-volume, high-velocity data sets from experimental/observational science (EOS) facilities supported by the US Department of Energy Office of Science. We present three case study applications from NESAP for Data that use Python. These codesmore » vary in terms of “Python purity” from applications developed in pure Python to ones that use Python mainly as a convenience layer for scientists without expertise in lower level programming lan- guages like C, C++ or Fortran. The science case, requirements, constraints, algorithms, and initial performance optimizations for each code are discussed. Our goal with this paper is to contribute to the larger conversation around the role of Python in high-performance computing today and tomorrow, highlighting areas for future work and emerging best practices« less
Large Scale Computing and Storage Requirements for High Energy Physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerber, Richard A.; Wasserman, Harvey
2010-11-24
The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility for the Department of Energy's Office of Science, providing high-performance computing (HPC) resources to more than 3,000 researchers working on about 400 projects. NERSC provides large-scale computing resources and, crucially, the support and expertise needed for scientists to make effective use of them. In November 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of High Energy Physics (HEP) held a workshop to characterize the HPC resources needed at NERSC to support HEP research through the next three to five years. Themore » effort is part of NERSC's legacy of anticipating users needs and deploying resources to meet those demands. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. The chief findings: (1) Science teams need access to a significant increase in computational resources to meet their research goals; (2) Research teams need to be able to read, write, transfer, store online, archive, analyze, and share huge volumes of data; (3) Science teams need guidance and support to implement their codes on future architectures; and (4) Projects need predictable, rapid turnaround of their computational jobs to meet mission-critical time constraints. This report expands upon these key points and includes others. It also presents a number of case studies as representative of the research conducted within HEP. Workshop participants were asked to codify their requirements in this case study format, summarizing their science goals, methods of solution, current and three-to-five year computing requirements, and software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, multi-core environment that is expected to dominate HPC architectures over the next few years. The report includes a section that describes efforts already underway or planned at NERSC that address requirements collected at the workshop. NERSC has many initiatives in progress that address key workshop findings and are aligned with NERSC's strategic plans.« less
Preliminary analyses of space radiation protection for lunar base surface systems
NASA Technical Reports Server (NTRS)
Nealy, John E.; Wilson, John W.; Townsend, Lawrence W.
1989-01-01
Radiation shielding analyses are performed for candidate lunar base habitation modules. The study primarily addresses potential hazards due to contributions from the galactic cosmic rays. The NASA Langley Research Center's high energy nucleon and heavy ion transport codes are used to compute propagation of radiation through conventional and regolith shield materials. Computed values of linear energy transfer are converted to biological dose-equivalent using quality factors established by the International Commision of Radiological Protection. Special fluxes of heavy charged particles and corresponding dosimetric quantities are computed for a series of thicknesses in various shield media and are used as an input data base for algorithms pertaining to specific shielded geometries. Dosimetric results are presented as isodose contour maps of shielded configuration interiors. The dose predictions indicate that shielding requirements are substantial, and an abbreviated uncertainty analysis shows that better definition of the space radiation environment as well as improvement in nuclear interaction cross-section data can greatly increase the accuracy of shield requirement predictions.
Galerkin method for unsplit 3-D Dirac equation using atomically/kinetically balanced B-spline basis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fillion-Gourdeau, F., E-mail: filliong@CRM.UMontreal.ca; Centre de Recherches Mathématiques, Université de Montréal, Montréal, H3T 1J4; Lorin, E., E-mail: elorin@math.carleton.ca
2016-02-15
A Galerkin method is developed to solve the time-dependent Dirac equation in prolate spheroidal coordinates for an electron–molecular two-center system. The initial state is evaluated from a variational principle using a kinetic/atomic balanced basis, which allows for an efficient and accurate determination of the Dirac spectrum and eigenfunctions. B-spline basis functions are used to obtain high accuracy. This numerical method is used to compute the energy spectrum of the two-center problem and then the evolution of eigenstate wavefunctions in an external electromagnetic field.
Tunable mode and line selection by injection in a TEA CO2 laser
NASA Technical Reports Server (NTRS)
Menzies, R. T.; Flamant, P. H.; Kavaya, M. J.; Kuiper, E. N.
1984-01-01
Tunable mode selection by injection in pulsed CO2 lasers is examined, and both analytical and numerical models are used to compute the required injection power for a variety of experimental cases. These are treated in two categories: mode selection at a desired frequency displacement from the center frequency of a transition line in a dispersive cavity and mode (and line) selection at the center frequency of a selected transition line in a nondispersive cavity. The results point out the potential flexibility of pulsed injection in providing wavelength tunable high-energy single-frequency pulses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tucker, T.C.
1980-06-01
The implementation of a version of the Rutherford Laboratory's magnetostatic computer code GFUN3D on the CDC 7600 at the National Magnetic Fusion Energy Computer Center is reported. A new iteration technique that greatly increases the probability of convergence and reduces computation time by about 30% for calculations with nonlinear, ferromagnetic materials is included. The use of GFUN3D on the NMFE network is discussed, and suggestions for future work are presented. Appendix A consists of revisions to the GFUN3D User Guide (published by Rutherford Laboratory( that are necessary to use this version. Appendix B contains input and output for some samplemore » calculations. Appendix C is a detailed discussion of the old and new iteration techniques.« less
Simple vertex correction improves G W band energies of bulk and two-dimensional crystals
NASA Astrophysics Data System (ADS)
Schmidt, Per S.; Patrick, Christopher E.; Thygesen, Kristian S.
2017-11-01
The G W self-energy method has long been recognized as the gold standard for quasiparticle (QP) calculations of solids in spite of the fact that the neglect of vertex corrections and the use of a density-functional theory starting point lack rigorous justification. In this work we remedy this situation by including a simple vertex correction that is consistent with a local-density approximation starting point. We analyze the effect of the self-energy by splitting it into short-range and long-range terms which are shown to govern, respectively, the center and size of the band gap. The vertex mainly improves the short-range correlations and therefore has a small effect on the band gap, while it shifts the band gap center up in energy by around 0.5 eV, in good agreement with experiments. Our analysis also explains how the relative importance of short- and long-range interactions in structures of different dimensionality is reflected in their QP energies. Inclusion of the vertex comes at practically no extra computational cost and even improves the basis set convergence compared to G W . Taken together, the method provides an efficient and rigorous improvement over the G W approximation.
Lin, Yu-Hsiu; Hu, Yu-Chen
2018-04-27
The emergence of smart Internet of Things (IoT) devices has highly favored the realization of smart homes in a down-stream sector of a smart grid. The underlying objective of Demand Response (DR) schemes is to actively engage customers to modify their energy consumption on domestic appliances in response to pricing signals. Domestic appliance scheduling is widely accepted as an effective mechanism to manage domestic energy consumption intelligently. Besides, to residential customers for DR implementation, maintaining a balance between energy consumption cost and users’ comfort satisfaction is a challenge. Hence, in this paper, a constrained Particle Swarm Optimization (PSO)-based residential consumer-centric load-scheduling method is proposed. The method can be further featured with edge computing. In contrast with cloud computing, edge computing—a method of optimizing cloud computing technologies by driving computing capabilities at the IoT edge of the Internet as one of the emerging trends in engineering technology—addresses bandwidth-intensive contents and latency-sensitive applications required among sensors and central data centers through data analytics at or near the source of data. A non-intrusive load-monitoring technique proposed previously is utilized to automatic determination of physical characteristics of power-intensive home appliances from users’ life patterns. The swarm intelligence, constrained PSO, is used to minimize the energy consumption cost while considering users’ comfort satisfaction for DR implementation. The residential consumer-centric load-scheduling method proposed in this paper is evaluated under real-time pricing with inclining block rates and is demonstrated in a case study. The experimentation reported in this paper shows the proposed residential consumer-centric load-scheduling method can re-shape loads by home appliances in response to DR signals. Moreover, a phenomenal reduction in peak power consumption is achieved by 13.97%.
U.S, Department of Energy's Bioenergy Research Centers An Overview of the Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2009-07-01
Alternative fuels from renewable cellulosic biomass--plant stalks, trunks, stems, and leaves--are expected to significantly reduce U.S. dependence on imported oil while enhancing national energy security and decreasing the environmental impacts of energy use. Ethanol and other advanced biofuels from cellulosic biomass are renewable alternatives that could increase domestic production of transportation fuels, revitalize rural economies, and reduce carbon dioxide and pollutant emissions. According to U.S. Secretary of Energy Steven Chu, 'Developing the next generation of biofuels is key to our effort to end our dependence on foreign oil and address the climate crisis while creating millions of new jobs thatmore » can't be outsourced'. In the United States, the Energy Independence and Security Act (EISA) of 2007 is an important driver for the sustainable development of renewable biofuels. As part of EISA, the Renewable Fuel Standard mandates that 36 billion gallons of biofuels are to be produced annually by 2022, of which 16 billion gallons are expected to come from cellulosic feedstocks. Although cellulosic ethanol production has been demonstrated on a pilot level, developing a cost-effective, commercial-scale cellulosic biofuel industry will require transformational science to significantly streamline current production processes. Woodchips, grasses, cornstalks, and other cellulosic biomass are widely abundant but more difficult to break down into sugars than corn grain--the primary source of U.S. ethanol fuel production today. Biological research is key to accelerating the deconstruction of cellulosic biomass into sugars that can be converted to biofuels. The Department of Energy (DOE) Office of Science continues to play a major role in inspiring, supporting, and guiding the biotechnology revolution over the past 25 years. The DOE Genomic Science Program is advancing a new generation of research focused on achieving whole-systems understanding for biology. This program is bringing together scientists in diverse fields to understand the complex biology underlying solutions to DOE missions in energy production, environmental remediation, and climate change science. New interdisciplinary research communities are emerging, as are knowledgebases and scientific and computational resources critical to advancing large-scale, genome-based biology. To focus the most advanced biotechnology-based resources on the biological challenges of biofuel production, DOE established three Bioenergy Research Centers (BRCs) in September 2007. Each center is pursuing the basic research underlying a range of high-risk, high-return biological solutions for bioenergy applications. Advances resulting from the BRCs will provide the knowledge needed to develop new biobased products, methods, and tools that the emerging biofuel industry can use. The scientific rationale for these centers and for other fundamental genomic research critical to the biofuel industry was established at a DOE workshop involving members of the research community (see sidebar, Biofuel Research Plan, below). The DOE BRCs have developed automated, high-throughput analysis pipelines that will accelerate scientific discovery for biology-based biofuel research. The three centers, which were selected through a scientific peer-review process, are based in geographically diverse locations--the Southeast, the Midwest, and the West Coast--with partners across the nation. DOE's Oak Ridge National Laboratory leads the BioEnergy Science Center (BESC) in Tennessee; the University of Wisconsin-Madison leads the Great Lakes Bioenergy Research Center (GLBRC); and DOE's Lawrence Berkeley National Laboratory leads the DOE Joint BioEnergy Institute (JBEI) in California. Each center represents a multidisciplinary partnership with expertise spanning the physical and biological sciences, including genomics, microbial and plant biology, analytical chemistry, computational biology and bioinformatics, and engineering. Institutional partners include DOE national laboratories, universities, private companies, and nonprofit organizations.« less
Using Left Overs to Make Energy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steuterman, Sally; Czarnecki, Alicia; Hurley, Paul
Representing the Material Science Antinides (MSA), this document is one of the entries in the Ten Hundred and One Word Challenge. As part of the challenge, the 46 Energy Frontier Research Centers were invited to represent their science in images, cartoons, photos, words and original paintings, but any descriptions or words could only use the 1000 most commonly used words in the English language, with the addition of one word important to each of the EFRCs and the mission of DOE energy. The mission of MSA is to conduct transformative research in the actinide sciences with full integration of experimentalmore » and computational approaches, and an emphasis on research questions that are important to the energy future of the nation.« less
The Center for Multiscale Plasma Dynamics, Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gombosi, Tamas I.
The University of Michigan participated in the joint UCLA/Maryland fusion science center focused on plasma physics problems for which the traditional separation of the dynamics into microscale and macroscale processes breaks down. These processes involve large scale flows and magnetic fields tightly coupled to the small scale, kinetic dynamics of turbulence, particle acceleration and energy cascade. The interaction between these vastly disparate scales controls the evolution of the system. The enormous range of temporal and spatial scales associated with these problems renders direct simulation intractable even in computations that use the largest existing parallel computers. Our efforts focused on twomore » main problems: the development of Hall MHD solvers on solution adaptive grids and the development of solution adaptive grids using generalized coordinates so that the proper geometry of inertial confinement can be taken into account and efficient refinement strategies can be obtained.« less
Energy Conservation and Conversion in NIMROD Computations of Magnetic Reconnection
NASA Astrophysics Data System (ADS)
Maddox, J. A.; Sovinec, C. R.
2017-10-01
Previous work modeling magnetic relaxation during non-inductive start-up at the Pegasus spherical tokamak indicates an order of magnitude gap between measured experimental temperature and simulated temperature in NIMROD. Potential causes of the plasma temperature gap include: insufficient transport modeling, too low modeled injector power input, and numerical loss of energy, as energy is not algorithmically conserved in NIMROD simulations. Simple 2D nonlinear MHD simulations explore numerical energy conservation discrepancies in NIMROD because understanding numerical loss of energy is fundamental to addressing the physical problems of the other potential causes of energy loss. Evolution of these configurations induces magnetic reconnection, which transfers magnetic energy to heat and kinetic energy. The kinetic energy is eventually damped so, magnetic energy loss must correspond to an increase in internal energy. Results in the 2D geometries indicate that numerical energy loss during reconnection depends on the temporal resolution of the dynamics. Work support from U.S. Department of Energy through a subcontract from the Plasma Science and Innovation Center.
The Ames Power Monitoring System
NASA Technical Reports Server (NTRS)
Osetinsky, Leonid; Wang, David
2003-01-01
The Ames Power Monitoring System (APMS) is a centralized system of power meters, computer hardware, and specialpurpose software that collects and stores electrical power data by various facilities at Ames Research Center (ARC). This system is needed because of the large and varying nature of the overall ARC power demand, which has been observed to range from 20 to 200 MW. Large portions of peak demand can be attributed to only three wind tunnels (60, 180, and 100 MW, respectively). The APMS helps ARC avoid or minimize costly demand charges by enabling wind-tunnel operators, test engineers, and the power manager to monitor total demand for center in real time. These persons receive the information they need to manage and schedule energy-intensive research in advance and to adjust loads in real time to ensure that the overall maximum allowable demand is not exceeded. The APMS (see figure) includes a server computer running the Windows NT operating system and can, in principle, include an unlimited number of power meters and client computers. As configured at the time of reporting the information for this article, the APMS includes more than 40 power meters monitoring all the major research facilities, plus 15 Windows-based client personal computers that display real-time and historical data to users via graphical user interfaces (GUIs). The power meters and client computers communicate with the server using Transmission Control Protocol/Internet Protocol (TCP/IP) on Ethernet networks, variously, through dedicated fiber-optic cables or through the pre-existing ARC local-area network (ARCLAN). The APMS has enabled ARC to achieve significant savings ($1.2 million in 2001) in the cost of power and electric energy by helping personnel to maintain total demand below monthly allowable levels, to manage the overall power factor to avoid low power factor penalties, and to use historical system data to identify opportunities for additional energy savings. The APMS also provides power engineers and electricians with the information they need to plan modifications in advance and perform day-to-day maintenance of the ARC electric-power distribution system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lottes, S.A.; Bojanowski, C.; Shen, J.
2012-04-09
The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to improve design allowing for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFHRC wind engineering laboratory. This quarterly report documents technical progress on the project tasks for the period of October through December 2011.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lottes, S.A.; Bojanowski, C.; Shen, J.
2012-06-28
The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to improve design allowing for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFHRC wind engineering laboratory. This quarterly report documents technical progress on the project tasks for the period of January through March 2012.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lottes, S.A.; Kulak, R.F.; Bojanowski, C.
2011-08-26
The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water loads on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to assess them for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, vehicle stability under high wind loading, and the use of electromagnetic shock absorbers to improve vehicle stability under high wind conditions. This quarterly report documents technical progress on the project tasks for the period of April through June 2011.« less
Prediction and characterization of application power use in a high-performance computing environment
Bugbee, Bruce; Phillips, Caleb; Egan, Hilary; ...
2017-02-27
Power use in data centers and high-performance computing (HPC) facilities has grown in tandem with increases in the size and number of these facilities. Substantial innovation is needed to enable meaningful reduction in energy footprints in leadership-class HPC systems. In this paper, we focus on characterizing and investigating application-level power usage. We demonstrate potential methods for predicting power usage based on a priori and in situ characteristics. Lastly, we highlight a potential use case of this method through a simulated power-aware scheduler using historical jobs from a real scientific HPC system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weiss, Charles J.; Egbert, Jonathan D.; Chen, Shentan
2014-04-28
Treatment of trans-[W(N2)2(dppe)(PEtNMePEt)] (dppe = Ph2PCH2CH2PPh2; PEtNMePEt = Et2PCH2N(Me)CH2PEt2) with three equivalents of tetrafluoroboric acid (HBF4∙Et2O) at -78 °C generated the seven-coordinate tungsten hydride trans-[W(N2)2(H)(dppe)(PEtNMePEt)][BF4]. Depending on the temperature of the reaction, protonation of a pendant amine is also observed, affording trans-[W(N2)2(H)(dppe)(PEtNMe(H)PEt)][BF4]2, with formation of the hydrazido complex, [W(NNH2)(dppe)(PEtNMe(H)PEt)][BF4]2, as a minor product. Similar product mixtures were obtained using triflic acid (HOTf). Upon acid addition to the carbonyl analogue, cis-[W(CO)2(dppe)(PEtNMePEt)], the seven-coordinate carbonyl-hydride complex, trans-[W(CO)2(H)(dppe)(PEtN(H)MePEt)][OTf]2 was generated. The mixed diphosphine complex without the pendant amine in the ligand backbone, trans-[W(N2)2(dppe)(depp)] (depp = Et2P(CH2)3PEt2), was synthesized and treated with HBF4∙Et2O, selectivelymore » generating a hydrazido complex, [W(NNH2)(F)(dppe)(depp)][BF4]. Computational analysis was used to probe proton affinity of three sites of protonation, the metal, pendant amine, and N2 ligand in these complexes. Room temperature reactions with 100 equivalents of HOTf produced NH4+ from reduction of the N2 ligand (electrons come from W). The addition of 100 equivalents HOTf to trans-[W(N2)2(dppe)(PEtNMePEt)] afforded 0.88 ± 0.02 equivalents NH4+, while 0.36 ± 0.02 equivalents of NH4+was formed upon treatment of trans-[W(N2)2(dppe)(depp)], the complex without the pendant amine. This work was supported as part of the Center for Molecular Electrocatalysis, an Energy Frontier Research Center funded by the U.S. Department of Energy Office of Science, Office of Basic Energy Sciences. Computational resources were provided by the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. Pacific Northwest National Laboratory is operated by Battelle for DOE.« less
High Pressure Water Stripping Using Multi-Orifice Nozzles
NASA Technical Reports Server (NTRS)
Hoppe, David
1999-01-01
The use of multi-orifice rotary nozzles greatly increases the speed and stripping effectiveness of high pressure water blasting systems, but also greatly increases the complexity of selecting and optimizing the operating parameters. The rotational speed of the nozzle must be coupled with its transverse velocity as it passes across the surface of the substrate being stripped. The radial and angular positions of each orifice must be included in the analysis of the nozzle configuration. Orifices at the outer edge of the nozzle head move at a faster rate than the orifices located near the center. The energy transmitted to the surface from the impact force of the water stream from an outer orifice is therefore spread over a larger area than energy from an inner orifice. Utilizing a larger diameter orifice in the outer radial positions increases the total energy transmitted from the outer orifice to compensate for the wider distribution of energy. The total flow rate from the combination of all orifices must be monitored and should be kept below the pump capacity while choosing orifice to insert in each position. The energy distribution from the orifice pattern is further complicated since the rotary path of all the orifices in the nozzle head pass through the center section. All orifices contribute to the stripping in the center of the path while only the outer most orifice contributes to the stripping at the edge of the nozzle. Additional orifices contribute to the stripping from the outer edge toward the center section. With all these parameters to configure and each parameter change affecting the others, a computer model was developed to track and coordinate these parameters. The computer simulation graphically indicates the cumulative affect from each parameter selected. The result from the proper choices in parameters is a well designed, highly efficient stripping system. A poorly chosen set of parameters will cause the nozzle to strip aggressively in some areas while leaving the coating untouched in adjacent sections. The high pressure water stripping system can be set to extremely aggressive conditions allowing stripping of hard to remove adhesives, paint systems, and even cladding and chromate conversion coatings. The energy force can also be reduced to strip coatings from thin aluminum substrates without causing any damage or deterioration to the substrate's surface. High pressure water stripping of aerospace components has thus proven to be an efficient and cost effective method for cleaning and removing coatings.
Index to NASA Tech Briefs, 1974
NASA Technical Reports Server (NTRS)
1975-01-01
The following information was given for 1974: (1) abstracts of reports dealing with new technology derived from the research and development activities of NASA or the U.S. Atomic Energy Commission, arranged by subjects: electronics/electrical, electronics/electrical systems, physical sciences, materials/chemistry, life sciences, mechanics, machines, equipment and tools, fabrication technology, and computer programs, (2) indexes for the above documents: subject, personal author, originating center.
ERIC Educational Resources Information Center
Yarker, Morgan Brown
2013-01-01
Research suggests that scientific models and modeling should be topics covered in K-12 classrooms as part of a comprehensive science curriculum. It is especially important when talking about topics in weather and climate, where computer and forecast models are the center of attention. There are several approaches to model based inquiry, but it can…
Development of a thermal and structural analysis procedure for cooled radial turbines
NASA Technical Reports Server (NTRS)
Kumar, Ganesh N.; Deanna, Russell G.
1988-01-01
A procedure for computing the rotor temperature and stress distributions in a cooled radial turbine is considered. Existing codes for modeling the external mainstream flow and the internal cooling flow are used to compute boundary conditions for the heat transfer and stress analyses. An inviscid, quasi three-dimensional code computes the external free stream velocity. The external velocity is then used in a boundary layer analysis to compute the external heat transfer coefficients. Coolant temperatures are computed by a viscous one-dimensional internal flow code for the momentum and energy equation. These boundary conditions are input to a three-dimensional heat conduction code for calculation of rotor temperatures. The rotor stress distribution may be determined for the given thermal, pressure and centrifugal loading. The procedure is applied to a cooled radial turbine which will be tested at the NASA Lewis Research Center. Representative results from this case are included.
Development of a thermal and structural analysis procedure for cooled radial turbines
NASA Technical Reports Server (NTRS)
Kumar, Ganesh N.; Deanna, Russell G.
1988-01-01
A procedure for computing the rotor temperature and stress distributions in a cooled radial turbine are considered. Existing codes for modeling the external mainstream flow and the internal cooling flow are used to compute boundary conditions for the heat transfer and stress analysis. The inviscid, quasi three dimensional code computes the external free stream velocity. The external velocity is then used in a boundary layer analysis to compute the external heat transfer coefficients. Coolant temperatures are computed by a viscous three dimensional internal flow cade for the momentum and energy equation. These boundary conditions are input to a three dimensional heat conduction code for the calculation of rotor temperatures. The rotor stress distribution may be determined for the given thermal, pressure and centrifugal loading. The procedure is applied to a cooled radial turbine which will be tested at the NASA Lewis Research Center. Representative results are given.
Role of the ATLAS Grid Information System (AGIS) in Distributed Data Analysis and Simulation
NASA Astrophysics Data System (ADS)
Anisenkov, A. V.
2018-03-01
In modern high-energy physics experiments, particular attention is paid to the global integration of information and computing resources into a unified system for efficient storage and processing of experimental data. Annually, the ATLAS experiment performed at the Large Hadron Collider at the European Organization for Nuclear Research (CERN) produces tens of petabytes raw data from the recording electronics and several petabytes of data from the simulation system. For processing and storage of such super-large volumes of data, the computing model of the ATLAS experiment is based on heterogeneous geographically distributed computing environment, which includes the worldwide LHC computing grid (WLCG) infrastructure and is able to meet the requirements of the experiment for processing huge data sets and provide a high degree of their accessibility (hundreds of petabytes). The paper considers the ATLAS grid information system (AGIS) used by the ATLAS collaboration to describe the topology and resources of the computing infrastructure, to configure and connect the high-level software systems of computer centers, to describe and store all possible parameters, control, configuration, and other auxiliary information required for the effective operation of the ATLAS distributed computing applications and services. The role of the AGIS system in the development of a unified description of the computing resources provided by grid sites, supercomputer centers, and cloud computing into a consistent information model for the ATLAS experiment is outlined. This approach has allowed the collaboration to extend the computing capabilities of the WLCG project and integrate the supercomputers and cloud computing platforms into the software components of the production and distributed analysis workload management system (PanDA, ATLAS).
Validation of a Node-Centered Wall Function Model for the Unstructured Flow Code FUN3D
NASA Technical Reports Server (NTRS)
Carlson, Jan-Renee; Vasta, Veer N.; White, Jeffery
2015-01-01
In this paper, the implementation of two wall function models in the Reynolds averaged Navier-Stokes (RANS) computational uid dynamics (CFD) code FUN3D is described. FUN3D is a node centered method for solving the three-dimensional Navier-Stokes equations on unstructured computational grids. The first wall function model, based on the work of Knopp et al., is used in conjunction with the one-equation turbulence model of Spalart-Allmaras. The second wall function model, also based on the work of Knopp, is used in conjunction with the two-equation k-! turbulence model of Menter. The wall function models compute the wall momentum and energy flux, which are used to weakly enforce the wall velocity and pressure flux boundary conditions in the mean flow momentum and energy equations. These wall conditions are implemented in an implicit form where the contribution of the wall function model to the Jacobian are also included. The boundary conditions of the turbulence transport equations are enforced explicitly (strongly) on all solid boundaries. The use of the wall function models is demonstrated on four test cases: a at plate boundary layer, a subsonic di user, a 2D airfoil, and a 3D semi-span wing. Where possible, different near-wall viscous spacing tactics are examined. Iterative residual convergence was obtained in most cases. Solution results are compared with theoretical and experimental data for several variations of grid spacing. In general, very good comparisons with data were achieved.
Review of optical wireless communications for data centers
NASA Astrophysics Data System (ADS)
Arnon, Shlomi
2017-10-01
A data center (DC) is a facility either physical or virtual, for running applications, searching, storage, management and dissemination of information known as cloud computing, which consume a huge amount of energy. A DC includes thousands of servers, communication and storage equipment and a support system including an air conditioning system, security, monitoring equipment and electricity regulator units. Data center operators face the challenges of meeting exponentially increasing demands for network bandwidth without unreasonable increases in operation and infrastructure cost. In order to meet the requirements of moderate increase in operation and infrastructure cost technology, a revolution is required. One way to overcome the shortcomings of traditional static (wired) data center architectures is use of a hybrid network based on fiber and optical wireless communication (OWC) or free space optics (FSO). The OWC link could be deployed on top of the existing cable/fiber network layer, so that live migration could be done easily and dynamically. In that case the network topology is flexible and adapts quickly to changes in traffic, heat distribution, power consumption and characteristics of the applications. In addition, OWC could provide an easy way to maintain and scale up data centers. As a result total cost of ownership could be reduced and the return on investment could be increased. In this talk we will review the main OWC technologies applicable for data centers, indicate how energy could be saved using OWC multichannel communication and discuss the issue of OWC pointing accuracy for data center scenario.
A Machine LearningFramework to Forecast Wave Conditions
NASA Astrophysics Data System (ADS)
Zhang, Y.; James, S. C.; O'Donncha, F.
2017-12-01
Recently, significant effort has been undertaken to quantify and extract wave energy because it is renewable, environmental friendly, abundant, and often close to population centers. However, a major challenge is the ability to accurately and quickly predict energy production, especially across a 48-hour cycle. Accurate forecasting of wave conditions is a challenging undertaking that typically involves solving the spectral action-balance equation on a discretized grid with high spatial resolution. The nature of the computations typically demands high-performance computing infrastructure. Using a case-study site at Monterey Bay, California, a machine learning framework was trained to replicate numerically simulated wave conditions at a fraction of the typical computational cost. Specifically, the physics-based Simulating WAves Nearshore (SWAN) model, driven by measured wave conditions, nowcast ocean currents, and wind data, was used to generate training data for machine learning algorithms. The model was run between April 1st, 2013 and May 31st, 2017 generating forecasts at three-hour intervals yielding 11,078 distinct model outputs. SWAN-generated fields of 3,104 wave heights and a characteristic period could be replicated through simple matrix multiplications using the mapping matrices from machine learning algorithms. In fact, wave-height RMSEs from the machine learning algorithms (9 cm) were less than those for the SWAN model-verification exercise where those simulations were compared to buoy wave data within the model domain (>40 cm). The validated machine learning approach, which acts as an accurate surrogate for the SWAN model, can now be used to perform real-time forecasts of wave conditions for the next 48 hours using available forecasted boundary wave conditions, ocean currents, and winds. This solution has obvious applications to wave-energy generation as accurate wave conditions can be forecasted with over a three-order-of-magnitude reduction in computational expense. The low computational cost (and by association low computer-power requirement) means that the machine learning algorithms could be installed on a wave-energy converter as a form of "edge computing" where a device could forecast its own 48-hour energy production.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerber, Richard; Hack, James; Riley, Katherine
The mission of the U.S. Department of Energy Office of Science (DOE SC) is the delivery of scientific discoveries and major scientific tools to transform our understanding of nature and to advance the energy, economic, and national security missions of the United States. To achieve these goals in today’s world requires investments in not only the traditional scientific endeavors of theory and experiment, but also in computational science and the facilities that support large-scale simulation and data analysis. The Advanced Scientific Computing Research (ASCR) program addresses these challenges in the Office of Science. ASCR’s mission is to discover, develop, andmore » deploy computational and networking capabilities to analyze, model, simulate, and predict complex phenomena important to DOE. ASCR supports research in computational science, three high-performance computing (HPC) facilities — the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory and Leadership Computing Facilities at Argonne (ALCF) and Oak Ridge (OLCF) National Laboratories — and the Energy Sciences Network (ESnet) at Berkeley Lab. ASCR is guided by science needs as it develops research programs, computers, and networks at the leading edge of technologies. As we approach the era of exascale computing, technology changes are creating challenges for science programs in SC for those who need to use high performance computing and data systems effectively. Numerous significant modifications to today’s tools and techniques will be needed to realize the full potential of emerging computing systems and other novel computing architectures. To assess these needs and challenges, ASCR held a series of Exascale Requirements Reviews in 2015–2017, one with each of the six SC program offices,1 and a subsequent Crosscut Review that sought to integrate the findings from each. Participants at the reviews were drawn from the communities of leading domain scientists, experts in computer science and applied mathematics, ASCR facility staff, and DOE program managers in ASCR and the respective program offices. The purpose of these reviews was to identify mission-critical scientific problems within the DOE Office of Science (including experimental facilities) and determine the requirements for the exascale ecosystem that would be needed to address those challenges. The exascale ecosystem includes exascale computing systems, high-end data capabilities, efficient software at scale, libraries, tools, and other capabilities. This effort will contribute to the development of a strategic roadmap for ASCR compute and data facility investments and will help the ASCR Facility Division establish partnerships with Office of Science stakeholders. It will also inform the Office of Science research needs and agenda. The results of the six reviews have been published in reports available on the web at http://exascaleage.org/. This report presents a summary of the individual reports and of common and crosscutting findings, and it identifies opportunities for productive collaborations among the DOE SC program offices.« less
Towards prediction of correlated material properties using quantum Monte Carlo methods
NASA Astrophysics Data System (ADS)
Wagner, Lucas
Correlated electron systems offer a richness of physics far beyond noninteracting systems. If we would like to pursue the dream of designer correlated materials, or, even to set a more modest goal, to explain in detail the properties and effective physics of known materials, then accurate simulation methods are required. Using modern computational resources, quantum Monte Carlo (QMC) techniques offer a way to directly simulate electron correlations. I will show some recent results on a few extremely challenging materials including the metal-insulator transition of VO2, the ground state of the doped cuprates, and the pressure dependence of magnetic properties in FeSe. By using a relatively simple implementation of QMC, at least some properties of these materials can be described truly from first principles, without any adjustable parameters. Using the QMC platform, we have developed a way of systematically deriving effective lattice models from the simulation. This procedure is particularly attractive for correlated electron systems because the QMC methods treat the one-body and many-body components of the wave function and Hamiltonian on completely equal footing. I will show some examples of using this downfolding technique and the high accuracy of QMC to connect our intuitive ideas about interacting electron systems with high fidelity simulations. The work in this presentation was supported in part by NSF DMR 1206242, the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research, Scientific Discovery through Advanced Computing (SciDAC) program under Award Number FG02-12ER46875, and the Center for Emergent Superconductivity, Department of Energy Frontier Research Center under Grant No. DEAC0298CH1088. Computing resources were provided by a Blue Waters Illinois grant and INCITE PhotSuper and SuperMatSim allocations.
An Advanced Framework for Improving Situational Awareness in Electric Power Grid Operation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Yousu; Huang, Zhenyu; Zhou, Ning
With the deployment of new smart grid technologies and the penetration of renewable energy in power systems, significant uncertainty and variability is being introduced into power grid operation. Traditionally, the Energy Management System (EMS) operates the power grid in a deterministic mode, and thus will not be sufficient for the future control center in a stochastic environment with faster dynamics. One of the main challenges is to improve situational awareness. This paper reviews the current status of power grid operation and presents a vision of improving wide-area situational awareness for a future control center. An advanced framework, consisting of parallelmore » state estimation, state prediction, parallel contingency selection, parallel contingency analysis, and advanced visual analytics, is proposed to provide capabilities needed for better decision support by utilizing high performance computing (HPC) techniques and advanced visual analytic techniques. Research results are presented to support the proposed vision and framework.« less
Gradient gravitational search: An efficient metaheuristic algorithm for global optimization.
Dash, Tirtharaj; Sahu, Prabhat K
2015-05-30
The adaptation of novel techniques developed in the field of computational chemistry to solve the concerned problems for large and flexible molecules is taking the center stage with regard to efficient algorithm, computational cost and accuracy. In this article, the gradient-based gravitational search (GGS) algorithm, using analytical gradients for a fast minimization to the next local minimum has been reported. Its efficiency as metaheuristic approach has also been compared with Gradient Tabu Search and others like: Gravitational Search, Cuckoo Search, and Back Tracking Search algorithms for global optimization. Moreover, the GGS approach has also been applied to computational chemistry problems for finding the minimal value potential energy of two-dimensional and three-dimensional off-lattice protein models. The simulation results reveal the relative stability and physical accuracy of protein models with efficient computational cost. © 2015 Wiley Periodicals, Inc.
Enabling opportunistic resources for CMS Computing Operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hufnagel, Dirk
With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize opportunistic resources resources not owned by, or a priori configured for CMS to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are used to enablemore » access and bring the CMS environment into these non CMS resources. Finally, we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.« less
Enabling opportunistic resources for CMS Computing Operations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hufnagel, Dick
With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize “opportunistic” resources — resources not owned by, or a priori configured for CMS — to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are usedmore » to enable access and bring the CMS environment into these non CMS resources. Here we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.« less
Enabling opportunistic resources for CMS Computing Operations
Hufnagel, Dirk
2015-12-23
With the increased pressure on computing brought by the higher energy and luminosity from the LHC in Run 2, CMS Computing Operations expects to require the ability to utilize opportunistic resources resources not owned by, or a priori configured for CMS to meet peak demands. In addition to our dedicated resources we look to add computing resources from non CMS grids, cloud resources, and national supercomputing centers. CMS uses the HTCondor/glideinWMS job submission infrastructure for all its batch processing, so such resources will need to be transparently integrated into its glideinWMS pool. Bosco and parrot wrappers are used to enablemore » access and bring the CMS environment into these non CMS resources. Finally, we describe our strategy to supplement our native capabilities with opportunistic resources and our experience so far using them.« less
Energy Materials Center at Cornell: Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abruña, Héctor; Mutolo, Paul F
2015-01-02
The mission of the Energy Materials Center at Cornell (emc 2) was to achieve a detailed understanding, via a combination of synthesis of new materials, experimental and computational approaches, of how the nature, structure, and dynamics of nanostructured interfaces affect energy conversion and storage with emphasis on fuel cells, batteries and supercapacitors. Our research on these systems was organized around a full system strategy for; the development and improved performance of materials for both electrodes at which storage or conversion occurs; understanding their internal interfaces, such as SEI layers in batteries and electrocatalyst supports in fuel cells, and methods formore » structuring them to enable high mass transport as well as high ionic and electronic conductivity; development of ion-conducting electrolytes for batteries and fuel cells (separately) and other separator components, as needed; and development of methods for the characterization of these systems under operating conditions (operando methods) Generally, our work took industry and DOE report findings of current materials as a point of departure to focus on novel material sets for improved performance. In addition, some of our work focused on studying existing materials, for example observing battery solvent degradation, fuel cell catalyst coarsening or monitoring lithium dendrite growth, employing in operando methods developed within the center.« less
Nonlinear functional for solvation in Density Functional Theory
NASA Astrophysics Data System (ADS)
Gunceler, Deniz; Sundararaman, Ravishankar; Schwarz, Kathleen; Letchworth-Weaver, Kendra; Arias, T. A.
2013-03-01
Density functional calculations of molecules and surfaces in a liquid can accelerate the development of many technologies ranging from solar energy harvesting to lithium batteries. Such studies require the development of robust functionals describing the liquid. Polarizable continuum models (PCM's) have been applied to some solvated systems; but they do not sufficiently capture solvation effects to describe highly polar systems like surfaces of ionic solids. In this work, we present a nonlinear fluid functional within the framework of Joint Density Functional Theory. The fluid is treated not as a linear dielectric, but as a distribution of dipoles that responds to the solute, which we describe starting from the exact free energy functional for point dipoles. We also show PCM's can be recovered as the linear limit of our functional. Our description is of similar computational cost to PCM's, and captures complex solvation effects like dielectric saturation without requiring new fit parameters. For polar and nonpolar molecules, it achieves millihartree level agreement with experimental solvation energies. Furthermore, our functional now makes it possible to investigate chemistry on the surface of lithium battery materials, which PCM's predict to be unstable. Supported as part of the Energy Materials Center at Cornell, an Energy Frontier Research Center funded by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences under Award Number DE-SC0001086
Proceedings of the 5. joint Russian-American computational mathematics conference
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1997-12-31
These proceedings contain a record of the talks presented and papers submitted by participants. The conference participants represented three institutions from the United States, Sandia National Laboratories (SNL), Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and two from Russia, Russian Federal Nuclear Center--All Russian Research Institute of Experimental Physics (RFNC-VNIIEF/Arzamas-16), and Russian Federal Nuclear Center--All Russian Research Institute of Technical Physics (RFNC-VNIITF/Chelyabinsk-70). The presentations and papers cover a wide range of applications from radiation transport to materials. Selected papers have been indexed separately for inclusion in the Energy Science and Technology Database.
Collision for Li++He System. I. Potential Curves and Non-Adiabatic Coupling Matrix Elements
NASA Astrophysics Data System (ADS)
Yoshida, Junichi; O-Ohata, Kiyosi
1984-02-01
The potential curves and the non-adiabatic coupling matrix elements for the Li++He collision system were computed. The SCF molecular orbitals were constructed with the CGTO atomic bases centered on each nucleus and the center of mass of two nuclei. The SCF and CI calculations were done at various internuclear distances in the range of 0.1˜25.0 a.u. The potential energies and the wavefunctions were calculated with good approximation over whole internuclear distance. The non-adiabatic coupling matrix elements were calculated with the tentative method in which the ETF are approximately taken into account.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghosh, Soumya; Soudackov, Alexander V.; Hammes-Schiffer, Sharon
Electron transfer and proton coupled electron transfer (PCET) reactions at electrochemical interfaces play an essential role in a broad range of energy conversion processes. The reorganization energy, which is a measure of the free energy change associated with solute and solvent rearrangements, is a key quantity for calculating rate constants for these reactions. We present a computational method for including the effects of the double layer and ionic environment of the diffuse layer in calculations of electrochemical solvent reorganization energies. This approach incorporates an accurate electronic charge distribution of the solute within a molecular-shaped cavity in conjunction with a dielectricmore » continuum treatment of the solvent, ions, and electrode using the integral equations formalism polarizable continuum model. The molecule-solvent boundary is treated explicitly, but the effects of the electrode-double layer and double layer-diffuse layer boundaries, as well as the effects of the ionic strength of the solvent, are included through an external Green’s function. The calculated total reorganization energies agree well with experimentally measured values for a series of electrochemical systems, and the effects of including both the double layer and ionic environment are found to be very small. This general approach was also extended to electrochemical PCET and produced total reorganization energies in close agreement with experimental values for two experimentally studied PCET systems. This research was supported as part of the Center for Molecular Electrocatalysis, an Energy Frontier Research Center, funded by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences.« less
Suzuki, Kimichi; Morokuma, Keiji; Maeda, Satoshi
2017-10-05
We propose a multistructural microiteration (MSM) method for geometry optimization and reaction path calculation in large systems. MSM is a simple extension of the geometrical microiteration technique. In conventional microiteration, the structure of the non-reaction-center (surrounding) part is optimized by fixing atoms in the reaction-center part before displacements of the reaction-center atoms. In this method, the surrounding part is described as the weighted sum of multiple surrounding structures that are independently optimized. Then, geometric displacements of the reaction-center atoms are performed in the mean field generated by the weighted sum of the surrounding parts. MSM was combined with the QM/MM-ONIOM method and applied to chemical reactions in aqueous solution or enzyme. In all three cases, MSM gave lower reaction energy profiles than the QM/MM-ONIOM-microiteration method over the entire reaction paths with comparable computational costs. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Theoretical and material studies of thin-film electroluminescent devices
NASA Technical Reports Server (NTRS)
Summers, C. J.
1989-01-01
Thin-film electroluminescent (TFEL) devices are studied for a possible means of achieving a high resolution, light weight, compact video display panel for computer terminals or television screens. The performance of TFEL devices depends upon the probability of an electron impact exciting a luminescent center which in turn depends upon the density of centers present in the semiconductor layer, the possibility of an electron achieving the impact excitation threshold energy, and the collision cross section itself. Efficiency of such a device is presently very poor. It can best be improved by increasing the number of hot electrons capable of impact exciting a center. Hot electron distributions and a method for increasing the efficiency and brightness of TFEL devices (with the additional advantage of low voltage direct current operation) are investigated.
The experimental nuclear reaction data (EXFOR): Extended computer database and Web retrieval system
Zerkin, V. V.; Pritychenko, B.
2018-02-04
The EXchange FORmat (EXFOR) experimental nuclear reaction database and the associated Web interface provide access to the wealth of low- and intermediate-energy nuclear reaction physics data. This resource is based on numerical data sets and bibliographical information of ~22,000 experiments since the beginning of nuclear science. The principles of the computer database organization, its extended contents and Web applications development are described. New capabilities for the data sets uploads, renormalization, covariance matrix, and inverse reaction calculations are presented in this paper. The EXFOR database, updated monthly, provides an essential support for nuclear data evaluation, application development, and research activities. Finally,more » it is publicly available at the websites of the International Atomic Energy Agency Nuclear Data Section, http://www-nds.iaea.org/exfor, the U.S. National Nuclear Data Center, http://www.nndc.bnl.gov/exfor, and the mirror sites in China, India and Russian Federation.« less
NASA Technical Reports Server (NTRS)
Poole, L. R.
1976-01-01
An initial attempt was made to verify the Langley Research Center and Virginia Institute of Marine Science mid-Atlantic continental-shelf wave refraction model. The model was used to simulate refraction occurring during a continental-shelf remote sensing experiment conducted on August 17, 1973. Simulated wave spectra compared favorably, in a qualitative sense, with the experimental spectra. However, it was observed that most of the wave energy resided at frequencies higher than those for which refraction and shoaling effects were predicted, In addition, variations among the experimental spectra were so small that they were not considered statistically significant. In order to verify the refraction model, simulation must be performed in conjunction with a set of significantly varying spectra in which a considerable portion of the total energy resides at frequencies for which refraction and shoaling effects are likely.
The experimental nuclear reaction data (EXFOR): Extended computer database and Web retrieval system
NASA Astrophysics Data System (ADS)
Zerkin, V. V.; Pritychenko, B.
2018-04-01
The EXchange FORmat (EXFOR) experimental nuclear reaction database and the associated Web interface provide access to the wealth of low- and intermediate-energy nuclear reaction physics data. This resource is based on numerical data sets and bibliographical information of ∼22,000 experiments since the beginning of nuclear science. The principles of the computer database organization, its extended contents and Web applications development are described. New capabilities for the data sets uploads, renormalization, covariance matrix, and inverse reaction calculations are presented. The EXFOR database, updated monthly, provides an essential support for nuclear data evaluation, application development, and research activities. It is publicly available at the websites of the International Atomic Energy Agency Nuclear Data Section, http://www-nds.iaea.org/exfor, the U.S. National Nuclear Data Center, http://www.nndc.bnl.gov/exfor, and the mirror sites in China, India and Russian Federation.
Computation of the dipole moments of proteins.
Antosiewicz, J
1995-10-01
A simple and computationally feasible procedure for the calculation of net charges and dipole moments of proteins at arbitrary pH and salt conditions is described. The method is intended to provide data that may be compared to the results of transient electric dichroism experiments on protein solutions. The procedure consists of three major steps: (i) calculation of self energies and interaction energies for ionizable groups in the protein by using the finite-difference Poisson-Boltzmann method, (ii) determination of the position of the center of diffusion (to which the calculated dipole moment refers) and the extinction coefficient tensor for the protein, and (iii) generation of the equilibrium distribution of protonation states of the protein by a Monte Carlo procedure, from which mean and root-mean-square dipole moments and optical anisotropies are calculated. The procedure is applied to 12 proteins. It is shown that it gives hydrodynamic and electrical parameters for proteins in good agreement with experimental data.
The experimental nuclear reaction data (EXFOR): Extended computer database and Web retrieval system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zerkin, V. V.; Pritychenko, B.
The EXchange FORmat (EXFOR) experimental nuclear reaction database and the associated Web interface provide access to the wealth of low- and intermediate-energy nuclear reaction physics data. This resource is based on numerical data sets and bibliographical information of ~22,000 experiments since the beginning of nuclear science. The principles of the computer database organization, its extended contents and Web applications development are described. New capabilities for the data sets uploads, renormalization, covariance matrix, and inverse reaction calculations are presented in this paper. The EXFOR database, updated monthly, provides an essential support for nuclear data evaluation, application development, and research activities. Finally,more » it is publicly available at the websites of the International Atomic Energy Agency Nuclear Data Section, http://www-nds.iaea.org/exfor, the U.S. National Nuclear Data Center, http://www.nndc.bnl.gov/exfor, and the mirror sites in China, India and Russian Federation.« less
NASA Astrophysics Data System (ADS)
Lee, Myeong H.; Dunietz, Barry D.; Geva, Eitan
2014-03-01
We present a methodology to obtain the photo-induced electron transfer rate constant in organic photovoltaic (OPV) materials within the framework of Fermi's golden rule, using inputs obtained from first-principles electronic structure calculation. Within this approach, the nuclear vibrational modes are treated quantum-mechanically and a short-time approximation is avoided in contrast to the classical Marcus theory where these modes are treated classically within the high-temperature and short-time limits. We demonstrate our methodology on boron-subphthalocyanine-chloride/C60 OPV system to determine the rate constants of electron transfer and electron recombination processes upon photo-excitation. We consider two representative donor/acceptor interface configurations to investigate the effect of interface configuration on the charge transfer characteristics of OPV materials. In addition, we determine the time scale of excited states population by employing a master equation after obtaining the rate constants for all accessible electronic transitions. This work is pursued as part of the Center for Solar and Thermal Energy Conversion, an Energy Frontier Research Center funded by the US Department of Energy Office of Science, Office of Basic Energy Sciences under 390 Award No. DE-SC0000957.
NASA Astrophysics Data System (ADS)
Uysal, Ahmet; Zhou, Hua; Lee, Sang Soo; Fenter, Paul; Feng, Guang; Li, Song; Cummings, Peter; Fulvio, Pasquale; Dai, Sheng; McDonough, Jake; Gogotsi, Yury
2014-03-01
Electrical double layer capacitors (EDLCs) with room temperature ionic liquid (RTIL) electrolytes and carbon electrodes are promising candidates for energy storage devices with high power density and long cycle life. We studied the potential and time dependent changes in the electric double layer (EDL) structure of an imidazolium-based room temperature ionic liquid (RTIL) electrolyte at an epitaxial graphene (EG) surface. We used in situ x-ray reflectivity (XR) to determine the EDL structure at static potentials, during cyclic voltammetry (CV) and potential step measurements. The static potential structures were also investigated with fully atomistic molecular dynamics (MD) simulations. Combined XR and MD results show that the EDL structure has alternating anion/cation layers within the first nanometer of the interface. The dynamical response of the EDL to potential steps has a slow component (>10 s) and the RTIL structure shows hysteresis during CV scans. We propose a conceptual model that connects nanoscale interfacial structure to the macroscopic measurements. This material is based upon work supported as part of the Fluid Interface Reactions, Structures and Transport (FIRST) Center, an Energy Frontier Research Center funded by the U.S. Department of Energy (DOE), Office of Science (SC), Office of Basic Energy
Today's Leaders for a Sustainable Tomorrow
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wood, Bryan
2013-02-27
Today's Leaders for a Sustainable Tomorrow is a collaboration of five residential environmental learning centers (Audubon Center of the North Woods, Deep Portage Learning Center, Laurentian Environmental Center, Long Lake Conservation Center and Wolf Ridge Environmental Learning Center) that together increased energy efficiency, energy conservation and renewable energy technologies through a number of different means appropriate for each unique center. For energy efficiency upgrades the centers installed envelope improvements to seal air barriers through better insulation in walls, ceilings, windows, doors as well as the installation of more energy efficient windows, doors, lighting and air ventilation systems. Through energy sub-metermore » monitoring the centers are able to accurately chart the usage of energy at each of their campuses and eliminate unnecessary energy usage. Facilities reduced their dependence on fossil fuel energy sources through the installation of renewable energy technologies including wood gasification, solar domestic hot water, solar photovoltaic, solar air heat, geothermal heating and wind power. Centers also installed energy education displays on the specific renewable energy technologies used at the center.« less
Laboratory Computing Resource Center
Systems Computing and Data Resources Purchasing Resources Future Plans For Users Getting Started Using LCRC Software Best Practices and Policies Getting Help Support Laboratory Computing Resource Center Laboratory Computing Resource Center Latest Announcements See All April 27, 2018, Announcements, John Low
A molecular dynamics simulation study of irradiation induced defects in gold nanowire
NASA Astrophysics Data System (ADS)
Liu, Wenqiang; Chen, Piheng; Qiu, Ruizhi; Khan, Maaz; Liu, Jie; Hou, Mingdong; Duan, Jinglai
2017-08-01
Displacement cascade in gold nanowires was studied using molecular dynamics computer simulations. Primary knock-on atoms (PKAs) with different kinetic energies were initiated either at the surface or at the center of the nanowires. We found three kinds of defects that were induced by the cascade, including point defects, stacking faults and crater at the surface. The starting points of PKAs influence the number of residual point defects, and this consequently affect the boundary of anti-radiation window which was proposed by calculation of diffusion of point defects to the free surface of nanowires. Formation of stacking faults that expanded the whole cross-section of gold nanowires was observed when the PKA's kinetic energy was higher than 5 keV. Increasing the PKA's kinetic energy up to more than 10 keV may lead to the formation of crater at the surface of nanowires due to microexplosion of hot atoms. At this energy, PKAs started from the center of nanowires can also result in the creation of crater because length of cascade region is comparable to diameter of nanowires. Both the two factors, namely initial positions of PKAs as well as the craters induced by higher energy irradiation, would influence the ability of radiation resistance of metal nanowires.
Next Generation Workload Management System For Big Data on Heterogeneous Distributed Computing
NASA Astrophysics Data System (ADS)
Klimentov, A.; Buncic, P.; De, K.; Jha, S.; Maeno, T.; Mount, R.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Petrosyan, A.; Porter, R. J.; Read, K. F.; Vaniachine, A.; Wells, J. C.; Wenaus, T.
2015-05-01
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS and ALICE are the largest collaborations ever assembled in the sciences and are at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, both experiments rely on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System (WMS) for managing the workflow for all data processing on hundreds of data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. The scale is demonstrated by the following numbers: PanDA manages O(102) sites, O(105) cores, O(108) jobs per year, O(103) users, and ATLAS data volume is O(1017) bytes. In 2013 we started an ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF). The project titled ‘Next Generation Workload Management and Analysis System for Big Data’ (BigPanDA) is funded by DOE ASCR and HEP. Extending PanDA to clouds and LCF presents new challenges in managing heterogeneity and supporting workflow. The BigPanDA project is underway to setup and tailor PanDA at the Oak Ridge Leadership Computing Facility (OLCF) and at the National Research Center "Kurchatov Institute" together with ALICE distributed computing and ORNL computing professionals. Our approach to integration of HPC platforms at the OLCF and elsewhere is to reuse, as much as possible, existing components of the PanDA system. We will present our current accomplishments with running the PanDA WMS at OLCF and other supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sovinec, Carl
The objective of the Plasma Science and Innovation Center (PSI-Center) is to develop and deploy computational models that simulate conditions in smaller, concept-exploration plasma experiments. The PSIC group at the University of Wisconsin-Madison, led by Prof. Carl Sovinec, uses and enhances the Non-Ideal Magnetohydrodynamics with Rotation, Open Discussion (NIMROD) code, to simulate macroscopic plasma dynamics in a number of magnetic confinement configurations. These numerical simulations provide information on how magnetic fields and plasma flows evolve over all three spatial dimensions, which supplements the limited access of diagnostics in plasma experiments. The information gained from simulation helps explain how plasma evolves.more » It is also used to engineer more effective plasma confinement systems, reducing the need for building many experiments to cover the physical parameter space. The ultimate benefit is a more cost-effective approach to the development of fusion energy for peaceful power production. The supplemental funds provided by the American Recovery and Reinvestment Act of 2009 were used to purchase computer components that were assembled into a 48-core system with 256 Gb of shared memory. The system was engineered and constructed by the group's system administrator at the time, Anthony Hammond. It was successfully used by then graduate student, Dr. John O'Bryan, for computing magnetic relaxation dynamics that occur during experimental tests of non-inductive startup in the Pegasus Toroidal Experiment (pegasus.ep.wisc.edu). Dr. O'Bryan's simulations provided the first detailed explanation of how the driven helical filament of electrical current evolves into a toroidal tokamak-like plasma configuration.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gharibyan, N.
In order to fully characterize the NIF neutron spectrum, SAND-II-SNL software was requested/received from the Radiation Safety Information Computational Center. The software is designed to determine the neutron energy spectrum through analysis of experimental activation data. However, given that the source code was developed in Sparcstation 10, it is not compatible with current version of FORTRAN. Accounts have been established through the Lawrence Livermore National Laboratory’s High Performance Computing in order to access different compiles for FORTRAN (e.g. pgf77, pgf90). Additionally, several of the subroutines included in the SAND-II-SNL package have required debugging efforts to allow for proper compiling ofmore » the code.« less
An Analysis of Cloud Computing with Amazon Web Services for the Atmospheric Science Data Center
NASA Astrophysics Data System (ADS)
Gleason, J. L.; Little, M. M.
2013-12-01
NASA science and engineering efforts rely heavily on compute and data handling systems. The nature of NASA science data is such that it is not restricted to NASA users, instead it is widely shared across a globally distributed user community including scientists, educators, policy decision makers, and the public. Therefore NASA science computing is a candidate use case for cloud computing where compute resources are outsourced to an external vendor. Amazon Web Services (AWS) is a commercial cloud computing service developed to use excess computing capacity at Amazon, and potentially provides an alternative to costly and potentially underutilized dedicated acquisitions whenever NASA scientists or engineers require additional data processing. AWS desires to provide a simplified avenue for NASA scientists and researchers to share large, complex data sets with external partners and the public. AWS has been extensively used by JPL for a wide range of computing needs and was previously tested on a NASA Agency basis during the Nebula testing program. Its ability to support the Langley Science Directorate needs to be evaluated by integrating it with real world operational needs across NASA and the associated maturity that would come with that. The strengths and weaknesses of this architecture and its ability to support general science and engineering applications has been demonstrated during the previous testing. The Langley Office of the Chief Information Officer in partnership with the Atmospheric Sciences Data Center (ASDC) has established a pilot business interface to utilize AWS cloud computing resources on a organization and project level pay per use model. This poster discusses an effort to evaluate the feasibility of the pilot business interface from a project level perspective by specifically using a processing scenario involving the Clouds and Earth's Radiant Energy System (CERES) project.
High Energy Astronomy Observatory (HEAO)
1980-01-01
Like the Crab Nebula, the Vela Supernova Remnant has a radio pulsar at its center. In this image taken by the High Energy Astronomy Observatory (HEAO)-2/Einstein Observatory, the pulsar appears as a point source surrounded by weak and diffused emissions of x-rays. HEAO-2's computer processing system was able to record and display the total number of x-ray photons (a tiny bundle of radiant energy used as the fundamental unit of electromagnetic radiation) on a scale along the margin of the picture. The HEAO-2, the first imaging and largest x-ray telescope built to date, was capable of producing actual photographs of x-ray objects. Shortly after launch, the HEAO-2 was nicknamed the Einstein Observatory by its scientific experimenters in honor of the centernial of the birth of Albert Einstein, whose concepts of relativity and gravitation have influenced much of modern astrophysics, particularly x-ray astronomy. The HEAO-2, designed and developed by TRW, Inc. under the project management of the Marshall Space Flight Center, was launched aboard an Atlas/Centaur launch vehicle on November 13, 1978.
Li, Xiang; Ko, Yeon-Jae; Wang, Haopeng; Bowen, Kit H; Guevara-García, Alfredo; Martínez, Ana
2011-02-07
The copper-nucleoside anions, Cu(-)(cytidine) and Cu(-)(uridine), have been generated in the gas phase and studied by both experimental (anion photoelectron spectroscopy) and theoretical (density functional calculations) methods. The photoelectron spectra of both systems are dominated by single, intense, and relatively narrow peaks. These peaks are centered at 2.63 and 2.71 eV for Cu(-)(cytidine) and Cu(-)(uridine), respectively. According to our calculations, Cu(-)(cytidine) and Cu(-)(uridine) species with these peak center [vertical detachment energy (VDE)] values correspond to structures in which copper atomic anions are bound to the sugar portions of their corresponding nucleosides largely through electrostatic interactions; the observed species are anion-molecule complexes. The combination of experiment and theory also reveal the presence of a slightly higher energy, anion-molecule complex isomer in the case of the Cu(-)(cytidine). Furthermore, our calculations found that chemically bond isomers of these species are much more stable than their anion-molecule complex counterparts, but since their calculated VDE values are larger than the photon energy used in these experiments, they were not observed.
NASA Astrophysics Data System (ADS)
Li, Xiang; Ko, Yeon-Jae; Wang, Haopeng; Bowen, Kit H.; Guevara-García, Alfredo; Martínez, Ana
2011-02-01
The copper-nucleoside anions, Cu-(cytidine) and Cu-(uridine), have been generated in the gas phase and studied by both experimental (anion photoelectron spectroscopy) and theoretical (density functional calculations) methods. The photoelectron spectra of both systems are dominated by single, intense, and relatively narrow peaks. These peaks are centered at 2.63 and 2.71 eV for Cu-(cytidine) and Cu-(uridine), respectively. According to our calculations, Cu-(cytidine) and Cu-(uridine) species with these peak center [vertical detachment energy (VDE)] values correspond to structures in which copper atomic anions are bound to the sugar portions of their corresponding nucleosides largely through electrostatic interactions; the observed species are anion-molecule complexes. The combination of experiment and theory also reveal the presence of a slightly higher energy, anion-molecule complex isomer in the case of the Cu-(cytidine). Furthermore, our calculations found that chemically bond isomers of these species are much more stable than their anion-molecule complex counterparts, but since their calculated VDE values are larger than the photon energy used in these experiments, they were not observed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anson, Colin W.; Ghosh, Soumya; Hammes-Schiffer, Sharon
2016-03-30
Macrocyclic metal complexes and p-benzoquinones are commonly used as co-catalytic redox mediators in aerobic oxidation reactions. In an effort to gain insight into the mechanism and energetic efficiency of these reactions, we investigated Co(salophen)-catalyzed aerobic oxidation of p-hydroquinone. Kinetic and spectroscopic data suggest that the catalyst resting-state consists of an equilibrium between a CoII(salophen) complex, a CoIII-superoxide adduct, and a hydrogen-bonded adduct between the hydroquinone and the CoIII–O2 species. The kinetic data, together with density functional theory data, suggest that the turnover-limiting step features proton-coupled electron transfer from a semi-hydroquinone species and a CoIII-hydroperoxide intermediate. Additional experimental and computational datamore » suggest that a coordinated H2O2 intermediate oxidizes a second equivalent of hydroquinone. This research was supported as part of the Center for Molecular Electrocatalysis, an Energy Frontier Research Center, funded by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences. The NSF provided partial support for the EPR instrumentation (NSF CHE-0741901).« less
Image of the Vela Supernova Remnant Taken by the High Energy Astronomy Observatory (HEAO)-2
NASA Technical Reports Server (NTRS)
1980-01-01
Like the Crab Nebula, the Vela Supernova Remnant has a radio pulsar at its center. In this image taken by the High Energy Astronomy Observatory (HEAO)-2/Einstein Observatory, the pulsar appears as a point source surrounded by weak and diffused emissions of x-rays. HEAO-2's computer processing system was able to record and display the total number of x-ray photons (a tiny bundle of radiant energy used as the fundamental unit of electromagnetic radiation) on a scale along the margin of the picture. The HEAO-2, the first imaging and largest x-ray telescope built to date, was capable of producing actual photographs of x-ray objects. Shortly after launch, the HEAO-2 was nicknamed the Einstein Observatory by its scientific experimenters in honor of the centernial of the birth of Albert Einstein, whose concepts of relativity and gravitation have influenced much of modern astrophysics, particularly x-ray astronomy. The HEAO-2, designed and developed by TRW, Inc. under the project management of the Marshall Space Flight Center, was launched aboard an Atlas/Centaur launch vehicle on November 13, 1978.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cammarota, Ryan C.; Vollmer, Matthew V.; Xie, Jing
Large-scale CO2 hydrogenation could offer a renewable stream of industrially important C1 chemicals while reducing CO2 emissions. Critical to this opportunity is the requirement for inexpensive catalysts based on earth-abundant metals instead of precious metals. We report a nickel-gallium complex featuring a Ni(0)→Ga(III) bond that shows remarkable catalytic activity for hydrogenating CO2 to formate at ambient temperature (3150 turnovers, turnover frequency = 9700 h-1), compared with prior homogeneous Ni-centred catalysts. The Lewis acidic Ga(III) ion plays a pivotal role by stabilizing reactive catalytic intermediates, including a rare anionic d10 Ni hydride. The structure of this reactive intermediate shows a terminalmore » Ni-H, for which the hydride donor strength rivals those of precious metal-hydrides. Collectively, our experimental and computational results demonstrate that modulating a transition metal center via a direct interaction with a Lewis acidic support can be a powerful strategy for promoting new reactivity paradigms in base-metal catalysis. The work was supported as part of the Inorganometallic Catalysis Design Center, an Energy Frontier Research Center funded by the U.S. Department of Energy (DOE), Office of Science, Basic Energy Sciences under Award DE-SC0012702. R.C.C. and M.V.V. were supported by DOE Office of Science Graduate Student Research and National Science Foundation Graduate Research Fellowship programs, respectively. J.C.L., S.A.B., and A.M.A. were supported by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences. Pacific Northwest National Laboratory is operated by Battelle for the U.S. Department of Energy.« less
NASA Astrophysics Data System (ADS)
Gasper, Raymond; Ramasubramaniam, Ashwin
Defective graphene has been shown experimentally to be an excellent support for transition-metal electrocatalysts in direct methanol fuel cells. Prior computational modeling has shown that the improved catalytic activity of graphene-supported metal clusters is in part due to increased resistance to catalyst sintering and CO poisoning, but the increased reaction rate for the methanol decomposition reaction (MDR) is not yet fully explained. Using DFT, we investigate the adsorption of MDR intermediates and reaction thermodynamics on defective graphene-supported Pt13 nanoclusters with realistic, low-symmetry morphologies. We find that the support-induced shifts in Pt13 electronic structure correlate well with a rigid shift in adsorption of MDR intermediates, and that adsorption energy scaling relationships perform well on the low-symmetry surface. We investigate the reaction kinetics and thermodynamics, including testing the effectiveness of scaling relationships for predicting reaction barriers on the nanoclusters. Using these fundamental data, we perform microkinetic modeling to quantify the effect of the support on the MDR, and to understand how the support influences surface coverages, CO poisoning, and the relationships between reaction pathways. Funded by U.S. Department of Energy under Award Number DE-SC0010610. Computational resources were provided by National Energy Research Scientific Computing Center.
DEEP: Database of Energy Efficiency Performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hong, Tianzhen; Piette, Mary; Lee, Sang Hoon
A database of energy efficiency performance (DEEP) is a presimulated database to enable quick and accurate assessment of energy retrofit of commercial buildings. DEEP was compiled from results of about 10 million EnergyPlus simulations. DEEP provides energy savings for screening and evaluation of retrofit measures targeting the small and medium-sized office and retail buildings in California. The prototype building models are developed for a comprehensive assessment of building energy performance based on DOE commercial reference buildings and the California DEER [sic] prototype buildings. The prototype buildings represent seven building types across six vintages of constructions and 16 California climate zones.more » DEEP uses these prototypes to evaluate energy performance of about 100 energy conservation measures covering envelope, lighting, heating, ventilation, air conditioning, plug loads, and domestic hot war. DEEP consists the energy simulation results for individual retrofit measures as well as packages of measures to consider interactive effects between multiple measures. The large scale EnergyPlus simulations are being conducted on the super computers at the National Energy Research Scientific Computing Center (NERSC) of Lawrence Berkeley National Laboratory. The pre-simulation database is a part of the CEC PIER project to develop a web-based retrofit toolkit for small and medium-sized commercial buildings in California, which provides real-time energy retrofit feedback by querying DEEP with recommended measures, estimated energy savings and financial payback period based on users' decision criteria of maximizing energy savings, energy cost savings, carbon reduction, or payback of investment. The pre-simulated database and associated comprehensive measure analysis enhances the ability to performance assessments of retrofits to reduce energy use for small and medium buildings and business owners who typically do not have resources to conduct costly building energy audit.« less
Exascale computing and what it means for shock physics
NASA Astrophysics Data System (ADS)
Germann, Timothy
2015-06-01
The U.S. Department of Energy is preparing to launch an Exascale Computing Initiative, to address the myriad challenges required to deploy and effectively utilize an exascale-class supercomputer (i.e., one capable of performing 1018 operations per second) in the 2023 timeframe. Since physical (power dissipation) requirements limit clock rates to at most a few GHz, this will necessitate the coordination of on the order of a billion concurrent operations, requiring sophisticated system and application software, and underlying mathematical algorithms, that may differ radically from traditional approaches. Even at the smaller workstation or cluster level of computation, the massive concurrency and heterogeneity within each processor will impact computational scientists. Through the multi-institutional, multi-disciplinary Exascale Co-design Center for Materials in Extreme Environments (ExMatEx), we have initiated an early and deep collaboration between domain (computational materials) scientists, applied mathematicians, computer scientists, and hardware architects, in order to establish the relationships between algorithms, software stacks, and architectures needed to enable exascale-ready materials science application codes within the next decade. In my talk, I will discuss these challenges, and what it will mean for exascale-era electronic structure, molecular dynamics, and engineering-scale simulations of shock-compressed condensed matter. In particular, we anticipate that the emerging hierarchical, heterogeneous architectures can be exploited to achieve higher physical fidelity simulations using adaptive physics refinement. This work is supported by the U.S. Department of Energy, Office of Science, Office of Advanced Scientific Computing Research.
The Development of University Computing in Sweden 1965-1985
NASA Astrophysics Data System (ADS)
Dahlstrand, Ingemar
In 1965-70 the government agency, Statskontoret, set up five university computing centers, as service bureaux financed by grants earmarked for computer use. The centers were well equipped and staffed and caused a surge in computer use. When the yearly flow of grant money stagnated at 25 million Swedish crowns, the centers had to find external income to survive and acquire time-sharing. But the charging system led to the computers not being fully used. The computer scientists lacked equipment for laboratory use. The centers were decentralized and the earmarking abolished. Eventually they got new tasks like running computers owned by the departments, and serving the university administration.
Modelling excitonic-energy transfer in light-harvesting complexes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kramer, Tobias; Kreisbeck, Christoph
The theoretical and experimental study of energy transfer in photosynthesis has revealed an interesting transport regime, which lies at the borderline between classical transport dynamics and quantum-mechanical interference effects. Dissipation is caused by the coupling of electronic degrees of freedom to vibrational modes and leads to a directional energy transfer from the antenna complex to the target reaction-center. The dissipative driving is robust and does not rely on fine-tuning of specific vibrational modes. For the parameter regime encountered in the biological systems new theoretical tools are required to directly compare theoretical results with experimental spectroscopy data. The calculations require tomore » utilize massively parallel graphics processor units (GPUs) for efficient and exact computations.« less
High Energy Astronomical Data Processing and Analysis via the Internet
NASA Astrophysics Data System (ADS)
Valencic, Lynne A.; Snowden, S.; Pence, W.
2012-01-01
The HEASARC at NASA Goddard Space Flight Center and the US XMM-Newton GOF has developed Hera, a data processing facility for analyzing high energy astronomical data over the internet. Hera provides all the disk space and computing resources needed to do general processing of and advanced research on publicly available data from High Energy Astrophysics missions. The data and data products are kept on a server at GSFC and can be downloaded to a user's local machine. Further, the XMM-GOF has developed scripts to streamline XMM data reduction. These are available through Hera, and can also be downloaded to a user's local machine. These are free services provided to students, educators, and researchers for educational and research purposes.
NASA Astrophysics Data System (ADS)
Turinsky, Paul J.; Martin, William R.
2017-04-01
In this special issue of the Journal of Computational Physics, the research and development completed at the time of manuscript submission by the Consortium for Advanced Simulation of Light Water Reactors (CASL) is presented. CASL is the first of several Energy Innovation Hubs that have been created by the Department of Energy. The Hubs are modeled after the strong scientific management characteristics of the Manhattan Project and AT&T Bell Laboratories, and function as integrated research centers that combine basic and applied research with engineering to accelerate scientific discovery that addresses critical energy issues. Lifetime of a Hub is expected to be five or ten years depending upon performance, with CASL being granted a ten year lifetime.
Casimir interaction between spheres in ( D + 1)-dimensional Minkowski spacetime
NASA Astrophysics Data System (ADS)
Teo, L. P.
2014-05-01
We consider the Casimir interaction between two spheres in ( D + 1)-dimensional Minkowski spacetime due to the vacuum fluctuations of scalar fields. We consider combinations of Dirichlet and Neumann boundary conditions. The TGTG formula of the Casimir interaction energy is derived. The computations of the T matrices of the two spheres are straightforward. To compute the two G matrices, known as translation matrices, which relate the hyper-spherical waves in two spherical coordinate frames differ by a translation, we generalize the operator approach employed in [39]. The result is expressed in terms of an integral over Gegenbauer polynomials. In contrast to the D=3 case, we do not re-express the integral in terms of 3 j-symbols and hyper-spherical waves, which in principle, can be done but does not simplify the formula. Using our expression for the Casimir interaction energy, we derive the large separation and small separation asymptotic expansions of the Casimir interaction energy. In the large separation regime, we find that the Casimir interaction energy is of order L -2 D+3, L -2 D+1 and L -2 D-1 respectively for Dirichlet-Dirichlet, Dirichlet-Neumann and Neumann-Neumann boundary conditions, where L is the center-to-center distance of the two spheres. In the small separation regime, we confirm that the leading term of the Casimir interaction agrees with the proximity force approximation, which is of order , where d is the distance between the two spheres. Another main result of this work is the analytic computations of the next-to-leading order term in the small separation asymptotic expansion. This term is computed using careful order analysis as well as perturbation method. In the case the radius of one of the sphere goes to infinity, we find that the results agree with the one we derive for sphere-plate configuration. When D=3, we also recover previously known results. We find that when D is large, the ratio of the next-to-leading order term to the leading order term is linear in D, indicating a larger correction at higher dimensions. The methodologies employed in this work and the results obtained can be used to study the one-loop effective action of the system of two spherical objects in the universe.
UC Merced Center for Computational Biology Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Colvin, Michael; Watanabe, Masakatsu
Final report for the UC Merced Center for Computational Biology. The Center for Computational Biology (CCB) was established to support multidisciplinary scientific research and academic programs in computational biology at the new University of California campus in Merced. In 2003, the growing gap between biology research and education was documented in a report from the National Academy of Sciences, Bio2010 Transforming Undergraduate Education for Future Research Biologists. We believed that a new type of biological sciences undergraduate and graduate programs that emphasized biological concepts and considered biology as an information science would have a dramatic impact in enabling the transformationmore » of biology. UC Merced as newest UC campus and the first new U.S. research university of the 21st century was ideally suited to adopt an alternate strategy - to create a new Biological Sciences majors and graduate group that incorporated the strong computational and mathematical vision articulated in the Bio2010 report. CCB aimed to leverage this strong commitment at UC Merced to develop a new educational program based on the principle of biology as a quantitative, model-driven science. Also we expected that the center would be enable the dissemination of computational biology course materials to other university and feeder institutions, and foster research projects that exemplify a mathematical and computations-based approach to the life sciences. As this report describes, the CCB has been successful in achieving these goals, and multidisciplinary computational biology is now an integral part of UC Merced undergraduate, graduate and research programs in the life sciences. The CCB began in fall 2004 with the aid of an award from U.S. Department of Energy (DOE), under its Genomes to Life program of support for the development of research and educational infrastructure in the modern biological sciences. This report to DOE describes the research and academic programs made possible by the CCB from its inception until August, 2010, at the end of the final extension. Although DOE support for the center ended in August 2010, the CCB will continue to exist and support its original objectives. The research and academic programs fostered by the CCB have led to additional extramural funding from other agencies, and we anticipate that CCB will continue to provide support for quantitative and computational biology program at UC Merced for many years to come. Since its inception in fall 2004, CCB research projects have continuously had a multi-institutional collaboration with Lawrence Livermore National Laboratory (LLNL), and the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign, as well as individual collaborators at other sites. CCB affiliated faculty cover a broad range of computational and mathematical research including molecular modeling, cell biology, applied math, evolutional biology, bioinformatics, etc. The CCB sponsored the first distinguished speaker series at UC Merced, which had an important role is spreading the word about the computational biology emphasis at this new campus. One of CCB's original goals is to help train a new generation of biologists who bridge the gap between the computational and life sciences. To archive this goal, by summer 2006, a new program - summer undergraduate internship program, have been established under CCB to train the highly mathematical and computationally intensive Biological Science researchers. By the end of summer 2010, 44 undergraduate students had gone through this program. Out of those participants, 11 students have been admitted to graduate schools and 10 more students are interested in pursuing graduate studies in the sciences. The center is also continuing to facilitate the development and dissemination of undergraduate and graduate course materials based on the latest research in computational biology.« less
Quantum Simulation of Helium Hydride Cation in a Solid-State Spin Register.
Wang, Ya; Dolde, Florian; Biamonte, Jacob; Babbush, Ryan; Bergholm, Ville; Yang, Sen; Jakobi, Ingmar; Neumann, Philipp; Aspuru-Guzik, Alán; Whitfield, James D; Wrachtrup, Jörg
2015-08-25
Ab initio computation of molecular properties is one of the most promising applications of quantum computing. While this problem is widely believed to be intractable for classical computers, efficient quantum algorithms exist which have the potential to vastly accelerate research throughput in fields ranging from material science to drug discovery. Using a solid-state quantum register realized in a nitrogen-vacancy (NV) defect in diamond, we compute the bond dissociation curve of the minimal basis helium hydride cation, HeH(+). Moreover, we report an energy uncertainty (given our model basis) of the order of 10(-14) hartree, which is 10 orders of magnitude below the desired chemical precision. As NV centers in diamond provide a robust and straightforward platform for quantum information processing, our work provides an important step toward a fully scalable solid-state implementation of a quantum chemistry simulator.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houston, Johnny L; Geter, Kerry
This Project?s third year of implementation in 2007-2008, the final year, as designated by Elizabeth City State University (ECSU), in cooperation with the National Association of Mathematicians (NAM) Inc., in an effort to promote research and research training programs in computational science ? scientific visualization (CSSV). A major goal of the Project was to attract the energetic and productive faculty, graduate and upper division undergraduate students of diverse ethnicities to a program that investigates science and computational science issues of long-term interest to the Department of Energy (DoE) and the nation. The breadth and depth of computational science?scientific visualization andmore » the magnitude of resources available are enormous for permitting a variety of research activities. ECSU?s Computational Science-Science Visualization Center will serve as a conduit for directing users to these enormous resources.« less
ASR4: A computer code for fitting and processing 4-gage anelastic strain recovery data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warpinski, N.R.
A computer code for analyzing four-gage Anelastic Strain Recovery (ASR) data has been modified for use on a personal computer. This code fits the viscoelastic model of Warpinski and Teufel to measured ASR data, calculates the stress orientation directly, and computes stress magnitudes if sufficient input data are available. The code also calculates the stress orientation using strain-rosette equations, and its calculates stress magnitudes using Blanton's approach, assuming sufficient input data are available. The program is written in FORTRAN, compiled with Ryan-McFarland Version 2.4. Graphics use PLOT88 software by Plotworks, Inc., but the graphics software must be obtained by themore » user because of licensing restrictions. A version without graphics can also be run. This code is available through the National Energy Software Center (NESC), operated by Argonne National Laboratory. 5 refs., 3 figs.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Del Ben, Mauro, E-mail: mauro.delben@chem.uzh.ch; Hutter, Jürg, E-mail: hutter@chem.uzh.ch; VandeVondele, Joost, E-mail: Joost.VandeVondele@mat.ethz.ch
The forces acting on the atoms as well as the stress tensor are crucial ingredients for calculating the structural and dynamical properties of systems in the condensed phase. Here, these derivatives of the total energy are evaluated for the second-order Møller-Plesset perturbation energy (MP2) in the framework of the resolution of identity Gaussian and plane waves method, in a way that is fully consistent with how the total energy is computed. This consistency is non-trivial, given the different ways employed to compute Coulomb, exchange, and canonical four center integrals, and allows, for example, for energy conserving dynamics in various ensembles.more » Based on this formalism, a massively parallel algorithm has been developed for finite and extended system. The designed parallel algorithm displays, with respect to the system size, cubic, quartic, and quintic requirements, respectively, for the memory, communication, and computation. All these requirements are reduced with an increasing number of processes, and the measured performance shows excellent parallel scalability and efficiency up to thousands of nodes. Additionally, the computationally more demanding quintic scaling steps can be accelerated by employing graphics processing units (GPU’s) showing, for large systems, a gain of almost a factor two compared to the standard central processing unit-only case. In this way, the evaluation of the derivatives of the RI-MP2 energy can be performed within a few minutes for systems containing hundreds of atoms and thousands of basis functions. With good time to solution, the implementation thus opens the possibility to perform molecular dynamics (MD) simulations in various ensembles (microcanonical ensemble and isobaric-isothermal ensemble) at the MP2 level of theory. Geometry optimization, full cell relaxation, and energy conserving MD simulations have been performed for a variety of molecular crystals including NH{sub 3}, CO{sub 2}, formic acid, and benzene.« less
Performance Assessment Institute-NV
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lombardo, Joesph
2012-12-31
The National Supercomputing Center for Energy and the Environment’s intention is to purchase a multi-purpose computer cluster in support of the Performance Assessment Institute (PA Institute). The PA Institute will serve as a research consortium located in Las Vegas Nevada with membership that includes: national laboratories, universities, industry partners, and domestic and international governments. This center will provide a one-of-a-kind centralized facility for the accumulation of information for use by Institutions of Higher Learning, the U.S. Government, and Regulatory Agencies and approved users. This initiative will enhance and extend High Performance Computing (HPC) resources in Nevada to support critical nationalmore » and international needs in "scientific confirmation". The PA Institute will be promoted as the leading Modeling, Learning and Research Center worldwide. The program proposes to utilize the existing supercomputing capabilities and alliances of the University of Nevada Las Vegas as a base, and to extend these resource and capabilities through a collaborative relationship with its membership. The PA Institute will provide an academic setting for interactive sharing, learning, mentoring and monitoring of multi-disciplinary performance assessment and performance confirmation information. The role of the PA Institute is to facilitate research, knowledge-increase, and knowledge-sharing among users.« less
NASA Astrophysics Data System (ADS)
Stockton, Gregory R.
2011-05-01
Over the last 10 years, very large government, military, and commercial computer and data center operators have spent millions of dollars trying to optimally cool data centers as each rack has begun to consume as much as 10 times more power than just a few years ago. In fact, the maximum amount of data computation in a computer center is becoming limited by the amount of available power, space and cooling capacity at some data centers. Tens of millions of dollars and megawatts of power are being annually spent to keep data centers cool. The cooling and air flows dynamically change away from any predicted 3-D computational fluid dynamic modeling during construction and as time goes by, and the efficiency and effectiveness of the actual cooling rapidly departs even farther from predicted models. By using 3-D infrared (IR) thermal mapping and other techniques to calibrate and refine the computational fluid dynamic modeling and make appropriate corrections and repairs, the required power for data centers can be dramatically reduced which reduces costs and also improves reliability.
Numerical simulation of long-duration blast wave evolution in confined facilities
NASA Astrophysics Data System (ADS)
Togashi, F.; Baum, J. D.; Mestreau, E.; Löhner, R.; Sunshine, D.
2010-10-01
The objective of this research effort was to investigate the quasi-steady flow field produced by explosives in confined facilities. In this effort we modeled tests in which a high explosive (HE) cylindrical charge was hung in the center of a room and detonated. The HEs used for the tests were C-4 and AFX 757. While C-4 is just slightly under-oxidized and is typically modeled as an ideal explosive, AFX 757 includes a significant percentage of aluminum particles, so long-time afterburning and energy release must be considered. The Lawrence Livermore National Laboratory (LLNL)-produced thermo-chemical equilibrium algorithm, “Cheetah”, was used to estimate the remaining burnable detonation products. From these remaining species, the afterburning energy was computed and added to the flow field. Computations of the detonation and afterburn of two HEs in the confined multi-room facility were performed. The results demonstrate excellent agreement with available experimental data in terms of blast wave time of arrival, peak shock amplitude, reverberation, and total impulse (and hence, total energy release, via either the detonation or afterburn processes.
Quantum Critical Point revisited by the Dynamical Mean Field Theory
NASA Astrophysics Data System (ADS)
Xu, Wenhu; Kotliar, Gabriel; Tsvelik, Alexei
Dynamical mean field theory is used to study the quantum critical point (QCP) in the doped Hubbard model on a square lattice. The QCP is characterized by a universal scaling form of the self energy and a spin density wave instability at an incommensurate wave vector. The scaling form unifies the low energy kink and the high energy waterfall feature in the spectral function, while the spin dynamics includes both the critical incommensurate and high energy antiferromagnetic paramagnons. We use the frequency dependent four-point correlation function of spin operators to calculate the momentum dependent correction to the electron self energy. Our results reveal a substantial difference with the calculations based on the Spin-Fermion model which indicates that the frequency dependence of the the quasiparitcle-paramagnon vertices is an important factor. The authors are supported by Center for Computational Design of Functional Strongly Correlated Materials and Theoretical Spectroscopy under DOE Grant DE-FOA-0001276.
Impact of Weak Agostic Interactions in Nickel Electrocatalysts for Hydrogen Oxidation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klug, Christina M.; O’Hagan, Molly; Bullock, R. Morris
To understand how H2 binding and oxidation is influenced by [Ni(PR2NR'2)2]2+ PR2NR'2 catalysts with H2 binding energies close to thermoneutral, two [Ni(PPh2NR'2)2]2+ (R = Me or C14H29) complexes with phenyl substituents on phosphorous and varying alkyl chain lengths on the pendant amine were studied. In the solid state, [Ni(PPh2NMe2)2]2+ exhibits an anagostic interaction between the Ni(II) center and the α-CH3 of the pendant amine, and DFT and variable-temperature 31P NMR experiments suggest than the anagostic interaction persists in solution. The equilibrium constants for H2 addition to these complexes was measured by 31P NMR spectroscopy, affording free energies of H2 additionmore » (ΔG°H2) of –0.8 kcal mol–1 in benzonitrile and –1.6 to –2.3 kcal mol–1 in THF. The anagostic interaction contributes to the low driving force for H2 binding by stabilizing the four-coordinate Ni(II) species prior to binding of H2. The pseudo-first order rate constants for H2 addition at 1 atm were measured by variable scan rate cyclic voltammetry, and were found to be similar for both complexes, less than 0.2 s–1 in benzonitrile and 3 –6 s–1 in THF. In the presence of exogenous base and H2 , turnover frequencies of electrocatalytic H2 oxidation were measured to be less than 0.2 s–1 in benzonitrile and 4 –9 s–1 in THF. These complexes are slower electrocatalysts for H2 oxidation than previously studied [Ni(PR2NR'2)2]2+ complexes due to a competition between H2 binding and formation of the anagostic interaction. However, the decrease in catalytic rate is accompanied by a beneficial 130 mV decrease in overpotential. This research was supported as part of the Center for Molecular Electrocatalysis, an Energy Frontier Research Center funded by the U.S. Department of Energy (DOE), Office of Science, Office of Basic Energy Sciences. Computational resources were provided at the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. Mass spectrometry experiments were performed in the William R. Wiley Environmental Molecular Sciences Laboratory, a DOE national scientific user facility sponsored by the DOE’s Office of Biological and Environmental Research and located and the Pacific Northwest National Laboratory (PNNL). The authors thank Dr. Rosalie Chu for mass spectroscopy analysis. PNNL is operated by Battelle for DOE.« less
Energy Innovation Hubs: A Home for Scientific Collaboration
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chu, Steven
Secretary Chu will host a live, streaming Q&A session with the directors of the Energy Innovation Hubs on Tuesday, March 6, at 2:15 p.m. EST. The directors will be available for questions regarding their teams' work and the future of American energy. Ask your questions in the comments below, or submit them on Facebook, Twitter (@energy), or send an e-mail to newmedia@hq.doe.gov, prior or during the live event. Dr. Hank Foley is the director of the Greater Philadelphia Innovation Cluster for Energy-Efficient Buildings, which is pioneering new data intensive techniques for designing and operating energy efficient buildings, including advanced computermore » modeling. Dr. Douglas Kothe is the director of the Consortium for Advanced Simulation of Light Water Reactors, which uses powerful supercomputers to create "virtual" reactors that will help improve the safety and performance of both existing and new nuclear reactors. Dr. Nathan Lewis is the director of the Joint Center for Artificial Photosynthesis, which focuses on how to produce fuels from sunlight, water, and carbon dioxide. The Energy Innovation Hubs are major integrated research centers, with researchers from many different institutions and technical backgrounds. Each hub is focused on a specific high priority goal, rapidly accelerating scientific discoveries and shortening the path from laboratory innovation to technological development and commercial deployment of critical energy technologies. Ask your questions in the comments below, or submit them on Facebook, Twitter (@energy), or send an e-mail to newmedia@energy.gov, prior or during the live event. The Energy Innovation Hubs are major integrated research centers, with researchers from many different institutions and technical backgrounds. Each Hub is focused on a specific high priority goal, rapidly accelerating scientific discoveries and shortening the path from laboratory innovation to technological development and commercial deployment of critical energy technologies. Dr. Hank Holey is the director of the Greater Philadelphia Innovation Cluster for Energy-Efficient Buildings, which is pioneering new data intensive techniques for designing and operating energy efficient buildings, including advanced computer modeling. Dr. Douglas Kothe is the director of the Modeling and Simulation for Nuclear Reactors Hub, which uses powerful supercomputers to create "virtual" reactors that will help improve the safety and performance of both existing and new nuclear reactors. Dr. Nathan Lewis is the director of the Joint Center for Artificial Photosynthesis Hub, which focuses on how to produce biofuels from sunlight, water, and carbon dioxide.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pinski, Peter; Riplinger, Christoph; Neese, Frank, E-mail: evaleev@vt.edu, E-mail: frank.neese@cec.mpg.de
2015-07-21
In this work, a systematic infrastructure is described that formalizes concepts implicit in previous work and greatly simplifies computer implementation of reduced-scaling electronic structure methods. The key concept is sparse representation of tensors using chains of sparse maps between two index sets. Sparse map representation can be viewed as a generalization of compressed sparse row, a common representation of a sparse matrix, to tensor data. By combining few elementary operations on sparse maps (inversion, chaining, intersection, etc.), complex algorithms can be developed, illustrated here by a linear-scaling transformation of three-center Coulomb integrals based on our compact code library that implementsmore » sparse maps and operations on them. The sparsity of the three-center integrals arises from spatial locality of the basis functions and domain density fitting approximation. A novel feature of our approach is the use of differential overlap integrals computed in linear-scaling fashion for screening products of basis functions. Finally, a robust linear scaling domain based local pair natural orbital second-order Möller-Plesset (DLPNO-MP2) method is described based on the sparse map infrastructure that only depends on a minimal number of cutoff parameters that can be systematically tightened to approach 100% of the canonical MP2 correlation energy. With default truncation thresholds, DLPNO-MP2 recovers more than 99.9% of the canonical resolution of the identity MP2 (RI-MP2) energy while still showing a very early crossover with respect to the computational effort. Based on extensive benchmark calculations, relative energies are reproduced with an error of typically <0.2 kcal/mol. The efficiency of the local MP2 (LMP2) method can be drastically improved by carrying out the LMP2 iterations in a basis of pair natural orbitals. While the present work focuses on local electron correlation, it is of much broader applicability to computation with sparse tensors in quantum chemistry and beyond.« less
Pinski, Peter; Riplinger, Christoph; Valeev, Edward F; Neese, Frank
2015-07-21
In this work, a systematic infrastructure is described that formalizes concepts implicit in previous work and greatly simplifies computer implementation of reduced-scaling electronic structure methods. The key concept is sparse representation of tensors using chains of sparse maps between two index sets. Sparse map representation can be viewed as a generalization of compressed sparse row, a common representation of a sparse matrix, to tensor data. By combining few elementary operations on sparse maps (inversion, chaining, intersection, etc.), complex algorithms can be developed, illustrated here by a linear-scaling transformation of three-center Coulomb integrals based on our compact code library that implements sparse maps and operations on them. The sparsity of the three-center integrals arises from spatial locality of the basis functions and domain density fitting approximation. A novel feature of our approach is the use of differential overlap integrals computed in linear-scaling fashion for screening products of basis functions. Finally, a robust linear scaling domain based local pair natural orbital second-order Möller-Plesset (DLPNO-MP2) method is described based on the sparse map infrastructure that only depends on a minimal number of cutoff parameters that can be systematically tightened to approach 100% of the canonical MP2 correlation energy. With default truncation thresholds, DLPNO-MP2 recovers more than 99.9% of the canonical resolution of the identity MP2 (RI-MP2) energy while still showing a very early crossover with respect to the computational effort. Based on extensive benchmark calculations, relative energies are reproduced with an error of typically <0.2 kcal/mol. The efficiency of the local MP2 (LMP2) method can be drastically improved by carrying out the LMP2 iterations in a basis of pair natural orbitals. While the present work focuses on local electron correlation, it is of much broader applicability to computation with sparse tensors in quantum chemistry and beyond.
High-Performance Analysis of Filtered Semantic Graphs
2012-05-06
any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if it does not display a...observation that explains why SEJITS+KDT performance is so close to CombBLAS performance in practice (as shown later in Section 7) even though its in-core...NEC, Nokia , NVIDIA, Oracle, and Samsung. This research used resources of the National Energy Research Sci- entific Computing Center, which is
Wave packet dynamics, time scales and phase diagram in the IBM-Lipkin-Meshkov-Glick model
NASA Astrophysics Data System (ADS)
Castaños, Octavio; de los Santos, Francisco; Yáñez, Rafael; Romera, Elvira
2018-02-01
We derive the phase diagram of a scalar two-level boson model by studying the equilibrium and stability properties of its energy surface. The plane of control parameters is enlarged with respect to previous studies. We then analyze the time evolution of wave packets centered around the ground state at various quantum phase transition boundary lines. In particular, classical and revival times are computed numerically.
Solid liquid interfacial free energies of benzene
NASA Astrophysics Data System (ADS)
Azreg-Aı¨nou, M.
2007-02-01
In this work we determine for the range of melting temperatures 284.6⩽T⩽306.7 K, corresponding to equilibrium pressures 20.6⩽P⩽102.9 MPa, the benzene solid-liquid interfacial free energy by a cognitive approach including theoretical and experimental physics, mathematics, computer algebra (MATLAB), and some results from molecular dynamics computer simulations. From a theoretical and mathematical points of view, we deal with the elaboration of an analytical expression for the internal energy derived from a unified solid-liquid-vapor equation of state and with the elaboration of an existing statistical model for the entropy drop of the melt near the solid-liquid interface. From an experimental point of view, we will use our results obtained in collaboration with colleagues concerning the supercooled liquid benzene. Of particular interest for this work is the existing center-of-mass radial distribution function of benzene at 298 K obtained by computer simulation. Crystal-orientation-independent and minimum interfacial free energies are calculated and shown to increase slightly with the above temperatures. Both crystal-orientation-independent and minimum free energies agree with existing calculations and with rare existing experimental data. Taking into account the fact that the extent of supercooling is generally admitted as a constant, we determine the limits of supercooling by which we explore the behavior of the critical nucleus radius which is shown to decrease in terms of the above temperatures. The radius of the, and the number of molecules per, critical nucleus are shown to assume the average values of 20.2 A˚ and 175 with standard deviations of 0.16 Å and 4.5, respectively.
Predictability Experiments With the Navy Operational Global Atmospheric Prediction System
NASA Astrophysics Data System (ADS)
Reynolds, C. A.; Gelaro, R.; Rosmond, T. E.
2003-12-01
There are several areas of research in numerical weather prediction and atmospheric predictability, such as targeted observations and ensemble perturbation generation, where it is desirable to combine information about the uncertainty of the initial state with information about potential rapid perturbation growth. Singular vectors (SVs) provide a framework to accomplish this task in a mathematically rigorous and computationally feasible manner. In this study, SVs are calculated using the tangent and adjoint models of the Navy Operational Global Atmospheric Prediction System (NOGAPS). The analysis error variance information produced by the NRL Atmospheric Variational Data Assimilation System is used as the initial-time SV norm. These VAR SVs are compared to SVs for which total energy is both the initial and final time norms (TE SVs). The incorporation of analysis error variance information has a significant impact on the structure and location of the SVs. This in turn has a significant impact on targeted observing applications. The utility and implications of such experiments in assessing the analysis error variance estimates will be explored. Computing support has been provided by the Department of Defense High Performance Computing Center at the Naval Oceanographic Office Major Shared Resource Center at Stennis, Mississippi.
Assessment of the MHD capability in the ATHENA code using data from the ALEX facility
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roth, P.A.
1989-03-01
The ATHENA (Advanced Thermal Hydraulic Energy Network Analyzer) code is a system transient analysis code with multi-loop, multi-fluid capabilities, which is available to the fusion community at the National Magnetic Fusion Energy Computing Center (NMFECC). The work reported here assesses the ATHENA magnetohydrodynamic (MHD) pressure drop model for liquid metals flowing through a strong magnetic field. An ATHENA model was developed for two simple geometry, adiabatic test sections used in the Argonne Liquid Metal Experiment (ALEX) at Argonne National Laboratory (ANL). The pressure drops calculated by ATHENA agreed well with the experimental results from the ALEX facility.
NASA Technical Reports Server (NTRS)
Krishen, Kumar (Editor); Burnham, Calvin (Editor)
1995-01-01
The papers presented at the 4th International Conference Exhibition: World Congress on Superconductivity held at the Marriott Orlando World Center, Orlando, Florida, are contained in this document and encompass the research, technology, applications, funding, political, and social aspects of superconductivity. Specifically, the areas covered included: high-temperature materials; thin films; C-60 based superconductors; persistent magnetic fields and shielding; fabrication methodology; space applications; physical applications; performance characterization; device applications; weak link effects and flux motion; accelerator technology; superconductivity energy; storage; future research and development directions; medical applications; granular superconductors; wire fabrication technology; computer applications; technical and commercial challenges, and power and energy applications.
Efficient approach to the free energy of crystals via Monte Carlo simulations
NASA Astrophysics Data System (ADS)
Navascués, G.; Velasco, E.
2015-08-01
We present a general approach to compute the absolute free energy of a system of particles with constrained center of mass based on the Monte Carlo thermodynamic coupling integral method. The version of the Frenkel-Ladd approach [J. Chem. Phys. 81, 3188 (1984)], 10.1063/1.448024, which uses a harmonic coupling potential, is recovered. Also, we propose a different choice, based on one-particle square-well coupling potentials, which is much simpler, more accurate, and free from some of the difficulties of the Frenkel-Ladd method. We apply our approach to hard spheres and compare with the standard harmonic method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lottes, S.A.; Kulak, R.F.; Bojanowski, C.
2011-12-09
The computational fluid dynamics (CFD) and computational structural mechanics (CSM) focus areas at Argonne's Transportation Research and Analysis Computing Center (TRACC) initiated a project to support and compliment the experimental programs at the Turner-Fairbank Highway Research Center (TFHRC) with high performance computing based analysis capabilities in August 2010. The project was established with a new interagency agreement between the Department of Energy and the Department of Transportation to provide collaborative research, development, and benchmarking of advanced three-dimensional computational mechanics analysis methods to the aerodynamics and hydraulics laboratories at TFHRC for a period of five years, beginning in October 2010. Themore » analysis methods employ well-benchmarked and supported commercial computational mechanics software. Computational mechanics encompasses the areas of Computational Fluid Dynamics (CFD), Computational Wind Engineering (CWE), Computational Structural Mechanics (CSM), and Computational Multiphysics Mechanics (CMM) applied in Fluid-Structure Interaction (FSI) problems. The major areas of focus of the project are wind and water effects on bridges - superstructure, deck, cables, and substructure (including soil), primarily during storms and flood events - and the risks that these loads pose to structural failure. For flood events at bridges, another major focus of the work is assessment of the risk to bridges caused by scour of stream and riverbed material away from the foundations of a bridge. Other areas of current research include modeling of flow through culverts to assess them for fish passage, modeling of the salt spray transport into bridge girders to address suitability of using weathering steel in bridges, CFD analysis of the operation of the wind tunnel in the TFCHR wind engineering laboratory, vehicle stability under high wind loading, and the use of electromagnetic shock absorbers to improve vehicle stability under high wind conditions. This quarterly report documents technical progress on the project tasks for the period of July through September 2011.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shu, Yinan; Levine, Benjamin G., E-mail: levine@chemistry.msu.edu; Hohenstein, Edward G.
2015-01-14
Multireference quantum chemical methods, such as the complete active space self-consistent field (CASSCF) method, have long been the state of the art for computing regions of potential energy surfaces (PESs) where complex, multiconfigurational wavefunctions are required, such as near conical intersections. Herein, we present a computationally efficient alternative to the widely used CASSCF method based on a complete active space configuration interaction (CASCI) expansion built from the state-averaged natural orbitals of configuration interaction singles calculations (CISNOs). This CISNO-CASCI approach is shown to predict vertical excitation energies of molecules with closed-shell ground states similar to those predicted by state averaged (SA)-CASSCFmore » in many cases and to provide an excellent reference for a perturbative treatment of dynamic electron correlation. Absolute energies computed at the CISNO-CASCI level are found to be variationally superior, on average, to other CASCI methods. Unlike SA-CASSCF, CISNO-CASCI provides vertical excitation energies which are both size intensive and size consistent, thus suggesting that CISNO-CASCI would be preferable to SA-CASSCF for the study of systems with multiple excitable centers. The fact that SA-CASSCF and some other CASCI methods do not provide a size intensive/consistent description of excited states is attributed to changes in the orbitals that occur upon introduction of non-interacting subsystems. Finally, CISNO-CASCI is found to provide a suitable description of the PES surrounding a biradicaloid conical intersection in ethylene.« less
First-Principles Thermodynamics Study of Spinel MgAl 2 O 4 Surface Stability
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cai, Qiuxia; Wang, Jian-guo; Wang, Yong
The surface stability of all possible terminations for three low-index (111, 110, 100) structures of the spinel MgAl2O4 has been studied using first-principles based thermodynamic approach. The surface Gibbs free energy results indicate that the 100_AlO2 termination is the most stable surface structure under ultra-high vacuum at T=1100 K regardless of Al-poor or Al-rich environment. With increasing oxygen pressure, the 111_O2(Al) termination becomes the most stable surface in the Al-rich environment. The oxygen vacancy formation is thermodynamically favorable over the 100_AlO2, 111_O2(Al) and the (111) structure with Mg/O connected terminations. On the basis of surface Gibbs free energies for bothmore » perfect and defective surface terminations, the 100_AlO2 and 111_O2(Al) are the most dominant surfaces in Al-rich environment under atmospheric condition. This is also consistent with our previously reported experimental observation. This work was supported by a Laboratory Directed Research and Development (LDRD) project of the Pacific Northwest National Laboratory (PNNL). The computing time was granted by the National Energy Research Scientific Computing Center (NERSC). Part of computing time was also granted by a scientific theme user proposal in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL), which is a U.S. Department of Energy national scientific user facility located at PNNL in Richland, Washington.« less
NASA Technical Reports Server (NTRS)
Pereira, J. M.; Revilock, D. M.
2004-01-01
Under the Federal Aviation Administration's Airworthiness Assurance Center of Excellence and the Aircraft Catastrophic Failure Prevention Program, National Aeronautics and Space Administration Glenn Research Center collaborated with Arizona State University, Honeywell Engines, Systems and Services, and SRI International to develop improved computational models for designing fabric-based engine containment systems. In the study described in this report, ballistic impact tests were conducted on layered dry fabric rings to provide impact response data for calibrating and verifying the improved numerical models. This report provides data on projectile velocity, impact and residual energy, and fabric deformation for a number of different test conditions.
Final Report: Main Group Element Chemistry in Service of Hydrogen Storage and Activation
DOE Office of Scientific and Technical Information (OSTI.GOV)
David A. Dixon; Anthony J. Arduengo, III
2010-09-30
Replacing combustion of carbon-based fuels with alternative energy sources that have minimal environmental impact is one of the grand scientific and technological challenges of the early 21st century. Not only is it critical to capture energy from new, renewable sources, it is also necessary to store the captured energy efficiently and effectively for use at the point of service when and where it is needed, which may not be collocated with the collection site. There are many potential storage media but we focus on the storage of energy in chemical bonds. It is more efficient to store energy on amore » per weight basis in chemical bonds. This is because it is hard to pack electrons into small volumes with low weight without the use of chemical bonds. The focus of the project was the development of new chemistries to enable DOE to meet its technical objectives for hydrogen storage using chemical hydrogen storage systems. We provided computational chemistry support in terms of thermodynamics, kinetics, and properties prediction in support of the experimental efforts of the DOE Center of Excellence for Chemical Hydrogen Storage. The goal of the Center is to store energy in chemical bonds involving hydrogen atoms. Once the hydrogen is stored in a set of X-H/Y-H bonds, the hydrogen has to be easily released and the depleted fuel regenerated very efficiently. This differs substantially from our current use of fossil fuel energy sources where the reactant is converted to energy plus CO2 (coal) or CO2 and H2O (gasoline, natural gas), which are released into the atmosphere. In future energy storage scenarios, the spent fuel will be captured and the energy storage medium regenerated. This places substantial additional constraints on the chemistry. The goal of the computational chemistry work was to reduce the time to design new materials and develop materials that meet the 2010 and 2015 DOE objectives in terms of weight percent, volume, release time, and regeneration ability. This goal was met in terms of reducing the number of costly experiments and helping to focus the experimental effort on the potentially optimal targets. We have used computational chemistry approaches to predict the thermodynamic properties of a wide range of compounds containing boron, nitrogen, hydrogen, and other elements as appropriate including carbon. These calculations were done in most cases with high level molecular orbital theory methods that have small error bars on the order of ± 1 to 2 kcal/mol. The results were used to benchmark more approximate methods such as density functional theory for larger systems and for database development. We predicted reliable thermodynamics for thousands of compounds for release and regeneration schemes to aid/guide materials design and process design and simulation. These are the first reliable computed values for these compounds and for many represent the only available values. Overall, the computational results have provided us with new insights into the chemistry of main group and organic-base chemical hydrogen systems from the release of hydrogen to the regeneration of spent fuel. A number of experimental accomplishments were also made in this project. The experimental work on hydrogen storage materials centered on activated polarized σ- or π-bonded frameworks that hold the potential for ready dihydrogen activation, uptake, and eventually release. To this end, a large number of non-traditional valence systems including carbenes, cyanocarbons, and C-B and and B-N systems were synthesized and examined. During the course of these studies an important lead arose from the novel valency of a class of stable organic singlet bi-radical systems. A synthetic strategy to an “endless” hydrogen storage polymer has been developed based on our cyanocarbon chemistry. A key issue with the synthetic efforts was being able to link the kinetics of release with the size of the substituents as it was difficult to develop a low molecular weight molecule with the right kinetics. A novel hydrogen activation process has been developed which showed that Lewis acid-base pairs need not be “frustrated” in their reactivity towards activating H2. Reaction can occur at temperatures as low as -80 ºC. We established that the interaction of H2 with the electrophile is a key step in the activation process.« less
Comparing Server Energy Use and Efficiency Using Small Sample Sizes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coles, Henry C.; Qin, Yong; Price, Phillip N.
This report documents a demonstration that compared the energy consumption and efficiency of a limited sample size of server-type IT equipment from different manufacturers by measuring power at the server power supply power cords. The results are specific to the equipment and methods used. However, it is hoped that those responsible for IT equipment selection can used the methods described to choose models that optimize energy use efficiency. The demonstration was conducted in a data center at Lawrence Berkeley National Laboratory in Berkeley, California. It was performed with five servers of similar mechanical and electronic specifications; three from Intel andmore » one each from Dell and Supermicro. Server IT equipment is constructed using commodity components, server manufacturer-designed assemblies, and control systems. Server compute efficiency is constrained by the commodity component specifications and integration requirements. The design freedom, outside of the commodity component constraints, provides room for the manufacturer to offer a product with competitive efficiency that meets market needs at a compelling price. A goal of the demonstration was to compare and quantify the server efficiency for three different brands. The efficiency is defined as the average compute rate (computations per unit of time) divided by the average energy consumption rate. The research team used an industry standard benchmark software package to provide a repeatable software load to obtain the compute rate and provide a variety of power consumption levels. Energy use when the servers were in an idle state (not providing computing work) were also measured. At high server compute loads, all brands, using the same key components (processors and memory), had similar results; therefore, from these results, it could not be concluded that one brand is more efficient than the other brands. The test results show that the power consumption variability caused by the key components as a group is similar to all other components as a group. However, some differences were observed. The Supermicro server used 27 percent more power at idle compared to the other brands. The Intel server had a power supply control feature called cold redundancy, and the data suggest that cold redundancy can provide energy savings at low power levels. Test and evaluation methods that might be used by others having limited resources for IT equipment evaluation are explained in the report.« less
75 FR 9199 - Combined Notice of Filings #1
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-01
..., LLC, Calpine Power America--CA, LLC, Calpine Power America--OR, LLC,CES Marketing IX, LLC,CES Marketing V, L.P.,CES Marketing X, LLC, PCF2, LLC, Mankato Energy Center, LLC, Riverside Energy Center, LLC... Center, LLC, Pine Bluff Energy, LLC, Pastoria Energy Center, LLC, Morgan Energy Center, LLC, MOBILE...
Alternative Fuels Data Center: Lifecycle Energy Balance
Energy Balance to someone by E-mail Share Alternative Fuels Data Center: Lifecycle Energy Balance on Facebook Tweet about Alternative Fuels Data Center: Lifecycle Energy Balance on Twitter Bookmark Alternative Fuels Data Center: Lifecycle Energy Balance on Google Bookmark Alternative Fuels Data Center
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mei, Donghai
2013-05-20
Molecular adsorption of formate and carboxyl on the stoichiometric CeO2(111) and CeO2(110) surfaces was studied using periodic density functional theory (DFT+U) calculations. Two distinguishable adsorption modes (strong and weak) of formate are identified. The bidentate configuration is more stable than the monodentate adsorption configuration. Both formate and carboxyl bind at the more open CeO2(110) surface are stronger. The calculated vibrational frequencies of two adsorbed species are consistent with experimental measurements. Finally, the effects of U parameters on the adsorption of formate and carboxyl over both CeO2 surfaces were investigated. We found that the geometrical configurations of two adsorbed species aremore » not affected by using different U parameters (U=0, 5, and 7). However, the calculated adsorption energy of carboxyl pronouncedly increases with the U value while the adsorption energy of formate only slightly changes (<0.2 eV). The Bader charge analysis shows the opposite charge transfer occurs for formate and carboxyl adsorption where the adsorbed formate is negatively charge whiled the adsorbed carboxyl is positively charged. Interestingly, with the increasing U parameter, the amount of charge is also increased. This work was supported by the Laboratory Directed Research and Development (LDRD) project of the Pacific Northwest National Laboratory (PNNL) and by a Cooperative Research and Development Agreement (CRADA) with General Motors. The computations were performed using the Molecular Science Computing Facility in the William R. Wiley Environmental Molecular Sciences Laboratory (EMSL), which is a U.S. Department of Energy national scientific user facility located at PNNL in Richland, Washington. Part of the computing time was also granted by the National Energy Research Scientific Computing Center (NERSC)« less
NASA Astrophysics Data System (ADS)
Kalugina, Yulia N.; Roy, Pierre-Nicholas
2017-12-01
We present a five-dimensional potential energy surface (PES) for the HF@C60 system computed at the DF-LMP2/cc-pVTZ level of theory. We also calculated a five-dimensional dipole moment surface (DMS) based on DFT(PBE0)/cc-pVTZ calculations. The HF and C60 molecules are considered rigid with bond length rHF = 0.9255 Å (gas phase ground rovibrational state geometry). The C60 geometry is of Ih symmetry. The ab initio points were fitted to obtain a PES in terms of bipolar spherical harmonics. The minimum of the PES corresponds to a geometry where the center of mass of HF is located 0.11 Å away from the center of the cage with an interaction energy of -6.929 kcal/mol. The DMS was also represented in terms of bipolar spherical harmonics. The PES was used to calculate the rotation-translation bound states of HF@C60, and good agreement was found relative to the available experimental data [A. Krachmalnicoff et al., Nat. Chem. 8, 953 (2016)] except for the splitting of the first rotational excitation levels. We propose an empirical adjustment to the PES in order to account for the experimentally observed symmetry breaking. The form of that effective PES is additive. We also propose an effective Hamiltonian with an adjusted rotational constant in order to quantitatively reproduce the experimental results including the splitting of the first rotational state. We use our models to compute the molecular volume polarizability of HF confined by C60 and obtain good agreement with experiment.
Kalugina, Yulia N; Roy, Pierre-Nicholas
2017-12-28
We present a five-dimensional potential energy surface (PES) for the HF@C 60 system computed at the DF-LMP2/cc-pVTZ level of theory. We also calculated a five-dimensional dipole moment surface (DMS) based on DFT(PBE0)/cc-pVTZ calculations. The HF and C 60 molecules are considered rigid with bond length r HF = 0.9255 Å (gas phase ground rovibrational state geometry). The C 60 geometry is of I h symmetry. The ab initio points were fitted to obtain a PES in terms of bipolar spherical harmonics. The minimum of the PES corresponds to a geometry where the center of mass of HF is located 0.11 Å away from the center of the cage with an interaction energy of -6.929 kcal/mol. The DMS was also represented in terms of bipolar spherical harmonics. The PES was used to calculate the rotation-translation bound states of HF@C 60 , and good agreement was found relative to the available experimental data [A. Krachmalnicoff et al., Nat. Chem. 8, 953 (2016)] except for the splitting of the first rotational excitation levels. We propose an empirical adjustment to the PES in order to account for the experimentally observed symmetry breaking. The form of that effective PES is additive. We also propose an effective Hamiltonian with an adjusted rotational constant in order to quantitatively reproduce the experimental results including the splitting of the first rotational state. We use our models to compute the molecular volume polarizability of HF confined by C 60 and obtain good agreement with experiment.
Beyond Gaussians: a study of single spot modeling for scanning proton dose calculation
Li, Yupeng; Zhu, Ronald X.; Sahoo, Narayan; Anand, Aman; Zhang, Xiaodong
2013-01-01
Active spot scanning proton therapy is becoming increasingly adopted by proton therapy centers worldwide. Unlike passive-scattering proton therapy, active spot scanning proton therapy, especially intensity-modulated proton therapy, requires proper modeling of each scanning spot to ensure accurate computation of the total dose distribution contributed from a large number of spots. During commissioning of the spot scanning gantry at the Proton Therapy Center in Houston, it was observed that the long-range scattering protons in a medium may have been inadequately modeled for high-energy beams by a commercial treatment planning system, which could lead to incorrect prediction of field-size effects on dose output. In the present study, we developed a pencil-beam algorithm for scanning-proton dose calculation by focusing on properly modeling individual scanning spots. All modeling parameters required by the pencil-beam algorithm can be generated based solely on a few sets of measured data. We demonstrated that low-dose halos in single-spot profiles in the medium could be adequately modeled with the addition of a modified Cauchy-Lorentz distribution function to a double-Gaussian function. The field-size effects were accurately computed at all depths and field sizes for all energies, and good dose accuracy was also achieved for patient dose verification. The implementation of the proposed pencil beam algorithm also enabled us to study the importance of different modeling components and parameters at various beam energies. The results of this study may be helpful in improving dose calculation accuracy and simplifying beam commissioning and treatment planning processes for spot scanning proton therapy. PMID:22297324
[Ionization energies and infrared spectra studies of histidine using density functional theory].
Hu, Qiong; Wang, Guo-Ying; Liu, Gang; Ou, Jia-Ming; Wang, Rui-Li
2010-05-01
Histidines provide axial ligands to the primary electron donors in photosynthetic reaction centers (RCs) and play an important role in the protein environments of these donors. In this paper the authors present a systematic study of ionization energies and vibrational properties of histidine using hybrid density functional theory (DFT). All calculations were undertaken by using B3LYP method in combination with four basis sets: 6-31G(d), 6-31G(df, p), 6-31+G(d) and 6-311+G(2d, 2p) with the aim to investigate how the basis sets influence the calculation results. To investigate solvent effects and gain a detailed understanding of marker bands of histidine, the ionization energies of histidine and the vibrational frequencies of histidine which are unlabeled and 13C, 15N, and 2H labeled in the gas phase, CCl4, protein environment, THF and water solution, which span a wide range of dielectric constant, were also calculated. Our results showed that: (1) The main geometry parameters of histidine were impacted by basis sets and mediums, and C2-N3 and N3-C4 bond of imidazole ring of histidine side chain display the maximum bond lengths in the gas phase; (2) single point energies and frequencies calculated were decreased while ionization energies increased with the increasing level of basis sets and diffuse function applied in the same solvent; (3) with the same computational method, the higher the dielectric constant of the solvent used, the lower the ionization energy and vibrational frequency and the higher the intensity obtained. In addition, calculated ionization energy in the gas phase and marker bands of histidine as well as frequency shift upon 13C and 15N labeling at the computationally more expensive 6-311+G(2d, 2p) level are in good agreement with experimental observations available in literatures. All calculations indicated that the results calculated by using higher level basis set with diffuse function were more accurate and closer to the experimental value. In conclusion, the results provide useful information for the further studies of the functional and vibrational properties of chlorophyll-a ligated to histidine residue in photosynthetic reaction center.
Energy System Integration Facility Secure Data Center | Energy Systems
Integration Facility | NREL Energy System Integration Facility Secure Data Center Energy System Integration Facility Secure Data Center The Energy Systems Integration Facility's Secure Data Center provides
Multigrid treatment of implicit continuum diffusion
NASA Astrophysics Data System (ADS)
Francisquez, Manaure; Zhu, Ben; Rogers, Barrett
2017-10-01
Implicit treatment of diffusive terms of various differential orders common in continuum mechanics modeling, such as computational fluid dynamics, is investigated with spectral and multigrid algorithms in non-periodic 2D domains. In doubly periodic time dependent problems these terms can be efficiently and implicitly handled by spectral methods, but in non-periodic systems solved with distributed memory parallel computing and 2D domain decomposition, this efficiency is lost for large numbers of processors. We built and present here a multigrid algorithm for these types of problems which outperforms a spectral solution that employs the highly optimized FFTW library. This multigrid algorithm is not only suitable for high performance computing but may also be able to efficiently treat implicit diffusion of arbitrary order by introducing auxiliary equations of lower order. We test these solvers for fourth and sixth order diffusion with idealized harmonic test functions as well as a turbulent 2D magnetohydrodynamic simulation. It is also shown that an anisotropic operator without cross-terms can improve model accuracy and speed, and we examine the impact that the various diffusion operators have on the energy, the enstrophy, and the qualitative aspect of a simulation. This work was supported by DOE-SC-0010508. This research used resources of the National Energy Research Scientific Computing Center (NERSC).
Dong, Yu-Shuang; Xu, Gao-Chao; Fu, Xiao-Dong
2014-01-01
The cloud platform provides various services to users. More and more cloud centers provide infrastructure as the main way of operating. To improve the utilization rate of the cloud center and to decrease the operating cost, the cloud center provides services according to requirements of users by sharding the resources with virtualization. Considering both QoS for users and cost saving for cloud computing providers, we try to maximize performance and minimize energy cost as well. In this paper, we propose a distributed parallel genetic algorithm (DPGA) of placement strategy for virtual machines deployment on cloud platform. It executes the genetic algorithm parallelly and distributedly on several selected physical hosts in the first stage. Then it continues to execute the genetic algorithm of the second stage with solutions obtained from the first stage as the initial population. The solution calculated by the genetic algorithm of the second stage is the optimal one of the proposed approach. The experimental results show that the proposed placement strategy of VM deployment can ensure QoS for users and it is more effective and more energy efficient than other placement strategies on the cloud platform. PMID:25097872
Dong, Yu-Shuang; Xu, Gao-Chao; Fu, Xiao-Dong
2014-01-01
The cloud platform provides various services to users. More and more cloud centers provide infrastructure as the main way of operating. To improve the utilization rate of the cloud center and to decrease the operating cost, the cloud center provides services according to requirements of users by sharding the resources with virtualization. Considering both QoS for users and cost saving for cloud computing providers, we try to maximize performance and minimize energy cost as well. In this paper, we propose a distributed parallel genetic algorithm (DPGA) of placement strategy for virtual machines deployment on cloud platform. It executes the genetic algorithm parallelly and distributedly on several selected physical hosts in the first stage. Then it continues to execute the genetic algorithm of the second stage with solutions obtained from the first stage as the initial population. The solution calculated by the genetic algorithm of the second stage is the optimal one of the proposed approach. The experimental results show that the proposed placement strategy of VM deployment can ensure QoS for users and it is more effective and more energy efficient than other placement strategies on the cloud platform.
The NASA B-757 HIRF Test Series: Flight Test Results
NASA Technical Reports Server (NTRS)
Moeller, Karl J.; Dudley, Kenneth L.
1997-01-01
In 1995, the NASA Langley Research Center conducted a series of aircraft tests aimed at characterizing the electromagnetic environment (EME) in and around a Boeing 757 airliner. Measurements were made of the electromagnetic energy coupled into the aircraft and the signals induced on select structures as the aircraft was flown past known RF transmitters. These measurements were conducted to provide data for the validation of computational techniques for the assessment of electromagnetic effects in commercial transport aircraft. This paper reports on the results of flight tests using RF radiators in the HF, VHF, and UHF ranges and on efforts to use computational and analytical techniques to predict RF field levels inside the airliner at these frequencies.
Computers in aeronautics and space research at the Lewis Research Center
NASA Technical Reports Server (NTRS)
1991-01-01
This brochure presents a general discussion of the role of computers in aerospace research at NASA's Lewis Research Center (LeRC). Four particular areas of computer applications are addressed: computer modeling and simulation, computer assisted engineering, data acquisition and analysis, and computer controlled testing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duignan, Timothy T.; Baer, Marcel D.; Schenter, Gregory K.
Determining the solvation free energies of single ions in water is one of the most fundamental problems in physical chemistry and yet many unresolved questions remain. In particular, the ability to decompose the solvation free energy into simple and intuitive contributions will have important implications for coarse grained models of electrolyte solution. Here, we provide rigorous definitions of the various types of single ion solvation free energies based on different simulation protocols. We calculate solvation free energies of charged hard spheres using density functional theory interaction potentials with molecular dynamics simulation (DFT-MD) and isolate the effects of charge and cavitation,more » comparing to the Born (linear response) model. We show that using uncorrected Ewald summation leads to highly unphysical values for the solvation free energy and that charging free energies for cations are approximately linear as a function of charge but that there is a small non-linearity for small anions. The charge hydration asymmetry (CHA) for hard spheres, determined with quantum mechanics, is much larger than for the analogous real ions. This suggests that real ions, particularly anions, are significantly more complex than simple charged hard spheres, a commonly employed representation. We would like to thank Thomas Beck, Shawn Kathmann, Richard Remsing and John Weeks for helpful discussions. Computing resources were generously allocated by PNNL's Institutional Computing program. This research also used resources of the National Energy Research Scientific Computing Center, a DOE Office of Science User Facility supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231. TTD, GKS, and CJM were supported by the U.S. Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences, and Biosciences. MDB was supported by MS3 (Materials Synthesis and Simulation Across Scales) Initiative, a Laboratory Directed Research and Development Program at Pacific Northwest National Laboratory (PNNL). PNNL is a multi-program national laboratory operated by Battelle for the U.S. Department of Energy.« less
A demand-centered, hybrid life-cycle methodology for city-scale greenhouse gas inventories.
Ramaswami, Anu; Hillman, Tim; Janson, Bruce; Reiner, Mark; Thomas, Gregg
2008-09-01
Greenhouse gas (GHG) accounting for individual cities is confounded by spatial scale and boundary effects that impact the allocation of regional material and energy flows. This paper develops a demand-centered, hybrid life-cycle-based methodology for conducting city-scale GHG inventories that incorporates (1) spatial allocation of surface and airline travel across colocated cities in larger metropolitan regions, and, (2) life-cycle assessment (LCA) to quantify the embodied energy of key urban materials--food, water, fuel, and concrete. The hybrid methodology enables cities to separately report the GHG impact associated with direct end-use of energy by cities (consistent with EPA and IPCC methods), as well as the impact of extra-boundary activities such as air travel and production of key urban materials (consistent with Scope 3 protocols recommended by the World Resources Institute). Application of this hybrid methodology to Denver, Colorado, yielded a more holistic GHG inventory that approaches a GHG footprint computation, with consistency of inclusions across spatial scale as well as convergence of city-scale per capita GHG emissions (approximately 25 mt CO2e/person/year) with state and national data. The method is shown to have significant policy impacts, and also demonstrates the utility of benchmarks in understanding energy use in various city sectors.
A Comprehensive Opacities/Atomic Database for the Analysis of Astrophysical Spectra and Modeling
NASA Technical Reports Server (NTRS)
Pradhan, Anil K. (Principal Investigator)
1997-01-01
The main goals of this ADP award have been accomplished. The electronic database TOPBASE, consisting of the large volume of atomic data from the Opacity Project, has been installed and is operative at a NASA site at the Laboratory for High Energy Astrophysics Science Research Center (HEASRC) at the Goddard Space Flight Center. The database will be continually maintained and updated by the PI and collaborators. TOPBASE is publicly accessible from IP: topbase.gsfc.nasa.gov. During the last six months (since the previous progress report), considerable work has been carried out to: (1) put in the new data for low ionization stages of iron: Fe I - V, beginning with Fe II, (2) high-energy photoionization cross sections computed by Dr. Hong Lin Zhang (consultant on the Project) were 'merged' with the current Opacity Project data and input into TOPbase; (3) plans laid out for a further extension of TOPbase to include TIPbase, the database for collisional data to complement the radiative data in TOPbase.
NASA Astrophysics Data System (ADS)
Hakim, Ammar; Shi, Eric; Juno, James; Bernard, Tess; Hammett, Greg
2017-10-01
For weakly collisional (or collisionless) plasmas, kinetic effects are required to capture the physics of micro-turbulence. We have implemented solvers for kinetic and gyrokinetic equations in the computational plasma physics framework, Gkeyll. We use a version of discontinuous Galerkin scheme that conserves energy exactly. Plasma sheaths are modeled with novel boundary conditions. Positivity of distribution functions is maintained via a reconstruction method, allowing robust simulations that continue to conserve energy even with positivity limiters. We have performed a large number of benchmarks, verifying the accuracy and robustness of our code. We demonstrate the application of our algorithm to two classes of problems (a) Vlasov-Maxwell simulations of turbulence in a magnetized plasma, applicable to space plasmas; (b) Gyrokinetic simulations of turbulence in open-field-line geometries, applicable to laboratory plasmas. Supported by the Max-Planck/Princeton Center for Plasma Physics, the SciDAC Center for the Study of Plasma Microturbulence, and DOE Contract DE-AC02-09CH11466.
Advanced computational tools for 3-D seismic analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barhen, J.; Glover, C.W.; Protopopescu, V.A.
1996-06-01
The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advancemore » in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.« less
Hartree-Fock calculation of the differential photoionization cross sections of small Li clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Galitskiy, S. A.; Artemyev, A. N.; Jänkälä, K.
2015-01-21
Cross sections and angular distribution parameters for the single-photon ionization of all electron orbitals of Li{sub 2−8} are systematically computed in a broad interval of the photoelectron kinetic energies for the energetically most stable geometry of each cluster. Calculations of the partial photoelectron continuum waves in clusters are carried out by the single center method within the Hartree-Fock approximation. We study photoionization cross sections per one electron and analyze in some details general trends in the photoionization of inner and outer shells with respect to the size and geometry of a cluster. The present differential cross sections computed for Li{submore » 2} are in a good agreement with the available theoretical data, whereas those computed for Li{sub 3−8} clusters can be considered as theoretical predictions.« less
Finite Element Analysis of an Energy Absorbing Sub-floor Structure
NASA Technical Reports Server (NTRS)
Moore, Scott C.
1995-01-01
As part of the Advanced General Aviation Transportation Experiments program, the National Aeronautics and Space Administration's Langley Research Center is conducting tests to design energy absorbing structures to improve occupant survivability in aircraft crashes. An effort is currently underway to design an Energy Absorbing (EA) sub-floor structure which will reduce occupant loads in an aircraft crash. However, a recent drop test of a fuselage specimen with a proposed EA sub-floor structure demonstrated that the effects of sectioning the fuselage on both the fuselage section's stiffness and the performance of the EA structure were not fully understood. Therefore, attempts are underway to model the proposed sub-floor structure on computers using the DYCAST finite element code to provide a better understanding of the structure's behavior in testing, and in an actual crash.
Scalar-fluid interacting dark energy: Cosmological dynamics beyond the exponential potential
NASA Astrophysics Data System (ADS)
Dutta, Jibitesh; Khyllep, Wompherdeiki; Tamanini, Nicola
2017-01-01
We extend the dynamical systems analysis of scalar-fluid interacting dark energy models performed in C. G. Boehmer et al., Phys. Rev. D 91, 123002 (2015), 10.1103/PhysRevD.91.123002 by considering scalar field potentials beyond the exponential type. The properties and stability of critical points are examined using a combination of linear analysis, computational methods and advanced mathematical techniques, such as center manifold theory. We show that the interesting results obtained with an exponential potential can generally be recovered also for more complicated scalar field potentials. In particular, employing power law and hyperbolic potentials as examples, we find late time accelerated attractors, transitions from dark matter to dark energy domination with specific distinguishing features, and accelerated scaling solutions capable of solving the cosmic coincidence problem.
Computational Simulation of High Energy Density Plasmas
2009-10-30
the imploding liner. The PFS depends on a lithium barrier foil slowing the advance of deuterium up the coaxial gun to the corner. There the plasma ...the coaxial gun section, and Figure 4 shows the physical state of the plasma just prior to pinch. Figure 5 shows neutron yield reaching 1014 in this...details the channel geometry between the center cylinder and coaxial gas gun . The deuterium injection starts when the pressure of the deuterium gas in
Galactic cosmic ray radiation levels in spacecraft on interplanetary missions
NASA Technical Reports Server (NTRS)
Shinn, J. L.; Nealy, J. E.; Townsend, L. W.; Wilson, J. W.; Wood, J.S.
1994-01-01
Using the Langley Research Center Galactic Cosmic Ray (GCR) transport computer code (HZETRN) and the Computerized Anatomical Man (CAM) model, crew radiation levels inside manned spacecraft on interplanetary missions are estimated. These radiation-level estimates include particle fluxes, LET (Linear Energy Transfer) spectra, absorbed dose, and dose equivalent within various organs of interest in GCR protection studies. Changes in these radiation levels resulting from the use of various different types of shield materials are presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anzai, Chihaya; Hasselhuhn, Alexander; Höschele, Maik
We compute the contribution to the total cross section for the inclusive production of a Standard Model Higgs boson induced by two quarks with different flavour in the initial state. Our calculation is exact in the Higgs boson mass and the partonic center-of-mass energy. Here, we describe the reduction to master integrals, the construction of a canonical basis, and the solution of the corresponding differential equations. Our analytic result contains both Harmonic Polylogarithms and iterated integrals with additional letters in the alphabet.
Operational Characteristics of a High Voltage Dense Plasma Focus.
1985-11-01
A high voltage dense plasma focus powered by a single-stage Marx bank was designed, built and operated. The maximum bank parameters are: voltage--120...kV, energy--20 kJ, short-circuit current--600kA. The bank impedance is about 200 millohms. The plasma focus center electrode diameter is 1.27 cm. The...about 50 milliohms. The context of this work is established with a review of previous plasma focus theoretical, experimental and computational work and
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-18
... Wind Energy Center Project (DOE/EIS-0461), and Proposed Crowned Ridge Wind Energy Center Project (DOE... to prepare environmental impact statements (EISs) for the Hyde County Wind Energy Center Project and the Crowned Ridge Wind Energy Center Project in the Federal Register on November 30, 2010. Both...
Clean Energy Solutions Center Services (Arabic Translation) (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2014-06-01
This is the Arabic translation of the Clean Energy Solutions Center Services fact sheet. The Clean Energy Solutions Center (Solutions Center) helps governments, advisors and analysts create policies and programs that advance the deployment of clean energy technologies. The Solutions Center partners with international organizations to provide online training, expert assistance, and technical resources on clean energy policy.
Data Center Energy Efficiency Standards in India: Preliminary Findings from Global Practices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raje, Sanyukta; Maan, Hermant; Ganguly, Suprotim
Global data center energy consumption is growing rapidly. In India, information technology industry growth, fossil-fuel generation, and rising energy prices add significant operational costs and carbon emissions from energy-intensive data centers. Adoption of energy-efficient practices can improve the global competitiveness and sustainability of data centers in India. Previous studies have concluded that advancement of energy efficiency standards through policy and regulatory mechanisms is the fastest path to accelerate the adoption of energy-efficient practices in the Indian data centers. In this study, we reviewed data center energy efficiency practices in the United States, Europe, and Asia. Using evaluation metrics, we identifiedmore » an initial set of energy efficiency standards applicable to the Indian context using the existing policy mechanisms. These preliminary findings support next steps to recommend energy efficiency standards and inform policy makers on strategies to adopt energy-efficient technologies and practices in Indian data centers.« less
Predicting In-Situ X-ray Diffraction for the SrTiO3/Liquid Interface from First Principles
NASA Astrophysics Data System (ADS)
Letchworth-Weaver, Kendra; Gunceler, Deniz; Sundararaman, Ravishankar; Huang, Xin; Brock, Joel; Arias, T. A.
2013-03-01
Recent advances in experimental techniques, such as in-situ x-ray diffraction, allow researchers to probe the solid-liquid interface in electrochemical systems under operating conditions. These advances offer an unprecedented opportunity for theory to predict properties of electrode materials in aqueous environments and inform the design of energy conversion and storage devices. To compare with experiment, these theoretical studies require microscopic details of both the liquid and the electrode surface. Joint Density Functional Theory (JDFT), a computationally efficient alternative to molecular dynamics, couples a classical density-functional, which captures molecular structure of the liquid, to a quantum-mechanical functional for the electrode surface. We present a JDFT exploration of SrTiO3, which can catalyze solar-driven water splitting, in an electrochemical environment. We determine the geometry of the polar SrTiO3 surface and the equilibrium structure of the contacting liquid, as well as the influence of the liquid upon the electronic structure of the surface. We then predict the effect of the fluid environment on x-ray diffraction patterns and compare our predictions to in-situ measurements performed at the Cornell High Energy Synchrotron Source (CHESS). This material is based upon work supported by the Energy Materials Center at Cornell (EMC2), an Energy Frontier Research Center funded by the U.S. Department of Energy.
Energy balance and mass conservation in reduced order models of fluid flows
NASA Astrophysics Data System (ADS)
Mohebujjaman, Muhammad; Rebholz, Leo G.; Xie, Xuping; Iliescu, Traian
2017-10-01
In this paper, we investigate theoretically and computationally the conservation properties of reduced order models (ROMs) for fluid flows. Specifically, we investigate whether the ROMs satisfy the same (or similar) energy balance and mass conservation as those satisfied by the Navier-Stokes equations. All of our theoretical findings are illustrated and tested in numerical simulations of a 2D flow past a circular cylinder at a Reynolds number Re = 100. First, we investigate the ROM energy balance. We show that using the snapshot average for the centering trajectory (which is a popular treatment of nonhomogeneous boundary conditions in ROMs) yields an incorrect energy balance. Then, we propose a new approach, in which we replace the snapshot average with the Stokes extension. Theoretically, the Stokes extension produces an accurate energy balance. Numerically, the Stokes extension yields more accurate results than the standard snapshot average, especially for longer time intervals. Our second contribution centers around ROM mass conservation. We consider ROMs created using two types of finite elements: the standard Taylor-Hood (TH) element, which satisfies the mass conservation weakly, and the Scott-Vogelius (SV) element, which satisfies the mass conservation pointwise. Theoretically, the error estimates for the SV-ROM are sharper than those for the TH-ROM. Numerically, the SV-ROM yields significantly more accurate results, especially for coarser meshes and longer time intervals.
Alloy Design Workbench-Surface Modeling Package Developed
NASA Technical Reports Server (NTRS)
Abel, Phillip B.; Noebe, Ronald D.; Bozzolo, Guillermo H.; Good, Brian S.; Daugherty, Elaine S.
2003-01-01
NASA Glenn Research Center's Computational Materials Group has integrated a graphical user interface with in-house-developed surface modeling capabilities, with the goal of using computationally efficient atomistic simulations to aid the development of advanced aerospace materials, through the modeling of alloy surfaces, surface alloys, and segregation. The software is also ideal for modeling nanomaterials, since surface and interfacial effects can dominate material behavior and properties at this level. Through the combination of an accurate atomistic surface modeling methodology and an efficient computational engine, it is now possible to directly model these types of surface phenomenon and metallic nanostructures without a supercomputer. Fulfilling a High Operating Temperature Propulsion Components (HOTPC) project level-I milestone, a graphical user interface was created for a suite of quantum approximate atomistic materials modeling Fortran programs developed at Glenn. The resulting "Alloy Design Workbench-Surface Modeling Package" (ADW-SMP) is the combination of proven quantum approximate Bozzolo-Ferrante-Smith (BFS) algorithms (refs. 1 and 2) with a productivity-enhancing graphical front end. Written in the portable, platform independent Java programming language, the graphical user interface calls on extensively tested Fortran programs running in the background for the detailed computational tasks. Designed to run on desktop computers, the package has been deployed on PC, Mac, and SGI computer systems. The graphical user interface integrates two modes of computational materials exploration. One mode uses Monte Carlo simulations to determine lowest energy equilibrium configurations. The second approach is an interactive "what if" comparison of atomic configuration energies, designed to provide real-time insight into the underlying drivers of alloying processes.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-05
...- 000; EG10-34-000; EG10-34-000; EG10-35-000; EG10-36-000; EG10-37-000; EG10-38-000] Cedro Hill Wind LLC; Butler Ridge Wind Energy Center, LLC; High Majestic Wind Energy Center, LLC; Wessington Wind Energy Center, LLC; Juniper Canyon Wind Power LLC; Loraine Windpark Project, LLC; White Oak Energy LLC; Meadow...
User-Centered Computer Aided Language Learning
ERIC Educational Resources Information Center
Zaphiris, Panayiotis, Ed.; Zacharia, Giorgos, Ed.
2006-01-01
In the field of computer aided language learning (CALL), there is a need for emphasizing the importance of the user. "User-Centered Computer Aided Language Learning" presents methodologies, strategies, and design approaches for building interfaces for a user-centered CALL environment, creating a deeper understanding of the opportunities and…
CFD Modeling Activities at the NASA Stennis Space Center
NASA Technical Reports Server (NTRS)
Allgood, Daniel
2007-01-01
A viewgraph presentation on NASA Stennis Space Center's Computational Fluid Dynamics (CFD) Modeling activities is shown. The topics include: 1) Overview of NASA Stennis Space Center; 2) Role of Computational Modeling at NASA-SSC; 3) Computational Modeling Tools and Resources; and 4) CFD Modeling Applications.
Protein–protein docking by fast generalized Fourier transforms on 5D rotational manifolds
Padhorny, Dzmitry; Kazennov, Andrey; Zerbe, Brandon S.; Porter, Kathryn A.; Xia, Bing; Mottarella, Scott E.; Kholodov, Yaroslav; Ritchie, David W.; Vajda, Sandor; Kozakov, Dima
2016-01-01
Energy evaluation using fast Fourier transforms (FFTs) enables sampling billions of putative complex structures and hence revolutionized rigid protein–protein docking. However, in current methods, efficient acceleration is achieved only in either the translational or the rotational subspace. Developing an efficient and accurate docking method that expands FFT-based sampling to five rotational coordinates is an extensively studied but still unsolved problem. The algorithm presented here retains the accuracy of earlier methods but yields at least 10-fold speedup. The improvement is due to two innovations. First, the search space is treated as the product manifold SO(3)×(SO(3)∖S1), where SO(3) is the rotation group representing the space of the rotating ligand, and (SO(3)∖S1) is the space spanned by the two Euler angles that define the orientation of the vector from the center of the fixed receptor toward the center of the ligand. This representation enables the use of efficient FFT methods developed for SO(3). Second, we select the centers of highly populated clusters of docked structures, rather than the lowest energy conformations, as predictions of the complex, and hence there is no need for very high accuracy in energy evaluation. Therefore, it is sufficient to use a limited number of spherical basis functions in the Fourier space, which increases the efficiency of sampling while retaining the accuracy of docking results. A major advantage of the method is that, in contrast to classical approaches, increasing the number of correlation function terms is computationally inexpensive, which enables using complex energy functions for scoring. PMID:27412858
Powers, Chuck
2017-12-11
The Data Center in the Research Support Facility on the campus of the U.S. Department of Energy's National Renewable Energy Laboratory (NREL) marks a significant accomplishment in its ultra-efficiency. Data centers by nature are very energy intensive. The RSF Data Center was designed to use 80% less energy than NREL's old data center, which had been in use for the last 30 years. This tour takes you through the data center highlighting its energy saving techniques.
U.S. Department of Energy's Bioenergy Research Centers An Overview of the Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2010-07-01
Alternative fuels from renewable cellulosic biomass - plant stalks, trunks, stems, and leaves - are expected to significantly reduce U.S. dependence on imported oil while enhancing national energy security and decreasing the environmental impacts of energy use. Ethanol and other advanced biofuels from cellulosic biomass are renewable alternatives that could increase domestic production of transportation fuels, revitalize rural economies, and reduce carbon dioxide and pollutant emissions. According to U.S. Secretary of Energy Steven Chu, 'Developing the next generation of biofuels is key to our effort to end our dependence on foreign oil and address the climate crisis while creating millionsmore » of new jobs that can't be outsourced.' Although cellulosic ethanol production has been demonstrated on a pilot level, developing a cost-effective, commercial-scale cellulosic biofuel industry will require transformational science to significantly streamline current production processes. Woodchips, grasses, cornstalks, and other cellulosic biomass are widely abundant but more difficult to break down into sugars than corn grain - the primary source of U.S. ethanol fuel production today. Biological research is key to accelerating the deconstruction of cellulosic biomass into sugars that can be converted to biofuels. The Department of Energy (DOE) Office of Science continues to play a major role in inspiring, supporting, and guiding the biotechnology revolution over the past 30 years. The DOE Genomic Science program is advancing a new generation of research focused on achieving whole-systems understanding of biology. This program is bringing together scientists in diverse fields to understand the complex biology underlying solutions to DOE missions in energy production, environmental remediation, and climate change science. For more information on the Genomic Science program, see p. 26. To focus the most advanced biotechnology-based resources on the biological challenges of biofuel production, DOE established three Bioenergy Research Centers (BRCs) in September 2007. Each center is pursuing the basic research underlying a range of high-risk, high-return biological solutions for bioenergy applications. Advances resulting from the BRCs are providing the knowledge needed to develop new biobased products, methods, and tools that the emerging biofuel industry can use (see sidebar, Bridging the Gap from Fundamental Biology to Industrial Innovation for Bioenergy, p. 6). The DOE BRCs have developed automated, high-throughput analysis pipelines that will accelerate scientific discovery for biology-based biofuel research. The three centers, which were selected through a scientific peer-review process, are based in geographically diverse locations - the Southeast, the Midwest, and the West Coast - with partners across the nation (see U.S. map, DOE Bioenergy Research Centers and Partners, on back cover). DOE's Lawrence Berkeley National Laboratory leads the DOE Joint BioEnergy Institute (JBEI) in California; DOE's Oak Ridge National Laboratory leads the BioEnergy Science Center (BESC) in Tennessee; and the University of Wisconsin-Madison leads the Great Lakes Bioenergy Research Center (GLBRC). Each center represents a multidisciplinary partnership with expertise spanning the physical and biological sciences, including genomics, microbial and plant biology, analytical chemistry, computational biology and bioinformatics, and engineering. Institutional partners include DOE national laboratories, universities, private companies, and nonprofit organizations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Botta, F.; Mairani, A.; Battistoni, G.
Purpose: The calculation of patient-specific dose distribution can be achieved by Monte Carlo simulations or by analytical methods. In this study, fluka Monte Carlo code has been considered for use in nuclear medicine dosimetry. Up to now, fluka has mainly been dedicated to other fields, namely high energy physics, radiation protection, and hadrontherapy. When first employing a Monte Carlo code for nuclear medicine dosimetry, its results concerning electron transport at energies typical of nuclear medicine applications need to be verified. This is commonly achieved by means of calculation of a representative parameter and comparison with reference data. Dose point kernelmore » (DPK), quantifying the energy deposition all around a point isotropic source, is often the one. Methods: fluka DPKs have been calculated in both water and compact bone for monoenergetic electrons (10{sup -3} MeV) and for beta emitting isotopes commonly used for therapy ({sup 89}Sr, {sup 90}Y, {sup 131}I, {sup 153}Sm, {sup 177}Lu, {sup 186}Re, and {sup 188}Re). Point isotropic sources have been simulated at the center of a water (bone) sphere, and deposed energy has been tallied in concentric shells. fluka outcomes have been compared to penelope v.2008 results, calculated in this study as well. Moreover, in case of monoenergetic electrons in water, comparison with the data from the literature (etran, geant4, mcnpx) has been done. Maximum percentage differences within 0.8{center_dot}R{sub CSDA} and 0.9{center_dot}R{sub CSDA} for monoenergetic electrons (R{sub CSDA} being the continuous slowing down approximation range) and within 0.8{center_dot}X{sub 90} and 0.9{center_dot}X{sub 90} for isotopes (X{sub 90} being the radius of the sphere in which 90% of the emitted energy is absorbed) have been computed, together with the average percentage difference within 0.9{center_dot}R{sub CSDA} and 0.9{center_dot}X{sub 90} for electrons and isotopes, respectively. Results: Concerning monoenergetic electrons, within 0.8{center_dot}R{sub CSDA} (where 90%-97% of the particle energy is deposed), fluka and penelope agree mostly within 7%, except for 10 and 20 keV electrons (12% in water, 8.3% in bone). The discrepancies between fluka and the other codes are of the same order of magnitude than those observed when comparing the other codes among them, which can be referred to the different simulation algorithms. When considering the beta spectra, discrepancies notably reduce: within 0.9{center_dot}X{sub 90}, fluka and penelope differ for less than 1% in water and less than 2% in bone with any of the isotopes here considered. Complete data of fluka DPKs are given as Supplementary Material as a tool to perform dosimetry by analytical point kernel convolution. Conclusions: fluka provides reliable results when transporting electrons in the low energy range, proving to be an adequate tool for nuclear medicine dosimetry.« less
ISCR Annual Report: Fical Year 2004
DOE Office of Scientific and Technical Information (OSTI.GOV)
McGraw, J R
2005-03-03
Large-scale scientific computation and all of the disciplines that support and help to validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of computational simulation as a tool of scientific and engineering research is underscored in the November 2004 statement of the Secretary of Energy that, ''high performance computing is the backbone of the nation's science and technologymore » enterprise''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use efficiently. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to LLNL's core missions than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In Fiscal Year 2004, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for short- and long-term visits with the aim of encouraging long-term academic research agendas that address LLNL's research priorities. Through such collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''feet and hands'' that carry those advances into the Laboratory and incorporates them into practice. ISCR research participants are integrated into LLNL's Computing and Applied Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other five institutes of the URP, it navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort.« less
Molecular dynamics simulation of fast particle irradiation on the single crystal CeO2
NASA Astrophysics Data System (ADS)
Sasajima, Y.; Ajima, N.; Osada, T.; Ishikawa, N.; Iwase, A.
2013-11-01
We used a molecular dynamics method to simulate structural relaxation caused by the high-energy-ion irradiation of single crystal CeO2. As the initial condition, we assumed high thermal energy was supplied to the individual atoms within a cylindrical region of nanometer-order diameter located in the center of the single crystal. The potential proposed by Inaba et al. was utilized to calculate interactions between atoms [H. Inaba, R. Sagawa, H. Hayashi, K. Kawamura, Solid State Ionics 122 (1999) 95-103]. The supplied thermal energy was first spent to change the crystal structure into an amorphous one within a short period of about 0.3 ps, then it was dissipated in the crystal. We compared the obtained results with those of computer simulations for UO2 and found that CeO2 was more stable than UO2 when supplied with high thermal energy.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoon Lee, Sang; Hong, Tianzhen; Sawaya, Geof
The paper presents a method and process to establish a database of energy efficiency performance (DEEP) to enable quick and accurate assessment of energy retrofit of commercial buildings. DEEP was compiled from results of about 35 million EnergyPlus simulations. DEEP provides energy savings for screening and evaluation of retrofit measures targeting the small and medium-sized office and retail buildings in California. The prototype building models are developed for a comprehensive assessment of building energy performance based on DOE commercial reference buildings and the California DEER prototype buildings. The prototype buildings represent seven building types across six vintages of constructions andmore » 16 California climate zones. DEEP uses these prototypes to evaluate energy performance of about 100 energy conservation measures covering envelope, lighting, heating, ventilation, air-conditioning, plug-loads, and domestic hot water. DEEP consists the energy simulation results for individual retrofit measures as well as packages of measures to consider interactive effects between multiple measures. The large scale EnergyPlus simulations are being conducted on the super computers at the National Energy Research Scientific Computing Center of Lawrence Berkeley National Laboratory. The pre-simulation database is a part of an on-going project to develop a web-based retrofit toolkit for small and medium-sized commercial buildings in California, which provides real-time energy retrofit feedback by querying DEEP with recommended measures, estimated energy savings and financial payback period based on users’ decision criteria of maximizing energy savings, energy cost savings, carbon reduction, or payback of investment. The pre-simulated database and associated comprehensive measure analysis enhances the ability to performance assessments of retrofits to reduce energy use for small and medium buildings and business owners who typically do not have resources to conduct costly building energy audit. DEEP will be migrated into the DEnCity - DOE’s Energy City, which integrates large-scale energy data for multi-purpose, open, and dynamic database leveraging diverse source of existing simulation data.« less
Mathematics and Computer Science | Argonne National Laboratory
Genomics and Systems Biology LCRCLaboratory Computing Resource Center MCSGMidwest Center for Structural Genomics NAISENorthwestern-Argonne Institute of Science & Engineering SBCStructural Biology Center
Computer Center Harris 1600 Operator’s Guide.
1982-06-01
RECIPIENT’S CATALOG NUMBER CMLD-82-15 Vb /9 7 ’ 4. TITLE (and Subtitle) S. TYPE OF REPORT & PERIOD COVERED Computer Center Harris 1600 Operator’s Guide...AD-AIAA 077 DAVID W TAYLOR NAVAL SHIP RESEARCH AND DEVELOPMENT CE--ETC F/G. 5/9 COMPUTER CENTER HARRIS 1600 OPEAATOR’S GUIDE.dU) M JUN 62 D A SOMMER...20084 COMPUTER CENTER HARRIS 1600 OPERATOR’s GUIDE by David V. Sommer & Sharon E. Good APPROVED FOR PUBLIC RELEASE: DISTRIBUTION UNLIMITED ’-.7 SJ0 o 0
Stocks, G. Malcolm (Director, Center for Defect Physics in Structural Materials); CDP Staff
2017-12-09
'Center for Defect Physics - Energy Frontier Research Center' was submitted by the Center for Defect Physics (CDP) to the 'Life at the Frontiers of Energy Research' video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. CDP is directed by G. Malcolm Stocks at Oak Ridge National Laboratory, and is a partnership of scientists from nine institutions: Oak Ridge National Laboratory (lead); Ames Laboratory; Brown University; University of California, Berkeley; Carnegie Mellon University; University of Illinois, Urbana-Champaign; Lawrence Livermore National Laboratory; Ohio State University; and University of Tennessee. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stocks, G. Malcolm; Ice, Gene
"Center for Defect Physics - Energy Frontier Research Center" was submitted by the Center for Defect Physics (CDP) to the "Life at the Frontiers of Energy Research" video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. CDP is directed by G. Malcolm Stocks at Oak Ridge National Laboratory, and is a partnership of scientists from eight institutions: Oak Ridge National Laboratory (lead); Ames Laboratory; University of California, Berkeley; Carnegie Mellon University; University of Illinois, Urbana-Champaign; Ohio State University;more » University of Georgia and University of Tennessee. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.« less
NASA Astrophysics Data System (ADS)
Plewa, Tomasz; Handy, Timothy; Odrzywolek, Andrzej
2014-03-01
We compute and discuss the process of nucleosynthesis in a series of core-collapse explosion models of a 15 solar mass, blue supergiant progenitor. We obtain nucleosynthetic yields and study the evolution of the chemical element distribution from the moment of core bounce until young supernova remnant phase. Our models show how the process of energy deposition due to radioactive decay modifies the dynamics and the core ejecta structure on small and intermediate scales. The results are compared against observations of young supernova remnants including Cas A and the recent data obtained for SN 1987A. The work has been supported by the NSF grant AST-1109113 and DOE grant DE-FG52-09NA29548. This research used resources of the National Energy Research Scientific Computing Center, which is supported by the U.S. DoE under Contract No. DE-AC02-05CH11231.
Simmie, John M
2012-05-10
The enthalpies of formation, entropies, specific heats at constant pressure, enthalpy functions, and all carbon-hydrogen and carbon-methyl bond dissociation energies have been computed using high-level methods for the cyclic ethers (oxolanes) tetrahydrofuran, 2-methyltetrahydrofuran, and 2,5-dimethyltetrahydrofuran. Barrier heights for hydrogen-abstraction reactions by hydrogen atoms and the methyl radical are also computed and shown to correlate with reaction energy change. The results show a pleasing consistency and considerably expands the available data for these important compounds. Abstraction by ȮH is accompanied by formation of both pre- and postreaction weakly bound complexes. The resulting radicals formed after abstraction undergo ring-opening reactions leading to readily recognizable intermediates, while competitive H-elimination reactions result in formation of dihydrofurans. Formation enthalpies of all 2,3- and 2,5-dihydrofurans and associated radicals are also reported. It is probable that the compounds at the center of this study will be relatively clean-burning biofuels, although formation of intermediate aldehydes might be problematic.
Free energy and entropy of a dipolar liquid by computer simulations
NASA Astrophysics Data System (ADS)
Palomar, Ricardo; Sesé, Gemma
2018-02-01
Thermodynamic properties for a system composed of dipolar molecules are computed. Free energy is evaluated by means of the thermodynamic integration technique, and it is also estimated by using a perturbation theory approach, in which every molecule is modeled as a hard sphere within a square well, with an electric dipole at its center. The hard sphere diameter, the range and depth of the well, and the dipole moment have been calculated from properties easily obtained in molecular dynamics simulations. Connection between entropy and dynamical properties is explored in the liquid and supercooled states by using instantaneous normal mode calculations. A model is proposed in order to analyze translation and rotation contributions to entropy separately. Both contributions decrease upon cooling, and a logarithmic correlation between excess entropy associated with translation and the corresponding proportion of imaginary frequency modes is encountered. Rosenfeld scaling law between reduced diffusion and excess entropy is tested, and the origin of its failure at low temperatures is investigated.
Computer Modeling of High-Intensity Cs-Sputter Ion Sources
NASA Astrophysics Data System (ADS)
Brown, T. A.; Roberts, M. L.; Southon, J. R.
The grid-point mesh program NEDLab has been used to computer model the interior of the high-intensity Cs-sputter source used in routine operations at the Center for Accelerator Mass Spectrometry (CAMS), with the goal of improving negative ion output. NEDLab has several features that are important to realistic modeling of such sources. First, space-charge effects are incorporated in the calculations through an automated ion-trajectories/Poissonelectric-fields successive-iteration process. Second, space charge distributions can be averaged over successive iterations to suppress model instabilities. Third, space charge constraints on ion emission from surfaces can be incorporate under Child's Law based algorithms. Fourth, the energy of ions emitted from a surface can be randomly chosen from within a thermal energy distribution. And finally, ions can be emitted from a surface at randomized angles The results of our modeling effort indicate that significant modification of the interior geometry of the source will double Cs+ ion production from our spherical ionizer and produce a significant increase in negative ion output from the source.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li Xiang; Ko, Yeon-Jae; Wang Haopeng
2011-02-07
The copper-nucleoside anions, Cu{sup -}(cytidine) and Cu{sup -}(uridine), have been generated in the gas phase and studied by both experimental (anion photoelectron spectroscopy) and theoretical (density functional calculations) methods. The photoelectron spectra of both systems are dominated by single, intense, and relatively narrow peaks. These peaks are centered at 2.63 and 2.71 eV for Cu{sup -}(cytidine) and Cu{sup -}(uridine), respectively. According to our calculations, Cu{sup -}(cytidine) and Cu{sup -}(uridine) species with these peak center [vertical detachment energy (VDE)] values correspond to structures in which copper atomic anions are bound to the sugar portions of their corresponding nucleosides largely through electrostaticmore » interactions; the observed species are anion-molecule complexes. The combination of experiment and theory also reveal the presence of a slightly higher energy, anion-molecule complex isomer in the case of the Cu{sup -}(cytidine). Furthermore, our calculations found that chemically bond isomers of these species are much more stable than their anion-molecule complex counterparts, but since their calculated VDE values are larger than the photon energy used in these experiments, they were not observed.« less
Structural principles for computational and de novo design of 4Fe-4S metalloproteins
Nanda, Vikas; Senn, Stefan; Pike, Douglas H.; Rodriguez-Granillo, Agustina; Hansen, Will; Khare, Sagar D.; Noy, Dror
2017-01-01
Iron-sulfur centers in metalloproteins can access multiple oxidation states over a broad range of potentials, allowing them to participate in a variety of electron transfer reactions and serving as catalysts for high-energy redox processes. The nitrogenase FeMoCO cluster converts di-nitrogen to ammonia in an eight-electron transfer step. The 2(Fe4S4) containing bacterial ferredoxin is an evolutionarily ancient metalloprotein fold and is thought to be a primordial progenitor of extant oxidoreductases. Controlling chemical transformations mediated by iron-sulfur centers such as nitrogen fixation, hydrogen production as well as electron transfer reactions involved in photosynthesis are of tremendous importance for sustainable chemistry and energy production initiatives. As such, there is significant interest in the design of iron-sulfur proteins as minimal models to gain fundamental understanding of complex natural systems and as lead-molecules for industrial and energy applications. Herein, we discuss salient structural characteristics of natural iron-sulfur proteins and how they guide principles for design. Model structures of past designs are analyzed in the context of these principles and potential directions for enhanced designs are presented, and new areas of iron-sulfur protein design are proposed. PMID:26449207
Next Generation Workload Management System For Big Data on Heterogeneous Distributed Computing
Klimentov, A.; Buncic, P.; De, K.; ...
2015-05-22
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS and ALICE are the largest collaborations ever assembled in the sciences and are at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, both experiments rely on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Managementmore » System (WMS) for managing the workflow for all data processing on hundreds of data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. The scale is demonstrated by the following numbers: PanDA manages O(10 2) sites, O(10 5) cores, O(10 8) jobs per year, O(10 3) users, and ATLAS data volume is O(10 17) bytes. In 2013 we started an ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF). The project titled 'Next Generation Workload Management and Analysis System for Big Data' (BigPanDA) is funded by DOE ASCR and HEP. Extending PanDA to clouds and LCF presents new challenges in managing heterogeneity and supporting workflow. The BigPanDA project is underway to setup and tailor PanDA at the Oak Ridge Leadership Computing Facility (OLCF) and at the National Research Center "Kurchatov Institute" together with ALICE distributed computing and ORNL computing professionals. Our approach to integration of HPC platforms at the OLCF and elsewhere is to reuse, as much as possible, existing components of the PanDA system. Finally, we will present our current accomplishments with running the PanDA WMS at OLCF and other supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications.« less
Next Generation Workload Management System For Big Data on Heterogeneous Distributed Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klimentov, A.; Buncic, P.; De, K.
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS and ALICE are the largest collaborations ever assembled in the sciences and are at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, both experiments rely on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Managementmore » System (WMS) for managing the workflow for all data processing on hundreds of data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. The scale is demonstrated by the following numbers: PanDA manages O(10 2) sites, O(10 5) cores, O(10 8) jobs per year, O(10 3) users, and ATLAS data volume is O(10 17) bytes. In 2013 we started an ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF). The project titled 'Next Generation Workload Management and Analysis System for Big Data' (BigPanDA) is funded by DOE ASCR and HEP. Extending PanDA to clouds and LCF presents new challenges in managing heterogeneity and supporting workflow. The BigPanDA project is underway to setup and tailor PanDA at the Oak Ridge Leadership Computing Facility (OLCF) and at the National Research Center "Kurchatov Institute" together with ALICE distributed computing and ORNL computing professionals. Our approach to integration of HPC platforms at the OLCF and elsewhere is to reuse, as much as possible, existing components of the PanDA system. Finally, we will present our current accomplishments with running the PanDA WMS at OLCF and other supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sasaki, M.; Ueda, T.; Tanioka, M.
A photoinduced {open_quotes}transient thermoelectric effect{close_quotes} (TTE) has been measured for a p-GaAs crystal using a tunable pulsed laser, over the laser energy range 0.93{endash}1.80 eV, laser intensity 0.2{endash}130mJ/cm{sup 2}, time range 1 ns{endash}1 ms, and temperature range 4.2{endash}50 K, with special attention to native defects of EL2 centers, whose ground state (EL2{sup 0}) and excited state (EL2{sup ex}) are located, respectively, at 0.76 and 1.80 eV above the top of the valence band (their energy difference {sigma}{sup ex}=1.04eV). After laser irradiation at one end of the sample, a TTE voltage is induced within a rising time {tau}{sub r} (1.0{endash}1.5 {mu}s)more » due to hole diffusion, followed by exponential decay with multiple decay times {tau}{sub 1}{endash}{tau}{sub 5} that depend on the laser energy, its intensity, and the temperature. The decay time {tau}{sub 1} is assigned to relate to photoexcited electron diffusion in the conduction band and others {tau}{sub 2}{endash}{tau}{sub 5} with electron recombinations with photogenerated holes in the valence band via EL2 centers in p-GaAs, for which a rough evaluation of the capture cross section is made. Based on the experimental data, we have discussed the photoinduced carrier generation/recombination processes in three laser energy ranges with the two boundaries {sigma}{sup ex} and the band-gap energy E{sub g} (=1.50 eV); regions I (E{lt}{sigma}{sup ex}), II ({sigma}{sup ex}{le}E{lt}E{sub g}), and III (E{ge}E{sub g}). For these three energy regions, we have carried out computer simulations for the photoinduced TTE voltage profiles by solving one-dimensional transport equations for photogenerated electrons and holes, in qualitative agreement with the observations. {copyright} {ital 1997 American Institute of Physics.}« less
Low-energy phonon dispersion in LaFe4Sb12
NASA Astrophysics Data System (ADS)
Leithe-Jasper, Andreas; Boehm, Martin; Mutka, Hannu; Koza, Michael M.
We studied the vibrational dynamics of a single crystal of LaFe4Sb12 by three-axis inelastic neutron spectroscopy. The dispersion of phonons with wave vectors q along [ xx 0 ] and [ xxx ] directions in the energy range of eigenmodes with high amplitudes of lanthanum vibrations, i.e., at ℏω < 12 meV is identified. Symmetry-avoided anticrossing dispersion of phonons is established in both monitored directions and distinct eigenstates at high-symmetry points and at the Brillouin-zone center are discriminated. The experimentally derived phonon dispersion and intensities are compared with and backed up by ab initio lattice dynamics calculations. results of the computer model match well with the experimental data.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-11
... injection and since CenterPoint can no longer purchase replacement parts for the existing compressor unit... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. CP12-467-000] CenterPoint... May 22, 2012, CenterPoint Energy Gas Transmission Company, LLC (CenterPoint), 1111 Louisiana Street...
SU-G-IeP3-04: Effective Dose Measurements in Fast Kvp Switch Dual Energy Computed Tomography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raudabaugh, J; Moore, B; Nguyen, G
2016-06-15
Purpose: The objective of this study was two-fold: (a) to test a new approach to approximating organ dose by using the effective energy of the combined 80kV/140kV beam in dual-energy (DE) computed tomography (CT), and (b) to derive the effective dose (ED) in the abdomen-pelvis protocol in DECT. Methods: A commercial dual energy CT scanner was employed using a fast-kV switch abdomen/pelvis protocol alternating between 80 kV and 140 kV. MOSFET detectors were used for organ dose measurements. First, an experimental validation of the dose equivalency between MOSFET and ion chamber (as a gold standard) was performed using a CTDImore » phantom. Second, the ED of DECT scans was measured using MOSFET detectors and an anthropomorphic phantom. For ED calculations, an abdomen/pelvis scan was used using ICRP 103 tissue weighting factors; ED was also computed using the AAPM Dose Length Product (DLP) method and compared to the MOSFET value. Results: The effective energy was determined as 42.9 kV under the combined beam from half-value layer (HVL) measurement. ED for the dual-energy scan was calculated as 16.49 ± 0.04 mSv by the MOSFET method and 14.62 mSv by the DLP method. Conclusion: Tissue dose in the center of the CTDI body phantom was 1.71 ± 0.01 cGy (ion chamber) and 1.71 ± 0.06 (MOSFET) respectively; this validated the use of effective energy method for organ dose estimation. ED from the abdomen-pelvis scan was calculated as 16.49 ± 0.04 mSv by MOSFET and 14.62 mSv by the DLP method; this suggests that the DLP method provides a reasonable approximation to the ED.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pluharova, Eva; Baer, Marcel D.; Mundy, Christopher J.
2014-07-03
Understanding specific ion effects on proteins remains a considerable challenge. N-methylacetamide serves as a useful proxy for the protein backbone that can be well characterized both experimentally and theoretically. The spectroscopic signatures in the amide I band reflecting the strength of the interaction of alkali cations and alkali earth dications with the carbonyl group remain difficult to assign and controversial to interpret. Herein, we directly compute the IR shifts corresponding to the binding of either sodium or calcium to aqueous N-methylacetamide using ab initio molecular dynamics simulations. We show that the two cations interact with aqueous N-methylacetamide with different affinitiesmore » and in different geometries. Since sodium exhibits a weak interaction with the carbonyl group, the resulting amide I band is similar to an unperturbed carbonyl group undergoing aqueous solvation. In contrast, the stronger calcium binding results in a clear IR shift with respect to N-methylacetamide in pure water. Support from the Czech Ministry of Education (grant LH12001) is gratefully acknowledged. EP thanks the International Max-Planck Research School for support and the Alternative Sponsored Fellowship program at Pacific Northwest National Laboratory (PNNL). PJ acknowledges the Praemium Academie award from the Academy of Sciences. Calculations of the free energy profiles were made possible through generous allocation of computer time from the North-German Supercomputing Alliance (HLRN). Calculations of vibrational spectra were performed in part using the computational resources in the National Energy Research Supercomputing Center (NERSC) at Lawrence Berkeley National Laboratory. This work was supported by National Science Foundation grant CHE-0431312. CJM is supported by the U.S. Department of Energy`s (DOE) Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences and Biosciences. PNNL is operated for the Department of Energy by Battelle. MDB is grateful for the support of the Linus Pauling Distinguished Postdoctoral Fellowship Program at PNNL.« less
Local Aqueous Solvation Structure Around Ca2+ During Ca2+---Cl– Pair Formation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baer, Marcel D.; Mundy, Christopher J.
2016-03-03
The molecular details of single ion solvation around Ca2+ and ion-pairing of Ca2--Cl- are investigated using ab initio molecular dynamics. The use of empirical dispersion corrections to the BLYP functional are investigated by comparison to experimentally available extended X-ray absorption fine structure (EXAFS) measurements, which probes the first solvation shell in great detail. Besides finding differences in the free-energy for both ion-pairing and the coordination number of ion solvation between the quantum and classical descriptions of interaction, there were important differences found between dispersion corrected and uncorrected density functional theory (DFT). Specifically, we show significantly different free-energy landscapes for bothmore » coordination number of Ca2+ and its ion-pairing with Cl- depending on the DFT simulation protocol. Our findings produce a self-consistent treatment of short-range solvent response to the ion and the intermediate to long-range collective response of the electrostatics of the ion-ion interaction to produce a detailed picture of ion-pairing that is consistent with experiment. MDB is supported by MS3 (Materials Synthesis and Simulation Across Scales) Initiative at Pacific Northwest National Laboratory. It was conducted under the Laboratory Directed Research and Development Program at PNNL, a multiprogram national laboratory operated by Battelle for the U.S. Department of Energy. CJM acknowledges support from US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences & Biosciences. This research used resources of the National Energy Research Scientific Computing Center, a DOE Office of Science User Facility supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231. Additional computing resources were generously allocated by PNNL's Institutional Computing program. The authors thank Prof. Tom Beck for discussions regarding QCT, and Drs. Greg Schenter and Shawn Kathmann for insightful comments.« less
Space technology test facilities at the NASA Ames Research Center
NASA Technical Reports Server (NTRS)
Gross, Anthony R.; Rodrigues, Annette T.
1990-01-01
The major space research and technology test facilities at the NASA Ames Research Center are divided into five categories: General Purpose, Life Support, Computer-Based Simulation, High Energy, and the Space Exploraton Test Facilities. The paper discusses selected facilities within each of the five categories and discusses some of the major programs in which these facilities have been involved. Special attention is given to the 20-G Man-Rated Centrifuge, the Human Research Facility, the Plant Crop Growth Facility, the Numerical Aerodynamic Simulation Facility, the Arc-Jet Complex and Hypersonic Test Facility, the Infrared Detector and Cryogenic Test Facility, and the Mars Wind Tunnel. Each facility is described along with its objectives, test parameter ranges, and major current programs and applications.
77 FR 34941 - Privacy Act of 1974; Notice of a Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-12
...; Notice of a Computer Matching Program AGENCY: Defense Manpower Data Center, DoD. ACTION: Notice of a... computer matching program are the Department of Veterans Affairs (VA) and the Defense Manpower Data Center... identified as DMDC 01, entitled ``Defense Manpower Data Center Data Base,'' last published in the Federal...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-13
... the Defense Manpower Data Center, Department of Defense AGENCY: Postal Service TM . ACTION: Notice of Computer Matching Program--United States Postal Service and the Defense Manpower Data Center, Department of... as the recipient agency in a computer matching program with the Defense Manpower Data Center (DMDC...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Metzger, I.; Van Geet, O.
This report summarizes the results from the data center energy efficiency and renewable energy site assessment conducted for the Oregon Army National Guard in Salem, Oregon. A team led by NREL conducted the assessment of the Anderson Readiness Center data centers March 18-20, 2014 as part of ongoing efforts to reduce energy use and incorporate renewable energy technologies where feasible. Although the data centers in this facility account for less than 5% of the total square footage, they are estimated to be responsible for 70% of the annual electricity consumption.
Cluster-collision frequency. I. The long-range intercluster potential
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amadon, A.S.; Marlow, W.H.
1991-05-15
In recent years, gas-borne atomic and molecular clusters have emerged as subjects of basic physical and chemical interest and are gaining recognition for their importance in numerous applications. To calculate the evolution of the mass distribution of these clusters, their thermal collision rates are required. For computing these collision rates, the long-range interaction energy between clusters is required and is the subject of this paper. Utilizing a formulation of the iterated van der Waals interaction over discrete molecules that can be shown to converge with increasing numbers of atoms to the Lifshitz--van der Waals interaction for condensed matter, we calculatemore » the interaction energy as a function of center-of-mass separation for identical pairs of clusters of 13, 33, and 55 molecules of carbon tetrachloride in icosahedral and dodecahedral configurations. Two different relative orientations are chosen for each pair of clusters, and the energies are compared with energies calculated from the standard formula for continuum matter derived by summing over pair interactions with the Hamaker constant calculated according to Lifshitz theory. The results of these calculations give long-range interaction energies that assume typical adhesion-type values at cluster contact, unlike the unbounded results for the Lifshitz-Hamaker model. The relative difference between the discrete molecular energies and the continuum energies vanishes for {ital r}{sup *}{approx}2, where {ital r}{sup *} is the center-of-mass separation distance in units of cluster diameter. For larger separations, the relative difference changes sign, showing a value of approximately 15%, with the difference diminishing for increasing-sized clusters.« less
NREL's Impact Grows Through the Clean Energy Solutions Center and the New
Clean Energy Design Studio - Continuum Magazine | NREL NREL's Impact Grows Through the Clean Energy Solutions Center and the New Clean Energy Design Studio The Clean Energy Solutions Center (Solutions Center) helps governments design and adopt policies and programs that support the deployment of
Exact N 3LO results for qq ' → H + X
Anzai, Chihaya; Hasselhuhn, Alexander; Höschele, Maik; ...
2015-07-27
We compute the contribution to the total cross section for the inclusive production of a Standard Model Higgs boson induced by two quarks with different flavour in the initial state. Our calculation is exact in the Higgs boson mass and the partonic center-of-mass energy. Here, we describe the reduction to master integrals, the construction of a canonical basis, and the solution of the corresponding differential equations. Our analytic result contains both Harmonic Polylogarithms and iterated integrals with additional letters in the alphabet.
Strongly Cavity-Enhanced Spontaneous Emission from Silicon-Vacancy Centers in Diamond
Zhang, Jingyuan Linda; Sun, Shuo; Burek, Michael J.; ...
2018-01-29
Quantum emitters are an integral component for a broad range of quantum technologies, including quantum communication, quantum repeaters, and linear optical quantum computation. Solid-state color centers are promising candidates for scalable quantum optics due to their long coherence time and small inhomogeneous broadening. However, once excited, color centers often decay through phonon-assisted processes, limiting the efficiency of single-photon generation and photon-mediated entanglement generation. Herein, we demonstrate strong enhancement of spontaneous emission rate of a single silicon-vacancy center in diamond embedded within a monolithic optical cavity, reaching a regime in which the excited-state lifetime is dominated by spontaneous emission into themore » cavity mode. We observe 10-fold lifetime reduction and 42-fold enhancement in emission intensity when the cavity is tuned into resonance with the optical transition of a single silicon-vacancy center, corresponding to 90% of the excited-state energy decay occurring through spontaneous emission into the cavity mode. Here, we also demonstrate the largest coupling strength ( g/2π = 4.9 ± 0.3 GHz) and cooperativity ( C = 1.4) to date for color-center-based cavity quantum electrodynamics systems, bringing the system closer to the strong coupling regime.« less
Strongly Cavity-Enhanced Spontaneous Emission from Silicon-Vacancy Centers in Diamond
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Jingyuan Linda; Sun, Shuo; Burek, Michael J.
Quantum emitters are an integral component for a broad range of quantum technologies, including quantum communication, quantum repeaters, and linear optical quantum computation. Solid-state color centers are promising candidates for scalable quantum optics due to their long coherence time and small inhomogeneous broadening. However, once excited, color centers often decay through phonon-assisted processes, limiting the efficiency of single-photon generation and photon-mediated entanglement generation. Herein, we demonstrate strong enhancement of spontaneous emission rate of a single silicon-vacancy center in diamond embedded within a monolithic optical cavity, reaching a regime in which the excited-state lifetime is dominated by spontaneous emission into themore » cavity mode. We observe 10-fold lifetime reduction and 42-fold enhancement in emission intensity when the cavity is tuned into resonance with the optical transition of a single silicon-vacancy center, corresponding to 90% of the excited-state energy decay occurring through spontaneous emission into the cavity mode. Here, we also demonstrate the largest coupling strength ( g/2π = 4.9 ± 0.3 GHz) and cooperativity ( C = 1.4) to date for color-center-based cavity quantum electrodynamics systems, bringing the system closer to the strong coupling regime.« less
Overview of Particle and Heavy Ion Transport Code System PHITS
NASA Astrophysics Data System (ADS)
Sato, Tatsuhiko; Niita, Koji; Matsuda, Norihiro; Hashimoto, Shintaro; Iwamoto, Yosuke; Furuta, Takuya; Noda, Shusaku; Ogawa, Tatsuhiko; Iwase, Hiroshi; Nakashima, Hiroshi; Fukahori, Tokio; Okumura, Keisuke; Kai, Tetsuya; Chiba, Satoshi; Sihver, Lembit
2014-06-01
A general purpose Monte Carlo Particle and Heavy Ion Transport code System, PHITS, is being developed through the collaboration of several institutes in Japan and Europe. The Japan Atomic Energy Agency is responsible for managing the entire project. PHITS can deal with the transport of nearly all particles, including neutrons, protons, heavy ions, photons, and electrons, over wide energy ranges using various nuclear reaction models and data libraries. It is written in Fortran language and can be executed on almost all computers. All components of PHITS such as its source, executable and data-library files are assembled in one package and then distributed to many countries via the Research organization for Information Science and Technology, the Data Bank of the Organization for Economic Co-operation and Development's Nuclear Energy Agency, and the Radiation Safety Information Computational Center. More than 1,000 researchers have been registered as PHITS users, and they apply the code to various research and development fields such as nuclear technology, accelerator design, medical physics, and cosmic-ray research. This paper briefly summarizes the physics models implemented in PHITS, and introduces some important functions useful for specific applications, such as an event generator mode and beam transport functions.
Hanford meteorological station computer codes: Volume 9, The quality assurance computer codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burk, K.W.; Andrews, G.L.
1989-02-01
The Hanford Meteorological Station (HMS) was established in 1944 on the Hanford Site to collect and archive meteorological data and provide weather forecasts and related services for Hanford Site approximately 1/2 mile east of the 200 West Area and is operated by PNL for the US Department of Energy. Meteorological data are collected from various sensors and equipment located on and off the Hanford Site. These data are stored in data bases on the Digital Equipment Corporation (DEC) VAX 11/750 at the HMS (hereafter referred to as the HMS computer). Files from those data bases are routinely transferred to themore » Emergency Management System (EMS) computer at the Unified Dose Assessment Center (UDAC). To ensure the quality and integrity of the HMS data, a set of Quality Assurance (QA) computer codes has been written. The codes will be routinely used by the HMS system manager or the data base custodian. The QA codes provide detailed output files that will be used in correcting erroneous data. The following sections in this volume describe the implementation and operation of QA computer codes. The appendices contain detailed descriptions, flow charts, and source code listings of each computer code. 2 refs.« less
Clean Energy Solutions Center Services (Vietnamese Translation) (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2014-11-01
This is the Vietnamese language translation of the Clean Energy Solutions Center (Solutions Center) fact sheet. The Solutions Center helps governments, advisors and analysts create policies and programs that advance the deployment of clean energy technologies. The Solutions Center partners with international organizations to provide online training, expert assistance, and technical resources on clean energy policy.
Clean Energy Solutions Center Services (Chinese Translation) (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2014-04-01
This is the Chinese language translation of the Clean Energy Solutions Center (Solutions Center) fact sheet. The Solutions Center helps governments, advisors and analysts create policies and programs that advance the deployment of clean energy technologies. The Solutions Center partners with international organizations to provide online training, expert assistance, and technical resources on clean energy policy.
Electron Impact Ionization: A New Parameterization for 100 eV to 1 MeV Electrons
NASA Technical Reports Server (NTRS)
Fang, Xiaohua; Randall, Cora E.; Lummerzheim, Dirk; Solomon, Stanley C.; Mills, Michael J.; Marsh, Daniel; Jackman, Charles H.; Wang, Wenbin; Lu, Gang
2008-01-01
Low, medium and high energy electrons can penetrate to the thermosphere (90-400 km; 55-240 miles) and mesosphere (50-90 km; 30-55 miles). These precipitating electrons ionize that region of the atmosphere, creating positively charged atoms and molecules and knocking off other negatively charged electrons. The precipitating electrons also create nitrogen-containing compounds along with other constituents. Since the electron precipitation amounts change within minutes, it is necessary to have a rapid method of computing the ionization and production of nitrogen-containing compounds for inclusion in computationally-demanding global models. A new methodology has been developed, which has parameterized a more detailed model computation of the ionizing impact of precipitating electrons over the very large range of 100 eV up to 1,000,000 eV. This new parameterization method is more accurate than a previous parameterization scheme, when compared with the more detailed model computation. Global models at the National Center for Atmospheric Research will use this new parameterization method in the near future.
Distributed computing testbed for a remote experimental environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butner, D.N.; Casper, T.A.; Howard, B.C.
1995-09-18
Collaboration is increasing as physics research becomes concentrated on a few large, expensive facilities, particularly in magnetic fusion energy research, with national and international participation. These facilities are designed for steady state operation and interactive, real-time experimentation. We are developing tools to provide for the establishment of geographically distant centers for interactive operations; such centers would allow scientists to participate in experiments from their home institutions. A testbed is being developed for a Remote Experimental Environment (REE), a ``Collaboratory.`` The testbed will be used to evaluate the ability of a remotely located group of scientists to conduct research on themore » DIII-D Tokamak at General Atomics. The REE will serve as a testing environment for advanced control and collaboration concepts applicable to future experiments. Process-to-process communications over high speed wide area networks provide real-time synchronization and exchange of data among multiple computer networks, while the ability to conduct research is enhanced by adding audio/video communication capabilities. The Open Software Foundation`s Distributed Computing Environment is being used to test concepts in distributed control, security, naming, remote procedure calls and distributed file access using the Distributed File Services. We are exploring the technology and sociology of remotely participating in the operation of a large scale experimental facility.« less
Data Center Consolidation: A Step towards Infrastructure Clouds
NASA Astrophysics Data System (ADS)
Winter, Markus
Application service providers face enormous challenges and rising costs in managing and operating a growing number of heterogeneous system and computing landscapes. Limitations of traditional computing environments force IT decision-makers to reorganize computing resources within the data center, as continuous growth leads to an inefficient utilization of the underlying hardware infrastructure. This paper discusses a way for infrastructure providers to improve data center operations based on the findings of a case study on resource utilization of very large business applications and presents an outlook beyond server consolidation endeavors, transforming corporate data centers into compute clouds.
Center for Extended Magnetohydrodynamic Modeling Cooperative Agreement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carl R. Sovinec
The Center for Extended Magnetohydrodynamic Modeling (CEMM) is developing computer simulation models for predicting the behavior of magnetically confined plasmas. Over the first phase of support from the Department of Energy’s Scientific Discovery through Advanced Computing (SciDAC) initiative, the focus has been on macroscopic dynamics that alter the confinement properties of magnetic field configurations. The ultimate objective is to provide computational capabilities to predict plasma behavior—not unlike computational weather prediction—to optimize performance and to increase the reliability of magnetic confinement for fusion energy. Numerical modeling aids theoretical research by solving complicated mathematical models of plasma behavior including strong nonlinear effectsmore » and the influences of geometrical shaping of actual experiments. The numerical modeling itself remains an area of active research, due to challenges associated with simulating multiple temporal and spatial scales. The research summarized in this report spans computational and physical topics associated with state of the art simulation of magnetized plasmas. The tasks performed for this grant are categorized according to whether they are primarily computational, algorithmic, or application-oriented in nature. All involve the development and use of the Non-Ideal Magnetohydrodynamics with Rotation, Open Discussion (NIMROD) code, which is described at http://nimrodteam.org. With respect to computation, we have tested and refined methods for solving the large algebraic systems of equations that result from our numerical approximations of the physical model. Collaboration with the Terascale Optimal PDE Solvers (TOPS) SciDAC center led us to the SuperLU_DIST software library [http://crd.lbl.gov/~xiaoye/SuperLU/] for solving large sparse matrices using direct methods on parallel computers. Switching to this solver library boosted NIMROD’s performance by a factor of five in typical large nonlinear simulations, which has been publicized as a success story of SciDAC-fostered collaboration. Furthermore, the SuperLU software does not assume any mathematical symmetry, and its generality provides an important capability for extending the physical model beyond magnetohydrodynamics (MHD). With respect to algorithmic and model development, our most significant accomplishment is the development of a new method for solving plasma models that treat electrons as an independent plasma component. These ‘two-fluid’ models encompass MHD and add temporal and spatial scales that are beyond the response of the ion species. Implementation and testing of a previously published algorithm did not prove successful for NIMROD, and the new algorithm has since been devised, analyzed, and implemented. Two-fluid modeling, an important objective of the original NIMROD project, is now routine in 2D applications. Algorithmic components for 3D modeling are in place and tested; though, further computational work is still needed for efficiency. Other algorithmic work extends the ion-fluid stress tensor to include models for parallel and gyroviscous stresses. In addition, our hot-particle simulation capability received important refinements that permitted completion of a benchmark with the M3D code. A highlight of our applications work is the edge-localized mode (ELM) modeling, which was part of the first-ever computational Performance Target for the DOE Office of Fusion Energy Science, see http://www.science.doe.gov/ofes/performancetargets.shtml. Our efforts allowed MHD simulations to progress late into the nonlinear stage, where energy is conducted to the wall location. They also produced a two-fluid ELM simulation starting from experimental information and demonstrating critical drift effects that are characteristic of two-fluid physics. Another important application is the internal kink mode in a tokamak. Here, the primary purpose of the study has been to benchmark the two main code development lines of CEMM, NIMROD and M3D, on a relevant nonlinear problem. Results from the two codes show repeating nonlinear relaxation events driven by the kink mode over quantitatively comparable timescales. The work has launched a more comprehensive nonlinear benchmarking exercise, where realistic transport effects have an important role.« less
Engage States on Energy Assurance and Energy Security
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kara Colton; John Ratliff; Sue Gander
2008-09-30
The NGA Center's 'Engaging States on Energy Security and Energy Assurance' has been successful in achieving the stated project purposes and objectives both in the initial proposal as well as in subsequent revisions to it. Our activities, which involve the NGA Center for Best Practices (The NGA Center) Homeland Security and Technology Division, included conducting tabletop exercises to help federal and state homeland security and energy officials determine roles and actions for various emergency scenarios. This included efforts to education state official on developing an energy assurance plan, harmonizing approaches to controlling price volatility, implementing reliability standards, understanding short andmore » long-term energy outlooks and fuel diversification, and capitalizing on DOE's research and development activities. Regarding our work on energy efficiency and renewable energy, the NGA Center's Environment, Energy and Natural Resources Division hosted three workshops which engaged states on the clean energy and alternative transportation fuel and also produced several reports on related topics. In addition, we convened 18 meetings, via conference call, of the Energy Working Group. Finally, through the NGA Center's Front and Center newsletter articles, the NGA Center disseminated promising practices to a wide audience of state policymakers. The NGA Center also hosted a number of workshops and web conferences designed to directly engage states on the deliverables under this Cooperative Agreement. Through the NGA Center's written products and newsletter articles, the NGA Center was able to disseminate promising practices to a wide audience of state policymakers.« less
Computer Maintenance Operations Center (CMOC), additional computer support equipment ...
Computer Maintenance Operations Center (CMOC), additional computer support equipment - Beale Air Force Base, Perimeter Acquisition Vehicle Entry Phased-Array Warning System, Techinical Equipment Building, End of Spencer Paul Road, north of Warren Shingle Road (14th Street), Marysville, Yuba County, CA
Nuclear energy center site survey reactor plant considerations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harty, H.
The Energy Reorganization Act of 1974 required the Nuclear Regulatory Commission (NRC) to make a nuclear energy center site survey (NECSS). Background information for the NECSS report was developed in a series of tasks which include: socioeconomic inpacts; environmental impact (reactor facilities); emergency response capability (reactor facilities); aging of nuclear energy centers; and dry cooled nuclear energy centers.
Capacity and reliability analyses with applications to power quality
NASA Astrophysics Data System (ADS)
Azam, Mohammad; Tu, Fang; Shlapak, Yuri; Kirubarajan, Thiagalingam; Pattipati, Krishna R.; Karanam, Rajaiah
2001-07-01
The deregulation of energy markets, the ongoing advances in communication networks, the proliferation of intelligent metering and protective power devices, and the standardization of software/hardware interfaces are creating a dramatic shift in the way facilities acquire and utilize information about their power usage. The currently available power management systems gather a vast amount of information in the form of power usage, voltages, currents, and their time-dependent waveforms from a variety of devices (for example, circuit breakers, transformers, energy and power quality meters, protective relays, programmable logic controllers, motor control centers). What is lacking is an information processing and decision support infrastructure to harness this voluminous information into usable operational and management knowledge to handle the health of their equipment and power quality, minimize downtime and outages, and to optimize operations to improve productivity. This paper considers the problem of evaluating the capacity and reliability analyses of power systems with very high availability requirements (e.g., systems providing energy to data centers and communication networks with desired availability of up to 0.9999999). The real-time capacity and margin analysis helps operators to plan for additional loads and to schedule repair/replacement activities. The reliability analysis, based on computationally efficient sum of disjoint products, enables analysts to decide the optimum levels of redundancy, aids operators in prioritizing the maintenance options for a given budget and monitoring the system for capacity margin. The resulting analytical and software tool is demonstrated on a sample data center.
Integrated Computational Materials Engineering for Magnesium in Automotive Body Applications
NASA Astrophysics Data System (ADS)
Allison, John E.; Liu, Baicheng; Boyle, Kevin P.; Hector, Lou; McCune, Robert
This paper provides an overview and progress report for an international collaborative project which aims to develop an ICME infrastructure for magnesium for use in automotive body applications. Quantitative processing-micro structure-property relationships are being developed for extruded Mg alloys, sheet-formed Mg alloys and high pressure die cast Mg alloys. These relationships are captured in computational models which are then linked with manufacturing process simulation and used to provide constitutive models for component performance analysis. The long term goal is to capture this information in efficient computational models and in a web-centered knowledge base. The work is being conducted at leading universities, national labs and industrial research facilities in the US, China and Canada. This project is sponsored by the U.S. Department of Energy, the U.S. Automotive Materials Partnership (USAMP), Chinese Ministry of Science and Technology (MOST) and Natural Resources Canada (NRCan).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warpinski, N.R.
A computer code for calculating hydraulic fracture height and width in a stressed-layer medium has been modified for easy use on a personal computer. HSTRESS allows for up to 51 layers having different thicknesses, stresses and fracture toughnesses. The code can calculate fracture height versus pressure or pressure versus fracture height, depending on the design model in which the data will be used. At any pressure/height, a width profile is calculated and an equivalent width factor and flow resistance factor are determined. This program is written in FORTRAN. Graphics use PLOT88 software by Plotworks, Inc., but the graphics software mustmore » be obtained by the user because of licensing restrictions. A version without graphics can also be run. This code is available through the National Energy Software Center (NESC), operated by Argonne National Laboratory. 14 refs., 21 figs.« less
ERIC Educational Resources Information Center
Lin, Che-Li; Liang, Jyh-Chong; Su, Yi-Ching; Tsai, Chin-Chung
2013-01-01
Teacher-centered instruction has been widely adopted in college computer science classrooms and has some benefits in training computer science undergraduates. Meanwhile, student-centered contexts have been advocated to promote computer science education. How computer science learners respond to or prefer the two types of teacher authority,…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerber, Richard; Allcock, William; Beggio, Chris
2014-10-17
U.S. Department of Energy (DOE) High Performance Computing (HPC) facilities are on the verge of a paradigm shift in the way they deliver systems and services to science and engineering teams. Research projects are producing a wide variety of data at unprecedented scale and level of complexity, with community-specific services that are part of the data collection and analysis workflow. On June 18-19, 2014 representatives from six DOE HPC centers met in Oakland, CA at the DOE High Performance Operational Review (HPCOR) to discuss how they can best provide facilities and services to enable large-scale data-driven scientific discovery at themore » DOE national laboratories. The report contains findings from that review.« less
Shaik, Saboor; Talanki, Ashok Babu Puttranga Setty
2016-05-01
Building roofs are responsible for the huge heat gain in buildings. In the present work, an analysis of the influence of insulation location inside a flat roof exposed directly to the sun's radiation was performed to reduce heat gain in buildings. The unsteady thermal response parameters of the building roof such as admittance, transmittance, decrement factor, and time lags have been investigated by solving a one-dimensional diffusion equation under convective periodic boundary conditions. Theoretical results of four types of walls were compared with the experimental results available in literature. The results reveal that the roof with insulation placed at the outer side and at the center plane of the roof is the most energy efficient from the lower decrement factor point of view and the roof with insulation placed at the center plane and the inner side of the roof is the best from the highest time lag point of view among the seven studied configurations. The composite roof with expanded polystyrene insulation located at the outer side and at the center plane of the roof is found to be the best roof from the lowest decrement factor (0.130) point of view, and the composite roof with resin-bonded mineral wool insulation located at the center plane and at the inner side of the roof is found to be energy efficient from the highest time lag point (9.33 h) of view among the seven configurations with five different insulation materials studied. The optimum fabric energy storage thicknesses of reinforced cement concrete, expanded polystyrene, foam glass, rock wool, rice husk, resin-bonded mineral wool, and cement plaster were computed. From the results, it is concluded that rock wool has the least optimum fabric energy storage thickness (0.114 m) among the seven studied building roof materials.
Umari, P; Fabris, S
2012-05-07
The quasi-particle energy levels of the Zn-Phthalocyanine (ZnPc) molecule calculated with the GW approximation are shown to depend sensitively on the explicit description of the metal-center semicore states. We find that the calculated GW energy levels are in good agreement with the measured experimental photoemission spectra only when explicitly including the Zn 3s and 3p semicore states in the valence. The main origin of this effect is traced back to the exchange term in the self-energy GW approximation. Based on this finding, we propose a simplified approach for correcting GW calculations of metal phthalocyanine molecules that avoids the time-consuming explicit treatment of the metal semicore states. Our method allows for speeding up the calculations without compromising the accuracy of the computed spectra.
The Practical Obstacles of Data Transfer: Why researchers still love scp
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nam, Hai Ah; Hill, Jason J; Parete-Koon, Suzanne T
The importance of computing facilities is heralded every six months with the announcement of the new Top500 list, showcasing the world s fastest supercomputers. Unfortu- nately, with great computing capability does not come great long-term data storage capacity, which often means users must move their data to their local site archive, to remote sites where they may be doing future computation or anal- ysis, or back to their home institution, else face the dreaded data purge that most HPC centers employ to keep utiliza- tion of large parallel filesystems low to manage performance and capacity. At HPC centers, data transfermore » is crucial to the scientific workflow and will increase in importance as computing systems grow in size. The Energy Sciences Net- work (ESnet) recently launched its fifth generation network, a 100 Gbps high-performance, unclassified national network connecting more than 40 DOE research sites to support scientific research and collaboration. Despite the tenfold increase in bandwidth to DOE research sites amenable to multiple data transfer streams and high throughput, in prac- tice, researchers often under-utilize the network and resort to painfully-slow single stream transfer methods such as scp to avoid the complexity of using multiple stream tools such as GridFTP and bbcp, and contend with frustration from the lack of consistency of available tools between sites. In this study we survey and assess the data transfer methods pro- vided at several DOE supported computing facilities, includ- ing both leadership-computing facilities, connected through ESnet. We present observed transfer rates, suggested opti- mizations, and discuss the obstacles the tools must overcome to receive wide-spread adoption over scp.« less
The Russian effort in establishing large atomic and molecular databases
NASA Astrophysics Data System (ADS)
Presnyakov, Leonid P.
1998-07-01
The database activities in Russia have been developed in connection with UV and soft X-ray spectroscopic studies of extraterrestrial and laboratory (magnetically confined and laser-produced) plasmas. Two forms of database production are used: i) a set of computer programs to calculate radiative and collisional data for the general atom or ion, and ii) development of numeric database systems with the data stored in the computer. The first form is preferable for collisional data. At the Lebedev Physical Institute, an appropriate set of the codes has been developed. It includes all electronic processes at collision energies from the threshold up to the relativistic limit. The ion -atom (and -ion) collisional data are calculated with the methods developed recently. The program for the calculations of the level populations and line intensities is used for spectrical diagnostics of transparent plasmas. The second form of database production is widely used at the Institute of Physico-Technical Measurements (VNIIFTRI), and the Troitsk Center: the Institute of Spectroscopy and TRINITI. The main results obtained at the centers above are reviewed. Plans for future developments jointly with international collaborations are discussed.
Accelerating Science with the NERSC Burst Buffer Early User Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhimji, Wahid; Bard, Debbie; Romanus, Melissa
NVRAM-based Burst Buffers are an important part of the emerging HPC storage landscape. The National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory recently installed one of the first Burst Buffer systems as part of its new Cori supercomputer, collaborating with Cray on the development of the DataWarp software. NERSC has a diverse user base comprised of over 6500 users in 700 different projects spanning a wide variety of scientific computing applications. The use-cases of the Burst Buffer at NERSC are therefore also considerable and diverse. We describe here performance measurements and lessons learned from the Burstmore » Buffer Early User Program at NERSC, which selected a number of research projects to gain early access to the Burst Buffer and exercise its capability to enable new scientific advancements. To the best of our knowledge this is the first time a Burst Buffer has been stressed at scale by diverse, real user workloads and therefore these lessons will be of considerable benefit to shaping the developing use of Burst Buffers at HPC centers.« less
A One Dimensional, Time Dependent Inlet/Engine Numerical Simulation for Aircraft Propulsion Systems
NASA Technical Reports Server (NTRS)
Garrard, Doug; Davis, Milt, Jr.; Cole, Gary
1999-01-01
The NASA Lewis Research Center (LeRC) and the Arnold Engineering Development Center (AEDC) have developed a closely coupled computer simulation system that provides a one dimensional, high frequency inlet/engine numerical simulation for aircraft propulsion systems. The simulation system, operating under the LeRC-developed Application Portable Parallel Library (APPL), closely coupled a supersonic inlet with a gas turbine engine. The supersonic inlet was modeled using the Large Perturbation Inlet (LAPIN) computer code, and the gas turbine engine was modeled using the Aerodynamic Turbine Engine Code (ATEC). Both LAPIN and ATEC provide a one dimensional, compressible, time dependent flow solution by solving the one dimensional Euler equations for the conservation of mass, momentum, and energy. Source terms are used to model features such as bleed flows, turbomachinery component characteristics, and inlet subsonic spillage while unstarted. High frequency events, such as compressor surge and inlet unstart, can be simulated with a high degree of fidelity. The simulation system was exercised using a supersonic inlet with sixty percent of the supersonic area contraction occurring internally, and a GE J85-13 turbojet engine.
NASA Astrophysics Data System (ADS)
Ethier, Stephane; Lin, Zhihong
2001-10-01
Earlier this year, the National Energy Research Scientific Computing center (NERSC) took delivery of the second most powerful computer in the world. With its 2,528 processors running at a peak performance of 1.5 GFlops, this IBM SP machine has a theoretical performance of almost 3.8 TFlops. To efficiently harness such computing power in one single code is not an easy task and requires a good knowledge of the computer's architecture. Here we present the steps that we followed to improve our gyrokinetic micro-turbulence code GTC in order to take advantage of the new 16-way shared memory nodes of the NERSC IBM SP. Performance results are shown as well as details about the improved mixed-mode MPI-OpenMP model that we use. The enhancements to the code allowed us to tackle much bigger problem sizes, getting closer to our goal of simulating an ITER-size tokamak with both kinetic ions and electrons.(This work is supported by DOE Contract No. DE-AC02-76CH03073 (PPPL), and in part by the DOE Fusion SciDAC Project.)
Computer Maintenance Operations Center (CMOC), showing duplexed cyber 170174 computers ...
Computer Maintenance Operations Center (CMOC), showing duplexed cyber 170-174 computers - Beale Air Force Base, Perimeter Acquisition Vehicle Entry Phased-Array Warning System, Techinical Equipment Building, End of Spencer Paul Road, north of Warren Shingle Road (14th Street), Marysville, Yuba County, CA
NASA Center for Computational Sciences: History and Resources
NASA Technical Reports Server (NTRS)
2000-01-01
The Nasa Center for Computational Sciences (NCCS) has been a leading capacity computing facility, providing a production environment and support resources to address the challenges facing the Earth and space sciences research community.
Center for Computing Research Summer Research Proceedings 2015.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bradley, Andrew Michael; Parks, Michael L.
2015-12-18
The Center for Computing Research (CCR) at Sandia National Laboratories organizes a summer student program each summer, in coordination with the Computer Science Research Institute (CSRI) and Cyber Engineering Research Institute (CERI).
Integration of Panda Workload Management System with supercomputers
NASA Astrophysics Data System (ADS)
De, K.; Jha, S.; Klimentov, A.; Maeno, T.; Mashinistov, R.; Nilsson, P.; Novikov, A.; Oleynik, D.; Panitkin, S.; Poyda, A.; Read, K. F.; Ryabinkin, E.; Teslyuk, A.; Velikhov, V.; Wells, J. C.; Wenaus, T.
2016-09-01
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System for managing the workflow for all data processing on over 140 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250000 cores with a peak performance of 0.3+ petaFLOPS, next LHC data taking runs will require more resources than Grid computing can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, Europe and Russia (in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF), Supercomputer at the National Research Center "Kurchatov Institute", IT4 in Ostrava, and others). The current approach utilizes a modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run singlethreaded workloads in parallel on Titan's multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads on several supercomputing platforms. We will present our current accomplishments in running PanDA WMS at supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facility's infrastructure for High Energy and Nuclear Physics, as well as other data-intensive science applications, such as bioinformatics and astro-particle physics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pope, Gary A.
"The Center for Frontiers of Subsurface Energy Security (CFSES)" was submitted to the "Life at the Frontiers of Energy Research" video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. CFSES is directed by Gary A. Pope at the University of Texas at Austin and partners with Sandia National Laboratories. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conductmore » fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.« less
Pope, Gary A. (Director, Center for Frontiers of Subsurface Energy Security); CFSES Staff
2017-12-09
'The Center for Frontiers of Subsurface Energy Security (CFSES)' was submitted to the 'Life at the Frontiers of Energy Research' video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. CFSES is directed by Gary A. Pope at the University of Texas at Austin and partners with Sandia National Laboratories. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.
Burns, Peter (Director, Materials Science of Actinides); MSA Staff
2017-12-09
'Energy Frontier Research Center Materials Science of Actinides' was submitted by the EFRC for Materials Science of Actinides (MSA) to the 'Life at the Frontiers of Energy Research' video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. MSA is directed by Peter Burns at the University of Notre Dame, and is a partnership of scientists from ten institutions.The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.
First-principles study of oxygen evolution reaction on Co doped NiFe-layered double hydroxides
NASA Astrophysics Data System (ADS)
Yu, Jie; Perdew, John; Yan, Qimin
The conversion of solar energy to renewable fuels is a grand challenge. One of the crucial steps for this energy conversion process is the discovery of efficient catalysts with lower overpotential for the oxygen evolution reaction (OER). Layered double hydroxides (LDH) with earth abundant elements such as Ni and Fe have been found as promising OER catalysts and shown to be active for water oxidation. Doping is one of the feasible ways to even lower the overpotential of host materials and breaks the linear scaling law. In this talk we'll present our study on the reaction mechanism of OER on pure and Co-doped NiFe-LDH systems in alkaline solution. We study the absorption energetics of reaction intermediate states and calculate the thermodynamic reaction energy using density functional theory with the PBE +U and the newly developed SCAN functionals. It is shown that the NiFe-LDH system with Co dopants has lower overpotential and higher activity compared with the undoped system. The improvement in activity is related to the presence of Co states in the electronic structure. The work provides a clear clue for the further improvement of the OER activity of LDH systems by chemical doping. The work was supported as part of the Center for the Computational Design of Functional Layered Materials, an Energy Frontier Research Center funded by the U.S. Department of Energy, Office of Science.
A theoretical study of hydrogen complexes of the XH-pi type between propyne and HF, HCL or HCN.
Tavares, Alessandra M; da Silva, Washington L V; Lopes, Kelson C; Ventura, Elizete; Araújo, Regiane C M U; do Monte, Silmar A; da Silva, João Bosco P; Ramos, Mozart N
2006-05-15
The present manuscript reports a systematic investigation of the basis set dependence of some properties of hydrogen-bonded (pi type) complexes formed by propyne and a HX molecule, where X=F, Cl and CN. The calculations have been performed at Hartree-Fock, MP2 and B3LYP levels. Geometries, H-bond energies and vibrational have been considered. The more pronounced effects on the structural parameters of the isolated molecules, as a result of complexation, are verified on RCtriple bondC and HX bond lengths. As compared to double-zeta (6-31G**), triple-zeta (6-311G**) basis set leads to an increase of RCtriple bondC bond distance, at all three computational levels. In the case where diffuse functions are added to both hydrogen and 'heavy' atoms, the effect is more pronounced. The propyne-HX structural parameters are quite similar to the corresponding parameters of acetylene-HX complexes, at all levels. The largest difference is obtained for hydrogen bond distance, RH, with a smaller value for propyne-HX complex, indicating a stronger bond. Concerning the electronic properties, the results yield the following ordering for H-bond energies, DeltaE: propynecdots, three dots, centeredHF>propynecdots, three dots, centeredHCl>propynecdots, three dots, centeredHCN. It is also important to point out that the inclusion of BSSE and zero-point energies (ZPE) corrections cause significant changes on DeltaE. The smaller effect of ZPE is obtained for propynecdots, three dots, centeredHCN at HF/6-311++G** level, while the greatest difference is obtained at MP2/6-31G** level for propynecdots, three dots, centeredHF system. Concerning the IR vibrational it was obtained that larger shift can be associated with stronger hydrogen bonds. The more pronounced effect on the normal modes of the isolated molecule after the complexation is obtained for HX stretching frequency, which is shifted downward.
NASA Astrophysics Data System (ADS)
Lee, Myeong H.; Dunietz, Barry D.; Geva, Eitan
2014-03-01
Classical Marcus theory is commonly adopted in solvent-mediated charge transfer (CT) process to obtain the CT rate constant, but it can become questionable when the intramolecular vibrational modes dominate the CT process as in OPV devices because Marcus theory treats these modes classically and therefore nuclear tunneling is not accounted for. We present a computational scheme to obtain the electron transfer rate constant beyond classical Marcus theory. Within this approach, the nuclear vibrational modes are treated quantum-mechanically and a short-time approximation is avoided. Ab initio calculations are used to obtain the basic parameters needed for calculating the electron transfer rate constant. We apply our methodology to phthalocyanine(H2PC)-C60 organic photovoltaic system where one C60 acceptor and one or two H2PC donors are included to model the donor-acceptor interface configuration. We obtain the electron transfer and recombination rate constants for all accessible charge transfer (CT) states, from which the CT exciton dynamics is determined by employing a master equation. The role of higher lying excited states in CT exciton dynamics is discussed. This work is pursued as part of the Center for Solar and Thermal Energy Conversion, an Energy Frontier Research Center funded by the US Department of Energy Office of Science, Office of Basic Energy Sciences under 390 Award No. DE-SC0000957.
NASA Astrophysics Data System (ADS)
Blum, Volker
This talk describes recent advances of a general, efficient, accurate all-electron electronic theory approach based on numeric atom-centered orbitals; emphasis is placed on developments related to materials for energy conversion and their discovery. For total energies and electron band structures, we show that the overall accuracy is on par with the best benchmark quality codes for materials, but scalable to large system sizes (1,000s of atoms) and amenable to both periodic and non-periodic simulations. A recent localized resolution-of-identity approach for the Coulomb operator enables O (N) hybrid functional based descriptions of the electronic structure of non-periodic and periodic systems, shown for supercell sizes up to 1,000 atoms; the same approach yields accurate results for many-body perturbation theory as well. For molecular systems, we also show how many-body perturbation theory for charged and neutral quasiparticle excitation energies can be efficiently yet accurately applied using basis sets of computationally manageable size. Finally, the talk highlights applications to the electronic structure of hybrid organic-inorganic perovskite materials, as well as to graphene-based substrates for possible future transition metal compound based electrocatalyst materials. All methods described here are part of the FHI-aims code. VB gratefully acknowledges contributions by numerous collaborators at Duke University, Fritz Haber Institute Berlin, TU Munich, USTC Hefei, Aalto University, and many others around the globe.
Annealing behavior of the EB-centers and M-center in low-energy electron irradiated n-type 4H-SiC
NASA Astrophysics Data System (ADS)
Beyer, F. C.; Hemmingsson, C.; Pedersen, H.; Henry, A.; Janzén, E.; Isoya, J.; Morishita, N.; Ohshima, T.
2011-05-01
After low-energy electron irradiation of epitaxial n-type 4H-SiC with a dose of 5×1016 cm-2, the bistable M-center, previously reported in high-energy proton implanted 4H-SiC, is detected in the deep level transient spectroscopy (DLTS) spectrum. The annealing behavior of the M-center is confirmed, and an enhanced recombination process is suggested. The annihilation process is coincidental with the evolvement of the bistable EB-centers in the low temperature range of the DLTS spectrum. The annealing energy of the M-center is similar to the generation energy of the EB-centers, thus partial transformation of the M-center to the EB-centers is suggested. The EB-centers completely disappeared after annealing temperatures higher than 700 ∘C without the formation of new defects in the observed DLTS scanning range. The threshold energy for moving Si atom in SiC is higher than the applied irradiation energy, and the annihilation temperatures are relatively low, therefore the M-center, EH1 and EH3, as well as the EB-centers are attributed to defects related to the C atom in SiC, most probably to carbon interstitials and their complexes.
NASA Astrophysics Data System (ADS)
Patil, Mandar; Harada, Tomohiro; Nakao, Ken-ichi; Joshi, Pankaj S.; Kimura, Masashi
2016-05-01
The origin of the ultrahigh-energy particles we receive on Earth from outer space such as EeV cosmic rays and PeV neutrinos remains an enigma. All mechanisms known to us currently make use of electromagnetic interaction to accelerate charged particles. In this paper, we propose a mechanism exclusively based on gravity rather than electromagnetic interaction. We show that it is possible to generate ultrahigh-energy particles starting from particles with moderate energies using the collisional Penrose process in an overspinning Kerr spacetime transcending the Kerr bound only by an infinitesimal amount, i.e., with the Kerr parameter a =M (1 +ɛ ) , where we take the limit ɛ →0+. We consider two massive particles starting from rest at infinity that collide at r =M with divergent center-of-mass energy and produce two massless particles. We show that massless particles produced in the collision can escape to infinity with the ultrahigh energies exploiting the collisional Penrose process with the divergent efficiency η ˜1 /√{ɛ }→∞ . Assuming the isotropic emission of massless particles in the center-of-mass frame of the colliding particles, we show that half of the particles created in the collisions escape to infinity with the divergent energies, while the proportion of particles that reach infinity with finite energy is minuscule. To a distant observer, ultrahigh-energy particles appear to originate from a bright spot which is at the angular location ξ ˜2 M /robs with respect to the singularity on the side which is rotating toward the observer. We compute the spectrum of the high-energy massless particles and show that anisotropy in the emission in the center-of-mass frame leaves a distinct signature on its shape. Since the anisotropy is dictated by the differential cross section of the underlying particle physics process, the observation of the spectrum can constrain the particle physics model and serve as a unique probe into fundamental physics at ultrahigh energies at which particles collide. Thus, the existence of the near-extremal overspinning Kerr geometry in the Universe, either as a transient or permanent configuration, would have deep implications on astrophysics as well as fundamental particle physics.
NASA Astrophysics Data System (ADS)
Jiang, Yingni
2018-03-01
Due to the high energy consumption of communication, energy saving of data centers must be enforced. But the lack of evaluation mechanisms has restrained the process on energy saving construction of data centers. In this paper, energy saving evaluation index system of data centers was constructed on the basis of clarifying the influence factors. Based on the evaluation index system, analytical hierarchy process was used to determine the weights of the evaluation indexes. Subsequently, a three-grade fuzzy comprehensive evaluation model was constructed to evaluate the energy saving system of data centers.
78 FR 45513 - Privacy Act of 1974; Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-29
...; Computer Matching Program AGENCY: Defense Manpower Data Center (DMDC), DoD. ACTION: Notice of a Computer... individual's privacy, and would result in additional delay in determining eligibility and, if applicable, the... Defense. NOTICE OF A COMPUTER MATCHING PROGRAM AMONG THE DEFENSE MANPOWER DATA CENTER, THE DEPARTMENT OF...
20. SITE BUILDING 002 SCANNER BUILDING IN COMPUTER ...
20. SITE BUILDING 002 - SCANNER BUILDING - IN COMPUTER ROOM LOOKING AT "CONSOLIDATED MAINTENANCE OPERATIONS CENTER" JOB AREA AND OPERATION WORK CENTER. TASKS INCLUDE RADAR MAINTENANCE, COMPUTER MAINTENANCE, CYBER COMPUTER MAINTENANCE AND RELATED ACTIVITIES. - Cape Cod Air Station, Technical Facility-Scanner Building & Power Plant, Massachusetts Military Reservation, Sandwich, Barnstable County, MA
Computational Science: A Research Methodology for the 21st Century
NASA Astrophysics Data System (ADS)
Orbach, Raymond L.
2004-03-01
Computational simulation - a means of scientific discovery that employs computer systems to simulate a physical system according to laws derived from theory and experiment - has attained peer status with theory and experiment. Important advances in basic science are accomplished by a new "sociology" for ultrascale scientific computing capability (USSCC), a fusion of sustained advances in scientific models, mathematical algorithms, computer architecture, and scientific software engineering. Expansion of current capabilities by factors of 100 - 1000 open up new vistas for scientific discovery: long term climatic variability and change, macroscopic material design from correlated behavior at the nanoscale, design and optimization of magnetic confinement fusion reactors, strong interactions on a computational lattice through quantum chromodynamics, and stellar explosions and element production. The "virtual prototype," made possible by this expansion, can markedly reduce time-to-market for industrial applications such as jet engines and safer, more fuel efficient cleaner cars. In order to develop USSCC, the National Energy Research Scientific Computing Center (NERSC) announced the competition "Innovative and Novel Computational Impact on Theory and Experiment" (INCITE), with no requirement for current DOE sponsorship. Fifty nine proposals for grand challenge scientific problems were submitted for a small number of awards. The successful grants, and their preliminary progress, will be described.
Green, Peter F. (Director, Center for Solar and Thermal Energy Conversion, University of Michigan); CSTEC Staff
2017-12-09
'Heart of the Solution - Energy Frontiers' was submitted by the Center for Solar and Thermal Energy Conversion (CSTEC) to the 'Life at the Frontiers of Energy Research' video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. This video was both the People's Choice Award winner and selected as one of five winners by a distinguished panel of judges for its 'exemplary explanation of the role of an Energy Frontier Research Center'. The Center for Solar and Thermal Energy Conversion is directed by Peter F. Green at the University of Michigan. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges. The mission of the Center for Solar and Thermal Energy Conversion is 'to study complex material structures on the nanoscale to identify key features for their potential use as materials to convert solar energy and heat to electricity.' Research topics are: solar photovoltaic, photonic, optics, solar thermal, thermoelectric, phonons, thermal conductivity, solar electrodes, defects, ultrafast physics, interfacial characterization, matter by design, novel materials synthesis, charge transport, and self-assembly.
Controller Chips Preserve Microprocessor Function
NASA Technical Reports Server (NTRS)
2012-01-01
Above the Atlantic Ocean, off the coast of Brazil, there is a dip in the Earth s surrounding magnetic field called the South Atlantic Anomaly. Here, space radiation can reach into Earth s upper atmosphere to interfere with the functioning of satellites, aircraft, and even the International Space Station. "The South Atlantic Anomaly is a hot spot of radiation that the space station goes through at a certain point in orbit," Miria Finckenor, a physicist at Marshall Space Flight Center, describes, "If there s going to be a problem with the electronics, 90 percent of that time, it is going to be in that spot." Space radiation can cause physical damage to microchips and can actually change the software commands in computers. When high-energy particles penetrate a satellite or other spacecraft, the electrical components can absorb the energy and temporarily switch off. If the energy is high enough, it can cause the device to enter a hung state, which can only be addressed by restarting the system. When space radiation affects the operational status of microprocessors, the occurrence is called single event functional interrupt (SEFI). SEFI happens not only to the computers onboard spacecraft in Earth orbit, but to the computers on spacecraft throughout the solar system. "One of the Mars rovers had this problem in the radiation environment and was rebooting itself several times a day. On one occasion, it rebooted 40 times in one day," Finckenor says. "It s hard to obtain any data when you have to constantly reboot and start over."
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
"CABS: Green Energy for our Nation's Future" was submitted by the Center for Advanced Biofuel Systems (CABS) to the "Life at the Frontiers of Energy Research" video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. CABS, an EFRC directed by Jan Jaworski at the Donald Danforth Plant Science Center is a partnership of scientists from five institutions: Donald Danforth Plant Science Center (lead), Michigan State University, the University of Nebraska, New Mexico Consortium/LANL, and Washington State University. Themore » Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1980-03-01
Appendix II of The Woodlands Metro Center Energy Study near Houston consists of the following: Metro Center Program, Conventional Plan Building Prototypes and Detail Parcel Analysis, Energy Plan Building Prototypes, and Energy Plan Detail Parcel Analysis.
Zhu, Xiaoyang (Director, Understanding Charge Separation and Transfer at Interfaces in Energy Materials); CST Staff
2017-12-09
'EFRC:CST at the University of Texas at Austin - A DOE Energy Frontier Research Center' was submitted by the EFRC for Understanding Charge Separation and Transfer at Interfaces in Energy Materials (EFRC:CST) to the 'Life at the Frontiers of Energy Research' video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. EFRC:CST is directed by Xiaoyang Zhu at the University of Texas at Austin in partnership with Sandia National Laboratories. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.
Harrigan, Robert L; Yvernault, Benjamin C; Boyd, Brian D; Damon, Stephen M; Gibney, Kyla David; Conrad, Benjamin N; Phillips, Nicholas S; Rogers, Baxter P; Gao, Yurui; Landman, Bennett A
2016-01-01
The Vanderbilt University Institute for Imaging Science (VUIIS) Center for Computational Imaging (CCI) has developed a database built on XNAT housing over a quarter of a million scans. The database provides framework for (1) rapid prototyping, (2) large scale batch processing of images and (3) scalable project management. The system uses the web-based interfaces of XNAT and REDCap to allow for graphical interaction. A python middleware layer, the Distributed Automation for XNAT (DAX) package, distributes computation across the Vanderbilt Advanced Computing Center for Research and Education high performance computing center. All software are made available in open source for use in combining portable batch scripting (PBS) grids and XNAT servers. Copyright © 2015 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roth, P.A.
1988-10-28
The ATHENA (Advanced Thermal Hydraulic Energy Network Analyzer) code is a system transient analysis code with multi-loop, multi-fluid capabilities, which is available to the fusion community at the National Magnetic Fusion Energy Computing Center (NMFECC). The work reported here assesses the ATHENA magnetohydrodynamic (MHD) pressure drop model for liquid metals flowing through a strong magnetic field. An ATHENA model was developed for two simple geometry, adiabatic test sections used in the Argonne Liquid Metal Experiment (ALEX) at Argonne National Laboratory (ANL). The pressure drops calculated by ATHENA agreed well with the experimental results from the ALEX facility. 13 refs., 4more » figs., 2 tabs.« less
Distribution Locational Real-Time Pricing Based Smart Building Control and Management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hao, Jun; Dai, Xiaoxiao; Zhang, Yingchen
This paper proposes an real-virtual parallel computing scheme for smart building operations aiming at augmenting overall social welfare. The University of Denver's campus power grid and Ritchie fitness center is used for demonstrating the proposed approach. An artificial virtual system is built in parallel to the real physical system to evaluate the overall social cost of the building operation based on the social science based working productivity model, numerical experiment based building energy consumption model and the power system based real-time pricing mechanism. Through interactive feedback exchanged between the real and virtual system, enlarged social welfare, including monetary cost reductionmore » and energy saving, as well as working productivity improvements, can be achieved.« less
Elastic and transport cross sections for inert gases in a hydrogen plasma
NASA Astrophysics Data System (ADS)
Krstic, Predrag
2005-05-01
Accurate elastic differential and integral scattering and transport cross sections have been computed using a fully quantum-mechanical approach for hydrogen ions (H^+, D^+ and T^+) colliding with Neon, Krypton and Xenon, in the center of mass energy range 0.1 to 200 eV. The momentum transfer and viscosity cross sections have been extended to higher keV collision energies using a classical, three-body scattering method. The results were compared with previously calculated values for Argon and Helium, as well as with simple analytical models. The cross sections, tabulated and available through the world wide web (www-cfadc.phy.ornl.gov) are of significance in fusion plasma modeling, gaseous electronics and other plasma applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fournier, Joseph A.; Wolke, Conrad T.; Johnson, Mark A.
In this Article, we review the role of gas-phase, size-selected protonated water clusters, H+(H2O)n, in the analysis of the microscopic mechanics responsible for the behavior of the excess proton in bulk water. We extend upon previous studies of the smaller, two-dimensional sheet-like structures to larger (n≥10) assemblies with three-dimensional cage morphologies which better mimic the bulk environment. Indeed, clusters in which a complete second solvation shell forms around a surface-embedded hydronium ion yield vibrational spectra where the signatures of the proton defect display strikingly similar positions and breadth to those observed in dilute acids. We investigate effects of the localmore » structure and intermolecular interactions on the large red shifts observed in the proton vibrational signature upon cluster growth using various theoretical methods. We show that, in addition to sizeable anharmonic couplings, the position of the excess proton vibration can be traced to large increases in the electric field exerted on the embedded hydronium ion upon formation of the first and second solvation shells. MAJ acknowledges support from the U.S. Department of Energy under Grant No. DE-FG02- 06ER15800 as well as the facilities and staff of the Yale University Faculty of Arts and Sciences High Performance Computing Center, and by the National Science Foundation under Grant No. CNS 08-21132 that partially funded acquisition of the facilities. SMK and SSX acknowledge support from the US Department of Energy, Office of Science, Office of Basic Energy Sciences, Division of Chemical Sciences, Geosciences and Biosciences. Pacific Northwest National Laboratory (PNNL) is a multiprogram national laboratory operated for DOE by Battelle. This research used resources of the National Energy Research Scientific Computing Center, which is supported by the Office of Science of the U.S. Department of Energy under Contract No. DE-AC02-05CH11231.« less
Heterogeneous multiscale Monte Carlo simulations for gold nanoparticle radiosensitization.
Martinov, Martin P; Thomson, Rowan M
2017-02-01
To introduce the heterogeneous multiscale (HetMS) model for Monte Carlo simulations of gold nanoparticle dose-enhanced radiation therapy (GNPT), a model characterized by its varying levels of detail on different length scales within a single phantom; to apply the HetMS model in two different scenarios relevant for GNPT and to compare computed results with others published. The HetMS model is implemented using an extended version of the EGSnrc user-code egs_chamber; the extended code is tested and verified via comparisons with recently published data from independent GNP simulations. Two distinct scenarios for the HetMS model are then considered: (a) monoenergetic photon beams (20 keV to 1 MeV) incident on a cylinder (1 cm radius, 3 cm length); (b) isotropic point source (brachytherapy source spectra) at the center of a 2.5 cm radius sphere with gold nanoparticles (GNPs) diffusing outwards from the center. Dose enhancement factors (DEFs) are compared for different source energies, depths in phantom, gold concentrations, GNP sizes, and modeling assumptions, as well as with independently published values. Simulation efficiencies are investigated. The HetMS MC simulations account for the competing effects of photon fluence perturbation (due to gold in the scatter media) coupled with enhanced local energy deposition (due to modeling discrete GNPs within subvolumes). DEFs are most sensitive to these effects for the lower source energies, varying with distance from the source; DEFs below unity (i.e., dose decreases, not enhancements) can occur at energies relevant for brachytherapy. For example, in the cylinder scenario, the 20 keV photon source has a DEF of 3.1 near the phantom's surface, decreasing to less than unity by 0.7 cm depth (for 20 mg/g). Compared to discrete modeling of GNPs throughout the gold-containing (treatment) volume, efficiencies are enhanced by up to a factor of 122 with the HetMS approach. For the spherical phantom, DEFs vary with time for diffusion, radionuclide, and radius; DEFs differ considerably from those computed using a widely applied analytic approach. By combining geometric models of varying complexity on different length scales within a single simulation, the HetMS model can effectively account for both macroscopic and microscopic effects which must both be considered for accurate computation of energy deposition and DEFs for GNPT. Efficiency gains with the HetMS approach enable diverse calculations which would otherwise be prohibitively long. The HetMS model may be extended to diverse scenarios relevant for GNPT, providing further avenues for research and development. © 2016 American Association of Physicists in Medicine.
Catalyzing Gender Equality-Focused Clean Energy Development in West Africa
DOE Office of Scientific and Technical Information (OSTI.GOV)
The Economic Community of West African States (ECOWAS) Regional Center for Renewable Energy and Energy Efficiency (ECREEE) partnered with the Clean Energy Solutions Center (Solutions Center), the African Development Bank and other institutions to develop a Situation Analysis of Energy and Gender Issues in ECOWAS Member States. Through a systematic approach to assess interlinked gender and energy issues in the region, the report puts forth a number of key findings. This brochure highlights ECREEE's partnership with the Solutions Center and key findings from the report.
Phillips, Jordan J; Peralta, Juan E
2013-05-07
We present a method for calculating magnetic coupling parameters from a single spin-configuration via analytic derivatives of the electronic energy with respect to the local spin direction. This method does not introduce new approximations beyond those found in the Heisenberg-Dirac Hamiltonian and a standard Kohn-Sham Density Functional Theory calculation, and in the limit of an ideal Heisenberg system it reproduces the coupling as determined from spin-projected energy-differences. Our method employs a generalized perturbative approach to constrained density functional theory, where exact expressions for the energy to second order in the constraints are obtained by analytic derivatives from coupled-perturbed theory. When the relative angle between magnetization vectors of metal atoms enters as a constraint, this allows us to calculate all the magnetic exchange couplings of a system from derivatives with respect to local spin directions from the high-spin configuration. Because of the favorable computational scaling of our method with respect to the number of spin-centers, as compared to the broken-symmetry energy-differences approach, this opens the possibility for the blackbox exploration of magnetic properties in large polynuclear transition-metal complexes. In this work we outline the motivation, theory, and implementation of this method, and present results for several model systems and transition-metal complexes with a variety of density functional approximations and Hartree-Fock.
A source-controlled data center network model.
Yu, Yang; Liang, Mangui; Wang, Zhe
2017-01-01
The construction of data center network by applying SDN technology has become a hot research topic. The SDN architecture has innovatively separated the control plane from the data plane which makes the network more software-oriented and agile. Moreover, it provides virtual multi-tenancy, effective scheduling resources and centralized control strategies to meet the demand for cloud computing data center. However, the explosion of network information is facing severe challenges for SDN controller. The flow storage and lookup mechanisms based on TCAM device have led to the restriction of scalability, high cost and energy consumption. In view of this, a source-controlled data center network (SCDCN) model is proposed herein. The SCDCN model applies a new type of source routing address named the vector address (VA) as the packet-switching label. The VA completely defines the communication path and the data forwarding process can be finished solely relying on VA. There are four advantages in the SCDCN architecture. 1) The model adopts hierarchical multi-controllers and abstracts large-scale data center network into some small network domains that has solved the restriction for the processing ability of single controller and reduced the computational complexity. 2) Vector switches (VS) developed in the core network no longer apply TCAM for table storage and lookup that has significantly cut down the cost and complexity for switches. Meanwhile, the problem of scalability can be solved effectively. 3) The SCDCN model simplifies the establishment process for new flows and there is no need to download flow tables to VS. The amount of control signaling consumed when establishing new flows can be significantly decreased. 4) We design the VS on the NetFPGA platform. The statistical results show that the hardware resource consumption in a VS is about 27% of that in an OFS.
A source-controlled data center network model
Yu, Yang; Liang, Mangui; Wang, Zhe
2017-01-01
The construction of data center network by applying SDN technology has become a hot research topic. The SDN architecture has innovatively separated the control plane from the data plane which makes the network more software-oriented and agile. Moreover, it provides virtual multi-tenancy, effective scheduling resources and centralized control strategies to meet the demand for cloud computing data center. However, the explosion of network information is facing severe challenges for SDN controller. The flow storage and lookup mechanisms based on TCAM device have led to the restriction of scalability, high cost and energy consumption. In view of this, a source-controlled data center network (SCDCN) model is proposed herein. The SCDCN model applies a new type of source routing address named the vector address (VA) as the packet-switching label. The VA completely defines the communication path and the data forwarding process can be finished solely relying on VA. There are four advantages in the SCDCN architecture. 1) The model adopts hierarchical multi-controllers and abstracts large-scale data center network into some small network domains that has solved the restriction for the processing ability of single controller and reduced the computational complexity. 2) Vector switches (VS) developed in the core network no longer apply TCAM for table storage and lookup that has significantly cut down the cost and complexity for switches. Meanwhile, the problem of scalability can be solved effectively. 3) The SCDCN model simplifies the establishment process for new flows and there is no need to download flow tables to VS. The amount of control signaling consumed when establishing new flows can be significantly decreased. 4) We design the VS on the NetFPGA platform. The statistical results show that the hardware resource consumption in a VS is about 27% of that in an OFS. PMID:28328925
Energy Efficiency Feasibility Study and Resulting Plan for the Bay Mills Indian Community
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kushman, Chris
In 2011 the Inter-Tribal Council of Michigan, Inc. was awarded an Energy Efficiency Development and Deployment in Indian Country grant from the U.S. Department of Energy’s Tribal Energy Program. This grant aimed to study select Bay Mills Indian Community community/government buildings to determine what is required to reduce each building’s energy consumption by 30%. The Bay Mills Indian Community (BMIC) buildings with the largest expected energy use were selected for this study and included the Bay Mills Ellen Marshall Health Center building, Bay Mills Indian Community Administration Building, Bay Mills Community College main campus, Bay Mills Charter School and themore » Waishkey Community Center buildings. These five sites are the largest energy consuming Community buildings and comprised the study area of this project titled “Energy Efficiency Feasibility Study and Resulting Plan for the Bay Mills Indian Community”. The end objective of this study, plan and the Tribe is to reduce the energy consumption at the Community’s most energy intensive buildings that will, in turn, reduce emissions at the source of energy production, reduce energy expenditures, create long lasting energy conscious practices and positively affect the quality of the natural environment. This project’s feasibility study and resulting plan is intended to act as a guide to the Community’s first step towards planned energy management within its buildings/facilities. It aims to reduce energy consumption by 30% or greater within the subject facilities with an emphasis on energy conservation and efficiency. The energy audits and related power consumption analyses conducted for this study revealed numerous significant energy conservation and efficiency opportunities for all of the subject sites/buildings. In addition, many of the energy conservation measures require no cost and serve to help balance other measures requiring capital investment. Reoccurring deficiencies relating to heating, cooling, thermostat setting inefficiencies, powering computers, lighting, items linked to weatherization and numerous other items were encountered that can be mitigated with the energy conservation measures developed and specified during the course of this project.« less
Bridging the PSI Knowledge Gap: A Multi-Scale Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wirth, Brian D.
2015-01-08
Plasma-surface interactions (PSI) pose an immense scientific hurdle in magnetic confinement fusion and our present understanding of PSI in confinement environments is highly inadequate; indeed, a recent Fusion Energy Sciences Advisory Committee report found that 4 out of the 5 top five fusion knowledge gaps were related to PSI. The time is appropriate to develop a concentrated and synergistic science effort that would expand, exploit and integrate the wealth of laboratory ion-beam and plasma research, as well as exciting new computational tools, towards the goal of bridging the PSI knowledge gap. This effort would broadly advance plasma and material sciences,more » while providing critical knowledge towards progress in fusion PSI. This project involves the development of a Science Center focused on a new approach to PSI science; an approach that both exploits access to state-of-the-art PSI experiments and modeling, as well as confinement devices. The organizing principle is to develop synergistic experimental and modeling tools that treat the truly coupled multi-scale aspect of the PSI issues in confinement devices. This is motivated by the simple observation that while typical lab experiments and models allow independent manipulation of controlling variables, the confinement PSI environment is essentially self-determined with few outside controls. This means that processes that may be treated independently in laboratory experiments, because they involve vastly different physical and time scales, will now affect one another in the confinement environment. Also, lab experiments cannot simultaneously match all exposure conditions found in confinement devices typically forcing a linear extrapolation of lab results. At the same time programmatic limitations prevent confinement experiments alone from answering many key PSI questions. The resolution to this problem is to usefully exploit access to PSI science in lab devices, while retooling our thinking from a linear and de-coupled extrapolation to a multi-scale, coupled approach. The PSI Plasma Center consisted of three equal co-centers; one located at the MIT Plasma Science and Fusion Center, one at UC San Diego Center for Energy Research and one at the UC Berkeley Department of Nuclear Engineering, which moved to the University of Tennessee, Knoxville (UTK) with Professor Brian Wirth in July 2010. The Center had three co-directors: Prof. Dennis Whyte led the MIT co-center, the UCSD co-center was led by Dr. Russell Doerner, and Prof. Brian Wirth led the UCB/UTK center. The directors have extensive experience in PSI and material research, and have been internationally recognized in the magnetic fusion, materials and plasma research fields. The co-centers feature keystone PSI experimental and modeling facilities dedicated to PSI science: the DIONISOS/CLASS facility at MIT, the PISCES facility at UCSD, and the state-of-the-art numerical modeling capabilities at UCB/UTK. A collaborative partner in the center is Sandia National Laboratory at Livermore (SNL/CA), which has extensive capabilities with low energy ion beams and surface diagnostics, as well as supporting plasma facilities, including the Tritium Plasma Experiment, all of which significantly augment the Center. Interpretive, continuum material models are available through SNL/CA, UCSD and MIT. The participating institutions of MIT, UCSD, UCB/UTK, SNL/CA and LLNL brought a formidable array of experimental tools and personnel abilities into the PSI Plasma Center. Our work has focused on modeling activities associated with plasma surface interactions that are involved in effects of He and H plasma bombardment on tungsten surfaces. This involved performing computational material modeling of the surface evolution during plasma bombardment using molecular dynamics modeling. The principal outcomes of the research efforts within the combined experimental – modeling PSI center are to provide a knowledgebase of the mechanisms of surface degradation, and the influence of the surface on plasma conditions.« less
Energy | Argonne National Laboratory
Biology IMEInstitute for Molecular Engineering JCESRJoint Center for Energy Storage Research MCSGMidwest Science and Engineering RISCRisk and Infrastructure Science Center SBCStructural Biology Center Energy.gov
Staniszewska, Magdalena; Kupfer, Stephan; Guthmuller, Julien
2018-05-16
Time-dependent density functional theory calculations combined with the Marcus theory of electron transfer (ET) were applied on the molecular photocatalyst [(tbbpy)2Ru(tpphz)PdCl2]2+ in order to elucidate the light-induced relaxation pathways populated upon excitation in the longer wavelength range of its absorption spectrum. The computational results show that after the initial excitation, metal (Ru) to ligand (tpphz) charge transfer (MLCT) triplet states are energetically accessible, but that an ET toward the catalytic center (PdCl2) from these states is a slow process, with estimated time constants above 1 ns. Instead, the calculations predict that low-lying Pd-centered states are efficiently populated - associated to an energy transfer toward the catalytic center. Thus, it is postulated that these states lead to the dissociation of a Cl- and are consequently responsible for the experimentally observed degradation of the catalytic center. Following dissociation, it is shown that the ET rates from the MLCT states to the charge separated states are significantly increased (i.e. 10^5-10^6 times larger). This demonstrates that alteration of the catalytic center generates efficient charge separation. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burns, Peter; Lenzen, Meehan
"Energy Frontier Research Center Materials Science of Actinides" was submitted by the EFRC for Materials Science of Actinides (MSA) to the "Life at the Frontiers of Energy Research" video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. MSA is directed by Peter Burns at the University of Notre Dame, and is a partnership of scientists from ten institutions.The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Researchmore » Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-14
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. ER13-1348-000] Gainesville Renewable Energy Center, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for... Gainesville Renewable Energy Center, LLC's application for market- based rate authority, with an accompanying...
75 FR 68607 - CenterPoint Energy-Illinois Gas Transmission Company; Notice of Baseline Filing
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-08
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. PR10-80-001] CenterPoint Energy--Illinois Gas Transmission Company; Notice of Baseline Filing November 1, 2010. Take notice that on October 28, 2010, CenterPoint Energy--Illinois Gas Transmission Company submitted a revised...
76 FR 31602 - Combined Notice of Filings #2
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-01
.... Applicants: Hatch Solar Energy Center I, LLC. Description: Hatch Solar Energy Center I, LLC submits tariff filing per 35.12: Hatch Solar Energy Center I, LLC MBR Application to be effective 5/26/2011. Filed Date... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission Combined Notice of Filings 2 Take notice...
77 FR 62499 - Leaf River Energy Center LLC; Notice of Application
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-15
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. CP12-526-000] Leaf River Energy Center LLC; Notice of Application Take notice that on September 24, 2012, Leaf River Energy Center LLC (Leaf River), 53 Riverside Avenue, Westport, Connecticut, 06880, filed an application in Docket No...
NASA Astrophysics Data System (ADS)
King, Jacob; Kruger, Scott
2017-10-01
Flow can impact the stability and nonlinear evolution of range of instabilities (e.g. RWMs, NTMs, sawteeth, locked modes, PBMs, and high-k turbulence) and thus robust numerical algorithms for simulations with flow are essential. Recent simulations of DIII-D QH-mode [King et al., Phys. Plasmas and Nucl. Fus. 2017] with flow have been restricted to smaller time-step sizes than corresponding computations without flow. These computations use a mixed semi-implicit, implicit leapfrog time discretization as implemented in the NIMROD code [Sovinec et al., JCP 2004]. While prior analysis has shown that this algorithm is unconditionally stable with respect to the effect of large flows on the MHD waves in slab geometry [Sovinec et al., JCP 2010], our present Von Neumann stability analysis shows that a flow-induced numerical instability may arise when ad-hoc cylindrical curvature is included. Computations with the NIMROD code in cylindrical geometry with rigid rotation and without free-energy drive from current or pressure gradients qualitatively confirm this analysis. We explore potential methods to circumvent this flow-induced numerical instability such as using a semi-Lagrangian formulation instead of time-centered implicit advection and/or modification to the semi-implicit operator. This work is supported by the DOE Office of Science (Office of Fusion Energy Sciences).
Pegis, Michael L.; McKeown, Bradley A.; Kumar, Neeraj; ...
2016-10-28
Improvement of electrocatalysts for the oxygen reduction reaction (ORR) is critical for the advancement of fuel cell technologies. Herein, we report a series of eleven soluble iron porphyrin ORR electrocatalysts that possess turnover frequencies (TOFs) from 3 s -1 to an unprecedented 2.2 x 10 6 s -1. These TOFs correlate with the ORR overpotential, which can be changed by modulating the ancillary ligand, by varying the reaction conditions or by changing the catalyst’s protonation state. This is the first such correlation for homogeneous ORR electrocatalysis, and it demonstrates that the remarkably fast TOFs are a consequence of the highmore » overpotential. Computational studies indicate that the correlation is analogous to the volcano plot analysis developed for heterogeneous ORR materials. This unique parallel between homo- and heterogeneous ORR electrocatalysts allows a fundamental understanding of intrinsic barriers associated with the ORR, which can aid the design of new catalytic systems that operate at low overpotential. This research was supported as part of the Center for Molecular Electrocatalysis, an Energy Frontier Research Center funded by the U.S. Department of Energy (DOE), Office of Science, Office of Basic Energy Sciences. Additional data is given in the Electronic Supporting Information.« less
Optimal Time-Resource Allocation for Energy-Efficient Physical Activity Detection
Thatte, Gautam; Li, Ming; Lee, Sangwon; Emken, B. Adar; Annavaram, Murali; Narayanan, Shrikanth; Spruijt-Metz, Donna; Mitra, Urbashi
2011-01-01
The optimal allocation of samples for physical activity detection in a wireless body area network for health-monitoring is considered. The number of biometric samples collected at the mobile device fusion center, from both device-internal and external Bluetooth heterogeneous sensors, is optimized to minimize the transmission power for a fixed number of samples, and to meet a performance requirement defined using the probability of misclassification between multiple hypotheses. A filter-based feature selection method determines an optimal feature set for classification, and a correlated Gaussian model is considered. Using experimental data from overweight adolescent subjects, it is found that allocating a greater proportion of samples to sensors which better discriminate between certain activity levels can result in either a lower probability of error or energy-savings ranging from 18% to 22%, in comparison to equal allocation of samples. The current activity of the subjects and the performance requirements do not significantly affect the optimal allocation, but employing personalized models results in improved energy-efficiency. As the number of samples is an integer, an exhaustive search to determine the optimal allocation is typical, but computationally expensive. To this end, an alternate, continuous-valued vector optimization is derived which yields approximately optimal allocations and can be implemented on the mobile fusion center due to its significantly lower complexity. PMID:21796237
NASA Astrophysics Data System (ADS)
Ryzhikov, Vladimir D.; Burachas, S. F.; Volkov, V. G.; Danshin, Evgeniy A.; Lisetskaya, Elena K.; Piven, L. A.; Svishch, Vladimir M.; Chernikov, Vyacheslav V.; Filimonov, A. E.
1997-02-01
After the Chernobyl catastrophe among the problems of current concern a question arose of detection of 'hot' particles formed from plutonium alloys with carbon, nitrogen, silicon, etc. For this purpose, the instruments are needed, which would be able to detect not only alpha- particles and low energy gamma-radiation, but also neutrons and high energy gamma-quanta from ((alpha) , n(gamma) ) - reactions. At present for each kind of radiation detectors of different types are used. A general drawback of all these instruments is their narrow dynamic range of dose rates and energies, and especially impossibility to registrate n-flux in condition large background activity gamma-rays nuclei, which makes each of them applicable only under certain specific conditions. For detection of 'hot' particles, oxide and semiconductor scintillators were used, which contained elements with large capture cross section for thermal neutrons. In this paper we try to determine possibilities and limitations of solid-state neutron detectors based on CdS(Te), ZnSe(Te), CdWO4 (CWO), Gd2SiO5 (GSO) scintillators developed and produced by the Science and Technology Center for Radiation Instruments of the Institute for Single Crystals. The instruments developed by Center are based preferable on a very promising system 'scintillator- photodiode-preamplifier' matched with modern computer data processing techniques.
Evolution of the Virtualized HPC Infrastructure of Novosibirsk Scientific Center
NASA Astrophysics Data System (ADS)
Adakin, A.; Anisenkov, A.; Belov, S.; Chubarov, D.; Kalyuzhny, V.; Kaplin, V.; Korol, A.; Kuchin, N.; Lomakin, S.; Nikultsev, V.; Skovpen, K.; Sukharev, A.; Zaytsev, A.
2012-12-01
Novosibirsk Scientific Center (NSC), also known worldwide as Akademgorodok, is one of the largest Russian scientific centers hosting Novosibirsk State University (NSU) and more than 35 research organizations of the Siberian Branch of Russian Academy of Sciences including Budker Institute of Nuclear Physics (BINP), Institute of Computational Technologies, and Institute of Computational Mathematics and Mathematical Geophysics (ICM&MG). Since each institute has specific requirements on the architecture of computing farms involved in its research field, currently we've got several computing facilities hosted by NSC institutes, each optimized for a particular set of tasks, of which the largest are the NSU Supercomputer Center, Siberian Supercomputer Center (ICM&MG), and a Grid Computing Facility of BINP. A dedicated optical network with the initial bandwidth of 10 Gb/s connecting these three facilities was built in order to make it possible to share the computing resources among the research communities, thus increasing the efficiency of operating the existing computing facilities and offering a common platform for building the computing infrastructure for future scientific projects. Unification of the computing infrastructure is achieved by extensive use of virtualization technology based on XEN and KVM platforms. This contribution gives a thorough review of the present status and future development prospects for the NSC virtualized computing infrastructure and the experience gained while using it for running production data analysis jobs related to HEP experiments being carried out at BINP, especially the KEDR detector experiment at the VEPP-4M electron-positron collider.
DOE Office of Scientific and Technical Information (OSTI.GOV)
De, K; Jha, S; Klimentov, A
2016-01-01
The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS, one of the largest collaborations ever assembled in the sciences, is at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, the ATLAS experiment is relying on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Managementmore » System for managing the workflow for all data processing on over 150 data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. While PanDA currently uses more than 250,000 cores with a peak performance of 0.3 petaFLOPS, LHC data taking runs require more resources than Grid computing can possibly provide. To alleviate these challenges, LHC experiments are engaged in an ambitious program to expand the current computing model to include additional resources such as the opportunistic use of supercomputers. We will describe a project aimed at integration of PanDA WMS with supercomputers in United States, Europe and Russia (in particular with Titan supercomputer at Oak Ridge Leadership Computing Facility (OLCF), MIRA supercomputer at Argonne Leadership Computing Facilities (ALCF), Supercomputer at the National Research Center Kurchatov Institute , IT4 in Ostrava and others). Current approach utilizes modified PanDA pilot framework for job submission to the supercomputers batch queues and local data management, with light-weight MPI wrappers to run single threaded workloads in parallel on LCFs multi-core worker nodes. This implementation was tested with a variety of Monte-Carlo workloads on several supercomputing platforms for ALICE and ATLAS experiments and it is in full production for the ATLAS experiment since September 2015. We will present our current accomplishments with running PanDA WMS at supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications, such as bioinformatics and astro-particle physics.« less
An investigation of the effects of touchpad location within a notebook computer.
Kelaher, D; Nay, T; Lawrence, B; Lamar, S; Sommerich, C M
2001-02-01
This study evaluated effects of the location of a notebook computer's integrated touchpad, complimenting previous work in the area of desktop mouse location effects. Most often integrated touchpads are located in the computer's wrist rest, and centered on the keyboard. This study characterized effects of this bottom center location and four alternatives (top center, top right, right side, and bottom right) upon upper extremity posture, discomfort, preference, and performance. Touchpad location was found to significantly impact each of those measures. The top center location was particularly poor, in that it elicited more ulnar deviation, more shoulder flexion, more discomfort, and perceptions of performance impedance. In general, the bottom center, bottom right, and right side locations fared better, though subjects' wrists were more extended in the bottom locations. Suggestions for notebook computer design are provided.
An efficient HZETRN (a galactic cosmic ray transport code)
NASA Technical Reports Server (NTRS)
Shinn, Judy L.; Wilson, John W.
1992-01-01
An accurate and efficient engineering code for analyzing the shielding requirements against the high-energy galactic heavy ions is needed. The HZETRN is a deterministic code developed at Langley Research Center that is constantly under improvement both in physics and numerical computation and is targeted for such use. One problem area connected with the space-marching technique used in this code is the propagation of the local truncation error. By improving the numerical algorithms for interpolation, integration, and grid distribution formula, the efficiency of the code is increased by a factor of eight as the number of energy grid points is reduced. The numerical accuracy of better than 2 percent for a shield thickness of 150 g/cm(exp 2) is found when a 45 point energy grid is used. The propagating step size, which is related to the perturbation theory, is also reevaluated.
Determination of crash test pulses and their application to aircraft seat analysis
NASA Technical Reports Server (NTRS)
Alfaro-Bou, E.; Williams, M. S.; Fasanella, E. L.
1981-01-01
Deceleration time histories (crash pulses) from a series of twelve light aircraft crash tests conducted at NASA Langley Research Center (LaRC) were analyzed to provide data for seat and airframe design for crashworthiness. Two vertical drop tests at 12.8 m/s (42 ft/s) and 36 G peak deceleration (simulating one of the vertical light aircraft crash pulses) were made using an energy absorbing light aircraft seat prototype. Vertical pelvis acceleration measured in a 50 percentile dummy in the energy absorbing seat were found to be 45% lower than those obtained from the same dummy in a typical light aircraft seat. A hybrid mathematical seat-occupant model was developed using the DYCAST nonlinear finite element computer code and was used to analyze a vertical drop test of the energy absorbing seat. Seat and occupant accelerations predicted by the DYCAST model compared quite favorably with experimental values.
Computed potential energy surfaces for chemical reactions
NASA Technical Reports Server (NTRS)
Walch, Stephen P.
1990-01-01
The objective was to obtain accurate potential energy surfaces (PES's) for a number of reactions which are important in the H/N/O combustion process. The interest in this is centered around the design of the SCRAM jet engine for the National Aerospace Plane (NASP), which was envisioned as an air-breathing hydrogen-burning vehicle capable of reaching velocities as large as Mach 25. Preliminary studies indicated that the supersonic flow in the combustor region of the scram jet engine required accurate reaction rate data for reactions in the H/N/O system, some of which was not readily available from experiment. The most important class of combustion reactions from the standpoint of the NASP project are radical recombinaton reactions, since these reactions result in most of the heat release in the combustion process. Theoretical characterizations of the potential energy surfaces for these reactions are presented and discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCallen, R; Salari, K; Ortega, J
2003-05-01
A Working Group Meeting on Heavy Vehicle Aerodynamic Drag was held at Lawrence Livermore National Laboratory on May 29-30, 2003. The purpose of the meeting was to present and discuss suggested guidance and direction for the design of drag reduction devices determined from experimental and computational studies. Representatives from the Department of Energy (DOE)/Office of Energy Efficiency and Renewable Energy/Office of FreedomCAR & Vehicle Technologies, Lawrence Livermore National Laboratory (LLNL), Sandia National Laboratories (SNL), NASA Ames Research Center (NASA), University of Southern California (USC), California Institute of Technology (Caltech), Georgia Tech Research Institute (GTRI), Argonne National Laboratory (ANL), Clarkson University,more » and PACCAR participated in the meeting. This report contains the technical presentations (viewgraphs) delivered at the Meeting, briefly summarizes the comments and conclusions, provides some highlighted items, and outlines the future action items.« less
FY 72 Computer Utilization at the Transportation Systems Center
DOT National Transportation Integrated Search
1972-08-01
The Transportation Systems Center currently employs a medley of on-site and off-site computer systems to obtain the computational support it requires. Examination of the monthly User Accountability Reports for FY72 indicated that during the fiscal ye...
Servicios del Centro de Soluciones Para la Energia Limpia (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
2014-05-01
This is the Spanish translation of the Clean Energy Solutions Center Services fact sheet. The Clean Energy Solutions Center (Solutions Center) helps governments, advisors and analysts create policies and programs that advance the deployment of clean energy technologies. The Solutions Center partners with international organizations to provide online training, expert assistance, and technical resources on clean energy policy.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-05
... Environmental Impact Statement for the Proposed RES Americas Moapa Solar Energy Center, Clark County, Nevada... environmental impact statement (DEIS) for the proposed RES Americas Moapa Solar Energy Center on the Moapa River... Progress and on the following Web site: www.MoapaSolarEnergyCenterEIS.com . In order to be fully considered...
Bowers, John (Director, Center for Energy Efficient Materials ); CEEM Staff
2017-12-09
'Undergraduate Research at the Center for Energy Efficient Materials (CEEM)' was submitted by CEEM to the 'Life at the Frontiers of Energy Research' video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. CEEM, an EFRC directed by John Bowers at the University of California, Santa Barbara is a partnership of scientists from four institutions: UC, Santa Barbara (lead), UC, Santa Cruz, Los Alamos National Laboratory, and National Renewable Energy Laboratory. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges. The mission of the Center for Energy Efficient Materials is 'to discover and develop materials that control the interactions between light, electricity, and heat at the nanoscale for improved solar energy conversion, solid-state lighting, and conversion of heat into electricity.' Research topics are: solar photovoltaic, photonic, solid state lighting, optics, thermoelectric, bio-inspired, electrical energy storage, batteries, battery electrodes, novel materials synthesis, and scalable processing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Halabi, Linda
"Undergraduate Research at the Center for Energy Efficient Materials (CEEM)" was submitted by CEEM to the "Life at the Frontiers of Energy Research" video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. CEEM, an EFRC directed by John Bowers at the University of California, Santa Barbara is a partnership of scientists from four institutions: UC, Santa Barbara (lead), UC, Santa Cruz, Los Alamos National Laboratory, and National Renewable Energy Laboratory. The Office of Basic Energy Sciences in themore » U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges. The mission of the Center for Energy Efficient Materials is 'to discover and develop materials that control the interactions between light, electricity, and heat at the nanoscale for improved solar energy conversion, solid-state lighting, and conversion of heat into electricity.' Research topics are: solar photovoltaic, photonic, solid state lighting, optics, thermoelectric, bio-inspired, electrical energy storage, batteries, battery electrodes, novel materials synthesis, and scalable processing.« less
-275-4303 Kevin Regimbal oversees NREL's High Performance Computing (HPC) Systems & Operations , engineering, and operations. Kevin is interested in data center design and computing as well as data center integration and optimization. Professional Experience HPC oversight: program manager, project manager, center
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
"Electricity: the Energy of Tomorrow" was submitted by the Energy Materials Center at Cornell (emc2) to the "Life at the Frontiers of Energy Research" video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. emc2, an EFRC directed by Hector D. Abruna at Cornell University (lead) is a partnership between Cornell and Lawrence Berkeley National Laboratory. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs)more » in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.« less
Abruna, Hector D. (Director, Energy Materials Center at Cornell); emc2 Staff
2017-12-09
'Electricity: the Energy of Tomorrow' was submitted by the Energy Materials Center at Cornell (emc2) to the 'Life at the Frontiers of Energy Research' video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. emc2, an EFRC directed by Hector D. Abruna at Cornell University (lead) is a partnership between Cornell and Lawrence Berkeley National Laboratory. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.
Energy Frontier Research With ATLAS: Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butler, John; Black, Kevin; Ahlen, Steve
2016-06-14
The Boston University (BU) group is playing key roles across the ATLAS experiment: in detector operations, the online trigger, the upgrade, computing, and physics analysis. Our team has been critical to the maintenance and operations of the muon system since its installation. During Run 1 we led the muon trigger group and that responsibility continues into Run 2. BU maintains and operates the ATLAS Northeast Tier 2 computing center. We are actively engaged in the analysis of ATLAS data from Run 1 and Run 2. Physics analyses we have contributed to include Standard Model measurements (W and Z cross sections,more » t\\bar{t} differential cross sections, WWW^* production), evidence for the Higgs decaying to \\tau^+\\tau^-, and searches for new phenomena (technicolor, Z' and W', vector-like quarks, dark matter).« less
DSMC Computations for Regions of Shock/Shock and Shock/Boundary Layer Interaction
NASA Technical Reports Server (NTRS)
Moss, James N.
2001-01-01
This paper presents the results of a numerical study of hypersonic interacting flows at flow conditions that include those for which experiments have been conducted in the Calspan-University of Buffalo Research Center (CUBRC) Large Energy National Shock (LENS) tunnel and the ONERA R5Ch low-density wind tunnel. The computations are made with the direct simulation Monte Carlo (DSMC) method of Bird. The focus is on Mach 9.3 to 11.4 flows about flared axisymmetric configurations, both hollow cylinder flares and double cones. The results presented highlight the sensitivity of the calculations to grid resolution, provide results concerning the conditions for incipient separation, and provide information concerning the flow structure and surface results for the extent of separation, heating, pressure, and skin friction.
Data Reprocessing on Worldwide Distributed Systems
NASA Astrophysics Data System (ADS)
Wicke, Daniel
The DØ experiment faces many challenges in terms of enabling access to large datasets for physicists on four continents. The strategy for solving these problems on worldwide distributed computing clusters is presented. Since the beginning of Run II of the Tevatron (March 2001) all Monte-Carlo simulations for the experiment have been produced at remote systems. For data analysis, a system of regional analysis centers (RACs) was established which supply the associated institutes with the data. This structure, which is similar to the tiered structure foreseen for the LHC was used in Fall 2003 to reprocess all DØ data with a much improved version of the reconstruction software. This makes DØ the first running experiment that has implemented and operated all important computing tasks of a high energy physics experiment on systems distributed worldwide.
Opportunities and choice in a new vector era
NASA Astrophysics Data System (ADS)
Nowak, A.
2014-06-01
This work discusses the significant changes in computing landscape related to the progression of Moore's Law, and the implications on scientific computing. Particular attention is devoted to the High Energy Physics domain (HEP), which has always made good use of threading, but levels of parallelism closer to the hardware were often left underutilized. Findings of the CERN openlab Platform Competence Center are reported in the context of expanding "performance dimensions", and especially the resurgence of vectors. These suggest that data oriented designs are feasible in HEP and have considerable potential for performance improvements on multiple levels, but will rarely trump algorithmic enhancements. Finally, an analysis of upcoming hardware and software technologies identifies heterogeneity as a major challenge for software, which will require more emphasis on scalable, efficient design.
VizieR Online Data Catalog: X-Ray source properties for NGC 2207/IC 2163 (Mineo+, 2014)
NASA Astrophysics Data System (ADS)
Mineo, S.; Rappaport, S.; Levine, A.; Pooley, D.; Steinhorn, B.; Homan, J.
2017-08-01
We analyzed four Chandra ACIS-S observations of the galaxy pair NGC 2207/IC 2163. The data reduction was done following the standard CIAO threads (CIAO version 4.6, CALDB version 4.5.9) for soft (0.5-2 keV), hard (2-8 keV), and broad (0.5-8.0 keV) energy bands. All Chandra data sets were reprocessed using chandra_repro, a script that automates the recommended data-processing steps presented in the CIAO analysis threads. Using the script fluximage, we computed a monochromatic exposure map for the mean photon energy of each band: 1.25 keV, 5.0 keV, and 4.25 keV for the soft, hard, and broad band, respectively. fluximage outputs both the instrument map for the center of each energy band using the tool mkinstmap and the exposure maps in sky coordinates for each energy band using mkexpmap. (5 data files).
NASA Astrophysics Data System (ADS)
Liao, Bi-Tao; Mei, Yang; Chen, Bo-Wei; Zheng, Wen-Chen
2017-07-01
The optical bands and EPR (or spin-Hamiltonian) parameters (g factors g//, g⊥ and zero-field splitting D) for Mn4+ ions at the trigonal octahedral Ti4+ site of MgTiO3 crystal are uniformly computed by virtue of the complete diagonalization (of energy matrix) method based on the two-spin-orbit-parameter model, where besides the effects of spin-orbit parameter of central dn ion on the spectral data (in the classical crystal field theory), those of ligands are also contained. The computed eight optical and EPR spectral data with four suitable adjustable parameters (note: differing from those in the previous work within cubic symmetry approximation where the used Racah parameters violate the nephelauxetic effect, the present Racah parameters obey the effect and hence are suitable) are rationally coincident with the experimental values. In particular, the calculated ground state splitting 2D, the first excited splitting ΔE(2E) and g-anisotropy Δg (=g//-g⊥) (they depend strongly on the angular distortion of d3 centers) are in excellent agreement with the observed values, suggesting that the angular distortions caused by the impurity-induced local lattice relaxation obtained from the above calculation for the trigonal Mn4+ impurity center in MgTiO3: Mn4+ crystal seem to be acceptable.
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
The BioEnergy Science Center, led by Oak Ridge National Laboratory, has been making advances in biofuels for over a decade. These achievements in plant genomics, microbial engineering, biochemistry, and plant physiology will carry over into the Center for Bioenergy Innovation, a new Department of Energy bioenergy research center.
Nielsen, Jens E.; Gunner, M. R.; Bertrand García-Moreno, E.
2012-01-01
The pKa Cooperative http://www.pkacoop.org was organized to advance development of accurate and useful computational methods for structure-based calculation of pKa values and electrostatic energy in proteins. The Cooperative brings together laboratories with expertise and interest in theoretical, computational and experimental studies of protein electrostatics. To improve structure-based energy calculations it is necessary to better understand the physical character and molecular determinants of electrostatic effects. The Cooperative thus intends to foment experimental research into fundamental aspects of proteins that depend on electrostatic interactions. It will maintain a depository for experimental data useful for critical assessment of methods for structure-based electrostatics calculations. To help guide the development of computational methods the Cooperative will organize blind prediction exercises. As a first step, computational laboratories were invited to reproduce an unpublished set of experimental pKa values of acidic and basic residues introduced in the interior of staphylococcal nuclease by site-directed mutagenesis. The pKa values of these groups are unique and challenging to simulate owing to the large magnitude of their shifts relative to normal pKa values in water. Many computational methods were tested in this 1st Blind Prediction Challenge and critical assessment exercise. A workshop was organized in the Telluride Science Research Center to assess objectively the performance of many computational methods tested on this one extensive dataset. This volume of PROTEINS: Structure, Function, and Bioinformatics introduces the pKa Cooperative, presents reports submitted by participants in the blind prediction challenge, and highlights some of the problems in structure-based calculations identified during this exercise. PMID:22002877
The role of dedicated data computing centers in the age of cloud computing
NASA Astrophysics Data System (ADS)
Caramarcu, Costin; Hollowell, Christopher; Strecker-Kellogg, William; Wong, Antonio; Zaytsev, Alexandr
2017-10-01
Brookhaven National Laboratory (BNL) anticipates significant growth in scientific programs with large computing and data storage needs in the near future and has recently reorganized support for scientific computing to meet these needs. A key component is the enhanced role of the RHIC-ATLAS Computing Facility (RACF) in support of high-throughput and high-performance computing (HTC and HPC) at BNL. This presentation discusses the evolving role of the RACF at BNL, in light of its growing portfolio of responsibilities and its increasing integration with cloud (academic and for-profit) computing activities. We also discuss BNL’s plan to build a new computing center to support the new responsibilities of the RACF and present a summary of the cost benefit analysis done, including the types of computing activities that benefit most from a local data center vs. cloud computing. This analysis is partly based on an updated cost comparison of Amazon EC2 computing services and the RACF, which was originally conducted in 2012.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, Peter F.
"Heart of the Solution- Energy Frontiers" was submitted by the Center for Solar and Thermal Energy Conversion (CSTEC) to the "Life at the Frontiers of Energy Research" video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. This video was both the People's Choice Award winner and selected as one of five winners by a distinguished panel of judges for its "exemplary explanation of the role of an Energy Frontier Research Center". The Center for Solar and Thermal Energymore » Conversion is directed by Peter F. Green at the University of Michigan. The Office of Basic Energy Sciences in the U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges. The mission of the Center for Solar and Thermal Energy Conversion is 'to study complex material structures on the nanoscale to identify key features for their potential use as materials to convert solar energy and heat to electricity.' Research topics are: solar photovoltaic, photonic, optics, solar thermal, thermoelectric, phonons, thermal conductivity, solar electrodes, defects, ultrafast physics, interfacial characterization, matter by design, novel materials synthesis, charge transport, and self-assembly.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Xiaoyang
"EFRC: CST at the University of Texas at Austin- A DOE Energy Frontier Research Center" was submitted by the EFRC for Understanding Charge Separation and Transfer at Interfaces in Energy Materials (EFRC:CST) to the "Life at the Frontiers of Energy Research" video contest at the 2011 Science for Our Nation's Energy Future: Energy Frontier Research Centers (EFRCs) Summit and Forum. Twenty-six EFRCs created short videos to highlight their mission and their work. EFRC: CST is directed by Xiaoyang Zhu at the University of Texas at Austin in partnership with Sandia National Laboratories. The Office of Basic Energy Sciences in themore » U.S. Department of Energy's Office of Science established the 46 Energy Frontier Research Centers (EFRCs) in 2009. These collaboratively-organized centers conduct fundamental research focused on 'grand challenges' and use-inspired 'basic research needs' recently identified in major strategic planning efforts by the scientific community. The overall purpose is to accelerate scientific progress toward meeting the nation's critical energy challenges.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-03
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. ER11-3635-000] Hatch Solar Energy Center 1, LLC; Supplemental Notice That Initial Market-Based Rate Filing Includes Request for... Hatch Solar Energy Center 1, LLC's application for market-based rate authority, with an accompanying...
76 FR 13171 - Leaf River Energy Center LLC; Notice of Application
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-10
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. CP11-107-000] Leaf River Energy Center LLC; Notice of Application On February 25, 2011, Leaf River Energy Center LLC (Leaf River... Docket No. CP08-8-000 to authorize Leaf River to relocate and construct two of its certificated and not...
DSMC simulations of shock interactions about sharp double cones
NASA Astrophysics Data System (ADS)
Moss, James N.
2001-08-01
This paper presents the results of a numerical study of shock interactions resulting from Mach 10 flow about sharp double cones. Computations are made by using the direct simulation Monte Carlo (DSMC) method of Bird. The sensitivity and characteristics of the interactions are examined by varying flow conditions, model size, and configuration. The range of conditions investigated includes those for which experiments have been or will be performed in the ONERA R5Ch low-density wind tunnel and the Calspan-University of Buffalo Research Center (CUBRC) Large Energy National Shock (LENS) tunnel.
DSMC Simulations of Shock Interactions About Sharp Double Cones
NASA Technical Reports Server (NTRS)
Moss, James N.
2000-01-01
This paper presents the results of a numerical study of shock interactions resulting from Mach 10 flow about sharp double cones. Computations are made by using the direct simulation Monte Carlo (DSMC) method of Bird. The sensitivity and characteristics of the interactions are examined by varying flow conditions, model size, and configuration. The range of conditions investigated includes those for which experiments have been or will be performed in the ONERA R5Ch low-density wind tunnel and the Calspan-University of Buffalo Research Center (CUBRC) Large Energy National Shock (LENS) tunnel.
Computer simulation of formation and decomposition of Au13 nanoparticles
NASA Astrophysics Data System (ADS)
Stishenko, P.; Svalova, A.
2017-08-01
To study the Ostwald ripening process of Au13 nanoparticles a two-scale model is constructed: analytical approximation of average nanoparticle energy as function of nanoparticle size and structural motive, and the Monte Carlo model of 1000 particles ensemble. Simulation results show different behavior of particles of different structural motives. The change of the distributions of atom coordination numbers during the Ostwald ripening process was observed. The nanoparticles of the equal size and shape with the face-centered cubic structure of the largest sizes appeared to be the most stable.
Radiation-reaction force on a small charged body to second order
NASA Astrophysics Data System (ADS)
Moxon, Jordan; Flanagan, Éanna
2018-05-01
In classical electrodynamics, an accelerating charged body emits radiation and experiences a corresponding radiation-reaction force, or self-force. We extend to higher order in the total charge a previous rigorous derivation of the electromagnetic self-force in flat spacetime by Gralla, Harte, and Wald. The method introduced by Gralla, Harte, and Wald computes the self-force from the Maxwell field equations and conservation of stress-energy in a limit where the charge, size, and mass of the body go to zero, and it does not require regularization of a singular self-field. For our higher-order computation, an adjustment of the definition of the mass of the body is necessary to avoid including self-energy from the electromagnetic field sourced by the body in the distant past. We derive the evolution equations for the mass, spin, and center-of-mass position of the body through second order. We derive, for the first time, the second-order acceleration dependence of the evolution of the spin (self-torque), as well as a mixing between the extended body effects and the acceleration-dependent effects on the overall body motion.
Large Eddy Simulation of a Supercritical Turbulent Mixing Layer
NASA Astrophysics Data System (ADS)
Sheikhi, Reza; Hadi, Fatemeh; Safari, Mehdi
2017-11-01
Supercritical turbulent flows are relevant to a wide range of applications such as supercritical power cycles, gas turbine combustors, rocket propulsion and internal combustion engines. Large eddy simulation (LES) analysis of such flows involves solving mass, momentum, energy and scalar transport equations with inclusion of generalized diffusion fluxes. These equations are combined with a real gas equation of state and the corresponding thermodynamic mixture variables. Subgrid scale models are needed for not only the conventional convective terms but also the additional high pressure effects arising due to the nonlinearity associated with generalized diffusion fluxes and real gas equation of state. In this study, LES is carried out to study the high pressure turbulent mixing of methane with carbon dioxide in a temporally developing mixing layer under supercritical condition. LES results are assessed by comparing with data obtained from direct numerical simulation (DNS) of the same layer. LES predictions agree favorably with DNS data and represent several key supercritical turbulent flow features such as high density gradient regions. Supported by DOE Grant SC0017097; computational support is provided by DOE National Energy Research Scientific Computing Center.
Experimental and Computational Interrogation of Fast SCR Mechanism and Active Sites on H-Form SSZ-13
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Sichi; Zheng, Yang; Gao, Feng
Experiment and density functional theory (DFT) models are combined to develop a unified, quantitative model of the mechanism and kinetics of fast selective catalytic reduction (SCR) of NO/NO2 mixtures over H-SSZ-13 zeolite. Rates, rate orders, and apparent activation energies collected under differential conditions reveal two distinct kinetic regimes. First-principles thermodynamics simulations are used to determine the relative coverages of free Brønsted sites, chemisorbed NH4+ and physisorbed NH3 as a function of reaction conditions. First-principles metadynamics calculations show that all three sites can contribute to the rate-limiting N-N bond forming step in fast SCR. The results are used to parameterize amore » kinetic model that encompasses the full range of reaction conditions and recovers observed rate orders and apparent activation energies. Observed kinetic regimes are related to changes in most-abundant surface intermediates. Financial support was provided by the National Science Foundation GAOLI program under award number 1258690-CBET. We thank the Center for Research Computing at Notre« less
Adamczyk, Peter Gabriel; Roland, Michelle; Hahn, Michael E
2017-08-01
Many studies have reported the effects of different foot prostheses on gait, but most results cannot be generalized because the prostheses' properties are seldom reported. We varied hindfoot and forefoot stiffness in an experimental foot prosthesis, in increments of 15N/mm, and tested the parametric effects of these variations on treadmill walking in unilateral transtibial amputees, at speeds from 0.7 to 1.5m/s. We computed outcomes such as prosthesis energy return, center of mass (COM) mechanics, ground reaction forces, and joint mechanics, and computed their sensitivity to component stiffness. A stiffer hindfoot led to reduced prosthesis energy return, increased ground reaction force (GRF) loading rate, and greater stance-phase knee flexion and knee extensor moment. A stiffer forefoot resulted in reduced prosthetic-side ankle push-off and COM push-off work, and increased knee extension and knee flexor moment in late stance. The sensitivity parameters obtained from these tests may be useful in clinical prescription and further research into compensatory mechanisms of joint function. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Abdullah, Warith; Reddy, Remata
From October 22nd to 30th, 2012 Hurricane Sandy was a huge storm of many abnormalities causing an estimated 50 billion dollars in damage. Tropical storm development states systems’ energy as product of warm sea surface temperatures (SST’s) and tropical cyclone heat potential (TCHP). Advances in Earth Observing (EO) technology, remote sensing and proxy remote sensing have allowed for accurate measurements of SST and TCHP information. In this study, we investigated rapid intensification of Sandy through EO applications for precipitable water vapor (PWAT), SST’s and TCHP during the period of October 27th. These data were obtained from NASA and NOAA satellites and NOAA National Buoy data center (NDBC). The Sensible Heat (Qs) fluxes were computed to determine available energy resulting from ocean-atmosphere interface. Buoy 41010, 120 NM east of Cape Canaveral at 0850 UTC measured 22.3 °C atmospheric temperatures and 27 °C SST, an interface of 4.7 °C. Sensible heat equation computed fluxes of 43.7 W/m2 at 982.0 mb central pressure. Sandy formed as late-season storm and near-surface air temperatures averaged > 21 °C according to NOAA/ESRL NCEP/NCAR reanalysis at 1000 mb and GOES 13 (EAST) geostationary water vapor imagery shows approaching cold front during October 27th. Sandy encountered massive dry air intrusion to S, SE and E quadrants of storm while travelling up U.S east coast but experienced no weakening. Cool, dry air intrusion was considered for PWAT investigation from closest sounding station during Oct. 27th 0900 - 2100 UTC at Charleston, SC station 72208. Measured PWAT totaled 42.97 mm, indicating large energy potential supply to the storm. The Gulf Stream was observed using NASA Short-term Prediction Research and Transition Center (SPoRT) MODIS SST analysis. The results show 5 °C warmer above average than surrounding cooler water, with > 25 °C water extent approximately 400 NM east of Chesapeake Bay and eddies > 26 °C. Results from sensible heat computations for atmospheric interface suggests unusual warmth associated with Gulf Stream current, such that it provided Sandy with enough kinetic energy to intensify at high latitude. The study further suggests that energy gained from Caribbean TCHP and Gulf Stream SST’s were largely retained by Sandy upon losing tropical-cyclone characteristics and merging with strong cold front and polar jet stream. Storms of Sandy’s magnitude and unusual source of energy resulting from Gulf Stream may indicate a building average for tropical cyclone development and intensity for North Atlantic, particularly as the GOM waters continue to warm on seasonal averages.
CAROLINA CENTER FOR COMPUTATIONAL TOXICOLOGY
The Center will advance the field of computational toxicology through the development of new methods and tools, as well as through collaborative efforts. In each Project, new computer-based models will be developed and published that represent the state-of-the-art. The tools p...
Lombardi, Andrea; Pirani, Fernando; Laganà, Antonio; Bartolomei, Massimiliano
2016-06-15
In this work, we exploit a new formulation of the potential energy and of the related computational procedures, which embodies the coupling between the intra and intermolecular components, to characterize possible propensities of the collision dynamics in energy transfer processes of interest for simulation and control of phenomena occurring in a variety of equilibrium and nonequilibrium environments. The investigation reported in the paper focuses on the prototype CO2 -N2 system, whose intramolecular component of the interaction is modeled in terms of a many body expansion while the intermolecular component is modeled in terms of a recently developed bonds-as-interacting-molecular-centers' approach. The main advantage of this formulation of the potential energy surface is that of being (a) truly full dimensional (i.e., all the variations of the coordinates associated with the molecular vibrations and rotations on the geometrical and electronic structure of the monomers, are explicitly taken into account without freezing any bonds or angles), (b) more flexible than other usual formulations of the interaction and (c) well suited for fitting procedures better adhering to accurate ab initio data and sensitive to experimental arrangement dependent information. Specific attention has been given to the fact that a variation of vibrational and rotational energy has a higher (both qualitative and quantitative) impact on the energy transfer when a more accurate formulation of the intermolecular interaction (with respect to that obtained when using rigid monomers) is adopted. This makes the potential energy surface better suited for the kinetic modeling of gaseous mixtures in plasma, combustion and atmospheric chemistry computational applications. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Lazaridis, Themis; Leveritt, John M; PeBenito, Leo
2014-09-01
The energetic cost of burying charged groups in the hydrophobic core of lipid bilayers has been controversial, with simulations giving higher estimates than certain experiments. Implicit membrane approaches are usually deemed too simplistic for this problem. Here we challenge this view. The free energy of transfer of amino acid side chains from water to the membrane center predicted by IMM1 is reasonably close to all-atom free energy calculations. The shape of the free energy profile, however, for the charged side chains needs to be modified to reflect the all-atom simulation findings (IMM1-LF). Membrane thinning is treated by combining simulations at different membrane widths with an estimate of membrane deformation free energy from elasticity theory. This approach is first tested on the voltage sensor and the isolated S4 helix of potassium channels. The voltage sensor is stably inserted in a transmembrane orientation for both the original and the modified model. The transmembrane orientation of the isolated S4 helix is unstable in the original model, but a stable local minimum in IMM1-LF, slightly higher in energy than the interfacial orientation. Peptide translocation is addressed by mapping the effective energy of the peptide as a function of vertical position and tilt angle, which allows identification of minimum energy pathways and transition states. The barriers computed for the S4 helix and other experimentally studied peptides are low enough for an observable rate. Thus, computational results and experimental studies on the membrane burial of peptide charged groups appear to be consistent. This article is part of a Special Issue entitled: Interfacially Active Peptides and Proteins. Guest Editors: William C. Wimley and Kalina Hristova. Copyright © 2014 Elsevier B.V. All rights reserved.
Experimental Results from the Thermal Energy Storage-1 (TES-1) Flight Experiment
NASA Technical Reports Server (NTRS)
Wald, Lawrence W.; Tolbert, Carol; Jacqmin, David
1995-01-01
The Thermal Energy Storage-1 (TES-1) is a flight experiment that flew on the Space Shuttle Columbia (STS-62), in March 1994, as part of the OAST-2 mission. TES-1 is the first experiment in a four experiment suite designed to provide data for understanding the long duration microgravity behavior of thermal energy storage fluoride salts that undergo repeated melting and freezing. Such data have never been obtained before and have direct application for the development of space-based solar dynamic (SD) power systems. These power systems will store solar energy in a thermal energy salt such as lithium fluoride or calcium fluoride. The stored energy is extracted during the shade portion of the orbit. This enables the solar dynamic power system to provide constant electrical power over the entire orbit. Analytical computer codes have been developed for predicting performance of a spaced-based solar dynamic power system. Experimental verification of the analytical predictions is needed prior to using the analytical results for future space power design applications. The four TES flight experiments will be used to obtain the needed experimental data. This paper will focus on the flight results from the first experiment, TES-1, in comparison to the predicted results from the Thermal Energy Storage Simulation (TESSIM) analytical computer code. The TES-1 conceptual development, hardware design, final development, and system verification testing were accomplished at the NASA lewis Research Center (LeRC). TES-1 was developed under the In-Space Technology Experiment Program (IN-STEP), which sponsors NASA, industry, and university flight experiments designed to enable and enhance space flight technology. The IN-STEP Program is sponsored by the Office of Space Access and Technology (OSAT).
Are we living near the center of a local void?
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cusin, Giulia; Pitrou, Cyril; Uzan, Jean-Philippe, E-mail: giulia.cusin@unige.ch, E-mail: pitrou@iap.fr, E-mail: uzan@iap.fr
The properties of the cosmic microwave background (CMB) temperature and polarisation anisotropies measured by a static, off-centered observer located in a local spherically symmetric void, are described. In particular in this paper we compute, together with the standard 2-point angular correlation functions, the off-diagonal correlators, which are no more vanishing by symmetry. While the energy shift induced by the off-centered position of the observer can be suppressed by a proper choice of the observer velocity, a lensing-like effect on the CMB emission point remains. This latter effect is genuinely geometrical (e.g. non-degenerate with a boost) and reflects in the structuremore » of the off-diagonal correlators. At lowest order in this effect, the temperature and polarisation correlation matrices have non-vanishing diagonal elements, as usual, and all the off-diagonal terms are excited. This particular signature of a local void model allows one, in principle, to disentangle geometrical effects from local kinematical ones in CMB observations.« less
NASA Astrophysics Data System (ADS)
Klyashtorny, V. G.; Fufina, T. Yu.; Vasilieva, L. G.; Shuvalov, V. A.; Gabdulkhakov, A. G.
2014-07-01
Pigment-protein interactions are responsible for the high efficiency of the light-energy transfer and conversion in photosynthesis. The reaction center (RC) from the purple bacterium Rhodobacter sphaeroides is the most convenient model for studying the mechanisms of primary processes of photosynthesis. Site-directed mutagenesis can be used to study the effect of the protein environment of electron-transfer cofactors on the optical properties, stability, pigment composition, and functional activity of RC. The preliminary analysis of RC was performed by computer simulation of the amino acid substitutions L(M196)H + H(M202)L at the pigment-protein interface and by estimating the stability of the threedimensional structure of the mutant RC by the molecular dynamics method. The doubly mutated reaction center was overexpressed, purified, and crystallized. The three-dimensional structure of this mutant was determined by X-ray crystallography and compared with the molecular dynamics model.
NASA Astrophysics Data System (ADS)
Osei, Richard
There are many problems associated with operating a data center. Some of these problems include data security, system performance, increasing infrastructure complexity, increasing storage utilization, keeping up with data growth, and increasing energy costs. Energy cost differs by location, and at most locations fluctuates over time. The rising cost of energy makes it harder for data centers to function properly and provide a good quality of service. With reduced energy cost, data centers will have longer lasting servers/equipment, higher availability of resources, better quality of service, a greener environment, and reduced service and software costs for consumers. Some of the ways that data centers have tried to using to reduce energy costs include dynamically switching on and off servers based on the number of users and some predefined conditions, the use of environmental monitoring sensors, and the use of dynamic voltage and frequency scaling (DVFS), which enables processors to run at different combinations of frequencies with voltages to reduce energy cost. This thesis presents another method by which energy cost at data centers could be reduced. This method involves the use of Ant Colony Optimization (ACO) on a Quadratic Assignment Problem (QAP) in assigning user request to servers in geo-distributed data centers. In this paper, an effort to reduce data center energy cost involves the use of front portals, which handle users' requests, were used as ants to find cost effective ways to assign users requests to a server in heterogeneous geo-distributed data centers. The simulation results indicate that the ACO for Optimal Server Activation and Task Placement algorithm reduces energy cost on a small and large number of users' requests in a geo-distributed data center and its performance increases as the input data grows. In a simulation with 3 geo-distributed data centers, and user's resource request ranging from 25,000 to 25,000,000, the ACO algorithm was able to reduce energy cost on an average of $.70 per second. The ACO for Optimal Server Activation and Task Placement algorithm has proven to work as an alternative or improvement in reducing energy cost in geo-distributed data centers.
Computational Study of Droplet Trains Impacting a Smooth Solid Surface
NASA Astrophysics Data System (ADS)
Markt, David, Jr.; Pathak, Ashish; Raessi, Mehdi; Lee, Seong-Young; Zhao, Emma
2017-11-01
The study of droplet impingement is vital to understanding the fluid dynamics of fuel injection in modern internal combustion engines. One widely accepted model was proposed by Yarin and Weiss (JFM, 1995), developed from experiments of single trains of ethanol droplets impacting a substrate. The model predicts the onset of splashing and the mass ejected upon splashing. In this study, using an in-house 3D multiphase flow solver, the experiments of Yarin and Weiss were computationally simulated. The experimentally observed splashing threshold was captured by the simulations, thus validating the solver's ability to accurately simulate the splashing dynamics. Then, we performed simulations of cases with multiple droplet trains, which have high relevance to dense fuel sprays, where droplets impact within the spreading diameters of their neighboring droplets, leading to changes in splashing dynamics due to interactions of spreading films. For both single and multi-train simulations the amount of splashed mass was calculated as a function of time, allowing a quantitative comparison between the two cases. Furthermore, using a passive scalar the amount of splashed mass per impinging droplet was also calculated. This work is supported by the Department of Energy, Office of Energy Efficiency and Renewable Energy (EERE) and the Department of Defense, Tank and Automotive Research, Development, and Engineering Center (TARDEC), under Award Number DE-EE0007292.
Double Super-Exchange in Silicon Quantum Dots Connected by Short-Bridged Networks
NASA Astrophysics Data System (ADS)
Li, Huashan; Wu, Zhigang; Lusk, Mark
2013-03-01
Silicon quantum dots (QDs) with diameters in the range of 1-2 nm are attractive for photovoltaic applications. They absorb photons more readily, transport excitons with greater efficiency, and show greater promise in multiple-exciton generation and hot carrier collection paradigms. However, their high excitonic binding energy makes it difficult to dissociate excitons into separate charge carriers. One possible remedy is to create dot assemblies in which a second material creates a Type-II heterojunction with the dot so that exciton dissociation occurs locally. This talk will focus on such a Type-II heterojunction paradigm in which QDs are connected via covalently bonded, short-bridge molecules. For such interpenetrating networks of dots and molecules, our first principles computational investigation shows that it is possible to rapidly and efficiently separate electrons to QDs and holes to bridge units. The bridge network serves as an efficient mediator of electron superexchange between QDs while the dots themselves play the complimentary role of efficient hole superexchange mediators. Dissociation, photoluminescence and carrier transport rates will be presented for bridge networks of silicon QDs that exhibit such double superexchange. This material is based upon work supported by the Renewable Energy Materials Research Science and Engineering Center (REMRSEC) under Grant No. DMR-0820518 and Golden Energy Computing Organization (GECO).
Predicting lattice thermal conductivity with help from ab initio methods
NASA Astrophysics Data System (ADS)
Broido, David
2015-03-01
The lattice thermal conductivity is a fundamental transport parameter that determines the utility a material for specific thermal management applications. Materials with low thermal conductivity find applicability in thermoelectric cooling and energy harvesting. High thermal conductivity materials are urgently needed to help address the ever-growing heat dissipation problem in microelectronic devices. Predictive computational approaches can provide critical guidance in the search and development of new materials for such applications. Ab initio methods for calculating lattice thermal conductivity have demonstrated predictive capability, but while they are becoming increasingly efficient, they are still computationally expensive particularly for complex crystals with large unit cells . In this talk, I will review our work on first principles phonon transport for which the intrinsic lattice thermal conductivity is limited only by phonon-phonon scattering arising from anharmonicity. I will examine use of the phase space for anharmonic phonon scattering and the Grüneisen parameters as measures of the thermal conductivities for a range of materials and compare these to the widely used guidelines stemming from the theory of Liebfried and Schölmann. This research was supported primarily by the NSF under Grant CBET-1402949, and by the S3TEC, an Energy Frontier Research Center funded by the US DOE, office of Basic Energy Sciences under Award No. DE-SC0001299.
Design and synthesis of a crystalline LiPON electrolyte
NASA Astrophysics Data System (ADS)
Holzwarth, N. A. W.; Senevirathne, Keerthi; Day, Cynthia S.; Lachgar, Abdessadek; Gross, Michael D.
2013-03-01
In the course of a computation study of the broad class of lithium phosphorus oxy-nitride materials of interest for solid electrolyte applications, Du and Holzwarth, [2] recently predicted a stable crystalline material with the stoichiometry Li2PO2N. The present paper reports the experimental preparation of the material using high temperature solid state synthesis and reports the results of experimental and calculational characterization studies. The so-named SD -Li2PO2N crystal structure has the orthorhombic space group Cmc21 with lattice constants a=9.0692(4) Å, b=5.3999(2) Å, and c=4.6856(2) Å. The structure is similar but not identical to the predicted structure, characterized by parallel arrangements of anionic phosphorus oxy-nitride chains having planar P -N -P -N backbones. Nitrogen 2p π states contribute to the strong bonding and to the chemical and thermal stablility of the material in air up to 600° C and in vacuum up to 1050° C. The measured Arrhenius activation energy for ionic conductivity is 0.6 eV which is comparable to computed vacancy migration energies in the presence of a significant population of Li+ ion vacancies. Supported by NSF grant DMR-1105485 and by a grnat from the Wake Forest University Center for Energy, Environment, and Sustainability.
A Theoretical Investigation of the Input Characteristics of a Rectangular Cavity-Backed Slot Antenna
NASA Technical Reports Server (NTRS)
Cockrell, C. R.
1975-01-01
Equations which represent the magnetic and electric stored energies are derived for an infinite section of rectangular waveguide and a rectangular cavity. These representations which are referred to as being physically observable are obtained by considering the difference in the volume integrals appearing in the complex Poynting theorem. It is shown that the physically observable stored energies are determined by the field components that vanish in a reference plane outside the aperture. These physically observable representations are used to compute the input admittance of a rectangular cavity-backed slot antenna in which a single propagating wave is assumed to exist in the cavity. The slot is excited by a voltage source connected across its center; a sinusoidal distribution is assumed in the slot. Input-admittance calculations are compared with measured data. In addition, input-admittance curves as a function of electrical slot length are presented for several size cavities. For the rectangular cavity backed slot antenna, the quality factor and relative bandwidth were computed independently by using these energy relationships. It is shown that the asymptotic relationship which is usually assumed to exist between the quality bandwidth and the reciprocal of relative bandwidth is equally valid for the rectangular cavity backed slot antenna.
Adaptation of a Control Center Development Environment for Industrial Process Control
NASA Technical Reports Server (NTRS)
Killough, Ronnie L.; Malik, James M.
1994-01-01
In the control center, raw telemetry data is received for storage, display, and analysis. This raw data must be combined and manipulated in various ways by mathematical computations to facilitate analysis, provide diversified fault detection mechanisms, and enhance display readability. A development tool called the Graphical Computation Builder (GCB) has been implemented which provides flight controllers with the capability to implement computations for use in the control center. The GCB provides a language that contains both general programming constructs and language elements specifically tailored for the control center environment. The GCB concept allows staff who are not skilled in computer programming to author and maintain computer programs. The GCB user is isolated from the details of external subsystem interfaces and has access to high-level functions such as matrix operators, trigonometric functions, and unit conversion macros. The GCB provides a high level of feedback during computation development that improves upon the often cryptic errors produced by computer language compilers. An equivalent need can be identified in the industrial data acquisition and process control domain: that of an integrated graphical development tool tailored to the application to hide the operating system, computer language, and data acquisition interface details. The GCB features a modular design which makes it suitable for technology transfer without significant rework. Control center-specific language elements can be replaced by elements specific to industrial process control.
Cloudbursting - Solving the 3-body problem
NASA Astrophysics Data System (ADS)
Chang, G.; Heistand, S.; Vakhnin, A.; Huang, T.; Zimdars, P.; Hua, H.; Hood, R.; Koenig, J.; Mehrotra, P.; Little, M. M.; Law, E.
2014-12-01
Many science projects in the future will be accomplished through collaboration among 2 or more NASA centers along with, potentially, external scientists. Science teams will be composed of more geographically dispersed individuals and groups. However, the current computing environment does not make this easy and seamless. By being able to share computing resources among members of a multi-center team working on a science/ engineering project, limited pre-competition funds could be more efficiently applied and technical work could be conducted more effectively with less time spent moving data or waiting for computing resources to free up. Based on the work from an NASA CIO IT Labs task, this presentation will highlight our prototype work in identifying the feasibility and identify the obstacles, both technical and management, to perform "Cloudbursting" among private clouds located at three different centers. We will demonstrate the use of private cloud computing infrastructure at the Jet Propulsion Laboratory, Langley Research Center, and Ames Research Center to provide elastic computation to each other to perform parallel Earth Science data imaging. We leverage elastic load balancing and auto-scaling features at each data center so that each location can independently define how many resources to allocate to a particular job that was "bursted" from another data center and demonstrate that compute capacity scales up and down with the job. We will also discuss future work in the area, which could include the use of cloud infrastructure from different cloud framework providers as well as other cloud service providers.
Computers and Media Centers--A Winning Combination.
ERIC Educational Resources Information Center
Graf, Nancy
1984-01-01
Profile of the computer program offered by the library/media center at Chief Joseph Junior High School in Richland, Washington, highlights program background, operator's licensing procedure, the trainer license, assistance from high school students, need for more computers, handling of software, and helpful hints. (EJS)