Building CHAOS: An Operating System for Livermore Linux Clusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garlick, J E; Dunlap, C M
2003-02-21
The Livermore Computing (LC) Linux Integration and Development Project (the Linux Project) produces and supports the Clustered High Availability Operating System (CHAOS), a cluster operating environment based on Red Hat Linux. Each CHAOS release begins with a set of requirements and ends with a formally tested, packaged, and documented release suitable for use on LC's production Linux clusters. One characteristic of CHAOS is that component software packages come from different sources under varying degrees of project control. Some are developed by the Linux Project, some are developed by other LC projects, some are external open source projects, and some aremore » commercial software packages. A challenge to the Linux Project is to adhere to release schedules and testing disciplines in a diverse, highly decentralized development environment. Communication channels are maintained for externally developed packages in order to obtain support, influence development decisions, and coordinate/understand release schedules. The Linux Project embraces open source by releasing locally developed packages under open source license, by collaborating with open source projects where mutually beneficial, and by preferring open source over proprietary software. Project members generally use open source development tools. The Linux Project requires system administrators and developers to work together to resolve problems that arise in production. This tight coupling of production and development is a key strategy for making a product that directly addresses LC's production requirements. It is another challenge to balance support and development activities in such a way that one does not overwhelm the other.« less
Data Intensive Computing on Amazon Web Services
DOE Office of Scientific and Technical Information (OSTI.GOV)
Magana-Zook, S. A.
The Geophysical Monitoring Program (GMP) has spent the past few years building up the capability to perform data intensive computing using what have been referred to as “big data” tools. These big data tools would be used against massive archives of seismic signals (>300 TB) to conduct research not previously possible. Examples of such tools include Hadoop (HDFS, MapReduce), HBase, Hive, Storm, Spark, Solr, and many more by the day. These tools are useful for performing data analytics on datasets that exceed the resources of traditional analytic approaches. To this end, a research big data cluster (“Cluster A”) was setmore » up as a collaboration between GMP and Livermore Computing (LC).« less
Science& Technology Review June 2003
DOE Office of Scientific and Technical Information (OSTI.GOV)
McMahon, D
This month's issue has the following articles: (1) Livermore's Three-Pronged Strategy for High-Performance Computing, Commentary by Dona Crawford; (2) Riding the Waves of Supercomputing Technology--Livermore's Computation Directorate is exploiting multiple technologies to ensure high-performance, cost-effective computing; (3) Chromosome 19 and Lawrence Livermore Form a Long-Lasting Bond--Lawrence Livermore biomedical scientists have played an important role in the Human Genome Project through their long-term research on chromosome 19; (4) A New Way to Measure the Mass of Stars--For the first time, scientists have determined the mass of a star in isolation from other celestial bodies; and (5) Flexibly Fueled Storage Tank Bringsmore » Hydrogen-Powered Cars Closer to Reality--Livermore's cryogenic hydrogen fuel storage tank for passenger cars of the future can accommodate three forms of hydrogen fuel separately or in combination.« less
Zhang, Xiaohua; Wong, Sergio E; Lightstone, Felice C
2013-04-30
A mixed parallel scheme that combines message passing interface (MPI) and multithreading was implemented in the AutoDock Vina molecular docking program. The resulting program, named VinaLC, was tested on the petascale high performance computing (HPC) machines at Lawrence Livermore National Laboratory. To exploit the typical cluster-type supercomputers, thousands of docking calculations were dispatched by the master process to run simultaneously on thousands of slave processes, where each docking calculation takes one slave process on one node, and within the node each docking calculation runs via multithreading on multiple CPU cores and shared memory. Input and output of the program and the data handling within the program were carefully designed to deal with large databases and ultimately achieve HPC on a large number of CPU cores. Parallel performance analysis of the VinaLC program shows that the code scales up to more than 15K CPUs with a very low overhead cost of 3.94%. One million flexible compound docking calculations took only 1.4 h to finish on about 15K CPUs. The docking accuracy of VinaLC has been validated against the DUD data set by the re-docking of X-ray ligands and an enrichment study, 64.4% of the top scoring poses have RMSD values under 2.0 Å. The program has been demonstrated to have good enrichment performance on 70% of the targets in the DUD data set. An analysis of the enrichment factors calculated at various percentages of the screening database indicates VinaLC has very good early recovery of actives. Copyright © 2013 Wiley Periodicals, Inc.
LINCS: Livermore's network architecture. [Octopus computing network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fletcher, J.G.
1982-01-01
Octopus, a local computing network that has been evolving at the Lawrence Livermore National Laboratory for over fifteen years, is currently undergoing a major revision. The primary purpose of the revision is to consolidate and redefine the variety of conventions and formats, which have grown up over the years, into a single standard family of protocols, the Livermore Interactive Network Communication Standard (LINCS). This standard treats the entire network as a single distributed operating system such that access to a computing resource is obtained in a single way, whether that resource is local (on the same computer as the accessingmore » process) or remote (on another computer). LINCS encompasses not only communication but also such issues as the relationship of customer to server processes and the structure, naming, and protection of resources. The discussion includes: an overview of the Livermore user community and computing hardware, the functions and structure of each of the seven layers of LINCS protocol, the reasons why we have designed our own protocols and why we are dissatisfied by the directions that current protocol standards are taking.« less
LTSS compendium: an introduction to the CDC 7600 and the Livermore Timesharing System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fong, K. W.
1977-08-15
This report is an introduction to the CDC 7600 computer and to the Livermore Timesharing System (LTSS) used by the National Magnetic Fusion Energy Computer Center (NMFECC) and the Lawrence Livermore Laboratory Computer Center (LLLCC or Octopus network) on their 7600's. This report is based on a document originally written specifically about the system as it is implemented at NMFECC but has been broadened to point out differences in implementation at LLLCC. It also contains information about LLLCC not relevant to NMFECC. This report is written for computational physicists who want to prepare large production codes to run under LTSSmore » on the 7600's. The generalized discussion of the operating system focuses on creating and executing controllees. This document and its companion, UCID-17557, CDC 7600 LTSS Programming Stratagems, provide a basis for understanding more specialized documents about individual parts of the system.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
A video on computer security is described. Lonnie Moore, the Computer Security Manager, CSSM/CPPM at Lawrence Livermore National Laboratory (LLNL) and Gale Warshawsky, the Coordinator for Computer Security Education and Awareness at LLNL, wanted to share topics such as computer ethics, software piracy, privacy issues, and protecting information in a format that would capture and hold an audience`s attention. Four Computer Security Short Subject videos were produced which ranged from 1--3 minutes each. These videos are very effective education and awareness tools that can be used to generate discussions about computer security concerns and good computing practices.
Massive Signal Analysis with Hadoop (Invited)
NASA Astrophysics Data System (ADS)
Addair, T.
2013-12-01
The Geophysical Monitoring Program (GMP) at Lawrence Livermore National Laboratory is in the process of transitioning from a primarily human-driven analysis pipeline to a more automated and exploratory system. Waveform correlation represents a significant part of this effort, and the results that come out of this processing could lead to the development of more sophisticated event detection and analysis systems that require less human interaction, and address fundamental shortcomings in existing systems. Furthermore, use of distributed IO systems fundamentally addresses a scalability concern for the GMP as our data holdings continue to grow rapidly. As the data volume increases, it becomes less reasonable to rely upon human analysts to sift through all the information. Not only is more automation essential to keeping up with the ingestion rate, but so too do we require faster and more sophisticated tools for visualizing and interacting with the data. These issues of scalability are not unique to GMP or the seismic domain. All across the lab, and throughout industry, we hear about the promise of 'big data' to address the need of quickly analyzing vast amounts of data in fundamentally new ways. Our waveform correlation system finds and correlates nearby seismic events across the entire Earth. In our original implementation of the system, we processed some 50 TB of data on an in-house traditional HPC cluster (44 cores, 1 filesystem) over the span of 42 days. Having determined the primary bottleneck in the performance to be reading waveforms off a single BlueArc file server, we began investigating distributed IO solutions like Hadoop. As a test case, we took a 1 TB subset of our data and ported it to Livermore Computing's development Hadoop cluster. Through a pilot project sponsored by Livermore Computing (LC), the GMP successfully implemented the waveform correlation system in the Hadoop distributed MapReduce computing framework. Hadoop is an open source implementation of the MapReduce distributed programming framework. We used the Hadoop scripting framework known as Pig for putting together the multi-job MapReduce pipeline used to extract as much parallelism as possible from the algorithms. We also made use the Sqoop data ingestion tool to pull metadata tables from our Oracle database into HDFS (the Hadoop Distributed Filesystem). Running on our in-house HPC cluster, processing this test dataset took 58 hours to complete. In contrast, running our Hadoop implementation on LC's 10 node (160 core) cluster, we were able to cross-correlate the 1 TB of nearby seismic events in just under 3 hours, over a factor of 19 improvement from our existing implementation. This project is one of the first major data mining and analysis tasks performed at the lab or anywhere else correlating the entire Earth's seismicity. Through the success of this project, we believe we've shown that a MapReduce solution can be appropriate for many large-scale Earth science data analysis and exploration problems. Given Hadoop's position as the dominant data analytics solution in industry, we believe Hadoop can be applied to many previously intractable Earth science problems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fong, K. W.
1977-08-15
This report deals with some techniques in applied programming using the Livermore Timesharing System (LTSS) on the CDC 7600 computers at the National Magnetic Fusion Energy Computer Center (NMFECC) and the Lawrence Livermore Laboratory Computer Center (LLLCC or Octopus network). This report is based on a document originally written specifically about the system as it is implemented at NMFECC but has been revised to accommodate differences between LLLCC and NMFECC implementations. Topics include: maintaining programs, debugging, recovering from system crashes, and using the central processing unit, memory, and input/output devices efficiently and economically. Routines that aid in these procedures aremore » mentioned. The companion report, UCID-17556, An LTSS Compendium, discusses the hardware and operating system and should be read before reading this report.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
East, D. R.; Sexton, J.
This was a collaborative effort between Lawrence Livermore National Security, LLC as manager and operator of Lawrence Livermore National Laboratory (LLNL) and IBM TJ Watson Research Center to research, assess feasibility and develop an implementation plan for a High Performance Computing Innovation Center (HPCIC) in the Livermore Valley Open Campus (LVOC). The ultimate goal of this work was to help advance the State of California and U.S. commercial competitiveness in the arena of High Performance Computing (HPC) by accelerating the adoption of computational science solutions, consistent with recent DOE strategy directives. The desired result of this CRADA was a well-researched,more » carefully analyzed market evaluation that would identify those firms in core sectors of the US economy seeking to adopt or expand their use of HPC to become more competitive globally, and to define how those firms could be helped by the HPCIC with IBM as an integral partner.« less
Computation Directorate 2008 Annual Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crawford, D L
2009-03-25
Whether a computer is simulating the aging and performance of a nuclear weapon, the folding of a protein, or the probability of rainfall over a particular mountain range, the necessary calculations can be enormous. Our computers help researchers answer these and other complex problems, and each new generation of system hardware and software widens the realm of possibilities. Building on Livermore's historical excellence and leadership in high-performance computing, Computation added more than 331 trillion floating-point operations per second (teraFLOPS) of power to LLNL's computer room floors in 2008. In addition, Livermore's next big supercomputer, Sequoia, advanced ever closer to itsmore » 2011-2012 delivery date, as architecture plans and the procurement contract were finalized. Hyperion, an advanced technology cluster test bed that teams Livermore with 10 industry leaders, made a big splash when it was announced during Michael Dell's keynote speech at the 2008 Supercomputing Conference. The Wall Street Journal touted Hyperion as a 'bright spot amid turmoil' in the computer industry. Computation continues to measure and improve the costs of operating LLNL's high-performance computing systems by moving hardware support in-house, by measuring causes of outages to apply resources asymmetrically, and by automating most of the account and access authorization and management processes. These improvements enable more dollars to go toward fielding the best supercomputers for science, while operating them at less cost and greater responsiveness to the customers.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clouse, C. J.; Edwards, M. J.; McCoy, M. G.
2015-07-07
Through its Advanced Scientific Computing (ASC) and Inertial Confinement Fusion (ICF) code development efforts, Lawrence Livermore National Laboratory (LLNL) provides a world leading numerical simulation capability for the National HED/ICF program in support of the Stockpile Stewardship Program (SSP). In addition the ASC effort provides high performance computing platform capabilities upon which these codes are run. LLNL remains committed to, and will work with, the national HED/ICF program community to help insure numerical simulation needs are met and to make those capabilities available, consistent with programmatic priorities and available resources.
Webinar: Delivering Transformational HPC Solutions to Industry
Streitz, Frederick
2018-01-16
Dr. Frederick Streitz, director of the High Performance Computing Innovation Center, discusses Lawrence Livermore National Laboratory computational capabilities and expertise available to industry in this webinar.
Algorithms and Architectures for Elastic-Wave Inversion Final Report CRADA No. TC02144.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larsen, S.; Lindtjorn, O.
2017-08-15
This was a collaborative effort between Lawrence Livermore National Security, LLC as manager and operator of Lawrence Livermore National Laboratory (LLNL) and Schlumberger Technology Corporation (STC), to perform a computational feasibility study that investigates hardware platforms and software algorithms applicable to STC for Reverse Time Migration (RTM) / Reverse Time Inversion (RTI) of 3-D seismic data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blackwell, Matt; Rodger, Arthur; Kennedy, Tom
When the California Academy of Sciences created the "Earthquake: Evidence of a Restless Planet" exhibit, they called on Lawrence Livermore to help combine seismic research with the latest data-driven visualization techniques. The outcome is a series of striking visualizations of earthquakes, tsunamis and tectonic plate evolution. Seismic-wave research is a core competency at Livermore. While most often associated with earthquakes, the research has many other applications of national interest, such as nuclear explosion monitoring, explosion forensics, energy exploration, and seismic acoustics. For the Academy effort, Livermore researchers simulated the San Andreas and Hayward fault events at high resolutions. Such calculationsmore » require significant computational resources. To simulate the 1906 earthquake, for instance, visualizing 125 seconds of ground motion required over 1 billion grid points, 10,000 time steps, and 7.5 hours of processor time on 2,048 cores of Livermore's Sierra machine.« less
Supercomputing meets seismology in earthquake exhibit
Blackwell, Matt; Rodger, Arthur; Kennedy, Tom
2018-02-14
When the California Academy of Sciences created the "Earthquake: Evidence of a Restless Planet" exhibit, they called on Lawrence Livermore to help combine seismic research with the latest data-driven visualization techniques. The outcome is a series of striking visualizations of earthquakes, tsunamis and tectonic plate evolution. Seismic-wave research is a core competency at Livermore. While most often associated with earthquakes, the research has many other applications of national interest, such as nuclear explosion monitoring, explosion forensics, energy exploration, and seismic acoustics. For the Academy effort, Livermore researchers simulated the San Andreas and Hayward fault events at high resolutions. Such calculations require significant computational resources. To simulate the 1906 earthquake, for instance, visualizing 125 seconds of ground motion required over 1 billion grid points, 10,000 time steps, and 7.5 hours of processor time on 2,048 cores of Livermore's Sierra machine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bennett, C. V.; Mendez, A. J.
This was a collaborative effort between Lawrence Livermore National Security, LLC (formerly The Regents of the University of California)/Lawrence Livermore National Laboratory (LLNL) and Mendez R & D Associates (MRDA) to develop and demonstrate a reconfigurable and cost effective design for optical code division multiplexing (O-CDM) with high spectral efficiency and throughput, as applied to the field of distributed computing, including multiple accessing (sharing of communication resources) and bidirectional data distribution in fiber-to-the-premise (FTTx) networks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nikolic, R J
This month's issue has the following articles: (1) Dawn of a New Era of Scientific Discovery - Commentary by Edward I. Moses; (2) At the Frontiers of Fundamental Science Research - Collaborators from national laboratories, universities, and international organizations are using the National Ignition Facility to probe key fundamental science questions; (3) Livermore Responds to Crisis in Post-Earthquake Japan - More than 70 Laboratory scientists provided round-the-clock expertise in radionuclide analysis and atmospheric dispersion modeling as part of the nation's support to Japan following the March 2011 earthquake and nuclear accident; (4) A Comprehensive Resource for Modeling, Simulation, and Experimentsmore » - A new Web-based resource called MIDAS is a central repository for material properties, experimental data, and computer models; and (5) Finding Data Needles in Gigabit Haystacks - Livermore computer scientists have developed a novel computer architecture based on 'persistent' memory to ease data-intensive computations.« less
CUBE (Computer Use By Engineers) symposium abstracts. [LASL, October 4--6, 1978
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruminer, J.J.
1978-07-01
This report presents the abstracts for the CUBE (Computer Use by Engineers) Symposium, October 4, through 6, 1978. Contributors are from Lawrence Livermore Laboratory, Los Alamos Scientific Laboratory, and Sandia Laboratories.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
Lonnie Moore, the Computer Security Manager, CSSM/CPPM at Lawrence Livermore National Laboratory (LLNL) and Gale Warshawsky, the Coordinator for Computer Security Education & Awareness at LLNL, wanted to share topics such as computer ethics, software piracy, privacy issues, and protecting information in a format that would capture and hold an audience`s attention. Four Computer Security Short Subject videos were produced which ranged from 1-3 minutes each. These videos are very effective education and awareness tools that can be used to generate discussions about computer security concerns and good computing practices. Leaders may incorporate the Short Subjects into presentations. After talkingmore » about a subject area, one of the Short Subjects may be shown to highlight that subject matter. Another method for sharing them could be to show a Short Subject first and then lead a discussion about its topic. The cast of characters and a bit of information about their personalities in the LLNL Computer Security Short Subjects is included in this report.« less
New design for interfacing computers to the Octopus network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sloan, L.J.
1977-03-14
The Lawrence Livermore Laboratory has several large-scale computers which are connected to the Octopus network. Several difficulties arise in providing adequate resources along with reliable performance. To alleviate some of these problems a new method of bringing large computers into the Octopus environment is proposed.
Computations for Truck Sliding with TRUCK 3.1 Code
1989-08-01
16 REFERENCES 1. L u. \\Villiam N.. Hobbs. Norman P. and Atkinson, Michael. TRUCK 3.1-An Improrcd Digital (’oiputtr Program for Calculating the Response...for Operations and Plans ATIN: Technical Libary Director of Chemical & Nuear Operations Dpartnt of the AIW Waskbington, DC 20310 1 Cocaeder US Ay...Lawrenoe Livermore Lab. ATIN: Code 2124, Tedhnical ATTN: Tech Info Dept L-3 Reports Libary P.O. Be 808 Monterey, CA 93940 Livermore, CA 94550 AFSC
Computational Studies of X-ray Framing Cameras for the National Ignition Facility
2013-06-01
Livermore National Laboratory 7000 East Avenue Livermore, CA 94550 USA Abstract The NIF is the world’s most powerful laser facility and is...a phosphor screen where the output is recorded. The x-ray framing cameras have provided excellent information. As the yields at NIF have increased...experiments on the NIF . The basic operation of these cameras is shown in Fig. 1. Incident photons generate photoelectrons both in the pores of the MCP and
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, G.; Mansur, D.L.; Ruhter, W.D.
1994-10-01
This report presents the details of the Lawrence Livermore National Laboratory safeguards and securities program. This program is focused on developing new technology, such as x- and gamma-ray spectrometry, for measurement of special nuclear materials. This program supports the Office of Safeguards and Securities in the following five areas; safeguards technology, safeguards and decision support, computer security, automated physical security, and automated visitor access control systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barney, B; Shuler, J
2006-08-21
Purple is an Advanced Simulation and Computing (ASC) funded massively parallel supercomputer located at Lawrence Livermore National Laboratory (LLNL). The Purple Computational Environment documents the capabilities and the environment provided for the FY06 LLNL Level 1 General Availability Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Los Alamos National Laboratory, Lawrence Livermore National Laboratory, and Sandia National Laboratories, but also documents needs of the LLNL and Alliance users working in the unclassified environment. Additionally,more » the Purple Computational Environment maps the provided capabilities to the Trilab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the General Availability user environment capabilities of the ASC community. Appendix A lists these requirements and includes a description of ACE requirements met and those requirements that are not met for each section of this document. The Purple Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the Tri-lab community.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shoopman, J. D.
This report documents Livermore Computing (LC) activities in support of ASC L2 milestone 5589: Modernization and Expansion of LLNL Archive Disk Cache, due March 31, 2016. The full text of the milestone is included in Attachment 1. The description of the milestone is: Description: Configuration of archival disk cache systems will be modernized to reduce fragmentation, and new, higher capacity disk subsystems will be deployed. This will enhance archival disk cache capability for ASC archive users, enabling files written to the archives to remain resident on disk for many (6–12) months, regardless of file size. The milestone was completed inmore » three phases. On August 26, 2015 subsystems with 6PB of disk cache were deployed for production use in LLNL’s unclassified HPSS environment. Following that, on September 23, 2015 subsystems with 9 PB of disk cache were deployed for production use in LLNL’s classified HPSS environment. On January 31, 2016, the milestone was fully satisfied when the legacy Data Direct Networks (DDN) archive disk cache subsystems were fully retired from production use in both LLNL’s unclassified and classified HPSS environments, and only the newly deployed systems were in use.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hazi, A U
2007-02-06
Setting performance goals is part of the business plan for almost every company. The same is true in the world of supercomputers. Ten years ago, the Department of Energy (DOE) launched the Accelerated Strategic Computing Initiative (ASCI) to help ensure the safety and reliability of the nation's nuclear weapons stockpile without nuclear testing. ASCI, which is now called the Advanced Simulation and Computing (ASC) Program and is managed by DOE's National Nuclear Security Administration (NNSA), set an initial 10-year goal to obtain computers that could process up to 100 trillion floating-point operations per second (teraflops). Many computer experts thought themore » goal was overly ambitious, but the program's results have proved them wrong. Last November, a Livermore-IBM team received the 2005 Gordon Bell Prize for achieving more than 100 teraflops while modeling the pressure-induced solidification of molten metal. The prestigious prize, which is named for a founding father of supercomputing, is awarded each year at the Supercomputing Conference to innovators who advance high-performance computing. Recipients for the 2005 prize included six Livermore scientists--physicists Fred Streitz, James Glosli, and Mehul Patel and computer scientists Bor Chan, Robert Yates, and Bronis de Supinski--as well as IBM researchers James Sexton and John Gunnels. This team produced the first atomic-scale model of metal solidification from the liquid phase with results that were independent of system size. The record-setting calculation used Livermore's domain decomposition molecular-dynamics (ddcMD) code running on BlueGene/L, a supercomputer developed by IBM in partnership with the ASC Program. BlueGene/L reached 280.6 teraflops on the Linpack benchmark, the industry standard used to measure computing speed. As a result, it ranks first on the list of Top500 Supercomputer Sites released in November 2005. To evaluate the performance of nuclear weapons systems, scientists must understand how materials behave under extreme conditions. Because experiments at high pressures and temperatures are often difficult or impossible to conduct, scientists rely on computer models that have been validated with obtainable data. Of particular interest to weapons scientists is the solidification of metals. ''To predict the performance of aging nuclear weapons, we need detailed information on a material's phase transitions'', says Streitz, who leads the Livermore-IBM team. For example, scientists want to know what happens to a metal as it changes from molten liquid to a solid and how that transition affects the material's characteristics, such as its strength.« less
Cross-scale MD simulations of dynamic strength of tantalum
NASA Astrophysics Data System (ADS)
Bulatov, Vasily
2017-06-01
Dislocations are ubiquitous in metals where their motion presents the dominant and often the only mode of plastic response to straining. Over the last 25 years computational prediction of plastic response in metals has relied on Discrete Dislocation Dynamics (DDD) as the most fundamental method to account for collective dynamics of moving dislocations. Here we present first direct atomistic MD simulations of dislocation-mediated plasticity that are sufficiently large and long to compute plasticity response of single crystal tantalum while tracing the underlying dynamics of dislocations in all atomistic details. Where feasible, direct MD simulations sidestep DDD altogether thus reducing uncertainties of strength predictions to those of the interatomic potential. In the specific context of shock-induced material dynamics, the same MD models predict when, under what conditions and how dislocations interact and compete with other fundamental mechanisms of dynamic response, e.g. twinning, phase-transformations, fracture. In collaboration with: Luis Zepeda-Ruiz, Lawrence Livermore National Laboratory; Alexander Stukowski, Technische Universitat Darmstadt; Tomas Oppelstrup, Lawrence Livermore National Laboratory. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
"TIS": An Intelligent Gateway Computer for Information and Modeling Networks. Overview.
ERIC Educational Resources Information Center
Hampel, Viktor E.; And Others
TIS (Technology Information System) is being used at the Lawrence Livermore National Laboratory (LLNL) to develop software for Intelligent Gateway Computers (IGC) suitable for the prototyping of advanced, integrated information networks. Dedicated to information management, TIS leads the user to available information resources, on TIS or…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quirk, W.J.; Canada, J.; de Vore, L.
1994-04-01
This issue highlights the Lawrence Livermore National Laboratory`s 1993 accomplishments in our mission areas and core programs: economic competitiveness, national security, energy, the environment, lasers, biology and biotechnology, engineering, physics, chemistry, materials science, computers and computing, and science and math education. Secondary topics include: nonproliferation, arms control, international security, environmental remediation, and waste management.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, Valerie
Given the significant impact of computing on society, it is important that all cultures, especially underrepresented cultures, are fully engaged in the field of computing to ensure that everyone benefits from the advances in computing. This proposal is focused on the field of high performance computing. The lack of cultural diversity in computing, in particular high performance computing, is especially evident with respect to the following ethnic groups – African Americans, Hispanics, and Native Americans – as well as People with Disabilities. The goal of this proposal is to organize and coordinate a National Laboratory Career Development Workshop focused onmore » underrepresented cultures (ethnic cultures and disability cultures) in high performance computing. It is expected that the proposed workshop will increase the engagement of underrepresented cultures in HPC through increased exposure to the excellent work at the national laboratories. The National Laboratory Workshops are focused on the recruitment of senior graduate students and the retention of junior lab staff through the various panels and discussions at the workshop. Further, the workshop will include a community building component that extends beyond the workshop. The workshop was held was held at the Lawrence Livermore National Laboratory campus in Livermore, CA. from June 14 - 15, 2012. The grant provided funding for 25 participants from underrepresented groups. The workshop also included another 25 local participants in the summer programs at Lawrence Livermore National Laboratory. Below are some key results from the assessment of the workshops: 86% of the participants indicated strongly agree or agree to the statement "I am more likely to consider/continue a career at a national laboratory as a result of participating in this workshop." 77% indicated strongly agree or agree to the statement "I plan to pursue a summer internship at a national laboratory." 100% of the participants indicated strongly agree or agree to the statement "The CMD-IT NLPDEV workshop was a valuable experience."« less
LLNL NESHAPs 2015 Annual Report - June 2016
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilson, K. R.; Gallegos, G. M.; MacQueen, D. H.
2016-06-01
Lawrence Livermore National Security, LLC operates facilities at Lawrence Livermore National Laboratory (LLNL) in which radionuclides are handled and stored. These facilities are subject to the U.S. Environmental Protection Agency (EPA) National Emission Standards for Hazardous Air Pollutants (NESHAPs) in Code of Federal Regulations (CFR) Title 40, Part 61, Subpart H, which regulates radionuclide emissions to air from Department of Energy (DOE) facilities. Specifically, NESHAPs limits the emission of radionuclides to the ambient air to levels resulting in an annual effective dose equivalent of 10 mrem (100 μSv) to any member of the public. Using measured and calculated emissions, andmore » building-specific and common parameters, LLNL personnel applied the EPA-approved computer code, CAP88-PC, Version 4.0.1.17, to calculate the dose to the maximally exposed individual member of the public for the Livermore Site and Site 300.« less
DOE R&D Accomplishments Database
2002-01-01
For 50 years, Lawrence Livermore National Laboratory has been making history and making a difference. The outstanding efforts by a dedicated work force have led to many remarkable accomplishments. Creative individuals and interdisciplinary teams at the Laboratory have sought breakthrough advances to strengthen national security and to help meet other enduring national needs. The Laboratory's rich history includes many interwoven stories -- from the first nuclear test failure to accomplishments meeting today's challenges. Many stories are tied to Livermore's national security mission, which has evolved to include ensuring the safety, security, and reliability of the nation's nuclear weapons without conducting nuclear tests and preventing the proliferation and use of weapons of mass destruction. Throughout its history and in its wide range of research activities, Livermore has achieved breakthroughs in applied and basic science, remarkable feats of engineering, and extraordinary advances in experimental and computational capabilities. From the many stories to tell, one has been selected for each year of the Laboratory's history. Together, these stories give a sense of the Laboratory -- its lasting focus on important missions, dedication to scientific and technical excellence, and drive to made the world more secure and a better place to live.
Science and Technology Review, January-February 1997
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Table of contents: accelerators at Livermore; the B-Factory and the Big Bang; assessing exposure to radiation; next generation of computer storage; and a powerful new tool to detect clandestine nuclear tests.
User's manual for a material transport code on the Octopus Computer Network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Naymik, T.G.; Mendez, G.D.
1978-09-15
A code to simulate material transport through porous media was developed at Oak Ridge National Laboratory. This code has been modified and adapted for use at Lawrence Livermore Laboratory. This manual, in conjunction with report ORNL-4928, explains the input, output, and execution of the code on the Octopus Computer Network.
Brusniak, Mi-Youn; Bodenmiller, Bernd; Campbell, David; Cooke, Kelly; Eddes, James; Garbutt, Andrew; Lau, Hollis; Letarte, Simon; Mueller, Lukas N; Sharma, Vagisha; Vitek, Olga; Zhang, Ning; Aebersold, Ruedi; Watts, Julian D
2008-01-01
Background Quantitative proteomics holds great promise for identifying proteins that are differentially abundant between populations representing different physiological or disease states. A range of computational tools is now available for both isotopically labeled and label-free liquid chromatography mass spectrometry (LC-MS) based quantitative proteomics. However, they are generally not comparable to each other in terms of functionality, user interfaces, information input/output, and do not readily facilitate appropriate statistical data analysis. These limitations, along with the array of choices, present a daunting prospect for biologists, and other researchers not trained in bioinformatics, who wish to use LC-MS-based quantitative proteomics. Results We have developed Corra, a computational framework and tools for discovery-based LC-MS proteomics. Corra extends and adapts existing algorithms used for LC-MS-based proteomics, and statistical algorithms, originally developed for microarray data analyses, appropriate for LC-MS data analysis. Corra also adapts software engineering technologies (e.g. Google Web Toolkit, distributed processing) so that computationally intense data processing and statistical analyses can run on a remote server, while the user controls and manages the process from their own computer via a simple web interface. Corra also allows the user to output significantly differentially abundant LC-MS-detected peptide features in a form compatible with subsequent sequence identification via tandem mass spectrometry (MS/MS). We present two case studies to illustrate the application of Corra to commonly performed LC-MS-based biological workflows: a pilot biomarker discovery study of glycoproteins isolated from human plasma samples relevant to type 2 diabetes, and a study in yeast to identify in vivo targets of the protein kinase Ark1 via phosphopeptide profiling. Conclusion The Corra computational framework leverages computational innovation to enable biologists or other researchers to process, analyze and visualize LC-MS data with what would otherwise be a complex and not user-friendly suite of tools. Corra enables appropriate statistical analyses, with controlled false-discovery rates, ultimately to inform subsequent targeted identification of differentially abundant peptides by MS/MS. For the user not trained in bioinformatics, Corra represents a complete, customizable, free and open source computational platform enabling LC-MS-based proteomic workflows, and as such, addresses an unmet need in the LC-MS proteomics field. PMID:19087345
Joint the Center for Applied Scientific Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gamblin, Todd; Bremer, Timo; Van Essen, Brian
The Center for Applied Scientific Computing serves as Livermore Lab’s window to the broader computer science, computational physics, applied mathematics, and data science research communities. In collaboration with academic, industrial, and other government laboratory partners, we conduct world-class scientific research and development on problems critical to national security. CASC applies the power of high-performance computing and the efficiency of modern computational methods to the realms of stockpile stewardship, cyber and energy security, and knowledge discovery for intelligence applications.
User's manual for a two-dimensional, ground-water flow code on the Octopus computer network
DOE Office of Scientific and Technical Information (OSTI.GOV)
Naymik, T.G.
1978-08-30
A ground-water hydrology computer code, programmed by R.L. Taylor (in Proc. American Society of Civil Engineers, Journal of Hydraulics Division, 93(HY2), pp. 25-33 (1967)), has been adapted to the Octopus computer system at Lawrence Livermore Laboratory. Using an example problem, this manual details the input, output, and execution options of the code.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Willis, D. K.
2016-12-01
High performance computing (HPC) has been a defining strength of Lawrence Livermore National Laboratory (LLNL) since its founding. Livermore scientists have designed and used some of the world’s most powerful computers to drive breakthroughs in nearly every mission area. Today, the Laboratory is recognized as a world leader in the application of HPC to complex science, technology, and engineering challenges. Most importantly, HPC has been integral to the National Nuclear Security Administration’s (NNSA’s) Stockpile Stewardship Program—designed to ensure the safety, security, and reliability of our nuclear deterrent without nuclear testing. A critical factor behind Lawrence Livermore’s preeminence in HPC ismore » the ongoing investments made by the Laboratory Directed Research and Development (LDRD) Program in cutting-edge concepts to enable efficient utilization of these powerful machines. Congress established the LDRD Program in 1991 to maintain the technical vitality of the Department of Energy (DOE) national laboratories. Since then, LDRD has been, and continues to be, an essential tool for exploring anticipated needs that lie beyond the planning horizon of our programs and for attracting the next generation of talented visionaries. Through LDRD, Livermore researchers can examine future challenges, propose and explore innovative solutions, and deliver creative approaches to support our missions. The present scientific and technical strengths of the Laboratory are, in large part, a product of past LDRD investments in HPC. Here, we provide seven examples of LDRD projects from the past decade that have played a critical role in building LLNL’s HPC, computer science, mathematics, and data science research capabilities, and describe how they have impacted LLNL’s mission.« less
Science & Technology Review June 2012
DOE Office of Scientific and Technical Information (OSTI.GOV)
Poyneer, L A
2012-04-20
This month's issue has the following articles: (1) A New Era in Climate System Analysis - Commentary by William H. Goldstein; (2) Seeking Clues to Climate Change - By comparing past climate records with results from computer simulations, Livermore scientists can better understand why Earth's climate has changed and how it might change in the future; (3) Finding and Fixing a Supercomputer's Faults - Livermore experts have developed innovative methods to detect hardware faults in supercomputers and help applications recover from errors that do occur; (4) Targeting Ignition - Enhancements to the cryogenic targets for National Ignition Facility experiments aremore » furthering work to achieve fusion ignition with energy gain; (5) Neural Implants Come of Age - A new generation of fully implantable, biocompatible neural prosthetics offers hope to patients with neurological impairment; and (6) Incubator Busy Growing Energy Technologies - Six collaborations with industrial partners are using the Laboratory's high-performance computing resources to find solutions to urgent energy-related problems.« less
Description and use of LSODE, the Livermore Solver for Ordinary Differential Equations
NASA Technical Reports Server (NTRS)
Radhakrishnan, Krishnan; Hindmarsh, Alan C.
1993-01-01
LSODE, the Livermore Solver for Ordinary Differential Equations, is a package of FORTRAN subroutines designed for the numerical solution of the initial value problem for a system of ordinary differential equations. It is particularly well suited for 'stiff' differential systems, for which the backward differentiation formula method of orders 1 to 5 is provided. The code includes the Adams-Moulton method of orders 1 to 12, so it can be used for nonstiff problems as well. In addition, the user can easily switch methods to increase computational efficiency for problems that change character. For both methods a variety of corrector iteration techniques is included in the code. Also, to minimize computational work, both the step size and method order are varied dynamically. This report presents complete descriptions of the code and integration methods, including their implementation. It also provides a detailed guide to the use of the code, as well as an illustrative example problem.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alchorn, A L
Thank you for your interest in the activities of the Lawrence Livermore National Laboratory Computation Directorate. This collection of articles from the Laboratory's Science & Technology Review highlights the most significant computational projects, achievements, and contributions during 2002. In 2002, LLNL marked the 50th anniversary of its founding. Scientific advancement in support of our national security mission has always been the core of the Laboratory. So that researchers could better under and predict complex physical phenomena, the Laboratory has pushed the limits of the largest, fastest, most powerful computers in the world. In the late 1950's, Edward Teller--one of themore » LLNL founders--proposed that the Laboratory commission a Livermore Advanced Research Computer (LARC) built to Livermore's specifications. He tells the story of being in Washington, DC, when John Von Neumann asked to talk about the LARC. He thought Teller wanted too much memory in the machine. (The specifications called for 20-30,000 words.) Teller was too smart to argue with him. Later Teller invited Von Neumann to the Laboratory and showed him one of the design codes being prepared for the LARC. He asked Von Neumann for suggestions on fitting the code into 10,000 words of memory, and flattered him about ''Labbies'' not being smart enough to figure it out. Von Neumann dropped his objections, and the LARC arrived with 30,000 words of memory. Memory, and how close memory is to the processor, is still of interest to us today. Livermore's first supercomputer was the Remington-Rand Univac-1. It had 5600 vacuum tubes and was 2 meters wide by 4 meters long. This machine was commonly referred to as a 1 KFlop machine [E+3]. Skip ahead 50 years. The ASCI White machine at the Laboratory today, produced by IBM, is rated at a peak performance of 12.3 TFlops or E+13. We've improved computer processing power by 10 orders of magnitude in 50 years, and I do not believe there's any reason to think we won't improve another 10 orders of magnitude in the next 50 years. For years I have heard talk of hitting the physical limits of Moore's Law, but new technologies will take us into the next phase of computer processing power such as 3-D chips, molecular computing, quantum computing, and more. Big computers are icons or symbols of the culture and larger infrastructure that exists at LLNL to guide scientific discovery and engineering development. We have dealt with balance issues for 50 years and will continue to do so in our quest for a digital proxy of the properties of matter at extremely high temperatures and pressures. I believe that the next big computational win will be the merger of high-performance computing with information management. We already create terabytes--soon to be petabytes--of data. Efficiently storing, finding, visualizing and extracting data and turning that into knowledge which aids decision-making and scientific discovery is an exciting challenge. In the meantime, please enjoy this retrospective on computational physics, computer science, advanced software technologies, and applied mathematics performed by programs and researchers at LLNL during 2002. It offers a glimpse into the stimulating world of computational science in support of the national missions and homeland defense.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bertsch, Adam; Draeger, Erik; Richards, David
2017-01-12
With Sequoia at Lawrence Livermore National Laboratory, researchers explore grand challenging problems and are generating results at scales never before achieved. Sequoia is the first computer to have more than one million processors and is one of the fastest supercomputers in the world.
First-Principles Equation of State and Shock Compression of Warm Dense Aluminum and Hydrocarbons
NASA Astrophysics Data System (ADS)
Driver, Kevin; Soubiran, Francois; Zhang, Shuai; Militzer, Burkhard
2017-10-01
Theoretical studies of warm dense plasmas are a key component of progress in fusion science, defense science, and astrophysics programs. Path integral Monte Carlo (PIMC) and density functional theory molecular dynamics (DFT-MD), two state-of-the-art, first-principles, electronic-structure simulation methods, provide a consistent description of plasmas over a wide range of density and temperature conditions. Here, we combine high-temperature PIMC data with lower-temperature DFT-MD data to compute coherent equations of state (EOS) for aluminum and hydrocarbon plasmas. Subsequently, we derive shock Hugoniot curves from these EOSs and extract the temperature-density evolution of plasma structure and ionization behavior from pair-correlation function analyses. Since PIMC and DFT-MD accurately treat effects of atomic shell structure, we find compression maxima along Hugoniot curves attributed to K-shell and L-shell ionization, which provide a benchmark for widely-used EOS tables, such as SESAME and LEOS, and more efficient models. LLNL-ABS-734424. Funding provided by the DOE (DE-SC0010517) and in part under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. Computational resources provided by Blue Waters (NSF ACI1640776) and NERSC. K. Driver's and S. Zhang's current address is Lawrence Livermore Natl. Lab, Livermore, CA, 94550, USA.
2005 White Paper on Institutional Capability Computing Requirements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carnes, B; McCoy, M; Seager, M
This paper documents the need for a significant increase in the computing infrastructure provided to scientists working in the unclassified domains at Lawrence Livermore National Laboratory (LLNL). This need could be viewed as the next step in a broad strategy outlined in the January 2002 White Paper (UCRL-ID-147449) that bears essentially the same name as this document. Therein we wrote: 'This proposed increase could be viewed as a step in a broader strategy linking hardware evolution to applications development that would take LLNL unclassified computational science to a position of distinction if not preeminence by 2006.' This position of distinctionmore » has certainly been achieved. This paper provides a strategy for sustaining this success but will diverge from its 2002 predecessor in that it will: (1) Amplify the scientific and external success LLNL has enjoyed because of the investments made in 2002 (MCR, 11 TF) and 2004 (Thunder, 23 TF). (2) Describe in detail the nature of additional investments that are important to meet both the institutional objectives of advanced capability for breakthrough science and the scientists clearly stated request for adequate capacity and more rapid access to moderate-sized resources. (3) Put these requirements in the context of an overall strategy for simulation science and external collaboration. While our strategy for Multiprogrammatic and Institutional Computing (M&IC) has worked well, three challenges must be addressed to assure and enhance our position. The first is that while we now have over 50 important classified and unclassified simulation codes available for use by our computational scientists, we find ourselves coping with high demand for access and long queue wait times. This point was driven home in the 2005 Institutional Computing Executive Group (ICEG) 'Report Card' to the Deputy Director for Science and Technology (DDST) Office and Computation Directorate management. The second challenge is related to the balance that should be maintained in the simulation environment. With the advent of Thunder, the institution directed a change in course from past practice. Instead of making Thunder available to the large body of scientists, as was MCR, and effectively using it as a capacity system, the intent was to make it available to perhaps ten projects so that these teams could run very aggressive problems for breakthrough science. This usage model established Thunder as a capability system. The challenge this strategy raises is that the majority of scientists have not seen an improvement in capacity computing resources since MCR, thus creating significant tension in the system. The question then is: 'How do we address the institution's desire to maintain the potential for breakthrough science and also meet the legitimate requests from the ICEG to achieve balance?' Both the capability and the capacity environments must be addressed through this one procurement. The third challenge is to reach out more aggressively to the national science community to encourage access to LLNL resources as part of a strategy for sharpening our science through collaboration. Related to this, LLNL has been unable in the past to provide access for sensitive foreign nationals (SFNs) to the Livermore Computing (LC) unclassified 'yellow' network. Identifying some mechanism for data sharing between LLNL computational scientists and SFNs would be a first practical step in fostering cooperative, collaborative relationships with an important and growing sector of the American science community.« less
US Department of Energy High School Student Supercomputing Honors Program: A follow-up assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1987-01-01
The US DOE High School Student Supercomputing Honors Program was designed to recognize high school students with superior skills in mathematics and computer science and to provide them with formal training and experience with advanced computer equipment. This document reports on the participants who attended the first such program, which was held at the National Magnetic Fusion Energy Computer Center at the Lawrence Livermore National Laboratory (LLNL) during August 1985.
Dynamic Fracture Simulations of Explosively Loaded Cylinders
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arthur, Carly W.; Goto, D. M.
2015-11-30
This report documents the modeling results of high explosive experiments investigating dynamic fracture of steel (AerMet® 100 alloy) cylinders. The experiments were conducted at Lawrence Livermore National Laboratory (LLNL) during 2007 to 2008 [10]. A principal objective of this study was to gain an understanding of dynamic material failure through the analysis of hydrodynamic computer code simulations. Two-dimensional and three-dimensional computational cylinder models were analyzed using the ALE3D multi-physics computer code.
2004-10-01
MONITORING AGENCY NAME(S) AND ADDRESS(ES) Defense Advanced Research Projects Agency AFRL/IFTC 3701 North Fairfax Drive...Scalable Parallel Libraries for Large-Scale Concurrent Applications," Technical Report UCRL -JC-109251, Lawrence Livermore National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chrzanowski, P; Walter, K
For the Laboratory and staff, 2006 was a year of outstanding achievements. As our many accomplishments in this annual report illustrate, the Laboratory's focus on important problems that affect our nation's security and our researchers breakthroughs in science and technology have led to major successes. As a national laboratory that is part of the Department of Energy's National Nuclear Security Administration (DOE/NNSA), Livermore is a key contributor to the Stockpile Stewardship Program for maintaining the safety, security, and reliability of the nation's nuclear weapons stockpile. The program has been highly successful, and our annual report features some of the Laboratory'smore » significant stockpile stewardship accomplishments in 2006. A notable example is a long-term study with Los Alamos National Laboratory, which found that weapon pit performance will not sharply degrade from the aging effects on plutonium. The conclusion was based on a wide range of nonnuclear experiments, detailed simulations, theoretical advances, and thorough analyses of the results of past nuclear tests. The study was a superb scientific effort. The continuing success of stockpile stewardship enabled NNSA in 2006 to lay out Complex 2030, a vision for a transformed nuclear weapons complex that is more responsive, cost efficient, and highly secure. One of the ways our Laboratory will help lead this transformation is through the design and development of reliable replacement warheads (RRWs). Compared to current designs, these warheads would have enhanced performance margins and security features and would be less costly to manufacture and maintain in a smaller, modernized production complex. In early 2007, NNSA selected Lawrence Livermore and Sandia National Laboratories-California to develop ''RRW-1'' for the U.S. Navy. Design efforts for the RRW, the plutonium aging work, and many other stockpile stewardship accomplishments rely on computer simulations performed on NNSA's Advanced Simulation and Computing (ASC) Program supercomputers at Livermore. ASC Purple and BlueGene/L, the world's fastest computer, together provide nearly a half petaflop (500 trillion operations per second) of computer power for use by the three NNSA national laboratories. Livermore-led teams were awarded the Gordon Bell Prize for Peak Performance in both 2005 and 2006. The winning simulations, run on BlueGene/L, investigated the properties of materials at the length and time scales of atomic interactions. The computing power that makes possible such detailed simulations provides unprecedented opportunities for scientific discovery. Laboratory scientists are meeting the extraordinary challenge of creating experimental capabilities to match the resolution of supercomputer simulations. Working with a wide range of collaborators, we are developing experimental tools that gather better data at the nanometer and subnanosecond scales. Applications range from imaging biomolecules to studying matter at extreme conditions of pressure and temperature. The premier high-energy-density experimental physics facility in the world will be the National Ignition Facility (NIF) when construction is completed in 2009. We are leading the national effort to perform the first fusion ignition experiments using NIF's 192-beam laser and prepare to explore some of the remaining important issues in weapons physics. With scientific colleagues from throughout the nation, we are also designing revolutionary experiments on NIF to advance the fields of astrophysics, planetary physics, and materials science. Mission-directed, multidisciplinary science and technology at Livermore is also focused on reducing the threat posed by the proliferation of weapons of mass destruction as well as their acquisition and use by terrorists. The Laboratory helps this important national effort by providing its unique expertise, integration analyses, and operational support to the Department of Homeland Security. For this vital facet of the Laboratory's national security mission, we are developing advanced technologies, such as a pocket-size explosives detector and an airborne persistent surveillance system, both of which earned R&D 100 Awards. Altogether, Livermore won seven R&D 100 Awards in 2006, the most for any organization. Emerging threats to national and global security go beyond defense and homeland security. Livermore pursues major scientific and technical advances to meet the need for a clean environment; clean, abundant energy; better water management; and improved human health. Our annual report highlights the link between human activities and the warming of tropical oceans, as well as techniques for imaging biological molecules and detecting bone cancer in its earliest stages. In addition, we showcase many scientific discoveries: distant planets, the composition of comets, a new superheavy element.« less
ERIC Educational Resources Information Center
Havas, George D.
This brief guide to materials in the Library of Congress (LC) on computer aided design and/or computer aided manufacturing lists reference materials and other information sources under 13 headings: (1) brief introductions; (2) LC subject headings used for such materials; (3) textbooks; (4) additional titles; (5) glossaries and handbooks; (6)…
Spherical harmonic results for the 3D Kobayashi Benchmark suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, P N; Chang, B; Hanebutte, U R
1999-03-02
Spherical harmonic solutions are presented for the Kobayashi benchmark suite. The results were obtained with Ardra, a scalable, parallel neutron transport code developed at Lawrence Livermore National Laboratory (LLNL). The calculations were performed on the IBM ASCI Blue-Pacific computer at LLNL.
Computer Aided Self-Forging Fragment Design,
1978-06-01
This value is reached so quickly that HEMP solutions using work hardening and those using only elastic—perfectly plastic formulations are quite...Elastic— Plastic Flow, UCRL—7322 , Lawrence Radiation Laboratory , Livermore , California (1969) . 4. Giroux , E. D . , HEMP Users Manual, UCRL—5l079...Laboratory, the HEMP computer code has been developed to serve as an effective design tool to simplify this task considerably. Using this code, warheads 78 06
Jack Rabbit Pretest Data For TATB Based IHE Model Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hart, M M; Strand, O T; Bosson, S T
The Jack Rabbit Pretest series consisted of 5 focused hydrodynamic experiments, 2021E PT3, PT4, PT5, PT6, and PT7. They were fired in March and April of 2008 at the Contained Firing Facility, Site 300, Lawrence Livermore National Laboratory, Livermore, California. These experiments measured dead-zone formation and impulse gradients created during the detonation of TATB based insensitive high explosive. This document contains reference data tables for all 5 experiments. These data tables include: (1) Measured laser velocimetry of the experiment diagnostic plate (2) Computed diagnostic plate profile contours through velocity integration (3) Computed center axis pressures through velocity differentiation. All timesmore » are in microseconds, referenced from detonator circuit current start. All dimensions are in millimeters. Schematic axi-symmetric cross sections are shown for each experiment. These schematics detail the materials used and dimensions of the experiment and component parts. This should allow anyone wanting to evaluate their TATB based insensitive high explosive detonation model against experiment. These data are particularly relevant in examining reactive flow detonation model prediction in computational simulation of dead-zone formation and resulting impulse gradients produced by detonating TATB based explosive.« less
Experimental Studies of Very-High Mach Number Hydrodynamics
1994-02-14
BUCKINGHAM Lawrence Livermore National Laboratory Livermore, California IRA KOHLBERG Kohlberg Associates, Inc. Alexandria, Virginia 9 / 1 321 February 14...34** Lawrence Livermore National Laboratory, Livermore, CA tKohlberg Associates, Inc., Alexandria, VA 12a. DISTRIBUTION/AVAILABlUTY STATEMENT 12b...Kohlberg 3 IPlasma Physics Division, Naval Research Laboratory, Washington DC 20375, USA 2 Lawrence Livermore National Laboratory, Liveraore, Ca. USA 3
Science & Technology Review: September 2016
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vogt, Ramona L.; Meissner, Caryn N.; Chinn, Ken B.
2016-09-30
This is the September issue of the Lawrence Livermore National Laboratory's Science & Technology Review, which communicates, to a broad audience, the Laboratory’s scientific and technological accomplishments in fulfilling its primary missions. This month, there are features on "Laboratory Investments Drive Computational Advances" and "Laying the Groundwork for Extreme-Scale Computing." Research highlights include "Nuclear Data Moves into the 21st Century", "Peering into the Future of Lick Observatory", and "Facility Drives Hydrogen Vehicle Innovations."
76 FR 28305 - Amendment of Class D and Class E Airspace; Livermore, CA
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-17
... E airspace at Livermore, CA, to accommodate aircraft using new Instrument Landing System (ILS... surface of the earth. * * * * * AWP CA E5 Livermore, CA [Amended] Livermore Municipal Airport, CA (Lat. 37...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Theys, M.
1994-05-06
Beamlet is a high power laser currently being built at Lawrence Livermore National Lab as a proof of concept for the National Ignition Facility (NIF). Beamlet is testing several areas of laser advancements, such as a 37cm Pockels cell, square amplifier, and propagation of a square beam. The diagnostics on beamlet tell the operators how much energy the beam has in different locations, the pulse shape, the energy distribution, and other important information regarding the beam. This information is being used to evaluate new amplifier designs, and extrapolate performance to the NIF laser. In my term at Lawrence Livermore Nationalmore » Laboratory I have designed and built a diagnostic, calibrated instruments used on diagnostics, setup instruments, hooked up communication lines to the instruments, and setup computers to control specific diagnostics.« less
Computational Methods for Crashworthiness
NASA Technical Reports Server (NTRS)
Noor, Ahmed K. (Compiler); Carden, Huey D. (Compiler)
1993-01-01
Presentations and discussions from the joint UVA/NASA Workshop on Computational Methods for Crashworthiness held at Langley Research Center on 2-3 Sep. 1992 are included. The presentations addressed activities in the area of impact dynamics. Workshop attendees represented NASA, the Army and Air Force, the Lawrence Livermore and Sandia National Laboratories, the aircraft and automotive industries, and academia. The workshop objectives were to assess the state-of-technology in the numerical simulation of crash and to provide guidelines for future research.
Programming for 1.6 Millon cores: Early experiences with IBM's BG/Q SMP architecture
NASA Astrophysics Data System (ADS)
Glosli, James
2013-03-01
With the stall in clock cycle improvements a decade ago, the drive for computational performance has continues along a path of increasing core counts on a processor. The multi-core evolution has been expressed in both a symmetric multi processor (SMP) architecture and cpu/GPU architecture. Debates rage in the high performance computing (HPC) community which architecture best serves HPC. In this talk I will not attempt to resolve that debate but perhaps fuel it. I will discuss the experience of exploiting Sequoia, a 98304 node IBM Blue Gene/Q SMP at Lawrence Livermore National Laboratory. The advantages and challenges of leveraging the computational power BG/Q will be detailed through the discussion of two applications. The first application is a Molecular Dynamics code called ddcMD. This is a code developed over the last decade at LLNL and ported to BG/Q. The second application is a cardiac modeling code called Cardioid. This is a code that was recently designed and developed at LLNL to exploit the fine scale parallelism of BG/Q's SMP architecture. Through the lenses of these efforts I'll illustrate the need to rethink how we express and implement our computational approaches. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Improved Algorithms Speed It Up for Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hazi, A
2005-09-20
Huge computers, huge codes, complex problems to solve. The longer it takes to run a code, the more it costs. One way to speed things up and save time and money is through hardware improvements--faster processors, different system designs, bigger computers. But another side of supercomputing can reap savings in time and speed: software improvements to make codes--particularly the mathematical algorithms that form them--run faster and more efficiently. Speed up math? Is that really possible? According to Livermore physicist Eugene Brooks, the answer is a resounding yes. ''Sure, you get great speed-ups by improving hardware,'' says Brooks, the deputy leadermore » for Computational Physics in N Division, which is part of Livermore's Physics and Advanced Technologies (PAT) Directorate. ''But the real bonus comes on the software side, where improvements in software can lead to orders of magnitude improvement in run times.'' Brooks knows whereof he speaks. Working with Laboratory physicist Abraham Szoeke and others, he has been instrumental in devising ways to shrink the running time of what has, historically, been a tough computational nut to crack: radiation transport codes based on the statistical or Monte Carlo method of calculation. And Brooks is not the only one. Others around the Laboratory, including physicists Andrew Williamson, Randolph Hood, and Jeff Grossman, have come up with innovative ways to speed up Monte Carlo calculations using pure mathematics.« less
LLL Octopus network: some lessons and future directions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watson, R.W.
1978-06-27
The Octopus network, designed and developed by the Lawrence Livermore Laboratory, is a pioneering, high-performance, local computer network. Several lessons derived from the 14 years of experience in the evolution of Octopus are described, and some of the directions to be taken in its medium-term future are indicated. 3 figures.
2008-02-01
Livermore, California. 32. Martini, K. (1996a). “Research in the out-of-plane behavior of unreinforced masonry.” Ancient Reconstruction of the Pompeii Forum...plane behavior of unreinforced masonry,” Ancient Reconstruction of the Pompeii Forum. School of Architecture, University of Virginia
Rarefaction Wave Eliminator Concepts For A Large Blast/Thermal Simulator.
1985-02-01
hard copies of the pressure-time records. Final data process- ing was completed with the computer, printer , and plotter. Plots of pressure- time records...F ATTN: Prof 0. Zinke Fayetteville, AR 72701 Cdr, CRDC, AMCCOM ATTI: 4O-SPS-IL University of California PM=-J Lawrence Livermore Lab SOM-RSP-A ATTN
Livermore Site Spill Prevention, Control, and Countermeasures Plan, May 2017
DOE Office of Scientific and Technical Information (OSTI.GOV)
Griffin, D.; Mertesdorf, E.
This Spill Prevention, Control, and Countermeasure (SPCC) Plan describes the measures that are taken at Lawrence Livermore National Laboratory’s (LLNL) Livermore Site in Livermore, California, to prevent, control, and handle potential spills from aboveground containers that can contain 55 gallons or more of oil.
Emulating a million machines to investigate botnets.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rudish, Donald W.
2010-06-01
Researchers at Sandia National Laboratories in Livermore, California are creating what is in effect a vast digital petridish able to hold one million operating systems at once in an effort to study the behavior of rogue programs known as botnets. Botnets are used extensively by malicious computer hackers to steal computing power fron Internet-connected computers. The hackers harness the stolen resources into a scattered but powerful computer that can be used to send spam, execute phishing, scams or steal digital information. These remote-controlled 'distributed computers' are difficult to observe and track. Botnets may take over parts of tens of thousandsmore » or in some cases even millions of computers, making them among the world's most powerful computers for some applications.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hazi, A
According to the results from a Livermore computer model, some of the small change jingling in your pocket contains zinc and copper created in massive gamma-ray bursts (GRBs) that rank as the most impressive light shows in the universe. Livermore astrophysicist Jason Pruet and his colleagues Rebecca Surman and Gail McLaughlin from North Carolina State University (NCSU) reported on their calculations in the February 20, 2004, issue of ''Astrophysical Journal Letters''. They found that GRBs from black holes surrounded by a disk of dense, hot plasma may have contributed heavily to the galactic inventory of elements such as calcium, scandium,more » titanium, zinc, and copper. ''A typical GRB of this kind briefly outshines all the stars in millions of galaxies combined'', says Pruet. ''Plus it makes about 100 times as much of some common elements as an ordinary supernova''.« less
Reduced Chemical Kinetic Mechanisms for Hydrocarbon Fuels
2006-01-01
Technologies Reaction Engineering International 77 West 200 South, Suite # 210 Salt Lake City, UT 84101 3Professor Department of Mechanical ... Engineering University of California, Berkeley Berkeley, CA 94720 4Program Leader for Computational Chemistry Lawrence Livermore National Laboratory...species by the error introduced by assuming they are in quasi-steady state. The reduced mechanisms have been compared to detailed chemistry calculations
Emission Measurements of Ultracell XX25 Reformed Methanol Fuel Cell System
2008-06-01
combination of electrochemical devices such as fuel cell and battery. Polymer electrolyte membrane fuel cells ( PEMFC ) using hydrogen or liquid...communications and computers, sensors and night vision capabilities. High temperature PEMFC offers some advantages such as enhanced electrode kinetics and better...tolerance of carbon monoxide that will poison the conventional PEMFC . Ultracell Corporation, Livermore, California has developed a first
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vrieling, P. Douglas
2016-01-01
The Livermore Valley Open Campus (LVOC), a joint initiative of the National Nuclear Security Administration (NNSA), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL), enhances the national security missions of NNSA by promoting greater collaboration between world-class scientists at the national security laboratories, and their partners in industry and academia. Strengthening the science, technology, and engineering (ST&E) base of our nation is one of the NNSA’s top goals. By conducting coordinated and collaborative programs, LVOC enhances both the NNSA and the broader national science and technology base, and helps to ensure the health of core capabilities at LLNLmore » and SNL. These capabilities must remain strong to enable the laboratories to execute their primary mission for NNSA.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCallen, R; Salari, K; Ortega, J
2003-05-01
A Working Group Meeting on Heavy Vehicle Aerodynamic Drag was held at Lawrence Livermore National Laboratory on May 29-30, 2003. The purpose of the meeting was to present and discuss suggested guidance and direction for the design of drag reduction devices determined from experimental and computational studies. Representatives from the Department of Energy (DOE)/Office of Energy Efficiency and Renewable Energy/Office of FreedomCAR & Vehicle Technologies, Lawrence Livermore National Laboratory (LLNL), Sandia National Laboratories (SNL), NASA Ames Research Center (NASA), University of Southern California (USC), California Institute of Technology (Caltech), Georgia Tech Research Institute (GTRI), Argonne National Laboratory (ANL), Clarkson University,more » and PACCAR participated in the meeting. This report contains the technical presentations (viewgraphs) delivered at the Meeting, briefly summarizes the comments and conclusions, provides some highlighted items, and outlines the future action items.« less
G2LC: Resources Autoscaling for Real Time Bioinformatics Applications in IaaS.
Hu, Rongdong; Liu, Guangming; Jiang, Jingfei; Wang, Lixin
2015-01-01
Cloud computing has started to change the way how bioinformatics research is being carried out. Researchers who have taken advantage of this technology can process larger amounts of data and speed up scientific discovery. The variability in data volume results in variable computing requirements. Therefore, bioinformatics researchers are pursuing more reliable and efficient methods for conducting sequencing analyses. This paper proposes an automated resource provisioning method, G2LC, for bioinformatics applications in IaaS. It enables application to output the results in a real time manner. Its main purpose is to guarantee applications performance, while improving resource utilization. Real sequence searching data of BLAST is used to evaluate the effectiveness of G2LC. Experimental results show that G2LC guarantees the application performance, while resource is saved up to 20.14%.
G2LC: Resources Autoscaling for Real Time Bioinformatics Applications in IaaS
Hu, Rongdong; Liu, Guangming; Jiang, Jingfei; Wang, Lixin
2015-01-01
Cloud computing has started to change the way how bioinformatics research is being carried out. Researchers who have taken advantage of this technology can process larger amounts of data and speed up scientific discovery. The variability in data volume results in variable computing requirements. Therefore, bioinformatics researchers are pursuing more reliable and efficient methods for conducting sequencing analyses. This paper proposes an automated resource provisioning method, G2LC, for bioinformatics applications in IaaS. It enables application to output the results in a real time manner. Its main purpose is to guarantee applications performance, while improving resource utilization. Real sequence searching data of BLAST is used to evaluate the effectiveness of G2LC. Experimental results show that G2LC guarantees the application performance, while resource is saved up to 20.14%. PMID:26504488
Cross Domain Deterrence: Livermore Technical Report, 2014-2016
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barnes, Peter D.; Bahney, Ben; Matarazzo, Celeste
2016-08-03
Lawrence Livermore National Laboratory (LLNL) is an original collaborator on the project titled “Deterring Complex Threats: The Effects of Asymmetry, Interdependence, and Multi-polarity on International Strategy,” (CDD Project) led by the UC Institute on Global Conflict and Cooperation at UCSD under PIs Jon Lindsay and Erik Gartzke , and funded through the DoD Minerva Research Initiative. In addition to participating in workshops and facilitating interaction among UC social scientists, LLNL is leading the computational modeling effort and assisting with empirical case studies to probe the viability of analytic, modeling and data analysis concepts. This report summarizes LLNL work on themore » CDD Project to date, primarily in Project Years 1-2, corresponding to Federal fiscal year 2015. LLNL brings two unique domains of expertise to bear on this Project: (1) access to scientific expertise on the technical dimensions of emerging threat technology, and (2) high performance computing (HPC) expertise, required for analyzing the complexity of bargaining interactions in the envisioned threat models. In addition, we have a small group of researchers trained as social scientists who are intimately familiar with the International Relations research. We find that pairing simulation scientists, who are typically trained in computer science, with domain experts, social scientists in this case, is the most effective route to developing powerful new simulation tools capable of representing domain concepts accurately and answering challenging questions in the field.« less
Havery Mudd 2014-2015 Computer Science Conduit Clinic Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aspesi, G; Bai, J; Deese, R
2015-05-12
Conduit, a new open-source library developed at Lawrence Livermore National Laboratories, provides a C++ application programming interface (API) to describe and access scientific data. Conduit’s primary use is for inmemory data exchange in high performance computing (HPC) applications. Our team tested and improved Conduit to make it more appealing to potential adopters in the HPC community. We extended Conduit’s capabilities by prototyping four libraries: one for parallel communication using MPI, one for I/O functionality, one for aggregating performance data, and one for data visualization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
F. Cui; F.J. Presuel-Moreno; R.G. Kelly
2005-10-13
The ability of a SS316L surface wetted with a thin electrolyte layer to serve as an effective cathode for an active localized corrosion site was studied computationally. The dependence of the total net cathodic current, I{sub net}, supplied at the repassivation potential E{sub rp} (of the anodic crevice) on relevant physical parameters including water layer thickness (WL), chloride concentration ([Cl{sup -}]) and length of cathode (Lc) were investigated using a three-level, full factorial design. The effects of kinetic parameters including the exchange current density (i{sub o,c}) and Tafel slope ({beta}{sub c}) of oxygen reduction, the anodic passive current density (i{submore » p}) (on the cathodic surface), and E{sub rp} were studied as well using three-level full factorial designs of [Cl{sup -}] and Lc with a fixed WL of 25 {micro}m. The study found that all the three parameters WL, [Cl{sup -}] and Lc as well as the interactions of Lc x WL and Lc x [Cl{sup -}] had significant impact on I{sub net}. A five-factor regression equation was obtained which fits the computation results reasonably well, but demonstrated that interactions are more complicated than can be explained with a simple linear model. Significant effects on I{sub net} were found upon varying either i{sub o,c}, {beta}{sub c}, or E{sub rp}, whereas i{sub p} in the studied range was found to have little impact. It was observed that I{sub net} asymptotically approached maximum values (I{sub max}) when Lc increased to critical minimum values. I{sub max} can be used to determine the stability of coupled localized corrosion and the critical Lc provides important information for experimental design and corrosion protection.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2010-01-29
...: Lawrence Livermore National Laboratory. Location: Livermore, California. Job Titles and/or Job Duties: All... L. Hinnefeld, Interim Director, Office of Compensation Analysis and Support, National Institute for...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matzke, Melissa M.; Brown, Joseph N.; Gritsenko, Marina A.
2013-02-01
Liquid chromatography coupled with mass spectrometry (LC-MS) is widely used to identify and quantify peptides in complex biological samples. In particular, label-free shotgun proteomics is highly effective for the identification of peptides and subsequently obtaining a global protein profile of a sample. As a result, this approach is widely used for discovery studies. Typically, the objective of these discovery studies is to identify proteins that are affected by some condition of interest (e.g. disease, exposure). However, for complex biological samples, label-free LC-MS proteomics experiments measure peptides and do not directly yield protein quantities. Thus, protein quantification must be inferred frommore » one or more measured peptides. In recent years, many computational approaches to relative protein quantification of label-free LC-MS data have been published. In this review, we examine the most commonly employed quantification approaches to relative protein abundance from peak intensity values, evaluate their individual merits, and discuss challenges in the use of the various computational approaches.« less
Overview of the LINCS architecture
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fletcher, J.G.; Watson, R.W.
1982-01-13
Computing at the Lawrence Livermore National Laboratory (LLNL) has evolved over the past 15 years with a computer network based resource sharing environment. The increasing use of low cost and high performance micro, mini and midi computers and commercially available local networking systems will accelerate this trend. Further, even the large scale computer systems, on which much of the LLNL scientific computing depends, are evolving into multiprocessor systems. It is our belief that the most cost effective use of this environment will depend on the development of application systems structured into cooperating concurrent program modules (processes) distributed appropriately over differentmore » nodes of the environment. A node is defined as one or more processors with a local (shared) high speed memory. Given the latter view, the environment can be characterized as consisting of: multiple nodes communicating over noisy channels with arbitrary delays and throughput, heterogenous base resources and information encodings, no single administration controlling all resources, distributed system state, and no uniform time base. The system design problem is - how to turn the heterogeneous base hardware/firmware/software resources of this environment into a coherent set of resources that facilitate development of cost effective, reliable, and human engineered applications. We believe the answer lies in developing a layered, communication oriented distributed system architecture; layered and modular to support ease of understanding, reconfiguration, extensibility, and hiding of implementation or nonessential local details; communication oriented because that is a central feature of the environment. The Livermore Interactive Network Communication System (LINCS) is a hierarchical architecture designed to meet the above needs. While having characteristics in common with other architectures, it differs in several respects.« less
Annotation: a computational solution for streamlining metabolomics analysis
Domingo-Almenara, Xavier; Montenegro-Burke, J. Rafael; Benton, H. Paul; Siuzdak, Gary
2017-01-01
Metabolite identification is still considered an imposing bottleneck in liquid chromatography mass spectrometry (LC/MS) untargeted metabolomics. The identification workflow usually begins with detecting relevant LC/MS peaks via peak-picking algorithms and retrieving putative identities based on accurate mass searching. However, accurate mass search alone provides poor evidence for metabolite identification. For this reason, computational annotation is used to reveal the underlying metabolites monoisotopic masses, improving putative identification in addition to confirmation with tandem mass spectrometry. This review examines LC/MS data from a computational and analytical perspective, focusing on the occurrence of neutral losses and in-source fragments, to understand the challenges in computational annotation methodologies. Herein, we examine the state-of-the-art strategies for computational annotation including: (i) peak grouping or full scan (MS1) pseudo-spectra extraction, i.e., clustering all mass spectral signals stemming from each metabolite; (ii) annotation using ion adduction and mass distance among ion peaks; (iii) incorporation of biological knowledge such as biotransformations or pathways; (iv) tandem MS data; and (v) metabolite retention time calibration, usually achieved by prediction from molecular descriptors. Advantages and pitfalls of each of these strategies are discussed, as well as expected future trends in computational annotation. PMID:29039932
A computer method of finding valuations forcing validity of LC formulae
NASA Astrophysics Data System (ADS)
Godlewski, Łukasz; Świetorzecka, Kordula; Mulawka, Jan
2014-11-01
The purpose of this paper is to present the computer implementation of a system known as LC temporal logic [1]. Firstly, to become familiar with some theoretical issues, a short introduction to this logic is discussed. The algorithms allowing a deep analysis of the formulae of LC logic are considered. In particular we discuss how to determine if a formula is a tautology, contrtautology or it is satisfable. Next, we show how to find all valuations to satisfy the formula. Finally, we consider finding histories generated by the formula and transforming these histories into the state machine. Moreover, a description of the experiments that verify the implementation are briefly presented.
27 CFR 9.46 - Livermore Valley.
Code of Federal Regulations, 2011 CFR
2011-04-01
....” (b) Approved maps. The appropriate maps for determining the boundary of the Livermore Valley... 1980); (12) Hayward, CA (1993); and (13) Las Trampas Ridge, CA (1995). (c) Boundary. The Livermore... miles, passing through the Dublin map near Walpert Ridge, onto the Hayward map to the point where the...
27 CFR 9.46 - Livermore Valley.
Code of Federal Regulations, 2010 CFR
2010-04-01
....” (b) Approved maps. The appropriate maps for determining the boundary of the Livermore Valley... 1980); (12) Hayward, CA (1993); and (13) Las Trampas Ridge, CA (1995). (c) Boundary. The Livermore... miles, passing through the Dublin map near Walpert Ridge, onto the Hayward map to the point where the...
03-NIF Dedication: Norm Pattiz
DOE Office of Scientific and Technical Information (OSTI.GOV)
Norm Pattiz
2009-07-02
The National Ignition Facility, the world's largest laser system, was dedicated at a ceremony on May 29, 2009 at Lawrence Livermore National Laboratory. These are the remarks by Norm Pattiz, the chairman of Lawrence Livermore National Security, which manages Lawrence Livermore National Laboratory for the U.S. Department of Energy.
03-NIF Dedication: Norm Pattiz
Norm Pattiz
2017-12-09
The National Ignition Facility, the world's largest laser system, was dedicated at a ceremony on May 29, 2009 at Lawrence Livermore National Laboratory. These are the remarks by Norm Pattiz, the chairman of Lawrence Livermore National Security, which manages Lawrence Livermore National Laboratory for the U.S. Department of Energy.
Science & Technology Review November 2002
DOE Office of Scientific and Technical Information (OSTI.GOV)
Budil, K
This months issue of Science and Technology Review has the following articles: (1) High-Tech Help for Fighting Wildfires--Commentary by Leland W. Younker; (2) This Model Can Take the Heat--A physics-based simulation program to combat wildfires combines the capabilities and resources of Lawrence Livermore and Los Alamos national laboratories. (3) The Best and the Brightest Come to Livermore--The Lawrence Fellowship Program attracts the most sought-after postdoctoral researchers to the Laboratory. (4) A view to Kill--Livermore sensors are aimed at the ''kill'' vehicle when it intercepts an incoming ballistic missile. (5) 50th Anniversary Highlight--Biological Research Evolves at Livermore--Livermore's biological research program keepsmore » pace with emerging national issues, from studying the effects of ionizing radiation to detecting agents of biological warfare.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stanton, Courtney; Kuo, I-F W.; Mundy, Christopher J.
2007-11-01
Despite decades of study, the mechanism of orotidine-5'-monophosphate decarboxylase (ODCase) remains unresolved. A computational investigation of the direct decarboxylation mechanism has been performed using mixed quantum mechanical/molecular mechanical (QM/MM) dynamics simulations. The study was performed with the program CP2K that integrates classical dynamics and ab initio dynamics based on the Born-Oppenheimer approach. Two different QM regions were explored. It was found that the size of the QM region has a dramatic effect on the calculated reaction barrier. The free energy barriers for decarboxylation of orotidine-5'-monophosphate (OMP) in solution and in the enzyme were determined with the metadynamics method to bemore » 40 kcal/mol and 33 kcal/mol, respectively. The calculated change in activation free energy (ΔΔG±) on going from solution to the enzyme is therefore -7 kcal/mol, far less than the experimental change of -23 kcal/mol (for kcat/kuncat Radzicka, A.; Wolfenden, R., Science. 1995, 267, 90-92). These results do not support the direct decarboxylation mechanism in the enzyme. Funding was provided by the University of California Lawrence Livermore National Laboratory (LLNL) and the National Institutes of Health (NIH). Part of this work was performed under the auspices of the U.S. Department of Energy by LLNL under contract No. W-7405-Eng-48. Computer resources were provided by Livermore Computing.« less
LLNL Partners with IBM on Brain-Like Computing Chip
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Essen, Brian
Lawrence Livermore National Laboratory (LLNL) will receive a first-of-a-kind brain-inspired supercomputing platform for deep learning developed by IBM Research. Based on a breakthrough neurosynaptic computer chip called IBM TrueNorth, the scalable platform will process the equivalent of 16 million neurons and 4 billion synapses and consume the energy equivalent of a hearing aid battery – a mere 2.5 watts of power. The brain-like, neural network design of the IBM Neuromorphic System is able to infer complex cognitive tasks such as pattern recognition and integrated sensory processing far more efficiently than conventional chips.
LLNL Partners with IBM on Brain-Like Computing Chip
Van Essen, Brian
2018-06-25
Lawrence Livermore National Laboratory (LLNL) will receive a first-of-a-kind brain-inspired supercomputing platform for deep learning developed by IBM Research. Based on a breakthrough neurosynaptic computer chip called IBM TrueNorth, the scalable platform will process the equivalent of 16 million neurons and 4 billion synapses and consume the energy equivalent of a hearing aid battery â a mere 2.5 watts of power. The brain-like, neural network design of the IBM Neuromorphic System is able to infer complex cognitive tasks such as pattern recognition and integrated sensory processing far more efficiently than conventional chips.
Early years of Computational Statistical Mechanics
NASA Astrophysics Data System (ADS)
Mareschal, Michel
2018-05-01
Evidence that a model of hard spheres exhibits a first-order solid-fluid phase transition was provided in the late fifties by two new numerical techniques known as Monte Carlo and Molecular Dynamics. This result can be considered as the starting point of computational statistical mechanics: at the time, it was a confirmation of a counter-intuitive (and controversial) theoretical prediction by J. Kirkwood. It necessitated an intensive collaboration between the Los Alamos team, with Bill Wood developing the Monte Carlo approach, and the Livermore group, where Berni Alder was inventing Molecular Dynamics. This article tells how it happened.
User's guide for FRMOD, a zero dimensional FRM burn code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Driemeryer, D.; Miley, G.H.
1979-10-15
The zero-dimensional FRM plasma burn code, FRMOD is written in the FORTRAN language and is currently available on the Control Data Corporation (CDC) 7600 computer at the Magnetic Fusion Energy Computer Center (MFECC), sponsored by the US Department of Energy, in Livermore, CA. This guide assumes that the user is familiar with the system architecture and some of the utility programs available on the MFE-7600 machine, since online documentation is available for system routines through the use of the DOCUMENT utility. Users may therefore refer to it for answers to system related questions.
The electromechanical battery: The new kid on the block
DOE Office of Scientific and Technical Information (OSTI.GOV)
Post, R.F.
1993-08-01
In a funded program at the Lawrence Livermore National Laboratory new materials and novel designs are being incorporated into a new approach to an old concept -- flywheel energy storage. Modular devices, dubbed ``electromechanical batteries`` (EMB) are being developed that should represent an important alternative to the electrochemical storage battery for use in electric vehicles or for stationary applications, such as computer back-up power or utility load-leveling.
Crashworthiness simulation of composite automotive structures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Botkin, M E; Johnson, N L; Simunovic, S
1998-06-01
In 1990 the Automotive Composites Consortium (ACC) began the investigation of crash worthiness simulation methods for composite materials. A contract was given to Livermore Software Technology Corporation (LSTC) to implement a new damage model in LS-DYNA3D TM specifically for composite structures. This model is in LS-DYNA3D TM and is in use by the ACC partners. In 1994 USCAR, a partnership of American auto companies, entered into a partnership called SCAAP (Super Computing Automotive Applications Partnership) for the express purpose of working with the National Labs on computational oriented research. A CRADA (Cooperative Research and Development Agreement) was signed with Lawrencemore » Livermore National Laboratory, Oak Ridge National Laboratory, Sandia National Laboratory, Argonne National Laboratory, and Los Alamos National Laboratory to work in three distinctly different technical areas, one of which was composites material modeling for crash worthiness. Each Laboratory was assigned a specific modeling task. The ACC was responsible for the technical direction of the composites project and provided all test data for code verification. All new models were to be implemented in DYNA3D and periodically distributed to all partners for testing. Several new models have been developed and implemented. Excellent agreement has been shown between tube crush simulation and experiments.« less
1990-05-30
phase HPLC using an IBM Instruments Inc. model LC 9533 ternary liquid chromatograph attached to a model F9522 fixed UV module and a model F9523...acid analyses are done by separation and quantitation of phenylthiocarbamyl amino acid derivatives using a second IBM model LC 9533 ternary liquid...computer which controls the HPLC and an IBM Instruments Inc. model LC 9505 automatic sampler. The hemoglobin present in the effluent from large
Lawrence Livermore National Laboratory Environmental Report 2014
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, H. E.; Bertoldo, N. A.; Blake, R. G.
The purposes of the Lawrence Livermore National Laboratory Environmental Report 2014 are to record Lawrence Livermore National Laboratory’s (LLNL’s) compliance with environmental standards and requirements, describe LLNL’s environmental protection and remediation programs, and present the results of environmental monitoring at the two LLNL sites—the Livermore Site and Site 300. The report is prepared for the U.S. Department of Energy (DOE) by LLNL’s Environmental Functional Area. Submittal of the report satisfies requirements under DOE Order 231.1B, “Environment, Safety and Health Reporting,” and DOE Order 458.1, “Radiation Protection of the Public and Environment.”
Lawrence Livermore National Laboratory Environmental Report 2015
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosene, C. A.; Jones, H. E.
The purposes of the Lawrence Livermore National Laboratory Environmental Report 2015 are to record Lawrence Livermore National Laboratory’s (LLNL’s) compliance with environmental standards and requirements, describe LLNL’s environmental protection and remediation programs, and present the results of environmental monitoring at the two LLNL sites—the Livermore Site and Site 300. The report is prepared for the U.S. Department of Energy (DOE) by LLNL’s Environmental Functional Area. Submittal of the report satisfies requirements under DOE Order 231.1B, “Environment, Safety and Health Reporting,” and DOE Order 458.1, “Radiation Protection of the Public and Environment.”
NASA Technical Reports Server (NTRS)
Huang, Xinchuan; Fortenberry, Ryan Clifton; Lee, Timothy J.
2013-01-01
Very recently, molecular rotational transitions observed in the photon-dominated region of the Horsehead nebula have been attributed to l-C3H+. In an effort to corroborate this finding, we employed state-of-the art and proven high-accuracy quantum chemical techniques to compute spectroscopic constants for this cation and its isotopologues. Even though the B rotational constant from the fit of the observed spectrum and our computations agree to within 20 MHz, a typical level of accuracy, the D rotational constant differs by more than 40%, while the H rotational constant differs by three orders of magnitude. With the likely errors in the rotational transition energies resulting from this difference in D on the order of 1 MHz for the lowest observed transition (J = 4 yields 3) and growing as J increases, the assignment of the observed rotational lines from the Horsehead nebula to l-C3H+ is questionable.
Improved detonation modeling with CHEETAH
DOE Office of Scientific and Technical Information (OSTI.GOV)
Heller, A.
1997-11-01
A Livermore software program called CHEETAH, an important, even indispensable tool for energetic materials researchers worldwide, was made more powerful in the summer of 1997 with the release of CHEETAH 2.0, an advanced version that simulates a wider variety of detonations. Derived from more than 40 years of experiments on high explosives at Lawrence Livermore and Los Alamos national laboratories, CHEETAH predicts the results from detonating a mixture of specified reactants. It operates by solving thermodynamic equations to predict detonation products and such properties as temperature, pressure, volume, and total energy released. The code is prized by synthesis chemists andmore » other researchers because it allows them to vary the starting molecules and conditions to optimize the desired performance properties. One of the Laboratory`s most popular computer codes, CHEETAH is used at more than 200 sites worldwide, including ones in England, Canada, Sweden, Switzerland, and France. Most sites are defense-related, although a few users, such as Japanese fireworks researchers, are in the civilian sector.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gutmacher, R.; Crawford, R.
This comprehensive guide to the analytical capabilities of Lawrence Livermore Laboratory's General Chemistry Division describes each analytical method in terms of its principle, field of application, and qualitative and quantitative uses. Also described are the state and quantity of sample required for analysis, processing time, available instrumentation, and responsible personnel.
Big Data: Next-Generation Machines for Big Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hack, James J.; Papka, Michael E.
Addressing the scientific grand challenges identified by the US Department of Energy’s (DOE’s) Office of Science’s programs alone demands a total leadership-class computing capability of 150 to 400 Pflops by the end of this decade. The successors to three of the DOE’s most powerful leadership-class machines are set to arrive in 2017 and 2018—the products of the Collaboration Oak Ridge Argonne Livermore (CORAL) initiative, a national laboratory–industry design/build approach to engineering nextgeneration petascale computers for grand challenge science. These mission-critical machines will enable discoveries in key scientific fields such as energy, biotechnology, nanotechnology, materials science, and high-performance computing, and servemore » as a milestone on the path to deploying exascale computing capabilities.« less
Effects of Peripapillary Scleral Stiffening on the Deformation of the Lamina Cribrosa
Coudrillier, Baptiste; Campbell, Ian C.; Read, A. Thomas; Geraldes, Diogo M.; Vo, Nghia T.; Feola, Andrew; Mulvihill, John; Albon, Julie; Abel, Richard L.; Ethier, C. Ross
2016-01-01
Purpose Scleral stiffening has been proposed as a treatment for glaucoma to protect the lamina cribrosa (LC) from excessive intraocular pressure–induced deformation. Here we experimentally evaluated the effects of moderate stiffening of the peripapillary sclera on the deformation of the LC. Methods An annular sponge, saturated with 1.25% glutaraldehyde, was applied to the external surface of the peripapillary sclera for 5 minutes to stiffen the sclera. Tissue deformation was quantified in two groups of porcine eyes, using digital image correlation (DIC) or computed tomography imaging and digital volume correlation (DVC). In group A (n = 14), eyes were subjected to inflation testing before and after scleral stiffening. Digital image correlation was used to measure scleral deformation and quantify the magnitude of scleral stiffening. In group B (n = 5), the optic nerve head region was imaged using synchrotron radiation phase-contrast microcomputed tomography (PC μCT) at an isotropic spatial resolution of 3.2 μm. Digital volume correlation was used to compute the full-field three-dimensional deformation within the LC and evaluate the effects of peripapillary scleral cross-linking on LC biomechanics. Results On average, scleral treatment with glutaraldehyde caused a 34 ± 14% stiffening of the peripapillary sclera measured at 17 mm Hg and a 47 ± 12% decrease in the maximum tensile strain in the LC measured at 15 mm Hg. The reduction in LC strains was not due to cross-linking of the LC. Conclusions Peripapillary scleral stiffening is effective at reducing the magnitude of biomechanical strains within the LC. Its potential and future utilization in glaucoma axonal neuroprotection requires further investigation. PMID:27183053
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hattem, M V; Paterson, L; Woollett, J
2008-08-20
65 surveys were completed in 2002 to assess the current distribution of special status amphibians at the Lawrence Livermore National Laboratory's (LLNL) Livermore Site and Site 300. Combined with historical information from previous years, the information presented herein illustrates the dynamic and probable risk that amphibian populations face at both sites. The Livermore Site is developed and in stark contrast to the mostly undeveloped Site 300. Yet both sites have significant issues threatening the long-term sustainability of their respective amphibian populations. Livermore Site amphibians are presented with a suite of challenges inherent of urban interfaces, most predictably the bullfrog (Ranamore » catesbeiana), while Site 300's erosion issues and periodic feral pig (Sus scrofa) infestations reduce and threaten populations. The long-term sustainability of LLNL's special status amphibians will require active management and resource commitment to maintain and restore amphibian habitat at both sites.« less
A Microwave Method for Measuring Moisture Content, Density, and Grain Angle of Wood.
1985-03-01
Livermore, CA 94550. James, William L; Yen , You - Hsin ; King, Ray J. A microwave method for measuring moisture content,density, and grain angle of wood...Note S FPL-0250 March 1985 Density, and Grain 8 Angle of Wood William L. James, Physicist Forest Products Laboratory, Madison, WI You -Hain Yen ... Yen . You -1tsin. Microwave electromagnetic nondestructive testing of wood in real- time. Madison. WI: Department of Electronic and Computer
Progress report on Nuclear Density project with Lawrence Livermore National Lab Year 2010
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, C W; Krastev, P; Ormand, W E
2011-03-11
The main goal for year 2010 was to improve parallelization of the configuration interaction code BIGSTICK, co-written by W. Erich Ormand (LLNL) and Calvin W. Johnson (SDSU), with the parallelization carried out primarily by Plamen Krastev, a postdoc at SDSU and funded in part by this grant. The central computational algorithm is the Lanczos algorithm, which consists of a matrix-vector multiplication (matvec), followed by a Gram-Schmidt reorthogonalization.
’Do-It-Yourself’ Fallout/Blast Shelter Evaluation
1984-03-01
N4AME & AOORIESS(I! dittvrevI !M’", Controlling Olif~t) IS. SEC’.JRITY CL-ASS. (GO this report) Lawrence Livermore National Laboratory Unclassified P...the data from the transient recorder iemory tirough the Computer Automated Measurement and Control (CAMAC) data busa und stores them on an $-inch...Command and Control Technical Center Emergency Technology Division Department of Defense 0a& Ridge Natioual Laboratory The Pentagon Attn: Librarian
2003-06-01
Albuquerque, NM, 1992. Dobratz, B. M. LLNL Explosives Handbook; UCRL -5299; Lawrence Livermore Laboratory: Livermore, CA, 1981 Geiger, W.; Honcia, G...L.; Hornig, H. C.; Kury, J. W. Adiabatic Expansion of High Explosive Detonation Products; UCRL -50422; Lawrence Livermore National Laboratory...ARMAMENT LAB AFATL DLJR J FOSTER D LAMBERT EGLIN AFB FL 32542-6810 2 DARPA W SNOWDEN S WAX 3701 N FAIRFAX DR ARLINGTON VA
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-11
... DEPARTMENT OF ENERGY Federal Energy Regulatory Commission [Docket No. ER12-1432-000] ReEnergy Livermore Falls LLC; Supplemental Notice That Revised Market-Based Rate Tariff Filing Includes Request for Blanket Section 204 Authorization This is a supplemental notice in the above-referenced proceeding of ReEnergy Livermore Falls LLC's tariff...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gharibyan, N.
In order to fully characterize the NIF neutron spectrum, SAND-II-SNL software was requested/received from the Radiation Safety Information Computational Center. The software is designed to determine the neutron energy spectrum through analysis of experimental activation data. However, given that the source code was developed in Sparcstation 10, it is not compatible with current version of FORTRAN. Accounts have been established through the Lawrence Livermore National Laboratory’s High Performance Computing in order to access different compiles for FORTRAN (e.g. pgf77, pgf90). Additionally, several of the subroutines included in the SAND-II-SNL package have required debugging efforts to allow for proper compiling ofmore » the code.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerhard, M.A.; Sommer, S.C.
1995-04-01
AUTOCASK (AUTOmatic Generation of 3-D CASK models) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for the structural analysis of shipping casks for radioactive material. Model specification is performed on the microcomputer, and the analyses are performed on an engineering workstation or mainframe computer. AUTOCASK is based on 80386/80486 compatible microcomputers. The system is composed of a series of menus, input programs, display programs, a mesh generation program, and archive programs. All data is entered through fill-in-the-blank input screens that contain descriptive data requests.
Livermore Big Artificial Neural Network Toolkit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Essen, Brian Van; Jacobs, Sam; Kim, Hyojin
2016-07-01
LBANN is a toolkit that is designed to train artificial neural networks efficiently on high performance computing architectures. It is optimized to take advantages of key High Performance Computing features to accelerate neural network training. Specifically it is optimized for low-latency, high bandwidth interconnects, node-local NVRAM, node-local GPU accelerators, and high bandwidth parallel file systems. It is built on top of the open source Elemental distributed-memory dense and spars-direct linear algebra and optimization library that is released under the BSD license. The algorithms contained within LBANN are drawn from the academic literature and implemented to work within a distributed-memory framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hazi, A
2005-09-20
Institutions Lawrence Livermore National Laboratory conduct similar or complementary research often excel through collaboration. Indeed, much of Lawrence Livermore's research involves collaboration with other institutions, including universities, other national laboratories, government agencies, and private industry. In particular, Livermore's strategic collaborations with other University of California (UC) campuses have proven exceptionally successful in combining basic science and applied multidisciplinary research. In joint projects, the collaborating institutions benefit from sharing expertise and resources as they work toward their distinctive missions in education, research, and public service. As Laboratory scientists and engineers identify resources needed to conduct their work, they often turn tomore » university researchers with complementary expertise. Successful projects can expand in scope to include additional scientists and engineers both from the Laboratory and from UC, and these projects may become an important element of the research portfolios of the cognizant Livermore directorate and the university department. Additional funding may be provided to broaden or deepen a research project or perhaps develop it for transfer to the private sector for commercial release. Occasionally, joint projects evolve into a strategic collaboration at the institutional level, attracting the attention of the Laboratory director and the UC chancellor. Government agencies or private industries may contribute funding in recognition of the potential payoff of the joint research, and a center may be established at one of the UC campuses. Livermore scientists and engineers and UC faculty are recruited to these centers to focus on a particular area and achieve goals through interdisciplinary research. Some of these researchers hold multilocation appointments, allowing them to work at Livermore and another UC campus. Such centers also attract postdoctoral researchers and graduate students pursuing careers in the centers specialized areas of science. foster university collaboration is through the Laboratory's institutes, which have been established to focus university outreach efforts in fields of scientific importance to Livermore's programs and missions. Some of these joint projects may grow to the level of a strategic collaboration. Others may assist in Livermore's national security mission; provide a recruiting pipeline from universities to the Laboratory; or enhance university interactions and the vitality of Livermore's science and technology environment through seminars, workshops, and visitor programs.« less
Varying execution discipline to increase performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campbell, P.L.; Maccabe, A.B.
1993-12-22
This research investigates the relationship between execution discipline and performance. The hypothesis has two parts: 1. Different execution disciplines exhibit different performance for different computations, and 2. These differences can be effectively predicted by heuristics. A machine model is developed that can vary its execution discipline. That is, the model can execute a given program using either the control-driven, data-driven or demand-driven execution discipline. This model is referred to as a ``variable-execution-discipline`` machine. The instruction set for the model is the Program Dependence Web (PDW). The first part of the hypothesis will be tested by simulating the execution of themore » machine model on a suite of computations, based on the Livermore Fortran Kernel (LFK) Test (a.k.a. the Livermore Loops), using all three execution disciplines. Heuristics are developed to predict relative performance. These heuristics predict (a) the execution time under each discipline for one iteration of each loop and (b) the number of iterations taken by that loop; then the heuristics use those predictions to develop a prediction for the execution of the entire loop. Similar calculations are performed for branch statements. The second part of the hypothesis will be tested by comparing the results of the simulated execution with the predictions produced by the heuristics. If the hypothesis is supported, then the door is open for the development of machines that can vary execution discipline to increase performance.« less
LC-MSsim – a simulation software for liquid chromatography mass spectrometry data
Schulz-Trieglaff, Ole; Pfeifer, Nico; Gröpl, Clemens; Kohlbacher, Oliver; Reinert, Knut
2008-01-01
Background Mass Spectrometry coupled to Liquid Chromatography (LC-MS) is commonly used to analyze the protein content of biological samples in large scale studies. The data resulting from an LC-MS experiment is huge, highly complex and noisy. Accordingly, it has sparked new developments in Bioinformatics, especially in the fields of algorithm development, statistics and software engineering. In a quantitative label-free mass spectrometry experiment, crucial steps are the detection of peptide features in the mass spectra and the alignment of samples by correcting for shifts in retention time. At the moment, it is difficult to compare the plethora of algorithms for these tasks. So far, curated benchmark data exists only for peptide identification algorithms but no data that represents a ground truth for the evaluation of feature detection, alignment and filtering algorithms. Results We present LC-MSsim, a simulation software for LC-ESI-MS experiments. It simulates ESI spectra on the MS level. It reads a list of proteins from a FASTA file and digests the protein mixture using a user-defined enzyme. The software creates an LC-MS data set using a predictor for the retention time of the peptides and a model for peak shapes and elution profiles of the mass spectral peaks. Our software also offers the possibility to add contaminants, to change the background noise level and includes a model for the detectability of peptides in mass spectra. After the simulation, LC-MSsim writes the simulated data to mzData, a public XML format. The software also stores the positions (monoisotopic m/z and retention time) and ion counts of the simulated ions in separate files. Conclusion LC-MSsim generates simulated LC-MS data sets and incorporates models for peak shapes and contaminations. Algorithm developers can match the results of feature detection and alignment algorithms against the simulated ion lists and meaningful error rates can be computed. We anticipate that LC-MSsim will be useful to the wider community to perform benchmark studies and comparisons between computational tools. PMID:18842122
Computation Directorate Annual Report 2003
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crawford, D L; McGraw, J R; Ashby, S F
Big computers are icons: symbols of the culture, and of the larger computing infrastructure that exists at Lawrence Livermore. Through the collective effort of Laboratory personnel, they enable scientific discovery and engineering development on an unprecedented scale. For more than three decades, the Computation Directorate has supplied the big computers that enable the science necessary for Laboratory missions and programs. Livermore supercomputing is uniquely mission driven. The high-fidelity weapon simulation capabilities essential to the Stockpile Stewardship Program compel major advances in weapons codes and science, compute power, and computational infrastructure. Computation's activities align with this vital mission of the Departmentmore » of Energy. Increasingly, non-weapons Laboratory programs also rely on computer simulation. World-class achievements have been accomplished by LLNL specialists working in multi-disciplinary research and development teams. In these teams, Computation personnel employ a wide array of skills, from desktop support expertise, to complex applications development, to advanced research. Computation's skilled professionals make the Directorate the success that it has become. These individuals know the importance of the work they do and the many ways it contributes to Laboratory missions. They make appropriate and timely decisions that move the entire organization forward. They make Computation a leader in helping LLNL achieve its programmatic milestones. I dedicate this inaugural Annual Report to the people of Computation in recognition of their continuing contributions. I am proud that we perform our work securely and safely. Despite increased cyber attacks on our computing infrastructure from the Internet, advanced cyber security practices ensure that our computing environment remains secure. Through Integrated Safety Management (ISM) and diligent oversight, we address safety issues promptly and aggressively. The safety of our employees, whether at work or at home, is a paramount concern. Even as the Directorate meets today's supercomputing requirements, we are preparing for the future. We are investigating open-source cluster technology, the basis of our highly successful Mulitprogrammatic Capability Resource (MCR). Several breakthrough discoveries have resulted from MCR calculations coupled with theory and experiment, prompting Laboratory scientists to demand ever-greater capacity and capability. This demand is being met by a new 23-TF system, Thunder, with architecture modeled on MCR. In preparation for the ''after-next'' computer, we are researching technology even farther out on the horizon--cell-based computers. Assuming that the funding and the technology hold, we will acquire the cell-based machine BlueGene/L within the next 12 months.« less
Takano, Kouji; Komatsu, Tomoaki; Hata, Naoki; Nakajima, Yasoichi; Kansaku, Kenji
2009-08-01
The white/gray flicker matrix has been used as a visual stimulus for the so-called P300 brain-computer interface (BCI), but the white/gray flash stimuli might induce discomfort. In this study, we investigated the effectiveness of green/blue flicker matrices as visual stimuli. Ten able-bodied, non-trained subjects performed Alphabet Spelling (Japanese Alphabet: Hiragana) using an 8 x 10 matrix with three types of intensification/rest flicker combinations (L, luminance; C, chromatic; LC, luminance and chromatic); both online and offline performances were evaluated. The accuracy rate under the online LC condition was 80.6%. Offline analysis showed that the LC condition was associated with significantly higher accuracy than was the L or C condition (Tukey-Kramer, p < 0.05). No significant difference was observed between L and C conditions. The LC condition, which used the green/blue flicker matrix was associated with better performances in the P300 BCI. The green/blue chromatic flicker matrix can be an efficient tool for practical BCI application.
ERIC Educational Resources Information Center
California State Postsecondary Education Commission, Sacramento.
The Livermore Education Center (LEC), an off-campus center of Chabot College, was established in 1975. In 1986, the South County Community College District designated the LEC a full-service community college campus eligible for state funding of facilities, and in 1988, the Board of Governors of the California Community Colleges approved Las…
Watanabe, Satoshi; Saeki, Keigo; Waseda, Yuko; Murata, Akari; Takato, Hazuki; Ichikawa, Yukari; Yasui, Masahide; Kimura, Hideharu; Hamaguchi, Yasuhito; Matsushita, Takashi; Yamada, Kazunori; Kawano, Mitsuhiro; Furuichi, Kengo; Wada, Takashi; Kasahara, Kazuo
2018-02-01
Lung cancer (LC) adversely impacts survival in patients with idiopathic pulmonary fibrosis. However, little is known about LC in patients with connective tissue disease-associated interstitial lung disease (CTD-ILD). The aim of this study was to evaluate the prevalence of and risk factors for LC in CTD-ILD, and the clinical characteristics and survival of CTD-ILD patients with LC. We conducted a single-center, retrospective review of patients with CTD-ILD from 2003 to 2016. Patients with pathologically diagnosed LC were identified. The prevalence, risk factors, and clinical features of LC and the impact of LC on CTD-ILD patient outcomes were observed. Of 266 patients with CTD-ILD, 24 (9.0%) had LC. CTD-ILD with LC was more likely in patients who were older, male, and smokers; had rheumatoid arthritis, a usual interstitial pneumonia pattern, emphysema on chest computed tomography scan, and lower diffusing capacity of the lung carbon monoxide (DLco)% predicted; and were not receiving immunosuppressive therapy. Multivariate analysis indicated that the presence of emphysema [odds ratio (OR), 8.473; 95% confidence interval (CI), 2.241-32.033] and nonuse of immunosuppressive therapy (OR, 8.111; 95% CI, 2.457-26.775) were independent risk factors for LC. CTD-ILD patients with LC had significantly worse survival than patients without LC (10-year survival rate: 28.5% vs. 81.8%, P<0.001). LC is associated with the presence of emphysema and nonuse of immunosuppressive therapy, and contributes to increased mortality in patients with CTD-ILD.
CAFE: Computer aided fabric evaluation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sims, J.E.
1994-05-06
With the intent of automating the inspection of color printed fabrics for defects, the Engineering Research Division of the Lawrence Livermore National Laboratory in addition with several other national labs, in conjunction with the textile industry has initiated the CAFE project. The projects objective is predicated on the development, implementation and testing of an algorithm for the inspection of color printed fabrics. We attempt to take advantage of the wide ranging applications possible with Computer Vision in order to achieve this. The first job of the algorithm is to teach the computer the {open_quote}correct{close_quote} repeat as the reference, tests themore » remaining repeats in the pattern. There are two different ways to go about doing the first job and with this paper we will describe both methods.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lawrence, J.D.
1970-03-12
The Control Data 405 card reader, modified by the Control Data 3649 card read controller, is the primary mechanism for transferring information from a deck of punched cards into the CDC 6600 and CDC 7600 computers of the LLL Octopus system. The card reader operates at a maximum rate of 1200 cards per minute. A description of the card reader and its operation is given. A discussion of formates is included. (RWR)
Sscience & technology review; Science Technology Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-07-01
This review is published ten times a year to communicate, to a broad audience, Lawrence Livermore National Laboratory`s scientific and technological accomplishments, particularly in the Laboratory`s core mission areas - global security, energy and the environment, and bioscience and biotechnology. This review for the month of July 1996 discusses: Frontiers of research in advanced computations, The multibeam Fabry-Perot velocimeter: Efficient measurement of high velocities, High-tech tools for the American textile industry, and Rock mechanics: can the Tuff take the stress.
Breaux, Justin H. S.
2017-03-15
The US Department of Energy (DOE) has partnered with the National Cancer Institute (NCI) to use DOE supercomputers to aid in the fight against cancer by building sophisticated models based on data available at the population, patient, and molecular levels. Here, through a three-year pilot project called the Joint Design of Advanced Computing Solutions for Cancer (JDACSC), four participating national laboratories--Argonne, Lawrence Livermore, Los Alamos, and Oak Ridge--will focus on three problems singled out by the NCI as the biggest bottlenecks to advancing cancer research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Breaux, Justin H. S.
The US Department of Energy (DOE) has partnered with the National Cancer Institute (NCI) to use DOE supercomputers to aid in the fight against cancer by building sophisticated models based on data available at the population, patient, and molecular levels. Here, through a three-year pilot project called the Joint Design of Advanced Computing Solutions for Cancer (JDACSC), four participating national laboratories--Argonne, Lawrence Livermore, Los Alamos, and Oak Ridge--will focus on three problems singled out by the NCI as the biggest bottlenecks to advancing cancer research.
Livermore Site Spill Prevention, Control, and Countermeasures (SPCC) Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bellah, W.; Griffin, D.; Mertesdorf, E.
This Spill Prevention, Control, and Countermeasure (SPCC) Plan describes the measures that are taken at Lawrence Livermore National Laboratory’s (LLNL) Livermore Site in Livermore, California, to prevent, control, and handle potential spills from aboveground containers that can contain 55 gallons or more of oil. This SPCC Plan complies with the Oil Pollution Prevention regulation in Title 40 of the Code of Federal Regulations (40 CFR), Part 112 (40 CFR 112) and with 40 CFR 761.65(b) and (c), which regulates the temporary storage of polychlorinated biphenyls (PCBs). This Plan has also been prepared in accordance with Division 20, Chapter 6.67 ofmore » the California Health and Safety Code (HSC 6.67) requirements for oil pollution prevention (referred to as the Aboveground Petroleum Storage Act [APSA]), and the United States Department of Energy (DOE) Order No. 436.1. This SPCC Plan establishes procedures, methods, equipment, and other requirements to prevent the discharge of oil into or upon the navigable waters of the United States or adjoining shorelines for aboveground oil storage and use at the Livermore Site.« less
Scientific developments of liquid crystal-based optical memory: a review
NASA Astrophysics Data System (ADS)
Prakash, Jai; Chandran, Achu; Biradar, Ashok M.
2017-01-01
The memory behavior in liquid crystals (LCs), although rarely observed, has made very significant headway over the past three decades since their discovery in nematic type LCs. It has gone from a mere scientific curiosity to application in variety of commodities. The memory element formed by numerous LCs have been protected by patents, and some commercialized, and used as compensation to non-volatile memory devices, and as memory in personal computers and digital cameras. They also have the low cost, large area, high speed, and high density memory needed for advanced computers and digital electronics. Short and long duration memory behavior for industrial applications have been obtained from several LC materials, and an LC memory with interesting features and applications has been demonstrated using numerous LCs. However, considerable challenges still exist in searching for highly efficient, stable, and long-lifespan materials and methods so that the development of useful memory devices is possible. This review focuses on the scientific and technological approach of fascinating applications of LC-based memory. We address the introduction, development status, novel design and engineering principles, and parameters of LC memory. We also address how the amalgamation of LCs could bring significant change/improvement in memory effects in the emerging field of nanotechnology, and the application of LC memory as the active component for futuristic and interesting memory devices.
Scientific developments of liquid crystal-based optical memory: a review.
Prakash, Jai; Chandran, Achu; Biradar, Ashok M
2017-01-01
The memory behavior in liquid crystals (LCs), although rarely observed, has made very significant headway over the past three decades since their discovery in nematic type LCs. It has gone from a mere scientific curiosity to application in variety of commodities. The memory element formed by numerous LCs have been protected by patents, and some commercialized, and used as compensation to non-volatile memory devices, and as memory in personal computers and digital cameras. They also have the low cost, large area, high speed, and high density memory needed for advanced computers and digital electronics. Short and long duration memory behavior for industrial applications have been obtained from several LC materials, and an LC memory with interesting features and applications has been demonstrated using numerous LCs. However, considerable challenges still exist in searching for highly efficient, stable, and long-lifespan materials and methods so that the development of useful memory devices is possible. This review focuses on the scientific and technological approach of fascinating applications of LC-based memory. We address the introduction, development status, novel design and engineering principles, and parameters of LC memory. We also address how the amalgamation of LCs could bring significant change/improvement in memory effects in the emerging field of nanotechnology, and the application of LC memory as the active component for futuristic and interesting memory devices.
Scalable load balancing for massively parallel distributed Monte Carlo particle transport
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Brien, M. J.; Brantley, P. S.; Joy, K. I.
2013-07-01
In order to run computer simulations efficiently on massively parallel computers with hundreds of thousands or millions of processors, care must be taken that the calculation is load balanced across the processors. Examining the workload of every processor leads to an unscalable algorithm, with run time at least as large as O(N), where N is the number of processors. We present a scalable load balancing algorithm, with run time 0(log(N)), that involves iterated processor-pair-wise balancing steps, ultimately leading to a globally balanced workload. We demonstrate scalability of the algorithm up to 2 million processors on the Sequoia supercomputer at Lawrencemore » Livermore National Laboratory. (authors)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fu, Pengchen; Settgast, Randolph R.; Johnson, Scott M.
2014-12-17
GEOS is a massively parallel, multi-physics simulation application utilizing high performance computing (HPC) to address subsurface reservoir stimulation activities with the goal of optimizing current operations and evaluating innovative stimulation methods. GEOS enables coupling of di erent solvers associated with the various physical processes occurring during reservoir stimulation in unique and sophisticated ways, adapted to various geologic settings, materials and stimulation methods. Developed at the Lawrence Livermore National Laboratory (LLNL) as a part of a Laboratory-Directed Research and Development (LDRD) Strategic Initiative (SI) project, GEOS represents the culmination of a multi-year ongoing code development and improvement e ort that hasmore » leveraged existing code capabilities and sta expertise to design new computational geosciences software.« less
A Collection of Articles Reprinted from Science & Technology Review on University Relations Program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radousky, H; Rennie, G; Henke, A
2006-08-23
This month's issue has the following articles: (1) The Power of Partnership--Livermore researchers forge strategic collaborations with colleagues from other University of California campuses to further science and better protect the nation; (2) Collaborative Research Prepares Our Next-Generation Scientists and Engineers--Commentary by Laura R. Gilliom; (3) Next-Generation Scientists and Engineers Tap Lab's Resources--University of California Ph.D. candidates work with Livermore scientists and engineers to conduct fundamental research as part of their theses; (4) The Best and the Brightest Come to Livermore--The Lawrence Fellowship Program attracts the most sought-after postdoctoral researchers to the Laboratory; and (5) Faculty on Sabbatical Find amore » Good Home at Livermore--Faculty members from around the world come to the Laboratory as sabbatical scholars.« less
DOT National Transportation Integrated Search
1995-09-01
This report, prepared by Lawrence Livermore National Laboratory (LLNL) for the U.S. Department of Energy, Oakland Operations Office (DOE/OAK), provides a comprehensive summary of the environmental program activities at Lawrence Livermore National Lab...
DOT National Transportation Integrated Search
1996-09-03
This report, prepared by Lawrence Livermore National Laboratory (LLNL) for the U.S. Department of Energy, Oakland Operations Office (DOE/OAK), provides a comprehensive summary of the environmental program activities at Lawrence Livermore National Lab...
DOT National Transportation Integrated Search
1994-09-01
This report, prepared by Lawrence Livermore National Laboratory (LLNL) for the U.S. Department of Energy, Oakland Operations Office (DOE/OAK), provides a comprehensive summary of the environmental program activities at Lawrence Livermore National Lab...
User Guide to the 1981 LC (Language Census) Database. Version 1.2.
ERIC Educational Resources Information Center
Kimbrough, Kenneth L.
This guide is designed to introduce potential users of the California Language Census data to a means of accessing that data using an online, interactive computer system known as "1981 LC." The language census is an actual count of the numbers of pupils with a primary language other than English in California public schools as of March 1…
Environmental Report 1996 Volume 2
DOT National Transportation Integrated Search
1997-09-01
This report, prepared by Lawrence Livermore National Laboratory (LLNL) for the U.S. Department of Energy, Oakland Operations Office (DOE/OAK), provides a comprehensive summary of the environmental program activities at Lawrence Livermore National Lab...
Environmental Report 1996 Volume 1
DOT National Transportation Integrated Search
1997-09-01
This report, prepared by Lawrence Livermore National Laboratory (LLNL) for the U.S. Department of Energy, Oakland Operations Office (DOE/OAK), provides a comprehensive summary of the environmental program activities at Lawrence Livermore National Lab...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-21
.../National Historic Landmarks Program. CALIFORNIA Alameda County Livermore Carnegie Library and Park, (California Carnegie Libraries MPS) 2155 3rd St., Livermore, 11000876 COLORADO Routt County Steamboat...
Environmental Report 1995, Volume 2
DOT National Transportation Integrated Search
1996-09-03
This report, prepared by Lawrence Livermore National Laboratory (LLNL) for the U.S. Department of Energy, Oakland Operations Office (DOE/OAK), provides a comprehensive summary of the environmental program activities at Lawrence Livermore National Lab...
Science & Technology Review October 2005
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aufderheide III, M B
This month's issue has the following articles: (1) Important Missions, Great Science, and Innovative Technology--Commentary by Cherry A. Murray; (2) NanoFoil{reg_sign} Solders with Less Heat--Soldering and brazing to join an array of materials are now Soldering and brazing to join an array of materials are now possible without furnaces, torches, or lead; (3) Detecting Radiation on the Move--An award-winning technology can detect even small amounts An award-winning technology can detect even small amounts of radioactive material in transit; (4) Identifying Airborne Pathogens in Time to Respond--A mass spectrometer identifies airborne spores in less than A mass spectrometer identifies airborne sporesmore » in less than a minute with no false positives; (5) Picture Perfect with VisIt--The Livermore-developed software tool VisIt helps scientists The Livermore-developed software tool VisIt helps scientists visualize and analyze large data sets; (6) Revealing the Mysteries of Water--Scientists are using Livermore's Thunder supercomputer and new algorithms to understand the phases of water; and (7) Lightweight Target Generates Bright, Energetic X Rays--Livermore scientists are producing aerogel targets for use in inertial Livermore scientists are producing aerogel targets for use in inertial confinement fusion experiments and radiation-effects testing.« less
Birds of a Feather: Supporting Secure Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Braswell III, H V
2006-04-24
Over the past few years Lawrence Livermore National Laboratory has begun the process of moving to a diskless environment in the Secure Computer Support realm. This movement has included many moving targets and increasing support complexity. We would like to set up a forum for Security and Support professionals to get together from across the Complex and discuss current deployments, lessons learned, and next steps. This would include what hardware, software, and hard copy based solutions are being used to manage Secure Computing. The topics to be discussed include but are not limited to: Diskless computing, port locking and management,more » PC, Mac, and Linux/UNIX support and setup, system imaging, security setup documentation and templates, security documentation and management, customer tracking, ticket tracking, software download and management, log management, backup/disaster recovery, and mixed media environments.« less
NASA Technical Reports Server (NTRS)
Fortenberry, Ryan C.; Huang, Xinchuan; Crawford, T. Daniel; Lee, Timothy J.
2013-01-01
It has been shown that rotational lines observed in the Horsehead nebula photon-dominated-region (PDR) are probably not caused by l-C3H+, as was originally suggested. In the search for viable alternative candidate carriers, quartic force fields are employed here to provide highly accurate rotational constants, as well as fundamental vibrational frequencies, for another candidate carrier: 1 (sup 1)A' C3H(-). The ab initio computed spectroscopic constants provided in this work are, compared to those necessary to define the observed lines, as accurate as the computed spectroscopic constants for many of the known interstellar anions. Additionally, the computed D-eff for C3H(-) is three times closer to the D deduced from the observed Horsehead nebula lines relative to l-C3H(+). As a result, 1 (sup 1)A' C3H(-). is a more viable candidate for these observed rotational transitions and would be the seventh confirmed interstellar anion detected within the past decade and the first C(sub n)H(-) molecular anion with an odd n.
10-NIF Dedication: Ellen Tauscher
DOE Office of Scientific and Technical Information (OSTI.GOV)
Congresswoman Ellen Tauscher
2009-07-02
The National Ignition Facility, the world's largest laser system, was dedicated at a ceremony on May 29, 2009 at Lawrence Livermore National Laboratory. These are the remarks by Congresswoman Ellen Tauscher, of California's 10th district, which includes Livermore.
10-NIF Dedication: Ellen Tauscher
Congresswoman Ellen Tauscher
2017-12-09
The National Ignition Facility, the world's largest laser system, was dedicated at a ceremony on May 29, 2009 at Lawrence Livermore National Laboratory. These are the remarks by Congresswoman Ellen Tauscher, of California's 10th district, which includes Livermore.
Science& Technology Review September 2003
DOE Office of Scientific and Technical Information (OSTI.GOV)
McMahon, D
2003-09-01
This September 2003 issue of ''Science and Technology Review'' covers the following articles: (1) ''The National Ignition Facility Is Born''; (2) ''The National Ignition Facility Comes to Life'' Over the last 15 years, thousands of Livermore engineers, scientists, and technicians as well as hundreds of industrial partners have worked to bring the National Ignition Facility into being. (3) ''Tracking the Activity of Bacteria Underground'' Using real-time polymerase chain reaction and liquid chromatography/tandem mass spectrometry, researchers at Livermore are gaining knowledge on how bacteria work underground to break down compounds of environmental concern. (4) ''When Every Second Counts--Pathogen Identification in Lessmore » Than a Minute'' Livermore has developed a system that can quickly identify airborne pathogens such as anthrax. (5) ''Portable Radiation Detector Provides Laboratory-Scale Precision in the Field'' A team of Livermore physicists and engineers has developed a handheld, mechanically cooled germanium detector designed to identify radioisotopes.« less
Thrifty: An Exascale Architecture for Energy Proportional Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Torrellas, Josep
2014-12-23
The objective of this project is to design different aspects of a novel exascale architecture called Thrifty. Our goal is to focus on the challenges of power/energy efficiency, performance, and resiliency in exascale systems. The project includes work on computer architecture (Josep Torrellas from University of Illinois), compilation (Daniel Quinlan from Lawrence Livermore National Laboratory), runtime and applications (Laura Carrington from University of California San Diego), and circuits (Wilfred Pinfold from Intel Corporation). In this report, we focus on the progress at the University of Illinois during the last year of the grant (September 1, 2013 to August 31, 2014).more » We also point to the progress in the other collaborating institutions when needed.« less
Culbertson, C N; Wangerin, K; Ghandourah, E; Jevremovic, T
2005-08-01
The goal of this study was to evaluate the COG Monte Carlo radiation transport code, developed and tested by Lawrence Livermore National Laboratory, for neutron capture therapy related modeling. A boron neutron capture therapy model was analyzed comparing COG calculational results to results from the widely used MCNP4B (Monte Carlo N-Particle) transport code. The approach for computing neutron fluence rate and each dose component relevant in boron neutron capture therapy is described, and calculated values are shown in detail. The differences between the COG and MCNP predictions are qualified and quantified. The differences are generally small and suggest that the COG code can be applied for BNCT research related problems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1999-03-01
The US Department of Energy (DOE), prepared a draft Supplement Analysis (SA) for Continued Operation of Lawrence Livermore National Laboratory (LLNL) and Sandia National Laboratories, Livermore (SNL-L), in accordance with DOE`s requirements for implementation of the National Environmental Policy Act of 1969 (NEPA) (10 Code of Federal Regulations [CFR] Part 1021.314). It considers whether the Final Environmental Impact Statement and Environmental Impact Report for Continued Operation of Lawrence Livermore National Laboratory and Sandia National Laboratories, Livermore (1992 EIS/EIR) should be supplement3ed, whether a new environmental impact statement (EIS) should be prepared, or no further NEPA documentation is required. The SAmore » examines the current project and program plans and proposals for LLNL and SNL-L, operations to identify new or modified projects or operations or new information for the period from 1998 to 2002 that was not considered in the 1992 EIS/EIR. When such changes, modifications, and information are identified, they are examined to determine whether they could be considered substantial or significant in reference to the 1992 proposed action and the 1993 Record of Decision (ROD). DOE released the draft SA to the public to obtain stakeholder comments and to consider those comments in the preparation of the final SA. DOE distributed copies of the draft SA to those who were known to have an interest in LLNL or SNL-L activities in addition to those who requested a copy. In response to comments received, DOE prepared this Comment Response Document.« less
Modeling of Near-Field Blast Performance
2013-11-01
The freeze-out temperature is chosen by comparison of calorimetry experiments (2, 3) and thermoequilibrium calculations using CHEETAH (4). The near...P.; Vitello, P. CHEETAH Users Manual; Lawrence Livermore National Laboratory: Livermore, CA, 2012. 5. Walter, P. Introduction to Air Blast
07-NIF Dedication: Jerry McNerney
DOE Office of Scientific and Technical Information (OSTI.GOV)
Congressman Jerry McNerney
2009-07-02
The National Ignition Facility, the world's largest laser system, was dedicated at a ceremony on May 29, 2009 at Lawrence Livermore National Laboratory. These are the remarks by Congressman Jerry McNerney, of California's 11th district, which adjoins Livermore.
07-NIF Dedication: Jerry McNerney
Congressman Jerry McNerney
2017-12-09
The National Ignition Facility, the world's largest laser system, was dedicated at a ceremony on May 29, 2009 at Lawrence Livermore National Laboratory. These are the remarks by Congressman Jerry McNerney, of California's 11th district, which adjoins Livermore.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harms, Kevin; Oral, H. Sarp; Atchley, Scott
The Oak Ridge and Argonne Leadership Computing Facilities are both receiving new systems under the Collaboration of Oak Ridge, Argonne, and Livermore (CORAL) program. Because they are both part of the INCITE program, applications need to be portable between these two facilities. However, the Summit and Aurora systems will be vastly different architectures, including their I/O subsystems. While both systems will have POSIX-compliant parallel file systems, their Burst Buffer technologies will be different. This difference may pose challenges to application portability between facilities. Application developers need to pay attention to specific burst buffer implementations to maximize code portability.
ERIC Educational Resources Information Center
Mozelius, Peter; Hettiarachchi, Enosha
2012-01-01
This paper describes the iterative development process of a Learning Object Repository (LOR), named eNOSHA. Discussions on a project for a LOR started at the e-Learning Centre (eLC) at The University of Colombo, School of Computing (UCSC) in 2007. The eLC has during the last decade been developing learning content for a nationwide e-learning…
Precision and manufacturing at the Lawrence Livermore National Laboratory
NASA Technical Reports Server (NTRS)
Saito, Theodore T.; Wasley, Richard J.; Stowers, Irving F.; Donaldson, Robert R.; Thompson, Daniel C.
1994-01-01
Precision Engineering is one of the Lawrence Livermore National Laboratory's core strengths. This paper discusses the past and present current technology transfer efforts of LLNL's Precision Engineering program and the Livermore Center for Advanced Manufacturing and Productivity (LCAMP). More than a year ago the Precision Machine Commercialization project embodied several successful methods of transferring high technology from the National Laboratories to industry. Currently, LCAMP has already demonstrated successful technology transfer and is involved in a broad spectrum of current programs. In addition, this paper discusses other technologies ripe for future transition including the Large Optics Diamond Turning Machine.
Precision and manufacturing at the Lawrence Livermore National Laboratory
NASA Astrophysics Data System (ADS)
Saito, Theodore T.; Wasley, Richard J.; Stowers, Irving F.; Donaldson, Robert R.; Thompson, Daniel C.
1994-02-01
Precision Engineering is one of the Lawrence Livermore National Laboratory's core strengths. This paper discusses the past and present current technology transfer efforts of LLNL's Precision Engineering program and the Livermore Center for Advanced Manufacturing and Productivity (LCAMP). More than a year ago the Precision Machine Commercialization project embodied several successful methods of transferring high technology from the National Laboratories to industry. Currently, LCAMP has already demonstrated successful technology transfer and is involved in a broad spectrum of current programs. In addition, this paper discusses other technologies ripe for future transition including the Large Optics Diamond Turning Machine.
Computational electronics and electromagnetics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shang, C C
The Computational Electronics and Electromagnetics thrust area serves as the focal point for Engineering R and D activities for developing computer-based design and analysis tools. Representative applications include design of particle accelerator cells and beamline components; design of transmission line components; engineering analysis and design of high-power (optical and microwave) components; photonics and optoelectronics circuit design; electromagnetic susceptibility analysis; and antenna synthesis. The FY-97 effort focuses on development and validation of (1) accelerator design codes; (2) 3-D massively parallel, time-dependent EM codes; (3) material models; (4) coupling and application of engineering tools for analysis and design of high-power components; andmore » (5) development of beam control algorithms coupled to beam transport physics codes. These efforts are in association with technology development in the power conversion, nondestructive evaluation, and microtechnology areas. The efforts complement technology development in Lawrence Livermore National programs.« less
Computational electronics and electromagnetics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shang, C. C.
The Computational Electronics and Electromagnetics thrust area at Lawrence Livermore National Laboratory serves as the focal point for engineering R&D activities for developing computer-based design, analysis, and tools for theory. Key representative applications include design of particle accelerator cells and beamline components; engineering analysis and design of high-power components, photonics, and optoelectronics circuit design; EMI susceptibility analysis; and antenna synthesis. The FY-96 technology-base effort focused code development on (1) accelerator design codes; (2) 3-D massively parallel, object-oriented time-domain EM codes; (3) material models; (4) coupling and application of engineering tools for analysis and design of high-power components; (5) 3-D spectral-domainmore » CEM tools; and (6) enhancement of laser drilling codes. Joint efforts with the Power Conversion Technologies thrust area include development of antenna systems for compact, high-performance radar, in addition to novel, compact Marx generators. 18 refs., 25 figs., 1 tab.« less
Composite Flywheel Development for Energy Storage
2005-01-01
Fiber-Composite Flywheel Program: Quarterly Progress Report; UCRL -50033-76-4; Lawrence Livermore National Laboratory: Livermore, CA, 1976. 2...BEACH DAHLGREN VA 22448 1 WATERWAYS EXPERIMENT D SCOTT 3909 HALLS FERRY RD SC C VICKSBURG MS 39180 1 DARPA B WILCOX 3701 N FAIRFAX DR
1984-10-01
it necessary and identify by blckci -. mbrr, ’At tile bneginninp, of this contract , bot], -,-j- .lc the rest of the optical community imagined * that...simple analog optical computer,, could produce satisfactory solutions to elgenproblems. Earl’ - in this contract we improved optical computing... contract both we and the rest of the optical community imagined that simple analog optical computers could produce . satisfactory solutions to
ICPD-a new peak detection algorithm for LC/MS.
Zhang, Jianqiu; Haskins, William
2010-12-01
The identification and quantification of proteins using label-free Liquid Chromatography/Mass Spectrometry (LC/MS) play crucial roles in biological and biomedical research. Increasing evidence has shown that biomarkers are often low abundance proteins. However, LC/MS systems are subject to considerable noise and sample variability, whose statistical characteristics are still elusive, making computational identification of low abundance proteins extremely challenging. As a result, the inability of identifying low abundance proteins in a proteomic study is the main bottleneck in protein biomarker discovery. In this paper, we propose a new peak detection method called Information Combining Peak Detection (ICPD ) for high resolution LC/MS. In LC/MS, peptides elute during a certain time period and as a result, peptide isotope patterns are registered in multiple MS scans. The key feature of the new algorithm is that the observed isotope patterns registered in multiple scans are combined together for estimating the likelihood of the peptide existence. An isotope pattern matching score based on the likelihood probability is provided and utilized for peak detection. The performance of the new algorithm is evaluated based on protein standards with 48 known proteins. The evaluation shows better peak detection accuracy for low abundance proteins than other LC/MS peak detection methods.
Tiger Team Assessment of the Sandia National Laboratories, Livermore, California
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1990-08-01
This report provides the results of the Tiger Team Assessment of the Sandia National Laboratories (SNL) in Livermore, California, conducted from April 30 to May 18, 1990. The purpose of the assessment was to provide the Secretary of Energy with the status of environment, safety and health (ES H) activities at SNL, Livermore. The assessment was conducted by a team consisting of three subteams of federal and private sector technical specialists in the disciplines of environment, safety and health, and management. On-site activities for the assessment included document reviews, observation of site operations, and discussions and interviews with DOE personnel,more » site contractor personnel, and regulators. Using these sources of information and data, the Tiger Team identified a significant number of findings and concerns having to do with the environment, safety and health, and management, as well as concerns regarding noncompliance with Occupational Safety and Health Administration (OSHA) standards. Although the Tiger Team concluded that none of the findings or concerns necessitated immediate cessation of any operations at SNL, Livermore, it does believe that a sizable number of them require prompt management attention. A special area of concern identified for the near-term health and safety of on-site personnel pertained to the on-site Trudell Auto Repair Shop site. Several significant OSHA concerns and environmental findings relating to this site prompted the Tiger Team Leader to immediately advise SNL, Livermore and AL management of the situation. A case study was prepared by the Team, because the root causes of the problems associated with this site were believed to reflect the overall root causes for the areas of ES H noncompliance at SNL, Livermore. 4 figs., 3 tabs.« less
Reduction of risk of dying from tobacco-related diseases after quitting smoking in Italy.
Carreras, Giulia; Pistelli, Francesco; Falcone, Franco; Carrozzi, Laura; Martini, Andrea; Viegi, Giovanni; Gorini, Giuseppe
2015-01-01
The aims of this paper are to compute the risks of dying of ischemic heart disease (IHD), lung cancer (LC), stroke, and chronic obstructive pulmonary disease (COPD) for Italian smokers by gender, age and daily number of cigarettes smoked, and to estimate the benefit of stopping smoking in terms of risk reduction. Life tables by sex and smoking status were computed for each smoking-related disease based on Italian smoking data, and risk charts with 10-year probabilities of death were computed for never, current and former smokers. Men aged 45-49 years, current smokers, have a 8, 10, 3 and 1 in 1,000 chance of dying of IHD, LC, stroke and COPD, respectively, whereas women with the same characteristics have a 2, 6, 3 and 1 in 1,000 chance, respectively, for all smokers combined, i.e., independent of the smoking intensity. The risk reduction rates from quitting smoking are remarkable: a man who quits smoking at 45-49 years can reduce the risk of dying of IHD, LC, stroke and COPD in the next 10 years by 43%, 53%, 57% and 55%, respectively; a woman by 49%, 49%, 59% and 57%, respectively. Estimates of risk reduction by quitting smoking are useful to provide a sounder scientific basis for public health messages and clinical advice.
Numerical Simulations of 3D Seismic Data Final Report CRADA No. TC02095.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedmann, S. J.; Kostov, C.
This was a collaborative effort between Lawrence Livermore National Security, LLC (formerly The Regents of the University of Califomia)/Lawrence-Livermore National Laboratory (LLNL) and Schlumberger Cambridge Research (SCR), to develop synthetic seismic data sets and supporting codes.
NASA Technical Reports Server (NTRS)
Beiersdorfer, P.; Brown, G. V.; Gu, M.-F.; Harris, C. L.; Kahn, S. M.; Kim, S.-H.; Neill, P. A.; Savin, D. W.; Smith, A. J.; Utter, S. B.
2000-01-01
Using the EBIT facility in Livermore we produce definitive atomic data for input into spectral synthesis codes. Recent measurements of line excitation and dielectronic recombination of highly charged K-shell and L-shell ions are presented to illustrate this point.
Human lamina cribrosa insertion and age.
Sigal, Ian A; Flanagan, John G; Lathrop, Kira L; Tertinegg, Inka; Bilonick, Richard
2012-10-03
To test the hypothesis that in healthy human eyes the lamina cribrosa (LC) insertion into the pia mater increases with age. The optic nerve heads (ONHs) of donor eyes fixed at either 5 or 50 mm Hg of IOP were sectioned, stained, and imaged under bright- and dark-field conditions. A 3-dimensional (3D) model of each ONH was reconstructed. From the 3D models we measured the area of LC insertion into the peripapillary scleral flange and into the pia, and computed the total area of insertion and fraction of LC inserting into the pia. Linear mixed effect models were used to determine if the measurements were associated with age or IOP. We analyzed 21 eyes from 11 individuals between 47 and 91 years old. The LC inserted into the pia in all eyes. The fraction of LC inserting into the pia (2.2%-29.6%) had a significant decrease with age (P = 0.049), which resulted from a nonsignificant increase in the total area of LC insertion (P = 0.41) and a nonsignificant decrease in the area of LC insertion into the pia (P = 0.55). None of the measures was associated with fixation IOP (P values 0.44-0.81). Differences between fellow eyes were smaller than differences between unrelated eyes. The LC insertion into the pia mater is common in middle-aged and older eyes, and does not increase with age. The biomechanical and vascular implications of the LC insertion into the pia mater are not well understood and should be investigated further.
Human Lamina Cribrosa Insertion and Age
Sigal, Ian A.; Flanagan, John G.; Lathrop, Kira L.; Tertinegg, Inka; Bilonick, Richard
2012-01-01
Purpose. To test the hypothesis that in healthy human eyes the lamina cribrosa (LC) insertion into the pia mater increases with age. Methods. The optic nerve heads (ONHs) of donor eyes fixed at either 5 or 50 mm Hg of IOP were sectioned, stained, and imaged under bright- and dark-field conditions. A 3-dimensional (3D) model of each ONH was reconstructed. From the 3D models we measured the area of LC insertion into the peripapillary scleral flange and into the pia, and computed the total area of insertion and fraction of LC inserting into the pia. Linear mixed effect models were used to determine if the measurements were associated with age or IOP. Results. We analyzed 21 eyes from 11 individuals between 47 and 91 years old. The LC inserted into the pia in all eyes. The fraction of LC inserting into the pia (2.2%–29.6%) had a significant decrease with age (P = 0.049), which resulted from a nonsignificant increase in the total area of LC insertion (P = 0.41) and a nonsignificant decrease in the area of LC insertion into the pia (P = 0.55). None of the measures was associated with fixation IOP (P values 0.44–0.81). Differences between fellow eyes were smaller than differences between unrelated eyes. Conclusions. The LC insertion into the pia mater is common in middle-aged and older eyes, and does not increase with age. The biomechanical and vascular implications of the LC insertion into the pia mater are not well understood and should be investigated further. PMID:22956611
Computer-Aided Design and Analysis of Digital Guidance and Control Systems.
1983-07-01
w. O M0 4 z Q.C u0 LC 0 -C LC0 L Ww N a’ 00 - U C to _..q Q 0 c C0 * - C)Q 0 0 (’ UOA- r40 0 U - -r -0 >L00C 0 & c 40 kA 0 - ;( b C .- 0 UOC. CL 0...8217Oka O0 . cc0>- lc0 0O 6, EC ! zz Oor- L-IC 0 0 - CUa-! 0 aJO C. 0 0. aa ~ .a " -IQ - , 0 ak Uu Ou 0 -- U C 0- E-0 L’ 0!- C0-0- a SC -- 00.4.6a. C
Science & Technology Review November 2006
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radousky, H
This months issue has the following articles: (1) Expanded Supercomputing Maximizes Scientific Discovery--Commentary by Dona Crawford; (2) Thunder's Power Delivers Breakthrough Science--Livermore's Thunder supercomputer allows researchers to model systems at scales never before possible. (3) Extracting Key Content from Images--A new system called the Image Content Engine is helping analysts find significant but hard-to-recognize details in overhead images. (4) Got Oxygen?--Oxygen, especially oxygen metabolism, was key to evolution, and a Livermore project helps find out why. (5) A Shocking New Form of Laserlike Light--According to research at Livermore, smashing a crystal with a shock wave can result in coherent light.
Large Colloids in Cholesteric Liquid Crystals
NASA Astrophysics Data System (ADS)
Stratford, K.; Gray, A.; Lintuvuori, J. S.
2015-12-01
We describe a coarse-grained Landau-de Gennes model of liquid crystals (LCs) including hydrodynamics based on the Beris-Edwards equations. The model is employed to study the impact of large colloids on the long range LC defect structure in the cholesteric LC blue phases. `Large' here means that the particle size is comparable to the cholesteric pitch, the length scale on which the LC order undergoes a helical twist. We investigate the case of a single particle, with either normal or degenerate planar anchoring, placed initially in an equilibrium blue phase LC. It is found that in some cases, well defined steady disclination structure emerges at the particle surface, while in other cases no clear steady state is reached in the simulations, and disclination reorganisation appears to proliferate through the bulk LC. These systems are of potential interest in the context of using LCs to template self-assembly of colloid structure, e.g., for opto-electronic devices. Computationally, we demonstrate a parallel approach using mixed message-passing and threaded model on graphical processing units allows effective and efficient progress for this problem.
Ribeiro, Isabella Lima Arrais; Campos, Fernanda; Sousa, Rafael Santiago; Alves, Maria Luiza Lima; Rodrigues, Dalton Matos; Souza, Rodrigo Othavio Assuncão; Bottino, Marco Antonio
2015-01-01
Discrepancies at the abutment/crown interface can affect the longevity of zirconia restorations. The aim was to evaluate the marginal and internal discrepancies (MD and ID) of zirconia copings manufactured by two milling systems with different finish lines. Three aluminum-master-dies (h = 5.5 mm; Ψ =7.5 mm; 6), with different finish lines (large chamfer [LC]; tilted chamfer [TC]; rounded shoulder [RS]) were fabricated. Twenty impressions were made from each master die and poured. Sixty zirconia copings were manufactured and divided according to the factors "finish line" and "milling system" (n = 10): CAD LC = Computer-aided design/computer-aided manufacturing (CAD/CAM) + LC; CAD TC = CAD/CAM + TC; CAD RS = CAD/CAM + RS; MAD LC = manually aided design/manually aided manufacturing (MAD/MAM) + LC; MAD TC = MAD/MAM + TC; and MAD RS = MAD/MAM + RS. For MD analysis, each coping was fixed, and the distance between the external edges of the coping and the edge of the cervical preparation was measured (50 points). Using the same copings, the ID of each coping was evaluated, by the replica technique, at 12 points equally distributed among the regions (n = 10): Ray (R), axial (A), and occlusal (Occl). The measurements were performed by optical microscopy (Χ250). The data (μm) were subjected to parametric and non-parametric statistical analyses. For the MAD/MAM system, the "finish line" (P = 0.0001) affected significantly the MD median values (μm): LC = 251.80 a , RS = 68.40 a and TC = 8.10 b (Dunn's test). For the CAD/CAM system, the median MD values (μm) were not affected by the factor "finish line" (P = 0.4037): LC = 0.82 a , RS = 0.52 a , and TC = 0.89 a . For the ID, it was observed interaction between the finish line types and the region (P = 0.0001) and between region and milling system (P = 0.0031) (RM-ANOVA). The CAD/CAM system presented lower MD values, regardless the finish line. However, the MAD/MAM system showed ID values smaller than those of CAD/CAM.
Computer-aided diagnostic system for diffuse liver diseases with ultrasonography by neural networks
NASA Astrophysics Data System (ADS)
Ogawa, K.; Fukushima, M.; Kubota, K.; Hisa, N.
1998-12-01
The aim of the study is to establish a computer-aided diagnostic system for diffuse liver diseases such as chronic active hepatitis (CAH) and liver cirrhosis (LC). The authors introduced an artificial neural network in the classification of these diseases. In this system the neural network was trained by feature parameters extracted from B-mode ultrasonic images of normal liver (NL), CAH and LC. For input data the authors used six parameters calculated by a region of interest (ROI) and a parameter calculated by five ROIs in each image. They were variance of pixel values, coefficient of variation, annular Fourier power spectrum, longitudinal Fourier power spectrum which were calculated for the ROI, and variation of the means of the five ROIs. In addition, the authors used two more parameters calculated from a co-occurrence matrix of pixel values in the ROI. The results showed that the neural network classifier was 83.8% in sensitivity for LC, 90.0% in sensitivity for CAH and 93.6% in specificity, and the system was considered to be helpful for clinical and educational use.
Final Report on Contract N00014-92-C-0173 (Office of Naval Research)
2001-01-10
PHILPOTTI* t Lawrence Livermore National Laboratory, University of California, Livermore, CA 94550, USA SIBM Research Division, Almaden Research Center...defines the ITP on one electrode and adsorbed hydrated lithium ion defines the OlIP on the second electrode. Ions have been classified according to
High-Resolution Regional Phase Attenuation Models of the Iranian Plateau and Zagros (Postprint)
2012-05-12
15 September 2011, Tucson, AZ, Volume I, pp 153-160. Government Purpose Rights. Johann Wolfgang Goethe -Universität 1, and Lawrence Livermore...University of Missouri1, Johann Wolfgang Goethe -Universität 2, and Lawrence Livermore National Laboratory3 Sponsored by the Air Force
360 Video Tour of 3D Printing Labs at LLNL
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
Additive manufacturing is changing the way the world thinks about manufacturing and design. And here at Lawrence Livermore National Laboratory, it’s changing the way our scientists approach research and development. Today we’ll look around three of the additive manufacturing research labs on the Lawrence Livermore campus.
Analysis of Proton Transport Experiments.
1980-09-05
which can inhibit transport, may grow . The abrupt loss of transport at higher currents in the small channel suggests this possibility. Future experiments... Unicorn Park Drive Woburn, MA 01801 Attn: H. Linnerud 1 copy Lawrence Livermore Laboratory P. 0. Box 808 Livermore, CA 94550 Attn: R. J. Briggs 1 copy R
Sandia National Laboratories: Livermore Valley Open Campus (LVOC)
Visiting the LVOC Locations Livermore Valley Open Campus (LVOC) Open engagement Expanding opportunities for open engagement of the broader scientific community. Building on success Sandia's Combustion Research Facility pioneered open collaboration over 30 years ago. Access to DOE-funded capabilities Expanding access
Weiss, Gunter; Schlegel, Anne; Kottwitz, Denise; König, Thomas; Tetzner, Reimo
2017-01-01
Low-dose computed tomography (LDCT) is used for screening for lung cancer (LC) in high-risk patients in the United States. The definition of high risk and the impact of frequent false-positive results of low-dose computed tomography remains a challenge. DNA methylation biomarkers are valuable noninvasive diagnostic tools for cancer detection. This study reports on the evaluation of methylation markers in plasma DNA for LC detection and discrimination of malignant from nonmalignant lung disease. Circulating DNA was extracted from 3.5-mL plasma samples, treated with bisulfite using a commercially available kit, purified, and assayed by real-time polymerase chain reaction for assessment of DNA methylation of short stature homeobox 2 gene (SHOX2), prostaglandin E receptor 4 gene (PTGER4), and forkhead box L2 gene (FOXL2). In three independent case-control studies these assays were evaluated and optimized. The resultant assay, a triplex polymerase chain reaction combining SHOX2, PTGER4, and the reference gene actin, beta gene (ACTB), was validated using plasma from patients with and without malignant disease. A panel of SHOX2 and PTGER4 provided promising results in three independent case-control studies examining a total of 330 plasma specimens (area under the receiver operating characteristic curve = 91%-98%). A validation study with 172 patient samples demonstrated significant discriminatory performance in distinguishing patients with LC from subjects without malignancy (area under the curve = 0.88). At a fixed specificity of 90%, sensitivity for LC was 67%; at a fixed sensitivity of 90%, specificity was 73%. Measurement of SHOX2 and PTGER4 methylation in plasma DNA allowed detection of LC and differentiation of nonmalignant diseases. Development of a diagnostic test based on this panel may provide clinical utility in combination with current imaging techniques to improve LC risk stratification. Copyright © 2016 International Association for the Study of Lung Cancer. Published by Elsevier Inc. All rights reserved.
Smith, Kathleen S.; Ranville, James F.; Adams, M.; Choate, LaDonna M.; Church, Stan E.; Fey, David L.; Wanty, Richard B.; Crock, James G.
2006-01-01
The chemical speciation of metals influences their biological effects. The Biotic Ligand Model (BLM) is a computational approach to predict chemical speciation and acute toxicological effects of metals on aquatic biota. Recently, the U.S. Environmental Protection Agency incorporated the BLM into their regulatory water-quality criteria for copper. Results from three different laboratory copper toxicity tests were compared with BLM predictions for simulated test-waters. This was done to evaluate the ability of the BLM to accurately predict the effects of hardness and concentrations of dissolved organic carbon (DOC) and iron on aquatic toxicity. In addition, we evaluated whether the BLM and the three toxicity tests provide consistent results. Comparison of BLM predictions with two types of Ceriodaphnia dubia toxicity tests shows that there is fairly good agreement between predicted LC50 values computed by the BLM and LC50 values determined from the two toxicity tests. Specifically, the effect of increasing calcium concentration (and hardness) on copper toxicity appears to be minimal. Also, there is fairly good agreement between the BLM and the two toxicity tests for test solutions containing elevated DOC, for which the LC50 is 3-to-5 times greater (less toxic) than the LC50 for the lower-DOC test water. This illustrates the protective effects of DOC on copper toxicity and demonstrates the ability of the BLM to predict these protective effects. In contrast, for test solutions with added iron there is a decrease in LC50 values (increase in toxicity) in results from the two C. dubia toxicity tests, and the agreement between BLM LC50 predictions and results from these toxicity tests is poor. The inability of the BLM to account for competitive iron binding to DOC or DOC fractionation may be a significant shortcoming of the BLM for predicting site- specific water-quality criteria in streams affected by iron-rich acidic drainage in mined and mineralized areas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhong, H; Li, H; Gordon, J
Purpose: To investigate radiotherapy outcomes by incorporating 4DCT-based physiological and tumor elasticity functions for lung cancer patients. Methods: 4DCT images were acquired from 28 lung SBRT patients before radiation treatment. Deformable image registration (DIR) was performed from the end-inhale to the end-exhale using a B-Spline-based algorithm (Elastix, an open source software package). The resultant displacement vector fields (DVFs) were used to calculate a relative Jacobian function (RV) for each patient. The computed functions in the lung and tumor regions represent lung ventilation and tumor elasticity properties, respectively. The 28 patients were divided into two groups: 16 with two-year tumor localmore » control (LC) and 12 with local failure (LF). The ventilation and elasticity related RV functions were calculated for each of these patients. Results: The LF patients have larger RV values than the LC patients. The mean RV value in the lung region was 1.15 (±0.67) for the LF patients, higher than 1.06 (±0.59) for the LC patients. In the tumor region, the elasticity-related RV values are 1.2 (±0.97) and 0.86 (±0.64) for the LF and LC patients, respectively. Among the 16 LC patients, 3 have the mean RV values greater than 1.0 in the tumors. These tumors were located near the diaphragm, where the displacements are relatively large.. RV functions calculated in the tumor were better correlated with treatment outcomes than those calculated in the lung. Conclusion: The ventilation and elasticity-related RV functions in the lung and tumor regions were calculated from 4DCT image and the resultant values showed differences between the LC and LF patients. Further investigation of the impact of the displacements on the computed RV is warranted. Results suggest that the RV images might be useful for evaluation of treatment outcome for lung cancer patients.« less
CP/M: A Family of 8- and 16-Bit Computer Operating Systems.
ERIC Educational Resources Information Center
Kildall, Gary
1982-01-01
Traces the development of the computer CP/M (Control Program for Microcomputers) and MP/M (Multiprogramming Monitor Microcomputers) operating system by Gary Kildall of Digital Research Company. Discusses the adaptation of these operating systems to the newly emerging 16 and 32 bit microprocessors. (Author/LC)
A Monte Carlo Simulation of Brownian Motion in the Freshman Laboratory
ERIC Educational Resources Information Center
Anger, C. D.; Prescott, J. R.
1970-01-01
Describes a dry- lab" experiment for the college freshman laboratory, in which the essential features of Browian motion are given principles, using the Monte Carlo technique. Calculations principles, using the Monte Carlo technique. Calculations are carried out by a computation sheme based on computer language. Bibliography. (LC)
1994-01-01
These enhancements have allowed us to use GEMACS to model very small ( electrical ) features such as 0.1V pins on printed circuit boards without the...34Enhancements and Limitations of the Code NEC for Modeling Electrically Small Antennas," Lawrence Livermore National Laboratory, Report UCID-20970, January... electrical lengths of the coupling paths arc also shown in Figure 6. The "LB" indicates the large box dimensions (1/4.4 scale model ) and "SB" Indicates the
Lapi, Dominga; Sabatino, Lina; Altobelli, Giovanna Giuseppina; Mondola, Paolo; Cimini, Vincenzo; Colantuoni, Antonio
2010-01-01
Propionyl-l-carnitine (pLc) exerts protective effects in different experimental models of ischemia-reperfusion (I/R). The aim of the present study was to assess the effects of intravenously and topically applied pLc on microvascular permeability increase induced by I/R in the hamster cheek pouch preparation. The hamster cheek pouch microcirculation was visualized by fluorescence microscopy. Microvascular permeability, leukocyte adhesion to venular walls, perfused capillary length, and capillary red blood cell velocity (V(RBC)) were evaluated by computer-assisted methods. E-selectin expression was assessed by in vitro analysis. Lipid peroxidation and reactive oxygen species (ROS) formation were determined by thiobarbituric acid-reactive substances (TBARS) and 2'-7'-dichlorofluorescein (DCF), respectively. In control animals, I/R caused a significant increase in permeability and in the leukocyte adhesion in venules. Capillary perfusion and V(RBC) decreased. TBARS levels and DCF fluorescence significantly increased compared with baseline. Intravenously infused pLc dose-dependently prevented leakage and leukocyte adhesion, preserved capillary perfusion, and induced vasodilation at the end of reperfusion, while ROS concentration decreased. Inhibition of nitric oxide synthase prior to pLc caused vasoconstriction and partially blunted the pLc-induced protective effects; inhibition of the endothelium-derived hyperpolarizing factor (EDHF) abolished pLc effects. Topical application of pLc on cheek pouch membrane produced the same effects as observed with intravenous administration. pLc decreased the E-selectin expression. pLc prevents microvascular changes induced by I/R injury. The reduction of permeability increase could be mainly due to EDHF release induce vasodilatation together with NO. The reduction of E-selectin expression prevents leukocyte adhesion and permeability increase.
Lapi, Dominga; Sabatino, Lina; Altobelli, Giovanna Giuseppina; Mondola, Paolo; Cimini, Vincenzo; Colantuoni, Antonio
2010-01-01
Background and purpose Propionyl-l-carnitine (pLc) exerts protective effects in different experimental models of ischemia–reperfusion (I/R). The aim of the present study was to assess the effects of intravenously and topically applied pLc on microvascular permeability increase induced by I/R in the hamster cheek pouch preparation. Methods The hamster cheek pouch microcirculation was visualized by fluorescence microscopy. Microvascular permeability, leukocyte adhesion to venular walls, perfused capillary length, and capillary red blood cell velocity (VRBC) were evaluated by computer-assisted methods. E-selectin expression was assessed by in vitro analysis. Lipid peroxidation and reactive oxygen species (ROS) formation were determined by thiobarbituric acid-reactive substances (TBARS) and 2′-7′-dichlorofluorescein (DCF), respectively. Results In control animals, I/R caused a significant increase in permeability and in the leukocyte adhesion in venules. Capillary perfusion and VRBC decreased. TBARS levels and DCF fluorescence significantly increased compared with baseline. Intravenously infused pLc dose-dependently prevented leakage and leukocyte adhesion, preserved capillary perfusion, and induced vasodilation at the end of reperfusion, while ROS concentration decreased. Inhibition of nitric oxide synthase prior to pLc caused vasoconstriction and partially blunted the pLc-induced protective effects; inhibition of the endothelium-derived hyperpolarizing factor (EDHF) abolished pLc effects. Topical application of pLc on cheek pouch membrane produced the same effects as observed with intravenous administration. pLc decreased the E-selectin expression. Conclusions pLc prevents microvascular changes induced by I/R injury. The reduction of permeability increase could be mainly due to EDHF release induce vasodilatation together with NO. The reduction of E-selectin expression prevents leukocyte adhesion and permeability increase. PMID:21423374
Analysis of Ethanolamines: Validation of Semi-Volatile Analysis by HPLC-MS/MS by EPA Method MS888
DOE Office of Scientific and Technical Information (OSTI.GOV)
Owens, J; Vu, A; Koester, C
The Environmental Protection Agency's (EPA) Region 5 Chicago Regional Laboratory (CRL) developed a method titled 'Analysis of Diethanolamine, Triethanolamine, n-Methyldiethanolamine, and n-Ethyldiethanolamine in Water by Single Reaction Monitoring Liquid Chromatography/Tandem Mass Spectrometry (LC/MS/MS): EPA Method MS888'. This draft standard operating procedure (SOP) was distributed to multiple EPA laboratories and to Lawrence Livermore National Laboratory, which was tasked to serve as a reference laboratory for EPA's Environmental Reference Laboratory Network (ERLN) and to develop and validate analytical procedures. The primary objective of this study was to validate and verify the analytical procedures described in 'EPA Method MS888' for analysis of themore » listed ethanolamines in aqueous samples. The gathered data from this validation study will be used to: (1) demonstrate analytical method performance; (2) generate quality control acceptance criteria; and (3) revise the SOP to provide a validated method that would be available for use during a homeland security event. The data contained in this report will be compiled, by EPA CRL, with data generated by other EPA Regional laboratories so that performance metrics of 'EPA Method MS888' can be determined.« less
Science and Technology Review December 2006
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radousky, H B
2006-10-30
This month's issue has the following articles: (1) Livermore's Biosecurity Research Directly Benefits Public Health--Commentary by Raymond J. Juzaitis; (2) Diagnosing Flu Fast--Livermore's FluIDx device can diagnose flu and four other respiratory viruses in just two hours; (3) An Action Plan to Reopen a Contaminated Airport--New planning tools and faster sample analysis methods will hasten restoration of a major airport to full use following a bioterrorist attack; (4) Early Detection of Bone Disease--A Livermore technique detects small changes in skeletal calcium balance that may signal bone disease; and (5) Taking a Gander with Gamma Rays--Gamma rays may be the nextmore » source for looking deep inside the atom.« less
Signal intensity of lanthanum carbonate on magnetic resonance images: phantom study.
Nakamura, Shinichi; Awai, Kazuo; Komi, Masanori; Morita, Kosuke; Namimoto, Tomohiro; Yanaga, Yumi; Utsunomiya, Daisuke; Date, Shuji; Yamashita, Yasuyuki
2011-06-01
Lanthanum carbonate (LC) is used to treat hyperphosphatemia. The purpose of this study was to investigate the signal intensity (SI) of LC on magnetic resonance imaging (MRI) scans of phantoms. LC tablets were thoroughly ground and mixed with distilled water or edible agar (0.05, 0.25, 0.5, and 2.5 mg/ml) in plastic bottles. Four intact tablets were placed in plastic bottles that did or did not contain distilled water or agar. Two radiologists consensually evaluated T1- and T2-weighted images (WIs) obtained with 1.5- and 3.0-T MRI systems for the SI of unground and ground tablets. On T1- and T2WI, the SIs of the LC suspensions and the solvents alone were similar; the SIs of unground tablets alone and of the air were also similar. Unground tablets in phantoms filled with solvent exhibited lower SI than the solvent. Ground tablets in suspension were not visualized on MRI or computed tomography. These results remained unchanged regardless of differences in magnetic field strength or the solvent used. Ground LC had no contrast enhancement effect on T1WI; on T2WI it did not affect the SI of the solvent. Unground LC tablets may be visualized as a "filling defect" on MRI.
Shigefuku, Ryuta; Takahashi, Hideaki; Nakano, Hiroyasu; Watanabe, Tsunamasa; Matsunaga, Kotaro; Matsumoto, Nobuyuki; Kato, Masaki; Morita, Ryo; Michikawa, Yousuke; Tamura, Tomohiro; Hiraishi, Tetsuya; Hattori, Nobuhiro; Noguchi, Yohei; Nakahara, Kazunari; Ikeda, Hiroki; Ishii, Toshiya; Okuse, Chiaki; Sase, Shigeru; Itoh, Fumio; Suzuki, Michihiro
2016-09-14
The progression of chronic liver disease differs by etiology. The aim of this study was to elucidate the difference in disease progression between chronic hepatitis C (CHC) and nonalcoholic fatty liver disease (NAFLD) by means of fibrosis markers, liver function, and hepatic tissue blood flow (TBF). Xenon computed tomography (Xe-CT) was performed in 139 patients with NAFLD and 152 patients with CHC (including liver cirrhosis (LC)). The cutoff values for fibrosis markers were compared between NAFLD and CHC, and correlations between hepatic TBF and liver function tests were examined at each fibrosis stage. The cutoff values for detection of the advanced fibrosis stage were lower in NAFLD than in CHC. Although portal venous TBF (PVTBF) correlated with liver function tests, PVTBF in initial LC caused by nonalcoholic steatohepatitis (NASH-LC) was significantly lower than that in hepatitis C virus (C-LC) (p = 0.014). Conversely, the liver function tests in NASH-LC were higher than those in C-LC (p < 0.05). It is important to recognize the difference between NAFLD and CHC. We concluded that changes in hepatic blood flow occurred during the earliest stage of hepatic fibrosis in patients with NAFLD; therefore, patients with NAFLD need to be followed carefully.
ICPD-A New Peak Detection Algorithm for LC/MS
2010-01-01
Background The identification and quantification of proteins using label-free Liquid Chromatography/Mass Spectrometry (LC/MS) play crucial roles in biological and biomedical research. Increasing evidence has shown that biomarkers are often low abundance proteins. However, LC/MS systems are subject to considerable noise and sample variability, whose statistical characteristics are still elusive, making computational identification of low abundance proteins extremely challenging. As a result, the inability of identifying low abundance proteins in a proteomic study is the main bottleneck in protein biomarker discovery. Results In this paper, we propose a new peak detection method called Information Combining Peak Detection (ICPD ) for high resolution LC/MS. In LC/MS, peptides elute during a certain time period and as a result, peptide isotope patterns are registered in multiple MS scans. The key feature of the new algorithm is that the observed isotope patterns registered in multiple scans are combined together for estimating the likelihood of the peptide existence. An isotope pattern matching score based on the likelihood probability is provided and utilized for peak detection. Conclusions The performance of the new algorithm is evaluated based on protein standards with 48 known proteins. The evaluation shows better peak detection accuracy for low abundance proteins than other LC/MS peak detection methods. PMID:21143790
Nishihara, Hitoshi; Ayaki, Masahiko; Watanabe, Tomiko; Ohnishi, Takeo; Kageyama, Toshiyuki; Yaguchi, Shigeo
2004-03-01
To compare the long-term clinical and experimental results of soft acrylic intraocular lenses(IOLs) manufactured by the lathe-cut(LC) method and by the cast-molding(CM) method. This was a retrospective study of 20 patients(22 eyes) who were examined in a 5- and 7-year follow-up study. Sixteen eyes were implanted with polyacrylic IOLs manufactured by the LC method and 6 eyes were implanted with polyacrylic IOLs manufactured by the CM method. Postoperative measurements included best corrected visual acuity, contrast sensitivity, biomicroscopic examination, and Scheimpflug slit-lamp images to evaluate surface light scattering. Scanning electron microscopy and three-dimensional surface analysis were conducted. At 7 years, the mean visual acuity was 1.08 +/- 0.24 (mean +/- standard deviation) in the LC group and 1.22 +/- 0.27 in the CM group. Surface light-seatter was 12.0 +/- 4.0 computer compatible tapes(CCT) in the LC group and 37.4 +/- 5.4 CCT in the CM group. Mean surface roughness was 0.70 +/- 0.07 nm in the LC group and 6.16 +/- 0.97 nm in the CM group. Acrylic IOLs manufactured by the LC method are more stable in long-termuse.
High explosive corner turning performance and the LANL Mushroom test
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hill, L.G.; Seitz, W.L.; Forest, C.A.
1997-09-01
The Mushroom test is designed to characterize the corner turning performance of a new generation of less insensitive booster explosives. The test is described in detail, and three corner turning figures-of-merit are examined using pure TATB (both Livermore`s Ultrafine and a Los Alamos research blend) and PBX9504 as examples.
HCCI Combustion Engines Final Report CRADA No. TC02032.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aceves, S.; Lyford-Pike, E.
This was a collaborative effort between Lawrence Livermore National Security, LLC (formerly The Regents of the University of California)/Lawrence Livermore National Laboratory (LLNL) and Cummins Engine Company (Cwnmins), to advance the state of the art on HomogeneousCharge Compression-Ignition (HCCI) engines, resulting in a clean, high-efficiency alternative to diesel engines.
2015 Cross-Domain Deterrence Seminar Summary Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Juarez, A.
2016-01-11
Lawrence Livermore National Laboratory (LLNL) hosted the 2nd Annual Cross-Domain Deterrence Seminar on November 17th, 2015 in Livermore, CA. The seminar was sponsored by LLNL’s Center for Global Security Research (CGSR), National Security Office (NSO), and Global Security program. This summary covers the seminar’s panels and subsequent discussions.
ERIC Educational Resources Information Center
Khoury, Anne
2006-01-01
Leadership development, a component of HRD, is becoming an area of increasingly important practice for all organizations. When companies such as Lawrence Livermore National Laboratory rely on knowledge workers for success, leadership becomes even more important. This research paper tests the hypothesis that leadership credibility and the courage…
Rapid Assessment of Individual Soldier Operational Readiness Final Report CRADA No. TC02104.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turteltaub, K.; Mapes, J.
This was a collaborative effort between Lawrence Livermore National Security (LLNS) (formerly The Regents of the University of California), Lawrence Livermore National Laboratory (LLNL) and Rules Based Medicine, Inc. {RBM), to identify markers in blood that would be candidates for determining the combat readiness of troops.
High Peak Power Ka-Band Gyrotron Oscillator Experiments with Slotted and Unslotted Cavities.
1987-11-10
cylindrical graphite cathode by explosive plasma formation. (In order to optimize the compression ratio for these experiments, a graphite cathode was employed...48106 Attn: S.B. Segall I copy Lawrence Livermore National Laboratory P.O. Box 808 Livermore, California 94550 Attn: Dr. D. Prosnitz 1 copy Dr. T.J
Megavolt, Multi-Kiloamp Ka-Band Gyrotron Oscillator Experiment
1989-03-15
pulseline accelerator with 20 K2 output impedance and 55 nsec voltage pulse was used to generate a multi-kiloamp annular electron beam by explosive plasma...Lawrence Livermore National Laboratory P.O. Box 808 Livermore, California 94550 Attn: Dr. D. Prosnitz 1 copy Dr. T.J. Orzechowski 1 copy Dr. J. Chase 1
Computer Generated Holography with Intensity-Graded Patterns
Conti, Rossella; Assayag, Osnath; de Sars, Vincent; Guillon, Marc; Emiliani, Valentina
2016-01-01
Computer Generated Holography achieves patterned illumination at the sample plane through phase modulation of the laser beam at the objective back aperture. This is obtained by using liquid crystal-based spatial light modulators (LC-SLMs), which modulate the spatial phase of the incident laser beam. A variety of algorithms is employed to calculate the phase modulation masks addressed to the LC-SLM. These algorithms range from simple gratings-and-lenses to generate multiple diffraction-limited spots, to iterative Fourier-transform algorithms capable of generating arbitrary illumination shapes perfectly tailored on the base of the target contour. Applications for holographic light patterning include multi-trap optical tweezers, patterned voltage imaging and optical control of neuronal excitation using uncaging or optogenetics. These past implementations of computer generated holography used binary input profile to generate binary light distribution at the sample plane. Here we demonstrate that using graded input sources, enables generating intensity graded light patterns and extend the range of application of holographic light illumination. At first, we use intensity-graded holograms to compensate for LC-SLM position dependent diffraction efficiency or sample fluorescence inhomogeneity. Finally we show that intensity-graded holography can be used to equalize photo evoked currents from cells expressing different levels of chanelrhodopsin2 (ChR2), one of the most commonly used optogenetics light gated channels, taking into account the non-linear dependence of channel opening on incident light. PMID:27799896
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, G.; Mansur, D.L.; Ruhter, W.D.
1994-01-01
The Lawrence Livermore National Laboratory (LLNL) carries out safeguards and security activities for the Department of Energy (DOE), Office of Safeguards and Security (OSS), as well as other organizations, both within and outside the DOE. This document summarizes the activities conducted for the OSS during the first quarter of fiscal year 1994 (October through December, 1993). The nature and scope of the activities carried out for OSS at LLNL require a broad base of technical expertise. To assure projects are staffed and executed effectively, projects are conducted by the organization at LLNL best able to supply the needed technical expertise.more » These projects are developed and managed by senior program managers. Institutional oversight and coordination is provided through the LLNL Deputy Director`s office. At present, the Laboratory is supporting OSS in five areas: (1) Safeguards Technology, (2) Safeguards and Decision Support, (3) Computer Security, (4) DOE Automated Physical Security, and (5) DOE Automated Visitor Access Control System. This report describes the activities in each of these five areas. The information provided includes an introduction which briefly describes the activity, summary of major accomplishments, task descriptions with quarterly progress, summaries of milestones and deliverables and publications published this quarter.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davis, G.; Mansur, D.L.; Ruhter, W.D.
The Lawrence Livermore National Laboratory (LLNL) carries out safeguards and security activities for the Department of Energy (DOE), Office of Safeguards and Security (OSS), as well as other organizations, both within and outside the DOE. This document summarizes the activities conducted for the OSS during the First Quarter of Fiscal Year 1997 (October through December, 1996). The nature and scope of the activities carried out for OSS at LLNL require a broad base of technical expertise. To assure projects are staffed and executed effectively, projects are conducted by the organization at LLNL best able to supply the needed technical expertise.more » These projects are developed and managed by senior program managers. Institutional oversight and coordination is provided through the LLNL Deputy Director`s office. At present, the Laboratory is supporting OSS in four areas: (1) safeguards technology; (2) safeguards and material accountability; (3) computer security--distributed systems; and (4) physical and personnel security support. The remainder of this report describes the activities in each of these four areas. The information provided includes an introduction which briefly describes the activity, summary of major accomplishments, task descriptions with quarterly progress, summaries of milestones and deliverables and publications published this quarter.« less
Cognitive training transfer using a personal computer-based game: A close quarters battle case study
NASA Astrophysics Data System (ADS)
Woodman, Michael D.
In this dissertation, liquid crystal (LC) materials and devices are investigated in order to meet the challenges for photonics and displays applications. We have studied three kinds of liquid crystal materials: positive dielectric anisotropic LCs, negative dielectric anisotropic LCs, and dual-frequency LCs. For the positive dielectric anisotropic LCs, we have developed some high birefringence isothiocyanato tolane LC compounds with birefringence ˜0.4, and super high birefringence isothiocyanato biphenyl-bistolane LC compounds with birefringence as high as ˜0.7. Moreover, we have studied the photostability of several high birefringence LC compounds, mixtures, and LC alignment layers in order to determine the failure mechanism concerning the lifetime of LC devices. Although cyano and isothiocyanato LC compounds have similar absorption peaks, the isothiocyanato compounds are more stable than their cyano counterparts under the same illumination conditions. This ultraviolet-durable performance of isothiocyanato compounds originates from its molecular structure and the delocalized electron distribution. We have investigated the alignment performance of negative dielectric anisotropic LCs in homeotropic (vertical aligned, VA) LC cell. Some (2, 3) laterally difluorinated biphenyls, terphenyls and tolanes are selected for this study. Due to the strong repulsive force between LCs and alignment layer, (2,3) laterally difluorinated terphenyls and tolanes do not align well in a VA cell resulting in a poor contrast ratio for the LC panel. We have developed a novel method to suppress the light leakage at dark state. By doping positive Deltaepsilon or non-polar LC compounds or mixtures into the host negative LC mixtures, the repulsive force is reduced and the cell exhibits an excellent dark state. In addition, these dopants increase the birefringence and reduce the viscosity of the host LCs which leads to a faster response time. In this dissertation, we investigate the dielectric heating effect of dual-frequency LCs. Because the absorption peak of imaginary dielectric constant occurs at high frequency region (˜ MHz), there is a heat generated when the LC cell is operated at a high frequency voltage. We have formulated a new dual-frequency LC mixture which greatly reduces the dielectric heating effect while maintaining good physical properties. Another achievement in this thesis is that we have developed a polarization independent phase modulator by using a negative dielectric anisotropic LC gel. (Abstract shortened by UMI.)
Liquid crystal light valve technologies for display applications
NASA Astrophysics Data System (ADS)
Kikuchi, Hiroshi; Takizawa, Kuniharu
2001-11-01
The liquid crystal (LC) light valve, which is a spatial light modulator that uses LC material, is a very important device in the area of display development, image processing, optical computing, holograms, etc. In particular, there have been dramatic developments in the past few years in the application of the LC light valve to projectors and other display technologies. Various LC operating modes have been developed, including thin film transistors, MOS-FETs and other active matrix drive techniques to meet the requirements for higher resolution, and substantial improvements have been achieved in the performance of optical systems, resulting in brighter display images. Given this background, the number of applications for the LC light valve has greatly increased. The resolution has increased from QVGA (320 x 240) to QXGA (2048 x 1536) or even super- high resolution of eight million pixels. In the area of optical output, projectors of 600 to 13,000 lm are now available, and they are used for presentations, home theatres, electronic cinema and other diverse applications. Projectors using the LC light valve can display high- resolution images on large screens. They are now expected to be developed further as part of hyper-reality visual systems. This paper provides an overview of the needs for large-screen displays, human factors related to visual effects, the way in which LC light valves are applied to projectors, improvements in moving picture quality, and the results of the latest studies that have been made to increase the quality of images and moving images or pictures.
Conversion of the Livermore Education Center to College Status.
ERIC Educational Resources Information Center
Freitas, Joseph M.; And Others
In March 1988, the South County Community College District (SCCCD) requested the approval of the Board of Governors of the California Community Colleges to change the status of the Livermore Education Center from an "educational center" to a "college." An analysis by the Chancellor's Office of the request indicated that the District met Title 5…
360 Video Tour of 3D Printing Labs at LLNL
None
2018-01-16
Additive manufacturing is changing the way the world thinks about manufacturing and design. And here at Lawrence Livermore National Laboratory, itâs changing the way our scientists approach research and development. Today weâll look around three of the additive manufacturing research labs on the Lawrence Livermore campus.
A Uniaxial Nonlinear Thermoviscoelastic Constitutive Model with Damage for M30 Gun Propellant
1994-06-01
Gun Propellants at High Pressure." Lawrence Livermore National Laboratory, UCRL -88521, 1983. n g Design - k _ ao tics of Gum-’ AMCP 706-150, U.S. Army...07806-5000 Bethesda, MD 20054-5000 2 Commander 5 Director DARPA Lawrence Livermore National ATTN: J. Kelly Laboratory B. Wilcox ATTN: R. Christensen 3701
2004-12-01
3701 North Fairfax Drive Arlington, VA 22203-1714 NA NA NA Radar & EM Speech, Voiced Speech Excitations 61 ULUNCLASSIFIED UNCLASSIFIED UNCLASSIFIED...New Ideas for Speech Recognition and Related Technologies”, Lawrence Livermore National Laboratory Report, UCRL -UR-120310 , 1995 . Available from...Livermore Laboratory report UCRL -JC-134775M Holzrichter 2003, Holzrichter J.F., Kobler, J. B., Rosowski, J.J., Burke, G.J., (2003) “EM wave
LLNL: Science in the National Interest
George Miller
2017-12-09
This is Lawrence Livermore National Laboratory. located in the Livermore Valley about 50 miles east of San Francisco, the Lab is where the nations topmost science, engineering and technology come together. National security, counter-terrorism, medical technologies, energy, climate change our researchers are working to develop solutions to these challenges. For more than 50 years, we have been keeping America strong.
LIP: The Livermore Interpolation Package, Version 1.6
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fritsch, F. N.
2016-01-04
This report describes LIP, the Livermore Interpolation Package. LIP was totally rewritten from the package described in [1]. In particular, the independent variables are now referred to as x and y, since it is a general-purpose package that need not be restricted to equation of state data, which uses variables ρ (density) and T (temperature).
Science and Technology Review June 2006
DOE Office of Scientific and Technical Information (OSTI.GOV)
Radousky, H
2006-04-20
This month's issue has the following articles: (1) Maintaining Excellence through Intellectual Vitality--Commentary by Cherry A. Murray; (2) Next-Generation Scientists and Engineers Tap Lab's Resources--University of California Ph.D. candidates work with Livermore scientists and engineers to conduct fundamental research as part of their theses; (3) Adaptive Optics Provide a Clearer View--The Center for Adaptive Optics is sharpening the view of celestial objects and retinal cells; (4) Wired on the Nanoscale--A Lawrence Fellow at Livermore is using genetically engineered viruses to create nanostructures such as tiny gold wires; and (5) Too Hot to Handle--Livermore scientists couple carbon-cycle and climate models tomore » predict the global effects of depleting Earth's fossil-fuel supply.« less
Wang, Yu-Jen; Chen, Po-Ju; Liang, Xiao; Lin, Yi-Hsin
2017-03-27
Augmented reality (AR), which use computer-aided projected information to augment our sense, has important impact on human life, especially for the elder people. However, there are three major challenges regarding the optical system in the AR system, which are registration, vision correction, and readability under strong ambient light. Here, we solve three challenges simultaneously for the first time using two liquid crystal (LC) lenses and polarizer-free attenuator integrated in optical-see-through AR system. One of the LC lens is used to electrically adjust the position of the projected virtual image which is so-called registration. The other LC lens with larger aperture and polarization independent characteristic is in charge of vision correction, such as myopia and presbyopia. The linearity of lens powers of two LC lenses is also discussed. The readability of virtual images under strong ambient light is solved by electrically switchable transmittance of the LC attenuator originating from light scattering and light absorption. The concept demonstrated in this paper could be further extended to other electro-optical devices as long as the devices exhibit the capability of phase modulations and amplitude modulations.
Computational simulation of the creep-rupture process in filamentary composite materials
NASA Technical Reports Server (NTRS)
Slattery, Kerry T.; Hackett, Robert M.
1991-01-01
A computational simulation of the internal damage accumulation which causes the creep-rupture phenomenon in filamentary composite materials is developed. The creep-rupture process involves complex interactions between several damage mechanisms. A statistically-based computational simulation using a time-differencing approach is employed to model these progressive interactions. The finite element method is used to calculate the internal stresses. The fibers are modeled as a series of bar elements which are connected transversely by matrix elements. Flaws are distributed randomly throughout the elements in the model. Load is applied, and the properties of the individual elements are updated at the end of each time step as a function of the stress history. The simulation is continued until failure occurs. Several cases, with different initial flaw dispersions, are run to establish a statistical distribution of the time-to-failure. The calculations are performed on a supercomputer. The simulation results compare favorably with the results of creep-rupture experiments conducted at the Lawrence Livermore National Laboratory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, T.F.; Gerhard, M.A.; Trummer, D.J.
CASKS (Computer Analysis of Storage casKS) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for evaluating safety analysis reports on spent-fuel storage casks. The bulk of the complete program and this user`s manual are based upon the SCANS (Shipping Cask ANalysis System) program previously developed at LLNL. A number of enhancements and improvements were added to the original SCANS program to meet requirements unique to storage casks. CASKS is an easy-to-use system that calculates global response of storage casks to impact loads, pressure loads and thermal conditions. This provides reviewers withmore » a tool for an independent check on analyses submitted by licensees. CASKS is based on microcomputers compatible with the IBM-PC family of computers. The system is composed of a series of menus, input programs, cask analysis programs, and output display programs. All data is entered through fill-in-the-blank input screens that contain descriptive data requests.« less
Enomoto, Yasunori; Inui, Naoki; Yoshimura, Katsuhiro; Nishimoto, Koji; Mori, Kazutaka; Kono, Masato; Fujisawa, Tomoyuki; Enomoto, Noriyuki; Nakamura, Yutaro; Iwashita, Toshihide; Suda, Takafumi
2016-12-01
Previous studies have reported that patients with idiopathic pulmonary fibrosis occasionally develop lung cancer (LC). However, in connective tissue disease (CTD)-related interstitial lung disease (ILD), there are few data regarding the LC development. The aim of the present study was to evaluate the clinical significance of LC development in patients with CTD-ILD. A retrospective review of our database of 562 patients with ILD between 2000 and 2014 identified 127 patients diagnosed with CTD-ILD. The overall and cumulative incidences of LC were calculated. In addition, the risk factors and prognostic impact of LC development were evaluated. The median age at the ILD diagnosis was 63 years (range 37-84 years), and 73 patients (57.5%) were female. The median follow-up period from the ILD diagnosis was 67.4 months (range 10.4-322.1 months). During the period, 7 out of the 127 patients developed LC (overall incidence 5.5%). The cumulative incidences at 1, 3, and 5 years were 0.0%, 1.8%, and 2.9%, respectively. The risk of LC development was significantly higher in patients with higher smoking pack-year (odds ratio [OR] 1.028; 95% confidence interval [CI] 1.008-1.049; P = 0.007) and emphysema on chest high-resolution computed tomography (OR 14.667; 95% CI 2.871-74.926; P = 0.001). The median overall survival time after developing LC was 7.0 months (95% CI 4.9-9.1 months), and the most common cause of death was LC, not ILD. According to the Cox proportional hazard model analysis with time-dependent covariates, patients who developed LC showed significantly poorer prognosis than those who did not (hazard ratio 87.86; 95% CI 19.56-394.67; P < 0.001). In CTD-ILD, clinicians should be careful with the risk of LC development in patients with a heavy smoking history and subsequent emphysema. Although not so frequent, the complication could be a poor prognostic determinant.
Rarefaction Shock Wave Cutter for Offshore Oil-Gas Platform Removal Final Report CRADA No. TC02009.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Glenn, L. A.; Barker, J.
This was a collaborative effort between Lawrence Livermore National Security, LLC/Lawrence Livermore National Laboratory (LLNL) (formerly the University of California) and Jet Research Center, a wholly owned division of Halliburton Energy Services, Inc. to design and prototype an improved explosive cutter for cutting the support legs of offshore oil and gas platforms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Preston, R E
This report presents the results of Jones & Stokes special-status plant surveys and vegetation mapping for the University of California, Lawrence Livermore National Laboratory (LLNL). Special-status plant surveys were conducted at Site 300 in April to May 1997 and in March to April 2002. Eight special-status plants were identified at Site 300: large-flowered fiddleneck, big tarplant, diamond-petaled poppy, round-leaved filaree, gypsum-loving larkspur, California androsace, stinkbells, and hogwallow starfish. Maps identifying the locations of these species, a discussion of the occurrence of these species at Site 300, and a checklist of the flora of Site 300 are presented. A reconnaissance surveymore » of the LLNL Livermore Site was conducted in June 2002. This survey concluded that no special-status plants occur at the Livermore Site. Vegetation mapping was conducted in 2001 at Site 300 to update a previous vegetation study done in 1986. The purpose of the vegetation mapping was to update and to delineate more precisely the boundaries between vegetation types and to map vegetation types that previously were not mapped. The vegetation map is presented with a discussion of the vegetation classification used.« less
Science & Technology Review October/November 2016
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vogt, R. L.; Meissner, C. N.; Kotta, P. R.
At Lawrence Livermore National Laboratory, we focus on science and technology research to ensure our nation’s security. We also apply that expertise to solve other important national problems in energy, bioscience, and the environment. Science & Technology Review is published eight times a year to communicate, to a broad audience, the Laboratory’s scientific and technological accomplishments in fulfilling its primary missions. The publication’s goal is to help readers understand these accomplishments and appreciate their value to the individual citizen, the nation, and the world. The Laboratory is operated by Lawrence Livermore National Security, LLC (LLNS), for the Department of Energy’smore » National Nuclear Security Administration. LLNS is a partnership involving Bechtel National, University of California, Babcock & Wilcox, Washington Division of URS Corporation, and Battelle in affiliation with Texas A&M University. More information about LLNS is available online at www.llnsllc.com. Please address any correspondence (including name and address changes) to S&TR, Mail Stop L-664, Lawrence Livermore National Laboratory, P.O. Box 808, Livermore, California 94551, or telephone (925) 423-3893. Our e-mail address is str-mail@llnl.gov. S&TR is available on the Web at str.llnl.gov.« less
Hartzell, Stephen; Leeds, Alena L.; Ramirez-Guzman, Leonardo; Allen, James P.; Schmitt, Robert G.
2016-01-01
Thirty‐two accelerometers were deployed in the Livermore Valley, California, for approximately one year to study sedimentary basin effects. Many local and near‐regional earthquakes were recorded, including the 24 August 2014 Mw 6.0 Napa, California, earthquake. The resulting ground‐motion data set is used to quantify the seismic response of the Livermore basin, a major structural depression in the California Coast Range Province bounded by active faults. Site response is calculated by two methods: the reference‐site spectral ratio method and a source‐site spectral inversion method. Longer‐period (≥1 s) amplification factors follow the same general pattern as Bouguer gravity anomaly contours. Site response spectra are inverted for shallow shear‐wave velocity profiles, which are consistent with independent information. Frequency–wavenumber analysis is used to analyze plane‐wave propagation across the Livermore Valley and to identify basin‐edge‐induced surface waves with back azimuths different from the source back azimuth. Finite‐element simulations in a 3D velocity model of the region illustrate the generation of basin‐edge‐induced surface waves and point out strips of elevated ground velocities along the margins of the basin.
Proceedings of the 5. joint Russian-American computational mathematics conference
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1997-12-31
These proceedings contain a record of the talks presented and papers submitted by participants. The conference participants represented three institutions from the United States, Sandia National Laboratories (SNL), Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and two from Russia, Russian Federal Nuclear Center--All Russian Research Institute of Experimental Physics (RFNC-VNIIEF/Arzamas-16), and Russian Federal Nuclear Center--All Russian Research Institute of Technical Physics (RFNC-VNIITF/Chelyabinsk-70). The presentations and papers cover a wide range of applications from radiation transport to materials. Selected papers have been indexed separately for inclusion in the Energy Science and Technology Database.
Science & Technology Review January/February 2010
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bearinger, J P
2009-11-30
This month's issue has the following articles: (1) Innovative Materials Rise to the Radiation Challenge - Commentary by Bruce Warner; (2) The Hunt for Better Radiation Detection - New materials will help radiation detectors pick up weak signals and accurately identify illicit radioactive sources; (3) Time-Critical Technology Identifies Deadly Bloodborne Pathogens - A portable device can simultaneously distinguish up to five bloodborne pathogens in just minutes; (4) Defending Computer Networks against Attack - A Laboratory effort takes a new approach to detecting increasingly sophisticated cyber attacks; and (5) Imaging Cargo's Inner Secrets - Livermore-University of California collaborators are modeling amore » new radiographic technique for identifying nuclear materials concealed inside cargo containers.« less
Nuclear winter from gulf war discounted
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marshall, E.
Would a major conflagration in Kuwait's oil fields trigger a climate catastrophe akin to the 'nuclear winter' that got so much attention in the 1980s This question prompted a variety of opinions. The British Meteorological Office and researchers at Lawrence Livermore National Laboratory concluded that the effect of smoke from major oil fires in Kuwait on global temperatures is likely to be small; however, the obscuration of sunlight might significantly reduce surface temperatures locally. Michael MacCracken, leader of the researchers at Livermore, predicts that the worst plausible oil fires in the Gulf would produce a cloud of pollution about asmore » severe as that found on a bad day at the Los Angeles airport. The results of some mathematical modeling by the Livermore research group are reported.« less
Science & Technology Review November 2007
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chinn, D J
2007-10-16
This month's issue has the following articles: (1) Simulating the Electromagnetic World--Commentary by Steven R. Patterson; (2) A Code to Model Electromagnetic Phenomena--EMSolve, a Livermore supercomputer code that simulates electromagnetic fields, is helping advance a wide range of research efforts; (3) Characterizing Virulent Pathogens--Livermore researchers are developing multiplexed assays for rapid detection of pathogens; (4) Imaging at the Atomic Level--A powerful new electron microscope at the Laboratory is resolving materials at the atomic level for the first time; (5) Scientists without Borders--Livermore scientists lend their expertise on peaceful nuclear applications to their counterparts in other countries; and (6) Probing Deepmore » into the Nucleus--Edward Teller's contributions to the fast-growing fields of nuclear and particle physics were part of a physics golden age.« less
The PDAPP mouse model of Alzheimer's disease: locus coeruleus neuronal shrinkage.
German, Dwight C; Nelson, Omar; Liang, Fen; Liang, Chang-Lin; Games, Dora
2005-11-28
Alzheimer's disease is characterized by neuronal degeneration in the cerebral cortex and hippocampus and subcortical neuronal degeneration in such nuclei as the locus coeruleus (LC). Transgenic mice overexpressing mutant human amyloid precursor protein V717F, PDAPP mice, develop several Alzheimer's disease-like lesions. The present study sought to determine whether there is also loss of LC noradrenergic neurons or evidence of degenerative changes in these animals. PDAPP hemizygous and wild-type littermate control mice were examined at 23 months of age, at a time when there are numerous amyloid-beta (Abeta) plaques in the neocortex and hippocampus. Tissue sections were stained immunohistochemically with an antibody against tyrosine hydroxylase (TH) to identify LC neurons. Computer imaging procedures were used to count the TH-immunoreactive somata in sections through the rostral-caudal extent of the nucleus. There was no loss of LC neurons in the hemizygous mice. In a second experiment, homozygous PDAPP and wild-type mice were examined, at 2 months and 24 months of age. Again there was no age-related loss of neurons in the homozygous animals. In the portion of the LC where neurons reside that project to the cortex and hippocampus, however, the neurons were decreased in size selectively in the 24-month-old transgenic animals. These data indicate that overt LC cell loss does not occur following abundant overexpression of Abeta peptide. However, the selective size reduction of the LC neuronal population projecting to cortical and hippocampal regions containing Abeta-related neuropathology implies that these cells may be subjected to a retrograde-mediated stress. Copyright 2005 Wiley-Liss, Inc.
Hypercube matrix computation task
NASA Technical Reports Server (NTRS)
Calalo, R.; Imbriale, W.; Liewer, P.; Lyons, J.; Manshadi, F.; Patterson, J.
1987-01-01
The Hypercube Matrix Computation (Year 1986-1987) task investigated the applicability of a parallel computing architecture to the solution of large scale electromagnetic scattering problems. Two existing electromagnetic scattering codes were selected for conversion to the Mark III Hypercube concurrent computing environment. They were selected so that the underlying numerical algorithms utilized would be different thereby providing a more thorough evaluation of the appropriateness of the parallel environment for these types of problems. The first code was a frequency domain method of moments solution, NEC-2, developed at Lawrence Livermore National Laboratory. The second code was a time domain finite difference solution of Maxwell's equations to solve for the scattered fields. Once the codes were implemented on the hypercube and verified to obtain correct solutions by comparing the results with those from sequential runs, several measures were used to evaluate the performance of the two codes. First, a comparison was provided of the problem size possible on the hypercube with 128 megabytes of memory for a 32-node configuration with that available in a typical sequential user environment of 4 to 8 megabytes. Then, the performance of the codes was anlyzed for the computational speedup attained by the parallel architecture.
Delivering Insight The History of the Accelerated Strategic Computing Initiative
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larzelere II, A R
2007-01-03
The history of the Accelerated Strategic Computing Initiative (ASCI) tells of the development of computational simulation into a third fundamental piece of the scientific method, on a par with theory and experiment. ASCI did not invent the idea, nor was it alone in bringing it to fruition. But ASCI provided the wherewithal - hardware, software, environment, funding, and, most of all, the urgency - that made it happen. On October 1, 2005, the Initiative completed its tenth year of funding. The advances made by ASCI over its first decade are truly incredible. Lawrence Livermore, Los Alamos, and Sandia National Laboratories,more » along with leadership provided by the Department of Energy's Defense Programs Headquarters, fundamentally changed computational simulation and how it is used to enable scientific insight. To do this, astounding advances were made in simulation applications, computing platforms, and user environments. ASCI dramatically changed existing - and forged new - relationships, both among the Laboratories and with outside partners. By its tenth anniversary, despite daunting challenges, ASCI had accomplished all of the major goals set at its beginning. The history of ASCI is about the vision, leadership, endurance, and partnerships that made these advances possible.« less
Active and passive computed tomography mixed waste focus area final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roberson, G P
1998-08-19
The Mixed Waste Focus Area (MWFA) Characterization Development Strategy delineates an approach to resolve technology deficiencies associated with the characterization of mixed wastes. The intent of this strategy is to ensure the availability of technologies to support the Department of Energy's (DOE) mixed waste low-level or transuranic (TRU) contaminated waste characterization management needs. To this end the MWFA has defined and coordinated characterization development programs to ensure that data and test results necessary to evaluate the utility of non-destructive assay technologies are available to meet site contact handled waste management schedules. Requirements used as technology development project benchmarks are basedmore » in the National TRU Program Quality Assurance Program Plan. These requirements include the ability to determine total bias and total measurement uncertainty. These parameters must be completely evaluated for waste types to be processed through a given nondestructive waste assay system constituting the foundation of activities undertaken in technology development projects. Once development and testing activities have been completed, Innovative Technology Summary Reports are generated to provide results and conclusions to support EM-30, -40, or -60 end user/customer technology selection. The Active and Passive Computed Tomography non-destructive assay system is one of the technologies selected for development by the MWFA. Lawrence Livermore National Laboratory's (LLNL) is developing the Active and Passive Computed Tomography (A&PCT) nondestructive assay (NDA) technology to identify and accurately quantify all detectable radioisotopes in closed containers of waste. This technology will be applicable to all types of waste regardless of .their classification; low level, transuranic or provide results and conclusions to support EM-30, -40, or -60 end user/customer technology selection. The Active and Passive Computed Tomography non-destructive assay system is one of the technologies selected for development by the MWFA. Lawrence Livermore National Laboratory's (LLNL) is developing the Active and Passive Computed Tomography (A&PCT) nondestructive assay (NDA) technology to identify and accurately quantify all detectable radioisotopes in closed containers of waste. This technology will be applicable to all types of waste regardless of .their classification; low level, transuranic or mixed, which contains radioactivity and hazardous organic species. The scope of our technology is to develop a non-invasive waste-drum scanner that employs the principles of computed tomography and gamma-ray spectral analysis to identify and quantify all of the detectable radioisotopes. Once this and other applicable technologies are developed, waste drums can be non- destructively and accurately characterized to satisfy repository and regulatory guidelines prior to disposal.« less
Sonk, Jason A; Schlegel, H Bernhard
2011-10-27
Time-dependent configuration interaction (TD-CI) simulations can be used to simulate molecules in intense laser fields. TD-CI calculations use the excitation energies and transition dipoles calculated in the absence of a field. The EOM-CCSD method provides a good estimate of the field-free excited states but is rather expensive. Linear-response time-dependent density functional theory (TD-DFT) is an inexpensive alternative for computing the field-free excitation energies and transition dipoles needed for TD-CI simulations. Linear-response TD-DFT calculations were carried out with standard functionals (B3LYP, BH&HLYP, HSE2PBE (HSE03), BLYP, PBE, PW91, and TPSS) and long-range corrected functionals (LC-ωPBE, ωB97XD, CAM-B3LYP, LC-BLYP, LC-PBE, LC-PW91, and LC-TPSS). These calculations used the 6-31G(d,p) basis set augmented with three sets of diffuse sp functions on each heavy atom. Butadiene was employed as a test case, and 500 excited states were calculated with each functional. Standard functionals yield average excitation energies that are significantly lower than the EOM-CC, while long-range corrected functionals tend to produce average excitation energies slightly higher. Long-range corrected functionals also yield transition dipoles that are somewhat larger than EOM-CC on average. The TD-CI simulations were carried out with a three-cycle Gaussian pulse (ω = 0.06 au, 760 nm) with intensities up to 1.26 × 10(14) W cm(-2) directed along the vector connecting the end carbons. The nonlinear response as indicated by the residual populations of the excited states after the pulse is far too large with standard functionals, primarily because the excitation energies are too low. The LC-ωPBE, LC-PBE, LC-PW91, and LC-TPSS long-range corrected functionals produce responses comparable to EOM-CC.
Fixatives Application for Risk Mitigation Following Contamination with a Biological Agent
2011-11-02
PRES- Gruinard Island 5% formaldehyde Sverdlosk Release UNKNOWN: but washing, chloramines , soil disposal believed to have been used...507816 Lawrence Livermore National Laboratory LLNL-PRES- 4 Disinfectant >6 Log Reduction on Materials (EPA, 2010a,b; Wood et al., 2011...LL L-PRES-507816 Lawrence Livermore National Laboratory LLNL-PRES- High disinfectant concentrations increase operational costs and risk
Material Modeling for Terminal Ballistic Simulation
1992-09-01
DYNA-3D-a nonlinear, explicit, three-dimensional finite element code for solid and structural mechanics- user manual. Technical Report UCRL -MA...Rep. UCRL -50108, Rev. 1, Lawrence Livermore Laboratory, 1977. [34] S. P. Marsh. LASL Shock Hugoniot Data. University of California Press, Berkeley, CA...Steinberg. Equation of state and strength properties of selected ma- teriaJs. Tech. Rep. UCRL -MA-106439, Lawrence Livermore National Labo- ratory, 1991. [371
Characterization of Jets From Exploding Bridge Wire Detonators
2005-05-01
Laboratories: Albuquerque, NM, 1992. 8. Lee, E. L; Hornig, H. C.; Kury, J. W. Adiabatic Expansion of High Explosive Detonation Products; UCRL ...Dobratz, B. M. LLNL Explosives Handbook; UCRL -5299; Lawrence Livermore Laboratory, University of California: Livermore, CA 1981. 22...ATTN AFATL DLJR D LAMBERT EGLIN AFB FL 32542-6810 2 DARPA ATTN W SNOWDEN S WAX 3701 N FAIRFAX DR ARLINGTON VA 22203-1714 2 LOS
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-13
...NIOSH gives notice as required by Department of Health and Human Services regulations of a decision to evaluate a petition to designate a class of employees from the Sandia National Laboratory- Livermore in Livermore, California to be included in the Special Exposure Cohort under the Energy Employees Occupational Illness Compensation Program Act of 2000.
Astronomy Applications of Adaptive Optics at Lawrence Livermore National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bauman, B J; Gavel, D T
2003-04-23
Astronomical applications of adaptive optics at Lawrence Livermore National Laboratory (LLNL) has a history that extends from 1984. The program started with the Lick Observatory Adaptive Optics system and has progressed through the years to lever-larger telescopes: Keck, and now the proposed CELT (California Extremely Large Telescope) 30m telescope. LLNL AO continues to be at the forefront of AO development and science.
Livermore Accelerator Source for Radionuclide Science (LASRS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Anderson, Scott; Bleuel, Darren; Johnson, Micah
The Livermore Accelerator Source for Radionuclide Science (LASRS) will generate intense photon and neutron beams to address important gaps in the study of radionuclide science that directly impact Stockpile Stewardship, Nuclear Forensics, and Nuclear Material Detection. The co-location of MeV-scale neutral and photon sources with radiochemical analytics provides a unique facility to meet current and future challenges in nuclear security and nuclear science.
Running SW4 On New Commodity Technology Systems (CTS-1) Platform
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rodgers, Arthur J.; Petersson, N. Anders; Pitarka, Arben
We have recently been running earthquake ground motion simulations with SW4 on the new capacity computing systems, called the Commodity Technology Systems - 1 (CTS-1) at Lawrence Livermore National Laboratory (LLNL). SW4 is a fourth order time domain finite difference code developed by LLNL and distributed by the Computational Infrastructure for Geodynamics (CIG). SW4 simulates seismic wave propagation in complex three-dimensional Earth models including anelasticity and surface topography. We are modeling near-fault earthquake strong ground motions for the purposes of evaluating the response of engineered structures, such as nuclear power plants and other critical infrastructure. Engineering analysis of structures requiresmore » the inclusion of high frequencies which can cause damage, but are often difficult to include in simulations because of the need for large memory to model fine grid spacing on large domains.« less
Wangerin, K; Culbertson, C N; Jevremovic, T
2005-08-01
The goal of this study was to evaluate the COG Monte Carlo radiation transport code, developed and tested by Lawrence Livermore National Laboratory, for gadolinium neutron capture therapy (GdNCT) related modeling. The validity of COG NCT model has been established for this model, and here the calculation was extended to analyze the effect of various gadolinium concentrations on dose distribution and cell-kill effect of the GdNCT modality and to determine the optimum therapeutic conditions for treating brain cancers. The computational results were compared with the widely used MCNP code. The differences between the COG and MCNP predictions were generally small and suggest that the COG code can be applied to similar research problems in NCT. Results for this study also showed that a concentration of 100 ppm gadolinium in the tumor was most beneficial when using an epithermal neutron beam.
Computational Study of the Richtmyer-Meshkov Instability with a Complex Initial Condition
NASA Astrophysics Data System (ADS)
McFarland, Jacob; Reilly, David; Greenough, Jeffrey; Ranjan, Devesh
2014-11-01
Results are presented for a computational study of the Richtmyer-Meshkov instability with a complex initial condition. This study covers experiments which will be conducted at the newly-built inclined shock tube facility at the Georgia Institute of Technology. The complex initial condition employed consists of an underlying inclined interface perturbation with a broadband spectrum of modes superimposed. A three-dimensional staggered mesh arbitrary Lagrange Eulerian (ALE) hydrodynamics code developed at Lawerence Livermore National Laboratory called ARES was used to obtain both qualitative and quantitative results. Qualitative results are discussed using time series of density plots from which mixing width may be extracted. Quantitative results are also discussed using vorticity fields, circulation components, and energy spectra. The inclined interface case is compared to the complex interface case in order to study the effect of initial conditions on shocked, variable-density flows.
Research in Information Processing and Computer Science. Final Technical Report.
ERIC Educational Resources Information Center
Carnegie-Mellon Univ., Pittsburgh, PA. Social Studies Curriculum Center.
This is the final scientific research report for the research in programing at Carnegie-Mellon University during 1968-1970. Three team programing efforts during the past two years have been the development of (1) BLISS--a system building language on the PDP-10 computer, (2) LC2--a conversational system on the IBM/360, and L*--a system building…
Liu, Nai-Yu; Lee, Hsiao-Hui; Chang, Zee-Fen; Tsay, Yeou-Guang
2015-09-10
It has been observed that a modified peptide and its non-modified counterpart, when analyzed with reverse phase liquid chromatography, usually share a very similar elution property [1-3]. Inasmuch as this property is common to many different types of protein modifications, we propose an informatics-based approach, featuring the generation of segmental average mass spectra ((sa)MS), that is capable of locating different types of modified peptides in two-dimensional liquid chromatography-mass spectrometric (LC-MS) data collected for regular protease digests from proteins in gels or solutions. To enable the localization of these peptides in the LC-MS map, we have implemented a set of computer programs, or the (sa)MS package, that perform the needed functions, including generating a complete set of segmental average mass spectra, compiling the peptide inventory from the Sequest/TurboSequest results, searching modified peptide candidates and annotating a tandem mass spectrum for final verification. Using ROCK2 as an example, our programs were applied to identify multiple types of modified peptides, such as phosphorylated and hexosylated ones, which particularly include those peptides that could have been ignored due to their peculiar fragmentation patterns and consequent low search scores. Hence, we demonstrate that, when complemented with peptide search algorithms, our approach and the entailed computer programs can add the sequence information needed for bolstering the confidence of data interpretation by the present analytical platforms and facilitate the mining of protein modification information out of complicated LC-MS/MS data. Copyright © 2015 Elsevier B.V. All rights reserved.
Patient-specific lean body mass can be estimated from limited-coverage computed tomography images.
Devriese, Joke; Beels, Laurence; Maes, Alex; van de Wiele, Christophe; Pottel, Hans
2018-06-01
In PET/CT, quantitative evaluation of tumour metabolic activity is possible through standardized uptake values, usually normalized for body weight (BW) or lean body mass (LBM). Patient-specific LBM can be estimated from whole-body (WB) CT images. As most clinical indications only warrant PET/CT examinations covering head to midthigh, the aim of this study was to develop a simple and reliable method to estimate LBM from limited-coverage (LC) CT images and test its validity. Head-to-toe PET/CT examinations were retrospectively retrieved and semiautomatically segmented into tissue types based on thresholding of CT Hounsfield units. LC was obtained by omitting image slices. Image segmentation was validated on the WB CT examinations by comparing CT-estimated BW with actual BW, and LBM estimated from LC images were compared with LBM estimated from WB images. A direct method and an indirect method were developed and validated on an independent data set. Comparing LBM estimated from LC examinations with estimates from WB examinations (LBMWB) showed a significant but limited bias of 1.2 kg (direct method) and nonsignificant bias of 0.05 kg (indirect method). This study demonstrates that LBM can be estimated from LC CT images with no significant difference from LBMWB.
Letarte, Simon; Brusniak, Mi-Youn; Campbell, David; Eddes, James; Kemp, Christopher J; Lau, Hollis; Mueller, Lukas; Schmidt, Alexander; Shannon, Paul; Kelly-Spratt, Karen S; Vitek, Olga; Zhang, Hui; Aebersold, Ruedi; Watts, Julian D
2008-12-01
A proof-of-concept demonstration of the use of label-free quantitative glycoproteomics for biomarker discovery workflow is presented here, using a mouse model for skin cancer as an example. Blood plasma was collected from 10 control mice, and 10 mice having a mutation in the p19(ARF) gene, conferring them high propensity to develop skin cancer after carcinogen exposure. We enriched for N-glycosylated plasma proteins, ultimately generating deglycosylated forms of the modified tryptic peptides for liquid chromatography mass spectrometry (LC-MS) analyses. LC-MS runs for each sample were then performed with a view to identifying proteins that were differentially abundant between the two mouse populations. We then used a recently developed computational framework, Corra, to perform peak picking and alignment, and to compute the statistical significance of any observed changes in individual peptide abundances. Once determined, the most discriminating peptide features were then fragmented and identified by tandem mass spectrometry with the use of inclusion lists. We next assessed the identified proteins to see if there were sets of proteins indicative of specific biological processes that correlate with the presence of disease, and specifically cancer, according to their functional annotations. As expected for such sick animals, many of the proteins identified were related to host immune response. However, a significant number of proteins also directly associated with processes linked to cancer development, including proteins related to the cell cycle, localisation, trasport, and cell death. Additional analysis of the same samples in profiling mode, and in triplicate, confirmed that replicate MS analysis of the same plasma sample generated less variation than that observed between plasma samples from different individuals, demonstrating that the reproducibility of the LC-MS platform was sufficient for this application. These results thus show that an LC-MS-based workflow can be a useful tool for the generation of candidate proteins of interest as part of a disease biomarker discovery effort.
Letarte, Simon; Brusniak, Mi-Youn; Campbell, David; Eddes, James; Kemp, Christopher J.; Lau, Hollis; Mueller, Lukas; Schmidt, Alexander; Shannon, Paul; Kelly-Spratt, Karen S.; Vitek, Olga; Zhang, Hui; Aebersold, Ruedi; Watts, Julian D.
2010-01-01
A proof-of-concept demonstration of the use of label-free quantitative glycoproteomics for biomarker discovery workflow is presented here, using a mouse model for skin cancer as an example. Blood plasma was collected from 10 control mice, and 10 mice having a mutation in the p19ARF gene, conferring them high propensity to develop skin cancer after carcinogen exposure. We enriched for N-glycosylated plasma proteins, ultimately generating deglycosylated forms of the modified tryptic peptides for liquid chromatography mass spectrometry (LC-MS) analyses. LC-MS runs for each sample were then performed with a view to identifying proteins that were differentially abundant between the two mouse populations. We then used a recently developed computational framework, Corra, to perform peak picking and alignment, and to compute the statistical significance of any observed changes in individual peptide abundances. Once determined, the most discriminating peptide features were then fragmented and identified by tandem mass spectrometry with the use of inclusion lists. We next assessed the identified proteins to see if there were sets of proteins indicative of specific biological processes that correlate with the presence of disease, and specifically cancer, according to their functional annotations. As expected for such sick animals, many of the proteins identified were related to host immune response. However, a significant number of proteins also directly associated with processes linked to cancer development, including proteins related to the cell cycle, localisation, trasport, and cell death. Additional analysis of the same samples in profiling mode, and in triplicate, confirmed that replicate MS analysis of the same plasma sample generated less variation than that observed between plasma samples from different individuals, demonstrating that the reproducibility of the LC-MS platform was sufficient for this application. These results thus show that an LC-MS-based workflow can be a useful tool for the generation of candidate proteins of interest as part of a disease biomarker discovery effort. PMID:20157627
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nikolic, R J
This month's issue has the following articles: (1) Honoring a Legacy of Service to the Nation - The nation pays tribute to George Miller, who retired in December 2011 as the Laboratory's tenth director; (2) Life-Extension Programs Encompass All Our Expertise - Commentary by Bruce T. Goodwin; (3) Extending the Life of an Aging Weapon - Stockpile stewards have begun work on a multiyear effort to extend the service life of the aging W78 warhead by 30 years; (4) Materials by Design - Material microstructures go three-dimensional with improved additive manufacturing techniques developed at Livermore; (5) Friendly Microbes Power Energy-Producingmore » Devices - Livermore researchers are demonstrating how electrogenic bacteria and microbial fuel cell technologies can produce clean, renewable energy and purify water; and (6) Chemical Sensor Is All Wires, No Batteries - Livermore's 'batteryless' nanowire sensor could benefit applications in diverse fields such as homeland security and medicine.« less
2011-11-16
Security, LLC 2011 CBD S& T Conference November 16, 2011 LLNL-PRES-508394 Lawrence Livermore National Laboratory LLNL-PRES- Background...PRES- Gruinard Island 5% formaldehyde Sverdlosk Release UNKNOWN: but washing, chloramines , soil disposal believed to have been used...508394 Lawrence Livermore National Laboratory LLNL-PRES- 4 Disinfectant >6 Log Reduction on Materials (EPA, 2010a,b; Wood et al., 2011
Numerical Modeling of Buried Mine Explosions
2001-03-01
Lawrence Livermore Laboratory Report, UCRL -50108, Rev. 1, June 1977. 12. Dobratz, B. M., and P. C. Crawford. “LLNL Explosives Handbook.” Lawrence...Livermore National Laboratory Report, UCRL -52997, January 1985. 13. Kerley, G. I. “Multiphase Equation of State for Iron.” Sandia National Laboratories...BOX 202797 AUSTIN TX 78720-2797 1 DARPA B KASPAR 3701 N FAIRFAX DR ARLINGTON VA 22203-1714 1 US MILITARY ACADEMY MATH SCI
Studies in Seismic Verification
1992-05-01
NTS and Shagan River nuclear explosions, Rep. UCRL -102276, Lawrence Livermore Natl. Lab., Livermore, Calif., 1990. Taylor, S. R., and P. D. Marshall...western U.S. earthquakes and implications for the tectonic stress field, Report UCRL -JC-105880, 36 pp., 1990. Randall, M. J., The spectral theory of...Alewine, III Dr. Stephen Bratt DARPA/NMRO Center for Seismic Studies 3701 North Fairfax Drive 1300 North 17th Street Arlington, VA 22203-1714 Suite 1450
Final Report Bald and Golden Eagle Territory Surveys for the Lawrence Livermore National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fratanduono, M. L.
2014-11-25
Garcia and Associates (GANDA) was contracted by the Lawrence Livermore National Laboratory (LLNL) to conduct surveys for bald eagles (Haliaeetus leucocephalus) and golden eagles (Aquila chrysaetos) at Site 300 and in the surrounding area out to 10-miles. The survey effort was intended to document the boundaries of eagle territories by careful observation of eagle behavior from selected viewing locations throughout the study area.
Sending an Instrument to Psyche, the Largest Metal Asteroid in the Solar System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Burks, Morgan
In a few years, an instrument designed and built by Lawrence Livermore National Laboratory researchers will be flying hundreds of millions of miles through space to explore a rare, largely metal asteroid. The Livermore gamma ray spectrometer will be built in collaboration with researchers from the Johns Hopkins Applied Physics Laboratory for the first-ever visit to Psyche, the largest metal asteroid in the solar system.
The effect of Livermore OPAL opacities on the evolutionary masses of RR Lyrae stars
NASA Technical Reports Server (NTRS)
Yi, Sukyoung; Lee, Young-Wook; Demarque, Pierre
1993-01-01
We have investigated the effect of the new Livermore OPAL opacities on the evolution of horizontal-branch (HB) stars. This work was motivated by the recent stellar pulsation calculations using the new Livermore opacities, which suggest that the masses of double-mode RR Lyrae stars are 0.1-0.2 solar mass larger than those based on earlier opacities. Unlike the pulsation calculations, we find that the effect of opacity change on the evolution of HB stars is not significant. In particular, the effect of the mean masses of RR Lyrae stars is very small, showing a decrease of only 0.01-0.02 solar mass compared to the models based on old Cox-Stewart opacities. Consequently, with the new Livermore OPAL opacities, both the stellar pulsation and evolution models now predict approximately the same masses for the RR Lyrae stars. Our evolutionary models suggest that the mean masses of the RR Lyrae stars are about 0.76 and about 0.71 solar mass for M15 (Oosterhoff group II) and M3 (group I), respectively. If (alpha/Fe) = 0.4, these values are decreased by about 0.03 solar mass. Variations of the mean masses of RR Lyrae stars with HB morphology and metallicity are also presented.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vigil,Benny Manuel; Ballance, Robert; Haskell, Karen
Cielo is a massively parallel supercomputer funded by the DOE/NNSA Advanced Simulation and Computing (ASC) program, and operated by the Alliance for Computing at Extreme Scale (ACES), a partnership between Los Alamos National Laboratory (LANL) and Sandia National Laboratories (SNL). The primary Cielo compute platform is physically located at Los Alamos National Laboratory. This Cielo Computational Environment Usage Model documents the capabilities and the environment to be provided for the Q1 FY12 Level 2 Cielo Capability Computing (CCC) Platform Production Readiness Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model ismore » focused on the needs of the ASC user working in the secure computing environments at Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory, or Sandia National Laboratories, but also addresses the needs of users working in the unclassified environment. The Cielo Computational Environment Usage Model maps the provided capabilities to the tri-Lab ASC Computing Environment (ACE) Version 8.0 requirements. The ACE requirements reflect the high performance computing requirements for the Production Readiness Milestone user environment capabilities of the ASC community. A description of ACE requirements met, and those requirements that are not met, are included in each section of this document. The Cielo Computing Environment, along with the ACE mappings, has been issued and reviewed throughout the tri-Lab community.« less
Liu, Qi; Fu, Xiao-Long; Yu, Wen; Zhu, Zheng-Fei; Zhang, Ying-Jian
2015-05-01
The aim of the study was to evaluate the predictive value of fluorine-18 fluorodeoxyglucose (F-FDG) PET/computed tomography (CT) pretreatment on local control (LC) and survival after radical radiotherapy or chemoradiotherapy in patients with locally advanced esophageal squamous cell carcinoma (SCC) and to discuss its potential value for establishing optimal radiation treatment plans. Fifty-eight patients with pathologically proven esophageal SCC who underwent ¹⁸F-FDG PET/CT pretreatment in our center were retrospectively reviewed. We examined the correlation between the PET parameters of primary tumors and LC and overall survival. The coefficient of variation was used to estimate the ¹⁸F-FDG uptake in heterogeneity. The mean duration of follow-up for surviving patients was 38 months, and 36 patients died because of tumor recurrence or other diseases. The rates of 3-year overall survival and LC were 40.4 and 50.4%, respectively. Multivariate analysis of LC revealed that metabolic tumor volume (MTV) greater than 16.08 ml was the only predictor of outcome, with a lower 3-year LC (P=0.017, hazard ratio: 1.608, 95% confidence interval: 1.090-2.371). The coefficient of variations of their primary lesion were higher compared with those of patients who had smaller MTVs. In this study, MTV assessed by PET/CT might be an adverse factor for predicting LC in esophageal SCC. For those with higher MTVs, higher intratumor heterogeneity suggests that irradiation may need to be boosted in stable high-uptake regions to improve LC. These results need to be prospectively validated in larger cohorts.
Garrido, Pilar; Sánchez, Marcelo; Belda Sanchis, José; Moreno Mata, Nicolás; Artal, Ángel; Gayete, Ángel; Matilla González, José María; Galbis Caravajal, José Marcelo; Isla, Dolores; Paz-Ares, Luis; Seijo, Luis M
2017-10-01
Lung cancer (LC) is a major public health issue. Despite recent advances in treatment, primary prevention and early diagnosis are key to reducing the incidence and mortality of this disease. A recent clinical trial demonstrated the efficacy of selective screening by low-dose computed tomography (LDCT) in reducing the risk of both lung cancer mortality and all-cause mortality in high-risk individuals. This article contains the reflections of an expert group on the use of LDCT for early diagnosis of LC in high-risk individuals, and how to evaluate its implementation in Spain. The expert group was set up by the Spanish Society of Pulmonology and Thoracic Surgery (SEPAR), the Spanish Society of Thoracic Surgery (SECT), the Spanish Society of Radiology (SERAM) and the Spanish Society of Medical Oncology (SEOM). Copyright © 2017 SEPAR. Publicado por Elsevier España, S.L.U. All rights reserved.
NASA Technical Reports Server (NTRS)
Wasynczuk, O.; Krause, P. C.; Biess, J. J.; Kapustka, R.
1990-01-01
A detailed computer simulation was used to illustrate the steady-state and dynamic operating characteristics of a 20-kHz resonant spacecraft power system. The simulated system consists of a parallel-connected set of DC-inductor resonant inverters (drivers), a 440-V cable, a node transformer, a 220-V cable, and a transformer-rectifier-filter (TRF) AC-to-DC receiver load. Also included in the system are a 1-kW 0.8-pf RL load and a double-LC filter connected at the receiving end of the 20-kHz AC system. The detailed computer simulation was used to illustrate the normal steady-state operating characteristics and the dynamic system performance following, for example, TRF startup. It is shown that without any filtering the given system exhibits harmonic resonances due to an interaction between the switching of the source and/or load converters and the AC system. However, the double-LC filter at the receiving-end of the AC system and harmonic traps connected in series with each of the drivers significantly reduce the harmonic distortion of the 20-kHz bus voltage. Significant additional improvement in the waveform quality can be achieved by including a double-LC filter with each driver.
The language capacity: architecture and evolution.
Chomsky, Noam
2017-02-01
There is substantial evidence that the human language capacity (LC) is a species-specific biological property, essentially unique to humans, invariant among human groups, and dissociated from other cognitive systems. Each language, an instantiation of LC, consists of a generative procedure that yields a discrete infinity of hierarchically structured expressions with semantic interpretations, hence a kind of "language of thought" (LOT), along with an operation of externalization (EXT) to some sensory-motor system, typically sound. There is mounting evidence that generation of LOT observes language-independent principles of computational efficiency and is based on the simplest computational operations, and that EXT is an ancillary process not entering into the core semantic properties of LOT and is the primary locus of the apparent complexity, diversity, and mutability of language. These conclusions are not surprising, since the internal system is acquired virtually without evidence in fundamental respects, and EXT relates it to sensory-motor systems that are unrelated to it. Even such properties as the linear order of words appear to be reflexes of the sensory motor system, not available to generation of LOT. The limited evidence from the evolutionary record lends support to these conclusions, suggesting that LC emerged with Homo sapiens or not long after, and has not evolved since human groups dispersed.
NASA Technical Reports Server (NTRS)
Wilson, Timmy R.; Beech, Geoffrey; Johnston, Ian
2009-01-01
The NESC Assessment Team reviewed a computer simulation of the LC-39 External Tank (ET) GH2 Vent Umbilical system developed by United Space Alliance (USA) for the Space Shuttle Program (SSP) and designated KSC Analytical Tool ID 451 (KSC AT-451). The team verified that the vent arm kinematics were correctly modeled, but noted that there were relevant system sensitivities. Also, the structural stiffness used in the math model varied somewhat from the analytic calculations. Results of the NESC assessment were communicated to the model developers.
Progress Toward a Multidimensional Representation of the 5.56-mm Interior Ballistics
2009-08-01
were performed as a check of all the major species formed at one atmosphere pressure. Cheetah (17) thermodynamics calculations were performed under...in impermeable boundaries that only yield to gas-dynamic flow after a prescribed pressure load is reached act as rigid bodies within the chamber... Cheetah Code, version 4.0; Lawrence Livermore National Laboratory: Livermore, CA, 2005. 18. Williams, A. W.; Brant, A. L.; Kaste, P. J.; Colburn, J. W
Development of a Laser for Landmine Destruction Final Report CRADA No. TC02126.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yamamoto, R.; Sheppard, C.
2017-08-31
This was one of two CRADAs between Lawrence Livermore National Security, LLC as manager and operator of Lawrence Livermore National Laboratory (LLNL) and First Alliance Technologies, LLC (First Alliance), to conduct research and development activity toward an integrated system for the detecting, locating, and destroying of landmines and unexploded ordinance using a laser to destroy landmines and unexploded ordinance and First Alliance’s Land Mine Locator (LML) system.
Trends in Anti-Nuclear Protests in the United States, 1984-1987
1989-01-01
Obispo, CA. 2 days of peaceful protests at Diablo Canyon nuclear powerplant against licensing of plant. Date: January 12 and 13, 1984 Group: Abalone ...Members of the Abalone Alliance and the Livermore Action Group blocked entrance to Bohemian Grove club, a conservative all-male club to which Reagan...belongs, to protest the club members’ connections to the nuclear weapons industry. Date: July 22, 1984 Group: Abalone and Livermore Action Group
1991-07-16
UCRL -51414-REV1, Lawrence Livermore Laboratory, University of California, CA. - 47 - North, R. G. (1977). Station magnitude bias --- its determination...1976 at and near the nuclear testing ground in eastern Kazakhstan, UCRL -52856, Lawrence Livermore Laboratory, University of California, CA. Ryall, A...VA 24061 Dr. Ralph Alewine, I Dr. Stephen Bratt DARPA/NMRO Center for Seismic Studies 3701 North Fairfax Drive 1300 North 17th Street Arlington, VA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barbee, T. W.; Schena, D.
This was a collaborative effort between Lawrence Livermore National Security, LLC as manager and operator of Lawrence Livermore National Laboratory (LLNL) and TroyCap LLC, to develop manufacturing steps for commercial production of nano-structure capacitors. The technical objective of this project was to demonstrate high deposition rates of selected dielectric materials which are 2 to 5 times larger than typical using current technology.
Emergency Response Capability Baseline Needs Assessment - Compliance Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharry, John A.
This document was prepared by John A. Sharry, LLNL Fire Marshal and Division Leader for Fire Protection and was reviewed by LLNL Emergency Management Department Head, James Colson. This document is the second of a two-part analysis on Emergency Response Capabilities of Lawrence Livermore National Laboratory. The first part, 2016 Baseline Needs Assessment Requirements Document established the minimum performance criteria necessary to meet mandatory requirements. This second part analyses the performance of Lawrence Livermore Laboratory Emergency Management Department to the contents of the Requirements Document. The document was prepared based on an extensive review of information contained in the 2016more » BNA, a review of Emergency Planning Hazards Assessments, a review of building construction, occupancy, fire protection features, dispatch records, LLNL alarm system records, fire department training records, and fire department policies and procedures. The 2013 BNA was approved by NNSA’s Livermore Field Office on January 22, 2014.« less
Science& Technology Review March 2004
DOE Office of Scientific and Technical Information (OSTI.GOV)
McMahon, D H
2004-01-23
This month's issue has the following articles: (1) ''Rethinking Atoms for Peace and the Future of Nuclear Technology'' a commentary by Ronald F. Lehman II; (2) ''Rich Legacy from Atoms for Peace'' In 1953, President Eisenhower encouraged world leaders to pursue peaceful uses of nuclear technology. Many of Livermore's contributions in the spirit of this initiative continue to benefit society today. (3) ''Tropopause Height Becomes Another Climate-Change Fingerprint'' Simulations and observational data show that human activities are largely responsible for the steady elevation of the tropopause, the boundary between the troposphere and the stratosphere. (4) ''A Better Method for Certifyingmore » the Nuclear Stockpile'' Livermore and Los Alamos are developing a common framework for evaluating the reliability and safety of nuclear weapons. (5) ''Observing How Proteins Loop the Loop'' A new experimental method developed at Livermore allows scientists to monitor the folding processes of proteins, one molecule at a time.« less
Supercomputers for engineering analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goudreau, G.L.; Benson, D.J.; Hallquist, J.O.
1986-07-01
The Cray-1 and Cray X-MP/48 experience in engineering computations at the Lawrence Livermore National Laboratory is surveyed. The fully vectorized explicit DYNA and implicit NIKE finite element codes are discussed with respect to solid and structural mechanics. The main efficiencies for production analyses are currently obtained by simple CFT compiler exploitation of pipeline architecture for inner do-loop optimization. Current developmet of outer-loop multitasking is also discussed. Applications emphasis will be on 3D examples spanning earth penetrator loads analysis, target lethality assessment, and crashworthiness. The use of a vectorized large deformation shell element in both DYNA and NIKE has substantially expandedmore » 3D nonlinear capability. 25 refs., 7 figs.« less
Science and Technology Review June 2005
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aufderheide, M
2005-05-03
This is the articles in this month's issue: (1) Close Collaborations Advance Progress in Genomic Research--Commentary by Elbert Branscomb; (2) Mining Genomes--Livermore computer programs help locate the stretches of DNA in gene deserts that regulate protein-making genes; (3) Shedding Light on Quantum Physics--Laboratory laser research builds from the foundation of Einstein's description of the quantization of light. (4) The Sharper Image for Surveillance--Speckle imaging-an image-processing technique used in astronomy is bringing long-distance surveillance into sharper focus. (5) Keeping Cool Close to the Sun--The specially coated gamma-ray spectrometer aboard the MESSENGER spacecraft will help scientists determine the abundance of elements inmore » Mercury's crust.« less
Science& Technology Review November 2003
DOE Office of Scientific and Technical Information (OSTI.GOV)
McMahon, D
2003-11-01
This issue of Science & Technology Review covers the following topics: (1) We Will Always Need Basic Science--Commentary by Tomas Diaz de la Rubia; (2) When Semiconductors Go Nano--experiments and computer simulations reveal some surprising behavior of semiconductors at the nanoscale; (3) Retinal Prosthesis Provides Hope for Restoring Sight--A microelectrode array is being developed for a retinal prosthesis; (4) Maglev on the Development Track for Urban Transportation--Inductrack, a Livermore concept to levitate train cars using permanent magnets, will be demonstrated on a 120-meter-long test track; and (5) Power Plant on a Chip Moves Closer to Reality--Laboratory-designed fuel processor gives powermore » boost to dime-size fuel cell.« less
shock driven instability of a multi-phase particle-gas system
NASA Astrophysics Data System (ADS)
McFarland, Jacob; Black, Wolfgang; Dahal, Jeevan; Morgan, Brandon
2015-11-01
A computational study of a shock driven instability of a multiphse particle-gas system is presented. This instability can evolve in a similar fashion to the Richtmyer-Meshkov (RM) instability, but has addition parameters to be considered. Particle relaxation times, and density differences of the gas and particle-gas system can be adjusted to produce results which are different from the classical RM instability. We will show simulation results from the Ares code, developed at Lawrence Livermore National Laboratory, which uses a particle-in-cell approach to study the effects of the particle-gas system parameters. Mixing parameters will be presented to highlight the suppression of circulation and gas mixing by the particle phase.
2013 R&D 100 Award: âMiniappsâ Bolster High Performance Computing
Belak, Jim; Richards, David
2018-06-12
Two Livermore computer scientists served on a Sandia National Laboratories-led team that developed Mantevo Suite 1.0, the first integrated suite of small software programs, also called "miniapps," to be made available to the high performance computing (HPC) community. These miniapps facilitate the development of new HPC systems and the applications that run on them. Miniapps (miniature applications) serve as stripped down surrogates for complex, full-scale applications that can require a great deal of time and effort to port to a new HPC system because they often consist of hundreds of thousands of lines of code. The miniapps are a prototype that contains some or all of the essentials of the real application but with many fewer lines of code, making the miniapp more versatile for experimentation. This allows researchers to more rapidly explore options and optimize system design, greatly improving the chances the full-scale application will perform successfully. These miniapps have become essential tools for exploring complex design spaces because they can reliably predict the performance of full applications.
NASA Astrophysics Data System (ADS)
Hartmann Siantar, Christine L.; Moses, Edward I.
1998-11-01
When using radiation to treat cancer, doctors rely on physics and computer technology to predict where the radiation dose will be deposited in the patient. The accuracy of computerized treatment planning plays a critical role in the ultimate success or failure of the radiation treatment. Inaccurate dose calculations can result in either insufficient radiation for cure, or excessive radiation to nearby healthy tissue, which can reduce the patient's quality of life. This paper describes how advanced physics, computer, and engineering techniques originally developed for nuclear weapons and high-energy physics research are being used to predict radiation dose in cancer patients. Results for radiation therapy planning, achieved in the Lawrence Livermore National Laboratory (LLNL) 0143-0807/19/6/005/img2 program show that these tools can give doctors new insights into their patients' treatments by providing substantially more accurate dose distributions than have been available in the past. It is believed that greater accuracy in radiation therapy treatment planning will save lives by improving doctors' ability to target radiation to the tumour and reduce suffering by reducing the incidence of radiation-induced complications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Wei; Department of Radiation Oncology, Shandong's Key Laboratory of Radiation Oncology, Shandong Cancer Hospital and Institute, Jinan; Currey, Adam
2016-03-15
Purpose: To compare lumpectomy cavity (LC) and planning target volume (PTV) delineated with the use of magnetic resonance imaging (MRI) and computed tomography (CT) and to examine the possibility of replacing CT with MRI for radiation therapy (RT) planning for breast cancer. Methods and Materials: MRI and CT data were acquired for 15 patients with early-stage breast cancer undergoing lumpectomy during RT simulation in prone positions, the same as their RT treatment positions. The LCs were delineated manually on both CT (LC-CT) and MRI acquired with 4 sequences: T1, T2, STIR, and DCE. Various PTVs were created by expanding amore » 15-mm margin from the corresponding LCs and from the union of the LCs for the 4 MRI sequences (PTV-MRI). Differences were measured in terms of cavity visualization score (CVS) and dice coefficient (DC). Results: The mean CVSs for T1, T2, STIR, DCE, and CT defined LCs were 3.47, 3.47, 3.87, 3.50. and 2.60, respectively, implying that the LC is mostly visible with a STIR sequence. The mean reductions of LCs from those for CT were 22%, 43%, 36%, and 17% for T1, T2, STIR, and DCE, respectively. In 14 of 15 cases, MRI (union of T1, T2, STIR, and DCE) defined LC included extra regions that would not be visible from CT. The DCs between CT and MRI (union of T1, T2, STIR, and DCE) defined volumes were 0.65 ± 0.20 for LCs and 0.85 ± 0.06 for PTVs. There was no obvious difference between the volumes of PTV-MRI and PTV-CT, and the average PTV-STIR/PTV-CT volume ratio was 0.83 ± 0.23. Conclusions: The use of MRI improves the visibility of LC in comparison with CT. The volumes of LC and PTV generated based on a MRI sequence are substantially smaller than those based on CT, and the PTV-MRI volumes, defined by the union of T1, T2, STIR, and DCE, were comparable with those of PTV-CT for most of the cases studied.« less
Voorhees, A P; Jan, N-J; Sigal, I A
2017-08-01
It is widely considered that intraocular pressure (IOP)-induced deformation within the neural tissue pores of the lamina cribrosa (LC) contributes to neurodegeneration and glaucoma. Our goal was to study how the LC microstructure and mechanical properties determine the mechanical insult to the neural tissues within the pores of the LC. Polarized light microscopy was used to measure the collagen density and orientation in histology sections of three sheep optic nerve heads (ONH) at both mesoscale (4.4μm) and microscale (0.73μm) resolutions. Mesoscale fiber-aware FE models were first used to calculate ONH deformations at an IOP of 30mmHg. The results were then used as boundary conditions for microscale models of LC regions. Models predicted large insult to the LC neural tissues, with 95th percentile 1st principal strains ranging from 7 to 12%. Pores near the scleral boundary suffered significantly higher stretch compared to pores in more central regions (10.0±1.4% vs. 7.2±0.4%; p=0.014; mean±SD). Variations in material properties altered the minimum, median, and maximum levels of neural tissue insult but largely did not alter the patterns of pore-to-pore variation, suggesting these patterns are determined by the underlying structure and geometry of the LC beams and pores. To the best of our knowledge, this is the first computational model that reproduces the highly heterogeneous neural tissue strain fields observed experimentally. The loss of visual function associated with glaucoma has been attributed to sustained mechanical insult to the neural tissues of the lamina cribrosa due to elevated intraocular pressure. Our study is the first computational model built from specimen-specific tissue microstructure to consider the mechanics of the neural tissues of the lamina separately from the connective tissue. We found that the deformation of the neural tissue was much larger than that predicted by any recent microstructure-aware models of the lamina. These results are consistent with recent experimental data and the highest deformations were found in the region of the lamina where glaucomatous damage first occurs. This study provides new insight into the complex biomechanical environment within the lamina. Copyright © 2017 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
Campbell, Ian C.; Coudrillier, Baptiste; Mensah, Johanne; Abel, Richard L.; Ethier, C. Ross
2015-01-01
The lamina cribrosa (LC) is a tissue in the posterior eye with a complex trabecular microstructure. This tissue is of great research interest, as it is likely the initial site of retinal ganglion cell axonal damage in glaucoma. Unfortunately, the LC is difficult to access experimentally, and thus imaging techniques in tandem with image processing have emerged as powerful tools to study the microstructure and biomechanics of this tissue. Here, we present a staining approach to enhance the contrast of the microstructure in micro-computed tomography (micro-CT) imaging as well as a comparison between tissues imaged with micro-CT and second harmonic generation (SHG) microscopy. We then apply a modified version of Frangi's vesselness filter to automatically segment the connective tissue beams of the LC and determine the orientation of each beam. This approach successfully segmented the beams of a porcine optic nerve head from micro-CT in three dimensions and SHG microscopy in two dimensions. As an application of this filter, we present finite-element modelling of the posterior eye that suggests that connective tissue volume fraction is the major driving factor of LC biomechanics. We conclude that segmentation with Frangi's filter is a powerful tool for future image-driven studies of LC biomechanics. PMID:25589572
A Damage Mechanics Source Model for Underground Nuclear Explosions.
1991-08-01
California Institute of Technology Reston, VA 22091 Pasadena, CA 91125 Mr. William J. Best Prof. F. A. Dahlen 907 Westwood Drive Geological and Geophysical...ENSCO, Inc. Department of Geological Sciences 445 Pineda Court . , -7’- 9 Meibcurr..e, F 3940 6 William Kikendall Prof. Amos Nur Teledyne Geotech...Teledyne Geotech Lawrence Livermore National Laboratory 3a¢,l Shiloh Road L-205 Garland, TX 75041 P. 0. Box 808 Livermore, CA 94550 Dr. Matthew Sibol
Fiber Based Optical Amplifier for High Energy Laser Pulses Final Report CRADA No. TC02100.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Messerly, M.; Cunningham, P.
This was a collaborative effort between Lawrence Livermore National Security, LLC (formerly The Regents of the University of California)/Lawrence Livermore National Laboratory (LLNL), and The Boeing Company to develop an optical fiber-based laser amplifier capable of producing and sustaining very high-energy, nanosecond-scale optical pulses. The overall technical objective of this CRADA was to research, design, and develop an optical fiber-based amplifier that would meet specific metrics.
1992-03-01
Propagation of Lg Waves Across Eastern Europe and Asia, Lawrence Livermore National Laboratory Report, LLNL Report No. UCRL -52494. Press, F., and M. Ewing...the Nuclear Testing Ground in Eastern Kazakhstan, Lawrence Livermore National Laboratory Report, LLNL Report No. UCRL -52856. Ruzaikin, A., I. Nersesov...Derring Hall University Park, PA 16802 Blacksburg, VA 24061 Dr. Ralph Alewine, III Dr. Stephen Bratt DARPAftMRO Center for Seismic Studies 3701 North Fairax
The Future Role and Need for Nuclear Weapons in the 21st Century
2007-01-01
program, the Manhattan Project : Einstein‘s letter to Roosevelt in 1939 regarding the use of the energy from uranium for bombs, ―the imaginary German...succeed, nuclear weapons were introduced by the US into our world in 1945. The Manhattan Project efforts produced four bombs within its first three...Proceedings‖ (Livermore, CA: Lawrence Livermore National Laboratory, 1991), 14. 6 Ibid. , 12. 7 ― Manhattan Project ,‖ MSN Encarta, 2, http://encarta
2003 Lawrence Livermore National Laboratory Annual Illness and Injury Surveillance Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
U.S. Department of Energy, Office of Health, Safety and Security, Office of Illness and Injury Prevention Programs
2007-05-23
Annual Illness and Injury Surveillance Program report for 2003 for Lawrence Livermore National Lab. The U.S. Department of Energy’s (DOE) commitment to assuring the health and safety of its workers includes the conduct of epidemiologic surveillance activities that provide an early warning system for health problems among workers. The IISP monitors illnesses and health conditions that result in an absence of workdays, occupational injuries and illnesses, and disabilities and deaths among current workers.
Calculating the Vulnerability of Synthetic Polymers to Autoignition during Nuclear Flash.
1985-03-01
Lawrence Livermore National Laboratory P.O. Box 808 2561C Livermore, California 94550 II. CONTROLLING OFFICE NAME AND ADDRESS 12. REPORT DATE~March...34Low Emissivity and Solar Control Coatings on Architectural Glass," Proc. SPIE 37, 324 (1982). 10. R. C. Weast, Ed., Handbook of Chemistry and Physics...Attn: Michael Frankel Chief of Engineers Washington, D.C. 20305 Department of the Army Attn: DAEN-RDZ-A Command and Control Technical Center Washington
Simón-Manso, Yamil; Lowenthal, Mark S; Kilpatrick, Lisa E; Sampson, Maureen L; Telu, Kelly H; Rudnick, Paul A; Mallard, W Gary; Bearden, Daniel W; Schock, Tracey B; Tchekhovskoi, Dmitrii V; Blonder, Niksa; Yan, Xinjian; Liang, Yuxue; Zheng, Yufang; Wallace, William E; Neta, Pedatsur; Phinney, Karen W; Remaley, Alan T; Stein, Stephen E
2013-12-17
Recent progress in metabolomics and the development of increasingly sensitive analytical techniques have renewed interest in global profiling, i.e., semiquantitative monitoring of all chemical constituents of biological fluids. In this work, we have performed global profiling of NIST SRM 1950, "Metabolites in Human Plasma", using GC-MS, LC-MS, and NMR. Metabolome coverage, difficulties, and reproducibility of the experiments on each platform are discussed. A total of 353 metabolites have been identified in this material. GC-MS provides 65 unique identifications, and most of the identifications from NMR overlap with the LC-MS identifications, except for some small sugars that are not directly found by LC-MS. Also, repeatability and intermediate precision analyses show that the SRM 1950 profiling is reproducible enough to consider this material as a good choice to distinguish between analytical and biological variability. Clinical laboratory data shows that most results are within the reference ranges for each assay. In-house computational tools have been developed or modified for MS data processing and interactive web display. All data and programs are freely available online at http://peptide.nist.gov/ and http://srmd.nist.gov/ .
Physics and Advanced Technologies 2003 Annual Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hazi, A; Sketchley, J
2005-01-20
The Physics and Advanced Technologies (PAT) Directorate overcame significant challenges in 2003 to deliver a wealth of scientific and programmatic milestones, and move toward closer alignment with programs at Lawrence Livermore National Laboratory. We acted aggressively in enabling the PAT Directorate to contribute to future, growing Lawrence Livermore missions in homeland security and at the National Ignition Facility (NIF). We made heavy investments to bring new capabilities to the Laboratory, to initiate collaborations with major Laboratory programs, and to align with future Laboratory directions. Consistent with our mission, we sought to ensure that Livermore programs have access to the bestmore » science and technology, today and tomorrow. For example, in a move aimed at revitalizing the Laboratory's expertise in nuclear and radiation detection, we brought the talented Measurement Sciences Group to Livermore from Lawrence Berkeley National Laboratory, after its mission there had diminished. The transfer to our I Division entailed significant investment by PAT in equipment and infrastructure required by the group. In addition, the move occurred at a time when homeland security funding was expected, but not yet available. By the end of the year, though, the group was making crucial contributions to the radiation detection program at Livermore, and nearly every member was fully engaged in programmatic activities. Our V Division made a move of a different sort, relocating en masse from Building 121 to the NIF complex. This move was designed to enhance interaction and collaboration among high-energy-density experimental scientists at the Laboratory, a goal that is essential to the effective use of NIF in the future. Since then, V Division has become increasingly integrated with NIF activities. Division scientists are heavily involved in diagnostic development and fielding and are poised to perform equation-of-state and high-temperature hohlraum experiments in 2004 as part of the NIF Early Light program.« less
Laboratory Directed Research and Development FY2011 Annual Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Craig, W; Sketchley, J; Kotta, P
2012-03-22
A premier applied-science laboratory, Lawrence Livermore National Laboratory (LLNL) has earned the reputation as a leader in providing science and technology solutions to the most pressing national and global security problems. The LDRD Program, established by Congress at all DOE national laboratories in 1991, is LLNL's most important single resource for fostering excellent science and technology for today's needs and tomorrow's challenges. The LDRD internally directed research and development funding at LLNL enables high-risk, potentially high-payoff projects at the forefront of science and technology. The LDRD Program at Livermore serves to: (1) Support the Laboratory's missions, strategic plan, and foundationalmore » science; (2) Maintain the Laboratory's science and technology vitality; (3) Promote recruiting and retention; (4) Pursue collaborations; (5) Generate intellectual property; and (6) Strengthen the U.S. economy. Myriad LDRD projects over the years have made important contributions to every facet of the Laboratory's mission and strategic plan, including its commitment to nuclear, global, and energy and environmental security, as well as cutting-edge science and technology and engineering in high-energy-density matter, high-performance computing and simulation, materials and chemistry at the extremes, information systems, measurements and experimental science, and energy manipulation. A summary of each project was submitted by the principal investigator. Project summaries include the scope, motivation, goals, relevance to DOE/NNSA and LLNL mission areas, the technical progress achieved in FY11, and a list of publications that resulted from the research. The projects are: (1) Nuclear Threat Reduction; (2) Biosecurity; (3) High-Performance Computing and Simulation; (4) Intelligence; (5) Cybersecurity; (6) Energy Security; (7) Carbon Capture; (8) Material Properties, Theory, and Design; (9) Radiochemistry; (10) High-Energy-Density Science; (11) Laser Inertial-Fusion Energy; (12) Advanced Laser Optical Systems and Applications; (12) Space Security; (13) Stockpile Stewardship Science; (14) National Security; (15) Alternative Energy; and (16) Climatic Change.« less
Pairwise alignment of chromatograms using an extended Fisher-Rao metric.
Wallace, W E; Srivastava, A; Telu, K H; Simón-Manso, Y
2014-09-02
A conceptually new approach for aligning chromatograms is introduced and applied to examples of metabolite identification in human blood plasma by liquid chromatography-mass spectrometry (LC-MS). A square-root representation of the chromatogram's derivative coupled with an extended Fisher-Rao metric enables the computation of relative differences between chromatograms. Minimization of these differences using a common dynamic programming algorithm brings the chromatograms into alignment. Application to a complex sample, National Institute of Standards and Technology (NIST) Standard Reference Material 1950, Metabolites in Human Plasma, analyzed by two different LC-MS methods having significantly different ranges of elution time is described. Published by Elsevier B.V.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lowdermilk, W. H.; Brothers, L. J.
This was a collaborative effort by Lawrence Livermore National Security (formerly the University of California)/Lawrence Livermore National Laboratory (LLNL), Valley Forge Composite Technologies, Inc., and the following Russian Institutes: P. N. Lebedev Physical Institute (LPI), Innovative Technologies Center.(AUO CIT), Central Design Bureau-Almas (CDB Almaz), Moscow Instrument Automation Research Institute, and Institute for High Energy Physics (IBEP) to develop equipment and procedures for detecting explosive materials concealed in airline checked baggage and cargo.
Manufacturing and Characterization of Ultra Pure Ferrous Alloys Final Report CRADA No. TC02069.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lesuer, D.; McGreevy, T. E.
This CRADA was a.collaborative effort between the Lawrence Livermore National Security LLC (formerly University of California)/Lawrence Livermore National Laboratory (LLNL),and Caterpillar Inc. (CaterpiHar), to further advance levitation casting techniques (developed at the Central Research Institute for Material (CRIM) in St. Petersburg, Russia) for use in manufacturing high purity metal alloys. This DOE Global Initiatives for Proliferation Prevention Program (IPP) project was to develop and demonstrate the levitation casting technology for producing ultra-pure alloys.
1991-12-04
ADDRESS(ES) 10. SPONSORING/MONITORING DARPA/NMRO Phillips Laboratory AGENCY REPORT NUMBER (Attn: Dr. A. Ryall) Hanscom AFB, MA 01731-5000 3701 North...areas and media at the USERDA Nevada Test Site, UCRL -51948, Lawrence Livermore La- boratory, Livermore, California. Stead, R. J. and D. V. HeImberger...University Park, PA 16802 Blacksburg, VA 24061 Dr. Ralph Alewine, III Dr. Stephen Bratt DARPA/NMRO Center for Seismic Studies 3701 North Fairfax Drive 1300
1992-08-17
01731-5000 UP, No. 1106 9. SPONSORING / MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSORING/ MONITORING AGENCY REPORT NUMBER DARPA/NMRO 3701 North...the peaceful uses of nuclear explosives, UCRL -5414, Lawrence Livermore National Laboratory, 1973. Nordyke, M.D., A review of Soviet data on the peaceful...Lawrence Livermore national Laboratory, UCRL -JC-107941, preprint. Haskell, N. A. (1964). Radiation pattern of surface waves from point sources in a
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paterson, Lisa E.; Woollett, Jim S.
2014-01-01
The Lawrence Livermore National Laboratory’s (LLNL’s) Environmental Restoration Department (ERD) is required to conduct an ecological review at least every five years to ensure that biological and contaminant conditions in areas undergoing remediation have not changed such that existing conditions pose an ecological hazard (Dibley et al. 2009a). This biological review is being prepared by the Natural Resources Team within LLNL’s Environmental Functional Area (EFA) to support the 2013 five-year ecological review.
PHYSICS: Will Livermore Laser Ever Burn Brightly?
Seife, C; Malakoff, D
2000-08-18
The National Ignition Facility (NIF), a superlaser being built here at Lawrence Livermore National Laboratory in an effort to use lasers rather than nuclear explosions to create a fusion reaction, is supposed to allow weapons makers to preserve the nuclear arsenal--and do nifty fusion science, too. But a new report that examines its troubled past also casts doubt on its future. Even some of NIF's scientific and political allies are beginning to talk openly of a scaled-down version of the original 192-laser design.
The Use of Carbon Aerogel Electrodes for Deionizing Water and Treating Aqueous Process Wastes
1996-01-01
Wastes Joseph C. Farmer, Gregory V. Mack and David V. Fix Lawrence Livermore National Laboratory Livermore, California 94550 Abstract A wide variety of...United States Department of Interior, 190 pages, May (1966). 9. A. M. Johnson, A. W. Venolia, J. Newman, R. G. Wilbourne , C. M. Wong, , W. S. Gillam...Dept. Interior Pub. 200 056, 31 pages, March (1970). 10. A. M. Johnson, A. W. Venolia, R. G Wilbourne , J. Newman, "The Electrosorb Process for
1985-04-01
Lawrence Livermore National Laboratory *P.O. Box 808 2431D Livermore, CA 94550 ______ 11. CONTROLLING OFFICE NAME AND ADDRESS 12. REPORT DATE April 1985...administration of drugs is preferred, to give the highest degree of control possible. Specific tumors are to be made more sensitive to radiation, while the...PJlanification c/Evaristo San Miguel, 8 Madrid-8 SPAIN Ministero dell Interno * ~Direzione Generale della -’- - Protezione Civile 00100 Rome ITALY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kupper, L.L.; Setzer, R.W.; Schwartzbaum, J.
1987-07-01
This document reports on a reevaluation of data obtained in a previous report on occupational factors associated with the development of malignant melanomas at Lawrence Livermore National Laboratory. The current report reduces the number of these factors from five to three based on a rigorous statistical analysis of the original data. Recommendations include restructuring the original questionnaire and trying to contact more individuals that worked with volatile photographic chemicals. 17 refs., 7 figs., 22 tabs. (TEM)
Antisubmarine Warfare (ASW) Lexicon
1990-01-01
Communications Satellite CRT Cathode Ray Tube COMNAVSURFLANT Commander, CS Combat System; Computer Subsystem Naval Surface Force, U.S. Atlantic Fleet CSA Close...Sideband Low-Frequency Acoustic Vernier Analyzer LSD Large Screen Display LC Launch Control LSI Low Ship Impact 24 LSNSR Line-of-Bearing Sensor NCA
Diffraction Plates for Classroom Demonstrations
ERIC Educational Resources Information Center
Hoover, Richard B.
1969-01-01
Describes the computer generation of random and regular arrays of apertures on photographic film and their applications for classroom demonstrations of the Fraunhofer patterns produced by simple and complex apertures, Babinet's principle, resolution according to the Rayleigh criterion, and many other aspects of diffraction. (LC)
Probabilistic neural networks modeling of the 48-h LC50 acute toxicity endpoint to Daphnia magna.
Niculescu, S P; Lewis, M A; Tigner, J
2008-01-01
Two modeling experiments based on the maximum likelihood estimation paradigm and targeting prediction of the Daphnia magna 48-h LC50 acute toxicity endpoint for both organic and inorganic compounds are reported. The resulting models computational algorithms are implemented as basic probabilistic neural networks with Gaussian kernel (statistical corrections included). The first experiment uses strictly D. magna information for 971 structures as training/learning data and the resulting model targets practical applications. The second experiment uses the same training/learning information plus additional data on another 29 compounds whose endpoint information is originating from D. pulex and Ceriodaphnia dubia. It only targets investigation of the effect of mixing strictly D. magna 48-h LC50 modeling information with small amounts of similar information estimated from related species, and this is done as part of the validation process. A complementary 81 compounds dataset (involving only strictly D. magna information) is used to perform external testing. On this external test set, the Gaussian character of the distribution of the residuals is confirmed for both models. This allows the use of traditional statistical methodology to implement computation of confidence intervals for the unknown measured values based on the models predictions. Examples are provided for the model targeting practical applications. For the same model, a comparison with other existing models targeting the same endpoint is performed.
Secular change and inter-annual variability of the Gulf Stream position, 1993-2013, 70°-55°W
NASA Astrophysics Data System (ADS)
Bisagni, James J.; Gangopadhyay, Avijit; Sanchez-Franks, Alejandra
2017-07-01
The Gulf Stream (GS) is the northeastward-flowing surface limb of the Atlantic Ocean's meridional overturning circulation (AMOC) ;conveyer belt; that flows towards Europe and the Nordic Seas. Changes in the GS position after its separation from the coast at Cape Hatteras, i.e., from 75°W to 50°W, may be key to understanding the AMOC, sea level variability and ecosystem behavior along the east coast of North America. In this study we compare secular change and inter-annual variability (IAV) of the Gulf Stream North Wall (GSNW) position with equator-ward Labrador Current (LC) transport along the southwestern Grand Banks near 52°W using 21 years (1993-2013) of satellite altimeter data. Results at 55°, 60°, and 65°W show a significant southward (negative) secular trend for the GSNW, decreasing to a small but insignificant southward trend at 70°W. IAV of de-trended GSNW position residuals also decreases to the west. The long-term secular trend of annual mean upper layer (200 m) LC transport near 52°W is positive. Furthermore, IAV of LC transport residuals near 52°W along the southwestern Grand Banks are significantly correlated with GSNW position residuals at 55°W at a lag of +1-year, with positive (negative) LC transport residuals corresponding to southward (northward) GSNW positions one year later. The Taylor-Stephens index (TSI) computed from the first principal component of the GSNW position from 79° to 65°W shows a similar relationship with a more distal LC index computed along altimeter ground track 250 located north of the Grand Banks across Hamilton Bank in the western Labrador Sea. Increased (decreased) sea height differences along ground track 250 are significantly correlated with a more southward (northward) TSI two years later (lag of +2-years). Spectral analysis of IAV reveals corresponding spectral peaks at 5-7 years and 2-3 years for the North Atlantic Oscillation (NAO), GSNW (70°-55°W) and LC transport near 52°W for the 1993-2013 period suggesting a connection between these phenomena. An upper-layer (200 m) slope water volume calculation using the LC IAV rms residual of +1.04 Sv near 52°W results in an estimated GSNW IAV residual of 79 km, or 63% of the observed 125.6 km (1.13°) rms value at 55°W. A similar upper-layer slope water volume calculation using the positive long-term, upper-layer LC transport trend accounts for 68% of the mean observed secular southward shift of the GSNW between 55° and 70°W over the 1993-2013 period. Our work provides additional observational evidence of important interactions between the upper layers of the sub-polar and sub-tropical gyres within the North Atlantic over both secular and inter-annual time scales as suggested by previous studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chrzanowski, P; Walter, K
Lawrence Livermore National Laboratory's many outstanding accomplishments in 2007 are a tribute to a dedicated staff, which is shaping the Laboratory's future as we go through a period of transition and transformation. The achievements highlighted in this annual report illustrate our focus on the important problems that affect our nation's security and global stability, our application of breakthrough science and technology to tackle those problems, and our commitment to safe, secure, and efficient operations. In May 2007, the Department of Energy (DOE) awarded Lawrence Livermore National Security, LLC (LLNS), a new public-private partnership, the contract to manage and operate themore » Laboratory starting in October. Since its inception in 1952, the Laboratory had been managed by the University of California (UC) for the DOE's National Nuclear Security Administration (NNSA) and predecessor organizations. UC is one of the parent organizations that make up LLNS, and UC's presence in the new management entity will help us carry forward our strong tradition of multidisciplinary science and technology. 'Team science' applied to big problems was pioneered by the Laboratory's co-founder and namesake, Ernest O. Lawrence, and has been our hallmark ever since. Transition began fully a year before DOE's announcement. More than 1,600 activities had to be carried out to transition the Laboratory from management by a not-for-profit to a private entity. People, property, and procedures as well as contracts, formal agreements, and liabilities had to be transferred to LLNS. The pre-transition and transition teams did a superb job, and I thank them for their hard work. Transformation is an ongoing process at Livermore. We continually reinvent ourselves as we seek breakthroughs that impact emerging national needs. An example is our development in the late 1990s of a portable instrument that could rapidly detect DNA signatures, research that started with a view toward the potential threat of terrorist use of biological weapons. As featured in our annual report, activities in this area have grown to many important projects contributing to homeland security and disease prevention and control. At times transformation happens in large steps. Such was the case when nuclear testing stopped in the early 1990s. As one of the nation's nuclear weapon design laboratories, Livermore embarked on the Stockpile Stewardship Program. The objectives are to ensure the safety, security, and reliability of the nation's nuclear weapons stockpile and to develop a science-based, thorough understanding of the performance of nuclear weapons. The ultimate goal is to sustain confidence in an aging stockpile without nuclear testing. Now is another time of major change for the Laboratory as the nation is resizing its nuclear deterrent and NNSA begins taking steps to transform the nuclear weapons complex to meet 21st-century national security needs. As you will notice in the opening commentary to each section of this report, the Laboratory's senior management team is a mixture of new and familiar faces. LLNS drew the best talent from its parent organizations--Bechtel National, UC, Babcock & Wilcox, the Washington Group Division of URS, and Battelle--to lead the Laboratory. We are honored to take on the responsibility and see a future with great opportunities for Livermore to apply its exceptional science and technology to important national problems. We will work with NNSA to build on the successful Stockpile Stewardship Program and transform the nation's nuclear weapons complex to become smaller, safer, more secure, and more cost effective. Our annual report highlights progress in many relevant areas. Laboratory scientists are using astonishing computational capabilities--including BlueGene/L, the world's fastest supercomputer with a revolutionary architecture and over 200,000 processors--to gain key insights about performance of aging nuclear weapons. What we learn will help us sustain the stockpile without nuclear testing. Preparations are underway to start experiments at the National Ignition Facility (NIF), the world's largest laser. They will help us resolve the most important questions we still have about nuclear weapons performance. Future NIF experiments will also explore the promise of an essentially inexhaustible source of clean energy from nuclear fusion. In addition, we have begun the process of eliminating significant quantities of special nuclear materials from the Livermore site. We will carry forward Livermore's tradition of exceptional science and technology. This is the S&T that led to the design and construction of NIF and leadership in an international consortium that is developing the Gemini Planet Imager. When the Imager comes on line in 2010 at an observatory in Chile, the Imager will bring into sharp focus planets that are 30 to 150 light years from our solar system.« less
The Effect of Large Scale Salinity Gradient on Langmuir Turbulence
NASA Astrophysics Data System (ADS)
Fan, Y.; Jarosz, E.; Yu, Z.; Jensen, T.; Sullivan, P. P.; Liang, J.
2017-12-01
Langmuir circulation (LC) is believed to be one of the leading order causes of turbulent mixing in the upper ocean. It is important for momentum and heat exchange across the mixed layer (ML) and directly impact the dynamics and thermodynamics in the upper ocean and lower atmosphere including the vertical distributions of chemical, biological, optical, and acoustic properties. Based on Craik and Leibovich (1976) theory, large eddy simulation (LES) models have been developed to simulate LC in the upper ocean, yielding new insights that could not be obtained from field observations and turbulent closure models. Due its high computational cost, LES models are usually limited to small domain sizes and cannot resolve large-scale flows. Furthermore, most LES models used in the LC simulations use periodic boundary conditions in the horizontal direction, which assumes the physical properties (i.e. temperature and salinity) and expected flow patterns in the area of interest are of a periodically repeating nature so that the limited small LES domain is representative for the larger area. Using periodic boundary condition can significantly reduce computational effort in problems, and it is a good assumption for isotropic shear turbulence. However, LC is anisotropic (McWilliams et al 1997) and was observed to be modulated by crosswind tidal currents (Kukulka et al 2011). Using symmetrical domains, idealized LES studies also indicate LC could interact with oceanic fronts (Hamlington et al 2014) and standing internal waves (Chini and Leibovich, 2005). The present study expands our previous LES modeling investigations of Langmuir turbulence to the real ocean conditions with large scale environmental motion that features fresh water inflow into the study region. Large scale gradient forcing is introduced to the NCAR LES model through scale separation analysis. The model is applied to a field observation in the Gulf of Mexico in July, 2016 when the measurement site was impacted by large fresh water inflow due to flooding from the Mississippi river. Model results indicate that the strong salinity gradient can reduce the mean flow in the ML and inhibit the turbulence in the planetary boundary layer. The Langmuir cells are also rotated clockwise by the pressure gradient.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woollett, J J
2008-09-18
The purpose of this report is to present the results of a live-trapping and visual surveys for special status reptiles at the Site 300 Facilities of Lawrence Livermore National Laboratory (LLNL). The survey was conducted under the authority of the Federal recovery permit of Swaim Biological Consulting (PRT-815537) and a Memorandum of Understanding issued from the California Department of Fish and Game. Site 300 is located between Livermore and Tracy just north of Tesla road (Alameda County) and Corral Hollow Road (San Joaquin County) and straddles the Alameda and San Joaquin County line (Figures 1 and 2). It encompasses portionsmore » of the USGS 7.5 minute Midway and Tracy quadrangles (Figure 2). Focused surveys were conducted for four special status reptiles including the Alameda whipsnake (Masticophis lateralis euryxanthus), the San Joaquin Whipsnake (Masticophis Hagellum ruddock), the silvery legless lizard (Anniella pulchra pulchra), and the California horned lizard (Phrynosoma coronanum frontale).« less
Multi-pulse power injection and spheromak sustainment in SSPX
NASA Astrophysics Data System (ADS)
Stallard, B. W.; Hill, D. N.; Hooper, E. B.; Bulmer, R. H.; McLean, H. S.; Wood, R. D.; Woodruff, S.; Sspx Team
2000-10-01
Lawrence Livermore National Laboratory, Livermore, CA 94550, USA. Spheromak formation (gun injection phase) and sustainment experiments are now routine in SSPX using a multi-bank power system. Gun voltage, impedance, and power coupling show a clear current threshold dependence on gun flux (I_th~=λ_0φ_gun/μ_0), increasing with current above the threshold, and are compared with CTX results. The characteristic gun inductance, L_gun~=0.6 μH, derived from the gun voltage dependence on di/dt, is larger than expected from Corsica modeling of the spheromak equilibrium. It’s value is consistent with the n=1 ‘doughook’ mode structure reported in SPHEX and believed important for helicity injection and toroidal current drive. Results of helicity and power balance calculations of spheromak poloidal field buildup are compared with experiment and used to project sustainment with a future longer pulse power supply. This work was performed under the auspices of US DOE by the University of California Lawrence Livermore National Laboratory under Contract No. W-7405-ENG-48.
Lawrence Livermore National Laboratory environmental report for 1990
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sims, J.M.; Surano, K.A.; Lamson, K.C.
1990-01-01
This report documents the results of the Environmental Monitoring Program at the Lawrence Livermore National Laboratory (LLNL) and presents summary information about environmental compliance for 1990. To evaluate the effect of LLNL operations on the local environment, measurements of direct radiation and a variety of radionuclides and chemical compounds in ambient air, soil, sewage effluent surface water, groundwater, vegetation, and foodstuff were made at both the Livermore site and at Site 300 nearly. LLNL's compliance with all applicable guides, standards, and limits for radiological and nonradiological emissions to the environment was evaluated. Aside from an August 13 observation of silvermore » concentrations slightly above guidelines for discharges to the sanitary sewer, all the monitoring data demonstrated LLNL compliance with environmental laws and regulations governing emission and discharge of materials to the environment. In addition, the monitoring data demonstrated that the environmental impacts of LLNL are minimal and pose no threat to the public to or to the environment. 114 refs., 46 figs., 79 tabs.« less
GlycReSoft: A Software Package for Automated Recognition of Glycans from LC/MS Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maxwell, Evan; Tan, Yan; Tan, Yuxiang
2012-09-26
Glycosylation modifies the physicochemical properties and protein binding functions of glycoconjugates. These modifications are biosynthesized in the endoplasmic reticulum and Golgi apparatus by a series of enzymatic transformations that are under complex control. As a result, mature glycans on a given site are heterogeneous mixtures of glycoforms. This gives rise to a spectrum of adhesive properties that strongly influences interactions with binding partners and resultant biological effects. In order to understand the roles glycosylation plays in normal and disease processes, efficient structural analysis tools are necessary. In the field of glycomics, liquid chromatography/mass spectrometry (LC/MS) is used to profile themore » glycans present in a given sample. This technology enables comparison of glycan compositions and abundances among different biological samples, i.e. normal versus disease, normal versus mutant, etc. Manual analysis of the glycan profiling LC/MS data is extremely time-consuming and efficient software tools are needed to eliminate this bottleneck. In this work, we have developed a tool to computationally model LC/MS data to enable efficient profiling of glycans. Using LC/MS data deconvoluted by Decon2LS/DeconTools, we built a list of unique neutral masses corresponding to candidate glycan compositions summarized over their various charge states, adducts and range of elution times. Our work aims to provide confident identification of true compounds in complex data sets that are not amenable to manual interpretation. This capability is an essential part of glycomics work flows. We demonstrate this tool, GlycReSoft, using an LC/MS dataset on tissue derived heparan sulfate oligosaccharides. The software, code and a test data set are publically archived under an open source license.« less
ERIC Educational Resources Information Center
Lindsay, Robert E.
1970-01-01
Describes a novel instructional method for physics involving the use of a computer assisted instruction system equipped with cathode-ray-tube terminals, light pen, and keyboard input. Discusses exercises with regard to content, mediation, scoring and control. Several examples of exercises are given along with results from student evaluation. (LC)
High-Resolution Large-Field-of-View Three-Dimensional Hologram Display System and Method Thereof
NASA Technical Reports Server (NTRS)
Chao, Tien-Hsin (Inventor); Mintz, Frederick W. (Inventor); Tsou, Peter (Inventor); Bryant, Nevin A. (Inventor)
2001-01-01
A real-time, dynamic, free space-virtual reality, 3-D image display system is enabled by using a unique form of Aerogel as the primary display media. A preferred embodiment of this system comprises a 3-D mosaic topographic map which is displayed by fusing four projected hologram images. In this embodiment, four holographic images are projected from four separate holograms. Each holographic image subtends a quadrant of the 4(pi) solid angle. By fusing these four holographic images, a static 3-D image such as a featured terrain map would be visible for 360 deg in the horizontal plane and 180 deg in the vertical plane. An input, either acquired by 3-D image sensor or generated by computer animation, is first converted into a 2-D computer generated hologram (CGH). This CGH is then downloaded into large liquid crystal (LC) panel. A laser projector illuminates the CGH-filled LC panel and generates and displays a real 3-D image in the Aerogel matrix.
Saiki, M.K.; Monda, D.P.; Bellerud, B.L.
1999-01-01
Resource managers hypothesize that occasional fish kills during summer-early fall in Upper Klamath Lake, Oregon, may be linked to unfavorable water quality conditions created by massive algal blooms. In a preliminary effort to address this concern, short-term (96-h-long) laboratory tests were conducted with larval and juvenile Lost River (Deltistes luxatus) and shortnose (Chasmistes brevirostris) suckers to determine the upper median lethal concentrations (LC50s; also referred to as median tolerance limits) for pH, un-ionized ammonia, and water temperature, and the lower LC50s for dissolved oxygen. The mean LC50s varied among species and life stages as follows: for pH, 10.30-10.39; for un-ionized ammonia, 0.48-1.06 mg litre-1; for temperature, 30.35-31.82??C; and for dissolved oxygen, 1.34-2.10 mg litre-1. Comparisons of 95% confidence limits indicated that, on average, the 96-h LC50s were not significantly different from those computed for shorter exposure times (i.e., 24 h, 48 h, and 72 h). According to two-way analysis of variance, LC50s for the four water quality variables did not vary significantly (p > 0.05) between fish species. However, LC50s for pH (exposure times of 24 h and 48 h) and dissolved oxygen (exposure times of 48 h, 72 h, and 96 h) differed significantly (p ??? 0.05) between life stages, whereas LC50s for un-ionized ammonia and water temperature did not exhibit significant differences. In general, larvae were more sensitive than juveniles to high pH and low dissolved oxygen concentrations. When compared to ambient water quality conditions in Upper Klamath Lake, our results strongly suggest that near-anoxic conditions associated with the senescence phase of algal blooms are most likely to cause high mortalities of larval and juvenile suckers.
Warp-X: A new exascale computing platform for beam–plasma simulations
Vay, J. -L.; Almgren, A.; Bell, J.; ...
2018-01-31
Turning the current experimental plasma accelerator state-of-the-art from a promising technology into mainstream scientific tools depends critically on high-performance, high-fidelity modeling of complex processes that develop over a wide range of space and time scales. As part of the U.S. Department of Energy's Exascale Computing Project, a team from Lawrence Berkeley National Laboratory, in collaboration with teams from SLAC National Accelerator Laboratory and Lawrence Livermore National Laboratory, is developing a new plasma accelerator simulation tool that will harness the power of future exascale supercomputers for high-performance modeling of plasma accelerators. We present the various components of the codes such asmore » the new Particle-In-Cell Scalable Application Resource (PICSAR) and the redesigned adaptive mesh refinement library AMReX, which are combined with redesigned elements of the Warp code, in the new WarpX software. Lastly, the code structure, status, early examples of applications and plans are discussed.« less
FBIS report. Science and technology: Europe/International, March 29, 1996
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-03-29
;Partial Contents: Advanced Materials (EU Project to Improve Production in Metal Matrix Compounds Noted, Germany: Extremely Hard Carbon Coating Development, Italy: Director of CNR Metallic Materials Institute Interviewed); Aerospace (ESA Considers Delays, Reductions as Result of Budget Cuts, Italy: Space Agency`s Director on Restructuring, Future Plans); Automotive, Transportation (EU: Clean Diesel Engine Technology Research Reviewed); Biotechnology (Germany`s Problems, Successes in Biotechnology Discussed); Computers (EU Europort Parallel Computing Project Concluded, Italy: PQE 2000 Project on Massively Parallel Systems Viewed); Defense R&D (France: Future Tasks of `Brevel` Military Intelligence Drone Noted); Energy, Environment (German Scientist Tests Elimination of Phosphates); Advanced Manufacturing (France:more » Advanced Rapid Prototyping System Presented); Lasers, Sensors, Optics (France: Strategy of Cilas Laser Company Detailed); Microelectronics (France: Simulation Company to Develop Microelectronic Manufacturing Application); Nuclear R&D (France: Megajoule Laser Plan, Cooperation with Livermore Lab Noted); S&T Policy (EU Efforts to Aid Small Companies` Research Viewed); Telecommunications (France Telecom`s Way to Internet).« less
CLARET user's manual: Mainframe Logs. Revision 1
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frobose, R.H.
1984-11-12
CLARET (Computer Logging and RETrieval) is a stand-alone PDP 11/23 system that can support 16 terminals. It provides a forms-oriented front end by which operators enter online activity logs for the Lawrence Livermore National Laboratory's OCTOPUS computer network. The logs are stored on the PDP 11/23 disks for later retrieval, and hardcopy reports are generated both automatically and upon request. Online viewing of the current logs is provided to management. As each day's logs are completed, the information is automatically sent to a CRAY and included in an online database system. The terminal used for the CLARET system is amore » dual-port Hewlett Packard 2626 terminal that can be used as either the CLARET logging station or as an independent OCTOPUS terminal. Because this is a stand-alone system, it does not depend on the availability of the OCTOPUS network to run and, in the event of a power failure, can be brought up independently.« less
A Computational Model for Thrombus Formation in Response to Cardiovascular Implantable Devices
NASA Astrophysics Data System (ADS)
Horn, John; Ortega, Jason; Maitland, Duncan
2014-11-01
Cardiovascular implantable devices elicit complex physiological responses within blood. Notably, alterations in blood flow dynamics and interactions between blood proteins and biomaterial surface chemistry may lead to the formation of thrombus. For some devices, such as stents and heart valves, this is an adverse outcome. For other devices, such as embolic aneurysm treatments, efficient blood clot formation is desired. Thus a method to study how biomedical devices induce thrombosis is paramount to device development and optimization. A multiscale, multiphysics computational model is developed to predict thrombus formation within the vasculature. The model consists of a set of convection-diffusion-reaction partial differential equations for blood protein constituents involved in the progression of the clotting cascades. This model is used to study thrombus production from endovascular devices with the goal of optimizing the device design to generate the desired clotting response. This work was performed in part under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Warp-X: A new exascale computing platform for beam–plasma simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vay, J. -L.; Almgren, A.; Bell, J.
Turning the current experimental plasma accelerator state-of-the-art from a promising technology into mainstream scientific tools depends critically on high-performance, high-fidelity modeling of complex processes that develop over a wide range of space and time scales. As part of the U.S. Department of Energy's Exascale Computing Project, a team from Lawrence Berkeley National Laboratory, in collaboration with teams from SLAC National Accelerator Laboratory and Lawrence Livermore National Laboratory, is developing a new plasma accelerator simulation tool that will harness the power of future exascale supercomputers for high-performance modeling of plasma accelerators. We present the various components of the codes such asmore » the new Particle-In-Cell Scalable Application Resource (PICSAR) and the redesigned adaptive mesh refinement library AMReX, which are combined with redesigned elements of the Warp code, in the new WarpX software. Lastly, the code structure, status, early examples of applications and plans are discussed.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mok, G.C.; Thomas, G.R.; Gerhard, M.A.
SCANS (Shipping Cask ANalysis System) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for evaluating safety analysis reports on spent fuel shipping casks. SCANS is an easy-to-use system that calculates the global response to impact loads, pressure loads and thermal conditions, providing reviewers with an independent check on analyses submitted by licensees. SCANS is based on microcomputers compatible with the IBM-PC family of computers. The system is composed of a series of menus, input programs, cask analysis programs, and output display programs. All data is entered through fill-in-the-blank input screens thatmore » contain descriptive data requests. Analysis options are based on regulatory cases described in the Code of Federal Regulations 10 CFR 71 and Regulatory Guides published by the US Nuclear Regulatory Commission in 1977 and 1978.« less
A comparison of non-local electron transport models relevant to inertial confinement fusion
NASA Astrophysics Data System (ADS)
Sherlock, Mark; Brodrick, Jonathan; Ridgers, Christopher
2017-10-01
We compare the reduced non-local electron transport model developed by Schurtz et al. to Vlasov-Fokker-Planck simulations. Two new test cases are considered: the propagation of a heat wave through a high density region into a lower density gas, and a 1-dimensional hohlraum ablation problem. We find the reduced model reproduces the peak heat flux well in the ablation region but significantly over-predicts the coronal preheat. The suitability of the reduced model for computing non-local transport effects other than thermal conductivity is considered by comparing the computed distribution function to the Vlasov-Fokker-Planck distribution function. It is shown that even when the reduced model reproduces the correct heat flux, the distribution function is significantly different to the Vlasov-Fokker-Planck prediction. Two simple modifications are considered which improve agreement between models in the coronal region. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Stockpile Stewardship: How We Ensure the Nuclear Deterrent Without Testing
None
2018-01-16
In the 1990s, the U.S. nuclear weapons program shifted emphasis from developing new designs to dismantling thousands of existing weapons and maintaining a much smaller enduring stockpile. The United States ceased underground nuclear testing, and the Department of Energy created the Stockpile Stewardship Program to maintain the safety, security, and reliability of the U.S. nuclear deterrent without full-scale testing. This video gives a behind the scenes look at a set of unique capabilities at Lawrence Livermore that are indispensable to the Stockpile Stewardship Program: high performance computing, the Superblock category II nuclear facility, the JASPER a two stage gas gun, the High Explosive Applications Facility (HEAF), the National Ignition Facility (NIF), and the Site 300 contained firing facility.
The AMTEX Partnership{trademark} mid year report, fiscal year 1997
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1997-03-01
The AMTEX Partnership{trademark} is a collaborative research and development program among the US Integrated Textile Complex (ITC), the US Department of Energy (DOE), the DOE national laboratories, other federal agencies and laboratories, and universities. The goal of AMTEX is to strengthen the competitiveness of this vital industry, thereby preserving and creating US jobs. Three AMTEX projects funded in FY 1997 are Diamond Activated Manufacturing Architecture (DAMA), Computer-Aided Fabric Evaluation (CAFE), and Textile Resource Conservation (TReC). The five sites involved in AMTEX work are Sandia National Laboratory (SNL), Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), the Oak Ridgemore » Y-12 Plant, and the Oak Ridge National Laboratory (ORNL) (the latter is funded through Y-12).« less
NASA Astrophysics Data System (ADS)
Awwal, Abdul A. S.
2016-09-01
Every summer in the National Ignition Facility (NIF) at Lawrence Livermore National Laboratory, students are brought in to gain interesting research and development experience. In this work, we will review some case studies of past research experiences with students inside and outside NIF, that led to successful journal and conference publications. Several of these works will be reviewed to demonstrate how problems were chosen and defined so that meaningful results could be obtained within a limited time frame. It is anticipated that success with such projects will go a long way in motivating students in their future graduate career. Projects from laser measurement, optical computing and application of matched filtering in laser beam alignment will be reviewed to demonstrate this approach.
The application of latent curve analysis to testing developmental theories in intervention research.
Curran, P J; Muthén, B O
1999-08-01
The effectiveness of a prevention or intervention program has traditionally been assessed using time-specific comparisons of mean levels between the treatment and the control groups. However, many times the behavior targeted by the intervention is naturally developing over time, and the goal of the treatment is to alter this natural or normative developmental trajectory. Examining time-specific mean levels can be both limiting and potentially misleading when the behavior of interest is developing systematically over time. It is argued here that there are both theoretical and statistical advantages associated with recasting intervention treatment effects in terms of normative and altered developmental trajectories. The recently developed technique of latent curve (LC) analysis is reviewed and extended to a true experimental design setting in which subjects are randomly assigned to a treatment intervention or a control condition. LC models are applied to both artificially generated and real intervention data sets to evaluate the efficacy of an intervention program. Not only do the LC models provide a more comprehensive understanding of the treatment and control group developmental processes compared to more traditional fixed-effects models, but LC models have greater statistical power to detect a given treatment effect. Finally, the LC models are modified to allow for the computation of specific power estimates under a variety of conditions and assumptions that can provide much needed information for the planning and design of more powerful but cost-efficient intervention programs for the future.
Kiefer, Patrick; Schmitt, Uwe; Vorholt, Julia A
2013-04-01
The Python-based, open-source eMZed framework was developed for mass spectrometry (MS) users to create tailored workflows for liquid chromatography (LC)/MS data analysis. The goal was to establish a unique framework with comprehensive basic functionalities that are easy to apply and allow for the extension and modification of the framework in a straightforward manner. eMZed supports the iterative development and prototyping of individual evaluation strategies by providing a computing environment and tools for inspecting and modifying underlying LC/MS data. The framework specifically addresses non-expert programmers, as it requires only basic knowledge of Python and relies largely on existing successful open-source software, e.g. OpenMS. The framework eMZed and its documentation are freely available at http://emzed.biol.ethz.ch/. eMZed is published under the GPL 3.0 license, and an online discussion group is available at https://groups.google.com/group/emzed-users. Supplementary data are available at Bioinformatics online.
Liquid crystalline fiber optic colorimeter for hydrostatic pressure measurement
NASA Astrophysics Data System (ADS)
Wolinski, Tomasz R.; Bajdecki, Waldemar K.; Domanski, Andrzej W.; Karpierz, Miroslaw A.; Konopka, Witold; Nasilowski, T.; Sierakowski, Marek W.; Swillo, Marcin; Dabrowski, Roman S.; Nowinowski-Kruszelnicki, Edward; Wasowski, Janusz
2001-08-01
This paper presents results of tests performed on a fiber optic system of liquid crystalline transducer for hydrostatic pressure monitoring based on properties of colorimetry. The system employs pressure-induced deformations occurring in liquid crystalline (LC) cells configured in a homogeneous Frederiks geometry. The sensor is compared of a round LC cell placed inside a specially designed pressure chamber. As a light source we used a typical diode operating at red wavelength and modulated using standard techniques. The pressure transducer was connected to a computer with a specially designed interface built on the bas of advanced ADAM modules. Results indicate that the system offers high response to pressure with reduced temperature sensitivity and, depending on the LC cell used, can be adjusted for monitoring of low hydrostatic pressures up to 6 MPa. These studies have demonstrated the feasibility of fiber optic liquid crystal colorimeter for hydrostatic pressure sensing specially dedicated to pipe- lines, mining instrumentation, and process-control technologies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chinn, D J
This month's issue has the following articles: (1) The Edward Teller Centennial--Commentary by George H. Miller; (2) Edward Teller's Century: Celebrating the Man and His Vision--Colleagues at the Laboratory remember Edward Teller, cofounder of Lawrence Livermore, adviser to U.S. presidents, and physicist extraordinaire, on the 100th anniversary of his birth; (3) Quark Theory and Today's Supercomputers: It's a Match--Thanks to the power of BlueGene/L, Livermore has become an epicenter for theoretical advances in particle physics; and (4) The Role of Dentin in Tooth Fracture--Studies on tooth dentin show that its mechanical properties degrade with age.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tweed, J.
1996-10-01
This Quinquennial Review Report of the Lawrence Livermore National Laboratory (LLNL) branch of the Institute for Geophysics and Planetary Physics (IGPP) provides an overview of IGPP-LLNL, its mission, and research highlights of current scientific activities. This report also presents an overview of the University Collaborative Research Program (UCRP), a summary of the UCRP Fiscal Year 1997 proposal process and the project selection list, a funding summary for 1993-1996, seminars presented, and scientific publications. 2 figs., 3 tabs.
Electronic and thermally tunable infrared metamaterial absorbers
NASA Astrophysics Data System (ADS)
Shrekenhamer, David; Miragliotta, Joseph A.; Brinkley, Matthew; Fan, Kebin; Peng, Fenglin; Montoya, John A.; Gauza, Sebastian; Wu, Shin-Tson; Padilla, Willie J.
2016-09-01
In this paper, we report a computational and experimental study using tunable infrared (IR) metamaterial absorbers (MMAs) to demonstrate frequency tunable (7%) and amplitude modulation (61%) designs. The dynamic tuning of each structure was achieved through the addition of an active material—liquid crystals (LC) or vanadium dioxide (VO2)-within the unit cell of the MMA architecture. In both systems, an applied stimulus (electric field or temperature) induced a dielectric change in the active material and subsequent variation in the absorption and reflection properties of the MMA in the mid- to long-wavelength region of the IR (MWIR and LWIR, respectively). These changes were observed to be reversible for both systems and dynamic in the LC-based structure.
Red Storm usage model :Version 1.12.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jefferson, Karen L.; Sturtevant, Judith E.
Red Storm is an Advanced Simulation and Computing (ASC) funded massively parallel supercomputer located at Sandia National Laboratories (SNL). The Red Storm Usage Model (RSUM) documents the capabilities and the environment provided for the FY05 Tri-Lab Level II Limited Availability Red Storm User Environment Milestone and the FY05 SNL Level II Limited Availability Red Storm Platform Milestone. This document describes specific capabilities, tools, and procedures to support both local and remote users. The model is focused on the needs of the ASC user working in the secure computing environments at Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL),more » and SNL. Additionally, the Red Storm Usage Model maps the provided capabilities to the Tri-Lab ASC Computing Environment (ACE) requirements. The ACE requirements reflect the high performance computing requirements for the ASC community and have been updated in FY05 to reflect the community's needs. For each section of the RSUM, Appendix I maps the ACE requirements to the Limited Availability User Environment capabilities and includes a description of ACE requirements met and those requirements that are not met in that particular section. The Red Storm Usage Model, along with the ACE mappings, has been issued and vetted throughout the Tri-Lab community.« less
Computationally efficient optimization of radiation drives
NASA Astrophysics Data System (ADS)
Zimmerman, George; Swift, Damian
2017-06-01
For many applications of pulsed radiation, the temporal pulse shape is designed to induce a desired time-history of conditions. This optimization is normally performed using multi-physics simulations of the system, adjusting the shape until the desired response is induced. These simulations may be computationally intensive, and iterative forward optimization is then expensive and slow. In principle, a simulation program could be modified to adjust the radiation drive automatically until the desired instantaneous response is achieved, but this may be impracticable in a complicated multi-physics program. However, the computational time increment is typically much shorter than the time scale of changes in the desired response, so the radiation intensity can be adjusted so that the response tends toward the desired value. This relaxed in-situ optimization method can give an adequate design for a pulse shape in a single forward simulation, giving a typical gain in computational efficiency of tens to thousands. This approach was demonstrated for the design of laser pulse shapes to induce ramp loading to high pressure in target assemblies where different components had significantly different mechanical impedance, requiring careful pulse shaping. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
NASA Astrophysics Data System (ADS)
Klein, R.; Woodward, C. S.; Johannesson, G.; Domyancic, D.; Covey, C. C.; Lucas, D. D.
2012-12-01
Uncertainty Quantification (UQ) is a critical field within 21st century simulation science that resides at the very center of the web of emerging predictive capabilities. The science of UQ holds the promise of giving much greater meaning to the results of complex large-scale simulations, allowing for quantifying and bounding uncertainties. This powerful capability will yield new insights into scientific predictions (e.g. Climate) of great impact on both national and international arenas, allow informed decisions on the design of critical experiments (e.g. ICF capsule design, MFE, NE) in many scientific fields, and assign confidence bounds to scientifically predictable outcomes (e.g. nuclear weapons design). In this talk I will discuss a major new strategic initiative (SI) we have developed at Lawrence Livermore National Laboratory to advance the science of Uncertainty Quantification at LLNL focusing in particular on (a) the research and development of new algorithms and methodologies of UQ as applied to multi-physics multi-scale codes, (b) incorporation of these advancements into a global UQ Pipeline (i.e. a computational superstructure) that will simplify user access to sophisticated tools for UQ studies as well as act as a self-guided, self-adapting UQ engine for UQ studies on extreme computing platforms and (c) use laboratory applications as a test bed for new algorithms and methodologies. The initial SI focus has been on applications for the quantification of uncertainty associated with Climate prediction, but the validated UQ methodologies we have developed are now being fed back into Science Based Stockpile Stewardship (SSS) and ICF UQ efforts. To make advancements in several of these UQ grand challenges, I will focus in talk on the following three research areas in our Strategic Initiative: Error Estimation in multi-physics and multi-scale codes ; Tackling the "Curse of High Dimensionality"; and development of an advanced UQ Computational Pipeline to enable complete UQ workflow and analysis for ensemble runs at the extreme scale (e.g. exascale) with self-guiding adaptation in the UQ Pipeline engine. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344 and was funded by the Uncertainty Quantification Strategic Initiative Laboratory Directed Research and Development Project at LLNL under project tracking code 10-SI-013 (UCRL LLNL-ABS-569112).
Nonlinear optics quantum computing with circuit QED.
Adhikari, Prabin; Hafezi, Mohammad; Taylor, J M
2013-02-08
One approach to quantum information processing is to use photons as quantum bits and rely on linear optical elements for most operations. However, some optical nonlinearity is necessary to enable universal quantum computing. Here, we suggest a circuit-QED approach to nonlinear optics quantum computing in the microwave regime, including a deterministic two-photon phase gate. Our specific example uses a hybrid quantum system comprising a LC resonator coupled to a superconducting flux qubit to implement a nonlinear coupling. Compared to the self-Kerr nonlinearity, we find that our approach has improved tolerance to noise in the qubit while maintaining fast operation.
Sadygov, Rovshan G; Maroto, Fernando Martin; Hühmer, Andreas F R
2006-12-15
We present an algorithmic approach to align three-dimensional chromatographic surfaces of LC-MS data of complex mixture samples. The approach consists of two steps. In the first step, we prealign chromatographic profiles: two-dimensional projections of chromatographic surfaces. This is accomplished by correlation analysis using fast Fourier transforms. In this step, a temporal offset that maximizes the overlap and dot product between two chromatographic profiles is determined. In the second step, the algorithm generates correlation matrix elements between full mass scans of the reference and sample chromatographic surfaces. The temporal offset from the first step indicates a range of the mass scans that are possibly correlated, then the correlation matrix is calculated only for these mass scans. The correlation matrix carries information on highly correlated scans, but it does not itself determine the scan or time alignment. Alignment is determined as a path in the correlation matrix that maximizes the sum of the correlation matrix elements. The computational complexity of the optimal path generation problem is reduced by the use of dynamic programming. The program produces time-aligned surfaces. The use of the temporal offset from the first step in the second step reduces the computation time for generating the correlation matrix and speeds up the process. The algorithm has been implemented in a program, ChromAlign, developed in C++ language for the .NET2 environment in WINDOWS XP. In this work, we demonstrate the applications of ChromAlign to alignment of LC-MS surfaces of several datasets: a mixture of known proteins, samples from digests of surface proteins of T-cells, and samples prepared from digests of cerebrospinal fluid. ChromAlign accurately aligns the LC-MS surfaces we studied. In these examples, we discuss various aspects of the alignment by ChromAlign, such as constant time axis shifts and warping of chromatographic surfaces.
Initial Results of the SSPX Transient Internal Probe System for Measuring Toroidal Field Profiles
NASA Astrophysics Data System (ADS)
Holcomb, C. T.; Jarboe, T. R.; Mattick, A. T.; Hill, D. N.; McLean, H. S.; Wood, R. D.; Cellamare, V.
2000-10-01
Lawrence Livermore National Laboratory, Livermore, CA 94550, USA. The Sustained Spheromak Physics Experiment (SSPX) is using a field profile diagnostic called the Transient Internal Probe (TIP). TIP consists of a verdet-glass bullet that is used to measure the magnetic field by Faraday rotation. This probe is shot through the spheromak by a light gas gun at speeds near 2 km/s. An argon laser is aligned along the path of the probe. The light passes through the probe and is retro-reflected to an ellipsometer that measures the change in polarization angle. The measurement is spatially resolved down to the probes’ 1 cm length to within 15 Gauss. Initial testing results are given. This and future data will be used to determine the field profile for equilibrium reconstruction. TIP can also be used in conjunction with wall probes to map out toroidal mode amplitudes and phases internally. This work was performed under the auspices of US DOE by the University of California Lawrence Livermore National Laboratory under Contract No. W-7405-ENG-48.
Proceedings of the 3rd US-Japan Workshop on Plasma Polarization Spectroscopy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beiersdorfer, P; Flyimoto, T
The third US-Japan Workshop on Plasma Polarization Spectroscopy was held at the Lawrence Livermore National Laboratory in Livermore, California, on June 18-21, 2001. The talks presented at this workshop are summarized in these proceedings. The papers cover both experimental investigation and applications of plasma polarization spectroscopy as well as the theoretical foundation and formalisms to understand and describe the polarization phenomena. The papers give an overview of the history of plasma polarization spectroscopy, derive the formal aspects of polarization spectroscopy, including the effects of electric and magnetic fields, discuss spectra perturbed by intense microwave fields, charge exchange, and dielectronic recombination,more » and present calculations of various collisional excitation and ionization cross sections and the modeling of plasma polarization spectroscopy phenomena. Experimental results are given from the WT-3 tokamak, the MST reverse field pinch, the Large Helical Device, the GAMMA 10 mirror machine, the Nevada Terrawatt Facility, the Livermore EBIT-II electron beam ion trap, and beam-foil spectroscopy. In addition, results were presented from studies of several laser-produced plasma experiments and new instrumental techniques were demonstrated.« less
Design of Incident Field B-Dot Sensor for the Nose Boom of NASA F-106B Aircraft.
1984-04-01
ip , and 0. Figures 14 to 21 show the computed cBM(t), whereas Figures 22 to 29 show the computed M(t) as well as the sensor pick-up voltage Vs...is adequate signal corresponding to the inci- dent field. 34 6. d.) .0S E4 ~ 0000091 o~ VOZ -a--W0 009 .41 CA -4 809 r-4- 48J do u L)c 35C Cj .a C9 CD
Cheng, Wing-Chi; Yau, Tsan-Sang; Wong, Ming-Kei; Chan, Lai-Ping; Mok, Vincent King-Kuen
2006-10-16
A rapid urinalysis system based on SPE-LC-MS/MS with an in-house post-analysis data management system has been developed for the simultaneous identification and semi-quantitation of opiates (morphine, codeine), methadone, amphetamines (amphetamine, methylamphetamine (MA), 3,4-methylenedioxyamphetamine (MDA) and 3,4-methylenedioxymethamphetamine (MDMA)), 11-benzodiazepines or their metabolites and ketamine. The urine samples are subjected to automated solid phase extraction prior to analysis by LC-MS (Finnigan Surveyor LC connected to a Finnigan LCQ Advantage) fitted with an Alltech Rocket Platinum EPS C-18 column. With a single point calibration at the cut-off concentration for each analyte, simultaneous identification and semi-quantitation for the above mentioned drugs can be achieved in a 10 min run per urine sample. A computer macro-program package was developed to automatically retrieve appropriate data from the analytical data files, compare results with preset values (such as cut-off concentrations, MS matching scores) of each drug being analyzed and generate user-defined Excel reports to indicate all positive and negative results in batch-wise manner for ease of checking. The final analytical results are automatically copied into an Access database for report generation purposes. Through the use of automation in sample preparation, simultaneous identification and semi-quantitation by LC-MS/MS and a tailored made post-analysis data management system, this new urinalysis system significantly improves the quality of results, reduces the post-data treatment time, error due to data transfer and is suitable for high-throughput laboratory in batch-wise operation.
Series hybrid vehicles and optimized hydrogen engine design
NASA Astrophysics Data System (ADS)
Smith, J. R.; Aceves, S.; Vanblarigan, P.
1995-05-01
Lawrence Livermore, Sandia Livermore and Los Alamos National Laboratories have a joint project to develop an optimized hydrogen fueled engine for series hybrid automobiles. The major divisions of responsibility are: system analysis, engine design and kinetics modeling by LLNL; performance and emission testing, and friction reduction by SNL; computational fluid mechanics and combustion modeling by LANL. This project is a component of the Department of Energy, Office of Utility Technology, National Hydrogen Program. We report here on the progress on system analysis and preliminary engine testing. We have done system studies of series hybrid automobiles that approach the PNGV design goal of 34 km/liter (80 mpg), for 384 km (240 mi) and 608 km (380 mi) ranges. Our results indicate that such a vehicle appears feasible using an optimized hydrogen engine. The impact of various on-board storage options on fuel economy are evaluated. Experiments with an available engine at the Sandia Combustion Research Facility demonstrated NO(x) emissions of 10 to 20 ppm at an equivalence ratio of 0.4, rising to about 500 ppm at 0.5 equivalence ratio using neat hydrogen. Hybrid vehicle simulation studies indicate that exhaust NO(x) concentrations must be less than 180 ppm to meet the 0.2 g/mile California Air Resources Board ULEV or Federal Tier-2 emissions regulations. We have designed and fabricated a first generation optimized hydrogen engine head for use on an existing single cylinder Onan engine. This head currently features 14.8:1 compression ratio, dual ignition, water cooling, two valves and open quiescent combustion chamber to minimize heat transfer losses.
Workshop on Incomplete Network Data Held at Sandia National Labs – Livermore
DOE Office of Scientific and Technical Information (OSTI.GOV)
Soundarajan, Sucheta; Wendt, Jeremy D.
2016-06-01
While network analysis is applied in a broad variety of scientific fields (including physics, computer science, biology, and the social sciences), how networks are constructed and the resulting bias and incompleteness have drawn more limited attention. For example, in biology, gene networks are typically developed via experiment -- many actual interactions are likely yet to be discovered. In addition to this incompleteness, the data-collection processes can introduce significant bias into the observed network datasets. For instance, if you observe part of the World Wide Web network through a classic random walk, then high degree nodes are more likely to bemore » found than if you had selected nodes at random. Unfortunately, such incomplete and biasing data collection methods must be often used.« less
Stockpile Stewardship: How We Ensure the Nuclear Deterrent Without Testing
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2014-09-04
In the 1990s, the U.S. nuclear weapons program shifted emphasis from developing new designs to dismantling thousands of existing weapons and maintaining a much smaller enduring stockpile. The United States ceased underground nuclear testing, and the Department of Energy created the Stockpile Stewardship Program to maintain the safety, security, and reliability of the U.S. nuclear deterrent without full-scale testing. This video gives a behind the scenes look at a set of unique capabilities at Lawrence Livermore that are indispensable to the Stockpile Stewardship Program: high performance computing, the Superblock category II nuclear facility, the JASPER a two stage gas gun,more » the High Explosive Applications Facility (HEAF), the National Ignition Facility (NIF), and the Site 300 contained firing facility.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bearinger, J P
This months issue has the following articles: (1) Science Translated for the Greater Good--Commentary by Steven D. Liedle; (2) The New Face of Industrial Partnerships--An entrepreneurial spirit is blossoming at Lawrence Livermore; (3) Monitoring a Nuclear Weapon from the Inside--Livermore researchers are developing tiny sensors to warn of detrimental chemical and physical changes inside nuclear warheads; (4) Simulating the Biomolecular Structure of Nanometer-Size Particles--Grand Challenge simulations reveal the size and structure of nanolipoprotein particles used to study membrane proteins; and (5) Antineutrino Detectors Improve Reactor Safeguards--Antineutrino detectors track the consumption and production of fissile materials inside nuclear reactors.
Development of a Landmine Detection Sensor Final Report CRADA No. TC02133.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Romero, C. E.; Sheppard, C.
2017-09-06
This was one of two CRADAs between Lawrence Livermore National Security, LLC as manager and operator of Lawrence Livermore National Laboratory (LLNL) and First Alliance Technologies, LLC (First Alliance), to conduct research and development activity toward an integrated system for the detecting, locating, and destroying of landmines and unexploded ordinance using a laser to destroy landmines and unexploded ordinance and First Alliance’s Land Mine Locator (LML) system. The focus of this CRADA was on developing a sensor system that accurately detects landmines, and provides exact location information in a timely manner with extreme reliability.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Savage, J.W.
1983-03-10
A human factors engineering design review/audit of the Waterford-3 control room was performed at the site on May 10 through May 13, 1982. The report was prepared on the basis of the HFEB's review of the applicant's Preliminary Human Engineering Discrepancy (PHED) report and the human factors engineering design review performed at the site. This design review was carried out by a team from the Human Factors Engineering Branch, Division of Human Factors Safety. The review team was assisted by consultants from Lawrence Livermore National Laboratory (University of California), Livermore, California.
Science & Technology Review September 2005
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aufderheide III, M B
2005-07-19
This month's issue has the following articles: (1) The Pursuit of Fusion Energy--Commentary by William H. Goldstein; (2) A Dynamo of a Plasma--The self-organizing magnetized plasmas in a Livermore fusion energy experiment are akin to solar flares and galactic jets; (3) How One Equation Changed the World--A three-page paper by Albert Einstein revolutionized physics by linking mass and energy; (4) Recycled Equations Help Verify Livermore Codes--New analytic solutions for imploding spherical shells give scientists additional tools for verifying codes; and (5) Dust That.s Worth Keeping--Scientists have solved the mystery of an astronomical spectral feature in interplanetary dust particles.
Patnaik, Rajashree; Padhy, Rabindra N
2018-05-11
Toxicities of methylmercury chloride (CH 3 HgCl) and methylmercury hydroxide (CH 3 HgOH) to cultured neuroblastoma cell line SH-SY5Y in vitro are evaluated. This is the comparative study between two methylmercury compounds to find out the extent of toxicity of these compounds are toxic to SH-SY5Y cell line. Both cytotoxicity and genotoxicity experiments were carried out to find out the more toxic compound. For cytotoxicity study, four staining assay methods independently with trypan blue (TB), acridine orange/ethidium bromide (AO/EB), 3-(4,5-dimethylthiazol-2-yl) 2,5-diphenyl tetrazolium bromide (MTT), and neutral red (NR) were used and the comet assay method was done for genotoxicity study. The obtained toxicity data were used for probit analysis. In cytotoxicity, CH 3 HgCl had minimum inhibitory concentration (MIC) value in each assay method as 3 mg/L invariably; LC 25 values were in the range 7.41 to 10.23 mg/L, and LC 50 values were 14.79 to 15.48 mg/L; while LC 75 values were 20.89 to 26.91 mg/L. Moreover, LC 100 value was 30 mg/L, known from comet assay experiments for CH 3 HgCl. Similarly for CH 3 HgOH, the MIC value in each assay method was invariably 3 mg/L, the LC 25 values were in the range 12.58 to 16.59 mg/L, and LC 50 values were 19.49 to 23.44 mg/L; LC 75 values were 27.54 to 30.90 mg/L and LC 100 value was 42 mg/L in each assay done for cytotoxicity and genotoxicity studies. Computed DNA fragmentation indices in comet assays were 98.6 ± 0.57 30 mg/L with CH 3 HgCl and 76 ± 5.29 30 mg/L with CH 3 HgOH. This study clearly indicated that methylmercury chloride is more toxic than methylmercury hydroxide to SH-SY5Y cell line. Toxicity of Hg had been quantified with in vitro cultured human neuroblastoma cell line; since it has neurotoxic effects, its neural evaluation has implications in environmental health issues.
A group contribution method has been developed to correlate the acute toxicity (96 h LC50) to the fathead minnow (Pimephales promelas) for 379 organic chemicals. Multilinear regression and computational neural networks (CNNs) were used for model building. The multilinear linear m...
LC21: A Digital Strategy for the Library of Congress.
ERIC Educational Resources Information Center
National Academy of Sciences - National Research Council, Washington, DC.
The Library of Congress asked the Computer Science and Telecommunications Board (CSTB) of the National Academies to conduct a study to provide strategic advice concerning the information technology path that the Library of Congress should traverse over the coming decade. The Committee on an Information Technology Strategy for the Library of…
Xenon NMR of liquid crystals confined to cylindrical nanocavities: a simulation study.
Karjalainen, Jouni; Vaara, Juha; Straka, Michal; Lantto, Perttu
2015-03-21
Applications of liquid crystals (LCs), such as smart windows and the ubiquitous display devices, are based on controlling the orientational and translational order in a small volume of LC medium. Hence, understanding the effects of confinement to the liquid crystal phase behaviour is essential. The NMR shielding of (129)Xe atoms dissolved in LCs constitutes a very sensitive probe to the details of LC environment. Linking the experimental results to microscopic phenomena calls for molecular simulations. In this work, the NMR shielding of atomic (129)Xe dissolved in a uniaxial thermotropic LC confined to nanosized cylindrical cavities is computed from coarse-grained (CG) isobaric Monte Carlo (MC) simulations with a quantum-chemically (QC) pre-parameterised pairwise-additive model for the Xe nuclear shielding tensor. We report the results for the (129)Xe nuclear shielding and its connection to the structure and order of the LC appropriate to two different cavity sizes, as well as a comparison to the results of bulk (non-confined) simulations. We find that the confinement changes the LC phase structure dramatically and gives rise to the coexistence of varying degrees of LC order, which is reflected in the Xe shielding. Furthermore, we qualitatively reproduce the behaviour of the mean (129)Xe chemical shift with respect to temperature for atomic Xe dissolved in LC confined to controlled-pore glass materials. In the small-radius cavity the nematic - paranematic phase transition is revealed only by the anisotropic component of the (129)Xe nuclear shielding. In the larger cavity, the nematic - paranematic - isotropic transition is clearly seen in the Xe shielding. The simulated (129)Xe NMR shielding is insensitive to the smectic-A - nematic transition, since in the smectic-A phase, the Xe atoms largely occupy the imperfect layer structure near the cavity walls. The direct contribution of the cavity wall to (129)Xe nuclear shielding is dependent on the cavity size but independent of temperature. Our results show that the combination of CG simulations and a QC pre-parameterised (129)Xe NMR shielding allows efficient studies of the phase behaviour and structure of complex systems containing thousands of molecules, and brings us closer to the simulation of NMR experiments.
Air-Gapped Structures as Magnetic Elements for Use in Power Processing Systems. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Ohri, A. K.
1977-01-01
Methodical approaches to the design of inductors for use in LC filters and dc-to-dc converters using air gapped magnetic structures are presented. Methods for the analysis and design of full wave rectifier LC filter circuits operating with the inductor current in both the continuous conduction and the discontinuous conduction modes are also described. In the continuous conduction mode, linear circuit analysis techniques are employed, while in the case of the discontinuous mode, the method of analysis requires computer solutions of the piecewise linear differential equations which describe the filter in the time domain. Procedures for designing filter inductors using air gapped cores are presented. The first procedure requires digital computation to yield a design which is optimized in the sense of minimum core volume and minimum number of turns. The second procedure does not yield an optimized design as defined above, but the design can be obtained by hand calculations or with a small calculator. The third procedure is based on the use of specially prepared magnetic core data and provides an easy way to quickly reach a workable design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raboin, P J
1998-01-01
The Computational Mechanics thrust area is a vital and growing facet of the Mechanical Engineering Department at Lawrence Livermore National Laboratory (LLNL). This work supports the development of computational analysis tools in the areas of structural mechanics and heat transfer. Over 75 analysts depend on thrust area-supported software running on a variety of computing platforms to meet the demands of LLNL programs. Interactions with the Department of Defense (DOD) High Performance Computing and Modernization Program and the Defense Special Weapons Agency are of special importance as they support our ParaDyn project in its development of new parallel capabilities for DYNA3D.more » Working with DOD customers has been invaluable to driving this technology in directions mutually beneficial to the Department of Energy. Other projects associated with the Computational Mechanics thrust area include work with the Partnership for a New Generation Vehicle (PNGV) for ''Springback Predictability'' and with the Federal Aviation Administration (FAA) for the ''Development of Methodologies for Evaluating Containment and Mitigation of Uncontained Engine Debris.'' In this report for FY-97, there are five articles detailing three code development activities and two projects that synthesized new code capabilities with new analytic research in damage/failure and biomechanics. The article this year are: (1) Energy- and Momentum-Conserving Rigid-Body Contact for NIKE3D and DYNA3D; (2) Computational Modeling of Prosthetics: A New Approach to Implant Design; (3) Characterization of Laser-Induced Mechanical Failure Damage of Optical Components; (4) Parallel Algorithm Research for Solid Mechanics Applications Using Finite Element Analysis; and (5) An Accurate One-Step Elasto-Plasticity Algorithm for Shell Elements in DYNA3D.« less
A liquid-crystal-on-silicon color sequential display using frame buffer pixel circuits
NASA Astrophysics Data System (ADS)
Lee, Sangrok
Next generation liquid-crystal-on-silicon (LCOS) high definition (HD) televisions and image projection displays will need to be low-cost and high quality to compete with existing systems based on digital micromirror devices (DMDs), plasma displays, and direct view liquid crystal displays. In this thesis, a novel frame buffer pixel architecture that buffers data for the next image frame while displaying the current frame, offers such a competitive solution is presented. The primary goal of the thesis is to demonstrate the LCOS microdisplay architecture for high quality image projection displays and at potentially low cost. The thesis covers four main research areas: new frame buffer pixel circuits to improve the LCOS performance, backplane architecture design and testing, liquid crystal modes for the LCOS microdisplay, and system integration and demonstration. The design requirements for the LCOS backplane with a 64 x 32 pixel array are addressed and measured electrical characteristics matches to computer simulation results. Various liquid crystal (LC) modes applicable for LCOS microdisplays and their physical properties are discussed. One- and two-dimensional director simulations are performed for the selected LC modes. Test liquid crystal cells with the selected LC modes are made and their electro-optic effects are characterized. The 64 x 32 LCOS microdisplays fabricated with the best LC mode are optically tested with interface circuitry. The characteristics of the LCOS microdisplays are summarized with the successful demonstration.
Towards an optimal contact metal for CNTFETs.
Fediai, Artem; Ryndyk, Dmitry A; Seifert, Gotthard; Mothes, Sven; Claus, Martin; Schröter, Michael; Cuniberti, Gianaurelio
2016-05-21
Downscaling of the contact length Lc of a side-contacted carbon nanotube field-effect transistor (CNTFET) is challenging because of the rapidly increasing contact resistance as Lc falls below 20-50 nm. If in agreement with existing experimental results, theoretical work might answer the question, which metals yield the lowest CNT-metal contact resistance and what physical mechanisms govern the geometry dependence of the contact resistance. However, at the scale of 10 nm, parameter-free models of electron transport become computationally prohibitively expensive. In our work we used a dedicated combination of the Green function formalism and density functional theory to perform an overall ab initio simulation of extended CNT-metal contacts of an arbitrary length (including infinite), a previously not achievable level of simulations. We provide a systematic and comprehensive discussion of metal-CNT contact properties as a function of the metal type and the contact length. We have found and been able to explain very uncommon relations between chemical, physical and electrical properties observed in CNT-metal contacts. The calculated electrical characteristics are in reasonable quantitative agreement and exhibit similar trends as the latest experimental data in terms of: (i) contact resistance for Lc = ∞, (ii) scaling of contact resistance Rc(Lc); (iii) metal-defined polarity of a CNTFET. Our results can guide technology development and contact material selection for downscaling the length of side-contacts below 10 nm.
Comparison of x-ray cross sections for diagnostic and therapeutic medical physics.
Boone, J M; Chavez, A E
1996-12-01
The purpose of this technical report is to make available an up-to-date source of attenuation coefficient data to the medical physics community, and to compare these data with other more familiar sources. Data files from Lawrence Livermore National Laboratory (in Livermore, CA) were truncated to match the needs of the medical physics community, and an interpolation routine was written to calculate a continuous set of cross sections spanning energies from 1 keV to 50 MeV. Coefficient data are available for elements Z = 1 through Z = 100. Values for mass attenuation coefficients, mass-energy-transfer coefficients, and mass-energy absorption coefficients are produced by a single computer subroutine. In addition to total interaction cross sections, the cross sections for photoelectric, Rayleigh, Compton, pair, and some triplet interactions are also produced by this single program. The coefficients were compared to the 1970 data of Storm and Israel over the energy interval from 1 to 1000 keV; for elements 10, 20, 30, 40, 50, 60, 70, and 80, the average positive difference between the Storm and Israel coefficients and the coefficients reported here are 1.4%, 2.7%, and 2.6%, for the mass attenuation, mass energy-transfer, and mass-energy absorption coefficients, respectively. The 1969 data compilation of mass attenuation coefficients from McMaster et al. were also compared with the newer LLNL data. Over the energy region from 10 keV to 1000 keV, and from elements Z = 1 to Z = 82 (inclusive), the overall average difference was 1.53% (sigma = 0.85%). While the overall average difference was small, there was larger variation (> 5%) between cross sections for some elements. In addition to coefficient data, other useful data such as the density, atomic weight, K, L1, L2, L3, M, and N edges, and numerous characteristic emission energies are output by the program, depending on a single input variable. The computer source code, written in C, can be accessed and downloaded from the World Wide Web at: http:@www.aip.org/epaps/epaps.html [E-MPHSA-23-1977].
Advantages and limitations of computed tomography scans for treatment planning of lung cancer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mira, J.G.; Potter, J.L.; Fullerton, G.D.
1982-09-01
Forty-five Chest computed tomography (CT) scans performed on patients with lung carcinoma (LC) were evaluated in an attempt to understand the pattern of intrathoracic tumor spread and the advantages and limitations this technique offers for treatment planning when compared to planning done by conventional X rays. The following findings can help treatment planning. (1) When regular X rays do not show location (i.e., hemithorax opacification), CT scan will show it in 68% of patients. If regular X rays show a well localized mass, unsuspected tumor extensions were disclosed in 78% of these patients. Hence, CT scans should be done inmore » all LC patients prior to treatment planning; (2) Mediastinal masses frequently spread anteriorly toward the sternum and posteriorly around the vertebral bodies toward the cord and costal pleura. This should be considered for radiotherapy boost techniques; (3) Lung masses spread in one third of cases toward the lateral costal pleura. Thus, the usual 1-2cm of safety margin around the LC are not sufficient in some cases; (4) Tumor size can appear much smaller in regular X rays than in CT scans. Hence, CT scans are necessary for accurate staging and evaluation of tumor response. Some CT scan limitations are: (1) Atelectasis blends with tumor in approximately half of the patients, thus obscuring tumor boundaries; (2) CT numbers and contrast enhancement did not help to differentiate between these two structures; and (3) Limited definition of CT scan prevents investigation of suspected microscopic spread around tumor masses.« less
NASA Astrophysics Data System (ADS)
Wunenburger, R.; Chatain, D.; Garrabos, Y.; Beysens, D.
2000-07-01
We report a study concerning the compensation of gravity forces in two-phase (p-) hydrogen. The sample is placed near one end of the vertical z axis of a superconducting coil, where there is a near-uniform magnetic field gradient. A variable effective gravity level g can thus be applied to the two-phase fluid system. The vanishing behavior of the capillary length lC at the critical point is compensated by a decrease in g and lC is kept much smaller than the cell dimension. For g ranging from 1 to 0.25 times Earth's gravity (modulus g0) we compare the actual shape of the meniscus to the expected shape in a homogeneous gravity field. We determine lC in a wide range of reduced temperature τ=(TC-T)/TC=[10-4-0.02] from a fit of the meniscus shape. The data are in agreement with previous measurements further from TC performed in n-H2 under Earth's gravity. The effective gravity is homogeneous within 10-2g0 for a 3 mm diameter and 2 mm thickness sample and is in good agreement with the computed one, validating the use of the apparatus as a variable gravity facility. In the vicinity of the levitation point (where magnetic forces exactly compensate Earth's gravity), the computed axial component of the acceleration is found to be quadratic in z, whereas its radial component is proportional to the distance to the axis, which explains the gas-liquid patterns observed near the critical point.
Wunenburger; Chatain; Garrabos; Beysens
2000-07-01
We report a study concerning the compensation of gravity forces in two-phase (p-) hydrogen. The sample is placed near one end of the vertical z axis of a superconducting coil, where there is a near-uniform magnetic field gradient. A variable effective gravity level g can thus be applied to the two-phase fluid system. The vanishing behavior of the capillary length l(C) at the critical point is compensated by a decrease in g and l(C) is kept much smaller than the cell dimension. For g ranging from 1 to 0.25 times Earth's gravity (modulus g(0)) we compare the actual shape of the meniscus to the expected shape in a homogeneous gravity field. We determine l(C) in a wide range of reduced temperature tau=(T(C)-T)/T(C)=[10(-4)-0.02] from a fit of the meniscus shape. The data are in agreement with previous measurements further from T(C) performed in n-H2 under Earth's gravity. The effective gravity is homogeneous within 10(-2)g(0) for a 3 mm diameter and 2 mm thickness sample and is in good agreement with the computed one, validating the use of the apparatus as a variable gravity facility. In the vicinity of the levitation point (where magnetic forces exactly compensate Earth's gravity), the computed axial component of the acceleration is found to be quadratic in z, whereas its radial component is proportional to the distance to the axis, which explains the gas-liquid patterns observed near the critical point.
Numerical simulation of long-duration blast wave evolution in confined facilities
NASA Astrophysics Data System (ADS)
Togashi, F.; Baum, J. D.; Mestreau, E.; Löhner, R.; Sunshine, D.
2010-10-01
The objective of this research effort was to investigate the quasi-steady flow field produced by explosives in confined facilities. In this effort we modeled tests in which a high explosive (HE) cylindrical charge was hung in the center of a room and detonated. The HEs used for the tests were C-4 and AFX 757. While C-4 is just slightly under-oxidized and is typically modeled as an ideal explosive, AFX 757 includes a significant percentage of aluminum particles, so long-time afterburning and energy release must be considered. The Lawrence Livermore National Laboratory (LLNL)-produced thermo-chemical equilibrium algorithm, “Cheetah”, was used to estimate the remaining burnable detonation products. From these remaining species, the afterburning energy was computed and added to the flow field. Computations of the detonation and afterburn of two HEs in the confined multi-room facility were performed. The results demonstrate excellent agreement with available experimental data in terms of blast wave time of arrival, peak shock amplitude, reverberation, and total impulse (and hence, total energy release, via either the detonation or afterburn processes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rodgers, Arthur J.; Dreger, Douglas S.; Pitarka, Arben
We performed three-dimensional (3D) anelastic ground motion simulations of the South Napa earthquake to investigate the performance of different finite rupture models and the effects of 3D structure on the observed wavefield. We considered rupture models reported by Dreger et al. (2015), Ji et al., (2015), Wei et al. (2015) and Melgar et al. (2015). We used the SW4 anelastic finite difference code developed at Lawrence Livermore National Laboratory (Petersson and Sjogreen, 2013) and distributed by the Computational Infrastructure for Geodynamics. This code can compute the seismic response for fully 3D sub-surface models, including surface topography and linear anelasticity. Wemore » use the 3D geologic/seismic model of the San Francisco Bay Area developed by the United States Geological Survey (Aagaard et al., 2008, 2010). Evaluation of earlier versions of this model indicated that the structure can reproduce main features of observed waveforms from moderate earthquakes (Rodgers et al., 2008; Kim et al., 2010). Simulations were performed for a domain covering local distances (< 25 km) and resolution providing simulated ground motions valid to 1 Hz.« less
HLYWD: a program for post-processing data files to generate selected plots or time-lapse graphics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Munro, J.K. Jr.
1980-05-01
The program HLYWD is a post-processor of output files generated by large plasma simulation computations or of data files containing a time sequence of plasma diagnostics. It is intended to be used in a production mode for either type of application; i.e., it allows one to generate along with the graphics sequence, segments containing title, credits to those who performed the work, text to describe the graphics, and acknowledgement of funding agency. The current version is designed to generate 3D plots and allows one to select type of display (linear or semi-log scales), choice of normalization of function values formore » display purposes, viewing perspective, and an option to allow continuous rotations of surfaces. This program was developed with the intention of being relatively easy to use, reasonably flexible, and requiring a minimum investment of the user's time. It uses the TV80 library of graphics software and ORDERLIB system software on the CDC 7600 at the National Magnetic Fusion Energy Computing Center at Lawrence Livermore Laboratory in California.« less
NASA Astrophysics Data System (ADS)
Lindsey, Rebecca; Goldman, Nir; Fried, Laurence
2017-06-01
Atomistic modeling of chemistry at extreme conditions remains a challenge, despite continuing advances in computing resources and simulation tools. While first principles methods provide a powerful predictive tool, the time and length scales associated with chemistry at extreme conditions (ns and μm, respectively) largely preclude extension of such models to molecular dynamics. In this work, we develop a simulation approach that retains the accuracy of density functional theory (DFT) while decreasing computational effort by several orders of magnitude. We generate n-body descriptions for atomic interactions by mapping forces arising from short density functional theory (DFT) trajectories on to simple Chebyshev polynomial series. We examine the importance of including greater than 2-body interactions, model transferability to different state points, and discuss approaches to ensure smooth and reasonable model shape outside of the distance domain sampled by the DFT training set. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Pichler, Peter; Mazanek, Michael; Dusberger, Frederico; Weilnböck, Lisa; Huber, Christian G; Stingl, Christoph; Luider, Theo M; Straube, Werner L; Köcher, Thomas; Mechtler, Karl
2012-11-02
While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC-MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge.
Urinary Amino Acid Analysis: A Comparison of iTRAQ®-LC-MS/MS, GC-MS, and Amino Acid Analyzer
Kaspar, Hannelore; Dettmer, Katja; Chan, Queenie; Daniels, Scott; Nimkar, Subodh; Daviglus, Martha L.; Stamler, Jeremiah; Elliott, Paul; Oefner, Peter J.
2009-01-01
Urinary amino acid analysis is typically done by cation-exchange chromatography followed by post-column derivatization with ninhydrin and UV detection. This method lacks throughput and specificity. Two recently introduced stable isotope ratio mass spectrometric methods promise to overcome those shortcomings. Using two blinded sets of urine replicates and a certified amino acid standard, we compared the precision and accuracy of gas chromatography/mass spectrometry (GC-MS) and liquid chromatography-tandem mass spectrometry (LC-MS/MS) of propyl chloroformate and iTRAQ® derivatized amino acids, respectively, to conventional amino acid analysis. The GC-MS method builds on the direct derivatization of amino acids in diluted urine with propyl chloroformate, GC separation and mass spectrometric quantification of derivatives using stable isotope labeled standards. The LC-MS/MS method requires prior urinary protein precipitation followed by labeling of urinary and standard amino acids with iTRAQ® tags containing different cleavable reporter ions distinguishable by MS/MS fragmentation. Means and standard deviations of percent technical error (%TE) computed for 20 amino acids determined by amino acid analyzer, GC-MS, and iTRAQ®-LC-MS/MS analyses of 33 duplicate and triplicate urine specimens were 7.27±5.22, 21.18±10.94, and 18.34±14.67, respectively. Corresponding values for 13 amino acids determined in a second batch of 144 urine specimens measured in duplicate or triplicate were 8.39±5.35, 6.23±3.84, and 35.37±29.42. Both GC-MS and iTRAQ®-LC-MS/MS are suited for high-throughput amino acid analysis, with the former offering at present higher reproducibility and completely automated sample pretreatment, while the latter covers more amino acids and related amines. PMID:19481989
Urinary amino acid analysis: a comparison of iTRAQ-LC-MS/MS, GC-MS, and amino acid analyzer.
Kaspar, Hannelore; Dettmer, Katja; Chan, Queenie; Daniels, Scott; Nimkar, Subodh; Daviglus, Martha L; Stamler, Jeremiah; Elliott, Paul; Oefner, Peter J
2009-07-01
Urinary amino acid analysis is typically done by cation-exchange chromatography followed by post-column derivatization with ninhydrin and UV detection. This method lacks throughput and specificity. Two recently introduced stable isotope ratio mass spectrometric methods promise to overcome those shortcomings. Using two blinded sets of urine replicates and a certified amino acid standard, we compared the precision and accuracy of gas chromatography/mass spectrometry (GC-MS) and liquid chromatography-tandem mass spectrometry (LC-MS/MS) of propyl chloroformate and iTRAQ derivatized amino acids, respectively, to conventional amino acid analysis. The GC-MS method builds on the direct derivatization of amino acids in diluted urine with propyl chloroformate, GC separation and mass spectrometric quantification of derivatives using stable isotope labeled standards. The LC-MS/MS method requires prior urinary protein precipitation followed by labeling of urinary and standard amino acids with iTRAQ tags containing different cleavable reporter ions distinguishable by MS/MS fragmentation. Means and standard deviations of percent technical error (%TE) computed for 20 amino acids determined by amino acid analyzer, GC-MS, and iTRAQ-LC-MS/MS analyses of 33 duplicate and triplicate urine specimens were 7.27+/-5.22, 21.18+/-10.94, and 18.34+/-14.67, respectively. Corresponding values for 13 amino acids determined in a second batch of 144 urine specimens measured in duplicate or triplicate were 8.39+/-5.35, 6.23+/-3.84, and 35.37+/-29.42. Both GC-MS and iTRAQ-LC-MS/MS are suited for high-throughput amino acid analysis, with the former offering at present higher reproducibility and completely automated sample pretreatment, while the latter covers more amino acids and related amines.
NASA Astrophysics Data System (ADS)
Compton, Duane C.; Snapp, Robert R.
2007-09-01
TWiGS (two-dimensional wavelet transform with generalized cross validation and soft thresholding) is a novel algorithm for denoising liquid chromatography-mass spectrometry (LC-MS) data for use in "shot-gun" proteomics. Proteomics, the study of all proteins in an organism, is an emerging field that has already proven successful for drug and disease discovery in humans. There are a number of constraints that limit the effectiveness of liquid chromatography-mass spectrometry (LC-MS) for shot-gun proteomics, where the chemical signals are typically weak, and data sets are computationally large. Most algorithms suffer greatly from a researcher driven bias, making the results irreproducible and unusable by other laboratories. We thus introduce a new algorithm, TWiGS, that removes electrical (additive white) and chemical noise from LC-MS data sets. TWiGS is developed to be a true two-dimensional algorithm, which operates in the time-frequency domain, and minimizes the amount of researcher bias. It is based on the traditional discrete wavelet transform (DWT), which allows for fast and reproducible analysis. The separable two-dimensional DWT decomposition is paired with generalized cross validation and soft thresholding. The Haar, Coiflet-6, Daubechie-4 and the number of decomposition levels are determined based on observed experimental results. Using a synthetic LC-MS data model, TWiGS accurately retains key characteristics of the peaks in both the time and m/z domain, and can detect peaks from noise of the same intensity. TWiGS is applied to angiotensin I and II samples run on a LC-ESI-TOF-MS (liquid-chromatography-electrospray-ionization) to demonstrate its utility for the detection of low-lying peaks obscured by noise.
A short course on measure and probability theories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pebay, Philippe Pierre
2004-02-01
This brief Introduction to Measure Theory, and its applications to Probabilities, corresponds to the lecture notes of a seminar series given at Sandia National Laboratories in Livermore, during the spring of 2003. The goal of these seminars was to provide a minimal background to Computational Combustion scientists interested in using more advanced stochastic concepts and methods, e.g., in the context of uncertainty quantification. Indeed, most mechanical engineering curricula do not provide students with formal training in the field of probability, and even in less in measure theory. However, stochastic methods have been used more and more extensively in the pastmore » decade, and have provided more successful computational tools. Scientists at the Combustion Research Facility of Sandia National Laboratories have been using computational stochastic methods for years. Addressing more and more complex applications, and facing difficult problems that arose in applications showed the need for a better understanding of theoretical foundations. This is why the seminar series was launched, and these notes summarize most of the concepts which have been discussed. The goal of the seminars was to bring a group of mechanical engineers and computational combustion scientists to a full understanding of N. WIENER'S polynomial chaos theory. Therefore, these lectures notes are built along those lines, and are not intended to be exhaustive. In particular, the author welcomes any comments or criticisms.« less
Goudey, Benjamin; Abedini, Mani; Hopper, John L; Inouye, Michael; Makalic, Enes; Schmidt, Daniel F; Wagner, John; Zhou, Zeyu; Zobel, Justin; Reumann, Matthias
2015-01-01
Genome-wide association studies (GWAS) are a common approach for systematic discovery of single nucleotide polymorphisms (SNPs) which are associated with a given disease. Univariate analysis approaches commonly employed may miss important SNP associations that only appear through multivariate analysis in complex diseases. However, multivariate SNP analysis is currently limited by its inherent computational complexity. In this work, we present a computational framework that harnesses supercomputers. Based on our results, we estimate a three-way interaction analysis on 1.1 million SNP GWAS data requiring over 5.8 years on the full "Avoca" IBM Blue Gene/Q installation at the Victorian Life Sciences Computation Initiative. This is hundreds of times faster than estimates for other CPU based methods and four times faster than runtimes estimated for GPU methods, indicating how the improvement in the level of hardware applied to interaction analysis may alter the types of analysis that can be performed. Furthermore, the same analysis would take under 3 months on the currently largest IBM Blue Gene/Q supercomputer "Sequoia" at the Lawrence Livermore National Laboratory assuming linear scaling is maintained as our results suggest. Given that the implementation used in this study can be further optimised, this runtime means it is becoming feasible to carry out exhaustive analysis of higher order interaction studies on large modern GWAS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Althouse, P.; McKannay, R. H.
This was a collaborative effort between Lawrence Livermore National Security, LLC as manager and operator of Lawrence Livermore National Laboratory (LLNL) and ISOFLEX USA (ISOFLEX), to 1) develop and test a prototype waste destruction system ("System") using AC plasma torch technology to break down and drastically reduce the volume of Carbon-14 (C-14) contaminated medical laboratory wastes while satisfying all environmental regulations, and 2) develop and demonstrate methods for recovering 99%+ of the carbon including the C-14 allowing for possible re-use as a tagging and labeling tool in the biomedical industry.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dingell, J.D.
1991-02-01
The Department of Energy's (DOE) Lawrence Livermore National Laboratory, located in Livermore, California, generates and controls large numbers of classified documents associated with the research and testing of nuclear weapons. Concern has been raised about the potential for espionage at the laboratory and the national security implications of classified documents being stolen. This paper determines the extent of missing classified documents at the laboratory and assesses the adequacy of accountability over classified documents in the laboratory's custody. Audit coverage was limited to the approximately 600,000 secret documents in the laboratory's custody. The adequacy of DOE's oversight of the laboratory's secretmore » document control program was also assessed.« less
322-R2U2 Engineering Assessment - August 2015
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abri, M.; Griffin, D.
This Engineering Assessment and Certification of Integrity of retention tank system 322-R2 has been prepared for tank systems that store and neutralizes hazardous waste and have secondary containment. The regulations require that this assessment be completed periodically and certified by an independent, qualified, California-registered professional engineer. Abri Environmental Engineering performed an inspection of the 322-R2 Tank system at the Lawrence Livermore National Laboratory (LLNL) in Livermore, CA. Mr. William W. Moore, P.E., conducted this inspection on March 16, 2015. Mr. Moore is a California Registered Civil Engineer, with extensive experience in civil engineering, and hazardous waste management.
Rethinking Approaches to Strategic Stability in the 21st Century
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rose, Brian
Lawrence Livermore National Laboratory (LLNL) hosted a two-day conference on rethinking approaches to strategic stability in the 21st century on October 20-21, 2016 in Livermore, CA. The conference was jointly convened by Lawrence Livermore, Los Alamos, and Sandia National Laboratories, and was held in partnership with the United States Department of State’s Bureau of Arms Control, Verification and Compliance. The conference took place at LLNL’s Center for Global Security Research (CGSR) and included a range of representatives from U.S. government, academic, and private institutions, as well as representatives from U.S. allies in Europe and Asia.The following summary covers topics andmore » discussions from each of the panels. It is not intended to capture every point in detail, but seeks to outline the range of views on these complex and inter-related issues while providing a general overview of the panel topics and discussions that took place. The conference was held under the Chatham House rule and does not attribute any remarks to any specific individual or institution. The views reflected in this report do not represent the United States Government, Department of State, or the national laboratories.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kilmer, J.
Various Department of Energy Orders incorporate by reference, health and safety regulations promulgated by the Occupational Safety and Health Administration (OSHA). One of the OSHA regulations, 29 CFR 1910.120, Hazardous Waste Operations and Emergency Response, requires that site safety plans are written for activities such as those covered by work plans for Site 300 environmental investigations. Based upon available data, this Site Safety Plan (Plan) for environmental restoration has been prepared specifically for the Lawrence Livermore National Laboratory Site 300, located approximately 15 miles east of Livermore, California. As additional facts, monitoring data, or analytical data on hazards are provided,more » this Plan may need to be modified. It is the responsibility of the Environmental Restoration Program and Division (ERD) Site Safety Officer (SSO), with the assistance of Hazards Control, to evaluate data which may impact health and safety during these activities and to modify the Plan as appropriate. This Plan is not `cast-in-concrete.` The SSO shall have the authority, with the concurrence of Hazards Control, to institute any change to maintain health and safety protection for workers at Site 300.« less
Lawrence Livermore National Laboratory Environmental Report 2012
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, Henry E.; Armstrong, Dave; Blake, Rick G.
Lawrence Livermore National Laboratory (LLNL) is a premier research laboratory that is part of the National Nuclear Security Administration (NNSA) within the U.S. Department of Energy (DOE). As a national security laboratory, LLNL is responsible for ensuring that the nation’s nuclear weapons remain safe, secure, and reliable. The Laboratory also meets other pressing national security needs, including countering the proliferation of weapons of mass destruction and strengthening homeland security, and conducting major research in atmospheric, earth, and energy sciences; bioscience and biotechnology; and engineering, basic science, and advanced technology. The Laboratory is managed and operated by Lawrence Livermore National Security,more » LLC (LLNS), and serves as a scientific resource to the U.S. government and a partner to industry and academia. LLNL operations have the potential to release a variety of constituents into the environment via atmospheric, surface water, and groundwater pathways. Some of the constituents, such as particles from diesel engines, are common at many types of facilities while others, such as radionuclides, are unique to research facilities like LLNL. All releases are highly regulated and carefully monitored. LLNL strives to maintain a safe, secure and efficient operational environment for its employees and neighboring communities. Experts in environment, safety and health (ES&H) support all Laboratory activities. LLNL’s radiological control program ensures that radiological exposures and releases are reduced to as low as reasonably achievable to protect the health and safety of its employees, contractors, the public, and the environment. LLNL is committed to enhancing its environmental stewardship and managing the impacts its operations may have on the environment through a formal Environmental Management System. The Laboratory encourages the public to participate in matters related to the Laboratory’s environmental impact on the community by soliciting citizens’ input on matters of significant public interest and through various communications. The Laboratory also provides public access to information on its ES&H activities. LLNL consists of two sites—an urban site in Livermore, California, referred to as the “Livermore Site,” which occupies 1.3 square miles; and a rural Experimental Test Site, referred to as “Site 300,” near Tracy, California, which occupies 10.9 square miles. In 2012 the Laboratory had a staff of approximately 7000.« less
Lawrence Livermore National Laboratory Environmental Report 2013
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jones, H. E.; Bertoldo, N. A.; Blake, R. G.
Lawrence Livermore National Laboratory (LLNL) is a premier research laboratory that is part of the National Nuclear Security Administration (NNSA) within the U.S. Department of Energy (DOE). As a national security laboratory, LLNL is responsible for ensuring that the nation’s nuclear weapons remain safe, secure, and reliable. The Laboratory also meets other pressing national security needs, including countering the proliferation of weapons of mass destruction and strengthening homeland security, and conducting major research in atmospheric, earth, and energy sciences; bioscience and biotechnology; and engineering, basic science, and advanced technology. The Laboratory is managed and operated by Lawrence Livermore National Security,more » LLC (LLNS), and serves as a scientific resource to the U.S. government and a partner to industry and academia. LLNL operations have the potential to release a variety of constituents into the environment via atmospheric, surface water, and groundwater pathways. Some of the constituents, such as particles from diesel engines, are common at many types of facilities while others, such as radionuclides, are unique to research facilities like LLNL. All releases are highly regulated and carefully monitored. LLNL strives to maintain a safe, secure and efficient operational environment for its employees and neighboring communities. Experts in environment, safety and health (ES&H) support all Laboratory activities. LLNL’s radiological control program ensures that radiological exposures and releases are reduced to as low as reasonably achievable to protect the health and safety of its employees, contractors, the public, and the environment. LLNL is committed to enhancing its environmental stewardship and managing the impacts its operations may have on the environment through a formal Environmental Management System. The Laboratory encourages the public to participate in matters related to the Laboratory’s environmental impact on the community by soliciting citizens’ input on matters of significant public interest and through various communications. The Laboratory also provides public access to information on its ES&H activities. LLNL consists of two sites—an urban site in Livermore, California, referred to as the “Livermore Site,” which occupies 1.3 square miles; and a rural Experimental Test Site, referred to as “Site 300,” near Tracy, California, which occupies 10.9 square miles. In 2013 the Laboratory had a staff of approximately 6,300.« less
Fast-responding liquid crystal light-valve technology for color-sequential display applications
NASA Astrophysics Data System (ADS)
Janssen, Peter J.; Konovalov, Victor A.; Muravski, Anatoli A.; Yakovenko, Sergei Y.
1996-04-01
A color sequential projection system has some distinct advantages over conventional systems which make it uniquely suitable for consumer TV as well as high performance professional applications such as computer monitors and electronic cinema. A fast responding light-valve is, clearly, essential for a good performing system. Response speed of transmissive LC lightvalves has been marginal thus far for good color rendition. Recently, Sevchenko Institute has made some very fast reflective LC cells which were evaluated at Philips Labs. These devices showed sub millisecond-large signal-response times, even at room temperature, and produced good color in a projector emulation testbed. In our presentation we describe our highly efficient color sequential projector and demonstrate its operation on video tape. Next we discuss light-valve requirements and reflective light-valve test results.
Accelerated wavefront determination technique for optical imaging through scattering medium
NASA Astrophysics Data System (ADS)
He, Hexiang; Wong, Kam Sing
2016-03-01
Wavefront shaping applied on scattering light is a promising optical imaging method in biological systems. Normally, optimized modulation can be obtained by a Liquid-Crystal Spatial Light Modulator (LC-SLM) and CCD hardware iteration. Here we introduce an improved method for this optimization process. The core of the proposed method is to firstly detect the disturbed wavefront, and then to calculate the modulation phase pattern by computer simulation. In particular, phase retrieval method together with phase conjugation is most effective. In this way, the LC-SLM based system can complete the wavefront optimization and imaging restoration within several seconds which is two orders of magnitude faster than the conventional technique. The experimental results show good imaging quality and may contribute to real time imaging recovery in scattering medium.
LLNL Mercury Project Trinity Open Science Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brantley, Patrick; Dawson, Shawn; McKinley, Scott
2016-04-20
The Mercury Monte Carlo particle transport code developed at Lawrence Livermore National Laboratory (LLNL) is used to simulate the transport of radiation through urban environments. These challenging calculations include complicated geometries and require significant computational resources to complete. As a result, a question arises as to the level of convergence of the calculations with Monte Carlo simulation particle count. In the Trinity Open Science calculations, one main focus was to investigate convergence of the relevant simulation quantities with Monte Carlo particle count to assess the current simulation methodology. Both for this application space but also of more general applicability, wemore » also investigated the impact of code algorithms on parallel scaling on the Trinity machine as well as the utilization of the Trinity DataWarp burst buffer technology in Mercury via the LLNL Scalable Checkpoint/Restart (SCR) library.« less
LLNL Scientists Use NERSC to Advance Global Aerosol Simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bergmann, D J; Chuang, C; Rotman, D
2004-10-13
While ''greenhouse gases'' have been the focus of climate change research for a number of years, DOE's ''Aerosol Initiative'' is now examining how aerosols (small particles of approximately micron size) affect the climate on both a global and regional scale. Scientists in the Atmospheric Science Division at Lawrence Livermore National Laboratory (LLNL) are using NERSC's IBM supercomputer and LLNL's IMPACT (atmospheric chemistry) model to perform simulations showing the historic effects of sulfur aerosols at a finer spatial resolution than ever done before. Simulations were carried out for five decades, from the 1950s through the 1990s. The results clearly show themore » effects of the changing global pattern of sulfur emissions. Whereas in 1950 the United States emitted 41 percent of the world's sulfur aerosols, this figure had dropped to 15 percent by 1990, due to conservation and anti-pollution policies. By contrast, the fraction of total sulfur emissions of European origin has only dropped by a factor of 2 and the Asian emission fraction jumped six fold during the same time, from 7 percent in 1950 to 44 percent in 1990. Under a special allocation of computing time provided by the Office of Science INCITE (Innovative and Novel Computational Impact on Theory and Experiment) program, Dan Bergmann, working with a team of LLNL scientists including Cathy Chuang, Philip Cameron-Smith, and Bala Govindasamy, was able to carry out a large number of calculations during the past month, making the aerosol project one of the largest users of NERSC resources. The applications ran on 128 and 256 processors. The objective was to assess the effects of anthropogenic (man-made) sulfate aerosols. The IMPACT model calculates the rate at which SO{sub 2} (a gas emitted by industrial activity) is oxidized and forms particles known as sulfate aerosols. These particles have a short lifespan in the atmosphere, often washing out in about a week. This means that their effects on climate tend to be more regional, occurring near the area where the SO{sub 2} is emitted. To accurately study these regional effects, Bergmann needed to run the simulations at a finer horizontal resolution, as the coarser resolution (typically 300km by 300km) of other climate models are insufficient for studying changes on a regional scale. Livermore's use of CAM3, the Community Atmospheric Model which is a high-resolution climate model developed at NCAR (with collaboration from DOE), allows a 100km by 100km grid to be applied. NERSC's terascale computing capability provided the needed computational horsepower to run the application at the finer level.« less
FY10 Engineering Innovations, Research and Technology Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lane, M A; Aceves, S M; Paulson, C N
This report summarizes key research, development, and technology advancements in Lawrence Livermore National Laboratory's Engineering Directorate for FY2010. These efforts exemplify Engineering's nearly 60-year history of developing and applying the technology innovations needed for the Laboratory's national security missions, and embody Engineering's mission to ''Enable program success today and ensure the Laboratory's vitality tomorrow.'' Leading off the report is a section featuring compelling engineering innovations. These innovations range from advanced hydrogen storage that enables clean vehicles, to new nuclear material detection technologies, to a landmine detection system using ultra-wideband ground-penetrating radar. Many have been recognized with R&D Magazine's prestigious R&Dmore » 100 Award; all are examples of the forward-looking application of innovative engineering to pressing national problems and challenging customer requirements. Engineering's capability development strategy includes both fundamental research and technology development. Engineering research creates the competencies of the future where discovery-class groundwork is required. Our technology development (or reduction to practice) efforts enable many of the research breakthroughs across the Laboratory to translate from the world of basic research to the national security missions of the Laboratory. This portfolio approach produces new and advanced technological capabilities, and is a unique component of the value proposition of the Lawrence Livermore Laboratory. The balance of the report highlights this work in research and technology, organized into thematic technical areas: Computational Engineering; Micro/Nano-Devices and Structures; Measurement Technologies; Engineering Systems for Knowledge Discovery; and Energy Manipulation. Our investments in these areas serve not only known programmatic requirements of today and tomorrow, but also anticipate the breakthrough engineering innovations that will be needed in the future.« less
Zoppetti, Nicola; Andreuccetti, Daniele; Bellieni, Carlo; Bogi, Andrea; Pinto, Iole
2011-12-01
Portable - or "laptop" - computers (LCs) are widely and increasingly used all over the world. Since LCs are often used in tight contact with the body even by pregnant women, fetal exposures to low frequency magnetic fields generated by these units can occur. LC emissions are usually characterized by complex waveforms and are often generated by the main AC power supply (when connected) and by the display power supply sub-system. In the present study, low frequency magnetic field emissions were measured for a set of five models of portable computers. For each of them, the magnetic flux density was characterized in terms not just of field amplitude, but also of the so called "weighted peak" (WP) index, introduced in the 2003 ICNIRP Statement on complex waveforms and confirmed in the 2010 ICNIRP Guidelines for low frequency fields. For the model of LC presenting the higher emission, a deeper analysis was also carried out, using numerical dosimetry techniques to calculate internal quantities (current density and in-situ electric field) with reference to a digital body model of a pregnant woman. Since internal quantities have complex waveforms too, the concept of WP index was extended to them, considering the ICNIRP basic restrictions defined in the 1998 Guidelines for the current density and in the 2010 Guidelines for the in-situ electric field. Induced quantities and WP indexes were computed using an appropriate original formulation of the well known Scalar Potential Finite Difference (SPFD) numerical method for electromagnetic dosimetry in quasi-static conditions. Copyright © 2011 Elsevier Ltd. All rights reserved.
Børsting, M W; Qvist, K B; Brockmann, E; Vindeløv, J; Pedersen, T L; Vogensen, F K; Ardö, Y
2015-01-01
Lactococcus lactis strains depend on a proteolytic system for growth in milk to release essential AA from casein. The cleavage specificities of the cell envelope proteinase (CEP) can vary between strains and environments and whether the enzyme is released or bound to the cell wall. Thirty-eight Lc. lactis strains were grouped according to their CEP AA sequences and according to identified peptides after hydrolysis of milk. Finally, AA positions in the substrate binding region were suggested by the use of a new CEP template based on Streptococcus C5a CEP. Aligning the CEP AA sequences of 38 strains of Lc. lactis showed that 21 strains, which were previously classified as group d, could be subdivided into 3 groups. Independently, similar subgroupings were found based on comparison of the Lc. lactis CEP AA sequences and based on normalized quantity of identified peptides released from αS1-casein and β-casein. A model structure of Lc. lactis CEP based on the crystal structure of Streptococcus C5a CEP was used to investigate the AA positions in the substrate-binding region. New AA positions were suggested, which could be relevant for the cleavage specificity of CEP; however, these could only explain 2 out of 3 found subgroups. The third subgroup could be explained by 1 to 5 AA positions located opposite the substrate binding region. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
On the physical origins of interaction-induced vibrational (hyper)polarizabilities.
Zaleśny, Robert; Garcia-Borràs, Marc; Góra, Robert W; Medved', Miroslav; Luis, Josep M
2016-08-10
This paper presents the results of a pioneering exploration of the physical origins of vibrational contributions to the interaction-induced electric properties of molecular complexes. In order to analyze the excess nuclear relaxation (hyper)polarizabilities, a new scheme was proposed which relies on the computationally efficient Bishop-Hasan-Kirtman method for determining the nuclear relaxation contributions to electric properties. The extension presented herein is general and can be used with any interaction-energy partitioning method. As an example, in this study we employed the variational-perturbational interaction-energy decomposition scheme (at the MP2/aug-cc-pVQZ level) and the extended transition state method by employing three exchange-correlation functionals (BLYP, LC-BLYP, and LC-BLYP-dDsC) to study the excess properties of the HCN dimer. It was observed that the first-order electrostatic contribution to the excess nuclear relaxation polarizability cancels with the negative exchange repulsion term out to a large extent, resulting in a positive value of Δα(nr) due to the contributions from the delocalization and the dispersion terms. In the case of the excess nuclear relaxation first hyperpolarizability, the pattern of interaction contributions is very similar to that for Δα(nr), both in terms of their sign as well as relative magnitude. Finally, our results show that the LC-BLYP and LC-BLYP-dDsC functionals, which yield smaller values of the orbital relaxation term than BLYP, are more successful in predicting excess properties.
Expectation Maximization and its Application in Modeling, Segmentation and Anomaly Detection
2008-05-01
ocomplNc <la!a rrot>lcm,. ",., i’lCOll\\l>lc,c,ICSS of Ihc dala mayan "" IIuc lu missing dala. (J,,,,,,.,ed di,nibu!ions . elc . 0"" such c • ..- is a...Estimation Techniques in Computer Huiyan, Z., Yongfeng, C., Wen, Y. SAR Image Segmentation Using MPM Constrained Stochastic Relaxation. Civil Engineering
The Use of Help Options in Multimedia Listening Environments to Aid Language Learning: A Review
ERIC Educational Resources Information Center
Mohsen, Mohammed Ali
2016-01-01
This paper provides a comprehensive review on the use of help options (HOs) in the multimedia listening context to aid listening comprehension (LC) and improve incidental vocabulary learning. The paper also aims to synthesize the research findings obtained from the use of HOs in Computer-Assisted Language Learning (CALL) literature and reveals the…
Evaluating the ISDN line to deliver interactive multimedia experiences
DOE Office of Scientific and Technical Information (OSTI.GOV)
Michaels, D.K.
1994-05-06
We will use the 128 kilobit/sec ISDN connection from the Lawrence Livermore National Laboratory to the Livermore High School Math Learning Center to provide students there with interactive multimedia educational experiences. These experiences may consist of tutorials, exercises, and interactive puzzles to teach students` course material. We will determine if it is possible to store the multimedia files at LLNL and deliver them to the student machines via FTP as they are needed. An evaluation of the effect of the ISDN data rate is a substantial component of our research and suggestions on how to best use the ISDN linemore » in this capacity will be given.« less
Emergency Response Capability Baseline Needs Assessment Compliance Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharry, John A.
2013-09-16
This document is the second of a two-part analysis of Emergency Response Capabilities of Lawrence Livermore National Laboratory. The first part, 2013 Baseline Needs Assessment Requirements Document established the minimum performance criteria necessary to meet mandatory requirements. This second part analyses the performance of Lawrence Livermore Laboratory Emergency Management Department to the contents of the Requirements Document. The document was prepared based on an extensive review of information contained in the 2009 BNA, the 2012 BNA document, a review of Emergency Planning Hazards Assessments, a review of building construction, occupancy, fire protection features, dispatch records, LLNL alarm system records, firemore » department training records, and fire department policies and procedures.« less
Livermore study says oil leaks not severe
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patrick, L.
The Petroleum Marketers Association of America (PMAA), which is working to reform the federal Leaking Underground Storage Tank program, got some strong ammunition last month. A study that the Lawrence Livermore National Laboratory performed for the California State Water Resources Control Board has found that the environmental threat of leaks is not as severe as formerly thought. The study said: such leaks rarely jeopardize drinking water; fuel hydrocarbons have limited impacts on health, the environment, and groundwater; and cleanups often are done contrary to the knowledge and experience gained from prior remediations. As a result of the study, Gov. Petemore » Wilson ordered California cleanups halted at sites more than 250 feet from drinking water supplies.« less
Mosaic Transparent Armor System Final Report CRADA No. TC02162.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kuntz, J. D.; Breslin, M.
This was a collaborative effort between Lawrence Livermore National Security, LLC as manager and operator of Lawrence Livermore National Laboratory (LLNL) and The Protective Group, Inc. (TPG) to improve the performance of the mosaic transparent armor system (MTAS) for transparent armor applications, military and civilian. LLNL was to provide the unique MTAS technology and designs to TPG for innovative construction and ballistic testing of improvements needed for current and near future application of the armor windows on vehicles and aircraft. The goal of the project was to advance the technology of MTAS to the point that these mosaic transparent windowsmore » would be introduced and commercially manufactured for military vehicles and aircraft.« less
Slurry Coating System Statement of Work and Specification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chan, S. M.
2017-02-06
The Slurry Coating System will be used to coat crystals with a polymer to support Lawrence Livermore National Security, LLC (LLNS) research and development at Lawrence Livermore National Laboratory (LLNL). The crystals will be suspended in water in a kettle. A polymer solution is added, temperature of the kettle is raised and aggregates of the crystals and polymer form. The slurry is heated under vacuum to drive off the solvents and slowly cooled while mixing to room temperature. The resulting aggregates are then filtered and dried. The performance characteristics and fielding constraints define a unique set of requirements for amore » new system. This document presents the specifications and requirements for the system.« less
Teng, Y. G.; Berger, W. T.; Nesbitt, N. M.; ...
2015-07-27
Botulinum neurotoxins (BoNTs) are among the most potent biological toxin known to humans, and are classified as Category A bioterrorism agents by the Centers for Disease Control and prevention (CDC). There are seven known BoNT serotypes (A-G) which have been thus far identified in literature. BoNTs have been shown to block neurotransmitter release by cleaving proteins of the soluble NSF attachment protein receptor (SNARE) complex. Disruption of the SNARE complex precludes motor neuron failure which ultimately results in flaccid paralysis in humans and animals. Currently, there are no effective therapeutic treatments against the neurotoxin light chain (LC) after translocation intomore » the cytosols of motor neurons. In this work, high-throughput virtual screening was employed to screen a library of commercially available compounds from ZINC database against BoNT/A-LC. Among the hit compounds from the in-silico screening, two lead compounds were identified and found to have potent inhibitory activity against BoNT/A-LC in vitro, as well as in Neuro-2a cells. A few analogues of the lead compounds were synthesized and their potency examined. One of these analogues showed an enhanced activity than the lead compounds« less
Optoelectronic stereoscopic device for diagnostics, treatment, and developing of binocular vision
NASA Astrophysics Data System (ADS)
Pautova, Larisa; Elkhov, Victor A.; Ovechkis, Yuri N.
2003-08-01
Operation of the device is based on alternative generation of pictures for left and right eyes on the monitor screen. Controller gives pulses on LCG so that shutter for left or right eye opens synchronously with pictures. The device provides frequency of switching more than 100 Hz, and that is why the flickering is absent. Thus, a separate demonstration of images to the left eye or to the right one in turn is obtained for patients being unaware and creates the conditions of binocular perception clsoe to natural ones without any additional separation of vision fields. LC-cell transfer characteristic coodination with time parameters of monitor screen has enabled to improve stereo image quality. Complicated problem of computer stereo images with LC-glasses is so called 'ghosts' - noise images that come to blocked eye. We reduced its influence by adapting stereo images to phosphor and LC-cells characteristics. The device is intended for diagnostics and treatment of stabismus, amblyopia and other binocular and stereoscopic vision impairments, for cultivating, training and developing of stereoscopic vision, for measurements of horizontal and vertical phoria, phusion reserves, the stereovision acuity and some else, for fixing central scotoma borders, as well as suppression scotoma in strabismus too.
Smith, Kathleen S.
2005-01-01
This work evaluates the use of the biotic ligand model (BLM), an aquatic toxicity model, to predict toxic effects of metals on aquatic biota in areas underlain by different rock types. The chemical composition of water, soil, and sediment is largely derived from the composition of the underlying rock. Geologic source materials control key attributes of water chemistry that affect metal toxicity to aquatic biota, including: 1) potentially toxic elements, 2) alkalinity, 3) total dissolved solids, and 4) soluble major elements, such as Ca and Mg, which contribute to water hardness. Miller (2002) compiled chemical data for water samples collected in watersheds underlain by ten different rock types, and in a mineralized area in western Colorado. He found that each rock type has a unique range of water chemistry. In this study, the ten rock types were grouped into two general categories, igneous and sedimentary. Water collected in watersheds underlain by sedimentary rock has higher mean pH, alkalinity, and calcium concentrations than water collected in watersheds underlain by igneous rock. Water collected in the mineralized area had elevated concentrations of calcium and sulfate in addition to other chemical constituents. Miller's water-chemistry data were used in the BLM (computer program) to determine copper and zinc toxicity to Daphnia magna. Modeling results show that waters from watersheds underlain by different rock types have characteristic ranges of predicted LC 50 values (a measurement of aquatic toxicity) for copper and zinc, with watersheds underlain by igneous rock having lower predicted LC 50 values than watersheds underlain by sedimentary rock. Lower predicted LC 50 values suggest that aquatic biota in watersheds underlain by igneous rock may be more vulnerable to copper and zinc inputs than aquatic biota in watersheds underlain by sedimentary rock. For both copper and zinc, there is a trend of increasing predicted LC 50 values with increasing dissolved organic carbon (DOC) concentrations. Predicted copper LC 50 values are extremely sensitive to DOC concentrations, whereas alkalinity appears to have an influence on zinc toxicity at alkalinities in excess of about 100 mg/L CaCO 3 . These findings show promise for coupling the BLM (computer program) with measured water-chemistry data to predict metal toxicity to aquatic biota in different geologic settings and under different scenarios. This approach may ultimately be a useful tool for mine-site planning, mitigation and remediation strategies, and ecological risk assessment.
NASA Astrophysics Data System (ADS)
Aslan, N.; Koc-San, D.
2016-06-01
The main objectives of this study are (i) to calculate Land Surface Temperature (LST) from Landsat imageries, (ii) to determine the UHI effects from Landsat 7 ETM+ (June 5, 2001) and Landsat 8 OLI (June 17, 2014) imageries, (iii) to examine the relationship between LST and different Land Use/Land Cover (LU/LC) types for the years 2001 and 2014. The study is implemented in the central districts of Antalya. Initially, the brightness temperatures are retrieved and the LST values are calculated from Landsat thermal images. Then, the LU/LC maps are created from Landsat pan-sharpened images using Random Forest (RF) classifier. Normalized Difference Vegetation Index (NDVI) image, ASTER Global Digital Elevation Model (GDEM) and DMSP_OLS nighttime lights data are used as auxiliary data during the classification procedure. Finally, UHI effect is determined and the LST values are compared with LU/LC classes. The overall accuracies of RF classification results were computed higher than 88 % for both Landsat images. During 13-year time interval, it was observed that the urban and industrial areas were increased significantly. Maximum LST values were detected for dry agriculture, urban, and bareland classes, while minimum LST values were detected for vegetation and irrigated agriculture classes. The UHI effect was computed as 5.6 °C for 2001 and 6.8 °C for 2014. The validity of the study results were assessed using MODIS/Terra LST and Emissivity data and it was found that there are high correlation between Landsat LST and MODIS LST data (r2 = 0.7 and r2 = 0.9 for 2001 and 2014, respectively).
Bit storage and bit flip operations in an electromechanical oscillator.
Mahboob, I; Yamaguchi, H
2008-05-01
The Parametron was first proposed as a logic-processing system almost 50 years ago. In this approach the two stable phases of an excited harmonic oscillator provide the basis for logic operations. Computer architectures based on LC oscillators were developed for this approach, but high power consumption and difficulties with integration meant that the Parametron was rendered obsolete by the transistor. Here we propose an approach to mechanical logic based on nanoelectromechanical systems that is a variation on the Parametron architecture and, as a first step towards a possible nanomechanical computer, we demonstrate both bit storage and bit flip operations.
The ASCI Network for SC 2000: Gigabyte Per Second Networking
DOE Office of Scientific and Technical Information (OSTI.GOV)
PRATT, THOMAS J.; NAEGLE, JOHN H.; MARTINEZ JR., LUIS G.
2001-11-01
This document highlights the Discom's Distance computing and communication team activities at the 2000 Supercomputing conference in Dallas Texas. This conference is sponsored by the IEEE and ACM. Sandia's participation in the conference has now spanned a decade, for the last five years Sandia National Laboratories, Los Alamos National Lab and Lawrence Livermore National Lab have come together at the conference under the DOE's ASCI, Accelerated Strategic Computing Initiatives, Program rubric to demonstrate ASCI's emerging capabilities in computational science and our combined expertise in high performance computer science and communication networking developments within the program. At SC 2000, DISCOM demonstratedmore » an infrastructure. DISCOM2 uses this forum to demonstrate and focus communication and pre-standard implementation of 10 Gigabit Ethernet, the first gigabyte per second data IP network transfer application, and VPN technology that enabled a remote Distributed Resource Management tools demonstration. Additionally a national OC48 POS network was constructed to support applications running between the show floor and home facilities. This network created the opportunity to test PSE's Parallel File Transfer Protocol (PFTP) across a network that had similar speed and distances as the then proposed DISCOM WAN. The SCINET SC2000 showcased wireless networking and the networking team had the opportunity to explore this emerging technology while on the booth. This paper documents those accomplishments, discusses the details of their convention exhibit floor. We also supported the production networking needs of the implementation, and describes how these demonstrations supports DISCOM overall strategies in high performance computing networking.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beck, James B.
National Security Office (NSO) newsletter's main highlight is on the annual Strategic Weapons in the 21st Century that the Los Alamos and Lawrence Livermore National Laboratories host in Washington, DC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernhardt, A. F.; Smith, P. M.
This project was a collaborative effort between the University of California, Lawrence Livermore National Laboratory (LLNL) and FlexICs, Inc. to develop thin film transistor (TFT) electronics for active matrix displays.
Rymer, Caroline; Givens, D Ian
2010-08-15
Enriching poultry meat with long-chain n-3 polyunsaturated fatty acids (LC n-3 PUFA) can increase low population intakes of LC n-3 PUFA, but fishy taints can spoil reheated meat. This experiment determined the effect of different amounts of LC n-3 PUFA and vitamin E in the broiler diet on the fatty acid composition and sensory characteristics of the breast meat. Ross 308 broilers (120) were randomly allocated to one of five treatments from 21 to 42 days of age. Diets contained (g kg(-1)) 0, 9 or 18 LC n-3 PUFA (0LC, 9LC, 18LC), and 100, 150 or 200 mg LD-alpha-tocopherol acetate kg(-1) (E). The five diets were 0LC100E, 9LC100E, 18LC100E, 18LC150E, 18LC200E, with four pens per diet, except 18LC100E (eight pens). Breast meat was analysed for fatty acids (uncooked) and sensory analysis by R-index (reheated). LC n-3 PUFA content (mg kg(-1) meat) was 514 (0LC100E) and 2236 (9LC and 18LC). Compared with 0LC100E, meat from 18LC100E and 18LC150E tasted significantly different, while 23% of panellists detected fishy taints in 9LC100E and 18LC200E. Chicken meat can be enriched with nutritionally meaningful amounts of LC n-3 PUFA, but > 100 mg dl-alpha-tocopherol acetate kg(-1) broiler diet is needed to protect reheated meat from oxidative deterioration. Copyright (c) 2010 Society of Chemical Industry.
ASC Tri-lab Co-design Level 2 Milestone Report 2015
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hornung, Rich; Jones, Holger; Keasler, Jeff
2015-09-23
In 2015, the three Department of Energy (DOE) National Laboratories that make up the Advanced Sci- enti c Computing (ASC) Program (Sandia, Lawrence Livermore, and Los Alamos) collaboratively explored performance portability programming environments in the context of several ASC co-design proxy applica- tions as part of a tri-lab L2 milestone executed by the co-design teams at each laboratory. The programming environments that were studied included Kokkos (developed at Sandia), RAJA (LLNL), and Legion (Stan- ford University). The proxy apps studied included: miniAero, LULESH, CoMD, Kripke, and SNAP. These programming models and proxy-apps are described herein. Each lab focused on amore » particular combination of abstractions and proxy apps, with the goal of assessing performance portability using those. Performance portability was determined by: a) the ability to run a single application source code on multiple advanced architectures, b) comparing runtime performance between \
Application Modernization at LLNL and the Sierra Center of Excellence
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neely, J. Robert; de Supinski, Bronis R.
We repport that in 2014, Lawrence Livermore National Laboratory began acquisition of Sierra, a pre-exascale system from IBM and Nvidia. It marks a significant shift in direction for LLNL by introducing the concept of heterogeneous computing via GPUs. LLNL’s mission requires application teams to prepare for this paradigm shift. Thus, the Sierra procurement required a proposed Center of Excellence that would align the expertise of the chosen vendors with laboratory personnel that represent the application developers, system software, and tool providers in a concentrated effort to prepare the laboratory’s codes in advance of the system transitioning to production in 2018.more » Finally, this article presents LLNL’s overall application strategy, with a focus on how LLNL is collaborating with IBM and Nvidia to ensure a successful transition of its mission-oriented applications into the exascale era.« less
Weapon Physicist Declassifies Rescued Nuclear Test Films
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spriggs, Greg; Moye, Jim
2017-03-15
The U.S. conducted 210 atmospheric nuclear tests between 1945 and 1962, with multiple cameras capturing each event at around 2,400 frames per second. But in the decades since, around 10,000 of these films sat idle, scattered across the country in high-security vaults. Not only were they gathering dust, the film material itself was slowly decomposing, bringing the data they contained to the brink of being lost forever. For the past five years, Lawrence Livermore National Laboratory (LLNL) weapon physicist Greg Spriggs and a crack team of film experts, archivists and software developers have been on a mission to hunt down,more » scan, reanalyze and declassify these decomposing films. The goals are to preserve the films’ content before it’s lost forever, and provide better data to the post-testing-era scientists who use computer codes to help certify that the aging U.S. nuclear deterrent remains safe, secure and effective.« less
Overview of the FuZE Fusion Z-Pinch Experiment
NASA Astrophysics Data System (ADS)
Shumlak, U.; Nelson, B. A.; Claveau, E. L.; Forbes, E. G.; Golingo, R. P.; Stepanov, A. D.; Weber, T. R.; Zhang, Y.; McLean, H. S.; Higginson, D. P.; Schmidt, A.; Tummel, K. K.
2017-10-01
Successful results of the sheared flow stabilized (SFS) Z-pinch from ZaP and ZaP-HD have motivated the new FuZE project to scale the plasma performance to fusion conditions. The SFS Z-pinch is immune to the instabilities that plague the conventional Z-pinch yet maintains the same favorable radial scaling. The plasma density and temperature increase rapidly with decreasing plasma radius, which naturally leads to a compact configuration at fusion conditions. The SFS Z-pinch is being investigated as a novel approach to a compact fusion device in a collaborative ARPA-E ALPHA project with the University of Washington and Lawrence Livermore National Laboratory. The project includes an experimental effort coupled with high-fidelity physics modeling using kinetic and fluid simulations. Along with scaling law analysis, computational and experimental results from the FuZE device are presented. This work is supported by an award from US ARPA-E.
Effects of numerical tolerance levels on an atmospheric chemistry model for mercury
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferris, D.C.; Burns, D.S.; Shuford, J.
1996-12-31
A Box Model was developed to investigate the atmospheric oxidation processes of mercury in the environment. Previous results indicated the most important influences on the atmospheric concentration of HgO(g) are (i) the flux of HgO(g) volatilization, which is related to the surface medium, extent of contamination, and temperature, and (ii) the presence of Cl{sub 2} in the atmosphere. The numerical solver which has been incorporated into the ORganic CHemistry Integrated Dispersion (ORCHID) model uses the Livermore Solver of Ordinary Differential Equations (LSODE). In the solution of the ODE`s, LSODE uses numerical tolerances. The tolerances effect computer run time, the relativemore » accuracy of ODE calculated species concentrations and whether or not LSODE converges to a solution using this system of equations. The effects of varying these tolerances on the solution of the box model and the ORCHID model will be discussed.« less
3D integrated HYDRA simulations of hohlraums including fill tubes
NASA Astrophysics Data System (ADS)
Marinak, M. M.; Milovich, J.; Hammel, B. A.; Macphee, A. G.; Smalyuk, V. A.; Kerbel, G. D.; Sepke, S.; Patel, M. V.
2017-10-01
Measurements of fill tube perturbations from hydro growth radiography (HGR) experiments on the National Ignition Facility show spoke perturbations in the ablator radiating from the base of the tube. These correspond to the shadow of the 10 μm diameter glass fill tube cast by hot spots at early time. We present 3D integrated HYDRA simulations of these experiments which include the fill tube. Meshing techniques are described which were employed to resolve the fill tube structure and associated perturbations in the simulations. We examine the extent to which the specific illumination geometry necessary to accommodate a backlighter in the HGR experiment contributes to the spoke pattern. Simulations presented include high resolution calculations run on the Trinity machine operated by the Alliance for Computing at Extreme Scale (ACES) partnership. This work was performed under the auspices of the Lawrence Livermore National Security, LLC, (LLNS) under Contract No. DE-AC52-07NA27344.
NASA Astrophysics Data System (ADS)
Nora, R.; Field, J. E.; Peterson, J. Luc; Spears, B.; Kruse, M.; Humbird, K.; Gaffney, J.; Springer, P. T.; Brandon, S.; Langer, S.
2017-10-01
We present an experimentally corroborated hydrodynamic extrapolation of several recent BigFoot implosions on the National Ignition Facility. An estimate on the value and error of the hydrodynamic scale necessary for ignition (for each individual BigFoot implosion) is found by hydrodynamically scaling a distribution of multi-dimensional HYDRA simulations whose outputs correspond to their experimental observables. The 11-parameter database of simulations, which include arbitrary drive asymmetries, dopant fractions, hydrodynamic scaling parameters, and surface perturbations due to surrogate tent and fill-tube engineering features, was computed on the TRINITY supercomputer at Los Alamos National Laboratory. This simple extrapolation is the first step in providing a rigorous calibration of our workflow to provide an accurate estimate of the efficacy of achieving ignition on the National Ignition Facility. This work performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Application Modernization at LLNL and the Sierra Center of Excellence
Neely, J. Robert; de Supinski, Bronis R.
2017-09-01
We repport that in 2014, Lawrence Livermore National Laboratory began acquisition of Sierra, a pre-exascale system from IBM and Nvidia. It marks a significant shift in direction for LLNL by introducing the concept of heterogeneous computing via GPUs. LLNL’s mission requires application teams to prepare for this paradigm shift. Thus, the Sierra procurement required a proposed Center of Excellence that would align the expertise of the chosen vendors with laboratory personnel that represent the application developers, system software, and tool providers in a concentrated effort to prepare the laboratory’s codes in advance of the system transitioning to production in 2018.more » Finally, this article presents LLNL’s overall application strategy, with a focus on how LLNL is collaborating with IBM and Nvidia to ensure a successful transition of its mission-oriented applications into the exascale era.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lefrancois, A.; Reisman, D. B.; Bastea, M.
2006-02-13
Isentropic compression experiments and numerical simulations on metals are performed at Z accelerator facility from Sandia National Laboratory and at Lawrence Livermore National Laboratory in order to study the isentrope, associated Hugoniot and phase changes of these metals. 3D configurations have been calculated here to benchmark the new beta version of the electromagnetism package coupled with the dynamics in Ls-Dyna and compared with the ICE Z shots 1511 and 1555. The electromagnetism module is being developed in the general-purpose explicit and implicit finite element program LS-DYNA{reg_sign} in order to perform coupled mechanical/thermal/electromagnetism simulations. The Maxwell equations are solved using amore » Finite Element Method (FEM) for the solid conductors coupled with a Boundary Element Method (BEM) for the surrounding air (or vacuum). More details can be read in the references.« less
NASA Technical Reports Server (NTRS)
Jensen, K. A.; Ripoll, J.-F.; Wray, A. A.; Joseph, D.; ElHafi, M.
2004-01-01
Five computational methods for solution of the radiative transfer equation in an absorbing-emitting and non-scattering gray medium were compared on a 2 m JP-8 pool fire. The temperature and absorption coefficient fields were taken from a synthetic fire due to the lack of a complete set of experimental data for fires of this size. These quantities were generated by a code that has been shown to agree well with the limited quantity of relevant data in the literature. Reference solutions to the governing equation were determined using the Monte Carlo method and a ray tracing scheme with high angular resolution. Solutions using the discrete transfer method, the discrete ordinate method (DOM) with both S(sub 4) and LC(sub 11) quadratures, and moment model using the M(sub 1) closure were compared to the reference solutions in both isotropic and anisotropic regions of the computational domain. DOM LC(sub 11) is shown to be the more accurate than the commonly used S(sub 4) quadrature technique, especially in anisotropic regions of the fire domain. This represents the first study where the M(sub 1) method was applied to a combustion problem occurring in a complex three-dimensional geometry. The M(sub 1) results agree well with other solution techniques, which is encouraging for future applications to similar problems since it is computationally the least expensive solution technique. Moreover, M(sub 1) results are comparable to DOM S(sub 4).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edward Moses
The National Ignition Facility, the world's largest laser system, was dedicated at a ceremony on May 29, 2009 at Lawrence Livermore National Laboratory. These are the remarks by NIF Director Edward Moses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruggiero, A.; Orgren, A.
This project was a collaborative effort between Lawrence Livermore National Security, LLC (formerly The Regents of the University of California)/Lawrence Livermore National Laboratory (LLNL) and LGS Innovations, LLC (formerly Lucent Technologies, Inc.), to develop long-range and mobile operational free-space optical (FSO) laser communication systems for specialized government applications. LLNL and LGS Innovations formerly Lucent Bell Laboratories Government Communications Systems performed this work for a United States Government (USG) Intelligence Work for Others (I-WFO) customer, also referred to as "Government Customer", or "Customer" and "Government Sponsor." The CRADA was a critical and required part of the LLNL technology transfer plan formore » the customer.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDonald, J.R.; Minor, J.E.; Mehta, K.C.
1975-11-01
Criteria are prescribed and guidance is provided for professional personnel who are involved with the evaluation of existing buildings and facilities at Site 300 near Livermore, California to resist the possible effects of extreme winds and tornadoes. The development of parameters for the effects of tornadoes and extreme winds and guidelines for evaluation and design of structures are presented. The investigations conducted are summarized and the techniques used for arriving at the combined tornado and extreme wind risk model are discussed. The guidelines for structural design methods for calculating pressure distributions on walls and roofs of structures and methods formore » accommodating impact loads from missiles are also presented. (auth)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, C.; Arsenlis, T.; Bailey, A.
Lawrence Livermore National Laboratory Campus Capability Plan for 2018-2028. Lawrence Livermore National Laboratory (LLNL) is one of three national laboratories that are part of the National Nuclear Security Administration. LLNL provides critical expertise to strengthen U.S. security through development and application of world-class science and technology that: Ensures the safety, reliability, and performance of the U.S. nuclear weapons stockpile; Promotes international nuclear safety and nonproliferation; Reduces global danger from weapons of mass destruction; Supports U.S. leadership in science and technology. Essential to the execution and continued advancement of these mission areas are responsive infrastructure capabilities. This report showcases each LLNLmore » capability area and describes the mission, science, and technology efforts enabled by LLNL infrastructure, as well as future infrastructure plans.« less
Science and Technology Review July/August 2010
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blobaum, K M
2010-05-27
This issue has the following articles: (1) Deterrence with a Minimum Nuclear Stockpile - Commentary by Bruce T. Goodwin; (2) Enhancing Confidence in the Nation's Nuclear Stockpile - Livermore experts are participating in a national effort aimed at predicting how nuclear weapon materials and systems will likely change over time; (3) Narrowing Uncertainties - For climate modeling and many other fields, understanding uncertainty, or margin of error, is critical; (4) Insight into a Deadly Disease - Laboratory experiments reveal the pathogenesis of tularemia in host cells, bringing scientists closer to developing a vaccine for this debilitating disease. (5) Return tomore » Rongelap - On the Rongelap Atoll, Livermore scientists are working to minimize radiological exposure for natives now living on or wishing to return to the islands.« less
Li, Wei-Cai; Wu, Jian-Yang; Zhang, Hong-Na; Shi, Sheng-You; Liu, Li-Qin; Shu, Bo; Liang, Qing-Zhi; Xie, Jiang-Hui; Wei, Yong-Zan
2014-01-01
Fruit cracking has long been a topic of great concern for growers and researchers of litchi (Litchi chinensis Sonn.). To understand the molecular mechanisms underlying fruit cracking, high-throughput RNA sequencing (RNA-Seq) was first used for de novo assembly and characterization of the transcriptome of cracking pericarp of litchi. Comparative transcriptomic analyses were performed on non-cracking and cracking fruits. A total of approximately 26 million and 29 million high quality reads were obtained from the two groups of samples, and were assembled into 46,641 unigenes with an average length of 993 bp. These unigenes can be useful resources for future molecular studies of the pericarp in litchi. Furthermore, four genes (LcAQP, 1; LcPIP, 1; LcNIP, 1; LcSIP, 1) involved in water transport, five genes (LcKS, 2; LcGA2ox, 2; LcGID1, 1) involved in GA metabolism, 21 genes (LcCYP707A, 2; LcGT, 9; Lcβ-Glu, 6; LcPP2C, 2; LcABI1, 1; LcABI5, 1) involved in ABA metabolism, 13 genes (LcTPC, 1; Ca2+/H+ exchanger, 3; Ca2+-ATPase, 4; LcCDPK, 2; LcCBL, 3) involved in Ca transport and 24 genes (LcPG, 5; LcEG, 1; LcPE, 3; LcEXP, 5; Lcβ-Gal, 9; LcXET, 1) involved in cell wall metabolism were identified as genes that are differentially expressed in cracked fruits compared to non-cracked fruits. Our results open new doors to further understand the molecular mechanisms behind fruit cracking in litchi and other fruits, especially Sapindaceae plants. PMID:25272225
The NISTmAb tryptic peptide spectral library for monoclonal antibody characterization.
Dong, Qian; Liang, Yuxue; Yan, Xinjian; Markey, Sanford P; Mirokhin, Yuri A; Tchekhovskoi, Dmitrii V; Bukhari, Tallat H; Stein, Stephen E
2018-04-01
We describe the creation of a mass spectral library composed of all identifiable spectra derived from the tryptic digest of the NISTmAb IgG1κ. The library is a unique reference spectral collection developed from over six million peptide-spectrum matches acquired by liquid chromatography-mass spectrometry (LC-MS) over a wide range of collision energy. Conventional one-dimensional (1D) LC-MS was used for various digestion conditions and 20- and 24-fraction two-dimensional (2D) LC-MS studies permitted in-depth analyses of single digests. Computer methods were developed for automated analysis of LC-MS isotopic clusters to determine the attributes for all ions detected in the 1D and 2D studies. The library contains a selection of over 12,600 high-quality tandem spectra of more than 3,300 peptide ions identified and validated by accurate mass, differential elution pattern, and expected peptide classes in peptide map experiments. These include a variety of biologically modified peptide spectra involving glycosylated, oxidized, deamidated, glycated, and N/C-terminal modified peptides, as well as artifacts. A complete glycation profile was obtained for the NISTmAb with spectra for 58% and 100% of all possible glycation sites in the heavy and light chains, respectively. The site-specific quantification of methionine oxidation in the protein is described. The utility of this reference library is demonstrated by the analysis of a commercial monoclonal antibody (adalimumab, Humira®), where 691 peptide ion spectra are identifiable in the constant regions, accounting for 60% coverage for both heavy and light chains. The NIST reference library platform may be used as a tool for facile identification of the primary sequence and post-translational modifications, as well as the recognition of LC-MS method-induced artifacts for human and recombinant IgG antibodies. Its development also provides a general method for creating comprehensive peptide libraries of individual proteins.
The NISTmAb tryptic peptide spectral library for monoclonal antibody characterization
Dong, Qian; Liang, Yuxue; Yan, Xinjian; Markey, Sanford P.; Mirokhin, Yuri A.; Tchekhovskoi, Dmitrii V.; Bukhari, Tallat H.; Stein, Stephen E.
2018-01-01
ABSTRACT We describe the creation of a mass spectral library composed of all identifiable spectra derived from the tryptic digest of the NISTmAb IgG1κ. The library is a unique reference spectral collection developed from over six million peptide-spectrum matches acquired by liquid chromatography-mass spectrometry (LC-MS) over a wide range of collision energy. Conventional one-dimensional (1D) LC-MS was used for various digestion conditions and 20- and 24-fraction two-dimensional (2D) LC-MS studies permitted in-depth analyses of single digests. Computer methods were developed for automated analysis of LC-MS isotopic clusters to determine the attributes for all ions detected in the 1D and 2D studies. The library contains a selection of over 12,600 high-quality tandem spectra of more than 3,300 peptide ions identified and validated by accurate mass, differential elution pattern, and expected peptide classes in peptide map experiments. These include a variety of biologically modified peptide spectra involving glycosylated, oxidized, deamidated, glycated, and N/C-terminal modified peptides, as well as artifacts. A complete glycation profile was obtained for the NISTmAb with spectra for 58% and 100% of all possible glycation sites in the heavy and light chains, respectively. The site-specific quantification of methionine oxidation in the protein is described. The utility of this reference library is demonstrated by the analysis of a commercial monoclonal antibody (adalimumab, Humira®), where 691 peptide ion spectra are identifiable in the constant regions, accounting for 60% coverage for both heavy and light chains. The NIST reference library platform may be used as a tool for facile identification of the primary sequence and post-translational modifications, as well as the recognition of LC-MS method-induced artifacts for human and recombinant IgG antibodies. Its development also provides a general method for creating comprehensive peptide libraries of individual proteins. PMID:29425077
Kou, Longfa; Yao, Qing; Sun, Mengchi; Wu, Chunnuan; Wang, Jia; Luo, Qiuhua; Wang, Gang; Du, Yuqian; Fu, Qiang; Wang, Jian; He, Zhonggui; Ganapathy, Vadivel; Sun, Jin
2017-09-01
OCTN2 (SLC22A5) is a Na + -coupled absorption transporter for l-carnitine in small intestine. This study tests the potential of this transporter for oral delivery of therapeutic drugs encapsulated in l-carnitine-conjugated poly(lactic-co-glycolic acid) (PLGA) nanoparticles (LC-PLGA NPs) and discloses the molecular mechanism for cellular endocytosis of transporter-targeting nanoparticles. Conjugation of l-carnitine to a surface of PLGA-NPs enhances the cellular uptake and intestinal absorption of encapsulated drug. In both cases, the uptake process is dependent on cotransporting ion Na + . Computational OCTN2 docking analysis shows that the presence of Na + is important for the formation of the energetically stable intermediate complex of transporter-Na + -LC-PLGA NPs, which is also the first step in cellular endocytosis of nanoparticles. The transporter-mediated intestinal absorption of LC-PLGA NPs occurs via endocytosis/transcytosis rather than via the traditional transmembrane transport. The portal blood versus the lymphatic route is evaluated by the plasma appearance of the drug in the control and lymph duct-ligated rats. Absorption via the lymphatic system is the predominant route in the oral delivery of the NPs. In summary, LC-PLGA NPs can effectively target OCTN2 on the enterocytes for enhancing oral delivery of drugs and the critical role of cotransporting ions should be noticed in designing transporter-targeting nanoparticles. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
The impact of overdiagnosis on the selection of efficient lung cancer screening strategies.
Han, Summer S; Ten Haaf, Kevin; Hazelton, William D; Munshi, Vidit N; Jeon, Jihyoun; Erdogan, Saadet A; Johanson, Colden; McMahon, Pamela M; Meza, Rafael; Kong, Chung Yin; Feuer, Eric J; de Koning, Harry J; Plevritis, Sylvia K
2017-06-01
The U.S. Preventive Services Task Force (USPSTF) recently updated their national lung screening guidelines and recommended low-dose computed tomography (LDCT) for lung cancer (LC) screening through age 80. However, the risk of overdiagnosis among older populations is a concern. Using four comparative models from the Cancer Intervention and Surveillance Modeling Network, we evaluate the overdiagnosis of the screening program recommended by USPSTF in the U.S. 1950 birth cohort. We estimate the number of LC deaths averted by screening (D) per overdiagnosed case (O), yielding the ratio D/O, to quantify the trade-off between the harms and benefits of LDCT. We analyze 576 hypothetical screening strategies that vary by age, smoking, and screening frequency and evaluate efficient screening strategies that maximize the D/O ratio and other metrics including D and life-years gained (LYG) per overdiagnosed case. The estimated D/O ratio for the USPSTF screening program is 2.85 (model range: 1.5-4.5) in the 1950 birth cohort, implying LDCT can prevent ∼3 LC deaths per overdiagnosed case. This D/O ratio increases by 22% when the program stops screening at an earlier age 75 instead of 80. Efficiency frontier analysis shows that while the most efficient screening strategies that maximize the mortality reduction (D) irrespective of overdiagnosis screen through age 80, screening strategies that stop at age 75 versus 80 produce greater efficiency in increasing life-years gained per overdiagnosed case. Given the risk of overdiagnosis with LC screening, the stopping age of screening merits further consideration when balancing benefits and harms. © 2017 UICC.
Chen, Wenduo; Zhu, Youliang; Cui, Fengchao; Liu, Lunyang; Sun, Zhaoyan; Chen, Jizhong; Li, Yunqi
2016-01-01
Gay-Berne (GB) potential is regarded as an accurate model in the simulation of anisotropic particles, especially for liquid crystal (LC) mesogens. However, its computational complexity leads to an extremely time-consuming process for large systems. Here, we developed a GPU-accelerated molecular dynamics (MD) simulation with coarse-grained GB potential implemented in GALAMOST package to investigate the LC phase transitions for mesogens in small molecules, main-chain or side-chain polymers. For identical mesogens in three different molecules, on cooling from fully isotropic melts, the small molecules form a single-domain smectic-B phase, while the main-chain LC polymers prefer a single-domain nematic phase as a result of connective restraints in neighboring mesogens. The phase transition of side-chain LC polymers undergoes a two-step process: nucleation of nematic islands and formation of multi-domain nematic texture. The particular behavior originates in the fact that the rotational orientation of the mesogenes is hindered by the polymer backbones. Both the global distribution and the local orientation of mesogens are critical for the phase transition of anisotropic particles. Furthermore, compared with the MD simulation in LAMMPS, our GPU-accelerated code is about 4 times faster than the GPU version of LAMMPS and at least 200 times faster than the CPU version of LAMMPS. This study clearly shows that GPU-accelerated MD simulation with GB potential in GALAMOST can efficiently handle systems with anisotropic particles and interactions, and accurately explore phase differences originated from molecular structures.
Cui, Fengchao; Liu, Lunyang; Sun, Zhaoyan; Chen, Jizhong; Li, Yunqi
2016-01-01
Gay-Berne (GB) potential is regarded as an accurate model in the simulation of anisotropic particles, especially for liquid crystal (LC) mesogens. However, its computational complexity leads to an extremely time-consuming process for large systems. Here, we developed a GPU-accelerated molecular dynamics (MD) simulation with coarse-grained GB potential implemented in GALAMOST package to investigate the LC phase transitions for mesogens in small molecules, main-chain or side-chain polymers. For identical mesogens in three different molecules, on cooling from fully isotropic melts, the small molecules form a single-domain smectic-B phase, while the main-chain LC polymers prefer a single-domain nematic phase as a result of connective restraints in neighboring mesogens. The phase transition of side-chain LC polymers undergoes a two-step process: nucleation of nematic islands and formation of multi-domain nematic texture. The particular behavior originates in the fact that the rotational orientation of the mesogenes is hindered by the polymer backbones. Both the global distribution and the local orientation of mesogens are critical for the phase transition of anisotropic particles. Furthermore, compared with the MD simulation in LAMMPS, our GPU-accelerated code is about 4 times faster than the GPU version of LAMMPS and at least 200 times faster than the CPU version of LAMMPS. This study clearly shows that GPU-accelerated MD simulation with GB potential in GALAMOST can efficiently handle systems with anisotropic particles and interactions, and accurately explore phase differences originated from molecular structures. PMID:26986851
NASA Astrophysics Data System (ADS)
Bender, Jason; Raman, Kumar; Huntington, Channing; Nagel, Sabrina; Morgan, Brandon; Prisbrey, Shon; MacLaren, Stephan
2017-10-01
Experiments at the National Ignition Facility (NIF) are studying Richtmyer-Meshkov and Rayleigh-Taylor hydrodynamic instabilities in multiply-shocked plasmas. Targets feature two different-density fluids with a multimode initial perturbation at the interface, which is struck by two X-ray-driven shock waves. Here we discuss computational hydrodynamics simulations investigating the effect of second-shock (``reshock'') strength on instability growth, and how these simulations are informing target design for the ongoing experimental campaign. A Reynolds-Averaged Navier Stokes (RANS) model was used to predict motion of the spike and bubble fronts and the mixing-layer width. In addition to reshock strength, the reshock ablator thickness and the total length of the target were varied; all three parameters were found to be important for target design, particularly for ameliorating undesirable reflected shocks. The RANS data are compared to theoretical models that predict multimode instability growth proportional to the shock-induced change in interface velocity, and to currently-available data from the NIF experiments. Work performed under the auspices of the U.S. D.O.E. by Lawrence Livermore National Laboratory under Contract No. DE-AC52-07NA27344. LLNL-ABS-734611.
An ARM data-oriented diagnostics package to evaluate the climate model simulation
NASA Astrophysics Data System (ADS)
Zhang, C.; Xie, S.
2016-12-01
A set of diagnostics that utilize long-term high frequency measurements from the DOE Atmospheric Radiation Measurement (ARM) program is developed for evaluating the regional simulation of clouds, radiation and precipitation in climate models. The diagnostics results are computed and visualized automatically in a python-based package that aims to serve as an easy entry point for evaluating climate simulations using the ARM data, as well as the CMIP5 multi-model simulations. Basic performance metrics are computed to measure the accuracy of mean state and variability of simulated regional climate. The evaluated physical quantities include vertical profiles of clouds, temperature, relative humidity, cloud liquid water path, total column water vapor, precipitation, sensible and latent heat fluxes, radiative fluxes, aerosol and cloud microphysical properties. Process-oriented diagnostics focusing on individual cloud and precipitation-related phenomena are developed for the evaluation and development of specific model physical parameterizations. Application of the ARM diagnostics package will be presented in the AGU session. This work is performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344, IM release number is: LLNL-ABS-698645.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brady, M; Browand, F; Flowers, D
A Working Group Meeting on Heavy Vehicle Aerodynamic Drag was held at University of Southern California, Los Angeles, California on July 30, 1999. The purpose of the meeting was to present technical details on the experimental and computational plans and approaches and provide an update on progress in obtaining experimental results, model developments, and simulations. The focus of the meeting was a review of University of Southern California's (USC) experimental plans and results and the computational results from Lawrence Livermore National Laboratory (LLNL) and Sandia National Laboratories (SNL) for the integrated tractor-trailer benchmark geometry called the Sandia Model. Much ofmore » the meeting discussion involved the NASA Ames 7 ft x 10 ft wind tunnel tests and the need for documentation of the results. The present and projected budget and funding situation was also discussed. Presentations were given by representatives from the Department of Energy (DOE) Office of Transportation Technology Office of Heavy Vehicle Technology (OHVT), LLNL, SNL, USC, and California Institute of Technology (Caltech). This report contains the technical presentations (viewgraphs) delivered at the Meeting, briefly summarizes the comments and conclusions, and outlines the future action items.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frank, R.N.
1990-02-28
The Inspection Shop at Lawrence Livermore Lab recently purchased a Sheffield Apollo RS50 Direct Computer Control Coordinate Measuring Machine. The performance of the machine was specified to conform to B89 standard which relies heavily upon using the measuring machine in its intended manner to verify its accuracy (rather than parametric tests). Although it would be possible to use the interactive measurement system to perform these tasks, a more thorough and efficient job can be done by creating Function Library programs for certain tasks which integrate Hewlett-Packard Basic 5.0 language and calls to proprietary analysis and machine control routines. This combinationmore » provides efficient use of the measuring machine with a minimum of keyboard input plus an analysis of the data with respect to the B89 Standard rather than a CMM analysis which would require subsequent interpretation. This paper discusses some characteristics of the Sheffield machine control and analysis software and my use of H-P Basic language to create automated measurement programs to support the B89 performance evaluation of the CMM. 1 ref.« less
Hui, Wang; Young, David A; Rowan, Andrew D; Xu, Xin; Cawston, Tim E; Proctor, Carole J
2016-01-01
Objective To use a computational approach to investigate the cellular and extracellular matrix changes that occur with age in the knee joints of mice. Methods Knee joints from an inbred C57/BL1/6 (ICRFa) mouse colony were harvested at 3–30 months of age. Sections were stained with H&E, Safranin-O, Picro-sirius red and antibodies to matrix metalloproteinase-13 (MMP-13), nitrotyrosine, LC-3B, Bcl-2, and cleaved type II collagen used for immunohistochemistry. Based on this and other data from the literature, a computer simulation model was built using the Systems Biology Markup Language using an iterative approach of data analysis and modelling. Individual parameters were subsequently altered to assess their effect on the model. Results A progressive loss of cartilage matrix occurred with age. Nitrotyrosine, MMP-13 and activin receptor-like kinase-1 (ALK1) staining in cartilage increased with age with a concomitant decrease in LC-3B and Bcl-2. Stochastic simulations from the computational model showed a good agreement with these data, once transforming growth factor-β signalling via ALK1/ALK5 receptors was included. Oxidative stress and the interleukin 1 pathway were identified as key factors in driving the cartilage breakdown associated with ageing. Conclusions A progressive loss of cartilage matrix and cellularity occurs with age. This is accompanied with increased levels of oxidative stress, apoptosis and MMP-13 and a decrease in chondrocyte autophagy. These changes explain the marked predisposition of joints to develop osteoarthritis with age. Computational modelling provides useful insights into the underlying mechanisms involved in age-related changes in musculoskeletal tissues. PMID:25475114
Loi, Gianfranco; Dominietto, Marco; Manfredda, Irene; Mones, Eleonora; Carriero, Alessandro; Inglese, Eugenio; Krengli, Marco; Brambilla, Marco
2008-09-01
This note describes a method to characterize the performances of image fusion software (Syntegra) with respect to accuracy and robustness. Computed tomography (CT), magnetic resonance imaging (MRI), and single-photon emission computed tomography (SPECT) studies were acquired from two phantoms and 10 patients. Image registration was performed independently by two couples composed of one radiotherapist and one physicist by means of superposition of anatomic landmarks. Each couple performed jointly and saved the registration. The two solutions were averaged to obtain the gold standard registration. A new set of estimators was defined to identify translation and rotation errors in the coordinate axes, independently from point position in image field of view (FOV). Algorithms evaluated were local correlation (LC) for CT-MRI, normalized mutual information (MI) for CT-MRI, and CT-SPECT registrations. To evaluate accuracy, estimator values were compared to limiting values for the algorithms employed, both in phantoms and in patients. To evaluate robustness, different alignments between images taken from a sample patient were produced and registration errors determined. LC algorithm resulted accurate in CT-MRI registrations in phantoms, but exceeded limiting values in 3 of 10 patients. MI algorithm resulted accurate in CT-MRI and CT-SPECT registrations in phantoms; limiting values were exceeded in one case in CT-MRI and never reached in CT-SPECT registrations. Thus, the evaluation of robustness was restricted to the algorithm of MI both for CT-MRI and CT-SPECT registrations. The algorithm of MI proved to be robust: limiting values were not exceeded with translation perturbations up to 2.5 cm, rotation perturbations up to 10 degrees and roto-translational perturbation up to 3 cm and 5 degrees.
2011 Computation Directorate Annual Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crawford, D L
2012-04-11
From its founding in 1952 until today, Lawrence Livermore National Laboratory (LLNL) has made significant strategic investments to develop high performance computing (HPC) and its application to national security and basic science. Now, 60 years later, the Computation Directorate and its myriad resources and capabilities have become a key enabler for LLNL programs and an integral part of the effort to support our nation's nuclear deterrent and, more broadly, national security. In addition, the technological innovation HPC makes possible is seen as vital to the nation's economic vitality. LLNL, along with other national laboratories, is working to make supercomputing capabilitiesmore » and expertise available to industry to boost the nation's global competitiveness. LLNL is on the brink of an exciting milestone with the 2012 deployment of Sequoia, the National Nuclear Security Administration's (NNSA's) 20-petaFLOP/s resource that will apply uncertainty quantification to weapons science. Sequoia will bring LLNL's total computing power to more than 23 petaFLOP/s-all brought to bear on basic science and national security needs. The computing systems at LLNL provide game-changing capabilities. Sequoia and other next-generation platforms will enable predictive simulation in the coming decade and leverage industry trends, such as massively parallel and multicore processors, to run petascale applications. Efficient petascale computing necessitates refining accuracy in materials property data, improving models for known physical processes, identifying and then modeling for missing physics, quantifying uncertainty, and enhancing the performance of complex models and algorithms in macroscale simulation codes. Nearly 15 years ago, NNSA's Accelerated Strategic Computing Initiative (ASCI), now called the Advanced Simulation and Computing (ASC) Program, was the critical element needed to shift from test-based confidence to science-based confidence. Specifically, ASCI/ASC accelerated the development of simulation capabilities necessary to ensure confidence in the nuclear stockpile-far exceeding what might have been achieved in the absence of a focused initiative. While stockpile stewardship research pushed LLNL scientists to develop new computer codes, better simulation methods, and improved visualization technologies, this work also stimulated the exploration of HPC applications beyond the standard sponsor base. As LLNL advances to a petascale platform and pursues exascale computing (1,000 times faster than Sequoia), ASC will be paramount to achieving predictive simulation and uncertainty quantification. Predictive simulation and quantifying the uncertainty of numerical predictions where little-to-no data exists demands exascale computing and represents an expanding area of scientific research important not only to nuclear weapons, but to nuclear attribution, nuclear reactor design, and understanding global climate issues, among other fields. Aside from these lofty goals and challenges, computing at LLNL is anything but 'business as usual.' International competition in supercomputing is nothing new, but the HPC community is now operating in an expanded, more aggressive climate of global competitiveness. More countries understand how science and technology research and development are inextricably linked to economic prosperity, and they are aggressively pursuing ways to integrate HPC technologies into their native industrial and consumer products. In the interest of the nation's economic security and the science and technology that underpins it, LLNL is expanding its portfolio and forging new collaborations. We must ensure that HPC remains an asymmetric engine of innovation for the Laboratory and for the U.S. and, in doing so, protect our research and development dynamism and the prosperity it makes possible. One untapped area of opportunity LLNL is pursuing is to help U.S. industry understand how supercomputing can benefit their business. Industrial investment in HPC applications has historically been limited by the prohibitive cost of entry, the inaccessibility of software to run the powerful systems, and the years it takes to grow the expertise to develop codes and run them in an optimal way. LLNL is helping industry better compete in the global market place by providing access to some of the world's most powerful computing systems, the tools to run them, and the experts who are adept at using them. Our scientists are collaborating side by side with industrial partners to develop solutions to some of industry's toughest problems. The goal of the Livermore Valley Open Campus High Performance Computing Innovation Center is to allow American industry the opportunity to harness the power of supercomputing by leveraging the scientific and computational expertise at LLNL in order to gain a competitive advantage in the global economy.« less
Requirements for a network storage service
NASA Technical Reports Server (NTRS)
Kelly, Suzanne M.; Haynes, Rena A.
1992-01-01
Sandia National Laboratories provides a high performance classified computer network as a core capability in support of its mission of nuclear weapons design and engineering, physical sciences research, and energy research and development. The network, locally known as the Internal Secure Network (ISN), was designed in 1989 and comprises multiple distributed local area networks (LAN's) residing in Albuquerque, New Mexico and Livermore, California. The TCP/IP protocol suite is used for inner-node communications. Scientific workstations and mid-range computers, running UNIX-based operating systems, compose most LAN's. One LAN, operated by the Sandia Corporate Computing Directorate, is a general purpose resource providing a supercomputer and a file server to the entire ISN. The current file server on the supercomputer LAN is an implementation of the Common File System (CFS) developed by Los Alamos National Laboratory. Subsequent to the design of the ISN, Sandia reviewed its mass storage requirements and chose to enter into a competitive procurement to replace the existing file server with one more adaptable to a UNIX/TCP/IP environment. The requirements study for the network was the starting point for the requirements study for the new file server. The file server is called the Network Storage Services (NSS) and is requirements are described in this paper. The next section gives an application or functional description of the NSS. The final section adds performance, capacity, and access constraints to the requirements.
Atomistic study of mixing at high Z / low Z interfaces at Warm Dense Matter Conditions
NASA Astrophysics Data System (ADS)
Haxhimali, Tomorr; Glosli, James; Rudd, Robert; Lawrence Livermore National Laboratory Team
2016-10-01
We use atomistic simulations to study different aspects of mixing occurring at an initially sharp interface of high Z and low Z plasmas in the Warm/Hot Dense Matter regime. We consider a system of Diamond (the low Z component) in contact with Ag (the high Z component), which undergoes rapid isochoric heating from room temperature up to 10 eV, rapidly changing the solids into warm dense matter at solid density. We simulate the motion of ions via the screened Coulomb potential. The electric field, the electron density and ionizations level are computed on the fly by solving Poisson equation. The spatially varying screening lengths computed from the electron cloud are included in this effective interaction; the electrons are not simulated explicitly. We compute the electric field generated at the Ag-C interface as well as the dynamics of the ions during the mixing process occurring at the plasma interface. Preliminary results indicate an anomalous transport of high Z ions (Ag) into the low Z component (C); a phenomenon that is partially related to the enhanced transport of ions due to the generated electric field. These results are in agreement with recent experimental observation on Au-diamond plasma interface. This work was performed under the auspices of the US Dept. of Energy by Lawrence Livermore National Security, LLC under Contract DE-AC52-07NA27344.
Simulation of 0.3 MWt AFBC test rig burning Turkish lignites
DOE Office of Scientific and Technical Information (OSTI.GOV)
Selcuk, N.; Degirmenci, E.; Oymak, O.
1997-12-31
A system model coupling bed and freeboard models for continuous combustion of lignite particles of wide size distribution burning in their own ash in a fluidized bed combustor was modified to incorporate: (1) a procedure for faster computation of particle size distributions (PSDs) without any sacrifice in accuracy; (2) energy balance on char particles for the determination of variation of temperature with particle size, (3) plug flow assumption for the interstitial gas. An efficient and accurate computer code developed for the solution of the conservation equations for energy and chemical species was applied to the prediction of the behavior ofmore » a 0.3 MWt AFBC test rig burning low quality Turkish lignites. The construction and operation of the test rig was carried out within the scope of a cooperation agreement between Middle East Technical University (METU) and Babcock and Wilcox GAMA (BWG) under the auspices of Canadian International Development Agency (CIDA). Predicted concentration and temperature profiles and particle size distributions of solid streams were compared with measured data and found to be in reasonable agreement. The computer code replaces the conventional numerical integration of the analytical solution of population balance with direct integration in ODE form by using a powerful integrator LSODE (Livermore Solver for Ordinary Differential Equations) resulting in two orders of magnitude decrease in CPU (Central Processing Unit) time.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thakkar, Ajit J., E-mail: ajit@unb.ca; Wu, Taozhe
2015-10-14
Static electronic dipole polarizabilities for 135 molecules are calculated using second-order Møller-Plesset perturbation theory and six density functionals recently recommended for polarizabilities. Comparison is made with the best gas-phase experimental data. The lowest mean absolute percent deviations from the best experimental values for all 135 molecules are 3.03% and 3.08% for the LC-τHCTH and M11 functionals, respectively. Excluding the eight extreme outliers for which the experimental values are almost certainly in error, the mean absolute percent deviation for the remaining 127 molecules drops to 2.42% and 2.48% for the LC-τHCTH and M11 functionals, respectively. Detailed comparison enables us to identifymore » 32 molecules for which the discrepancy between the calculated and experimental values warrants further investigation.« less
Gompertz type dechanneling functions for protons in <1 0 0>, <1 1 0> and <1 1 1> Si crystal channels
NASA Astrophysics Data System (ADS)
Petrović, S.; Erić, M.; Kokkoris, M.; Nešković, N.
2007-03-01
In this work the energy dependences of the Gompertz type sigmoidal dechanneling function parameters for protons in <1 0 0>, <1 1 0> and <1 1 1> Si crystal channels is investigated theoretically. The proton energy range considered is between 1 and 10 MeV. The original dechanneling functions are generated using a realistic Monte Carlo computer simulation code. We show that the Gompertz type dechanneling function, having two parameters, lc and k, representing the dechanneling range and rate, respectively, approximate accurately the original dechanneling function. It is also shown that the energy dependences of parameters lc and k can be approximated by a linear function and a sum of two exponential functions, respectively. The results obtained can be used for accurate reproduction of experimental proton channeling spectra recorded in the backscattering geometry.
Mizanur, Rahman M; Frasca, Verna; Swaminathan, Subramanyam; Bavari, Sina; Webb, Robert; Smith, Leonard A; Ahmed, S Ashraf
2013-08-16
Botulinum neurotoxins are the most toxic of all compounds. The toxicity is related to a poor zinc endopeptidase activity located in a 50-kDa domain known as light chain (Lc) of the toxin. The C-terminal tail of Lc is not visible in any of the currently available x-ray structures, and it has no known function but undergoes autocatalytic truncations during purification and storage. By synthesizing C-terminal peptides of various lengths, in this study, we have shown that these peptides competitively inhibit the normal catalytic activity of Lc of serotype A (LcA) and have defined the length of the mature LcA to consist of the first 444 residues. Two catalytically inactive mutants also inhibited LcA activity. Our results suggested that the C terminus of LcA might interact at or near its own active site. By using synthetic C-terminal peptides from LcB, LcC1, LcD, LcE, and LcF and their respective substrate peptides, we have shown that the inhibition of activity is specific only for LcA. Although a potent inhibitor with a Ki of 4.5 μm, the largest of our LcA C-terminal peptides stimulated LcA activity when added at near-stoichiometric concentration to three versions of LcA differing in their C-terminal lengths. The result suggested a product removal role of the LcA C terminus. This suggestion is supported by a weak but specific interaction determined by isothermal titration calorimetry between an LcA C-terminal peptide and N-terminal product from a peptide substrate of LcA. Our results also underscore the importance of using a mature LcA as an inhibitor screening target.
3D shear wave velocity structure revealed with ambient noise tomography on a DAS array
NASA Astrophysics Data System (ADS)
Zeng, X.; Thurber, C. H.; Wang, H. F.; Fratta, D.
2017-12-01
An 8700-m Distributed Acoustic Sensing (DAS) cable was deployed at Brady's Hot Springs, Nevada in March 2016 in a 1.5 by 0.5 km study area. The layout of the DAS array was designed with a zig-zag geometry to obtain relatively uniform areal and varied angular coverage, providing very dense coverage with a one-meter channel spacing. This array continuously recorded signals of a vibroseis truck, earthquakes, and traffic noise during the 15-day deployment. As shown in a previous study (Zeng et al., 2017), ambient noise tomography can be applied to DAS continuous records to image shear wave velocity structure in the near surface. To avoid effects of the vibroseis truck operation, only continuous data recorded during the nighttime was used to compute noise cross-correlation functions for channel pairs within a given linear segment. The frequency band of whitening was set at 5 to 15 Hz and the length of the cross-correlation time window was set to 60 second. The phase velocities were determined using the multichannel analysis of surface waves (MASW) methodology. The phase velocity dispersion curve was then used to invert for shear wave velocity profiles. A preliminarily velocity model at Brady's Hot Springs (Lawrence Livermore National Laboratory, 2015) was used as the starting model and the sensitivity kernels of Rayleigh wave group and phase velocities were computed with this model. As the sensitivity kernel shows, shear wave velocity in the top 200 m can be constrained with Rayleigh wave group and phase velocities in our frequency band. With the picked phase velocity data, the shear wave velocity structure can be obtained via Occam's inversion (Constable et al., 1987; Lai 1998). Shear wave velocity gradually increases with depth and it is generally faster than the Lawrence Livermore National Laboratory (2015) model. Furthermore, that model has limiting constraints at shallow depth. The strong spatial variation is interpreted to reflect the different sediments and sediment thicknesses in the near surface. Shear wave velocities in the northeast corner of the tested area is high whereas loose soil reduces shear wave velocities in the central part of the tested area. This spatial variation pattern is very similar to the results obtained with the ambient noise tomography using the 238-geophone array used the experiment.
DLP™-based dichoptic vision test system
NASA Astrophysics Data System (ADS)
Woods, Russell L.; Apfelbaum, Henry L.; Peli, Eli
2010-01-01
It can be useful to present a different image to each of the two eyes while they cooperatively view the world. Such dichoptic presentation can occur in investigations of stereoscopic and binocular vision (e.g., strabismus, amblyopia) and vision rehabilitation in clinical and research settings. Various techniques have been used to construct dichoptic displays. The most common and most flexible modern technique uses liquid-crystal (LC) shutters. When used in combination with cathode ray tube (CRT) displays, there is often leakage of light from the image intended for one eye into the view of the other eye. Such interocular crosstalk is 14% even in our state of the art CRT-based dichoptic system. While such crosstalk may have minimal impact on stereo movie or video game experiences, it can defeat clinical and research investigations. We use micromirror digital light processing (DLP™) technology to create a novel dichoptic visual display system with substantially lower interocular crosstalk (0.3% remaining crosstalk comes from the LC shutters). The DLP system normally uses a color wheel to display color images. Our approach is to disable the color wheel, synchronize the display directly to the computer's sync signal, allocate each of the three (former) color presentations to one or both eyes, and open and close the LC shutters in synchrony with those color events.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ocampo, Ruben P.; Bellah, Wendy
The existing Lawrence Livermore National Laboratory (LLNL) Site 300 drinking water system operation schematic is shown in Figures 1 and 2 below. The sources of water are from two Site 300 wells (Well #18 and Well #20) and San Francisco Public Utilities Commission (SFPUC) Hetch-Hetchy water through the Thomas shaft pumping station. Currently, Well #20 with 300 gallons per minute (gpm) pump capacity is the primary source of well water used during the months of September through July, while Well #18 with 225 gpm pump capacity is the source of well water for the month of August. The well watermore » is chlorinated using sodium hypochlorite to provide required residual chlorine throughout Site 300. Well water chlorination is covered in the Lawrence Livermore National Laboratory Experimental Test Site (Site 300) Chlorination Plan (“the Chlorination Plan”; LLNL-TR-642903; current version dated August 2013). The third source of water is the SFPUC Hetch-Hetchy Water System through the Thomas shaft facility with a 150 gpm pump capacity. At the Thomas shaft station the pumped water is treated through SFPUC-owned and operated ultraviolet (UV) reactor disinfection units on its way to Site 300. The Thomas Shaft Hetch- Hetchy water line is connected to the Site 300 water system through the line common to Well pumps #18 and #20 at valve box #1.« less
High-Power Microwave Metamaterials for Phased-Array, anti-HPM, and Pulse-Shaping Applications
2014-07-23
examined single-layer metasurfaces composed of miniature LC resonators arranged in a 2-D periodic lattice. These metasurfaces are engineered to be...with a reasonable degree of accuracy. Additionally, when the unit cell of the metasurface was composed of two different resonators, breakdown was...Electrical and Computer Engineering of the University of Wisconsin-Madison, we demonstrated that such single-layer metasurfaces can be used to reduce
High-Q Superconducting Coplanar Waveguide Resonators for Integration into Molecule Ion Traps
2010-05-01
V12C (3.13) 4 and We = V12 (3.14) 4 w 2 L’ finally yielding 2Wm R Q = wo m - w0L= woRC, (3.15) where wo = 1/ vLC is the resonant frequency of the...small. The primary challenge with simulating the microresonators was refining the mesh while remaining under memory limits of the modeling computer. It
Acceleration of Binding Site Comparisons by Graph Partitioning.
Krotzky, Timo; Klebe, Gerhard
2015-08-01
The comparison of protein binding sites is a prominent task in computational chemistry and has been studied in many different ways. For the automatic detection and comparison of putative binding cavities the Cavbase system has been developed which uses a coarse-grained set of pseudocenters to represent the physicochemical properties of a binding site and employs a graph-based procedure to calculate similarities between two binding sites. However, the comparison of two graphs is computationally quite demanding which makes large-scale studies such as the rapid screening of entire databases hardly feasible. In a recent work, we proposed the method Local Cliques (LC) for the efficient comparison of Cavbase binding sites. It employs a clique heuristic to detect the maximum common subgraph of two binding sites and an extended graph model to additionally compare the shape of individual surface patches. In this study, we present an alternative to further accelerate the LC method by partitioning the binding-site graphs into disjoint components prior to their comparisons. The pseudocenter sets are split with regard to their assigned phyiscochemical type, which leads to seven much smaller graphs than the original one. Applying this approach on the same test scenarios as in the former comprehensive way results in a significant speed-up without sacrificing accuracy. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Meteoroid head echo polarization features studied by numerical electromagnetics modeling
NASA Astrophysics Data System (ADS)
Vertatschitsch, L. E.; Sahr, J. D.; Colestock, P.; Close, S.
2011-12-01
Meteoroid head echoes are radar returns associated with scatter from the dense plasma surrounding meteoroids striking the Earth's atmosphere. Such echoes are detected by high power, large aperture (HPLA) radars. Frequently such detections show large variations in signal strength that suggest constructive and destructive interference. Using the ARPA Long-Range Tracking and Instrumentation Radar (ALTAIR) we can also observe the polarization of the returns. Usually, scatter from head echoes resembles scatter from a small sphere; when transmitting right circular polarization (RC), the received signal consists entirely of left circular polarization (LC). For some detections, power is also received in the RC channel, which indicates the presence of a more complicated scattering process. Radar returns of a fragmenting meteoroid are simulated using a hard-sphere scattering model numerically evaluated in the resonant region of Mie scatter. The cross- and co-polar scattering cross-sections are computed for pairs of spheres lying within a few wavelengths, simulating the earliest stages of fragmentation upon atmospheric impact. The likelihood of detecting this sort of idealized fragmentation event is small, but this demonstrates the measurements that would result from such an event would display RC power comparable to LC power, matching the anomalous data. The resulting computations show that fragmentation is a consistent interpretation for these head echo radar returns.
Vijay, Sonam
2014-01-01
Salivary gland proteins of Anopheles mosquitoes offer attractive targets to understand interactions with sporozoites, blood feeding behavior, homeostasis, and immunological evaluation of malaria vectors and parasite interactions. To date limited studies have been carried out to elucidate salivary proteins of An. stephensi salivary glands. The aim of the present study was to provide detailed analytical attributives of functional salivary gland proteins of urban malaria vector An. stephensi. A proteomic approach combining one-dimensional electrophoresis (1DE), ion trap liquid chromatography mass spectrometry (LC/MS/MS), and computational bioinformatic analysis was adopted to provide the first direct insight into identification and functional characterization of known salivary proteins and novel salivary proteins of An. stephensi. Computational studies by online servers, namely, MASCOT and OMSSA algorithms, identified a total of 36 known salivary proteins and 123 novel proteins analysed by LC/MS/MS. This first report describes a baseline proteomic catalogue of 159 salivary proteins belonging to various categories of signal transduction, regulation of blood coagulation cascade, and various immune and energy pathways of An. stephensi sialotranscriptome by mass spectrometry. Our results may serve as basis to provide a putative functional role of proteins in concept of blood feeding, biting behavior, and other aspects of vector-parasite host interactions for parasite development in anopheline mosquitoes. PMID:25126571
Vijay, Sonam; Rawat, Manmeet; Sharma, Arun
2014-01-01
Salivary gland proteins of Anopheles mosquitoes offer attractive targets to understand interactions with sporozoites, blood feeding behavior, homeostasis, and immunological evaluation of malaria vectors and parasite interactions. To date limited studies have been carried out to elucidate salivary proteins of An. stephensi salivary glands. The aim of the present study was to provide detailed analytical attributives of functional salivary gland proteins of urban malaria vector An. stephensi. A proteomic approach combining one-dimensional electrophoresis (1DE), ion trap liquid chromatography mass spectrometry (LC/MS/MS), and computational bioinformatic analysis was adopted to provide the first direct insight into identification and functional characterization of known salivary proteins and novel salivary proteins of An. stephensi. Computational studies by online servers, namely, MASCOT and OMSSA algorithms, identified a total of 36 known salivary proteins and 123 novel proteins analysed by LC/MS/MS. This first report describes a baseline proteomic catalogue of 159 salivary proteins belonging to various categories of signal transduction, regulation of blood coagulation cascade, and various immune and energy pathways of An. stephensi sialotranscriptome by mass spectrometry. Our results may serve as basis to provide a putative functional role of proteins in concept of blood feeding, biting behavior, and other aspects of vector-parasite host interactions for parasite development in anopheline mosquitoes.
Baron, G; Altomare, A; Regazzoni, L; Redaelli, V; Grandi, S; Riva, A; Morazzoni, P; Mazzolari, A; Carini, M; Vistoli, G; Aldini, G
2017-09-10
The aim of the present investigation was to better understand the pharmacokinetic profile of bilberry (Vaccinium Myrtillus) anthocyanins and the role of glucose transporters (sGLT1 and GLUT2) on their absorption. In particular, the absorption of 15 different anthocyanins contained in a standardized bilberry extract (Mirtoselect ® ) was measured in rats by a validated LC-ESI-MS/MS approach. The plasma concentration peak (Cmax) of 11.1ng/mL was reached after 30min and fasting condition significantly increased the bioavailability of anthocyanins by more than 7 fold in respect to fed rats. Glucose co-administration did not interfere with the overall anthocyanin uptake. Bioavailability of each anthocyanin was then estimated by comparing the relative content in plasma vs extract. The 15 anthocyanins behaved differently in term of bioavailability and both the aglycone and the sugar moiety were found to affect the absorption. For instance, arabinoside moiety was detrimental while cyanidin enhanced bioavailability. Computational studies permitted to rationalize such results, highlighting the role of glucose transporters (sGLT1 and GLUT2) in anthocyanins absorption. In particular a significant correlation was found for the 15 anthocyanins between sGLT1 and GLUT2 recognition and absorption. Copyright © 2017 Elsevier B.V. All rights reserved.
Xu, Yin-Yin; Lv, Wen-Juan; Ren, Cui-Ling; Niu, Xiao-Ying; Chen, Hong-Li; Chen, Xing-Guo
2018-01-12
The popularity of novel nanoparticles coated capillary column has aroused widespread attention of researchers. Metal organic frameworks (MOFs) with special structure and chemical properties have received great interest in separation sciences. This work presents the investigation of HKUST-1 (Hong Kong University of Science and Technology-1, called Cu 3 (BTC) 2 or MOF-199) nanoparticles as a new type of coating material for capillary electrochromatography. For the first time, three layers coating (3-LC), five layers coating (5-LC), ten layers coating (10-LC), fifteen layers coating (15-LC), twenty layers coating(20-LC) and twenty-five layers coating (25-LC) capillary columns coated with HKUST-1 nanoparticles were synthesized by covalent bond with in situ, layer-by-layer self-assembly approach. The results of scanning electron microscopy (SEM), X-ray diffraction (XRD) and plasma atomic emission spectrometry (ICP-AES) indicated that HKUST-1 was successfully grafted on the inner wall of the capillary. The separating performances of 3-LC, 5-LC, 10-LC, 15-LC, 20-LC and 25-LC open tubular (OT) capillary columns were studied with some neutral small organic molecules. The results indicated that the neutral small organic molecules were separated successfully with 10-LC, 15-LC and 20-LC OT capillary columns because of the size selectivity of lattice aperture and hydrophobicity of organic ligands. In addition, 10-LC and 15-LC OT capillary columns showed better performance for the separation of certain phenolic compounds. Furthermore, 10-LC, 15-LC and 20-LC OT capillary columns exhibited good intra-day repeatability with the relative standard deviations (RSDs; %) of migration time and peak areas lying in the range of 0.3-1.2% and 0.5-4.2%, respectively. For inter-day reproducibility, the RSDs of the three OT capillary columns were found to be lying in the range of 0.3-5.5% and 0.3-4.5% for migration time and peak area, respectively. The RSDs of retention times for column-to-column for three batches of 10-LC, 15-LC and 20-LC OT capillary columns were in the range from 2.3% to 7.2%. Moreover, the fabricated 10-LC, 15-LC and 20-LC OT capillary columns exhibited good repeatability and stability for separation, which could be used successively for more than 120 runs with no observable changes on the separation efficiency. Copyright © 2017 Elsevier B.V. All rights reserved.
Certification of Completion of ASC FY08 Level-2 Milestone ID #2933
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lipari, D A
2008-06-12
This report documents the satisfaction of the completion criteria associated with ASC FY08 Milestone ID No.2933: 'Deploy Moab resource management services on BlueGene/L'. Specifically, this milestone represents LLNL efforts to enhance both SLURM and Moab to extend Moab's capabilities to schedule and manage BlueGene/L, and increases portability of user scripts between ASC systems. The completion criteria for the milestone are the following: (1) Batch jobs can be specified, submitted to Moab, scheduled and run on the BlueGene/L system; (2) Moab will be able to support the markedly increased scale in node count as well as the wiring geometry that ismore » unique to BlueGene/L; and (3) Moab will also prepare and report statistics of job CPU usage just as it does for the current systems it supports. This document presents the completion evidence for both of the stated milestone certification methods: Completion evidence for this milestone will be in the form of (1) documentation--a report that certifies that the completion criteria have been met; and (2) user hand-off. As the selected Tri-Lab workload manager, Moab was chosen to replace LCRM as the enterprise-wide scheduler across Livermore Computing (LC) systems. While LCRM/SLURM successfully scheduled jobs on BG/L, the effort to replace LCRM with Moab on BG/L represented a significant challenge. Moab is a commercial product developed and sold by Cluster Resources, Inc. (CRI). Moab receives the users batch job requests and dispatches these jobs to run on a specific cluster. SLURM is an open-source resource manager whose development is managed by members of the Integrated Computational Resource Management Group (ICRMG) within the Services and Development Division at LLNL. SLURM is responsible for launching and running jobs on an individual cluster. Replacing LCRM with Moab on BG/L required substantial changes to both Moab and SLURM. While the ICRMG could directly manage the SLURM development effort, the work to enhance Moab had to be done by Moab's vendor. Members of the ICRMG held many meetings with CRI developers to develop the design and specify the requirements for what Moab needed to do. Extensions to SLURM are used to run jobs on the BlueGene/L architecture. These extensions support the three dimensional network topology unique to BG/L. While BG/L geometry support was already in SLURM, enhancements were needed to provide backfill capability and answer 'will-run' queries from Moab. For its part, the Moab architecture needed to be modified to interact with SLURM in a more coordinated way. It needed enhancements to support SLURM's shorthand notation for representing thousands of compute nodes and report this information using Moab's existing status commands. The LCRM wrapper scripts that emulated LCRM commands also needed to be enhanced to support BG/L usage. The effort was successful as Moab 5.2.2 and SLURM 1.3 was installed on the 106496 node BG/L machine on May 21, 2008, and turned over to the users to run production.« less
Ceramic High Efficiency Particulate Air (HEPA) Filter Final Report CRADA No. TC02102.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitchell, M.; Morse, T.
This was a collaborative effort between Lawrence Livermore National Security, LLC (formerly The Regents of the University of California)/Lawrence Livermor e National Laboratory (LLNL) and Flanders-Precisionaire (Flanders), to develop ceramic HEP A filters under a Thrust II Initiative for Proliferation Prevention (IPP) project. The research was conducted via the IPP Program at Commonwe alth of Independent States (CIS) Institutes, which are handled under a separate agreement. The institutes (collectively referred to as "CIS Institutes") involved with this project were: Bochvar: Federal State Unitarian Enterprise All-Russia Scientific and Research Institute of Inorganic Materials (FSUE VNIINM); Radium Khlopin: Federal State Unitarian Enterprisemore » NPO Radium Institute named (FSUE NPO Radium Institute); and Bakor: Science and Technology Center Bakor (STC Bakor).« less
Emission line spectra of S VII ? S XIV in the 20 ? 75 ? wavelength region
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lepson, J K; Beiersdorfer, P; Behar, E
As part of a larger project to complete a comprehensive catalogue of astrophysically relevant emission lines in support of new-generation X-ray observatories using the Lawrence Livermore electron beam ion traps EBIT-I and EBIT-II, the authors present observations of sulfur lines in the soft X-ray and extreme ultraviolet regions. The database includes wavelength measurements with standard errors, relative intensities, and line assignments for 127 transitions of S VII through S XIV between 20 and 75 {angstrom}. The experimental data are complemented with a full set of calculations using the Hebrew University Lawrence Livermore Atomic Code (HULLAC). A comparison of the laboratorymore » data with Chandra measurements of Procyon allows them to identify S VII-S XI lines.« less
Commercialization of Ultra-Hard Ceramics for Cutting Tools Final Report CRADA No. TC0279.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Landingham, R.; Neumann, T.
This was a collaborative effort between Lawrence Livermore National Security, LLC as manager and operator of Lawrence Livermore National Laboratory (LLNL) and Greenleaf Corporation (Greenleaf) to develop the technology for forming unique precursor nano-powders process that can be consolidated into ceramic products for industry. LLNL researchers have developed a solgel process for forming nano-ceramic powders. The nano powders are highly tailorable, allowing the explicit design of desired properties that lead to ultra hard materials with fine grain size. The present CRADA would allow the two parties to continue the development of the sol-gel process and the consolidation process in ordermore » to develop an industrially sound process for the manufacture of these ultra-hard materials.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pruet, J
2007-06-23
This report describes Kiwi, a program developed at Livermore to enable mature studies of the relation between imperfectly known nuclear physics and uncertainties in simulations of complicated systems. Kiwi includes a library of evaluated nuclear data uncertainties, tools for modifying data according to these uncertainties, and a simple interface for generating processed data used by transport codes. As well, Kiwi provides access to calculations of k eigenvalues for critical assemblies. This allows the user to check implications of data modifications against integral experiments for multiplying systems. Kiwi is written in python. The uncertainty library has the same format and directorymore » structure as the native ENDL used at Livermore. Calculations for critical assemblies rely on deterministic and Monte Carlo codes developed by B division.« less
Water Treatment Using Advanced Ultraviolet Light Sources Final Report CRADA No. TC02089.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoppes, W.; Oster, S.
This was a collaborative effort between Lawrence Livermore National Security, LLC as manager and operator of Lawrence Livermore National Laboratory (LLNL) and Teknichal Services, LLC (TkS), to develop water treatment systems using advanced ultraviolet light sources. The Russian institutes involved with this project were The High Current Electronics Institute (HCEI) and Russian Institute of Technical Physics-Institute of Experimental Physics (VNIIEF). HCEI and VNIIEF developed and demonstrated the potential commercial viability of short-wavelength ultraviolet excimer lamps under a Thrust 1 Initiatives for Proliferation Prevention (IPP) Program. The goals of this collaboration were to demonstrate both the commercial viability of excilampbased watermore » disinfection and achieve further substantial operational improvement in the lamps themselves; particularly in the area of energy efficiency.« less
Catalytic Properties of Botulinum Neurotoxins Subtypes A3 and A4
Henkel, James S.; Jacobson, Mark; Tepp, William; Pier, Christina; Johnson, Eric A.; Barbieri, Joseph T.
2009-01-01
Botulinum toxins (BoNT) are zinc proteases (serotypes A-G) which cause flaccid paralysis through the cleavage of SNARE proteins within motor neurons. BoNT/A was originally organized into two subtypes: BoNT/A1 and BoNT/A2, which are ~ 95 % homologous and possess similar catalytic activities. Subsequently, two additional subtypes were identified; BoNT/A3 (Loch Maree), and BoNT/A4 (657Ba), which have 81 and 88% homology with BoNT/A1, respectively. Alignment studies predicted that BoNT/A3 and BoNT/A4 were sufficiently different to BoNT/A1 to affect SNAP25 binding and cleavage. Recombinant Light Chain (LC) of BoNT/A3 (LC/A3) and BoNT/A4 (LC/A4) were subjected to biochemical analysis. LC/A3 cleaved SNAP25 at 50% the rate of LC/A1, but cleaved SNAPtide® at a faster rate than LC/A1, while LC/A4 cleaved SNAP25 and SNAPtide® at slower rates than LC/A1. LC/A3 and LC/A4 had similar Kms for SNAP25 relative to LC/A1, while the kcat for LC/A4 was 10- fold slower than LC/A1, suggesting a defect in substrate cleavage. Neither LC/A3 nor LC/A4 possessed autocatalytic activity, a property of LC/A1 and LC/A2. Thus, the four subtypes of BoNT/A bind SNAP25 with similar affinity but have different catalytic capacities for SNAP25 cleavage, SNAPtide® cleavage, and autocatalysis. The catalytic properties identified among the subtypes of LC/A may influence strategies for the development of small molecule- or peptide- inhibitors as therapies against botulism. PMID:19256469
Nanoliposomes protect against AL amyloid light chain protein-induced endothelial injury.
Truran, Seth; Weissig, Volkmar; Ramirez-Alvarado, Marina; Franco, Daniel A; Burciu, Camelia; Georges, Joseph; Murarka, Shishir; Okoth, Winter A; Schwab, Sara; Hari, Parameswaran; Migrino, Raymond Q
2014-03-01
A newly-recognized pathogenic mechanism underlying light chain amyloidosis (AL) involves endothelial dysfunction and cell injury caused by misfolded light chain proteins (LC). Nanoliposomes (NL) are artificial phospholipid vesicles that could attach to misfolded proteins and reduce tissue injury. To test whether co-treatment with NL reduces LC-induced endothelial dysfunction and cell death. Abdominal subcutaneous adipose arterioles from 14 non-AL subjects were cannulated; dilator response to acetylcholine and papaverine were measured at baseline and following 1-hour exposure to LC (20 µg/mL, 2 purified from AL subjects' urine, 1 from human recombinant LC [AL-09]) ± NL (phosphatidylcholine/cholesterol/phosphatidic acid 70/25/5 molar ratio) or NL alone. Human aortic artery endothelial cells (HAEC) were exposed to Oregon Green-labeled LC ± NL for 24 hours and intracellular LC and apoptosis (Hoechst stain) were measured. Circular dichroism spectroscopy was performed on AL-09 LC ± NL to follow changes in secondary structure and protein thermal stability. LC caused impaired dilation to acetylcholine that was restored by NL (control - 94.0 ± 1.8%, LC - 65.0 ± 7.1%, LC + NL - 95.3 ± 1.8%, p ≤ 0.001 LC versus control or LC + NL). NL protection was inhibited by L-NG-nitroarginine methyl ester. NL increased the beta sheet structure of LC, reduced endothelial cell internalization of LC and protected against LC-induced endothelial cell death. LC induced human adipose arteriole endothelial dysfunction and endothelial cell death, which were reversed by co-treatment with NL. This protection may partly be due to enhancing LC protein structure and reducing LC internalization. Nanoliposomes represent a promising new class of agents to ameliorate tissue injury from protein misfolding diseases such as AL.
01-NIF Dedication: George Miller
George Miller
2017-12-09
The National Ignition Facility, the world's largest laser system, was dedicated at a ceremony on May 29, 2009 at Lawrence Livermore National Laboratory. These are the remarks by Lab Director George Miller.
09-NIF Dedication: Arnold Schwarzenegger
DOE Office of Scientific and Technical Information (OSTI.GOV)
Governor Arnold Schwarzenegger
2009-07-02
The National Ignition Facility, the world's largest laser system, was dedicated at a ceremony on May 29, 2009 at Lawrence Livermore National Laboratory. These are the remarks by California Governor Arnold Schwarzenegger.
09-NIF Dedication: Arnold Schwarzenegger
Governor Arnold Schwarzenegger
2017-12-09
The National Ignition Facility, the world's largest laser system, was dedicated at a ceremony on May 29, 2009 at Lawrence Livermore National Laboratory. These are the remarks by California Governor Arnold Schwarzenegger.
01-NIF Dedication: George Miller
DOE Office of Scientific and Technical Information (OSTI.GOV)
George Miller
2009-07-02
The National Ignition Facility, the world's largest laser system, was dedicated at a ceremony on May 29, 2009 at Lawrence Livermore National Laboratory. These are the remarks by Lab Director George Miller.
02-NIF Dedication: Edward Moses
Edward Moses
2017-12-09
The National Ignition Facility, the world's largest laser system, was dedicated at a ceremony on May 29, 2009 at Lawrence Livermore National Laboratory. These are the remarks by NIF Director Edward Moses.
LC-MS/MS signal suppression effects in the analysis of pesticides in complex environmental matrices.
Choi, B K; Hercules, D M; Gusev, A I
2001-02-01
The application of LC separation and mobile phase additives in addressing LC-MS/MS matrix signal suppression effects for the analysis of pesticides in a complex environmental matrix was investigated. It was shown that signal suppression is most significant for analytes eluting early in the LC-MS analysis. Introduction of different buffers (e.g. ammonium formate, ammonium hydroxide, formic acid) into the LC mobile phase was effective in improving signal correlation between the matrix and standard samples. The signal improvement is dependent on buffer concentration as well as LC separation of the matrix components. The application of LC separation alone was not effective in addressing suppression effects when characterizing complex matrix samples. Overloading of the LC column by matrix components was found to significantly contribute to analyte-matrix co-elution and suppression of signal. This signal suppression effect can be efficiently compensated by 2D LC (LC-LC) separation techniques. The effectiveness of buffers and LC separation in improving signal correlation between standard and matrix samples is discussed.
NASA Astrophysics Data System (ADS)
Lundquist, J. K.; Sugiyama, G.; Nasstrom, J.
2007-12-01
This presentation describes the tools and services provided by the National Atmospheric Release Advisory Center (NARAC) at Lawrence Livermore National Laboratory (LLNL) for modeling the impacts of airborne hazardous materials. NARAC provides atmospheric plume modeling tools and services for chemical, biological, radiological, and nuclear airborne hazards. NARAC can simulate downwind effects from a variety of scenarios, including fires, industrial and transportation accidents, radiation dispersal device explosions, hazardous material spills, sprayers, nuclear power plant accidents, and nuclear detonations. NARAC collaborates on radiological dispersion source terms and effects models with Sandia National Laboratories and the U.S. Nuclear Regulatory Commission. NARAC was designated the interim provider of capabilities for the Department of Homeland Security's Interagency Modeling and Atmospheric Assessment Center by the Homeland Security Council in April 2004. The NARAC suite of software tools include simple stand-alone, local-scale plume modeling tools for end-user's computers, and Web- and Internet-based software to access advanced modeling tools and expert analyses from the national center at LLNL. Initial automated, 3-D predictions of plume exposure limits and protective action guidelines for emergency responders and managers are available from the center in 5-10 minutes. These can be followed immediately by quality-assured, refined analyses by 24 x 7 on-duty or on-call NARAC staff. NARAC continues to refine calculations using updated on-scene information, including measurements, until all airborne releases have stopped and the hazardous threats are mapped and impacts assessed. Model predictions include the 3-D spatial and time-varying effects of weather, land use, and terrain, on scales from the local to regional to global. Real-time meteorological data and forecasts are provided by redundant communications links to the U.S. National Oceanic and Atmospheric Administration (NOAA), U.S. Navy, and U.S. Air Force, as well as an in-house mesoscale numerical weather prediction model. NARAC provides an easy-to-use Geographical Information System (GIS) for display of plume predictions with affected population counts and detailed maps, and the ability to export plume predictions to other standard GIS capabilities. Data collection and product distribution is provided through a variety of communication methods, including dial-up, satellite, and wired and wireless networks. Ongoing research and development activities will be highlighted. The NARAC scientific support team is developing urban parameterizations for use in a regional dispersion model (see companion paper by Delle Monache). Modifications to the numerical weather prediction model WRF to account for characteristics of urban dynamics are also in progress, as is boundary-layer turbulence model development for simulations with resolutions greater than 1km. The NARAC building-resolving computational fluid dynamics capability, FEM3MP, enjoys ongoing development activities such as the expansion of its ability to model releases of dense gases. Other research activities include sensor-data fusion, such as the reconstruction of unknown source terms from sparse and disparate observations. This work was performed under the auspices of the U.S. Department of Energy by the University of California, Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48. The Department of Homeland Security sponsored the production of this material under the Department of Energy contract for the management and operation of Lawrence Livermore National Laboratory. UCRL-PROC-234355
NASA Astrophysics Data System (ADS)
Soloviev, A.; Dean, C.; Lukas, R.; Donelan, M. A.; Terray, E. A.
2016-12-01
Surface-wave breaking is a powerful mechanism producing significant energy flux to small scale turbulence. Most of the turbulent energy produced by breaking waves dissipates within one significant wave height, while the turbulent diffusion layer extends to approximately ten significant wave heights. Notably, the near-surface shear may practically vanish within the wave-stirred layer due to small-scale turbulent mixing. The surface ocean temperature-salinity structure, circulation, and mass exchanges (including greenhouse gases and pollutants) substantially depend on turbulent mixing and non-local transport in the near-surface layer of the ocean. Spatially coherent organized motions have been recognized as an important part of non-local transport. Langmuir circulation (LC) and ramp-like structures are believed to vertically transfer an appreciable portion of the momentum, heat, gases, pollutants (e.g., oil), and other substances in the upper layer of the ocean. Free surface significantly complicates the analysis of turbulent exchanges at the air-sea interface and the coherent structures are not yet completely understood. In particular, there is growing observational evidence that in the case of developing seas when the wind direction may not coincide with the direction of the energy containing waves, the Langmuir lines are oriented in the wind rather than the wave direction. In addition, the vortex force due to Stokes drift in traditional models is altered in the breaking-wave-stirred layer. Another complication is that the ramp-like structures in the upper ocean turbulent boundary layer have axes perpendicular to the axes of LC. The ramp-like structures are not considered in the traditional model. We have developed a new model, which treats the LC and ramp-like structures in the near-surface layer of the ocean as a coupled system. Using computational fluid dynamics tools (LES), we have been able to reproduce both LC and ramp-like structures coexisting in space though intermittent in time. In the model, helicity isosurfaces appear to be tilted and, in general, coordinated with the tilted velocity isosurfaces produced by ramp-like structures. This is an indication of coupling between the LC and ramp-like structures. Remarkably, the new model is able to explain observations of LC under developing seas.
Yin, Rui; Wang, Min; Huang, Ying-Ying; Huang, Huang-Chiao; Avci, Pinar; Chiang, Long Y; Hamblin, Michael R
2014-05-01
We report the synthesis and anticancer photodynamic properties of two new decacationic fullerene (LC14) and red light-harvesting antenna-fullerene conjugated monoadduct (LC15) derivatives. The antenna of LC15 was attached covalently to C60>with distance of only <3.0 Ǻ to facilitate ultrafast intramolecular photoinduced-electron-transfer (for type-I photochemistry) and photon absorption at longer wavelengths. Because LC15 was hydrophobic we compared formulation in Cremophor EL micelles with direct dilution from dimethylacetamide. LC14 produced more (1)O2 than LC15, while LC15 produced much more HO·than LC14 as measured by specific fluorescent probes. When delivered by DMA, LC14 killed more HeLa cells than LC15 when excited by UVA light, while LC15 killed more cells when excited by white light consistent with the antenna effect. However LC15 was more effective than LC14 when delivered by micelles regardless of the excitation light. Micellar delivery produced earlier apoptosis and damage to the endoplasmic reticulum as well as to lysosomes and mitochondria. This team of authors report the synthesis and the photodynamic properties of two new derivatives for cancer treatment; one is a decacationic fullerene (LC14) and the other is a red light-harvesting antenna-fullerene conjugated monoadduct (LC15) utilizing a HeLa cell model. Copyright © 2014 Elsevier Inc. All rights reserved.
Hayes, Taylor R; Petrov, Alexander A
2016-02-01
The ability to adaptively shift between exploration and exploitation control states is critical for optimizing behavioral performance. Converging evidence from primate electrophysiology and computational neural modeling has suggested that this ability may be mediated by the broad norepinephrine projections emanating from the locus coeruleus (LC) [Aston-Jones, G., & Cohen, J. D. An integrative theory of locus coeruleus-norepinephrine function: Adaptive gain and optimal performance. Annual Review of Neuroscience, 28, 403-450, 2005]. There is also evidence that pupil diameter covaries systematically with LC activity. Although imperfect and indirect, this link makes pupillometry a useful tool for studying the locus coeruleus norepinephrine system in humans and in high-level tasks. Here, we present a novel paradigm that examines how the pupillary response during exploration and exploitation covaries with individual differences in fluid intelligence during analogical reasoning on Raven's Advanced Progressive Matrices. Pupillometry was used as a noninvasive proxy for LC activity, and concurrent think-aloud verbal protocols were used to identify exploratory and exploitative solution periods. This novel combination of pupillometry and verbal protocols from 40 participants revealed a decrease in pupil diameter during exploitation and an increase during exploration. The temporal dynamics of the pupillary response was characterized by a steep increase during the transition to exploratory periods, sustained dilation for many seconds afterward, and followed by gradual return to baseline. Moreover, the individual differences in the relative magnitude of pupillary dilation accounted for 16% of the variance in Advanced Progressive Matrices scores. Assuming that pupil diameter is a valid index of LC activity, these results establish promising preliminary connections between the literature on locus coeruleus norepinephrine-mediated cognitive control and the literature on analogical reasoning and fluid intelligence.
Hoenner, Xavier; Whiting, Scott D; Hindell, Mark A; McMahon, Clive R
2012-01-01
Accurately quantifying animals' spatial utilisation is critical for conservation, but has long remained an elusive goal due to technological impediments. The Argos telemetry system has been extensively used to remotely track marine animals, however location estimates are characterised by substantial spatial error. State-space models (SSM) constitute a robust statistical approach to refine Argos tracking data by accounting for observation errors and stochasticity in animal movement. Despite their wide use in ecology, few studies have thoroughly quantified the error associated with SSM predicted locations and no research has assessed their validity for describing animal movement behaviour. We compared home ranges and migratory pathways of seven hawksbill sea turtles (Eretmochelys imbricata) estimated from (a) highly accurate Fastloc GPS data and (b) locations computed using common Argos data analytical approaches. Argos 68(th) percentile error was <1 km for LC 1, 2, and 3 while markedly less accurate (>4 km) for LC ≤ 0. Argos error structure was highly longitudinally skewed and was, for all LC, adequately modelled by a Student's t distribution. Both habitat use and migration routes were best recreated using SSM locations post-processed by re-adding good Argos positions (LC 1, 2 and 3) and filtering terrestrial points (mean distance to migratory tracks ± SD = 2.2 ± 2.4 km; mean home range overlap and error ratio = 92.2% and 285.6 respectively). This parsimonious and objective statistical procedure however still markedly overestimated true home range sizes, especially for animals exhibiting restricted movements. Post-processing SSM locations nonetheless constitutes the best analytical technique for remotely sensed Argos tracking data and we therefore recommend using this approach to rework historical Argos datasets for better estimation of animal spatial utilisation for research and evidence-based conservation purposes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Edward Moses
The National Ignition Facility, the world's largest laser system, was dedicated at a ceremony on May 29, 2009 at Lawrence Livermore National Laboratory. These are the concluding remarks by NIF Director Edward Moses, and a brief video presentation.
Small Optics Laser Damage Test Procedure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wolfe, Justin
2017-10-19
This specification defines the requirements and procedure for laser damage testing of coatings and bare surfaces designated for small optics in the National Ignition Facility (NIF) at Lawrence Livermore National Laboratory (LLNL).
Computational Electronics and Electromagnetics
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeFord, J.F.
The Computational Electronics and Electromagnetics thrust area is a focal point for computer modeling activities in electronics and electromagnetics in the Electronics Engineering Department of Lawrence Livermore National Laboratory (LLNL). Traditionally, they have focused their efforts in technical areas of importance to existing and developing LLNL programs, and this continues to form the basis for much of their research. A relatively new and increasingly important emphasis for the thrust area is the formation of partnerships with industry and the application of their simulation technology and expertise to the solution of problems faced by industry. The activities of the thrust areamore » fall into three broad categories: (1) the development of theoretical and computational models of electronic and electromagnetic phenomena, (2) the development of useful and robust software tools based on these models, and (3) the application of these tools to programmatic and industrial problems. In FY-92, they worked on projects in all of the areas outlined above. The object of their work on numerical electromagnetic algorithms continues to be the improvement of time-domain algorithms for electromagnetic simulation on unstructured conforming grids. The thrust area is also investigating various technologies for conforming-grid mesh generation to simplify the application of their advanced field solvers to design problems involving complicated geometries. They are developing a major code suite based on the three-dimensional (3-D), conforming-grid, time-domain code DSI3D. They continue to maintain and distribute the 3-D, finite-difference time-domain (FDTD) code TSAR, which is installed at several dozen university, government, and industry sites.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gallegos, G; Daniels, J; Wegrecki, A
2006-04-24
This document contains the human health and ecological risk assessment for the Resource Recovery and Conservation Act (RCRA) permit renewal for the Explosives Waste Treatment Facility (EWTF). Volume 1 is the text of the risk assessment, and Volume 2 (provided on a compact disc) is the supporting modeling data. The EWTF is operated by the Lawrence Livermore National Laboratory (LLNL) at Site 300, which is located in the foothills between the cities of Livermore and Tracy, approximately 17 miles east of Livermore and 8 miles southwest of Tracy. Figure 1 is a map of the San Francisco Bay Area, showingmore » the location of Site 300 and other points of reference. One of the principal activities of Site 300 is to test what are known as ''high explosives'' for nuclear weapons. These are the highly energetic materials that provide the force to drive fissionable material to criticality. LLNL scientists develop and test the explosives and the integrated non-nuclear components in support of the United States nuclear stockpile stewardship program as well as in support of conventional weapons and the aircraft, mining, oil exploration, and construction industries. Many Site 300 facilities are used in support of high explosives research. Some facilities are used in the chemical formulation of explosives; others are locations where explosive charges are mechanically pressed; others are locations where the materials are inspected radiographically for such defects as cracks and voids. Finally, some facilities are locations where the machined charges are assembled before they are sent to the on-site test firing facilities, and additional facilities are locations where materials are stored. Wastes generated from high-explosives research are treated by open burning (OB) and open detonation (OD). OB and OD treatments are necessary because they are the safest methods for treating explosives wastes generated at these facilities, and they eliminate the requirement for further handling and transportation that would be required if the wastes were treated off site.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gallegos, G; Daniels, J; Wegrecki, A
2007-10-01
This document contains the human health and ecological risk assessment for the Resource Recovery and Conservation Act (RCRA) permit renewal for the Explosives Waste Treatment Facility (EWTF). Volume 1 is the text of the risk assessment, and Volume 2 (provided on a compact disc) is the supporting modeling data. The EWTF is operated by the Lawrence Livermore National Laboratory (LLNL) at Site 300, which is located in the foothills between the cities of Livermore and Tracy, approximately 17 miles east of Livermore and 8 miles southwest of Tracy. Figure 1 is a map of the San Francisco Bay Area, showingmore » the location of Site 300 and other points of reference. One of the principal activities of Site 300 is to test what are known as 'high explosives' for nuclear weapons. These are the highly energetic materials that provide the force to drive fissionable material to criticality. LLNL scientists develop and test the explosives and the integrated non-nuclear components in support of the United States nuclear stockpile stewardship program as well as in support of conventional weapons and the aircraft, mining, oil exploration, and construction industries. Many Site 300 facilities are used in support of high explosives research. Some facilities are used in the chemical formulation of explosives; others are locations where explosive charges are mechanically pressed; others are locations where the materials are inspected radiographically for such defects as cracks and voids. Finally, some facilities are locations where the machined charges are assembled before they are sent to the onsite test firing facilities, and additional facilities are locations where materials are stored. Wastes generated from high-explosives research are treated by open burning (OB) and open detonation (OD). OB and OD treatments are necessary because they are the safest methods for treating explosives wastes generated at these facilities, and they eliminate the requirement for further handling and transportation that would be required if the wastes were treated off site.« less
The Weapons Laboratory Technical Library: Automating with ’Stilas’
1990-03-01
version of the system to LC in October 1988. -X- A small business specializing in library automation, SIRSI was founded in 1979 by library and...computer specialists, and has a strong reputation based upon the success of their UNIX-based Unicorn Collection Management System. SIRSI offers a complete...system based on the Unicorn and BRS/ Search systems. The contracted STILAS package includes UNISYS hardware, software written in the C language
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cooper, J. F.; Berner, J. K.
This was a collaborative effort between The Regents of the University of California, Lawrence Livermore National Laboratory (LLNL) and Contained Energy, Inc. (CEI), to conduct necessary research and to develop, fabricate and test a multi-cell carbon fuel cell.
Wang, Dan; Zhao, Jietang; Hu, Bing; Li, Jiaqi; Qin, Yaqi; Chen, Linhuan; Qin, Yonghua
2018-01-01
Sucrose phosphate synthase (SPS, EC 2.4.1.14) is a key enzyme that regulates sucrose biosynthesis in plants. SPS is encoded by different gene families which display differential expression patterns and functional divergence. Genome-wide identification and expression analyses of SPS gene families have been performed in Arabidopsis, rice, and sugarcane, but a comprehensive analysis of the SPS gene family in Litchi chinensis Sonn. has not yet been reported. In the current study, four SPS gene (LcSPS1, LcSPS2, LcSPS3, and LcSPS4) were isolated from litchi. The genomic organization analysis indicated the four litchi SPS genes have very similar exon-intron structures. Phylogenetic tree showed LcSPS1-4 were grouped into different SPS families (LcSPS1 and LcSPS2 in A family, LcSPS3 in B family, and LcSPS4 in C family). LcSPS1 and LcSPS4 were strongly expressed in the flowers, while LcSPS3 most expressed in mature leaves. RT-qPCR results showed that LcSPS genes expressed differentially during aril development between cultivars with different hexose/sucrose ratios. A higher level of expression of LcSPS genes was detected in Wuheli, which accumulates higher sucrose in the aril at mature. The tissue- and developmental stage-specific expression of LcSPS1-4 genes uncovered in this study increase our understanding of the important roles played by these genes in litchi fruits. PMID:29473005
Human Langerhans cells express E-cadherin.
Blauvelt, A; Katz, S I; Udey, M C
1995-02-01
Murine Langerhans cells (LC) synthesize and express E-cadherin, a Ca(++)-dependent homophilic cell adhesion molecule that mediates LC-keratinocyte (KC) binding in vitro. In vivo, E-cadherin expression by LC may promote localization and persistence of LC within the epidermis through LC-KC adhesion. In addition, changes in LC E-cadherin expression or affinity may be an important factor in the egress of LC from the epidermis after exposure to antigen. The aim of the present study was to determine if human LC also express E-cadherin. Suction blister roofs were obtained from normal volunteers and epidermal cell (EC) suspensions were prepared by limited trypsinization in the presence of 1 mM Ca++. EC were then incubated with antibodies to E-cadherin and CD1a or HLA-DR, and examined by two-color analytical flow cytometry or immunofluorescence microscopy. Most (82.9% +/- 7.4% [mean +/- SD], range 67-89%, n = 7) freshly prepared human LC expressed E-cadherin, as did the majority of KC. The amount of E-cadherin (as determined by mean fluorescence intensity) expressed by LC and KC was similar. Trypsin/EDTA treatment of freshly prepared EC abrogated expression of E-cadherin by LC and KC, whereas E-cadherin was not degraded by trypsin in the presence of Ca++. LC expressed lower levels of E-cadherin after 3 d in culture. Thus, human LC, like murine LC, express the homophilic adhesion molecule E-cadherin, which may be important in establishing and maintaining interactions between LC and KC in mammalian epidermis.
Signorelli, Luca; Patcas, Raphael; Peltomäki, Timo; Schätzle, Marc
2016-01-01
The aim of this study was to determine radiation doses of different cone-beam computed tomography (CBCT) scan modes in comparison to a conventional set of orthodontic radiographs (COR) by means of phantom dosimetry. Thermoluminescent dosimeter (TLD) chips (3 × 1 × 1 mm) were used on an adult male tissue-equivalent phantom to record the distribution of the absorbed radiation dose. Three different scanning modes (i.e., portrait, normal landscape, and fast scan landscape) were compared to CORs [i.e., conventional lateral (LC) and posteroanterior (PA) cephalograms and digital panoramic radiograph (OPG)]. The following radiation levels were measured: 131.7, 91, and 77 μSv in the portrait, normal landscape, and fast landscape modes, respectively. The overall effective dose for a COR was 35.81 μSv (PA: 8.90 μSv; OPG: 21.87 μSv; LC: 5.03 μSv). Although one CBCT scan may replace all CORs, one set of CORs still entails 2-4 times less radiation than one CBCT. Depending on the scan mode, the radiation dose of a CBCT is about 3-6 times an OPG, 8-14 times a PA, and 15-26 times a lateral LC. Finally, in order to fully reconstruct cephalograms including the cranial base and other important structures, the CBCT portrait mode must be chosen, rendering the difference in radiation exposure even clearer (131.7 vs. 35.81 μSv). Shielding radiation-sensitive organs can reduce the effective dose considerably. CBCT should not be recommended for use in all orthodontic patients as a substitute for a conventional set of radiographs. In CBCT, reducing the height of the field of view and shielding the thyroid are advisable methods and must be implemented to lower the exposure dose.
Wang, Liang; Chen, Min; Yang, Jie; Zhang, Zhihong
2013-01-01
LC3 is a marker protein that is involved in the formation of autophagosomes and autolysosomes, which are usually characterized and monitored by fluorescence microscopy using fluorescent protein-tagged LC3 probes (FP-LC3). FP-LC3 and even endogenous LC3 can also be incorporated into intracellular protein aggregates in an autophagy-independent manner. However, the dynamic process of LC3 associated with autophagosomes and autolysosomes or protein aggregates in living cells remains unclear. Here, we explored the dynamic properties of the two types of FP-LC3-containing puncta using fluorescence microscopy techniques, including fluorescence recovery after photobleaching (FRAP) and fluorescence resonance energy transfer (FRET). The FRAP data revealed that the fluorescent signals of FP-LC3 attached to phagophores or in mature autolysosomes showed either minimal or no recovery after photobleaching, indicating that the dissociation of LC3 from the autophagosome membranes may be very slow. In contrast, FP-LC3 in the protein aggregates exhibited nearly complete recovery (more than 80%) and rapid kinetics of association and dissociation (half-time < 1 sec), indicating a rapid exchange occurs between the aggregates and cytoplasmic pool, which is mainly due to the transient interaction of LC3 and SQSTM1/p62. Based on the distinct dynamic properties of FP-LC3 in the two types of punctate structures, we provide a convenient and useful FRAP approach to distinguish autophagosomes from LC3-involved protein aggregates in living cells. Using this approach, we find the FP-LC3 puncta that adjacently localized to the phagophore marker ATG16L1 were protein aggregate-associated LC3 puncta, which exhibited different kinetics compared with that of autophagic structures. PMID:23482084
08-NIF Dedication: Zoe Lofgren
Congresswoman Zoe Lofgren
2017-12-09
The National Ignition Facility, the world's largest laser system, was dedicated at a ceremony on May 29, 2009 at Lawrence Livermore National Laboratory. These are the remarks by Congresswoman Zoe Lofgren, of California's 16th district.
11-NIF Dedication: Dianne Feinstein
U.S. Senator Dianne Feinstein
2017-12-09
The National Ignition Facility, the world's largest laser system, was dedicated at a ceremony on May 29, 2009 at Lawrence Livermore National Laboratory. These are the remarks by U.S. Senator Dianne Feinstein of California.
08-NIF Dedication: Zoe Lofgren
DOE Office of Scientific and Technical Information (OSTI.GOV)
Congresswoman Zoe Lofgren
2009-07-02
The National Ignition Facility, the world's largest laser system, was dedicated at a ceremony on May 29, 2009 at Lawrence Livermore National Laboratory. These are the remarks by Congresswoman Zoe Lofgren, of California's 16th district.
11-NIF Dedication: Dianne Feinstein
DOE Office of Scientific and Technical Information (OSTI.GOV)
U.S. Senator Dianne Feinstein
2009-07-02
The National Ignition Facility, the world's largest laser system, was dedicated at a ceremony on May 29, 2009 at Lawrence Livermore National Laboratory. These are the remarks by U.S. Senator Dianne Feinstein of California.
NASA Spacecraft Images Texas Wildfire
2012-05-15
The Livermore and Spring Ranch fires near the Davis Mountain Resort, Texas, burned 13,000 and 11,000 acres respectively. When NASA Terra spacecraft acquired this image on May 12, 2012, both fires had been contained.
Gao, Wentao; Chen, Zhixia; Wang, Wei; Stang, Michael T.
2013-01-01
p62 is constitutively degraded by autophagy via its interaction with LC3. However, the interaction of p62 with LC3 species in the context of the LC3 lipidation process is not specified. Further, the p62-mediated protein aggregation’s effect on autophagy is unclear. We systemically analyzed the interactions of p62 with all known Atg proteins involved in LC3 lipidation. We find that p62 does not interact with LC3 at the stages when it is being processed by Atg4B or when it is complexed or conjugated with Atg3. p62 does interact with LC3-I and LC3-I:Atg7 complex and is preferentially recruited by LC3-II species under autophagic stimulation. Given that Atg4B, Atg3 and LC3-Atg3 are indispensable for LC3-II conversion, our study reveals a protective mechanism for Atg4B, Atg3 and LC3-Atg3 conjugate from being inappropriately sequestered into p62 aggregates. Our findings imply that p62 could potentially impair autophagy by negatively affecting LC3 lipidation and contribute to the development of protein aggregate diseases. PMID:24023838
Gao, Wentao; Chen, Zhixia; Wang, Wei; Stang, Michael T
2013-01-01
p62 is constitutively degraded by autophagy via its interaction with LC3. However, the interaction of p62 with LC3 species in the context of the LC3 lipidation process is not specified. Further, the p62-mediated protein aggregation's effect on autophagy is unclear. We systemically analyzed the interactions of p62 with all known Atg proteins involved in LC3 lipidation. We find that p62 does not interact with LC3 at the stages when it is being processed by Atg4B or when it is complexed or conjugated with Atg3. p62 does interact with LC3-I and LC3-I:Atg7 complex and is preferentially recruited by LC3-II species under autophagic stimulation. Given that Atg4B, Atg3 and LC3-Atg3 are indispensable for LC3-II conversion, our study reveals a protective mechanism for Atg4B, Atg3 and LC3-Atg3 conjugate from being inappropriately sequestered into p62 aggregates. Our findings imply that p62 could potentially impair autophagy by negatively affecting LC3 lipidation and contribute to the development of protein aggregate diseases.
2002 Small Mammal Inventory at Lawrence Livermore National Laboratory, Site 300
DOE Office of Scientific and Technical Information (OSTI.GOV)
West, E; Woollett, J
2004-11-16
To assist the University of California in obtaining biological assessment information for the ''2004 Environmental Impact Statement for Continued Operation of Lawrence Livermore National Laboratory (LLNL)'', Jones & Stokes conducted an inventory of small mammals in six major vegetation communities at Site 300. These communities were annual grassland, native grassland, oak savanna, riparian corridor, coastal scrub, and seep/spring wetlands. The principal objective of this study was to assess the diversity and abundance of small mammal species in these communities, as well as the current status of any special-status small mammal species found in these communities. Surveys in the native grasslandmore » community were conducted before and after a controlled fire management burn of the grasslands to qualitatively evaluate any potential effects of fire on small mammals in the area.« less
Adaptive Optics at Lawrence Livermore National Laboratory
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gavel, D T
2003-03-10
Adaptive optics enables high resolution imaging through the atmospheric by correcting for the turbulent air's aberrations to the light waves passing through it. The Lawrence Livermore National Laboratory for a number of years has been at the forefront of applying adaptive optics technology to astronomy on the world's largest astronomical telescopes, in particular at the Keck 10-meter telescope on Mauna Kea, Hawaii. The technology includes the development of high-speed electrically driven deformable mirrors, high-speed low-noise CCD sensors, and real-time wavefront reconstruction and control hardware. Adaptive optics finds applications in many other areas where light beams pass through aberrating media andmore » must be corrected to maintain diffraction-limited performance. We describe systems and results in astronomy, medicine (vision science), and horizontal path imaging, all active programs in our group.« less
The next phase of the Axion Dark Matter eXperiment
NASA Astrophysics Data System (ADS)
Carosi, Gianpaolo; Asztalos, S.; Hagmann, C.; Kinion, D.; van Bibber, K.; Hotz, M.; Lyapustin, D.; Rosenberg, L.; Rybka, G.; Wagner, A.; Hoskins, J.; Martin, C.; Sikivie, P.; Sullivan, N.; Tanner, D.; Bradley, R.; Clarke, J.; ADMX Collaboration
2011-04-01
Axions are a well motivated dark matter candidate which may be detected by their resonant conversion to photons in the presence of a large static magnetic field. The Axion Dark Matter eXperiment recently finished a search for DM axions using a new ultralow noise microwave receiver based on a SQUID amplifier. The success of this precursor experiment has paved the way for a definitive axion search which will see the system noise temperature lowered from 1.8 K to 100 mK, dramatically increasing sensitivity to even pessimistic axion models as well as increasing scan speed. Here we discuss the implementation of this next experimental phase. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Security, LLC, Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taffet, Michael J.; Esser, Bradley K.; Madrid, Victor M.
This report summarizes work performed by Lawrence Livermore National Laboratory (LLNL) under Navajo Nation Services Contract CO9729 in support of the Navajo Abandoned Mine Lands Reclamation Program (NAMLRP). Due to restrictions on access to uranium mine waste sites at Tse Tah, Arizona that developed during the term of the contract, not all of the work scope could be performed. LLNL was able to interpret environmental monitoring data provided by NAMLRP. Summaries of these data evaluation activities are provided in this report. Additionally, during the contract period, LLNL provided technical guidance, instructional meetings, and review of relevant work performed by NAMLRPmore » and its contractors that was not contained in the contract work scope.« less
Double-shell target fabrication workshop-2016 report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Y. Morris; Oertel, John; Farrell, Michael
On June 30, 2016, over 40 representatives from Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory (LANL), General Atomics (GA), Laboratory for Laser Energetics (LLE), Schafer Corporation, and NNSA headquarter attended a double-shell (DS) target fabrication workshop at Livermore, California. Pushered-single-shell (PSS) and DS metalgas platforms potentially have a large impact on programmatic applications. The goal of this focused workshop is to bring together target fabrication scientists, physicists, and designers to brainstorm future PSS and DS target fabrication needs and strategies. This one-day workshop intends to give an overall view of historical information, recent approaches, and future research activitiesmore » at each participating organization. Five topical areas have been discussed that are vital to the success of future DS target fabrications, including inner metal shells, foam spheres, outer ablators, fill tube assembly, and metrology.« less
NASA Astrophysics Data System (ADS)
Elgin, L.; Handy, T.; Malamud, G.; Huntington, C. M.; Trantham, M. R.; Klein, S. R.; Kuranz, C. C.; Drake, R. P.; Shvarts, D.
2017-10-01
Potential flow models predict that a Rayleigh-Taylor unstable system will reach a terminal velocity (and constant Froude number) at low Atwood numbers. Numerical simulations predict a re-acceleration phase of Rayleigh-Taylor Instability (RTI) and higher Froude number at late times. To observe this effect, we are conducting a series of experiments at OMEGA 60 to measure single-mode RTI growth at low and high Atwood numbers and late times. X-ray radiographs spanning 40 + ns capture the evolution of these systems. Experimental design challenges and initial results are discussed here. This work is funded by the Lawrence Livermore National Laboratory under subcontract B614207, and was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract No. DE-AC52-07NA27344.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pagoria, P.; Racoveanu, A.
This was a collaborative effort between Lawrence Livermore National Security, LLC as manager and operator of Lawrence Livermore National Laboratory (LLNL) and Physical Sciences, Inc. (PSI), to develop a synthesis of two novel energetic heterocyclic oxidizers as possible replacements for ammonium perchlorate (AP) in rocket propellant formulations. This CRADA resulted from the award of the Phase I Small Business Technology Transfer (STTR) from DOD. The CRADA consisted of two phases. The goal for Phase 1 was to produce a new oxidizer called TNMDNP. Phase 2 is optional (based on the success of Phase 1) and the goal of Phase 2more » (optional) was to produce a new oxidizer called TNMDNT. Phase 2 tasks would be performed based on the successful results of Phase 1.« less
Acha, Robert; Brey, Richard; Capello, Kevin
2013-02-01
A torso phantom was developed by the Lawrence Livermore National Laboratory (LLNL) that serves as a standard for intercomparison and intercalibration of detector systems used to measure low-energy photons from radionuclides, such as americium deposited in the lungs. DICOM images of the second-generation Human Monitoring Laboratory-Lawrence Livermore National Laboratory (HML-LLNL) torso phantom were segmented and converted into three-dimensional (3D) voxel phantoms to simulate the response of high purity germanium (HPGe) detector systems, as found in the HML new lung counter using a Monte Carlo technique. The photon energies of interest in this study were 17.5, 26.4, 45.4, 59.5, 122, 244, and 344 keV. The detection efficiencies at these photon energies were predicted for different chest wall thicknesses (1.49 to 6.35 cm) and compared to measured values obtained with lungs containing (241)Am (34.8 kBq) and (152)Eu (10.4 kBq). It was observed that no statistically significant differences exist at the 95% confidence level between the mean values of simulated and measured detection efficiencies. Comparisons between the simulated and measured detection efficiencies reveal a variation of 20% at 17.5 keV and 1% at 59.5 keV. It was found that small changes in the formulation of the tissue substitute material caused no significant change in the outcome of Monte Carlo simulations.
Classification of lung cancer histology by gold nanoparticle sensors
Barash, Orna; Peled, Nir; Tisch, Ulrike; Bunn, Paul A.; Hirsch, Fred R.; Haick, Hossam
2016-01-01
We propose a nanomedical device for the classification of lung cancer (LC) histology. The device profiles volatile organic compounds (VOCs) in the headspace of (subtypes of) LC cells, using gold nanoparticle (GNP) sensors that are suitable for detecting LC-specific patterns of VOC profiles, as determined by gas chromatography–mass spectrometry analysis. Analyzing the GNP sensing signals by support vector machine allowed significant discrimination between (i) LC and healthy cells; (ii) small cell LC and non–small cell LC; and between (iii) two subtypes of non–small cell LC: adenocarcinoma and squamous cell carcinoma. The discriminative power of the GNP sensors was then linked with the chemical nature and composition of the headspace VOCs of each LC state. These proof-of-concept findings could totally revolutionize LC screening and diagnosis, and might eventually allow early and differential diagnosis of LC subtypes with detectable or unreachable lung nodules. PMID:22033081
Toxicity of botanical formulations to nursery-infesting white grubs (Coleoptera: Scarabaeidae).
Ranger, Christopher M; Reding, Michael E; Oliver, Jason B; Moyseenko, James J; Youssef, Nadeer N
2009-02-01
The toxicity of eight botanically based biopesticides was evaluated against third instars of the scarab larvae (Coleoptera: Scarabaeidae) Popillia japonica Newman, Rhizotrogus majalis (Razoumowsky), Anomala orientalis Waterhouse, and Cyclocephala borealis Arrow. Soil dip bioassays were used to obtain concentration-mortality data 7 d after treatment of larvae, leading to the calculation of LC50 and LC90 values. A wide range in LC50 and LC90 values were exhibited among the formulations. The product Armorex was one of the most active formulations against P. japonica (LC50 = 0.42 ml/liter), R. majalis (LC50 = 0.48 ml/liter), A. orientalis (LC50 = 0.39 ml/liter), and C. borealis (LC50 = 0.49 ml/liter). Armorex is composed of extracts from diverse botanical sources, including 84.5% sesame oil, 2.0% garlic oil, 2.0% clove oil, 1.0% rosemary oil, and 0.5% white pepper extracts. The product Azatin, composed of 3% azadirachtin, also exhibited high toxicity to P. japonica (LC50 = 1.13 ml/liter), R. majalis (LC50 = 0.81 ml/liter), and A. orientalis (LC50 = 1.87 ml/liter). Veggie Pharm is composed of extracts from diverse sources, but this product showed the lowest toxicity to P. japonica (LC50 = 35.19 ml/liter), R. majalis (LC50 = 62.10 ml/liter), A. orientalis (LC50 = 43.76 ml/liter), and C. borealis (LC50 = 50.24 ml/liter). These results document the potential for botanical formulations to control white grubs, but blending extracts from diverse botanical sources does not ensure enhanced biological activity.
Achari, Arunkumar E; Jain, Sushil K
2017-09-15
Diabetic patients have lower blood levels of l-cysteine (LC) and glutathione (GSH). This study examined the hypothesis that LC supplementation positively up regulates the effects of insulin on GSH and glucose metabolism in 3T3-L1 adipocyte model. 3T3L1 adipocytes were treated with LC (250 μM, 2 h) and/or insulin (15 or 30 nM, 2 h), and high glucose (HG, 25 mM, 20 h). Results showed that HG caused significant increase (95%) in ROS and reduction in the protein levels of DsbA-L (43%), adiponectin (64%), GCLC (20%), GCLM (21%), GSH (50%), and GLUT-4 (23%) in adipocytes. Furthermore, HG caused a reduction in total (35%) and HMW adiponectin (30%) secretion. Treatment with insulin alone significantly (p < 0.05) reduced ROS levels as well as increased DsbA-L, adiponectin, GCLC, GCLM, GSH, and GLUT-4 protein levels, glucose utilization, and improved total and HMW adiponectin secretion in HG treated adipocytes compared to HG alone. Interestingly, LC supplementation along with insulin caused greater reduction in ROS levels and significantly (p < 0.05) boosted the DsbA-L (41% vs LC, 29% vs Insulin), adiponectin (92% Vs LC, 84% Vs insulin) protein levels and total (32% Vs LC, 22% Vs insulin) and HMW adiponectin (75% Vs LC, 39% Vs insulin) secretion compared with the either insulin or LC alone in HG-treated cells. In addition, LC supplementation along with insulin increased GCLC (21% Vs LC, 14% insulin), GCLM (28% Vs LC, 16% insulin) and GSH (25% Vs LC and insulin) levels compared with the either insulin or LC alone in HG-treated cells. Furthermore, LC and insulin increases GLUT-4 protein expression (65% Vs LC, 18% Vs Insulin), glucose utilization (57% Vs LC, 27% Vs insulin) compared with the either insulin or LC alone in HG-treated cells. Similarly, LC supplementation increased insulin action significantly in cells maintained in medium contained control glucose. To explore the beneficial effect of LC is mediated by the upregulation of GCLC, we knocked down GCLC using siRNA in adipoctyes. There was a significant decrease in DsbA-L and GLUT-4 mRNA levels and GSH levels in GCLC knockdown adipocytes and LC supplementation up regulates GCLC, DsbA-L and GLUT-4 mRNA expression and GSH levels in GCLC knockdown cells. These results demonstrated that LC along with insulin increases GSH levels thereby improving adiponectin secretion and glucose utilization in adipocytes. This suggests that LC supplementation can increase insulin sensitivity and can be used as an adjuvant therapy for diabetes. Copyright © 2017. Published by Elsevier Inc.
Movements and Spatial Use of False Killer Whales in Hawaii: Satellite Tagging Studies in 2009
2011-02-07
with estimated error of between 500 and 1 500 m), as well as LC0 , LCA, LCB, and LCZ locations (with no estimation of accuracy) were only retained...for each individual that passed the Douglas Argos-Filter, by location class (LC) ID # locations after filtering LC3 LC2 LC1 LC0 LCA LCB LCZ
Risk factors and serological markers of liver cirrhosis after Fontan procedure.
Shimizu, Mikiko; Miyamoto, Kenji; Nishihara, Yunosuke; Izumi, Gaku; Sakai, Shuji; Inai, Kei; Nishikawa, Toshio; Nakanishi, Toshio
2016-09-01
Liver cirrhosis (LC), which may result in hepatic failure or cancer, has been reported in patients after Fontan procedure. The purpose of this study was to clarify the frequency and histological characteristics of LC, and to evaluate the risk factors and serological markers of LC with Fontan circulation. Retrospective review of contrast-enhanced CT scans (CT) of the liver was carried out in 57 patients after Fontan procedure. Patients were divided into two groups: LC group (n = 31) and no LC group (n = 26). Age at Fontan procedure, duration after Fontan procedure, catheterization data, and history of failing Fontan circulation were compared between groups. Serological data including γ-GTP and hyaluronic acid were compared. Histology of autopsy specimens was assessed when available. Duration after Fontan procedure was significantly longer in LC group than no LC group. History of failing Fontan circulation was more frequent in LC group than in no LC group. There was no correlation between type of procedure (APC/Bjork/lateral tunnel/TCPC) and LC in this series. Serum hyaluronic acid, γ-GTP, and Forns index were significantly higher in LC group. Significant risk factors for LC were duration after Fontan procedure (>20 years). In autopsy specimens, histopathological changes of LC were observed predominantly in the central venous area. LC diagnosed with CT is frequent in patients long after Fontan procedure, especially after 20 years. Hyaluronic acid and γ-GTP could be useful markers to monitor the progression of liver fibrosis in Fontan patients.
Scheltema, Richard A; Mann, Matthias
2012-06-01
With the advent of high-throughput mass spectrometry (MS)-based proteomics, the magnitude and complexity of the performed experiments has increased dramatically. Likewise, investments in chromatographic and MS instrumentation are a large proportion of the budget of proteomics laboratories. Guarding measurement quality and maximizing uptime of the LC-MS/MS systems therefore requires constant care despite automated workflows. We describe a real-time surveillance system, called SprayQc, that continuously monitors the status of the peripheral equipment to ensure that operational parameters are within an acceptable range. SprayQc is composed of multiple plug-in software components that use computer vision to analyze electrospray conditions, monitor the chromatographic device for stable backpressure, interact with a column oven to control pressure by temperature, and ensure that the mass spectrometer is still acquiring data. Action is taken when a failure condition has been detected, such as stopping the column oven and the LC flow, as well as automatically notifying the appropriate operator. Additionally, all defined metrics can be recorded synchronized on retention time with the MS acquisition file, allowing for later inspection and providing valuable information for optimization. SprayQc has been extensively tested in our laboratory, supports third-party plug-in development, and is freely available for download from http://sourceforge.org/projects/sprayqc .
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDonald, Rachel; Probyn, Linda; Poon, Ian
Purpose: To evaluate the applicability of the Response Evaluation Criteria in Solid Tumors (RECIST 1.1) and University of Texas MD Anderson (MDA) Cancer Center criteria in the setting of stereotactic body radiation therapy (SBRT) to nonspine bone metastases. Methods: Patients who were treated with SBRT to nonspine bone metastases were identified by retrospective chart review. An independent musculoskeletal radiologist evaluated response to treatment using computed tomography (CT) scans. Results: Thirty-three patients were treated to 42 nonspine bone metastases. The most common primary cancer cites were renal cell carcinoma (RCC) (33.3%), lung (24.2%), and prostate (18.2%). Bone metastases were either mainlymore » lytic (57.1%), mainly sclerotic (28.6%), or mixed (14.3%). When lytic and sclerotic lesions were evaluated according to RECIST 1.1, local control (LC) was 83%, 85%, 88%, and 80% for those with CT imaging between months 1 to 3, 4 to 6, 7 to 9, and 10 to 12, respectively. When evaluated by the MDA criteria by density, LC within each time period was slightly greater. Overall LC decreased considerably when evaluated by MDA in terms of size. Conclusions: Consensus definitions of response are required as they have implications on clinical trials and disease management. Without consistent response criteria, outcomes from clinical trials cannot be compared and treatment efficacy remains undetermined.« less
Image analysis tools and emerging algorithms for expression proteomics
English, Jane A.; Lisacek, Frederique; Morris, Jeffrey S.; Yang, Guang-Zhong; Dunn, Michael J.
2012-01-01
Since their origins in academic endeavours in the 1970s, computational analysis tools have matured into a number of established commercial packages that underpin research in expression proteomics. In this paper we describe the image analysis pipeline for the established 2-D Gel Electrophoresis (2-DE) technique of protein separation, and by first covering signal analysis for Mass Spectrometry (MS), we also explain the current image analysis workflow for the emerging high-throughput ‘shotgun’ proteomics platform of Liquid Chromatography coupled to MS (LC/MS). The bioinformatics challenges for both methods are illustrated and compared, whilst existing commercial and academic packages and their workflows are described from both a user’s and a technical perspective. Attention is given to the importance of sound statistical treatment of the resultant quantifications in the search for differential expression. Despite wide availability of proteomics software, a number of challenges have yet to be overcome regarding algorithm accuracy, objectivity and automation, generally due to deterministic spot-centric approaches that discard information early in the pipeline, propagating errors. We review recent advances in signal and image analysis algorithms in 2-DE, MS, LC/MS and Imaging MS. Particular attention is given to wavelet techniques, automated image-based alignment and differential analysis in 2-DE, Bayesian peak mixture models and functional mixed modelling in MS, and group-wise consensus alignment methods for LC/MS. PMID:21046614
Yan, Zhengyin; Maher, Noureddine; Torres, Rhoda; Cotto, Carlos; Hastings, Becki; Dasgupta, Malini; Hyman, Rolanda; Huebert, Norman; Caldwell, Gary W
2008-07-01
In addition to matrix effects, common interferences observed in liquid chromatography/tandem mass spectrometry (LC/MS/MS) analyses can be caused by the response of drug-related metabolites to the multiple reaction monitoring (MRM) channel of a given drug, as a result of in-source reactions or decomposition of either phase I or II metabolites. However, it has been largely ignored that, for some drugs, metabolism can lead to the formation of isobaric or isomeric metabolites that exhibit the same MRM transitions as parent drugs. The present study describes two examples demonstrating that interference caused by isobaric or isomeric metabolites is a practical issue in analyzing biological samples by LC/MS/MS. In the first case, two sequential metabolic reactions, demethylation followed by oxidation of a primary alcohol moiety to a carboxylic acid, produced an isobaric metabolite that exhibits a MRM transition identical to the parent drug. Because the drug compound was rapidly metabolized in rats and completely disappeared in plasma samples, the isobaric metabolite appeared as a single peak in the total ion current (TIC) trace and could easily be quantified as the drug since it was eluted at a retention time very close to that of the drug in a 12-min LC run. In the second example, metabolism via the ring-opening of a substituted isoxazole moiety led to the formation of an isomeric product that showed an almost identical collision-induced dissociation (CID) MS spectrum as the original drug. Because two components were co-eluted, the isomeric product could be mistakenly quantified and reported by data processing software as the parent drug if the TIC trace was not carefully inspected. Nowadays, all LC/MS data are processed by computer software in a highly automated fashion, and some analysts may spend much less time to visually examine raw TIC traces than they used to do. Two examples described in this article remind us that quality data require both adequate chromatographic separations and close examination of raw data in LC/MS/MS analyses of drugs in biological matrix.
Brahme, Anders; Nyman, Peter; Skatt, Björn
2008-05-01
A four-dimensional (4D) laser camera (LC) has been developed for accurate patient imaging in diagnostic and therapeutic radiology. A complementary metal-oxide semiconductor camera images the intersection of a scanned fan shaped laser beam with the surface of the patient and allows real time recording of movements in a three-dimensional (3D) or four-dimensional (4D) format (3D +time). The LC system was first designed as an accurate patient setup tool during diagnostic and therapeutic applications but was found to be of much wider applicability as a general 4D photon "tag" for the surface of the patient in different clinical procedures. It is presently used as a 3D or 4D optical benchmark or tag for accurate delineation of the patient surface as demonstrated for patient auto setup, breathing and heart motion detection. Furthermore, its future potential applications in gating, adaptive therapy, 3D or 4D image fusion between most imaging modalities and image processing are discussed. It is shown that the LC system has a geometrical resolution of about 0, 1 mm and that the rigid body repositioning accuracy is about 0, 5 mm below 20 mm displacements, 1 mm below 40 mm and better than 2 mm at 70 mm. This indicates a slight need for repeated repositioning when the initial error is larger than about 50 mm. The positioning accuracy with standard patient setup procedures for prostate cancer at Karolinska was found to be about 5-6 mm when independently measured using the LC system. The system was found valuable for positron emission tomography-computed tomography (PET-CT) in vivo tumor and dose delivery imaging where it potentially may allow effective correction for breathing artifacts in 4D PET-CT and image fusion with lymph node atlases for accurate target volume definition in oncology. With a LC system in all imaging and radiation therapy rooms, auto setup during repeated diagnostic and therapeutic procedures may save around 5 min per session, increase accuracy and allow efficient image fusion between all imaging modalities employed.
Method of preparing a tunable-focus liquid-crystal (LC) lens
NASA Astrophysics Data System (ADS)
Li, Xiaolong; Zhou, Zuowei; Ren, Hongwen
2018-02-01
A liquid crystal (LC) lens is prepared by controlling the alignment of a LC using a homogeneous polyimide (PI) layer and a homeotropic PI layer. The rubbed homogeneous PI layer has a concave surface and the homeotropic PI layer is flat. The LC sandwiched between the two PI layers obtains a hybrid alignment which has the largest gradient of refractive index (GRIN) distribution. The LC layer exhibits a lens character because of its convex shape. Since the effective refractive index of the LC is larger than that of the homogeneous PI, the LC lens can focus a light with the shortest focal length in the voltage-off state. By applying an external voltage, the LC molecules can be reoriented along the electric field. As a result, the focal length of the LC lens is reduced. The focal length of the LC lens can be tuned from 30 to 120 μm when the voltage is changed from 0 to 7 Vrms. This LC lens has the advantages of no threshold, low operating voltage, and simple fabrication.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Niven, W.A.
The long-term position accuracy of an inertial navigation system depends primarily on the ability of the gyroscopes to maintain a near-perfect reference orientation. Small imperfections in the gyroscopes cause them to drift slowly away from their initial orientation, thereby producing errors in the system's calculations of position. The A3FIX is a computer program subroutine developed to estimate inertial navigation system gyro drift rates with the navigator stopped or moving slowly. It processes data of the navigation system's position error to arrive at estimates of the north- south and vertical gyro drift rates. It also computes changes in the east--west gyromore » drift rate if the navigator is stopped and if data on the system's azimuth error changes are also available. The report describes the subroutine, its capabilities, and gives examples of gyro drift rate estimates that were computed during the testing of a high quality inertial system under the PASSPORT program at the Lawrence Livermore Laboratory. The appendices provide mathematical derivations of the estimation equations that are used in the subroutine, a discussion of the estimation errors, and a program listing and flow diagram. The appendices also contain a derivation of closed form solutions to the navigation equations to clarify the effects that motion and time-varying drift rates induce in the phase-plane relationships between the Schulerfiltered errors in latitude and azimuth snd between the Schulerfiltered errors in latitude and longitude. (auth)« less
el-Sayed, Kamelia Abass
2006-08-01
The aqueous extract of the sea anemone Parasicyonis actinostoloides showed molluscicidal effect against vector snails of Schistosoma hacematobium and Fasciola gigantica after 24 hours of exposure. LC50) and LC90 values for P. actinostoloides were 40 & 78.6 ppm for B. runcatus and 46.6 & 86.5 ppm for L. natalensis respectively. The effect of continuously exposure of B. truncatits and L. naltlensis to sublethal aqueous extract concentrations (LC0, LC10 & LC25) on survival rate, egg production and on infectivity of miracidia to infection with S. haematobium and F. gigantica were studied. The data showed that no B. truncatus survived more than 42, 32 & 27 days after exposure with a mean life span of 18.5, 13.3 & 11.1 days respectively. The death rate of B. truncatus with LC0 was highly significant as compared to treatment with LC10 & LC25 (p < 0.01). L. natalensis were more susceptible to the effect of aqueous extract than B. truncatus. LC0, LC10 & LC25, extract killed all L. natalensis through 32, 27 & 22 days. The mean life span of those exposed to LC0 was 12.37 days, high significant when compared with treated LC10 & LC25 ones (p < 0.01). The cumulative mortality rates of B. truncatus and L. natalensis in controls during the experimental study (52 days) was 60% & 75%, respectively. Egg production of B. truncatius and L. natalensis was not affected by sublethal concentrations. Control snails layed significantly higher no. of eggs than treated ones. B. truncatus stopped egg laying 17 days after exposure to LC25. those treated with LC10 & LC0 ceased to deposit eggs after 22 & 27 days respectively. The percent reduction in egg laying capacity of B. truncatus treated with LC0, LC10 & LC25 compared to controls was 77.1%, 93.2% & 92.8% respectively (p < 0.01). Similar reduction in egg production of treated L. natalensis cornpared to controls occurred, the percent reduction in egg production of snails treated with LC0, LC10 & LC25 in relation to controls was 78.4%, 92.4% & 94.7% respectively. Sublethal concentrations of aqueous extract of P. actinostoloides affected hatchability of B. truncatus and L. natalensis eggs. The data showed that eggs of B. truncatus and L. natalensis can hatch in all tested concentrations but with different rates. The eggs' hatchability in snails exposed to LC0, LCIo & LC25 extract at 5 days old was 44%, 38% & 30% in B. truncatus respectively. In L. natalensis eggs, the corresponding rates were lower 28%, 24% & 18% respectively. The infection of B. truncatuts and L. natalensis with S. haematobium n and F. giganlica miracidia was greatly reduced by the sublethal concentrations of aqueous extract of P. actinostoloides. The reduction of infection rate increased with the increased of sublethal concentrations. In B. truncatus the reduction was 43.2%, 57.6% & 76.6% compared to controls and in L. natalensis was 56.3%, 70.2% & 77.4%, respectively.
Experiments and simulations of flux rope dynamics in a plasma
NASA Astrophysics Data System (ADS)
Intrator, Thomas; Abbate, Sara; Ryutov, Dmitri
2005-10-01
The behavior of flux ropes is a key issue in solar, space and astrophysics. For instance, magnetic fields and currents on the Sun are sheared and twisted as they store energy, experience an as yet unidentified instability, open into interplanetary space, eject the plasma trapped in them, and cause a flare. The Reconnection Scaling Experiment (RSX) provides a simple means to systematically characterize the linear and non-linear evolution of driven, dissipative, unstable plasma-current filaments. Topology evolves in three dimensions, supports multiple modes, and can bifurcate to quasi-helical equilibria. The ultimate saturation to a nonlinear force and energy balance is the link to a spectrum of relaxation processes. RSX has adjustable energy density β1 to β 1, non-negligible equilibrium plasma flows, driven steady-state scenarios, and adjustable line tying at boundaries. We will show magnetic structure of a kinking, rotating single line tied column, magnetic reconnection between two flux ropes, and pictures of three braided flux ropes. We use computed simulation movies to bridge the gap between the solar physics scales and experimental data with computational modeling. In collaboration with Ivo Furno, Tsitsi Madziwa-Nussinovm Giovanni Lapenta, Adam Light, Los Alamos National Laboratory; Sara Abbate, Torino Polytecnico; and Dmitri Ryutov, Lawrence Livermore National Laboratory.
Functional group interactions with single wall carbon NT studied by ab-initio calculations
NASA Astrophysics Data System (ADS)
Cicero, Giancarlo
2005-03-01
With the goal of designing functionalized nanotube materials, recent AFM measurements have succeeded in determining the force between individual chemical groups an single-wall carbon nanotubes (SWCNT) [1]. In order to rationalize and understand these experimental results, we have performed Density Functional Theory calculations for a number of structural arrangements of model tips functionalized with the same groups as those used experimentally. Our calculations include full geometry optimization of the composite SWCNT/tip system as well as `pulling-out' simulations to compute interaction forces. We considered (14x0), semi- conducting tubes, and AFM tips where modeled by a SiH3CH2-X molecule, with X- representing -CN, -CH3, -NH2 or -CH2OCH2. As X is varied, computed forces reproduce the same trend as that observed experimentally when n-doped SWCNT are considered; significantly different trends are observed for neutral and p-doped tubes. We propose that the polar solvent present in the experimental setup may be responsible for the n-doping of the nanotube suggested by our calculations. This work was performed under the auspices of the U.S. Department of Energy by University of California Lawrence Livermore National Laboratory under contract No. W-7405-Eng-48. [1] M.C. LeMieux et al, preprint
The ASCI Network for SC '99: A Step on the Path to a 100 Gigabit Per Second Supercomputing Network
DOE Office of Scientific and Technical Information (OSTI.GOV)
PRATT,THOMAS J.; TARMAN,THOMAS D.; MARTINEZ,LUIS M.
2000-07-24
This document highlights the Discom{sup 2}'s Distance computing and communication team activities at the 1999 Supercomputing conference in Portland, Oregon. This conference is sponsored by the IEEE and ACM. Sandia, Lawrence Livermore and Los Alamos National laboratories have participated in this conference for eleven years. For the last four years the three laboratories have come together at the conference under the DOE's ASCI, Accelerated Strategic Computing Initiatives rubric. Communication support for the ASCI exhibit is provided by the ASCI DISCOM{sup 2} project. The DISCOM{sup 2} communication team uses this forum to demonstrate and focus communication and networking developments within themore » community. At SC 99, DISCOM built a prototype of the next generation ASCI network demonstrated remote clustering techniques, demonstrated the capabilities of the emerging Terabit Routers products, demonstrated the latest technologies for delivering visualization data to the scientific users, and demonstrated the latest in encryption methods including IP VPN technologies and ATM encryption research. The authors also coordinated the other production networking activities within the booth and between their demonstration partners on the exhibit floor. This paper documents those accomplishments, discusses the details of their implementation, and describes how these demonstrations support Sandia's overall strategies in ASCI networking.« less
Zecca, Luigi; Stroppolo, Antonella; Gatti, Alberto; Tampellini, Davide; Toscani, Marco; Gallorini, Mario; Giaveri, Giuseppe; Arosio, Paolo; Santambrogio, Paolo; Fariello, Ruggero G.; Karatekin, Erdem; Kleinman, Mark H.; Turro, Nicholas; Hornykiewicz, Oleh; Zucca, Fabio A.
2004-01-01
In this study, a comparative analysis of metal-related neuronal vulnerability was performed in two brainstem nuclei, the locus coeruleus (LC) and substantia nigra (SN), known targets of the etiological noxae in Parkinson's disease and related disorders. LC and SN pars compacta neurons both degenerate in Parkinson's disease and other Parkinsonisms; however, LC neurons are comparatively less affected and with a variable degree of involvement. In this study, iron, copper, and their major molecular forms like ferritins, ceruloplasmin, neuromelanin (NM), manganese-superoxide dismutase (SOD), and copper/zinc-SOD were measured in LC and SN of normal subjects at different ages. Iron content in LC was much lower than that in SN, and the ratio heavy-chain ferritin/iron in LC was higher than in the SN. The NM concentration was similar in LC and SN, but the iron content in NM of LC was much lower than SN. In both regions, heavy- and light-chain ferritins were present only in glia and were not detectable in neurons. These data suggest that in LC neurons, the iron mobilization and toxicity is lower than that in SN and is efficiently buffered by NM. The bigger damage occurring in SN could be related to the higher content of iron. Ferritins accomplish the same function of buffering iron in glial cells. Ceruloplasmin levels were similar in LC and SN, but copper was higher in LC. However, the copper content in NM of LC was higher than that of SN, indicating a higher copper mobilization in LC neurons. Manganese-SOD and copper/zinc-SOD had similar age trend in LC and SN. These results may explain at least one of the reasons underlying lower vulnerability of LC compared to SN in Parkinsonian syndromes. PMID:15210960
Star Power on Earth: Path to Clean Energy Future
Ed Moses
2017-12-09
Lawrence Livermore National Laboratory's "Science on Saturday" lecture series presents Ed Moses, Director of the National Ignition Facility, discussing the world's largest laser system and its potential impact on society's upcoming energy needs.
Wan, Cuihong; Liu, Jian; Fong, Vincent; Lugowski, Andrew; Stoilova, Snejana; Bethune-Waddell, Dylan; Borgeson, Blake; Havugimana, Pierre C; Marcotte, Edward M; Emili, Andrew
2013-04-09
The experimental isolation and characterization of stable multi-protein complexes are essential to understanding the molecular systems biology of a cell. To this end, we have developed a high-throughput proteomic platform for the systematic identification of native protein complexes based on extensive fractionation of soluble protein extracts by multi-bed ion exchange high performance liquid chromatography (IEX-HPLC) combined with exhaustive label-free LC/MS/MS shotgun profiling. To support these studies, we have built a companion data analysis software pipeline, termed ComplexQuant. Proteins present in the hundreds of fractions typically collected per experiment are first identified by exhaustively interrogating MS/MS spectra using multiple database search engines within an integrative probabilistic framework, while accounting for possible post-translation modifications. Protein abundance is then measured across the fractions based on normalized total spectral counts and precursor ion intensities using a dedicated tool, PepQuant. This analysis allows co-complex membership to be inferred based on the similarity of extracted protein co-elution profiles. Each computational step has been optimized for processing large-scale biochemical fractionation datasets, and the reliability of the integrated pipeline has been benchmarked extensively. This article is part of a Special Issue entitled: From protein structures to clinical applications. Copyright © 2012 Elsevier B.V. All rights reserved.
Tung, Ying-Tsen; Hsu, Wen-Ming; Lee, Hsinyu; Huang, Wei-Pang; Liao, Yung-Feng
2010-07-01
Mammalian p62/sequestosome-1 protein binds to both LC3, the mammalian homologue of yeast Atg8, and polyubiquitinated cargo proteins destined to undergo autophagy-mediated degradation. We previously identified a cargo receptor-binding domain in Atg8 that is essential for its interaction with the cargo receptor Atg19 in selective autophagic processes in yeast. We, thus, sought to determine whether this interaction is evolutionally conserved from yeast to mammals. Using an amino acid replacement approach, we demonstrate that cells expressing mutant LC3 (LC3-K30D, LC3-K51A, or LC3-L53A) all exhibit defective lipidation of LC3, a disrupted LC3-p62 interaction, and impaired autophagic degradation of p62, suggesting that the p62-binding site of LC3 is localized within an evolutionarily conserved domain. Importantly, whereas cells expressing these LC3 mutants exhibited similar overall autophagic activity comparable to that of cells expressing wild-type LC3, autophagy-mediated clearance of the aggregation-prone mutant Huntingtin was defective in the mutant-expressing cells. Together, these results suggest that p62 directly binds to the evolutionarily conserved cargo receptor-binding domain of Atg8/LC3 and selectively mediates the clearance of mutant Huntingtin.
Deep sequencing reveals microbiota dysbiosis of tongue coat in patients with liver carcinoma.
Lu, Haifeng; Ren, Zhigang; Li, Ang; Zhang, Hua; Jiang, Jianwen; Xu, Shaoyan; Luo, Qixia; Zhou, Kai; Sun, Xiaoli; Zheng, Shusen; Li, Lanjuan
2016-09-08
Liver carcinoma (LC) is a common malignancy worldwide, associated with high morbidity and mortality. Characterizing microbiome profiles of tongue coat may provide useful insights and potential diagnostic marker for LC patients. Herein, we are the first time to investigate tongue coat microbiome of LC patients with cirrhosis based on 16S ribosomal RNA (rRNA) gene sequencing. After strict inclusion and exclusion criteria, 35 early LC patients with cirrhosis and 25 matched healthy subjects were enrolled. Microbiome diversity of tongue coat in LC patients was significantly increased shown by Shannon, Simpson and Chao 1 indexes. Microbiome on tongue coat was significantly distinguished LC patients from healthy subjects by principal component analysis. Tongue coat microbial profiles represented 38 operational taxonomic units assigned to 23 different genera, distinguishing LC patients. Linear discriminant analysis (LDA) effect size (LEfSe) reveals significant microbial dysbiosis of tongue coats in LC patients. Strikingly, Oribacterium and Fusobacterium could distinguish LC patients from healthy subjects. LEfSe outputs show microbial gene functions related to categories of nickel/iron_transport, amino_acid_transport, energy produced system and metabolism between LC patients and healthy subjects. These findings firstly identify microbiota dysbiosis of tongue coat in LC patients, may providing novel and non-invasive potential diagnostic biomarker of LC.
Deep sequencing reveals microbiota dysbiosis of tongue coat in patients with liver carcinoma
NASA Astrophysics Data System (ADS)
Lu, Haifeng; Ren, Zhigang; Li, Ang; Zhang, Hua; Jiang, Jianwen; Xu, Shaoyan; Luo, Qixia; Zhou, Kai; Sun, Xiaoli; Zheng, Shusen; Li, Lanjuan
2016-09-01
Liver carcinoma (LC) is a common malignancy worldwide, associated with high morbidity and mortality. Characterizing microbiome profiles of tongue coat may provide useful insights and potential diagnostic marker for LC patients. Herein, we are the first time to investigate tongue coat microbiome of LC patients with cirrhosis based on 16S ribosomal RNA (rRNA) gene sequencing. After strict inclusion and exclusion criteria, 35 early LC patients with cirrhosis and 25 matched healthy subjects were enrolled. Microbiome diversity of tongue coat in LC patients was significantly increased shown by Shannon, Simpson and Chao 1 indexes. Microbiome on tongue coat was significantly distinguished LC patients from healthy subjects by principal component analysis. Tongue coat microbial profiles represented 38 operational taxonomic units assigned to 23 different genera, distinguishing LC patients. Linear discriminant analysis (LDA) effect size (LEfSe) reveals significant microbial dysbiosis of tongue coats in LC patients. Strikingly, Oribacterium and Fusobacterium could distinguish LC patients from healthy subjects. LEfSe outputs show microbial gene functions related to categories of nickel/iron_transport, amino_acid_transport, energy produced system and metabolism between LC patients and healthy subjects. These findings firstly identify microbiota dysbiosis of tongue coat in LC patients, may providing novel and non-invasive potential diagnostic biomarker of LC.
NASA Technical Reports Server (NTRS)
Niedra, Janis M.; Gerber, Scott S.
1995-01-01
The L-C resonant decay technique for measuring circuit Q or losses is improved by eliminating the switch from the inductor-capacitor loop. A MOSFET switch is used instead to momentarily connect the resonant circuit to an existing voltage source, which itself is gated off during the decay transient. Very reproducible, low duty cycle data could be taken this way over a dynamic voltage range of at least 10:1. Circuit Q is computed from a polynomial fit to the sequence of the decaying voltage maxima. This method was applied to measure the losses at 60 kHz in inductors having loose powder cores of moly permalloy and an Mn-Zn power ferrite. After the copper and capacitor losses are separated out, the resulting specific core loss is shown to be roughly as expected for the MPP powder, but anomalously high for the ferrite powder. Possible causes are mentioned.
A role for locus coeruleus in Parkinson tremor
Isaias, Ioannis U.; Marzegan, Alberto; Pezzoli, Gianni; Marotta, Giorgio; Canesi, Margherita; Biella, Gabriele E. M.; Volkmann, Jens; Cavallari, Paolo
2012-01-01
We analyzed rest tremor, one of the etiologically most elusive hallmarks of Parkinson disease (PD), in 12 consecutive PD patients during a specific task activating the locus coeruleus (LC) to investigate a putative role of noradrenaline (NA) in tremor generation and suppression. Clinical diagnosis was confirmed in all subjects by reduced dopamine reuptake transporter (DAT) binding values investigated by single photon computed tomography imaging (SPECT) with [123I] N-ω-fluoropropyl-2β-carbomethoxy-3β-(4-iodophenyl) tropane (FP-CIT). The intensity of tremor (i.e., the power of Electromyography [EMG] signals), but not its frequency, significantly increased during the task. In six subjects, tremor appeared selectively during the task. In a second part of the study, we retrospectively reviewed SPECT with FP-CIT data and confirmed the lack of correlation between dopaminergic loss and tremor by comparing DAT binding values of 82 PD subjects with bilateral tremor (n = 27), unilateral tremor (n = 22), and no tremor (n = 33). This study suggests a role of the LC in Parkinson tremor. PMID:22287946
A three-dimensional algebraic grid generation scheme for gas turbine combustors with inclined slots
NASA Technical Reports Server (NTRS)
Yang, S. L.; Cline, M. C.; Chen, R.; Chang, Y. L.
1993-01-01
A 3D algebraic grid generation scheme is presented for generating the grid points inside gas turbine combustors with inclined slots. The scheme is based on the 2D transfinite interpolation method. Since the scheme is a 2D approach, it is very efficient and can easily be extended to gas turbine combustors with either dilution hole or slot configurations. To demonstrate the feasibility and the usefulness of the technique, a numerical study of the quick-quench/lean-combustion (QQ/LC) zones of a staged turbine combustor is given. Preliminary results illustrate some of the major features of the flow and temperature fields in the QQ/LC zones. Formation of co- and counter-rotating bulk flow and shape temperature fields can be observed clearly, and the resulting patterns are consistent with experimental observations typical of the confined slanted jet-in-cross flow. Numerical solutions show the method to be an efficient and reliable tool for generating computational grids for analyzing gas turbine combustors with slanted slots.
Napolitano, José G.; Gödecke, Tanja; Lankin, David C.; Jaki, Birgit U.; McAlpine, James B.; Chen, Shao-Nong; Pauli, Guido F.
2013-01-01
The development of analytical methods for parallel characterization of multiple phytoconstituents is essential to advance the quality control of herbal products. While chemical standardization is commonly carried out by targeted analysis using gas or liquid chromatography-based methods, more universal approaches based on quantitative 1H NMR (qHNMR) measurements are being used increasingly in the multi-targeted assessment of these complex mixtures. The present study describes the development of a 1D qHNMR-based method for simultaneous identification and quantification of green tea constituents. This approach utilizes computer-assisted 1H iterative Full Spin Analysis (HiFSA) and enables rapid profiling of seven catechins in commercial green tea extracts. The qHNMR results were cross-validated against quantitative profiles obtained with an orthogonal LC-MS/MS method. The relative strengths and weaknesses of both approaches are discussed, with special emphasis on the role of identical reference standards in qualitative and quantitative analyses. PMID:23870106
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gyllenhaal, J.
CLOMP is the C version of the Livermore OpenMP benchmark developed to measure OpenMP overheads and other performance impacts due to threading. For simplicity, it does not use MPI by default but it is expected to be run on the resources a threaded MPI task would use (e.g., a portion of a shared memory compute node). Compiling with -DWITH_MPI allows packing one or more nodes with CLOMP tasks and having CLOMP report OpenMP performance for the slowest MPI task. On current systems, the strong scaling performance results for 4, 8, or 16 threads are of the most interest. Suggested weakmore » scaling inputs are provided for evaluating future systems. Since MPI is often used to place at least one MPI task per coherence or NUMA domain, it is recommended to focus OpenMP runtime measurements on a subset of node hardware where it is most possible to have low OpenMP overheads (e.g., within one coherence domain or NUMA domain).« less