The grand challenge of managing the petascale facility.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aiken, R. J.; Mathematics and Computer Science
2007-02-28
This report is the result of a study of networks and how they may need to evolve to support petascale leadership computing and science. As Dr. Ray Orbach, director of the Department of Energy's Office of Science, says in the spring 2006 issue of SciDAC Review, 'One remarkable example of growth in unexpected directions has been in high-end computation'. In the same article Dr. Michael Strayer states, 'Moore's law suggests that before the end of the next cycle of SciDAC, we shall see petaflop computers'. Given the Office of Science's strong leadership and support for petascale computing and facilities, wemore » should expect to see petaflop computers in operation in support of science before the end of the decade, and DOE/SC Advanced Scientific Computing Research programs are focused on making this a reality. This study took its lead from this strong focus on petascale computing and the networks required to support such facilities, but it grew to include almost all aspects of the DOE/SC petascale computational and experimental science facilities, all of which will face daunting challenges in managing and analyzing the voluminous amounts of data expected. In addition, trends indicate the increased coupling of unique experimental facilities with computational facilities, along with the integration of multidisciplinary datasets and high-end computing with data-intensive computing; and we can expect these trends to continue at the petascale level and beyond. Coupled with recent technology trends, they clearly indicate the need for including capability petascale storage, networks, and experiments, as well as collaboration tools and programming environments, as integral components of the Office of Science's petascale capability metafacility. The objective of this report is to recommend a new cross-cutting program to support the management of petascale science and infrastructure. The appendices of the report document current and projected DOE computation facilities, science trends, and technology trends, whose combined impact can affect the manageability and stewardship of DOE's petascale facilities. This report is not meant to be all-inclusive. Rather, the facilities, science projects, and research topics presented are to be considered examples to clarify a point.« less
Final Project Report. Scalable fault tolerance runtime technology for petascale computers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krishnamoorthy, Sriram; Sadayappan, P
With the massive number of components comprising the forthcoming petascale computer systems, hardware failures will be routinely encountered during execution of large-scale applications. Due to the multidisciplinary, multiresolution, and multiscale nature of scientific problems that drive the demand for high end systems, applications place increasingly differing demands on the system resources: disk, network, memory, and CPU. In addition to MPI, future applications are expected to use advanced programming models such as those developed under the DARPA HPCS program as well as existing global address space programming models such as Global Arrays, UPC, and Co-Array Fortran. While there has been amore » considerable amount of work in fault tolerant MPI with a number of strategies and extensions for fault tolerance proposed, virtually none of advanced models proposed for emerging petascale systems is currently fault aware. To achieve fault tolerance, development of underlying runtime and OS technologies able to scale to petascale level is needed. This project has evaluated range of runtime techniques for fault tolerance for advanced programming models.« less
Real science at the petascale.
Saksena, Radhika S; Boghosian, Bruce; Fazendeiro, Luis; Kenway, Owain A; Manos, Steven; Mazzeo, Marco D; Sadiq, S Kashif; Suter, James L; Wright, David; Coveney, Peter V
2009-06-28
We describe computational science research that uses petascale resources to achieve scientific results at unprecedented scales and resolution. The applications span a wide range of domains, from investigation of fundamental problems in turbulence through computational materials science research to biomedical applications at the forefront of HIV/AIDS research and cerebrovascular haemodynamics. This work was mainly performed on the US TeraGrid 'petascale' resource, Ranger, at Texas Advanced Computing Center, in the first half of 2008 when it was the largest computing system in the world available for open scientific research. We have sought to use this petascale supercomputer optimally across application domains and scales, exploiting the excellent parallel scaling performance found on up to at least 32 768 cores for certain of our codes in the so-called 'capability computing' category as well as high-throughput intermediate-scale jobs for ensemble simulations in the 32-512 core range. Furthermore, this activity provides evidence that conventional parallel programming with MPI should be successful at the petascale in the short to medium term. We also report on the parallel performance of some of our codes on up to 65 636 cores on the IBM Blue Gene/P system at the Argonne Leadership Computing Facility, which has recently been named the fastest supercomputer in the world for open science.
The Petascale Data Storage Institute
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gibson, Garth; Long, Darrell; Honeyman, Peter
2013-07-01
Petascale computing infrastructures for scientific discovery make petascale demands on information storage capacity, performance, concurrency, reliability, availability, and manageability.The Petascale Data Storage Institute focuses on the data storage problems found in petascale scientific computing environments, with special attention to community issues such as interoperability, community buy-in, and shared tools.The Petascale Data Storage Institute is a collaboration between researchers at Carnegie Mellon University, National Energy Research Scientific Computing Center, Pacific Northwest National Laboratory, Oak Ridge National Laboratory, Sandia National Laboratory, Los Alamos National Laboratory, University of Michigan, and the University of California at Santa Cruz.
MOGO: Model-Oriented Global Optimization of Petascale Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malony, Allen D.; Shende, Sameer S.
The MOGO project was initiated under in 2008 under the DOE Program Announcement for Software Development Tools for Improved Ease-of-Use on Petascale systems (LAB 08-19). The MOGO team consisted of Oak Ridge National Lab, Argonne National Lab, and the University of Oregon. The overall goal of MOGO was to attack petascale performance analysis by developing a general framework where empirical performance data could be efficiently and accurately compared with performance expectations at various levels of abstraction. This information could then be used to automatically identify and remediate performance problems. MOGO was be based on performance models derived from application knowledge,more » performance experiments, and symbolic analysis. MOGO was able to make reasonable impact on existing DOE applications and systems. New tools and techniques were developed, which, in turn, were used on important DOE applications on DOE LCF systems to show significant performance improvements.« less
Foundational Tools for Petascale Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Barton
2014-05-19
The Paradyn project has a history of developing algorithms, techniques, and software that push the cutting edge of tool technology for high-end computing systems. Under this funding, we are working on a three-year agenda to make substantial new advances in support of new and emerging Petascale systems. The overall goal for this work is to address the steady increase in complexity of these petascale systems. Our work covers two key areas: (1) The analysis, instrumentation and control of binary programs. Work in this area falls under the general framework of the Dyninst API tool kits. (2) Infrastructure for building toolsmore » and applications at extreme scale. Work in this area falls under the general framework of the MRNet scalability framework. Note that work done under this funding is closely related to work done under a contemporaneous grant, “High-Performance Energy Applications and Systems”, SC0004061/FG02-10ER25972, UW PRJ36WV.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sprague, Michael A.
Enabled by petascale supercomputing, the next generation of computer models for wind energy will simulate a vast range of scales and physics, spanning from turbine structural dynamics and blade-scale turbulence to mesoscale atmospheric flow. A single model covering all scales and physics is not feasible. Thus, these simulations will require the coupling of different models/codes, each for different physics, interacting at their domain boundaries.
High-resolution coupled physics solvers for analysing fine-scale nuclear reactor design problems.
Mahadevan, Vijay S; Merzari, Elia; Tautges, Timothy; Jain, Rajeev; Obabko, Aleksandr; Smith, Michael; Fischer, Paul
2014-08-06
An integrated multi-physics simulation capability for the design and analysis of current and future nuclear reactor models is being investigated, to tightly couple neutron transport and thermal-hydraulics physics under the SHARP framework. Over several years, high-fidelity, validated mono-physics solvers with proven scalability on petascale architectures have been developed independently. Based on a unified component-based architecture, these existing codes can be coupled with a mesh-data backplane and a flexible coupling-strategy-based driver suite to produce a viable tool for analysts. The goal of the SHARP framework is to perform fully resolved coupled physics analysis of a reactor on heterogeneous geometry, in order to reduce the overall numerical uncertainty while leveraging available computational resources. The coupling methodology and software interfaces of the framework are presented, along with verification studies on two representative fast sodium-cooled reactor demonstration problems to prove the usability of the SHARP framework.
High-resolution coupled physics solvers for analysing fine-scale nuclear reactor design problems
Mahadevan, Vijay S.; Merzari, Elia; Tautges, Timothy; Jain, Rajeev; Obabko, Aleksandr; Smith, Michael; Fischer, Paul
2014-01-01
An integrated multi-physics simulation capability for the design and analysis of current and future nuclear reactor models is being investigated, to tightly couple neutron transport and thermal-hydraulics physics under the SHARP framework. Over several years, high-fidelity, validated mono-physics solvers with proven scalability on petascale architectures have been developed independently. Based on a unified component-based architecture, these existing codes can be coupled with a mesh-data backplane and a flexible coupling-strategy-based driver suite to produce a viable tool for analysts. The goal of the SHARP framework is to perform fully resolved coupled physics analysis of a reactor on heterogeneous geometry, in order to reduce the overall numerical uncertainty while leveraging available computational resources. The coupling methodology and software interfaces of the framework are presented, along with verification studies on two representative fast sodium-cooled reactor demonstration problems to prove the usability of the SHARP framework. PMID:24982250
Parcels v0.9: prototyping a Lagrangian ocean analysis framework for the petascale age
NASA Astrophysics Data System (ADS)
Lange, Michael; van Sebille, Erik
2017-11-01
As ocean general circulation models (OGCMs) move into the petascale age, where the output of single simulations exceeds petabytes of storage space, tools to analyse the output of these models will need to scale up too. Lagrangian ocean analysis, where virtual particles are tracked through hydrodynamic fields, is an increasingly popular way to analyse OGCM output, by mapping pathways and connectivity of biotic and abiotic particulates. However, the current software stack of Lagrangian ocean analysis codes is not dynamic enough to cope with the increasing complexity, scale and need for customization of use-cases. Furthermore, most community codes are developed for stand-alone use, making it a nontrivial task to integrate virtual particles at runtime of the OGCM. Here, we introduce the new Parcels code, which was designed from the ground up to be sufficiently scalable to cope with petascale computing. We highlight its API design that combines flexibility and customization with the ability to optimize for HPC workflows, following the paradigm of domain-specific languages. Parcels is primarily written in Python, utilizing the wide range of tools available in the scientific Python ecosystem, while generating low-level C code and using just-in-time compilation for performance-critical computation. We show a worked-out example of its API, and validate the accuracy of the code against seven idealized test cases. This version 0.9 of Parcels is focused on laying out the API, with future work concentrating on support for curvilinear grids, optimization, efficiency and at-runtime coupling with OGCMs.
Adjusting process count on demand for petascale global optimization
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sosonkina, Masha; Watson, Layne T.; Radcliffe, Nicholas R.
2012-11-23
There are many challenges that need to be met before efficient and reliable computation at the petascale is possible. Many scientific and engineering codes running at the petascale are likely to be memory intensive, which makes thrashing a serious problem for many petascale applications. One way to overcome this challenge is to use a dynamic number of processes, so that the total amount of memory available for the computation can be increased on demand. This paper describes modifications made to the massively parallel global optimization code pVTdirect in order to allow for a dynamic number of processes. In particular, themore » modified version of the code monitors memory use and spawns new processes if the amount of available memory is determined to be insufficient. The primary design challenges are discussed, and performance results are presented and analyzed.« less
Mapping to Irregular Torus Topologies and Other Techniques for Petascale Biomolecular Simulation
Phillips, James C.; Sun, Yanhua; Jain, Nikhil; Bohm, Eric J.; Kalé, Laxmikant V.
2014-01-01
Currently deployed petascale supercomputers typically use toroidal network topologies in three or more dimensions. While these networks perform well for topology-agnostic codes on a few thousand nodes, leadership machines with 20,000 nodes require topology awareness to avoid network contention for communication-intensive codes. Topology adaptation is complicated by irregular node allocation shapes and holes due to dedicated input/output nodes or hardware failure. In the context of the popular molecular dynamics program NAMD, we present methods for mapping a periodic 3-D grid of fixed-size spatial decomposition domains to 3-D Cray Gemini and 5-D IBM Blue Gene/Q toroidal networks to enable hundred-million atom full machine simulations, and to similarly partition node allocations into compact domains for smaller simulations using multiple-copy algorithms. Additional enabling techniques are discussed and performance is reported for NCSA Blue Waters, ORNL Titan, ANL Mira, TACC Stampede, and NERSC Edison. PMID:25594075
High-resolution coupled physics solvers for analysing fine-scale nuclear reactor design problems
Mahadevan, Vijay S.; Merzari, Elia; Tautges, Timothy; ...
2014-06-30
An integrated multi-physics simulation capability for the design and analysis of current and future nuclear reactor models is being investigated, to tightly couple neutron transport and thermal-hydraulics physics under the SHARP framework. Over several years, high-fidelity, validated mono-physics solvers with proven scalability on petascale architectures have been developed independently. Based on a unified component-based architecture, these existing codes can be coupled with a mesh-data backplane and a flexible coupling-strategy-based driver suite to produce a viable tool for analysts. The goal of the SHARP framework is to perform fully resolved coupled physics analysis of a reactor on heterogeneous geometry, in ordermore » to reduce the overall numerical uncertainty while leveraging available computational resources. Finally, the coupling methodology and software interfaces of the framework are presented, along with verification studies on two representative fast sodium-cooled reactor demonstration problems to prove the usability of the SHARP framework.« less
Lightweight and Statistical Techniques for Petascale PetaScale Debugging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Barton
2014-06-30
This project investigated novel techniques for debugging scientific applications on petascale architectures. In particular, we developed lightweight tools that narrow the problem space when bugs are encountered. We also developed techniques that either limit the number of tasks and the code regions to which a developer must apply a traditional debugger or that apply statistical techniques to provide direct suggestions of the location and type of error. We extend previous work on the Stack Trace Analysis Tool (STAT), that has already demonstrated scalability to over one hundred thousand MPI tasks. We also extended statistical techniques developed to isolate programming errorsmore » in widely used sequential or threaded applications in the Cooperative Bug Isolation (CBI) project to large scale parallel applications. Overall, our research substantially improved productivity on petascale platforms through a tool set for debugging that complements existing commercial tools. Previously, Office Of Science application developers relied either on primitive manual debugging techniques based on printf or they use tools, such as TotalView, that do not scale beyond a few thousand processors. However, bugs often arise at scale and substantial effort and computation cycles are wasted in either reproducing the problem in a smaller run that can be analyzed with the traditional tools or in repeated runs at scale that use the primitive techniques. New techniques that work at scale and automate the process of identifying the root cause of errors were needed. These techniques significantly reduced the time spent debugging petascale applications, thus leading to a greater overall amount of time for application scientists to pursue the scientific objectives for which the systems are purchased. We developed a new paradigm for debugging at scale: techniques that reduced the debugging scenario to a scale suitable for traditional debuggers, e.g., by narrowing the search for the root-cause analysis to a small set of nodes or by identifying equivalence classes of nodes and sampling our debug targets from them. We implemented these techniques as lightweight tools that efficiently work on the full scale of the target machine. We explored four lightweight debugging refinements: generic classification parameters, such as stack traces, application-specific classification parameters, such as global variables, statistical data acquisition techniques and machine learning based approaches to perform root cause analysis. Work done under this project can be divided into two categories, new algorithms and techniques for scalable debugging, and foundation infrastructure work on our MRNet multicast-reduction framework for scalability, and Dyninst binary analysis and instrumentation toolkits.« less
The Development of the Non-hydrostatic Unified Model of the Atmosphere (NUMA)
2011-09-19
capabilities: 1. Highly scalable on current and future computer architectures ( exascale computing: this means CPUs and GPUs) 2. Flexibility to use a...From Terascale to Petascale/ Exascale Computing • 10 of Top 500 are already in the Petascale range • 3 of top 10 are GPU-based machines 2
Zhang, Xiaohua; Wong, Sergio E; Lightstone, Felice C
2013-04-30
A mixed parallel scheme that combines message passing interface (MPI) and multithreading was implemented in the AutoDock Vina molecular docking program. The resulting program, named VinaLC, was tested on the petascale high performance computing (HPC) machines at Lawrence Livermore National Laboratory. To exploit the typical cluster-type supercomputers, thousands of docking calculations were dispatched by the master process to run simultaneously on thousands of slave processes, where each docking calculation takes one slave process on one node, and within the node each docking calculation runs via multithreading on multiple CPU cores and shared memory. Input and output of the program and the data handling within the program were carefully designed to deal with large databases and ultimately achieve HPC on a large number of CPU cores. Parallel performance analysis of the VinaLC program shows that the code scales up to more than 15K CPUs with a very low overhead cost of 3.94%. One million flexible compound docking calculations took only 1.4 h to finish on about 15K CPUs. The docking accuracy of VinaLC has been validated against the DUD data set by the re-docking of X-ray ligands and an enrichment study, 64.4% of the top scoring poses have RMSD values under 2.0 Å. The program has been demonstrated to have good enrichment performance on 70% of the targets in the DUD data set. An analysis of the enrichment factors calculated at various percentages of the screening database indicates VinaLC has very good early recovery of actives. Copyright © 2013 Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spentzouris, Panagiotis; /Fermilab; Cary, John
The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessarymore » accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.« less
Data-intensive computing on numerically-insensitive supercomputers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahrens, James P; Fasel, Patricia K; Habib, Salman
2010-12-03
With the advent of the era of petascale supercomputing, via the delivery of the Roadrunner supercomputing platform at Los Alamos National Laboratory, there is a pressing need to address the problem of visualizing massive petascale-sized results. In this presentation, I discuss progress on a number of approaches including in-situ analysis, multi-resolution out-of-core streaming and interactive rendering on the supercomputing platform. These approaches are placed in context by the emerging area of data-intensive supercomputing.
Final Report: Correctness Tools for Petascale Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mellor-Crummey, John
2014-10-27
In the course of developing parallel programs for leadership computing systems, subtle programming errors often arise that are extremely difficult to diagnose without tools. To meet this challenge, University of Maryland, the University of Wisconsin—Madison, and Rice University worked to develop lightweight tools to help code developers pinpoint a variety of program correctness errors that plague parallel scientific codes. The aim of this project was to develop software tools that help diagnose program errors including memory leaks, memory access errors, round-off errors, and data races. Research at Rice University focused on developing algorithms and data structures to support efficient monitoringmore » of multithreaded programs for memory access errors and data races. This is a final report about research and development work at Rice University as part of this project.« less
Mira: Argonne's 10-petaflops supercomputer
Papka, Michael; Coghlan, Susan; Isaacs, Eric; Peters, Mark; Messina, Paul
2018-02-13
Mira, Argonne's petascale IBM Blue Gene/Q system, ushers in a new era of scientific supercomputing at the Argonne Leadership Computing Facility. An engineering marvel, the 10-petaflops supercomputer is capable of carrying out 10 quadrillion calculations per second. As a machine for open science, any researcher with a question that requires large-scale computing resources can submit a proposal for time on Mira, typically in allocations of millions of core-hours, to run programs for their experiments. This adds up to billions of hours of computing time per year.
Mira: Argonne's 10-petaflops supercomputer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Papka, Michael; Coghlan, Susan; Isaacs, Eric
2013-07-03
Mira, Argonne's petascale IBM Blue Gene/Q system, ushers in a new era of scientific supercomputing at the Argonne Leadership Computing Facility. An engineering marvel, the 10-petaflops supercomputer is capable of carrying out 10 quadrillion calculations per second. As a machine for open science, any researcher with a question that requires large-scale computing resources can submit a proposal for time on Mira, typically in allocations of millions of core-hours, to run programs for their experiments. This adds up to billions of hours of computing time per year.
MADNESS: A Multiresolution, Adaptive Numerical Environment for Scientific Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harrison, Robert J.; Beylkin, Gregory; Bischoff, Florian A.
2016-01-01
MADNESS (multiresolution adaptive numerical environment for scientific simulation) is a high-level software environment for solving integral and differential equations in many dimensions that uses adaptive and fast harmonic analysis methods with guaranteed precision based on multiresolution analysis and separated representations. Underpinning the numerical capabilities is a powerful petascale parallel programming environment that aims to increase both programmer productivity and code scalability. This paper describes the features and capabilities of MADNESS and briefly discusses some current applications in chemistry and several areas of physics.
Toward Petascale Biologically Plausible Neural Networks
NASA Astrophysics Data System (ADS)
Long, Lyle
This talk will describe an approach to achieving petascale neural networks. Artificial intelligence has been oversold for many decades. Computers in the beginning could only do about 16,000 operations per second. Computer processing power, however, has been doubling every two years thanks to Moore's law, and growing even faster due to massively parallel architectures. Finally, 60 years after the first AI conference we have computers on the order of the performance of the human brain (1016 operations per second). The main issues now are algorithms, software, and learning. We have excellent models of neurons, such as the Hodgkin-Huxley model, but we do not know how the human neurons are wired together. With careful attention to efficient parallel computing, event-driven programming, table lookups, and memory minimization massive scale simulations can be performed. The code that will be described was written in C + + and uses the Message Passing Interface (MPI). It uses the full Hodgkin-Huxley neuron model, not a simplified model. It also allows arbitrary network structures (deep, recurrent, convolutional, all-to-all, etc.). The code is scalable, and has, so far, been tested on up to 2,048 processor cores using 107 neurons and 109 synapses.
Harnessing the power of emerging petascale platforms
NASA Astrophysics Data System (ADS)
Mellor-Crummey, John
2007-07-01
As part of the US Department of Energy's Scientific Discovery through Advanced Computing (SciDAC-2) program, science teams are tackling problems that require computational simulation and modeling at the petascale. A grand challenge for computer science is to develop software technology that makes it easier to harness the power of these systems to aid scientific discovery. As part of its activities, the SciDAC-2 Center for Scalable Application Development Software (CScADS) is building open source software tools to support efficient scientific computing on the emerging leadership-class platforms. In this paper, we describe two tools for performance analysis and tuning that are being developed as part of CScADS: a tool for analyzing scalability and performance, and a tool for optimizing loop nests for better node performance. We motivate these tools by showing how they apply to S3D, a turbulent combustion code under development at Sandia National Laboratory. For S3D, our node performance analysis tool helped uncover several performance bottlenecks. Using our loop nest optimization tool, we transformed S3D's most costly loop nest to reduce execution time by a factor of 2.94 for a processor working on a 503 domain.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rojas, Joseph Maurice
We summarize the contributions of the Texas A\\&M University Group to the project (DE-FG02-09ER25949/DE-SC0002505: Topology for Statistical Modeling of Petascale Data - an ASCR-funded collaboration between Sandia National Labs, Texas A\\&M U, and U Utah) during 6/9/2011 -- 2/27/2013.
Turner, Michael S
2007-01-05
Cosmology is in the midst of a period of revolutionary discovery, propelled by bold ideas from particle physics and by technological advances from gigapixel charge-coupled device cameras to peta-scale computing. The basic features of the universe have now been determined: It is 13.7 billion years old, spatially flat, and expanding at an accelerating rate; it is composed of atoms (4%), exotic dark matter (20%), and dark energy (76%); and there is evidence that galaxies and other structures were seeded by quantum fluctuations. Although we know much about the universe, we understand far less. Poised to dramatically advance our understanding of both the universe and the laws that govern it, cosmology is on the verge of a golden age.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barrett, Brian W.; Hemmert, K. Scott; Underwood, Keith Douglas
Achieving the next three orders of magnitude performance increase to move from petascale to exascale computing will require a significant advancements in several fundamental areas. Recent studies have outlined many of the challenges in hardware and software that will be needed. In this paper, we examine these challenges with respect to high-performance networking. We describe the repercussions of anticipated changes to computing and networking hardware and discuss the impact that alternative parallel programming models will have on the network software stack. We also present some ideas on possible approaches that address some of these challenges.
MADNESS: A Multiresolution, Adaptive Numerical Environment for Scientific Simulation
Harrison, Robert J.; Beylkin, Gregory; Bischoff, Florian A.; ...
2016-01-01
We present MADNESS (multiresolution adaptive numerical environment for scientific simulation) that is a high-level software environment for solving integral and differential equations in many dimensions that uses adaptive and fast harmonic analysis methods with guaranteed precision that are based on multiresolution analysis and separated representations. Underpinning the numerical capabilities is a powerful petascale parallel programming environment that aims to increase both programmer productivity and code scalability. This paper describes the features and capabilities of MADNESS and briefly discusses some current applications in chemistry and several areas of physics.
Understanding I/O workload characteristics of a Peta-scale storage system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Youngjae; Gunasekaran, Raghul
2015-01-01
Understanding workload characteristics is critical for optimizing and improving the performance of current systems and software, and architecting new storage systems based on observed workload patterns. In this paper, we characterize the I/O workloads of scientific applications of one of the world s fastest high performance computing (HPC) storage cluster, Spider, at the Oak Ridge Leadership Computing Facility (OLCF). OLCF flagship petascale simulation platform, Titan, and other large HPC clusters, in total over 250 thousands compute cores, depend on Spider for their I/O needs. We characterize the system utilization, the demands of reads and writes, idle time, storage space utilization,more » and the distribution of read requests to write requests for the Peta-scale Storage Systems. From this study, we develop synthesized workloads, and we show that the read and write I/O bandwidth usage as well as the inter-arrival time of requests can be modeled as a Pareto distribution. We also study the I/O load imbalance problems using I/O performance data collected from the Spider storage system.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahrens, James P; Patchett, John M; Lo, Li - Ta
2011-01-24
This report provides documentation for the completion of the Los Alamos portion of the ASC Level II 'Visualization on the Supercomputing Platform' milestone. This ASC Level II milestone is a joint milestone between Sandia National Laboratory and Los Alamos National Laboratory. The milestone text is shown in Figure 1 with the Los Alamos portions highlighted in boldfaced text. Visualization and analysis of petascale data is limited by several factors which must be addressed as ACES delivers the Cielo platform. Two primary difficulties are: (1) Performance of interactive rendering, which is the most computationally intensive portion of the visualization process. Formore » terascale platforms, commodity clusters with graphics processors (GPUs) have been used for interactive rendering. For petascale platforms, visualization and rendering may be able to run efficiently on the supercomputer platform itself. (2) I/O bandwidth, which limits how much information can be written to disk. If we simply analyze the sparse information that is saved to disk we miss the opportunity to analyze the rich information produced every timestep by the simulation. For the first issue, we are pursuing in-situ analysis, in which simulations are coupled directly with analysis libraries at runtime. This milestone will evaluate the visualization and rendering performance of current and next generation supercomputers in contrast to GPU-based visualization clusters, and evaluate the perfromance of common analysis libraries coupled with the simulation that analyze and write data to disk during a running simulation. This milestone will explore, evaluate and advance the maturity level of these technologies and their applicability to problems of interest to the ASC program. In conclusion, we improved CPU-based rendering performance by a a factor of 2-10 times on our tests. In addition, we evaluated CPU and CPU-based rendering performance. We encourage production visualization experts to consider using CPU-based rendering solutions when it is appropriate. For example, on remote supercomputers CPU-based rendering can offer a means of viewing data without having to offload the data or geometry onto a CPU-based visualization system. In terms of comparative performance of the CPU and CPU we believe that further optimizations of the performance of both CPU or CPU-based rendering are possible. The simulation community is currently confronting this reality as they work to port their simulations to different hardware architectures. What is interesting about CPU rendering of massive datasets is that for part two decades CPU performance has significantly outperformed CPU-based systems. Based on our advancements, evaluations and explorations we believe that CPU-based rendering has returned as one viable option for the visualization of massive datasets.« less
NASA Astrophysics Data System (ADS)
Wyborn, L. A.; Evans, B. J. K.; Pugh, T.; Lescinsky, D. T.; Foster, C.; Uhlherr, A.
2014-12-01
The National Computational Infrastructure (NCI) at the Australian National University (ANU) is a partnership between CSIRO, ANU, Bureau of Meteorology (BoM) and Geoscience Australia. Recent investments in a 1.2 PFlop Supercomputer (Raijin), ~ 20 PB data storage using Lustre filesystems and a 3000 core high performance cloud have created a hybrid platform for higher performance computing and data-intensive science to enable large scale earth and climate systems modelling and analysis. There are > 3000 users actively logging in and > 600 projects on the NCI system. Efficiently scaling and adapting data and software systems to petascale infrastructures requires the collaborative development of an architecture that is designed, programmed and operated to enable users to interactively invoke different forms of in-situ computation over complex and large scale data collections. NCI makes available major and long tail data collections from both the government and research sectors based on six themes: 1) weather, climate and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology and 6) astronomy, bio and social. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. Collections are the operational form for data management and access. Similar data types from individual custodians are managed cohesively. Use of international standards for discovery and interoperability allow complex interactions within and between the collections. This design facilitates a transdisciplinary approach to research and enables a shift from small scale, 'stove-piped' science efforts to large scale, collaborative systems science. This new and complex infrastructure requires a move to shared, globally trusted software frameworks that can be maintained and updated. Workflow engines become essential and need to integrate provenance, versioning, traceability, repeatability and publication. There are also human resource challenges as highly skilled HPC/HPD specialists, specialist programmers, and data scientists are required whose skills can support scaling to the new paradigm of effective and efficient data-intensive earth science analytics on petascale, and soon to be exascale systems.
Multi-petascale highly efficient parallel supercomputer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Asaad, Sameh; Bellofatto, Ralph E.; Blocksome, Michael A.
A Multi-Petascale Highly Efficient Parallel Supercomputer of 100 petaflop-scale includes node architectures based upon System-On-a-Chip technology, where each processing node comprises a single Application Specific Integrated Circuit (ASIC). The ASIC nodes are interconnected by a five dimensional torus network that optimally maximize the throughput of packet communications between nodes and minimize latency. The network implements collective network and a global asynchronous network that provides global barrier and notification functions. Integrated in the node design include a list-based prefetcher. The memory system implements transaction memory, thread level speculation, and multiversioning cache that improves soft error rate at the same time andmore » supports DMA functionality allowing for parallel processing message-passing.« less
Capturing Petascale Application Characteristics with the Sequoia Toolkit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vetter, Jeffrey S; Bhatia, Nikhil; Grobelny, Eric M
2005-01-01
Characterization of the computation, communication, memory, and I/O demands of current scientific applications is crucial for identifying which technologies will enable petascale scientific computing. In this paper, we present the Sequoia Toolkit for characterizing HPC applications. The Sequoia Toolkit consists of the Sequoia trace capture library and the Sequoia Event Analysis Library, or SEAL, that facilitates the development of tools for analyzing Sequoia event traces. Using the Sequoia Toolkit, we have characterized the behavior of application runs with up to 2048 application processes. To illustrate the use of the Sequoia Toolkit, we present a preliminary characterization of LAMMPS, a molecularmore » dynamics application of great interest to the computational biology community.« less
DEM Based Modeling: Grid or TIN? The Answer Depends
NASA Astrophysics Data System (ADS)
Ogden, F. L.; Moreno, H. A.
2015-12-01
The availability of petascale supercomputing power has enabled process-based hydrological simulations on large watersheds and two-way coupling with mesoscale atmospheric models. Of course with increasing watershed scale come corresponding increases in watershed complexity, including wide ranging water management infrastructure and objectives, and ever increasing demands for forcing data. Simulations of large watersheds using grid-based models apply a fixed resolution over the entire watershed. In large watersheds, this means an enormous number of grids, or coarsening of the grid resolution to reduce memory requirements. One alternative to grid-based methods is the triangular irregular network (TIN) approach. TINs provide the flexibility of variable resolution, which allows optimization of computational resources by providing high resolution where necessary and low resolution elsewhere. TINs also increase required effort in model setup, parameter estimation, and coupling with forcing data which are often gridded. This presentation discusses the costs and benefits of the use of TINs compared to grid-based methods, in the context of large watershed simulations within the traditional gridded WRF-HYDRO framework and the new TIN-based ADHydro high performance computing watershed simulator.
Petascale Simulation Initiative Tech Base: FY2007 Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
May, J; Chen, R; Jefferson, D
The Petascale Simulation Initiative began as an LDRD project in the middle of Fiscal Year 2004. The goal of the project was to develop techniques to allow large-scale scientific simulation applications to better exploit the massive parallelism that will come with computers running at petaflops per second. One of the major products of this work was the design and prototype implementation of a programming model and a runtime system that lets applications extend data-parallel applications to use task parallelism. By adopting task parallelism, applications can use processing resources more flexibly, exploit multiple forms of parallelism, and support more sophisticated multiscalemore » and multiphysics models. Our programming model was originally called the Symponents Architecture but is now known as Cooperative Parallelism, and the runtime software that supports it is called Coop. (However, we sometimes refer to the programming model as Coop for brevity.) We have documented the programming model and runtime system in a submitted conference paper [1]. This report focuses on the specific accomplishments of the Cooperative Parallelism project (as we now call it) under Tech Base funding in FY2007. Development and implementation of the model under LDRD funding alone proceeded to the point of demonstrating a large-scale materials modeling application using Coop on more than 1300 processors by the end of FY2006. Beginning in FY2007, the project received funding from both LDRD and the Computation Directorate Tech Base program. Later in the year, after the three-year term of the LDRD funding ended, the ASC program supported the project with additional funds. The goal of the Tech Base effort was to bring Coop from a prototype to a production-ready system that a variety of LLNL users could work with. Specifically, the major tasks that we planned for the project were: (1) Port SARS [former name of the Coop runtime system] to another LLNL platform, probably Thunder or Peloton (depending on when Peloton becomes available); (2) Improve SARS's robustness and ease-of-use, and develop user documentation; and (3) Work with LLNL code teams to help them determine how Symponents could benefit their applications. The original funding request was $296,000 for the year, and we eventually received $252,000. The remainder of this report describes our efforts and accomplishments for each of the goals listed above.« less
Advances in petascale kinetic plasma simulation with VPIC and Roadrunner
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowers, Kevin J; Albright, Brian J; Yin, Lin
2009-01-01
VPIC, a first-principles 3d electromagnetic charge-conserving relativistic kinetic particle-in-cell (PIC) code, was recently adapted to run on Los Alamos's Roadrunner, the first supercomputer to break a petaflop (10{sup 15} floating point operations per second) in the TOP500 supercomputer performance rankings. They give a brief overview of the modeling capabilities and optimization techniques used in VPIC and the computational characteristics of petascale supercomputers like Roadrunner. They then discuss three applications enabled by VPIC's unprecedented performance on Roadrunner: modeling laser plasma interaction in upcoming inertial confinement fusion experiments at the National Ignition Facility (NIF), modeling short pulse laser GeV ion acceleration andmore » modeling reconnection in magnetic confinement fusion experiments.« less
NASA Astrophysics Data System (ADS)
Keyes, David E.
2007-09-01
It takes a village to perform a petascale computation—domain scientists, applied mathematicians, computer scientists, computer system vendors, program managers, and support staff—and the village was assembled during 24-28 June 2007 in Boston's Westin Copley Place for the third annual Scientific Discovery through Advanced Computing (SciDAC) 2007 Conference. Over 300 registered participants networked around 76 posters, focused on achievements and challenges in 36 plenary talks, and brainstormed in two panels. In addition, with an eye to spreading the vision for simulation at the petascale and to growing the workforce, 115 participants—mostly doctoral students and post-docs complementary to the conferees—were gathered on 29 June 2007 in classrooms of the Massachusetts Institute of Technology for a full day of tutorials on the use of SciDAC software. Eleven SciDAC-sponsored research groups presented their software at an introductory level, in both lecture and hands-on formats that included live runs on a local BlueGene/L. Computation has always been about garnering insight into the behavior of systems too complex to explore satisfactorily by theoretical means alone. Today, however, computation is about much more: scientists and decision makers expect quantitatively reliable predictions from simulations ranging in scale from that of the Earth's climate, down to quarks, and out to colliding black holes. Predictive simulation lies at the heart of policy choices in energy and environment affecting billions of lives and expenditures of trillions of dollars. It is also at the heart of scientific debates on the nature of matter and the origin of the universe. The petascale is barely adequate for such demands and we are barely established at the levels of resolution and throughput that this new scale of computation affords. However, no scientific agenda worldwide is pushing the petascale frontier on all its fronts as vigorously as SciDAC. The breadth of this conference archive reflects the philosophy of the SciDAC program, which was introduced as a collaboration of all of the program offices in the Office of Science of the U.S. Department of Energy (DOE) in Fall 2001 and was renewed for a second period of five years in Fall 2006, with additional support in certain areas from the DOE's National Nuclear Security Administration (NNSA) and the U.S. National Science Foundation (NSF). All of the projects in the SciDAC portfolio were represented at the conference and most are captured in this volume. In addition, the Organizing Committee incorporated into the technical program a number of computational science highlights from outside of SciDAC, and, indeed, from outside of the United States. As implied by the title, scientific discovery is the driving deliverable of the SciDAC program, spanning the full range of the DOE Office of Science: accelerator design, astrophysics, chemistry and materials science, climate science, combustion, life science, nuclear physics, plasma physics, and subsurface physics. As articulated in the eponymous report that launched SciDAC, the computational challenges of these diverse areas are remarkably common. Each is profoundly multiscale in space and time and therefore continues to benefit at any margin from access to the largest and fastest computers available. Optimality of representation and execution requires adaptive, scalable mathematical algorithms in both continuous (geometrically complex domain) and discrete (mesh and graph) aspects. Programmability and performance optimality require software environments that both manage the intricate details of the underlying hardware and abstract them for scientific users. Running effectively on remote specialized hardware requires transparent workflow systems. Comprehending the petascale data sets generated in such simulations requires automated tools for data exploration and visualization. Archiving and sharing access to this data within the inevitably distributed community of leading scientists requires networked collaborative environments. Each of these elements is a research and development project in its own right. SciDAC does not replace theoretical programs oriented towards long-term basic research, but harvests them for contemporary, complementary state-of-the-art computational campaigns. By clustering researchers from applications and enabling technologies into coordinated, mission-driven projects, SciDAC accomplishes two ends with remarkable effectiveness: (1) it enriches the scientific perspective of both applications and enabling communities through mutual interaction and (2) it leverages between applications solutions and effort encapsulated in software. Though SciDAC is unique, its objective of multiscale science at extreme computational scale is shared and approached through different programmatic mechanisms, notably NNSA's ASC program, NSF's Cyberinfrastructure program, and DoD's CREATE program in the U.S., and RIKEN's computational simulation programs in Japan. Representatives of each of these programs were given the podium at SciDAC 2007 and communication occurred that will be valuable towards the ends of complementarity, leverage, and promulgation of best practices. The 2007 conference was graced with additional welcome program announcements. Michael Strayer announced a new program of postdoctoral research fellowships in the enabling technologies. (The computer science post-docs will be named after the late Professor Ken Kennedy, who briefly led the SciDAC project Center for Scalable Application Development Software (CScADS) until his untimely death in February 2007.) IBM announced its petascale BlueGene/P system on June 26. Meanwhile, at ISC07 in Dresden, the semi-annual posting of a revised Top 500 list on June 27 showed several new Top 10 systems accessible to various SciDAC participants. While SciDAC is dominated in 2007 by the classical scientific pursuit of understanding through reduction to components and isolation of causes and effects, simulation at scale is beginning to offer something even more tantalizing: synthesis and integration of multiple interacting phenomena in complex systems. Indeed, the design-oriented elements of SciDAC, such as accelerator and tokamak modeling, area already emphasizing multiphysics coupling, and climate science has been doing so for years in the coupling of models of the ocean, atmosphere, ice, and land. In one of the panels at SciDAC 2007, leaders of a three-stage `progressive workshop' on exascale simulation for energy and environment (E3), considered prospects for whole-system modeling in a variety of scientific areas within the domain of DOE related to energy, environmental, and global security. Computer vendors were invited to comment on the prospects for delivering exascale computing systems in another panel. The daunting nature of this challenge is summarized with the observation that the peak processing power of the entire Top 500 list of June 2007 is only 0.0052 exaflop/s. It takes the combined power of most of the computers on the internet today worldwide to reach 1 exaflop/s or 1018 floating point operations per second. The program of SciDAC 2007 followed a template honed by its predecessor meetings in San Francisco in 2005 and Denver in 2006. The Boston venue permitted outreach to a number of universities in the immediate region and throughout southern New England, including SciDAC campuses of Boston University, Harvard, and MIT, and a dozen others including most of the Ivy League. Altogether 55 universities, 20 laboratories, 14 private companies, 5 agencies, and 4 countries were represented among the conference and tutorial workshop participants. Approximately 47% of the conference participants were from government laboratories, 37% from universities, 9% from federal program offices, and 7% from industry. Keys to the success of SciDAC 2007 were the informal poster receptions, coffee breaks, working breakfasts and lunches, and even the `Right-brain Night' featuring artistic statements, both reverent and irreverent, by computational scientists, inspired by their work. The organizers thank the sponsors for their generosity in attracting participants to these informal occasions with sumptuous snacks and beverages: AMD, Cray, DataDirect, IBM, SGI, SiCortex, and the Institute of Physics. A conference as logistically complex as SciDAC 2007 cannot possibly and should not be executed primarily by the scientists, themselves. It is a great pleasure to acknowledge the many talented staff that contributed to a productive time for all participants and nearperfect adherence to schedule. Chief among them is Betsy Riley, currently detailed from ORNL to the program office in Germantown, with degrees in mathematics and computer science, but a passion for organizing interdisciplinary scientific programs. Betsy staffed the organizing committee during the year of telecon meetings leading up to the conference and masterminded sponsorship, invitations, and the compilation of the proceedings. Assisting her from ORNL in managing the program were Daniel Pack, Angela Beach, and Angela Fincher. Cynthia Latham of ORNL performed admirably in website and graphic design for all aspects of the online and printed materials of the meeting. John Bui, John Smith, and Missy Smith of ORNL ran their customary tight ship with respect to audio-visual execution and capture, assisted by Eric Ecklund and Keith Quinn of the Westin. Pamelia Nixon-Hartje of Ambassador Services was personally invaluable in getting the most out of the hotel and its staff. We thank Jeff Nichols of ORNL for managing the primary subcontract for the meeting. The SciDAC tutorial program was a joint effort of Professor John Negele of MIT, David Skinner, PI of the SciDAC Outreach Center, and the SciDAC 2007 Chair. Sponsorship from the Outreach Center in the form of travel scholarships for students, and of the local area SciDAC university delegation of BU, Harvard, and MIT for food and facilities is gratefully acknowledged. Of course, the archival success of a scientific meeting rests with the willingness of the presenters to make the extra effort to package their field-leading science in a form suitable for interaction with colleagues from other disciplines rather than fellow specialists. This goal, oft-stated in the run up to the meeting, was achieved to an admirable degree, both in the live presentations and in these proceedings. This effort is its own reward, since it leads to enhanced communication and accelerated scientific progress. Our greatest thanks are reserved for Michael Strayer, Associate Director for OASCR and the Director of SciDAC, for envisioning this celebratory meeting three years ago, and sustaining it with his own enthusiasm, in order to provide a highly visible manifestation of the fruits of SciDAC. He and the other Office of Science program managers in attendance and working in Washington, DC to communicate the opportunities afforded by SciDAC deserve the gratitude of a new virtual scientific village created and cemented under the vision of scientific discovery through advanced computing. David E Keyes Fu Foundation Professor of Applied Mathematics
Hadwiger, M; Beyer, J; Jeong, Won-Ki; Pfister, H
2012-12-01
This paper presents the first volume visualization system that scales to petascale volumes imaged as a continuous stream of high-resolution electron microscopy images. Our architecture scales to dense, anisotropic petascale volumes because it: (1) decouples construction of the 3D multi-resolution representation required for visualization from data acquisition, and (2) decouples sample access time during ray-casting from the size of the multi-resolution hierarchy. Our system is designed around a scalable multi-resolution virtual memory architecture that handles missing data naturally, does not pre-compute any 3D multi-resolution representation such as an octree, and can accept a constant stream of 2D image tiles from the microscopes. A novelty of our system design is that it is visualization-driven: we restrict most computations to the visible volume data. Leveraging the virtual memory architecture, missing data are detected during volume ray-casting as cache misses, which are propagated backwards for on-demand out-of-core processing. 3D blocks of volume data are only constructed from 2D microscope image tiles when they have actually been accessed during ray-casting. We extensively evaluate our system design choices with respect to scalability and performance, compare to previous best-of-breed systems, and illustrate the effectiveness of our system for real microscopy data from neuroscience.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spentzouris, P.; /Fermilab; Cary, J.
The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessarymore » accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors. ComPASS is in the first year of executing its plan to develop the next-generation HPC accelerator modeling tools. ComPASS aims to develop an integrated simulation environment that will utilize existing and new accelerator physics modules with petascale capabilities, by employing modern computing and solver technologies. The ComPASS vision is to deliver to accelerator scientists a virtual accelerator and virtual prototyping modeling environment, with the necessary multiphysics, multiscale capabilities. The plan for this development includes delivering accelerator modeling applications appropriate for each stage of the ComPASS software evolution. Such applications are already being used to address challenging problems in accelerator design and optimization. The ComPASS organization for software development and applications accounts for the natural domain areas (beam dynamics, electromagnetics, and advanced acceleration), and all areas depend on the enabling technologies activities, such as solvers and component technology, to deliver the desired performance and integrated simulation environment. The ComPASS applications focus on computationally challenging problems important for design or performance optimization to all major HEP, NP, and BES accelerator facilities. With the cost and complexity of particle accelerators rising, the use of computation to optimize their designs and find improved operating regimes becomes essential, potentially leading to significant cost savings with modest investment.« less
Final Scientific Report: A Scalable Development Environment for Peta-Scale Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karbach, Carsten; Frings, Wolfgang
2013-02-22
This document is the final scientific report of the project DE-SC000120 (A scalable Development Environment for Peta-Scale Computing). The objective of this project is the extension of the Parallel Tools Platform (PTP) for applying it to peta-scale systems. PTP is an integrated development environment for parallel applications. It comprises code analysis, performance tuning, parallel debugging and system monitoring. The contribution of the Juelich Supercomputing Centre (JSC) aims to provide a scalable solution for system monitoring of supercomputers. This includes the development of a new communication protocol for exchanging status data between the target remote system and the client running PTP.more » The communication has to work for high latency. PTP needs to be implemented robustly and should hide the complexity of the supercomputer's architecture in order to provide a transparent access to various remote systems via a uniform user interface. This simplifies the porting of applications to different systems, because PTP functions as abstraction layer between parallel application developer and compute resources. The common requirement for all PTP components is that they have to interact with the remote supercomputer. E.g. applications are built remotely and performance tools are attached to job submissions and their output data resides on the remote system. Status data has to be collected by evaluating outputs of the remote job scheduler and the parallel debugger needs to control an application executed on the supercomputer. The challenge is to provide this functionality for peta-scale systems in real-time. The client server architecture of the established monitoring application LLview, developed by the JSC, can be applied to PTP's system monitoring. LLview provides a well-arranged overview of the supercomputer's current status. A set of statistics, a list of running and queued jobs as well as a node display mapping running jobs to their compute resources form the user display of LLview. These monitoring features have to be integrated into the development environment. Besides showing the current status PTP's monitoring also needs to allow for submitting and canceling user jobs. Monitoring peta-scale systems especially deals with presenting the large amount of status data in a useful manner. Users require to select arbitrary levels of detail. The monitoring views have to provide a quick overview of the system state, but also need to allow for zooming into specific parts of the system, into which the user is interested in. At present, the major batch systems running on supercomputers are PBS, TORQUE, ALPS and LoadLeveler, which have to be supported by both the monitoring and the job controlling component. Finally, PTP needs to be designed as generic as possible, so that it can be extended for future batch systems.« less
cross flow from peta-scale, high-fidelity simulations in collaboration with the gas turbine industry. A stratified combustion in the stabilization of flames above a jet in cross flow. Earlier work involved using
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gibson, Garth
Petascale computing infrastructures for scientific discovery make petascale demands on information storage capacity, performance, concurrency, reliability, availability, and manageability. The Petascale Data Storage Institute focuses on the data storage problems found in petascale scientific computing environments, with special attention to community issues such as interoperability, community buy-in, and shared tools. The Petascale Data Storage Institute is a collaboration between researchers at Carnegie Mellon University, National Energy Research Scientific Computing Center, Pacific Northwest National Laboratory, Oak Ridge National Laboratory, Sandia National Laboratory, Los Alamos National Laboratory, University of Michigan, and the University of California at Santa Cruz. Because the Institute focusesmore » on low level files systems and storage systems, its role in improving SciDAC systems was one of supporting application middleware such as data management and system-level performance tuning. In retrospect, the Petascale Data Storage Institute’s most innovative and impactful contribution is the Parallel Log-structured File System (PLFS). Published in SC09, PLFS is middleware that operates in MPI-IO or embedded in FUSE for non-MPI applications. Its function is to decouple concurrently written files into a per-process log file, whose impact (the contents of the single file that the parallel application was concurrently writing) is determined on later reading, rather than during its writing. PLFS is transparent to the parallel application, offering a POSIX or MPI-IO interface, and it shows an order of magnitude speedup to the Chombo benchmark and two orders of magnitude to the FLASH benchmark. Moreover, LANL production applications see speedups of 5X to 28X, so PLFS has been put into production at LANL. Originally conceived and prototyped in a PDSI collaboration between LANL and CMU, it has grown to engage many other PDSI institutes, international partners like AWE, and has a large team at EMC supporting and enhancing it. PLFS is open sourced with a BSD license on sourceforge. Post PDSI funding comes from NNSA and industry sources. Moreover, PLFS has spin out half a dozen or more papers, partnered on research with multiple schools and vendors, and has projects to transparently 1) dis- tribute metadata over independent metadata servers, 2) exploit drastically non-POSIX Hadoop storage for HPC POSIX applications, 3) compress checkpoints on the fly, 4) batch delayed writes for write speed, 5) compress read-back indexes and parallelize their redistribution, 6) double-buffer writes in NAND Flash storage to decouple host blocking during checkpoint from disk write time in the storage system, 7) pack small files into a smaller number of bigger containers. There are two large scale open source Linux software projects that PDSI significantly incubated, though neither were initated in PDSI. These are 1) Ceph, a UCSC parallel object storage research project that has continued to be a vehicle for research, and has become a released part of Linux, and 2) Parallel NFS (pNFS) a portion of the IETF’s NFSv4.1 that brings the core data parallelism found in Lustre, PanFS, PVFS, and Ceph to the industry standard NFS, with released code in Linux 3.0, and its vendor offerings, with products from NetApp, EMC, BlueArc and RedHat. Both are fundamentally supported and advanced by vendor companies now, but were critcally transferred from research demonstration to viable product with funding from PDSI, in part. At this point Lustre remains the primary path to scalable IO in Exascale systems, but both Ceph and pNFS are viable alternatives with different fundamental advantages. Finally, research community building was a big success for PDSI. Through the HECFSIO workshops and HECURA project with NSF PDSI stimulated and helped to steer leveraged funding of over $25M. Through the Petascale (now Parallel) Data Storage Workshop series, www.pdsw.org, colocated with SCxy each year, PDSI created and incubated five offerings of this high-attendance workshop. The workshop has gone on without PDSI support with two more highly successfully workshops, rewriting its organizational structure to be community managed. More than 70 peer reviewed papers have been presented at PDSW workshops.« less
Pope, Bernard J; Fitch, Blake G; Pitman, Michael C; Rice, John J; Reumann, Matthias
2011-01-01
Future multiscale and multiphysics models must use the power of high performance computing (HPC) systems to enable research into human disease, translational medical science, and treatment. Previously we showed that computationally efficient multiscale models will require the use of sophisticated hybrid programming models, mixing distributed message passing processes (e.g. the message passing interface (MPI)) with multithreading (e.g. OpenMP, POSIX pthreads). The objective of this work is to compare the performance of such hybrid programming models when applied to the simulation of a lightweight multiscale cardiac model. Our results show that the hybrid models do not perform favourably when compared to an implementation using only MPI which is in contrast to our results using complex physiological models. Thus, with regards to lightweight multiscale cardiac models, the user may not need to increase programming complexity by using a hybrid programming approach. However, considering that model complexity will increase as well as the HPC system size in both node count and number of cores per node, it is still foreseeable that we will achieve faster than real time multiscale cardiac simulations on these systems using hybrid programming models.
NASA Astrophysics Data System (ADS)
Wyborn, L.
2012-04-01
The advent of the petascale era, in both storage and compute facilities, will offer new opportunities for earth scientists to transform the way they do their science and to undertake cross-disciplinary science at a global scale. No longer will data have to be averaged and subsampled: it can be analysed to its fullest resolution at national or even global scales. Much larger data volumes can be analysed in single passes and at higher resolution: large scale cross domain science is now feasible. However, in general, earth sciences have been slow to capitalise on the potential of these new petascale compute facilities: many struggle to even use terascale facilities. Our chances of using these new facilities will require a vast improvement in the management of the full life cycle of data: in reality it will need to be transformed. Many of our current issues with earth science data are historic and stem from the limitations of early data storage systems. As storage was so expensive, metadata was usually stored separate from the data and attached as a readme file. Likewise, attributes that defined uncertainty, reliability and traceability were recoded in lab note books and rarely stored with the data. Data were routinely transferred as files. The new opportunities require that the traditional discover, display and locally download and process paradigm is too limited. For data access and assimilation to be improved, data will need to be self describing. For heterogeneous data to be rapidly integrated attributes such as reliability, uncertainty and traceability will need to be systematically recorded with each observation. The petascale era also requires that individual data files be transformed and aggregated into calibrated data arrays or data cubes. Standards become critical and are the enablers of integration. These changes are common to almost every science discipline. What makes earth sciences unique is that many domains record time series data, particularly in the environmental geosciences areas (weathering, soil changes, climate change). The data life cycle will be measured in decades and centuries, not years. Preservation over such time spans is quite a challenge to the earth sciences as data will have to be managed over many evolutions of software and hardware. The focus has to be on managing the data and not the media. Currently storage is not an issue, but it is predicted that data volumes will soon exceed the effective storage media than can be physically manufactured. This means that organisations will have to think about disposal and destruction of data. For earth sciences, this will be a particularly sensitive issue. Petascale computing offers many new opportunities to the earth sciences and by 2020 exascale computers will be a reality. To fully realise these opportunities the earth sciences needs to actively and systematically rethink what the ramifications of these new systems will have on current practices for data storage, discovery, access and assimilation.
Introducing Mira, Argonne's Next-Generation Supercomputer
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2013-03-19
Mira, the new petascale IBM Blue Gene/Q system installed at the ALCF, will usher in a new era of scientific supercomputing. An engineering marvel, the 10-petaflops machine is capable of carrying out 10 quadrillion calculations per second.
Towards Petascale DNS of High Reynolds-Number Turbulent Boundary Layer
NASA Astrophysics Data System (ADS)
Webster, Keegan R.
In flight vehicles, a large portion of fuel consumption is due to skin-friction drag. Reduction of this drag will significantly reduce the fuel consumption of flight vehicles and help our nation to reduce CO 2 emissions. In order to reduce skin-friction drag, an increased understanding of wall-turbulence is needed. Direct numerical simulation (DNS) of spatially developing turbulent boundary layers (SDTBL) can provide the fundamental understanding of wall-turbulence in order to produce models for Reynolds averaged Navier-Stokes (RANS) and large-eddy simulations (LES). DNS of SDTBL over a flat plate at Retheta = 1430 - 2900 were performed. Improvements were made to the DNS code allowing for higher Reynolds number simulations towards petascale DNS of turbulent boundary layers. Mesh refinement and improvements to the inflow and outflow boundary conditions have resulted in turbulence statistics that match more closely to experimental results. The Reynolds stresses and the terms of their evolution equations are reported.
Active Storage with Analytics Capabilities and I/O Runtime System for Petascale Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choudhary, Alok
Computational scientists must understand results from experimental, observational and computational simulation generated data to gain insights and perform knowledge discovery. As systems approach the petascale range, problems that were unimaginable a few years ago are within reach. With the increasing volume and complexity of data produced by ultra-scale simulations and high-throughput experiments, understanding the science is largely hampered by the lack of comprehensive I/O, storage, acceleration of data manipulation, analysis, and mining tools. Scientists require techniques, tools and infrastructure to facilitate better understanding of their data, in particular the ability to effectively perform complex data analysis, statistical analysis and knowledgemore » discovery. The goal of this work is to enable more effective analysis of scientific datasets through the integration of enhancements in the I/O stack, from active storage support at the file system layer to MPI-IO and high-level I/O library layers. We propose to provide software components to accelerate data analytics, mining, I/O, and knowledge discovery for large-scale scientific applications, thereby increasing productivity of both scientists and the systems. Our approaches include 1) design the interfaces in high-level I/O libraries, such as parallel netCDF, for applications to activate data mining operations at the lower I/O layers; 2) Enhance MPI-IO runtime systems to incorporate the functionality developed as a part of the runtime system design; 3) Develop parallel data mining programs as part of runtime library for server-side file system in PVFS file system; and 4) Prototype an active storage cluster, which will utilize multicore CPUs, GPUs, and FPGAs to carry out the data mining workload.« less
Big Data: Next-Generation Machines for Big Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hack, James J.; Papka, Michael E.
Addressing the scientific grand challenges identified by the US Department of Energy’s (DOE’s) Office of Science’s programs alone demands a total leadership-class computing capability of 150 to 400 Pflops by the end of this decade. The successors to three of the DOE’s most powerful leadership-class machines are set to arrive in 2017 and 2018—the products of the Collaboration Oak Ridge Argonne Livermore (CORAL) initiative, a national laboratory–industry design/build approach to engineering nextgeneration petascale computers for grand challenge science. These mission-critical machines will enable discoveries in key scientific fields such as energy, biotechnology, nanotechnology, materials science, and high-performance computing, and servemore » as a milestone on the path to deploying exascale computing capabilities.« less
On-line Machine Learning and Event Detection in Petascale Data Streams
NASA Astrophysics Data System (ADS)
Thompson, David R.; Wagstaff, K. L.
2012-01-01
Traditional statistical data mining involves off-line analysis in which all data are available and equally accessible. However, petascale datasets have challenged this premise since it is often impossible to store, let alone analyze, the relevant observations. This has led the machine learning community to investigate adaptive processing chains where data mining is a continuous process. Here pattern recognition permits triage and followup decisions at multiple stages of a processing pipeline. Such techniques can also benefit new astronomical instruments such as the Large Synoptic Survey Telescope (LSST) and Square Kilometre Array (SKA) that will generate petascale data volumes. We summarize some machine learning perspectives on real time data mining, with representative cases of astronomical applications and event detection in high volume datastreams. The first is a "supervised classification" approach currently used for transient event detection at the Very Long Baseline Array (VLBA). It injects known signals of interest - faint single-pulse anomalies - and tunes system parameters to recover these events. This permits meaningful event detection for diverse instrument configurations and observing conditions whose noise cannot be well-characterized in advance. Second, "semi-supervised novelty detection" finds novel events based on statistical deviations from previous patterns. It detects outlier signals of interest while considering known examples of false alarm interference. Applied to data from the Parkes pulsar survey, the approach identifies anomalous "peryton" phenomena that do not match previous event models. Finally, we consider online light curve classification that can trigger adaptive followup measurements of candidate events. Classifier performance analyses suggest optimal survey strategies, and permit principled followup decisions from incomplete data. These examples trace a broad range of algorithm possibilities available for online astronomical data mining. This talk describes research performed at the Jet Propulsion Laboratory, California Institute of Technology. Copyright 2012, All Rights Reserved. U.S. Government support acknowledged.
NASA Astrophysics Data System (ADS)
Reed, P. M.; Chaney, N.; Herman, J. D.; Wood, E. F.; Ferringer, M. P.
2015-12-01
This research represents a multi-institutional collaboration between Cornell University, The Aerospace Corporation, and Princeton University that has completed a Petascale diagnostic assessment of the current 10 satellite missions providing rainfall observations. Our diagnostic assessment has required four core tasks: (1) formally linking high-resolution astrodynamics design and coordination of space assets with their global hydrological impacts within a Petascale "many-objective" global optimization framework, (2) developing a baseline diagnostic evaluation of a 1-degree resolution global implementation of the Variable Infiltration Capacity (VIC) model to establish the required satellite observation frequencies and coverage to maintain acceptable global flood forecasts, (3) evaluating the limitations and vulnerabilities of the full suite of current satellite precipitation missions including the recently approved Global Precipitation Measurement (GPM) mission, and (4) conceptualizing the next generation spaced-based platforms for water cycle observation. Our team exploited over 100 Million hours of computing access on the 700,000+ core Blue Waters machine to radically advance our ability to discover and visualize key system tradeoffs and sensitivities. This project represents to our knowledge the first attempt to develop a 10,000 member Monte Carlo global hydrologic simulation at one degree resolution that characterizes the uncertain effects of changing the available frequencies of satellite precipitation on drought and flood forecasts. The simulation—optimization components of the work have set a theoretical baseline for the best possible frequencies and coverages for global precipitation given unlimited investment, broad international coordination in reconfiguring existing assets, and new satellite constellation design objectives informed directly by key global hydrologic forecasting requirements. Our research poses a step towards realizing the integrated global water cycle observatory long sought by the World Climate Research Programme, which has to date eluded the world's space agencies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bland, Arthur S Buddy; Hack, James J; Baker, Ann E
Oak Ridge National Laboratory's (ORNL's) Cray XT5 supercomputer, Jaguar, kicked off the era of petascale scientific computing in 2008 with applications that sustained more than a thousand trillion floating point calculations per second - or 1 petaflop. Jaguar continues to grow even more powerful as it helps researchers broaden the boundaries of knowledge in virtually every domain of computational science, including weather and climate, nuclear energy, geosciences, combustion, bioenergy, fusion, and materials science. Their insights promise to broaden our knowledge in areas that are vitally important to the Department of Energy (DOE) and the nation as a whole, particularly energymore » assurance and climate change. The science of the 21st century, however, will demand further revolutions in computing, supercomputers capable of a million trillion calculations a second - 1 exaflop - and beyond. These systems will allow investigators to continue attacking global challenges through modeling and simulation and to unravel longstanding scientific questions. Creating such systems will also require new approaches to daunting challenges. High-performance systems of the future will need to be codesigned for scientific and engineering applications with best-in-class communications networks and data-management infrastructures and teams of skilled researchers able to take full advantage of these new resources. The Oak Ridge Leadership Computing Facility (OLCF) provides the nation's most powerful open resource for capability computing, with a sustainable path that will maintain and extend national leadership for DOE's Office of Science (SC). The OLCF has engaged a world-class team to support petascale science and to take a dramatic step forward, fielding new capabilities for high-end science. This report highlights the successful delivery and operation of a petascale system and shows how the OLCF fosters application development teams, developing cutting-edge tools and resources for next-generation systems.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Srinath Vadlamani; Scott Kruger; Travis Austin
Extended magnetohydrodynamic (MHD) codes are used to model the large, slow-growing instabilities that are projected to limit the performance of International Thermonuclear Experimental Reactor (ITER). The multiscale nature of the extended MHD equations requires an implicit approach. The current linear solvers needed for the implicit algorithm scale poorly because the resultant matrices are so ill-conditioned. A new solver is needed, especially one that scales to the petascale. The most successful scalable parallel processor solvers to date are multigrid solvers. Applying multigrid techniques to a set of equations whose fundamental modes are dispersive waves is a promising solution to CEMM problems.more » For the Phase 1, we implemented multigrid preconditioners from the HYPRE project of the Center for Applied Scientific Computing at LLNL via PETSc of the DOE SciDAC TOPS for the real matrix systems of the extended MHD code NIMROD which is a one of the primary modeling codes of the OFES-funded Center for Extended Magnetohydrodynamic Modeling (CEMM) SciDAC. We implemented the multigrid solvers on the fusion test problem that allows for real matrix systems with success, and in the process learned about the details of NIMROD data structures and the difficulties of inverting NIMROD operators. The further success of this project will allow for efficient usage of future petascale computers at the National Leadership Facilities: Oak Ridge National Laboratory, Argonne National Laboratory, and National Energy Research Scientific Computing Center. The project will be a collaborative effort between computational plasma physicists and applied mathematicians at Tech-X Corporation, applied mathematicians Front Range Scientific Computations, Inc. (who are collaborators on the HYPRE project), and other computational plasma physicists involved with the CEMM project.« less
Petascale supercomputing to accelerate the design of high-temperature alloys
Shin, Dongwon; Lee, Sangkeun; Shyam, Amit; ...
2017-10-25
Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ'-Al 2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviourmore » of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. As a result, the approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.« less
Petascale supercomputing to accelerate the design of high-temperature alloys
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shin, Dongwon; Lee, Sangkeun; Shyam, Amit
Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ'-Al 2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviourmore » of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. As a result, the approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.« less
Multi-petascale highly efficient parallel supercomputer
Asaad, Sameh; Bellofatto, Ralph E.; Blocksome, Michael A.; Blumrich, Matthias A.; Boyle, Peter; Brunheroto, Jose R.; Chen, Dong; Cher, Chen -Yong; Chiu, George L.; Christ, Norman; Coteus, Paul W.; Davis, Kristan D.; Dozsa, Gabor J.; Eichenberger, Alexandre E.; Eisley, Noel A.; Ellavsky, Matthew R.; Evans, Kahn C.; Fleischer, Bruce M.; Fox, Thomas W.; Gara, Alan; Giampapa, Mark E.; Gooding, Thomas M.; Gschwind, Michael K.; Gunnels, John A.; Hall, Shawn A.; Haring, Rudolf A.; Heidelberger, Philip; Inglett, Todd A.; Knudson, Brant L.; Kopcsay, Gerard V.; Kumar, Sameer; Mamidala, Amith R.; Marcella, James A.; Megerian, Mark G.; Miller, Douglas R.; Miller, Samuel J.; Muff, Adam J.; Mundy, Michael B.; O'Brien, John K.; O'Brien, Kathryn M.; Ohmacht, Martin; Parker, Jeffrey J.; Poole, Ruth J.; Ratterman, Joseph D.; Salapura, Valentina; Satterfield, David L.; Senger, Robert M.; Smith, Brian; Steinmacher-Burow, Burkhard; Stockdell, William M.; Stunkel, Craig B.; Sugavanam, Krishnan; Sugawara, Yutaka; Takken, Todd E.; Trager, Barry M.; Van Oosten, James L.; Wait, Charles D.; Walkup, Robert E.; Watson, Alfred T.; Wisniewski, Robert W.; Wu, Peng
2015-07-14
A Multi-Petascale Highly Efficient Parallel Supercomputer of 100 petaOPS-scale computing, at decreased cost, power and footprint, and that allows for a maximum packaging density of processing nodes from an interconnect point of view. The Supercomputer exploits technological advances in VLSI that enables a computing model where many processors can be integrated into a single Application Specific Integrated Circuit (ASIC). Each ASIC computing node comprises a system-on-chip ASIC utilizing four or more processors integrated into one die, with each having full access to all system resources and enabling adaptive partitioning of the processors to functions such as compute or messaging I/O on an application by application basis, and preferably, enable adaptive partitioning of functions in accordance with various algorithmic phases within an application, or if I/O or other processors are underutilized, then can participate in computation or communication nodes are interconnected by a five dimensional torus network with DMA that optimally maximize the throughput of packet communications between nodes and minimize latency.
Petascale supercomputing to accelerate the design of high-temperature alloys
NASA Astrophysics Data System (ADS)
Shin, Dongwon; Lee, Sangkeun; Shyam, Amit; Haynes, J. Allen
2017-12-01
Recent progress in high-performance computing and data informatics has opened up numerous opportunities to aid the design of advanced materials. Herein, we demonstrate a computational workflow that includes rapid population of high-fidelity materials datasets via petascale computing and subsequent analyses with modern data science techniques. We use a first-principles approach based on density functional theory to derive the segregation energies of 34 microalloying elements at the coherent and semi-coherent interfaces between the aluminium matrix and the θ‧-Al2Cu precipitate, which requires several hundred supercell calculations. We also perform extensive correlation analyses to identify materials descriptors that affect the segregation behaviour of solutes at the interfaces. Finally, we show an example of leveraging machine learning techniques to predict segregation energies without performing computationally expensive physics-based simulations. The approach demonstrated in the present work can be applied to any high-temperature alloy system for which key materials data can be obtained using high-performance computing.
Working Towards New Transformative Geoscience Analytics Enabled by Petascale Computing
NASA Astrophysics Data System (ADS)
Woodcock, R.; Wyborn, L.
2012-04-01
Currently the top 10 supercomputers in the world are petascale and already exascale computers are being planned. Cloud computing facilities are becoming mainstream either as private or commercial investments. These computational developments will provide abundant opportunities for the earth science community to tackle the data deluge which has resulted from new instrumentation enabling data to be gathered at a greater rate and at higher resolution. Combined, the new computational environments should enable the earth sciences to be transformed. However, experience in Australia and elsewhere has shown that it is not easy to scale existing earth science methods, software and analytics to take advantage of the increased computational capacity that is now available. It is not simply a matter of 'transferring' current work practices to the new facilities: they have to be extensively 'transformed'. In particular new Geoscientific methods will need to be developed using advanced data mining, assimilation, machine learning and integration algorithms. Software will have to be capable of operating in highly parallelised environments, and will also need to be able to scale as the compute systems grow. Data access will have to improve and the earth science community needs to move from the file discovery, display and then locally download paradigm to self describing data cubes and data arrays that are available as online resources from either major data repositories or in the cloud. In the new transformed world, rather than analysing satellite data scene by scene, sensor agnostic data cubes of calibrated earth observation data will enable researchers to move across data from multiple sensors at varying spatial data resolutions. In using geophysics to characterise basement and cover, rather than analysing individual gridded airborne geophysical data sets, and then combining the results, petascale computing will enable analysis of multiple data types, collected at varying resolutions with integration and validation across data type boundaries. Increased capacity of storage and compute will mean that uncertainty and reliability of individual observations will consistently be taken into account and propagated throughout the processing chain. If these data access difficulties can be overcome, the increased compute capacity will also mean that larger scale, more complex models can be run at higher resolution and instead of single pass modelling runs. Ensembles of models will be able to be run to simultaneously test multiple hypotheses. Petascale computing and high performance data offer more than "bigger, faster": it is an opportunity for a transformative change in the way in which geoscience research is routinely conducted.
Open-Source Development of the Petascale Reactive Flow and Transport Code PFLOTRAN
NASA Astrophysics Data System (ADS)
Hammond, G. E.; Andre, B.; Bisht, G.; Johnson, T.; Karra, S.; Lichtner, P. C.; Mills, R. T.
2013-12-01
Open-source software development has become increasingly popular in recent years. Open-source encourages collaborative and transparent software development and promotes unlimited free redistribution of source code to the public. Open-source development is good for science as it reveals implementation details that are critical to scientific reproducibility, but generally excluded from journal publications. In addition, research funds that would have been spent on licensing fees can be redirected to code development that benefits more scientists. In 2006, the developers of PFLOTRAN open-sourced their code under the U.S. Department of Energy SciDAC-II program. Since that time, the code has gained popularity among code developers and users from around the world seeking to employ PFLOTRAN to simulate thermal, hydraulic, mechanical and biogeochemical processes in the Earth's surface/subsurface environment. PFLOTRAN is a massively-parallel subsurface reactive multiphase flow and transport simulator designed from the ground up to run efficiently on computing platforms ranging from the laptop to leadership-class supercomputers, all from a single code base. The code employs domain decomposition for parallelism and is founded upon the well-established and open-source parallel PETSc and HDF5 frameworks. PFLOTRAN leverages modern Fortran (i.e. Fortran 2003-2008) in its extensible object-oriented design. The use of this progressive, yet domain-friendly programming language has greatly facilitated collaboration in the code's software development. Over the past year, PFLOTRAN's top-level data structures were refactored as Fortran classes (i.e. extendible derived types) to improve the flexibility of the code, ease the addition of new process models, and enable coupling to external simulators. For instance, PFLOTRAN has been coupled to the parallel electrical resistivity tomography code E4D to enable hydrogeophysical inversion while the same code base can be used as a third-party library to provide hydrologic flow, energy transport, and biogeochemical capability to the community land model, CLM, part of the open-source community earth system model (CESM) for climate. In this presentation, the advantages and disadvantages of open source software development in support of geoscience research at government laboratories, universities, and the private sector are discussed. Since the code is open-source (i.e. it's transparent and readily available to competitors), the PFLOTRAN team's development strategy within a competitive research environment is presented. Finally, the developers discuss their approach to object-oriented programming and the leveraging of modern Fortran in support of collaborative geoscience research as the Fortran standard evolves among compiler vendors.
In situ visualization for large-scale combustion simulations.
Yu, Hongfeng; Wang, Chaoli; Grout, Ray W; Chen, Jacqueline H; Ma, Kwan-Liu
2010-01-01
As scientific supercomputing moves toward petascale and exascale levels, in situ visualization stands out as a scalable way for scientists to view the data their simulations generate. This full picture is crucial particularly for capturing and understanding highly intermittent transient phenomena, such as ignition and extinction events in turbulent combustion.
NASA Astrophysics Data System (ADS)
To, Albert C.; Liu, Wing Kam; Olson, Gregory B.; Belytschko, Ted; Chen, Wei; Shephard, Mark S.; Chung, Yip-Wah; Ghanem, Roger; Voorhees, Peter W.; Seidman, David N.; Wolverton, Chris; Chen, J. S.; Moran, Brian; Freeman, Arthur J.; Tian, Rong; Luo, Xiaojuan; Lautenschlager, Eric; Challoner, A. Dorian
2008-09-01
Microsystems have become an integral part of our lives and can be found in homeland security, medical science, aerospace applications and beyond. Many critical microsystem applications are in harsh environments, in which long-term reliability needs to be guaranteed and repair is not feasible. For example, gyroscope microsystems on satellites need to function for over 20 years under severe radiation, thermal cycling, and shock loading. Hence a predictive-science-based, verified and validated computational models and algorithms to predict the performance and materials integrity of microsystems in these situations is needed. Confidence in these predictions is improved by quantifying uncertainties and approximation errors. With no full system testing and limited sub-system testings, petascale computing is certainly necessary to span both time and space scales and to reduce the uncertainty in the prediction of long-term reliability. This paper presents the necessary steps to develop predictive-science-based multiscale modeling and simulation system. The development of this system will be focused on the prediction of the long-term performance of a gyroscope microsystem. The environmental effects to be considered include radiation, thermo-mechanical cycling and shock. Since there will be many material performance issues, attention is restricted to creep resulting from thermal aging and radiation-enhanced mass diffusion, material instability due to radiation and thermo-mechanical cycling and damage and fracture due to shock. To meet these challenges, we aim to develop an integrated multiscale software analysis system that spans the length scales from the atomistic scale to the scale of the device. The proposed software system will include molecular mechanics, phase field evolution, micromechanics and continuum mechanics software, and the state-of-the-art model identification strategies where atomistic properties are calibrated by quantum calculations. We aim to predict the long-term (in excess of 20 years) integrity of the resonator, electrode base, multilayer metallic bonding pads, and vacuum seals in a prescribed mission. Although multiscale simulations are efficient in the sense that they focus the most computationally intensive models and methods on only the portions of the space time domain needed, the execution of the multiscale simulations associated with evaluating materials and device integrity for aerospace microsystems will require the application of petascale computing. A component-based software strategy will be used in the development of our massively parallel multiscale simulation system. This approach will allow us to take full advantage of existing single scale modeling components. An extensive, pervasive thrust in the software system development is verification, validation, and uncertainty quantification (UQ). Each component and the integrated software system need to be carefully verified. An UQ methodology that determines the quality of predictive information available from experimental measurements and packages the information in a form suitable for UQ at various scales needs to be developed. Experiments to validate the model at the nanoscale, microscale, and macroscale are proposed. The development of a petascale predictive-science-based multiscale modeling and simulation system will advance the field of predictive multiscale science so that it can be used to reliably analyze problems of unprecedented complexity, where limited testing resources can be adequately replaced by petascale computational power, advanced verification, validation, and UQ methodologies.
Pope, Bernard J; Fitch, Blake G; Pitman, Michael C; Rice, John J; Reumann, Matthias
2011-10-01
Future multiscale and multiphysics models that support research into human disease, translational medical science, and treatment can utilize the power of high-performance computing (HPC) systems. We anticipate that computationally efficient multiscale models will require the use of sophisticated hybrid programming models, mixing distributed message-passing processes [e.g., the message-passing interface (MPI)] with multithreading (e.g., OpenMP, Pthreads). The objective of this study is to compare the performance of such hybrid programming models when applied to the simulation of a realistic physiological multiscale model of the heart. Our results show that the hybrid models perform favorably when compared to an implementation using only the MPI and, furthermore, that OpenMP in combination with the MPI provides a satisfactory compromise between performance and code complexity. Having the ability to use threads within MPI processes enables the sophisticated use of all processor cores for both computation and communication phases. Considering that HPC systems in 2012 will have two orders of magnitude more cores than what was used in this study, we believe that faster than real-time multiscale cardiac simulations can be achieved on these systems.
NASA Astrophysics Data System (ADS)
Balcas, J.; Hendricks, T. W.; Kcira, D.; Mughal, A.; Newman, H.; Spiropulu, M.; Vlimant, J. R.
2017-10-01
The SDN Next Generation Integrated Architecture (SDN-NGeNIA) project addresses some of the key challenges facing the present and next generations of science programs in HEP, astrophysics, and other fields, whose potential discoveries depend on their ability to distribute, process and analyze globally distributed Petascale to Exascale datasets. The SDN-NGenIA system under development by Caltech and partner HEP and network teams is focused on the coordinated use of network, computing and storage infrastructures, through a set of developments that build on the experience gained in recently completed and previous projects that use dynamic circuits with bandwidth guarantees to support major network flows, as demonstrated across LHC Open Network Environment [1] and in large scale demonstrations over the last three years, and recently integrated with PhEDEx and Asynchronous Stage Out data management applications of the CMS experiment at the Large Hadron Collider. In addition to the general program goals of supporting the network needs of the LHC and other science programs with similar needs, a recent focus is the use of the Leadership HPC facility at Argonne National Lab (ALCF) for data intensive applications.
Current and Future Development of a Non-hydrostatic Unified Atmospheric Model (NUMA)
2010-09-09
following capabilities: 1. Highly scalable on current and future computer architectures ( exascale computing and beyond and GPUs) 2. Flexibility... Exascale Computing • 10 of Top 500 are already in the Petascale range • Should also keep our eyes on GPUs (e.g., Mare Nostrum) 2. Numerical
Large-scale large eddy simulation of nuclear reactor flows: Issues and perspectives
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merzari, Elia; Obabko, Aleks; Fischer, Paul
Numerical simulation has been an intrinsic part of nuclear engineering research since its inception. In recent years a transition is occurring toward predictive, first-principle-based tools such as computational fluid dynamics. Even with the advent of petascale computing, however, such tools still have significant limitations. In the present work some of these issues, and in particular the presence of massive multiscale separation, are discussed, as well as some of the research conducted to mitigate them. Petascale simulations at high fidelity (large eddy simulation/direct numerical simulation) were conducted with the massively parallel spectral element code Nek5000 on a series of representative problems.more » These simulations shed light on the requirements of several types of simulation: (1) axial flow around fuel rods, with particular attention to wall effects; (2) natural convection in the primary vessel; and (3) flow in a rod bundle in the presence of spacing devices. Finally, the focus of the work presented here is on the lessons learned and the requirements to perform these simulations at exascale. Additional physical insight gained from these simulations is also emphasized.« less
Large-scale large eddy simulation of nuclear reactor flows: Issues and perspectives
Merzari, Elia; Obabko, Aleks; Fischer, Paul; ...
2016-11-03
Numerical simulation has been an intrinsic part of nuclear engineering research since its inception. In recent years a transition is occurring toward predictive, first-principle-based tools such as computational fluid dynamics. Even with the advent of petascale computing, however, such tools still have significant limitations. In the present work some of these issues, and in particular the presence of massive multiscale separation, are discussed, as well as some of the research conducted to mitigate them. Petascale simulations at high fidelity (large eddy simulation/direct numerical simulation) were conducted with the massively parallel spectral element code Nek5000 on a series of representative problems.more » These simulations shed light on the requirements of several types of simulation: (1) axial flow around fuel rods, with particular attention to wall effects; (2) natural convection in the primary vessel; and (3) flow in a rod bundle in the presence of spacing devices. Finally, the focus of the work presented here is on the lessons learned and the requirements to perform these simulations at exascale. Additional physical insight gained from these simulations is also emphasized.« less
Analysis Report for Exascale Storage Requirements for Scientific Data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruwart, Thomas M.
Over the next 10 years, the Department of Energy will be transitioning from Petascale to Exascale Computing resulting in data storage, networking, and infrastructure requirements to increase by three orders of magnitude. The technologies and best practices used today are the result of a relatively slow evolution of ancestral technologies developed in the 1950s and 1960s. These include magnetic tape, magnetic disk, networking, databases, file systems, and operating systems. These technologies will continue to evolve over the next 10 to 15 years on a reasonably predictable path. Experience with the challenges involved in transitioning these fundamental technologies from Terascale tomore » Petascale computing systems has raised questions about how these will scale another 3 or 4 orders of magnitude to meet the requirements imposed by Exascale computing systems. This report is focused on the most concerning scaling issues with data storage systems as they relate to High Performance Computing- and presents options for a path forward. Given the ability to store exponentially increasing amounts of data, far more advanced concepts and use of metadata will be critical to managing data in Exascale computing systems.« less
Freud: a software suite for high-throughput simulation analysis
NASA Astrophysics Data System (ADS)
Harper, Eric; Spellings, Matthew; Anderson, Joshua; Glotzer, Sharon
Computer simulation is an indispensable tool for the study of a wide variety of systems. As simulations scale to fill petascale and exascale supercomputing clusters, so too does the size of the data produced, as well as the difficulty in analyzing these data. We present Freud, an analysis software suite for efficient analysis of simulation data. Freud makes no assumptions about the system being analyzed, allowing for general analysis methods to be applied to nearly any type of simulation. Freud includes standard analysis methods such as the radial distribution function, as well as new methods including the potential of mean force and torque and local crystal environment analysis. Freud combines a Python interface with fast, parallel C + + analysis routines to run efficiently on laptops, workstations, and supercomputing clusters. Data analysis on clusters reduces data transfer requirements, a prohibitive cost for petascale computing. Used in conjunction with simulation software, Freud allows for smart simulations that adapt to the current state of the system, enabling the study of phenomena such as nucleation and growth, intelligent investigation of phases and phase transitions, and determination of effective pair potentials.
DOE Office of Scientific and Technical Information (OSTI.GOV)
de Supinski, B R; Miller, B P; Liblit, B
2011-09-13
Petascale platforms with O(10{sup 5}) and O(10{sup 6}) processing cores are driving advancements in a wide range of scientific disciplines. These large systems create unprecedented application development challenges. Scalable correctness tools are critical to shorten the time-to-solution on these systems. Currently, many DOE application developers use primitive manual debugging based on printf or traditional debuggers such as TotalView or DDT. This paradigm breaks down beyond a few thousand cores, yet bugs often arise above that scale. Programmers must reproduce problems in smaller runs to analyze them with traditional tools, or else perform repeated runs at scale using only primitive techniques.more » Even when traditional tools run at scale, the approach wastes substantial effort and computation cycles. Continued scientific progress demands new paradigms for debugging large-scale applications. The Correctness on Petascale Systems (CoPS) project is developing a revolutionary debugging scheme that will reduce the debugging problem to a scale that human developers can comprehend. The scheme can provide precise diagnoses of the root causes of failure, including suggestions of the location and the type of errors down to the level of code regions or even a single execution point. Our fundamentally new strategy combines and expands three relatively new complementary debugging approaches. The Stack Trace Analysis Tool (STAT), a 2011 R&D 100 Award Winner, identifies behavior equivalence classes in MPI jobs and highlights behavior when elements of the class demonstrate divergent behavior, often the first indicator of an error. The Cooperative Bug Isolation (CBI) project has developed statistical techniques for isolating programming errors in widely deployed code that we will adapt to large-scale parallel applications. Finally, we are developing a new approach to parallelizing expensive correctness analyses, such as analysis of memory usage in the Memgrind tool. In the first two years of the project, we have successfully extended STAT to determine the relative progress of different MPI processes. We have shown that the STAT, which is now included in the debugging tools distributed by Cray with their large-scale systems, substantially reduces the scale at which traditional debugging techniques are applied. We have extended CBI to large-scale systems and developed new compiler based analyses that reduce its instrumentation overhead. Our results demonstrate that CBI can identify the source of errors in large-scale applications. Finally, we have developed MPIecho, a new technique that will reduce the time required to perform key correctness analyses, such as the detection of writes to unallocated memory. Overall, our research results are the foundations for new debugging paradigms that will improve application scientist productivity by reducing the time to determine which package or module contains the root cause of a problem that arises at all scales of our high end systems. While we have made substantial progress in the first two years of CoPS research, significant work remains. While STAT provides scalable debugging assistance for incorrect application runs, we could apply its techniques to assertions in order to observe deviations from expected behavior. Further, we must continue to refine STAT's techniques to represent behavioral equivalence classes efficiently as we expect systems with millions of threads in the next year. We are exploring new CBI techniques that can assess the likelihood that execution deviations from past behavior are the source of erroneous execution. Finally, we must develop usable correctness analyses that apply the MPIecho parallelization strategy in order to locate coding errors. We expect to make substantial progress on these directions in the next year but anticipate that significant work will remain to provide usable, scalable debugging paradigms.« less
Petascale Computing: Impact on Future NASA Missions
NASA Technical Reports Server (NTRS)
Brooks, Walter
2006-01-01
This slide presentation reviews NASA's use of a new super computer, called Columbia, capable of operating at 62 Tera Flops. This computer is the 4th fastest computer in the world. This computer will serve all mission directorates. The applications that it would serve are: aerospace analysis and design, propulsion subsystem analysis, climate modeling, hurricane prediction and astrophysics and cosmology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Ucilia
This report has the following articles: (1) Deconstructing Microbes--metagenomic research on bugs in termites relies on new data analysis tools; (2) Popular Science--a nanomaterial research paper in Nano Letters drew strong interest from the scientific community; (3) Direct Approach--researchers employ an algorithm to solve an energy-reduction issue essential in describing complex physical system; and (4) SciDAC Special--A science journal features research on petascale enabling technologies.
Highlights of X-Stack ExM Deliverable Swift/T
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wozniak, Justin M.
Swift/T is a key success from the ExM: System support for extreme-scale, many-task applications1 X-Stack project, which proposed to use concurrent dataflow as an innovative programming model to exploit extreme parallelism in exascale computers. The Swift/T component of the project reimplemented the Swift language from scratch to allow applications that compose scientific modules together to be build and run on available petascale computers (Blue Gene, Cray). Swift/T does this via a new compiler and runtime that generates and executes the application as an MPI program. We assume that mission-critical emerging exascale applications will be composed as scalable applications using existingmore » software components, connected by data dependencies. Developers wrap native code fragments using a higherlevel language, then build composite applications to form a computational experiment. This exemplifies hierarchical concurrency: lower-level messaging libraries are used for fine-grained parallelism; highlevel control is used for inter-task coordination. These patterns are best expressed with dataflow, but static DAGs (i.e., other workflow languages) limit the applications that can be built; they do not provide the expressiveness of Swift, such as conditional execution, iteration, and recursive functions.« less
NASA Astrophysics Data System (ADS)
Stevens, Rick
2008-07-01
The fourth annual Scientific Discovery through Advanced Computing (SciDAC) Conference was held June 13-18, 2008, in Seattle, Washington. The SciDAC conference series is the premier communitywide venue for presentation of results from the DOE Office of Science's interdisciplinary computational science program. Started in 2001 and renewed in 2006, the DOE SciDAC program is the country's - and arguably the world's - most significant interdisciplinary research program supporting the development of advanced scientific computing methods and their application to fundamental and applied areas of science. SciDAC supports computational science across many disciplines, including astrophysics, biology, chemistry, fusion sciences, and nuclear physics. Moreover, the program actively encourages the creation of long-term partnerships among scientists focused on challenging problems and computer scientists and applied mathematicians developing the technology and tools needed to address those problems. The SciDAC program has played an increasingly important role in scientific research by allowing scientists to create more accurate models of complex processes, simulate problems once thought to be impossible, and analyze the growing amount of data generated by experiments. To help further the research community's ability to tap into the capabilities of current and future supercomputers, Under Secretary for Science, Raymond Orbach, launched the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program in 2003. The INCITE program was conceived specifically to seek out computationally intensive, large-scale research projects with the potential to significantly advance key areas in science and engineering. The program encourages proposals from universities, other research institutions, and industry. During the first two years of the INCITE program, 10 percent of the resources at NERSC were allocated to INCITE awardees. However, demand for supercomputing resources far exceeded available systems; and in 2003, the Office of Science identified increasing computing capability by a factor of 100 as the second priority on its Facilities of the Future list. The goal was to establish leadership-class computing resources to support open science. As a result of a peer reviewed competition, the first leadership computing facility was established at Oak Ridge National Laboratory in 2004. A second leadership computing facility was established at Argonne National Laboratory in 2006. This expansion of computational resources led to a corresponding expansion of the INCITE program. In 2008, Argonne, Lawrence Berkeley, Oak Ridge, and Pacific Northwest national laboratories all provided resources for INCITE. By awarding large blocks of computer time on the DOE leadership computing facilities, the INCITE program enables the largest-scale computations to be pursued. In 2009, INCITE will award over half a billion node-hours of time. The SciDAC conference celebrates progress in advancing science through large-scale modeling and simulation. Over 350 participants attended this year's talks, poster sessions, and tutorials, spanning the disciplines supported by DOE. While the principal focus was on SciDAC accomplishments, this year's conference also included invited presentations and posters from DOE INCITE awardees. Another new feature in the SciDAC conference series was an electronic theater and video poster session, which provided an opportunity for the community to see over 50 scientific visualizations in a venue equipped with many high-resolution large-format displays. To highlight the growing international interest in petascale computing, this year's SciDAC conference included a keynote presentation by Herman Lederer from the Max Planck Institut, one of the leaders of DEISA (Distributed European Infrastructure for Supercomputing Applications) project and a member of the PRACE consortium, Europe's main petascale project. We also heard excellent talks from several European groups, including Laurent Gicquel of CERFACS, who spoke on `Large-Eddy Simulations of Turbulent Reacting Flows of Real Burners: Status and Challenges', and Jean-Francois Hamelin from EDF, who presented a talk on `Getting Ready for Petaflop Capacities and Beyond: A Utility Perspective'. Two other compelling addresses gave attendees a glimpse into the future. Tomas Diaz de la Rubia of Lawrence Livermore National Laboratory spoke on a vision for a fusion/fission hybrid reactor known as the `LIFE Engine' and discussed some of the materials and modeling challenges that need to be overcome to realize the vision for a 1000-year greenhouse-gas-free power source. Dan Reed from Microsoft gave a capstone talk on the convergence of technology, architecture, and infrastructure for cloud computing, data-intensive computing, and exascale computing (1018 flops/sec). High-performance computing is making rapid strides. The SciDAC community's computational resources are expanding dramatically. In the summer of 2008 the first general purpose petascale system (IBM Cell-based RoadRunner at Los Alamos National Laboratory) was recognized in the top 500 list of fastest machines heralding in the dawning of the petascale era. The DOE's leadership computing facility at Argonne reached number three on the Top 500 and is at the moment the most capable open science machine based on an IBM BG/P system with a peak performance of over 550 teraflops/sec. Later this year Oak Ridge is expected to deploy a 1 petaflops/sec Cray XT system. And even before the scientific community has had an opportunity to make significant use of petascale systems, the computer science research community is forging ahead with ideas and strategies for development of systems that may by the end of the next decade sustain exascale performance. Several talks addressed barriers to, and strategies for, achieving exascale capabilities. The last day of the conference was devoted to tutorials hosted by Microsoft Research at a new conference facility in Redmond, Washington. Over 90 people attended the tutorials, which covered topics ranging from an introduction to BG/P programming to advanced numerical libraries. The SciDAC and INCITE programs and the DOE Office of Advanced Scientific Computing Research core program investments in applied mathematics, computer science, and computational and networking facilities provide a nearly optimum framework for advancing computational science for DOE's Office of Science. At a broader level this framework also is benefiting the entire American scientific enterprise. As we look forward, it is clear that computational approaches will play an increasingly significant role in addressing challenging problems in basic science, energy, and environmental research. It takes many people to organize and support the SciDAC conference, and I would like to thank as many of them as possible. The backbone of the conference is the technical program; and the task of selecting, vetting, and recruiting speakers is the job of the organizing committee. I thank the members of this committee for all the hard work and the many tens of conference calls that enabled a wonderful program to be assembled. This year the following people served on the organizing committee: Jim Ahrens, LANL; David Bader, LLNL; Bryan Barnett, Microsoft; Peter Beckman, ANL; Vincent Chan, GA; Jackie Chen, SNL; Lori Diachin, LLNL; Dan Fay, Microsoft; Ian Foster, ANL; Mark Gordon, Ames; Mohammad Khaleel, PNNL; David Keyes, Columbia University; Bob Lucas, University of Southern California; Tony Mezzacappa, ORNL; Jeff Nichols, ORNL; David Nowak, ANL; Michael Papka, ANL; Thomas Schultess, ORNL; Horst Simon, LBNL; David Skinner, LBNL; Panagiotis Spentzouris, Fermilab; Bob Sugar, UCSB; and Kathy Yelick, LBNL. I owe a special thanks to Mike Papka and Jim Ahrens for handling the electronic theater. I also thank all those who submitted videos. It was a highly successful experiment. Behind the scenes an enormous amount of work is required to make a large conference go smoothly. First I thank Cheryl Zidel for her tireless efforts as organizing committee liaison and posters chair and, in general, handling all of my end of the program and keeping me calm. I also thank Gail Pieper for her work in editing the proceedings, Beth Cerny Patino for her work on the Organizing Committee website and electronic theater, and Ken Raffenetti for his work in keeping that website working. Jon Bashor and John Hules did an excellent job in handling conference communications. I thank Caitlin Youngquist for the striking graphic design; Dan Fay for tutorials arrangements; and Lynn Dory, Suzanne Stevenson, Sarah Pebelske and Sarah Zidel for on-site registration and conference support. We all owe Yeen Mankin an extra-special thanks for choosing the hotel, handling contracts, arranging menus, securing venues, and reassuring the chair that everything was under control. We are pleased to have obtained corporate sponsorship from Cray, IBM, Intel, HP, and SiCortex. I thank all the speakers and panel presenters. I also thank the former conference chairs Tony Metzzacappa, Bill Tang, and David Keyes, who were never far away for advice and encouragement. Finally, I offer my thanks to Michael Strayer, without whose leadership, vision, and persistence the SciDAC program would not have come into being and flourished. I am honored to be part of his program and his friend. Rick Stevens Seattle, Washington July 18, 2008
Using Formal Grammars to Predict I/O Behaviors in HPC: The Omnisc'IO Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dorier, Matthieu; Ibrahim, Shadi; Antoniu, Gabriel
2016-08-01
The increasing gap between the computation performance of post-petascale machines and the performance of their I/O subsystem has motivated many I/O optimizations including prefetching, caching, and scheduling. In order to further improve these techniques, modeling and predicting spatial and temporal I/O patterns of HPC applications as they run has become crucial. In this paper we present Omnisc'IO, an approach that builds a grammar-based model of the I/O behavior of HPC applications and uses it to predict when future I/O operations will occur, and where and how much data will be accessed. To infer grammars, Omnisc'IO is based on StarSequitur, amore » novel algorithm extending Nevill-Manning's Sequitur algorithm. Omnisc'IO is transparently integrated into the POSIX and MPI I/O stacks and does not require any modification in applications or higher-level I/O libraries. It works without any prior knowledge of the application and converges to accurate predictions of any N future I/O operations within a couple of iterations. Its implementation is efficient in both computation time and memory footprint.« less
RELIABILITY, AVAILABILITY, AND SERVICEABILITY FOR PETASCALE HIGH-END COMPUTING AND BEYOND
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chokchai "Box" Leangsuksun
2011-05-31
Our project is a multi-institutional research effort that adopts interplay of RELIABILITY, AVAILABILITY, and SERVICEABILITY (RAS) aspects for solving resilience issues in highend scientific computing in the next generation of supercomputers. results lie in the following tracks: Failure prediction in a large scale HPC; Investigate reliability issues and mitigation techniques including in GPGPU-based HPC system; HPC resilience runtime & tools.
2009 fault tolerance for extreme-scale computing workshop, Albuquerque, NM - March 19-20, 2009.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Katz, D. S.; Daly, J.; DeBardeleben, N.
2009-02-01
This is a report on the third in a series of petascale workshops co-sponsored by Blue Waters and TeraGrid to address challenges and opportunities for making effective use of emerging extreme-scale computing. This workshop was held to discuss fault tolerance on large systems for running large, possibly long-running applications. The main point of the workshop was to have systems people, middleware people (including fault-tolerance experts), and applications people talk about the issues and figure out what needs to be done, mostly at the middleware and application levels, to run such applications on the emerging petascale systems, without having faults causemore » large numbers of application failures. The workshop found that there is considerable interest in fault tolerance, resilience, and reliability of high-performance computing (HPC) systems in general, at all levels of HPC. The only way to recover from faults is through the use of some redundancy, either in space or in time. Redundancy in time, in the form of writing checkpoints to disk and restarting at the most recent checkpoint after a fault that cause an application to crash/halt, is the most common tool used in applications today, but there are questions about how long this can continue to be a good solution as systems and memories grow faster than I/O bandwidth to disk. There is interest in both modifications to this, such as checkpoints to memory, partial checkpoints, and message logging, and alternative ideas, such as in-memory recovery using residues. We believe that systematic exploration of these ideas holds the most promise for the scientific applications community. Fault tolerance has been an issue of discussion in the HPC community for at least the past 10 years; but much like other issues, the community has managed to put off addressing it during this period. There is a growing recognition that as systems continue to grow to petascale and beyond, the field is approaching the point where we don't have any choice but to address this through R&D efforts.« less
A survey of CPU-GPU heterogeneous computing techniques
Mittal, Sparsh; Vetter, Jeffrey S.
2015-07-04
As both CPU and GPU become employed in a wide range of applications, it has been acknowledged that both of these processing units (PUs) have their unique features and strengths and hence, CPU-GPU collaboration is inevitable to achieve high-performance computing. This has motivated significant amount of research on heterogeneous computing techniques, along with the design of CPU-GPU fused chips and petascale heterogeneous supercomputers. In this paper, we survey heterogeneous computing techniques (HCTs) such as workload-partitioning which enable utilizing both CPU and GPU to improve performance and/or energy efficiency. We review heterogeneous computing approaches at runtime, algorithm, programming, compiler and applicationmore » level. Further, we review both discrete and fused CPU-GPU systems; and discuss benchmark suites designed for evaluating heterogeneous computing systems (HCSs). Furthermore, we believe that this paper will provide insights into working and scope of applications of HCTs to researchers and motivate them to further harness the computational powers of CPUs and GPUs to achieve the goal of exascale performance.« less
A survey of CPU-GPU heterogeneous computing techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mittal, Sparsh; Vetter, Jeffrey S.
As both CPU and GPU become employed in a wide range of applications, it has been acknowledged that both of these processing units (PUs) have their unique features and strengths and hence, CPU-GPU collaboration is inevitable to achieve high-performance computing. This has motivated significant amount of research on heterogeneous computing techniques, along with the design of CPU-GPU fused chips and petascale heterogeneous supercomputers. In this paper, we survey heterogeneous computing techniques (HCTs) such as workload-partitioning which enable utilizing both CPU and GPU to improve performance and/or energy efficiency. We review heterogeneous computing approaches at runtime, algorithm, programming, compiler and applicationmore » level. Further, we review both discrete and fused CPU-GPU systems; and discuss benchmark suites designed for evaluating heterogeneous computing systems (HCSs). Furthermore, we believe that this paper will provide insights into working and scope of applications of HCTs to researchers and motivate them to further harness the computational powers of CPUs and GPUs to achieve the goal of exascale performance.« less
Petascale Many Body Methods for Complex Correlated Systems
NASA Astrophysics Data System (ADS)
Pruschke, Thomas
2012-02-01
Correlated systems constitute an important class of materials in modern condensed matter physics. Correlation among electrons are at the heart of all ordering phenomena and many intriguing novel aspects, such as quantum phase transitions or topological insulators, observed in a variety of compounds. Yet, theoretically describing these phenomena is still a formidable task, even if one restricts the models used to the smallest possible set of degrees of freedom. Here, modern computer architectures play an essential role, and the joint effort to devise efficient algorithms and implement them on state-of-the art hardware has become an extremely active field in condensed-matter research. To tackle this task single-handed is quite obviously not possible. The NSF-OISE funded PIRE collaboration ``Graduate Education and Research in Petascale Many Body Methods for Complex Correlated Systems'' is a successful initiative to bring together leading experts around the world to form a virtual international organization for addressing these emerging challenges and educate the next generation of computational condensed matter physicists. The collaboration includes research groups developing novel theoretical tools to reliably and systematically study correlated solids, experts in efficient computational algorithms needed to solve the emerging equations, and those able to use modern heterogeneous computer architectures to make then working tools for the growing community.
Spiking network simulation code for petascale computers.
Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M; Plesser, Hans E; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz
2014-01-01
Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today.
Reducing I/O variability using dynamic I/O path characterization in petascale storage systems
Son, Seung Woo; Sehrish, Saba; Liao, Wei-keng; ...
2016-11-01
In petascale systems with a million CPU cores, scalable and consistent I/O performance is becoming increasingly difficult to sustain mainly because of I/O variability. Furthermore, the I/O variability is caused by concurrently running processes/jobs competing for I/O or a RAID rebuild when a disk drive fails. We present a mechanism that stripes across a selected subset of I/O nodes with the lightest workload at runtime to achieve the highest I/O bandwidth available in the system. In this paper, we propose a probing mechanism to enable application-level dynamic file striping to mitigate I/O variability. We also implement the proposed mechanism inmore » the high-level I/O library that enables memory-to-file data layout transformation and allows transparent file partitioning using subfiling. Subfiling is a technique that partitions data into a set of files of smaller size and manages file access to them, making data to be treated as a single, normal file to users. Here, we demonstrate that our bandwidth probing mechanism can successfully identify temporally slower I/O nodes without noticeable runtime overhead. Experimental results on NERSC’s systems also show that our approach isolates I/O variability effectively on shared systems and improves overall collective I/O performance with less variation.« less
Scemama, Anthony; Caffarel, Michel; Oseret, Emmanuel; Jalby, William
2013-04-30
Various strategies to implement efficiently quantum Monte Carlo (QMC) simulations for large chemical systems are presented. These include: (i) the introduction of an efficient algorithm to calculate the computationally expensive Slater matrices. This novel scheme is based on the use of the highly localized character of atomic Gaussian basis functions (not the molecular orbitals as usually done), (ii) the possibility of keeping the memory footprint minimal, (iii) the important enhancement of single-core performance when efficient optimization tools are used, and (iv) the definition of a universal, dynamic, fault-tolerant, and load-balanced framework adapted to all kinds of computational platforms (massively parallel machines, clusters, or distributed grids). These strategies have been implemented in the QMC=Chem code developed at Toulouse and illustrated with numerical applications on small peptides of increasing sizes (158, 434, 1056, and 1731 electrons). Using 10-80 k computing cores of the Curie machine (GENCI-TGCC-CEA, France), QMC=Chem has been shown to be capable of running at the petascale level, thus demonstrating that for this machine a large part of the peak performance can be achieved. Implementation of large-scale QMC simulations for future exascale platforms with a comparable level of efficiency is expected to be feasible. Copyright © 2013 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Morin, Paul; Porter, Claire; Cloutier, Michael; Howat, Ian; Noh, Myoung-Jong; Willis, Michael; Kramer, WIlliam; Bauer, Greg; Bates, Brian; Williamson, Cathleen
2017-04-01
Surface topography is among the most fundamental data sets for geosciences, essential for disciplines ranging from glaciology to geodynamics. Two new projects are using sub-meter, commercial imagery licensed by the National Geospatial-Intelligence Agency and open source photogrammetry software to produce a time-tagged 2m posting elevation model of the Arctic and an 8m posting reference elevation model for the Antarctic. When complete, this publically available data will be at higher resolution than any elevation models that cover the entirety of the Western United States. These two polar projects are made possible due to three equally important factors: 1) open-source photogrammetry software, 2) petascale computing, and 3) sub-meter imagery licensed to the United States Government. Our talk will detail the technical challenges of using automated photogrammetry software; the rapid workflow evolution to allow DEM production; the task of deploying the workflow on one of the world's largest supercomputers; the trials of moving massive amounts of data, and the management strategies the team needed to solve in order to meet deadlines. Finally, we will discuss the implications of this type of collaboration for future multi-team use of leadership-class systems such as Blue Waters, and for further elevation mapping.
Spiking network simulation code for petascale computers
Kunkel, Susanne; Schmidt, Maximilian; Eppler, Jochen M.; Plesser, Hans E.; Masumoto, Gen; Igarashi, Jun; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus; Helias, Moritz
2014-01-01
Brain-scale networks exhibit a breathtaking heterogeneity in the dynamical properties and parameters of their constituents. At cellular resolution, the entities of theory are neurons and synapses and over the past decade researchers have learned to manage the heterogeneity of neurons and synapses with efficient data structures. Already early parallel simulation codes stored synapses in a distributed fashion such that a synapse solely consumes memory on the compute node harboring the target neuron. As petaflop computers with some 100,000 nodes become increasingly available for neuroscience, new challenges arise for neuronal network simulation software: Each neuron contacts on the order of 10,000 other neurons and thus has targets only on a fraction of all compute nodes; furthermore, for any given source neuron, at most a single synapse is typically created on any compute node. From the viewpoint of an individual compute node, the heterogeneity in the synaptic target lists thus collapses along two dimensions: the dimension of the types of synapses and the dimension of the number of synapses of a given type. Here we present a data structure taking advantage of this double collapse using metaprogramming techniques. After introducing the relevant scaling scenario for brain-scale simulations, we quantitatively discuss the performance on two supercomputers. We show that the novel architecture scales to the largest petascale supercomputers available today. PMID:25346682
Manyscale Computing for Sensor Processing in Support of Space Situational Awareness
NASA Astrophysics Data System (ADS)
Schmalz, M.; Chapman, W.; Hayden, E.; Sahni, S.; Ranka, S.
2014-09-01
Increasing image and signal data burden associated with sensor data processing in support of space situational awareness implies continuing computational throughput growth beyond the petascale regime. In addition to growing applications data burden and diversity, the breadth, diversity and scalability of high performance computing architectures and their various organizations challenge the development of a single, unifying, practicable model of parallel computation. Therefore, models for scalable parallel processing have exploited architectural and structural idiosyncrasies, yielding potential misapplications when legacy programs are ported among such architectures. In response to this challenge, we have developed a concise, efficient computational paradigm and software called Manyscale Computing to facilitate efficient mapping of annotated application codes to heterogeneous parallel architectures. Our theory, algorithms, software, and experimental results support partitioning and scheduling of application codes for envisioned parallel architectures, in terms of work atoms that are mapped (for example) to threads or thread blocks on computational hardware. Because of the rigor, completeness, conciseness, and layered design of our manyscale approach, application-to-architecture mapping is feasible and scalable for architectures at petascales, exascales, and above. Further, our methodology is simple, relying primarily on a small set of primitive mapping operations and support routines that are readily implemented on modern parallel processors such as graphics processing units (GPUs) and hybrid multi-processors (HMPs). In this paper, we overview the opportunities and challenges of manyscale computing for image and signal processing in support of space situational awareness applications. We discuss applications in terms of a layered hardware architecture (laboratory > supercomputer > rack > processor > component hierarchy). Demonstration applications include performance analysis and results in terms of execution time as well as storage, power, and energy consumption for bus-connected and/or networked architectures. The feasibility of the manyscale paradigm is demonstrated by addressing four principal challenges: (1) architectural/structural diversity, parallelism, and locality, (2) masking of I/O and memory latencies, (3) scalability of design as well as implementation, and (4) efficient representation/expression of parallel applications. Examples will demonstrate how manyscale computing helps solve these challenges efficiently on real-world computing systems.
NASA Astrophysics Data System (ADS)
Borne, K. D.; Fortson, L.; Gay, P.; Lintott, C.; Raddick, M. J.; Wallin, J.
2009-12-01
The remarkable success of Galaxy Zoo as a citizen science project for galaxy classification within a terascale astronomy data collection has led to the development of a broader collaboration, known as the Zooniverse. Activities will include astronomy, lunar science, solar science, and digital humanities. Some features of our program include development of a unified framework for citizen science projects, development of a common set of user-based research tools, engagement of the machine learning community to apply machine learning algorithms on the rich training data provided by citizen scientists, and extension across multiple research disciplines. The Zooniverse collaboration is just getting started, but already we are implementing a scientifically deep follow-on to Galaxy Zoo. This project, tentatively named Galaxy Merger Zoo, will engage users in running numerical simulations, whose input parameter space is voluminous and therefore demands a clever solution, such as allowing the citizen scientists to select their own sets of parameters, which then trigger new simulations of colliding galaxies. The user interface design has many of the engaging features that retain users, including rapid feedback, visually appealing graphics, and the sense of playing a competitive game for the benefit of science. We will discuss these topics. In addition, we will also describe applications of Citizen Science that are being considered for the petascale science project LSST (Large Synoptic Survey Telescope). LSST will produce a scientific data system that consists of a massive image archive (nearly 100 petabytes) and a similarly massive scientific parameter database (20-40 petabytes). Applications of Citizen Science for such an enormous data collection will enable greater scientific return in at least two ways. First, citizen scientists work with real data and perform authentic research tasks of value to the advancement of the science, providing "human computation" capabilities and resources to review, annotate, and explore aspects of the data that are too overwhelming for the science team. Second, citizen scientists' inputs (in the form of rich training data and class labels) can be used to improve the classifiers that the project team uses to classify and prioritize new events detected in the petascale data stream. This talk will review these topics and provide an update on the Zooniverse project.
Quantum transport and nanoplasmonics with carbon nanorings - using HPC in computational nanoscience
NASA Astrophysics Data System (ADS)
Jack, Mark A.
2011-10-01
Central theme of this talk is the theoretical study of toroidal carbon nanostructures as a new form of metamaterial. The interference of ring-generated electromagnetic radiation in a regular array of nanorings driven by an incoming polarized wave front may lead to fascinating new optoelectronics applications. The tight-binding method is used to model charge transport in a carbon nanotorus: All transport observables can be derived from the Green's function of the device region in a non-equilibrium Green's function algorithm. We have calculated density-of-states D(E) and transmissivities T(E) between two metallic leads under a small voltage bias. Electron-phonon coupling is included for low-energy phonon modes of armchair and zigzag nanorings with atomic displacements determined by a collaborator's finite-element based code. A numerically fast and stable algorithm has been developed via parallel linear algebra matrix routines (PETSc) with MPI parallelism to reach significant speed-up. Production runs are planned on the NSF XSEDE network. This project was supported in parts by a 2010 NSF TeraGrid Fellowship and the Sunshine State Education and Research Computing Alliance (SSERCA). Two summer students were supported as 2010 and 2011 NCSI/Shodor Petascale Computing undergraduate interns.[4pt] In collaboration with Leon W. Durivage, Adam Byrd, and Mario Encinosa.
High Performance Visualization using Query-Driven Visualizationand Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bethel, E. Wes; Campbell, Scott; Dart, Eli
2006-06-15
Query-driven visualization and analytics is a unique approach for high-performance visualization that offers new capabilities for knowledge discovery and hypothesis testing. The new capabilities akin to finding needles in haystacks are the result of combining technologies from the fields of scientific visualization and scientific data management. This approach is crucial for rapid data analysis and visualization in the petascale regime. This article describes how query-driven visualization is applied to a hero-sized network traffic analysis problem.
The Next Frontier in Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sarrao, John
2016-11-16
Exascale computing refers to computing systems capable of at least one exaflop or a billion calculations per second (1018). That is 50 times faster than the most powerful supercomputers being used today and represents a thousand-fold increase over the first petascale computer that came into operation in 2008. How we use these large-scale simulation resources is the key to solving some of today’s most pressing problems, including clean energy production, nuclear reactor lifetime extension and nuclear stockpile aging.
SciDAC GSEP: Gyrokinetic Simulation of Energetic Particle Turbulence and Transport
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Zhihong
Energetic particle (EP) confinement is a key physics issue for burning plasma experiment ITER, the crucial next step in the quest for clean and abundant energy, since ignition relies on self-heating by energetic fusion products (α-particles). Due to the strong coupling of EP with burning thermal plasmas, plasma confinement property in the ignition regime is one of the most uncertain factors when extrapolating from existing fusion devices to the ITER tokamak. EP population in current tokamaks are mostly produced by auxiliary heating such as neutral beam injection (NBI) and radio frequency (RF) heating. Remarkable progress in developing comprehensive EP simulationmore » codes and understanding basic EP physics has been made by two concurrent SciDAC EP projects GSEP funded by the Department of Energy (DOE) Office of Fusion Energy Science (OFES), which have successfully established gyrokinetic turbulence simulation as a necessary paradigm shift for studying the EP confinement in burning plasmas. Verification and validation have rapidly advanced through close collaborations between simulation, theory, and experiment. Furthermore, productive collaborations with computational scientists have enabled EP simulation codes to effectively utilize current petascale computers and emerging exascale computers. We review here key physics progress in the GSEP projects regarding verification and validation of gyrokinetic simulations, nonlinear EP physics, EP coupling with thermal plasmas, and reduced EP transport models. Advances in high performance computing through collaborations with computational scientists that enable these large scale electromagnetic simulations are also highlighted. These results have been widely disseminated in numerous peer-reviewed publications including many Phys. Rev. Lett. papers and many invited presentations at prominent fusion conferences such as the biennial International Atomic Energy Agency (IAEA) Fusion Energy Conference and the annual meeting of the American Physics Society, Division of Plasma Physics (APS-DPP).« less
NASA Astrophysics Data System (ADS)
Fonseca, R. A.; Vieira, J.; Fiuza, F.; Davidson, A.; Tsung, F. S.; Mori, W. B.; Silva, L. O.
2013-12-01
A new generation of laser wakefield accelerators (LWFA), supported by the extreme accelerating fields generated in the interaction of PW-Class lasers and underdense targets, promises the production of high quality electron beams in short distances for multiple applications. Achieving this goal will rely heavily on numerical modelling to further understand the underlying physics and identify optimal regimes, but large scale modelling of these scenarios is computationally heavy and requires the efficient use of state-of-the-art petascale supercomputing systems. We discuss the main difficulties involved in running these simulations and the new developments implemented in the OSIRIS framework to address these issues, ranging from multi-dimensional dynamic load balancing and hybrid distributed/shared memory parallelism to the vectorization of the PIC algorithm. We present the results of the OASCR Joule Metric program on the issue of large scale modelling of LWFA, demonstrating speedups of over 1 order of magnitude on the same hardware. Finally, scalability to over ˜106 cores and sustained performance over ˜2 P Flops is demonstrated, opening the way for large scale modelling of LWFA scenarios.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gupta, R.; Naik, H.; Beckman, P.
Providing fault tolerance in high-end petascale systems, consisting of millions of hardware components and complex software stacks, is becoming an increasingly challenging task. Checkpointing continues to be the most prevalent technique for providing fault tolerance in such high-end systems. Considerable research has focussed on optimizing checkpointing; however, in practice, checkpointing still involves a high-cost overhead for users. In this paper, we study the checkpointing overhead seen by various applications running on leadership-class machines like the IBM Blue Gene/P at Argonne National Laboratory. In addition to studying popular applications, we design a methodology to help users understand and intelligently choose anmore » optimal checkpointing frequency to reduce the overall checkpointing overhead incurred. In particular, we study the Grid-Based Projector-Augmented Wave application, the Carr-Parrinello Molecular Dynamics application, the Nek5000 computational fluid dynamics application and the Parallel Ocean Program application-and analyze their memory usage and possible checkpointing trends on 65,536 processors of the Blue Gene/P system.« less
The Next Frontier in Computing
Sarrao, John
2018-06-13
Exascale computing refers to computing systems capable of at least one exaflop or a billion calculations per second (1018). That is 50 times faster than the most powerful supercomputers being used today and represents a thousand-fold increase over the first petascale computer that came into operation in 2008. How we use these large-scale simulation resources is the key to solving some of todayâs most pressing problems, including clean energy production, nuclear reactor lifetime extension and nuclear stockpile aging.
Rapid insights from remote sensing in the geosciences
NASA Astrophysics Data System (ADS)
Plaza, Antonio
2015-03-01
The growing availability of capacity computing for atomistic materials modeling has encouraged the use of high-accuracy computationally intensive interatomic potentials, such as SNAP. These potentials also happen to scale well on petascale computing platforms. SNAP has a very general form and uses machine-learning techniques to reproduce the energies, forces, and stress tensors of a large set of small configurations of atoms, which are obtained using high-accuracy quantum electronic structure (QM) calculations. The local environment of each atom is characterized by a set of bispectrum components of the local neighbor density projected on to a basis of hyperspherical harmonics in four dimensions. The computational cost per atom is much greater than that of simpler potentials such as Lennard-Jones or EAM, while the communication cost remains modest. We discuss a variety of strategies for implementing SNAP in the LAMMPS molecular dynamics package. We present scaling results obtained running SNAP on three different classes of machine: a conventional Intel Xeon CPU cluster; the Titan GPU-based system; and the combined Sequoia and Vulcan BlueGene/Q. The growing availability of capacity computing for atomistic materials modeling has encouraged the use of high-accuracy computationally intensive interatomic potentials, such as SNAP. These potentials also happen to scale well on petascale computing platforms. SNAP has a very general form and uses machine-learning techniques to reproduce the energies, forces, and stress tensors of a large set of small configurations of atoms, which are obtained using high-accuracy quantum electronic structure (QM) calculations. The local environment of each atom is characterized by a set of bispectrum components of the local neighbor density projected on to a basis of hyperspherical harmonics in four dimensions. The computational cost per atom is much greater than that of simpler potentials such as Lennard-Jones or EAM, while the communication cost remains modest. We discuss a variety of strategies for implementing SNAP in the LAMMPS molecular dynamics package. We present scaling results obtained running SNAP on three different classes of machine: a conventional Intel Xeon CPU cluster; the Titan GPU-based system; and the combined Sequoia and Vulcan BlueGene/Q. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corp., for the U.S. Dept. of Energy's National Nuclear Security Admin. under Contract DE-AC04-94AL85000.
From Petascale to Exascale: Eight Focus Areas of R&D Challenges for HPC Simulation Environments
DOE Office of Scientific and Technical Information (OSTI.GOV)
Springmeyer, R; Still, C; Schulz, M
2011-03-17
Programming models bridge the gap between the underlying hardware architecture and the supporting layers of software available to applications. Programming models are different from both programming languages and application programming interfaces (APIs). Specifically, a programming model is an abstraction of the underlying computer system that allows for the expression of both algorithms and data structures. In comparison, languages and APIs provide implementations of these abstractions and allow the algorithms and data structures to be put into practice - a programming model exists independently of the choice of both the programming language and the supporting APIs. Programming models are typically focusedmore » on achieving increased developer productivity, performance, and portability to other system designs. The rapidly changing nature of processor architectures and the complexity of designing an exascale platform provide significant challenges for these goals. Several other factors are likely to impact the design of future programming models. In particular, the representation and management of increasing levels of parallelism, concurrency and memory hierarchies, combined with the ability to maintain a progressive level of interoperability with today's applications are of significant concern. Overall the design of a programming model is inherently tied not only to the underlying hardware architecture, but also to the requirements of applications and libraries including data analysis, visualization, and uncertainty quantification. Furthermore, the successful implementation of a programming model is dependent on exposed features of the runtime software layers and features of the operating system. Successful use of a programming model also requires effective presentation to the software developer within the context of traditional and new software development tools. Consideration must also be given to the impact of programming models on both languages and the associated compiler infrastructure. Exascale programming models must reflect several, often competing, design goals. These design goals include desirable features such as abstraction and separation of concerns. However, some aspects are unique to large-scale computing. For example, interoperability and composability with existing implementations will prove critical. In particular, performance is the essential underlying goal for large-scale systems. A key evaluation metric for exascale models will be the extent to which they support these goals rather than merely enable them.« less
I/O-aware bandwidth allocation for petascale computing systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Zhou; Yang, Xu; Zhao, Dongfang
In the Big Data era, the gap between the storage performance and an appli- cation's I/O requirement is increasing. I/O congestion caused by concurrent storage accesses from multiple applications is inevitable and severely harms the performance. Conventional approaches either focus on optimizing an ap- plication's access pattern individually or handle I/O requests on a low-level storage layer without any knowledge from the upper-level applications. In this paper, we present a novel I/O-aware bandwidth allocation framework to coordinate ongoing I/O requests on petascale computing systems. The motivation behind this innovation is that the resource management system has a holistic view ofmore » both the system state and jobs' activities and can dy- namically control the jobs' status or allocate resource on the y during their execution. We treat a job's I/O requests as periodical subjobs within its lifecycle and transform the I/O congestion issue into a classical scheduling problem. Based on this model, we propose a bandwidth management mech- anism as an extension to the existing scheduling system. We design several bandwidth allocation policies with different optimization objectives either on user-oriented metrics or system performance. We conduct extensive trace- based simulations using real job traces and I/O traces from a production IBM Blue Gene/Q system at Argonne National Laboratory. Experimental results demonstrate that our new design can improve job performance by more than 30%, as well as increasing system performance.« less
Performance Engineering Research Institute SciDAC-2 Enabling Technologies Institute Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hall, Mary
2014-09-19
Enhancing the performance of SciDAC applications on petascale systems has high priority within DOE SC. As we look to the future, achieving expected levels of performance on high-end com-puting (HEC) systems is growing ever more challenging due to enormous scale, increasing archi-tectural complexity, and increasing application complexity. To address these challenges, PERI has implemented a unified, tripartite research plan encompassing: (1) performance modeling and prediction; (2) automatic performance tuning; and (3) performance engineering of high profile applications. The PERI performance modeling and prediction activity is developing and refining performance models, significantly reducing the cost of collecting the data upon whichmore » the models are based, and increasing model fidelity, speed and generality. Our primary research activity is automatic tuning (autotuning) of scientific software. This activity is spurred by the strong user preference for automatic tools and is based on previous successful activities such as ATLAS, which has automatically tuned components of the LAPACK linear algebra library, and other re-cent work on autotuning domain-specific libraries. Our third major component is application en-gagement, to which we are devoting approximately 30% of our effort to work directly with Sci-DAC-2 applications. This last activity not only helps DOE scientists meet their near-term per-formance goals, but also helps keep PERI research focused on the real challenges facing DOE computational scientists as they enter the Petascale Era.« less
eXascale PRogramming Environment and System Software (XPRESS)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chapman, Barbara; Gabriel, Edgar
Exascale systems, with a thousand times the compute capacity of today’s leading edge petascale computers, are expected to emerge during the next decade. Their software systems will need to facilitate the exploitation of exceptional amounts of concurrency in applications, and ensure that jobs continue to run despite the occurrence of system failures and other kinds of hard and soft errors. Adapting computations at runtime to cope with changes in the execution environment, as well as to improve power and performance characteristics, is likely to become the norm. As a result, considerable innovation is required to develop system support to meetmore » the needs of future computing platforms. The XPRESS project aims to develop and prototype a revolutionary software system for extreme-scale computing for both exascale and strongscaled problems. The XPRESS collaborative research project will advance the state-of-the-art in high performance computing and enable exascale computing for current and future DOE mission-critical applications and supporting systems. The goals of the XPRESS research project are to: A. enable exascale performance capability for DOE applications, both current and future, B. develop and deliver a practical computing system software X-stack, OpenX, for future practical DOE exascale computing systems, and C. provide programming methods and environments for effective means of expressing application and system software for portable exascale system execution.« less
Building confidence and credibility amid growing model and computing complexity
NASA Astrophysics Data System (ADS)
Evans, K. J.; Mahajan, S.; Veneziani, C.; Kennedy, J. H.
2017-12-01
As global Earth system models are developed to answer an ever-wider range of science questions, software products that provide robust verification, validation, and evaluation must evolve in tandem. Measuring the degree to which these new models capture past behavior, predict the future, and provide the certainty of predictions is becoming ever more challenging for reasons that are generally well known, yet are still challenging to address. Two specific and divergent needs for analysis of the Accelerated Climate Model for Energy (ACME) model - but with a similar software philosophy - are presented to show how a model developer-based focus can address analysis needs during expansive model changes to provide greater fidelity and execute on multi-petascale computing facilities. A-PRIME is a python script-based quick-look overview of a fully-coupled global model configuration to determine quickly if it captures specific behavior before significant computer time and expense is invested. EVE is an ensemble-based software framework that focuses on verification of performance-based ACME model development, such as compiler or machine settings, to determine the equivalence of relevant climate statistics. The challenges and solutions for analysis of multi-petabyte output data are highlighted from the aspect of the scientist using the software, with the aim of fostering discussion and further input from the community about improving developer confidence and community credibility.
2011 Computation Directorate Annual Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crawford, D L
2012-04-11
From its founding in 1952 until today, Lawrence Livermore National Laboratory (LLNL) has made significant strategic investments to develop high performance computing (HPC) and its application to national security and basic science. Now, 60 years later, the Computation Directorate and its myriad resources and capabilities have become a key enabler for LLNL programs and an integral part of the effort to support our nation's nuclear deterrent and, more broadly, national security. In addition, the technological innovation HPC makes possible is seen as vital to the nation's economic vitality. LLNL, along with other national laboratories, is working to make supercomputing capabilitiesmore » and expertise available to industry to boost the nation's global competitiveness. LLNL is on the brink of an exciting milestone with the 2012 deployment of Sequoia, the National Nuclear Security Administration's (NNSA's) 20-petaFLOP/s resource that will apply uncertainty quantification to weapons science. Sequoia will bring LLNL's total computing power to more than 23 petaFLOP/s-all brought to bear on basic science and national security needs. The computing systems at LLNL provide game-changing capabilities. Sequoia and other next-generation platforms will enable predictive simulation in the coming decade and leverage industry trends, such as massively parallel and multicore processors, to run petascale applications. Efficient petascale computing necessitates refining accuracy in materials property data, improving models for known physical processes, identifying and then modeling for missing physics, quantifying uncertainty, and enhancing the performance of complex models and algorithms in macroscale simulation codes. Nearly 15 years ago, NNSA's Accelerated Strategic Computing Initiative (ASCI), now called the Advanced Simulation and Computing (ASC) Program, was the critical element needed to shift from test-based confidence to science-based confidence. Specifically, ASCI/ASC accelerated the development of simulation capabilities necessary to ensure confidence in the nuclear stockpile-far exceeding what might have been achieved in the absence of a focused initiative. While stockpile stewardship research pushed LLNL scientists to develop new computer codes, better simulation methods, and improved visualization technologies, this work also stimulated the exploration of HPC applications beyond the standard sponsor base. As LLNL advances to a petascale platform and pursues exascale computing (1,000 times faster than Sequoia), ASC will be paramount to achieving predictive simulation and uncertainty quantification. Predictive simulation and quantifying the uncertainty of numerical predictions where little-to-no data exists demands exascale computing and represents an expanding area of scientific research important not only to nuclear weapons, but to nuclear attribution, nuclear reactor design, and understanding global climate issues, among other fields. Aside from these lofty goals and challenges, computing at LLNL is anything but 'business as usual.' International competition in supercomputing is nothing new, but the HPC community is now operating in an expanded, more aggressive climate of global competitiveness. More countries understand how science and technology research and development are inextricably linked to economic prosperity, and they are aggressively pursuing ways to integrate HPC technologies into their native industrial and consumer products. In the interest of the nation's economic security and the science and technology that underpins it, LLNL is expanding its portfolio and forging new collaborations. We must ensure that HPC remains an asymmetric engine of innovation for the Laboratory and for the U.S. and, in doing so, protect our research and development dynamism and the prosperity it makes possible. One untapped area of opportunity LLNL is pursuing is to help U.S. industry understand how supercomputing can benefit their business. Industrial investment in HPC applications has historically been limited by the prohibitive cost of entry, the inaccessibility of software to run the powerful systems, and the years it takes to grow the expertise to develop codes and run them in an optimal way. LLNL is helping industry better compete in the global market place by providing access to some of the world's most powerful computing systems, the tools to run them, and the experts who are adept at using them. Our scientists are collaborating side by side with industrial partners to develop solutions to some of industry's toughest problems. The goal of the Livermore Valley Open Campus High Performance Computing Innovation Center is to allow American industry the opportunity to harness the power of supercomputing by leveraging the scientific and computational expertise at LLNL in order to gain a competitive advantage in the global economy.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Livny, Miron; Shank, James; Ernst, Michael
Under this SciDAC-2 grant the project’s goal w a s t o stimulate new discoveries by providing scientists with effective and dependable access to an unprecedented national distributed computational facility: the Open Science Grid (OSG). We proposed to achieve this through the work of the Open Science Grid Consortium: a unique hands-on multi-disciplinary collaboration of scientists, software developers and providers of computing resources. Together the stakeholders in this consortium sustain and use a shared distributed computing environment that transforms simulation and experimental science in the US. The OSG consortium is an open collaboration that actively engages new research communities. Wemore » operate an open facility that brings together a broad spectrum of compute, storage, and networking resources and interfaces to other cyberinfrastructures, including the US XSEDE (previously TeraGrid), the European Grids for ESciencE (EGEE), as well as campus and regional grids. We leverage middleware provided by computer science groups, facility IT support organizations, and computing programs of application communities for the benefit of consortium members and the US national CI.« less
Designing for Peta-Scale in the LSST Database
NASA Astrophysics Data System (ADS)
Kantor, J.; Axelrod, T.; Becla, J.; Cook, K.; Nikolaev, S.; Gray, J.; Plante, R.; Nieto-Santisteban, M.; Szalay, A.; Thakar, A.
2007-10-01
The Large Synoptic Survey Telescope (LSST), a proposed ground-based 8.4 m telescope with a 10 deg^2 field of view, will generate 15 TB of raw images every observing night. When calibration and processed data are added, the image archive, catalogs, and meta-data will grow 15 PB yr^{-1} on average. The LSST Data Management System (DMS) must capture, process, store, index, replicate, and provide open access to this data. Alerts must be triggered within 30 s of data acquisition. To do this in real-time at these data volumes will require advances in data management, database, and file system techniques. This paper describes the design of the LSST DMS and emphasizes features for peta-scale data. The LSST DMS will employ a combination of distributed database and file systems, with schema, partitioning, and indexing oriented for parallel operations. Image files are stored in a distributed file system with references to, and meta-data from, each file stored in the databases. The schema design supports pipeline processing, rapid ingest, and efficient query. Vertical partitioning reduces disk input/output requirements, horizontal partitioning allows parallel data access using arrays of servers and disks. Indexing is extensive, utilizing both conventional RAM-resident indexes and column-narrow, row-deep tag tables/covering indices that are extracted from tables that contain many more attributes. The DMS Data Access Framework is encapsulated in a middleware framework to provide a uniform service interface to all framework capabilities. This framework will provide the automated work-flow, replication, and data analysis capabilities necessary to make data processing and data quality analysis feasible at this scale.
Challenges at Petascale for Pseudo-Spectral Methods on Spheres (A Last Hurrah?)
NASA Technical Reports Server (NTRS)
Clune, Thomas
2011-01-01
Conclusions: a) Proper software abstractions should enable rapid-exploration of platform-specific optimizations/ tradeoffs. b) Pseudo-spectra! methods are marginally viable for at least some classes of petascaie problems. i.e., GPU based machine with good bisection would be best. c) Scalability at exascale is possible, but the necessary resolution will make algorithm prohibitively expensive. Efficient implementations of realistic global transposes are mtricate and tedious in MPI. PS at petascaie requires exploration of a variety of strategies for spreading local and remote communic3tions. PGAS allows far simpler implementation and thus rapid exploration of variants.
Experiment-scale molecular simulation study of liquid crystal thin films
NASA Astrophysics Data System (ADS)
Nguyen, Trung Dac; Carrillo, Jan-Michael Y.; Matheson, Michael A.; Brown, W. Michael
2014-03-01
Supercomputers have now reached a performance level adequate for studying thin films with molecular detail at the relevant scales. By exploiting the power of GPU accelerators on Titan, we have been able to perform simulations of characteristic liquid crystal films that provide remarkable qualitative agreement with experimental images. We have demonstrated that key features of spinodal instability can only be observed with sufficiently large system sizes, which were not accessible with previous simulation studies. Our study emphasizes the capability and significance of petascale simulations in providing molecular-level insights in thin film systems as well as other interfacial phenomena.
The LSST Data Mining Research Agenda
NASA Astrophysics Data System (ADS)
Borne, K.; Becla, J.; Davidson, I.; Szalay, A.; Tyson, J. A.
2008-12-01
We describe features of the LSST science database that are amenable to scientific data mining, object classification, outlier identification, anomaly detection, image quality assurance, and survey science validation. The data mining research agenda includes: scalability (at petabytes scales) of existing machine learning and data mining algorithms; development of grid-enabled parallel data mining algorithms; designing a robust system for brokering classifications from the LSST event pipeline (which may produce 10,000 or more event alerts per night) multi-resolution methods for exploration of petascale databases; indexing of multi-attribute multi-dimensional astronomical databases (beyond spatial indexing) for rapid querying of petabyte databases; and more.
Petascale Simulations of the Morphology and the Molecular Interface of Bulk Heterojunctions
Carrillo, Jan-Michael Y.; Seibers, Zach; Kumar, Rajeev; ...
2016-07-14
Understanding how additives interact and segregate within bulk heterojunction (BHJ) thin films is critical for exercising control over structure at multiple length scales and delivering improvements in photovoltaic performance. The morphological evolution of poly(3-hexylthiophene) (P3HT) and phenyl-C 61-butyric acid methyl ester (PCBM) blends that are commensurate with the size of a BHJ thin film is examined using petascale coarse-grained molecular dynamics simulations. When comparing 2 component and 3 component systems containing short P3HT chains as additives undergoing thermal annealing we demonstrate that the short chains alter the morphol- ogy in apparently useful ways: They efficiently migrate to the P3HT/PCBM interface,more » increasing the P3HT domain size and interfacial area. Simulation results agree with depth profiles determined from neutron reflectometry measurements that reveal PCBM enrichment near substrate and air interfaces, but a decrease in that PCBM enrich- ment when a small amount of short P3HT chains are integrated into the BHJ blend. Atomistic simulations of the P3HT/PCBM blend interfaces show a non-monotonic dependence of the interfacial thickness as a function of number of repeat units in the oligomeric P3HT additive, and the thiophene rings orient parallel to the interfacial plane as they approach the PCBM domain. Using the nanoscale geometries of the P3HT oligomers, LUMO and HOMO energy levels calculated by density functional theory are found to be invariant across the donor/acceptor interface. Finally, these connections between additives, processing, and morphology at all length scales are generally useful for efforts to improve device performance.« less
Using 100G Network Technology in Support of Petascale Science
NASA Technical Reports Server (NTRS)
Gary, James P.
2011-01-01
NASA in collaboration with a number of partners conducted a set of individual experiments and demonstrations during SC 10 that collectively were titled "Using 100G Network Technology in Support of Petascale Science". The partners included the iCAIR, Internet2, LAC, MAX, National LambdaRail (NLR), NOAA and SCinet Research Sandbox (SRS) as well as the vendors Ciena, Cisco, ColorChip, cPacket, Extreme Networks, Fusion-io, HP and Panduit who most generously allowed some of their leading edge 40G/100G optical transport, Ethernet switch and Internet Protocol router equipment and file server technologies to be involved. The experiments and demonstrations featured different vendor-provided 40G/100G network technology solutions for full-duplex 40G and 100G LAN data flows across SRS-deployed single-node fiber-pairs among the Exhibit Booths of NASA, the National Center for Data lining, NOAA and the SCinet Network Operations Center, as well as between the NASA Exhibit Booth in New Orleans and the Starlight Communications Exchange facility in Chicago across special SC 10- only 80- and 100-Gbps wide area network links provisioned respectively by the NLR and Internet2, then on to GSFC across a 40-Gbps link. provisioned by the Mid-Atlantic Crossroads. The networks and vendor equipment were load-stressed by sets of NASA/GSFC High End Computer Network Team-built, relatively inexpensive, net-test-workstations that are capable of demonstrating greater than 100Gbps uni-directional nuttcp-enabled memory-to-memory data transfers, greater than 80-Gbps aggregate--bidirectional memory-to-memory data transfers, and near 40-Gbps uni-directional disk-to-disk file copying. This paper will summarize the background context, key accomplishments and some significances of these experiments and demonstrations.
ERIC Educational Resources Information Center
Bodenmann, Guy; Shantinath, S. D.
2004-01-01
We describe a distress prevention training program for couples and three empirical studies that support its effectiveness. The program, Couples Coping Enhancement Training (CCET), is based both upon stress and coping theory and research on couples. In addition to traditional elements of couples programs (e.g., communication and problem-solving…
Design of the protoDUNE raw data management infrastructure
Fuess, S.; Illingworth, R.; Mengel, M.; ...
2017-10-01
The Deep Underground Neutrino Experiment (DUNE) will employ a set of Liquid Argon Time Projection Chambers (LArTPC) with a total mass of 40 kt as the main components of its Far Detector. In order to validate this technology and characterize the detector performance at full scale, an ambitious experimental program (called “protoDUNE”) has been initiated which includes a test of the large-scale prototypes for the single-phase and dual-phase LArTPC technologies, which will run in a beam at CERN. The total raw data volume that is slated to be collected during the scheduled 3-month beam run is estimated to be inmore » excess of 2.5 PB for each detector. This data volume will require that the protoDUNE experiment carefully design the DAQ, data handling and data quality monitoring systems to be capable of dealing with challenges inherent with peta-scale data management while simultaneously fulfilling the requirements of disseminating the data to a worldwide collaboration and DUNE associated computing sites. Here in this paper, we present our approach to solving these problems by leveraging the design, expertise and components created for the LHC and Intensity Frontier experiments into a unified architecture that is capable of meeting the needs of protoDUNE.« less
Adapting Wave-front Algorithms to Efficiently Utilize Systems with Deep Communication Hierarchies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kerbyson, Darren J.; Lang, Michael; Pakin, Scott
2011-09-30
Large-scale systems increasingly exhibit a differential between intra-chip and inter-chip communication performance especially in hybrid systems using accelerators. Processorcores on the same socket are able to communicate at lower latencies, and with higher bandwidths, than cores on different sockets either within the same node or between nodes. A key challenge is to efficiently use this communication hierarchy and hence optimize performance. We consider here the class of applications that contains wavefront processing. In these applications data can only be processed after their upstream neighbors have been processed. Similar dependencies result between processors in which communication is required to pass boundarymore » data downstream and whose cost is typically impacted by the slowest communication channel in use. In this work we develop a novel hierarchical wave-front approach that reduces the use of slower communications in the hierarchy but at the cost of additional steps in the parallel computation and higher use of on-chip communications. This tradeoff is explored using a performance model. An implementation using the Reverse-acceleration programming model on the petascale Roadrunner system demonstrates a 27% performance improvement at full system-scale on a kernel application. The approach is generally applicable to large-scale multi-core and accelerated systems where a differential in system communication performance exists.« less
Design of the protoDUNE raw data management infrastructure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fuess, S.; Illingworth, R.; Mengel, M.
The Deep Underground Neutrino Experiment (DUNE) will employ a set of Liquid Argon Time Projection Chambers (LArTPC) with a total mass of 40 kt as the main components of its Far Detector. In order to validate this technology and characterize the detector performance at full scale, an ambitious experimental program (called “protoDUNE”) has been initiated which includes a test of the large-scale prototypes for the single-phase and dual-phase LArTPC technologies, which will run in a beam at CERN. The total raw data volume that is slated to be collected during the scheduled 3-month beam run is estimated to be inmore » excess of 2.5 PB for each detector. This data volume will require that the protoDUNE experiment carefully design the DAQ, data handling and data quality monitoring systems to be capable of dealing with challenges inherent with peta-scale data management while simultaneously fulfilling the requirements of disseminating the data to a worldwide collaboration and DUNE associated computing sites. Here in this paper, we present our approach to solving these problems by leveraging the design, expertise and components created for the LHC and Intensity Frontier experiments into a unified architecture that is capable of meeting the needs of protoDUNE.« less
Performance Engineering Research Institute SciDAC-2 Enabling Technologies Institute Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lucas, Robert
2013-04-20
Enhancing the performance of SciDAC applications on petascale systems had high priority within DOE SC at the start of the second phase of the SciDAC program, SciDAC-2, as it continues to do so today. Achieving expected levels of performance on high-end computing (HEC) systems is growing ever more challenging due to enormous scale, increasing architectural complexity, and increasing application complexity. To address these challenges, the University of Southern California?s Information Sciences Institute organized the Performance Engineering Research Institute (PERI). PERI implemented a unified, tripartite research plan encompassing: (1) performance modeling and prediction; (2) automatic performance tuning; and (3) performance engineeringmore » of high profile applications. Within PERI, USC?s primary research activity was automatic tuning (autotuning) of scientific software. This activity was spurred by the strong user preference for automatic tools and was based on previous successful activities such as ATLAS, which automatically tuned components of the LAPACK linear algebra library, and other recent work on autotuning domain-specific libraries. Our other major component was application engagement, to which we devoted approximately 30% of our effort to work directly with SciDAC-2 applications. This report is a summary of the overall results of the USC PERI effort.« less
Ellingson, Sally R; Dakshanamurthy, Sivanesan; Brown, Milton; Smith, Jeremy C; Baudry, Jerome
2014-04-25
In this paper we give the current state of high-throughput virtual screening. We describe a case study of using a task-parallel MPI (Message Passing Interface) version of Autodock4 [1], [2] to run a virtual high-throughput screen of one-million compounds on the Jaguar Cray XK6 Supercomputer at Oak Ridge National Laboratory. We include a description of scripts developed to increase the efficiency of the predocking file preparation and postdocking analysis. A detailed tutorial, scripts, and source code for this MPI version of Autodock4 are available online at http://www.bio.utk.edu/baudrylab/autodockmpi.htm.
Schmied, Virginia; Myors, Karen; Wills, Jo; Cooke, Margaret
2002-01-01
This paper describes a pilot antenatal education program intended to better prepare couples for the early weeks of lifestyle changes and parenting. Eight weeks after birth, data were collected by questionnaire from 19 couples who participated in a pilot program and from 14 couples who were enrolled in a routine hospital program. Women in the pilot program were significantly more satisfied with their experience of parenthood. Facilitated gender-specific discussion groups formed a key strategy in the pilot program. PMID:17273305
Premarital screening for hemoglobinopathies: experience of a single center in Kurdistan, Iraq.
Al-Allawi, Nasir A S; Al-Doski, Adnan A S; Markous, Raji S D; Mohamad Amin, Khyria A K; Eissa, Adil A Z; Badi, Ameer I A; Asmaro, Rafal R H; Hamamy, Hanan
2015-01-01
A program for the prevention of major hemoglobinopathies was initiated in 2008 in the Kurdistan region of Iraq. This study reports on the achievements and challenges of the program. A total of 102,554 individuals (51,277 couples) visiting a premarital center between 2008 and 2012 were screened for carrier status of hemoglobinopathies, and at-risk couples were counseled. A total of 223 (4.3/1,000) couples were identified and counseled as high-risk couples. Available data on 198 high-risk couples indicated that 90.4% proceeded with their marriage plans, and 15% of these married couples decided to have prenatal diagnosis (PND) in subsequent pregnancies with the identification of 8 affected fetuses; all were terminated as chosen by the parents. Thirty affected births were recorded among the high-risk couples. The premarital program managed to reduce the affected birth rate of major hemoglobinopathies by 21.1%. Of the 136 affected babies born during the study period, 77.9% were born to couples married prior to the start of the program, while 22.1% were born to couples identified as having a high risk. The main reason for not taking the option of PND was unaffordable costs. Financial support would have increased opting for PND by high-risk couples. Further reduction in affected birth rates could be achieved by including parallel antenatal screening programs to cover those married before the initiation of the premarital program and improving the public health education and counseling programs. © 2015 S. Karger AG, Basel.
OPENING REMARKS: Scientific Discovery through Advanced Computing
NASA Astrophysics Data System (ADS)
Strayer, Michael
2006-01-01
Good morning. Welcome to SciDAC 2006 and Denver. I share greetings from the new Undersecretary for Energy, Ray Orbach. Five years ago SciDAC was launched as an experiment in computational science. The goal was to form partnerships among science applications, computer scientists, and applied mathematicians to take advantage of the potential of emerging terascale computers. This experiment has been a resounding success. SciDAC has emerged as a powerful concept for addressing some of the biggest challenges facing our world. As significant as these successes were, I believe there is also significance in the teams that achieved them. In addition to their scientific aims these teams have advanced the overall field of computational science and set the stage for even larger accomplishments as we look ahead to SciDAC-2. I am sure that many of you are expecting to hear about the results of our current solicitation for SciDAC-2. I’m afraid we are not quite ready to make that announcement. Decisions are still being made and we will announce the results later this summer. Nearly 250 unique proposals were received and evaluated, involving literally thousands of researchers, postdocs, and students. These collectively requested more than five times our expected budget. This response is a testament to the success of SciDAC in the community. In SciDAC-2 our budget has been increased to about 70 million for FY 2007 and our partnerships have expanded to include the Environment and National Security missions of the Department. The National Science Foundation has also joined as a partner. These new partnerships are expected to expand the application space of SciDAC, and broaden the impact and visibility of the program. We have, with our recent solicitation, expanded to turbulence, computational biology, and groundwater reactive modeling and simulation. We are currently talking with the Department’s applied energy programs about risk assessment, optimization of complex systems - such as the national and regional electricity grid, carbon sequestration, virtual engineering, and the nuclear fuel cycle. The successes of the first five years of SciDAC have demonstrated the power of using advanced computing to enable scientific discovery. One measure of this success could be found in the President’s State of the Union address in which President Bush identified ‘supercomputing’ as a major focus area of the American Competitiveness Initiative. Funds were provided in the FY 2007 President’s Budget request to increase the size of the NERSC-5 procurement to between 100-150 teraflops, to upgrade the LCF Cray XT3 at Oak Ridge to 250 teraflops and acquire a 100 teraflop IBM BlueGene/P to establish the Leadership computing facility at Argonne. We believe that we are on a path to establish a petascale computing resource for open science by 2009. We must develop software tools, packages, and libraries as well as the scientific application software that will scale to hundreds of thousands of processors. Computer scientists from universities and the DOE’s national laboratories will be asked to collaborate on the development of the critical system software components such as compilers, light-weight operating systems and file systems. Standing up these large machines will not be business as usual for ASCR. We intend to develop a series of interconnected projects that identify cost, schedule, risks, and scope for the upgrades at the LCF at Oak Ridge, the establishment of the LCF at Argonne, and the development of the software to support these high-end computers. The critical first step in defining the scope of the project is to identify a set of early application codes for each leadership class computing facility. These codes will have access to the resources during the commissioning phase of the facility projects and will be part of the acceptance tests for the machines. Applications will be selected, in part, by breakthrough science, scalability, and ability to exercise key hardware and software components. Possible early applications might include climate models; studies of the magnetic properties of nanoparticles as they relate to ultra-high density storage media; the rational design of chemical catalysts, the modeling of combustion processes that will lead to cleaner burning coal, and fusion and astrophysics research. I have presented just a few of the challenges that we look forward to on the road to petascale computing. Our road to petascale science might be paraphrased by the quote from e e cummings, ‘somewhere I have never traveled, gladly beyond any experience . . .’
Doss, Brian D.; Cicila, Larisa N.; Georgia, Emily J.; Roddy, McKenzie K.; Nowlan, Kathryn M.; Benson, Lisa A.; Christensen, Andrew
2016-01-01
Objective Within the United States, one-third of married couples are distressed and almost half of first marriages (and more than half of unmarried cohabiting relationships) end in divorce/separation. Additionally, relationship distress has been linked to mental and physical health problems in partners and their children. Although couple therapy is effective in reducing relationship distress, it is utilized by less than one third of divorcing couples. Therefore, more accessible interventions for relationship distress are needed. Method This study tests the efficacy of the OurRelationship (OR) program, an eight-hour online program adapted from an empirically-based, in-person couple therapy. In the program, couples complete online activities and have four, 15-minute calls with project staff. Nationwide, 300 heterosexual couples (N = 600 participants) participated; couples were generally representative of the US in terms of race, ethnicity, and education. Couples were randomly assigned to begin the program immediately or to a two month waitlist control group. Results Compared to the waitlist group, intervention couples reported significant improvements in relationship satisfaction (Cohen’s d=0.69), relationship confidence (d=0.47), and negative relationship quality (d=0.57). Additionally, couples reported significant improvements in multiple domains of individual functioning, especially when individuals began the program with difficulties in that domain: depressive (d=0.71) and anxious symptoms (d=0.94), perceived health (d=0.51), work functioning (d=0.57), and quality of life (d=0.44). Conclusions In a nationally-representative sample of couples, the OR program was effective in significantly improving both relationship and individual functioning, suggesting it can substantially increase the reach of current interventions through its low-cost, web-based format. PMID:26999504
Extreme scale multi-physics simulations of the tsunamigenic 2004 Sumatra megathrust earthquake
NASA Astrophysics Data System (ADS)
Ulrich, T.; Gabriel, A. A.; Madden, E. H.; Wollherr, S.; Uphoff, C.; Rettenberger, S.; Bader, M.
2017-12-01
SeisSol (www.seissol.org) is an open-source software package based on an arbitrary high-order derivative Discontinuous Galerkin method (ADER-DG). It solves spontaneous dynamic rupture propagation on pre-existing fault interfaces according to non-linear friction laws, coupled to seismic wave propagation with high-order accuracy in space and time (minimal dispersion errors). SeisSol exploits unstructured meshes to account for complex geometries, e.g. high resolution topography and bathymetry, 3D subsurface structure, and fault networks. We present the up-to-date largest (1500 km of faults) and longest (500 s) dynamic rupture simulation modeling the 2004 Sumatra-Andaman earthquake. We demonstrate the need for end-to-end-optimization and petascale performance of scientific software to realize realistic simulations on the extreme scales of subduction zone earthquakes: Considering the full complexity of subduction zone geometries leads inevitably to huge differences in element sizes. The main code improvements include a cache-aware wave propagation scheme and optimizations of the dynamic rupture kernels using code generation. In addition, a novel clustered local-time-stepping scheme for dynamic rupture has been established. Finally, asynchronous output has been implemented to overlap I/O and compute time. We resolve the frictional sliding process on the curved mega-thrust and a system of splay faults, as well as the seismic wave field and seafloor displacement with frequency content up to 2.2 Hz. We validate the scenario by geodetic, seismological and tsunami observations. The resulting rupture dynamics shed new light on the activation and importance of splay faults.
Jiang, Wei; Luo, Yun; Maragliano, Luca; Roux, Benoît
2012-11-13
An extremely scalable computational strategy is described for calculations of the potential of mean force (PMF) in multidimensions on massively distributed supercomputers. The approach involves coupling thousands of umbrella sampling (US) simulation windows distributed to cover the space of order parameters with a Hamiltonian molecular dynamics replica-exchange (H-REMD) algorithm to enhance the sampling of each simulation. In the present application, US/H-REMD is carried out in a two-dimensional (2D) space and exchanges are attempted alternatively along the two axes corresponding to the two order parameters. The US/H-REMD strategy is implemented on the basis of parallel/parallel multiple copy protocol at the MPI level, and therefore can fully exploit computing power of large-scale supercomputers. Here the novel technique is illustrated using the leadership supercomputer IBM Blue Gene/P with an application to a typical biomolecular calculation of general interest, namely the binding of calcium ions to the small protein Calbindin D9k. The free energy landscape associated with two order parameters, the distance between the ion and its binding pocket and the root-mean-square deviation (rmsd) of the binding pocket relative the crystal structure, was calculated using the US/H-REMD method. The results are then used to estimate the absolute binding free energy of calcium ion to Calbindin D9k. The tests demonstrate that the 2D US/H-REMD scheme greatly accelerates the configurational sampling of the binding pocket, thereby improving the convergence of the potential of mean force calculation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lichtner, Peter C.; Hammond, Glenn E.
Evolution of a hexavalent uranium [U(VI)] plume at the Hanford 300 Area bordering the Columbia River is investigated to evaluate the roles of labile and nonlabile forms of U(VI) on the longevity of the plume. A high fidelity, three-dimensional, field-scale, reactive flow and transport model is used to represent the system. Richards equation coupled to multicomponent reactive transport equations are solved for times up to 100 years taking into account rapid fluctuations in the Columbia River stage resulting in pulse releases of U(VI) into the river. The peta-scale computer code PFLOTRAN developed under a DOE SciDAC-2 project is employed inmore » the simulations and executed on ORNL's Cray XT5 supercomputer Jaguar. Labile U(VI) is represented in the model through surface complexation reactions and its nonlabile form through dissolution of metatorbernite used as a surrogate mineral. Initial conditions are constructed corresponding to the U(VI) plume already in place to avoid uncertainties associated with the lack of historical data for the waste stream. The cumulative U(VI) flux into the river is compared for cases of equilibrium and multirate sorption models and for no sorption. The sensitivity of the U(VI) flux into the river on the initial plume configuration is investigated. The presence of nonlabile U(VI) was found to be essential in explaining the longevity of the U(VI) plume and the prolonged high U(VI) concentrations at the site exceeding the EPA MCL for uranium.« less
SequenceL: Automated Parallel Algorithms Derived from CSP-NT Computational Laws
NASA Technical Reports Server (NTRS)
Cooke, Daniel; Rushton, Nelson
2013-01-01
With the introduction of new parallel architectures like the cell and multicore chips from IBM, Intel, AMD, and ARM, as well as the petascale processing available for highend computing, a larger number of programmers will need to write parallel codes. Adding the parallel control structure to the sequence, selection, and iterative control constructs increases the complexity of code development, which often results in increased development costs and decreased reliability. SequenceL is a high-level programming language that is, a programming language that is closer to a human s way of thinking than to a machine s. Historically, high-level languages have resulted in decreased development costs and increased reliability, at the expense of performance. In recent applications at JSC and in industry, SequenceL has demonstrated the usual advantages of high-level programming in terms of low cost and high reliability. SequenceL programs, however, have run at speeds typically comparable with, and in many cases faster than, their counterparts written in C and C++ when run on single-core processors. Moreover, SequenceL is able to generate parallel executables automatically for multicore hardware, gaining parallel speedups without any extra effort from the programmer beyond what is required to write the sequen tial/singlecore code. A SequenceL-to-C++ translator has been developed that automatically renders readable multithreaded C++ from a combination of a SequenceL program and sample data input. The SequenceL language is based on two fundamental computational laws, Consume-Simplify- Produce (CSP) and Normalize-Trans - pose (NT), which enable it to automate the creation of parallel algorithms from high-level code that has no annotations of parallelism whatsoever. In our anecdotal experience, SequenceL development has been in every case less costly than development of the same algorithm in sequential (that is, single-core, single process) C or C++, and an order of magnitude less costly than development of comparable parallel code. Moreover, SequenceL not only automatically parallelizes the code, but since it is based on CSP-NT, it is provably race free, thus eliminating the largest quality challenge the parallelized software developer faces.
Planning for Pre-Exascale Platform Environment (Fiscal Year 2015 Level 2 Milestone 5216)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Springmeyer, R.; Lang, M.; Noe, J.
This Plan for ASC Pre-Exascale Platform Environments document constitutes the deliverable for the fiscal year 2015 (FY15) Advanced Simulation and Computing (ASC) Program Level 2 milestone Planning for Pre-Exascale Platform Environment. It acknowledges and quantifies challenges and recognized gaps for moving the ASC Program towards effective use of exascale platforms and recommends strategies to address these gaps. This document also presents an update to the concerns, strategies, and plans presented in the FY08 predecessor document that dealt with the upcoming (at the time) petascale high performance computing (HPC) platforms. With the looming push towards exascale systems, a review of themore » earlier document was appropriate in light of the myriad architectural choices currently under consideration. The ASC Program believes the platforms to be fielded in the 2020s will be fundamentally different systems that stress ASC’s ability to modify codes to take full advantage of new or unique features. In addition, the scale of components will increase the difficulty of maintaining an errorfree system, thus driving new approaches to resilience and error detection/correction. The code revamps of the past, from serial- to vector-centric code to distributed memory to threaded implementations, will be revisited as codes adapt to a new message passing interface (MPI) plus “x” or more advanced and dynamic programming models based on architectural specifics. Development efforts are already underway in some cases, and more difficult or uncertain aspects of the new architectures will require research and analysis that may inform future directions for program choices. In addition, the potential diversity of system architectures may require parallel if not duplicative efforts to analyze and modify environments, codes, subsystems, libraries, debugging tools, and performance analysis techniques as well as exploring new monitoring methodologies. It is difficult if not impossible to selectively eliminate some of these activities until more information is available through simulations of potential architectures, analysis of systems designs, and informed study of commodity technologies that will be the constituent parts of future platforms.« less
Defense Science Board Report on Advanced Computing
2009-03-01
computers will require extensive research and development to have a chance of reaching the exascale level. Even if exascale level machines can...generations of petascale and then exascale level computing capability. This includes both the hardware and the complex software that may be...required for the architectures needed for exacscale capability. The challenges are extremely daunting, especially at the exascale
ERIC Educational Resources Information Center
Roiger, Trevor
2009-01-01
Some research exists relative to the personnel relationship between athletic training education programs (ATEPs) and intercollegiate athletic departments, yet little research has examined program directors' general perceptions of coupling or coupling related to the Commission on Accreditation of Athletic Training Education (CAATE) standards of…
An Evaluation of a Program to Help Dual-Earner Couples Share the Second Shift.
ERIC Educational Resources Information Center
Hawkins, Alan J.; And Others
1994-01-01
Used both traditional scientific and feminist methodologies to evaluate effectiveness of family life education program designed to help dual-earner couples (n=14 couples) share domestic labor. Both quantitative and qualitative data suggest that program produced small increases in husbands' involvement in both housework and child care and large…
Bradford, Angela B; Hawkins, Alan J; Acker, Jennifer
2015-12-01
Over the past decade, public funding for Couple and Relationship Education programs has expanded. As program administrators have been able to extend their reach to low-income individuals and couples using this support, it has become apparent that greater numbers of relationally distressed couples are attending classes than previously anticipated. Because psychoeducational programs for couples have traditionally served less distressed couples, this dynamic highlights the need to examine the policy and practice implications of more distressed couples accessing these services. This paper reviews some of the most immediate issues, including screening for domestic violence and couple needs, pedagogical considerations, and the potential integration of therapy and education services. We also make suggestions for future research that can inform policy and practice efforts. © 2015 Family Process Institute.
Whitton, Sarah W; Weitbrecht, Eliza M; Kuryluk, Amanda D; Hutsell, David W
2016-09-01
Relationship education, effective in improving relationship quality among different-sex couples, represents a promising and nonstigmatizing approach to promoting the health and stability of same-sex couples. A new culturally sensitive relationship education program was developed specifically for male same-sex couples, which includes adaptations of evidence-based strategies to build core relationship skills (e.g., communication skills training) and newly developed content to address unique challenges faced by this group (e.g., discrimination; low social support). A small randomized waitlist-control trial (N = 20 couples) was conducted to evaluate the program. To assess program efficacy, dyadic longitudinal data (collected at pre- and postprogram and 3-month follow-up) were analyzed using multilevel models that accounted for nonindependence in data from indistinguishable dyads. Results indicated significant program effects in comparison to waitlist controls on couple constructive and destructive communication, perceived stress, and relationship satisfaction. Gains in each of these areas were maintained at 3-month follow-up. Although there was no evidence of within-person program effects on social support, satisfaction, or relationship instability immediately postprogram, all 3 showed within-person improvements by follow-up. Ratings of program satisfaction were high. In summary, study findings support the feasibility, acceptability, and initial efficacy of the program and highlight the potential value of culturally sensitive adaptations of relationship education for same-sex couples. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Social marketing program sales.
1987-01-01
This table presents data on social marketing program sales for projects that provide more than 5000 couple-years of protection. Cited are social marketing programs in Bangladesh, Costa Rica, Egypt, El Salvador, Guatemala, Honduras, India, Indonesia, Jamaica, Nepal, Pakistan, Peru, and Sri Lanka. Included in the table are data on program funding, product sales (generally condoms, pills, and foaming tablets), and couple-years of protection provided. Among the social marketing programs reporting particularly high couple-years of protection levels are the Bangladesh Family Planning Social Marketing Program (1,165,100), the Egyptian Family Planning Association's Family for the Future Program (732,200), India's Nirodh Marketing Program (2,225,000), and Pakistan's Social Marketing Contraceptive Program (280,000).
Halford, W Kim; Petch, Jemima; Creedy, Debra K
2010-03-01
The transition to parenthood is often associated with a decline in couple relationship adjustment. Couples (n = 71) expecting their first child were randomly assigned to either: (a) Becoming a Parent (BAP), a maternal parenting education program; or (b) Couple CARE for Parents (CCP), a couple relationship and parenting education program. Couples were assessed pre-intervention (last trimester of pregnancy), post-intervention (5 months postpartum), and follow-up (12 months postpartum). Relative to BAP, CCP reduced negative couple communication from pre- to post-intervention, and prevented erosion of relationship adjustment and self-regulation in women but not men from pre-intervention to follow-up. Mean parenting stress reflected positive adjustment to parenthood with no differences between BAP and CCP. CCP shows promise as a brief program that can enhance couple communication and women's adjustment to parenthood.
Probabilistic Photometric Redshifts in the Era of Petascale Astronomy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carrasco Kind, Matias
2014-01-01
With the growth of large photometric surveys, accurately estimating photometric redshifts, preferably as a probability density function (PDF), and fully understanding the implicit systematic uncertainties in this process has become increasingly important. These surveys are expected to obtain images of billions of distinct galaxies. As a result, storing and analyzing all of these photometric redshift PDFs will be non-trivial, and this challenge becomes even more severe if a survey plans to compute and store multiple different PDFs. In this thesis, we have developed an end-to-end framework that will compute accurate and robust photometric redshift PDFs for massive data sets bymore » using two new, state-of-the-art machine learning techniques that are based on a random forest and a random atlas, respectively. By using data from several photometric surveys, we demonstrate the applicability of these new techniques, and we demonstrate that our new approach is among the best techniques currently available. We also show how different techniques can be combined by using novel Bayesian techniques to improve the photometric redshift precision to unprecedented levels while also presenting new approaches to better identify outliers. In addition, our framework provides supplementary information regarding the data being analyzed, including unbiased estimates of the accuracy of the technique without resorting to a validation data set, identification of poor photometric redshift areas within the parameter space occupied by the spectroscopic training data, and a quantification of the relative importance of the variables used during the estimation process. Furthermore, we present a new approach to represent and store photometric redshift PDFs by using a sparse representation with outstanding compression and reconstruction capabilities. We also demonstrate how this framework can also be directly incorporated into cosmological analyses. The new techniques presented in this thesis are crucial to enable the development of precision cosmology in the era of petascale astronomical surveys.« less
PoPLAR: Portal for Petascale Lifescience Applications and Research
2013-01-01
Background We are focusing specifically on fast data analysis and retrieval in bioinformatics that will have a direct impact on the quality of human health and the environment. The exponential growth of data generated in biology research, from small atoms to big ecosystems, necessitates an increasingly large computational component to perform analyses. Novel DNA sequencing technologies and complementary high-throughput approaches--such as proteomics, genomics, metabolomics, and meta-genomics--drive data-intensive bioinformatics. While individual research centers or universities could once provide for these applications, this is no longer the case. Today, only specialized national centers can deliver the level of computing resources required to meet the challenges posed by rapid data growth and the resulting computational demand. Consequently, we are developing massively parallel applications to analyze the growing flood of biological data and contribute to the rapid discovery of novel knowledge. Methods The efforts of previous National Science Foundation (NSF) projects provided for the generation of parallel modules for widely used bioinformatics applications on the Kraken supercomputer. We have profiled and optimized the code of some of the scientific community's most widely used desktop and small-cluster-based applications, including BLAST from the National Center for Biotechnology Information (NCBI), HMMER, and MUSCLE; scaled them to tens of thousands of cores on high-performance computing (HPC) architectures; made them robust and portable to next-generation architectures; and incorporated these parallel applications in science gateways with a web-based portal. Results This paper will discuss the various developmental stages, challenges, and solutions involved in taking bioinformatics applications from the desktop to petascale with a front-end portal for very-large-scale data analysis in the life sciences. Conclusions This research will help to bridge the gap between the rate of data generation and the speed at which scientists can study this data. The ability to rapidly analyze data at such a large scale is having a significant, direct impact on science achieved by collaborators who are currently using these tools on supercomputers. PMID:23902523
DOE Office of Scientific and Technical Information (OSTI.GOV)
Madduri, Kamesh; Im, Eun-Jin; Ibrahim, Khaled Z.
The next decade of high-performance computing (HPC) systems will see a rapid evolution and divergence of multi- and manycore architectures as power and cooling constraints limit increases in microprocessor clock speeds. Understanding efficient optimization methodologies on diverse multicore designs in the context of demanding numerical methods is one of the greatest challenges faced today by the HPC community. In this paper, we examine the efficient multicore optimization of GTC, a petascale gyrokinetic toroidal fusion code for studying plasma microturbulence in tokamak devices. For GTC’s key computational components (charge deposition and particle push), we explore efficient parallelization strategies across a broadmore » range of emerging multicore designs, including the recently-released Intel Nehalem-EX, the AMD Opteron Istanbul, and the highly multithreaded Sun UltraSparc T2+. We also present the first study on tuning gyrokinetic particle-in-cell (PIC) algorithms for graphics processors, using the NVIDIA C2050 (Fermi). Our work discusses several novel optimization approaches for gyrokinetic PIC, including mixed-precision computation, particle binning and decomposition strategies, grid replication, SIMDized atomic floating-point operations, and effective GPU texture memory utilization. Overall, we achieve significant performance improvements of 1.3–4.7× on these complex PIC kernels, despite the inherent challenges of data dependency and locality. Finally, our work also points to several architectural and programming features that could significantly enhance PIC performance and productivity on next-generation architectures.« less
Kim, Soo-Yeon; Kang, Hye-Won; Chung, Yong-Chul; Park, Seungha
2013-01-01
In the field of marital therapy, it is known that couple movement program helps married couples faced with conflict situation to rebuild the relationship and to maintain a family homeostasis. The purpose of this study was to configure and apply the kinesthetic empathy program and to assess the effectiveness for married couples in conflict. To achieve the research aims, qualitative research method has been conducted, subjecting three couples, 6 people, who are participating in expressive movement program for this study. The study used focus group interview method for collecting date and employed for the interview method by mixing the semi-structured and unstructured questionnaire. The results were followings. First, through kinesthetic empathy enhancing program, one could develop self-awareness and emotional attunement. Second, the result showed the relationship between intention and empathy. It shows that “knowing spouse’s hidden intention” is significant factors to understand others. Third, kinesthetic empathy program could complement general marriage counseling program. The results of this study provide empirical evidence that movement program functions as an empathy enhancer through the process of perceiving, feeling, thinking, and interacting with others. PMID:24278896
Supercomputing Sheds Light on the Dark Universe
DOE Office of Scientific and Technical Information (OSTI.GOV)
Habib, Salman; Heitmann, Katrin
2012-11-15
At Argonne National Laboratory, scientists are using supercomputers to shed light on one of the great mysteries in science today, the Dark Universe. With Mira, a petascale supercomputer at the Argonne Leadership Computing Facility, a team led by physicists Salman Habib and Katrin Heitmann will run the largest, most complex simulation of the universe ever attempted. By contrasting the results from Mira with state-of-the-art telescope surveys, the scientists hope to gain new insights into the distribution of matter in the universe, advancing future investigations of dark energy and dark matter into a new realm. The team's research was named amore » finalist for the 2012 Gordon Bell Prize, an award recognizing outstanding achievement in high-performance computing.« less
Coupled rotor/airframe vibration analysis program manual. Volume 2: Sample input and output listings
NASA Technical Reports Server (NTRS)
Cassarino, S.; Sopher, R.
1982-01-01
Sample input and output listings obtained with the base program (SIMVIB) of the coupled rotor/airframe vibration analysis and the external programs, G400/F389 and E927 are presented. Results for five of the base program test cases are shown. They represent different applications of the SIMVIB program to study the vibration characteristics of various dynamic configurations. Input and output listings obtained for one cycle of the G400/F389 coupled program are presented. Results from the rotor aeroelastic analysis E927 also appear. A brief description of the check cases is provided. A summary of the check cases for all the external programs interacting with the SIMVIB program is illustrated.
The Numerical Technique for the Landslide Tsunami Simulations Based on Navier-Stokes Equations
NASA Astrophysics Data System (ADS)
Kozelkov, A. S.
2017-12-01
The paper presents an integral technique simulating all phases of a landslide-driven tsunami. The technique is based on the numerical solution of the system of Navier-Stokes equations for multiphase flows. The numerical algorithm uses a fully implicit approximation method, in which the equations of continuity and momentum conservation are coupled through implicit summands of pressure gradient and mass flow. The method we propose removes severe restrictions on the time step and allows simulation of tsunami propagation to arbitrarily large distances. The landslide origin is simulated as an individual phase being a Newtonian fluid with its own density and viscosity and separated from the water and air phases by an interface. The basic formulas of equation discretization and expressions for coefficients are presented, and the main steps of the computation procedure are described in the paper. To enable simulations of tsunami propagation across wide water areas, we propose a parallel algorithm of the technique implementation, which employs an algebraic multigrid method. The implementation of the multigrid method is based on the global level and cascade collection algorithms that impose no limitations on the paralleling scale and make this technique applicable to petascale systems. We demonstrate the possibility of simulating all phases of a landslide-driven tsunami, including its generation, propagation and uprush. The technique has been verified against the problems supported by experimental data. The paper describes the mechanism of incorporating bathymetric data to simulate tsunamis in real water areas of the world ocean. Results of comparison with the nonlinear dispersion theory, which has demonstrated good agreement, are presented for the case of a historical tsunami of volcanic origin on the Montserrat Island in the Caribbean Sea.
ogs6 - a new concept for porous-fractured media simulations
NASA Astrophysics Data System (ADS)
Naumov, Dmitri; Bilke, Lars; Fischer, Thomas; Rink, Karsten; Wang, Wenqing; Watanabe, Norihiro; Kolditz, Olaf
2015-04-01
OpenGeoSys (OGS) is a scientific open-source initiative for numerical simulation of thermo-hydro-mechanical/chemical (THMC) processes in porous and fractured media, continuously developed since the mid-eighties. The basic concept is to provide a flexible numerical framework for solving coupled multi-field problems. OGS is targeting mainly on applications in environmental geoscience, e.g. in the fields of contaminant hydrology, water resources management, waste deposits, or geothermal energy systems, but it has also been successfully applied to new topics in energy storage recently. OGS is actively participating several international benchmarking initiatives, e.g. DECOVALEX (waste management), CO2BENCH (CO2 storage and sequestration), SeSBENCH (reactive transport processes) and HM-Intercomp (coupled hydrosystems). Despite the broad applicability of OGS in geo-, hydro- and energy-sciences, several shortcomings became obvious concerning the computational efficiency as well as the code structure became too sophisticated for further efficient development. OGS-5 was designed for object-oriented FEM applications. However, in many multi-field problems a certain flexibility of tailored numerical schemes is essential. Therefore, a new concept was designed to overcome existing bottlenecks. The paradigms for ogs6 are: - Flexibility of numerical schemes (FEM#FVM#FDM), - Computational efficiency (PetaScale ready), - Developer- and user-friendly. ogs6 has a module-oriented architecture based on thematic libraries (e.g. MeshLib, NumLib) on the large scale and uses object-oriented approach for the small scale interfaces. Usage of a linear algebra library (Eigen3) for the mathematical operations together with the ISO C++11 standard increases the expressiveness of the code and makes it more developer-friendly. The new C++ standard also makes the template meta-programming technique code used for compile-time optimizations more compact. We have transitioned the main code development to the GitHub code hosting system (https://github.com/ufz/ogs). The very flexible revision control system Git in combination with issue tracking, developer feedback and the code review options improve the code quality and the development process in general. The continuous testing procedure of the benchmarks as it was established for OGS-5 is maintained. Additionally unit testing, which is automatically triggered by any code changes, is executed by two continuous integration frameworks (Jenkins CI, Travis CI) which build and test the code on different operating systems (Windows, Linux, Mac OS), in multiple configurations and with different compilers (GCC, Clang, Visual Studio). To improve the testing possibilities further, XML based file input formats are introduced helping with automatic validation of the user contributed benchmarks. The first ogs6 prototype version 6.0.1 has been implemented for solving generic elliptic problems. Next steps are envisaged to transient, non-linear and coupled problems. Literature: [1] Kolditz O, Shao H, Wang W, Bauer S (eds) (2014): Thermo-Hydro-Mechanical-Chemical Processes in Fractured Porous Media: Modelling and Benchmarking - Closed Form Solutions. In: Terrestrial Environmental Sciences, Vol. 1, Springer, Heidelberg, ISBN 978-3-319-11893-2, 315pp. http://www.springer.com/earth+sciences+and+geography/geology/book/978-3-319-11893-2 [2] Naumov D (2015): Computational Fluid Dynamics in Unconsolidated Sediments: Model Generation and Discrete Flow Simulations, PhD thesis, Technische Universität Dresden.
Stepfamily Enrichment Program: A Preventive Intervention for Remarried Couples
ERIC Educational Resources Information Center
Michaels, Marcia L.
2006-01-01
The Stepfamily Enrichment Program is a multi-couple group intervention intended to help stepfamilies successfully negotiate the early stages of family formation. Theory, research, and clinical findings were integrated in this intervention designed specifically for remarried couples. Emphasis is placed on strengthening and improving family…
Falconier, Mariana K
2015-04-01
The accumulated knowledge about the negative impact of financial strain on couple's relationship functioning and the magnitude of the latest economic downturn have brought together the fields of financial counseling and couples' therapy. This article describes the development of a new interdisciplinary program that aims at helping couples under financial strain improve their financial management, communication, and dyadic coping skills. The article also reports the results from its initial pilot-testing with data collected from 18 financially distressed couples before and after participation in the program and 3 months later. Results from repeated measures ANOVAs suggest that the program may help reduce both partners' financial strain and the male negative communication and improve both partners' financial management skills and strategies to cope together with financial strain, and the male relationship satisfaction. These findings together with the high satisfaction reported by participants regarding the structure and content of the sessions and homework suggest that this program may be a promising approach to help couples experiencing financial strain. Gender differences, clinical implications, and possibilities for further research are also discussed. © 2014 American Association for Marriage and Family Therapy.
Development a computer codes to couple PWR-GALE output and PC-CREAM input
NASA Astrophysics Data System (ADS)
Kuntjoro, S.; Budi Setiawan, M.; Nursinta Adi, W.; Deswandri; Sunaryo, G. R.
2018-02-01
Radionuclide dispersion analysis is part of an important reactor safety analysis. From the analysis it can be obtained the amount of doses received by radiation workers and communities around nuclear reactor. The radionuclide dispersion analysis under normal operating conditions is carried out using the PC-CREAM code, and it requires input data such as source term and population distribution. Input data is derived from the output of another program that is PWR-GALE and written Population Distribution data in certain format. Compiling inputs for PC-CREAM programs manually requires high accuracy, as it involves large amounts of data in certain formats and often errors in compiling inputs manually. To minimize errors in input generation, than it is make coupling program for PWR-GALE and PC-CREAM programs and a program for writing population distribution according to the PC-CREAM input format. This work was conducted to create the coupling programming between PWR-GALE output and PC-CREAM input and programming to written population data in the required formats. Programming is done by using Python programming language which has advantages of multiplatform, object-oriented and interactive. The result of this work is software for coupling data of source term and written population distribution data. So that input to PC-CREAM program can be done easily and avoid formatting errors. Programming sourceterm coupling program PWR-GALE and PC-CREAM is completed, so that the creation of PC-CREAM inputs in souceterm and distribution data can be done easily and according to the desired format.
Evaluation of Time Domain EM Coupling Techniques. Volume II.
1980-08-01
tool for the analysis of elec- tromangetic coupling and shielding problems: the finite-difference, time-domain (FD- TD ) solution of Maxwell’s equations...The objective of the program was to evaluate the suitability of the FD- TD method to determine the amount of electromagnetic coupling through an...specific questfiowwere addressed during this program: 1. Can the FD- TD method accurately model electromagnetic coupling into a conducting structure for
Coupled oscillators: interesting experiments for high school students
NASA Astrophysics Data System (ADS)
Kodejška, Č.; Lepil, O.; Sedláčková, H.
2018-07-01
This work deals with the experimental demonstration of coupled oscillators using simple tools in the form of mechanical coupled pendulums, magnetically coupled elastic strings or electromagnetic oscillators. For the evaluation of results the data logger Lab Quest Vernier and video analysis in the Tracker program were used. In the first part of this work, coupled mechanical oscillators of different types are shown and the data analysis by the Tracker or Vernier Logger Pro programs. The second part describes a measurement using two LC circuits with inductively or capacitive coupled electromagnetic oscillators and the obtained experimental results.
Palinkas, Lawrence A.; Robertson, Angela M.; Syvertsen, Jennifer L.; Hernandez, Daniel O.; Ulibarri, Monica D.; Rangel, M. Gudelia; Martinex, Gustavo; Strathdee, Steffanie A.
2014-01-01
This mixed-methods study examined the acceptability of a hypothetical couples-based HIV prevention program for female sex workers and their intimate (non-commercial) male partners in Mexico. Among 320 participants, 67% preferred couples-based over individual programs, particularly among men. Reasons cited for preferring couples-based programs included convenience and health benefits for both partners. Participants reported that they would benefit from general health information and services, HIV counseling and testing, job training (particularly for men) and other services. However, qualitative interviews revealed that barriers relating to the environment (i.e., poor access to services), providers (i.e., lack of a therapeutic alliance), and intimate relationships (i.e., mistrust or instability) would need to be addressed before such a program could be successfully implemented. Despite women’s concerns about privacy and men’s preferences for gender-specific services, couples-based HIV prevention programs were largely acceptable to female sex workers and their intimate male partners. PMID:24510364
Does Couple and Relationship Education Work for Individuals in Stepfamilies? A Meta-Analytic Study
ERIC Educational Resources Information Center
Lucier-Greer, Mallory; Adler-Baeder, Francesca
2012-01-01
Recent meta-analytic efforts have documented how couple and relationship education (CRE) programs promote healthy relationship and family functioning. The current meta-analysis contributes to this body of literature by examining stepfamily couples, an at-risk, subpopulation of participants, and assessing the effectiveness of CRE programs for…
Measuring the effectiveness of contraceptive marketing programs: Preethi in Sri Lanka.
Davies, J; Louis, T D
1977-04-01
The Preethi marketing program resulted in sales of more than 11 million condoms during its first 33 months, multiplying Sri Lanka's annual per capita condom use by a factor of five. Estimates for 1974 show 144,000 new acceptors (8% of married women of reproductive age), totaling 50,000 couple-years of protection. It is also estimated that more than half of the nation's 1.8 million couples of childbearing age were educated about the function of a condom. Unit costs were low for this new, nationwide program: about US $2.00 for each new acceptor; 6.00 dollars for each couple-year of protection; and 0.09 dollars for each couple educated. The program's success suggests that a social marketing approach can advance family planning at a relatively low cost.
Final Report for DOE Award ER25756
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kesselman, Carl
2014-11-17
The SciDAC-funded Center for Enabling Distributed Petascale Science (CEDPS) was established to address technical challenges that arise due to the frequent geographic distribution of data producers (in particular, supercomputers and scientific instruments) and data consumers (people and computers) within the DOE laboratory system. Its goal is to produce technical innovations that meet DOE end-user needs for (a) rapid and dependable placement of large quantities of data within a distributed high-performance environment, and (b) the convenient construction of scalable science services that provide for the reliable and high-performance processing of computation and data analysis requests from many remote clients. The Centermore » is also addressing (c) the important problem of troubleshooting these and other related ultra-high-performance distributed activities from the perspective of both performance and functionality« less
Multi-dimensional simulations of core-collapse supernova explosions with CHIMERA
NASA Astrophysics Data System (ADS)
Messer, O. E. B.; Harris, J. A.; Hix, W. R.; Lentz, E. J.; Bruenn, S. W.; Mezzacappa, A.
2018-04-01
Unraveling the core-collapse supernova (CCSN) mechanism is a problem that remains essentially unsolved despite more than four decades of effort. Spherically symmetric models with otherwise high physical fidelity generally fail to produce explosions, and it is widely accepted that CCSNe are inherently multi-dimensional. Progress in realistic modeling has occurred recently through the availability of petascale platforms and the increasing sophistication of supernova codes. We will discuss our most recent work on understanding neutrino-driven CCSN explosions employing multi-dimensional neutrino-radiation hydrodynamics simulations with the Chimera code. We discuss the inputs and resulting outputs from these simulations, the role of neutrino radiation transport, and the importance of multi-dimensional fluid flows in shaping the explosions. We also highlight the production of 48Ca in long-running Chimera simulations.
Programming PHREEQC calculations with C++ and Python a comparative study
Charlton, Scott R.; Parkhurst, David L.; Muller, Mike
2011-01-01
The new IPhreeqc module provides an application programming interface (API) to facilitate coupling of other codes with the U.S. Geological Survey geochemical model PHREEQC. Traditionally, loose coupling of PHREEQC with other applications required methods to create PHREEQC input files, start external PHREEQC processes, and process PHREEQC output files. IPhreeqc eliminates most of this effort by providing direct access to PHREEQC capabilities through a component object model (COM), a library, or a dynamically linked library (DLL). Input and calculations can be specified through internally programmed strings, and all data exchange between an application and the module can occur in computer memory. This study compares simulations programmed in C++ and Python that are tightly coupled with IPhreeqc modules to the traditional simulations that are loosely coupled to PHREEQC. The study compares performance, quantifies effort, and evaluates lines of code and the complexity of the design. The comparisons show that IPhreeqc offers a more powerful and simpler approach for incorporating PHREEQC calculations into transport models and other applications that need to perform PHREEQC calculations. The IPhreeqc module facilitates the design of coupled applications and significantly reduces run times. Even a moderate knowledge of one of the supported programming languages allows more efficient use of PHREEQC than the traditional loosely coupled approach.
Gyrokinetic particle-in-cell optimization on emerging multi- and manycore platforms
Madduri, Kamesh; Im, Eun-Jin; Ibrahim, Khaled Z.; ...
2011-03-02
The next decade of high-performance computing (HPC) systems will see a rapid evolution and divergence of multi- and manycore architectures as power and cooling constraints limit increases in microprocessor clock speeds. Understanding efficient optimization methodologies on diverse multicore designs in the context of demanding numerical methods is one of the greatest challenges faced today by the HPC community. In this paper, we examine the efficient multicore optimization of GTC, a petascale gyrokinetic toroidal fusion code for studying plasma microturbulence in tokamak devices. For GTC’s key computational components (charge deposition and particle push), we explore efficient parallelization strategies across a broadmore » range of emerging multicore designs, including the recently-released Intel Nehalem-EX, the AMD Opteron Istanbul, and the highly multithreaded Sun UltraSparc T2+. We also present the first study on tuning gyrokinetic particle-in-cell (PIC) algorithms for graphics processors, using the NVIDIA C2050 (Fermi). Our work discusses several novel optimization approaches for gyrokinetic PIC, including mixed-precision computation, particle binning and decomposition strategies, grid replication, SIMDized atomic floating-point operations, and effective GPU texture memory utilization. Overall, we achieve significant performance improvements of 1.3–4.7× on these complex PIC kernels, despite the inherent challenges of data dependency and locality. Finally, our work also points to several architectural and programming features that could significantly enhance PIC performance and productivity on next-generation architectures.« less
Uvf - Unified Volume Format: A General System for Efficient Handling of Large Volumetric Datasets.
Krüger, Jens; Potter, Kristin; Macleod, Rob S; Johnson, Christopher
2008-01-01
With the continual increase in computing power, volumetric datasets with sizes ranging from only a few megabytes to petascale are generated thousands of times per day. Such data may come from an ordinary source such as simple everyday medical imaging procedures, while larger datasets may be generated from cluster-based scientific simulations or measurements of large scale experiments. In computer science an incredible amount of work worldwide is put into the efficient visualization of these datasets. As researchers in the field of scientific visualization, we often have to face the task of handling very large data from various sources. This data usually comes in many different data formats. In medical imaging, the DICOM standard is well established, however, most research labs use their own data formats to store and process data. To simplify the task of reading the many different formats used with all of the different visualization programs, we present a system for the efficient handling of many types of large scientific datasets (see Figure 1 for just a few examples). While primarily targeted at structured volumetric data, UVF can store just about any type of structured and unstructured data. The system is composed of a file format specification with a reference implementation of a reader. It is not only a common, easy to implement format but also allows for efficient rendering of most datasets without the need to convert the data in memory.
Baker, Zachary Kent; Power, John Fredrick; Tripp, Justin Leonard; Dunham, Mark Edward; Stettler, Matthew W; Jones, John Alexander
2014-10-14
Disclosed is a method and system for performing operations on at least one input data vector in order to produce at least one output vector to permit easy, scalable and fast programming of a petascale equivalent supercomputer. A PetaFlops Router may comprise one or more PetaFlops Nodes, which may be connected to each other and/or external data provider/consumers via a programmable crossbar switch external to the PetaFlops Node. Each PetaFlops Node has a FPGA and a programmable intra-FPGA crossbar switch that permits input and output variables to be configurably connected to various physical operators contained in the FPGA as desired by a user. This allows a user to specify the instruction set of the system on a per-application basis. Further, the intra-FPGA crossbar switch permits the output of one operation to be delivered as an input to a second operation. By configuring the external crossbar switch, the output of a first operation on a first PetaFlops Node may be used as the input for a second operation on a second PetaFlops Node. An embodiment may provide an ability for the system to recognize and generate pipelined functions. Streaming operators may be connected together at run-time and appropriately staged to allow data to flow through a series of functions. This allows the system to provide high throughput and parallelism when possible. The PetaFlops Router may implement the user desired instructions by appropriately configuring the intra-FPGA crossbar switch on each PetaFlops Node and the external crossbar switch.
Relationship Education for Military Couples: Recommendations for Best Practice.
Bakhurst, Melissa G; Loew, Benjamin; McGuire, Annabel C L; Halford, W Kim; Markman, Howard J
2017-06-01
Military couples have a number of distinctive strengths and challenges that are likely to influence their relationship adjustment. Military couples' strengths include stable employment, financial security, and subsidized health and counseling services. At the same time, military couples often experience long periods of separation and associated difficulties with emotional disconnect, trauma symptoms, and reintegrating the family. This paper describes best practice recommendations for working with military couples, including: addressing the distinctive challenges of the military lifestyle, ensuring program delivery is seen as relevant by military couples, and providing relationship education in formats that enhance the accessibility of programs. © 2016 Family Process Institute.
ERIC Educational Resources Information Center
Rahimi, Mohd Khairul Anuar
2017-01-01
This phenomenological study explored the experiences of international students in CACREP-accredited marriage, couple, and family counseling programs. Seven former international students from the program who have practiced counseling in their home country were interviewed to understand their learning experiences, adaptation process and counseling…
ERIC Educational Resources Information Center
Casquarelli, Elaine J.; Fallon, Kathleen M.
2011-01-01
Research shows that premarital counseling programs help engaged couples develop interpersonal and problem-solving skills that enhance their marital relationships. Yet, there are limited services for same-sex couples. This article assumes an integrated humanistic and social justice advocacy stance to explore the needs of lesbian, gay, and bisexual…
Improving Dyadic Coping in Couples with a Stress-Oriented Approach: A 2-Year Longitudinal Study
ERIC Educational Resources Information Center
Bodenmann, Guy; Pihet, Sandrine; Shantinath, Shachi D.; Cina, Annette; Widmer, Kathrin
2006-01-01
This study sought to assess the effectiveness of a marital distress prevention program for couples by examining how marital quality, especially marital competencies such as dyadic coping, could be improved by means of a prevention program focusing on the enhancement of coping resources (Couples Coping Enhancement Training). The study consisted of…
Feinberg, Mark E; Jones, Damon E; Hostetler, Michelle L; Roettger, Michael E; Paul, Ian M; Ehrenthal, Deborah B
2016-08-01
The transition to parenthood is a stressful period for most parents as individuals and as couples, with variability in parent mental health and couple relationship functioning linked to children's long-term emotional, mental health, and academic outcomes. Few couple-focused prevention programs targeting this period have been shown to be effective. The purpose of this study was to test the short-term efficacy of a brief, universal, transition-to-parenthood intervention (Family Foundations) and report the results of this randomized trial at 10 months postpartum. This was a randomized controlled trial; 399 couples expecting their first child were randomly assigned to intervention or control conditions after pretest. Intervention couples received a manualized nine-session (five prenatal and four postnatal classes) psychoeducational program delivered in small groups. Intent-to-treat analyses indicated that intervention couples demonstrated better posttest levels than control couples on more than two thirds of measures of coparenting, parent mental health, parenting, child adjustment, and family violence. Program effects on family violence were particularly large. Of eight outcome variables that did not demonstrate main effects, seven showed moderated intervention impact; such that, intervention couples at higher levels of risk during pregnancy showed better outcomes than control couples at similar levels of risk. These findings replicate a prior smaller study of Family Foundations, indicating that the Family Foundations approach to supporting couples making the transition to parenthood can have broad impact for parents, family relationships, and children's adjustment. Program effects are consistent and benefit all families, with particularly notable effects for families at elevated prenatal risk.
Whitton, Sarah W; Scott, Shelby B; Dyar, Christina; Weitbrecht, Eliza M; Hutsell, David W; Kuryluk, Amanda D
2017-10-01
Relationship education represents a promising, nonstigmatizing approach to promoting the health and stability of same-sex couples. A new culturally sensitive adaptation of relationship education was developed specifically for female same-sex couples (The Strengthening Same-Sex Relationships Program, Female version; SSSR-F). SSSR-F includes adaptations of evidence-based strategies to build core relationship skills (e.g., communication skills training) as well as new content to address unique challenges faced by this population (e.g., discrimination; low social support). A small randomized waitlist-control trial (N = 37 couples) was conducted to evaluate program feasibility, acceptability, and efficacy. Three proximal outcomes targeted by SSSR-F (communication, perceived stress, social support) and 3 distal outcomes (global relationship satisfaction, instability, and confidence) were assessed at pre- and posttreatment and 3-month follow-up. Results of multilevel models accounting for nonindependence in dyadic data indicated statistically significant program effects on positive and negative couple communication, relationship satisfaction, and relationship confidence and small, nonsignificant program effects on stress, social support, and relationship instability. Analyses of follow-up data suggest maintenance of effects on the proximal but not the distal outcomes. Ratings of program satisfaction were high. Overall, findings support the feasibility, acceptability, and initial efficacy of SSSR-F, highlighting the potential value of culturally sensitive relationship education for same-sex couples. Continued efforts are needed to increase sustainability of program effects on global relationship quality over time. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
ABINIT: Plane-Wave-Based Density-Functional Theory on High Performance Computers
NASA Astrophysics Data System (ADS)
Torrent, Marc
2014-03-01
For several years, a continuous effort has been produced to adapt electronic structure codes based on Density-Functional Theory to the future computing architectures. Among these codes, ABINIT is based on a plane-wave description of the wave functions which allows to treat systems of any kind. Porting such a code on petascale architectures pose difficulties related to the many-body nature of the DFT equations. To improve the performances of ABINIT - especially for what concerns standard LDA/GGA ground-state and response-function calculations - several strategies have been followed: A full multi-level parallelisation MPI scheme has been implemented, exploiting all possible levels and distributing both computation and memory. It allows to increase the number of distributed processes and could not be achieved without a strong restructuring of the code. The core algorithm used to solve the eigen problem (``Locally Optimal Blocked Congugate Gradient''), a Blocked-Davidson-like algorithm, is based on a distribution of processes combining plane-waves and bands. In addition to the distributed memory parallelization, a full hybrid scheme has been implemented, using standard shared-memory directives (openMP/openACC) or porting some comsuming code sections to Graphics Processing Units (GPU). As no simple performance model exists, the complexity of use has been increased; the code efficiency strongly depends on the distribution of processes among the numerous levels. ABINIT is able to predict the performances of several process distributions and automatically choose the most favourable one. On the other hand, a big effort has been carried out to analyse the performances of the code on petascale architectures, showing which sections of codes have to be improved; they all are related to Matrix Algebra (diagonalisation, orthogonalisation). The different strategies employed to improve the code scalability will be described. They are based on an exploration of new diagonalization algorithm, as well as the use of external optimized librairies. Part of this work has been supported by the european Prace project (PaRtnership for Advanced Computing in Europe) in the framework of its workpackage 8.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-17
... predicted topological properties of superconductors in two dimensions, to program fundamental couplings at... topological properties of superconductors in two dimensions, to program fundamental couplings at near-atomic...
Atomic Detail Visualization of Photosynthetic Membranes with GPU-Accelerated Ray Tracing
Vandivort, Kirby L.; Barragan, Angela; Singharoy, Abhishek; Teo, Ivan; Ribeiro, João V.; Isralewitz, Barry; Liu, Bo; Goh, Boon Chong; Phillips, James C.; MacGregor-Chatwin, Craig; Johnson, Matthew P.; Kourkoutis, Lena F.; Hunter, C. Neil
2016-01-01
The cellular process responsible for providing energy for most life on Earth, namely photosynthetic light-harvesting, requires the cooperation of hundreds of proteins across an organelle, involving length and time scales spanning several orders of magnitude over quantum and classical regimes. Simulation and visualization of this fundamental energy conversion process pose many unique methodological and computational challenges. We present, in two accompanying movies, light-harvesting in the photosynthetic apparatus found in purple bacteria, the so-called chromatophore. The movies are the culmination of three decades of modeling efforts, featuring the collaboration of theoretical, experimental, and computational scientists. We describe the techniques that were used to build, simulate, analyze, and visualize the structures shown in the movies, and we highlight cases where scientific needs spurred the development of new parallel algorithms that efficiently harness GPU accelerators and petascale computers. PMID:27274603
An Optimizing Compiler for Petascale I/O on Leadership Class Architectures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Choudhary, Alok; Kandemir, Mahmut
In high-performance computing systems, parallel I/O architectures usually have very complex hierarchies with multiple layers that collectively constitute an I/O stack, including high-level I/O libraries such as PnetCDF and HDF5, I/O middleware such as MPI-IO, and parallel file systems such as PVFS and Lustre. Our project explored automated instrumentation and compiler support for I/O intensive applications. Our project made significant progress towards understanding the complex I/O hierarchies of high-performance storage systems (including storage caches, HDDs, and SSDs), and designing and implementing state-of-the-art compiler/runtime system technology that targets I/O intensive HPC applications that target leadership class machine. This final report summarizesmore » the major achievements of the project and also points out promising future directions.« less
Multi-dimensional simulations of core-collapse supernova explosions with CHIMERA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Messer, Bronson; Harris, James Austin; Hix, William Raphael
Unraveling the core-collapse supernova (CCSN) mechanism is a problem that remains essentially unsolved despite more than four decades of effort. Spherically symmetric models with otherwise high physical fidelity generally fail to produce explosions, and it is widely accepted that CCSNe are inherently multi-dimensional. Progress in realistic modeling has occurred recently through the availability of petascale platforms and the increasing sophistication of supernova codes. We will discuss our most recent work on understanding neutrino-driven CCSN explosions employing multi-dimensional neutrino-radiation hydrodynamics simulations with the Chimera code. We discuss the inputs and resulting outputs from these simulations, the role of neutrino radiation transport,more » and the importance of multi-dimensional fluid flows in shaping the explosions. We also highlight the production of 48Ca in long-running Chimera simulations.« less
NASA Astrophysics Data System (ADS)
Pordes, Ruth; OSG Consortium; Petravick, Don; Kramer, Bill; Olson, Doug; Livny, Miron; Roy, Alain; Avery, Paul; Blackburn, Kent; Wenaus, Torre; Würthwein, Frank; Foster, Ian; Gardner, Rob; Wilde, Mike; Blatecky, Alan; McGee, John; Quick, Rob
2007-07-01
The Open Science Grid (OSG) provides a distributed facility where the Consortium members provide guaranteed and opportunistic access to shared computing and storage resources. OSG provides support for and evolution of the infrastructure through activities that cover operations, security, software, troubleshooting, addition of new capabilities, and support for existing and engagement with new communities. The OSG SciDAC-2 project provides specific activities to manage and evolve the distributed infrastructure and support it's use. The innovative aspects of the project are the maintenance and performance of a collaborative (shared & common) petascale national facility over tens of autonomous computing sites, for many hundreds of users, transferring terabytes of data a day, executing tens of thousands of jobs a day, and providing robust and usable resources for scientific groups of all types and sizes. More information can be found at the OSG web site: www.opensciencegrid.org.
High-performance metadata indexing and search in petascale data storage systems
NASA Astrophysics Data System (ADS)
Leung, A. W.; Shao, M.; Bisson, T.; Pasupathy, S.; Miller, E. L.
2008-07-01
Large-scale storage systems used for scientific applications can store petabytes of data and billions of files, making the organization and management of data in these systems a difficult, time-consuming task. The ability to search file metadata in a storage system can address this problem by allowing scientists to quickly navigate experiment data and code while allowing storage administrators to gather the information they need to properly manage the system. In this paper, we present Spyglass, a file metadata search system that achieves scalability by exploiting storage system properties, providing the scalability that existing file metadata search tools lack. In doing so, Spyglass can achieve search performance up to several thousand times faster than existing database solutions. We show that Spyglass enables important functionality that can aid data management for scientists and storage administrators.
Greater emotional arousal predicts poorer long-term memory of communication skills in couples.
Baucom, Brian R; Weusthoff, Sarah; Atkins, David C; Hahlweg, Kurt
2012-06-01
Many studies have examined the importance of learning skills in behaviorally based couple interventions but none have examined predictors of long-term memory for skills. Associations between emotional arousal and long-term recall of communication skills delivered to couples during a behaviorally based relationship distress prevention program were examined in a sample of 49 German couples. Fundamental frequency (f(0)), a vocal measure of encoded emotional arousal, was measured during pre-treatment couple conflict. Higher levels of f(0) were linked to fewer skills remembered 11 years after completing the program, and women remembered more skills than men. Implications of results for behaviorally based couple interventions are discussed. Copyright © 2012 Elsevier Ltd. All rights reserved.
Distressed Couples and Marriage Education
ERIC Educational Resources Information Center
DeMaria, Rita M.
2005-01-01
Professionals generally believe that couples who choose to attend marriage education programs are not as distressed as are clinical couples and that distressed couples are not good candidates for marriage education. We examined these assumptions in 129 married couples who enrolled in a PAIRS, Practical Application of Intimate Relationship Skills…
NASA Astrophysics Data System (ADS)
Buaria, D.; Yeung, P. K.
2017-12-01
A new parallel algorithm utilizing a partitioned global address space (PGAS) programming model to achieve high scalability is reported for particle tracking in direct numerical simulations of turbulent fluid flow. The work is motivated by the desire to obtain Lagrangian information necessary for the study of turbulent dispersion at the largest problem sizes feasible on current and next-generation multi-petaflop supercomputers. A large population of fluid particles is distributed among parallel processes dynamically, based on instantaneous particle positions such that all of the interpolation information needed for each particle is available either locally on its host process or neighboring processes holding adjacent sub-domains of the velocity field. With cubic splines as the preferred interpolation method, the new algorithm is designed to minimize the need for communication, by transferring between adjacent processes only those spline coefficients determined to be necessary for specific particles. This transfer is implemented very efficiently as a one-sided communication, using Co-Array Fortran (CAF) features which facilitate small data movements between different local partitions of a large global array. The cost of monitoring transfer of particle properties between adjacent processes for particles migrating across sub-domain boundaries is found to be small. Detailed benchmarks are obtained on the Cray petascale supercomputer Blue Waters at the University of Illinois, Urbana-Champaign. For operations on the particles in a 81923 simulation (0.55 trillion grid points) on 262,144 Cray XE6 cores, the new algorithm is found to be orders of magnitude faster relative to a prior algorithm in which each particle is tracked by the same parallel process at all times. This large speedup reduces the additional cost of tracking of order 300 million particles to just over 50% of the cost of computing the Eulerian velocity field at this scale. Improving support of PGAS models on major compilers suggests that this algorithm will be of wider applicability on most upcoming supercomputers.
NASA Astrophysics Data System (ADS)
Noh, M. J.; Howat, I. M.; Porter, C. C.; Willis, M. J.; Morin, P. J.
2016-12-01
The Arctic is undergoing rapid change associated with climate warming. Digital Elevation Models (DEMs) provide critical information for change measurement and infrastructure planning in this vulnerable region, yet the existing quality and coverage of DEMs in the Arctic is poor. Low contrast and repeatedly-textured surfaces, such as snow and glacial ice and mountain shadows, all common in the Arctic, challenge existing stereo-photogrammetric techniques. Submeter resolution, stereoscopic satellite imagery with high geometric and radiometric quality, and wide spatial coverage are becoming increasingly accessible to the scientific community. To utilize these imagery for extracting DEMs at a large scale over glaciated and high latitude regions we developed the Surface Extraction from TIN-based Searchspace Minimization (SETSM) algorithm. SETSM is fully automatic (i.e. no search parameter settings are needed) and uses only the satellite rational polynomial coefficients (RPCs). Using SETSM, we have generated a large number of DEMs (> 100,000 scene pair) from WorldView, GeoEye and QuickBird stereo images collected by DigitalGlobe Inc. and archived by the Polar Geospatial Center (PGC) at the University of Minnesota through an academic licensing program maintained by the US National Geospatial-Intelligence Agency (NGA). SETSM is the primary DEM generation software for the US National Science Foundation's ArcticDEM program, with the objective of generating high resolution (2-8m) topography for the entire Arctic landmass, including seamless DEM mosaics and repeat DEM strips for change detection. ArcticDEM is collaboration between multiple US universities, governmental agencies and private companies, as well as international partners assisting with quality control and registration. ArcticDEM is being produced using the petascale Blue Waters supercomputer at the National Center for Supercomputer Applications at the University of Illinois. In this paper, we introduce the SETSM algorithm and the processing system used for the ArcticDEM project, as well as provide notable examples of ArcticDEM products.
Operario, Don; Gamarel, Kristi E; Iwamoto, Mariko; Suzuki, Sachico; Suico, Sabrina; Darbes, Lynae; Nemoto, Tooru
2017-08-01
HIV risk among transgender women has been attributed to condomless sex with primary male partners. This study pilot tested a couples-focused HIV intervention program for transgender women and their primary male partners. We analyzed data from 56 transgender women and their male partners (n = 112 participants) who were randomized as a couple to one of two groups. Participants in the intervention group (27 couples) received 3 counseling sessions: 2 couples-focused sessions, which discussed relationship dynamics, communication, and HIV risk, and 1 individual-focused session on HIV prevention concerns. Participants in the control group (29 couples) received 1 session on general HIV prevention information delivered to both partners together. At 3-month follow-up, participants in the intervention reported lower odds of condomless sex with primary partners (OR 0.5, 95 % CI 0.3-1.0), reduced odds of engaging in sex with a casual partner (OR 0.3, 95 % CI 0.1-1.0), and reduction in the number of casual partners (B = -1.45, SE = 0.4) compared with the control group. Findings provide support for the feasibility and promise of a couples-focused HIV prevention intervention for transgender women and their primary male partners.
Compiled MPI: Cost-Effective Exascale Applications Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bronevetsky, G; Quinlan, D; Lumsdaine, A
2012-04-10
The complexity of petascale and exascale machines makes it increasingly difficult to develop applications that can take advantage of them. Future systems are expected to feature billion-way parallelism, complex heterogeneous compute nodes and poor availability of memory (Peter Kogge, 2008). This new challenge for application development is motivating a significant amount of research and development on new programming models and runtime systems designed to simplify large-scale application development. Unfortunately, DoE has significant multi-decadal investment in a large family of mission-critical scientific applications. Scaling these applications to exascale machines will require a significant investment that will dwarf the costs of hardwaremore » procurement. A key reason for the difficulty in transitioning today's applications to exascale hardware is their reliance on explicit programming techniques, such as the Message Passing Interface (MPI) programming model to enable parallelism. MPI provides a portable and high performance message-passing system that enables scalable performance on a wide variety of platforms. However, it also forces developers to lock the details of parallelization together with application logic, making it very difficult to adapt the application to significant changes in the underlying system. Further, MPI's explicit interface makes it difficult to separate the application's synchronization and communication structure, reducing the amount of support that can be provided by compiler and run-time tools. This is in contrast to the recent research on more implicit parallel programming models such as Chapel, OpenMP and OpenCL, which promise to provide significantly more flexibility at the cost of reimplementing significant portions of the application. We are developing CoMPI, a novel compiler-driven approach to enable existing MPI applications to scale to exascale systems with minimal modifications that can be made incrementally over the application's lifetime. It includes: (1) New set of source code annotations, inserted either manually or automatically, that will clarify the application's use of MPI to the compiler infrastructure, enabling greater accuracy where needed; (2) A compiler transformation framework that leverages these annotations to transform the original MPI source code to improve its performance and scalability; (3) Novel MPI runtime implementation techniques that will provide a rich set of functionality extensions to be used by applications that have been transformed by our compiler; and (4) A novel compiler analysis that leverages simple user annotations to automatically extract the application's communication structure and synthesize most complex code annotations.« less
NASA Technical Reports Server (NTRS)
Bennett, R. L.
1975-01-01
The analytical techniques and computer program developed in the fully-coupled rotor vibration study are described. The rotor blade natural frequency and mode shape analysis was implemented in a digital computer program designated DF1758. The program computes collective, cyclic, and scissor modes for a single blade within a specified range of frequency for specified values of rotor RPM and collective angle. The analysis includes effects of blade twist, cg offset from reference axis, and shear center offset from reference axis. Coupled inplane, out-of-plane, and torsional vibrations are considered. Normalized displacements, shear forces and moments may be printed out and Calcomp plots of natural frequencies as a function of rotor RPM may be produced.
Screening of redox couples and electrode materials
NASA Technical Reports Server (NTRS)
Giner, J.; Swette, L.; Cahill, K.
1976-01-01
Electrochemical parameters of selected redox couples that might be potentially promising for application in bulk energy storage systems were investigated. This was carried out in two phases: a broad investigation of the basic characteristics and behavior of various redox couples, followed by a more limited investigation of their electrochemical performance in a redox flow reactor configuration. In the first phase of the program, eight redox couples were evaluated under a variety of conditions in terms of their exchange current densities as measured by the rotating disk electrode procedure. The second phase of the program involved the testing of four couples in a redox reactor under flow conditions with a varity of electrode materials and structures.
NASA Astrophysics Data System (ADS)
Strayer, Michael
2007-09-01
Good morning. Welcome to Boston, the home of the Red Sox, Celtics and Bruins, baked beans, tea parties, Robert Parker, and SciDAC 2007. A year ago I stood before you to share the legacy of the first SciDAC program and identify the challenges that we must address on the road to petascale computing—a road E E Cummins described as `. . . never traveled, gladly beyond any experience.' Today, I want to explore the preparations for the rapidly approaching extreme scale (X-scale) generation. These preparations are the first step propelling us along the road of burgeoning scientific discovery enabled by the application of X- scale computing. We look to petascale computing and beyond to open up a world of discovery that cuts across scientific fields and leads us to a greater understanding of not only our world, but our universe. As part of the President's America Competitiveness Initiative, the ASCR Office has been preparing a ten year vision for computing. As part of this planning the LBNL together with ORNL and ANL hosted three town hall meetings on Simulation and Modeling at the Exascale for Energy, Ecological Sustainability and Global Security (E3). The proposed E3 initiative is organized around four programmatic themes: Engaging our top scientists, engineers, computer scientists and applied mathematicians; investing in pioneering large-scale science; developing scalable analysis algorithms, and storage architectures to accelerate discovery; and accelerating the build-out and future development of the DOE open computing facilities. It is clear that we have only just started down the path to extreme scale computing. Plan to attend Thursday's session on the out-briefing and discussion of these meetings. The road to the petascale has been at best rocky. In FY07, the continuing resolution provided 12% less money for Advanced Scientific Computing than either the President, the Senate, or the House. As a consequence, many of you had to absorb a no cost extension for your SciDAC work. I am pleased that the President's FY08 budget restores the funding for SciDAC. Quoting from Advanced Scientific Computing Research description in the House Energy and Water Development Appropriations Bill for FY08, "Perhaps no other area of research at the Department is so critical to sustaining U.S. leadership in science and technology, revolutionizing the way science is done and improving research productivity." As a society we need to revolutionize our approaches to energy, environmental and global security challenges. As we go forward along the road to the X-scale generation, the use of computation will continue to be a critical tool along with theory and experiment in understanding the behavior of the fundamental components of nature as well as for fundamental discovery and exploration of the behavior of complex systems. The foundation to overcome these societal challenges will build from the experiences and knowledge gained as you, members of our SciDAC research teams, work together to attack problems at the tera- and peta- scale. If SciDAC is viewed as an experiment for revolutionizing scientific methodology, then a strategic goal of ASCR program must be to broaden the intellectual base prepared to address the challenges of the new X-scale generation of computing. We must focus our computational science experiences gained over the past five years on the opportunities introduced with extreme scale computing. Our facilities are on a path to provide the resources needed to undertake the first part of our journey. Using the newly upgraded 119 teraflop Cray XT system at the Leadership Computing Facility, SciDAC research teams have in three days performed a 100-year study of the time evolution of the atmospheric CO2 concentration originating from the land surface. The simulation of the El Nino/Southern Oscillation which was part of this study has been characterized as `the most impressive new result in ten years' gained new insight into the behavior of superheated ionic gas in the ITER reactor as a result of an AORSA run on 22,500 processors that achieved over 87 trillion calculations per second (87 teraflops) which is 74% of the system's theoretical peak. Tomorrow, Argonne and IBM will announce that the first IBM Blue Gene/P, a 100 teraflop system, will be shipped to the Argonne Leadership Computing Facility later this fiscal year. By the end of FY2007 ASCR high performance and leadership computing resources will include the 114 teraflop IBM Blue Gene/P; a 102 teraflop Cray XT4 at NERSC and a 119 teraflop Cray XT system at Oak Ridge. Before ringing in the New Year, Oak Ridge will upgrade to 250 teraflops with the replacement of the dual core processors with quad core processors and Argonne will upgrade to between 250-500 teraflops, and next year, a petascale Cray Baker system is scheduled for delivery at Oak Ridge. The multidisciplinary teams in our SciDAC Centers for Enabling Technologies and our SciDAC Institutes must continue to work with our Scientific Application teams to overcome the barriers that prevent effective use of these new systems. These challenges include: the need for new algorithms as well as operating system and runtime software and tools which scale to parallel systems composed of hundreds of thousands processors; program development environments and tools which scale effectively and provide ease of use for developers and scientific end users; and visualization and data management systems that support moving, storing, analyzing, manipulating and visualizing multi-petabytes of scientific data and objects. The SciDAC Centers, located primarily at our DOE national laboratories will take the lead in ensuring that critical computer science and applied mathematics issues are addressed in a timely and comprehensive fashion and to address issues associated with research software lifecycle. In contrast, the SciDAC Institutes, which are university-led centers of excellence, will have more flexibility to pursue new research topics through a range of research collaborations. The Institutes will also work to broaden the intellectual and researcher base—conducting short courses and summer schools to take advantage of new high performance computing capabilities. The SciDAC Outreach Center at Lawrence Berkeley National Laboratory complements the outreach efforts of the SciDAC Institutes. The Outreach Center is our clearinghouse for SciDAC activities and resources and will communicate with the high performance computing community in part to understand their needs for workshops, summer schools and institutes. SciDAC is not ASCR's only effort to broaden the computational science community needed to meet the challenges of the new X-scale generation. I hope that you were able to attend the Computational Science Graduate Fellowship poster session last night. ASCR developed the fellowship in 1991 to meet the nation's growing need for scientists and technology professionals with advanced computer skills. CSGF, now jointly funded between ASCR and NNSA, is more than a traditional academic fellowship. It has provided more than 200 of the best and brightest graduate students with guidance, support and community in preparing them as computational scientists. Today CSGF alumni are bringing their diverse top-level skills and knowledge to research teams at DOE laboratories and in industries such as Proctor and Gamble, Lockheed Martin and Intel. At universities they are working to train the next generation of computational scientists. To build on this success, we intend to develop a wholly new Early Career Principal Investigator's (ECPI) program. Our objective is to stimulate academic research in scientific areas within ASCR's purview especially among faculty in early stages of their academic careers. Last February, we lost Ken Kennedy, one of the leading lights of our community. As we move forward into the extreme computing generation, his vision and insight will be greatly missed. In memorial to Ken Kennedy, we shall designate the ECPI grants to beginning faculty in Computer Science as the Ken Kennedy Fellowship. Watch the ASCR website for more information about ECPI and other early career programs in the computational sciences. We look to you, our scientists, researchers, and visionaries to take X-scale computing and use it to explode scientific discovery in your fields. We at SciDAC will work to ensure that this tool is the sharpest and most precise and efficient instrument to carve away the unknown and reveal the most exciting secrets and stimulating scientific discoveries of our time. The partnership between research and computing is the marriage that will spur greater discovery, and as Spencer said to Susan in Robert Parker's novel, `Sudden Mischief', `We stick together long enough, and we may get as smart as hell'. Michael Strayer
GPU Implementation of High Rayleigh Number Three-Dimensional Mantle Convection
NASA Astrophysics Data System (ADS)
Sanchez, D. A.; Yuen, D. A.; Wright, G. B.; Barnett, G. A.
2010-12-01
Although we have entered the age of petascale computing, many factors are still prohibiting high-performance computing (HPC) from infiltrating all suitable scientific disciplines. For this reason and others, application of GPU to HPC is gaining traction in the scientific world. With its low price point, high performance potential, and competitive scalability, GPU has been an option well worth considering for the last few years. Moreover with the advent of NVIDIA's Fermi architecture, which brings ECC memory, better double-precision performance, and more RAM to GPU, there is a strong message of corporate support for GPU in HPC. However many doubts linger concerning the practicality of using GPU for scientific computing. In particular, GPU has a reputation for being difficult to program and suitable for only a small subset of problems. Although inroads have been made in addressing these concerns, for many scientists GPU still has hurdles to clear before becoming an acceptable choice. We explore the applicability of GPU to geophysics by implementing a three-dimensional, second-order finite-difference model of Rayleigh-Benard thermal convection on an NVIDIA GPU using C for CUDA. Our code reaches sufficient resolution, on the order of 500x500x250 evenly-spaced finite-difference gridpoints, on a single GPU. We make extensive use of highly optimized CUBLAS routines, allowing us to achieve performance on the order of O( 0.1 ) µs per timestep*gridpoint at this resolution. This performance has allowed us to study high Rayleigh number simulations, on the order of 2x10^7, on a single GPU.
A Pervasive Parallel Processing Framework for Data Visualization and Analysis at Extreme Scale
DOE Office of Scientific and Technical Information (OSTI.GOV)
Moreland, Kenneth; Geveci, Berk
2014-11-01
The evolution of the computing world from teraflop to petaflop has been relatively effortless, with several of the existing programming models scaling effectively to the petascale. The migration to exascale, however, poses considerable challenges. All industry trends infer that the exascale machine will be built using processors containing hundreds to thousands of cores per chip. It can be inferred that efficient concurrency on exascale machines requires a massive amount of concurrent threads, each performing many operations on a localized piece of data. Currently, visualization libraries and applications are based off what is known as the visualization pipeline. In the pipelinemore » model, algorithms are encapsulated as filters with inputs and outputs. These filters are connected by setting the output of one component to the input of another. Parallelism in the visualization pipeline is achieved by replicating the pipeline for each processing thread. This works well for today’s distributed memory parallel computers but cannot be sustained when operating on processors with thousands of cores. Our project investigates a new visualization framework designed to exhibit the pervasive parallelism necessary for extreme scale machines. Our framework achieves this by defining algorithms in terms of worklets, which are localized stateless operations. Worklets are atomic operations that execute when invoked unlike filters, which execute when a pipeline request occurs. The worklet design allows execution on a massive amount of lightweight threads with minimal overhead. Only with such fine-grained parallelism can we hope to fill the billions of threads we expect will be necessary for efficient computation on an exascale machine.« less
A Sexual Enhancement Program for Elderly Couples
ERIC Educational Resources Information Center
Rowland, Kay F.; Haynes, Stephen N.
1978-01-01
This study examined effects of a group sexual enhancement program for elderly couples. The three two-week phases, pretreatment with no therapist contact, education on human sexual functioning in aging people, and communication exercises-sexual techniques, were methods to improve communication and increase enjoyment of sexual contact. (Author)
2013-12-11
coupled relationship (i.e., married or unmarried couple) breastfeeding were nearly 3.8 times as likely as women who were not in a coupled relationship...support, negative life events, being unmarried , and certain maternal hospital experiences, which may lead to formula supplementation. Moreover, these...live births, without chronic illnesses that might adversely affect the fetus, and with at least two socioeconomic risk factors (e.g., unmarried
Bodenmann, Guy; Cina, Annette; Ledermann, Thomas; Sanders, Matthew R
2008-04-01
The aim of this randomized controlled trial was to evaluate the efficacy of an evidence-based parenting program (the Triple P-Positive Parenting Program), intending to improve parenting skills and children's well-being. Parents participating in a Group Triple P program (n=50 couples) were compared with parents of a non-treated control group (n=50 couples) and parents participating in a marital distress prevention program (couples coping enhancement training (CCET)) (n=50 couples). The two major goals of this study were (a) to evaluate the efficacy of Triple P compared with the two other treatment conditions over a time-span of 1 year and (b) to answer the question whether this program that was developed in Australia is culturally accepted by Swiss parents. Results revealed that Triple P was effective with Swiss families. Mothers of the Triple P group showed significant improvements in parenting, parenting self-esteem, and a decrease in stressors related to parenting. Women trained in Triple P also reported significantly lower rates of child's misbehavior than women of the two other conditions. However, in men only a few significant results were found. Positive effects of the relationship training (CCET) were somewhat lower than those for the Triple P. These findings are further discussed.
Combustion energy frontier research center (CEFRC) final report (August 1, 2009 – July 31, 2016)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Law, Chung
The Combustion Energy Frontier Research Center (CEFRC) was established to tackle the single overarching grand challenge of energy sustainability, energy security and global warming: to develop a “validated, predictive, multi-scale, combustion modeling capability to optimize the design and operation of evolving fuels in advanced engines for transportation applications,” as identified in the DOE report on “Basic Energy Needs for Clean and Efficient Combustion of 21st Century Transportation Fuels”. The challenge is particularly daunting since energy conversion efficiencies and exhaust emissions are governed by coupled chemical and transport processes at multiple length scales ranging from electron excitation to molecular rearrangements tomore » nanoscale particulate formation to turbulent fuel/air mixing. To tackle this challenge, the CEFRC assembled a world-class team of 15 principal investigators, with the objectives to: 1) develop and test theoretical models to predict elementary reaction rates, molecule thermalization rates, chemical bond dissociation energies, and nonequilibrium transport properties using quantum chemistry calculations that account for strong electron correlation and multiple electronic potential energy surfaces; 2) develop automated kinetic mechanism generation, reduction, and error control methods for predicting alternative fuel including biofuel oxidation, fuel droplet decomposition, and NOx and particulate formation; 3) validate and improve the predictions of these models by measuring ignition delay times, species profiles, flame structures, burning limits, turbulence-kinetic coupling, and NOx and soot emissions at high-pressures and near-limit conditions, by using advanced experimental diagnostic techniques including multiple laser techniques, molecular beam sampling and synchrotron photoionization, and by conducting the measurements in high-pressure shock tubes, jet-stirred and flow reactors, flame bombs, counterflow flames, and advanced-design rapid compression ignition instruments; and 4) develop a suite of validated petascale high-fidelity simulation and modeling capabilities to understand and predict chemistry-turbulence-radiation coupling for new fuels in new regimes, including the high pressure, low-temperature combustion in advanced engine and turbine designs, and 5) establish a knowledge highway between researchers and engineers in academia, national laboratories, and industry to facilitate the dissemination and exchange of knowledge on national and international levels, and enrich the talent pool and capabilities of the next generation of combustion scientists and engineers. The technical activities of the CEFRC were conducted through three Disciplinary Working Groups – Chemistry Theory, Experiment and Mechanism, and Reacting Flows, which coordinated the Center’s research on the development of combustion chemistry of Foundation Fuels (C0–C4 hydrocarbons), Alcohols, and Biodiesel through three corresponding Mechanism Thrust Groups. Such a two-dimensional coordinated and tightly interwoven research structure has been proven to be highly effective in assuring the interplay between the developments of the fundamentals of combustion science and the utilization of the various categories of fuels. The Center has accomplished the above goals over the five year period (August 1, 2009 – July 31, 2014) with appropriated funding, followed by two additional no-cost-extension (NCE) years. The research results are documented in 230 journal publications, with six legacy review papers on the study of combustion chemistry using shock tubes, flow reactors, rapid compression machines, and flames, on uncertainty quantification, and on theoretical reaction dynamics and chemical modeling of combustion. A robust outreach program complemented these PI-led research activities, consisting of: 1) a roving post-doc program comprised of a corps of Center-appointed, co- or multi-sponsored post-doctoral fellows with rotating assignments to conduct seed projects initiated by at least two PIs, in residence with these sponsoring PIs, to rapidly pursue new and high-risk, high-payoff interdisciplinary ideas; 2) an annual summer school on combustion heavily attended (~200) by senior graduate students and practicing researchers covering advanced topics on chemical kinetics, fluid mechanics, turbulent combustion, engine combustion, new technologies, etc.; 3) a robust open web-site providing Center and community information as well as the lecture videos and notes of the summer school; and 4) widely distributed biannual newsletters.« less
ERIC Educational Resources Information Center
Todahl, Jeff; Linville, Deanna; Tuttle Shamblin, Abby F.; Ball, David
2012-01-01
A handful of clinical trials have concluded that conjoint couples treatment for intimate partner violence is safe and at least as effective as conventional batterer intervention programs, yet very few researchers have explored couples' perspectives on conjoint treatment. Using qualitative narrative analysis methodology, the researchers conducted…
Assessing the Impact of a Multiyear Marriage Education Program
ERIC Educational Resources Information Center
O'Halloran, Mary Sean; Rizzolo, Sonja; Cohen, Marsha L.; Wacker, Robbyn
2013-01-01
This study measured marital satisfaction of low-income couples in a Western state following participation in the Building Healthy Marriages program, which aimed to educate couples and increase relationship satisfaction. The researchers' goals were the following: To determine the areas in which participants experienced the greatest number of…
Preparing Groups of Engaged Couples for Marriage.
ERIC Educational Resources Information Center
Rolfe, David J.
This paper outlines a program designed for preparing groups of engaged couples for marriage in circumstances where program time is limited to two afternoon sessions. Six topic areas are covered: Adjustments and Priorities; Communication Skills; Parenthood; Money Management; Religious Dimensions in Marriage; and Sexuality. The method used is one of…
Marriage and Fatherhood Programs
ERIC Educational Resources Information Center
Cowan, Phillip A.; Cowan, Carolyn Pape; Knox, Virginia
2010-01-01
To improve the quality and stability of couple and father-child relationships in fragile families, researchers are beginning to consider how to tailor existing couple-relationship and father-involvement interventions, which are now targeted on married couples, to the specific needs of unwed couples in fragile families. The goal, explain Philip…
Atomic detail visualization of photosynthetic membranes with GPU-accelerated ray tracing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stone, John E.; Sener, Melih; Vandivort, Kirby L.
The cellular process responsible for providing energy for most life on Earth, namely, photosynthetic light-harvesting, requires the cooperation of hundreds of proteins across an organelle, involving length and time scales spanning several orders of magnitude over quantum and classical regimes. Simulation and visualization of this fundamental energy conversion process pose many unique methodological and computational challenges. In this paper, we present, in two accompanying movies, light-harvesting in the photosynthetic apparatus found in purple bacteria, the so-called chromatophore. The movies are the culmination of three decades of modeling efforts, featuring the collaboration of theoretical, experimental, and computational scientists. Finally, we describemore » the techniques that were used to build, simulate, analyze, and visualize the structures shown in the movies, and we highlight cases where scientific needs spurred the development of new parallel algorithms that efficiently harness GPU accelerators and petascale computers.« less
A Petascale Non-Hydrostatic Atmospheric Dynamical Core in the HOMME Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tufo, Henry
The High-Order Method Modeling Environment (HOMME) is a framework for building scalable, conserva- tive atmospheric models for climate simulation and general atmospheric-modeling applications. Its spatial discretizations are based on Spectral-Element (SE) and Discontinuous Galerkin (DG) methods. These are local methods employing high-order accurate spectral basis-functions that have been shown to perform well on massively parallel supercomputers at any resolution and scale particularly well at high resolutions. HOMME provides the framework upon which the CAM-SE community atmosphere model dynamical-core is constructed. In its current incarnation, CAM-SE employs the hydrostatic primitive-equations (PE) of motion, which limits its resolution to simulations coarser thanmore » 0.1 per grid cell. The primary objective of this project is to remove this resolution limitation by providing HOMME with the capabilities needed to build nonhydrostatic models that solve the compressible Euler/Navier-Stokes equations.« less
Two-Particle Dispersion in Isotropic Turbulent Flows
NASA Astrophysics Data System (ADS)
Salazar, Juan P. L. C.; Collins, Lance R.
2009-01-01
Two-particle dispersion is of central importance to a wide range of natural and industrial applications. It has been an active area of research since Richardson's (1926) seminal paper. This review emphasizes recent results from experiments, high-end direct numerical simulations, and modern theoretical discussions. Our approach is complementary to Sawford's (2001), whose review focused primarily on stochastic models of pair dispersion. We begin by reviewing the theoretical foundations of relative dispersion, followed by experimental and numerical findings for the dissipation subrange and inertial subrange. We discuss the findings in the context of the relevant theory for each regime. We conclude by providing a critical analysis of our current understanding and by suggesting paths toward further progress that take full advantage of exciting developments in modern experimental methods and peta-scale supercomputing.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Begoli, Edmon; Bates, Jack; Kistler, Derek E
The Polystore architecture revisits the federated approach to access and querying of the standalone, independent databases in the uniform and optimized fashion, but this time in the context of heterogeneous data and specialized analyses. In the light of this architectural philosophy, and in the light of the major data architecture development efforts at the US Department of Veterans Administration (VA), we discuss the need for the heterogeneous data store consisting of the large relational data warehouse, an image and text datastore, and a peta-scale genomic repository. The VA's heterogeneous datastore would, to a larger or smaller degree, follow the architecturalmore » blueprint proposed by the polystore architecture. To this end, we discuss the current state of the data architecture at VA, architectural alternatives for development of the heterogeneous datastore, the anticipated challenges, and the drawbacks and benefits of adopting the polystore architecture.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boman, Erik G.; Catalyurek, Umit V.; Chevalier, Cedric
2015-01-16
This final progress report summarizes the work accomplished at the Combinatorial Scientific Computing and Petascale Simulations Institute. We developed Zoltan, a parallel mesh partitioning library that made use of accurate hypergraph models to provide load balancing in mesh-based computations. We developed several graph coloring algorithms for computing Jacobian and Hessian matrices and organized them into a software package called ColPack. We developed parallel algorithms for graph coloring and graph matching problems, and also designed multi-scale graph algorithms. Three PhD students graduated, six more are continuing their PhD studies, and four postdoctoral scholars were advised. Six of these students and Fellowsmore » have joined DOE Labs (Sandia, Berkeley), as staff scientists or as postdoctoral scientists. We also organized the SIAM Workshop on Combinatorial Scientific Computing (CSC) in 2007, 2009, and 2011 to continue to foster the CSC community.« less
Hierarchical Petascale Simulation Framework for Stress Corrosion Cracking
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vashishta, Priya
2014-12-01
Reaction Dynamics in Energetic Materials: Detonation is a prototype of mechanochemistry, in which mechanically and thermally induced chemical reactions far from equilibrium exhibit vastly different behaviors. It is also one of the hardest multiscale physics problems, in which diverse length and time scales play important roles. The CACS group has performed multimillion-atom reactive MD simulations to reveal a novel two-stage reaction mechanism during the detonation of cyclotrimethylenetrinitramine (RDX) crystal. Rapid production of N2 and H2O within ~10 ps is followed by delayed production of CO molecules within ~ 1 ns. They found that further decomposition towards the final products ismore » inhibited by the formation of large metastable C- and O-rich clusters with fractal geometry. The CACS group has also simulated the oxidation dynamics of close-packed aggregates of aluminum nanoparticles passivated by oxide shells. Their simulation results suggest an unexpectedly active role of the oxide shell as a nanoreactor.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dorier, Matthieu; Mubarak, Misbah; Ross, Rob
Two-tiered direct network topologies such as Dragonflies have been proposed for future post-petascale and exascale machines, since they provide a high-radix, low-diameter, fast interconnection network. Such topologies call for redesigning MPI collective communication algorithms in order to attain the best performance. Yet as increasingly more applications share a machine, it is not clear how these topology-aware algorithms will react to interference with concurrent jobs accessing the same network. In this paper, we study three topology-aware broadcast algorithms, including one designed by ourselves. We evaluate their performance through event-driven simulation for small- and large-sized broadcasts (in terms of both data sizemore » and number of processes). We study the effect of different routing mechanisms on the topology-aware collective algorithms, as well as their sensitivity to network contention with other jobs. Our results show that while topology-aware algorithms dramatically reduce link utilization, their advantage in terms of latency is more limited.« less
Ab initio results for intermediate-mass, open-shell nuclei
NASA Astrophysics Data System (ADS)
Baker, Robert B.; Dytrych, Tomas; Launey, Kristina D.; Draayer, Jerry P.
2017-01-01
A theoretical understanding of nuclei in the intermediate-mass region is vital to astrophysical models, especially for nucleosynthesis. Here, we employ the ab initio symmetry-adapted no-core shell model (SA-NCSM) in an effort to push first-principle calculations across the sd-shell region. The ab initio SA-NCSM's advantages come from its ability to control the growth of model spaces by including only physically relevant subspaces, which allows us to explore ultra-large model spaces beyond the reach of other methods. We report on calculations for 19Ne and 20Ne up through 13 harmonic oscillator shells using realistic interactions and discuss the underlying structure as well as implications for various astrophysical reactions. This work was supported by the U.S. NSF (OCI-0904874 and ACI -1516338) and the U.S. DOE (DE-SC0005248), and also benefitted from the Blue Waters sustained-petascale computing project and high performance computing resources provided by LSU.
Design Aspects of the Rayleigh Convection Code
NASA Astrophysics Data System (ADS)
Featherstone, N. A.
2017-12-01
Understanding the long-term generation of planetary or stellar magnetic field requires complementary knowledge of the large-scale fluid dynamics pervading large fractions of the object's interior. Such large-scale motions are sensitive to the system's geometry which, in planets and stars, is spherical to a good approximation. As a result, computational models designed to study such systems often solve the MHD equations in spherical geometry, frequently employing a spectral approach involving spherical harmonics. We present computational and user-interface design aspects of one such modeling tool, the Rayleigh convection code, which is suitable for deployment on desktop and petascale-hpc architectures alike. In this poster, we will present an overview of this code's parallel design and its built-in diagnostics-output package. Rayleigh has been developed with NSF support through the Computational Infrastructure for Geodynamics and is expected to be released as open-source software in winter 2017/2018.
Atomic detail visualization of photosynthetic membranes with GPU-accelerated ray tracing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stone, John E.; Sener, Melih; Vandivort, Kirby L.
The cellular process responsible for providing energy for most life on Earth, namely, photosynthetic light-harvesting, requires the cooperation of hundreds of proteins across an organelle, involving length and time scales spanning several orders of magnitude over quantum and classical regimes. Simulation and visualization of this fundamental energy conversion process pose many unique methodological and computational challenges. We present, in two accompanying movies, light-harvesting in the photosynthetic apparatus found in purple bacteria, the so-called chromatophore. The movies are the culmination of three decades of modeling efforts, featuring the collaboration of theoretical, experimental, and computational scientists. We describe the techniques that weremore » used to build, simulate, analyze, and visualize the structures shown in the movies, and we highlight cases where scientific needs spurred the development of new parallel algorithms that efficiently harness GPU accelerators and petascale computers.« less
Atomic detail visualization of photosynthetic membranes with GPU-accelerated ray tracing
Stone, John E.; Sener, Melih; Vandivort, Kirby L.; ...
2015-12-12
The cellular process responsible for providing energy for most life on Earth, namely, photosynthetic light-harvesting, requires the cooperation of hundreds of proteins across an organelle, involving length and time scales spanning several orders of magnitude over quantum and classical regimes. Simulation and visualization of this fundamental energy conversion process pose many unique methodological and computational challenges. In this paper, we present, in two accompanying movies, light-harvesting in the photosynthetic apparatus found in purple bacteria, the so-called chromatophore. The movies are the culmination of three decades of modeling efforts, featuring the collaboration of theoretical, experimental, and computational scientists. Finally, we describemore » the techniques that were used to build, simulate, analyze, and visualize the structures shown in the movies, and we highlight cases where scientific needs spurred the development of new parallel algorithms that efficiently harness GPU accelerators and petascale computers.« less
An Optimizing Compiler for Petascale I/O on Leadership-Class Architectures
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kandemir, Mahmut Taylan; Choudary, Alok; Thakur, Rajeev
In high-performance computing (HPC), parallel I/O architectures usually have very complex hierarchies with multiple layers that collectively constitute an I/O stack, including high-level I/O libraries such as PnetCDF and HDF5, I/O middleware such as MPI-IO, and parallel file systems such as PVFS and Lustre. Our DOE project explored automated instrumentation and compiler support for I/O intensive applications. Our project made significant progress towards understanding the complex I/O hierarchies of high-performance storage systems (including storage caches, HDDs, and SSDs), and designing and implementing state-of-the-art compiler/runtime system technology that targets I/O intensive HPC applications that target leadership class machine. This final reportmore » summarizes the major achievements of the project and also points out promising future directions Two new sections in this report compared to the previous report are IOGenie and SSD/NVM-specific optimizations.« less
Extending Strong Scaling of Quantum Monte Carlo to the Exascale
NASA Astrophysics Data System (ADS)
Shulenburger, Luke; Baczewski, Andrew; Luo, Ye; Romero, Nichols; Kent, Paul
Quantum Monte Carlo is one of the most accurate and most computationally expensive methods for solving the electronic structure problem. In spite of its significant computational expense, its massively parallel nature is ideally suited to petascale computers which have enabled a wide range of applications to relatively large molecular and extended systems. Exascale capabilities have the potential to enable the application of QMC to significantly larger systems, capturing much of the complexity of real materials such as defects and impurities. However, both memory and computational demands will require significant changes to current algorithms to realize this possibility. This talk will detail both the causes of the problem and potential solutions. Sandia National Laboratories is a multi-mission laboratory managed and operated by Sandia Corp, a wholly owned subsidiary of Lockheed Martin Corp, for the US Department of Energys National Nuclear Security Administration under contract DE-AC04-94AL85000.
Scalable parallel distance field construction for large-scale applications
Yu, Hongfeng; Xie, Jinrong; Ma, Kwan -Liu; ...
2015-10-01
Computing distance fields is fundamental to many scientific and engineering applications. Distance fields can be used to direct analysis and reduce data. In this paper, we present a highly scalable method for computing 3D distance fields on massively parallel distributed-memory machines. Anew distributed spatial data structure, named parallel distance tree, is introduced to manage the level sets of data and facilitate surface tracking overtime, resulting in significantly reduced computation and communication costs for calculating the distance to the surface of interest from any spatial locations. Our method supports several data types and distance metrics from real-world applications. We demonstrate itsmore » efficiency and scalability on state-of-the-art supercomputers using both large-scale volume datasets and surface models. We also demonstrate in-situ distance field computation on dynamic turbulent flame surfaces for a petascale combustion simulation. In conclusion, our work greatly extends the usability of distance fields for demanding applications.« less
Scalable Parallel Distance Field Construction for Large-Scale Applications.
Yu, Hongfeng; Xie, Jinrong; Ma, Kwan-Liu; Kolla, Hemanth; Chen, Jacqueline H
2015-10-01
Computing distance fields is fundamental to many scientific and engineering applications. Distance fields can be used to direct analysis and reduce data. In this paper, we present a highly scalable method for computing 3D distance fields on massively parallel distributed-memory machines. A new distributed spatial data structure, named parallel distance tree, is introduced to manage the level sets of data and facilitate surface tracking over time, resulting in significantly reduced computation and communication costs for calculating the distance to the surface of interest from any spatial locations. Our method supports several data types and distance metrics from real-world applications. We demonstrate its efficiency and scalability on state-of-the-art supercomputers using both large-scale volume datasets and surface models. We also demonstrate in-situ distance field computation on dynamic turbulent flame surfaces for a petascale combustion simulation. Our work greatly extends the usability of distance fields for demanding applications.
SWIFT: SPH With Inter-dependent Fine-grained Tasking
NASA Astrophysics Data System (ADS)
Schaller, Matthieu; Gonnet, Pedro; Chalk, Aidan B. G.; Draper, Peter W.
2018-05-01
SWIFT runs cosmological simulations on peta-scale machines for solving gravity and SPH. It uses the Fast Multipole Method (FMM) to calculate gravitational forces between nearby particles, combining these with long-range forces provided by a mesh that captures both the periodic nature of the calculation and the expansion of the simulated universe. SWIFT currently uses a single fixed but time-variable softening length for all the particles. Many useful external potentials are also available, such as galaxy haloes or stratified boxes that are used in idealised problems. SWIFT implements a standard LCDM cosmology background expansion and solves the equations in a comoving frame; equations of state of dark-energy evolve with scale-factor. The structure of the code allows implementation for modified-gravity solvers or self-interacting dark matter schemes to be implemented. Many hydrodynamics schemes are implemented in SWIFT and the software allows users to add their own.
Systems, methods, and products for graphically illustrating and controlling a droplet actuator
NASA Technical Reports Server (NTRS)
Brafford, Keith R. (Inventor); Pamula, Vamsee K. (Inventor); Paik, Philip Y. (Inventor); Pollack, Michael G. (Inventor); Sturmer, Ryan A. (Inventor); Smith, Gregory F. (Inventor)
2010-01-01
Systems for controlling a droplet microactuator are provided. According to one embodiment, a system is provided and includes a controller, a droplet microactuator electronically coupled to the controller, and a display device displaying a user interface electronically coupled to the controller, wherein the system is programmed and configured to permit a user to effect a droplet manipulation by interacting with the user interface. According to another embodiment, a system is provided and includes a processor, a display device electronically coupled to the processor, and software loaded and/or stored in a storage device electronically coupled to the controller, a memory device electronically coupled to the controller, and/or the controller and programmed to display an interactive map of a droplet microactuator. According to yet another embodiment, a system is provided and includes a controller, a droplet microactuator electronically coupled to the controller, a display device displaying a user interface electronically coupled to the controller, and software for executing a protocol loaded and/or stored in a storage device electronically coupled to the controller, a memory device electronically coupled to the controller, and/or the controller.
Lambdin, Barrot; Kanweka, William; Inambao, Mubiana; Mwananyanda, Lawrence; Shah, Heena; Linton, Sabriya; Wong, Frank; Luisi, Nicole; Tichacek, Amanda; Kalowa, James; Chomba, Elwyn; Allen, Susan
2011-01-01
Couples in sub-Saharan Africa are the largest group in the world at risk for HIV infection. Couples counseling and testing programs have been shown to reduce HIV transmission, but such programs remain rare in Africa. Before couples counseling and testing can become the norm, it is essential to increase demand for the services. We evaluated the effectiveness of several promotional strategies during a two -year program in Kitwe and Ndola, Zambia. The program attracted more than 7,600 couples through the use of radio broadcasts, billboards, and other strategies. The most effective recruiting technique was the use of local residents trained as “influence agents” to reach out to friends, neighbors, and others in their sphere of influence. Of the estimated 2.5 million new cases of HIV in adults and children in 2009, more than two-thirds occurred in sub-Saharan Africa.1 In Zambia, the prevalence of HIV among adults in urban and rural areas is estimated at 19 and 10 percent, respectively.2 Most HIV transmission in sub-Saharan Africa is heterosexual and occurs between cohabiting partners with discordant HIV test results3–5—that is, only one partner is HIV-positive. Thus, most new cases of HIV occur when someone infects his or her heterosexual partner. Sub-Saharan African couples with discordant HIV test results are the world’s largest risk group for HIV.6 Approximately 20 percent of Zambian couples have discordant HIV results, a rate consistent with estimates from Uganda, Rwanda, Tanzania, and Kenya.7–14 PMID:21821565
Interchannel Coupling in the Photoionization of Atoms and Ions in the X-Ray Range
NASA Technical Reports Server (NTRS)
Manson, Steven T.; Chakraborty, Himadri S.; Deshmukh, Pranawa C.
2002-01-01
To understand how this interchannel coupling, so important in neutral atoms, applies to positive ions, a research program has been initiated to deal with this question, i.e., a program to quantify the effects of interchannel coupling in ionic photoionization, thereby assessing existing photoionization data bases in the x-ray region. To accomplish this task, we have employed the Relativistic Random-Phase-Approximation (RRPA) methodology which includes significant aspects of electron-electron correlation, including interchannel coupling. The RRPA methodology has been found to produce excellent agreement with experiment for neutral Ne at photon energies in the 1 keV range.
NASA Astrophysics Data System (ADS)
Tang, William M., Dr.
2006-01-01
The second annual Scientific Discovery through Advanced Computing (SciDAC) Conference was held from June 25-29, 2006 at the new Hyatt Regency Hotel in Denver, Colorado. This conference showcased outstanding SciDAC-sponsored computational science results achieved during the past year across many scientific domains, with an emphasis on science at scale. Exciting computational science that has been accomplished outside of the SciDAC program both nationally and internationally was also featured to help foster communication between SciDAC computational scientists and those funded by other agencies. This was illustrated by many compelling examples of how domain scientists collaborated productively with applied mathematicians and computer scientists to effectively take advantage of terascale computers (capable of performing trillions of calculations per second) not only to accelerate progress in scientific discovery in a variety of fields but also to show great promise for being able to utilize the exciting petascale capabilities in the near future. The SciDAC program was originally conceived as an interdisciplinary computational science program based on the guiding principle that strong collaborative alliances between domain scientists, applied mathematicians, and computer scientists are vital to accelerated progress and associated discovery on the world's most challenging scientific problems. Associated verification and validation are essential in this successful program, which was funded by the US Department of Energy Office of Science (DOE OS) five years ago. As is made clear in many of the papers in these proceedings, SciDAC has fundamentally changed the way that computational science is now carried out in response to the exciting challenge of making the best use of the rapid progress in the emergence of more and more powerful computational platforms. In this regard, Dr. Raymond Orbach, Energy Undersecretary for Science at the DOE and Director of the OS has stated: `SciDAC has strengthened the role of high-end computing in furthering science. It is defining whole new fields for discovery.' (SciDAC Review, Spring 2006, p8). Application domains within the SciDAC 2006 conference agenda encompassed a broad range of science including: (i) the DOE core mission of energy research involving combustion studies relevant to fuel efficiency and pollution issues faced today and magnetic fusion investigations impacting prospects for future energy sources; (ii) fundamental explorations into the building blocks of matter, ranging from quantum chromodynamics - the basic theory that describes how quarks make up the protons and neutrons of all matter - to the design of modern high-energy accelerators; (iii) the formidable challenges of predicting and controlling the behavior of molecules in quantum chemistry and the complex biomolecules determining the evolution of biological systems; (iv) studies of exploding stars for insights into the nature of the universe; and (v) integrated climate modeling to enable realistic analysis of earth's changing climate. Associated research has made it quite clear that advanced computation is often the only means by which timely progress is feasible when dealing with these complex, multi-component physical, chemical, and biological systems operating over huge ranges of temporal and spatial scales. Working with the domain scientists, applied mathematicians and computer scientists have continued to develop the discretizations of the underlying equations and the complementary algorithms to enable improvements in solutions on modern parallel computing platforms as they evolve from the terascale toward the petascale regime. Moreover, the associated tremendous growth of data generated from the terabyte to the petabyte range demands not only the advanced data analysis and visualization methods to harvest the scientific information but also the development of efficient workflow strategies which can deal with the data input/output, management, movement, and storage challenges. If scientific discovery is expected to keep apace with the continuing progression from tera- to petascale platforms, the vital alliance between domain scientists, applied mathematicians, and computer scientists will be even more crucial. During the SciDAC 2006 Conference, some of the future challenges and opportunities in interdisciplinary computational science were emphasized in the Advanced Architectures Panel and by Dr. Victor Reis, Senior Advisor to the Secretary of Energy, who gave a featured presentation on `Simulation, Computation, and the Global Nuclear Energy Partnership.' Overall, the conference provided an excellent opportunity to highlight the rising importance of computational science in the scientific enterprise and to motivate future investment in this area. As Michael Strayer, SciDAC Program Director, has noted: `While SciDAC may have started out as a specific program, Scientific Discovery through Advanced Computing has become a powerful concept for addressing some of the biggest challenges facing our nation and our world.' Looking forward to next year, the SciDAC 2007 Conference will be held from June 24-28 at the Westin Copley Plaza in Boston, Massachusetts. Chairman: David Keyes, Columbia University. The Organizing Committee for the SciDAC 2006 Conference would like to acknowledge the individuals whose talents and efforts were essential to the success of the meeting. Special thanks go to Betsy Riley for her leadership in building the infrastructure support for the conference, for identifying and then obtaining contributions from our corporate sponsors, for coordinating all media communications, and for her efforts in organizing and preparing the conference proceedings for publication; to Tim Jones for handling the hotel scouting, subcontracts, and exhibits and stage production; to Angela Harris for handling supplies, shipping, and tracking, poster sessions set-up, and for her efforts in coordinating and scheduling the promotional activities that took place during the conference; to John Bui and John Smith for their superb wireless networking and A/V set-up and support; to Cindy Latham for Web site design, graphic design, and quality control of proceedings submissions; and to Pamelia Nixon-Hartje of Ambassador for budget and quality control of catering. We are grateful for the highly professional dedicated efforts of all of these individuals, who were the cornerstones of the SciDAC 2006 Conference. Thanks also go to Angela Beach of the ORNL Conference Center for her efforts in executing the contracts with the hotel, Carolyn James of Colorado State for on-site registration supervision, Lora Wolfe and Brittany Hagen for administrative support at ORNL, and Dami Rich and Andrew Sproles for graphic design and production. We are also most grateful to the Oak Ridge National Laboratory, especially Jeff Nichols, and to our corporate sponsors, Data Direct Networks, Cray, IBM, SGI, and Institute of Physics Publishing for their support. We especially express our gratitude to the featured speakers, invited oral speakers, invited poster presenters, session chairs, and advanced architecture panelists and chair for their excellent contributions on behalf of SciDAC 2006. We would like to express our deep appreciation to Lali Chatterjee, Graham Douglas, Margaret Smith, and the production team of Institute of Physics Publishing, who worked tirelessly to publish the final conference proceedings in a timely manner. Finally, heartfelt thanks are extended to Michael Strayer, Associate Director for OASCR and SciDAC Director, and to the DOE program managers associated with SciDAC for their continuing enthusiasm and strong support for the annual SciDAC Conferences as a special venue to showcase the exciting scientific discovery achievements enabled by the interdisciplinary collaborations championed by the SciDAC program.
NASA Astrophysics Data System (ADS)
Davini, Paolo; von Hardenberg, Jost; Corti, Susanna; Christensen, Hannah M.; Juricke, Stephan; Subramanian, Aneesh; Watson, Peter A. G.; Weisheimer, Antje; Palmer, Tim N.
2017-03-01
The Climate SPHINX (Stochastic Physics HIgh resolutioN eXperiments) project is a comprehensive set of ensemble simulations aimed at evaluating the sensitivity of present and future climate to model resolution and stochastic parameterisation. The EC-Earth Earth system model is used to explore the impact of stochastic physics in a large ensemble of 30-year climate integrations at five different atmospheric horizontal resolutions (from 125 up to 16 km). The project includes more than 120 simulations in both a historical scenario (1979-2008) and a climate change projection (2039-2068), together with coupled transient runs (1850-2100). A total of 20.4 million core hours have been used, made available from a single year grant from PRACE (the Partnership for Advanced Computing in Europe), and close to 1.5 PB of output data have been produced on SuperMUC IBM Petascale System at the Leibniz Supercomputing Centre (LRZ) in Garching, Germany. About 140 TB of post-processed data are stored on the CINECA supercomputing centre archives and are freely accessible to the community thanks to an EUDAT data pilot project. This paper presents the technical and scientific set-up of the experiments, including the details on the forcing used for the simulations performed, defining the SPHINX v1.0 protocol. In addition, an overview of preliminary results is given. An improvement in the simulation of Euro-Atlantic atmospheric blocking following resolution increase is observed. It is also shown that including stochastic parameterisation in the low-resolution runs helps to improve some aspects of the tropical climate - specifically the Madden-Julian Oscillation and the tropical rainfall variability. These findings show the importance of representing the impact of small-scale processes on the large-scale climate variability either explicitly (with high-resolution simulations) or stochastically (in low-resolution simulations).
Lagardère, Louis; Jolly, Luc-Henri; Lipparini, Filippo; Aviat, Félix; Stamm, Benjamin; Jing, Zhifeng F; Harger, Matthew; Torabifard, Hedieh; Cisneros, G Andrés; Schnieders, Michael J; Gresh, Nohad; Maday, Yvon; Ren, Pengyu Y; Ponder, Jay W; Piquemal, Jean-Philip
2018-01-28
We present Tinker-HP, a massively MPI parallel package dedicated to classical molecular dynamics (MD) and to multiscale simulations, using advanced polarizable force fields (PFF) encompassing distributed multipoles electrostatics. Tinker-HP is an evolution of the popular Tinker package code that conserves its simplicity of use and its reference double precision implementation for CPUs. Grounded on interdisciplinary efforts with applied mathematics, Tinker-HP allows for long polarizable MD simulations on large systems up to millions of atoms. We detail in the paper the newly developed extension of massively parallel 3D spatial decomposition to point dipole polarizable models as well as their coupling to efficient Krylov iterative and non-iterative polarization solvers. The design of the code allows the use of various computer systems ranging from laboratory workstations to modern petascale supercomputers with thousands of cores. Tinker-HP proposes therefore the first high-performance scalable CPU computing environment for the development of next generation point dipole PFFs and for production simulations. Strategies linking Tinker-HP to Quantum Mechanics (QM) in the framework of multiscale polarizable self-consistent QM/MD simulations are also provided. The possibilities, performances and scalability of the software are demonstrated via benchmarks calculations using the polarizable AMOEBA force field on systems ranging from large water boxes of increasing size and ionic liquids to (very) large biosystems encompassing several proteins as well as the complete satellite tobacco mosaic virus and ribosome structures. For small systems, Tinker-HP appears to be competitive with the Tinker-OpenMM GPU implementation of Tinker. As the system size grows, Tinker-HP remains operational thanks to its access to distributed memory and takes advantage of its new algorithmic enabling for stable long timescale polarizable simulations. Overall, a several thousand-fold acceleration over a single-core computation is observed for the largest systems. The extension of the present CPU implementation of Tinker-HP to other computational platforms is discussed.
CAL3JHH: a Java program to calculate the vicinal coupling constants (3J H,H) of organic molecules.
Aguirre-Valderrama, Alonso; Dobado, José A
2008-12-01
Here, we present a free web-accessible application, developed in the JAVA programming language for the calculation of vicinal coupling constant (3J(H,H)) of organic molecules with the H-Csp3-Csp3-H fragment. This JAVA applet is oriented to assist chemists in structural and conformational analyses, allowing the user to calculate the averaged 3J(H,H) values among conformers, according to its Boltzmann populations. Thus, the CAL3JHH program uses the Haasnoot-Leeuw-Altona equation, and, by reading the molecule geometry from a protein data bank (PDB) file format or from multiple pdb files, automatically detects all the coupled hydrogens, evaluating the data needed for this equation. Moreover, a "Graphical viewer" menu allows the display of the results on the 3D molecule structure, as well as the plotting of the Newman projection for the couplings.
Equations of motion for coupled n-body systems
NASA Technical Reports Server (NTRS)
Frisch, H. P.
1980-01-01
Computer program, developed to analyze spacecraft attitude dynamics, can be applied to large class of problems involving objects that can be simplified into component parts. Systems of coupled rigid bodies, point masses, symmetric wheels, and elastically flexible bodies can be analyzed. Program derives complete set of non-linear equations of motion in vectordyadic format. Numerical solutions may be printed out. Program is in FORTRAN IV for batch execution and has been implemented on IBM 360.
ERIC Educational Resources Information Center
Petch, Jemima F.; Halford, W. Kim; Creedy, Debra K.; Gamble, Jenny
2012-01-01
Objective: This study evaluated the effectiveness of couple relationship education in assisting couples to sustain relationship functioning and parenting sensitivity, and whether benefits were moderated by risk of maladjustment in the transition to parenthood ("risk"). Method: Two hundred fifty couples expecting their first child were assessed on…
Hybrid thermocouple development program
NASA Technical Reports Server (NTRS)
Garvey, L. P.; Krebs, T. R.; Lee, E.
1971-01-01
The design and development of a hybrid thermocouple, having a segmented SiGe-PbTe n-leg encapsulated within a hollow cylindrical p-SiGe leg, is described. Hybrid couple efficiency is calculated to be 10% to 15% better than that of a all-SiGe couple. A preliminary design of a planar RTG, employing hybrid couples and a water heat pipe radiator, is described as an example of a possible system application. Hybrid couples, fabricated initially, were characterized by higher than predicted resistance and, in some cases, bond separations. Couples made later in the program, using improved fabrication techniques, exhibited normal resistances, both as-fabricated and after 700 hours of testing. Two flat-plate sections of the reference design thermoelectric converter were fabricated and delivered to NASA Lewis for testing and evaluation.
Beach, Steven R H; Barton, Allen W; Lei, Man Kit; Brody, Gene H; Kogan, Steven M; Hurt, Tera R; Fincham, Frank D; Stanley, Scott M
2014-12-01
African American couples (n = 331) with children, 89% of whom were married, were assigned to either (a) a culturally sensitive couple- and parenting-enhancement program (ProSAAF) or (b) an information-only control condition in which couples received self-help materials. Husbands averaged 41 years of age and wives averaged 39 years. We found significant effects of program participation in the short term on couple communication, which was targeted by the intervention, as well as over the long term, on self-reported arguing in front of children. Long-term parenting outcomes were fully mediated by changes in communication for wives, but not for husbands. For husbands, positive change depended on amount of wife reported change. We conclude that wives' changes in communication from baseline to posttest may be more pivotal for the couples' long-term experience of decreased arguing in front of children than are husbands' changes, with wives' changes leading to changes in both partners' reports of arguments in front of children. © 2014 Family Process Institute.
Hernando, Victoria; del Romero, Jorge; García, Soledad; Rodríguez, Carmen; del Amo, Julia; Castilla, Jesús
2009-10-01
To assess the effect of an HIV counseling and testing program targeting steady heterosexual serodiscordant couples. We studied 564 couples who attended a sexually transmitted infections/HIV clinic in Madrid in the period 1989 to 2007 and participated in couples counseling and testing. Sociodemographic, epidemiologic, clinical, and behavioral information of both partners was obtained before testing the nonindex partner. Sexual practices reported in the first (preintervention) and second visit were compared, as well those reported in 4 additional visits. Among the 399 couples who returned for a second visit (71%), the median number of sexual risk practices in the previous 6 months decreased (26.9-0; P <0.001) and the percentage of couples who had not engaged in sexual risk behavior increased (46.1-66.7; P <0.001). This reduction was maintained by the 143 couples who had 4 return visits. The diagnosis of HIV-infection in the index case previous to entering the program was associated with a lower frequency of sexual risk behavior. Independent predictors of postintervention risky sexual behavior included preintervention sexual risk behavior (odds ratio [OR]: 2.8, 95% confidence interval: 1.7-4.4), index case aged over 35 (OR: 2.0, 1.2-3.3), and a recent pregnancy (OR: 3.1, 1.6-6.3). The incidence of HIV seroconversion was 3.9 per 1000 couple-years (1.4-9.7). The diagnosis of HIV-infection and counseling appears to provide complementary reductions in sexual risk behaviors among serodiscordant steady heterosexual couples at follow-up, but the risk of transmission was not totally eliminated.
NASA Astrophysics Data System (ADS)
Borne, K. D.
2009-12-01
The emergence of e-Science over the past decade as a paradigm for Internet-based science was an inevitable evolution of science that built upon the web protocols and access patterns that were prevalent at that time, including Web Services, XML-based information exchange, machine-to-machine communication, service registries, the Grid, and distributed data. We now see a major shift in web behavior patterns to social networks, user-provided content (e.g., tags and annotations), ubiquitous devices, user-centric experiences, and user-led activities. The inevitable accrual of these social networking patterns and protocols by scientists and science projects leads to U-Science as a new paradigm for online scientific research (i.e., ubiquitous, user-led, untethered, You-centered science). U-Science applications include components from semantic e-science (ontologies, taxonomies, folksonomies, tagging, annotations, and classification systems), which is much more than Web 2.0-based science (Wikis, blogs, and online environments like Second Life). Among the best examples of U-Science are Citizen Science projects, including Galaxy Zoo, Stardust@Home, Project Budburst, Volksdata, CoCoRaHS (the Community Collaborative Rain, Hail and Snow network), and projects utilizing Volunteer Geographic Information (VGI). There are also scientist-led projects for scientists that engage a wider community in building knowledge through user-provided content. Among the semantic-based U-Science projects for scientists are those that specifically enable user-based annotation of scientific results in databases. These include the Heliophysics Knowledgebase, BioDAS, WikiProteins, The Entity Describer, and eventually AstroDAS. Such collaborative tagging of scientific data addresses several petascale data challenges for scientists: how to find the most relevant data, how to reuse those data, how to integrate data from multiple sources, how to mine and discover new knowledge in large databases, how to represent and encode the new knowledge, and how to curate the discovered knowledge. This talk will address the emergence of U-Science as a type of Semantic e-Science, and will explore challenges, implementations, and results. Semantic e-Science and U-Science applications and concepts will be discussed within the context of one particular implementation (AstroDAS: Astronomy Distributed Annotation System) and its applicability to petascale science projects such as the LSST (Large Synoptic Survey Telescope), coming online within the next few years.
20 CFR 416.2020 - Federally administered supplementary payments.
Code of Federal Regulations, 2010 CFR
2010-04-01
... for couples) for each title in effect for January 1972: (1) Since a State with a title XVI program had... disabled), the couple (both of whom are aged, blind, or disabled). (2) Other States could supplement up to...) Aged Individual, (ii) Aged Couple, (iii) Blind Individual, (iv) Blind Couple, (v) Disabled Individual...
Coupled Oscillators: Interesting Experiments for High School Students
ERIC Educational Resources Information Center
Kodejška, C.; Lepil, O.; Sedlácková, H.
2018-01-01
This work deals with the experimental demonstration of coupled oscillators using simple tools in the form of mechanical coupled pendulums, magnetically coupled elastic strings or electromagnetic oscillators. For the evaluation of results the data logger Lab Quest Vernier and video analysis in the Tracker program were used. In the first part of…
Simulated E-Bomb Effects on Electronically Equipped Targets
2009-09-01
coupling model program (CEMPAT), pursuing a feasible geometry of attack, practical antennas, best coupling approximations of ground conductivity and...procedure to determine these possible effects is to estimate the electromagnetic coupling from first principles and simulations using a coupling model ...Applications .................................... 16 B. SYSTEM OF INTEREST MODEL AS A TARGET ............................. 16 1. Shielding Methods, as
Factors Associated with Involvement in Marriage Preparation Programs
ERIC Educational Resources Information Center
Duncan, Stephen F.; Holman, Thomas B.; Yang, Chongming
2007-01-01
Little is known empirically about the characteristics of couples who do and do not participate in marriage preparation. This study assessed the individual, couple, family, and sociocultural context variables that distinguish couples who become involved in marriage preparation from those who do not, using a sample of 7,331 couples. The results…
NASA Technical Reports Server (NTRS)
Omalley, T. A.; Connolly, D. J.
1977-01-01
The use of the coupled cavity traveling wave tube for space communications has led to an increased interest in improving the efficiency of the basic interaction process in these devices through velocity resynchronization and other methods. To analyze these methods, a flexible, large signal computer program for use on the IBM 360/67 time-sharing system has been developed. The present report is a users' manual for this program.
NASA Technical Reports Server (NTRS)
Cassarino, S.; Sopher, R.
1982-01-01
user instruction and software descriptions for the base program of the coupled rotor/airframe vibration analysis are provided. The functional capabilities and procedures for running the program are provided. Interfaces with external programs are discussed. The procedure of synthesizing a dynamic system and the various solution methods are described. Input data and output results are presented. Detailed information is provided on the program structure. Sample test case results for five representative dynamic configurations are provided and discussed. System response are plotted to demonstrate the plots capabilities available. Instructions to install and execute SIMVIB on the CDC computer system are provided.
Jones, Damon E; Feinberg, Mark E; Hostetler, Michelle L
2014-06-01
The transition to parenthood involves many stressors that can have implications for the couple relationship as well as the developmental environment of the child. Scholars and policymakers have recognized the potential for interventions that can help couples navigate these stressors to improve parenting and coparenting strategies. Such evidence-based programs are scarcely available, however, and little is known about the resources necessary to carry out these programs. This study examines the costs and resources necessary to implement Family Foundations, a program that addresses the multifaceted issues facing first-time parents through a series of pre- and post-natal classes. Costs were determined using a 6-step analytic process and are based on the first implementation of the program carried out through a five-year demonstration project. This assessment demonstrates how overall costs change across years as new cohorts of families are introduced, and how cost breakdowns differ by category as needs shift from training group leaders to sustaining program services. Information from this cost analysis helps clarify how the program could be made more efficient in subsequent implementations. We also consider how results may be used in future research examining economic benefits of participation in the program. Copyright © 2014 Elsevier Ltd. All rights reserved.
Jones, Damon E.; Feinberg, Mark E.; Hostetler, Michelle
2014-01-01
The transition to parenthood involves many stressors that can have implications for the couple relationship as well as the developmental environment of the child. Scholars and policymakers have recognized the potential for interventions that can help couples navigate these stressors to improve parenting and coparenting strategies. Such evidence-based programs are scarcely available, however, and little is known about the resources necessary to carry out these programs. This study examines the costs and resources necessary to implement Family Foundations, a program that addresses the multifaceted issues facing first-time parents through a series of pre- and post-natal classes. Costs were determined using a 6-step analytic process and are based on the first implementation of the program carried out through a five-year demonstration project. This assessment demonstrates how overall costs change across years as new cohorts of families are introduced, and how cost breakdowns differ by category as needs shift from training group leaders to sustaining program services. Information from this cost analysis helps clarify how the program could be made more efficient in subsequent implementations. We also consider how results may be used in future research examining economic benefits of participation in the program. PMID:24603052
Osuka, Yosuke; Jung, Songee; Kim, Taeho; Okubo, Yoshiro; Kim, Eunbi; Tanaka, Kiyoji
2017-07-31
Family support can help older adults better adhere to exercise routine, but it remains unclear whether an exercise program targeting older married couples would have stronger effects on exercise adherence than would a program for individuals. The purpose of this study was to determine the effects of an exercise program on the exercise adherence of older married couples over a 24-week follow-up period. Thirty-four older married couples and 59 older adults participated in this study as couple and non-couple groups (CG and NCG, respectively). All participants attended an 8-week supervised program (once a week and a home-based exercise program comprising walking and strength exercises) and then participated in a follow-up measurement (24 weeks after post-intervention measurement). Exercise adherence was prospectively measured via an exercise habituation diary during the follow-up period-specifically, we asked them to record practice rates for walking (≥2 days/week) and strength exercises (≥6 items for 2 days/week). A multivariate logistic regression analysis was conducted to obtain the CG's odds ratios (ORs) and 95% confidence intervals (CIs) for adherence to walking and strength exercise adjusted for potential confounders (with NCG as the reference). Although the adherence rate of walking exercise in the CG was significantly higher than that in the NCG (29.2%; P < 0.001), there was no significant difference in the adherence rate of strength exercise between the two groups (P = 0.199). The multivariate logistic regression analysis showed that CG had significantly higher odds of adherence to walking exercise compared with the NCG (3.68 [1.57-8.60]). However, the odds of adherence to strength exercise did not significantly differ between the two groups (1.30 [0.52-3.26]). These results suggest that an exercise program targeting older married couples may be a useful strategy for maintaining walking adherence, even six months after the supervised program has ceased. A blinded randomized controlled trial will be needed to confirm this conclusion. Retrospectively registered. UMIN Clinical Trials Registry (Registered: 02/11/16) UMIN000024689 .
ERIC Educational Resources Information Center
Shambleau, Krista M.
2010-01-01
Federally funded Healthy Marriage Initiative (HMI) programs provide marriage education as well as other services to low-income diverse individuals and couples at many points along the marital continuum with improving children's well-being as the overarching purpose. These programs need appropriate measures of healthy marriage for couples with…
Code of Federal Regulations, 2010 CFR
2010-04-01
... criteria for individuals currently eligible for SSI benefits. We consider an individual or couple currently... couple's current monthly income (that is, the income upon which the individual's or couple's eligibility...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chace, D.A.; Roberts, R.M.; Palmer, J.B.
WIPP Salado Hydrology Program Data Report {number_sign}3 presents hydrologic data collected during permeability testing, coupled permeability and hydrofracture testing, and gas-threshold-pressure testing of the Salado Formation performed from November 1991 through October 1995. Fluid-pressure monitoring data representing August 1989 through May 1995 are also included. The report presents data from the drilling and testing of three boreholes associated with the permeability testing program, nine boreholes associated with the coupled permeability and hydrofracture testing program, and three boreholes associated with the gas-threshold-pressure testing program. The purpose of the permeability testing program was to provide data with which to interpret the disturbedmore » and undisturbed permeability and pore pressure characteristics of the different Salado Formation lithologies. The purpose of the coupled permeability and hydrofracture testing program was to provide data with which to characterize the occurrence, propagation, and direction of pressure induced fractures in the Salado Formation lithologies, especially MB139. The purpose of the gas-threshold-pressure testing program was to provide data with which to characterize the conditions under which pressurized gas displaces fluid in the brine-saturated Salado Formation lithologies. All of the holes were drilled from the WIPP underground facility 655 m below ground surface in the Salado Formation.« less
BSR: B-spline atomic R-matrix codes
NASA Astrophysics Data System (ADS)
Zatsarinny, Oleg
2006-02-01
BSR is a general program to calculate atomic continuum processes using the B-spline R-matrix method, including electron-atom and electron-ion scattering, and radiative processes such as bound-bound transitions, photoionization and polarizabilities. The calculations can be performed in LS-coupling or in an intermediate-coupling scheme by including terms of the Breit-Pauli Hamiltonian. New version program summaryTitle of program: BSR Catalogue identifier: ADWY Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADWY Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Computers on which the program has been tested: Microway Beowulf cluster; Compaq Beowulf cluster; DEC Alpha workstation; DELL PC Operating systems under which the new version has been tested: UNIX, Windows XP Programming language used: FORTRAN 95 Memory required to execute with typical data: Typically 256-512 Mwords. Since all the principal dimensions are allocatable, the available memory defines the maximum complexity of the problem No. of bits in a word: 8 No. of processors used: 1 Has the code been vectorized or parallelized?: no No. of lines in distributed program, including test data, etc.: 69 943 No. of bytes in distributed program, including test data, etc.: 746 450 Peripherals used: scratch disk store; permanent disk store Distribution format: tar.gz Nature of physical problem: This program uses the R-matrix method to calculate electron-atom and electron-ion collision processes, with options to calculate radiative data, photoionization, etc. The calculations can be performed in LS-coupling or in an intermediate-coupling scheme, with options to include Breit-Pauli terms in the Hamiltonian. Method of solution: The R-matrix method is used [P.G. Burke, K.A. Berrington, Atomic and Molecular Processes: An R-Matrix Approach, IOP Publishing, Bristol, 1993; P.G. Burke, W.D. Robb, Adv. At. Mol. Phys. 11 (1975) 143; K.A. Berrington, W.B. Eissner, P.H. Norrington, Comput. Phys. Comm. 92 (1995) 290].
Knowledge, attitude, and practice of reproductive behavior in Iranian minor thalassemia couples.
Kosaryan, Mehrnoosh; Vahidshahi, Koorosh; Siami, Rita; Nazari, Meisam; Karami, Hosein; Ehteshami, Sara
2009-06-01
To investigate the knowledge, attitude, and practice of reproductive behavior in Iranian minor thalassemia couples in Ghaemshahr City, Mazandaran, Iran. This is a cross-sectional descriptive survey conducted in 2006. Birth rates from 1997-2005 and the number of newly registered patients from at risk couples was recorded. Tools for data collection were a valid questionnaire containing epidemiologic characteristics of couples, knowledge (20 questions), attitude 20 statements, and practice by studying the family file in health centers. Questionnaires were completed by husband and wife separately. Actual versus expected numbers of patients born in that period were compared. The data were analyzed using the Statistical Package for Social Science version 13.00, and p<0.05 was interpreted as significant. Of the 240 at risk couples, 100 were studied. Of them, 82% had good knowledge of thalassemia, and 68.5% had a positive attitude toward thalassemia prevention program. Correlations of knowledge with attitude were significant (p<0.001), and 50% of the couples had unfavorable practice including unplanned pregnancy, fetal abortion without prenatal diagnosis (PND), delivery without PND, and having a child affected by thalassemia major (TM). Without PND, 4 TM patients were born. Ninety-eight episodes of unfavorable practice were reported. Meanwhile, the contraceptive method used by 12% of couples was unsafe. Suspected TM patients with no prevention program were 25; thus, the birth of 2 TM was prevented (92% reduction). We achieved great success during the last 9 years in the region, and TM prevention program improved knowledge, attitude, and practice in high-risk couples and carrier families.
The Effect of the Family Training Program on Married Women's Couple-Burnout Levels
ERIC Educational Resources Information Center
Sirin, Hatice Deveci; Deniz, M. Engin
2016-01-01
This study aims to investigate the effect of Modules 2 and 3 of the Family Communication Section of the Family Training Program as prepared by the Ministry of Family and Social Policies on married women's couple-burnout levels. The study group consists of 40 married women in total: 20 constituting the experimental group and the remaining 20…
ERIC Educational Resources Information Center
Johnson, Matthew D.
2013-01-01
The author is gratified and encouraged that such an esteemed group of relationship scientists as Hawkins et al. (2013, this issue) want to continue the discussion of government-supported marriage and relationship education (MRE) programs for lower income couples by responding to his article (Johnson, May-June 2012). In their comment, they argued…
ERIC Educational Resources Information Center
Petty, Barbara D.
2010-01-01
Couples can improve their marriages by implementing relationship building skills they learn while participating in a marriage education program. This study addresses how marriages improved as a result of participating in the marriage education program, Married and Loving It![R] and what specific components of the learning experience facilitated…
ERIC Educational Resources Information Center
Hawkins, Alan J.; Stanley, Scott M.; Cowan, Philip A.; Fincham, Frank D.; Beach, Steven R. H.; Cowan, Carolyn Pape; Rhoades, Galena K.; Markman, Howard J.; Daire, Andrew P.
2013-01-01
In the past decade, the federal government, some states, and numerous communities have initiated programs to help couples form and sustain healthy marriages and relationships in order to increase family stability for children. Thus, the authors value the attention given to this emerging policy area by the "American Psychologist" in a recent…
Ngure, Kenneth; Vusha, Sophie; Mugo, Nelly; Emmanuel-Fabula, Mira; Ngutu, Mariah; Celum, Connie; Baeten, Jared M; Heffron, Renee
2016-12-01
In spite of access to behavioral and biomedical HIV prevention strategies, HIV transmission occurs. For HIV-serodiscordant couples, prevention programs can be tailored to address individual and couples' needs to preserve their relationship while minimizing HIV risk. Programs for serodiscordant couples may benefit from learning from experiences of couples who transmit HIV. We conducted 20 individual in-depth interviews with 10 initially HIV-serodiscordant couples who transmitted HIV during prospective follow-up at a peri-urban research site in Thika, Kenya. Data were analyzed inductively to identify situations that led to prevention failure and coping mechanisms. Inconsistent condom use driven by low HIV risk perception and alcohol use often preceded seroconversion while persistent blame frequently hindered couples' communication soon after seroconversion. In this emerging era of antiretroviral-based HIV prevention, couples' counseling can capitalize on opportunities to foster a supportive environment to discuss initiation and adherence to time-limited pre-exposure prophylaxis and lifelong antiretroviral therapy, in addition to strategies to reduce alcohol use, diffuse blame, and use condoms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marquez, Andres; Manzano Franco, Joseph B.; Song, Shuaiwen
With Exascale performance and its challenges in mind, one ubiquitous concern among architects is energy efficiency. Petascale systems projected to Exascale systems are unsustainable at current power consumption rates. One major contributor to system-wide power consumption is the number of memory operations leading to data movement and management techniques applied by the runtime system. To address this problem, we present the concept of the Architected Composite Data Types (ACDT) framework. The framework is made aware of data composites, assigning them a specific layout, transformations and operators. Data manipulation overhead is amortized over a larger number of elements and program performancemore » and power efficiency can be significantly improved. We developed the fundamentals of an ACDT framework on a massively multithreaded adaptive runtime system geared towards Exascale clusters. Showcasing the capability of ACDT, we exercised the framework with two representative processing kernels - Matrix Vector Multiply and the Cholesky Decomposition – applied to sparse matrices. As transformation modules, we applied optimized compress/decompress engines and configured invariant operators for maximum energy/performance efficiency. Additionally, we explored two different approaches based on transformation opaqueness in relation to the application. Under the first approach, the application is agnostic to compression and decompression activity. Such approach entails minimal changes to the original application code, but leaves out potential applicationspecific optimizations. The second approach exposes the decompression process to the application, hereby exposing optimization opportunities that can only be exploited with application knowledge. The experimental results show that the two approaches have their strengths in HW and SW respectively, where the SW approach can yield performance and power improvements that are an order of magnitude better than ACDT-oblivious, hand-optimized implementations.We consider the ACDT runtime framework an important component of compute nodes that will lead towards power efficient Exascale clusters.« less
Sanibel Symposium in the Petascale-Exascale Computational Era
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cheng, Hai-Ping
The 56 th Sanibel Symposium was held February 14-19 2016 at the King and Prince Hotel, St. Simons Island, GA. It successfully brought quantum chemists and chemical and condensed matter physicists together in presentations, posters, and informal discussions bridging those two communities. The Symposium has had a significant role in preparing generations of quantum theorists. As computational potency and algorithmic sophistication have grown, the Symposium has evolved to emphasize more heavily computationally oriented method development in chemistry and materials physics, including nanoscience, complex molecular phenomena, and even bio-molecular methods and problems. Given this context, the 56 th Sanibel meeting systematicallymore » and deliberately had sessions focused on exascale computation. A selection of outstanding theoretical problems that need serious attention was included. Five invited sessions, two contributed sessions (hot topics), and a poster session were organized with the exascale theme. This was a historic milestone in the evolution of the Symposia. Just as years ago linear algebra, perturbation theory, density matrices, and band-structure methods dominated early Sanibel Symposia, the exascale sessions of the 56 thmeeting contributed a transformative influence to add structure and strength to the computational physical science community in an unprecedented way. A copy of the full program of the 56 th Symposium is attached. The exascale sessions were Linear Scaling, Non-Adabatic Dynamics, Interpretive Theory and Models, Computation, Software, and Algorithms, and Quantum Monte Carlo. The Symposium Proceedings will be published in Molecular Physics (2017). Note that the Sanibel proceedings from 2015 and 2014 were published as Molecular Physics vol. 114, issue 3-4 (2016) and vol. 113, issue 3-4 (2015) respectively.« less
Level-2 Milestone 3244: Deploy Dawn ID Machine for Initial Science Runs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fox, D
2009-09-21
This report documents the delivery, installation, integration, testing, and acceptance of the Dawn system, ASC L2 milestone 3244: Deploy Dawn ID Machine for Initial Science Runs, due September 30, 2009. The full text of the milestone is included in Attachment 1. The description of the milestone is: This milestone will be a result of work started three years ago with the planning for a multi-petaFLOPS UQ-focused platform (Sequoia) and will be satisfied when a smaller ID version of the final system is delivered, installed, integrated, tested, accepted, and deployed at LLNL for initial science runs in support of SSP mission.more » The deliverable for this milestone will be a LA petascale computing system (named Dawn) usable for code development and scaling necessary to ensure effective use of a final Sequoia platform (expected in 2011-2012), and for urgent SSP program needs. Allocation and scheduling of Dawn as an LA system will likely be performed informally, similar to what has been used for BlueGene/L. However, provision will be made to allow for dedicated access times for application scaling studies across the entire Dawn resource. The milestone was completed on April 1, 2009, when science runs began running on the Dawn system. The following sections describe the Dawn system architecture, current status, installation and integration time line, and testing and acceptance process. A project plan is included as Attachment 2. Attachment 3 is a letter certifying the handoff of the system to a nuclear weapons stockpile customer. Attachment 4 presents the results of science runs completed on the system.« less
Is Communication a Mechanism of Relationship Education Effects among Rural African Americans?
Barton, Allen W; Beach, Steven R H; Lavner, Justin A; Bryant, Chalandra M; Kogan, Steven M; Brody, Gene H
2017-10-01
Enhancing communication as a means of promoting relationship quality has been increasingly questioned, particularly for couples at elevated sociodemographic risk. In response, the current study investigated communication change as a mechanism accounting for changes in relationship satisfaction and confidence among 344 rural, predominantly low-income African American couples with an early adolescent child who participated in a randomized controlled trial of the Protecting Strong African American Families (ProSAAF) program. Approximately 9 months after baseline assessment, intent-to-treat analyses indicated ProSAAF couples demonstrated improved communication, satisfaction, and confidence compared with couples in the control condition. Improvements in communication mediated ProSAAF effects on relationship satisfaction and confidence; conversely, neither satisfaction nor confidence mediated intervention effects on changes in communication. These results underscore the short-term efficacy of a communication-focused, culturally sensitive prevention program and suggest that communication is a possible mechanism of change in relationship quality among low-income African American couples.
Cognitive Restructuring and a Collaborative Set in Couples' Work.
ERIC Educational Resources Information Center
Huber, Charles H.; Milstein, Barbara
1985-01-01
Investigated effects of cognitive restructuring efforts to modify unrealistic beliefs of marital partners in 17 couples. Treatment program sought to impact proactively upon positive therapeutic expectations and relationship goals and enhanced base level of marital satisfaction. On all outcome measures, treatment group (N=9 couples) showed…
Laser-direct-drive program: Promise, challenge, and path forward
DOE Office of Scientific and Technical Information (OSTI.GOV)
Campbell, E. M.; Goncharov, V. N.; Sangster, T. C.
Along with laser-indirect (x-ray)-drive and magnetic-drive target concepts, laser direct drive is a viable approach to achieving ignition and gain with inertial confinement fusion. In the United States, a national program has been established to demonstrate and understand the physics of laser direct drive. The program utilizes the Omega Laser Facility to conduct implosion and coupling physics at the nominally 30-kJ scale and laser–plasma interaction and coupling physics at the MJ scale at the National Ignition Facility. This paper will discuss the motivation and challenges for laser direct drive and the broad-based program presently underway in the United States.
Computer program for determining rotational line intensity factors for diatomic molecules
NASA Technical Reports Server (NTRS)
Whiting, E. E.
1973-01-01
A FORTRAN IV computer program, that provides a new research tool for determining reliable rotational line intensity factors (also known as Honl-London factors), for most electric and magnetic dipole allowed diatomic transitions, is described in detail. This users manual includes instructions for preparing the input data, a program listing, detailed flow charts, and three sample cases. The program is applicable to spin-allowed dipole transitions with either or both states intermediate between Hund's case (a) and Hund's case (b) coupling and to spin-forbidden dipole transitions with either or both states intermediate between Hund's case (c) and Hund's case (b) coupling.
Bifilar analysis users manual, volume 2
NASA Technical Reports Server (NTRS)
Cassarino, S. J.
1980-01-01
The digital computer program developed to study the vibration response of a coupled rotor/bifilar/airframe coupled system is described. The theoretical development of the rotor/airframe system equations of motion is provided. The fuselage and bifilar absorber equations of motion are discussed. The modular block approach used in the make-up of this computer program is described. The input data needed to run the rotor and bifilar absorber analyses is described. Sample output formats are presented and discussed. The results for four test cases, which use the major logic paths of the computer program, are presented. The overall program structure is discussed in detail. The FORTRAN subroutines are described in detail.
Laser-direct-drive program: Promise, challenge, and path forward
Campbell, E. M.; Goncharov, V. N.; Sangster, T. C.; ...
2017-03-19
Along with laser-indirect (x-ray)-drive and magnetic-drive target concepts, laser direct drive is a viable approach to achieving ignition and gain with inertial confinement fusion. In the United States, a national program has been established to demonstrate and understand the physics of laser direct drive. The program utilizes the Omega Laser Facility to conduct implosion and coupling physics at the nominally 30-kJ scale and laser–plasma interaction and coupling physics at the MJ scale at the National Ignition Facility. This paper will discuss the motivation and challenges for laser direct drive and the broad-based program presently underway in the United States.
Predicting the Coupling Properties of Axially-Textured Materials.
Fuentes-Cobas, Luis E; Muñoz-Romero, Alejandro; Montero-Cabrera, María E; Fuentes-Montero, Luis; Fuentes-Montero, María E
2013-10-30
A description of methods and computer programs for the prediction of "coupling properties" in axially-textured polycrystals is presented. Starting data are the single-crystal properties, texture and stereography. The validity and proper protocols for applying the Voigt, Reuss and Hill approximations to estimate coupling properties effective values is analyzed. Working algorithms for predicting mentioned averages are given. Bunge's symmetrized spherical harmonics expansion of orientation distribution functions, inverse pole figures and (single and polycrystals) physical properties is applied in all stages of the proposed methodology. The established mathematical route has been systematized in a working computer program. The discussion of piezoelectricity in a representative textured ferro-piezoelectric ceramic illustrates the application of the proposed methodology. Polycrystal coupling properties, predicted by the suggested route, are fairly close to experimentally measured ones.
Predicting the Coupling Properties of Axially-Textured Materials
Fuentes-Cobas, Luis E.; Muñoz-Romero, Alejandro; Montero-Cabrera, María E.; Fuentes-Montero, Luis; Fuentes-Montero, María E.
2013-01-01
A description of methods and computer programs for the prediction of “coupling properties” in axially-textured polycrystals is presented. Starting data are the single-crystal properties, texture and stereography. The validity and proper protocols for applying the Voigt, Reuss and Hill approximations to estimate coupling properties effective values is analyzed. Working algorithms for predicting mentioned averages are given. Bunge’s symmetrized spherical harmonics expansion of orientation distribution functions, inverse pole figures and (single and polycrystals) physical properties is applied in all stages of the proposed methodology. The established mathematical route has been systematized in a working computer program. The discussion of piezoelectricity in a representative textured ferro-piezoelectric ceramic illustrates the application of the proposed methodology. Polycrystal coupling properties, predicted by the suggested route, are fairly close to experimentally measured ones. PMID:28788370
Hanrath, Michael; Engels-Putzka, Anna
2010-08-14
In this paper, we present an efficient implementation of general tensor contractions, which is part of a new coupled-cluster program. The tensor contractions, used to evaluate the residuals in each coupled-cluster iteration are particularly important for the performance of the program. We developed a generic procedure, which carries out contractions of two tensors irrespective of their explicit structure. It can handle coupled-cluster-type expressions of arbitrary excitation level. To make the contraction efficient without loosing flexibility, we use a three-step procedure. First, the data contained in the tensors are rearranged into matrices, then a matrix-matrix multiplication is performed, and finally the result is backtransformed to a tensor. The current implementation is significantly more efficient than previous ones capable of treating arbitrary high excitations.
Coupled rotor/airframe vibration analysis
NASA Technical Reports Server (NTRS)
Sopher, R.; Studwell, R. E.; Cassarino, S.; Kottapalli, S. B. R.
1982-01-01
A coupled rotor/airframe vibration analysis developed as a design tool for predicting helicopter vibrations and a research tool to quantify the effects of structural properties, aerodynamic interactions, and vibration reduction devices on vehicle vibration levels is described. The analysis consists of a base program utilizing an impedance matching technique to represent the coupled rotor/airframe dynamics of the system supported by inputs from several external programs supplying sophisticated rotor and airframe aerodynamic and structural dynamic representation. The theoretical background, computer program capabilities and limited correlation results are presented in this report. Correlation results using scale model wind tunnel results show that the analysis can adequately predict trends of vibration variations with airspeed and higher harmonic control effects. Predictions of absolute values of vibration levels were found to be very sensitive to modal characteristics and results were not representative of measured values.
Orbital Maneuvering Engine Feed System Coupled Stability Investigation, Computer User's Manual
NASA Technical Reports Server (NTRS)
Schuman, M. D.; Fertig, K. W.; Hunting, J. K.; Kahn, D. R.
1975-01-01
An operating manual for the feed system coupled stability model was given, in partial fulfillment of a program designed to develop, verify, and document a digital computer model that can be used to analyze and predict engine/feed system coupled instabilities in pressure-fed storable propellant propulsion systems over a frequency range of 10 to 1,000 Hz. The first section describes the analytical approach to modelling the feed system hydrodynamics, combustion dynamics, chamber dynamics, and overall engineering model structure, and presents the governing equations in each of the technical areas. This is followed by the program user's guide, which is a complete description of the structure and operation of the computerized model. Last, appendices provide an alphabetized FORTRAN symbol table, detailed program logic diagrams, computer code listings, and sample case input and output data listings.
Accelerating scientific discovery : 2007 annual report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Beckman, P.; Dave, P.; Drugan, C.
2008-11-14
As a gateway for scientific discovery, the Argonne Leadership Computing Facility (ALCF) works hand in hand with the world's best computational scientists to advance research in a diverse span of scientific domains, ranging from chemistry, applied mathematics, and materials science to engineering physics and life sciences. Sponsored by the U.S. Department of Energy's (DOE) Office of Science, researchers are using the IBM Blue Gene/L supercomputer at the ALCF to study and explore key scientific problems that underlie important challenges facing our society. For instance, a research team at the University of California-San Diego/ SDSC is studying the molecular basis ofmore » Parkinson's disease. The researchers plan to use the knowledge they gain to discover new drugs to treat the disease and to identify risk factors for other diseases that are equally prevalent. Likewise, scientists from Pratt & Whitney are using the Blue Gene to understand the complex processes within aircraft engines. Expanding our understanding of jet engine combustors is the secret to improved fuel efficiency and reduced emissions. Lessons learned from the scientific simulations of jet engine combustors have already led Pratt & Whitney to newer designs with unprecedented reductions in emissions, noise, and cost of ownership. ALCF staff members provide in-depth expertise and assistance to those using the Blue Gene/L and optimizing user applications. Both the Catalyst and Applications Performance Engineering and Data Analytics (APEDA) teams support the users projects. In addition to working with scientists running experiments on the Blue Gene/L, we have become a nexus for the broader global community. In partnership with the Mathematics and Computer Science Division at Argonne National Laboratory, we have created an environment where the world's most challenging computational science problems can be addressed. Our expertise in high-end scientific computing enables us to provide guidance for applications that are transitioning to petascale as well as to produce software that facilitates their development, such as the MPICH library, which provides a portable and efficient implementation of the MPI standard--the prevalent programming model for large-scale scientific applications--and the PETSc toolkit that provides a programming paradigm that eases the development of many scientific applications on high-end computers.« less
ERIC Educational Resources Information Center
Gubits, Daniel; Lowenstein, Amy E.; Harris, Jorgen; Hsueh, JoAnn
2014-01-01
The Supporting Healthy Marriage (SHM) evaluation was launched in 2003 to test the effectiveness of a skills-based relationship education program designed to help low-and modest-income married couples strengthen their relationships and to support more stable and more nurturing home environments and more positive outcomes for parents and their…
Marriage education for clinicians.
Wetzler, Scott; Frame, Laura; Litzinger, Samantha
2011-01-01
The field of marriage education has come to be dominated by nonprofessionals with no clinical training because clinicians interested in relationships typically provide marital therapy to couples in distress rather than marriage education to healthy couples. In this paper, we encourage clinicians to participate in the development of marriage education programs, such as that described by our Supporting Healthy Marriage program, which serves a large number of low-income couples, and propose a psychological conceptual framework for delivering marriage education services. It makes sense for clinicians to consider using this novel approach given the opportunity to impact such a large segment of society that might not receive psychological services.
Profiles of dyadic adjustment for advanced prostate cancer to inform couple-based intervention.
Elliott, Kate-Ellen J; Scott, Jennifer L; Monsour, Michael; Nuwayhid, Fadi
2015-01-01
The purpose of the study is to describe from a relational perspective, partners' psychological adjustment, coping and support needs for advanced prostate cancer. A mixed methods design was adopted, employing triangulation of qualitative and quantitative data, to produce dyadic profiles of adjustment for six couples recruited from the urology clinics of local hospitals in Tasmania, Australia. Dyads completed a video-taped communication task, semi-structured interview and standardised self-report questionnaires. Themes identified were associated with the dyadic challenges of the disease experience (e.g. relationship intimacy, disease progression and carer burden). Couples with poor psychological adjustment profiles had both clinical and global locus of distress, treatment side-effects, carer burden and poor general health. Resilient couples demonstrated relationship closeness and adaptive cognitive and behavioural coping strategies. The themes informed the adaption of an effective program for couples coping with women's cancers (CanCOPE, to create a program for couples facing advanced prostate cancer (ProCOPE-Adv). Mixed method results inform the development of psychological therapy components for couples coping with advanced prostate cancer. The concomitance of co-morbid health problems may have implications for access and engagement for older adult populations in face-to-face intervention.
Code of Federal Regulations, 2011 CFR
2011-07-01
... receiving property of retarder and car coupling noise. 201.26 Section 201.26 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) NOISE ABATEMENT PROGRAMS NOISE EMISSION STANDARDS FOR TRANSPORTATION... receiving property of retarder and car coupling noise. (a) Retarders—(1) Microphone. The microphone must be...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Guo, Yi; Errichello, Robert
2013-08-29
An analytical model is developed to evaluate the design of a spline coupling. For a given torque and shaft misalignment, the model calculates the number of teeth in contact, tooth loads, stiffnesses, stresses, and safety factors. The analytic model provides essential spline coupling design and modeling information and could be easily integrated into gearbox design and simulation tools.
The Effectiveness of PREP with Lower-Income Racial/Ethnic Minority Couples
ERIC Educational Resources Information Center
Owen, Jesse; Quirk, Kelley; Bergen, Carrie; Inch, Leslie J.; France, Tiffany
2012-01-01
The current study examined the effectiveness of the Prevention and Relationship Enhancement Program (PREP) with lower-income and racial/ethnic minority (African American and Latino/a) couples. Additionally, we tested whether relationship outcomes varied based on the delivery format (i.e., group format vs. couple format). The sample included 321…
Fredman, Steffany J; Le, Yunying; Marshall, Amy D; Brick, Timothy R; Feinberg, Mark E
2017-06-01
Posttraumatic stress disorder (PTSD) symptoms are associated with disruptions in both couple functioning and parenting, and limited research suggests that, among military couples, perceptions of couple functioning and parenting stress are a function of both one's own and one's partner's mental health symptoms. However, this work has not been generalized to civilian couples, and little is known about the associations between PTSD symptoms and family adjustment in specific family developmental contexts. We examined PTSD symptoms' associations with perceived couple functioning and parenting stress within a dyadic context in civilian couples who had participated in a randomized controlled trial of a universal, couple-based transition to parenthood program and at least one member of the couple reported having experienced a Criterion A1 traumatic event. Results of actor-partner interdependence models revealed that parents' own and partners' PTSD symptoms were negatively associated with perceived couple functioning; contrary to expectation, the association of partners' PTSD symptoms with perceived couple functioning was strongest among men who received the intervention. A parent's own PTSD symptoms were positively associated with parenting stress for both men and women and were unexpectedly strongest for men who received the intervention. Partner PTSD symptoms were also positively associated with increased parenting stress for both men and women. Findings support a dyadic conceptualization of the associations between spouses' PTSD symptoms and family outcomes during the transition to parenthood and suggest that participating in a couple-based, psychoeducational program during this phase in the family life cycle may be particularly salient for men.
Fredman, Steffany J.; Le, Yunying; Marshall, Amy D.; Brick, Timothy R.; Feinberg, Mark E.
2017-01-01
Posttraumatic stress disorder (PTSD) symptoms are associated with disruptions in both couple functioning and parenting, and limited research suggests that, among military couples, perceptions of couple functioning and parenting stress are a function of both one’s own and one’s partner’s mental health symptoms. However, this work has not been generalized to civilian couples, and little is known about the associations between PTSD symptoms and family adjustment in specific family developmental contexts. We examined PTSD symptoms’ associations with perceived couple functioning and parenting stress within a dyadic context in civilian couples who had participated in a randomized controlled trial of a universal, couple-based transition to parenthood program and at least one member of the couple reported having experienced a Criterion A1 traumatic event. Results of actor-partner interdependence models revealed that parents’ own and partners’ PTSD symptoms were negatively associated with perceived couple functioning; contrary to expectation, the association of partners’ PTSD symptoms with perceived couple functioning was strongest among men who received the intervention. A parent’s own PTSD symptoms were positively associated with parenting stress for both men and women and were unexpectedly strongest for men who received the intervention. Partner PTSD symptoms were also positively associated with increased parenting stress for both men and women. Findings support a dyadic conceptualization of the associations between spouses’ PTSD symptoms and family outcomes during the transition to parenthood and suggest that participating in a couple-based, psychoeducational program during this phase in the family life cycle may be particularly salient for men. PMID:29104817
NASA Astrophysics Data System (ADS)
Chernyavskiy, Andrey; Khamitov, Kamil; Teplov, Alexey; Voevodin, Vadim; Voevodin, Vladimir
2016-10-01
In recent years, quantum information technologies (QIT) showed great development, although, the way of the implementation of QIT faces the serious difficulties, some of which are challenging computational tasks. This work is devoted to the deep and broad analysis of the parallel algorithmic properties of such tasks. As an example we take one- and two-qubit transformations of a many-qubit quantum state, which are the most critical kernels of many important QIT applications. The analysis of the algorithms uses the methodology of the AlgoWiki project (algowiki-project.org) and consists of two parts: theoretical and experimental. Theoretical part includes features like sequential and parallel complexity, macro structure, and visual information graph. Experimental part was made by using the petascale Lomonosov supercomputer (Moscow State University, Russia) and includes the analysis of locality and memory access, scalability and the set of more specific dynamic characteristics of realization. This approach allowed us to obtain bottlenecks and generate ideas of efficiency improvement.
Stochastic simulation of uranium migration at the Hanford 300 Area.
Hammond, Glenn E; Lichtner, Peter C; Rockhold, Mark L
2011-03-01
This work focuses on the quantification of groundwater flow and subsequent U(VI) transport uncertainty due to heterogeneity in the sediment permeability at the Hanford 300 Area. U(VI) migration at the site is simulated with multiple realizations of stochastically-generated high resolution permeability fields and comparisons are made of cumulative water and U(VI) flux to the Columbia River. The massively parallel reactive flow and transport code PFLOTRAN is employed utilizing 40,960 processor cores on DOE's petascale Jaguar supercomputer to simultaneously execute 10 transient, variably-saturated groundwater flow and U(VI) transport simulations within 3D heterogeneous permeability fields using the code's multi-realization simulation capability. Simulation results demonstrate that the cumulative U(VI) flux to the Columbia River is less responsive to fine scale heterogeneity in permeability and more sensitive to the distribution of permeability within the river hyporheic zone and mean permeability of larger-scale geologic structures at the site. Copyright © 2010 Elsevier B.V. All rights reserved.
Developing Discontinuous Galerkin Methods for Solving Multiphysics Problems in General Relativity
NASA Astrophysics Data System (ADS)
Kidder, Lawrence; Field, Scott; Teukolsky, Saul; Foucart, Francois; SXS Collaboration
2016-03-01
Multi-messenger observations of the merger of black hole-neutron star and neutron star-neutron star binaries, and of supernova explosions will probe fundamental physics inaccessible to terrestrial experiments. Modeling these systems requires a relativistic treatment of hydrodynamics, including magnetic fields, as well as neutrino transport and nuclear reactions. The accuracy, efficiency, and robustness of current codes that treat all of these problems is not sufficient to keep up with the observational needs. We are building a new numerical code that uses the Discontinuous Galerkin method with a task-based parallelization strategy, a promising combination that will allow multiphysics applications to be treated both accurately and efficiently on petascale and exascale machines. The code will scale to more than 100,000 cores for efficient exploration of the parameter space of potential sources and allowed physics, and the high-fidelity predictions needed to realize the promise of multi-messenger astronomy. I will discuss the current status of the development of this new code.
Analyzing checkpointing trends for applications on the IBM Blue Gene/P system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Naik, H.; Gupta, R.; Beckman, P.
Current petascale systems have tens of thousands of hardware components and complex system software stacks, which increase the probability of faults occurring during the lifetime of a process. Checkpointing has been a popular method of providing fault tolerance in high-end systems. While considerable research has been done to optimize checkpointing, in practice the method still involves a high-cost overhead for users. In this paper, we study the checkpointing overhead seen by applications running on leadership-class machines such as the IBM Blue Gene/P at Argonne National Laboratory. We study various applications and design a methodology to assist users in understanding andmore » choosing checkpointing frequency and reducing the overhead incurred. In particular, we study three popular applications -- the Grid-Based Projector-Augmented Wave application, the Carr-Parrinello Molecular Dynamics application, and a Nek5000 computational fluid dynamics application -- and analyze their memory usage and possible checkpointing trends on 32,768 processors of the Blue Gene/P system.« less
Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers.
Jordan, Jakob; Ippen, Tammo; Helias, Moritz; Kitayama, Itaru; Sato, Mitsuhisa; Igarashi, Jun; Diesmann, Markus; Kunkel, Susanne
2018-01-01
State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems.
Computational sciences in the upstream oil and gas industry
Halsey, Thomas C.
2016-01-01
The predominant technical challenge of the upstream oil and gas industry has always been the fundamental uncertainty of the subsurface from which it produces hydrocarbon fluids. The subsurface can be detected remotely by, for example, seismic waves, or it can be penetrated and studied in the extremely limited vicinity of wells. Inevitably, a great deal of uncertainty remains. Computational sciences have been a key avenue to reduce and manage this uncertainty. In this review, we discuss at a relatively non-technical level the current state of three applications of computational sciences in the industry. The first of these is seismic imaging, which is currently being revolutionized by the emergence of full wavefield inversion, enabled by algorithmic advances and petascale computing. The second is reservoir simulation, also being advanced through the use of modern highly parallel computing architectures. Finally, we comment on the role of data analytics in the upstream industry. This article is part of the themed issue ‘Energy and the subsurface’. PMID:27597785
Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers
Jordan, Jakob; Ippen, Tammo; Helias, Moritz; Kitayama, Itaru; Sato, Mitsuhisa; Igarashi, Jun; Diesmann, Markus; Kunkel, Susanne
2018-01-01
State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems. PMID:29503613
TECA: Petascale pattern recognition for climate science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prabhat, .; Byna, Surendra; Vishwanath, Venkatram
Climate Change is one of the most pressing challenges facing humanity in the 21st century. Climate simulations provide us with a unique opportunity to examine effects of anthropogenic emissions. Highresolution climate simulations produce “Big Data”: contemporary climate archives are ≈ 5PB in size and we expect future archives to measure on the order of Exa-Bytes. In this work, we present the successful application of TECA (Toolkit for Extreme Climate Analysis) framework, for extracting extreme weather patterns such as Tropical Cyclones, Atmospheric Rivers and Extra-Tropical Cyclones from TB-sized simulation datasets. TECA has been run at full-scale on Cray XE6 and IBMmore » BG/Q systems, and has reduced the runtime for pattern detection tasks from years to hours. TECA has been utilized to evaluate the performance of various computational models in reproducing the statistics of extreme weather events, and for characterizing the change in frequency of storm systems in the future.« less
Scaling of Multimillion-Atom Biological Molecular Dynamics Simulation on a Petascale Supercomputer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schulz, Roland; Lindner, Benjamin; Petridis, Loukas
2009-01-01
A strategy is described for a fast all-atom molecular dynamics simulation of multimillion-atom biological systems on massively parallel supercomputers. The strategy is developed using benchmark systems of particular interest to bioenergy research, comprising models of cellulose and lignocellulosic biomass in an aqueous solution. The approach involves using the reaction field (RF) method for the computation of long-range electrostatic interactions, which permits efficient scaling on many thousands of cores. Although the range of applicability of the RF method for biomolecular systems remains to be demonstrated, for the benchmark systems the use of the RF produces molecular dipole moments, Kirkwood G factors,more » other structural properties, and mean-square fluctuations in excellent agreement with those obtained with the commonly used Particle Mesh Ewald method. With RF, three million- and five million atom biological systems scale well up to 30k cores, producing 30 ns/day. Atomistic simulations of very large systems for time scales approaching the microsecond would, therefore, appear now to be within reach.« less
Scaling of Multimillion-Atom Biological Molecular Dynamics Simulation on a Petascale Supercomputer.
Schulz, Roland; Lindner, Benjamin; Petridis, Loukas; Smith, Jeremy C
2009-10-13
A strategy is described for a fast all-atom molecular dynamics simulation of multimillion-atom biological systems on massively parallel supercomputers. The strategy is developed using benchmark systems of particular interest to bioenergy research, comprising models of cellulose and lignocellulosic biomass in an aqueous solution. The approach involves using the reaction field (RF) method for the computation of long-range electrostatic interactions, which permits efficient scaling on many thousands of cores. Although the range of applicability of the RF method for biomolecular systems remains to be demonstrated, for the benchmark systems the use of the RF produces molecular dipole moments, Kirkwood G factors, other structural properties, and mean-square fluctuations in excellent agreement with those obtained with the commonly used Particle Mesh Ewald method. With RF, three million- and five million-atom biological systems scale well up to ∼30k cores, producing ∼30 ns/day. Atomistic simulations of very large systems for time scales approaching the microsecond would, therefore, appear now to be within reach.
Workload Characterization of a Leadership Class Storage Cluster
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Youngjae; Gunasekaran, Raghul; Shipman, Galen M
2010-01-01
Understanding workload characteristics is critical for optimizing and improving the performance of current systems and software, and architecting new storage systems based on observed workload patterns. In this paper, we characterize the scientific workloads of the world s fastest HPC (High Performance Computing) storage cluster, Spider, at the Oak Ridge Leadership Computing Facility (OLCF). Spider provides an aggregate bandwidth of over 240 GB/s with over 10 petabytes of RAID 6 formatted capacity. OLCFs flagship petascale simulation platform, Jaguar, and other large HPC clusters, in total over 250 thousands compute cores, depend on Spider for their I/O needs. We characterize themore » system utilization, the demands of reads and writes, idle time, and the distribution of read requests to write requests for the storage system observed over a period of 6 months. From this study we develop synthesized workloads and we show that the read and write I/O bandwidth usage as well as the inter-arrival time of requests can be modeled as a Pareto distribution.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fakcharoenphol, Perapon; Xiong, Yi; Hu, Litang
TOUGH2-EGS is a numerical simulation program coupling geomechanics and chemical reactions for fluid and heat flows in porous media and fractured reservoirs of enhanced geothermal systems. The simulator includes the fully-coupled geomechanical (THM) module, the fully-coupled geochemical (THC) module, and the sequentially coupled reactive geochemistry (THMC) module. The fully-coupled flow-geomechanics model is developed from the linear elastic theory for the thermo-poro-elastic system and is formulated with the mean normal stress as well as pore pressure and temperature. The chemical reaction is sequentially coupled after solution of flow equations, which provides the flow velocity and phase saturation for the solute transportmore » calculation at each time step. In addition, reservoir rock properties, such as porosity and permeability, are subjected to change due to rock deformation and chemical reactions. The relationships between rock properties and geomechanical and chemical effects from poro-elasticity theories and empirical correlations are incorporated into the simulator. This report provides the user with detailed information on both mathematical models and instructions for using TOUGH2-EGS for THM, THC or THMC simulations. The mathematical models include the fluid and heat flow equations, geomechanical equation, reactive geochemistry equations, and discretization methods. Although TOUGH2-EGS has the capability for simulating fluid and heat flows coupled with both geomechanical and chemical effects, it is up to the users to select the specific coupling process, such as THM, THC, or THMC in a simulation. There are several example problems illustrating the applications of this program. These example problems are described in details and their input data are presented. The results demonstrate that this program can be used for field-scale geothermal reservoir simulation with fluid and heat flow, geomechanical effect, and chemical reaction in porous and fractured media.« less
NASA Astrophysics Data System (ADS)
Klein, Andreas; Gerlach, Gerald
1998-09-01
This paper deals with the simulation of the fluid-structure interaction phenomena in micropumps. The proposed solution approach is based on external coupling of two different solvers, which are considered here as `black boxes'. Therefore, no specific intervention is necessary into the program code, and solvers can be exchanged arbitrarily. For the realization of the external iteration loop, two algorithms are considered: the relaxation-based Gauss-Seidel method and the computationally more extensive Newton method. It is demonstrated in terms of a simplified test case, that for rather weak coupling, the Gauss-Seidel method is sufficient. However, by simply changing the considered fluid from air to water, the two physical domains become strongly coupled, and the Gauss-Seidel method fails to converge in this case. The Newton iteration scheme must be used instead.
Computer program for afterheat temperature distribution for mobile nuclear power plant
NASA Technical Reports Server (NTRS)
Parker, W. G.; Vanbibber, L. E.
1972-01-01
ESATA computer program was developed to analyze thermal safety aspects of post-impacted mobile nuclear power plants. Program is written in FORTRAN 4 and designed for IBM 7094/7044 direct coupled system.
Bangladesh's SMP earns top marks.
1984-01-01
A recent evaluation funded by the US Agency for International Development (AID) confirms that Bangladesh's contraceptive social marketing program has exceeded its planner's goals and demonstrated the ability of such a system to widely distribute contraceptive products at a low cost. The project, which began contraceptive sales in 1975, distributes condoms, oral contraceptives, and foaming vaginal tablets. Almost 25% of contraceptive users in Bangladesh are serviced by the social marketing program. By the end of 1983, the program was providing 1,022,000 couple years of protection; this included 84 million condoms, 1.7 million pill cycles, and 5.1 million spermicidal tablets each year. The program's cost for 1 couple year of protection is US$1.66. Social marketing sales have accounted for all increases in couple years of protection experienced by the country's national population program since 1975. Sales have been boosted by recent efforts to draw rural medical practitioners into family planning activities. Mobile film units have further increased sales. The USAID report identifies 3 elements that have spearheaded the social marketing program's achievements: 1) the existence of a committed core management team, 2) the granting of autonomy to make daily decisions to this management team, and 3) central control fo the product distribution system by management rather than by subcontractors. Overall, the social marketing program is credited with legitimizing discussion of contraception in a country formerly considered too conservative to tolerate open product promotion.
Dynamics of the Pin Pallet Runaway Escapement
1978-06-01
for Continued Work 29 References 32 I Appendixes A Kinematics of Coupled Motion 34 B Differential Equation of Coupled Motion 38 f C Moment Arms 42 D...Expressions for these quantities are derived in appendix D. The differential equations for the free motion of the pallet and the escape-wheel are...Coupled Motion (location 100) To solve the differential equation of coupled motion (see equation .B (-10) of appendix B)- the main program calls on
What, Why, and for Whom: Couples Interventions--A Deconstruction Approach
ERIC Educational Resources Information Center
Sher, Tamara Goldman
2012-01-01
This paper provides a commentary on the special series on universal processes and common factors in couple therapy. The authors in this section share their insights, from varying perspectives, about what it is in couples therapy and relationship education programs that work, why they work, and for whom they work best. In so doing, these articles…
NASA Technical Reports Server (NTRS)
Wlezien, R. W.; Horner, G. C.; McGowan, A. R.; Padula, S. L.; Scott, M. A.; Silcox, R. J.; Simpson, J. O.
1998-01-01
In the last decade smart technologies have become enablers that cut across traditional boundaries in materials science and engineering. Here we define smart to mean embedded actuation, sensing, and control logic in a tightly coupled feedback loop. While multiple successes have been achieved in the laboratory, we have yet to see the general applicability of smart devices to real aircraft systems. The NASA Aircraft Morphing program is an attempt to couple research across a wide range of disciplines to integrate smart technologies into high payoff aircraft applications. The program bridges research in seven individual disciplines and combines the effort into activities in three primary program thrusts. System studies are used to assess the highest- payoff program objectives, and specific research activities are defined to address the technologies required for development of smart aircraft systems. In this paper we address the overall program goals and programmatic structure, and discuss the challenges associated with bringing the technologies to fruition.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Barton
2014-06-30
Peta-scale computing environments pose significant challenges for both system and application developers and addressing them required more than simply scaling up existing tera-scale solutions. Performance analysis tools play an important role in gaining this understanding, but previous monolithic tools with fixed feature sets have not sufficed. Instead, this project worked on the design, implementation, and evaluation of a general, flexible tool infrastructure supporting the construction of performance tools as “pipelines” of high-quality tool building blocks. These tool building blocks provide common performance tool functionality, and are designed for scalability, lightweight data acquisition and analysis, and interoperability. For this project, wemore » built on Open|SpeedShop, a modular and extensible open source performance analysis tool set. The design and implementation of such a general and reusable infrastructure targeted for petascale systems required us to address several challenging research issues. All components needed to be designed for scale, a task made more difficult by the need to provide general modules. The infrastructure needed to support online data aggregation to cope with the large amounts of performance and debugging data. We needed to be able to map any combination of tool components to each target architecture. And we needed to design interoperable tool APIs and workflows that were concrete enough to support the required functionality, yet provide the necessary flexibility to address a wide range of tools. A major result of this project is the ability to use this scalable infrastructure to quickly create tools that match with a machine architecture and a performance problem that needs to be understood. Another benefit is the ability for application engineers to use the highly scalable, interoperable version of Open|SpeedShop, which are reassembled from the tool building blocks into a flexible, multi-user interface set of tools. This set of tools targeted at Office of Science Leadership Class computer systems and selected Office of Science application codes. We describe the contributions made by the team at the University of Wisconsin. The project built on the efforts in Open|SpeedShop funded by DOE/NNSA and the DOE/NNSA Tri-Lab community, extended Open|Speedshop to the Office of Science Leadership Class Computing Facilities, and addressed new challenges found on these cutting edge systems. Work done under this project at Wisconsin can be divided into two categories, new algorithms and techniques for debugging, and foundation infrastructure work on our Dyninst binary analysis and instrumentation toolkits and MRNet scalability infrastructure.« less
Barton, Allen W; Beach, Steven R H; Bryant, Chalandra M; Lavner, Justin A; Brody, Gene H
2018-03-01
This study investigated (a) the stress spillover pathways linking contextual stressors, changes in couple relationship functioning and depressive symptoms, and changes in individuals' physical health, and (b) the stress-buffering effect of participation in an efficacious, family centered prevention program designed to protect couples from the deleterious effects of stressors. The sample consisted of 346 rural African American couples (63% married) who participated in a randomized controlled trial of the Protecting Strong African American Families (ProSAAF) program. Participants were assessed at three time points across 17 months. Results examining stress spillover within the control group indicated that elevated current, but not prior, financial hardship was associated with decreased effective communication, relationship satisfaction, and relationship confidence as well as increased depressive symptoms; current levels of racial discrimination also predicted greater depressive symptoms. Relationship confidence and relationship satisfaction, but not communication or depressive symptoms, in turn predicted declines in self-reported physical health. Results examining stress-buffering effects suggested that participation in ProSAAF protected individuals' relationship confidence from declines associated with elevated financial hardship. In addition, the indirect effect linking financial hardship to declines in physical health through relationship confidence that emerged among participants in the control group was no longer evident for ProSAAF couples. Results highlight the effect of contextual stressors on African Americans' couple and individual well-being and the potential for the ProSAAF program to provide a constructed resilience resource, protecting couple's confidence in their relationship from the negative effects of financial hardship and, consequently, promoting physical health. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Zhang, Shuo; Zhang, Chengning; Han, Guangwei; Wang, Qinghui
2014-01-01
A dual-motor coupling-propulsion electric bus (DMCPEB) is modeled, and its optimal control strategy is studied in this paper. The necessary dynamic features of energy loss for subsystems is modeled. Dynamic programming (DP) technique is applied to find the optimal control strategy including upshift threshold, downshift threshold, and power split ratio between the main motor and auxiliary motor. Improved control rules are extracted from the DP-based control solution, forming near-optimal control strategies. Simulation results demonstrate that a significant improvement in reducing energy loss due to the dual-motor coupling-propulsion system (DMCPS) running is realized without increasing the frequency of the mode switch. PMID:25540814
Zhang, Shuo; Zhang, Chengning; Han, Guangwei; Wang, Qinghui
2014-01-01
A dual-motor coupling-propulsion electric bus (DMCPEB) is modeled, and its optimal control strategy is studied in this paper. The necessary dynamic features of energy loss for subsystems is modeled. Dynamic programming (DP) technique is applied to find the optimal control strategy including upshift threshold, downshift threshold, and power split ratio between the main motor and auxiliary motor. Improved control rules are extracted from the DP-based control solution, forming near-optimal control strategies. Simulation results demonstrate that a significant improvement in reducing energy loss due to the dual-motor coupling-propulsion system (DMCPS) running is realized without increasing the frequency of the mode switch.
Relating engagement to outcomes in prevention: the case of a parenting program for couples.
Brown, Louis D; Goslin, Megan C; Feinberg, Mark E
2012-09-01
Analyses of program engagement can provide critical insight into how program involvement leads to outcomes. This study examines the relation between participant engagement and program outcomes in Family Foundations (FF), a universal preventive intervention designed to help couples manage the transition to parenthood by improving coparenting relationship quality. Previous intent-to-treat outcome analyses from a randomized trial indicate FF improves parental adjustment, interparental relationships, and parenting. Analyses for the current study use the same sample, and yield statistically reliable relations between participant engagement and interparental relationships but not parental adjustment or parenting. Discussion considers implications for FF and the difficulties researchers face when examining the relation between engagement and outcomes in preventive interventions.
Relating Engagement to Outcomes in Prevention: The Case of a Parenting Program for Couples
Brown, Louis D.; Goslin, Megan C.; Feinberg, Mark E.
2011-01-01
Analyses of program engagement can provide critical insight into how program involvement leads to outcomes. This study examines the relation between participant engagement and program outcomes in Family Foundations (FF), a universal preventive intervention designed to help couples manage the transition to parenthood by improving coparenting relationship quality. Previous intent-to-treat outcome analyses from a randomized trial indicate FF improves parental adjustment, interparental relationships, and parenting. Analyses for the current study use the same sample, and yield statistically reliable relations between participant engagement and interparental relationships but not parental adjustment or parenting. Discussion considers implications for FF and the difficulities researchers face when examining the relation between engagement and outcomes in preventive interventions. PMID:21826536
Effect of heat waves on VOC emissions from vegetation and urban air quality
NASA Astrophysics Data System (ADS)
Churkina, G.; Kuik, F.; Lauer, A.; Bonn, B.; Butler, T. M.
2015-12-01
Programs to plant millions of trees in cities around the world aim at the reduction of summer temperatures, increase carbon storage, storm water control, provision of space for recreation, as well as poverty alleviation. Although these multiple benefits speak positively for urban greening programs, the programs do not take into account how close human and natural systems are coupled in urban areas. Elevated temperatures together with anthropogenic emissions of air and water pollutants distinguish the urban system. Urban and sub-urban vegetation responds to ambient changes and reacts with pollutants. Neglecting this coupling may lead to unforeseen drawbacks of urban greening programs. The potential for emissions of volatile organic compounds (VOC) from vegetation combined with anthropogenic emissions to produce ozone has long been recognized. This potential increases under rising temperatures. Here we investigate how heat waves affect emissions of VOC from urban vegetation and corresponding ground-level ozone. In this study we use Weather Research and Forecasting Model with coupled atmospheric chemistry (WRF-CHEM) to quantify these feedbacks in Berlin, Germany during the 2006 heat wave. VOC emissions from vegetation are simulated with MEGAN 2.0 coupled with WRF-CHEM. Our preliminary results indicate that contribution of VOCs from vegetation to ozone formation may increase by more than twofold during the heat wave period. We highlight the importance of the vegetation for urban areas under changing climate and discuss associated tradeoffs.
van Lankveld, J J; Grotjohann, Y; van Lokven, B M; Everaerd, W
1999-01-01
This study compared characteristics of couples with different sexual dysfunctions who were recruited for participation in a bibliotherapy program via two routes: in response to media advertisements and through their presence on a waiting list for therapist-administered treatment in an outpatient sexology clinic. Data were collected from 492 subjects (246 couples). Male sexology patients were younger than media-recruited males. However, type of sexual dysfunction accounted for a substantially larger proportion of variance in the demographic and psychometric data. An interaction effect of recruitment strategy and sexual dysfunction type was found with respect to female anorgasmia. We conclude from the absence of differences between the two study groups that the Wills and DePaulo (1991) model of help-seeking behavior for mental problems does not apply to couples with sexual dysfunctions joining a bibliotherapy program who either primarily requested professional treatment or who responded to media advertising.
Design of 20 W fiber-coupled green laser diode by Zemax
NASA Astrophysics Data System (ADS)
Qi, Yunfei; Zhao, Pengfei; Wu, Yulong; Chen, Yongqi; Zou, Yonggang
2017-09-01
We represent a design of a 20 W, fiber-coupled diode laser module based on 26 single emitters at 520 nm. The module can produce more than 20 W output power from a standard fiber with core diameter of 400 μm and numerical aperture (NA) of 0.22. To achieve a 20 W laser beam, the spatial beam combination and polarization beam combination by polarization beam splitter are used to combine output of 26 single emitters into a single beam, and then an aspheric lens is used to couple the combined beam into an optical fiber. The simulation shows that the total coupling efficiency is more than 95%. Project supported by the National Key R& D Program of China (No. 2016YFB0402105), the Key Deployment Program of the Chinese Academy of Sciences (No. KGZD-SW-T01-2), and the National Natural Science Foundation of China (No. 61404135).
Kröger, Christoph; Kliem, Sören; Zimmermann, Peter; Kowalski, Jens
2018-04-01
This study examines the short-term effectiveness of a relationship education program designed for military couples. Distressed couples were randomly placed in either a wait-list control group or an intervention group. We conducted training sessions before a 3-month foreign assignment, and refresher courses approximately 6-week post-assignment. We analyzed the dyadic data of 32 couples, using hierarchical linear modeling in a two-level model. Reduction in unresolved conflicts was found in the intervention group, with large pre-post effects for both partners. Relationship satisfaction scores were improved, with moderate-to-large effects only for soldiers, rather than their partners. Post-follow-up effect sizes suggested further improvement in the intervention group. Future research should examine the long-term effectiveness of this treatment. © 2017 American Association for Marriage and Family Therapy.
Rhoades, Galena K
2015-12-01
This study examined the effectiveness of a couple-based relationship education program, Within Our Reach. Secondary data (n = 3,609) were analyzed from the federal Supporting Healthy Marriage project. Couples were randomly assigned to receive Within Our Reach and associated services or to a no-treatment (treatment-as-usual) control group. Those assigned to Within Our Reach reported better couple and individual outcomes on 8 of 12 outcomes measured (M ES = .15) at the 12-month follow-up and 6 of 10 outcomes measured at the 30-month follow-up (M ES = .14), including higher relationship happiness, more warmth and support, more positive communication, less negative behavior and emotion, less psychological abuse, less physical assault (for men), lower psychological distress (for women), and less infidelity. They were also less likely to report that their marriage was in trouble. These effects were generally small in size and many were replicated across the two follow-ups. There were no significant differences between those assigned to Within Our Reach versus control on cooperative parenting, severe psychological assault, or percent married. Implications for future research, programming, and policy are discussed. © 2015 Family Process Institute.
NASA Astrophysics Data System (ADS)
Youn, Dong Joon
This thesis presents the development and validation of an advanced hydro-mechanical coupled finite element program analyzing hydraulic fracture propagation within unconventional hydrocarbon formations under various conditions. The realistic modeling of hydraulic fracturing is necessarily required to improve the understanding and efficiency of the stimulation technique. Such modeling remains highly challenging, however, due to factors including the complexity of fracture propagation mechanisms, the coupled behavior of fracture displacement and fluid pressure, the interactions between pre-existing natural and initiated hydraulic fractures and the formation heterogeneity of the target reservoir. In this research, an eXtended Finite Element Method (XFEM) scheme is developed allowing for representation of single or multiple fracture propagations without any need for re-meshing. Also, the coupled flows through the fracture are considered in the program to account for their influence on stresses and deformations along the hydraulic fracture. In this research, a sequential coupling scheme is applied to estimate fracture aperture and fluid pressure with the XFEM. Later, the coupled XFEM program is used to estimate wellbore bottomhole pressure during fracture propagation, and the pressure variations are analyzed to determine the geometry and performance of the hydraulic fracturing as pressure leak-off test. Finally, material heterogeneity is included into the XFEM program to check the effect of random formation property distributions to the hydraulic fracture geometry. Random field theory is used to create the random realization of the material heterogeneity with the consideration of mean, standard deviation, and property correlation length. These analyses lead to probabilistic information on the response of unconventional reservoirs and offer a more scientific approach regarding risk management for the unconventional reservoir stimulation. The new stochastic approach combining XFEM and random field is named as eXtended Random Finite Element Method (XRFEM). All the numerical analysis codes in this thesis are written in Fortran 2003, and these codes are applicable as a series of sub-modules within a suite of finite element codes developed by Smith and Griffiths (2004).
NASA Astrophysics Data System (ADS)
Wissmeier, L. C.; Barry, D. A.
2009-12-01
Computer simulations of water availability and quality play an important role in state-of-the-art water resources management. However, many of the most utilized software programs focus either on physical flow and transport phenomena (e.g., MODFLOW, MT3DMS, FEFLOW, HYDRUS) or on geochemical reactions (e.g., MINTEQ, PHREEQC, CHESS, ORCHESTRA). In recent years, several couplings between both genres of programs evolved in order to consider interactions between flow and biogeochemical reactivity (e.g., HP1, PHWAT). Software coupling procedures can be categorized as ‘close couplings’, where programs pass information via the memory stack at runtime, and ‘remote couplings’, where the information is exchanged at each time step via input/output files. The former generally involves modifications of software codes and therefore expert programming skills are required. We present a generic recipe for remotely coupling the PHREEQC geochemical modeling framework and flow and solute transport (FST) simulators. The iterative scheme relies on operator splitting with continuous re-initialization of PHREEQC and the FST of choice at each time step. Since PHREEQC calculates the geochemistry of aqueous solutions in contact with soil minerals, the procedure is primarily designed for couplings to FST’s for liquid phase flow in natural environments. It requires the accessibility of initial conditions and numerical parameters such as time and space discretization in the input text file for the FST and control of the FST via commands to the operating system (batch on Windows; bash/shell on Unix/Linux). The coupling procedure is based on PHREEQC’s capability to save the state of a simulation with all solid, liquid and gaseous species as a PHREEQC input file by making use of the dump file option in the TRANSPORT keyword. The output from one reaction calculation step is therefore reused as input for the following reaction step where changes in element amounts due to advection/dispersion are introduced as irreversible reactions. An example for the coupling of PHREEQC and MATLAB for the solution of unsaturated flow and transport is provided.
NASA Technical Reports Server (NTRS)
Ferguson, D. R.
1972-01-01
The streamtube curvature program (STC) has been developed to predict the inviscid flow field and the pressure distribution about nacelles at transonic speeds. The effects of boundary layer are to displace the inviscid flow and effectively change the body shape. Thus, the body shape must be corrected by the displacement thickness in order to calculate the correct pressure distribution. This report describes the coupling of the Stratford and Beavers boundary layer solution with the inviscid STC analysis so that all nacelle pressure forces, friction drag, and incipient separation may be predicted. The usage of the coupled STC-SAB computer program is outlined and the program input and output are defined. Included in this manual are descriptions of the principal boundary layer tables and other revisions to the STC program. The use of the viscous option is controlled by the engineer during program input definition.
Conradi, Henk Jan; Dingemanse, Pieter; Noordhof, Arjen; Finkenauer, Catrin; Kamphuis, Jan H
2017-09-04
While evidence-based couple therapies are available, only a minority of troubled couples seek help and they often do this too late. To reach more couples earlier, the couple relationship education (CRE) group program "Hold me Tight" (HmT) based on Emotionally Focused Couples Therapy (EFCT) was developed. This study is the first to examine the effectiveness of HmT. Using a three-wave (waiting period, treatment, and follow-up) within-subject design, HmT was delivered to 79 self-referred couples and 50 clinician-referred couples. We applied a comprehensive outcome measure battery. Our main findings were that (1) self-referred couples significantly improved during HmT on all measures, that is relationship satisfaction, security of partner-bond, forgiveness, daily coordination, maintenance behavior, and psychological complaints, with a moderate-to-large mean effect size (d = .63), which was maintained (d = .57) during the 3.5 month follow-up; (2) in clinician-referred couples, who were vulnerable in terms of insecure attachment status and psychopathology, the improvement during HmT was moderate (d = .42), but this was reduced during the 3.5-month follow-up to a small effect (d = .22); (3) emotional functioning (typical HmT target) as well as behavioral functioning (typical Behavioral Couples Therapy-based CRE target) improved during HmT; and (4) individual psychological complaints, although not specifically targeted, were reduced during HmT. These findings suggest that HmT is a promising intervention for enhancement of relationship functioning. Clinical implications are discussed. © 2017 Family Process Institute.
ERIC Educational Resources Information Center
Faircloth, W. Brad; Schermerhorn, Alice C.; Mitchell, Patricia M.; Cummings, Jennifer S.; Cummings, E. Mark
2011-01-01
Family-focused prevention programs for community samples have potentially broad, clinically relevant implications but few studies have examined whether any program benefits continue to be observed over the long term. Although benefits of a marital conflict focused parent education program, the Happy Couples and Happy Kids (i.e., HCHK) program,…
NASA's space physics theory program - An opportunity for collaboration
NASA Technical Reports Server (NTRS)
Vinas, Adolfo F.
1990-01-01
The field of theoretical space physics offers a unique opportunity to Latin American scientists for collaborative participation in NASA programs where the greatly increased complexity of both experimental observations and theoretical simulations requires in-depth comparisons between theory and observational data. The key problem areas identified by NASA for aggressive work in the decade of the 1990s are the nature of flows and turbulence, acceleration and transport of particles, the coupling of microphysics and macrophysics, the coupling of local and global dynamics, and nonclassical plasmas.
Development of a CCD based solar speckle imaging system
NASA Astrophysics Data System (ADS)
Nisenson, Peter; Stachnik, Robert V.; Noyes, Robert W.
1986-02-01
A program to develop software and hardware for the purpose of obtaining high angular resolution images of the solar surface is described. The program included the procurement of a Charge Coupled Devices imaging system; an extensive laboratory and remote site testing of the camera system; the development of a software package for speckle image reconstruction which was eventually installed and tested at the Sacramento Peak Observatory; and experiments of the CCD system (coupled to an image intensifier) for low light level, narrow spectral band solar imaging.
Buttles, John W [Idaho Falls, ID
2011-12-20
Wireless communication devices include a software-defined radio coupled to processing circuitry. The processing circuitry is configured to execute computer programming code. Storage media is coupled to the processing circuitry and includes computer programming code configured to cause the processing circuitry to configure and reconfigure the software-defined radio to operate on each of a plurality of communication networks according to a selected sequence. Methods for communicating with a wireless device and methods of wireless network-hopping are also disclosed.
Buttles, John W
2013-04-23
Wireless communication devices include a software-defined radio coupled to processing circuitry. The system controller is configured to execute computer programming code. Storage media is coupled to the system controller and includes computer programming code configured to cause the system controller to configure and reconfigure the software-defined radio to operate on each of a plurality of communication networks according to a selected sequence. Methods for communicating with a wireless device and methods of wireless network-hopping are also disclosed.
Lagardère, Louis; Jolly, Luc-Henri; Lipparini, Filippo; Aviat, Félix; Stamm, Benjamin; Jing, Zhifeng F.; Harger, Matthew; Torabifard, Hedieh; Cisneros, G. Andrés; Schnieders, Michael J.; Gresh, Nohad; Maday, Yvon; Ren, Pengyu Y.; Ponder, Jay W.
2017-01-01
We present Tinker-HP, a massively MPI parallel package dedicated to classical molecular dynamics (MD) and to multiscale simulations, using advanced polarizable force fields (PFF) encompassing distributed multipoles electrostatics. Tinker-HP is an evolution of the popular Tinker package code that conserves its simplicity of use and its reference double precision implementation for CPUs. Grounded on interdisciplinary efforts with applied mathematics, Tinker-HP allows for long polarizable MD simulations on large systems up to millions of atoms. We detail in the paper the newly developed extension of massively parallel 3D spatial decomposition to point dipole polarizable models as well as their coupling to efficient Krylov iterative and non-iterative polarization solvers. The design of the code allows the use of various computer systems ranging from laboratory workstations to modern petascale supercomputers with thousands of cores. Tinker-HP proposes therefore the first high-performance scalable CPU computing environment for the development of next generation point dipole PFFs and for production simulations. Strategies linking Tinker-HP to Quantum Mechanics (QM) in the framework of multiscale polarizable self-consistent QM/MD simulations are also provided. The possibilities, performances and scalability of the software are demonstrated via benchmarks calculations using the polarizable AMOEBA force field on systems ranging from large water boxes of increasing size and ionic liquids to (very) large biosystems encompassing several proteins as well as the complete satellite tobacco mosaic virus and ribosome structures. For small systems, Tinker-HP appears to be competitive with the Tinker-OpenMM GPU implementation of Tinker. As the system size grows, Tinker-HP remains operational thanks to its access to distributed memory and takes advantage of its new algorithmic enabling for stable long timescale polarizable simulations. Overall, a several thousand-fold acceleration over a single-core computation is observed for the largest systems. The extension of the present CPU implementation of Tinker-HP to other computational platforms is discussed. PMID:29732110
NASA Technical Reports Server (NTRS)
Meyer, H. D.
1993-01-01
The Acoustic Radiation Code (ARC) is a finite element program used on the IBM mainframe to predict far-field acoustic radiation from a turbofan engine inlet. In this report, requirements for developers of internal aerodynamic codes regarding use of their program output an input for the ARC are discussed. More specifically, the particular input needed from the Bolt, Beranek and Newman/Pratt and Whitney (turbofan source noise generation) Code (BBN/PWC) is described. In a separate analysis, a method of coupling the source and radiation models, that recognizes waves crossing the interface in both directions, has been derived. A preliminary version of the coupled code has been developed and used for initial evaluation of coupling issues. Results thus far have shown that reflection from the inlet is sufficient to indicate that full coupling of the source and radiation fields is needed for accurate noise predictions ' Also, for this contract, the ARC has been modified for use on the Sun and Silicon Graphics Iris UNIX workstations. Changes and additions involved in this effort are described in an appendix.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, MooHyun
2014-08-01
This report presents the development of offshore anchor data sets which are intended to be used to develop a database that allows preliminary selection and sizing of anchors for the conceptual design of floating offshore wind turbines (FOWTs). The study is part of a project entitled “Development of Mooring-Anchor Program in Public Domain for Coupling with Floater Program for FOWTs (Floating Offshore Wind Turbines)”, under the direction of Dr. Moo-Hyun Kim at the Texas A&M University and with the sponsorship from the US Department of Energy (Contract No. DE-EE0005479, CFDA # 81.087 for DE-FOA-0000415, Topic Area 1.3: Subsurface Mooring andmore » Anchoring Dynamics Models).« less
ERIC Educational Resources Information Center
Burgin, Stephen R.; Sadler, Troy D.
2013-01-01
This article describes summer programs that allow high school students to participate in an "authentic scientific research experience" (ASRE). These summer programs are specifically designed to embed students in working laboratories and research groups. Summer ASRE programs for secondary learners range in length from a couple of weeks to…
Evaluation plan for the ticketing aggressive cars and trucks (TACT) program in Kentucky.
DOT National Transportation Integrated Search
2009-03-01
The objective of the program is to alter driver behavior around large commercial vehicles through education and enforcement. The key components of TACT are communications/media coupled with enforcement and evaluation. The program consisted of two med...
Coupled structural/thermal/electromagnetic analysis/tailoring of graded composite structures
NASA Technical Reports Server (NTRS)
Mcknight, R. L.; Huang, H.; Hartle, M.
1992-01-01
Accomplishments are described for the fourth years effort of a 5-year program to develop a methodology for coupled structural/thermal/electromagnetic analysis/tailoring of graded component structures. These accomplishments include: (1) demonstration of coupled solution capability; (2) alternate CSTEM electromagnetic technology; (3) CSTEM acoustic capability; (4) CSTEM tailoring; (5) CSTEM composite micromechanics using ICAN; and (6) multiple layer elements in CSTEM.
Ablation and radiation coupled viscous hypersonic shock layers, volume 1
NASA Technical Reports Server (NTRS)
Engel, C. D.
1971-01-01
The results for a stagnation-line analysis of the radiative heating of a phenolic-nylon ablator are presented. The analysis includes flow field coupling with the ablator surface, equilibrium chemistry, a step-function diffusion model and a coupled line and continuum radiation calculation. This report serves as the documentation, i e. users manual and operating instructions for the computer programs listed in the report.
The effects of work-related values on communication between R and D groups, part 1. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Douds, C. F.
1970-01-01
The research concerned with the liaison, interface, coupling, and technology transfer processes that occur in research and development is reported. Overviews of the functions of communication and coupling in the R and D processes, and the theoretical considerations of coupling, communication, and values are presented along with descriptions of the field research program and the instrumentation.
Al-Nood, Hafiz; Al-Hadi, Abdulrahman
2013-01-01
In Yemen, the prevalence of sickle cell trait and β-thalassemia trait are high. The aim of this premarital program is to identify sickle cell and thalassemia carrier couples in Yemen before completing marriages proposal, in order to prevent affected birth. This can be achieved by applying a low-cost premarital screening program using simple blood tests compatible with the limited health resources of the country. If microcytosis or positive sickle cell is found in both or one partner has microcytosis and the other has positive sickle cell, so their children at high risk of having sickle cell or/and thalassemia diseases. Carrier couples will be referred to genetic counseling. The outcomes of this preventive program are predicted to decrease the incidence of affected birth and reduce the health burden of these disorders. The success of this program also requires governmental, educational and religious supports. PMID:25003062
Programmed coherent coupling in a synthetic DNA-based excitonic circuit
NASA Astrophysics Data System (ADS)
Boulais, Étienne; Sawaya, Nicolas P. D.; Veneziano, Rémi; Andreoni, Alessio; Banal, James L.; Kondo, Toru; Mandal, Sarthak; Lin, Su; Schlau-Cohen, Gabriela S.; Woodbury, Neal W.; Yan, Hao; Aspuru-Guzik, Alán; Bathe, Mark
2018-02-01
Natural light-harvesting systems spatially organize densely packed chromophore aggregates using rigid protein scaffolds to achieve highly efficient, directed energy transfer. Here, we report a synthetic strategy using rigid DNA scaffolds to similarly program the spatial organization of densely packed, discrete clusters of cyanine dye aggregates with tunable absorption spectra and strongly coupled exciton dynamics present in natural light-harvesting systems. We first characterize the range of dye-aggregate sizes that can be templated spatially by A-tracts of B-form DNA while retaining coherent energy transfer. We then use structure-based modelling and quantum dynamics to guide the rational design of higher-order synthetic circuits consisting of multiple discrete dye aggregates within a DX-tile. These programmed circuits exhibit excitonic transport properties with prominent circular dichroism, superradiance, and fast delocalized exciton transfer, consistent with our quantum dynamics predictions. This bottom-up strategy offers a versatile approach to the rational design of strongly coupled excitonic circuits using spatially organized dye aggregates for use in coherent nanoscale energy transport, artificial light-harvesting, and nanophotonics.
Existing and Emerging Third-Party: Certification Programs
ERIC Educational Resources Information Center
Wagner, Dan
2012-01-01
When one considers the necessary elements of a green cleaning program, it is tough to know where to begin. After all, green cleaning has evolved considerably from the days when a program simply involved using a couple of "green" chemicals. Over the last several years, successful green cleaning programs have grown in sophistication and are now…
32 CFR 22.320 - Special competitions.
Code of Federal Regulations, 2011 CFR
2011-07-01
.... An example would be a program to enhance U.S. capabilities for academic research and research-coupled graduate education in defense-critical, science and engineering disciplines, a program that would be...
NASA Technical Reports Server (NTRS)
Kraft, R. E.
1996-01-01
The objective of this effort is to develop an analytical model for the coupling of active noise control (ANC) piston-type actuators that are mounted flush to the inner and outer walls of an annular duct to the modes in the duct generated by the actuator motion. The analysis will be used to couple the ANC actuators to the modal analysis propagation computer program for the annular duct, to predict the effects of active suppression of fan-generated engine noise sources. This combined program will then be available to assist in the design or evaluation of ANC systems in fan engine annular exhaust ducts. An analysis has been developed to predict the modes generated in an annular duct due to the coupling of flush-mounted ring actuators on the inner and outer walls of the duct. The analysis has been combined with a previous analysis for the coupling of modes to a cylindrical duct in a FORTRAN computer program to perform the computations. The method includes the effects of uniform mean flow in the duct. The program can be used for design or evaluation purposes for active noise control hardware for turbofan engines. Predictions for some sample cases modeled after the geometry of the NASA Lewis ANC Fan indicate very efficient coupling in both the inlet and exhaust ducts for the m = 6 spinning mode at frequencies where only a single radial mode is cut-on. Radial mode content in higher order cut-off modes at the source plane and the required actuator displacement amplitude to achieve 110 dB SPL levels in the desired mode were predicted. Equivalent cases with and without flow were examined for the cylindrical and annular geometry, and little difference was found for a duct flow Mach number of 0.1. The actuator ring coupling program will be adapted as a subroutine to the cylindrical duct modal analysis and the exhaust duct modal analysis. This will allow the fan source to be defined in terms of characteristic modes at the fan source plane and predict the propagation to the arbitrarily-located ANC source plane. The actuator velocities can then be determined to generate the anti-phase mode. The resulting combined fan source/ANC pressure can then be calculated at any desired wall sensor position. The actuator velocities can be determined manually or using a simulation of a control system feedback loop. This will provide a very useful ANC system design and evaluation tool.
Improving Quality of Life and Depression After Stroke Through Telerehabilitation
Linder, Susan M.; Rosenfeldt, Anson B.; Bay, R. Curtis; Sahu, Komal; Wolf, Steven L.
2015-01-01
OBJECTIVE. The aim of this study was to determine the effects of home-based robot-assisted rehabilitation coupled with a home exercise program compared with a home exercise program alone on depression and quality of life in people after stroke. METHOD. A multisite randomized controlled clinical trial was completed with 99 people <6 mo after stroke who had limited access to formal therapy. Participants were randomized into one of two groups, (1) a home exercise program or (2) a robot-assisted therapy + home exercise program, and participated in an 8-wk home intervention. RESULTS. We observed statistically significant changes in all but one domain on the Stroke Impact Scale and the Center for Epidemiologic Studies Depression Scale for both groups. CONCLUSION. A robot-assisted intervention coupled with a home exercise program and a home exercise program alone administered using a telerehabilitation model may be valuable approaches to improving quality of life and depression in people after stroke. PMID:26122686
Improving Quality of Life and Depression After Stroke Through Telerehabilitation.
Linder, Susan M; Rosenfeldt, Anson B; Bay, R Curtis; Sahu, Komal; Wolf, Steven L; Alberts, Jay L
2015-01-01
The aim of this study was to determine the effects of home-based robot-assisted rehabilitation coupled with a home exercise program compared with a home exercise program alone on depression and quality of life in people after stroke. A multisite randomized controlled clinical trial was completed with 99 people<6 mo after stroke who had limited access to formal therapy. Participants were randomized into one of two groups, (1) a home exercise program or (2) a robot-assisted therapy+home exercise program, and participated in an 8-wk home intervention. We observed statistically significant changes in all but one domain on the Stroke Impact Scale and the Center for Epidemiologic Studies Depression Scale for both groups. A robot-assisted intervention coupled with a home exercise program and a home exercise program alone administered using a telerehabilitation model may be valuable approaches to improving quality of life and depression in people after stroke. Copyright © 2015 by the American Occupational Therapy Association, Inc.
Negovanska, V; Hergueta, T; Guichart-Gomez, E; Dubois, B; Sarazin, M; Bungener, C
2011-02-01
Over the last decade, several programs have been developed for caregivers of Alzheimer disease patients. In France however, studies exploring their effects are still scarce. We conducted a study to compare two different interventions: a structured multidisciplinary program versus a classical intervention designed for Alzheimer disease patients and their spouses. Sixteen couples (Alzheimer's disease patient and spouse) residing in our administrative district participated in this monocentric study. For at least two years, these couples participated in a multidisciplinary program (n=8 couples) or received usual care (n=8 couples). The multidisciplinary program involved biannual consultations with a neurologist, a neuropsychologist and a psychologist, in addition to an annual meeting, stratified on the patient's MMSE score, for spouses). Usual care involved biannual consultations with the neurologist. The multidisciplinary program included a psychological intervention based on cognitive behavioral theories and centered on psycho-education, problem solving, adaptation strategies and on prevention of depression and anxiety. The spouses and the patients evaluated the 2-year follow-up during clinical interviews, completed by questionnaires. Sociodemographic data were noted for the patients and their spouses. Levels of depression and anxiety (Mini International Neuropsychiatric Inventory, Montgomery and Asberg Depression Scale, State-Trait Anxiety Inventory), perceived stress (Perceived Stress Scale) and care burden (Zarit Burden Inventory) were evaluated in spouses. Levels of cognitive impairment (Mini Mental State Examination), autonomy (Instrumental Activities of Daily Living), psychological state (Montgomery and Asberg Depression Scale, Covi Anxiety Scale), and behavioral symptoms frequency (Neuropsychiatric Inventory) were assessed in patients. The main significant result showed that the spouses' state of anxiety was lower among participants in the multidisciplinary program, compared with the classical neurological intervention. It also was found that the spouses and the patients who participated in this multidisciplinary program were less depressed. This study shows that a multidisciplinary structured intervention, with only two annual consultations and one annual meeting for spouses, can contribute to decrease significantly the spouses' state of anxiety. Further studies including a larger number of subjects should be conducted to confirm these findings. Copyright © 2010 Elsevier Masson SAS. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xiong, Yi; Fakcharoenphol, Perapon; Wang, Shihao
2013-12-01
TOUGH2-EGS-MP is a parallel numerical simulation program coupling geomechanics with fluid and heat flow in fractured and porous media, and is applicable for simulation of enhanced geothermal systems (EGS). TOUGH2-EGS-MP is based on the TOUGH2-MP code, the massively parallel version of TOUGH2. In TOUGH2-EGS-MP, the fully-coupled flow-geomechanics model is developed from linear elastic theory for thermo-poro-elastic systems and is formulated in terms of mean normal stress as well as pore pressure and temperature. Reservoir rock properties such as porosity and permeability depend on rock deformation, and the relationships between these two, obtained from poro-elasticity theories and empirical correlations, are incorporatedmore » into the simulation. This report provides the user with detailed information on the TOUGH2-EGS-MP mathematical model and instructions for using it for Thermal-Hydrological-Mechanical (THM) simulations. The mathematical model includes the fluid and heat flow equations, geomechanical equation, and discretization of those equations. In addition, the parallel aspects of the code, such as domain partitioning and communication between processors, are also included. Although TOUGH2-EGS-MP has the capability for simulating fluid and heat flows coupled with geomechanical effects, it is up to the user to select the specific coupling process, such as THM or only TH, in a simulation. There are several example problems illustrating applications of this program. These example problems are described in detail and their input data are presented. Their results demonstrate that this program can be used for field-scale geothermal reservoir simulation in porous and fractured media with fluid and heat flow coupled with geomechanical effects.« less
Developing Healthy Adolescents--A Progressive Health Care Partnership Program.
ERIC Educational Resources Information Center
Griesemer, Bernard A.; Hough, David L.
1993-01-01
A 1991 partnership coupling Southwest Missouri State University with Saint John's Regional Health Center spawned the Midwest Sports Medicine Center, originally designed to treat orthopedic injuries. Soon the center developed major educational initiatives, including SportsPACE, a program integrating health care programs into the secondary core…
ERIC Educational Resources Information Center
Vernon, David H.
1989-01-01
The paper reviews and critiques the 13 existing (1987) law school assistance programs and proposes a national repayment-assistance debt-forgiveness program which would involve an income-contingent repayment "tax" coupled with an assurance to creditors of repayment by means of a "guarantee" or "insurance" fund. (DB)
Daily Couple Experiences and Parent Affect in Families of Children with versus without Autism
Hartley, Sigan L.; DaWalt, Leann Smith; Schultz, Haley M.
2017-01-01
We examined daily couple experiences in 174 couples who had a child with autism spectrum disorder (ASD) relative to 179 couples who had a child without disabilities and their same-day association with parent affect. Parents completed a 14-day daily diary in which they reported time with partner, partner support, partner closeness, and positive and negative couple interactions and level of positive and negative affect. One-way multivariate analyses of covariance and dyadic multilevel models were conducted. Parents of children with ASD reported less time with partner, lower partner closeness, and fewer positive couple interactions than the comparison group. Daily couple experiences were more strongly associated with parent affect in the ASD than comparison group. Findings have implications for programs and supports. PMID:28275928
Daily Couple Experiences and Parent Affect in Families of Children with Versus Without Autism.
Hartley, Sigan L; DaWalt, Leann Smith; Schultz, Haley M
2017-06-01
We examined daily couple experiences in 174 couples who had a child with autism spectrum disorder (ASD) relative to 179 couples who had a child without disabilities and their same-day association with parent affect. Parents completed a 14-day daily diary in which they reported time with partner, partner support, partner closeness, and positive and negative couple interactions and level of positive and negative affect. One-way multivariate analyses of covariance and dyadic multilevel models were conducted. Parents of children with ASD reported less time with partner, lower partner closeness, and fewer positive couple interactions than the comparison group. Daily couple experiences were more strongly associated with parent affect in the ASD than comparison group. Findings have implications for programs and supports.
Decreasing Substance Use Risk Among African American Youth: Parent-based Mechanisms of Change
Beach, Steven R. H.; Barton, Allen W.; Lei, Man Kit; Mandara, Jelani; Wells, Ashley C.; Kogan, Steven M.; Brody, Gene H.
2017-01-01
African American couples (N = 139; 67.7% married; with children between the ages of 9 and 14) were randomly assigned to (a) a culturally sensitive, couple- and parenting-focused program designed to prevent stress-spillover (n = 70) or (b) an information-only control condition in which couples received self-help materials (n = 69). Eight months after baseline, youth whose parents participated in the program, compared with control youth, reported increased parental monitoring, positive racial socialization, and positive self-concept, as well as decreased conduct problems and self-reported substance use. Changes in youth-reported parenting behavior partially mediated the effect of the intervention on conduct problems and fully mediated its impact on positive self-concept, but did not mediate effects on lifetime substance use initiation. Results suggest the potential for a culturally sensitive family-based intervention targeting adults’ couple and parenting processes to enhance multiple parenting behaviors as well as decrease youths’ substance use onset and vulnerability. PMID:27129477
NASA Astrophysics Data System (ADS)
Lourderaj, Upakarasamy; Sun, Rui; Kohale, Swapnil C.; Barnes, George L.; de Jong, Wibe A.; Windus, Theresa L.; Hase, William L.
2014-03-01
The interface for VENUS and NWChem, and the resulting software package for direct dynamics simulations are described. The coupling of the two codes is considered to be a tight coupling since the two codes are compiled and linked together and act as one executable with data being passed between the two codes through routine calls. The advantages of this type of coupling are discussed. The interface has been designed to have as little interference as possible with the core codes of both VENUS and NWChem. VENUS is the code that propagates the direct dynamics trajectories and, therefore, is the program that drives the overall execution of VENUS/NWChem. VENUS has remained an essentially sequential code, which uses the highly parallel structure of NWChem. Subroutines of the interface that accomplish the data transmission and communication between the two computer programs are described. Recent examples of the use of VENUS/NWChem for direct dynamics simulations are summarized.
78 FR 7399 - Application(s) for Duty-Free Entry of Scientific Instruments
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-01
... superconductors in two dimensions, to program fundamental couplings at near-atomic scales and quantum simulation... mechanisms, by using predicted topological properties of superconductors in two dimensions, to program...
A vector-dyadic development of the equations of motion for N-coupled rigid bodies and point masses
NASA Technical Reports Server (NTRS)
Frisch, H. P.
1974-01-01
The equations of motion are derived, in vector-dyadic format, for a topological tree of coupled rigid bodies, point masses, and symmetrical momentum wheels. These equations were programmed, and form the basis for the general-purpose digital computer program N-BOD. A complete derivation of the equations of motion is included along with a description of the methods used for kinematics, constraint elimination, and for the inclusion of nongyroscope forces and torques acting external or internal to the system.
Improvements and applications of COBRA-TF for stand-alone and coupled LWR safety analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Avramova, M.; Cuervo, D.; Ivanov, K.
2006-07-01
The advanced thermal-hydraulic subchannel code COBRA-TF has been recently improved and applied for stand-alone and coupled LWR core calculations at the Pennsylvania State Univ. in cooperation with AREVA NP GmbH (Germany)) and the Technical Univ. of Madrid. To enable COBRA-TF for academic and industrial applications including safety margins evaluations and LWR core design analyses, the code programming, numerics, and basic models were revised and substantially improved. The code has undergone through an extensive validation, verification, and qualification program. (authors)
Coupled structural/thermal/electromagnetic analysis/tailoring of graded composite structures
NASA Technical Reports Server (NTRS)
Hartle, M. S.; Mcknight, R. L.; Huang, H.; Holt, R.
1992-01-01
Described here are the accomplishments of a 5-year program to develop a methodology for coupled structural, thermal, electromagnetic analysis tailoring of graded component structures. The capabilities developed over the course of the program are the analyzer module and the tailoring module for the modeling of graded materials. Highlighted accomplishments for the past year include the addition of a buckling analysis capability, the addition of mode shape slope calculation for flutter analysis, verification of the analysis modules using simulated components, and verification of the tailoring module.
Neural network error correction for solving coupled ordinary differential equations
NASA Technical Reports Server (NTRS)
Shelton, R. O.; Darsey, J. A.; Sumpter, B. G.; Noid, D. W.
1992-01-01
A neural network is presented to learn errors generated by a numerical algorithm for solving coupled nonlinear differential equations. The method is based on using a neural network to correctly learn the error generated by, for example, Runge-Kutta on a model molecular dynamics (MD) problem. The neural network programs used in this study were developed by NASA. Comparisons are made for training the neural network using backpropagation and a new method which was found to converge with fewer iterations. The neural net programs, the MD model and the calculations are discussed.
NASA Astrophysics Data System (ADS)
Karimabadi, Homa
2012-03-01
Recent advances in simulation technology and hardware are enabling breakthrough science where many longstanding problems can now be addressed for the first time. In this talk, we focus on kinetic simulations of the Earth's magnetosphere and magnetic reconnection process which is the key mechanism that breaks the protective shield of the Earth's dipole field, allowing the solar wind to enter the Earth's magnetosphere. This leads to the so-called space weather where storms on the Sun can affect space-borne and ground-based technological systems on Earth. The talk will consist of three parts: (a) overview of a new multi-scale simulation technique where each computational grid is updated based on its own unique timestep, (b) Presentation of a new approach to data analysis that we refer to as Physics Mining which entails combining data mining and computer vision algorithms with scientific visualization to extract physics from the resulting massive data sets. (c) Presentation of several recent discoveries in studies of space plasmas including the role of vortex formation and resulting turbulence in magnetized plasmas.
The Spider Center Wide File System; From Concept to Reality
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shipman, Galen M; Dillow, David A; Oral, H Sarp
2009-01-01
The Leadership Computing Facility (LCF) at Oak Ridge National Laboratory (ORNL) has a diverse portfolio of computational resources ranging from a petascale XT4/XT5 simulation system (Jaguar) to numerous other systems supporting development, visualization, and data analytics. In order to support vastly different I/O needs of these systems Spider, a Lustre-based center wide file system was designed and deployed to provide over 240 GB/s of aggregate throughput with over 10 Petabytes of formatted capacity. A multi-stage InfiniBand network, dubbed as Scalable I/O Network (SION), with over 889 GB/s of bisectional bandwidth was deployed as part of Spider to provide connectivity tomore » our simulation, development, visualization, and other platforms. To our knowledge, while writing this paper, Spider is the largest and fastest POSIX-compliant parallel file system in production. This paper will detail the overall architecture of the Spider system, challenges in deploying and initial testings of a file system of this scale, and novel solutions to these challenges which offer key insights into file system design in the future.« less
Computational biology in the cloud: methods and new insights from computing at scale.
Kasson, Peter M
2013-01-01
The past few years have seen both explosions in the size of biological data sets and the proliferation of new, highly flexible on-demand computing capabilities. The sheer amount of information available from genomic and metagenomic sequencing, high-throughput proteomics, experimental and simulation datasets on molecular structure and dynamics affords an opportunity for greatly expanded insight, but it creates new challenges of scale for computation, storage, and interpretation of petascale data. Cloud computing resources have the potential to help solve these problems by offering a utility model of computing and storage: near-unlimited capacity, the ability to burst usage, and cheap and flexible payment models. Effective use of cloud computing on large biological datasets requires dealing with non-trivial problems of scale and robustness, since performance-limiting factors can change substantially when a dataset grows by a factor of 10,000 or more. New computing paradigms are thus often needed. The use of cloud platforms also creates new opportunities to share data, reduce duplication, and to provide easy reproducibility by making the datasets and computational methods easily available.
Are Earth System model software engineering practices fit for purpose? A case study.
NASA Astrophysics Data System (ADS)
Easterbrook, S. M.; Johns, T. C.
2009-04-01
We present some analysis and conclusions from a case study of the culture and practices of scientists at the Met Office and Hadley Centre working on the development of software for climate and Earth System models using the MetUM infrastructure. The study examined how scientists think about software correctness, prioritize their requirements in making changes, and develop a shared understanding of the resulting models. We conclude that highly customized techniques driven strongly by scientific research goals have evolved for verification and validation of such models. In a formal software engineering context these represents costly, but invaluable, software integration tests with considerable benefits. The software engineering practices seen also exhibit recognisable features of both agile and open source software development projects - self-organisation of teams consistent with a meritocracy rather than top-down organisation, extensive use of informal communication channels, and software developers who are generally also users and science domain experts. We draw some general conclusions on whether these practices work well, and what new software engineering challenges may lie ahead as Earth System models become ever more complex and petascale computing becomes the norm.
Autonomic Closure for Turbulent Flows Using Approximate Bayesian Computation
NASA Astrophysics Data System (ADS)
Doronina, Olga; Christopher, Jason; Hamlington, Peter; Dahm, Werner
2017-11-01
Autonomic closure is a new technique for achieving fully adaptive and physically accurate closure of coarse-grained turbulent flow governing equations, such as those solved in large eddy simulations (LES). Although autonomic closure has been shown in recent a priori tests to more accurately represent unclosed terms than do dynamic versions of traditional LES models, the computational cost of the approach makes it challenging to implement for simulations of practical turbulent flows at realistically high Reynolds numbers. The optimization step used in the approach introduces large matrices that must be inverted and is highly memory intensive. In order to reduce memory requirements, here we propose to use approximate Bayesian computation (ABC) in place of the optimization step, thereby yielding a computationally-efficient implementation of autonomic closure that trades memory-intensive for processor-intensive computations. The latter challenge can be overcome as co-processors such as general purpose graphical processing units become increasingly available on current generation petascale and exascale supercomputers. In this work, we outline the formulation of ABC-enabled autonomic closure and present initial results demonstrating the accuracy and computational cost of the approach.
Katouda, Michio; Naruse, Akira; Hirano, Yukihiko; Nakajima, Takahito
2016-11-15
A new parallel algorithm and its implementation for the RI-MP2 energy calculation utilizing peta-flop-class many-core supercomputers are presented. Some improvements from the previous algorithm (J. Chem. Theory Comput. 2013, 9, 5373) have been performed: (1) a dual-level hierarchical parallelization scheme that enables the use of more than 10,000 Message Passing Interface (MPI) processes and (2) a new data communication scheme that reduces network communication overhead. A multi-node and multi-GPU implementation of the present algorithm is presented for calculations on a central processing unit (CPU)/graphics processing unit (GPU) hybrid supercomputer. Benchmark results of the new algorithm and its implementation using the K computer (CPU clustering system) and TSUBAME 2.5 (CPU/GPU hybrid system) demonstrate high efficiency. The peak performance of 3.1 PFLOPS is attained using 80,199 nodes of the K computer. The peak performance of the multi-node and multi-GPU implementation is 514 TFLOPS using 1349 nodes and 4047 GPUs of TSUBAME 2.5. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greenough, Jeffrey A.; de Supinski, Bronis R.; Yates, Robert K.
2005-04-25
We describe the performance of the block-structured Adaptive Mesh Refinement (AMR) code Raptor on the 32k node IBM BlueGene/L computer. This machine represents a significant step forward towards petascale computing. As such, it presents Raptor with many challenges for utilizing the hardware efficiently. In terms of performance, Raptor shows excellent weak and strong scaling when running in single level mode (no adaptivity). Hardware performance monitors show Raptor achieves an aggregate performance of 3:0 Tflops in the main integration kernel on the 32k system. Results from preliminary AMR runs on a prototype astrophysical problem demonstrate the efficiency of the current softwaremore » when running at large scale. The BG/L system is enabling a physics problem to be considered that represents a factor of 64 increase in overall size compared to the largest ones of this type computed to date. Finally, we provide a description of the development work currently underway to address our inefficiencies.« less
Matrix Algebra for GPU and Multicore Architectures (MAGMA) for Large Petascale Systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dongarra, Jack J.; Tomov, Stanimire
2014-03-24
The goal of the MAGMA project is to create a new generation of linear algebra libraries that achieve the fastest possible time to an accurate solution on hybrid Multicore+GPU-based systems, using all the processing power that future high-end systems can make available within given energy constraints. Our efforts at the University of Tennessee achieved the goals set in all of the five areas identified in the proposal: 1. Communication optimal algorithms; 2. Autotuning for GPU and hybrid processors; 3. Scheduling and memory management techniques for heterogeneity and scale; 4. Fault tolerance and robustness for large scale systems; 5. Building energymore » efficiency into software foundations. The University of Tennessee’s main contributions, as proposed, were the research and software development of new algorithms for hybrid multi/many-core CPUs and GPUs, as related to two-sided factorizations and complete eigenproblem solvers, hybrid BLAS, and energy efficiency for dense, as well as sparse, operations. Furthermore, as proposed, we investigated and experimented with various techniques targeting the five main areas outlined.« less
Lagrangian ocean analysis: Fundamentals and practices
van Sebille, Erik; Griffies, Stephen M.; Abernathey, Ryan; ...
2017-11-24
Lagrangian analysis is a powerful way to analyse the output of ocean circulation models and other ocean velocity data such as from altimetry. In the Lagrangian approach, large sets of virtual particles are integrated within the three-dimensional, time-evolving velocity fields. A variety of tools and methods for this purpose have emerged, over several decades. Here, we review the state of the art in the field of Lagrangian analysis of ocean velocity data, starting from a fundamental kinematic framework and with a focus on large-scale open ocean applications. Beyond the use of explicit velocity fields, we consider the influence of unresolvedmore » physics and dynamics on particle trajectories. We comprehensively list and discuss the tools currently available for tracking virtual particles. We then showcase some of the innovative applications of trajectory data, and conclude with some open questions and an outlook. Our overall goal of this review paper is to reconcile some of the different techniques and methods in Lagrangian ocean analysis, while recognising the rich diversity of codes that have and continue to emerge, and the challenges of the coming age of petascale computing.« less
NASA Astrophysics Data System (ADS)
Zhou, Quan; Liu, Lijun; Hu, Jiashun
2018-05-01
In the version of this Article originally published, data points representing mafic eruptions were missing from Fig. 4b, the corrected version is shown below. Furthermore, the authors omitted to include the following acknowledgements to the provider of the computational resources: "This research is part of the Blue Waters sustained-petascale computing project, which is supported by the National Science Foundation (awards OCI-0725070 and ACI-1238993) and the state of Illinois. Blue Waters is a joint effort of the University of Illinois at Urbana-Champaign and its National Center for Supercomputing Applications. This work is also part of the `PRAC Title 4-D Geodynamic Modeling With Data Assimilation: Origin Of Intra-Plate Volcanism In The Pacific Northwest' PRAC allocation support by the National Science Foundation (award number ACI 1516586). This work also used the Extreme Science and Engineering Discovery Environment (XSEDE), which is supported by National Science Foundation grant number ACI-1548562." Figure 4 and the Acknowledgements section have been updated in the online version of the Article.
Lagrangian ocean analysis: Fundamentals and practices
NASA Astrophysics Data System (ADS)
van Sebille, Erik; Griffies, Stephen M.; Abernathey, Ryan; Adams, Thomas P.; Berloff, Pavel; Biastoch, Arne; Blanke, Bruno; Chassignet, Eric P.; Cheng, Yu; Cotter, Colin J.; Deleersnijder, Eric; Döös, Kristofer; Drake, Henri F.; Drijfhout, Sybren; Gary, Stefan F.; Heemink, Arnold W.; Kjellsson, Joakim; Koszalka, Inga Monika; Lange, Michael; Lique, Camille; MacGilchrist, Graeme A.; Marsh, Robert; Mayorga Adame, C. Gabriela; McAdam, Ronan; Nencioli, Francesco; Paris, Claire B.; Piggott, Matthew D.; Polton, Jeff A.; Rühs, Siren; Shah, Syed H. A. M.; Thomas, Matthew D.; Wang, Jinbo; Wolfram, Phillip J.; Zanna, Laure; Zika, Jan D.
2018-01-01
Lagrangian analysis is a powerful way to analyse the output of ocean circulation models and other ocean velocity data such as from altimetry. In the Lagrangian approach, large sets of virtual particles are integrated within the three-dimensional, time-evolving velocity fields. Over several decades, a variety of tools and methods for this purpose have emerged. Here, we review the state of the art in the field of Lagrangian analysis of ocean velocity data, starting from a fundamental kinematic framework and with a focus on large-scale open ocean applications. Beyond the use of explicit velocity fields, we consider the influence of unresolved physics and dynamics on particle trajectories. We comprehensively list and discuss the tools currently available for tracking virtual particles. We then showcase some of the innovative applications of trajectory data, and conclude with some open questions and an outlook. The overall goal of this review paper is to reconcile some of the different techniques and methods in Lagrangian ocean analysis, while recognising the rich diversity of codes that have and continue to emerge, and the challenges of the coming age of petascale computing.
DNS of droplet motion in a turbulent flow
NASA Astrophysics Data System (ADS)
Rosso, Michele; Elghobashi, S.
2013-11-01
The objective of our research is to study the multi-way interactions between turbulence and vaporizing liquid droplets by performing direct numerical simulations (DNS). The freely-moving droplets are fully resolved in 3D space and time and all the relevant scales of the turbulent motion are simultaneously resolved down to the smallest length- and time-scales. Our DNS solve the unsteady three-dimensional Navier-Stokes and continuity equations throughout the whole computational domain, including the interior of the liquid droplets. The droplet surface motion and deformation are captured accurately by using the Level Set method. The pressure jump condition, density and viscosity discontinuities across the interface as well as surface tension are accounted for. Here, we present only the results of the first stage of our research which considers the effects of turbulence on the shape change of an initially spherical liquid droplet, at density ratio (of liquid to carrier fluid) of 1000, moving in isotropic turbulent flow. We validate our results via comparison with available expe. This research has been supported by NSF-CBET Award 0933085 and NSF PRAC (Petascale Computing Resource Allocation) Award.
Lagrangian ocean analysis: Fundamentals and practices
DOE Office of Scientific and Technical Information (OSTI.GOV)
van Sebille, Erik; Griffies, Stephen M.; Abernathey, Ryan
Lagrangian analysis is a powerful way to analyse the output of ocean circulation models and other ocean velocity data such as from altimetry. In the Lagrangian approach, large sets of virtual particles are integrated within the three-dimensional, time-evolving velocity fields. A variety of tools and methods for this purpose have emerged, over several decades. Here, we review the state of the art in the field of Lagrangian analysis of ocean velocity data, starting from a fundamental kinematic framework and with a focus on large-scale open ocean applications. Beyond the use of explicit velocity fields, we consider the influence of unresolvedmore » physics and dynamics on particle trajectories. We comprehensively list and discuss the tools currently available for tracking virtual particles. We then showcase some of the innovative applications of trajectory data, and conclude with some open questions and an outlook. Our overall goal of this review paper is to reconcile some of the different techniques and methods in Lagrangian ocean analysis, while recognising the rich diversity of codes that have and continue to emerge, and the challenges of the coming age of petascale computing.« less
Computer program to compute buckling loads of simply supported anisotropic plates
NASA Technical Reports Server (NTRS)
Chamis, C. C.
1973-01-01
Program handles several types of composites and several load conditions for each plate, both compressive or tensile membrane loads, and bending-stretching coupling via the concept of reduced bending rigidities. Vibration frequencies of homogeneous or layered anisotropic plates can be calculated by slightly modifying the program.
Facts on Employee Assistance Programs. Clearinghouse Fact Sheet.
ERIC Educational Resources Information Center
Desmond, Thomas C.
Employee Assistance Programs (EAPs) offer employees confidential, professional assistance for the kinds of personal problems that adversely affect their lives and their jobs. These programs started when acute worker shortages during World War II, coupled with the successes of Alcoholic Anonymous, prompted some companies in the 1940s to develop…
Inexpensive Timeshared Graphics on the SIGMA 7.
ERIC Educational Resources Information Center
Bork, Alfred M.
This paper gives a technical description of various computer graphics programs developed on the Sigma 7 computer. Terminals used are the Adage 100 and the Tektronix 4002-4010. Commands are Metasymbol procedures which access Metasymbol library subroutines; programs can also be coupled with FORTRAN programs. Available, inexpensive graphic terminals…
Programmed Writing as a Therapeutic Intervention: A Review of the Literature.
ERIC Educational Resources Information Center
Bibby, Robert Christopher
Traumatic life events are stressful and when coupled with decreased emotional expression they can have deleterious effects on a person's psychological and physical health. The empirical literature on programmed assignments, specifically Programmed Writing (PW) as a therapeutic intervention, was examined for its potential as a means to facilitate…
Hybrid Circuit Quantum Electrodynamics: Coupling a Single Silicon Spin Qubit to a Photon
2015-01-01
HYBRID CIRCUIT QUANTUM ELECTRODYNAMICS: COUPLING A SINGLE SILICON SPIN QUBIT TO A PHOTON PRINCETON UNIVERSITY JANUARY 2015 FINAL...SILICON SPIN QUBIT TO A PHOTON 5a. CONTRACT NUMBER FA8750-12-2-0296 5b. GRANT NUMBER N/A 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Jason R. Petta...architectures. 15. SUBJECT TERMS Quantum Computing, Quantum Hybrid Circuits, Quantum Electrodynamics, Coupling a Single Silicon Spin Qubit to a Photon
Physics of Coupled CME and Flare Systems
2016-12-21
AFRL-RV-PS- AFRL-RV-PS- TR-2016-0162 TR-2016-0162 PHYSICS OF COUPLED CME AND FLARE SYSTEMS K. S. Balasubramaniam, et al. 21 December 2016 Final...30 Sep 2016 4. TITLE AND SUBTITLE Physics of Coupled CME and Flare Systems 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 61102F...objectives for this task were: (i) derive measureable physical properties and discernible structural circumstances in solar active regions that
NASA Astrophysics Data System (ADS)
Lee, J. S.; Carena, M.; Ellis, J.; Pilaftsis, A.; Wagner, C. E. M.
2009-02-01
We describe the Fortran code CPsuperH2.0, which contains several improvements and extensions of its predecessor CPsuperH. It implements improved calculations of the Higgs-boson pole masses, notably a full treatment of the 4×4 neutral Higgs propagator matrix including the Goldstone boson and a more complete treatment of threshold effects in self-energies and Yukawa couplings, improved treatments of two-body Higgs decays, some important three-body decays, and two-loop Higgs-mediated contributions to electric dipole moments. CPsuperH2.0 also implements an integrated treatment of several B-meson observables, including the branching ratios of B→μμ, B→ττ, B→τν, B→Xγ and the latter's CP-violating asymmetry A, and the supersymmetric contributions to the Bs,d0-B¯s,d0 mass differences. These additions make CPsuperH2.0 an attractive integrated tool for analyzing supersymmetric CP and flavour physics as well as searches for new physics at high-energy colliders such as the Tevatron, LHC and linear colliders. Program summaryProgram title: CPsuperH2.0 Catalogue identifier: ADSR_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADSR_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 13 290 No. of bytes in distributed program, including test data, etc.: 89 540 Distribution format: tar.gz Programming language: Fortran 77 Computer: PC running under Linux and computers in Unix environment Operating system: Linux RAM: 32 Mbytes Classification: 11.1 Catalogue identifier of the previous version: ADSR_v1_0 Journal reference of the previous version: CPC 156 (2004) 283 Does the new version supersede the previous version?: Yes Nature of problem: The calculations of mass spectrum, decay widths and branching ratios of the neutral and charged Higgs bosons in the Minimal Supersymmetric Standard Model with explicit CP violation have been improved. The program is based on recent renormalization-group-improved diagrammatic calculations that include dominant higher-order logarithmic and threshold corrections, b-quark Yukawa-coupling resummation effects and improved treatment of Higgs-boson pole-mass shifts. The couplings of the Higgs bosons to the Standard Model gauge bosons and fermions, to their supersymmetric partners and all the trilinear and quartic Higgs-boson self-couplings are also calculated. The new implementations include a full treatment of the 4×4(2×2) neutral (charged) Higgs propagator matrix together with the center-of-mass dependent Higgs-boson couplings to gluons and photons, two-loop Higgs-mediated contributions to electric dipole moments, and an integrated treatment of several B-meson observables. Solution method: One-dimensional numerical integration for several Higgs-decay modes, iterative treatment of the threshold corrections and Higgs-boson pole masses, and the numerical diagonalization of the neutralino mass matrix. Reasons for new version: Mainly to provide a coherent numerical framework which calculates consistently observables for both low- and high-energy experiments. Summary of revisions: Improved treatment of Higgs-boson masses and propagators. Improved treatment of Higgs-boson couplings and decays. Higgs-mediated two-loop electric dipole moments. B-meson observables. Running time: Less than 0.1 seconds. The program may be obtained from http://www.hep.man.ac.uk/u/jslee/CPsuperH.html.
Test program, helium II orbital resupply coupling
NASA Technical Reports Server (NTRS)
Hyatt, William S.
1991-01-01
The full scope of this program was to have included development tests, design and production of custom test equipment and acceptance and qualification testing of prototype and protoflight coupling hardware. This program was performed by Ball Aerospace Systems Division, Boulder, Colorado until its premature termination in May 1991. Development tests were performed on cryogenic face seals and flow control devices at superfluid helium (He II) conditions. Special equipment was developed to allow quantified leak detection at large leak rates up to 8.4 x 10(exp -4) SCCS. Two major fixtures were developed and characterized: The Cryogenic Test Fixture (CTF) and the Thermal Mismatch Fixture (Glovebox). The CTF allows the coupling hardware to be filled with liquid nitrogen (LN2), liquid helium (LHe) or sub-cooled liquid helium when hardware flow control valves are either open or closed. Heat leak measurements, internal and external helium leakage measurements, cryogenic proof pressure tests and external load applications are performed in this fixture. Special reusable MLI closures were developed to provide repeatable installations in the CTF. The Thermal Mismatch Fixture allows all design configurations of coupling hardware to be engaged and disengaged while measuring applied forces and torques. Any two hardware components may be individually thermally preconditioned within the range of 117 deg K to 350 deg K prior to engage/disengage cycling. This verifies dimensional compatibility and operation when thermally mismatched. A clean, dry GN2 atmosphere is maintained in the fixture at all times. The first shipset of hardware was received, inspected and cycled at room temperature just prior to program termination.
Kim Halford, W; Pepping, Christopher A; Hilpert, Peter; Bodenmann, Guy; Wilson, Keithia L; Busby, Dean; Larson, Jeffry; Holman, Thomas
2015-05-01
Couple relationship education (RE) usually is conceived of as relationship enhancement for currently satisfied couples, with a goal of helping couples sustain satisfaction. However, RE also might be useful as a brief, accessible intervention for couples with low satisfaction. Two studies were conducted that tested whether couples with low relationship satisfaction show meaningful gains after RE. Study 1 was a three-condition randomized controlled trial in which 182 couples were randomly assigned to RELATE with Couple CARE (RCC), a flexible delivery education program for couples, or one of two control conditions. Couples with initially low satisfaction receiving RCC showed a moderate increase in relationship satisfaction (d=0.50) relative to the control. In contrast, couples initially high in satisfaction showed little change and there was no difference between RCC and the control conditions. Study 2 was an uncontrolled trial of the Couple Coping Enhancement Training (CCET) administered to 119 couples. Couples receiving CCET that had initially low satisfaction showed a moderate increase in satisfaction (g=.44), whereas initially highly satisfied couples showed no change. Brief relationship education can assist somewhat distressed couples to enhance satisfaction, and has potential as a cost-effective way of enhancing the reach of couple interventions. Copyright © 2015. Published by Elsevier Ltd.
Program Aids Analysis And Optimization Of Design
NASA Technical Reports Server (NTRS)
Rogers, James L., Jr.; Lamarsh, William J., II
1994-01-01
NETS/ PROSSS (NETS Coupled With Programming System for Structural Synthesis) computer program developed to provide system for combining NETS (MSC-21588), neural-network application program and CONMIN (Constrained Function Minimization, ARC-10836), optimization program. Enables user to reach nearly optimal design. Design then used as starting point in normal optimization process, possibly enabling user to converge to optimal solution in significantly fewer iterations. NEWT/PROSSS written in C language and FORTRAN 77.
Marital Dissolution among Interracial Couples
ERIC Educational Resources Information Center
Zhang, Yuanting; Van Hook, Jennifer
2009-01-01
Increases in interracial marriage have been interpreted as reflecting reduced social distance among racial and ethnic groups, but little is known about the stability of interracial marriages. Using six panels of Survey of Income and Program Participation (N = 23,139 married couples), we found that interracial marriages are less stable than…
The effect of a couples intervention to increase breast cancer screening among korean americans.
Lee, Eunice; Menon, Usha; Nandy, Karabi; Szalacha, Laura; Kviz, Frederick; Cho, Young; Miller, Arlene; Park, Hanjong
2014-05-01
To assess the efficacy of Korean Immigrants and Mammography-Culture-Specific Health Intervention (KIM-CHI), an educational program for Korean American (KA) couples designed to improve mammography uptake among KA women. A two-group cluster randomized, longitudinal, controlled design. 50 KA religious organizations in the Chicago area. 428 married KA women 40 years of age or older who had not had a mammogram in the past year. The women and their husbands were recruited from 50 KA religious organizations. Couples were randomly assigned to intervention or attention control groups. Those in the KIM-CHI program (n = 211 couples) were compared to an attention control group (n = 217 couples) at baseline, as well as at 6 and 15 months postintervention on mammogram uptake. Sociodemographic variables and mammography uptake were measured. Level of acculturation was measured using the Suinn-Lew Asian Self-Identity Acculturation Scale. Researchers asked questions about healthcare resources and use, health insurance status, usual source of care, physical examinations in the past two years, family history of breast cancer, and history of mammography. The KIM-CHI group showed statistically significant increases in mammography uptake compared to the attention control group at 6 months and 15 months postintervention. The culturally targeted KIM-CHI program was effective in increasing mammogram uptake among nonadherent KA women. Nurses and healthcare providers should consider specific health beliefs as well as inclusion of husbands or significant others. They also should target education to be culturally relevant for KA women to effectively improve frequency of breast cancer screening.
Intrasystem Analysis Program (IAP) code summaries
NASA Astrophysics Data System (ADS)
Dobmeier, J. J.; Drozd, A. L. S.; Surace, J. A.
1983-05-01
This report contains detailed descriptions and capabilities of the codes that comprise the Intrasystem Analysis Program. The four codes are: Intrasystem Electromagnetic Compatibility Analysis Program (IEMCAP), General Electromagnetic Model for the Analysis of Complex Systems (GEMACS), Nonlinear Circuit Analysis Program (NCAP), and Wire Coupling Prediction Models (WIRE). IEMCAP is used for computer-aided evaluation of electromagnetic compatibility (ECM) at all stages of an Air Force system's life cycle, applicable to aircraft, space/missile, and ground-based systems. GEMACS utilizes a Method of Moments (MOM) formalism with the Electric Field Integral Equation (EFIE) for the solution of electromagnetic radiation and scattering problems. The code employs both full matrix decomposition and Banded Matrix Iteration solution techniques and is expressly designed for large problems. NCAP is a circuit analysis code which uses the Volterra approach to solve for the transfer functions and node voltage of weakly nonlinear circuits. The Wire Programs deal with the Application of Multiconductor Transmission Line Theory to the Prediction of Cable Coupling for specific classes of problems.
Coupling Network Computing Applications in Air-cooled Turbine Blades Optimization
NASA Astrophysics Data System (ADS)
Shi, Liang; Yan, Peigang; Xie, Ming; Han, Wanjin
2018-05-01
Through establishing control parameters from blade outside to inside, the parametric design of air-cooled turbine blade based on airfoil has been implemented. On the basis of fast updating structure features and generating solid model, a complex cooling system has been created. Different flow units are modeled into a complex network topology with parallel and serial connection. Applying one-dimensional flow theory, programs have been composed to get pipeline network physical quantities along flow path, including flow rate, pressure, temperature and other parameters. These inner units parameters set as inner boundary conditions for external flow field calculation program HIT-3D by interpolation, thus to achieve full field thermal coupling simulation. Referring the studies in literatures to verify the effectiveness of pipeline network program and coupling algorithm. After that, on the basis of a modified design, and with the help of iSIGHT-FD, an optimization platform had been established. Through MIGA mechanism, the target of enhancing cooling efficiency has been reached, and the thermal stress has been effectively reduced. Research work in this paper has significance for rapid deploying the cooling structure design.
Lesser, Janna; Verdugo, Robert L; Koniak-Griffin, Deborah; Tello, Jerry; Kappos, Barbara; Cumberland, William G
2005-08-01
This article describes a two-phase community and academic collaboration funded by the California Collaborative Research Initiative to develop and test the feasibility of an innovative HIV prevention program relevant to the needs of the population of inner-city Latino teen parenting couples and realistic for implementation in community settings. The article describes (a) the identification of special issues that needed to be addressed before formation of a productive academic-community-based organization research partnership, including integrating a dominant theoretical model used in health education with principles of practice derived from clinical experience; (b) the first phase of the project that helped to inform the development of the HIV prevention program for couples; (c) examples from the intervention pilot study (Phase 2) that illustrate both the intervention strategies and the young participants' responses to the curriculum; and (d) the feasibility of program implementation and evaluation in a community setting.
Direct Numerical Simulation of Turbulent Multi-Stage Autoignition Relevant to Engine Conditions
NASA Astrophysics Data System (ADS)
Chen, Jacqueline
2017-11-01
Due to the unrivaled energy density of liquid hydrocarbon fuels combustion will continue to provide over 80% of the world's energy for at least the next fifty years. Hence, combustion needs to be understood and controlled to optimize combustion systems for efficiency to prevent further climate change, to reduce emissions and to ensure U.S. energy security. In this talk I will discuss recent progress in direct numerical simulations of turbulent combustion focused on providing fundamental insights into key `turbulence-chemistry' interactions that underpin the development of next generation fuel efficient, fuel flexible engines for transportation and power generation. Petascale direct numerical simulation (DNS) of multi-stage mixed-mode turbulent combustion in canonical configurations have elucidated key physics that govern autoignition and flame stabilization in engines and provide benchmark data for combustion model development under the conditions of advanced engines which operate near combustion limits to maximize efficiency and minimize emissions. Mixed-mode combustion refers to premixed or partially-premixed flames propagating into stratified autoignitive mixtures. Multi-stage ignition refers to hydrocarbon fuels with negative temperature coefficient behavior that undergo sequential low- and high-temperature autoignition. Key issues that will be discussed include: 1) the role of mixing in shear driven turbulence on the dynamics of multi-stage autoignition and cool flame propagation in diesel environments, 2) the role of thermal and composition stratification on the evolution of the balance of mixed combustion modes - flame propagation versus spontaneous ignition - which determines the overall combustion rate in autoignition processes, and 3) the role of cool flames on lifted flame stabilization. Finally prospects for DNS of turbulent combustion at the exascale will be discussed in the context of anticipated heterogeneous machine architectures. sponsored by DOE Office of Basic Energy Sciences and computing resources provided by the Oakridge Leadership Computing Facility through the DOE INCITE Program.
Coupled structural/thermal/electromagnetic analysis/tailoring of graded composite structures
NASA Technical Reports Server (NTRS)
Mcknight, R. L.; Huang, H.; Hartle, M.
1992-01-01
Accomplishments are described for the third years effort of a 5-year program to develop a methodology for coupled structural/thermal/electromagnetic analysis/tailoring of graded composite structures. These accomplishments include: (1) structural analysis capability specialized for graded composite structures including large deformation and deformation position eigenanalysis technologies; (2) a thermal analyzer specialized for graded composite structures; (3) absorption of electromagnetic waves by graded composite structures; and (4) coupled structural thermal/electromagnetic analysis of graded composite structures.
Overcoming the Coupling Dilemma in DNA-Programmable Nanoparticle Assemblies by "Ag+ Soldering".
Wang, Huiqiao; Li, Yulin; Liu, Miao; Gong, Ming; Deng, Zhaoxiang
2015-05-20
Strong coupling between nanoparticles is critical for facilitating charge and energy transfers. Despite the great success of DNA-programmable nanoparticle assemblies, the very weak interparticle coupling represents a key barrier to various applications. Here, an extremely simple, fast, and highly efficient process combining DNA-programming and molecular/ionic bonding is developed to address this challenge, which exhibits a seamless fusion with DNA nanotechnology. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Research on Couple/Family Counselor Training: A Search for Research
ERIC Educational Resources Information Center
Kleist, David M.
2003-01-01
The initial goals of IAMFC included conducting and fostering programs of education in the field of family counseling, and promoting and conducting programs of research in the field of marriage and family counseling (International Association of Marriage and Family Counselors, 1990). The goal of fostering program development shows signs of success,…
Cybersecurity Curriculum Development: Introducing Specialties in a Graduate Program
ERIC Educational Resources Information Center
Bicak, Ali; Liu, Michelle; Murphy, Diane
2015-01-01
The cybersecurity curriculum has grown dramatically over the past decade: once it was just a couple of courses in a computer science graduate program. Today cybersecurity is introduced at the high school level, incorporated into undergraduate computer science and information systems programs, and has resulted in a variety of cybersecurity-specific…
NASA Astrophysics Data System (ADS)
Evans, Ben; Allen, Chris; Antony, Joseph; Bastrakova, Irina; Gohar, Kashif; Porter, David; Pugh, Tim; Santana, Fabiana; Smillie, Jon; Trenham, Claire; Wang, Jingbo; Wyborn, Lesley
2015-04-01
The National Computational Infrastructure (NCI) has established a powerful and flexible in-situ petascale computational environment to enable both high performance computing and Data-intensive Science across a wide spectrum of national environmental and earth science data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress so far to harmonise the underlying data collections for future interdisciplinary research across these large volume data collections. NCI has established 10+ PBytes of major national and international data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the major Australian national-scale scientific collections), leading research communities, and collaborating overseas organisations. New infrastructures created at NCI mean the data collections are now accessible within an integrated High Performance Computing and Data (HPC-HPD) environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large-scale high-bandwidth Lustre filesystems. The hardware was designed at inception to ensure that it would allow the layered software environment to flexibly accommodate the advancement of future data science. New approaches to software technology and data models have also had to be developed to enable access to these large and exponentially increasing data volumes at NCI. Traditional HPC and data environments are still made available in a way that flexibly provides the tools, services and supporting software systems on these new petascale infrastructures. But to enable the research to take place at this scale, the data, metadata and software now need to evolve together - creating a new integrated high performance infrastructure. The new infrastructure at NCI currently supports a catalogue of integrated, reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. One of the challenges for NCI has been to support existing techniques and methods, while carefully preparing the underlying infrastructure for the transition needed for the next class of Data-intensive Science. In doing so, a flexible range of techniques and software can be made available for application across the corpus of data collections available, and to provide a new infrastructure for future interdisciplinary research.
NASA Astrophysics Data System (ADS)
Clay, M. P.; Buaria, D.; Gotoh, T.; Yeung, P. K.
2017-10-01
A new dual-communicator algorithm with very favorable performance characteristics has been developed for direct numerical simulation (DNS) of turbulent mixing of a passive scalar governed by an advection-diffusion equation. We focus on the regime of high Schmidt number (S c), where because of low molecular diffusivity the grid-resolution requirements for the scalar field are stricter than those for the velocity field by a factor √{ S c }. Computational throughput is improved by simulating the velocity field on a coarse grid of Nv3 points with a Fourier pseudo-spectral (FPS) method, while the passive scalar is simulated on a fine grid of Nθ3 points with a combined compact finite difference (CCD) scheme which computes first and second derivatives at eighth-order accuracy. A static three-dimensional domain decomposition and a parallel solution algorithm for the CCD scheme are used to avoid the heavy communication cost of memory transposes. A kernel is used to evaluate several approaches to optimize the performance of the CCD routines, which account for 60% of the overall simulation cost. On the petascale supercomputer Blue Waters at the University of Illinois, Urbana-Champaign, scalability is improved substantially with a hybrid MPI-OpenMP approach in which a dedicated thread per NUMA domain overlaps communication calls with computational tasks performed by a separate team of threads spawned using OpenMP nested parallelism. At a target production problem size of 81923 (0.5 trillion) grid points on 262,144 cores, CCD timings are reduced by 34% compared to a pure-MPI implementation. Timings for 163843 (4 trillion) grid points on 524,288 cores encouragingly maintain scalability greater than 90%, although the wall clock time is too high for production runs at this size. Performance monitoring with CrayPat for problem sizes up to 40963 shows that the CCD routines can achieve nearly 6% of the peak flop rate. The new DNS code is built upon two existing FPS and CCD codes. With the grid ratio Nθ /Nv = 8, the disparity in the computational requirements for the velocity and scalar problems is addressed by splitting the global communicator MPI_COMM_WORLD into disjoint communicators for the velocity and scalar fields, respectively. Inter-communicator transfer of the velocity field from the velocity communicator to the scalar communicator is handled with discrete send and non-blocking receive calls, which are overlapped with other operations on the scalar communicator. For production simulations at Nθ = 8192 and Nv = 1024 on 262,144 cores for the scalar field, the DNS code achieves 94% strong scaling relative to 65,536 cores and 92% weak scaling relative to Nθ = 1024 and Nv = 128 on 512 cores.
Methods work better when couples talk.
Keller, S
1996-01-01
Sexual partners who communicate about reproductive health issues reduce their risk of acquiring a sexually transmitted disease (STD) or of unintended pregnancy, but few couples feel comfortable talking openly about sex. AIDS prevention programs have focused on improving couple communication, but family planning programs have emphasized women-controlled contraception as more reliable than barrier methods. The effectiveness of barrier methods would likely improve, however, if clients are counseled in couple communication. Effective communication about sexual issues requires self-confidence, and strengthening a woman's self-confidence may also improve her ability to negotiate condom use. Small discussion groups held among female factory workers in Thailand in 1993-94 led to an increase from 60% to 90% in the number of women who felt confident in discussing STD risk with a partner and to an increase from 36% to 82% in those who said they would not be embarrassed to give a partner a condom. A Nigerian study also suggested that more education may also improve prospects for couple communication and contraceptive usage. A US study showed that adolescent women who communicated openly with their partners reduced their risks of unintended pregnancy and STDs, and a Kenyan study indicated that communication increases contraceptive usage among married couples. Various projects around the world are attempting to counsel women on communication and condom negotiation, and counselors are beginning the difficult task of teaching women how to convince men to use condoms.
Computer program for analysis of coupled-cavity traveling wave tubes
NASA Technical Reports Server (NTRS)
Connolly, D. J.; Omalley, T. A.
1977-01-01
A flexible, accurate, large signal computer program was developed for the design of coupled cavity traveling wave tubes. The program is written in FORTRAN IV for an IBM 360/67 time sharing system. The beam is described by a disk model and the slow wave structure by a sequence of cavities, or cells. The computational approach is arranged so that each cavity may have geometrical or electrical parameters different from those of its neighbors. This allows the program user to simulate a tube of almost arbitrary complexity. Input and output couplers, severs, complicated velocity tapers, and other features peculiar to one or a few cavities may be modeled by a correct choice of input data. The beam-wave interaction is handled by an approach in which the radio frequency fields are expanded in solutions to the transverse magnetic wave equation. All significant space harmonics are retained. The program was used to perform a design study of the traveling-wave tube developed for the Communications Technology Satellite. Good agreement was obtained between the predictions of the program and the measured performance of the flight tube.
NASA Technical Reports Server (NTRS)
Corrigan, J. C.; Cronkhite, J. D.; Dompka, R. V.; Perry, K. S.; Rogers, J. P.; Sadler, S. G.
1989-01-01
Under a research program designated Design Analysis Methods for VIBrationS (DAMVIBS), existing analytical methods are used for calculating coupled rotor-fuselage vibrations of the AH-1G helicopter for correlation with flight test data from an AH-1G Operational Load Survey (OLS) test program. The analytical representation of the fuselage structure is based on a NASTRAN finite element model (FEM), which has been developed, extensively documented, and correlated with ground vibration test. One procedure that was used for predicting coupled rotor-fuselage vibrations using the advanced Rotorcraft Flight Simulation Program C81 and NASTRAN is summarized. Detailed descriptions of the analytical formulation of rotor dynamics equations, fuselage dynamic equations, coupling between the rotor and fuselage, and solutions to the total system of equations in C81 are included. Analytical predictions of hub shears for main rotor harmonics 2p, 4p, and 6p generated by C81 are used in conjunction with 2p OLS measured control loads and a 2p lateral tail rotor gearbox force, representing downwash impingement on the vertical fin, to excite the NASTRAN model. NASTRAN is then used to correlate with measured OLS flight test vibrations. Blade load comparisons predicted by C81 showed good agreement. In general, the fuselage vibration correlations show good agreement between anslysis and test in vibration response through 15 to 20 Hz.
Sickle cell anemia in northern Israel: screening and prevention.
Koren, Ariel; Zalman, Lucia; Palmor, Haya; Zamir, Ronit Bril; Levin, Carina; Openheim, Ariella; Daniel-Spiegel, Etty; Shalev, Stavit; Filon, Dvora
2009-04-01
Sickle cell anemia is a hemolytic anemia caused by a single mutation in position 6 of the beta globin molecule. About 80 patients with SCA in northern Israel are currently receiving treatment. To assess a screening program in northern Israel aimed at detecting couples at risk for having offspring with SCA. Since 1987, screening for beta thalassemia in pregnant women in northern Israel has been conducted, and from 1999 all the samples were also tested for hemoglobin S, Hgb C, Hgb D, Hgb O Arab and others. During the 20 year period 1987-2006 a total of 69,340 women were screened; 114 couples who carried Hgb S were detected and 187 prenatal diagnoses were performed in couples at risk for having an offspring with Hgb S. The mean gestational age was 13 +/- 4 weeks. Fifty-four of those diagnoses revealed affected fetuses and in 4 cases the couple declined to perform therapeutic abortion. The economic burden to the health services for treating SCA patients is about U.S.$ 7000 per year, and the institution of prevention programs has proven cost-effective in populations with a high frequency of carriers. Since our program is aimed to also detect beta thalassemia, a disease that is more frequent in this area (> 2.5%), the added cost for the prevention of SCA is less significant despite the low incidence of the S gene in our population, namely < 1%.
Tur-Kaspa, I; Aljadeff, G; Rechitsky, S; Grotjan, H E; Verlinsky, Y
2010-08-01
Over 1000 children affected with cystic fibrosis (CF) are born annually in the USA. Since IVF with preimplantation genetic diagnosis (PGD) is an alternative to raising a sick child or to aborting an affected fetus, a cost-benefit analysis was performed for a national IVF-PGD program for preventing CF. The amount spent to deliver healthy children for all CF carrier-couples by IVF-PGD was compared with the average annual and lifetime direct medical costs per CF patient avoided. Treating annually about 4000 CF carrier-couples with IVF-PGD would result in 3715 deliveries of non-affected children at a cost of $57,467 per baby. Because the average annual direct medical cost per CF patient was $63,127 and life expectancy is 37 years, savings would be $2.3 million per patient and $2.2 billion for all new CF patients annually in lifetime treatment costs. Cumulated net saving of an IVF-PGD program for all carrier-couples for 37 years would be $33.3 billion. A total of 618,714 cumulative years of patients suffering because of CF and thousands of abortions could be prevented. A national IVF-PGD program is a highly cost-effective novel modality of preventive medicine and would avoid most births of individuals affected with debilitating genetic disease. 2010 Reproductive Healthcare Ltd. Published by Elsevier Ltd. All rights reserved.
Modeling the missile-launch tube problem in DYSCO
NASA Technical Reports Server (NTRS)
Berman, Alex; Gustavson, Bruce A.
1989-01-01
DYSCO is a versatile, general purpose dynamic analysis program which assembles equations and solves dynamics problems. The executive manages a library of technology modules which contain routines that compute the matrix coefficients of the second order ordinary differential equations of the components. The executive performs the coupling of the equations of the components and manages the solution of the coupled equations. Any new component representation may be added to the library if, given the state vector, a FORTRAN program can be written to compute M, C, K, and F. The problem described demonstrates the generality of this statement.
Applications of Microcomputers in the Teaching of Physics 6502 Software.
ERIC Educational Resources Information Center
Marsh, David P.
1980-01-01
Described is a variety of uses of the microcomputer when coupled with software available for systems using 6502 microprocessors. Included are several computer programs which exhibit some of the possibilities for programing the 6502 microprocessors. (DS)
Tilahun, Tizta; Coene, Gily; Temmerman, Marleen; Degomme, Olivier
2014-04-04
To assess spousal agreement levels regarding fertility preference and spousal communication, and to look at how it affects contraceptive use by couples. We conducted a cross-sectional study to collect quantitative data from March to May 2010 in Jimma zone, Ethiopia, using a multistage sampling design covering six districts. In each of the 811 couples included in the survey, both spouses were interviewed. Concordance between the husband and wife was assessed using different statistics and tests including concordance rates, ANOVA, Cohen's Κ and McNemar's test for paired samples. Multivariate analysis was computed to ascertain factors associated with contraceptive use. Over half of the couples wanted more children and 27.8% of the spouses differed about the desire for more children. In terms of sex preference, there was a 48.7% discord in couples who wanted to have more children. At large, spousal concordance on the importance of family planning was positive. However, it was the husband's favourable attitude towards family planning that determined a couple's use of contraception. Overall, contraceptive prevalence was 42.9%. Among the groups with the highest level of contraceptive users, were couples where the husband does not want any more children. Spousal communication about the decision to use contraception showed a positive association with a couple's contraceptive prevalence. Family planning programs aiming to increase contraceptive uptake could benefit from findings on spousal agreement regarding fertility desire, because the characteristics of each spouse influences the couple's fertility level. Disparities between husband and wife about the desire for more children sustain the need for male consideration while analysing the unmet need for contraception. Moreover, men play a significant role in the decision making concerning contraceptive use. Accordingly, involving men in family planning programs could increase a couple's contraceptive practice in the future.
The no conclusion intervention for couples in conflict.
Migerode, Lieven
2014-07-01
Dealing with difference is central to all couple therapy. This article presents an intervention designed to assist couples in handling conflict. Central to this approach is the acceptance that most conflicts cannot be solved. Couples are in need of a different understanding of couples conflict. This understanding is found in the analysis of love in context and in relational dialectics. Couples are guided through different steps: deciding on the valence of the issue as individuals, helping them decide which differences can be resolved and which issues demand new ways of living with the inevitable, and the introduction in the suggested no conclusion dialogue. This article briefly describes the five day intensive couple therapy program, in which the no intervention is embedded. The theoretical foundation of the intervention, followed by the step by step description of the intervention forms the major part of the article. A case vignette illustrates this approach. © 2012 American Association for Marriage and Family Therapy.
Sasaki, Kosei; Cropper, Elizabeth C; Weiss, Klaudiusz R; Jing, Jian
2013-01-01
Although electrical coupling is present in many microcircuits, the extent to which it will determine neuronal firing patterns and network activity remains poorly understood. This is particularly true when the coupling is present in a population of heterogeneous, or intrinsically distinct circuit elements. We examine this question in the Aplysia californica feeding motor network in five electrically-coupled identified cells, B64, B4/5, B70, B51 and a newly-identified interneuron B71. These neurons exhibit distinct activity patterns during the radula retraction phase of motor programs. In a subset of motor programs, retraction can be flexibly extended by adding a phase of network activity (hyper-retraction). This is manifested most prominently as an additional burst in the radula closure motoneuron B8. Two neurons that excite B8 (B51 and B71) and one that inhibits it (B70) are active during hyper-retraction. Consistent with their near synchronous firing, B51 and B71 showed one of the strongest coupling ratios in this group of neurons. Nonetheless, by manipulating their activity, we found that B51 preferentially acted as a driver of B64/B71 activity, whereas B71 played a larger role in driving B8 activity. In contrast, B70 was weakly coupled to other neurons and its inhibition of B8 counter-acted the excitatory drive to B8. Finally, the distinct firing patterns of the electrically-coupled neurons were fine-tuned by their intrinsic properties and the largely chemical cross-inhibition between some of them. Thus, the small microcircuit of Aplysia feeding network is advantageous in understanding how a population of electrically-coupled heterogeneous neurons may fulfill specific network functions. PMID:23283325
ERIC Educational Resources Information Center
Roiger, Trevor C.; Card, Karen A.
2012-01-01
Context: Coupling theory, based on a tight-loose continuum, describes the nature of a connection, relationship, or interaction between entities. Understanding the nature of an ATEP's relationship with intercollegiate athletic departments is important to their growth and responsiveness to environmental change. Objective: To determine program…
The flow of plasma in the solar terrestrial environment
NASA Technical Reports Server (NTRS)
Schunk, R. W.; Birmingham, T. J.
1992-01-01
The scientific goals of the program are outlined, and some of the papers submitted for publication within the last six months are briefly highlighted. Some of the topics covered include ionosphere-magnetosphere coupling, polar cap arcs, polar wind, convection vortices, ionosphere-plasmasphere coupling, and the validity of macroscopic plasma flow models.
Earth-moon system: Dynamics and parameter estimation
NASA Technical Reports Server (NTRS)
Breedlove, W. J., Jr.
1975-01-01
A theoretical development of the equations of motion governing the earth-moon system is presented. The earth and moon were treated as finite rigid bodies and a mutual potential was utilized. The sun and remaining planets were treated as particles. Relativistic, non-rigid, and dissipative effects were not included. The translational and rotational motion of the earth and moon were derived in a fully coupled set of equations. Euler parameters were used to model the rotational motions. The mathematical model is intended for use with data analysis software to estimate physical parameters of the earth-moon system using primarily LURE type data. Two program listings are included. Program ANEAMO computes the translational/rotational motion of the earth and moon from analytical solutions. Program RIGEM numerically integrates the fully coupled motions as described above.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mann, Amanda K; Wu, Zili; Calaza, Florencia
2014-01-01
CeO2 cubes with {100} facets, octahedra with {111} facets, and wires with highly defective structures were utilized to probe the structure-dependent reactivity of acetaldehyde. Using temperature-programmed desorption (TPD), temperature-programmed surface reactions (TPSR), and in situ infrared spectroscopy it was found that acetaldehyde desorbs unreacted or undergoes reduction, coupling, or C-C bond scission reactions depending on the surface structure of CeO2. Room temperature FTIR indicates that acetaldehyde binds primarily as 1-acetaldehyde on the octahedra, in a variety of conformations on the cubes, including coupling products and acetate and enolate species, and primarily as coupling products on the wires. The percent consumptionmore » of acetaldehyde follows the order of wires > cubes > octahedra. All the nanoshapes produce the coupling product crotonaldehyde; however, the selectivity to produce ethanol follows the order wires cubes >> octahedra. The selectivity and other differences can be attributed to the variation in the basicity of the surfaces, defects densities, coordination numbers of surface atoms, and the reducibility of the nanoshapes.« less
NASA Astrophysics Data System (ADS)
Liu, Zhebing; Huntington, Lee M. J.; Nooijen, Marcel
2015-10-01
The recently introduced multireference equation of motion (MR-EOM) approach is combined with a simple treatment of spin-orbit coupling, as implemented in the ORCA program. The resulting multireference equation of motion spin-orbit coupling (MR-EOM-SOC) approach is applied to the first-row transition metal atoms Cr, Mn, Fe and Co, for which experimental data are readily available. Using the MR-EOM-SOC approach, the splittings in each L-S multiplet can be accurately assessed (root mean square (RMS) errors of about 70 cm-1). The RMS errors for J-specific excitation energies range from 414 to 783 cm-1 and are comparable to previously reported J-averaged MR-EOM results using the ACESII program. The MR-EOM approach is highly efficient. A typical MR-EOM calculation of a full spin-orbit spectrum takes about 2 CPU hours on a single processor of a 12-core node, consisting of Intel XEON 2.93 GHz CPUs with 12.3 MB of shared cache memory.
Neuro-Linguistic Programming and Family Therapy.
ERIC Educational Resources Information Center
Davis, Susan L. R.; Davis, Donald I.
1983-01-01
Presents a brief introduction to Neuro-Linguistic Programming (NLP), followed by case examples which illustrate some of the substantive gains which NLP techniques have provided in work with couples and families. NLP's major contributions involve understanding new models of human experience. (WAS)
Evaluation of biochars by temperature programmed oxidation/mass spectroscopy
USDA-ARS?s Scientific Manuscript database
Biochar from the thermochemical conversion of biomass was evaluated by Temperature Programmed Oxidation (TPO) coupled with mass spectroscopy. This technique can be used to assess the oxidative reactivity of carbonaceous solids where higher temperature reactivity indicates greater structural order. ...
An Interactive Web-Based Program for Stepfamilies: Development and Evaluation of Efficacy
ERIC Educational Resources Information Center
Gelatt, Vicky A.; Adler-Baeder, Francesca; Seeley, John R.
2010-01-01
This study evaluated the efficacy of a family life education program for stepfamilies that is self-administered, interactive, and web-based. The program uses behavior-modeling videos to demonstrate effective couple, parenting, and stepparenting practices. A diverse sample of 300 parents/stepparents of a child aged 11-15 years were randomized into…
Brown, J Lynne; Wenrich, Tionni R
2012-08-01
Few Americans eat sufficient vegetables, especially the protective deep orange and dark green vegetables. To address this, a community-based wellness program to broaden vegetables served at evening meals targeting Appalachian food preparers and their families was tested in a randomized, controlled intervention. Food preparers (n=50) were predominately married (88%), white (98%), and female (94%), with several children living at home. Experimental food preparers (n=25) attended the program sessions and controls (n=25) were mailed relevant handouts and recipes. At program sessions, participants received nutrition information, hands-on cooking instruction, and prepared recipes to take home for family evaluation. As qualitative assessment, 10 couples from each treatment group (n=20 couples) were randomly selected for baseline and immediate post-intervention interviews to explore impact on the food preparer's family. These in-depth interviews with the food preparer and their adult partner were tape-recorded and transcribed verbatim. Two researchers conducted thematic analysis using constant comparison. Family flexibility about food choices was assessed using roles, rules, and power concepts from Family Systems Theory. Interviews at baseline revealed dinner vegetable variety was very limited because food preparers served only what everyone liked (a role expectation) and deferred to male partner and children's narrow vegetable preferences (power). Control couples reported no change in vegetable dinner variety post-intervention. Most experimental couples reported in-home tasting and evaluation was worthwhile and somewhat broadened vegetables served at dinners. But the role expectation of serving only what everyone liked and the practice of honoring powerful family members' vegetable preferences remained major barriers to change. Copyright © 2012 Academy of Nutrition and Dietetics. Published by Elsevier Inc. All rights reserved.
Effect of VOC emissions from vegetation on urban air quality during hot periods
NASA Astrophysics Data System (ADS)
Churkina, Galina; Kuik, Friderike; Bonn, Boris; Lauer, Axel; Grote, Ruediger; Butler, Tim
2016-04-01
Programs to plant millions of trees in cities around the world aim at the reduction of summer temperatures, increase of carbon storage, storm water control, and recreational space, as well as at poverty alleviation. These urban greening programs, however, do not take into account how closely human and natural systems are coupled in urban areas. Compared with the surroundings of cities, elevated temperatures together with high anthropogenic emissions of air and water pollutants are quite typical in urban systems. Urban and sub-urban vegetation respond to changes in meteorology and air quality and can react to pollutants. Neglecting this coupling may lead to unforeseen negative effects on air quality resulting from urban greening programs. The potential of emissions of volatile organic compounds (VOC) from vegetation combined with anthropogenic emissions of air pollutants to produce ozone has long been recognized. This ozone formation potential increases under rising temperatures. Here we investigate how emissions of VOC from urban vegetation affect corresponding ground-level ozone and PM10 concentrations in summer and especially during heat wave periods. We use the Weather Research and Forecasting Model with coupled atmospheric chemistry (WRF-CHEM) to quantify these feedbacks in the Berlin-Brandenburg region, Germany during the two summers of 2006 (heat wave) and 2014 (reference period). VOC emissions from vegetation are calculated by MEGAN 2.0 coupled online with WRF-CHEM. Our preliminary results indicate that the contribution of VOCs from vegetation to ozone formation may increase by more than twofold during heat wave periods. We highlight the importance of the vegetation for urban areas in the context of a changing climate and discuss potential tradeoffs of urban greening programs.
Adavanced RTG and thermoelectric materials study
NASA Technical Reports Server (NTRS)
Eggers, P. E.
1971-01-01
A comprehensive, generalized two-dimensional RTG analysis computer program was developed. This program is capable of analyzing any specified RTG design under a wide range of transient as well as steady-state operating conditions. The feasibility of a new concept for the design of segmented (or single-phase) thermoelectric couples was demonstrated. A SiGe-PbTe segmented couple involving pressure contacted junctions at the intermediate- and hot-junction temperatures was successfully encapsulated in a hermetically sealed bellows enclosure. This bellows-encapsulated couple was operated between a hot- and cold-junction temperature of 1200 K and 450 K, respectively, with a measured energy conversion efficiency of 7.6 + or - .5 per cent. An experimental study of selected sublimation barrier schemes revealed that a significant reduction in the sublimation rate of p-type PbTe could be achieved by using multiple layers of SiO2 fibers. A comparison of the barrier effectiveness is given for three different barrier designs.
Role of IAC in large space systems thermal analysis
NASA Technical Reports Server (NTRS)
Jones, G. K.; Skladany, J. T.; Young, J. P.
1982-01-01
Computer analysis programs to evaluate critical coupling effects that can significantly influence spacecraft system performance are described. These coupling effects arise from the varied parameters of the spacecraft systems, environments, and forcing functions associated with disciplines such as thermal, structures, and controls. Adverse effects can be expected to significantly impact system design aspects such as structural integrity, controllability, and mission performance. One such needed design analysis capability is a software system that can integrate individual discipline computer codes into a highly user-oriented/interactive-graphics-based analysis capability. The integrated analysis capability (IAC) system can be viewed as: a core framework system which serves as an integrating base whereby users can readily add desired analysis modules and as a self-contained interdisciplinary system analysis capability having a specific set of fully integrated multidisciplinary analysis programs that deal with the coupling of thermal, structures, controls, antenna radiation performance, and instrument optical performance disciplines.
Sexuality and the middle-aged cardiac patient.
Watts, R J
1976-06-01
Counseling for the resumption of sexual activity deserves as much attention in a cardiac rehabilitation program as walking or jogging. Research findings enable the counselor to give specific sexual advice. The energey expenditure during coitus for long-married couples is equivalent to that of climbing stairs, and consequently the risk of heart attack is low. However, clustering of psychosocial and physiologic demands, such as illicit affairs, outbursts of anger, alcohol, and hearty meals, may precipitate reinfarction or death. A sexual activities program is successful only if each partner is committed to give and receive pleasure. Knowledgeable and sensitive counseling will enable the couple to explore extra-coital options for lovemaking prior to the resumption of intercourse. This writer has observed that once couples are "turned on" to the pleasuring exercises, coital activity is attempted at an earlier date without untoward side effects in the cardiac patient.
Numerical Flight Mechanics Analysis Of The SHEFEX I Ascent And Re-Entry Phases
NASA Astrophysics Data System (ADS)
Bartolome Calvo, Javier; Eggers, Thino
2011-08-01
The SHarp Edge Flight EXperiment (SHEFEX) I provides a huge amount of scientific data to validate numerical tools in hypersonic flows. These data allow the direct comparison of flight measurements with the current numerical tools available at DLR. Therefore, this paper is devoted to apply a recently developed direct coupling between aerodynamics and flight dynamics to the SHEFEX I flight. In a first step, mission analyses are carried out using the trajectory optimization program REENT 6D coupled to missile DATCOM. In a second step, the direct coupling between the trajectory program and the DLR TAU code, in which the unsteady Euler equations including rigid body motion are solved, is applied to analyze some interesting parts of ascent and re-entry phases of the flight experiment. The agreement of the numerical predictions with the obtained flight data is satisfactory assuming a variable fin deflection angle.
NASA Astrophysics Data System (ADS)
Li, Tiefu; Chen, Zhen; Wang, Yimin; Tian, Lin; Qiu, Yueyin; Inomata, Kunihiro; Yoshihara, Fumiki; Han, Siyuan; Nori, Franco; Tsai, Jaw-Shen; You, J. Q.
We report the experimental observation of high-order sideband transitions at the single-photon level in a quantum circuit system of a flux qubit ultrastrongly coupled to a coplanar waveguide resonator. With the coupling strength reaching 10 % of the resonator's fundamental frequency, we obtain clear signatures of higher-order red- and first-order blue-sideband transitions. These transitions are owing to the ultrastrong Rabi coupling, instead of the driving power. Our observation advances the understanding of ultrastrongly-coupled systems and paves the way to study high-order processes in the quantum Rabi model. This work is supported by the National Basic Research Program of China and the National Natural Science Foundation of China.
NASA Technical Reports Server (NTRS)
Barnett, Alan R.; Ibrahim, Omar M.; Abdallah, Ayman A.; Sullivan, Timothy L.
1993-01-01
By utilizing MSC/NASTRAN DMAP (Direct Matrix Abstraction Program) in an existing NASA Lewis Research Center coupled loads methodology, solving modal equations of motion with initial conditions is possible using either coupled (Newmark-Beta) or uncoupled (exact mode superposition) integration available within module TRD1. Both the coupled and newly developed exact mode superposition methods have been used to perform transient analyses of various space systems. However, experience has shown that in most cases, significant time savings are realized when the equations of motion are integrated using the uncoupled solver instead of the coupled solver. Through the results of a real-world engineering analysis, advantages of using the exact mode superposition methodology are illustrated.
ERIC Educational Resources Information Center
Butler, Laurel
2014-01-01
The Young Artists at Work Program at the Yerba Buena Center for the Arts (YBCA) recently shifted its model from an afterschool arts program to a young artists' residency. This decision arose from a desire to reposition the youth program as a priority within the larger organization, coupled with a commitment to deepening the pedagogical values of…
Numerical Propulsion System Simulation: An Overview
NASA Technical Reports Server (NTRS)
Lytle, John K.
2000-01-01
The cost of implementing new technology in aerospace propulsion systems is becoming prohibitively expensive and time consuming. One of the main contributors to the high cost and lengthy time is the need to perform many large-scale hardware tests and the inability to integrate all appropriate subsystems early in the design process. The NASA Glenn Research Center is developing the technologies required to enable simulations of full aerospace propulsion systems in sufficient detail to resolve critical design issues early in the design process before hardware is built. This concept, called the Numerical Propulsion System Simulation (NPSS), is focused on the integration of multiple disciplines such as aerodynamics, structures and heat transfer with computing and communication technologies to capture complex physical processes in a timely and cost-effective manner. The vision for NPSS, as illustrated, is to be a "numerical test cell" that enables full engine simulation overnight on cost-effective computing platforms. There are several key elements within NPSS that are required to achieve this capability: 1) clear data interfaces through the development and/or use of data exchange standards, 2) modular and flexible program construction through the use of object-oriented programming, 3) integrated multiple fidelity analysis (zooming) techniques that capture the appropriate physics at the appropriate fidelity for the engine systems, 4) multidisciplinary coupling techniques and finally 5) high performance parallel and distributed computing. The current state of development in these five area focuses on air breathing gas turbine engines and is reported in this paper. However, many of the technologies are generic and can be readily applied to rocket based systems and combined cycles currently being considered for low-cost access-to-space applications. Recent accomplishments include: (1) the development of an industry-standard engine cycle analysis program and plug 'n play architecture, called NPSS Version 1, (2) A full engine simulation that combines a 3D low-pressure subsystem with a 0D high pressure core simulation. This demonstrates the ability to integrate analyses at different levels of detail and to aerodynamically couple components, the fan/booster and low-pressure turbine, through a 3D computational fluid dynamics simulation. (3) Simulation of all of the turbomachinery in a modern turbofan engine on parallel computing platform for rapid and cost-effective execution. This capability can also be used to generate full compressor map, requiring both design and off-design simulation. (4) Three levels of coupling characterize the multidisciplinary analysis under NPSS: loosely coupled, process coupled and tightly coupled. The loosely coupled and process coupled approaches require a common geometry definition to link CAD to analysis tools. The tightly coupled approach is currently validating the use of arbitrary Lagrangian/Eulerian formulation for rotating turbomachinery. The validation includes both centrifugal and axial compression systems. The results of the validation will be reported in the paper. (5) The demonstration of significant computing cost/performance reduction for turbine engine applications using PC clusters. The NPSS Project is supported under the NASA High Performance Computing and Communications Program.
Study on the Transient Process of 500kV Substations Secondary Equipment
NASA Astrophysics Data System (ADS)
Li, Hongbo; Li, Pei; Zhang, Yanyan; Niu, Lin; Gao, Nannan; Si, Tailong; Guo, Jiadong; Xu, Min-min; Li, Guofeng; Guo, Liangfeng
2017-05-01
By analyzing on the reason of the lightning accident occur in the substation, the way of lightning incoming surge invading the secondary system is summarized. The interference source acts on the secondary system through various coupling paths. It mainly consists of four ways: the conductance coupling mode, the Capacitive Coupling Mode, the inductive coupling mode, The Radiation Interference Model. Then simulated the way with the program-ATP. At last, from the three aspects of low-voltage power supply system, the impact potential distribution of grounding grid, the secondary system and the computer system. The lightning protection measures is put forward.
Cavity optomechanics: Manipulating photons and phonons towards the single-photon strong coupling
NASA Astrophysics Data System (ADS)
Liu, Yu-long; Wang, Chong; Zhang, Jing; Liu, Yu-xi
2018-02-01
Not Available Project supported by the National Basic Research Program of China (Grant No. 2014CB921401), the Tsinghua University Initiative Scientific Research Program, and the Tsinghua National Laboratory for Information Science and Technology (TNList) Cross-discipline Foundation.
XMDS2: Fast, scalable simulation of coupled stochastic partial differential equations
NASA Astrophysics Data System (ADS)
Dennis, Graham R.; Hope, Joseph J.; Johnsson, Mattias T.
2013-01-01
XMDS2 is a cross-platform, GPL-licensed, open source package for numerically integrating initial value problems that range from a single ordinary differential equation up to systems of coupled stochastic partial differential equations. The equations are described in a high-level XML-based script, and the package generates low-level optionally parallelised C++ code for the efficient solution of those equations. It combines the advantages of high-level simulations, namely fast and low-error development, with the speed, portability and scalability of hand-written code. XMDS2 is a complete redesign of the XMDS package, and features support for a much wider problem space while also producing faster code. Program summaryProgram title: XMDS2 Catalogue identifier: AENK_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AENK_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU General Public License, version 2 No. of lines in distributed program, including test data, etc.: 872490 No. of bytes in distributed program, including test data, etc.: 45522370 Distribution format: tar.gz Programming language: Python and C++. Computer: Any computer with a Unix-like system, a C++ compiler and Python. Operating system: Any Unix-like system; developed under Mac OS X and GNU/Linux. RAM: Problem dependent (roughly 50 bytes per grid point) Classification: 4.3, 6.5. External routines: The external libraries required are problem-dependent. Uses FFTW3 Fourier transforms (used only for FFT-based spectral methods), dSFMT random number generation (used only for stochastic problems), MPI message-passing interface (used only for distributed problems), HDF5, GNU Scientific Library (used only for Bessel-based spectral methods) and a BLAS implementation (used only for non-FFT-based spectral methods). Nature of problem: General coupled initial-value stochastic partial differential equations. Solution method: Spectral method with method-of-lines integration Running time: Determined by the size of the problem
Highway to Reform: The Coupling of District Reading Policy and Instructional Practice
ERIC Educational Resources Information Center
Woulfin, Sarah L.
2015-01-01
This article presents findings on teachers' implementation of a reading reform in an urban school district. Findings are based in observation, interview, and document data related to 12 elementary teachers' responses to a new reading program, the Teachers College Reading and Writing Workshop. Utilizing coupling theory and the concept of routines,…
Relationship Interventions during the Transition to Parenthood: Issues of Timing and Efficacy
ERIC Educational Resources Information Center
Trillingsgaard, Tea; Baucom, Katherine J. W.; Heyman, Richard E.; Elklit, Ask
2012-01-01
This study evaluated the efficacy of the Prevention and Relationship Enhancement Program (PREP) adapted for Danish couples expecting their first child. Couples were recruited consecutively through a public maternity ward (N = 290). On the basis of due dates, they were allocated to (a) PREP, (b) an information-based control group (INFO), or (c)…
NASA Astrophysics Data System (ADS)
Stock, Joachim W.; Kitzmann, Daniel; Patzer, A. Beate C.; Sedlmayr, Erwin
2018-06-01
For the calculation of complex neutral/ionized gas phase chemical equilibria, we present a semi-analytical versatile and efficient computer program, called FastChem. The applied method is based on the solution of a system of coupled nonlinear (and linear) algebraic equations, namely the law of mass action and the element conservation equations including charge balance, in many variables. Specifically, the system of equations is decomposed into a set of coupled nonlinear equations in one variable each, which are solved analytically whenever feasible to reduce computation time. Notably, the electron density is determined by using the method of Nelder and Mead at low temperatures. The program is written in object-oriented C++ which makes it easy to couple the code with other programs, although a stand-alone version is provided. FastChem can be used in parallel or sequentially and is available under the GNU General Public License version 3 at https://github.com/exoclime/FastChem together with several sample applications. The code has been successfully validated against previous studies and its convergence behavior has been tested even for extreme physical parameter ranges down to 100 K and up to 1000 bar. FastChem converges stable and robust in even most demanding chemical situations, which posed sometimes extreme challenges for previous algorithms.
Heat Waves, Urban Vegetation, and Air Pollution
NASA Astrophysics Data System (ADS)
Churkina, G.; Grote, R.; Butler, T. M.
2014-12-01
Fast-track programs to plant millions of trees in cities around the world aim at the reduction of summer temperatures, increase carbon storage, storm water control, provision of space for recreation, as well as poverty alleviation. Although these multiple benefits speak positively for urban greening programs, the programs do not take into account how close human and natural systems are coupled in urban areas. Elevated temperatures together with anthropogenic emissions of air and water pollutants distinguish the urban system. Urban and sub-urban vegetation responds to ambient changes and reacts with pollutants. Neglecting the existence of this coupling may lead to unforeseen drawbacks of urban greening programs. The potential for emissions from urban vegetation combined with anthropogenic emissions to produce ozone has long been recognized. This potential increases under rising temperatures. Here we investigate how global change induced heat waves affect emissions of volatile organic compounds (VOC) from urban vegetation and corresponding ground-level ozone levels. We also quantify other ecosystem services provided by urban vegetation (e.g., cooling and carbon storage) and their sensitivity to climate change. In this study we use Weather Research and Forecasting Model with coupled atmospheric chemistry (WRF-CHEM) to quantify these feedbacks in Berlin, Germany during the heat waves in 2003 and 2006. We highlight the importance of the vegetation for urban areas under changing climate and discuss associated tradeoffs.
Moodi, Mitra; Miri, Mohammad-Reza; Reza Sharifirad, Gholam
2013-01-01
Marriages and establishing a family is one of the most important events in the life of each person. It has significant effects on personal and social health, if it occurs with sufficient knowledge in the proper conditions. The aim of this study is to determine the effect of pre-marriage instruction on the knowledge and health attitudes of the couples attending the pre-marriage counseling classes. This pre and post quasi-experimental study was conducted on 250 couples attending the pre-marriage counseling classes. The required information was collected using an autonomous questionnaire designed based on the research objectives. The questionnaire included three parts: Demographic information, knowledge (27 questions) and attitude (18 questions. The questionnaire was filled out before and after the pre-marriage counseling program, which was presented as lectures. The effect of the instructional program was analyzed using a statistical test. The results showed that 83.2% of the couples had poor knowledge, 16% average, and 0.8% had good knowledge before the intervention. After the intervention, 60.4% of couples had poor knowledge, 31.6% average and 8% had good knowledge. The results also revealed that that the difference in mean scores of knowledge and attitudes regarding reproductive health, family planning, genetic diseases and disabilities was statistically significant (P < 0.001). Despite the mean scores of knowledge and attitude of the couples had increased after the instructional intervention, the increase in knowledge level was not very high. So the knowledge score of the couples increased just 4.3%, and only 8% of the couples had good knowledge after the instructional intervention. Therefore, to achieve a relatively stable behavior change in individuals and improving the health level of the young couples, it is recommended that more attention pay to the quality of the instructional classes.
Moodi, Mitra; Miri, Mohammad-Reza; Reza Sharifirad, Gholam
2013-01-01
Backgrounds: Marriages and establishing a family is one of the most important events in the life of each person. It has significant effects on personal and social health, if it occurs with sufficient knowledge in the proper conditions. The aim of this study is to determine the effect of pre-marriage instruction on the knowledge and health attitudes of the couples attending the pre-marriage counseling classes. Materials and Methods: This pre and post quasi-experimental study was conducted on 250 couples attending the pre-marriage counseling classes. The required information was collected using an autonomous questionnaire designed based on the research objectives. The questionnaire included three parts: Demographic information, knowledge (27 questions) and attitude (18 questions. The questionnaire was filled out before and after the pre-marriage counseling program, which was presented as lectures. The effect of the instructional program was analyzed using a statistical test. Results: The results showed that 83.2% of the couples had poor knowledge, 16% average, and 0.8% had good knowledge before the intervention. After the intervention, 60.4% of couples had poor knowledge, 31.6% average and 8% had good knowledge. The results also revealed that that the difference in mean scores of knowledge and attitudes regarding reproductive health, family planning, genetic diseases and disabilities was statistically significant (P < 0.001). Conclusions: Despite the mean scores of knowledge and attitude of the couples had increased after the instructional intervention, the increase in knowledge level was not very high. So the knowledge score of the couples increased just 4.3%, and only 8% of the couples had good knowledge after the instructional intervention. Therefore, to achieve a relatively stable behavior change in individuals and improving the health level of the young couples, it is recommended that more attention pay to the quality of the instructional classes. PMID:24251288
Upper Atmosphere Research Satellite (UARS): A program to study global ozone change
NASA Technical Reports Server (NTRS)
1991-01-01
A general overview of NASA's Upper Atmosphere Research Satellite (UARS) program is presented in a broad based informational publication. The UARS will be responsible for carrying out the first systematic, comprehensive study of the stratosphere and will furnish important new data on the mesosphere and thermosphere. The UARS mission objectives are to provide an increased understanding of energy input into the upper atmosphere; global photochemistry of the upper atmosphere; dynamics of the upper atmosphere; coupling among these processes; and coupling between the upper and lower atmosphere. These mission objectives are briefly described along with the UARS on-board instrumentation and related data management systems.
NASA Technical Reports Server (NTRS)
Rhodes, M. D.; Selberg, B. P.
1982-01-01
An investigation was performed to compare closely coupled dual wing and swept forward swept rearward wing aircraft to corresponding single wing 'baseline' designs to judge the advantages offered by aircraft designed with multiple wing systems. The optimum multiple wing geometry used on the multiple wing designs was determined in an analytic study which investigated the two- and three-dimensional aerodynamic behavior of a wide range of multiple wing configurations in order to find the wing geometry that created the minimum cruise drag. This analysis used a multi-element inviscid vortex panel program coupled to a momentum integral boundary layer analysis program to account for the aerodynamic coupling between the wings and to provide the two-dimensional aerodynamic data, which was then used as input for a three-dimensional vortex lattice program, which calculated the three-dimensional aerodynamic data. The low drag of the multiple wing configurations is due to a combination of two dimensional drag reductions, tailoring the three dimensional drag for the swept forward swept rearward design, and the structural advantages of the two wings that because of the structural connections permitted higher aspect ratios.
Shamblen, Stephen R; Arnold, Brooke B; McKiernan, Patrick; Collins, David A; Strader, Ted N
2013-09-01
Divorce proportions are currently high in the US and they are even higher among those who are incarcerated with substance abuse problems. Although much research has examined marital interventions, only two studies have examined marital interventions with prison populations. There is some empirical evidence that incarcerated couples benefit from traditional marital therapy (O'Farrell and Fals-Stewart, 1999, Addictions: A comprehensive guidebook, New York, Oxford University Press). An adaptation of the evidence-based Creating Lasting Family Connections program was implemented with 144 married couples, where one spouse was incarcerated, in a southern state with particularly high divorce and incarceration proportions. Results suggested that married men exposed to the program had larger improvements in some relationship skills relative to a convenience sample of men not so exposed. Both husbands and wives exposed to the program exhibited similar and significant increases in relationship skills. The results were comparable to a Prevention and Relationship Enhancement Program adaptation for inmates. The implications of the findings for prevention practitioners are discussed. © FPI, Inc.
Parent Couples' Coping Resources and Involvement in their Children's Intervention Program.
Brand, Devora; Zaidman-Zait, Anat; Most, Tova
2018-07-01
Parental involvement is vital to the implementation of intervention programs for deaf and hard-of-hearing (DHH) children. The current study examined the dyadic relationships between mothers' and fathers' coping resources and their involvement in their child's intervention program. In addition, the moderating roles of parent's gender and family religiosity on the associations between coping resources and involvement were examined. Seventy Jewish couples of parents of DHH children, representing various levels of religiosity, completed questionnaires regarding involvement in their child's intervention program, child acceptance, parental self-efficacy, and perceived social support. Multilevel modeling analyses were used to test actor-partner interdependence. The findings indicated significant actor effects for child acceptance, parental self-efficacy, and social support. All were positively associated with parental involvement. Gender was found to moderate the actor effect of child acceptance. Partner effects were found only for mothers, for child acceptance, and social support. Fathers' child acceptance and social support were negatively associated with mothers' involvement. Religiosity did not moderate neither actor nor partner effects. These results have important implications for planning intervention programs that are sensitive to each of the parent's needs.
A Cost-Benefit Analysis of the National Guard Youth ChalleNGe Program. Technical Report
ERIC Educational Resources Information Center
Perez-Arce, Francisco; Constant, Louay; Loughran, David S.; Karoly, Lynn A.
2012-01-01
Decades of research show that high school dropouts are more likely than graduates to commit crimes, abuse drugs and alcohol, have children out of wedlock, earn low wages, be unemployed, and suffer from poor health. The ChalleNGe program, currently operating in 27 states, is a residential program coupled with post-residential mentoring that seeks…
ERIC Educational Resources Information Center
Al-Imamy, Samer; Alizadeh, Javanshir; Nour, Mohamed A.
2006-01-01
One of the major issues related to teaching an introductory programming course is the excessive amount of time spent on the language's syntax, which leaves little time for developing skills in program design and solution creativity. The wide variation in the students' backgrounds, coupled with the traditional classroom (one size-fits-all) teaching…
Does Like Seek Like?: The Formation of Working Groups in a Programming Project
ERIC Educational Resources Information Center
Sanou Gozalo, Eduard; Hernández-Fernández, Antoni; Arias, Marta; Ferrer-i-Cancho, Ramon
2017-01-01
In a course of the degree of computer science, the programming project has changed from individual to teamed work, tentatively in couples (pair programming). Students have full freedom to team up with minimum intervention from teachers. The analysis of the working groups made indicates that students do not tend to associate with students with a…
NASA Technical Reports Server (NTRS)
Hodges, D. H.; Hopkins, A. S.; Kunz, D. L.; Hinnant, H. E.
1986-01-01
The General Rotorcraft Aeromechanical Stability Program (GRASP), which is a hybrid between finite element programs and spacecraft-oriented multibody programs, is described in terms of its design and capabilities. Numerical results from GRASP are presented and compared with the results from an existing, special-purpose coupled rotor/body aeromechanical stability program and with experimental data of Dowell and Traybar (1975 and 1977) for large deflections of an end-loaded cantilevered beam. The agreement is excellent in both cases.
Television camera as a scientific instrument
NASA Technical Reports Server (NTRS)
Smokler, M. I.
1970-01-01
Rigorous calibration program, coupled with a sophisticated data-processing program that introduced compensation for system response to correct photometry, geometric linearity, and resolution, converted a television camera to a quantitative measuring instrument. The output data are in the forms of both numeric printout records and photographs.
California Guide to Traffic Safety Education.
ERIC Educational Resources Information Center
California State Dept. of Education, Sacramento.
The guide proposes an elementary through high school program encompassing many aspects of traffic safety. Chapter 1 presents definitions, instructional goals, behavioral objectives, and K-6 traffic safety concepts coupled with student performance indicators. Various elements of program administration are covered in Chapter 2. Chapter 3 includes…
[Tenure--a Management Problem.
ERIC Educational Resources Information Center
Brassie, Stan
Tenure saturation coupled with declining enrollments, abolishment of general university requirements, program diversity, and affirmative action programs make tenure an issue. These factors are representative of many facing university management today. Serious examination of the concept of tenure reveals that 85 percent of all colleges have tenure,…
Reyes-Reyes, Elsa M; Aispuro, Ivan; Tavera-Garcia, Marco A; Field, Matthew; Moore, Sara; Ramos, Irma; Ramos, Kenneth S
2017-11-28
Although several lines of evidence have established the central role of epithelial-to-mesenchymal-transition (EMT) in malignant progression of non-small cell lung cancers (NSCLCs), the molecular events connecting EMT to malignancy remain poorly understood. This study presents evidence that Long Interspersed Nuclear Element-1 (LINE-1) retrotransposon couples EMT programming with malignancy in human bronchial epithelial cells (BEAS-2B). This conclusion is supported by studies showing that: 1) activation of EMT programming by TGF-β1 increases LINE-1 mRNAs and protein; 2) the lung carcinogen benzo(a)pyrene coregulates TGF-β1 and LINE-1 mRNAs, with LINE-1 positioned downstream of TGF-β1 signaling; and, 3) forced expression of LINE-1 in BEAS-2B cells recapitulates EMT programming and induces malignant phenotypes and tumorigenesis in vivo . These findings identify a TGFβ1-LINE-1 axis as a critical effector pathway that can be targeted for the development of precision therapies during malignant progression of intractable NSCLCs.
The Argonne Leadership Computing Facility 2010 annual report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Drugan, C.
Researchers found more ways than ever to conduct transformative science at the Argonne Leadership Computing Facility (ALCF) in 2010. Both familiar initiatives and innovative new programs at the ALCF are now serving a growing, global user community with a wide range of computing needs. The Department of Energy's (DOE) INCITE Program remained vital in providing scientists with major allocations of leadership-class computing resources at the ALCF. For calendar year 2011, 35 projects were awarded 732 million supercomputer processor-hours for computationally intensive, large-scale research projects with the potential to significantly advance key areas in science and engineering. Argonne also continued tomore » provide Director's Discretionary allocations - 'start up' awards - for potential future INCITE projects. And DOE's new ASCR Leadership Computing (ALCC) Program allocated resources to 10 ALCF projects, with an emphasis on high-risk, high-payoff simulations directly related to the Department's energy mission, national emergencies, or for broadening the research community capable of using leadership computing resources. While delivering more science today, we've also been laying a solid foundation for high performance computing in the future. After a successful DOE Lehman review, a contract was signed to deliver Mira, the next-generation Blue Gene/Q system, to the ALCF in 2012. The ALCF is working with the 16 projects that were selected for the Early Science Program (ESP) to enable them to be productive as soon as Mira is operational. Preproduction access to Mira will enable ESP projects to adapt their codes to its architecture and collaborate with ALCF staff in shaking down the new system. We expect the 10-petaflops system to stoke economic growth and improve U.S. competitiveness in key areas such as advancing clean energy and addressing global climate change. Ultimately, we envision Mira as a stepping-stone to exascale-class computers that will be faster than petascale-class computers by a factor of a thousand. Pete Beckman, who served as the ALCF's Director for the past few years, has been named director of the newly created Exascale Technology and Computing Institute (ETCi). The institute will focus on developing exascale computing to extend scientific discovery and solve critical science and engineering problems. Just as Pete's leadership propelled the ALCF to great success, we know that that ETCi will benefit immensely from his expertise and experience. Without question, the future of supercomputing is certainly in good hands. I would like to thank Pete for all his effort over the past two years, during which he oversaw the establishing of ALCF2, the deployment of the Magellan project, increases in utilization, availability, and number of projects using ALCF1. He managed the rapid growth of ALCF staff and made the facility what it is today. All the staff and users are better for Pete's efforts.« less
SCIDAC Center for simulation of wave particle interactions CompX participation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harvey, R.W.
Harnessing the energy that is released in fusion reactions would provide a safe and abundant source of power to meet the growing energy needs of the world population. The next step toward the development of fusion as a practical energy source is the construction of ITER, a device capable of producing and controlling the high performance plasma required for self-sustaining fusion reactions, or “burning” plasma. The input power required to drive the ITER plasma into the burning regime will be supplied primarily with a combination of external power from radio frequency waves in the ion cyclotron range of frequencies andmore » energetic ions from neutral beam injection sources, in addition to internally generated Ohmic heating from the induced plasma current that also serves to create the magnetic equilibrium for the discharge. The ITER project is a large multi-billion dollar international project in which the US participates. The success of the ITER project depends critically on the ability to create and maintain burning plasma conditions, it is absolutely necessary to have physics-based models that can accurately simulate the RF processes that affect the dynamical evolution of the ITER discharge. The Center for Simulation of WavePlasma Interactions (CSWPI), also known as RF-SciDAC, is a multi-institutional collaboration that has conducted ongoing research aimed at developing: (1) Coupled core-to-edge simulations that will lead to an increased understanding of parasitic losses of the applied RF power in the boundary plasma between the RF antenna and the core plasma; (2) Development of models for core interactions of RF waves with energetic electrons and ions (including fusion alpha particles and fast neutral beam ions) that include a more accurate representation of the particle dynamics in the combined equilibrium and wave fields; and (3) Development of improved algorithms that will take advantage of massively parallel computing platforms at the petascale level and beyond to achieve the needed physics, resolution, and/or statistics to address these issues. CompX provides computer codes and analysis for the calculation of the electron and ion distributions in velocity-space and plasma radius which are necessary for reliable calculations of power deposition and toroidal current drive due to combined radiofrequency and neutral beam at high injected powers. It has also contributed to ray tracing modeling of injected radiofrequency powers, and to coupling between full-wave radiofrequency wave models and the distribution function calculations. In the course of this research, the Fokker-Planck distribution function calculation was made substantially more realistic by inclusion of finite-width drift-orbit effects (FOW). FOW effects were also implemented in a calculation of the phase-space diffusion resulting from radiofrequency full-wave models. Average level of funding for CompX was approximately three man-months per year.« less
Community Maintenance Programs for Sexual Offenders
ERIC Educational Resources Information Center
Youssef, Carollyne
2013-01-01
While optimism regarding the treatment of sexual offenders has increased over the past couple of decades, research into the factors that assist offenders in maintaining therapeutic changes remains in the dark. Maintenance programs for offenders, while theoretically appearing to have a solid place in offender rehabilitation, surprisingly have not…
2010 Military Family Life Project (MFLP) - Couples: Tabulations of Responses
2013-08-31
interest income; dividends; child support/alimony; social security, welfare assistance; and net rent, trusts, and royalties from any other investments ...2010 Military Family Life Project: Couples Tabulations of Responses Additional copies of this report may be obtained from: Defense... RESPONSES Defense Manpower Data Center Human Resources Strategic Assessment Program 4800 Mark Center Drive, Suite 04E25-01, Alexandria, VA 22350
Effect of Sexual Education on Sexual Health in Iran
ERIC Educational Resources Information Center
Farnam, Farnaz; Pakgohar, Minoo; Mirmohamadali, Mandana; Mahmoodi, Mahmood
2008-01-01
The purpose of this study was to evaluate the effect of a special sex education program in sexual health on Iranian newly-wed couples. A sample of 64 couples referred to three health centers of Tehran Medicine University, a few months prior to their marriage, were divided into case and control groups. The case group received three lecture sessions…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-10-30
.... Thus, the availability of same-sex marriage in a particular state was not relevant to our determination... same-sex domestic partners living in states that do not allow same-sex couples to marry to be covered... for the children of same- sex domestic partners is limited to those states in which same-sex couples...
ERIC Educational Resources Information Center
Chaney, Cassandra; Monroe, Pamela
2011-01-01
With passage of the Welfare Reform Law of 1996, various national, state, and local programs were created to encourage marriage, particularly among low-income African American cohabiting couples with children. However, policy makers know little about the deterrents to marriage for members of this group. More specifically, there is a lack of data…
NASA Astrophysics Data System (ADS)
Hoek, Jaap
1983-02-01
A set of programs to calculate algebraically the generating functional (free energy) of a gauge system with arbitrary external sources on a lattice has been developed. It makes use of the strong coupling expansion. For theories with the standard Tr(UUU †U †) action results have been obtained up to fourth order.
Beating the tyranny of scale with a private cloud configured for Big Data
NASA Astrophysics Data System (ADS)
Lawrence, Bryan; Bennett, Victoria; Churchill, Jonathan; Juckes, Martin; Kershaw, Philip; Pepler, Sam; Pritchard, Matt; Stephens, Ag
2015-04-01
The Joint Analysis System, JASMIN, consists of a five significant hardware components: a batch computing cluster, a hypervisor cluster, bulk disk storage, high performance disk storage, and access to a tape robot. Each of the computing clusters consists of a heterogeneous set of servers, supporting a range of possible data analysis tasks - and a unique network environment makes it relatively trivial to migrate servers between the two clusters. The high performance disk storage will include the world's largest (publicly visible) deployment of the Panasas parallel disk system. Initially deployed in April 2012, JASMIN has already undergone two major upgrades, culminating in a system which by April 2015, will have in excess of 16 PB of disk and 4000 cores. Layered on the basic hardware are a range of services, ranging from managed services, such as the curated archives of the Centre for Environmental Data Archival or the data analysis environment for the National Centres for Atmospheric Science and Earth Observation, to a generic Infrastructure as a Service (IaaS) offering for the UK environmental science community. Here we present examples of some of the big data workloads being supported in this environment - ranging from data management tasks, such as checksumming 3 PB of data held in over one hundred million files, to science tasks, such as re-processing satellite observations with new algorithms, or calculating new diagnostics on petascale climate simulation outputs. We will demonstrate how the provision of a cloud environment closely coupled to a batch computing environment, all sharing the same high performance disk system allows massively parallel processing without the necessity to shuffle data excessively - even as it supports many different virtual communities, each with guaranteed performance. We will discuss the advantages of having a heterogeneous range of servers with available memory from tens of GB at the low end to (currently) two TB at the high end. There are some limitations of the JASMIN environment, the high performance disk environment is not fully available in the IaaS environment, and a planned ability to burst compute heavy jobs into the public cloud is not yet fully available. There are load balancing and performance issues that need to be understood. We will conclude with projections for future usage, and our plans to meet those requirements.
Developing Models for Predictive Climate Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Drake, John B; Jones, Philip W
2007-01-01
The Community Climate System Model results from a multi-agency collaboration designed to construct cutting-edge climate science simulation models for a broad research community. Predictive climate simulations are currently being prepared for the petascale computers of the near future. Modeling capabilities are continuously being improved in order to provide better answers to critical questions about Earth's climate. Climate change and its implications are front page news in today's world. Could global warming be responsible for the July 2006 heat waves in Europe and the United States? Should more resources be devoted to preparing for an increase in the frequency of strongmore » tropical storms and hurricanes like Katrina? Will coastal cities be flooded due to a rise in sea level? The National Climatic Data Center (NCDC), which archives all weather data for the nation, reports that global surface temperatures have increased over the last century, and that the rate of increase is three times greater since 1976. Will temperatures continue to climb at this rate, will they decline again, or will the rate of increase become even steeper? To address such a flurry of questions, scientists must adopt a systematic approach and develop a predictive framework. With responsibility for advising on energy and technology strategies, the DOE is dedicated to advancing climate research in order to elucidate the causes of climate change, including the role of carbon loading from fossil fuel use. Thus, climate science--which by nature involves advanced computing technology and methods--has been the focus of a number of DOE's SciDAC research projects. Dr. John Drake (ORNL) and Dr. Philip Jones (LANL) served as principal investigators on the SciDAC project, 'Collaborative Design and Development of the Community Climate System Model for Terascale Computers.' The Community Climate System Model (CCSM) is a fully-coupled global system that provides state-of-the-art computer simulations of the Earth's past, present, and future climate states. The collaborative SciDAC team--including over a dozen researchers at institutions around the country--developed, validated, documented, and optimized the performance of CCSM using the latest software engineering approaches, computational technology, and scientific knowledge. Many of the factors that must be accounted for in a comprehensive model of the climate system are illustrated in figure 1.« less
Rupture mechanism of liquid crystal thin films realized by large-scale molecular simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Trung D; Carrillo, Jan-Michael Y; Brown, W Michael
2014-01-01
The ability of liquid crystal (LC) molecules to respond to changes in their environment makes them an interesting candidate for thin film applications, particularly in bio-sensing, bio-mimicking devices, and optics. Yet the understanding of the (in)stability of this family of thin films has been limited by the inherent challenges encountered by experiment and continuum models. Using unprecedented largescale molecular dynamics (MD) simulations, we address the rupture origin of LC thin films wetting a solid substrate at length scales similar to those in experiment. Our simulations show the key signatures of spinodal instability in isotropic and nematic films on top ofmore » thermal nucleation, and importantly, for the first time, evidence of a common rupture mechanism independent of initial thickness and LC orientational ordering. We further demonstrate that the primary driving force for rupture is closely related to the tendency of the LC mesogens to recover their local environment in the bulk state. Our study not only provides new insights into the rupture mechanism of liquid crystal films, but also sets the stage for future investigations of thin film systems using peta-scale molecular dynamics simulations.« less
SCaLeM: A Framework for Characterizing and Analyzing Execution Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chavarría-Miranda, Daniel; Manzano Franco, Joseph B.; Krishnamoorthy, Sriram
2014-10-13
As scalable parallel systems evolve towards more complex nodes with many-core architectures and larger trans-petascale & upcoming exascale deployments, there is a need to understand, characterize and quantify the underlying execution models being used on such systems. Execution models are a conceptual layer between applications & algorithms and the underlying parallel hardware and systems software on which those applications run. This paper presents the SCaLeM (Synchronization, Concurrency, Locality, Memory) framework for characterizing and execution models. SCaLeM consists of three basic elements: attributes, compositions and mapping of these compositions to abstract parallel systems. The fundamental Synchronization, Concurrency, Locality and Memory attributesmore » are used to characterize each execution model, while the combinations of those attributes in the form of compositions are used to describe the primitive operations of the execution model. The mapping of the execution model’s primitive operations described by compositions, to an underlying abstract parallel system can be evaluated quantitatively to determine its effectiveness. Finally, SCaLeM also enables the representation and analysis of applications in terms of execution models, for the purpose of evaluating the effectiveness of such mapping.« less
Analytics-Driven Lossless Data Compression for Rapid In-situ Indexing, Storing, and Querying
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jenkins, John; Arkatkar, Isha; Lakshminarasimhan, Sriram
2013-01-01
The analysis of scientific simulations is highly data-intensive and is becoming an increasingly important challenge. Peta-scale data sets require the use of light-weight query-driven analysis methods, as opposed to heavy-weight schemes that optimize for speed at the expense of size. This paper is an attempt in the direction of query processing over losslessly compressed scientific data. We propose a co-designed double-precision compression and indexing methodology for range queries by performing unique-value-based binning on the most significant bytes of double precision data (sign, exponent, and most significant mantissa bits), and inverting the resulting metadata to produce an inverted index over amore » reduced data representation. Without the inverted index, our method matches or improves compression ratios over both general-purpose and floating-point compression utilities. The inverted index is light-weight, and the overall storage requirement for both reduced column and index is less than 135%, whereas existing DBMS technologies can require 200-400%. As a proof-of-concept, we evaluate univariate range queries that additionally return column values, a critical component of data analytics, against state-of-the-art bitmap indexing technology, showing multi-fold query performance improvements.« less
Dynamic load balancing for petascale quantum Monte Carlo applications: The Alias method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sudheer, C. D.; Krishnan, S.; Srinivasan, A.
Diffusion Monte Carlo is the most accurate widely used Quantum Monte Carlo method for the electronic structure of materials, but it requires frequent load balancing or population redistribution steps to maintain efficiency and avoid accumulation of systematic errors on parallel machines. The load balancing step can be a significant factor affecting performance, and will become more important as the number of processing elements increases. We propose a new dynamic load balancing algorithm, the Alias Method, and evaluate it theoretically and empirically. An important feature of the new algorithm is that the load can be perfectly balanced with each process receivingmore » at most one message. It is also optimal in the maximum size of messages received by any process. We also optimize its implementation to reduce network contention, a process facilitated by the low messaging requirement of the algorithm. Empirical results on the petaflop Cray XT Jaguar supercomputer at ORNL showing up to 30% improvement in performance on 120,000 cores. The load balancing algorithm may be straightforwardly implemented in existing codes. The algorithm may also be employed by any method with many near identical computational tasks that requires load balancing.« less
High-efficiency wavefunction updates for large scale Quantum Monte Carlo
NASA Astrophysics Data System (ADS)
Kent, Paul; McDaniel, Tyler; Li, Ying Wai; D'Azevedo, Ed
Within ab intio Quantum Monte Carlo (QMC) simulations, the leading numerical cost for large systems is the computation of the values of the Slater determinants in the trial wavefunctions. The evaluation of each Monte Carlo move requires finding the determinant of a dense matrix, which is traditionally iteratively evaluated using a rank-1 Sherman-Morrison updating scheme to avoid repeated explicit calculation of the inverse. For calculations with thousands of electrons, this operation dominates the execution profile. We propose a novel rank- k delayed update scheme. This strategy enables probability evaluation for multiple successive Monte Carlo moves, with application of accepted moves to the matrices delayed until after a predetermined number of moves, k. Accepted events grouped in this manner are then applied to the matrices en bloc with enhanced arithmetic intensity and computational efficiency. This procedure does not change the underlying Monte Carlo sampling or the sampling efficiency. For large systems and algorithms such as diffusion Monte Carlo where the acceptance ratio is high, order of magnitude speedups can be obtained on both multi-core CPU and on GPUs, making this algorithm highly advantageous for current petascale and future exascale computations.
WRF Test on IBM BG/L:Toward High Performance Application to Regional Climate Research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chin, H S
The effects of climate change will mostly be felt on local to regional scales (Solomon et al., 2007). To develop better forecast skill in regional climate change, an integrated multi-scale modeling capability (i.e., a pair of global and regional climate models) becomes crucially important in understanding and preparing for the impacts of climate change on the temporal and spatial scales that are critical to California's and nation's future environmental quality and economical prosperity. Accurate knowledge of detailed local impact on the water management system from climate change requires a resolution of 1km or so. To this end, a high performancemore » computing platform at the petascale appears to be an essential tool in providing such local scale information to formulate high quality adaptation strategies for local and regional climate change. As a key component of this modeling system at LLNL, the Weather Research and Forecast (WRF) model is implemented and tested on the IBM BG/L machine. The objective of this study is to examine the scaling feature of WRF on BG/L for the optimal performance, and to assess the numerical accuracy of WRF solution on BG/L.« less
Emerging CAE technologies and their role in Future Ambient Intelligence Environments
NASA Astrophysics Data System (ADS)
Noor, Ahmed K.
2011-03-01
Dramatic improvements are on the horizon in Computer Aided Engineering (CAE) and various simulation technologies. The improvements are due, in part, to the developments in a number of leading-edge technologies and their synergistic combinations/convergence. The technologies include ubiquitous, cloud, and petascale computing; ultra high-bandwidth networks, pervasive wireless communication; knowledge based engineering; networked immersive virtual environments and virtual worlds; novel human-computer interfaces; and powerful game engines and facilities. This paper describes the frontiers and emerging simulation technologies, and their role in the future virtual product creation and learning/training environments. The environments will be ambient intelligence environments, incorporating a synergistic combination of novel agent-supported visual simulations (with cognitive learning and understanding abilities); immersive 3D virtual world facilities; development chain management systems and facilities (incorporating a synergistic combination of intelligent engineering and management tools); nontraditional methods; intelligent, multimodal and human-like interfaces; and mobile wireless devices. The Virtual product creation environment will significantly enhance the productivity and will stimulate creativity and innovation in future global virtual collaborative enterprises. The facilities in the learning/training environment will provide timely, engaging, personalized/collaborative and tailored visual learning.
NASA Astrophysics Data System (ADS)
Hassan, A. H.; Fluke, C. J.; Barnes, D. G.
2012-09-01
Upcoming and future astronomy research facilities will systematically generate terabyte-sized data sets moving astronomy into the Petascale data era. While such facilities will provide astronomers with unprecedented levels of accuracy and coverage, the increases in dataset size and dimensionality will pose serious computational challenges for many current astronomy data analysis and visualization tools. With such data sizes, even simple data analysis tasks (e.g. calculating a histogram or computing data minimum/maximum) may not be achievable without access to a supercomputing facility. To effectively handle such dataset sizes, which exceed today's single machine memory and processing limits, we present a framework that exploits the distributed power of GPUs and many-core CPUs, with a goal of providing data analysis and visualizing tasks as a service for astronomers. By mixing shared and distributed memory architectures, our framework effectively utilizes the underlying hardware infrastructure handling both batched and real-time data analysis and visualization tasks. Offering such functionality as a service in a “software as a service” manner will reduce the total cost of ownership, provide an easy to use tool to the wider astronomical community, and enable a more optimized utilization of the underlying hardware infrastructure.
Data Mining and Machine Learning in Astronomy
NASA Astrophysics Data System (ADS)
Ball, Nicholas M.; Brunner, Robert J.
We review the current state of data mining and machine learning in astronomy. Data Mining can have a somewhat mixed connotation from the point of view of a researcher in this field. If used correctly, it can be a powerful approach, holding the potential to fully exploit the exponentially increasing amount of available data, promising great scientific advance. However, if misused, it can be little more than the black box application of complex computing algorithms that may give little physical insight, and provide questionable results. Here, we give an overview of the entire data mining process, from data collection through to the interpretation of results. We cover common machine learning algorithms, such as artificial neural networks and support vector machines, applications from a broad range of astronomy, emphasizing those in which data mining techniques directly contributed to improving science, and important current and future directions, including probability density functions, parallel algorithms, Peta-Scale computing, and the time domain. We conclude that, so long as one carefully selects an appropriate algorithm and is guided by the astronomical problem at hand, data mining can be very much the powerful tool, and not the questionable black box.
Approaching the exa-scale: a real-world evaluation of rendering extremely large data sets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patchett, John M; Ahrens, James P; Lo, Li - Ta
2010-10-15
Extremely large scale analysis is becoming increasingly important as supercomputers and their simulations move from petascale to exascale. The lack of dedicated hardware acceleration for rendering on today's supercomputing platforms motivates our detailed evaluation of the possibility of interactive rendering on the supercomputer. In order to facilitate our understanding of rendering on the supercomputing platform, we focus on scalability of rendering algorithms and architecture envisioned for exascale datasets. To understand tradeoffs for dealing with extremely large datasets, we compare three different rendering algorithms for large polygonal data: software based ray tracing, software based rasterization and hardware accelerated rasterization. We presentmore » a case study of strong and weak scaling of rendering extremely large data on both GPU and CPU based parallel supercomputers using Para View, a parallel visualization tool. Wc use three different data sets: two synthetic and one from a scientific application. At an extreme scale, algorithmic rendering choices make a difference and should be considered while approaching exascale computing, visualization, and analysis. We find software based ray-tracing offers a viable approach for scalable rendering of the projected future massive data sizes.« less
Building the future an atom at a time: Realizing feynman's vision
NASA Astrophysics Data System (ADS)
Madia, William J.
2006-10-01
Since Feynman’s 1959 lecture, “There’s Plenty of Room at the Bottom,” and particularly in the last 15 years, advances in instrumentation have permitted us to observe and characterize materials at atomic scale. New and even more powerful capabilities are rapidly becoming available. At the same time, our theoretical understanding and ability to model complex systems have matured to a level that enables us to begin making useful predictions in many areas, with the promise of further progress as we approach petascale computing. Progress in making and structuring nanoscale materials in commercially useful quantities is also being made, albeit more selectively. Exploiting chemistry and biochemistry to mimic nature’s accomplishments in living systems is a promising approach that is opening new possibilities. The remarkable progress of the last few years is already producing technological advances, and more can be expected as investments in nanoscience and nanotechnology increase. Just as advances in information technology during the second half of the 20th century produced dramatic technological, economic, and societal changes, so the coming nanoscale revolution will affect virtually every aspect of life in the 21st century.
Building the future an atom at a time: Realizing Feynman's vision
NASA Astrophysics Data System (ADS)
Madia, William J.
2006-10-01
Since Feynman’s 1959 lecture, “There’s Plenty of Room at the Bottom,” and particularly in the last 15 years, advances in instrumentation have permitted us to observe and characterize materials at atomic scale. New and even more powerful capabilities are rapidly becoming available. At the same time, our theoretical understanding and ability to model complex systems have matured to a level that enables us to begin making useful predictions in many areas, with the promise of further progress as we approach petascale computing. Progress in making and structuring nanoscale materials in commercially useful quantities is also being made, albeit more selectively. Exploiting chemistry and biochemistry to mimic nature’s accomplishments in living systems is a promising approach that is opening new possibilities. The remarkable progress of the last few years is already producing technological advances, and more can be expected as investments in nanoscience and nanotechnology increase. Just as advances in information technology during the second half of the 20th century produced dramatic technological, economic, and societal changes, so the coming nanoscale revolution will affect virtually every aspect of life in the 21st century.
PFLOTRAN: Reactive Flow & Transport Code for Use on Laptops to Leadership-Class Supercomputers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hammond, Glenn E.; Lichtner, Peter C.; Lu, Chuan
PFLOTRAN, a next-generation reactive flow and transport code for modeling subsurface processes, has been designed from the ground up to run efficiently on machines ranging from leadership-class supercomputers to laptops. Based on an object-oriented design, the code is easily extensible to incorporate additional processes. It can interface seamlessly with Fortran 9X, C and C++ codes. Domain decomposition parallelism is employed, with the PETSc parallel framework used to manage parallel solvers, data structures and communication. Features of the code include a modular input file, implementation of high-performance I/O using parallel HDF5, ability to perform multiple realization simulations with multiple processors permore » realization in a seamless manner, and multiple modes for multiphase flow and multicomponent geochemical transport. Chemical reactions currently implemented in the code include homogeneous aqueous complexing reactions and heterogeneous mineral precipitation/dissolution, ion exchange, surface complexation and a multirate kinetic sorption model. PFLOTRAN has demonstrated petascale performance using 2{sup 17} processor cores with over 2 billion degrees of freedom. Accomplishments achieved to date include applications to the Hanford 300 Area and modeling CO{sub 2} sequestration in deep geologic formations.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schanen, Michel; Marin, Oana; Zhang, Hong
Adjoints are an important computational tool for large-scale sensitivity evaluation, uncertainty quantification, and derivative-based optimization. An essential component of their performance is the storage/recomputation balance in which efficient checkpointing methods play a key role. We introduce a novel asynchronous two-level adjoint checkpointing scheme for multistep numerical time discretizations targeted at large-scale numerical simulations. The checkpointing scheme combines bandwidth-limited disk checkpointing and binomial memory checkpointing. Based on assumptions about the target petascale systems, which we later demonstrate to be realistic on the IBM Blue Gene/Q system Mira, we create a model of the expected performance of our checkpointing approach and validatemore » it using the highly scalable Navier-Stokes spectralelement solver Nek5000 on small to moderate subsystems of the Mira supercomputer. In turn, this allows us to predict optimal algorithmic choices when using all of Mira. We also demonstrate that two-level checkpointing is significantly superior to single-level checkpointing when adjoining a large number of time integration steps. To our knowledge, this is the first time two-level checkpointing had been designed, implemented, tuned, and demonstrated on fluid dynamics codes at large scale of 50k+ cores.« less
NASA Astrophysics Data System (ADS)
Claver, Chuck F.; Dubois-Felsmann, G. P.; Delgado, F.; Hascall, P.; Horn, D.; Marshall, S.; Nordby, M.; Schalk, T. L.; Schumacher, G.; Sebag, J.; LSST Project Team
2010-01-01
The LSST is a complete observing system that acquires and archives images, processes and analyzes them, and publishes reduced images and catalogs of sources and objects. The LSST will operate over a ten year period producing a survey of 20,000 square degrees over the entire southern sky in 6 filters (ugrizy) with each field having been visited several hundred times enabling a wide spectrum of science from fast transients to exploration of dark matter and dark energy. The LSST itself is a complex system of systems consisting of the 8.4m three mirror telescope, a 3.2 billion pixel camera, and a peta-scale data management system. The LSST project uses a Model Based Systems Engineering (MBSE) methodology to ensure an integrated approach to system design and rigorous definition of system interfaces and specifications. The MBSE methodology is applied through modeling of the LSST's systems with the System Modeling Language (SysML). The SysML modeling recursively establishes the threefold relationship between requirements, logical & physical functional decomposition and definition, and system and component behavior at successively deeper levels of abstraction and detail. The MBSE approach is applied throughout all stages of the project from design, to validation and verification, though to commissioning.
The Shock and Vibration Digest, Volume 13, Number 1
1981-01-01
Transfer matrices of different elements constituting expansion chamber mufflers have been used to develop a computer program for designing a... developed general computer programs using 10 ^=3 m^ Figure 4. Suppression of Flow-Acoustic Coupling by a Bridge Perforate [57] Figure 5. A Generally...the paper sessions are: • Environmental Stress Screening Theory • Screening Program Development /Implementa- tion • Screening Case Histories
Costs of Extending the Noncontributory Pension Program for Elderly: The Mexican Case.
Aguila, Emma; Mejia, Nelly; Perez-Arce, Francisco; Ramirez, Edgar; Rivera Illingworth, Alfonso
2016-01-01
Population aging coupled with high poverty rates among older persons and a lack of access to social-security benefits or traditional support systems have led governments in low and middle-income countries to introduce non-contributory pension programs for the elderly. This article reviews a non-contributory pension program introduced in Mexico in 2007 that has since expanded greatly. We use a variety of sources to estimate current and future costs of this program.
Female Couples Undergoing IVF with Partner Eggs (Co-IVF): Pathways to Parenthood.
Yeshua, Arielle; Lee, Joseph A; Witkin, Georgia; Copperman, Alan B
2015-06-01
Egg sharing in female couples can be used to allow dual participation of female couples in the pregnancy process. The oocyte donor-partner provides the eggs and the recipient partner provides the uterine environment for gestation. We present descriptive data of our experience in female couples to establish a better understanding of utilization of co-in vitro fertilization (Co-IVF) for social and medical reasons. Female couples enrolled in a third party reproduction program that engaged in at least one Co-IVF cycle were included. Previous assisted reproductive technology (ART) cycle data, Co-IVF cycle information and pregnancy outcomes were evaluated. Female couples (n=21) who participated in Co-IVF cycles were analyzed. Over time, 16/21 (76%) of couples achieved at least one pregnancy, 9 (42%) couples delivered, and there are another 5 (23%) ongoing pregnancies. Our analysis presents descriptive data and sheds realistic expectations for Co-IVF couples. Co-IVF cycles can result in a shared experience with regard to the process of creating a family, while preserving a female couple's desire for dual partner participation in the gestational process. We encourage centers treating female couples to consider departing from traditional nomenclature of "donors" and "recipients" and adopting the nomenclature "Co-IVF" to describe the modern understanding of the shared experience. Even if female couples have experienced prior unsuccessful cycles, couples ultimately retain an excellent prognosis for reproductive success using Co-IVF.
ERIC Educational Resources Information Center
Ulrici, Donna; And Others
1981-01-01
Provides a model for categorizing marital and family skill training programs according to their theoretical orientation. Describes emotional, reasoning, and action approaches to intervention which allow counselors to examine the relationship between client characteristics and intervention approaches. (JAC)
A Rubric for Assessing Students' Experimental Problem-Solving Ability
ERIC Educational Resources Information Center
Shadle, Susan E.; Brown, Eric C.; Towns, Marcy H.; Warner, Don L.
2012-01-01
The ability to couple problem solving both to the understanding of chemical concepts and to laboratory practices is an essential skill for undergraduate chemistry programs to foster in our students. Therefore, chemistry programs must offer opportunities to answer real problems that require use of problem-solving processes used by practicing…
This manual describes a two-dimensional, finite element model for coupled multiphase flow and multicomponent transport in planar or radially symmetric vertical sections. low and transport of three fluid phases, including water, nonaqueous phase liquid (NAPL), and gas are consider...
Relocating Two-Earner Couples: What Companies Are Doing. Research Bulletin No. 247.
ERIC Educational Resources Information Center
Johnson, Arlene A.
Forty employer organizations granted interviews to discuss the status of their spouse employment assistance programs, motivation for the programs, and implementation experiences. Representatives of 21 relocation consulting and research organizations supplied information on the rationale for the services they offer, ways employers use their…
Reintegrating Family Therapy Training in Psychiatric Residency Programs: Making the Case
ERIC Educational Resources Information Center
Rait, Douglas; Glick, Ira
2008-01-01
Objective: Given the marginalization of couples and family therapy in psychiatric residency programs over the past two decades, the authors propose a rationale for the reintegration of these important psychosocial treatments into the mainstream of general psychiatric residency education. Methods: After reviewing recent trends in the field that…
High temperature material interactions of thermoelectric systems using silicon germanium.
NASA Technical Reports Server (NTRS)
Stapfer, G.; Truscello, V. C.
1973-01-01
The efficient use of silicon germanium thermoelectric material for radioisotope thermoelectric generators (RTG) is achieved by operation at relatively high temperatures. The insulation technique which is most appropriate for this application uses multiple layers of molybdenum foil and astroquartz. Even so, the long term operation of these materials at elevated temperatures can cause material interaction to occur within the system. To investigate these material interactions, the Jet Propulsion Laboratory is currently testing a number of thermoelectric modules which use four silicon germanium thermoelectric couples in conjunction with the multifoil thermal insulation. The paper discusses the results of the ongoing four-couple module test program and correlates test results with those of a basic material test program.
PyMT: A Python package for model-coupling in the Earth sciences
NASA Astrophysics Data System (ADS)
Hutton, E.
2016-12-01
The current landscape of Earth-system models is not only broad in scientific scope, but also broad in type. On the one hand, the large variety of models is exciting, as it provides fertile ground for extending or linking models together in novel ways to answer new scientific questions. However, the heterogeneity in model type acts to inhibit model coupling, model development, or even model use. Existing models are written in a variety of programming languages, operate on different grids, use their own file formats (both for input and output), have different user interfaces, have their own time steps, etc. Each of these factors become obstructions to scientists wanting to couple, extend - or simply run - existing models. For scientists whose main focus may not be computer science these barriers become even larger and become significant logistical hurdles. And this is all before the scientific difficulties of coupling or running models are addressed. The CSDMS Python Modeling Toolkit (PyMT) was developed to help non-computer scientists deal with these sorts of modeling logistics. PyMT is the fundamental package the Community Surface Dynamics Modeling System uses for the coupling of models that expose the Basic Modeling Interface (BMI). It contains: Tools necessary for coupling models of disparate time and space scales (including grid mappers) Time-steppers that coordinate the sequencing of coupled models Exchange of data between BMI-enabled models Wrappers that automatically load BMI-enabled models into the PyMT framework Utilities that support open-source interfaces (UGRID, SGRID,CSDMS Standard Names, etc.) A collection of community-submitted models, written in a variety of programminglanguages, from a variety of process domains - but all usable from within the Python programming language A plug-in framework for adding additional BMI-enabled models to the framework In this presentation we intoduce the basics of the PyMT as well as provide an example of coupling models of different domains and grid types.
NGA Industry Critique of the Exploration Component
DOE Office of Scientific and Technical Information (OSTI.GOV)
Iovanetti, J.L.
1992-03-24
The author critiques the Exploration component of the U.S. Department of Energy (DOE) Geothermal Program Review X. The comments focus principally on the hydrothermal portion of the DOE program, but he also makes some commentary on the Long Valley Exploratory Well and Geopressured-Geothermal components of the program, as well as some general comments. Before I do that, I would like to review the current state of geothermal exploration in the United States. According to Koenig (1989, 1990) who critiqued the DOE Geothermal Program in those years, geothermal exploration in the western U.S. has been conducted in virtually all of themore » apparent geothermal resource areas. Many of these areas which were under exploration in the 1960s and 1970s, and were explored in part under the U.S. DOE Industry Coupled Program have progressed to commercial status in the 80s. The DOE March (1992) Draft Multi-Year Program Plan for FY 1993-1997 states that 8 out of the 14 geothermal resource areas explored under this Industry Coupled Program in the late 1970s are currently under production. I do not think we will find anyone in this room, in the geothermal industry, or in the United States that will argue with the clear and outstanding success of that government program. When the prices of oil dropped in the 1980s, many geothermal operators left the industry, and with the dramatic decrease in activity, many of the service companies went by the wayside also. By and large, the domestic geothermal industry today is emaciated. As a result of the capital intensive nature of geothermal development, the historical long lead times to go from exploration to production, the highly entrepreneurial nature of the industry, and the lack of an economic market, virtually no new exploration has been conducted in the U.S. in about 10 years. The consequence of this lack of activity is an almost nonexistent geothermal reserve base, outside of known producing fields and their immediate surrounds. The U.S. DOE Deep Thermal Gradient Drilling Program in the Cascade Range is a notable exception to this stagnant condition. Like it's predecessor, the industry coupled program, the Thermal Gradient Drilling Program identified at least, one potentially viable geothermal resource: Newberry Volcano.« less
Robey, B
1985-11-01
Demographers from the East-West Population Institute (EWPI) and the China State Family Planning Commission, jointly analyzing data from computer tapes of China's 1982 national fertility survey, have produced new evidence of the extent to which families in China prefer male children. Evidence exists in almost every part of China that couples prefer sons to daughters, according to researchers Fred Arnold and Liu Zhaoxiang. Only Beijing and Shanghai are exceptions to this pattern. The persistence of such attidues in China demonstrates the difficulty of overcoming deeply rooted Confucian traditions. Following the Chinese revolution, the government guaranteed sexual equality in political, economic, and cultural life, but patriarchal attitudes still prevail, particularly in the countryside. Historically, couples have favored sons for a variety of reasons, including to continue the family name, provide security for the parents' old age, add to the family labor force, and perform ancestral rites. Believing that such attitudes block successful implementation of China's 1-child family policy, the government has launched a campaign to try to change them. At the time of the time of the 1982 fertility survey, China's 1-child certificate program had been in effect for 3 years. The program provides incentives such as monetary bonuses, preferential housing allocation, and special consideration for the child in education and job assignments to couples who agree not to have a 2nd child. According to the survey, 37% of all 1-child couples had accepted the 1-child certificate. Significantly, 60% of all 1-child certificate holders have a son. Of couples whose 1st child was a boy, 40% obtained the 1-child certificate, versus only 34% of those whose 1st child was a girl. Despite penalties for renouncing the 1-child certificate, about 1 out of every 10 mothers in the program had given birth to a 2nd child by the time of the survey. The 1st child of these mothers was twice as likely to be a girl as to be a boy. Son preference is strongest in towns and rural farm villages and weakest in the cities, further evidence that traditional attitudes have a strong impact on the preference for sons. Son preference is also weaker among the more educated. Statistics on contraceptive use and abortion in China also suggest a preference for sons, with 69% of 1-male-child couples using some form of contraception, compared to 63% of 1-female-child couples. Other studies reported at the Beijing symposium, using different measures, confirm Arnold's and Liu's 1982 findings. The 1982 fertility survey shows that while son preference still exists in China today, it is less likely than one might expect to be a major barrier to the success of the family planning program.
McLeod, Deborah; Stephen, Joanne
2015-01-01
Development of psychological interventions delivered via the Internet is a rapidly growing field with the potential to make vital services more accessible. However, there is a corresponding need for careful examination of factors that contribute to effectiveness of Internet-delivered interventions, especially given the observed high dropout rates relative to traditional in-person (IP) interventions. Research has found that the involvement of an online therapist in a Web-based intervention reduces treatment dropout. However, the role of such online therapists is seldom well articulated and varies considerably across programs making it difficult to discern processes that are important for online therapist involvement.In this paper, we introduce the concept of “therapeutic facilitation” to describe the role of the online therapist that was developed and further refined in the context of a Web-based, asynchronous psychosocial intervention for couples affected by breast cancer called Couplelinks. Couplelinks is structured into 6 dyadic learning modules designed to be completed on a weekly basis in consultation with a facilitator through regular, asynchronous, online text-based communication.Principles of therapeutic facilitation derived from a combination of theory underlying the intervention and pilot-testing of the first iteration of the program are described. Case examples to illustrate these principles as well as commonly encountered challenges to online facilitation are presented. Guidelines and principles for therapeutic facilitation hold relevance for professionally delivered online programs more broadly, beyond interventions for couples and cancer. PMID:28410159
HIV status and gender: a brief report from heterosexual couples in Thailand.
Li, Li; Liang, Li-Jung; Lee, Sung-Jae; Farmer, Shu C
2012-01-01
Although the impact of HIV falls on both partners of a married couple, the burden of stress may not be necessarily shared evenly. The researchers in this study examined the relations among HIV status, gender, and depressive symptoms among 152 married or cohabitating couples living with HIV in the northern and northeastern regions of Thailand. Depressive symptoms were assessed using a 15-item depressive symptom screening test that was developed and used previously in Thailand. Among the 152 couples, 59% were couples in which both members were people living with HIV (seroconcordant; both people living with HIV couples), 28% had only female members with HIV (serodiscordant; females living with HIV couples), and 13% had only male members with HIV (serodiscordant; males living with HIV couples). The prevalence of depressive symptoms between seroconcordant and serodiscordant groups was similar. However, females living with HIV reported significantly higher levels of depressive symptoms, regardless of their partners' HIV status. Future prevention programs focusing on serodiscordant couples should be planned to target HIV risk, as well as emphasis on mental health, with a particular focus on women's increased susceptibility to negative mental health outcomes.
Summary of AH-1G flight vibration data for validation of coupled rotor-fuselage analyses
NASA Technical Reports Server (NTRS)
Dompka, R. V.; Cronkhite, J. D.
1986-01-01
Under a NASA research program designated DAMVIBS (Design Analysis Methods for VIBrationS), four U. S. helicopter industry participants (Bell Helicopter, Boeing Vertol, McDonnell Douglas Helicopter, and Sikorsky Aircraft) are to apply existing analytical methods for calculating coupled rotor-fuselage vibrations of the AH-1G helicopter for correlation with flight test data from an AH-1G Operational Load Survey (OLS) test program. Bell Helicopter, as the manufacturer of the AH-1G, was asked to provide pertinent rotor data and to collect the OLS flight vibration data needed to perform the correlations. The analytical representation of the fuselage structure is based on a NASTRAN finite element model (FEM) developed by Bell which has been extensively documented and correlated with ground vibration tests.The AH-1G FEM was provided to each of the participants for use in their coupled rotor-fuselage analyses. This report describes the AH-1G OLS flight test program and provides the flight conditions and measured vibration data to be used by each participant in their correlation effort. In addition, the mechanical, structural, inertial and aerodynamic data for the AH-1G two-bladed teetering main rotor system are presented. Furthermore, modifications to the NASTRAN FEM of the fuselage structure that are necessary to make it compatible with the OLS test article are described. The AH-1G OLS flight test data was found to be well documented and provide a sound basis for evaluating currently existing analysis methods used for calculation of coupled rotor-fuselage vibrations.
ARO - Terrestrial Research Program, Methodologies and Protocols for Characterization of Geomaterials
2015-05-14
of ice involves melting, digestion, and analysis using inductively coupled plasma – mass spectrometry (ICPMS). ICP-MS analysis established elemental...4] have distinct chemical compositions. Knowledge of the chemical composition of the mineral assemblage present in a rock is critical to...activation analysis (INAA), to inductively-coupled plasma analysis and mass spectrometry (ICP & ICP-MS), mass spectrometry (MS), and laser-ablation
ERIC Educational Resources Information Center
Bhattacharya, Gauri
2004-01-01
This article examines sociocultural expectations of sexual behavior and the reasons why not using condoms may be logical to married heterosexual couples in India. Married women who report monogamous sexual relationships with their husbands are a high-risk group for HIV infection in India. Based on the public health model and a population-based…
2014-01-01
Background The HIV/AIDS epidemic continues to disproportionately affect African American communities in the US, particularly those located in urban areas. Despite the fact that HIV is often transmitted from one sexual partner to another, most HIV prevention interventions have focused only on individuals, rather than couples. This five-year study investigates community-based implementation, effectiveness, and sustainability of ‘Eban II,’ an evidence-based risk reduction intervention for African-American heterosexual, serodiscordant couples. Methods/design This hybrid implementation/effectiveness implementation study is guided by organizational change theory as conceptualized in the Texas Christian University Program Change Model (PCM), a model of phased organizational change from exposure to adoption, implementation, and sustainability. The primary implementation aims are to assist 10 community-based organizations (CBOs) to implement and sustain Eban II; specifically, to partner with CBOs to expose providers to the intervention; facilitate its adoption, implementation and sustainment; and to evaluate processes and determinants of implementation, effectiveness, fidelity, and sustainment. The primary effectiveness aim is to evaluate the effect of Eban II on participant (n = 200 couples) outcomes, specifically incidents of protected sex and proportion of condom use. We will also determine the cost-effectiveness of implementation, as measured by implementation costs and potential cost savings. A mixed methods evaluation will examine implementation at the agency level; staff members from the CBOs will complete baseline measures of organizational context and climate, while key stakeholders will be interviewed periodically throughout implementation. Effectiveness of Eban II will be assessed using a randomized delayed enrollment (waitlist) control design to evaluate the impact of treatment on outcomes at posttest and three-month follow-up. Multi-level hierarchical modeling with a multi-level nested structure will be used to evaluate the effects of agency- and couples-level characteristics on couples-level outcomes (e.g., condom use). Discussion This study will produce important information regarding the value of the Eban II program and a theory-guided implementation process and tools designed for use in implementing Eban II and other evidence-based programs in demographically diverse, resource-constrained treatment settings. Trial registration NCT00644163 PMID:24950708
Hamilton, Alison B; Mittman, Brian S; Williams, John K; Liu, Honghu H; Eccles, Alicia M; Hutchinson, Craig S; Wyatt, Gail E
2014-06-20
The HIV/AIDS epidemic continues to disproportionately affect African American communities in the US, particularly those located in urban areas. Despite the fact that HIV is often transmitted from one sexual partner to another, most HIV prevention interventions have focused only on individuals, rather than couples. This five-year study investigates community-based implementation, effectiveness, and sustainability of 'Eban II,' an evidence-based risk reduction intervention for African-American heterosexual, serodiscordant couples. This hybrid implementation/effectiveness implementation study is guided by organizational change theory as conceptualized in the Texas Christian University Program Change Model (PCM), a model of phased organizational change from exposure to adoption, implementation, and sustainability. The primary implementation aims are to assist 10 community-based organizations (CBOs) to implement and sustain Eban II; specifically, to partner with CBOs to expose providers to the intervention; facilitate its adoption, implementation and sustainment; and to evaluate processes and determinants of implementation, effectiveness, fidelity, and sustainment. The primary effectiveness aim is to evaluate the effect of Eban II on participant (n = 200 couples) outcomes, specifically incidents of protected sex and proportion of condom use. We will also determine the cost-effectiveness of implementation, as measured by implementation costs and potential cost savings. A mixed methods evaluation will examine implementation at the agency level; staff members from the CBOs will complete baseline measures of organizational context and climate, while key stakeholders will be interviewed periodically throughout implementation. Effectiveness of Eban II will be assessed using a randomized delayed enrollment (waitlist) control design to evaluate the impact of treatment on outcomes at posttest and three-month follow-up. Multi-level hierarchical modeling with a multi-level nested structure will be used to evaluate the effects of agency- and couples-level characteristics on couples-level outcomes (e.g., condom use). This study will produce important information regarding the value of the Eban II program and a theory-guided implementation process and tools designed for use in implementing Eban II and other evidence-based programs in demographically diverse, resource-constrained treatment settings. NCT00644163.
Prevention and control of Hb Bart's disease in Guangxi Zhuang Autonomous Region, China.
He, Sheng; Zhang, Qiang; Li, Dongming; Chen, Shaoke; Tang, Yanqing; Chen, Qiuli; Zheng, Chenguang
2014-07-01
To demonstrate the performance of Hb Bart's Disease prevention in Guangxi Zhuang Autonomous Region, China. A prenatal control program for Hb Bart's disease was conducted from January 2006 to December 2012. A total of 17,555 pregnant women were screened for α-thalassemia in our prenatal screening program. Pregnancy at-risk for Hb Bart's disease was offered the choice of direct invasive testing or the non-invasive approach with serial ultrasonography. A total of 1425 at-risk couples attended the prenatal diagnosis. Three hundred ninety couples were screened at our own hospital, and the remaining 1035 couples were referred from other hospitals. Two hundred and three pregnant women chose non-invasive approach, and 1122 chose invasive testing. A total of 365 fetuses were diagnosed with Hb Bart's disease. All cases were finally confirmed by fetal DNA analysis. Eighty-two cases (22.4%) were diagnosed by chorionic villous sampling and 194 (53.2%) by amniocentesis samples. The other 89 (24.4%) cases were performed by cordocentesis. All of the affected pregnancies were terminated. Implementation of a prevention and control program accompanying with a referral system for prenatal diagnosis is technically feasible in Guangxi Zhuang Autonomous Region, China. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
ERIC Educational Resources Information Center
Stassun, Keivan Guadalupe; Burger, Arnold; Lange, Sheila Edwards
2010-01-01
We describe the Fisk-Vanderbilt Masters-to-PhD Bridge program as a successful model for effective partnerships with minority-serving institutions toward significantly broadening the participation of underrepresented groups in the physical sciences. The program couples targeted recruitment with active retention strategies, and is built upon a…
Evaluating employee assistance programs. A review of methods, outcomes, and future directions.
Jerrell, J M; Rightmyer, J F
1982-01-01
Renewed interest in assisting troubled employees has led to an upsurge in the development of employee assistance programs, coupled with demands for demonstrable effectiveness. This review examines the nature and scope of these programs, their administrative and methodological context, and the types and outcomes of evaluation studies conducted thus far. Proposals for improving future investigations through a number of different approaches and strategies are then made.
Simmons, Janie
2006-01-01
Background The drug treatment field tends to place emphasis on the individual rather than the individual in social context. While there are a growing number of studies indicating that drug-using intimate partners are likely to play an important role in determining treatment options, little attention has been given to the experience and complex treatment needs of illicit drug-using (heroin, cocaine, crack) couples. Methods This exploratory study used in-depth interviews and ethnographic engagement to better understand the relationship between interpersonal dynamics and the treatment experience of ten relatively stable drug-using couples in Hartford, CT. Semi-structured and open-ended qualitative interviews were conducted with each couple and separately with each partner. Whenever possible, the day-to-day realities and contexts of risk were also observed via participant and non-participant observation of these couples in the community. A grounded theory approach was used to inductively code and analyze nearly 40 transcripts of 60–90 minute interviews as well as fieldnotes. Results This study builds on a concept of complex interpersonal dynamics among drug users. Interpersonal dynamics of care and collusion were identified: couples cared for each other and colluded to acquire and use drugs. Care and collusion operate at the micro level of the risk environment. Treatment barriers and inadequacies were identified as part of the risk environment at the meso or intermediate level of analysis, and larger social forces such as gender dynamics, poverty and the "War on Drugs" were identified at the macro level. Interpersonal dynamics posed problems for couples when one or both partners were interested in accessing treatment. Structural barriers presented additional obstacles with the denial of admittance of both partners to treatment programs which had a sole focus on the individual and avoided treating couples. Conclusion Detoxification and treatment facilities need to recognize the complex interplay between interpersonal dynamics which shape the treatment experience of couples, and which are also shaped by larger structural dynamics, including barriers in the treatment system. Improvements to the treatment system in general will go a long way in improving treatment for couples. Couples-specific programming also needs to be developed. PMID:16722545
Prediction of jump phenomena in roll-coupled maneuvers of airplanes
NASA Technical Reports Server (NTRS)
Schy, A. A.; Hannah, M. E.
1976-01-01
An easily computerized analytical method is developed for identifying critical airplane maneuvers in which nonlinear rotational coupling effects may cause sudden jumps in the response to pilot's control inputs. Fifth and ninth degree polynomials for predicting multiple pseudo-steady states of roll-coupled maneuvers are derived. The program calculates the pseudo-steady solutions and their stability. The occurrence of jump-like responses for several airplanes and a variety of maneuvers is shown to correlate well with the appearance of multiple stable solutions for critical control combinations. The analysis is extended to include aerodynamics nonlinear in angle of attack.
Daryasafar, Navid; Baghbani, Somaye; Moghaddasi, Mohammad Naser; Sadeghzade, Ramezanali
2014-01-01
We intend to design a broadband band-pass filter with notch-band, which uses coupled transmission lines in the structure, using new models of coupled transmission lines. In order to realize and present the new model, first, previous models will be simulated in the ADS program. Then, according to the change of their equations and consequently change of basic parameters of these models, optimization and dependency among these parameters and also their frequency response are attended and results of these changes in order to design a new filter are converged.
Huang, Hung-Yu; Chou, Pai-Chien; Joa, Wen-Ching; Chen, Li-Fei; Sheng, Te-Fang; Lin, Horng-Chyuan; Yang, Lan-Yan; Pan, Yu-Bin; Chung, Fu-Tsai; Wang, Chun-Hua; Kuo, Han-Pin
2016-01-01
Abstract Pulmonary rehabilitation (PR) brings benefits to patients with chronic obstructive pulmonary disease (COPD). Negative pressure ventilation (NPV) increases ventilation and decreases hyperinflation as well as breathing work in COPD. We evaluated the long-term effects of a hospital-based PR program coupled with NPV support in patients with COPD on clinical outcomes. One hundred twenty-nine patients with COPD were followed up for more than 5 years, with the NPV group (n = 63) receiving the support of NPV (20–30 cm H2O delivery pressure for 60 min) and unsupervised home exercise program of 20 to 30 min daily walk, while the control group (n = 6) only received unsupervised home exercise program. Pulmonary function tests and 6 min walk tests (6MWT) were performed every 3 to 6 months. Emergency room (ER) visits and hospitalization with medical costs were recorded. A significant time-by-group interaction in the yearly decline of forced expiratory volume in 1 s in the control group analyzed by mixed-model repeated-measure analysis was found (P = 0.048). The 6MWT distance of the NPV group was significantly increased during the first 4 years, with the interaction of time and group (P = 0.003), the time alone (P = 0.014), and the quadratic time (P < 0.001) being significant between the 2 groups. ER exacerbations and hospitalizations decreased by 66% (P < 0.0001) and 54% (P < 0.0001) in the NPV group, respectively. Patients on PR program coupled with NPV had a significant reduction of annual medical costs (P = 0.022). Our hospital-based multidisciplinary PR coupled with NPV reduced yearly decline of lung function, exacerbations, and hospitalization rates, and improved walking distance and medical costs in patients with COPD during a 5-year observation PMID:27741132
Huang, Hung-Yu; Chou, Pai-Chien; Joa, Wen-Ching; Chen, Li-Fei; Sheng, Te-Fang; Lin, Horng-Chyuan; Yang, Lan-Yan; Pan, Yu-Bin; Chung, Fu-Tsai; Wang, Chun-Hua; Kuo, Han-Pin
2016-10-01
Pulmonary rehabilitation (PR) brings benefits to patients with chronic obstructive pulmonary disease (COPD). Negative pressure ventilation (NPV) increases ventilation and decreases hyperinflation as well as breathing work in COPD. We evaluated the long-term effects of a hospital-based PR program coupled with NPV support in patients with COPD on clinical outcomes.One hundred twenty-nine patients with COPD were followed up for more than 5 years, with the NPV group (n = 63) receiving the support of NPV (20-30 cm H2O delivery pressure for 60 min) and unsupervised home exercise program of 20 to 30 min daily walk, while the control group (n = 6) only received unsupervised home exercise program. Pulmonary function tests and 6 min walk tests (6MWT) were performed every 3 to 6 months. Emergency room (ER) visits and hospitalization with medical costs were recorded.A significant time-by-group interaction in the yearly decline of forced expiratory volume in 1 s in the control group analyzed by mixed-model repeated-measure analysis was found (P = 0.048). The 6MWT distance of the NPV group was significantly increased during the first 4 years, with the interaction of time and group (P = 0.003), the time alone (P = 0.014), and the quadratic time (P < 0.001) being significant between the 2 groups. ER exacerbations and hospitalizations decreased by 66% (P < 0.0001) and 54% (P < 0.0001) in the NPV group, respectively. Patients on PR program coupled with NPV had a significant reduction of annual medical costs (P = 0.022).Our hospital-based multidisciplinary PR coupled with NPV reduced yearly decline of lung function, exacerbations, and hospitalization rates, and improved walking distance and medical costs in patients with COPD during a 5-year observation.
The Economic Foundations of Cohabiting Couples' Union Transitions.
Ishizuka, Patrick
2018-04-01
In recent decades, cohabitation has become an increasingly important relationship context for U.S. adults and their children, a union status characterized by high levels of instability. To understand why some cohabiting couples marry but others separate, researchers have drawn on theories emphasizing the benefits of specialization, the persistence of the male breadwinner norm, low income as a source of stress and conflict, and rising economic standards associated with marriage (the marriage bar). Because of conflicting evidence and data constraints, however, important theoretical questions remain. This study uses survival analysis with prospective monthly data from nationally representative panels of the Survey of Income and Program Participation from 1996-2013 to test alternative theories of how money and work affect whether cohabiting couples marry or separate. Analyses indicate that the economic foundations of cohabiting couples' union transitions do not lie in economic specialization or only men's ability to be good providers. Instead, results for marriage support marriage bar theory: adjusting for couples' absolute earnings, increases in wealth and couples' earnings relative to a standard associated with marriage strongly predict marriage. For dissolution, couples with higher and more equal earnings are significantly less likely to separate. Findings demonstrate that within-couple earnings equality promotes stability, and between-couple inequalities in economic resources are critical in producing inequalities in couples' relationship outcomes.
"The best is always yet to come": Relationship stages and processes among young LGBT couples.
Macapagal, Kathryn; Greene, George J; Rivera, Zenaida; Mustanski, Brian
2015-06-01
Limited research has examined relationship development among lesbian, gay, bisexual, and transgender (LGBT) couples in emerging adulthood. A better understanding of LGBT couples can inform the development of relationship education programs that reflect their unique needs. The following questions guided this study: (a) What are the stages and processes during young LGBT couples' relationship development? and (b) How do these compare with existing literature on heterosexual adults? A secondary goal was to explore similarities and differences between couples assigned male (MAAB) and female at birth (FAAB). Thirty-six couples completed interviews on their relationship history. Qualitative analyses showed that relationship stages and processes were similar to past research on heterosexuals, but participants' subjective experiences reflected their LGBT identities and emerging adulthood, which exerted additional stress on the relationship. These factors also affected milestones indicative of commitment among heterosexual adults (e.g., introducing partner to family). Mixed methods analyses indicated that MAAB couples described negotiating relationship agreements and safe sex in more depth than FAAB couples. Relationship development models warrant modifications to consider the impact of sexual and gender identity and emerging adulthood when applied to young LGBT couples. These factors should be addressed in interventions to promote relationship health among young LGBT couples. (c) 2015 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
Oh, Kwang Jin; Kang, Ji Hoon; Myung, Hun Joo
2012-02-01
We have revised a general purpose parallel molecular dynamics simulation program mm_par using the object-oriented programming. We parallelized the revised version using a hierarchical scheme in order to utilize more processors for a given system size. The benchmark result will be presented here. New version program summaryProgram title: mm_par2.0 Catalogue identifier: ADXP_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADXP_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC license, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 2 390 858 No. of bytes in distributed program, including test data, etc.: 25 068 310 Distribution format: tar.gz Programming language: C++ Computer: Any system operated by Linux or Unix Operating system: Linux Classification: 7.7 External routines: We provide wrappers for FFTW [1], Intel MKL library [2] FFT routine, and Numerical recipes [3] FFT, random number generator, and eigenvalue solver routines, SPRNG [4] random number generator, Mersenne Twister [5] random number generator, space filling curve routine. Catalogue identifier of previous version: ADXP_v1_0 Journal reference of previous version: Comput. Phys. Comm. 174 (2006) 560 Does the new version supersede the previous version?: Yes Nature of problem: Structural, thermodynamic, and dynamical properties of fluids and solids from microscopic scales to mesoscopic scales. Solution method: Molecular dynamics simulation in NVE, NVT, and NPT ensemble, Langevin dynamics simulation, dissipative particle dynamics simulation. Reasons for new version: First, object-oriented programming has been used, which is known to be open for extension and closed for modification. It is also known to be better for maintenance. Second, version 1.0 was based on atom decomposition and domain decomposition scheme [6] for parallelization. However, atom decomposition is not popular due to its poor scalability. On the other hand, domain decomposition scheme is better for scalability. It still has a limitation in utilizing a large number of cores on recent petascale computers due to the requirement that the domain size is larger than the potential cutoff distance. To go beyond such a limitation, a hierarchical parallelization scheme has been adopted in this new version and implemented using MPI [7] and OPENMP [8]. Summary of revisions: (1) Object-oriented programming has been used. (2) A hierarchical parallelization scheme has been adopted. (3) SPME routine has been fully parallelized with parallel 3D FFT using volumetric decomposition scheme [9]. K.J.O. thanks Mr. Seung Min Lee for useful discussion on programming and debugging. Running time: Running time depends on system size and methods used. For test system containing a protein (PDB id: 5DHFR) with CHARMM22 force field [10] and 7023 TIP3P [11] waters in simulation box having dimension 62.23 Å×62.23 Å×62.23 Å, the benchmark results are given in Fig. 1. Here the potential cutoff distance was set to 12 Å and the switching function was applied from 10 Å for the force calculation in real space. For the SPME [12] calculation, K, K, and K were set to 64 and the interpolation order was set to 4. To do the fast Fourier transform, we used Intel MKL library. All bonds including hydrogen atoms were constrained using SHAKE/RATTLE algorithms [13,14]. The code was compiled using Intel compiler version 11.1 and mvapich2 version 1.5. Fig. 2 shows performance gains from using CUDA-enabled version [15] of mm_par for 5DHFR simulation in water on Intel Core2Quad 2.83 GHz and GeForce GTX 580. Even though mm_par2.0 is not ported yet for GPU, its performance data would be useful to expect mm_par2.0 performance on GPU. Timing results for 1000 MD steps. 1, 2, 4, and 8 in the figure mean the number of OPENMP threads. Timing results for 1000 MD steps from double precision simulation on CPU, single precision simulation on GPU, and double precision simulation on GPU.
Mmeje, Okeoma; Njoroge, Betty; Akama, Eliud; Leddy, Anna; Breitnauer, Brooke; Darbes, Lynae; Brown, Joelle
2016-01-01
Reproduction is important to many HIV-affected individuals and couples and healthcare providers (HCPs) are responsible for providing resources to help them safely conceive while minimizing the risk of sexual and perinatal HIV transmission. In order to fulfill their reproductive goals, HIV-affected individuals and their partners need access to information regarding safer methods of conception. The objective of this qualitative study was to develop a Safer Conception Counseling Toolkit that can be used to train HCPs and counsel HIV-affected individuals and couples in HIV care and treatment clinics in Kenya. We conducted a two-phased qualitative study among HCPs and HIV-affected individuals and couples from eight HIV care and treatment sites in Kisumu, Kenya. We conducted in-depth interviews (IDIs) and focus group discussions (FGDs) to assess the perspectives of HCPs and HIV-affected individuals and couples in order to develop and refine the content of the Toolkit. Subsequently, IDIs were conducted among HCPs who were trained using the Toolkit and FGDs among HIV-affected individuals and couples who were counseled with the Toolkit. HIV-related stigma, fears, and recommendations for delivery of safer conception counseling were assessed during the discussions. One hundred and six individuals participated in FGDs and IDIs; 29 HCPs, 49 HIV-affected women and men, and 14 HIV-serodiscordant couples. Participants indicated that a safer conception counseling and training program for HCPs is needed and that routine provision of safer conception counseling may promote maternal and child health by enhancing reproductive autonomy among HIV-affected couples. They also reported that the Toolkit may help dispel the stigma and fears associated with reproduction in HIV-affected couples, while supporting them in achieving their reproductive goals. Additional research is needed to evaluate the Safer Conception Toolkit in order to support its implementation and use in HIV care and treatment programs in Kenya and other HIV endemic regions of sub-Saharan Africa.
Bad Questions: An Essay Involving Item Response Theory
ERIC Educational Resources Information Center
Thissen, David
2016-01-01
David Thissen, a professor in the Department of Psychology and Neuroscience, Quantitative Program at the University of North Carolina, has consulted and served on technical advisory committees for assessment programs that use item response theory (IRT) over the past couple decades. He has come to the conclusion that there are usually two purposes…
Breaking from Traditionalism: Strategies for the Recruitment of Physical Education Teachers
ERIC Educational Resources Information Center
O'Neil, Kason; Richards, K. Andrew R.
2018-01-01
Teacher education programs across the country are being asked to systematically and deliberately recruit teacher candidates who are not only highly qualified, but represent diverse backgrounds. Coupled with dwindling enrollments, these programs may want to reevaluate the types of students recruited into a career in physical education. This article…
Neuro-Linguistic Programming in Couple Therapy.
ERIC Educational Resources Information Center
Forman, Bruce D.
Neuro-Linguistic Programming (NLP) is a method of understanding the organization of subjective human experience. The NLP model provides a theoretical framework for directing or guiding therapeutic change. According to NLP, people experience the so-called real world indirectly and operate on the real world as if it were like the model of it they…
IOWA STATE MANPOWER DEVELOPMENT COUNCIL. TWELFTH PROGRESS REPORT, MARCH 1, APRIL 30, 1967.
ERIC Educational Resources Information Center
Office of Manpower Policy, Evaluation, and Research (DOL), Washington, DC.
THE COUNCIL NEGOTIATED SEVERAL ON-THE-JOB TRAINING SUBCONTRACTS FOR PLACEMENTS, INITIATED 4 COUPLED ON-THE-JOB TRAINING-VOCATIONAL EDUCATION PROGRAMS, ACQUIRED ADDITIONAL FUNDS TO ENLARGE BOTH PROJECTS, AND WAS INVOLVED IN A CONTROVERSY OVER THE TRAINING OF BRICKLAYERS UNDER A MANPOWER DEVELOPMENT TRAINING ACT-APPROVED PROGRAM. REPRODUCTIONS OF…
Beginning Teacher Induction: What the Data Tell Us
ERIC Educational Resources Information Center
Ingersoll, Richard M.
2012-01-01
Induction support programs for beginning teachers is an education reform whose time has come. The national data indicate that over the past couple of decades the number of beginning teachers has ballooned in the U.S. Simultaneously, there has been a large increase in the number of states, districts, and schools offering induction programs.…
Early Childhood Program Evaluations: A Decision-Maker's Guide
ERIC Educational Resources Information Center
National Forum on Early Childhood Program Evaluation, 2007
2007-01-01
Increasing demands for evidence-based early childhood services and the need by policymakers to know whether a program is effective or whether it warrants a significant investment of public and/or private funds--coupled with the often-politicized debate around these topics--make it imperative for policymakers and civic leaders to have independent…
Landscape analysis software tools
Don Vandendriesche
2008-01-01
Recently, several new computer programs have been developed to assist in landscape analysis. The âSequential Processing Routine for Arraying Yieldsâ (SPRAY) program was designed to run a group of stands with particular treatment activities to produce vegetation yield profiles for forest planning. SPRAY uses existing Forest Vegetation Simulator (FVS) software coupled...
Human Sexuality Instruction: Implications for Couple and Family Counselor Educators.
ERIC Educational Resources Information Center
Gray, Lizbeth A.; House, Reese M.; Eicken, Sigrid
1996-01-01
Reports the results of a sexual curricula questionnaire sent to all United States counselor education programs (N=506). Data based on 243 responses indicate that educators believe that there is a need for sexual curricula in counselor education programs. However, many educators are not systematically including such information in their training.…
Exploratory technology research program for electrochemical energy storage, annual report for 1997
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kinoshita, K.
The US Department of Energy`s (DOE) Office of Transportation Technologies provides support for an Electrochemical Energy Storage Program, that includes research and development on advanced rechargeable batteries. A major goal of this program is to develop electrochemical power sources suitable for application in electric vehicles (EVs) and hybrid systems. The program centers on advanced electrochemical systems that offer the potential for high performance and low life-cycle costs, both of which are necessary to permit significant penetration into commercial markets. The DOE Electric Vehicle Technology Program is divided into two project areas: the US Advanced Battery Consortium (USABC) and Advanced Batterymore » R and D which includes the Exploratory Technology Research (ETR) Program managed by the Lawrence Berkeley National Laboratory (LBNL). The specific goal of the ETR Program is to identify the most promising electrochemical technologies and transfer them to the USABC, the battery industry and/or other Government agencies for further development and scale-up. This report summarizes the research, financial and management activities relevant to the ETR Program in CY 1997. This is a continuing program, and reports for prior years have been published; they are listed at the end of this Executive Summary. The general R and D areas addressed by the program include identification of new electrochemical couples for advanced batteries, determination of technical feasibility of the new couples, improvements in battery components and materials, and establishment of engineering principles applicable to electrochemical energy storage. Major emphasis is given to applied research which will lead to superior performance and lower life-cycle costs.« less
QCDNUM: Fast QCD evolution and convolution
NASA Astrophysics Data System (ADS)
Botje, M.
2011-02-01
The QCDNUM program numerically solves the evolution equations for parton densities and fragmentation functions in perturbative QCD. Un-polarised parton densities can be evolved up to next-to-next-to-leading order in powers of the strong coupling constant, while polarised densities or fragmentation functions can be evolved up to next-to-leading order. Other types of evolution can be accessed by feeding alternative sets of evolution kernels into the program. A versatile convolution engine provides tools to compute parton luminosities, cross-sections in hadron-hadron scattering, and deep inelastic structure functions in the zero-mass scheme or in generalised mass schemes. Input to these calculations are either the QCDNUM evolved densities, or those read in from an external parton density repository. Included in the software distribution are packages to calculate zero-mass structure functions in un-polarised deep inelastic scattering, and heavy flavour contributions to these structure functions in the fixed flavour number scheme. Program summaryProgram title: QCDNUM version: 17.00 Catalogue identifier: AEHV_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEHV_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: GNU Public Licence No. of lines in distributed program, including test data, etc.: 45 736 No. of bytes in distributed program, including test data, etc.: 911 569 Distribution format: tar.gz Programming language: Fortran-77 Computer: All Operating system: All RAM: Typically 3 Mbytes Classification: 11.5 Nature of problem: Evolution of the strong coupling constant and parton densities, up to next-to-next-to-leading order in perturbative QCD. Computation of observable quantities by Mellin convolution of the evolved densities with partonic cross-sections. Solution method: Parametrisation of the parton densities as linear or quadratic splines on a discrete grid, and evolution of the spline coefficients by solving (coupled) triangular matrix equations with a forward substitution algorithm. Fast computation of convolution integrals as weighted sums of spline coefficients, with weights derived from user-given convolution kernels. Restrictions: Accuracy and speed are determined by the density of the evolution grid. Running time: Less than 10 ms on a 2 GHz Intel Core 2 Duo processor to evolve the gluon density and 12 quark densities at next-to-next-to-leading order over a large kinematic range.
Tan, Tony Xing; Jordan-Arthur, Brittany; Garafano, Jeffrey S; Curran, Laura
2017-01-01
We investigated 109 (79.8% female; 76% White, and 83.5% Heterosexual) mental health trainees' explicit and implicit attitudes toward heterosexual, lesbian, and gay White couples adopting and raising Black children. To determine explicit attitudes, we used a vignette depicting a Black child ready for adoption and three types of equally qualified White families who were headed by a heterosexual couple, gay couple, or lesbian couple. The trainees were asked to indicate which type of family they preferred to adopt the child. To determine implicit attitudes, we used the computer programed latency-based multifactor implicit association test (IAT) protocol. The IAT data were collected from each participant individually. Explicit data showed that over 80% of the participants indicated no strong preference in terms of which type of family should adopted the child. However, IAT data showed that the trainees implicitly preferred lesbian couples. Overall, the degree of congruence between explicit and implicit was very low. Implications for training were discussed.
Reproductive goals and family planning attitudes in Pakistan: a couple-level analysis.
Mahmood, N
1998-01-01
This paper examined reproductive goals and family planning attitudes at the couple level in Pakistan. Data were based on the responses of the 1260 matched couples in the 1990-91 Pakistan Demographic and Health Survey. The questions integrated in the interview were on desired fertility, family size ideas, son preference, and family planning attitude. Findings of the analysis showed that about 60% of the couples have given similar responses (agreeing either positively or negatively) to several fertility-related questions, whereas the remaining 40% differ in their attitudes. This divergence may partly be of the environmental factors such as spouse rural background, lack of education, and minimal communication between spouses. This implies that a couple's joint approval, discussion of family planning, and husband's desire for no more children have the strongest effect on promoting contraceptive use. Thus, it is concluded that the role of couple agreement is important in promoting the use of family planning, and men should be made equal targets of such programs in Pakistan.
Sevinc, Gunes; Hölzel, Britta K; Hashmi, Javeria; Greenberg, Jonathan; McCallister, Adrienne; Treadway, Michael; Schneider, Marissa L; Dusek, Jeffery A; Carmody, James; Lazar, Sara W
2018-06-01
We investigated common and dissociable neural and psychological correlates of two widely used meditation-based stress reduction programs. Participants were randomized to the Relaxation Response (RR; n = 18; 56% female) or the Mindfulness-Based Stress Reduction (MBSR; n = 16; 56% female) programs. Both programs use a "bodyscan" meditation; however, the RR program explicitly emphasizes physical relaxation during this practice, whereas the MBSR program emphasizes mindful awareness with no explicit relaxation instructions. After the programs, neural activity during the respective meditation was investigated using functional magnetic resonance imaging. Both programs were associated with reduced stress (for RR, from 14.1 ± 6.6 to 11.3 ± 5.5 [Cohen's d = 0.50; for MBSR, from 17.7 ± 5.7 to 11.9 ± 5.0 [Cohen's d = 1.02]). Conjunction analyses revealed functional coupling between ventromedial prefrontal regions and supplementary motor areas (p < .001). The disjunction analysis indicated that the RR bodyscan was associated with stronger functional connectivity of the right inferior frontal gyrus-an important hub of intentional inhibition and control-with supplementary motor areas (p < .001, family-wise error [FWE] rate corrected). The MBSR program was uniquely associated with improvements in self-compassion and rumination, and the within-group analysis of MBSR bodyscan revealed significant functional connectivity of the right anterior insula-an important hub of sensory awareness and salience-with pregenual anterior cingulate during bodyscan meditation compared with rest (p = .03, FWE corrected). The bodyscan exercises in each program were associated with both overlapping and differential functional coupling patterns, which were consistent with each program's theoretical foundation. These results may have implications for the differential effects of these programs for the treatment of diverse conditions.
Pension plan participation among married couples.
Dushi, Irena; Iams, Howard M
2013-01-01
We present descriptive statistics on pension participation and types of pensions among married couples, using data from the 1996/2008 Panels of the Survey of Income and Program Participation and Social Security administrative records. Previous research has focused on pension coverage by marital status, but has not examined couples as a unit. Because couples usually share income, viewing them as a unit provides a better picture of potential access to income from retirement plans. Our analysis compares 1998 and 2009 data because substantial changes occurred in the pension landscape over this decade that could have influenced the prevalence of different pension plans, although we observe modest changes in participation rates and types of plans over the period. We find that in 20 percent of couples, neither spouse participated in a pension plan; in 10 percent, the wife was the only participant; and in 37 percent, the husband was the only participant.
The purpose of this SOP is to detail the operation and maintenance of an Instruments, SA Inc., Jobin-Yvon Model 70 (JY-70) inductively coupled plasma atomic emissions spectrometry (ICP-AES). This procedure was followed to ensure consistent data retrieval during the Arizona NHEXA...
New technologies for the detection of millimeter and submillimeter waves
NASA Technical Reports Server (NTRS)
Richards, P. L.; Clarke, J.; Gildemeister, J. M.; Lanting, T.; Lee, A. T.
2001-01-01
Voltage-biased superconducting bolometers have many operational advantages over conventional bolometer technology including sensitivity, linearity, speed, and immunity from environmental disturbance. A review is given of the Berkeley program for developing this new technology. Developments include fully lithographed individual bolometers in the spiderweb configuration, arrays of 1024 close-packed absorber-coupled bolometers, antenna-coupled bolometers, and a frequency-domain SQUID (superconducting quantum interference device) readout multiplexer.
Coupled structural/thermal/electromagnetic analysis/tailoring of graded composite structures
NASA Technical Reports Server (NTRS)
Mcknight, R. L.; Chen, P. C.; Dame, L. T.; Huang, H.
1992-01-01
Accomplishments are described for the first year effort of a 5-year program to develop a methodology for coupled structural/thermal/electromagnetic analysis/tailoring of graded composite structures. These accomplishments include: (1) the results of the selective literature survey; (2) 8-, 16-, and 20-noded isoparametric plate and shell elements; (3) large deformation structural analysis; (4) eigenanalysis; (5) anisotropic heat transfer analysis; and (6) anisotropic electromagnetic analysis.
NASA Technical Reports Server (NTRS)
Sutton, L. R.
1975-01-01
A theoretical analysis is developed for a coupled helicopter rotor system to allow determination of the loads and dynamic response behavior of helicopter rotor systems in both steady-state forward flight and maneuvers. The effects of an anisotropically supported swashplate or gyroscope control system and a deformed free wake on the rotor system dynamic response behavior are included.
NASA Astrophysics Data System (ADS)
Kawakami, Takashi; Sano, Shinsuke; Saito, Toru; Sharma, Sandeep; Shoji, Mitsuo; Yamada, Satoru; Takano, Yu; Yamanaka, Shusuke; Okumura, Mitsutaka; Nakajima, Takahito; Yamaguchi, Kizashi
2017-09-01
Theoretical examinations of the ferromagnetic coupling in the m-phenylene-bis-methylene molecule and its oligomer were carried out. These systems are good candidates for exchange-coupled systems to investigate strong electronic correlations. We studied effective exchange integrals (J), which indicated magnetic coupling between interacting spins in these species. First, theoretical calculations based on a broken-symmetry single-reference procedure, i.e. the UHF, UMP2, UMP4, UCCSD(T) and UB3LYP methods, were carried out with a GAUSSIAN program code under an SR wave function. From these results, the J value by the UHF method was largely positive because of the strong ferromagnetic spin polarisation effect. The J value by the UCCSD(T) and UB3LYP methods improved an overestimation problem by correcting the dynamical electronic correlation. Next, magnetic coupling among these spins was studied using the CAS-based method of the symmetry-adapted multireference methods procedure. Thus, the UNO DMRG CASCI (UNO, unrestricted natural orbital; DMRG, density matrix renormalised group; CASCI, complete active space configuration interaction) method was mainly employed with a combination of ORCA and BLOCK program codes. DMRG CASCI calculations in valence electron counting, which included all orbitals to full valence CI, provided the most reliable result, and support the UB3LYP method for extended systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gleicher, Frederick N.; Williamson, Richard L.; Ortensi, Javier
The MOOSE neutron transport application RATTLESNAKE was coupled to the fuels performance application BISON to provide a higher fidelity tool for fuel performance simulation. This project is motivated by the desire to couple a high fidelity core analysis program (based on the self-adjoint angular flux equations) to a high fidelity fuel performance program, both of which can simulate on unstructured meshes. RATTLESNAKE solves self-adjoint angular flux transport equation and provides a sub-pin level resolution of the multigroup neutron flux with resonance treatment during burnup or a fast transient. BISON solves the coupled thermomechanical equations for the fuel on a sub-millimetermore » scale. Both applications are able to solve their respective systems on aligned and unaligned unstructured finite element meshes. The power density and local burnup was transferred from RATTLESNAKE to BISON with the MOOSE Multiapp transfer system. Multiple depletion cases were run with one-way data transfer from RATTLESNAKE to BISON. The eigenvalues are shown to agree well with values obtained from the lattice physics code DRAGON. The one-way data transfer of power density is shown to agree with the power density obtained from an internal Lassman-style model in BISON.« less
Rosenberg, Nora E; Graybill, Lauren A; Wesevich, Austin; McGrath, Nuala; Golin, Carol E; Maman, Suzanne; Bhushan, Nivedita; Tsidya, Mercy; Chimndozi, Limbikani; Hoffman, Irving F; Hosseinipour, Mina C; Miller, William C
2017-08-01
In sub-Saharan Africa couple HIV testing and counseling (CHTC) has been associated with substantial increases in safe sex, especially when at least one partner is HIV infected. However, this relationship has not been characterized in an Option B+ context. The study was conducted at the antenatal clinic at Bwaila District Hospital in Lilongwe, Malawi in 2016 under an Option B+ program. Ninety heterosexual couples with an HIV-infected pregnant woman (female-positive couples) and 47 couples with an HIV-uninfected pregnant woman (female-negative couples) were enrolled in an observational study. Each couple member was assessed immediately before and 1 month after CHTC for safe sex (abstinence or consistent condom use in the last month). Generalized estimating equations were used to model change in safe sex before and after CHTC and to compare safe sex between female-positive and female-negative couples. Mean age was 26 years among women and 32 years among men. Before CHTC, safe sex was comparable among female-positive couples (8%) and female-negative couples (2%) [risk ratio (RR): 3.7, 95% confidence interval (CI): 0.5 to 29.8]. One month after CHTC, safe sex was higher among female-positive couples (75%) than among female-negative couples (3%) (RR: 30.0, 95% CI: 4.3 to 207.7). Safe sex increased substantially after CTHC for female-positive couples (RR 9.6, 95% CI: 4.6 to 20.0), but not for female-negative couples (RR: 1.2, 95% CI: 0.1 to 18.7). Engaging pregnant couples in CHTC can have prevention benefits for couples with an HIV-infected pregnant woman, but additional prevention approaches may be needed for couples with an HIV-uninfected pregnant woman.
The Effects of Partnered Exercise on Physical Intimacy in Couples Coping with Prostate Cancer
Lyons, Karen S.; Winters-Stone, Kerri M.; Bennett, Jill A.; Beer, Tomasz M.
2015-01-01
Objective The study examined whether couples coping with prostate cancer participating in a partnered exercise program - Exercising Together (ET) - experienced higher levels of physical intimacy (i.e., affectionate & sexual behavior) than couples in a usual care (UC) control group. Method Men and their wives (n=64 couples) were randomly assigned to either the ET or UC group. Couples in the ET group engaged in partnered strength-training twice weekly for six months. Multilevel modeling was used to explore the effects of ET on husband and wife engagement in both affectionate and sexual behaviors over time. Results Controlling for relationship quality, wives in ET showed significant increases in engagement in affectionate behaviors compared to wives in UC. No intervention effects were found for husbands. Conclusion Couple-based approaches to physical intimacy, after a cancer diagnosis, that facilitate collaborative engagement in non-sexual physical activities for the couple have potential to be effective for wives. More research is needed in this area to determine couples most amenable to such exercise strategies, optimal timing in the cancer trajectory, and the benefits of combining partnered exercise with more traditional relationship-focused strategies. PMID:26462060
Roddy, McKenzie K; Georgia, Emily J; Doss, Brian D
2017-04-20
In-person conjoint treatments for relationship distress are effective at increasing relationship satisfaction, and newly developed online programs are showing promising results. However, couples reporting even low levels intimate partner violence (IPV) are traditionally excluded from these interventions. To improve the availability of couple-based treatment for couples with IPV, the present study sought to determine whether associations with IPV found in community samples generalized to couples seeking help for their relationship and whether web-based interventions for relationship distressed worked equally well for couples with IPV. In the first aim, in a sample of 2,797 individuals who were seeking online help for their relationship, the levels and correlates of both low-intensity and clinically significant IPV largely matched what is found in community samples. In the second aim, in a sample of 300 couples who were randomly assigned to a web-based intervention or a waitlist control group, low-impact IPV did not moderate the effects of the intervention for relationship distress. Therefore, web-based interventions may be an effective (and easily accessible) intervention for relationship distress for couples with low-intensity IPV. © 2017 Family Process Institute.
NASA Technical Reports Server (NTRS)
Sharber, J. R.; Frahm, R. A.; Scherrer, J. R.
1997-01-01
Under this grant two instruments, a soft particle spectrometer and a Langmuir probe, were refurbished and calibrated, and flown on three instrumented rocket payloads as part of the Magnetosphere/Thermosphere Coupling program. The flights took place at the Poker Flat Research Range on February 12, 1994 (T(sub o) = 1316:00 UT), February 2, 1995 (T(sub o) = 1527:20 UT), and November 27, 1995 (T(sub o) = 0807:24 UT). In this report the observations of the particle instrumentation flown on all three of the flights are described, and brief descriptions of relevant geophysical activity for each flight are provided. Calibrations of the particle instrumentation for all ARIA flights are also provided.
CRRES combined radiation and release effects satellite program
NASA Technical Reports Server (NTRS)
Giles, B. L. (Compiler); Mccook, M. A. (Compiler); Mccook, M. W. (Compiler); Miller, G. P. (Compiler)
1995-01-01
The various regions of the magnetosphere-ionosphere system are coupled by flows of charged particle beams and electromagnetic waves. This coupling gives rise to processes that affect both technical and non-technical aspects of life on Earth. The CRRES Program sponsored experiments which were designed to produce controlled and known input to the space environment and the effects were measured with arrays of diagnostic instruments. Large amounts of material were used to modify and perturb the environment in a controlled manner, and response to this was studied. The CRRES and PEGSAT satellites were dual-mission spacecraft with a NASA mission to perform active chemical-release experiments, grouped into categories of tracer, modification, and simulation experiments. Two sounding rocket chemical release campaigns completed the study.
NASA Technical Reports Server (NTRS)
Rowlette, J. J. (Inventor)
1985-01-01
A coulometer for accurately measuring the state-of-charge of an open-cell battery utilizing an aqueous electrolyte, includes a current meter for measuring the battery/discharge current and a flow meter for measuring the rate at which the battery produces gas during charge and discharge. Coupled to the flow meter is gas analyzer which measures the oxygen fraction of the battery gas. The outputs of the current meter, flow meter, and gas analyzer are coupled to a programmed microcomputer which includes a CPU and program and data memories. The microcomputer calculates that fraction of charge and discharge current consumed in the generation of gas so that the actual state-of-charge can be determined. The state-of-charge is then shown on a visual display.
Lolekha, Rangsima; Kullerk, Nareeluck; Wolfe, Mitchell I; Klumthanom, Kanyarat; Singhagowin, Thapanaporn; Pattanasin, Sarika; Sombat, Potjaman; Naiwatanakul, Thananda; Leartvanangkul, Chailai; Voramongkol, Nipunporn
2014-12-24
Couples HIV testing and counseling (CHTC) at antenatal care (ANC) settings allows pregnant women to learn the HIV status of themselves and their partners. Couples can make decisions together to prevent HIV transmission. In Thailand, men were tested at ANC settings only if their pregnant partners were HIV positive. A CHTC program based in ANC settings was developed and implemented at 16 pilot hospitals in 7 provinces during 2009-2010. Cross-sectional data were collected using standard data collection forms from all pregnant women and accompanying partners who presented at first ANC visit at 16 hospitals. CHTC data for women and partners were analyzed to determine service uptake and HIV test results among couples. In-depth interviews were conducted among hospital staff of participating hospitals during field supervision visits to assess feasibility and acceptability of CHTC services. During October 2009-April 2010, 4,524 women initiating ANC were enrolled. Of these, 2,435 (54%) women came for ANC alone; 2,089 (46%) came with partners. Among men presenting with partners, 2,003 (96%) received couples counseling. Of these, 1,723 (86%) men and all pregnant women accepted HIV testing. Among 1,723 couples testing for HIV, 1,604 (93%) returned for test results. Of these, 1,567 (98%) were concordant negative, 6 (0.4%) were concordant positive and 17 (1%) were HIV discordant (7 male+/female- and 10 male-/female+). Nine of ten (90%) executive hospital staff reported high acceptability of CHTC services. CHTC implemented in ANC settings helps identify more HIV-positive men whose partners were negative than previous practice, with high acceptability among hospital staff.
Karita, Etienne; Nsanzimana, Sabin; Ndagije, Felix; Mukamuyango, Jeannine; Mugwaneza, Placidie; Remera, Eric; Raghunathan, Pratima L.; Bayingana, Roger; Kayitenkore, Kayitesi; Bekan-Homawoo, Brigitte; Tichacek, Amanda; Allen, Susan
2016-01-01
Background: Couples' voluntary HIV counseling and testing (CVCT) is a WHO-recommended intervention for prevention of heterosexual HIV transmission which very few African couples have received. We report the successful nationwide implementation of CVCT in Rwanda. Methods: From 1988 to 1994 in Rwanda, pregnant and postpartum women were tested for HIV and requested testing for their husbands. Partner testing was associated with more condom use and lower HIV and sexually transmitted infection rates, particularly among HIV-discordant couples. After the 1994 genocide, the research team continued to refine CVCT procedures in Zambia. These were reintroduced to Rwanda in 2001 and continually tested and improved. In 2003, the Government of Rwanda (GoR) established targets for partner testing among pregnant women, with the proportion rising from 16% in 2003 to 84% in 2008 as the prevention of mother-to-child transmission program expanded to >400 clinics. In 2009, the GoR adopted joint posttest counseling procedures, and in 2010 a quarterly follow-up program for discordant couples was established in government clinics with training and technical assistance. An estimated 80%–90% of Rwandan couples have now been jointly counseled and tested resulting in prevention of >70% of new HIV infections. Conclusions: Rwanda is the first African country to have established CVCT as standard of care in antenatal care. More than 20 countries have sent providers to Rwanda for CVCT training. To duplicate Rwanda's success, training and technical assistance must be part of a coordinated effort to set national targets, timelines, indicators, and budgets. Governments, bilateral, and multilateral funding agencies must jointly prioritize CVCT for prevention of new HIV infections. PMID:27741033
Karita, Etienne; Nsanzimana, Sabin; Ndagije, Felix; Wall, Kristin M; Mukamuyango, Jeannine; Mugwaneza, Placidie; Remera, Eric; Raghunathan, Pratima L; Bayingana, Roger; Kayitenkore, Kayitesi; Bekan-Homawoo, Brigitte; Tichacek, Amanda; Allen, Susan
2016-11-01
Couples' voluntary HIV counseling and testing (CVCT) is a WHO-recommended intervention for prevention of heterosexual HIV transmission which very few African couples have received. We report the successful nationwide implementation of CVCT in Rwanda. From 1988 to 1994 in Rwanda, pregnant and postpartum women were tested for HIV and requested testing for their husbands. Partner testing was associated with more condom use and lower HIV and sexually transmitted infection rates, particularly among HIV-discordant couples. After the 1994 genocide, the research team continued to refine CVCT procedures in Zambia. These were reintroduced to Rwanda in 2001 and continually tested and improved. In 2003, the Government of Rwanda (GoR) established targets for partner testing among pregnant women, with the proportion rising from 16% in 2003 to 84% in 2008 as the prevention of mother-to-child transmission program expanded to >400 clinics. In 2009, the GoR adopted joint posttest counseling procedures, and in 2010 a quarterly follow-up program for discordant couples was established in government clinics with training and technical assistance. An estimated 80%-90% of Rwandan couples have now been jointly counseled and tested resulting in prevention of >70% of new HIV infections. Rwanda is the first African country to have established CVCT as standard of care in antenatal care. More than 20 countries have sent providers to Rwanda for CVCT training. To duplicate Rwanda's success, training and technical assistance must be part of a coordinated effort to set national targets, timelines, indicators, and budgets. Governments, bilateral, and multilateral funding agencies must jointly prioritize CVCT for prevention of new HIV infections.
Vural, Bilgin Kiray; Temel, Ayla Bayik
2009-09-01
Through its ability to address and remove fear and misunderstanding and the resulting sexual reluctance and related problems, pre-marital sexual education and counselling can contribute to sexual satisfaction. This quasi-experimental research conducted in a pre-test-post-test control group design aimed to examine the effectiveness of nursing interventions on a premarital counselling program and its impacts on the sexual satisfaction of couples (36 couples in the experimental group and 35 couples in the control group). Although no difference was detected between the experimental and control groups in terms of the level of knowledge on pre-test point averages, the difference between them in terms of post-test knowledge gain averages was statistically significant. Approval rates for sexual myths in the pre-test were 27.87% in the experimental group and 37.03% in the control group; in the post-test they were 23.51% and 36.66% respectively. In the experimental group, 80.6% of the women and 63.9% of the men, and in the control group, 77.1% of the women and 71.4% of the men were established as having a problem-free sexual life. It was also discovered that levels of sexual satisfaction were shown to be higher among women and men in the experimental group who had attended premarital sexual counselling education than the women and men in the control group. A recommendation to encourage engaged couples to attend premarital sexual counselling is made based on the findings. It is thought that an intervention plan prepared within the framework of the Information, Motivation, Behavioural Skills theoretical model will help nurses guide recently-married couples to greater sexual satisfaction.
Koniak-Griffin, Deborah; Lesser, Janna; Takayanagi, Sumiko; Cumberland, William G
2011-04-01
To evaluate the efficacy and sustainability of a couple-focused human immunodeficiency virus (HIV) prevention intervention in reducing unprotected sex and increasing intent to use condoms and knowledge about AIDS. Randomized controlled trial. Urban community settings in Southern California. Primarily Latino couples (168 couples; 336 individuals) who were aged 14 to 25 years, English or Spanish speaking, and coparenting a child at least 3 months of age. A 12-hour theory-based, couple-focused HIV prevention program culturally tailored for young Latino parents, with emphasis on family protection, skill building, and issues related to gender and power. The 1½-hour control condition provided basic HIV-AIDS information. Primary outcome measures included self-report of condom use during the past 3 months; secondary, intent to use condoms and knowledge about AIDS. The HIV prevention intervention reduced the proportion of unprotected sex episodes (odds ratio, 0.87 per month from baseline to 6 months; 95% confidence interval [CI], 0.82-0.93) and increased intent to use condoms (slope increase, 0.20; 95% CI, 0.04-0.37) at the 6-month follow-up; however, these effects were not sustained at 12 months. Knowledge about AIDS was increased in both groups from baseline to 6 months (slope estimate, 0.57; 95% CI, 0.47-0.67) and was maintained in the intervention group only through 12 months. Female participants in both groups had higher intent to use condoms and knowledge about AIDS than male participants (P ≤ .01). The couple-focused HIV prevention intervention reduced risky sexual behaviors and improved intent to use condoms among young Latino parents at the 6-month evaluation. A maintenance program is needed to improve the sustainability of effects over time.
NASA Technical Reports Server (NTRS)
Omalley, T. A.
1984-01-01
The use of the coupled cavity traveling wave tube for space communications has led to an increased interest in improving the efficiency of the basic interaction process in these devices through velocity resynchronization and other methods. A flexible, three dimensional, axially symmetric, large signal computer program was developed for use on the IBM 370 time sharing system. A users' manual for this program is included.
Couple Characteristics and Contraceptive Use among Women and their Partners in Urban Kenya
Irani, Laili; Speizer, Ilene S.; Fotso, Jean-Christophe
2014-01-01
Background Few studies have used couple data to identify individual- and relationship-level characteristics that affect contraceptive use in urban areas. Using matched couple data from urban Kenya collected in 2010, this study determines the association between relationship-level characteristics (desire for another child, communication about desired number of children and FP use) and contraceptive use and intention to use among non-users. Methods Data were collected from three Kenyan cities: Nairobi, Mombasa and Kisumu. Baseline population-based survey data from the Measurement, Learning & Evaluation Project were used to identify 883couples (weighted value=840). Multivariate regressions used the couple as the unit of analysis. Results Almost two-thirds of couples currently used contraception. Adjusting for individual- and environmental-level characteristics, couples who desired another child were less likely to use contraception than couples wanting more children. In addition, couples where both partners reported communicating with each other regarding desired number of children and FP use were more likely to use contraception compared to couples that did not communicate. Analyses testing the association of relationship-level characteristics and intention to use contraception, among non-users, resembled those of current contraceptive users. Conclusion Couple-level characteristics are associated with current contraceptive use and future intent to use. Couples that discussed their desired number of children and FP use were more likely to use contraception than couples that did not communicate with each other. FP programs should identify strategies to improve communication in FP among couples and to ensure better cooperation between partners. PMID:24733057
“The Best is Always Yet to Come”: Relationship Stages and Processes Among Young LGBT Couples
Macapagal, Kathryn; Greene, George J.; Rivera, Zenaida A.; Mustanski, Brian
2015-01-01
Limited research has examined relationship development among lesbian, gay, bisexual, and transgender (LGBT) couples in emerging adulthood. A better understanding of LGBT couples can inform the development of relationship education programs that reflect their unique needs. The following questions guided this study: 1) what are the stages and processes during young LGBT couples’ relationship development? and 2) how do these compare to existing literature on heterosexual adults? A secondary goal was to explore similarities and differences between couples assigned male (MAAB) and female at birth (FAAB). Thirty-six couples completed interviews on their relationship history. Qualitative analyses showed that relationship stages and processes were similar to past research on heterosexuals, but participants’ subjective experiences reflected their LGBT identities and emerging adulthood, which exerted additional stress on the relationship. These factors also affected milestones indicative of commitment among heterosexual adults (e.g., introducing partner to family). Mixed-methods analyses indicated that MAAB couples described negotiating relationship agreements and safe sex in more depth than FAAB couples. Relationship development models warrant modifications to consider the impact of sexual and gender identity and emerging adulthood when applied to young LGBT couples. These factors should be addressed in interventions to promote relationship health among young LGBT couples. PMID:26053345
Contraceptive social marketing in the Philippines. A new initiative.
Migallos, G; Araneta, A
1994-01-01
By offering contraceptives at subsidized prices through pharmacies, drugstores, grocery shops, and other conveniently-located retail outlets, and promoting them with modern marketing techniques, social marketing programs can do much to reduce the unmet need for family planning. Users obviously benefit, while the family planning program benefits from advertising and marketing skills and some cost recovery. The Philippine Contraceptive Social Marketing Project (PCSMP) was formally launched in the Philippines in 1993 in response to the large unmet need in the country, and initial results are promising. The project was started with funding from the US Agency for International Development to provide affordable, quality contraceptives through the private sector to Filipino couples who choose to practice family planning. A 1988 survey found that only 22.4% of women aged 15-44 years were using modern methods of contraception and 13.8% were using traditional methods; approximately three million women therefore had unmet need for family planning. The PCSMP established an AIDS prevention component and a birth spacing component, enlisting the participation of oral contraceptive manufacturers Wyeth, Organon, and Schering, along with one condom distributor, Philusa. These companies lowered their product prices by 20% for the program. Despite objections from the Catholic church, sales of both oral pills and condoms increased in the first year. In its second year, the program will advertise Sensation condoms and the Couple's Choice Pills via television, through intensive distribution drives, consumer and trade promotions, and the continuous training of health professionals. The contraceptive injectable DMPA will be added to the Couple's Choice product line in April 1994. This method, too, will be heavily promoted.
A novel potential/viscous flow coupling technique for computing helicopter flow fields
NASA Technical Reports Server (NTRS)
Summa, J. Michael; Strash, Daniel J.; Yoo, Sungyul
1993-01-01
The primary objective of this work was to demonstrate the feasibility of a new potential/viscous flow coupling procedure for reducing computational effort while maintaining solution accuracy. This closed-loop, overlapped velocity-coupling concept has been developed in a new two-dimensional code, ZAP2D (Zonal Aerodynamics Program - 2D), a three-dimensional code for wing analysis, ZAP3D (Zonal Aerodynamics Program - 3D), and a three-dimensional code for isolated helicopter rotors in hover, ZAPR3D (Zonal Aerodynamics Program for Rotors - 3D). Comparisons with large domain ARC3D solutions and with experimental data for a NACA 0012 airfoil have shown that the required domain size can be reduced to a few tenths of a percent chord for the low Mach and low angle of attack cases and to less than 2-5 chords for the high Mach and high angle of attack cases while maintaining solution accuracies to within a few percent. This represents CPU time reductions by a factor of 2-4 compared with ARC2D. The current ZAP3D calculation for a rectangular plan-form wing of aspect ratio 5 with an outer domain radius of about 1.2 chords represents a speed-up in CPU time over the ARC3D large domain calculation by about a factor of 2.5 while maintaining solution accuracies to within a few percent. A ZAPR3D simulation for a two-bladed rotor in hover with a reduced grid domain of about two chord lengths was able to capture the wake effects and compared accurately with the experimental pressure data. Further development is required in order to substantiate the promise of computational improvements due to the ZAPR3D coupling concept.
Comparisons of Solar Wind Coupling Parameters with Auroral Energy Deposition Rates
NASA Technical Reports Server (NTRS)
Elsen, R.; Brittnacher, M. J.; Fillingim, M. O.; Parks, G. K.; Germany G. A.; Spann, J. F., Jr.
1997-01-01
Measurement of the global rate of energy deposition in the ionosphere via auroral particle precipitation is one of the primary goals of the Polar UVI program and is an important component of the ISTP program. The instantaneous rate of energy deposition for the entire month of January 1997 has been calculated by applying models to the UVI images and is presented by Fillingim et al. In this session. A number of parameters that predict the rate of coupling of solar wind energy into the magnetosphere have been proposed in the last few decades. Some of these parameters, such as the epsilon parameter of Perrault and Akasofu, depend on the instantaneous values in the solar wind. Other parameters depend on the integrated values of solar wind parameters, especially IMF Bz, e.g. applied flux which predicts the net transfer of magnetic flux to the tail. While these parameters have often been used successfully with substorm studies, their validity in terms of global energy input has not yet been ascertained, largely because data such as that supplied by the ISTP program was lacking. We have calculated these and other energy coupling parameters for January 1997 using solar wind data provided by WIND and other solar wind monitors. The rates of energy input predicted by these parameters are compared to those measured through UVI data and correlations are sought. Whether these parameters are better at providing an instantaneous rate of energy input or an average input over some time period is addressed. We also study if either type of parameter may provide better correlations if a time delay is introduced; if so, this time delay may provide a characteristic time for energy transport in the coupled solar wind-magnetosphere-ionosphere system.
Coupled circuit numerical analysis of eddy currents in an open MRI system.
Akram, Md Shahadat Hossain; Terada, Yasuhiko; Keiichiro, Ishi; Kose, Katsumi
2014-08-01
We performed a new coupled circuit numerical simulation of eddy currents in an open compact magnetic resonance imaging (MRI) system. Following the coupled circuit approach, the conducting structures were divided into subdomains along the length (or width) and the thickness, and by implementing coupled circuit concepts we have simulated transient responses of eddy currents for subdomains in different locations. We implemented the Eigen matrix technique to solve the network of coupled differential equations to speed up our simulation program. On the other hand, to compute the coupling relations between the biplanar gradient coil and any other conducting structure, we implemented the solid angle form of Ampere's law. We have also calculated the solid angle for three dimensions to compute inductive couplings in any subdomain of the conducting structures. Details of the temporal and spatial distribution of the eddy currents were then implemented in the secondary magnetic field calculation by the Biot-Savart law. In a desktop computer (Programming platform: Wolfram Mathematica 8.0®, Processor: Intel(R) Core(TM)2 Duo E7500 @ 2.93GHz; OS: Windows 7 Professional; Memory (RAM): 4.00GB), it took less than 3min to simulate the entire calculation of eddy currents and fields, and approximately 6min for X-gradient coil. The results are given in the time-space domain for both the direct and the cross-terms of the eddy current magnetic fields generated by the Z-gradient coil. We have also conducted free induction decay (FID) experiments of eddy fields using a nuclear magnetic resonance (NMR) probe to verify our simulation results. The simulation results were found to be in good agreement with the experimental results. In this study we have also conducted simulations for transient and spatial responses of secondary magnetic field induced by X-gradient coil. Our approach is fast and has much less computational complexity than the conventional electromagnetic numerical simulation methods. Copyright © 2014 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Chuluunbaatar, O.; Gusev, A. A.; Abrashkevich, A. G.; Amaya-Tapia, A.; Kaschiev, M. S.; Larsen, S. Y.; Vinitsky, S. I.
2007-10-01
A FORTRAN 77 program is presented which calculates energy values, reaction matrix and corresponding radial wave functions in a coupled-channel approximation of the hyperspherical adiabatic approach. In this approach, a multi-dimensional Schrödinger equation is reduced to a system of the coupled second-order ordinary differential equations on the finite interval with homogeneous boundary conditions of the third type. The resulting system of radial equations which contains the potential matrix elements and first-derivative coupling terms is solved using high-order accuracy approximations of the finite-element method. As a test desk, the program is applied to the calculation of the energy values and reaction matrix for an exactly solvable 2D-model of three identical particles on a line with pair zero-range potentials. Program summaryProgram title: KANTBP Catalogue identifier: ADZH_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADZH_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 4224 No. of bytes in distributed program, including test data, etc.: 31 232 Distribution format: tar.gz Programming language: FORTRAN 77 Computer: Intel Xeon EM64T, Alpha 21264A, AMD Athlon MP, Pentium IV Xeon, Opteron 248, Intel Pentium IV Operating system: OC Linux, Unix AIX 5.3, SunOS 5.8, Solaris, Windows XP RAM: depends on (a) the number of differential equations; (b) the number and order of finite-elements; (c) the number of hyperradial points; and (d) the number of eigensolutions required. Test run requires 30 MB Classification: 2.1, 2.4 External routines: GAULEG and GAUSSJ [W.H. Press, B.F. Flanery, S.A. Teukolsky, W.T. Vetterley, Numerical Recipes: The Art of Scientific Computing, Cambridge University Press, Cambridge, 1986] Nature of problem: In the hyperspherical adiabatic approach [J. Macek, J. Phys. B 1 (1968) 831-843; U. Fano, Rep. Progr. Phys. 46 (1983) 97-165; C.D. Lin, Adv. Atom. Mol. Phys. 22 (1986) 77-142], a multi-dimensional Schrödinger equation for a two-electron system [A.G. Abrashkevich, D.G. Abrashkevich, M. Shapiro, Comput. Phys. Comm. 90 (1995) 311-339] or a hydrogen atom in magnetic field [M.G. Dimova, M.S. Kaschiev, S.I. Vinitsky, J. Phys. B 38 (2005) 2337-2352] is reduced by separating the radial coordinate ρ from the angular variables to a system of second-order ordinary differential equations which contain potential matrix elements and first-derivative coupling terms. The purpose of this paper is to present the finite-element method procedure based on the use of high-order accuracy approximations for calculating approximate eigensolutions for such systems of coupled differential equations. Solution method: The boundary problems for coupled differential equations are solved by the finite-element method using high-order accuracy approximations [A.G. Abrashkevich, D.G. Abrashkevich, M.S. Kaschiev, I.V. Puzynin, Comput. Phys. Comm. 85 (1995) 40-64]. The generalized algebraic eigenvalue problem AF=EBF with respect to pair unknowns ( E,F) arising after the replacement of the differential problem by the finite-element approximation is solved by the subspace iteration method using the SSPACE program [K.J. Bathe, Finite Element Procedures in Engineering Analysis, Englewood Cliffs, Prentice-Hall, New York, 1982]. The generalized algebraic eigenvalue problem (A-EB)F=λDF with respect to pair unknowns (λ,F) arising after the corresponding replacement of the scattering boundary problem in open channels at fixed energy value, E, is solved by the LDL factorization of symmetric matrix and back-substitution methods using the DECOMP and REDBAK programs, respectively [K.J. Bathe, Finite Element Procedures in Engineering Analysis, Englewood Cliffs, Prentice-Hall, New York, 1982]. As a test desk, the program is applied to the calculation of the energy values and reaction matrix for an exactly solvable 2D-model of three identical particles on a line with pair zero-range potentials described in [Yu. A. Kuperin, P.B. Kurasov, Yu.B. Melnikov, S.P. Merkuriev, Ann. Phys. 205 (1991) 330-361; O. Chuluunbaatar, A.A. Gusev, S.Y. Larsen, S.I. Vinitsky, J. Phys. A 35 (2002) L513-L525; N.P. Mehta, J.R. Shepard, Phys. Rev. A 72 (2005) 032728-1-11; O. Chuluunbaatar, A.A. Gusev, M.S. Kaschiev, V.A. Kaschieva, A. Amaya-Tapia, S.Y. Larsen, S.I. Vinitsky, J. Phys. B 39 (2006) 243-269]. For this benchmark model the needed analytical expressions for the potential matrix elements and first-derivative coupling terms, their asymptotics and asymptotics of radial solutions of the boundary problems for coupled differential equations have been produced with help of a MAPLE computer algebra system. Restrictions: The computer memory requirements depend on: (a) the number of differential equations; (b) the number and order of finite-elements; (c) the total number of hyperradial points; and (d) the number of eigensolutions required. Restrictions due to dimension sizes may be easily alleviated by altering PARAMETER statements (see Long Write-Up and listing for details). The user must also supply subroutine POTCAL for evaluating potential matrix elements. The user should supply subroutines ASYMEV (when solving the eigenvalue problem) or ASYMSC (when solving the scattering problem) that evaluate the asymptotics of the radial wave functions at the right boundary point in case of a boundary condition of the third type, respectively. Running time: The running time depends critically upon: (a) the number of differential equations; (b) the number and order of finite-elements; (c) the total number of hyperradial points on interval [0,ρ]; and (d) the number of eigensolutions required. The test run which accompanies this paper took 28.48 s without calculation of matrix potentials on the Intel Pentium IV 2.4 GHz.
ERIC Educational Resources Information Center
Lesser, Janna; Verdugo, Robert L.; Koniak-Griffin, Deborah; Tello, Jerry; Kappos, Barbara; Cumberland, William G.
2005-01-01
This article describes a two-phase community and academic collaboration funded by the California Collaborative Research Initiative to develop and test the feasibility of an innovative HIV prevention program relevant to the needs of the population of inner-city Latino teen parenting couples and realistic for implementation in community settings.…
Vo-Ag Educators Seek to Increase Numbers and Professionalism: Vo-Ag Education at Work in Wisconsin
ERIC Educational Resources Information Center
Lehrmann, Eugene
1978-01-01
Comments on Wisconsin's commitment to agriculture and agricultural education, focusing on the Young Farmer Program, a postsecondary program consisting of forty hours of classroom instruction coupled with on-farm instruction designed to provide people becoming established in farming with competencies that will assist them in making rational and…