Sample records for accelerated strategic computing

  1. Cooperative high-performance storage in the accelerated strategic computing initiative

    NASA Technical Reports Server (NTRS)

    Gary, Mark; Howard, Barry; Louis, Steve; Minuzzo, Kim; Seager, Mark

    1996-01-01

    The use and acceptance of new high-performance, parallel computing platforms will be impeded by the absence of an infrastructure capable of supporting orders-of-magnitude improvement in hierarchical storage and high-speed I/O (Input/Output). The distribution of these high-performance platforms and supporting infrastructures across a wide-area network further compounds this problem. We describe an architectural design and phased implementation plan for a distributed, Cooperative Storage Environment (CSE) to achieve the necessary performance, user transparency, site autonomy, communication, and security features needed to support the Accelerated Strategic Computing Initiative (ASCI). ASCI is a Department of Energy (DOE) program attempting to apply terascale platforms and Problem-Solving Environments (PSEs) toward real-world computational modeling and simulation problems. The ASCI mission must be carried out through a unified, multilaboratory effort, and will require highly secure, efficient access to vast amounts of data. The CSE provides a logically simple, geographically distributed, storage infrastructure of semi-autonomous cooperating sites to meet the strategic ASCI PSE goal of highperformance data storage and access at the user desktop.

  2. Delivering Insight The History of the Accelerated Strategic Computing Initiative

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larzelere II, A R

    2007-01-03

    The history of the Accelerated Strategic Computing Initiative (ASCI) tells of the development of computational simulation into a third fundamental piece of the scientific method, on a par with theory and experiment. ASCI did not invent the idea, nor was it alone in bringing it to fruition. But ASCI provided the wherewithal - hardware, software, environment, funding, and, most of all, the urgency - that made it happen. On October 1, 2005, the Initiative completed its tenth year of funding. The advances made by ASCI over its first decade are truly incredible. Lawrence Livermore, Los Alamos, and Sandia National Laboratories,more » along with leadership provided by the Department of Energy's Defense Programs Headquarters, fundamentally changed computational simulation and how it is used to enable scientific insight. To do this, astounding advances were made in simulation applications, computing platforms, and user environments. ASCI dramatically changed existing - and forged new - relationships, both among the Laboratories and with outside partners. By its tenth anniversary, despite daunting challenges, ASCI had accomplished all of the major goals set at its beginning. The history of ASCI is about the vision, leadership, endurance, and partnerships that made these advances possible.« less

  3. Applications of the Strategic Defense Initiative's compact accelerators

    NASA Technical Reports Server (NTRS)

    Montanarelli, Nick; Lynch, Ted

    1991-01-01

    The Strategic Defense Initiative's (SDI) investment in particle accelerator technology for its directed energy weapons program has produced breakthroughs in the size and power of new accelerators. These accelerators, in turn, have produced spinoffs in several areas: the radio frequency quadrupole linear accelerator (RFQ linac) was recently incorporated into the design of a cancer therapy unit at the Loma Linda University Medical Center, an SDI-sponsored compact induction linear accelerator may replace Cobalt-60 radiation and hazardous ethylene-oxide as a method for sterilizing medical products, and other SDIO-funded accelerators may be used to produce the radioactive isotopes oxygen-15, nitrogen-13, carbon-11, and fluorine-18 for positron emission tomography (PET). Other applications of these accelerators include bomb detection, non-destructive inspection, decomposing toxic substances in contaminated ground water, and eliminating nuclear waste.

  4. Accelerating Clean Energy Commercialization. A Strategic Partnership Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Richard; Pless, Jacquelyn; Arent, Douglas J.

    Technology development in the clean energy and broader clean tech space has proven to be challenging. Long-standing methods for advancing clean energy technologies from science to commercialization are best known for relatively slow, linear progression through research and development, demonstration, and deployment (RDD&D); and characterized by well-known valleys of death for financing. Investment returns expected by traditional venture capital investors have been difficult to achieve, particularly for hardware-centric innovations, and companies that are subject to project finance risks. Commercialization support from incubators and accelerators has helped address these challenges by offering more support services to start-ups; however, more effort ismore » needed to fulfill the desired clean energy future. The emergence of new strategic investors and partners in recent years has opened up innovative opportunities for clean tech entrepreneurs, and novel commercialization models are emerging that involve new alliances among clean energy companies, RDD&D, support systems, and strategic customers. For instance, Wells Fargo and Company (WFC) and the National Renewable Energy Laboratory (NREL) have launched a new technology incubator that supports faster commercialization through a focus on technology development. The incubator combines strategic financing, technology and technical assistance, strategic customer site validation, and ongoing financial support.« less

  5. Strategic Computing Computer Vision: Taking Image Understanding To The Next Plateau

    NASA Astrophysics Data System (ADS)

    Simpson, R. L., Jr.

    1987-06-01

    The overall objective of the Strategic Computing (SC) Program of the Defense Advanced Research Projects Agency (DARPA) is to develop and demonstrate a new generation of machine intelligence technology which can form the basis for more capable military systems in the future and also maintain a position of world leadership for the US in computer technology. Begun in 1983, SC represents a focused research strategy for accelerating the evolution of new technology and its rapid prototyping in realistic military contexts. Among the very ambitious demonstration prototypes being developed within the SC Program are: 1) the Pilot's Associate which will aid the pilot in route planning, aerial target prioritization, evasion of missile threats, and aircraft emergency safety procedures during flight; 2) two battle management projects one for the for the Army, which is just getting started, called the AirLand Battle Management program (ALBM) which will use knowledge-based systems technology to assist in the generation and evaluation of tactical options and plans at the Corps level; 3) the other more established program for the Navy is the Fleet Command Center Battle Management Program (FCCBIVIP) at Pearl Harbor. The FCCBMP is employing knowledge-based systems and natural language technology in a evolutionary testbed situated in an operational command center to demonstrate and evaluate intelligent decision-aids which can assist in the evaluation of fleet readiness and explore alternatives during contingencies; and 4) the Autonomous Land Vehicle (ALV) which integrates in a major robotic testbed the technologies for dynamic image understanding, knowledge-based route planning with replanning during execution, hosted on new advanced parallel architectures. The goal of the Strategic Computing computer vision technology base (SCVision) is to develop generic technology that will enable the construction of complete, robust, high performance image understanding systems to support a wide

  6. Computational Accelerator Physics. Proceedings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bisognano, J.J.; Mondelli, A.A.

    1997-04-01

    The sixty two papers appearing in this volume were presented at CAP96, the Computational Accelerator Physics Conference held in Williamsburg, Virginia from September 24{minus}27,1996. Science Applications International Corporation (SAIC) and the Thomas Jefferson National Accelerator Facility (Jefferson lab) jointly hosted CAP96, with financial support from the U.S. department of Energy`s Office of Energy Research and the Office of Naval reasearch. Topics ranged from descriptions of specific codes to advanced computing techniques and numerical methods. Update talks were presented on nearly all of the accelerator community`s major electromagnetic and particle tracking codes. Among all papers, thirty of them are abstracted formore » the Energy Science and Technology database.(AIP)« less

  7. Community Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spentzouris, P.; /Fermilab; Cary, J.

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessarymore » accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors. ComPASS is in the first year of executing its plan to develop the next-generation HPC accelerator modeling tools. ComPASS aims to develop an integrated simulation environment that will utilize existing and new accelerator physics modules with petascale capabilities, by employing modern computing and solver technologies. The ComPASS vision is to deliver to accelerator scientists a virtual accelerator and virtual prototyping modeling environment, with the necessary multiphysics, multiscale capabilities. The plan for this development includes delivering accelerator modeling applications appropriate for each stage of the ComPASS software evolution. Such applications are already being used to address challenging problems in accelerator design and optimization. The Com

  8. Accelerating Climate Simulations Through Hybrid Computing

    NASA Technical Reports Server (NTRS)

    Zhou, Shujia; Sinno, Scott; Cruz, Carlos; Purcell, Mark

    2009-01-01

    Unconventional multi-core processors (e.g., IBM Cell B/E and NYIDIDA GPU) have emerged as accelerators in climate simulation. However, climate models typically run on parallel computers with conventional processors (e.g., Intel and AMD) using MPI. Connecting accelerators to this architecture efficiently and easily becomes a critical issue. When using MPI for connection, we identified two challenges: (1) identical MPI implementation is required in both systems, and; (2) existing MPI code must be modified to accommodate the accelerators. In response, we have extended and deployed IBM Dynamic Application Virtualization (DAV) in a hybrid computing prototype system (one blade with two Intel quad-core processors, two IBM QS22 Cell blades, connected with Infiniband), allowing for seamlessly offloading compute-intensive functions to remote, heterogeneous accelerators in a scalable, load-balanced manner. Currently, a climate solar radiation model running with multiple MPI processes has been offloaded to multiple Cell blades with approx.10% network overhead.

  9. Terascale Computing in Accelerator Science and Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ko, Kwok

    2002-08-21

    We have entered the age of ''terascale'' scientific computing. Processors and system architecture both continue to evolve; hundred-teraFLOP computers are expected in the next few years, and petaFLOP computers toward the end of this decade are conceivable. This ever-increasing power to solve previously intractable numerical problems benefits almost every field of science and engineering and is revolutionizing some of them, notably including accelerator physics and technology. At existing accelerators, it will help us optimize performance, expand operational parameter envelopes, and increase reliability. Design decisions for next-generation machines will be informed by unprecedented comprehensive and accurate modeling, as well as computer-aidedmore » engineering; all this will increase the likelihood that even their most advanced subsystems can be commissioned on time, within budget, and up to specifications. Advanced computing is also vital to developing new means of acceleration and exploring the behavior of beams under extreme conditions. With continued progress it will someday become reasonable to speak of a complete numerical model of all phenomena important to a particular accelerator.« less

  10. Strategic Flexibility in Computational Estimation for Chinese- and Canadian-Educated Adults

    ERIC Educational Resources Information Center

    Xu, Chang; Wells, Emma; LeFevre, Jo-Anne; Imbo, Ineke

    2014-01-01

    The purpose of the present study was to examine factors that influence strategic flexibility in computational estimation for Chinese- and Canadian-educated adults. Strategic flexibility was operationalized as the percentage of trials on which participants chose the problem-based procedure that best balanced proximity to the correct answer with…

  11. Commnity Petascale Project for Accelerator Science And Simulation: Advancing Computational Science for Future Accelerators And Accelerator Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spentzouris, Panagiotis; /Fermilab; Cary, John

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessarymore » accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors.« less

  12. Cloud computing strategic framework (FY13 - FY15).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arellano, Lawrence R.; Arroyo, Steven C.; Giese, Gerald J.

    This document presents an architectural framework (plan) and roadmap for the implementation of a robust Cloud Computing capability at Sandia National Laboratories. It is intended to be a living document and serve as the basis for detailed implementation plans, project proposals and strategic investment requests.

  13. Accelerating artificial intelligence with reconfigurable computing

    NASA Astrophysics Data System (ADS)

    Cieszewski, Radoslaw

    Reconfigurable computing is emerging as an important area of research in computer architectures and software systems. Many algorithms can be greatly accelerated by placing the computationally intense portions of an algorithm into reconfigurable hardware. Reconfigurable computing combines many benefits of both software and ASIC implementations. Like software, the mapped circuit is flexible, and can be changed over the lifetime of the system. Similar to an ASIC, reconfigurable systems provide a method to map circuits into hardware. Reconfigurable systems therefore have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. Such a field, where there is many different algorithms which can be accelerated, is an artificial intelligence. This paper presents example hardware implementations of Artificial Neural Networks, Genetic Algorithms and Expert Systems.

  14. Quantum Accelerators for High-performance Computing Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humble, Travis S.; Britt, Keith A.; Mohiyaddin, Fahd A.

    We define some of the programming and system-level challenges facing the application of quantum processing to high-performance computing. Alongside barriers to physical integration, prominent differences in the execution of quantum and conventional programs challenges the intersection of these computational models. Following a brief overview of the state of the art, we discuss recent advances in programming and execution models for hybrid quantum-classical computing. We discuss a novel quantum-accelerator framework that uses specialized kernels to offload select workloads while integrating with existing computing infrastructure. We elaborate on the role of the host operating system to manage these unique accelerator resources, themore » prospects for deploying quantum modules, and the requirements placed on the language hierarchy connecting these different system components. We draw on recent advances in the modeling and simulation of quantum computing systems with the development of architectures for hybrid high-performance computing systems and the realization of software stacks for controlling quantum devices. Finally, we present simulation results that describe the expected system-level behavior of high-performance computing systems composed from compute nodes with quantum processing units. We describe performance for these hybrid systems in terms of time-to-solution, accuracy, and energy consumption, and we use simple application examples to estimate the performance advantage of quantum acceleration.« less

  15. Advanced Computing Tools and Models for Accelerator Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ryne, Robert; Ryne, Robert D.

    2008-06-11

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics.

  16. Computing Models for FPGA-Based Accelerators

    PubMed Central

    Herbordt, Martin C.; Gu, Yongfeng; VanCourt, Tom; Model, Josh; Sukhwani, Bharat; Chiu, Matt

    2011-01-01

    Field-programmable gate arrays are widely considered as accelerators for compute-intensive applications. A critical phase of FPGA application development is finding and mapping to the appropriate computing model. FPGA computing enables models with highly flexible fine-grained parallelism and associative operations such as broadcast and collective response. Several case studies demonstrate the effectiveness of using these computing models in developing FPGA applications for molecular modeling. PMID:21603152

  17. Convergence acceleration of viscous flow computations

    NASA Technical Reports Server (NTRS)

    Johnson, G. M.

    1982-01-01

    A multiple-grid convergence acceleration technique introduced for application to the solution of the Euler equations by means of Lax-Wendroff algorithms is extended to treat compressible viscous flow. Computational results are presented for the solution of the thin-layer version of the Navier-Stokes equations using the explicit MacCormack algorithm, accelerated by a convective coarse-grid scheme. Extensions and generalizations are mentioned.

  18. Collaborative Strategic Board Games as a Site for Distributed Computational Thinking

    ERIC Educational Resources Information Center

    Berland, Matthew; Lee, Victor R.

    2011-01-01

    This paper examines the idea that contemporary strategic board games represent an informal, interactional context in which complex computational thinking takes place. When games are collaborative--that is, a game requires that players work in joint pursuit of a shared goal--the computational thinking is easily observed as distributed across…

  19. Accelerating Climate and Weather Simulations through Hybrid Computing

    NASA Technical Reports Server (NTRS)

    Zhou, Shujia; Cruz, Carlos; Duffy, Daniel; Tucker, Robert; Purcell, Mark

    2011-01-01

    Unconventional multi- and many-core processors (e.g. IBM (R) Cell B.E.(TM) and NVIDIA (R) GPU) have emerged as effective accelerators in trial climate and weather simulations. Yet these climate and weather models typically run on parallel computers with conventional processors (e.g. Intel, AMD, and IBM) using Message Passing Interface. To address challenges involved in efficiently and easily connecting accelerators to parallel computers, we investigated using IBM's Dynamic Application Virtualization (TM) (IBM DAV) software in a prototype hybrid computing system with representative climate and weather model components. The hybrid system comprises two Intel blades and two IBM QS22 Cell B.E. blades, connected with both InfiniBand(R) (IB) and 1-Gigabit Ethernet. The system significantly accelerates a solar radiation model component by offloading compute-intensive calculations to the Cell blades. Systematic tests show that IBM DAV can seamlessly offload compute-intensive calculations from Intel blades to Cell B.E. blades in a scalable, load-balanced manner. However, noticeable communication overhead was observed, mainly due to IP over the IB protocol. Full utilization of IB Sockets Direct Protocol and the lower latency production version of IBM DAV will reduce this overhead.

  20. Strategic directions of computing at Fermilab

    NASA Astrophysics Data System (ADS)

    Wolbers, Stephen

    1998-05-01

    Fermilab computing has changed a great deal over the years, driven by the demands of the Fermilab experimental community to record and analyze larger and larger datasets, by the desire to take advantage of advances in computing hardware and software, and by the advances coming from the R&D efforts of the Fermilab Computing Division. The strategic directions of Fermilab Computing continue to be driven by the needs of the experimental program. The current fixed-target run will produce over 100 TBytes of raw data and systems must be in place to allow the timely analysis of the data. The collider run II, beginning in 1999, is projected to produce of order 1 PByte of data per year. There will be a major change in methodology and software language as the experiments move away from FORTRAN and into object-oriented languages. Increased use of automation and the reduction of operator-assisted tape mounts will be required to meet the needs of the large experiments and large data sets. Work will continue on higher-rate data acquisition systems for future experiments and projects. R&D projects will be pursued as necessary to provide software, tools, or systems which cannot be purchased or acquired elsewhere. A closer working relation with other high energy laboratories will be pursued to reduce duplication of effort and to allow effective collaboration on many aspects of HEP computing.

  1. A strategic plan to accelerate development of acute stroke treatments.

    PubMed

    Marler, John R

    2012-09-01

    In order to reenergize acute stroke research and accelerate the development of new treatments, we need to transform the usual design and conduct of clinical trials to test for small but significant improvements in effectiveness, and treat patients as soon as possible after stroke onset when treatment effects are most detectable. This requires trials that include thousands of acute stroke patients. A plan to make these trials possible is proposed. There are four components: (1) free access to the electronic medical record; (2) a large stroke emergency network and clinical trial coordinating center connected in real time to hundreds of emergency departments; (3) a clinical trial technology development center; and (4) strategic leadership to raise funds, motivate clinicians to participate, and interact with politicians, insurers, legislators, and other national and international organizations working to advance the quality of stroke care. © 2012 New York Academy of Sciences.

  2. A coarse-grid projection method for accelerating incompressible flow computations

    NASA Astrophysics Data System (ADS)

    San, Omer; Staples, Anne E.

    2013-01-01

    We present a coarse-grid projection (CGP) method for accelerating incompressible flow computations, which is applicable to methods involving Poisson equations as incompressibility constraints. The CGP methodology is a modular approach that facilitates data transfer with simple interpolations and uses black-box solvers for the Poisson and advection-diffusion equations in the flow solver. After solving the Poisson equation on a coarsened grid, an interpolation scheme is used to obtain the fine data for subsequent time stepping on the full grid. A particular version of the method is applied here to the vorticity-stream function, primitive variable, and vorticity-velocity formulations of incompressible Navier-Stokes equations. We compute several benchmark flow problems on two-dimensional Cartesian and non-Cartesian grids, as well as a three-dimensional flow problem. The method is found to accelerate these computations while retaining a level of accuracy close to that of the fine resolution field, which is significantly better than the accuracy obtained for a similar computation performed solely using a coarse grid. A linear acceleration rate is obtained for all the cases we consider due to the linear-cost elliptic Poisson solver used, with reduction factors in computational time between 2 and 42. The computational savings are larger when a suboptimal Poisson solver is used. We also find that the computational savings increase with increasing distortion ratio on non-Cartesian grids, making the CGP method a useful tool for accelerating generalized curvilinear incompressible flow solvers.

  3. High Performance Computing Modeling Advances Accelerator Science for High-Energy Physics

    DOE PAGES

    Amundson, James; Macridin, Alexandru; Spentzouris, Panagiotis

    2014-07-28

    The development and optimization of particle accelerators are essential for advancing our understanding of the properties of matter, energy, space, and time. Particle accelerators are complex devices whose behavior involves many physical effects on multiple scales. Therefore, advanced computational tools utilizing high-performance computing are essential for accurately modeling them. In the past decade, the US Department of Energy's SciDAC program has produced accelerator-modeling tools that have been employed to tackle some of the most difficult accelerator science problems. The authors discuss the Synergia framework and its applications to high-intensity particle accelerator physics. Synergia is an accelerator simulation package capable ofmore » handling the entire spectrum of beam dynamics simulations. Our authors present Synergia's design principles and its performance on HPC platforms.« less

  4. Accelerating sino-atrium computer simulations with graphic processing units.

    PubMed

    Zhang, Hong; Xiao, Zheng; Lin, Shien-fong

    2015-01-01

    Sino-atrial node cells (SANCs) play a significant role in rhythmic firing. To investigate their role in arrhythmia and interactions with the atrium, computer simulations based on cellular dynamic mathematical models are generally used. However, the large-scale computation usually makes research difficult, given the limited computational power of Central Processing Units (CPUs). In this paper, an accelerating approach with Graphic Processing Units (GPUs) is proposed in a simulation consisting of the SAN tissue and the adjoining atrium. By using the operator splitting method, the computational task was made parallel. Three parallelization strategies were then put forward. The strategy with the shortest running time was further optimized by considering block size, data transfer and partition. The results showed that for a simulation with 500 SANCs and 30 atrial cells, the execution time taken by the non-optimized program decreased 62% with respect to a serial program running on CPU. The execution time decreased by 80% after the program was optimized. The larger the tissue was, the more significant the acceleration became. The results demonstrated the effectiveness of the proposed GPU-accelerating methods and their promising applications in more complicated biological simulations.

  5. Strategic flexibility in computational estimation for Chinese- and Canadian-educated adults.

    PubMed

    Xu, Chang; Wells, Emma; LeFevre, Jo-Anne; Imbo, Ineke

    2014-09-01

    The purpose of the present study was to examine factors that influence strategic flexibility in computational estimation for Chinese- and Canadian-educated adults. Strategic flexibility was operationalized as the percentage of trials on which participants chose the problem-based procedure that best balanced proximity to the correct answer with simplification of the required calculation. For example, on 42 × 57, the optimal problem-based solution is 40 × 60 because 2,400 is closer to the exact answer 2,394 than is 40 × 50 or 50 × 60. In Experiment 1 (n = 50), where participants had free choice of estimation procedures, Chinese-educated participants were more likely to choose the optimal problem-based procedure (80% of trials) than Canadian-educated participants (50%). In Experiment 2 (n = 48), participants had to choose 1 of 3 solution procedures. They showed moderate strategic flexibility that was equal across groups (60%). In Experiment 3 (n = 50), participants were given the same 3 procedure choices as in Experiment 2 but different instructions and explicit feedback. When instructed to respond quickly, both groups showed moderate strategic flexibility as in Experiment 2 (60%). When instructed to respond as accurately as possible or to balance speed and accuracy, they showed very high strategic flexibility (greater than 90%). These findings suggest that solvers will show very different levels of strategic flexibility in response to instructions, feedback, and problem characteristics and that these factors interact with individual differences (e.g., arithmetic skills, nationality) to produce variable response patterns.

  6. Pennsylvania's Transition to Enterprise Computing as a Study in Strategic Alignment

    ERIC Educational Resources Information Center

    Sawyer, Steve; Hinnant, Charles C.; Rizzuto, Tracey

    2008-01-01

    We theorize about the strategic alignment of computing with organizational mission, using the Commonwealth of Pennsylvania's efforts to pursue digital government initiatives as evidence. To do this we draw on a decade (1995-2004) of changes in Pennsylvania to characterize how a state government shifts from an organizational to an enterprise…

  7. Fast Acceleration of 2D Wave Propagation Simulations Using Modern Computational Accelerators

    PubMed Central

    Wang, Wei; Xu, Lifan; Cavazos, John; Huang, Howie H.; Kay, Matthew

    2014-01-01

    Recent developments in modern computational accelerators like Graphics Processing Units (GPUs) and coprocessors provide great opportunities for making scientific applications run faster than ever before. However, efficient parallelization of scientific code using new programming tools like CUDA requires a high level of expertise that is not available to many scientists. This, plus the fact that parallelized code is usually not portable to different architectures, creates major challenges for exploiting the full capabilities of modern computational accelerators. In this work, we sought to overcome these challenges by studying how to achieve both automated parallelization using OpenACC and enhanced portability using OpenCL. We applied our parallelization schemes using GPUs as well as Intel Many Integrated Core (MIC) coprocessor to reduce the run time of wave propagation simulations. We used a well-established 2D cardiac action potential model as a specific case-study. To the best of our knowledge, we are the first to study auto-parallelization of 2D cardiac wave propagation simulations using OpenACC. Our results identify several approaches that provide substantial speedups. The OpenACC-generated GPU code achieved more than speedup above the sequential implementation and required the addition of only a few OpenACC pragmas to the code. An OpenCL implementation provided speedups on GPUs of at least faster than the sequential implementation and faster than a parallelized OpenMP implementation. An implementation of OpenMP on Intel MIC coprocessor provided speedups of with only a few code changes to the sequential implementation. We highlight that OpenACC provides an automatic, efficient, and portable approach to achieve parallelization of 2D cardiac wave simulations on GPUs. Our approach of using OpenACC, OpenCL, and OpenMP to parallelize this particular model on modern computational accelerators should be applicable to other computational models of wave propagation in

  8. GPU-accelerated computation of electron transfer.

    PubMed

    Höfinger, Siegfried; Acocella, Angela; Pop, Sergiu C; Narumi, Tetsu; Yasuoka, Kenji; Beu, Titus; Zerbetto, Francesco

    2012-11-05

    Electron transfer is a fundamental process that can be studied with the help of computer simulation. The underlying quantum mechanical description renders the problem a computationally intensive application. In this study, we probe the graphics processing unit (GPU) for suitability to this type of problem. Time-critical components are identified via profiling of an existing implementation and several different variants are tested involving the GPU at increasing levels of abstraction. A publicly available library supporting basic linear algebra operations on the GPU turns out to accelerate the computation approximately 50-fold with minor dependence on actual problem size. The performance gain does not compromise numerical accuracy and is of significant value for practical purposes. Copyright © 2012 Wiley Periodicals, Inc.

  9. Strategic Control Algorithm Development : Volume 3. Strategic Algorithm Report.

    DOT National Transportation Integrated Search

    1974-08-01

    The strategic algorithm report presents a detailed description of the functional basic strategic control arrival algorithm. This description is independent of a particular computer or language. Contained in this discussion are the geometrical and env...

  10. Accelerating EPI distortion correction by utilizing a modern GPU-based parallel computation.

    PubMed

    Yang, Yao-Hao; Huang, Teng-Yi; Wang, Fu-Nien; Chuang, Tzu-Chao; Chen, Nan-Kuei

    2013-04-01

    The combination of phase demodulation and field mapping is a practical method to correct echo planar imaging (EPI) geometric distortion. However, since phase dispersion accumulates in each phase-encoding step, the calculation complexity of phase modulation is Ny-fold higher than conventional image reconstructions. Thus, correcting EPI images via phase demodulation is generally a time-consuming task. Parallel computing by employing general-purpose calculations on graphics processing units (GPU) can accelerate scientific computing if the algorithm is parallelized. This study proposes a method that incorporates the GPU-based technique into phase demodulation calculations to reduce computation time. The proposed parallel algorithm was applied to a PROPELLER-EPI diffusion tensor data set. The GPU-based phase demodulation method reduced the EPI distortion correctly, and accelerated the computation. The total reconstruction time of the 16-slice PROPELLER-EPI diffusion tensor images with matrix size of 128 × 128 was reduced from 1,754 seconds to 101 seconds by utilizing the parallelized 4-GPU program. GPU computing is a promising method to accelerate EPI geometric correction. The resulting reduction in computation time of phase demodulation should accelerate postprocessing for studies performed with EPI, and should effectuate the PROPELLER-EPI technique for clinical practice. Copyright © 2011 by the American Society of Neuroimaging.

  11. Developing Strategic and Reasoning Abilities with Computer Games at Primary School Level

    ERIC Educational Resources Information Center

    Bottino, R. M.; Ferlino, L.; Ott, M.; Tavella, M.

    2007-01-01

    The paper reports a small-scale, long-term pilot project designed to foster strategic and reasoning abilities in young primary school pupils by engaging them in a number of computer games, mainly those usually called mind games (brainteasers, puzzlers, etc.). In this paper, the objectives, work methodology, experimental setting, and tools used in…

  12. GPU acceleration of Dock6's Amber scoring computation.

    PubMed

    Yang, Hailong; Zhou, Qiongqiong; Li, Bo; Wang, Yongjian; Luan, Zhongzhi; Qian, Depei; Li, Hanlu

    2010-01-01

    Dressing the problem of virtual screening is a long-term goal in the drug discovery field, which if properly solved, can significantly shorten new drugs' R&D cycle. The scoring functionality that evaluates the fitness of the docking result is one of the major challenges in virtual screening. In general, scoring functionality in docking requires a large amount of floating-point calculations, which usually takes several weeks or even months to be finished. This time-consuming procedure is unacceptable, especially when highly fatal and infectious virus arises such as SARS and H1N1, which forces the scoring task to be done in a limited time. This paper presents how to leverage the computational power of GPU to accelerate Dock6's (http://dock.compbio.ucsf.edu/DOCK_6/) Amber (J. Comput. Chem. 25: 1157-1174, 2004) scoring with NVIDIA CUDA (NVIDIA Corporation Technical Staff, Compute Unified Device Architecture - Programming Guide, NVIDIA Corporation, 2008) (Compute Unified Device Architecture) platform. We also discuss many factors that will greatly influence the performance after porting the Amber scoring to GPU, including thread management, data transfer, and divergence hidden. Our experiments show that the GPU-accelerated Amber scoring achieves a 6.5× speedup with respect to the original version running on AMD dual-core CPU for the same problem size. This acceleration makes the Amber scoring more competitive and efficient for large-scale virtual screening problems.

  13. Strategic cognitive sequencing: a computational cognitive neuroscience approach.

    PubMed

    Herd, Seth A; Krueger, Kai A; Kriete, Trenton E; Huang, Tsung-Ren; Hazy, Thomas E; O'Reilly, Randall C

    2013-01-01

    We address strategic cognitive sequencing, the "outer loop" of human cognition: how the brain decides what cognitive process to apply at a given moment to solve complex, multistep cognitive tasks. We argue that this topic has been neglected relative to its importance for systematic reasons but that recent work on how individual brain systems accomplish their computations has set the stage for productively addressing how brain regions coordinate over time to accomplish our most impressive thinking. We present four preliminary neural network models. The first addresses how the prefrontal cortex (PFC) and basal ganglia (BG) cooperate to perform trial-and-error learning of short sequences; the next, how several areas of PFC learn to make predictions of likely reward, and how this contributes to the BG making decisions at the level of strategies. The third models address how PFC, BG, parietal cortex, and hippocampus can work together to memorize sequences of cognitive actions from instruction (or "self-instruction"). The last shows how a constraint satisfaction process can find useful plans. The PFC maintains current and goal states and associates from both of these to find a "bridging" state, an abstract plan. We discuss how these processes could work together to produce strategic cognitive sequencing and discuss future directions in this area.

  14. Multi-GPU Jacobian accelerated computing for soft-field tomography.

    PubMed

    Borsic, A; Attardo, E A; Halter, R J

    2012-10-01

    Image reconstruction in soft-field tomography is based on an inverse problem formulation, where a forward model is fitted to the data. In medical applications, where the anatomy presents complex shapes, it is common to use finite element models (FEMs) to represent the volume of interest and solve a partial differential equation that models the physics of the system. Over the last decade, there has been a shifting interest from 2D modeling to 3D modeling, as the underlying physics of most problems are 3D. Although the increased computational power of modern computers allows working with much larger FEM models, the computational time required to reconstruct 3D images on a fine 3D FEM model can be significant, on the order of hours. For example, in electrical impedance tomography (EIT) applications using a dense 3D FEM mesh with half a million elements, a single reconstruction iteration takes approximately 15-20 min with optimized routines running on a modern multi-core PC. It is desirable to accelerate image reconstruction to enable researchers to more easily and rapidly explore data and reconstruction parameters. Furthermore, providing high-speed reconstructions is essential for some promising clinical application of EIT. For 3D problems, 70% of the computing time is spent building the Jacobian matrix, and 25% of the time in forward solving. In this work, we focus on accelerating the Jacobian computation by using single and multiple GPUs. First, we discuss an optimized implementation on a modern multi-core PC architecture and show how computing time is bounded by the CPU-to-memory bandwidth; this factor limits the rate at which data can be fetched by the CPU. Gains associated with the use of multiple CPU cores are minimal, since data operands cannot be fetched fast enough to saturate the processing power of even a single CPU core. GPUs have much faster memory bandwidths compared to CPUs and better parallelism. We are able to obtain acceleration factors of 20

  15. Multi-GPU Jacobian Accelerated Computing for Soft Field Tomography

    PubMed Central

    Borsic, A.; Attardo, E. A.; Halter, R. J.

    2012-01-01

    Image reconstruction in soft-field tomography is based on an inverse problem formulation, where a forward model is fitted to the data. In medical applications, where the anatomy presents complex shapes, it is common to use Finite Element Models to represent the volume of interest and to solve a partial differential equation that models the physics of the system. Over the last decade, there has been a shifting interest from 2D modeling to 3D modeling, as the underlying physics of most problems are three-dimensional. Though the increased computational power of modern computers allows working with much larger FEM models, the computational time required to reconstruct 3D images on a fine 3D FEM model can be significant, on the order of hours. For example, in Electrical Impedance Tomography applications using a dense 3D FEM mesh with half a million elements, a single reconstruction iteration takes approximately 15 to 20 minutes with optimized routines running on a modern multi-core PC. It is desirable to accelerate image reconstruction to enable researchers to more easily and rapidly explore data and reconstruction parameters. Further, providing high-speed reconstructions are essential for some promising clinical application of EIT. For 3D problems 70% of the computing time is spent building the Jacobian matrix, and 25% of the time in forward solving. In the present work, we focus on accelerating the Jacobian computation by using single and multiple GPUs. First, we discuss an optimized implementation on a modern multi-core PC architecture and show how computing time is bounded by the CPU-to-memory bandwidth; this factor limits the rate at which data can be fetched by the CPU. Gains associated with use of multiple CPU cores are minimal, since data operands cannot be fetched fast enough to saturate the processing power of even a single CPU core. GPUs have a much faster memory bandwidths compared to CPUs and better parallelism. We are able to obtain acceleration factors of

  16. Accelerated construction

    DOT National Transportation Integrated Search

    2004-01-01

    Accelerated Construction Technology Transfer (ACTT) is a strategic process that uses various innovative techniques, strategies, and technologies to minimize actual construction time, while enhancing quality and safety on today's large, complex multip...

  17. Hardware accelerated high performance neutron transport computation based on AGENT methodology

    NASA Astrophysics Data System (ADS)

    Xiao, Shanjie

    The spatial heterogeneity of the next generation Gen-IV nuclear reactor core designs brings challenges to the neutron transport analysis. The Arbitrary Geometry Neutron Transport (AGENT) AGENT code is a three-dimensional neutron transport analysis code being developed at the Laboratory for Neutronics and Geometry Computation (NEGE) at Purdue University. It can accurately describe the spatial heterogeneity in a hierarchical structure through the R-function solid modeler. The previous version of AGENT coupled the 2D transport MOC solver and the 1D diffusion NEM solver to solve the three dimensional Boltzmann transport equation. In this research, the 2D/1D coupling methodology was expanded to couple two transport solvers, the radial 2D MOC solver and the axial 1D MOC solver, for better accuracy. The expansion was benchmarked with the widely applied C5G7 benchmark models and two fast breeder reactor models, and showed good agreement with the reference Monte Carlo results. In practice, the accurate neutron transport analysis for a full reactor core is still time-consuming and thus limits its application. Therefore, another content of my research is focused on designing a specific hardware based on the reconfigurable computing technique in order to accelerate AGENT computations. It is the first time that the application of this type is used to the reactor physics and neutron transport for reactor design. The most time consuming part of the AGENT algorithm was identified. Moreover, the architecture of the AGENT acceleration system was designed based on the analysis. Through the parallel computation on the specially designed, highly efficient architecture, the acceleration design on FPGA acquires high performance at the much lower working frequency than CPUs. The whole design simulations show that the acceleration design would be able to speedup large scale AGENT computations about 20 times. The high performance AGENT acceleration system will drastically shortening the

  18. Rocky Mountain Research Station: 2008 Strategic Framework Update

    Treesearch

    Lane Eskew

    2009-01-01

    The Rocky Mountain Research Station's 2008 Strategic Framework Update is an addendum to the 2003 RMRS Strategic Framework. It focuses on critical natural resources research topics over the next five to 10 years when we will see continued, if not accelerated, socioeconomic and...

  19. Cloud Computing and Validated Learning for Accelerating Innovation in IoT

    ERIC Educational Resources Information Center

    Suciu, George; Todoran, Gyorgy; Vulpe, Alexandru; Suciu, Victor; Bulca, Cristina; Cheveresan, Romulus

    2015-01-01

    Innovation in Internet of Things (IoT) requires more than just creation of technology and use of cloud computing or big data platforms. It requires accelerated commercialization or aptly called go-to-market processes. To successfully accelerate, companies need a new type of product development, the so-called validated learning process.…

  20. Acceleration of FDTD mode solver by high-performance computing techniques.

    PubMed

    Han, Lin; Xi, Yanping; Huang, Wei-Ping

    2010-06-21

    A two-dimensional (2D) compact finite-difference time-domain (FDTD) mode solver is developed based on wave equation formalism in combination with the matrix pencil method (MPM). The method is validated for calculation of both real guided and complex leaky modes of typical optical waveguides against the bench-mark finite-difference (FD) eigen mode solver. By taking advantage of the inherent parallel nature of the FDTD algorithm, the mode solver is implemented on graphics processing units (GPUs) using the compute unified device architecture (CUDA). It is demonstrated that the high-performance computing technique leads to significant acceleration of the FDTD mode solver with more than 30 times improvement in computational efficiency in comparison with the conventional FDTD mode solver running on CPU of a standard desktop computer. The computational efficiency of the accelerated FDTD method is in the same order of magnitude of the standard finite-difference eigen mode solver and yet require much less memory (e.g., less than 10%). Therefore, the new method may serve as an efficient, accurate and robust tool for mode calculation of optical waveguides even when the conventional eigen value mode solvers are no longer applicable due to memory limitation.

  1. Method for computationally efficient design of dielectric laser accelerator structures

    DOE PAGES

    Hughes, Tyler; Veronis, Georgios; Wootton, Kent P.; ...

    2017-06-22

    Here, dielectric microstructures have generated much interest in recent years as a means of accelerating charged particles when powered by solid state lasers. The acceleration gradient (or particle energy gain per unit length) is an important figure of merit. To design structures with high acceleration gradients, we explore the adjoint variable method, a highly efficient technique used to compute the sensitivity of an objective with respect to a large number of parameters. With this formalism, the sensitivity of the acceleration gradient of a dielectric structure with respect to its entire spatial permittivity distribution is calculated by the use of onlymore » two full-field electromagnetic simulations, the original and ‘adjoint’. The adjoint simulation corresponds physically to the reciprocal situation of a point charge moving through the accelerator gap and radiating. Using this formalism, we perform numerical optimizations aimed at maximizing acceleration gradients, which generate fabricable structures of greatly improved performance in comparison to previously examined geometries.« less

  2. From experiment to design -- Fault characterization and detection in parallel computer systems using computational accelerators

    NASA Astrophysics Data System (ADS)

    Yim, Keun Soo

    This dissertation summarizes experimental validation and co-design studies conducted to optimize the fault detection capabilities and overheads in hybrid computer systems (e.g., using CPUs and Graphics Processing Units, or GPUs), and consequently to improve the scalability of parallel computer systems using computational accelerators. The experimental validation studies were conducted to help us understand the failure characteristics of CPU-GPU hybrid computer systems under various types of hardware faults. The main characterization targets were faults that are difficult to detect and/or recover from, e.g., faults that cause long latency failures (Ch. 3), faults in dynamically allocated resources (Ch. 4), faults in GPUs (Ch. 5), faults in MPI programs (Ch. 6), and microarchitecture-level faults with specific timing features (Ch. 7). The co-design studies were based on the characterization results. One of the co-designed systems has a set of source-to-source translators that customize and strategically place error detectors in the source code of target GPU programs (Ch. 5). Another co-designed system uses an extension card to learn the normal behavioral and semantic execution patterns of message-passing processes executing on CPUs, and to detect abnormal behaviors of those parallel processes (Ch. 6). The third co-designed system is a co-processor that has a set of new instructions in order to support software-implemented fault detection techniques (Ch. 7). The work described in this dissertation gains more importance because heterogeneous processors have become an essential component of state-of-the-art supercomputers. GPUs were used in three of the five fastest supercomputers that were operating in 2011. Our work included comprehensive fault characterization studies in CPU-GPU hybrid computers. In CPUs, we monitored the target systems for a long period of time after injecting faults (a temporally comprehensive experiment), and injected faults into various types of

  3. Effects of different computer typing speeds on acceleration and peak contact pressure of the fingertips during computer typing.

    PubMed

    Yoo, Won-Gyu

    2015-01-01

    [Purpose] This study showed the effects of different computer typing speeds on acceleration and peak contact pressure of the fingertips during computer typing. [Subjects] Twenty-one male computer workers voluntarily consented to participate in this study. They consisted of 7 workers who could type 200-300 characteristics/minute, 7 workers who could type 300-400 characteristics/minute, and 7 workers who could type 400-500 chracteristics/minute. [Methods] This study was used to measure the acceleration and peak contact pressure of the fingertips for different typing speed groups using an accelerometer and CONFORMat system. [Results] The fingertip contact pressure was increased in the high typing speed group compared with the low and medium typing speed groups. The fingertip acceleration was increased in the high typing speed group compared with the low and medium typing speed groups. [Conclusion] The results of the present study indicate that a fast typing speed cause continuous pressure stress to be applied to the fingers, thereby creating pain in the fingers.

  4. Computational screening of organic polymer dielectrics for novel accelerator technologies

    DOE PAGES

    Pilania, Ghanshyam; Weis, Eric; Walker, Ethan M.; ...

    2018-06-18

    The use of infrared lasers to power accelerating dielectric structures is a developing area of research. Within this technology, the choice of the dielectric material forming the accelerating structures, such as the photonic band gap (PBG) structures, is dictated by a range of interrelated factors including their dielectric and optical properties, amenability to photo-polymerization, thermochemical stability and other target performance metrics of the particle accelerator. In this direction, electronic structure theory aided computational screening and design of dielectric materials can play a key role in identifying potential candidate materials with the targeted functionalities to guide experimental synthetic efforts. In anmore » attempt to systematically understand the role of chemistry in controlling the electronic structure and dielectric properties of organic polymeric materials, here we employ empirical screening and density functional theory (DFT) computations, as a part of our multi-step hierarchal screening strategy. Our DFT based analysis focused on the bandgap, dielectric permittivity, and frequency-dependent dielectric losses due to lattice absorption as key properties to down-select promising polymer motifs. In addition to the specific application of dielectric laser acceleration, the general methodology presented here is deemed to be valuable in the design of new insulators with an attractive combination of dielectric properties.« less

  5. Particle Identification on an FPGA Accelerated Compute Platform for the LHCb Upgrade

    NASA Astrophysics Data System (ADS)

    Fäerber, Christian; Schwemmer, Rainer; Machen, Jonathan; Neufeld, Niko

    2017-07-01

    The current LHCb readout system will be upgraded in 2018 to a “triggerless” readout of the entire detector at the Large Hadron Collider collision rate of 40 MHz. The corresponding bandwidth from the detector down to the foreseen dedicated computing farm (event filter farm), which acts as the trigger, has to be increased by a factor of almost 100 from currently 500 Gb/s up to 40 Tb/s. The event filter farm will preanalyze the data and will select the events on an event by event basis. This will reduce the bandwidth down to a manageable size to write the interesting physics data to tape. The design of such a system is a challenging task, and the reason why different new technologies are considered and have to be investigated for the different parts of the system. For the usage in the event building farm or in the event filter farm (trigger), an experimental field programmable gate array (FPGA) accelerated computing platform is considered and, therefore, tested. FPGA compute accelerators are used more and more in standard servers such as for Microsoft Bing search or Baidu search. The platform we use hosts a general Intel CPU and a high-performance FPGA linked via the high-speed Intel QuickPath Interconnect. An accelerator is implemented on the FPGA. It is very likely that these platforms, which are built, in general, for high-performance computing, are also very interesting for the high-energy physics community. First, the performance results of smaller test cases performed at the beginning are presented. Afterward, a part of the existing LHCb RICH particle identification is tested and is ported to the experimental FPGA accelerated platform. We have compared the performance of the LHCb RICH particle identification running on a normal CPU with the performance of the same algorithm, which is running on the Xeon-FPGA compute accelerator platform.

  6. A Quantitative Study of the Relationship between Leadership Practice and Strategic Intentions to Use Cloud Computing

    ERIC Educational Resources Information Center

    Castillo, Alan F.

    2014-01-01

    The purpose of this quantitative correlational cross-sectional research study was to examine a theoretical model consisting of leadership practice, attitudes of business process outsourcing, and strategic intentions of leaders to use cloud computing and to examine the relationships between each of the variables respectively. This study…

  7. Convergence acceleration of the Proteus computer code with multigrid methods

    NASA Technical Reports Server (NTRS)

    Demuren, A. O.; Ibraheem, S. O.

    1992-01-01

    Presented here is the first part of a study to implement convergence acceleration techniques based on the multigrid concept in the Proteus computer code. A review is given of previous studies on the implementation of multigrid methods in computer codes for compressible flow analysis. Also presented is a detailed stability analysis of upwind and central-difference based numerical schemes for solving the Euler and Navier-Stokes equations. Results are given of a convergence study of the Proteus code on computational grids of different sizes. The results presented here form the foundation for the implementation of multigrid methods in the Proteus code.

  8. Convergence acceleration of the Proteus computer code with multigrid methods

    NASA Technical Reports Server (NTRS)

    Demuren, A. O.; Ibraheem, S. O.

    1995-01-01

    This report presents the results of a study to implement convergence acceleration techniques based on the multigrid concept in the two-dimensional and three-dimensional versions of the Proteus computer code. The first section presents a review of the relevant literature on the implementation of the multigrid methods in computer codes for compressible flow analysis. The next two sections present detailed stability analysis of numerical schemes for solving the Euler and Navier-Stokes equations, based on conventional von Neumann analysis and the bi-grid analysis, respectively. The next section presents details of the computational method used in the Proteus computer code. Finally, the multigrid implementation and applications to several two-dimensional and three-dimensional test problems are presented. The results of the present study show that the multigrid method always leads to a reduction in the number of iterations (or time steps) required for convergence. However, there is an overhead associated with the use of multigrid acceleration. The overhead is higher in 2-D problems than in 3-D problems, thus overall multigrid savings in CPU time are in general better in the latter. Savings of about 40-50 percent are typical in 3-D problems, but they are about 20-30 percent in large 2-D problems. The present multigrid method is applicable to steady-state problems and is therefore ineffective in problems with inherently unstable solutions.

  9. A coarse-grid projection method for accelerating incompressible flow computations

    NASA Astrophysics Data System (ADS)

    San, Omer; Staples, Anne

    2011-11-01

    We present a coarse-grid projection (CGP) algorithm for accelerating incompressible flow computations, which is applicable to methods involving Poisson equations as incompressibility constraints. CGP methodology is a modular approach that facilitates data transfer with simple interpolations and uses black-box solvers for the Poisson and advection-diffusion equations in the flow solver. Here, we investigate a particular CGP method for the vorticity-stream function formulation that uses the full weighting operation for mapping from fine to coarse grids, the third-order Runge-Kutta method for time stepping, and finite differences for the spatial discretization. After solving the Poisson equation on a coarsened grid, bilinear interpolation is used to obtain the fine data for consequent time stepping on the full grid. We compute several benchmark flows: the Taylor-Green vortex, a vortex pair merging, a double shear layer, decaying turbulence and the Taylor-Green vortex on a distorted grid. In all cases we use either FFT-based or V-cycle multigrid linear-cost Poisson solvers. Reducing the number of degrees of freedom of the Poisson solver by powers of two accelerates these computations while, for the first level of coarsening, retaining the same level of accuracy in the fine resolution vorticity field.

  10. Strategic Control Algorithm Development : Volume 4A. Computer Program Report.

    DOT National Transportation Integrated Search

    1974-08-01

    A description of the strategic algorithm evaluation model is presented, both at the user and programmer levels. The model representation of an airport configuration, environmental considerations, the strategic control algorithm logic, and the airplan...

  11. Ultrasound window-modulated compounding Nakagami imaging: Resolution improvement and computational acceleration for liver characterization.

    PubMed

    Ma, Hsiang-Yang; Lin, Ying-Hsiu; Wang, Chiao-Yin; Chen, Chiung-Nien; Ho, Ming-Chih; Tsui, Po-Hsiang

    2016-08-01

    Ultrasound Nakagami imaging is an attractive method for visualizing changes in envelope statistics. Window-modulated compounding (WMC) Nakagami imaging was reported to improve image smoothness. The sliding window technique is typically used for constructing ultrasound parametric and Nakagami images. Using a large window overlap ratio may improve the WMC Nakagami image resolution but reduces computational efficiency. Therefore, the objectives of this study include: (i) exploring the effects of the window overlap ratio on the resolution and smoothness of WMC Nakagami images; (ii) proposing a fast algorithm that is based on the convolution operator (FACO) to accelerate WMC Nakagami imaging. Computer simulations and preliminary clinical tests on liver fibrosis samples (n=48) were performed to validate the FACO-based WMC Nakagami imaging. The results demonstrated that the width of the autocorrelation function and the parameter distribution of the WMC Nakagami image reduce with the increase in the window overlap ratio. One-pixel shifting (i.e., sliding the window on the image data in steps of one pixel for parametric imaging) as the maximum overlap ratio significantly improves the WMC Nakagami image quality. Concurrently, the proposed FACO method combined with a computational platform that optimizes the matrix computation can accelerate WMC Nakagami imaging, allowing the detection of liver fibrosis-induced changes in envelope statistics. FACO-accelerated WMC Nakagami imaging is a new-generation Nakagami imaging technique with an improved image quality and fast computation. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Acceleration of Image Segmentation Algorithm for (Breast) Mammogram Images Using High-Performance Reconfigurable Dataflow Computers

    PubMed Central

    Filipovic, Nenad D.

    2017-01-01

    Image segmentation is one of the most common procedures in medical imaging applications. It is also a very important task in breast cancer detection. Breast cancer detection procedure based on mammography can be divided into several stages. The first stage is the extraction of the region of interest from a breast image, followed by the identification of suspicious mass regions, their classification, and comparison with the existing image database. It is often the case that already existing image databases have large sets of data whose processing requires a lot of time, and thus the acceleration of each of the processing stages in breast cancer detection is a very important issue. In this paper, the implementation of the already existing algorithm for region-of-interest based image segmentation for mammogram images on High-Performance Reconfigurable Dataflow Computers (HPRDCs) is proposed. As a dataflow engine (DFE) of such HPRDC, Maxeler's acceleration card is used. The experiments for examining the acceleration of that algorithm on the Reconfigurable Dataflow Computers (RDCs) are performed with two types of mammogram images with different resolutions. There were, also, several DFE configurations and each of them gave a different acceleration value of algorithm execution. Those acceleration values are presented and experimental results showed good acceleration. PMID:28611851

  13. Acceleration of Image Segmentation Algorithm for (Breast) Mammogram Images Using High-Performance Reconfigurable Dataflow Computers.

    PubMed

    Milankovic, Ivan L; Mijailovic, Nikola V; Filipovic, Nenad D; Peulic, Aleksandar S

    2017-01-01

    Image segmentation is one of the most common procedures in medical imaging applications. It is also a very important task in breast cancer detection. Breast cancer detection procedure based on mammography can be divided into several stages. The first stage is the extraction of the region of interest from a breast image, followed by the identification of suspicious mass regions, their classification, and comparison with the existing image database. It is often the case that already existing image databases have large sets of data whose processing requires a lot of time, and thus the acceleration of each of the processing stages in breast cancer detection is a very important issue. In this paper, the implementation of the already existing algorithm for region-of-interest based image segmentation for mammogram images on High-Performance Reconfigurable Dataflow Computers (HPRDCs) is proposed. As a dataflow engine (DFE) of such HPRDC, Maxeler's acceleration card is used. The experiments for examining the acceleration of that algorithm on the Reconfigurable Dataflow Computers (RDCs) are performed with two types of mammogram images with different resolutions. There were, also, several DFE configurations and each of them gave a different acceleration value of algorithm execution. Those acceleration values are presented and experimental results showed good acceleration.

  14. Quantum computational complexity, Einstein's equations and accelerated expansion of the Universe

    NASA Astrophysics Data System (ADS)

    Ge, Xian-Hui; Wang, Bin

    2018-02-01

    We study the relation between quantum computational complexity and general relativity. The quantum computational complexity is proposed to be quantified by the shortest length of geodesic quantum curves. We examine the complexity/volume duality in a geodesic causal ball in the framework of Fermi normal coordinates and derive the full non-linear Einstein equation. Using insights from the complexity/action duality, we argue that the accelerated expansion of the universe could be driven by the quantum complexity and free from coincidence and fine-tunning problems.

  15. Accelerating Spaceborne SAR Imaging Using Multiple CPU/GPU Deep Collaborative Computing.

    PubMed

    Zhang, Fan; Li, Guojun; Li, Wei; Hu, Wei; Hu, Yuxin

    2016-04-07

    With the development of synthetic aperture radar (SAR) technologies in recent years, the huge amount of remote sensing data brings challenges for real-time imaging processing. Therefore, high performance computing (HPC) methods have been presented to accelerate SAR imaging, especially the GPU based methods. In the classical GPU based imaging algorithm, GPU is employed to accelerate image processing by massive parallel computing, and CPU is only used to perform the auxiliary work such as data input/output (IO). However, the computing capability of CPU is ignored and underestimated. In this work, a new deep collaborative SAR imaging method based on multiple CPU/GPU is proposed to achieve real-time SAR imaging. Through the proposed tasks partitioning and scheduling strategy, the whole image can be generated with deep collaborative multiple CPU/GPU computing. In the part of CPU parallel imaging, the advanced vector extension (AVX) method is firstly introduced into the multi-core CPU parallel method for higher efficiency. As for the GPU parallel imaging, not only the bottlenecks of memory limitation and frequent data transferring are broken, but also kinds of optimized strategies are applied, such as streaming, parallel pipeline and so on. Experimental results demonstrate that the deep CPU/GPU collaborative imaging method enhances the efficiency of SAR imaging on single-core CPU by 270 times and realizes the real-time imaging in that the imaging rate outperforms the raw data generation rate.

  16. Accelerating Spaceborne SAR Imaging Using Multiple CPU/GPU Deep Collaborative Computing

    PubMed Central

    Zhang, Fan; Li, Guojun; Li, Wei; Hu, Wei; Hu, Yuxin

    2016-01-01

    With the development of synthetic aperture radar (SAR) technologies in recent years, the huge amount of remote sensing data brings challenges for real-time imaging processing. Therefore, high performance computing (HPC) methods have been presented to accelerate SAR imaging, especially the GPU based methods. In the classical GPU based imaging algorithm, GPU is employed to accelerate image processing by massive parallel computing, and CPU is only used to perform the auxiliary work such as data input/output (IO). However, the computing capability of CPU is ignored and underestimated. In this work, a new deep collaborative SAR imaging method based on multiple CPU/GPU is proposed to achieve real-time SAR imaging. Through the proposed tasks partitioning and scheduling strategy, the whole image can be generated with deep collaborative multiple CPU/GPU computing. In the part of CPU parallel imaging, the advanced vector extension (AVX) method is firstly introduced into the multi-core CPU parallel method for higher efficiency. As for the GPU parallel imaging, not only the bottlenecks of memory limitation and frequent data transferring are broken, but also kinds of optimized strategies are applied, such as streaming, parallel pipeline and so on. Experimental results demonstrate that the deep CPU/GPU collaborative imaging method enhances the efficiency of SAR imaging on single-core CPU by 270 times and realizes the real-time imaging in that the imaging rate outperforms the raw data generation rate. PMID:27070606

  17. Computational study of radiation doses at UNLV accelerator facility

    NASA Astrophysics Data System (ADS)

    Hodges, Matthew; Barzilov, Alexander; Chen, Yi-Tung; Lowe, Daniel

    2017-09-01

    A Varian K15 electron linear accelerator (linac) has been considered for installation at University of Nevada, Las Vegas (UNLV). Before experiments can be performed, it is necessary to evaluate the photon and neutron spectra as generated by the linac, as well as the resulting dose rates within the accelerator facility. A computational study using MCNPX was performed to characterize the source terms for the bremsstrahlung converter. The 15 MeV electron beam available in the linac is above the photoneutron threshold energy for several materials in the linac assembly, and as a result, neutrons must be accounted for. The angular and energy distributions for bremsstrahlung flux generated by the interaction of the 15 MeV electron beam with the linac target were determined. This source term was used in conjunction with the K15 collimators to determine the dose rates within the facility.

  18. Strategic research in the social sciences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bainbridge, W.S.

    1995-12-31

    The federal government has identified a number of multi-agency funding initiatives for science in strategic areas, such as the initiatives on global environmental change and high performance computing, that give some role to the social sciences. Seven strategic areas for social science research are given with potential for federal funding: (1) Democratization. (2) Human Capital. (3) Administrative Science. (4) Cognitive Science. (5) High Performance Computing and Digital Libraries. (6) Human Dimensions of Environmental Change. and (7) Human Genetic Diversity. The first two are addressed in detail and the remainder as a group. 10 refs.

  19. Fast hydrological model calibration based on the heterogeneous parallel computing accelerated shuffled complex evolution method

    NASA Astrophysics Data System (ADS)

    Kan, Guangyuan; He, Xiaoyan; Ding, Liuqian; Li, Jiren; Hong, Yang; Zuo, Depeng; Ren, Minglei; Lei, Tianjie; Liang, Ke

    2018-01-01

    Hydrological model calibration has been a hot issue for decades. The shuffled complex evolution method developed at the University of Arizona (SCE-UA) has been proved to be an effective and robust optimization approach. However, its computational efficiency deteriorates significantly when the amount of hydrometeorological data increases. In recent years, the rise of heterogeneous parallel computing has brought hope for the acceleration of hydrological model calibration. This study proposed a parallel SCE-UA method and applied it to the calibration of a watershed rainfall-runoff model, the Xinanjiang model. The parallel method was implemented on heterogeneous computing systems using OpenMP and CUDA. Performance testing and sensitivity analysis were carried out to verify its correctness and efficiency. Comparison results indicated that heterogeneous parallel computing-accelerated SCE-UA converged much more quickly than the original serial version and possessed satisfactory accuracy and stability for the task of fast hydrological model calibration.

  20. Accelerating Technology Development through Integrated Computation and Experimentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shekhawat, Dushyant; Srivastava, Rameshwar D.; Ciferno, Jared

    2013-08-15

    This special section of Energy & Fuels comprises a selection of papers presented at the topical conference “Accelerating Technology Development through Integrated Computation and Experimentation”, sponsored and organized by the United States Department of Energy’s National Energy Technology Laboratory (NETL) as part of the 2012 American Institute of Chemical Engineers (AIChE) Annual Meeting held in Pittsburgh, PA, Oct 28-Nov 2, 2012. That topical conference focused on the latest research and development efforts in five main areas related to fossil energy, with each area focusing on the utilization of both experimental and computational approaches: (1) gas separations (membranes, sorbents, and solventsmore » for CO{sub 2}, H{sub 2}, and O{sub 2} production), (2) CO{sub 2} utilization (enhanced oil recovery, chemical production, mineralization, etc.), (3) carbon sequestration (flow in natural systems), (4) advanced power cycles (oxy-combustion, chemical looping, gasification, etc.), and (5) fuel processing (H{sub 2} production for fuel cells).« less

  1. Strategic Control Algorithm Development : Volume 4B. Computer Program Report (Concluded)

    DOT National Transportation Integrated Search

    1974-08-01

    A description of the strategic algorithm evaluation model is presented, both at the user and programmer levels. The model representation of an airport configuration, environmental considerations, the strategic control algorithm logic, and the airplan...

  2. Computer Simulation in Mass Emergency and Disaster Response: An Evaluation of Its Effectiveness as a Tool for Demonstrating Strategic Competency in Emergency Department Medical Responders

    ERIC Educational Resources Information Center

    O'Reilly, Daniel J.

    2011-01-01

    This study examined the capability of computer simulation as a tool for assessing the strategic competency of emergency department nurses as they responded to authentically computer simulated biohazard-exposed patient case studies. Thirty registered nurses from a large, urban hospital completed a series of computer-simulated case studies of…

  3. Computed lateral rate and acceleration power spectral response of conventional and STOL airplanes to atmospheric turbulence

    NASA Technical Reports Server (NTRS)

    Lichtenstein, J. H.

    1975-01-01

    Power-spectral-density calculations were made of the lateral responses to atmospheric turbulence for several conventional and short take-off and landing (STOL) airplanes. The turbulence was modeled as three orthogonal velocity components, which were uncorrelated, and each was represented with a one-dimensional power spectrum. Power spectral densities were computed for displacements, rates, and accelerations in roll, yaw, and sideslip. In addition, the power spectral density of the transverse acceleration was computed. Evaluation of ride quality based on a specific ride quality criterion was also made. The results show that the STOL airplanes generally had larger values for the rate and acceleration power spectra (and, consequently, larger corresponding root-mean-square values) than the conventional airplanes. The ride quality criterion gave poorer ratings to the STOL airplanes than to the conventional airplanes.

  4. Real-time dose computation: GPU-accelerated source modeling and superposition/convolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacques, Robert; Wong, John; Taylor, Russell

    Purpose: To accelerate dose calculation to interactive rates using highly parallel graphics processing units (GPUs). Methods: The authors have extended their prior work in GPU-accelerated superposition/convolution with a modern dual-source model and have enhanced performance. The primary source algorithm supports both focused leaf ends and asymmetric rounded leaf ends. The extra-focal algorithm uses a discretized, isotropic area source and models multileaf collimator leaf height effects. The spectral and attenuation effects of static beam modifiers were integrated into each source's spectral function. The authors introduce the concepts of arc superposition and delta superposition. Arc superposition utilizes separate angular sampling for themore » total energy released per unit mass (TERMA) and superposition computations to increase accuracy and performance. Delta superposition allows single beamlet changes to be computed efficiently. The authors extended their concept of multi-resolution superposition to include kernel tilting. Multi-resolution superposition approximates solid angle ray-tracing, improving performance and scalability with a minor loss in accuracy. Superposition/convolution was implemented using the inverse cumulative-cumulative kernel and exact radiological path ray-tracing. The accuracy analyses were performed using multiple kernel ray samplings, both with and without kernel tilting and multi-resolution superposition. Results: Source model performance was <9 ms (data dependent) for a high resolution (400{sup 2}) field using an NVIDIA (Santa Clara, CA) GeForce GTX 280. Computation of the physically correct multispectral TERMA attenuation was improved by a material centric approach, which increased performance by over 80%. Superposition performance was improved by {approx}24% to 0.058 and 0.94 s for 64{sup 3} and 128{sup 3} water phantoms; a speed-up of 101-144x over the highly optimized Pinnacle{sup 3} (Philips, Madison, WI) implementation

  5. GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing

    PubMed Central

    Fang, Ye; Ding, Yun; Feinstein, Wei P.; Koppelman, David M.; Moreno, Juana; Jarrell, Mark; Ramanujam, J.; Brylinski, Michal

    2016-01-01

    Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs) as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU). First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249. PMID:27420300

  6. GeauxDock: Accelerating Structure-Based Virtual Screening with Heterogeneous Computing.

    PubMed

    Fang, Ye; Ding, Yun; Feinstein, Wei P; Koppelman, David M; Moreno, Juana; Jarrell, Mark; Ramanujam, J; Brylinski, Michal

    2016-01-01

    Computational modeling of drug binding to proteins is an integral component of direct drug design. Particularly, structure-based virtual screening is often used to perform large-scale modeling of putative associations between small organic molecules and their pharmacologically relevant protein targets. Because of a large number of drug candidates to be evaluated, an accurate and fast docking engine is a critical element of virtual screening. Consequently, highly optimized docking codes are of paramount importance for the effectiveness of virtual screening methods. In this communication, we describe the implementation, tuning and performance characteristics of GeauxDock, a recently developed molecular docking program. GeauxDock is built upon the Monte Carlo algorithm and features a novel scoring function combining physics-based energy terms with statistical and knowledge-based potentials. Developed specifically for heterogeneous computing platforms, the current version of GeauxDock can be deployed on modern, multi-core Central Processing Units (CPUs) as well as massively parallel accelerators, Intel Xeon Phi and NVIDIA Graphics Processing Unit (GPU). First, we carried out a thorough performance tuning of the high-level framework and the docking kernel to produce a fast serial code, which was then ported to shared-memory multi-core CPUs yielding a near-ideal scaling. Further, using Xeon Phi gives 1.9× performance improvement over a dual 10-core Xeon CPU, whereas the best GPU accelerator, GeForce GTX 980, achieves a speedup as high as 3.5×. On that account, GeauxDock can take advantage of modern heterogeneous architectures to considerably accelerate structure-based virtual screening applications. GeauxDock is open-sourced and publicly available at www.brylinski.org/geauxdock and https://figshare.com/articles/geauxdock_tar_gz/3205249.

  7. Combining Acceleration Techniques for Low-Dose X-Ray Cone Beam Computed Tomography Image Reconstruction.

    PubMed

    Huang, Hsuan-Ming; Hsiao, Ing-Tsung

    2017-01-01

    Over the past decade, image quality in low-dose computed tomography has been greatly improved by various compressive sensing- (CS-) based reconstruction methods. However, these methods have some disadvantages including high computational cost and slow convergence rate. Many different speed-up techniques for CS-based reconstruction algorithms have been developed. The purpose of this paper is to propose a fast reconstruction framework that combines a CS-based reconstruction algorithm with several speed-up techniques. First, total difference minimization (TDM) was implemented using the soft-threshold filtering (STF). Second, we combined TDM-STF with the ordered subsets transmission (OSTR) algorithm for accelerating the convergence. To further speed up the convergence of the proposed method, we applied the power factor and the fast iterative shrinkage thresholding algorithm to OSTR and TDM-STF, respectively. Results obtained from simulation and phantom studies showed that many speed-up techniques could be combined to greatly improve the convergence speed of a CS-based reconstruction algorithm. More importantly, the increased computation time (≤10%) was minor as compared to the acceleration provided by the proposed method. In this paper, we have presented a CS-based reconstruction framework that combines several acceleration techniques. Both simulation and phantom studies provide evidence that the proposed method has the potential to satisfy the requirement of fast image reconstruction in practical CT.

  8. Multiple-grid convergence acceleration of viscous and inviscid flow computations

    NASA Technical Reports Server (NTRS)

    Johnson, G. M.

    1983-01-01

    A multiple-grid algorithm for use in efficiently obtaining steady solution to the Euler and Navier-Stokes equations is presented. The convergence of a simple, explicit fine-grid solution procedure is accelerated on a sequence of successively coarser grids by a coarse-grid information propagation method which rapidly eliminates transients from the computational domain. This use of multiple-gridding to increase the convergence rate results in substantially reduced work requirements for the numerical solution of a wide range of flow problems. Computational results are presented for subsonic and transonic inviscid flows and for laminar and turbulent, attached and separated, subsonic viscous flows. Work reduction factors as large as eight, in comparison to the basic fine-grid algorithm, were obtained. Possibilities for further performance improvement are discussed.

  9. Linking Student Engagement and Strategic Initiatives: Using NSSE Results to Inform Campus Action

    ERIC Educational Resources Information Center

    Doherty, Kathryn

    2007-01-01

    Towson University (TU) is in a period of growth in both students and facilities. To guide this growth, TU relies on its strategic plan, Towson 2010, to focus its strategic decisions through 2010. Release of the National Survey of Student Engagement (NSSE) data for 2005 coincided with a call for academic excellence and accelerated growth at Towson…

  10. Acceleration of Radiance for Lighting Simulation by Using Parallel Computing with OpenCL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zuo, Wangda; McNeil, Andrew; Wetter, Michael

    2011-09-06

    We report on the acceleration of annual daylighting simulations for fenestration systems in the Radiance ray-tracing program. The algorithm was optimized to reduce both the redundant data input/output operations and the floating-point operations. To further accelerate the simulation speed, the calculation for matrix multiplications was implemented using parallel computing on a graphics processing unit. We used OpenCL, which is a cross-platform parallel programming language. Numerical experiments show that the combination of the above measures can speed up the annual daylighting simulations 101.7 times or 28.6 times when the sky vector has 146 or 2306 elements, respectively.

  11. Acceleration of Cherenkov angle reconstruction with the new Intel Xeon/FPGA compute platform for the particle identification in the LHCb Upgrade

    NASA Astrophysics Data System (ADS)

    Faerber, Christian

    2017-10-01

    The LHCb experiment at the LHC will upgrade its detector by 2018/2019 to a ‘triggerless’ readout scheme, where all the readout electronics and several sub-detector parts will be replaced. The new readout electronics will be able to readout the detector at 40 MHz. This increases the data bandwidth from the detector down to the Event Filter farm to 40 TBit/s, which also has to be processed to select the interesting proton-proton collision for later storage. The architecture of such a computing farm, which can process this amount of data as efficiently as possible, is a challenging task and several compute accelerator technologies are being considered for use inside the new Event Filter farm. In the high performance computing sector more and more FPGA compute accelerators are used to improve the compute performance and reduce the power consumption (e.g. in the Microsoft Catapult project and Bing search engine). Also for the LHCb upgrade the usage of an experimental FPGA accelerated computing platform in the Event Building or in the Event Filter farm is being considered and therefore tested. This platform from Intel hosts a general CPU and a high performance FPGA linked via a high speed link which is for this platform a QPI link. On the FPGA an accelerator is implemented. The used system is a two socket platform from Intel with a Xeon CPU and an FPGA. The FPGA has cache-coherent memory access to the main memory of the server and can collaborate with the CPU. As a first step, a computing intensive algorithm to reconstruct Cherenkov angles for the LHCb RICH particle identification was successfully ported in Verilog to the Intel Xeon/FPGA platform and accelerated by a factor of 35. The same algorithm was ported to the Intel Xeon/FPGA platform with OpenCL. The implementation work and the performance will be compared. Also another FPGA accelerator the Nallatech 385 PCIe accelerator with the same Stratix V FPGA were tested for performance. The results show that the Intel

  12. National Strategic Computing Initiative Strategic Plan

    DTIC Science & Technology

    2016-07-01

    23 A.6 National Nanotechnology Initiative...Initiative: https://www.nitrd.gov/nitrdgroups/index.php?title=Big_Data_(BD_SSG)  National Nanotechnology Initiative: http://www.nano.gov  Precision...computing. While not limited to neuromorphic technologies, the National Nanotechnology Initiative’s first Grand Challenge seeks to achieve brain

  13. Computing Principal Eigenvectors of Large Web Graphs: Algorithms and Accelerations Related to PageRank and HITS

    ERIC Educational Resources Information Center

    Nagasinghe, Iranga

    2010-01-01

    This thesis investigates and develops a few acceleration techniques for the search engine algorithms used in PageRank and HITS computations. PageRank and HITS methods are two highly successful applications of modern Linear Algebra in computer science and engineering. They constitute the essential technologies accounted for the immense growth and…

  14. Acceleration of color computer-generated hologram from three-dimensional scenes with texture and depth information

    NASA Astrophysics Data System (ADS)

    Shimobaba, Tomoyoshi; Kakue, Takashi; Ito, Tomoyoshi

    2014-06-01

    We propose acceleration of color computer-generated holograms (CGHs) from three-dimensional (3D) scenes that are expressed as texture (RGB) and depth (D) images. These images are obtained by 3D graphics libraries and RGB-D cameras: for example, OpenGL and Kinect, respectively. We can regard them as two-dimensional (2D) cross-sectional images along the depth direction. The generation of CGHs from the 2D cross-sectional images requires multiple diffraction calculations. If we use convolution-based diffraction such as the angular spectrum method, the diffraction calculation takes a long time and requires large memory usage because the convolution diffraction calculation requires the expansion of the 2D cross-sectional images to avoid the wraparound noise. In this paper, we first describe the acceleration of the diffraction calculation using "Band-limited double-step Fresnel diffraction," which does not require the expansion. Next, we describe color CGH acceleration using color space conversion. In general, color CGHs are generated on RGB color space; however, we need to repeat the same calculation for each color component, so that the computational burden of the color CGH generation increases three-fold, compared with monochrome CGH generation. We can reduce the computational burden by using YCbCr color space because the 2D cross-sectional images on YCbCr color space can be down-sampled without the impairing of the image quality.

  15. Experimental, Theoretical and Computational Studies of Plasma-Based Concepts for Future High Energy Accelerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joshi, Chan; Mori, W.

    2013-10-21

    This is the final report on the DOE grant number DE-FG02-92ER40727 titled, “Experimental, Theoretical and Computational Studies of Plasma-Based Concepts for Future High Energy Accelerators.” During this grant period the UCLA program on Advanced Plasma Based Accelerators, headed by Professor C. Joshi has made many key scientific advances and trained a generation of students, many of whom have stayed in this research field and even started research programs of their own. In this final report however, we will focus on the last three years of the grant and report on the scientific progress made in each of the four tasksmore » listed under this grant. Four tasks are focused on: Plasma Wakefield Accelerator Research at FACET, SLAC National Accelerator Laboratory, In House Research at UCLA’s Neptune and 20 TW Laser Laboratories, Laser-Wakefield Acceleration (LWFA) in Self Guided Regime: Experiments at the Callisto Laser at LLNL, and Theory and Simulations. Major scientific results have been obtained in each of the four tasks described in this report. These have led to publications in the prestigious scientific journals, graduation and continued training of high quality Ph.D. level students and have kept the U.S. at the forefront of plasma-based accelerators research field.« less

  16. Establishing a nursing strategic agenda: the whys and wherefores.

    PubMed

    Young, Claire

    2008-01-01

    The health system nursing leader is responsible for providing high-quality, service-oriented nursing care to deliver such care with disciplined cost management; to lead and develop a group of nursing executives and managers at the facility level; to establish nursing professional development programs; to build and maintain an effective supply of nurses; and to advocate for nurses and patients. Balancing these imperatives requires thoughtful strategic planning and disciplined execution. In their absence, organizations flounder to address a single problem in isolation and struggle to perform against outcomes. One organization approached the challenge by engaging in a comprehensive, accelerated strategic planning process. The experience brought together 11 hospital nursing executives in consensus around a prioritized strategic agenda. This article is a case study of the approach used to define a nursing agenda.

  17. Accelerated computer generated holography using sparse bases in the STFT domain.

    PubMed

    Blinder, David; Schelkens, Peter

    2018-01-22

    Computer-generated holography at high resolutions is a computationally intensive task. Efficient algorithms are needed to generate holograms at acceptable speeds, especially for real-time and interactive applications such as holographic displays. We propose a novel technique to generate holograms using a sparse basis representation in the short-time Fourier space combined with a wavefront-recording plane placed in the middle of the 3D object. By computing the point spread functions in the transform domain, we update only a small subset of the precomputed largest-magnitude coefficients to significantly accelerate the algorithm over conventional look-up table methods. We implement the algorithm on a GPU, and report a speedup factor of over 30. We show that this transform is superior over wavelet-based approaches, and show quantitative and qualitative improvements over the state-of-the-art WASABI method; we report accuracy gains of 2dB PSNR, as well improved view preservation.

  18. Computing Nash equilibria through computational intelligence methods

    NASA Astrophysics Data System (ADS)

    Pavlidis, N. G.; Parsopoulos, K. E.; Vrahatis, M. N.

    2005-03-01

    Nash equilibrium constitutes a central solution concept in game theory. The task of detecting the Nash equilibria of a finite strategic game remains a challenging problem up-to-date. This paper investigates the effectiveness of three computational intelligence techniques, namely, covariance matrix adaptation evolution strategies, particle swarm optimization, as well as, differential evolution, to compute Nash equilibria of finite strategic games, as global minima of a real-valued, nonnegative function. An issue of particular interest is to detect more than one Nash equilibria of a game. The performance of the considered computational intelligence methods on this problem is investigated using multistart and deflection.

  19. GPU-accelerated Lattice Boltzmann method for anatomical extraction in patient-specific computational hemodynamics

    NASA Astrophysics Data System (ADS)

    Yu, H.; Wang, Z.; Zhang, C.; Chen, N.; Zhao, Y.; Sawchuk, A. P.; Dalsing, M. C.; Teague, S. D.; Cheng, Y.

    2014-11-01

    Existing research of patient-specific computational hemodynamics (PSCH) heavily relies on software for anatomical extraction of blood arteries. Data reconstruction and mesh generation have to be done using existing commercial software due to the gap between medical image processing and CFD, which increases computation burden and introduces inaccuracy during data transformation thus limits the medical applications of PSCH. We use lattice Boltzmann method (LBM) to solve the level-set equation over an Eulerian distance field and implicitly and dynamically segment the artery surfaces from radiological CT/MRI imaging data. The segments seamlessly feed to the LBM based CFD computation of PSCH thus explicit mesh construction and extra data management are avoided. The LBM is ideally suited for GPU (graphic processing unit)-based parallel computing. The parallel acceleration over GPU achieves excellent performance in PSCH computation. An application study will be presented which segments an aortic artery from a chest CT dataset and models PSCH of the segmented artery.

  20. Product-market differentiation: a strategic planning model for community hospitals.

    PubMed

    Milch, R A

    1980-01-01

    Community hospitals would seem to have every reason to identify and capitalize on their product-market strengths. The strategic marketing/planning model provides a framework for rational analysis of the community hospital dilemma and for developing sensible solutions to the complex problems of accelerating hospital price-inflation.

  1. Accelerating phylogenetics computing on the desktop: experiments with executing UPGMA in programmable logic.

    PubMed

    Davis, J P; Akella, S; Waddell, P H

    2004-01-01

    Having greater computational power on the desktop for processing taxa data sets has been a dream of biologists/statisticians involved in phylogenetics data analysis. Many existing algorithms have been highly optimized-one example being Felsenstein's PHYLIP code, written in C, for UPGMA and neighbor joining algorithms. However, the ability to process more than a few tens of taxa in a reasonable amount of time using conventional computers has not yielded a satisfactory speedup in data processing, making it difficult for phylogenetics practitioners to quickly explore data sets-such as might be done from a laptop computer. We discuss the application of custom computing techniques to phylogenetics. In particular, we apply this technology to speed up UPGMA algorithm execution by a factor of a hundred, against that of PHYLIP code running on the same PC. We report on these experiments and discuss how custom computing techniques can be used to not only accelerate phylogenetics algorithm performance on the desktop, but also on larger, high-performance computing engines, thus enabling the high-speed processing of data sets involving thousands of taxa.

  2. Computer Assistance in Information Work. Part I: Conceptual Framework for Improving the Computer/User Interface in Information Work. Part II: Catalog of Acceleration, Augmentation, and Delegation Functions in Information Work.

    ERIC Educational Resources Information Center

    Paisley, William; Butler, Matilda

    This study of the computer/user interface investigated the role of the computer in performing information tasks that users now perform without computer assistance. Users' perceptual/cognitive processes are to be accelerated or augmented by the computer; a long term goal is to delegate information tasks entirely to the computer. Cybernetic and…

  3. Observed differences in upper extremity forces, muscle efforts, postures, velocities and accelerations across computer activities in a field study of office workers.

    PubMed

    Bruno Garza, J L; Eijckelhof, B H W; Johnson, P W; Raina, S M; Rynell, P W; Huysmans, M A; van Dieën, J H; van der Beek, A J; Blatter, B M; Dennerlein, J T

    2012-01-01

    This study, a part of the PRedicting Occupational biomechanics in OFfice workers (PROOF) study, investigated whether there are differences in field-measured forces, muscle efforts, postures, velocities and accelerations across computer activities. These parameters were measured continuously for 120 office workers performing their own work for two hours each. There were differences in nearly all forces, muscle efforts, postures, velocities and accelerations across keyboard, mouse and idle activities. Keyboard activities showed a 50% increase in the median right trapezius muscle effort when compared to mouse activities. Median shoulder rotation changed from 25 degrees internal rotation during keyboard use to 15 degrees external rotation during mouse use. Only keyboard use was associated with median ulnar deviations greater than 5 degrees. Idle activities led to the greatest variability observed in all muscle efforts and postures measured. In future studies, measurements of computer activities could be used to provide information on the physical exposures experienced during computer use. Practitioner Summary: Computer users may develop musculoskeletal disorders due to their force, muscle effort, posture and wrist velocity and acceleration exposures during computer use. We report that many physical exposures are different across computer activities. This information may be used to estimate physical exposures based on patterns of computer activities over time.

  4. Strategic Control in Decision Making under Uncertainty

    PubMed Central

    Venkatraman, Vinod; Huettel, Scott

    2012-01-01

    Complex economic decisions – whether investing money for retirement or purchasing some new electronic gadget – often involve uncertainty about the likely consequences of our choices. Critical for resolving that uncertainty are strategic meta-decision processes, which allow people to simplify complex decision problems, to evaluate outcomes against a variety of contexts, and to flexibly match behavior to changes in the environment. In recent years, substantial research implicates the dorsomedial prefrontal cortex (dmPFC) in the flexible control of behavior. However, nearly all such evidence comes from paradigms involving executive function or response selection, not complex decision making. Here, we review evidence that demonstrates that the dmPFC contributes to strategic control in complex decision making. This region contains a functional topography such that the posterior dmPFC supports response-related control while the anterior dmPFC supports strategic control. Activation in the anterior dmPFC signals changes in how a decision problem is represented, which in turn can shape computational processes elsewhere in the brain. Based on these findings, we argue both for generalized contributions of the dmPFC to cognitive control, and for specific computational roles for its subregions depending upon the task demands and context. We also contend that these strategic considerations are also likely to be critical for decision making in other domains, including interpersonal interactions in social settings. PMID:22487037

  5. Accelerated Reader.

    ERIC Educational Resources Information Center

    Education Commission of the States, Denver, CO.

    This paper provides an overview of Accelerated Reader, a system of computerized testing and record-keeping that supplements the regular classroom reading program. Accelerated Reader's primary goal is to increase literature-based reading practice. The program offers a computer-aided reading comprehension and management program intended to motivate…

  6. CCSI and the role of advanced computing in accelerating the commercial deployment of carbon capture systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, David; Agarwal, Deborah A.; Sun, Xin

    2011-09-01

    The Carbon Capture Simulation Initiative is developing state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technology. The CCSI Toolset consists of an integrated multi-scale modeling and simulation framework, which includes extensive use of reduced order models (ROMs) and a comprehensive uncertainty quantification (UQ) methodology. This paper focuses on the interrelation among high performance computing, detailed device simulations, ROMs for scale-bridging, UQ and the integration framework.

  7. CCSI and the role of advanced computing in accelerating the commercial deployment of carbon capture systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, D.; Agarwal, D.; Sun, X.

    2011-01-01

    The Carbon Capture Simulation Initiative is developing state-of-the-art computational modeling and simulation tools to accelerate the commercialization of carbon capture technology. The CCSI Toolset consists of an integrated multi-scale modeling and simulation framework, which includes extensive use of reduced order models (ROMs) and a comprehensive uncertainty quantification (UQ) methodology. This paper focuses on the interrelation among high performance computing, detailed device simulations, ROMs for scale-bridging, UQ and the integration framework.

  8. Strategic Reading, Ontologies, and the Future of Scientific Publishing

    NASA Astrophysics Data System (ADS)

    Renear, Allen H.; Palmer, Carole L.

    2009-08-01

    The revolution in scientific publishing that has been promised since the 1980s is about to take place. Scientists have always read strategically, working with many articles simultaneously to search, filter, scan, link, annotate, and analyze fragments of content. An observed recent increase in strategic reading in the online environment will soon be further intensified by two current trends: (i) the widespread use of digital indexing, retrieval, and navigation resources and (ii) the emergence within many scientific disciplines of interoperable ontologies. Accelerated and enhanced by reading tools that take advantage of ontologies, reading practices will become even more rapid and indirect, transforming the ways in which scientists engage the literature and shaping the evolution of scientific publishing.

  9. Biomaterials and computation: a strategic alliance to investigate emergent responses of neural cells.

    PubMed

    Sergi, Pier Nicola; Cavalcanti-Adam, Elisabetta Ada

    2017-03-28

    Topographical and chemical cues drive migration, outgrowth and regeneration of neurons in different and crucial biological conditions. In the natural extracellular matrix, their influences are so closely coupled that they result in complex cellular responses. As a consequence, engineered biomaterials are widely used to simplify in vitro conditions, disentangling intricate in vivo behaviours, and narrowing the investigation on particular emergent responses. Nevertheless, how topographical and chemical cues affect the emergent response of neural cells is still unclear, thus in silico models are used as additional tools to reproduce and investigate the interactions between cells and engineered biomaterials. This work aims at presenting the synergistic use of biomaterials-based experiments and computation as a strategic way to promote the discovering of complex neural responses as well as to allow the interactions between cells and biomaterials to be quantitatively investigated, fostering a rational design of experiments.

  10. Covariant Uniform Acceleration

    NASA Astrophysics Data System (ADS)

    Friedman, Yaakov; Scarr, Tzvi

    2013-04-01

    We derive a 4D covariant Relativistic Dynamics Equation. This equation canonically extends the 3D relativistic dynamics equation , where F is the 3D force and p = m0γv is the 3D relativistic momentum. The standard 4D equation is only partially covariant. To achieve full Lorentz covariance, we replace the four-force F by a rank 2 antisymmetric tensor acting on the four-velocity. By taking this tensor to be constant, we obtain a covariant definition of uniformly accelerated motion. This solves a problem of Einstein and Planck. We compute explicit solutions for uniformly accelerated motion. The solutions are divided into four Lorentz-invariant types: null, linear, rotational, and general. For null acceleration, the worldline is cubic in the time. Linear acceleration covariantly extends 1D hyperbolic motion, while rotational acceleration covariantly extends pure rotational motion. We use Generalized Fermi-Walker transport to construct a uniformly accelerated family of inertial frames which are instantaneously comoving to a uniformly accelerated observer. We explain the connection between our approach and that of Mashhoon. We show that our solutions of uniformly accelerated motion have constant acceleration in the comoving frame. Assuming the Weak Hypothesis of Locality, we obtain local spacetime transformations from a uniformly accelerated frame K' to an inertial frame K. The spacetime transformations between two uniformly accelerated frames with the same acceleration are Lorentz. We compute the metric at an arbitrary point of a uniformly accelerated frame. We obtain velocity and acceleration transformations from a uniformly accelerated system K' to an inertial frame K. We introduce the 4D velocity, an adaptation of Horwitz and Piron s notion of "off-shell." We derive the general formula for the time dilation between accelerated clocks. We obtain a formula for the angular velocity of a uniformly accelerated object. Every rest point of K' is uniformly accelerated, and

  11. Acceleration of the matrix multiplication of Radiance three phase daylighting simulations with parallel computing on heterogeneous hardware of personal computer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zuo, Wangda; McNeil, Andrew; Wetter, Michael

    2013-05-23

    Building designers are increasingly relying on complex fenestration systems to reduce energy consumed for lighting and HVAC in low energy buildings. Radiance, a lighting simulation program, has been used to conduct daylighting simulations for complex fenestration systems. Depending on the configurations, the simulation can take hours or even days using a personal computer. This paper describes how to accelerate the matrix multiplication portion of a Radiance three-phase daylight simulation by conducting parallel computing on heterogeneous hardware of a personal computer. The algorithm was optimized and the computational part was implemented in parallel using OpenCL. The speed of new approach wasmore » evaluated using various daylighting simulation cases on a multicore central processing unit and a graphics processing unit. Based on the measurements and analysis of the time usage for the Radiance daylighting simulation, further speedups can be achieved by using fast I/O devices and storing the data in a binary format.« less

  12. Cognitive Characteristics of Strategic and Non-strategic Gamblers.

    PubMed

    Mouneyrac, Aurélie; Lemercier, Céline; Le Floch, Valérie; Challet-Bouju, Gaëlle; Moreau, Axelle; Jacques, Christian; Giroux, Isabelle

    2018-03-01

    Participation in strategic and non-strategic games is mostly explained in the literature by gender: men gamble on strategic games, while women gamble on non-strategic games. However, little is known about the underlying cognitive factors that could also distinguish strategic and non-strategic gamblers. We suggest that cognitive style and need for cognition also explain participation in gambling subtypes. From a dual-process perspective, cognitive style is the tendency to reject or accept the fast, automatic answer that comes immediately in response to a problem. Individuals that preferentially reject the automatic response use an analytic style, which suggest processing information in a slow way, with deep treatment. The intuitive style supposes a reliance on fast, automatic answers. The need for cognition provides a motivation to engage in effortful activities. One hundred and forty-nine gamblers (53 strategic and 96 non-strategic) answered the Cognitive Reflection Test, Need For Cognition Scale, and socio-demographic questions. A logistic regression was conducted to evaluate the influence of gender, cognitive style and need for cognition on participation in strategic and non-strategic games. Our results show that a model with both gender and cognitive variables is more accurate than a model with gender alone. Analytic (vs. intuitive) style, high (vs. low) need for cognition and being male (vs. female) are characteristics of strategic gamblers (vs. non-strategic gamblers). This study highlights the importance of considering the cognitive characteristics of strategic and non-strategic gamblers in order to develop preventive campaigns and treatments that fit the best profiles for gamblers.

  13. Proposal for an Accelerator R&D User Facility at Fermilab's Advanced Superconducting Test Accelerator (ASTA)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Church, M.; Edwards, H.; Harms, E.

    2013-10-01

    Fermilab is the nation’s particle physics laboratory, supported by the DOE Office of High Energy Physics (OHEP). Fermilab is a world leader in accelerators, with a demonstrated track-record— spanning four decades—of excellence in accelerator science and technology. We describe the significant opportunity to complete, in a highly leveraged manner, a unique accelerator research facility that supports the broad strategic goals in accelerator science and technology within the OHEP. While the US accelerator-based HEP program is oriented toward the Intensity Frontier, which requires modern superconducting linear accelerators and advanced highintensity storage rings, there are no accelerator test facilities that support themore » accelerator science of the Intensity Frontier. Further, nearly all proposed future accelerators for Discovery Science will rely on superconducting radiofrequency (SRF) acceleration, yet there are no dedicated test facilities to study SRF capabilities for beam acceleration and manipulation in prototypic conditions. Finally, there are a wide range of experiments and research programs beyond particle physics that require the unique beam parameters that will only be available at Fermilab’s Advanced Superconducting Test Accelerator (ASTA). To address these needs we submit this proposal for an Accelerator R&D User Facility at ASTA. The ASTA program is based on the capability provided by an SRF linac (which provides electron beams from 50 MeV to nearly 1 GeV) and a small storage ring (with the ability to store either electrons or protons) to enable a broad range of beam-based experiments to study fundamental limitations to beam intensity and to develop transformative approaches to particle-beam generation, acceleration and manipulation which cannot be done elsewhere. It will also establish a unique resource for R&D towards Energy Frontier facilities and a test-bed for SRF accelerators and high brightness beam applications in support of the

  14. LANDSAT-D accelerated payload correction subsystem output computer compatible tape format

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The NASA GSFC LANDSAT-D Ground Segment (GS) is developing an Accelerated Payload Correction Subsystem (APCS) to provide Thematic Mapper (TM) image correction data to be used outside the GS. This correction data is computed from a subset of the TM Payload Correction Data (PCD), which is downlinked from the spacecraft in a 32 Kbps data stream, and mirror scan correction data (MSCD), which is extracted from the wideband video data. This correction data is generated in the GS Thematic Mapper Mission Management Facility (MMF-T), and is recorded on a 9-track 1600 bit per inch computer compatible tape (CCT). This CCT is known as a APCS Output CCT (AOT). The AOT follows standardized corrections with respect to data formats, record construction and record identification. Applicable documents are delineated; common conventions which are used in further defining the structure, format and content of the AOT are defined; and the structure and content of the AOT are described.

  15. Commissioning the GTA accelerator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sander, O.R.; Atkins, W.H.; Bolme, G.O.

    1992-09-01

    The Ground Test Accelerator (GTA) is supported by the Strategic Defense command as part of their Neutral Particle Beam (NPB) program. Neutral particles have the advantage that in space they are unaffected by the earth`s magnetic field and travel in straight lines unless they enter the earth`s atmosphere and become charged by stripping. Heavy particles are difficult to stop and can probe the interior of space vehicles; hence, NPB can function as a discriminator between warheads and decoys. We are using GTA to resolve the physics and engineering issues related to accelerating, focusing, and steering a high-brightness, high-current H{sup -}more » beam and then neutralizing it. Our immediate goal is to produce a 24-MeV, 50mA device with a 2% duty factor.« less

  16. Commissioning the GTA accelerator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sander, O.R.; Atkins, W.H.; Bolme, G.O.

    1992-01-01

    The Ground Test Accelerator (GTA) is supported by the Strategic Defense command as part of their Neutral Particle Beam (NPB) program. Neutral particles have the advantage that in space they are unaffected by the earth's magnetic field and travel in straight lines unless they enter the earth's atmosphere and become charged by stripping. Heavy particles are difficult to stop and can probe the interior of space vehicles; hence, NPB can function as a discriminator between warheads and decoys. We are using GTA to resolve the physics and engineering issues related to accelerating, focusing, and steering a high-brightness, high-current H{sup -}more » beam and then neutralizing it. Our immediate goal is to produce a 24-MeV, 50mA device with a 2% duty factor.« less

  17. Strategic control in decision-making under uncertainty.

    PubMed

    Venkatraman, Vinod; Huettel, Scott A

    2012-04-01

    Complex economic decisions - whether investing money for retirement or purchasing some new electronic gadget - often involve uncertainty about the likely consequences of our choices. Critical for resolving that uncertainty are strategic meta-decision processes, which allow people to simplify complex decision problems, evaluate outcomes against a variety of contexts, and flexibly match behavior to changes in the environment. In recent years, substantial research has implicated the dorsomedial prefrontal cortex (dmPFC) in the flexible control of behavior. However, nearly all such evidence comes from paradigms involving executive function or response selection, not complex decision-making. Here, we review evidence that demonstrates that the dmPFC contributes to strategic control in complex decision-making. This region contains a functional topography such that the posterior dmPFC supports response-related control, whereas the anterior dmPFC supports strategic control. Activation in the anterior dmPFC signals changes in how a decision problem is represented, which in turn can shape computational processes elsewhere in the brain. Based on these findings, we argue for both generalized contributions of the dmPFC to cognitive control, and specific computational roles for its subregions depending upon the task demands and context. We also contend that these strategic considerations are likely to be critical for decision-making in other domains, including interpersonal interactions in social settings. © 2012 The Authors. European Journal of Neuroscience © 2012 Federation of European Neuroscience Societies and Blackwell Publishing Ltd.

  18. A coarse-grid-projection acceleration method for finite-element incompressible flow computations

    NASA Astrophysics Data System (ADS)

    Kashefi, Ali; Staples, Anne; FiN Lab Team

    2015-11-01

    Coarse grid projection (CGP) methodology provides a framework for accelerating computations by performing some part of the computation on a coarsened grid. We apply the CGP to pressure projection methods for finite element-based incompressible flow simulations. Based on it, the predicted velocity field data is restricted to a coarsened grid, the pressure is determined by solving the Poisson equation on the coarse grid, and the resulting data are prolonged to the preset fine grid. The contributions of the CGP method to the pressure correction technique are twofold: first, it substantially lessens the computational cost devoted to the Poisson equation, which is the most time-consuming part of the simulation process. Second, it preserves the accuracy of the velocity field. The velocity and pressure spaces are approximated by Galerkin spectral element using piecewise linear basis functions. A restriction operator is designed so that fine data are directly injected into the coarse grid. The Laplacian and divergence matrices are driven by taking inner products of coarse grid shape functions. Linear interpolation is implemented to construct a prolongation operator. A study of the data accuracy and the CPU time for the CGP-based versus non-CGP computations is presented. Laboratory for Fluid Dynamics in Nature.

  19. Accelerated Application Development: The ORNL Titan Experience

    DOE PAGES

    Joubert, Wayne; Archibald, Richard K.; Berrill, Mark A.; ...

    2015-05-09

    The use of computational accelerators such as NVIDIA GPUs and Intel Xeon Phi processors is now widespread in the high performance computing community, with many applications delivering impressive performance gains. However, programming these systems for high performance, performance portability and software maintainability has been a challenge. In this paper we discuss experiences porting applications to the Titan system. Titan, which began planning in 2009 and was deployed for general use in 2013, was the first multi-petaflop system based on accelerator hardware. To ready applications for accelerated computing, a preparedness effort was undertaken prior to delivery of Titan. In this papermore » we report experiences and lessons learned from this process and describe how users are currently making use of computational accelerators on Titan.« less

  20. Accelerated application development: The ORNL Titan experience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joubert, Wayne; Archibald, Rick; Berrill, Mark

    2015-08-01

    The use of computational accelerators such as NVIDIA GPUs and Intel Xeon Phi processors is now widespread in the high performance computing community, with many applications delivering impressive performance gains. However, programming these systems for high performance, performance portability and software maintainability has been a challenge. In this paper we discuss experiences porting applications to the Titan system. Titan, which began planning in 2009 and was deployed for general use in 2013, was the first multi-petaflop system based on accelerator hardware. To ready applications for accelerated computing, a preparedness effort was undertaken prior to delivery of Titan. In this papermore » we report experiences and lessons learned from this process and describe how users are currently making use of computational accelerators on Titan.« less

  1. A heterogeneous computing accelerated SCE-UA global optimization method using OpenMP, OpenCL, CUDA, and OpenACC.

    PubMed

    Kan, Guangyuan; He, Xiaoyan; Ding, Liuqian; Li, Jiren; Liang, Ke; Hong, Yang

    2017-10-01

    The shuffled complex evolution optimization developed at the University of Arizona (SCE-UA) has been successfully applied in various kinds of scientific and engineering optimization applications, such as hydrological model parameter calibration, for many years. The algorithm possesses good global optimality, convergence stability and robustness. However, benchmark and real-world applications reveal the poor computational efficiency of the SCE-UA. This research aims at the parallelization and acceleration of the SCE-UA method based on powerful heterogeneous computing technology. The parallel SCE-UA is implemented on Intel Xeon multi-core CPU (by using OpenMP and OpenCL) and NVIDIA Tesla many-core GPU (by using OpenCL, CUDA, and OpenACC). The serial and parallel SCE-UA were tested based on the Griewank benchmark function. Comparison results indicate the parallel SCE-UA significantly improves computational efficiency compared to the original serial version. The OpenCL implementation obtains the best overall acceleration results however, with the most complex source code. The parallel SCE-UA has bright prospects to be applied in real-world applications.

  2. Computational Materials Science and Chemistry: Accelerating Discovery and Innovation through Simulation-Based Engineering and Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crabtree, George; Glotzer, Sharon; McCurdy, Bill

    This report is based on a SC Workshop on Computational Materials Science and Chemistry for Innovation on July 26-27, 2010, to assess the potential of state-of-the-art computer simulations to accelerate understanding and discovery in materials science and chemistry, with a focus on potential impacts in energy technologies and innovation. The urgent demand for new energy technologies has greatly exceeded the capabilities of today's materials and chemical processes. To convert sunlight to fuel, efficiently store energy, or enable a new generation of energy production and utilization technologies requires the development of new materials and processes of unprecedented functionality and performance. Newmore » materials and processes are critical pacing elements for progress in advanced energy systems and virtually all industrial technologies. Over the past two decades, the United States has developed and deployed the world's most powerful collection of tools for the synthesis, processing, characterization, and simulation and modeling of materials and chemical systems at the nanoscale, dimensions of a few atoms to a few hundred atoms across. These tools, which include world-leading x-ray and neutron sources, nanoscale science facilities, and high-performance computers, provide an unprecedented view of the atomic-scale structure and dynamics of materials and the molecular-scale basis of chemical processes. For the first time in history, we are able to synthesize, characterize, and model materials and chemical behavior at the length scale where this behavior is controlled. This ability is transformational for the discovery process and, as a result, confers a significant competitive advantage. Perhaps the most spectacular increase in capability has been demonstrated in high performance computing. Over the past decade, computational power has increased by a factor of a million due to advances in hardware and software. This rate of improvement, which shows no sign of

  3. Using Equation-Free Computation to Accelerate Network-Free Stochastic Simulation of Chemical Kinetics.

    PubMed

    Lin, Yen Ting; Chylek, Lily A; Lemons, Nathan W; Hlavacek, William S

    2018-06-21

    The chemical kinetics of many complex systems can be concisely represented by reaction rules, which can be used to generate reaction events via a kinetic Monte Carlo method that has been termed network-free simulation. Here, we demonstrate accelerated network-free simulation through a novel approach to equation-free computation. In this process, variables are introduced that approximately capture system state. Derivatives of these variables are estimated using short bursts of exact stochastic simulation and finite differencing. The variables are then projected forward in time via a numerical integration scheme, after which a new exact stochastic simulation is initialized and the whole process repeats. The projection step increases efficiency by bypassing the firing of numerous individual reaction events. As we show, the projected variables may be defined as populations of building blocks of chemical species. The maximal number of connected molecules included in these building blocks determines the degree of approximation. Equation-free acceleration of network-free simulation is found to be both accurate and efficient.

  4. Working and strategic memory deficits in schizophrenia

    NASA Technical Reports Server (NTRS)

    Stone, M.; Gabrieli, J. D.; Stebbins, G. T.; Sullivan, E. V.

    1998-01-01

    Working memory and its contribution to performance on strategic memory tests in schizophrenia were studied. Patients (n = 18) and control participants (n = 15), all men, received tests of immediate memory (forward digit span), working memory (listening, computation, and backward digit span), and long-term strategic (free recall, temporal order, and self-ordered pointing) and nonstrategic (recognition) memory. Schizophrenia patients performed worse on all tests. Education, verbal intelligence, and immediate memory capacity did not account for deficits in working memory in schizophrenia patients. Reduced working memory capacity accounted for group differences in strategic memory but not in recognition memory. Working memory impairment may be central to the profile of impaired cognitive performance in schizophrenia and is consistent with hypothesized frontal lobe dysfunction associated with this disease. Additional medial-temporal dysfunction may account for the recognition memory deficit.

  5. TU-FG-201-04: Computer Vision in Autonomous Quality Assurance of Linear Accelerators

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, H; Jenkins, C; Yu, S

    Purpose: Routine quality assurance (QA) of linear accelerators represents a critical and costly element of a radiation oncology center. Recently, a system was developed to autonomously perform routine quality assurance on linear accelerators. The purpose of this work is to extend this system and contribute computer vision techniques for obtaining quantitative measurements for a monthly multi-leaf collimator (MLC) QA test specified by TG-142, namely leaf position accuracy, and demonstrate extensibility for additional routines. Methods: Grayscale images of a picket fence delivery on a radioluminescent phosphor coated phantom are captured using a CMOS camera. Collected images are processed to correct formore » camera distortions, rotation and alignment, reduce noise, and enhance contrast. The location of each MLC leaf is determined through logistic fitting and a priori modeling based on knowledge of the delivered beams. Using the data collected and the criteria from TG-142, a decision is made on whether or not the leaf position accuracy of the MLC passes or fails. Results: The locations of all MLC leaf edges are found for three different picket fence images in a picket fence routine to 0.1mm/1pixel precision. The program to correct for image alignment and determination of leaf positions requires a runtime of 21– 25 seconds for a single picket, and 44 – 46 seconds for a group of three pickets on a standard workstation CPU, 2.2 GHz Intel Core i7. Conclusion: MLC leaf edges were successfully found using techniques in computer vision. With the addition of computer vision techniques to the previously described autonomous QA system, the system is able to quickly perform complete QA routines with minimal human contribution.« less

  6. Acceleration and Velocity Sensing from Measured Strain

    NASA Technical Reports Server (NTRS)

    Pak, Chan-Gi; Truax, Roger

    2015-01-01

    A simple approach for computing acceleration and velocity of a structure from the strain is proposed in this study. First, deflection and slope of the structure are computed from the strain using a two-step theory. Frequencies of the structure are computed from the time histories of strain using a parameter estimation technique together with an autoregressive moving average model. From deflection, slope, and frequencies of the structure, acceleration and velocity of the structure can be obtained using the proposed approach. Simple harmonic motion is assumed for the acceleration computations, and the central difference equation with a linear autoregressive model is used for the computations of velocity. A cantilevered rectangular wing model is used to validate the simple approach. Quality of the computed deflection, acceleration, and velocity values are independent of the number of fibers. The central difference equation with a linear autoregressive model proposed in this study follows the target response with reasonable accuracy. Therefore, the handicap of the backward difference equation, phase shift, is successfully overcome.

  7. QALMA: A computational toolkit for the analysis of quality protocols for medical linear accelerators in radiation therapy

    NASA Astrophysics Data System (ADS)

    Rahman, Md Mushfiqur; Lei, Yu; Kalantzis, Georgios

    2018-01-01

    Quality Assurance (QA) for medical linear accelerator (linac) is one of the primary concerns in external beam radiation Therapy. Continued advancements in clinical accelerators and computer control technology make the QA procedures more complex and time consuming which often, adequate software accompanied with specific phantoms is required. To ameliorate that matter, we introduce QALMA (Quality Assurance for Linac with MATLAB), a MALAB toolkit which aims to simplify the quantitative analysis of QA for linac which includes Star-Shot analysis, Picket Fence test, Winston-Lutz test, Multileaf Collimator (MLC) log file analysis and verification of light & radiation field coincidence test.

  8. Strategic Planning: What's so Strategic about It?

    ERIC Educational Resources Information Center

    Strong, Bart

    2005-01-01

    The words "strategic" and "planning" used together can lead to confusion unless one spent the early years of his career in never-ending, team-oriented, corporate training sessions. Doesn't "strategic" have something to do with extremely accurate bombing or a defensive missile system or Star Wars or something? Don't "strategic" and "planning" both…

  9. Semiempirical Quantum Chemical Calculations Accelerated on a Hybrid Multicore CPU-GPU Computing Platform.

    PubMed

    Wu, Xin; Koslowski, Axel; Thiel, Walter

    2012-07-10

    In this work, we demonstrate that semiempirical quantum chemical calculations can be accelerated significantly by leveraging the graphics processing unit (GPU) as a coprocessor on a hybrid multicore CPU-GPU computing platform. Semiempirical calculations using the MNDO, AM1, PM3, OM1, OM2, and OM3 model Hamiltonians were systematically profiled for three types of test systems (fullerenes, water clusters, and solvated crambin) to identify the most time-consuming sections of the code. The corresponding routines were ported to the GPU and optimized employing both existing library functions and a GPU kernel that carries out a sequence of noniterative Jacobi transformations during pseudodiagonalization. The overall computation times for single-point energy calculations and geometry optimizations of large molecules were reduced by one order of magnitude for all methods, as compared to runs on a single CPU core.

  10. Analysis of optoelectronic strategic planning in Taiwan by artificial intelligence portfolio tool

    NASA Astrophysics Data System (ADS)

    Chang, Rang-Seng

    1992-05-01

    Taiwan ROC has achieved significant advances in the optoelectronic industry with some Taiwan products ranked high in the world market and technology. Six segmentations of optoelectronic were planned. Each one was divided into several strategic items, design artificial intelligent portfolio tool (AIPT) to analyze the optoelectronic strategic planning in Taiwan. The portfolio is designed to provoke strategic thinking intelligently. This computer- generated strategy should be selected and modified by the individual. Some strategies for the development of the Taiwan optoelectronic industry also are discussed in this paper.

  11. Vacuum Brazing of Accelerator Components

    NASA Astrophysics Data System (ADS)

    Singh, Rajvir; Pant, K. K.; Lal, Shankar; Yadav, D. P.; Garg, S. R.; Raghuvanshi, V. K.; Mundra, G.

    2012-11-01

    Commonly used materials for accelerator components are those which are vacuum compatible and thermally conductive. Stainless steel, aluminum and copper are common among them. Stainless steel is a poor heat conductor and not very common in use where good thermal conductivity is required. Aluminum and copper and their alloys meet the above requirements and are frequently used for the above purpose. The accelerator components made of aluminum and its alloys using welding process have become a common practice now a days. It is mandatory to use copper and its other grades in RF devices required for accelerators. Beam line and Front End components of the accelerators are fabricated from stainless steel and OFHC copper. Fabrication of components made of copper using welding process is very difficult and in most of the cases it is impossible. Fabrication and joining in such cases is possible using brazing process especially under vacuum and inert gas atmosphere. Several accelerator components have been vacuum brazed for Indus projects at Raja Ramanna Centre for Advanced Technology (RRCAT), Indore using vacuum brazing facility available at RRCAT, Indore. This paper presents details regarding development of the above mentioned high value and strategic components/assemblies. It will include basics required for vacuum brazing, details of vacuum brazing facility, joint design, fixturing of the jobs, selection of filler alloys, optimization of brazing parameters so as to obtain high quality brazed joints, brief description of vacuum brazed accelerator components etc.

  12. Strategic Analysis Overview

    NASA Technical Reports Server (NTRS)

    Cirillo, William M.; Earle, Kevin D.; Goodliff, Kandyce E.; Reeves, J. D.; Stromgren, Chel; Andraschko, Mark R.; Merrill, R. Gabe

    2008-01-01

    NASA s Constellation Program employs a strategic analysis methodology in providing an integrated analysis capability of Lunar exploration scenarios and to support strategic decision-making regarding those scenarios. The strategic analysis methodology integrates the assessment of the major contributors to strategic objective satisfaction performance, affordability, and risk and captures the linkages and feedbacks between all three components. Strategic analysis supports strategic decision making by senior management through comparable analysis of alternative strategies, provision of a consistent set of high level value metrics, and the enabling of cost-benefit analysis. The tools developed to implement the strategic analysis methodology are not element design and sizing tools. Rather, these models evaluate strategic performance using predefined elements, imported into a library from expert-driven design/sizing tools or expert analysis. Specific components of the strategic analysis tool set include scenario definition, requirements generation, mission manifesting, scenario lifecycle costing, crew time analysis, objective satisfaction benefit, risk analysis, and probabilistic evaluation. Results from all components of strategic analysis are evaluated a set of pre-defined figures of merit (FOMs). These FOMs capture the high-level strategic characteristics of all scenarios and facilitate direct comparison of options. The strategic analysis methodology that is described in this paper has previously been applied to the Space Shuttle and International Space Station Programs and is now being used to support the development of the baseline Constellation Program lunar architecture. This paper will present an overview of the strategic analysis methodology and will present sample results from the application of the strategic analysis methodology to the Constellation Program lunar architecture.

  13. Computer modeling of test particle acceleration at oblique shocks

    NASA Technical Reports Server (NTRS)

    Decker, Robert B.

    1988-01-01

    The present evaluation of the basic techniques and illustrative results of charged particle-modeling numerical codes suitable for particle acceleration at oblique, fast-mode collisionless shocks emphasizes the treatment of ions as test particles, calculating particle dynamics through numerical integration along exact phase-space orbits. Attention is given to the acceleration of particles at planar, infinitessimally thin shocks, as well as to plasma simulations in which low-energy ions are injected and accelerated at quasi-perpendicular shocks with internal structure.

  14. Executive summary of the Strategic Plan for National Institutes of Health Obesity Research.

    PubMed

    Spiegel, Allen M; Alving, Barbara M

    2005-07-01

    The Strategic Plan for National Institutes of Health (NIH) Obesity Research is intended to serve as a guide for coordinating obesity research activities across the NIH and for enhancing the development of new efforts based on identification of areas of greatest scientific opportunity and challenge. Developed by the NIH Obesity Research Task Force with critical input from external scientists and the public, the Strategic Plan reflects a dynamic planning process and presents a multidimensional research agenda, with an interrelated set of goals and strategies for achieving the goals. The major scientific themes around which the Strategic Plan is framed include the following: preventing and treating obesity through lifestyle modification; preventing and treating obesity through pharmacologic, surgical, or other medical approaches; breaking the link between obesity and its associated health conditions; and cross-cutting topics, including health disparities, technology, fostering of interdisciplinary research teams, investigator training, translational research, and education/outreach efforts. Through the efforts described in the Strategic Plan for NIH Obesity Research, the NIH will strive to facilitate and accelerate progress in obesity research to improve public health.

  15. Neurocognitive dysfunction in strategic and non-strategic gamblers.

    PubMed

    Grant, Jon E; Odlaug, Brian L; Chamberlain, Samuel R; Schreiber, Liana R N

    2012-08-07

    It has been theorized that there may be subtypes of pathological gambling, particularly in relation to the main type of gambling activities undertaken. Whether or not putative pathological gambling subtypes differ in terms of their clinical and cognitive profiles has received little attention. Subjects meeting DSM-IV criteria for pathological gambling were grouped into two categories of preferred forms of gambling - strategic (e.g., cards, dice, sports betting, stock market) and non-strategic (e.g., slots, video poker, pull tabs). Groups were compared on clinical characteristics (gambling severity, and time and money spent gambling), psychiatric comorbidity, and neurocognitive tests assessing motor impulsivity and cognitive flexibility. Seventy-seven subjects were included in this sample (45.5% females; mean age: 42.7±14.9) which consisted of the following groups: strategic (n=22; 28.6%) and non-strategic (n=55; 71.4%). Non-strategic gamblers were significantly more likely to be older, female, and divorced. Money spent gambling did not differ significantly between groups although one measure of gambling severity reflected more severe problems for strategic gamblers. Strategic and non-strategic gamblers did not differ in terms of cognitive function; both groups showed impairments in cognitive flexibility and inhibitory control relative to matched healthy volunteers. These preliminary results suggest that preferred form of gambling may be associated with specific clinical characteristics but are not dissociable in terms of cognitive inflexibility and motor impulsivity. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. Strategic Leadership

    ERIC Educational Resources Information Center

    Davies, Barbara; Davies, Brent

    2004-01-01

    This article explores the nature of strategic leadership and assesses whether a framework can be established to map the dimensions of strategic leadership. In particular it establishes a model which outlines both the organizational abilities and the individual characteristics of strategic leaders.

  17. Networking as a Strategic Tool, 1991

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This conference focuses on the technological advances, pitfalls, requirements, and trends involved in planning and implementing an effective computer network system. The basic theme of the conference is networking as a strategic tool. Tutorials and conference presentations explore the technology and methods involved in this rapidly changing field. Future directions are explored from a global, as well as local, perspective.

  18. Strategic Budgeting.

    ERIC Educational Resources Information Center

    Jones, Dennis P.

    1993-01-01

    An approach to college budgeting that encompasses strategic as well as operational decisions is proposed. Strategic decisions focus on creation and maintenance of institutional capacity, whereas operational decisions focus on use of that capacity to accomplish specific purposes. Strategic budgeting must emphasize institutional assets and their…

  19. Leading Strategic & Cultural Change through Technology. Proceedings of the Association of Small Computer Users in Education (ASCUE) Annual Conference (37th, Myrtle Beach, South Carolina, June 6-10, 2004)

    ERIC Educational Resources Information Center

    Smith, Peter, Ed.; Smith, Carol L., Ed.

    2004-01-01

    This 2004 Association of Small Computer Users in Education (ASCUE) conference proceedings presented the theme "Leading Strategic & Cultural Change through Technology." The conference introduced its ASCUE Officers and Directors, and provides abstracts of the pre-conference workshops. The full-text conference papers in this document…

  20. Accelerating Approximate Bayesian Computation with Quantile Regression: application to cosmological redshift distributions

    NASA Astrophysics Data System (ADS)

    Kacprzak, T.; Herbel, J.; Amara, A.; Réfrégier, A.

    2018-02-01

    Approximate Bayesian Computation (ABC) is a method to obtain a posterior distribution without a likelihood function, using simulations and a set of distance metrics. For that reason, it has recently been gaining popularity as an analysis tool in cosmology and astrophysics. Its drawback, however, is a slow convergence rate. We propose a novel method, which we call qABC, to accelerate ABC with Quantile Regression. In this method, we create a model of quantiles of distance measure as a function of input parameters. This model is trained on a small number of simulations and estimates which regions of the prior space are likely to be accepted into the posterior. Other regions are then immediately rejected. This procedure is then repeated as more simulations are available. We apply it to the practical problem of estimation of redshift distribution of cosmological samples, using forward modelling developed in previous work. The qABC method converges to nearly same posterior as the basic ABC. It uses, however, only 20% of the number of simulations compared to basic ABC, achieving a fivefold gain in execution time for our problem. For other problems the acceleration rate may vary; it depends on how close the prior is to the final posterior. We discuss possible improvements and extensions to this method.

  1. Strategic Computing. New-Generation Computing Technology: A Strategic Plan for Its Development and Application to Critical Problems in Defense

    DTIC Science & Technology

    1983-10-28

    Computing. By seizing an opportunity to leverage recent advances in artificial intelligence, computer science, and microelectronics, the Agency plans...occurred in many separated areas of artificial intelligence, computer science, and microelectronics. Advances in "expert system" technology now...and expert knowledge o Advances in Artificial Intelligence: Mechanization of speech recognition, vision, and natural language understanding. o

  2. 75 FR 67695 - U.S. Strategic Command Strategic Advisory Group Closed Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-03

    ... DEPARTMENT OF DEFENSE Office of the Secretary of Defense U.S. Strategic Command Strategic Advisory... meeting notice of the U.S. Strategic Command Strategic Advisory Group. DATES: December 9, 2010: 8 a.m. to..., intelligence, and policy-related issues to the Commander, U.S. Strategic Command, during the development of the...

  3. Particle tracking acceleration via signed distance fields in direct-accelerated geometry Monte Carlo

    DOE PAGES

    Shriwise, Patrick C.; Davis, Andrew; Jacobson, Lucas J.; ...

    2017-08-26

    Computer-aided design (CAD)-based Monte Carlo radiation transport is of value to the nuclear engineering community for its ability to conduct transport on high-fidelity models of nuclear systems, but it is more computationally expensive than native geometry representations. This work describes the adaptation of a rendering data structure, the signed distance field, as a geometric query tool for accelerating CAD-based transport in the direct-accelerated geometry Monte Carlo toolkit. Demonstrations of its effectiveness are shown for several problems. The beginnings of a predictive model for the data structure's utilization based on various problem parameters is also introduced.

  4. Computationally efficient methods for modelling laser wakefield acceleration in the blowout regime

    NASA Astrophysics Data System (ADS)

    Cowan, B. M.; Kalmykov, S. Y.; Beck, A.; Davoine, X.; Bunkers, K.; Lifschitz, A. F.; Lefebvre, E.; Bruhwiler, D. L.; Shadwick, B. A.; Umstadter, D. P.; Umstadter

    2012-08-01

    Electron self-injection and acceleration until dephasing in the blowout regime is studied for a set of initial conditions typical of recent experiments with 100-terawatt-class lasers. Two different approaches to computationally efficient, fully explicit, 3D particle-in-cell modelling are examined. First, the Cartesian code vorpal (Nieter, C. and Cary, J. R. 2004 VORPAL: a versatile plasma simulation code. J. Comput. Phys. 196, 538) using a perfect-dispersion electromagnetic solver precisely describes the laser pulse and bubble dynamics, taking advantage of coarser resolution in the propagation direction, with a proportionally larger time step. Using third-order splines for macroparticles helps suppress the sampling noise while keeping the usage of computational resources modest. The second way to reduce the simulation load is using reduced-geometry codes. In our case, the quasi-cylindrical code calder-circ (Lifschitz, A. F. et al. 2009 Particle-in-cell modelling of laser-plasma interaction using Fourier decomposition. J. Comput. Phys. 228(5), 1803-1814) uses decomposition of fields and currents into a set of poloidal modes, while the macroparticles move in the Cartesian 3D space. Cylindrical symmetry of the interaction allows using just two modes, reducing the computational load to roughly that of a planar Cartesian simulation while preserving the 3D nature of the interaction. This significant economy of resources allows using fine resolution in the direction of propagation and a small time step, making numerical dispersion vanishingly small, together with a large number of particles per cell, enabling good particle statistics. Quantitative agreement of two simulations indicates that these are free of numerical artefacts. Both approaches thus retrieve the physically correct evolution of the plasma bubble, recovering the intrinsic connection of electron self-injection to the nonlinear optical evolution of the driver.

  5. Charting the expansion of strategic exploratory behavior during adolescence.

    PubMed

    Somerville, Leah H; Sasse, Stephanie F; Garrad, Megan C; Drysdale, Andrew T; Abi Akar, Nadine; Insel, Catherine; Wilson, Robert C

    2017-02-01

    Although models of exploratory decision making implicate a suite of strategies that guide the pursuit of information, the developmental emergence of these strategies remains poorly understood. This study takes an interdisciplinary perspective, merging computational decision making and developmental approaches to characterize age-related shifts in exploratory strategy from adolescence to young adulthood. Participants were 149 12-28-year-olds who completed a computational explore-exploit paradigm that manipulated reward value, information value, and decision horizon (i.e., the utility that information holds for future choices). Strategic directed exploration, defined as information seeking selective for long time horizons, emerged during adolescence and maintained its level through early adulthood. This age difference was partially driven by adolescents valuing immediate reward over new information. Strategic random exploration, defined as stochastic choice behavior selective for long time horizons, was invoked at comparable levels over the age range, and predicted individual differences in attitudes toward risk taking in daily life within the adolescent portion of the sample. Collectively, these findings reveal an expansion of the diversity of strategic exploration over development, implicate distinct mechanisms for directed and random exploratory strategies, and suggest novel mechanisms for adolescent-typical shifts in decision making. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. GPU-accelerated computational tool for studying the effectiveness of asteroid disruption techniques

    NASA Astrophysics Data System (ADS)

    Zimmerman, Ben J.; Wie, Bong

    2016-10-01

    This paper presents the development of a new Graphics Processing Unit (GPU) accelerated computational tool for asteroid disruption techniques. Numerical simulations are completed using the high-order spectral difference (SD) method. Due to the compact nature of the SD method, it is well suited for implementation with the GPU architecture, hence solutions are generated at orders of magnitude faster than the Central Processing Unit (CPU) counterpart. A multiphase model integrated with the SD method is introduced, and several asteroid disruption simulations are conducted, including kinetic-energy impactors, multi-kinetic energy impactor systems, and nuclear options. Results illustrate the benefits of using multi-kinetic energy impactor systems when compared to a single impactor system. In addition, the effectiveness of nuclear options is observed.

  7. Dissociable neural representations of reinforcement and belief prediction errors underlie strategic learning

    PubMed Central

    Zhu, Lusha; Mathewson, Kyle E.; Hsu, Ming

    2012-01-01

    Decision-making in the presence of other competitive intelligent agents is fundamental for social and economic behavior. Such decisions require agents to behave strategically, where in addition to learning about the rewards and punishments available in the environment, they also need to anticipate and respond to actions of others competing for the same rewards. However, whereas we know much about strategic learning at both theoretical and behavioral levels, we know relatively little about the underlying neural mechanisms. Here, we show using a multi-strategy competitive learning paradigm that strategic choices can be characterized by extending the reinforcement learning (RL) framework to incorporate agents’ beliefs about the actions of their opponents. Furthermore, using this characterization to generate putative internal values, we used model-based functional magnetic resonance imaging to investigate neural computations underlying strategic learning. We found that the distinct notions of prediction errors derived from our computational model are processed in a partially overlapping but distinct set of brain regions. Specifically, we found that the RL prediction error was correlated with activity in the ventral striatum. In contrast, activity in the ventral striatum, as well as the rostral anterior cingulate (rACC), was correlated with a previously uncharacterized belief-based prediction error. Furthermore, activity in rACC reflected individual differences in degree of engagement in belief learning. These results suggest a model of strategic behavior where learning arises from interaction of dissociable reinforcement and belief-based inputs. PMID:22307594

  8. Dissociable neural representations of reinforcement and belief prediction errors underlie strategic learning.

    PubMed

    Zhu, Lusha; Mathewson, Kyle E; Hsu, Ming

    2012-01-31

    Decision-making in the presence of other competitive intelligent agents is fundamental for social and economic behavior. Such decisions require agents to behave strategically, where in addition to learning about the rewards and punishments available in the environment, they also need to anticipate and respond to actions of others competing for the same rewards. However, whereas we know much about strategic learning at both theoretical and behavioral levels, we know relatively little about the underlying neural mechanisms. Here, we show using a multi-strategy competitive learning paradigm that strategic choices can be characterized by extending the reinforcement learning (RL) framework to incorporate agents' beliefs about the actions of their opponents. Furthermore, using this characterization to generate putative internal values, we used model-based functional magnetic resonance imaging to investigate neural computations underlying strategic learning. We found that the distinct notions of prediction errors derived from our computational model are processed in a partially overlapping but distinct set of brain regions. Specifically, we found that the RL prediction error was correlated with activity in the ventral striatum. In contrast, activity in the ventral striatum, as well as the rostral anterior cingulate (rACC), was correlated with a previously uncharacterized belief-based prediction error. Furthermore, activity in rACC reflected individual differences in degree of engagement in belief learning. These results suggest a model of strategic behavior where learning arises from interaction of dissociable reinforcement and belief-based inputs.

  9. New technology continues to invade healthcare. What are the strategic implications/outcomes?

    PubMed

    Smith, Coy

    2004-01-01

    Healthcare technology continues to advance and be implemented in healthcare organizations. Nurse executives must strategically evaluate the effectiveness of each proposed system or device using a strategic planning process. Clinical information systems, computer-chip-based clinical monitoring devices, advanced Web-based applications with remote, wireless communication devices, clinical decision support software--all compete for capital and registered nurse salary dollars. The concept of clinical transformation is developed with new models of care delivery being supported by technology rather than driving care delivery. Senior nursing leadership's role in clinical transformation and healthcare technology implementation is developed. Proposed standards, expert group action, business and consumer groups, and legislation are reviewed as strategic drivers in the development of an electronic health record and healthcare technology. A matrix of advancing technology and strategic decision-making parameters are outlined.

  10. Stratway: A Modular Approach to Strategic Conflict Resolution

    NASA Technical Reports Server (NTRS)

    Hagen, George E.; Butler, Ricky W.; Maddalon, Jeffrey M.

    2011-01-01

    In this paper we introduce Stratway, a modular approach to finding long-term strategic resolutions to conflicts between aircraft. The modular approach provides both advantages and disadvantages. Our primary concern is to investigate the implications on the verification of safety-critical properties of a strategic resolution algorithm. By partitioning the problem into verifiable modules much stronger verification claims can be established. Since strategic resolution involves searching for solutions over an enormous state space, Stratway, like most similar algorithms, searches these spaces by applying heuristics, which present especially difficult verification challenges. An advantage of a modular approach is that it makes a clear distinction between the resolution function and the trajectory generation function. This allows the resolution computation to be independent of any particular vehicle. The Stratway algorithm was developed in both Java and C++ and is available through a open source license. Additionally there is a visualization application that is helpful when analyzing and quickly creating conflict scenarios.

  11. Strategic financial analysis: the CFO's role in strategic planning.

    PubMed

    Litos, D M

    1985-03-01

    Strategic financial analysis, the financial information support system for the strategic planning process, provides information vital to maintaining a healthy bottom line. This article, the third in HCSM's series on the organizational components of strategic planning, reviews the role of the chief financial officer in determining which programs and services will best meet the future needs of the institution.

  12. Checkpointing for a hybrid computing node

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cher, Chen-Yong

    2016-03-08

    According to an aspect, a method for checkpointing in a hybrid computing node includes executing a task in a processing accelerator of the hybrid computing node. A checkpoint is created in a local memory of the processing accelerator. The checkpoint includes state data to restart execution of the task in the processing accelerator upon a restart operation. Execution of the task is resumed in the processing accelerator after creating the checkpoint. The state data of the checkpoint are transferred from the processing accelerator to a main processor of the hybrid computing node while the processing accelerator is executing the task.

  13. Validation of GPU-accelerated superposition-convolution dose computations for the Small Animal Radiation Research Platform.

    PubMed

    Cho, Nathan; Tsiamas, Panagiotis; Velarde, Esteban; Tryggestad, Erik; Jacques, Robert; Berbeco, Ross; McNutt, Todd; Kazanzides, Peter; Wong, John

    2018-05-01

    The Small Animal Radiation Research Platform (SARRP) has been developed for conformal microirradiation with on-board cone beam CT (CBCT) guidance. The graphics processing unit (GPU)-accelerated Superposition-Convolution (SC) method for dose computation has been integrated into the treatment planning system (TPS) for SARRP. This paper describes the validation of the SC method for the kilovoltage energy by comparing with EBT2 film measurements and Monte Carlo (MC) simulations. MC data were simulated by EGSnrc code with 3 × 10 8 -1.5 × 10 9 histories, while 21 photon energy bins were used to model the 220 kVp x-rays in the SC method. Various types of phantoms including plastic water, cork, graphite, and aluminum were used to encompass the range of densities of mouse organs. For the comparison, percentage depth dose (PDD) of SC, MC, and film measurements were analyzed. Cross beam (x,y) dosimetric profiles of SC and film measurements are also presented. Correction factors (CFz) to convert SC to MC dose-to-medium are derived from the SC and MC simulations in homogeneous phantoms of aluminum and graphite to improve the estimation. The SC method produces dose values that are within 5% of film measurements and MC simulations in the flat regions of the profile. The dose is less accurate at the edges, due to factors such as geometric uncertainties of film placement and difference in dose calculation grids. The GPU-accelerated Superposition-Convolution dose computation method was successfully validated with EBT2 film measurements and MC calculations. The SC method offers much faster computation speed than MC and provides calculations of both dose-to-water in medium and dose-to-medium in medium. © 2018 American Association of Physicists in Medicine.

  14. Assessing the Relationships among Cloud Adoption, Strategic Alignment and Information Technology Effectiveness

    ERIC Educational Resources Information Center

    Chebrolu, Shankar Babu

    2010-01-01

    Against the backdrop of new economic realities, one of the larger forces that is affecting businesses worldwide is cloud computing, whose benefits include agility, time to market, time to capability, reduced cost, renewed focus on the core and strategic partnership with the business. Cloud computing can potentially transform a majority of the…

  15. A computational model of prefrontal control in free recall: strategic memory use in the California Verbal Learning Task.

    PubMed

    Becker, Suzanna; Lim, Jean

    2003-08-15

    Several decades of research into the function of the frontal lobes in brain-damaged patients, and more recently in intact individuals using function brain imaging, has delineated the complex executive functions of the frontal cortex. And yet, the mechanisms by which the brain achieves these functions remain poorly understood. Here, we present a computational model of the role of the prefrontal cortex (PFC) in controlled memory use that may help to shed light on the mechanisms underlying one aspect of frontal control: the development and deployment of recall strategies. The model accounts for interactions between the PFC and medial temporal lobe in strategic memory use. The PFC self-organizes its own mnemonic codes using internally derived performance measures. These mnemonic codes serve as retrieval cues by biasing retrieval in the medial temporal lobe memory system. We present data from three simulation experiments that demonstrate strategic encoding and retrieval in the free recall of categorized lists of words. Experiment 1 compares the performance of the model with two control networks to evaluate the contribution of various components of the model. Experiment 2 compares the performance of normal and frontally lesioned models to data from several studies using frontally intact and frontally lesioned individuals, as well as normal, healthy individuals under conditions of divided attention. Experiment 3 compares the model's performance on the recall of blocked and unblocked categorized lists of words to data from Stuss et al. (1994) for individuals with control and frontal lobe lesions. Overall, our model captures a number of aspects of human performance on free recall tasks: an increase in total words recalled and in semantic clustering scores across trials, superiority on blocked lists of related items compared to unblocked lists of related items, and similar patterns of performance across trials in the normal and frontally lesioned models, with poorer overall

  16. Strategic Planning for Computer-Based Educational Technology.

    ERIC Educational Resources Information Center

    Bozeman, William C.

    1984-01-01

    Offers educational practitioners direction for the development of a master plan for the implementation and application of computer-based educational technology by briefly examining computers in education, discussing organizational change from a theoretical perspective, and presenting an overview of the planning strategy known as the planning and…

  17. Acceleration and Velocity Sensing from Measured Strain

    NASA Technical Reports Server (NTRS)

    Pak, Chan-Gi; Truax, Roger

    2016-01-01

    A simple approach for computing acceleration and velocity of a structure from the strain is proposed in this study. First, deflection and slope of the structure are computed from the strain using a two-step theory. Frequencies of the structure are computed from the time histories of strain using a parameter estimation technique together with an Autoregressive Moving Average model. From deflection, slope, and frequencies of the structure, acceleration and velocity of the structure can be obtained using the proposed approach. shape sensing, fiber optic strain sensor, system equivalent reduction and expansion process.

  18. Strategic Plan. Volume 2

    NASA Technical Reports Server (NTRS)

    2002-01-01

    The Mission of the NSBRI will be to lead a National effort for accomplishing the integrated, critical path, biomedical research necessary to support the long term human presence, development, and exploration of space and to enhance life on Earth by applying the resultant advances in human knowledge and technology acquired through living and working in space. To carry out this mission, the NSBRI focuses its activities on three Strategic Programs: Strategic Program 1: Countermeasure Research Strategic Program 2: Education, Training and Outreach Strategic Program 3: Cooperative Research and Development. This document contains the detailed Team Strategic Plans for the 11 research teams focused on Strategic Program 1, and the Education and Outreach Team focused on Strategic Program 2. There is overlap and integration among the Programs and Team Strategic Plans, as described in each of the Plans.

  19. Neural signatures of strategic types in a two-person bargaining game

    PubMed Central

    Bhatt, Meghana A.; Lohrenz, Terry; Camerer, Colin F.; Montague, P. Read

    2010-01-01

    The management and manipulation of our own social image in the minds of others requires difficult and poorly understood computations. One computation useful in social image management is strategic deception: our ability and willingness to manipulate other people's beliefs about ourselves for gain. We used an interpersonal bargaining game to probe the capacity of players to manage their partner's beliefs about them. This probe parsed the group of subjects into three behavioral types according to their revealed level of strategic deception; these types were also distinguished by neural data measured during the game. The most deceptive subjects emitted behavioral signals that mimicked a more benign behavioral type, and their brains showed differential activation in right dorsolateral prefrontal cortex and left Brodmann area 10 at the time of this deception. In addition, strategic types showed a significant correlation between activation in the right temporoparietal junction and expected payoff that was absent in the other groups. The neurobehavioral types identified by the game raise the possibility of identifying quantitative biomarkers for the capacity to manipulate and maintain a social image in another person's mind. PMID:21041646

  20. Economic Modeling as a Component of Academic Strategic Planning.

    ERIC Educational Resources Information Center

    MacKinnon, Joyce; Sothmann, Mark; Johnson, James

    2001-01-01

    Computer-based economic modeling was used to enable a school of allied health to define outcomes, identify associated costs, develop cost and revenue models, and create a financial planning system. As a strategic planning tool, it assisted realistic budgeting and improved efficiency and effectiveness. (Contains 18 references.) (SK)

  1. 11. Strategic planning.

    PubMed

    2014-05-01

    There are several types of planning processes and plans, including strategic, operational, tactical, and contingency. For this document, operational planning includes tactical planning. This chapter examines the strategic planning process and includes an introduction into disaster response plans. "A strategic plan is an outline of steps designed with the goals of the entire organisation as a whole in mind, rather than with the goals of specific divisions or departments". Strategic planning includes all measures taken to provide a broad picture of what must be achieved and in which order, including how to organise a system capable of achieving the overall goals. Strategic planning often is done pre-event, based on previous experience and expertise. The strategic planning for disasters converts needs into a strategic plan of action. Strategic plans detail the goals that must be achieved. The process of converting needs into plans has been deconstructed into its components and includes consideration of: (1) disaster response plans; (2) interventions underway or planned; (3) available resources; (4) current status vs. pre-event status; (5) history and experience of the planners; and (6) access to the affected population. These factors are tempered by the local: (a) geography; (b) climate; (c) culture; (d) safety; and (e) practicality. The planning process consumes resources (costs). All plans must be adapted to the actual conditions--things never happen exactly as planned.

  2. The Strategic Attitude: Integrating Strategic Planning into Daily University Worklife

    ERIC Educational Resources Information Center

    Dickmeyer, Nathan

    2004-01-01

    Chief financial officers in today's universities are so busy with the challenges of day-to-day management that strategic thinking often takes a back seat. Planning for strategic change can go a long way toward streamlining the very daily tasks that obscure the "big picture." Learning how to integrate strategic thinking into day-to-day management…

  3. Journey to the 21st Century. A Summary of OCLC's Strategic Plan.

    ERIC Educational Resources Information Center

    OCLC Online Computer Library Center, Inc., Dublin, OH.

    This report on some of the strategic planning decisions that OCLC has made for the 21st century begins by describing the evolution of OCLC from a pioneer in the computer revolution with its Online Union Catalog and Shared Cataloging System in 1971 to a system that currently has nearly 60 distinct offerings. Corresponding computer and…

  4. Accelerator on a Chip

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    England, Joel

    2014-06-30

    SLAC's Joel England explains how the same fabrication techniques used for silicon computer microchips allowed their team to create the new laser-driven particle accelerator chips. (SLAC Multimedia Communications)

  5. Accelerator on a Chip

    ScienceCinema

    England, Joel

    2018-01-16

    SLAC's Joel England explains how the same fabrication techniques used for silicon computer microchips allowed their team to create the new laser-driven particle accelerator chips. (SLAC Multimedia Communications)

  6. Strategic Leadership Reconsidered

    ERIC Educational Resources Information Center

    Davies, Brent; Davies, Barbara J.

    2005-01-01

    This paper will address the challenge of how strategic leadership can be defined and articulated to provide a framework for developing a strategically focused school drawing on a NCSL research project. The paper is structured into three main parts. Part one outlines the elements that comprise a strategically focused school, develops an…

  7. Accelerated time-of-flight (TOF) PET image reconstruction using TOF bin subsetization and TOF weighting matrix pre-computation.

    PubMed

    Mehranian, Abolfazl; Kotasidis, Fotis; Zaidi, Habib

    2016-02-07

    FDG-PET study also revealed that for the same noise level, a higher contrast recovery can be obtained by increasing the number of TOF subsets. It can be concluded that the proposed TOF weighting matrix pre-computation and subsetization approaches enable to further accelerate and improve the convergence properties of OSEM and MLEM algorithms, thus opening new avenues for accelerated TOF PET image reconstruction.

  8. Accelerated time-of-flight (TOF) PET image reconstruction using TOF bin subsetization and TOF weighting matrix pre-computation

    NASA Astrophysics Data System (ADS)

    Mehranian, Abolfazl; Kotasidis, Fotis; Zaidi, Habib

    2016-02-01

    FDG-PET study also revealed that for the same noise level, a higher contrast recovery can be obtained by increasing the number of TOF subsets. It can be concluded that the proposed TOF weighting matrix pre-computation and subsetization approaches enable to further accelerate and improve the convergence properties of OSEM and MLEM algorithms, thus opening new avenues for accelerated TOF PET image reconstruction.

  9. Strategic groups, performance, and strategic response in the nursing home industry.

    PubMed

    Zinn, J S; Aaronson, W E; Rosko, M D

    1994-06-01

    This study examines the effect of strategic group membership on nursing home performance and strategic behavior. Data from the 1987 Medicare and Medicaid Automated Certification Survey were combined with data from the 1987 and 1989 Pennsylvania Long Term Care Facility Questionnaire. The sample consisted of 383 Pennsylvania nursing homes. Cluster analysis was used to place the 383 nursing homes into strategic groups on the basis of variables measuring scope and resource deployment. Performance was measured by indicators of the quality of nursing home care (rates of pressure ulcers, catheterization, and restraint usage) and efficiency in services provision. Changes in Medicare participation after passage of the 1988 Medicare Catastrophic Coverage Act (MCCA) measured strategic behavior. MANOVA and Turkey HSD post hoc means tests determined if significant differences were associated with strategic group membership. Cluster analysis produced an optimal seven-group solution. Differences in group means were significant for the clustering, performance, and conduct variables (p < .0001). Strategic groups characterized by facilities providing a continuum of care services had the best patient care outcomes. The most efficient groups were characterized by facilities with high Medicare census. While all strategic groups increased Medicare census following passage of the MCCA, those dominated by for-profits had the greatest increases. Our analysis demonstrates that strategic orientation influences nursing home response to regulatory initiatives, a factor that should be recognized in policy formation directed at nursing home reform.

  10. Strategic groups, performance, and strategic response in the nursing home industry.

    PubMed Central

    Zinn, J S; Aaronson, W E; Rosko, M D

    1994-01-01

    OBJECTIVE. This study examines the effect of strategic group membership on nursing home performance and strategic behavior. DATA SOURCES AND STUDY SETTING. Data from the 1987 Medicare and Medicaid Automated Certification Survey were combined with data from the 1987 and 1989 Pennsylvania Long Term Care Facility Questionnaire. The sample consisted of 383 Pennsylvania nursing homes. STUDY DESIGN. Cluster analysis was used to place the 383 nursing homes into strategic groups on the basis of variables measuring scope and resource deployment. Performance was measured by indicators of the quality of nursing home care (rates of pressure ulcers, catheterization, and restraint usage) and efficiency in services provision. Changes in Medicare participation after passage of the 1988 Medicare Catastrophic Coverage Act (MCCA) measured strategic behavior. MANOVA and Turkey HSD post hoc means tests determined if significant differences were associated with strategic group membership. FINDINGS. Cluster analysis produced an optimal seven-group solution. Differences in group means were significant for the clustering, performance, and conduct variables (p < .0001). Strategic groups characterized by facilities providing a continuum of care services had the best patient care outcomes. The most efficient groups were characterized by facilities with high Medicare census. While all strategic groups increased Medicare census following passage of the MCCA, those dominated by for-profits had the greatest increases. CONCLUSIONS. Our analysis demonstrates that strategic orientation influences nursing home response to regulatory initiatives, a factor that should be recognized in policy formation directed at nursing home reform. PMID:8005789

  11. NASA Strategic Plan

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The aforementioned strategic decisions and the overarching direction for America's aeronautics and space program are addressed in the Strategic Plan. Our Strategic Plan is critical to our ability to meet the challenges of this new era and deliver a vibrant aeronautics and space program that strengthens and inspires the Nation. The Plan is our top-level strategy.

  12. Acceleration and torque feedback for robotic control - Experimental results

    NASA Technical Reports Server (NTRS)

    Mclnroy, John E.; Saridis, George N.

    1990-01-01

    Gross motion control of robotic manipulators typically requires significant on-line computations to compensate for nonlinear dynamics due to gravity, Coriolis, centripetal, and friction nonlinearities. One controller proposed by Luo and Saridis avoids these computations by feeding back joint acceleration and torque. This study implements the controller on a Puma 600 robotic manipulator. Joint acceleration measurement is obtained by measuring linear accelerations of each joint, and deriving a computationally efficient transformation from the linear measurements to the angular accelerations. Torque feedback is obtained by using the previous torque sent to the joints. The implementation has stability problems on the Puma 600 due to the extremely high gains inherent in the feedback structure. Since these high gains excite frequency modes in the Puma 600, the algorithm is modified to decrease the gain inherent in the feedback structure. The resulting compensator is stable and insensitive to high frequency unmodeled dynamics. Moreover, a second compensator is proposed which uses acceleration and torque feedback, but still allows nonlinear terms to be fed forward. Thus, by feeding the increment in the easily calculated gravity terms forward, improved responses are obtained. Both proposed compensators are implemented, and the real time results are compared to those obtained with the computed torque algorithm.

  13. Healthcare's Future: Strategic Investment in Technology.

    PubMed

    Franklin, Michael A

    2018-01-01

    Recent and rapid advances in the implementation of technology have greatly affected the quality and efficiency of healthcare delivery in the United States. Simultaneously, diverse generational pressures-including the consumerism of millennials and unsustainable growth in the costs of care for baby boomers-have accelerated a revolution in healthcare delivery that was marked in 2010 by the passage of the Affordable Care Act.Against this backdrop, Maryland and the Centers for Medicare & Medicaid Services entered into a partnership in 2014 to modernize the Maryland All-Payer Model. Under this architecture, each Maryland hospital negotiates a global budget revenue agreement with the state's rate-setting agency, limiting the hospital's annual revenue to the budgetary cap established by the state.At Atlantic General Hospital (AGH), leaders had established a disciplined strategic planning process in which the board of trustees, medical staff, and administration annually agree on goals and initiatives to achieve the objectives set forth in its five-year strategic plans. This article describes two initiatives to improve care using technology. In 2006, AGH introduced a service guarantee in the emergency room (ER); the ER 30-Minute Promise assures patients that they will be placed in a bed or receive care within 30 minutes of arrival in the ER. In 2007, several independent hospitals in the state formed Maryland eCare to jointly contract for intensive care unit (ICU) physician coverage via telemedicine. This technology allows clinical staff to continuously monitor ICU patients remotely. The positive results of the ER 30-Minute Promise and Maryland eCare program show that technological advances in an independent, small, rural hospital can make a significant impact on its ability to maintain independence. AGH's strategic investments prepared the organization well for the transition in 2014 to a value-based payment system.

  14. Accelerated Adaptive MGS Phase Retrieval

    NASA Technical Reports Server (NTRS)

    Lam, Raymond K.; Ohara, Catherine M.; Green, Joseph J.; Bikkannavar, Siddarayappa A.; Basinger, Scott A.; Redding, David C.; Shi, Fang

    2011-01-01

    The Modified Gerchberg-Saxton (MGS) algorithm is an image-based wavefront-sensing method that can turn any science instrument focal plane into a wavefront sensor. MGS characterizes optical systems by estimating the wavefront errors in the exit pupil using only intensity images of a star or other point source of light. This innovative implementation of MGS significantly accelerates the MGS phase retrieval algorithm by using stream-processing hardware on conventional graphics cards. Stream processing is a relatively new, yet powerful, paradigm to allow parallel processing of certain applications that apply single instructions to multiple data (SIMD). These stream processors are designed specifically to support large-scale parallel computing on a single graphics chip. Computationally intensive algorithms, such as the Fast Fourier Transform (FFT), are particularly well suited for this computing environment. This high-speed version of MGS exploits commercially available hardware to accomplish the same objective in a fraction of the original time. The exploit involves performing matrix calculations in nVidia graphic cards. The graphical processor unit (GPU) is hardware that is specialized for computationally intensive, highly parallel computation. From the software perspective, a parallel programming model is used, called CUDA, to transparently scale multicore parallelism in hardware. This technology gives computationally intensive applications access to the processing power of the nVidia GPUs through a C/C++ programming interface. The AAMGS (Accelerated Adaptive MGS) software takes advantage of these advanced technologies, to accelerate the optical phase error characterization. With a single PC that contains four nVidia GTX-280 graphic cards, the new implementation can process four images simultaneously to produce a JWST (James Webb Space Telescope) wavefront measurement 60 times faster than the previous code.

  15. 78 FR 67131 - Notice of Advisory Committee Closed Meeting; U.S. Strategic Command Strategic Advisory Group

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-08

    .... Strategic Command Strategic Advisory Group AGENCY: Department of Defense. ACTION: Notice of Advisory... following Federal Advisory Committee meeting of the U.S. Strategic Command Strategic Advisory Group. DATES... issues to the Commander, U.S. Strategic Command, during the development of the Nation's strategic war...

  16. 77 FR 54615 - Strategic Management Program; Fiscal Year 2013-2016 Strategic Plan

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-05

    ... Manager, Strategic Management Program; National Transportation Safety Board, 490 L'Enfant Plaza SW., MD-1... NATIONAL TRANSPORTATION SAFETY BOARD Strategic Management Program; Fiscal Year 2013-2016 Strategic Plan AGENCY: National Transportation Safety Board. ACTION: Notice: Request for comments. SUMMARY: This...

  17. Architectural requirements for the Red Storm computing system.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Camp, William J.; Tomkins, James Lee

    This report is based on the Statement of Work (SOW) describing the various requirements for delivering 3 new supercomputer system to Sandia National Laboratories (Sandia) as part of the Department of Energy's (DOE) Accelerated Strategic Computing Initiative (ASCI) program. This system is named Red Storm and will be a distributed memory, massively parallel processor (MPP) machine built primarily out of commodity parts. The requirements presented here distill extensive architectural and design experience accumulated over a decade and a half of research, development and production operation of similar machines at Sandia. Red Storm will have an unusually high bandwidth, low latencymore » interconnect, specially designed hardware and software reliability features, a light weight kernel compute node operating system and the ability to rapidly switch major sections of the machine between classified and unclassified computing environments. Particular attention has been paid to architectural balance in the design of Red Storm, and it is therefore expected to achieve an atypically high fraction of its peak speed of 41 TeraOPS on real scientific computing applications. In addition, Red Storm is designed to be upgradeable to many times this initial peak capability while still retaining appropriate balance in key design dimensions. Installation of the Red Storm computer system at Sandia's New Mexico site is planned for 2004, and it is expected that the system will be operated for a minimum of five years following installation.« less

  18. Scientific Computing Strategic Plan for the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiting, Eric Todd

    Scientific computing is a critical foundation of modern science. Without innovations in the field of computational science, the essential missions of the Department of Energy (DOE) would go unrealized. Taking a leadership role in such innovations is Idaho National Laboratory’s (INL’s) challenge and charge, and is central to INL’s ongoing success. Computing is an essential part of INL’s future. DOE science and technology missions rely firmly on computing capabilities in various forms. Modeling and simulation, fueled by innovations in computational science and validated through experiment, are a critical foundation of science and engineering. Big data analytics from an increasing numbermore » of widely varied sources is opening new windows of insight and discovery. Computing is a critical tool in education, science, engineering, and experiments. Advanced computing capabilities in the form of people, tools, computers, and facilities, will position INL competitively to deliver results and solutions on important national science and engineering challenges. A computing strategy must include much more than simply computers. The foundational enabling component of computing at many DOE national laboratories is the combination of a showcase like data center facility coupled with a very capable supercomputer. In addition, network connectivity, disk storage systems, and visualization hardware are critical and generally tightly coupled to the computer system and co located in the same facility. The existence of these resources in a single data center facility opens the doors to many opportunities that would not otherwise be possible.« less

  19. 75 FR 18824 - Federal Advisory Committee; U.S. Strategic Command Strategic Advisory Group; Closed Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-13

    ... DEPARTMENT OF DEFENSE Office of the Secretary Federal Advisory Committee; U.S. Strategic Command... 102-3.150, the Department of Defense announces that the U.S. Strategic Command Strategic Advisory... Commander, U.S. Strategic Command, during the development of the Nation's strategic war plans. Agenda Topics...

  20. News | Computing

    Science.gov Websites

    Support News Publications Computing for Experiments Computing for Neutrino and Muon Physics Computing for Collider Experiments Computing for Astrophysics Research and Development Accelerator Modeling ComPASS - Impact of Detector Simulation on Particle Physics Collider Experiments Daniel Elvira's paper "Impact

  1. Proceeding of the 1999 Particle Accelerator Conference. Volume 3

    DTIC Science & Technology

    1999-04-02

    conditioning, a laser pulse was irradiated on a copper cath- ode and the photo-emitted beam was accelerated up to 2.9 MeV. An effective quantum...dipole magnet and a vacuum Nd:YAG laser pulse irradiation . As a result, the pumping unit. The gun cavity has two s-band cells made maximu ensergy andlthe...Optimizing beam intensity in the AGS involves a correctors at strategic locations are pulsed to minimize the compromise between conflicting needs to

  2. GPU Accelerated Prognostics

    NASA Technical Reports Server (NTRS)

    Gorospe, George E., Jr.; Daigle, Matthew J.; Sankararaman, Shankar; Kulkarni, Chetan S.; Ng, Eley

    2017-01-01

    Prognostic methods enable operators and maintainers to predict the future performance for critical systems. However, these methods can be computationally expensive and may need to be performed each time new information about the system becomes available. In light of these computational requirements, we have investigated the application of graphics processing units (GPUs) as a computational platform for real-time prognostics. Recent advances in GPU technology have reduced cost and increased the computational capability of these highly parallel processing units, making them more attractive for the deployment of prognostic software. We present a survey of model-based prognostic algorithms with considerations for leveraging the parallel architecture of the GPU and a case study of GPU-accelerated battery prognostics with computational performance results.

  3. Nonlinear theory of diffusive acceleration of particles by shock waves

    NASA Astrophysics Data System (ADS)

    Malkov, M. A.; Drury, L. O'C.

    2001-04-01

    Among the various acceleration mechanisms which have been suggested as responsible for the nonthermal particle spectra and associated radiation observed in many astrophysical and space physics environments, diffusive shock acceleration appears to be the most successful. We review the current theoretical understanding of this process, from the basic ideas of how a shock energizes a few reactionless particles to the advanced nonlinear approaches treating the shock and accelerated particles as a symbiotic self-organizing system. By means of direct solution of the nonlinear problem we set the limit to the test-particle approximation and demonstrate the fundamental role of nonlinearity in shocks of astrophysical size and lifetime. We study the bifurcation of this system, proceeding from the hydrodynamic to kinetic description under a realistic condition of Bohm diffusivity. We emphasize the importance of collective plasma phenomena for the global flow structure and acceleration efficiency by considering the injection process, an initial stage of acceleration and, the related aspects of the physics of collisionless shocks. We calculate the injection rate for different shock parameters and different species. This, together with differential acceleration resulting from nonlinear large-scale modification, determines the chemical composition of accelerated particles. The review concentrates on theoretical and analytical aspects but our strategic goal is to link the fundamental theoretical ideas with the rapidly growing wealth of observational data.

  4. Advanced induction accelerator designs for ground based and space based FELs

    NASA Astrophysics Data System (ADS)

    Birx, Daniel

    1994-04-01

    The primary goal of this program was to improve the performance of induction accelerators with particular regards to their being used to drive Free Electron Lasers (FEL's). It is hoped that FEL's operating at visible wavelengths might someday be used to beam power from earth to extraterrestrial locations. One application of this technology might be strategic theater defense, but this power source might be used to propel vehicles or supplement solar energized systems. Our path toward achieving this goal was directed first toward optimization of the nonlinear magnetic material used in induction accelerator construction and secondly at the overall design in terms of cost, size and efficiency. We began this research effort with an in depth study into the properties of various nonlinear magnetic materials. With the data on nonlinear magnetic materials, so important to the optimization of efficiency, in hand, we envisioned a new induction accelerator design where all of the components were packaged together in one container. This induction accelerator module would combine an /ll-solid-state, nonlinear magnetic driver and the induction accelerator cells all in one convenient package. Each accelerator module (denoted SNOMAD-IVB) would produce 1.0 MeV of acceleration with the exception of the SNOMAD-IV injector module which would produce 0.5 MeV of acceleration for an electron beam current up to 1000 amperes.

  5. Austin Community College Learning Resource Services Strategic Plan, 1992-1997.

    ERIC Educational Resources Information Center

    Austin Community Coll., TX.

    Designed as a planning tool and a statement of philosophy and mission, this five-part strategic planning report provides information on the activities, goals, and review processes of the Learning Resource Services (LRS) at Austin Community College in Austin, Texas. The LRS combines library services, access to computer terminals, and other…

  6. Accelerating population balance-Monte Carlo simulation for coagulation dynamics from the Markov jump model, stochastic algorithm and GPU parallel computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Zuwei; Zhao, Haibo, E-mail: klinsmannzhb@163.com; Zheng, Chuguang

    2015-01-15

    This paper proposes a comprehensive framework for accelerating population balance-Monte Carlo (PBMC) simulation of particle coagulation dynamics. By combining Markov jump model, weighted majorant kernel and GPU (graphics processing unit) parallel computing, a significant gain in computational efficiency is achieved. The Markov jump model constructs a coagulation-rule matrix of differentially-weighted simulation particles, so as to capture the time evolution of particle size distribution with low statistical noise over the full size range and as far as possible to reduce the number of time loopings. Here three coagulation rules are highlighted and it is found that constructing appropriate coagulation rule providesmore » a route to attain the compromise between accuracy and cost of PBMC methods. Further, in order to avoid double looping over all simulation particles when considering the two-particle events (typically, particle coagulation), the weighted majorant kernel is introduced to estimate the maximum coagulation rates being used for acceptance–rejection processes by single-looping over all particles, and meanwhile the mean time-step of coagulation event is estimated by summing the coagulation kernels of rejected and accepted particle pairs. The computational load of these fast differentially-weighted PBMC simulations (based on the Markov jump model) is reduced greatly to be proportional to the number of simulation particles in a zero-dimensional system (single cell). Finally, for a spatially inhomogeneous multi-dimensional (multi-cell) simulation, the proposed fast PBMC is performed in each cell, and multiple cells are parallel processed by multi-cores on a GPU that can implement the massively threaded data-parallel tasks to obtain remarkable speedup ratio (comparing with CPU computation, the speedup ratio of GPU parallel computing is as high as 200 in a case of 100 cells with 10 000 simulation particles per cell). These accelerating approaches of PBMC

  7. Learning to think strategically.

    PubMed

    1994-01-01

    Strategic thinking focuses on issues that directly affect the ability of a family planning program to attract and retain clients. This issue of "The Family Planning Manager" outlines the five steps of strategic thinking in family planning administration: 1) define the organization's mission and strategic goals; 2) identify opportunities for improving quality, expanding access, and increasing demand; 3) evaluate each option in terms of its compatibility with the organization's goals; 4) select an option; and 5) transform strategies into action. Also included in this issue is a 20-question test designed to permit readers to assess their "strategic thinking quotient" and a list of sample questions to guide a strategic analysis.

  8. Generation of nanosecond neutron pulses in vacuum accelerating tubes

    NASA Astrophysics Data System (ADS)

    Didenko, A. N.; Shikanov, A. E.; Rashchikov, V. I.; Ryzhkov, V. I.; Shatokhin, V. L.

    2014-06-01

    The generation of neutron pulses with a duration of 1-100 ns using small vacuum accelerating tubes is considered. Two physical models of acceleration of short deuteron bunches in pulse neutron generators are described. The dependences of an instantaneous neutron flux in accelerating tubes on the parameters of pulse neutron generators are obtained using computer simulation. The results of experimental investigation of short-pulse neutron generators based on the accelerating tube with a vacuum-arc deuteron source, connected in the circuit with a discharge peaker, and an accelerating tube with a laser deuteron source, connected according to the Arkad'ev-Marx circuit, are given. In the experiments, the neutron yield per pulse reached 107 for a pulse duration of 10-100 ns. The resultant experimental data are in satisfactory agreement with the results of computer simulation.

  9. 76 FR 14950 - Closed Meeting of the U.S. Strategic Command Strategic Advisory Group

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-18

    ... DEPARTMENT OF DEFENSE Office of the Secretary Closed Meeting of the U.S. Strategic Command.... Strategic Command Strategic Advisory Group. DATES: April 7, 2011, from 8 a.m. to 5 p.m. and April 8, 2011... policy-related issues to the Commander, U.S. Strategic Command, during the development of the Nation's...

  10. An Adiabatic Phase-Matching Accelerator

    DOE PAGES

    Lemery, Francois; Floettmann, Klaus; Piot, Philippe; ...

    2018-05-25

    We present a general concept to accelerate non-relativistic charged particles. Our concept employs an adiabatically-tapered dielectric-lined waveguide which supports accelerating phase velocities for synchronous acceleration. We propose an ansatz for the transient field equations, show it satisfies Maxwell's equations under an adiabatic approximation and find excellent agreement with a finite-difference time-domain computer simulation. The fields were implemented into the particle-tracking program {\\sc astra} and we present beam dynamics results for an accelerating field with a 1-mm-wavelength and peak electric field of 100~MV/m. The numerical simulations indicate that amore » $$\\sim 200$$-keV electron beam can be accelerated to an energy of $$\\sim10$$~MeV over $$\\sim 10$$~cm. The novel scheme is also found to form electron beams with parameters of interest to a wide range of applications including, e.g., future advanced accelerators, and ultra-fast electron diffraction.« less

  11. An Adiabatic Phase-Matching Accelerator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lemery, Francois; Floettmann, Klaus; Piot, Philippe

    2017-12-22

    We present a general concept to accelerate non-relativistic charged particles. Our concept employs an adiabatically-tapered dielectric-lined waveguide which supports accelerating phase velocities for synchronous acceleration. We propose an ansatz for the transient field equations, show it satisfies Maxwell's equations under an adiabatic approximation and find excellent agreement with a finite-difference time-domain computer simulation. The fields were implemented into the particle-tracking program {\\sc astra} and we present beam dynamics results for an accelerating field with a 1-mm-wavelength and peak electric field of 100~MV/m. The numerical simulations indicate that amore » $$\\sim 200$$-keV electron beam can be accelerated to an energy of $$\\sim10$$~MeV over $$\\sim 10$$~cm. The novel scheme is also found to form electron beams with parameters of interest to a wide range of applications including, e.g., future advanced accelerators, and ultra-fast electron diffraction.« less

  12. Processing of intended and unintended strategic issues and integration into the strategic agenda.

    PubMed

    Ridder, Hans-Gerd; Schrader, Jan Simon

    2017-11-01

    Strategic change is needed in hospitals due to external and internal pressures. However, research on strategic change, as a combination of management and medical expertise in hospitals, remains scarce. We analyze how intended strategic issues are processed into deliberate strategies and how unintended strategic issues are processed into emergent strategies in the management of strategy formation in hospitals. This study empirically investigates the integration of medical and management expertise in strategy formation. The longitudinal character of the case study enabled us to track patterns of intended and unintended strategic issues over 2 years. We triangulated data from interviews, observations, and documents. In accordance with the quality standards of qualitative research procedures, we analyzed the data by pattern matching and provided analytical generalization regarding strategy formation in hospitals. Our findings suggest that strategic issues are particularly successful within the strategy formation process if interest groups are concerned with the strategic issue, prospective profits are estimated, and relevant decisions makers are involved early on. Structure and interaction processes require clear criteria and transparent procedures for effective strategy formation. There is systematic neglect of medical expertise in processes of generating strategies. Our study reveals that the decentralized structure of medical centers is an adequate template for both the operationalization of intended strategic issues and the development of unintended strategic issues. However, tasks, roles, responsibility, resources, and administrative support are necessary for effective management of strategy formation. Similarly, criteria, procedures, and decision-making are prerequisites for effective strategy formation.

  13. 77 FR 61581 - Notice of Advisory Committee Closed Meeting; U.S. Strategic Command Strategic Advisory Group

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-10

    .... Strategic Command Strategic Advisory Group AGENCY: Department of Defense. ACTION: Notice of Advisory... advisory committee: U.S. Strategic Command Strategic Advisory Group. DATES: November 15, 2012, from 8 a.m... Command, during the development of the Nation's strategic war plans. Agenda: Topics include: Policy Issues...

  14. Strategic Planning for Independent Schools.

    ERIC Educational Resources Information Center

    Stone, Susan C.

    This manual is intended to serve independent schools beginning strategic planning methods. Chapter 1, "The Case for Strategic Planning," suggests replacing the term "long range planning" with the term "strategic planning," which emphasizes change. The strategic planning and policy development process begins with…

  15. Strategic Air Traffic Planning Using Eulerian Route Based Modeling and Optimization

    NASA Astrophysics Data System (ADS)

    Bombelli, Alessandro

    Due to a soaring air travel growth in the last decades, air traffic management has become increasingly challenging. As a consequence, planning tools are being devised to help human decision-makers achieve a better management of air traffic. Planning tools are divided into two categories, strategic and tactical. Strategic planning generally addresses a larger planning domain and is performed days to hours in advance. Tactical planning is more localized and is performed hours to minutes in advance. An aggregate route model for strategic air traffic flow management is presented. It is an Eulerian model, describing the flow between cells of unidirectional point-to-point routes. Aggregate routes are created from flight trajectory data based on similarity measures. Spatial similarity is determined using the Frechet distance. The aggregate routes approximate actual well-traveled traffic patterns. By specifying the model resolution, an appropriate balance between model accuracy and model dimension can be achieved. For a particular planning horizon, during which weather is expected to restrict the flow, a procedure for designing airborne reroutes and augmenting the traffic flow model is developed. The dynamics of the traffic flow on the resulting network take the form of a discrete-time, linear time-invariant system. The traffic flow controls are ground holding, pre-departure rerouting and airborne rerouting. Strategic planning--determining how the controls should be used to modify the future traffic flow when local capacity violations are anticipated--is posed as an integer programming problem of minimizing a weighted sum of flight delays subject to control and capacity constraints. Several tests indicate the effectiveness of the modeling and strategic planning approach. In the final, most challenging, test, strategic planning is demonstrated for the six western-most Centers of the 22-Center national airspace. The planning time horizon is four hours long, and there is

  16. The Talent Development Middle School. An Elective Replacement Approach to Providing Extra Help in Math--The CATAMA Program (Computer- and Team-Assisted Mathematics Acceleration). Report No. 21.

    ERIC Educational Resources Information Center

    Mac Iver, Douglas J.; Balfanz, Robert; Plank, Stephen B.

    In Talent Development Middle Schools, students needing extra help in mathematics participate in the Computer- and Team-Assisted Mathematics Acceleration (CATAMA) course. CATAMA is an innovative combination of computer-assisted instruction and structured cooperative learning that students receive in addition to their regular math course for about…

  17. A Conceptual Model of a Research Design about Congruence between Environmental Turbulence, Strategic Aggressiveness, and General Management Capability in Community Colleges

    ERIC Educational Resources Information Center

    Lewis, Alfred

    2013-01-01

    Numerous studies have examined the determinant strategic elements that affect the performance of organizations. These studies have increasing relevance to community colleges because of the accelerating pace of change in enrollment, resource availability, leadership turnover, and demand for service that these institutions are experiencing. The…

  18. The Concept of Strategic Decisionmaking.

    ERIC Educational Resources Information Center

    Collier, Douglas J.

    Strategic decision-making literature is reviewed, and applications to colleges and universities are made. The key requirement for strategic decision-making is that decisions affect the entire organization. While strategic decision-making can occur at different levels within the organization, the specific strategic decisions available to the…

  19. Strategic Alliance Poker: Demonstrating the Importance of Complementary Resources and Trust in Strategic Alliance Management

    ERIC Educational Resources Information Center

    Reutzel, Christopher R.; Worthington, William J.; Collins, Jamie D.

    2012-01-01

    Strategic Alliance Poker (SAP) provides instructors with an opportunity to integrate the resource based view with their discussion of strategic alliances in undergraduate Strategic Management courses. Specifically, SAP provides Strategic Management instructors with an experiential exercise that can be used to illustrate the value creation…

  20. Multi-Mode Cavity Accelerator Structure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Yong; Hirshfield, Jay Leonard

    2016-11-10

    This project aimed to develop a prototype for a novel accelerator structure comprising coupled cavities that are tuned to support modes with harmonically-related eigenfrequencies, with the goal of reaching an acceleration gradient >200 MeV/m and a breakdown rate <10 -7/pulse/meter. Phase I involved computations, design, and preliminary engineering of a prototype multi-harmonic cavity accelerator structure; plus tests of a bimodal cavity. A computational procedure was used to design an optimized profile for a bimodal cavity with high shunt impedance and low surface fields to maximize the reduction in temperature rise ΔT. This cavity supports the TM010 mode and its 2ndmore » harmonic TM011 mode. Its fundamental frequency is at 12 GHz, to benchmark against the empirical criteria proposed within the worldwide High Gradient collaboration for X-band copper structures; namely, a surface electric field E sur max< 260 MV/m and pulsed surface heating ΔT max< 56 °K. With optimized geometry, amplitude and relative phase of the two modes, reductions are found in surface pulsed heating, modified Poynting vector, and total RF power—as compared with operation at the same acceleration gradient using only the fundamental mode.« less

  1. 2011 Computation Directorate Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crawford, D L

    2012-04-11

    From its founding in 1952 until today, Lawrence Livermore National Laboratory (LLNL) has made significant strategic investments to develop high performance computing (HPC) and its application to national security and basic science. Now, 60 years later, the Computation Directorate and its myriad resources and capabilities have become a key enabler for LLNL programs and an integral part of the effort to support our nation's nuclear deterrent and, more broadly, national security. In addition, the technological innovation HPC makes possible is seen as vital to the nation's economic vitality. LLNL, along with other national laboratories, is working to make supercomputing capabilitiesmore » and expertise available to industry to boost the nation's global competitiveness. LLNL is on the brink of an exciting milestone with the 2012 deployment of Sequoia, the National Nuclear Security Administration's (NNSA's) 20-petaFLOP/s resource that will apply uncertainty quantification to weapons science. Sequoia will bring LLNL's total computing power to more than 23 petaFLOP/s-all brought to bear on basic science and national security needs. The computing systems at LLNL provide game-changing capabilities. Sequoia and other next-generation platforms will enable predictive simulation in the coming decade and leverage industry trends, such as massively parallel and multicore processors, to run petascale applications. Efficient petascale computing necessitates refining accuracy in materials property data, improving models for known physical processes, identifying and then modeling for missing physics, quantifying uncertainty, and enhancing the performance of complex models and algorithms in macroscale simulation codes. Nearly 15 years ago, NNSA's Accelerated Strategic Computing Initiative (ASCI), now called the Advanced Simulation and Computing (ASC) Program, was the critical element needed to shift from test-based confidence to science-based confidence. Specifically, ASCI

  2. Research and Development of Wires and Cables for High-Field Accelerator Magnets

    DOE PAGES

    Barzi, Emanuela; Zlobin, Alexander V.

    2016-02-18

    The latest strategic plans for High Energy Physics endorse steadfast superconducting magnet technology R&D for future Energy Frontier Facilities. This includes 10 to 16 T Nb3Sn accelerator magnets for the luminosity upgrades of the Large Hadron Collider and eventually for a future 100 TeV scale proton-protonmore » $(pp)$ collider. This paper describes the multi-decade R&D investment in the $$Nb_3Sn$$ superconductor technology, which was crucial to produce the first reproducible 10 to 12 T accelerator-quality dipoles and quadrupoles, as well as their scale-up. We also indicate prospective research areas in superconducting $$Nb_3Sn$$ wires and cables to achieve the next goals for superconducting accelerator magnets. Emphasis is on increasing performance and decreasing costs while pushing the $$Nb_3Sn$$ technology to its limits for future $pp$ colliders.« less

  3. Strategic agility for nursing leadership.

    PubMed

    Shirey, Maria R

    2015-06-01

    This department highlights change management strategies that may be successful in strategically planning and executing organizational change. In this article, the author discusses strategic agility as an important leadership competency and offers approaches for incorporating strategic agility in healthcare systems. A strategic agility checklist and infrastructure-building approach are presented.

  4. Strategizing Computer-Supported Collaborative Learning toward Knowledge Building

    ERIC Educational Resources Information Center

    Mukama, Evode

    2010-01-01

    The purpose of this paper is to explore how university students can develop knowledge in small task-based groups while acquiring hands-on computer skills. Inspired by the sociocultural perspective, this study presents a theoretical framework on co-construction of knowledge and on computer-supported collaborative learning. The participants were…

  5. Final Report on Institutional Computing Project s15_hilaserion, “Kinetic Modeling of Next-Generation High-Energy, High-Intensity Laser-Ion Accelerators as an Enabling Capability”

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Albright, Brian James; Yin, Lin; Stark, David James

    This proposal sought of order 1M core-hours of Institutional Computing time intended to enable computing by a new LANL Postdoc (David Stark) working under LDRD ER project 20160472ER (PI: Lin Yin) on laser-ion acceleration. The project was “off-cycle,” initiating in June of 2016 with a postdoc hire.

  6. Accelerated Reader. What Works Clearinghouse Intervention Report

    ERIC Educational Resources Information Center

    What Works Clearinghouse, 2009

    2009-01-01

    "Accelerated Reader" is a computer-based reading management system designed to complement an existing classroom literacy program for grades pre-K-12. It is designed to increase the amount of time students spend reading independently. Students choose reading-level appropriate books or short stories for which Accelerated Reader tests are…

  7. A Study on Strategic Planning and Procurement of Medicals in Uganda's Regional Referral Hospitals.

    PubMed

    Masembe, Ishak Kamaradi

    2016-12-31

    This study was an analysis of the effect of strategic planning on procurement of medicals in Uganda's regional referral hospitals (RRH's). Medicals were defined as essential medicines, medical devices and medical equipment. The Ministry of Health (MOH) has been carrying out strategic planning for the last 15 years via the Health Sector Strategic Plans. Their assumption was that strategic planning would translate to strategic procurement and consequently, availability of medicals in the RRH's. However, despite the existence of these plans, there have been many complaints about expired drugs and shortages in RRH's. For this purpose, a third variable was important because it served the role of mediation. A questionnaire was used to obtain information on perceptions of 206 respondents who were selected using simple random sampling. 8 key informant interviews were held, 2 in each RRH. 4 Focus Group Discussions were held, 1 for each RRH, and between 5 and 8 staff took part as discussants for approximately three hours. The findings suggested that strategic planning was affected by funding to approximately 34% while the relationship between funding and procurement was 35%. The direct relationship between strategic planning and procurement was 18%. However when the total causal effect was computed it turned out that strategic planning and the related variable of funding contributed 77% to procurement of medicals under the current hierarchical model where MOH is charged with development of strategic plans for the entire health sector. Since even with this contribution there were complaints, the study proposed a new model called CALF which according to a simulation, if adopted by MOH, strategic planning would contribute 87% to effectiveness in procurement of medicals.

  8. Software Accelerates Computing Time for Complex Math

    NASA Technical Reports Server (NTRS)

    2014-01-01

    Ames Research Center awarded Newark, Delaware-based EM Photonics Inc. SBIR funding to utilize graphic processing unit (GPU) technology- traditionally used for computer video games-to develop high-computing software called CULA. The software gives users the ability to run complex algorithms on personal computers with greater speed. As a result of the NASA collaboration, the number of employees at the company has increased 10 percent.

  9. U.S. Climate Change Science Program. Vision for the Program and Highlights of the Scientific Strategic Plan

    NASA Technical Reports Server (NTRS)

    2003-01-01

    The vision document provides an overview of the Climate Change Science Program (CCSP) long-term strategic plan to enhance scientific understanding of global climate change.This document is a companion to the comprehensive Strategic Plan for the Climate Change Science Program. The report responds to the Presidents direction that climate change research activities be accelerated to provide the best possible scientific information to support public discussion and decisionmaking on climate-related issues.The plan also responds to Section 104 of the Global Change Research Act of 1990, which mandates the development and periodic updating of a long-term national global change research plan coordinated through the National Science and Technology Council.This is the first comprehensive update of a strategic plan for U.S. global change and climate change research since the origal plan for the U.S. Global Change Research Program was adopted at the inception of the program in 1989.

  10. Strategic planning for neuroradiologists.

    PubMed

    Berlin, Jonathan W; Lexa, Frank J

    2012-08-01

    Strategic planning is becoming essential to neuroradiology as the health care environment continues to emphasize cost efficiency, teamwork and collaboration. A strategic plan begins with a mission statement and vision of where the neuroradiology division would like to be in the near future. Formalized strategic planning frameworks, such as the strengths, weaknesses, opportunities and threats (SWOT), and the Balanced Scorecard frameworks, can help neuroradiology divisions determine their current position in the marketplace. Communication, delegation, and accountability in neuroradiology is essential in executing an effective strategic plan. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. Computational thinking and thinking about computing

    PubMed Central

    Wing, Jeannette M.

    2008-01-01

    Computational thinking will influence everyone in every field of endeavour. This vision poses a new educational challenge for our society, especially for our children. In thinking about computing, we need to be attuned to the three drivers of our field: science, technology and society. Accelerating technological advances and monumental societal demands force us to revisit the most basic scientific questions of computing. PMID:18672462

  12. 76 FR 13984 - Cloud Computing Forum & Workshop III

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-15

    ... DEPARTMENT OF COMMERCE National Institute of Standards and Technology Cloud Computing Forum... public workshop. SUMMARY: NIST announces the Cloud Computing Forum & Workshop III to be held on April 7... provide information on the NIST strategic and tactical Cloud Computing program, including progress on the...

  13. Strategic Plan. Volume 1

    NASA Technical Reports Server (NTRS)

    2002-01-01

    The purpose of this document is to present the strategic plan and associated organizational structure that the National Space Biomedical Research Institute (NSBRI) will utilize to achieve the defined mission and objectives provided by NASA. Much of the information regarding the background and establishment of the NSBRI by NASA has been provided in other documentation and will not be repeated in this Strategic Plan. This Strategic Plan is presented in two volumes. Volume I (this volume) begins with an Introduction (Section 2) that provides the Institute's NASA-defined mission and objectives, and the organizational structure adopted to implement these through three Strategic Programs: Countermeasure Research; Education, Training and Outreach; and Cooperative Research and Development. These programs are described in Sections 3 to 5. Each program is presented in a similar way, using four subsections: Goals and Objectives; Current Strategies; Gaps and Modifications; and Resource Requirements. Section 6 provides the administrative infrastructure and total budget required to implement the Strategic Programs and assures that they form a single cohesive plan. This plan will ensure continued success of the Institute for the next five years. Volume II of the Strategic Plan provides an in-depth analysis of the current and future strategic programs of the 12 current NSBRI teams, including their goals, objectives, mutual interactions and schedules.

  14. Accelerating Computation of DCM for ERP in MATLAB by External Function Calls to the GPU.

    PubMed

    Wang, Wei-Jen; Hsieh, I-Fan; Chen, Chun-Chuan

    2013-01-01

    This study aims to improve the performance of Dynamic Causal Modelling for Event Related Potentials (DCM for ERP) in MATLAB by using external function calls to a graphics processing unit (GPU). DCM for ERP is an advanced method for studying neuronal effective connectivity. DCM utilizes an iterative procedure, the expectation maximization (EM) algorithm, to find the optimal parameters given a set of observations and the underlying probability model. As the EM algorithm is computationally demanding and the analysis faces possible combinatorial explosion of models to be tested, we propose a parallel computing scheme using the GPU to achieve a fast estimation of DCM for ERP. The computation of DCM for ERP is dynamically partitioned and distributed to threads for parallel processing, according to the DCM model complexity and the hardware constraints. The performance efficiency of this hardware-dependent thread arrangement strategy was evaluated using the synthetic data. The experimental data were used to validate the accuracy of the proposed computing scheme and quantify the time saving in practice. The simulation results show that the proposed scheme can accelerate the computation by a factor of 155 for the parallel part. For experimental data, the speedup factor is about 7 per model on average, depending on the model complexity and the data. This GPU-based implementation of DCM for ERP gives qualitatively the same results as the original MATLAB implementation does at the group level analysis. In conclusion, we believe that the proposed GPU-based implementation is very useful for users as a fast screen tool to select the most likely model and may provide implementation guidance for possible future clinical applications such as online diagnosis.

  15. Accelerating Computation of DCM for ERP in MATLAB by External Function Calls to the GPU

    PubMed Central

    Wang, Wei-Jen; Hsieh, I-Fan; Chen, Chun-Chuan

    2013-01-01

    This study aims to improve the performance of Dynamic Causal Modelling for Event Related Potentials (DCM for ERP) in MATLAB by using external function calls to a graphics processing unit (GPU). DCM for ERP is an advanced method for studying neuronal effective connectivity. DCM utilizes an iterative procedure, the expectation maximization (EM) algorithm, to find the optimal parameters given a set of observations and the underlying probability model. As the EM algorithm is computationally demanding and the analysis faces possible combinatorial explosion of models to be tested, we propose a parallel computing scheme using the GPU to achieve a fast estimation of DCM for ERP. The computation of DCM for ERP is dynamically partitioned and distributed to threads for parallel processing, according to the DCM model complexity and the hardware constraints. The performance efficiency of this hardware-dependent thread arrangement strategy was evaluated using the synthetic data. The experimental data were used to validate the accuracy of the proposed computing scheme and quantify the time saving in practice. The simulation results show that the proposed scheme can accelerate the computation by a factor of 155 for the parallel part. For experimental data, the speedup factor is about 7 per model on average, depending on the model complexity and the data. This GPU-based implementation of DCM for ERP gives qualitatively the same results as the original MATLAB implementation does at the group level analysis. In conclusion, we believe that the proposed GPU-based implementation is very useful for users as a fast screen tool to select the most likely model and may provide implementation guidance for possible future clinical applications such as online diagnosis. PMID:23840507

  16. Optimizations of Human Restraint Systems for Short-Period Acceleration

    NASA Technical Reports Server (NTRS)

    Payne, P. R.

    1963-01-01

    A restraint system's main function is to restrain its occupant when his vehicle is subjected to acceleration. If the restraint system is rigid and well-fitting (to eliminate slack) then it will transmit the vehicle acceleration to its occupant without modifying it in any way. Few present-day restraint systems are stiff enough to give this one-to-one transmission characteristic, and depending upon their dynamic characteristics and the nature of the vehicle's acceleration-time history, they will either magnify or attenuate the acceleration. Obviously an optimum restraint system will give maximum attenuation of an input acceleration. In the general case of an arbitrary acceleration input, a computer must be used to determine the optimum dynamic characteristics for the restraint system. Analytical solutions can be obtained for certain simple cases, however, and these cases are considered in this paper, after the concept of dynamic models of the human body is introduced. The paper concludes with a description of an analog computer specially developed for the Air Force to handle completely general mechanical restraint optimization programs of this type, where the acceleration input may be any arbitrary function of time.

  17. An acceleration framework for synthetic aperture radar algorithms

    NASA Astrophysics Data System (ADS)

    Kim, Youngsoo; Gloster, Clay S.; Alexander, Winser E.

    2017-04-01

    Algorithms for radar signal processing, such as Synthetic Aperture Radar (SAR) are computationally intensive and require considerable execution time on a general purpose processor. Reconfigurable logic can be used to off-load the primary computational kernel onto a custom computing machine in order to reduce execution time by an order of magnitude as compared to kernel execution on a general purpose processor. Specifically, Field Programmable Gate Arrays (FPGAs) can be used to accelerate these kernels using hardware-based custom logic implementations. In this paper, we demonstrate a framework for algorithm acceleration. We used SAR as a case study to illustrate the potential for algorithm acceleration offered by FPGAs. Initially, we profiled the SAR algorithm and implemented a homomorphic filter using a hardware implementation of the natural logarithm. Experimental results show a linear speedup by adding reasonably small processing elements in Field Programmable Gate Array (FPGA) as opposed to using a software implementation running on a typical general purpose processor.

  18. Development of hardware accelerator for molecular dynamics simulations: a computation board that calculates nonbonded interactions in cooperation with fast multipole method.

    PubMed

    Amisaki, Takashi; Toyoda, Shinjiro; Miyagawa, Hiroh; Kitamura, Kunihiro

    2003-04-15

    Evaluation of long-range Coulombic interactions still represents a bottleneck in the molecular dynamics (MD) simulations of biological macromolecules. Despite the advent of sophisticated fast algorithms, such as the fast multipole method (FMM), accurate simulations still demand a great amount of computation time due to the accuracy/speed trade-off inherently involved in these algorithms. Unless higher order multipole expansions, which are extremely expensive to evaluate, are employed, a large amount of the execution time is still spent in directly calculating particle-particle interactions within the nearby region of each particle. To reduce this execution time for pair interactions, we developed a computation unit (board), called MD-Engine II, that calculates nonbonded pairwise interactions using a specially designed hardware. Four custom arithmetic-processors and a processor for memory manipulation ("particle processor") are mounted on the computation board. The arithmetic processors are responsible for calculation of the pair interactions. The particle processor plays a central role in realizing efficient cooperation with the FMM. The results of a series of 50-ps MD simulations of a protein-water system (50,764 atoms) indicated that a more stringent setting of accuracy in FMM computation, compared with those previously reported, was required for accurate simulations over long time periods. Such a level of accuracy was efficiently achieved using the cooperative calculations of the FMM and MD-Engine II. On an Alpha 21264 PC, the FMM computation at a moderate but tolerable level of accuracy was accelerated by a factor of 16.0 using three boards. At a high level of accuracy, the cooperative calculation achieved a 22.7-fold acceleration over the corresponding conventional FMM calculation. In the cooperative calculations of the FMM and MD-Engine II, it was possible to achieve more accurate computation at a comparable execution time by incorporating larger nearby

  19. An Innovative Method for Evaluating Strategic Goals in a Public Agency: Conservation Leadership in the U.S. Forest Service

    Treesearch

    David N. Bengston; David P. Fan

    1999-01-01

    This article presents an innovative methodology for evaluating strategic planning goals in a public agency. Computer-coded content analysis was used to evaluate attitudes expressed in about 28,000 on-line news media stories about the U.S. Department of Agriculture Forest Service and its strategic goal of conservation leadership. Three dimensions of conservation...

  20. 78 FR 17924 - U.S. Strategic Command Strategic Advisory Group; Notice of Federal Advisory Committee Closed Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-25

    ... DEPARTMENT OF DEFENSE Office of the Secretary U.S. Strategic Command Strategic Advisory Group... following federal advisory committee: U.S. Strategic Command Strategic Advisory Group. DATES: April 18, 2013..., intelligence, and policy-related issues to the Commander, U.S. Strategic Command, during the development of the...

  1. Developing the Strategic Thinking of Instructional Leaders. Occasional Paper No. 13.

    ERIC Educational Resources Information Center

    Hallinger, Philip; McCary, C. E.

    Emerging research on instructional leadership is examined in this paper, with a focus on the new perspective on strategic thinking. The main theme is that research must address the reasoning that underlies the exercise of leadership rather than describe discrete behaviors of effective leaders. A computer simulation designed to facilitate the…

  2. Utilizing GPUs to Accelerate Turbomachinery CFD Codes

    NASA Technical Reports Server (NTRS)

    MacCalla, Weylin; Kulkarni, Sameer

    2016-01-01

    GPU computing has established itself as a way to accelerate parallel codes in the high performance computing world. This work focuses on speeding up APNASA, a legacy CFD code used at NASA Glenn Research Center, while also drawing conclusions about the nature of GPU computing and the requirements to make GPGPU worthwhile on legacy codes. Rewriting and restructuring of the source code was avoided to limit the introduction of new bugs. The code was profiled and investigated for parallelization potential, then OpenACC directives were used to indicate parallel parts of the code. The use of OpenACC directives was not able to reduce the runtime of APNASA on either the NVIDIA Tesla discrete graphics card, or the AMD accelerated processing unit. Additionally, it was found that in order to justify the use of GPGPU, the amount of parallel work being done within a kernel would have to greatly exceed the work being done by any one portion of the APNASA code. It was determined that in order for an application like APNASA to be accelerated on the GPU, it should not be modular in nature, and the parallel portions of the code must contain a large portion of the code's computation time.

  3. Enhancing the Future Strategic Corporal

    DTIC Science & Technology

    2006-01-01

    with greater firepower than ever before, the Strategic Corporal will be charged with greater responsibility than ever before, while the potential for...translation of “Strategic Corporal ”, and the article responsible for popularizing the term, see General Charles C. Krulak, “The Strategic Corporal ...Quantico, Virginia 22134-5068 FUTURE WAR PAPER ENHANCING THE FUTURE STRATEGIC CORPORAL SUBMITTED IN PARTIAL FULFILLMENT OF THE

  4. Accelerator Facilities for Radiation Research

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.

    1999-01-01

    HSRP Goals in Accelerator Use and Development are: 1.Need for ground-based heavy ion and proton facility to understand space radiation effects discussed most recently by NAS/NRC Report (1996). 2. Strategic Program Goals in facility usage and development: -(1) operation of AGS for approximately 600 beam hours/year; (2) operation of Loma Linda University (LLU) proton facility for approximately 400 beam hours/year; (3) construction of BAF facility; and (4) collaborative research at HIMAC in Japan and with other existing or potential international facilities. 3. MOA with LLU has been established to provide proton beams with energies of 40-250 important for trapped protons and solar proton events. 4. Limited number of beam hours available at Brookhaven National Laboratory's (BNL) Alternating Gradient Synchrotron (AGS).

  5. GPU-Accelerated Molecular Modeling Coming Of Age

    PubMed Central

    Stone, John E.; Hardy, David J.; Ufimtsev, Ivan S.

    2010-01-01

    Graphics processing units (GPUs) have traditionally been used in molecular modeling solely for visualization of molecular structures and animation of trajectories resulting from molecular dynamics simulations. Modern GPUs have evolved into fully programmable, massively parallel co-processors that can now be exploited to accelerate many scientific computations, typically providing about one order of magnitude speedup over CPU code and in special cases providing speedups of two orders of magnitude. This paper surveys the development of molecular modeling algorithms that leverage GPU computing, the advances already made and remaining issues to be resolved, and the continuing evolution of GPU technology that promises to become even more useful to molecular modeling. Hardware acceleration with commodity GPUs is expected to benefit the overall computational biology community by bringing teraflops performance to desktop workstations and in some cases potentially changing what were formerly batch-mode computational jobs into interactive tasks. PMID:20675161

  6. GPU Accelerated Vector Median Filter

    NASA Technical Reports Server (NTRS)

    Aras, Rifat; Shen, Yuzhong

    2011-01-01

    Noise reduction is an important step for most image processing tasks. For three channel color images, a widely used technique is vector median filter in which color values of pixels are treated as 3-component vectors. Vector median filters are computationally expensive; for a window size of n x n, each of the n(sup 2) vectors has to be compared with other n(sup 2) - 1 vectors in distances. General purpose computation on graphics processing units (GPUs) is the paradigm of utilizing high-performance many-core GPU architectures for computation tasks that are normally handled by CPUs. In this work. NVIDIA's Compute Unified Device Architecture (CUDA) paradigm is used to accelerate vector median filtering. which has to the best of our knowledge never been done before. The performance of GPU accelerated vector median filter is compared to that of the CPU and MPI-based versions for different image and window sizes, Initial findings of the study showed 100x improvement of performance of vector median filter implementation on GPUs over CPU implementations and further speed-up is expected after more extensive optimizations of the GPU algorithm .

  7. Strategic Management in the Community College.

    ERIC Educational Resources Information Center

    Myran, Gunder A.

    1983-01-01

    Defines strategic management and discusses its role in community colleges, focusing on the components and methodology of strategic management, strategic and operational management function, management teams, and the need for strategic management. (DMM)

  8. Strategic Investments Overview

    NASA Technical Reports Server (NTRS)

    Comstock, Doug

    2004-01-01

    This viewgraph presentation provides an overview of the organizational hierarchy for strategic management and strategic investments at NASA. The presentation also relates these topics to the budgets it submits to Congress, strategies for space exploration research and development, and systems analysis.

  9. Strategic consequences of emotional misrepresentation in negotiation: The blowback effect.

    PubMed

    Campagna, Rachel L; Mislin, Alexandra A; Kong, Dejun Tony; Bottom, William P

    2016-05-01

    Recent research indicates that expressing anger elicits concession making from negotiating counterparts. When emotions are conveyed either by a computer program or by a confederate, results appear to affirm a long-standing notion that feigning anger is an effective bargaining tactic. We hypothesize this tactic actually jeopardizes postnegotiation deal implementation and subsequent exchange. Four studies directly test both tactical and strategic consequences of emotional misrepresentation. False representations of anger generated little tactical benefit but produced considerable and persistent strategic disadvantage. This disadvantage is because of an effect we call "blowback." A negotiator's misrepresented anger creates an action-reaction cycle that results in genuine anger and diminishes trust in both the negotiator and counterpart. Our findings highlight the importance of considering the strategic implications of emotional misrepresentation for negotiators interested in claiming value. We discuss the benefits of researching reciprocal interdependence between 2 or more negotiating parties and of modeling value creation beyond deal construction to include implementation of terms. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  10. Analysis of Movement Acceleration of Down's Syndrome Teenagers Playing Computer Games.

    PubMed

    Carrogi-Vianna, Daniela; Lopes, Paulo Batista; Cymrot, Raquel; Hengles Almeida, Jefferson Jesus; Yazaki, Marcos Lomonaco; Blascovi-Assis, Silvana Maria

    2017-12-01

    This study aimed to evaluate movement acceleration characteristics in adolescents with Down syndrome (DS) and typical development (TD), while playing bowling and golf videogames on the Nintendo ® Wii™. The sample comprised 21 adolescents diagnosed with DS and 33 with TD of both sexes, between 10 and 14 years of age. The arm swing accelerations of the dominant upper limb were collected as measures during the bowling and the golf games. The first valid measurement, verified by the software readings, recorded at the start of each of the games, was used in the analysis. In the bowling game, the groups presented significant statistical differences, with the maximum (M) peaks of acceleration for the Male Control Group (MCG) (M = 70.37) and Female Control Group (FCG) (M = 70.51) when compared with Male Down Syndrome Group (MDSG) (M = 45.33) and Female Down Syndrome Group (FDSG) (M = 37.24). In the golf game the groups also presented significant statistical differences, the only difference being that the maximum peaks of acceleration for both male groups were superior compared with the female groups, MCG (M = 74.80) and FCG (M = 56.80), as well as in MDSG (M = 45.12) and in FDSG (M = 30.52). It was possible to use accelerometry to evaluate the movement acceleration characteristics of teenagers diagnosed with DS during virtual bowling and golf games played on the Nintendo Wii console.

  11. Accelerating an Ordered-Subset Low-Dose X-Ray Cone Beam Computed Tomography Image Reconstruction with a Power Factor and Total Variation Minimization.

    PubMed

    Huang, Hsuan-Ming; Hsiao, Ing-Tsung

    2016-01-01

    In recent years, there has been increased interest in low-dose X-ray cone beam computed tomography (CBCT) in many fields, including dentistry, guided radiotherapy and small animal imaging. Despite reducing the radiation dose, low-dose CBCT has not gained widespread acceptance in routine clinical practice. In addition to performing more evaluation studies, developing a fast and high-quality reconstruction algorithm is required. In this work, we propose an iterative reconstruction method that accelerates ordered-subsets (OS) reconstruction using a power factor. Furthermore, we combine it with the total-variation (TV) minimization method. Both simulation and phantom studies were conducted to evaluate the performance of the proposed method. Results show that the proposed method can accelerate conventional OS methods, greatly increase the convergence speed in early iterations. Moreover, applying the TV minimization to the power acceleration scheme can further improve the image quality while preserving the fast convergence rate.

  12. Accelerating an Ordered-Subset Low-Dose X-Ray Cone Beam Computed Tomography Image Reconstruction with a Power Factor and Total Variation Minimization

    PubMed Central

    Huang, Hsuan-Ming; Hsiao, Ing-Tsung

    2016-01-01

    In recent years, there has been increased interest in low-dose X-ray cone beam computed tomography (CBCT) in many fields, including dentistry, guided radiotherapy and small animal imaging. Despite reducing the radiation dose, low-dose CBCT has not gained widespread acceptance in routine clinical practice. In addition to performing more evaluation studies, developing a fast and high-quality reconstruction algorithm is required. In this work, we propose an iterative reconstruction method that accelerates ordered-subsets (OS) reconstruction using a power factor. Furthermore, we combine it with the total-variation (TV) minimization method. Both simulation and phantom studies were conducted to evaluate the performance of the proposed method. Results show that the proposed method can accelerate conventional OS methods, greatly increase the convergence speed in early iterations. Moreover, applying the TV minimization to the power acceleration scheme can further improve the image quality while preserving the fast convergence rate. PMID:27073853

  13. Investigations into dual-grating THz-driven accelerators

    NASA Astrophysics Data System (ADS)

    Wei, Y.; Ischebeck, R.; Dehler, M.; Ferrari, E.; Hiller, N.; Jamison, S.; Xia, G.; Hanahoe, K.; Li, Y.; Smith, J. D. A.; Welsch, C. P.

    2018-01-01

    Advanced acceleration technologies are receiving considerable interest in order to miniaturize future particle accelerators. One such technology is the dual-grating dielectric structures, which can support accelerating fields one to two orders of magnitude higher than the metal RF cavities in conventional accelerators. This opens up the possibility of enabling high accelerating gradients of up to several GV/m. This paper investigates numerically a quartz dual-grating structure which is driven by THz pulses to accelerate electrons. Geometry optimizations are carried out to achieve the trade-offs between accelerating gradient and vacuum channel gap. A realistic electron bunch available from the future Compact Linear Accelerator for Research and Applications (CLARA) is loaded into an optimized 100-period dual-grating structure for a detailed wakefield study. A THz pulse is then employed to interact with this CLARA bunch in the optimized structure. The computed beam quality is analyzed in terms of emittance, energy spread and loaded accelerating gradient. The simulations show that an accelerating gradient of 348 ± 12 MV/m with an emittance growth of 3.0% can be obtained.

  14. Acceleration of low order finite element computation with GPUs (Invited)

    NASA Astrophysics Data System (ADS)

    Knepley, M. G.

    2010-12-01

    Considerable effort has been focused on the acceleration using GPUs of high order spectral element methods and discontinuous Galerkin finite element methods. However, these methods are not universally applicable, and much of the existing FEM software base employs low order methods. In this talk, we present a formulation of FEM, using the PETSc framework from ANL, which is amenable to GPU acceleration even at very low order. In addition, using the FEniCS system for FEM, we show that the relevant kernels can be automatically generated and optimized using a symbolic manipulation system.

  15. Conservation of strategic metals

    NASA Technical Reports Server (NTRS)

    Stephens, J. R.

    1982-01-01

    A long-range program in support of the aerospace industry aimed at reducing the use of strategic materials in gas turbine engines is discussed. The program, which is called COSAM (Conservation of Strategic Aerospace Materials), has three general objectives. The first objective is to contribute basic scientific understanding to the turbine engine technology bank so that our national security is not jeopardized if our strategic material supply lines are disrupted. The second objective is to help reduce the dependence of United States military and civilian gas turbine engines on worldwide supply and price fluctuations in regard to strategic materials. The third objective is, through research, to contribute to the United States position of preeminence in the world gas turbine engine markets by minimizing the acquisition costs and optimizing the performance of gas turbine engines. Three major research thrusts are planned: strategic element substitution; advanced processing concepts; and alternate material identification. Results from research and any required supporting technology will give industry the materials technology options it needs to make tradeoffs in material properties for critical components against the cost and availability impacts related to their strategic metal content.

  16. Strategic Talk in Film.

    PubMed

    Payr, Sabine; Skowron, Marcin; Dobrosovestnova, Anna; Trapp, Martin; Trappl, Robert

    2017-01-01

    Conversational robots and agents are being designed for educational and/or persuasive tasks, e.g., health or fitness coaching. To pursue such tasks over a long time, they will need a complex model of the strategic goal, a variety of strategies to implement it in interaction, and the capability of strategic talk. Strategic talk is incipient ongoing conversation in which at least one participant has the objective of changing the other participant's attitudes or goals. The paper is based on the observation that strategic talk can stretch over considerable periods of time and a number of conversational segments. Film dialogues are taken as a source to develop a model of the strategic talk of mentor characters. A corpus of film mentor utterances is annotated on the basis of the model, and the data are interpreted to arrive at insights into mentor behavior, especially into the realization and sequencing of strategies.

  17. Strategic Planning to Conduct Joint Force Network Operations: A Content Analysis of NETOPS Organizations Strategic Plans

    DTIC Science & Technology

    2007-03-01

    information dominance , Joint Network Operations (NETOPS) organizations need to be strategically aligned. As result, to enhance the capabilities-based effects of NETOPS and reduce our NETOP infrastructures susceptibility to compromise. Once the key organizations were identified, their strategic plans were analyzed using a structured content analysis framework. The results illustrated that the strategic plans were aligned with the community of interests tasking to conduct NETOPS. Further research is required into the strategic alignment beyond the strategic

  18. 75 FR 22561 - Federal Advisory Committee; United States Strategic Command Strategic Advisory Group; Charter...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-29

    ... Command Strategic Advisory Group; Charter Renewal AGENCY: Department of Defense (DoD). ACTION: Renewal of... Command Strategic Advisory Group (hereafter referred to as the Group). FOR FURTHER INFORMATION CONTACT... Chairman of the Joint Chiefs of Staff and the Commander of the U.S. Strategic Command independent advice...

  19. A Study on Strategic Planning and Procurement of Medicals in Uganda’s Regional Referral Hospitals

    PubMed Central

    2016-01-01

    This study was an analysis of the effect of strategic planning on procurement of medicals in Uganda’s regional referral hospitals (RRH’s). Medicals were defined as essential medicines, medical devices and medical equipment. The Ministry of Health (MOH) has been carrying out strategic planning for the last 15 years via the Health Sector Strategic Plans. Their assumption was that strategic planning would translate to strategic procurement and consequently, availability of medicals in the RRH’s. However, despite the existence of these plans, there have been many complaints about expired drugs and shortages in RRH’s. For this purpose, a third variable was important because it served the role of mediation. A questionnaire was used to obtain information on perceptions of 206 respondents who were selected using simple random sampling. 8 key informant interviews were held, 2 in each RRH. 4 Focus Group Discussions were held, 1 for each RRH, and between 5 and 8 staff took part as discussants for approximately three hours. The findings suggested that strategic planning was affected by funding to approximately 34% while the relationship between funding and procurement was 35%. The direct relationship between strategic planning and procurement was 18%. However when the total causal effect was computed it turned out that strategic planning and the related variable of funding contributed 77% to procurement of medicals under the current hierarchical model where MOH is charged with development of strategic plans for the entire health sector. Since even with this contribution there were complaints, the study proposed a new model called CALF which according to a simulation, if adopted by MOH, strategic planning would contribute 87% to effectiveness in procurement of medicals. PMID:28299158

  20. 76 FR 52642 - Notice of Advisory Committee Closed Meeting; U.S. Strategic Command Strategic Advisory Group

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-23

    ... DEPARTMENT OF DEFENSE Notice of Advisory Committee Closed Meeting; U.S. Strategic Command.... Strategic Command Strategic Advisory Group. DATES: November 1, 2011, from 8 a.m. to 5 p.m. and November 2..., intelligence, and policy-related issues to the Commander, U.S. Strategic Command, during the development of the...

  1. Cultivating strategic thinking skills.

    PubMed

    Shirey, Maria R

    2012-06-01

    This department highlights change management strategies that may be successful in strategically planning and executing organizational change initiatives. With the goal of presenting practical approaches helpful to nurse leaders advancing organizational change, content includes evidence-based projects, tools, and resources that mobilize and sustain organizational change initiatives. In this article, the author presents an overview of strategic leadership and offers approaches for cultivating strategic thinking skills.

  2. Start with the End in Mind: Experiences of Accelerated Course Completion by Pre-Service Teachers and Educators

    ERIC Educational Resources Information Center

    Collins, Anita; Hay, Iain; Heiner, Irmgard

    2013-01-01

    In response to changes government funding and policies over the past five years, the Australian tertiary sector has entered an increasingly competitive climate. This has forced many universities to become more strategic in attracting increased numbers of PSTs. Providing accelerated learning opportunities for PSTs is viewed as one way to gain…

  3. Numerical Nuclear Second Derivatives on a Computing Grid: Enabling and Accelerating Frequency Calculations on Complex Molecular Systems.

    PubMed

    Yang, Tzuhsiung; Berry, John F

    2018-06-04

    The computation of nuclear second derivatives of energy, or the nuclear Hessian, is an essential routine in quantum chemical investigations of ground and transition states, thermodynamic calculations, and molecular vibrations. Analytic nuclear Hessian computations require the resolution of costly coupled-perturbed self-consistent field (CP-SCF) equations, while numerical differentiation of analytic first derivatives has an unfavorable 6 N ( N = number of atoms) prefactor. Herein, we present a new method in which grid computing is used to accelerate and/or enable the evaluation of the nuclear Hessian via numerical differentiation: NUMFREQ@Grid. Nuclear Hessians were successfully evaluated by NUMFREQ@Grid at the DFT level as well as using RIJCOSX-ZORA-MP2 or RIJCOSX-ZORA-B2PLYP for a set of linear polyacenes with systematically increasing size. For the larger members of this group, NUMFREQ@Grid was found to outperform the wall clock time of analytic Hessian evaluation; at the MP2 or B2LYP levels, these Hessians cannot even be evaluated analytically. We also evaluated a 156-atom catalytically relevant open-shell transition metal complex and found that NUMFREQ@Grid is faster (7.7 times shorter wall clock time) and less demanding (4.4 times less memory requirement) than an analytic Hessian. Capitalizing on the capabilities of parallel grid computing, NUMFREQ@Grid can outperform analytic methods in terms of wall time, memory requirements, and treatable system size. The NUMFREQ@Grid method presented herein demonstrates how grid computing can be used to facilitate embarrassingly parallel computational procedures and is a pioneer for future implementations.

  4. Modeling the Value of Strategic Actions in the Superior Colliculus

    PubMed Central

    Thevarajah, Dhushan; Webb, Ryan; Ferrall, Christopher; Dorris, Michael C.

    2009-01-01

    In learning models of strategic game play, an agent constructs a valuation (action value) over possible future choices as a function of past actions and rewards. Choices are then stochastic functions of these action values. Our goal is to uncover a neural signal that correlates with the action value posited by behavioral learning models. We measured activity from neurons in the superior colliculus (SC), a midbrain region involved in planning saccadic eye movements, while monkeys performed two saccade tasks. In the strategic task, monkeys competed against a computer in a saccade version of the mixed-strategy game ”matching-pennies”. In the instructed task, saccades were elicited through explicit instruction rather than free choices. In both tasks neuronal activity and behavior were shaped by past actions and rewards with more recent events exerting a larger influence. Further, SC activity predicted upcoming choices during the strategic task and upcoming reaction times during the instructed task. Finally, we found that neuronal activity in both tasks correlated with an established learning model, the Experience Weighted Attraction model of action valuation (Camerer and Ho, 1999). Collectively, our results provide evidence that action values hypothesized by learning models are represented in the motor planning regions of the brain in a manner that could be used to select strategic actions. PMID:20161807

  5. A comparison of acceleration methods for solving the neutron transport k-eigenvalue problem

    NASA Astrophysics Data System (ADS)

    Willert, Jeffrey; Park, H.; Knoll, D. A.

    2014-10-01

    Over the past several years a number of papers have been written describing modern techniques for numerically computing the dominant eigenvalue of the neutron transport criticality problem. These methods fall into two distinct categories. The first category of methods rewrite the multi-group k-eigenvalue problem as a nonlinear system of equations and solve the resulting system using either a Jacobian-Free Newton-Krylov (JFNK) method or Nonlinear Krylov Acceleration (NKA), a variant of Anderson Acceleration. These methods are generally successful in significantly reducing the number of transport sweeps required to compute the dominant eigenvalue. The second category of methods utilize Moment-Based Acceleration (or High-Order/Low-Order (HOLO) Acceleration). These methods solve a sequence of modified diffusion eigenvalue problems whose solutions converge to the solution of the original transport eigenvalue problem. This second class of methods is, in our experience, always superior to the first, as most of the computational work is eliminated by the acceleration from the LO diffusion system. In this paper, we review each of these methods. Our computational results support our claim that the choice of which nonlinear solver to use, JFNK or NKA, should be secondary. The primary computational savings result from the implementation of a HOLO algorithm. We display computational results for a series of challenging multi-dimensional test problems.

  6. GPU-accelerated molecular modeling coming of age.

    PubMed

    Stone, John E; Hardy, David J; Ufimtsev, Ivan S; Schulten, Klaus

    2010-09-01

    Graphics processing units (GPUs) have traditionally been used in molecular modeling solely for visualization of molecular structures and animation of trajectories resulting from molecular dynamics simulations. Modern GPUs have evolved into fully programmable, massively parallel co-processors that can now be exploited to accelerate many scientific computations, typically providing about one order of magnitude speedup over CPU code and in special cases providing speedups of two orders of magnitude. This paper surveys the development of molecular modeling algorithms that leverage GPU computing, the advances already made and remaining issues to be resolved, and the continuing evolution of GPU technology that promises to become even more useful to molecular modeling. Hardware acceleration with commodity GPUs is expected to benefit the overall computational biology community by bringing teraflops performance to desktop workstations and in some cases potentially changing what were formerly batch-mode computational jobs into interactive tasks. (c) 2010 Elsevier Inc. All rights reserved.

  7. FY17 Strategic Themes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leland, Robert W.

    2017-03-01

    I am pleased to present this summary of the FY17 Division 1000 Science and Technology Strategic Plan. As this plan represents a continuation of the work we started last year, the four strategic themes (Mission Engagement, Bold Outcomes, Collaborative Environment, and Safety Imperative) remain the same, along with many of the goals. You will see most of the changes in the actions listed for each goal: We completed some actions, modified others, and added a few new ones. As I’ve stated previously, this is not a strategy to be pursued in tension with the Laboratory strategic plan. The Division 1000more » strategic plan is intended to chart our course as we strive to contribute our very best in service of the greater Laboratory strategy. I welcome your feedback and look forward to our dialogue about these strategic themes. Please join me as we move forward to implement the plan in the coming months.« less

  8. Strategic Talk in Film

    PubMed Central

    Payr, Sabine; Skowron, Marcin; Dobrosovestnova, Anna; Trapp, Martin; Trappl, Robert

    2017-01-01

    ABSTRACT Conversational robots and agents are being designed for educational and/or persuasive tasks, e.g., health or fitness coaching. To pursue such tasks over a long time, they will need a complex model of the strategic goal, a variety of strategies to implement it in interaction, and the capability of strategic talk. Strategic talk is incipient ongoing conversation in which at least one participant has the objective of changing the other participant’s attitudes or goals. The paper is based on the observation that strategic talk can stretch over considerable periods of time and a number of conversational segments. Film dialogues are taken as a source to develop a model of the strategic talk of mentor characters. A corpus of film mentor utterances is annotated on the basis of the model, and the data are interpreted to arrive at insights into mentor behavior, especially into the realization and sequencing of strategies. PMID:29375243

  9. Modeling strategic use of human computer interfaces with novel hidden Markov models

    PubMed Central

    Mariano, Laura J.; Poore, Joshua C.; Krum, David M.; Schwartz, Jana L.; Coskren, William D.; Jones, Eric M.

    2015-01-01

    Immersive software tools are virtual environments designed to give their users an augmented view of real-world data and ways of manipulating that data. As virtual environments, every action users make while interacting with these tools can be carefully logged, as can the state of the software and the information it presents to the user, giving these actions context. This data provides a high-resolution lens through which dynamic cognitive and behavioral processes can be viewed. In this report, we describe new methods for the analysis and interpretation of such data, utilizing a novel implementation of the Beta Process Hidden Markov Model (BP-HMM) for analysis of software activity logs. We further report the results of a preliminary study designed to establish the validity of our modeling approach. A group of 20 participants were asked to play a simple computer game, instrumented to log every interaction with the interface. Participants had no previous experience with the game's functionality or rules, so the activity logs collected during their naïve interactions capture patterns of exploratory behavior and skill acquisition as they attempted to learn the rules of the game. Pre- and post-task questionnaires probed for self-reported styles of problem solving, as well as task engagement, difficulty, and workload. We jointly modeled the activity log sequences collected from all participants using the BP-HMM approach, identifying a global library of activity patterns representative of the collective behavior of all the participants. Analyses show systematic relationships between both pre- and post-task questionnaires, self-reported approaches to analytic problem solving, and metrics extracted from the BP-HMM decomposition. Overall, we find that this novel approach to decomposing unstructured behavioral data within software environments provides a sensible means for understanding how users learn to integrate software functionality for strategic task pursuit. PMID

  10. Synchronous acceleration with tapered dielectric-lined waveguides

    NASA Astrophysics Data System (ADS)

    Lemery, F.; Floettmann, K.; Piot, P.; Kärtner, F. X.; Aßmann, R.

    2018-05-01

    We present a general concept to accelerate nonrelativistic charged particles. Our concept employs an adiabatically-tapered dielectric-lined waveguide which supports accelerating phase velocities for synchronous acceleration. We propose an ansatz for the transient field equations, show it satisfies Maxwell's equations under an adiabatic approximation and find excellent agreement with a finite-difference time-domain computer simulation. The fields were implemented into the particle-tracking program astra and we present beam dynamics results for an accelerating field with a 1-mm-wavelength and peak electric field of 100 MV /m . Numerical simulations indicate that a ˜200 -keV electron beam can be accelerated to an energy of ˜10 MeV over ˜10 cm with parameters of interest to a wide range of applications including, e.g., future advanced accelerators, and ultra-fast electron diffraction.

  11. The strategic security officer.

    PubMed

    Hodges, Charles

    2014-01-01

    This article discusses the concept of the strategic security officer, and the potential that it brings to the healthcare security operational environment. The author believes that training and development, along with strict hiring practices, can enable a security department to reach a new level of professionalism, proficiency and efficiency. The strategic officer for healthcare security is adapted from the "strategic corporal" concept of US Marine Corps General Charles C. Krulak which focuses on understanding the total force implications of the decisions made by the lowest level leaders within the Corps (Krulak, 1999). This article focuses on the strategic organizational implications of every security officer's decisions in the constantly changing and increasingly volatile operational environment of healthcare security.

  12. 7 CFR 25.202 - Strategic plan.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Procedure § 25.202 Strategic plan. (a) Principles of strategic plan. The strategic plan included in the application must be developed in accordance with the following four key principles: (1) Strategic vision for... institutions and individual citizens. (3) Economic opportunity, including job creation within the community and...

  13. Strategic Risk Assessment

    NASA Technical Reports Server (NTRS)

    Derleth, Jason; Lobia, Marcus

    2009-01-01

    This slide presentation provides an overview of the attempt to develop and demonstrate a methodology for the comparative assessment of risks across the entire portfolio of NASA projects and assets. It includes information about strategic risk identification, normalizing strategic risks, calculation of relative risk score, and implementation options.

  14. Anderson Acceleration for Fixed-Point Iterations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walker, Homer F.

    The purpose of this grant was to support research on acceleration methods for fixed-point iterations, with applications to computational frameworks and simulation problems that are of interest to DOE.

  15. Binding and strategic selection in working memory: a lifespan dissociation.

    PubMed

    Sander, Myriam C; Werkle-Bergner, Markus; Lindenberger, Ulman

    2011-09-01

    Working memory (WM) shows a gradual increase during childhood, followed by accelerating decline from adulthood to old age. To examine these lifespan differences more closely, we asked 34 children (10-12 years), 40 younger adults (20-25 years), and 39 older adults (70-75 years) to perform a color change detection task. Load levels and encoding durations were varied for displays including targets only (Experiment 1) or targets plus distracters (Experiment 2, investigating a subsample of Experiment 1). WM performance was lower in older adults and children than in younger adults. Longer presentation times were associated with better performance in all age groups, presumably reflecting increasing effects of strategic selection mechanisms on WM performance. Children outperformed older adults when encoding times were short, and distracter effects were larger in children and older adults than in younger adults. We conclude that strategic selection in WM develops more slowly during childhood than basic binding operations, presumably reflecting the delay in maturation of frontal versus medio-temporal brain networks. In old age, both sets of mechanisms decline, reflecting senescent change in both networks. We discuss similarities to episodic memory development and address open questions for future research.

  16. Developing Oral Proficiency with VoiceThread: Learners' Strategic Uses and Views

    ERIC Educational Resources Information Center

    Dugartsyrenova, Vera A.; Sardegna, Veronica G.

    2017-01-01

    This study explored Russian as a foreign language (RFL) learners' self-reported strategic uses of "VoiceThread" (VT)--a multimodal asynchronous computer-mediated communication tool--in order to gain insights into learner perceived effectiveness of VT for second language (L2) oral skills development and to determine the factors that…

  17. Strategic Partnerships in Higher Education

    ERIC Educational Resources Information Center

    Ortega, Janet L.

    2013-01-01

    The purpose of this study was to investigate the impacts of strategic partnerships between community colleges and key stakeholders; to specifically examine strategic partnerships; leadership decision-making; criteria to evaluate strategic partnerships that added value to the institution, value to the students, faculty, staff, and the local…

  18. Strategic Planning and Financial Management

    ERIC Educational Resources Information Center

    Conneely, James F.

    2010-01-01

    Strong financial management is a strategy for strategic planning success in student affairs. It is crucial that student affairs professionals understand the necessity of linking their strategic planning with their financial management processes. An effective strategic planner needs strong financial management skills to implement the plan over…

  19. The Possibilities of Strategic Finance

    ERIC Educational Resources Information Center

    Chaffee, Ellen

    2010-01-01

    Strategic finance is aligning financial decisions--regarding revenues, creating and maintaining institutional assets, and using those assets--with the institution's mission and strategic plan. The concept known as "strategic finance" increasingly is being seen as a useful perspective for helping boards and presidents develop a sustainable…

  20. An "Elective Replacement" Approach to Providing Extra Help in Math: The Talent Development Middle Schools' Computer- and Team-Assisted Mathematics Acceleration (CATAMA) Program.

    ERIC Educational Resources Information Center

    Mac Iver, Douglas J.; Balfanz, Robert; Plank, Stephan B.

    1999-01-01

    Two studies evaluated the Computer- and Team-Assisted Mathematics Acceleration course (CATAMA) in Talent Development Middle Schools. The first study compared growth in math achievement for 96 seventh-graders (48 of whom participated in CATAMA and 48 of whom did not); the second study gathered data from interviews with, and observations of, CATAMA…

  1. Strategic and non-strategic problem gamblers differ on decision-making under risk and ambiguity.

    PubMed

    Lorains, Felicity K; Dowling, Nicki A; Enticott, Peter G; Bradshaw, John L; Trueblood, Jennifer S; Stout, Julie C

    2014-07-01

    To analyse problem gamblers' decision-making under conditions of risk and ambiguity, investigate underlying psychological factors associated with their choice behaviour and examine whether decision-making differed in strategic (e.g., sports betting) and non-strategic (e.g., electronic gaming machine) problem gamblers. Cross-sectional study. Out-patient treatment centres and university testing facilities in Victoria, Australia. Thirty-nine problem gamblers and 41 age, gender and estimated IQ-matched controls. Decision-making tasks included the Iowa Gambling Task (IGT) and a loss aversion task. The Prospect Valence Learning (PVL) model was used to provide an explanation of cognitive, motivational and response style factors involved in IGT performance. Overall, problem gamblers performed more poorly than controls on both the IGT (P = 0.04) and the loss aversion task (P = 0.01), and their IGT decisions were associated with heightened attention to gains (P = 0.003) and less consistency (P = 0.002). Strategic problem gamblers did not differ from matched controls on either decision-making task, but non-strategic problem gamblers performed worse on both the IGT (P = 0.006) and the loss aversion task (P = 0.02). Furthermore, we found differences in the PVL model parameters underlying strategic and non-strategic problem gamblers' choices on the IGT. Problem gamblers demonstrated poor decision-making under conditions of risk and ambiguity. Strategic (e.g. sports betting, poker) and non-strategic (e.g. electronic gaming machines) problem gamblers differed in decision-making and the underlying psychological processes associated with their decisions. © 2014 Society for the Study of Addiction.

  2. Toward real-time diffuse optical tomography: accelerating light propagation modeling employing parallel computing on GPU and CPU

    NASA Astrophysics Data System (ADS)

    Doulgerakis, Matthaios; Eggebrecht, Adam; Wojtkiewicz, Stanislaw; Culver, Joseph; Dehghani, Hamid

    2017-12-01

    Parameter recovery in diffuse optical tomography is a computationally expensive algorithm, especially when used for large and complex volumes, as in the case of human brain functional imaging. The modeling of light propagation, also known as the forward problem, is the computational bottleneck of the recovery algorithm, whereby the lack of a real-time solution is impeding practical and clinical applications. The objective of this work is the acceleration of the forward model, within a diffusion approximation-based finite-element modeling framework, employing parallelization to expedite the calculation of light propagation in realistic adult head models. The proposed methodology is applicable for modeling both continuous wave and frequency-domain systems with the results demonstrating a 10-fold speed increase when GPU architectures are available, while maintaining high accuracy. It is shown that, for a very high-resolution finite-element model of the adult human head with ˜600,000 nodes, consisting of heterogeneous layers, light propagation can be calculated at ˜0.25 s/excitation source.

  3. Computation of linear acceleration through an internal model in the macaque cerebellum

    PubMed Central

    Laurens, Jean; Meng, Hui; Angelaki, Dora E.

    2013-01-01

    A combination of theory and behavioral findings has supported a role for internal models in the resolution of sensory ambiguities and sensorimotor processing. Although the cerebellum has been proposed as a candidate for implementation of internal models, concrete evidence from neural responses is lacking. Here we exploit un-natural motion stimuli, which induce incorrect self-motion perception and eye movements, to explore the neural correlates of an internal model proposed to compensate for Einstein’s equivalence principle and generate neural estimates of linear acceleration and gravity. We show that caudal cerebellar vermis Purkinje cells and cerebellar nuclei neurons selective for actual linear acceleration also encode erroneous linear acceleration, as expected from the internal model hypothesis, even when no actual linear acceleration occurs. These findings provide strong evidence that the cerebellum might be involved in the implementation of internal models that mimic physical principles to interpret sensory signals, as previously hypothesized by theorists. PMID:24077562

  4. Clinical engineering department strategic graphical dashboard to enhance maintenance planning and asset management.

    PubMed

    Sloane, Elliot; Rosow, Eric; Adam, Joe; Shine, Dave

    2005-01-01

    The Clinical Engineering (a.k.a. Biomedical Engineering) Department has heretofore lagged in adoption of some of the leading-edge information system tools used in other industries. This present application is part of a DOD-funded SBIR grant to improve the overall management of medical technology, and describes the capabilities that Strategic Graphical Dashboards (SGDs) can afford. This SGD is built on top of an Oracle database, and uses custom-written graphic objects like gauges, fuel tanks, and Geographic Information System (GIS) maps to improve and accelerate decision making.

  5. The Strategic Reader.

    ERIC Educational Resources Information Center

    Devine, James T., Ed.; And Others

    1986-01-01

    To add a foundation to the growing excitement among educators about the central role they play in helping learners become strategic readers, the articles in this thematic journal provide insight into current reading theory and practice. Richard Telfer's article reviews research on strategic reading and clarifies what is meant by the phrase…

  6. Ion acceleration in a plasma focus

    NASA Technical Reports Server (NTRS)

    Gary, S. P.

    1974-01-01

    The electric and magnetic fields associated with anomalous diffusion to the axis of a linear plasma discharge are used to compute representative ion trajectories. Substantial axial acceleration of the ions is demonstrated.

  7. FY16 Strategic Themes.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leland, Robert W.

    2017-03-01

    I am pleased to present this summary of the Division 1000 Science and Technology Strategic Plan. This plan was created with considerable participation from all levels of management in Division 1000, and is intended to chart our course as we strive to contribute our very best in service of the greater Laboratory strategy. The plan is characterized by four strategic themes: Mission Engagement, Bold Outcomes, Collaborative Environment, and the Safety Imperative. Each theme is accompanied by a brief vision statement, several goals, and planned actions to support those goals throughout FY16. I want to be clear that this is notmore » a strategy to be pursued in tension with the Laboratory strategic plan. Rather, it is intended to describe “how” we intend to show up for the “what” described in Sandia’s Strategic Plan. I welcome your feedback and look forward to our dialogue about these strategic themes. Please join me as we move forward to implement the plan in the coming year.« less

  8. Accelerated Profile HMM Searches

    PubMed Central

    Eddy, Sean R.

    2011-01-01

    Profile hidden Markov models (profile HMMs) and probabilistic inference methods have made important contributions to the theory of sequence database homology search. However, practical use of profile HMM methods has been hindered by the computational expense of existing software implementations. Here I describe an acceleration heuristic for profile HMMs, the “multiple segment Viterbi” (MSV) algorithm. The MSV algorithm computes an optimal sum of multiple ungapped local alignment segments using a striped vector-parallel approach previously described for fast Smith/Waterman alignment. MSV scores follow the same statistical distribution as gapped optimal local alignment scores, allowing rapid evaluation of significance of an MSV score and thus facilitating its use as a heuristic filter. I also describe a 20-fold acceleration of the standard profile HMM Forward/Backward algorithms using a method I call “sparse rescaling”. These methods are assembled in a pipeline in which high-scoring MSV hits are passed on for reanalysis with the full HMM Forward/Backward algorithm. This accelerated pipeline is implemented in the freely available HMMER3 software package. Performance benchmarks show that the use of the heuristic MSV filter sacrifices negligible sensitivity compared to unaccelerated profile HMM searches. HMMER3 is substantially more sensitive and 100- to 1000-fold faster than HMMER2. HMMER3 is now about as fast as BLAST for protein searches. PMID:22039361

  9. Beam breakup in an advanced linear induction accelerator

    DOE PAGES

    Ekdahl, Carl August; Coleman, Joshua Eugene; McCuistian, Brian Trent

    2016-07-01

    Two linear induction accelerators (LIAs) have been in operation for a number of years at the Los Alamos Dual Axis Radiographic Hydrodynamic Test (DARHT) facility. A new multipulse LIA is being developed. We have computationally investigated the beam breakup (BBU) instability in this advanced LIA. In particular, we have explored the consequences of the choice of beam injector energy and the grouping of LIA cells. We find that within the limited range of options presently under consideration for the LIA architecture, there is little adverse effect on the BBU growth. The computational tool that we used for this investigation wasmore » the beam dynamics code linear accelerator model for DARHT (LAMDA). In conclusion, to confirm that LAMDA was appropriate for this task, we first validated it through comparisons with the experimental BBU data acquired on the DARHT accelerators.« less

  10. Computational Science and Innovation

    NASA Astrophysics Data System (ADS)

    Dean, D. J.

    2011-09-01

    Simulations - utilizing computers to solve complicated science and engineering problems - are a key ingredient of modern science. The U.S. Department of Energy (DOE) is a world leader in the development of high-performance computing (HPC), the development of applied math and algorithms that utilize the full potential of HPC platforms, and the application of computing to science and engineering problems. An interesting general question is whether the DOE can strategically utilize its capability in simulations to advance innovation more broadly. In this article, I will argue that this is certainly possible.

  11. Reforming Pentagon Strategic Decisionmaking. Strategic Forum. Number 221, July 2006

    DTIC Science & Technology

    2006-07-01

    capability that would improve Pentagon decisionmaking. Blink and Think It is commonly assumed that people can and should make decisions as rationally ... rationality ,” which not only helps them make decisions but also introduces a range of nonrational psychologi- cal factors into their thinking. An otherwise...decisionmaking shortcuts that limit their ability to make rational decisions . Strategic Forum No. 221July 2006 Institute for National Strategic Studies

  12. 2016 Annual Report - Argonne Leadership Computing Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, Jim; Papka, Michael E.; Cerny, Beth A.

    The Argonne Leadership Computing Facility (ALCF) helps researchers solve some of the world’s largest and most complex problems, while also advancing the nation’s efforts to develop future exascale computing systems. This report presents some of the ALCF’s notable achievements in key strategic areas over the past year.

  13. Advanced Computer Aids in the Planning and Execution of Air Warfare and Ground Strike Operations: Conference Proceedings, Meeting of the Avionics Panels of AGARD (51st) Held in Kongsberg, Norway on 12-16 May 1986

    DTIC Science & Technology

    1986-02-01

    the area of Artificial Intelligence (At). DARPA’s Strategic Computing Program 13 developing an At ýtchnology base upon which several applications...technologies with the Strategic Computing Program . In late 1983 the Strategic Computing Program (SCP) wes announced. The program was organizsd to develop...solving a resource allocation problem. The remainder of this paper will discuss the TEMPLAR progeam as it relates to the Strategic Computing Program

  14. Department of State Strategic Planning Workshop II. Center for Strategic Leadership Issue Paper, Volume 01-02

    DTIC Science & Technology

    2002-04-01

    Strategic Leadership 650 Wright Avenue Carlisle, PA 170l3-5049 OFFICIAL BUSINESS DEPARTMENT OF STATE STRATEGIC PLANNING WORKSHOP II U.S. ARMY WAR COLLEGE CSL 4 ...April 2002 Issues Paper 01-02 Department of State Strategic Planning Workshop II By Colonel Jeffrey C. Reynolds A State Department request, made...at the senior level, asked the Army Chief of Staff if the Army could help State improve its capacity to undertake strategic planning. In April

  15. Applying Strategic Visualization(Registered Trademark) to Lunar and Planetary Mission Design

    NASA Technical Reports Server (NTRS)

    Frassanito, John R.; Cooke, D. R.

    2002-01-01

    NASA teams, such as the NASA Exploration Team (NEXT), utilize advanced computational visualization processes to develop mission designs and architectures for lunar and planetary missions. One such process, Strategic Visualization (trademark), is a tool used extensively to help mission designers visualize various design alternatives and present them to other participants of their team. The participants, which may include NASA, industry, and the academic community, are distributed within a virtual network. Consequently, computer animation and other digital techniques provide an efficient means to communicate top-level technical information among team members. Today,Strategic Visualization(trademark) is used extensively both in the mission design process within the technical community, and to communicate the value of space exploration to the general public. Movies and digital images have been generated and shown on nationally broadcast television and the Internet, as well as in magazines and digital media. In our presentation will show excerpts of a computer-generated animation depicting the reference Earth/Moon L1 Libration Point Gateway architecture. The Gateway serves as a staging corridor for human expeditions to the lunar poles and other surface locations. Also shown are crew transfer systems and current reference lunar excursion vehicles as well as the Human and robotic construction of an inflatable telescope array for deployment to the Sun/Earth Libration Point.

  16. Being Strategic in HE Management

    ERIC Educational Resources Information Center

    West, Andrew

    2008-01-01

    The call to be strategic--and with it the concept of strategic management--can bring to mind a wide range of definitions, and there is now a huge array of academic literature supporting the different schools of thought. At a basic level, however, strategic thinking is probably most simply about focusing on the whole, rather than the part. In…

  17. Synchronous acceleration with tapered dielectric-lined waveguides

    DOE PAGES

    Lemery, Francois; Floettmann, Klaus; Piot, Philippe; ...

    2018-05-25

    Here, we present a general concept to accelerate non-relativistic charged particles. Our concept employs an adiabatically-tapered dielectric-lined waveguide which supports accelerating phase velocities for synchronous acceleration. We propose an ansatz for the transient field equations, show it satisfies Maxwell's equations under an adiabatic approximation and find excellent agreement with a finite-difference time-domain computer simulation. The fields were implemented into the particle-tracking program {\\sc astra} and we present beam dynamics results for an accelerating field with a 1-mm-wavelength and peak electric field of 100~MV/m. The numerical simulations indicate that amore » $$\\sim 200$$-keV electron beam can be accelerated to an energy of $$\\sim10$$~MeV over $$\\sim 10$$~cm. The novel scheme is also found to form electron beams with parameters of interest to a wide range of applications including, e.g., future advanced accelerators, and ultra-fast electron diffraction.« less

  18. Strategic management process in hospitals.

    PubMed

    Zovko, V

    2001-01-01

    Strategic management is concerned with strategic choices and strategic implementation; it provides the means by which organizations meet their objectives. In the case of hospitals it helps executives and all employees to understand the real purpose and long term goals of the hospital. Also, it helps the hospital find its place in the health care service provision chain, and enables the hospital to coordinate its activities with other organizations in the health care system. Strategic management is a tool, rather than a solution, that helps executives to identify root causes of major problems in the hospital.

  19. Developing strategic thinking in senior management.

    PubMed

    Zabriskie, N B; Huellmantel, A B

    1991-12-01

    Chief Executive Officers have recently stated that their greatest staffing challenge for the 1990s is the development of strategic leadership in their senior management. In order to do this, it is necessary to identify the substance of strategic thinking, and the capabilities that must be mastered. Writers on strategy have identified six major elements of strategic thinking and these have been organized to reveal the tasks, questions, decisions, and skills that senior executives must acquire in order to lead their organizations strategically. Finally, the article identifies training programme elements which are used by Directors of Manpower Development to develop strategic leadership ability.

  20. Numerical Nudging: Using an Accelerating Score to Enhance Performance.

    PubMed

    Shen, Luxi; Hsee, Christopher K

    2017-08-01

    People often encounter inherently meaningless numbers, such as scores in health apps or video games, that increase as they take actions. This research explored how the pattern of change in such numbers influences performance. We found that the key factor is acceleration-namely, whether the number increases at an increasing velocity. Six experiments in both the lab and the field showed that people performed better on an ongoing task if they were presented with a number that increased at an increasing velocity than if they were not presented with such a number or if they were presented with a number that increased at a decreasing or constant velocity. This acceleration effect occurred regardless of the absolute magnitude or the absolute velocity of the number, and even when the number was not tied to any specific rewards. This research shows the potential of numerical nudging-using inherently meaningless numbers to strategically alter behaviors-and is especially relevant in the present age of digital devices.

  1. Future of Department of Defense Cloud Computing Amid Cultural Confusion

    DTIC Science & Technology

    2013-03-01

    enterprise cloud - computing environment and transition to a public cloud service provider. Services have started the development of individual cloud - computing environments...endorsing cloud computing . It addresses related issues in matters of service culture changes and how strategic leaders will dictate the future of cloud ...through data center consolidation and individual Service provided cloud computing .

  2. Research | Computational Science | NREL

    Science.gov Websites

    Research Research NREL's computational science experts use advanced high-performance computing (HPC technologies, thereby accelerating the transformation of our nation's energy system. Enabling High-Impact Research NREL's computational science capabilities enable high-impact research. Some recent examples

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klitsner, Tom

    The recent Executive Order creating the National Strategic Computing Initiative (NSCI) recognizes the value of high performance computing for economic competitiveness and scientific discovery and commits to accelerate delivery of exascale computing. The HPC programs at Sandia –the NNSA ASC program and Sandia’s Institutional HPC Program– are focused on ensuring that Sandia has the resources necessary to deliver computation in the national interest.

  4. Computing at DESY — current setup, trends and strategic directions

    NASA Astrophysics Data System (ADS)

    Ernst, Michael

    1998-05-01

    Since the HERA experiments H1 and ZEUS started data taking in '92, the computing environment at DESY has changed dramatically. Running a mainframe centred computing for more than 20 years, DESY switched to a heterogeneous, fully distributed computing environment within only about two years in almost every corner where computing has its applications. The computing strategy was highly influenced by the needs of the user community. The collaborations are usually limited by current technology and their ever increasing demands is the driving force for central computing to always move close to the technology edge. While DESY's central computing has a multidecade experience in running Central Data Recording/Central Data Processing for HEP experiments, the most challenging task today is to provide for clear and homogeneous concepts in the desktop area. Given that lowest level commodity hardware draws more and more attention, combined with the financial constraints we are facing already today, we quickly need concepts for integrated support of a versatile device which has the potential to move into basically any computing area in HEP. Though commercial solutions, especially addressing the PC management/support issues, are expected to come to market in the next 2-3 years, we need to provide for suitable solutions now. Buying PC's at DESY currently at a rate of about 30/month will otherwise absorb any available manpower in central computing and still will leave hundreds of unhappy people alone. Though certainly not the only region, the desktop issue is one of the most important one where we need HEP-wide collaboration to a large extent, and right now. Taking into account that there is traditionally no room for R&D at DESY, collaboration, meaning sharing experience and development resources within the HEP community, is a predominant factor for us.

  5. Biocellion: accelerating computer simulation of multicellular biological system models

    PubMed Central

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-01-01

    Motivation: Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. Results: We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Availability and implementation: Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. Contact: seunghwa.kang@pnnl.gov PMID:25064572

  6. Strategic Planning with Critical Success Factors and Future Scenarios: An Integrated Strategic Planning Framework

    DTIC Science & Technology

    2010-11-01

    Implementing the Process by David Fogg [ Fogg 1994]. 6 Strategic goals typically reflect the primary goals of an organization or enterprise and imply a...method, such as the one described by Fogg in Team-Based Strategic Planning: A Com- plete Guide to Structuring, Facilitating, and Implementing the...Process, can provide ready recep- tors for non-CSF oriented information [ Fogg 1994]. If an organization is not adept at strategic planning, it is highly

  7. Strategic planning by independent community pharmacies.

    PubMed

    Harrison, Donald L

    2005-01-01

    (1) To assess the degree and level of use of the strategic planning process (none, partly, fully) by independent community pharmacy owners/managers and (2) to evaluate the relationships between independent community pharmacy owners/managers' level of strategic planning and indicators of pharmacy performance; including new and refill prescriptions filled, gross margin, rated patient care performance, rated dispensing performance, rated non-pharmacy performance, and rated financial performance. Cross-sectional study. United States. Nationwide random sample of 1,250 owners/managers of independent community pharmacies. Mailed survey. Quality of strategic planning conducted; pharmacy performance measures. Only 141 of 527 (26.8%) usable responses indicated use of some (77 pharmacies, 54.6%) or all (64 pharmacies, 45.4%) of the seven steps typical of strategic planning. Significant associations were observed between the level of strategic planning use and all pharmacy performance variables assessed, including indicators such as greater numbers of new and refill prescriptions dispensed, gross margins, patient care performance, dispensing performance, non-pharmacy performance, and financial performance. Greater ratings of pharmacy performance were significantly associated with the level of strategic planning use. Respondents who fully used strategic planning had significantly higher indicators than partial users; respondents who partly used the process had significantly higher ratings than respondents who did not conduct strategic planning.

  8. Educating and Training Accelerator Scientists and Technologists for Tomorrow

    NASA Astrophysics Data System (ADS)

    Barletta, William; Chattopadhyay, Swapan; Seryi, Andrei

    2012-01-01

    Accelerator science and technology is inherently an integrative discipline that combines aspects of physics, computational science, electrical and mechanical engineering. As few universities offer full academic programs, the education of accelerator physicists and engineers for the future has primarily relied on a combination of on-the-job training supplemented with intensive courses at regional accelerator schools. This article describes the approaches being used to satisfy the educational curiosity of a growing number of interested physicists and engineers.

  9. Educating and Training Accelerator Scientists and Technologists for Tomorrow

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barletta, William A.; Chattopadhyay, Swapan; Seryi, Andrei

    2012-07-01

    Accelerator science and technology is inherently an integrative discipline that combines aspects of physics, computational science, electrical and mechanical engineering. As few universities offer full academic programs, the education of accelerator physicists and engineers for the future has primarily relied on a combination of on-the-job training supplemented with intense courses at regional accelerator schools. This paper describes the approaches being used to satisfy the educational interests of a growing number of interested physicists and engineers.

  10. Biocellion: accelerating computer simulation of multicellular biological system models.

    PubMed

    Kang, Seunghwa; Kahan, Simon; McDermott, Jason; Flann, Nicholas; Shmulevich, Ilya

    2014-11-01

    Biological system behaviors are often the outcome of complex interactions among a large number of cells and their biotic and abiotic environment. Computational biologists attempt to understand, predict and manipulate biological system behavior through mathematical modeling and computer simulation. Discrete agent-based modeling (in combination with high-resolution grids to model the extracellular environment) is a popular approach for building biological system models. However, the computational complexity of this approach forces computational biologists to resort to coarser resolution approaches to simulate large biological systems. High-performance parallel computers have the potential to address the computing challenge, but writing efficient software for parallel computers is difficult and time-consuming. We have developed Biocellion, a high-performance software framework, to solve this computing challenge using parallel computers. To support a wide range of multicellular biological system models, Biocellion asks users to provide their model specifics by filling the function body of pre-defined model routines. Using Biocellion, modelers without parallel computing expertise can efficiently exploit parallel computers with less effort than writing sequential programs from scratch. We simulate cell sorting, microbial patterning and a bacterial system in soil aggregate as case studies. Biocellion runs on x86 compatible systems with the 64 bit Linux operating system and is freely available for academic use. Visit http://biocellion.com for additional information. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  11. 76 FR 60811 - Notice of Advisory Committee Closed Meeting; U.S. Strategic Command Strategic Advisory Group...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-30

    ... DEPARTMENT OF DEFENSE Notice of Advisory Committee Closed Meeting; U.S. Strategic Command Strategic Advisory Group; Correction AGENCY: Department of Defense. ACTION: Notice of Advisory Committee... Command Strategic Advisory Group gave notice of a meeting to be held on November 1, 2011, from 8 a.m. to 5...

  12. The PVCC Strategic Plan.

    ERIC Educational Resources Information Center

    Piedmont Virginia Community Coll., Charlottesville, VA.

    Presents Piedmont Virginia Community College's (PVCC's) strategic plan. Contains the following chapters: (1) introduction; (2) statement of mission; (3) summary of the college's strategic initiatives: funding, organization, faculty and staff, curriculum and instruction, enrollment management, students and student services, facilities, technology,…

  13. Real-time orthorectification by FPGA-based hardware acceleration

    NASA Astrophysics Data System (ADS)

    Kuo, David; Gordon, Don

    2010-10-01

    Orthorectification that corrects the perspective distortion of remote sensing imagery, providing accurate geolocation and ease of correlation to other images is a valuable first-step in image processing for information extraction. However, the large amount of metadata and the floating-point matrix transformations required to operate on each pixel make this a computation and I/O (Input/Output) intensive process. As result much imagery is either left unprocessed or loses timesensitive value in the long processing cycle. However, the computation on each pixel can be reduced substantially by using computational results of the neighboring pixels and accelerated by special pipelined hardware architecture in one to two orders of magnitude. A specialized coprocessor that is implemented inside an FPGA (Field Programmable Gate Array) chip and surrounded by vendorsupported hardware IP (Intellectual Property) shares the computation workload with CPU through PCI-Express interface. The ultimate speed of one pixel per clock (125 MHz) is achieved by the pipelined systolic array architecture. The optimal partition between software and hardware, the timing profile among image I/O and computation, and the highly automated GUI (Graphical User Interface) that fully exploits this speed increase to maximize overall image production throughput will also be discussed. The software that runs on a workstation with the acceleration hardware orthorectifies 16 Megapixels per second, which is 16 times faster than without the hardware. It turns the production time from months to days. A real-life successful story of an imaging satellite company that adopted such workstations for their orthorectified imagery production will be presented. The potential candidacy of the image processing computation that can be accelerated more efficiently by the same approach will also be analyzed.

  14. Chaotic dynamics in accelerator physics

    NASA Astrophysics Data System (ADS)

    Cary, J. R.

    1992-11-01

    Substantial progress was made in several areas of accelerator dynamics. We have completed a design of an FEL wiggler with adiabatic trapping and detrapping sections to develop an understanding of longitudinal adiabatic dynamics and to create efficiency enhancements for recirculating free-electron lasers. We developed a computer code for analyzing the critical KAM tori that binds the dynamic aperture in circular machines. Studies of modes that arise due to the interaction of coating beams with a narrow-spectrum impedance have begun. During this research educational and research ties with the accelerator community at large have been strengthened.

  15. Information Technology: Making It All Fit. Track VIII: Academic Computing Strategy.

    ERIC Educational Resources Information Center

    CAUSE, Boulder, CO.

    Six papers from the 1988 CAUSE conference's Track VIII, Academic Computing Strategy, are presented. They include: "Achieving Institution-Wide Computer Fluency: A Five-Year Retrospective" (Paul J. Plourde); "A Methodology and a Policy for Building and Implementing a Strategic Computer Plan" (Frank B. Thomas); "Aligning…

  16. Ice-sheet modelling accelerated by graphics cards

    NASA Astrophysics Data System (ADS)

    Brædstrup, Christian Fredborg; Damsgaard, Anders; Egholm, David Lundbek

    2014-11-01

    Studies of glaciers and ice sheets have increased the demand for high performance numerical ice flow models over the past decades. When exploring the highly non-linear dynamics of fast flowing glaciers and ice streams, or when coupling multiple flow processes for ice, water, and sediment, researchers are often forced to use super-computing clusters. As an alternative to conventional high-performance computing hardware, the Graphical Processing Unit (GPU) is capable of massively parallel computing while retaining a compact design and low cost. In this study, we present a strategy for accelerating a higher-order ice flow model using a GPU. By applying the newest GPU hardware, we achieve up to 180× speedup compared to a similar but serial CPU implementation. Our results suggest that GPU acceleration is a competitive option for ice-flow modelling when compared to CPU-optimised algorithms parallelised by the OpenMP or Message Passing Interface (MPI) protocols.

  17. Strategic R&D transactions in personalized drug development.

    PubMed

    Makino, Tomohiro; Lim, Yeongjoo; Kodama, Kota

    2018-03-21

    Although external collaboration capability influences the development of personalized medicine, key transactions in the pharmaceutical industry have not been addressed. To explore specific trends in interorganizational transactions and key players, we longitudinally surveyed strategic transactions, comparing them with other advanced medical developments, such as antibody therapy, as controls. We found that the financing deals of start-ups have surged over the past decade, accelerating intellectual property (IP) creation. Our correlation and regression analyses identified determinants of financing deals among alliance deals, acquisition deals, patents, research and development (R&D) licenses, market licenses, and scientific papers. They showed that patents positively correlated with transactions, and that the number of R&D licenses significantly predicted financing deals. This indicates, for the first time, that start-ups and investors lead progress in personalized medicine. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. Computer Diagnostics.

    ERIC Educational Resources Information Center

    Tondow, Murray

    The report deals with the influence of computer technology on education, particularly guidance. The need for computers is a result of increasing complexity which is defined as: (1) an exponential increase of information; (2) an exponential increase in dissemination capabilities; and (3) an accelerating curve of change. Listed are five functions of…

  19. Toward real-time diffuse optical tomography: accelerating light propagation modeling employing parallel computing on GPU and CPU.

    PubMed

    Doulgerakis, Matthaios; Eggebrecht, Adam; Wojtkiewicz, Stanislaw; Culver, Joseph; Dehghani, Hamid

    2017-12-01

    Parameter recovery in diffuse optical tomography is a computationally expensive algorithm, especially when used for large and complex volumes, as in the case of human brain functional imaging. The modeling of light propagation, also known as the forward problem, is the computational bottleneck of the recovery algorithm, whereby the lack of a real-time solution is impeding practical and clinical applications. The objective of this work is the acceleration of the forward model, within a diffusion approximation-based finite-element modeling framework, employing parallelization to expedite the calculation of light propagation in realistic adult head models. The proposed methodology is applicable for modeling both continuous wave and frequency-domain systems with the results demonstrating a 10-fold speed increase when GPU architectures are available, while maintaining high accuracy. It is shown that, for a very high-resolution finite-element model of the adult human head with ∼600,000 nodes, consisting of heterogeneous layers, light propagation can be calculated at ∼0.25  s/excitation source. (2017) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE).

  20. The paradox of strategic environmental assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bidstrup, Morten, E-mail: bidstrup@plan.aau.dk; Hansen, Anne Merrild, E-mail: merrild@plan.aau.dk

    Strategic Environmental Assessment (SEA) is a tool that can facilitate sustainable development and improve decision-making by introducing environmental concern early in planning processes. However, various international studies conclude that current planning practice is not taking full advantage of the tool, and we therefore define the paradox of SEA as the methodological ambiguity of non-strategic SEA. This article explores causality through at three-step case study on aggregates extraction planning in Denmark, which consists of a document analysis; a questionnaire survey and follow-up communication with key planners. Though the environmental reports on one hand largely lack strategic considerations, practitioners express an inherentmore » will for strategy and reveal that their SEAs in fact have been an integrated part of the planning process. Institutional context is found to be the most significant barrier for a strategy and this suggests that non-strategic planning setups can prove more important than non-strategic planning in SEA practice. Planners may try to execute strategy within the confinements of SEA-restricted planning contexts; however, such efforts can be overlooked if evaluated by a narrow criterion for strategy formation. Consequently, the paradox may also spark from challenged documentation. These findings contribute to the common understanding of SEA quality; however, further research is needed on how to communicate and influence the strategic options which arguably remain inside non-strategic planning realities. - Highlights: • International studies conclude that SEAs are not strategic. = The paradox of SEA. • Even on the highest managerial level, some contexts do not leave room for strategy. • Non-strategic SEA can derive from challenged documentation. • Descriptive and emergent strategy formation can, in practice, be deemed non-strategic.« less

  1. Successful strategic planning: creating clarity.

    PubMed

    Adams, Jim

    2005-01-01

    Most healthcare organizations have a strategic plan of some kind. Many of these organizations also have difficulty translating their strategic plan into specific actions that result in successful performance. In the worst cases, this can jeopardize the viability of the organization. The trouble lies in a lack of clarity in what a strategic plan is and what it should do for the organization. This article will answer key questions such as: What is strategy and how does it fit with other commonly used constructs such as mission, vision, and goals? What criteria can be used to determine if something is truly strategic to the organization? What are the phases of the strategy lifecycle? How do approaches for dealing with uncertainty, such as scenario planning, fit with organizational strategic planning? How can a meaningful IT strategy be developed if the organization strategy is lacking? What principles should guide a good IT planning process?

  2. Factors Influencing the Adoption of Cloud Computing by Decision Making Managers

    ERIC Educational Resources Information Center

    Ross, Virginia Watson

    2010-01-01

    Cloud computing is a growing field, addressing the market need for access to computing resources to meet organizational computing requirements. The purpose of this research is to evaluate the factors that influence an organization in their decision whether to adopt cloud computing as a part of their strategic information technology planning.…

  3. GPU-Accelerated Voxelwise Hepatic Perfusion Quantification

    PubMed Central

    Wang, H; Cao, Y

    2012-01-01

    Voxelwise quantification of hepatic perfusion parameters from dynamic contrast enhanced (DCE) imaging greatly contributes to assessment of liver function in response to radiation therapy. However, the efficiency of the estimation of hepatic perfusion parameters voxel-by-voxel in the whole liver using a dual-input single-compartment model requires substantial improvement for routine clinical applications. In this paper, we utilize the parallel computation power of a graphics processing unit (GPU) to accelerate the computation, while maintaining the same accuracy as the conventional method. Using CUDA-GPU, the hepatic perfusion computations over multiple voxels are run across the GPU blocks concurrently but independently. At each voxel, non-linear least squares fitting the time series of the liver DCE data to the compartmental model is distributed to multiple threads in a block, and the computations of different time points are performed simultaneously and synchronically. An efficient fast Fourier transform in a block is also developed for the convolution computation in the model. The GPU computations of the voxel-by-voxel hepatic perfusion images are compared with ones by the CPU using the simulated DCE data and the experimental DCE MR images from patients. The computation speed is improved by 30 times using a NVIDIA Tesla C2050 GPU compared to a 2.67 GHz Intel Xeon CPU processor. To obtain liver perfusion maps with 626400 voxels in a patient’s liver, it takes 0.9 min with the GPU-accelerated voxelwise computation, compared to 110 min with the CPU, while both methods result in perfusion parameters differences less than 10−6. The method will be useful for generating liver perfusion images in clinical settings. PMID:22892645

  4. Research for the Fluid Field of the Centrifugal Compressor Impeller in Accelerating Startup

    NASA Astrophysics Data System (ADS)

    Li, Xiaozhu; Chen, Gang; Zhu, Changyun; Qin, Guoliang

    2013-03-01

    In order to study the flow field in the impeller in the accelerating start-up process of centrifugal compressor, the 3-D and 1-D transient accelerated flow governing equations along streamline in the impeller of the centrifugal compressor are derived in detail, the assumption of pressure gradient distribution is presented, and the solving method for 1-D transient accelerating flow field is given based on the assumption. The solving method is achieved by programming and the computing result is obtained. It is obtained by comparison that the computing method is met with the test result. So the feasibility and effectiveness for solving accelerating start-up problem of centrifugal compressor by the solving method in this paper is proven.

  5. Improving Students' Self-Efficacy in Strategic Management: The Relative Impact of Cases and Simulations.

    ERIC Educational Resources Information Center

    Tompson, George H.; Dass, Parshotam

    2000-01-01

    Investigates the relative contribution of computer simulations and case studies for improving undergraduate students' self-efficacy in strategic management courses. Results of pre-and post-test data, regression analysis, and analysis of variance show that simulations result in significantly higher improvement in self-efficacy than case studies.…

  6. Combining Acceleration and Displacement Dependent Modal Frequency Responses Using an MSC/NASTRAN DMAP Alter

    NASA Technical Reports Server (NTRS)

    Barnett, Alan R.; Widrick, Timothy W.; Ludwiczak, Damian R.

    1996-01-01

    Solving for dynamic responses of free-free launch vehicle/spacecraft systems acted upon by buffeting winds is commonly performed throughout the aerospace industry. Due to the unpredictable nature of this wind loading event, these problems are typically solved using frequency response random analysis techniques. To generate dynamic responses for spacecraft with statically-indeterminate interfaces, spacecraft contractors prefer to develop models which have response transformation matrices developed for mode acceleration data recovery. This method transforms spacecraft boundary accelerations and displacements into internal responses. Unfortunately, standard MSC/NASTRAN modal frequency response solution sequences cannot be used to combine acceleration- and displacement-dependent responses required for spacecraft mode acceleration data recovery. External user-written computer codes can be used with MSC/NASTRAN output to perform such combinations, but these methods can be labor and computer resource intensive. Taking advantage of the analytical and computer resource efficiencies inherent within MS C/NASTRAN, a DMAP Alter has been developed to combine acceleration- and displacement-dependent modal frequency responses for performing spacecraft mode acceleration data recovery. The Alter has been used successfully to efficiently solve a common aerospace buffeting wind analysis.

  7. Strategic Decision Making Paradigms: A Primer for Senior Leaders

    DTIC Science & Technology

    2009-07-01

    decision making . STRATEGIC DECISION MAKING Strategic Change: There are several strategic...influenced by stakeholders outside of the organization. The Ontology of Strategic Decision Making . Strategic decisions are non-routine and involve...Coates USAWC, July 2009 5 The Complexity of Strategic Decision Making Strategic decisions entail “ill-structured,”6 “messy” or

  8. A Neural Mechanism of Strategic Social Choice under Sanction-Induced Norm Compliance

    PubMed

    Makwana, Aidan; Grön, Georg; Fehr, Ernst; Hare, Todd A

    2015-01-01

    In recent years, much has been learned about the representation of subjective value in simple, nonstrategic choices. However, a large fraction of our daily decisions are embedded in social interactions in which value guided decisions require balancing benefits for self against consequences imposed by others in response to our choices. Yet, despite their ubiquity, much less is known about how value computation takes place in strategic social contexts that include the possibility of retribution for norm violations. Here, we used functional magnetic resonance imaging (fMRI) to show that when human subjects face such a context connectivity increases between the temporoparietal junction (TPJ), implicated in the representation of other peoples' thoughts and intentions, and regions of ventromedial prefrontal cortex (vmPFC) that are associated with value computation. In contrast, we find no increase in connectivity between these regions in social nonstrategic cases where decision-makers are immune from retributive monetary punishments from a human partner. Moreover, there was also no increase in TPJ-vmPFC connectivity when the potential punishment was performed by a computer programmed to punish fairness norm violations in the same manner as a human would. Thus, TPJ-vmPFC connectivity is not simply a function of the social or norm enforcing nature of the decision, but rather occurs specifically in situations where subjects make decisions in a social context and strategically consider putative consequences imposed by others.

  9. Children's strategic theory of mind.

    PubMed

    Sher, Itai; Koenig, Melissa; Rustichini, Aldo

    2014-09-16

    Human strategic interaction requires reasoning about other people's behavior and mental states, combined with an understanding of their incentives. However, the ontogenic development of strategic reasoning is not well understood: At what age do we show a capacity for sophisticated play in social interactions? Several lines of inquiry suggest an important role for recursive thinking (RT) and theory of mind (ToM), but these capacities leave out the strategic element. We posit a strategic theory of mind (SToM) integrating ToM and RT with reasoning about incentives of all players. We investigated SToM in 3- to 9-y-old children and adults in two games that represent prevalent aspects of social interaction. Children anticipate deceptive and competitive moves from the other player and play both games in a strategically sophisticated manner by 7 y of age. One game has a pure strategy Nash equilibrium: In this game, children achieve equilibrium play by the age of 7 y on the first move. In the other game, with a single mixed-strategy equilibrium, children's behavior moved toward the equilibrium with experience. These two results also correspond to two ways in which children's behavior resembles adult behavior in the same games. In both games, children's behavior becomes more strategically sophisticated with age on the first move. Beyond the age of 7 y, children begin to think about strategic interaction not myopically, but in a farsighted way, possibly with a view to cooperating and capitalizing on mutual gains in long-run relationships.

  10. Collaborative Strategic Planning: Myth or Reality?

    ERIC Educational Resources Information Center

    Mbugua, Flora; Rarieya, Jane F. A.

    2014-01-01

    The concept and practice of strategic planning, while entrenched in educational institutions in the West, is just catching on in Kenya. While literature emphasizes the importance of collaborative strategic planning, it does not indicate the challenges presented by collaboratively engaging in strategic planning. This article reports on findings of…

  11. Accelerating Sequences in the Presence of Metal by Exploiting the Spatial Distribution of Off-Resonance

    PubMed Central

    Smith, Matthew R.; Artz, Nathan S.; Koch, Kevin M.; Samsonov, Alexey; Reeder, Scott B.

    2014-01-01

    Purpose To demonstrate feasibility of exploiting the spatial distribution of off-resonance surrounding metallic implants for accelerating multispectral imaging techniques. Theory Multispectral imaging (MSI) techniques perform time-consuming independent 3D acquisitions with varying RF frequency offsets to address the extreme off-resonance from metallic implants. Each off-resonance bin provides a unique spatial sensitivity that is analogous to the sensitivity of a receiver coil, and therefore provides a unique opportunity for acceleration. Methods Fully sampled MSI was performed to demonstrate retrospective acceleration. A uniform sampling pattern across off-resonance bins was compared to several adaptive sampling strategies using a total hip replacement phantom. Monte Carlo simulations were performed to compare noise propagation of two of these strategies. With a total knee replacement phantom, positive and negative off-resonance bins were strategically sampled with respect to the B0 field to minimize aliasing. Reconstructions were performed with a parallel imaging framework to demonstrate retrospective acceleration. Results An adaptive sampling scheme dramatically improved reconstruction quality, which was supported by the noise propagation analysis. Independent acceleration of negative and positive off-resonance bins demonstrated reduced overlapping of aliased signal to improve the reconstruction. Conclusion This work presents the feasibility of acceleration in the presence of metal by exploiting the spatial sensitivities of off-resonance bins. PMID:24431210

  12. Neural mechanisms mediating degrees of strategic uncertainty.

    PubMed

    Nagel, Rosemarie; Brovelli, Andrea; Heinemann, Frank; Coricelli, Giorgio

    2018-01-01

    In social interactions, strategic uncertainty arises when the outcome of one's choice depends on the choices of others. An important question is whether strategic uncertainty can be resolved by assessing subjective probabilities to the counterparts' behavior, as if playing against nature, and thus transforming the strategic interaction into a risky (individual) situation. By means of functional magnetic resonance imaging with human participants we tested the hypothesis that choices under strategic uncertainty are supported by the neural circuits mediating choices under individual risk and deliberation in social settings (i.e. strategic thinking). Participants were confronted with risky lotteries and two types of coordination games requiring different degrees of strategic thinking of the kind 'I think that you think that I think etc.' We found that the brain network mediating risk during lotteries (anterior insula, dorsomedial prefrontal cortex and parietal cortex) is also engaged in the processing of strategic uncertainty in games. In social settings, activity in this network is modulated by the level of strategic thinking that is reflected in the activity of the dorsomedial and dorsolateral prefrontal cortex. These results suggest that strategic uncertainty is resolved by the interplay between the neural circuits mediating risk and higher order beliefs (i.e. beliefs about others' beliefs). © The Author(s) (2017). Published by Oxford University Press.

  13. Neural mechanisms mediating degrees of strategic uncertainty

    PubMed Central

    Nagel, Rosemarie; Brovelli, Andrea; Heinemann, Frank

    2018-01-01

    Abstract In social interactions, strategic uncertainty arises when the outcome of one’s choice depends on the choices of others. An important question is whether strategic uncertainty can be resolved by assessing subjective probabilities to the counterparts’ behavior, as if playing against nature, and thus transforming the strategic interaction into a risky (individual) situation. By means of functional magnetic resonance imaging with human participants we tested the hypothesis that choices under strategic uncertainty are supported by the neural circuits mediating choices under individual risk and deliberation in social settings (i.e. strategic thinking). Participants were confronted with risky lotteries and two types of coordination games requiring different degrees of strategic thinking of the kind ‘I think that you think that I think etc.’ We found that the brain network mediating risk during lotteries (anterior insula, dorsomedial prefrontal cortex and parietal cortex) is also engaged in the processing of strategic uncertainty in games. In social settings, activity in this network is modulated by the level of strategic thinking that is reflected in the activity of the dorsomedial and dorsolateral prefrontal cortex. These results suggest that strategic uncertainty is resolved by the interplay between the neural circuits mediating risk and higher order beliefs (i.e. beliefs about others’ beliefs). PMID:29228378

  14. Manage "Human Capital" Strategically

    ERIC Educational Resources Information Center

    Odden, Allan

    2011-01-01

    To strategically manage human capital in education means restructuring the entire human resource system so that schools not only recruit and retain smart and capable individuals, but also manage them in ways that support the strategic directions of the organization. These management practices must be aligned with a district's education improvement…

  15. NOTE: Acceleration of Monte Carlo-based scatter compensation for cardiac SPECT

    NASA Astrophysics Data System (ADS)

    Sohlberg, A.; Watabe, H.; Iida, H.

    2008-07-01

    Single proton emission computed tomography (SPECT) images are degraded by photon scatter making scatter compensation essential for accurate reconstruction. Reconstruction-based scatter compensation with Monte Carlo (MC) modelling of scatter shows promise for accurate scatter correction, but it is normally hampered by long computation times. The aim of this work was to accelerate the MC-based scatter compensation using coarse grid and intermittent scatter modelling. The acceleration methods were compared to un-accelerated implementation using MC-simulated projection data of the mathematical cardiac torso (MCAT) phantom modelling 99mTc uptake and clinical myocardial perfusion studies. The results showed that when combined the acceleration methods reduced the reconstruction time for 10 ordered subset expectation maximization (OS-EM) iterations from 56 to 11 min without a significant reduction in image quality indicating that the coarse grid and intermittent scatter modelling are suitable for MC-based scatter compensation in cardiac SPECT.

  16. Acceleration of saddle-point searches with machine learning.

    PubMed

    Peterson, Andrew A

    2016-08-21

    In atomistic simulations, the location of the saddle point on the potential-energy surface (PES) gives important information on transitions between local minima, for example, via transition-state theory. However, the search for saddle points often involves hundreds or thousands of ab initio force calls, which are typically all done at full accuracy. This results in the vast majority of the computational effort being spent calculating the electronic structure of states not important to the researcher, and very little time performing the calculation of the saddle point state itself. In this work, we describe how machine learning (ML) can reduce the number of intermediate ab initio calculations needed to locate saddle points. Since machine-learning models can learn from, and thus mimic, atomistic simulations, the saddle-point search can be conducted rapidly in the machine-learning representation. The saddle-point prediction can then be verified by an ab initio calculation; if it is incorrect, this strategically has identified regions of the PES where the machine-learning representation has insufficient training data. When these training data are used to improve the machine-learning model, the estimates greatly improve. This approach can be systematized, and in two simple example problems we demonstrate a dramatic reduction in the number of ab initio force calls. We expect that this approach and future refinements will greatly accelerate searches for saddle points, as well as other searches on the potential energy surface, as machine-learning methods see greater adoption by the atomistics community.

  17. galario: Gpu Accelerated Library for Analyzing Radio Interferometer Observations

    NASA Astrophysics Data System (ADS)

    Tazzari, Marco; Beaujean, Frederik; Testi, Leonardo

    2017-10-01

    The galario library exploits the computing power of modern graphic cards (GPUs) to accelerate the comparison of model predictions to radio interferometer observations. It speeds up the computation of the synthetic visibilities given a model image (or an axisymmetric brightness profile) and their comparison to the observations.

  18. Strategic Leadership in Schools

    ERIC Educational Resources Information Center

    Williams, Henry S.; Johnson, Teryl L.

    2013-01-01

    Strategic leadership is built upon traits and actions that encompass the successful execution of all leadership styles. In a world that is rapidly changing, strategic leadership in schools guides school leader through assuring constant improvement process by anticipating future trends and planning for them and noting that plans must be flexible to…

  19. GPU accelerated dynamic functional connectivity analysis for functional MRI data.

    PubMed

    Akgün, Devrim; Sakoğlu, Ünal; Esquivel, Johnny; Adinoff, Bryon; Mete, Mutlu

    2015-07-01

    Recent advances in multi-core processors and graphics card based computational technologies have paved the way for an improved and dynamic utilization of parallel computing techniques. Numerous applications have been implemented for the acceleration of computationally-intensive problems in various computational science fields including bioinformatics, in which big data problems are prevalent. In neuroimaging, dynamic functional connectivity (DFC) analysis is a computationally demanding method used to investigate dynamic functional interactions among different brain regions or networks identified with functional magnetic resonance imaging (fMRI) data. In this study, we implemented and analyzed a parallel DFC algorithm based on thread-based and block-based approaches. The thread-based approach was designed to parallelize DFC computations and was implemented in both Open Multi-Processing (OpenMP) and Compute Unified Device Architecture (CUDA) programming platforms. Another approach developed in this study to better utilize CUDA architecture is the block-based approach, where parallelization involves smaller parts of fMRI time-courses obtained by sliding-windows. Experimental results showed that the proposed parallel design solutions enabled by the GPUs significantly reduce the computation time for DFC analysis. Multicore implementation using OpenMP on 8-core processor provides up to 7.7× speed-up. GPU implementation using CUDA yielded substantial accelerations ranging from 18.5× to 157× speed-up once thread-based and block-based approaches were combined in the analysis. Proposed parallel programming solutions showed that multi-core processor and CUDA-supported GPU implementations accelerated the DFC analyses significantly. Developed algorithms make the DFC analyses more practical for multi-subject studies with more dynamic analyses. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Sources and Information: Strategic Management.

    ERIC Educational Resources Information Center

    Palmer, Jim

    1983-01-01

    Provides an annotated bibliography of ERIC documents on strategic management, with emphasis on institutional responses to change, the role of the administrator in strategic management, budgeting and financial management, and institutional planning. (DMM)

  1. Characteristics of Useful and Practical Organizational Strategic Plans

    ERIC Educational Resources Information Center

    Kaufman, Roger

    2014-01-01

    Most organizational strategic plans are not strategic but rather tactical or operational plans masquerading as "strategic." This article identifies the basic elements required in a useful and practical strategic plan and explains why they are important.

  2. 12 CFR 228.27 - Strategic plan.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 3 2010-01-01 2010-01-01 false Strategic plan. 228.27 Section 228.27 Banks and... REINVESTMENT (REGULATION BB) Standards for Assessing Performance § 228.27 Strategic plan. (a) Alternative...(s) under a strategic plan if: (1) The bank has submitted the plan to the Board as provided for in...

  3. 23 CFR 1335.6 - Strategic plan.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 23 Highways 1 2010-04-01 2010-04-01 false Strategic plan. 1335.6 Section 1335.6 Highways NATIONAL... § 1335.6 Strategic plan. A strategic plan shall— (a) Be a multi-year plan that identifies and prioritizes... performance-based measures by which progress toward those goals will be determined; and (c) Be submitted to...

  4. 13 CFR 313.6 - Strategic Plans.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Strategic Plans. 313.6 Section 313... § 313.6 Strategic Plans. (a) General. An Impacted Community that intends to apply for a grant for implementation assistance under § 313.7 shall develop and submit a Strategic Plan to EDA for evaluation and...

  5. 13 CFR 313.6 - Strategic Plans.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false Strategic Plans. 313.6 Section 313... § 313.6 Strategic Plans. (a) General. An Impacted Community that intends to apply for a grant for implementation assistance under § 313.7 shall develop and submit a Strategic Plan to EDA for evaluation and...

  6. Strategic planning: today's hot buttons.

    PubMed

    Bohlmann, R C

    1998-01-01

    The first generation of mergers and managed care hasn't slowed down group practices' need for strategic planning. Even groups that already went through one merger are asking about new mergers or ownership possibilities, the future of managed care, performance standards and physician unhappiness. Strategic planning, including consideration of bench-marking, production of ancillary services and physician involvement, can help. Even if only a short, general look at the future, strategic planning shows the proactive leadership needed in today's environment.

  7. Acceleration of discrete stochastic biochemical simulation using GPGPU.

    PubMed

    Sumiyoshi, Kei; Hirata, Kazuki; Hiroi, Noriko; Funahashi, Akira

    2015-01-01

    For systems made up of a small number of molecules, such as a biochemical network in a single cell, a simulation requires a stochastic approach, instead of a deterministic approach. The stochastic simulation algorithm (SSA) simulates the stochastic behavior of a spatially homogeneous system. Since stochastic approaches produce different results each time they are used, multiple runs are required in order to obtain statistical results; this results in a large computational cost. We have implemented a parallel method for using SSA to simulate a stochastic model; the method uses a graphics processing unit (GPU), which enables multiple realizations at the same time, and thus reduces the computational time and cost. During the simulation, for the purpose of analysis, each time course is recorded at each time step. A straightforward implementation of this method on a GPU is about 16 times faster than a sequential simulation on a CPU with hybrid parallelization; each of the multiple simulations is run simultaneously, and the computational tasks within each simulation are parallelized. We also implemented an improvement to the memory access and reduced the memory footprint, in order to optimize the computations on the GPU. We also implemented an asynchronous data transfer scheme to accelerate the time course recording function. To analyze the acceleration of our implementation on various sizes of model, we performed SSA simulations on different model sizes and compared these computation times to those for sequential simulations with a CPU. When used with the improved time course recording function, our method was shown to accelerate the SSA simulation by a factor of up to 130.

  8. Acceleration of discrete stochastic biochemical simulation using GPGPU

    PubMed Central

    Sumiyoshi, Kei; Hirata, Kazuki; Hiroi, Noriko; Funahashi, Akira

    2015-01-01

    For systems made up of a small number of molecules, such as a biochemical network in a single cell, a simulation requires a stochastic approach, instead of a deterministic approach. The stochastic simulation algorithm (SSA) simulates the stochastic behavior of a spatially homogeneous system. Since stochastic approaches produce different results each time they are used, multiple runs are required in order to obtain statistical results; this results in a large computational cost. We have implemented a parallel method for using SSA to simulate a stochastic model; the method uses a graphics processing unit (GPU), which enables multiple realizations at the same time, and thus reduces the computational time and cost. During the simulation, for the purpose of analysis, each time course is recorded at each time step. A straightforward implementation of this method on a GPU is about 16 times faster than a sequential simulation on a CPU with hybrid parallelization; each of the multiple simulations is run simultaneously, and the computational tasks within each simulation are parallelized. We also implemented an improvement to the memory access and reduced the memory footprint, in order to optimize the computations on the GPU. We also implemented an asynchronous data transfer scheme to accelerate the time course recording function. To analyze the acceleration of our implementation on various sizes of model, we performed SSA simulations on different model sizes and compared these computation times to those for sequential simulations with a CPU. When used with the improved time course recording function, our method was shown to accelerate the SSA simulation by a factor of up to 130. PMID:25762936

  9. Strategic Planning for Interdisciplinary Science: a Geoscience Success Story

    NASA Astrophysics Data System (ADS)

    Harshvardhan, D.; Harbor, J. M.

    2003-12-01

    The Department of Earth and Atmospheric Sciences at Purdue University has engaged in a continuous strategic planning exercise for several years, including annual retreats since 1997 as an integral part of the process. The daylong Saturday retreat at the beginning of the fall semester has been used to flesh out the faculty hiring plan for the coming year based on the prior years' plans. The finalized strategic plan is built around the choice of three signature areas, two in disciplinary fields, (i) geodynamics and active tectonics, (ii) multi-scale atmospheric interactions and one interdisciplinary area, (iii) atmosphere/surface interactions. Our experience with strategic planning and the inherently interdisciplinary nature of geoscience helped us recently when our School of Science, which consists of seven departments, announced a competition for 60 new faculty positions that would be assigned based on the following criteria, listed in order of priority - (i) scientific merit and potential for societal impact, (ii) multidisciplinary nature of topic - level of participation and leveraging potential, (iii) alignment with Purdue's strategic plan - discovery, learning, engagement, (iv) existence of critical mass at Purdue and availability of faculty and student candidate pools, (v) corporate and federal sponsor interest. Some fifty white papers promoting diverse fields were submitted to the school and seven were chosen after a school-wide retreat. The department fared exceedingly well and we now have significant representation on three of the seven school areas of coalescence - (i) climate change, (ii) computational science and (iii) science education research. We are now in the process of drawing up hiring plans and developing strategies for allocation and reallocation of resources such as laboratory space and faculty startup to accommodate the 20% growth in faculty strength that is expected over the next five years.

  10. 12 CFR 25.27 - Strategic plan.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 1 2010-01-01 2010-01-01 false Strategic plan. 25.27 Section 25.27 Banks and... DEPOSIT PRODUCTION REGULATIONS Regulations Standards for Assessing Performance § 25.27 Strategic plan. (a... assessment area(s) under a strategic plan if: (1) The bank has submitted the plan to the OCC as provided for...

  11. 12 CFR 345.27 - Strategic plan.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Strategic plan. 345.27 Section 345.27 Banks and... REINVESTMENT Standards for Assessing Performance § 345.27 Strategic plan. (a) Alternative election. The FDIC... strategic plan if: (1) The bank has submitted the plan to the FDIC as provided for in this section; (2) The...

  12. Atwood's Machine: Experiments in an Accelerating Frame.

    ERIC Educational Resources Information Center

    Chee, Chia Teck; Hong, Chia Yee

    1999-01-01

    Experiments in an accelerating frame are hard to perform. Illustrates how simple computer software allows sufficiently rapid and accurate measurements to be made on an arrangement of weights and pulleys known as Atwood's machine. (Author/CCM)

  13. 160-fold acceleration of the Smith-Waterman algorithm using a field programmable gate array (FPGA)

    PubMed Central

    Li, Isaac TS; Shum, Warren; Truong, Kevin

    2007-01-01

    Background To infer homology and subsequently gene function, the Smith-Waterman (SW) algorithm is used to find the optimal local alignment between two sequences. When searching sequence databases that may contain hundreds of millions of sequences, this algorithm becomes computationally expensive. Results In this paper, we focused on accelerating the Smith-Waterman algorithm by using FPGA-based hardware that implemented a module for computing the score of a single cell of the SW matrix. Then using a grid of this module, the entire SW matrix was computed at the speed of field propagation through the FPGA circuit. These modifications dramatically accelerated the algorithm's computation time by up to 160 folds compared to a pure software implementation running on the same FPGA with an Altera Nios II softprocessor. Conclusion This design of FPGA accelerated hardware offers a new promising direction to seeking computation improvement of genomic database searching. PMID:17555593

  14. 160-fold acceleration of the Smith-Waterman algorithm using a field programmable gate array (FPGA).

    PubMed

    Li, Isaac T S; Shum, Warren; Truong, Kevin

    2007-06-07

    To infer homology and subsequently gene function, the Smith-Waterman (SW) algorithm is used to find the optimal local alignment between two sequences. When searching sequence databases that may contain hundreds of millions of sequences, this algorithm becomes computationally expensive. In this paper, we focused on accelerating the Smith-Waterman algorithm by using FPGA-based hardware that implemented a module for computing the score of a single cell of the SW matrix. Then using a grid of this module, the entire SW matrix was computed at the speed of field propagation through the FPGA circuit. These modifications dramatically accelerated the algorithm's computation time by up to 160 folds compared to a pure software implementation running on the same FPGA with an Altera Nios II softprocessor. This design of FPGA accelerated hardware offers a new promising direction to seeking computation improvement of genomic database searching.

  15. Creating Strategic Visions

    DTIC Science & Technology

    1990-10-15

    3010 o, AuTOVON 242-3010. - =n~m~m i ma ll lil~ m mm m i mii FOREWORD This futures study presents an analysis and discussion of a program used at the U.S...Operations Research Society, and The Planning Forum . iv CREATING STRATEGIC VISIONS 1 Introduction. The United States Army War College (USAWC) prepares its...consideration and time must be given to a program that attempts to help these potential leaders learn how to create strategic visions. In this paper

  16. One Teacher's Role in Promoting Understanding in Mental Computation

    ERIC Educational Resources Information Center

    Heirdsfield, Ann

    2005-01-01

    This paper reports the teacher actions that promoted the development of students' mental computation. A Year 3 teacher engaged her class in developing mental computation strategies over a ten-week period. Two overarching issues that appeared to support learning were establishing connections and encouraging strategic thinking. (Contains 2 figures.)…

  17. NASA strategic plan

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The NASA Strategic Plan is a living document. It provides far-reaching goals and objectives to create stability for NASA's efforts. The Plan presents NASA's top-level strategy: it articulates what NASA does and for whom; it differentiates between ends and means; it states where NASA is going and what NASA intends to do to get there. This Plan is not a budget document, nor does it present priorities for current or future programs. Rather, it establishes a framework for shaping NASA's activities and developing a balanced set of priorities across the Agency. Such priorities will then be reflected in the NASA budget. The document includes vision, mission, and goals; external environment; conceptual framework; strategic enterprises (Mission to Planet Earth, aeronautics, human exploration and development of space, scientific research, space technology, and synergy); strategic functions (transportation to space, space communications, human resources, and physical resources); values and operating principles; implementing strategy; and senior management team concurrence.

  18. Crew and Thermal Systems Strategic Communications Initiatives in Support of NASA's Strategic Goals

    NASA Technical Reports Server (NTRS)

    Paul, Heather L.; Lamberth, Erika Guillory; Jennings, Mallory A.

    2012-01-01

    NASA has defined strategic goals to invest in next-generation technologies and innovations, inspire students to become the future leaders of space exploration, and expand partnerships with industry and academia around the world. The Crew and Thermal Systems Division (CTSD) at the NASA Johnson Space Center actively supports these NASA initiatives. In July 2011, CTSD created a strategic communications team to communicate CTSD capabilities, technologies, and personnel to external technical audiences for business development and collaborative initiatives, and to students, educators, and the general public for education and public outreach efforts. This paper summarizes the CTSD Strategic Communications efforts and metrics through the first half of fiscal year 2012 with projections for end of fiscal year data.

  19. Crew and Thermal Systems Strategic Communications Initiatives in Support of NASA's Strategic Goals

    NASA Technical Reports Server (NTRS)

    Paul, Heather L.

    2012-01-01

    NASA has defined strategic goals to invest in next-generation technologies and innovations, to inspire students to become the future leaders of space exploration, and to expand partnerships with industry and academia around the world. The Crew and Thermal Systems Division (CTSD) at the NASA Johnson Space Center actively supports these NASA initiatives. In July 2011, CTSD created a strategic communications team to communicate CTSD capabilities, technologies, and personnel to internal NASA and external technical audiences for business development and collaborative initiatives, and to students, educators, and the general public for education and public outreach efforts. This paper summarizes the CTSD Strategic Communications efforts and metrics through the first nine months of fiscal year 2012.

  20. A Research Program in Computer Technology. 1986 Annual Technical Report

    DTIC Science & Technology

    1989-08-01

    1986 (Annual Technical Report I July 1985 - June 1986 A Research Program in Computer Technology ISI/SR-87-178 U S C INFORMA-TION S C I EN C ES...Program in Computer Technology (Unclassified) 12. PERSONAL AUTHOR(S) 151 Research Staff 13a. TYPE OF REPORT 113b. TIME COVERED 14 DATE OF REPORT (Yeer...survivable networks 17. distributed processing, local networks, personal computers, workstation environment 18. computer acquisition, Strategic Computing 19

  1. Minimum time acceleration of aircraft turbofan engines by using an algorithm based on nonlinear programming

    NASA Technical Reports Server (NTRS)

    Teren, F.

    1977-01-01

    Minimum time accelerations of aircraft turbofan engines are presented. The calculation of these accelerations was made by using a piecewise linear engine model, and an algorithm based on nonlinear programming. Use of this model and algorithm allows such trajectories to be readily calculated on a digital computer with a minimal expenditure of computer time.

  2. Using electronic patient records to inform strategic decision making in primary care.

    PubMed

    Mitchell, Elizabeth; Sullivan, Frank; Watt, Graham; Grimshaw, Jeremy M; Donnan, Peter T

    2004-01-01

    Although absolute risk of death associated with raised blood pressure increases with age, the benefits of treatment are greater in elderly patients. Despite this, the 'rule of halves' particularly applies to this group. We conducted a randomised controlled trial to evaluate different levels of feedback designed to improve identification, treatment and control of elderly hypertensives. Fifty-two general practices were randomly allocated to either: Control (n=19), Audit only feedback (n=16) or Audit plus Strategic feedback, prioritising patients by absolute risk (n=17). Feedback was based on electronic data, annually extracted from practice computer systems. Data were collected for 265,572 patients, 30,345 aged 65-79. The proportion of known hypertensives in each group with BP recorded increased over the study period and the numbers of untreated and uncontrolled patients reduced. There was a significant difference in mean systolic pressure between the Audit plus Strategic and Audit only groups and significantly greater control in the Audit plus Strategic group. Providing patient-specific practice feedback can impact on identification and management of hypertension in the elderly and produce a significant increase in control.

  3. What does God know? Supernatural agents' access to socially strategic and non-strategic information.

    PubMed

    Purzycki, Benjamin G; Finkel, Daniel N; Shaver, John; Wales, Nathan; Cohen, Adam B; Sosis, Richard

    2012-07-01

    Current evolutionary and cognitive theories of religion posit that supernatural agent concepts emerge from cognitive systems such as theory of mind and social cognition. Some argue that these concepts evolved to maintain social order by minimizing antisocial behavior. If these theories are correct, then people should process information about supernatural agents' socially strategic knowledge more quickly than non-strategic knowledge. Furthermore, agents' knowledge of immoral and uncooperative social behaviors should be especially accessible to people. To examine these hypotheses, we measured response-times to questions about the knowledge attributed to four different agents--God, Santa Claus, a fictional surveillance government, and omniscient but non-interfering aliens--that vary in their omniscience, moral concern, ability to punish, and how supernatural they are. As anticipated, participants respond more quickly to questions about agents' socially strategic knowledge than non-strategic knowledge, but only when agents are able to punish. Copyright © 2012 Cognitive Science Society, Inc.

  4. A novel electron accelerator for MRI-Linac radiotherapy.

    PubMed

    Whelan, Brendan; Gierman, Stephen; Holloway, Lois; Schmerge, John; Keall, Paul; Fahrig, Rebecca

    2016-03-01

    MRI guided radiotherapy is a rapidly growing field; however, current electron accelerators are not designed to operate in the magnetic fringe fields of MRI scanners. As such, current MRI-Linac systems require magnetic shielding, which can degrade MR image quality and limit system flexibility. The purpose of this work was to develop and test a novel medical electron accelerator concept which is inherently robust to operation within magnetic fields for in-line MRI-Linac systems. Computational simulations were utilized to model the accelerator, including the thermionic emission process, the electromagnetic fields within the accelerating structure, and resulting particle trajectories through these fields. The spatial and energy characteristics of the electron beam were quantified at the accelerator target and compared to published data for conventional accelerators. The model was then coupled to the fields from a simulated 1 T superconducting magnet and solved for cathode to isocenter distances between 1.0 and 2.4 m; the impact on the electron beam was quantified. For the zero field solution, the average current at the target was 146.3 mA, with a median energy of 5.8 MeV (interquartile spread of 0.1 MeV), and a spot size diameter of 1.5 mm full-width-tenth-maximum. Such an electron beam is suitable for therapy, comparing favorably to published data for conventional systems. The simulated accelerator showed increased robustness to operation in in-line magnetic fields, with a maximum current loss of 3% compared to 85% for a conventional system in the same magnetic fields. Computational simulations suggest that replacing conventional DC electron sources with a RF based source could be used to develop medical electron accelerators which are robust to operation in in-line magnetic fields. This would enable the development of MRI-Linac systems with no magnetic shielding around the Linac and reduce the requirements for optimization of magnetic fringe field, simplify design of

  5. A novel electron accelerator for MRI-Linac radiotherapy

    PubMed Central

    Whelan, Brendan; Gierman, Stephen; Holloway, Lois; Schmerge, John; Keall, Paul; Fahrig, Rebecca

    2016-01-01

    Purpose: MRI guided radiotherapy is a rapidly growing field; however, current electron accelerators are not designed to operate in the magnetic fringe fields of MRI scanners. As such, current MRI-Linac systems require magnetic shielding, which can degrade MR image quality and limit system flexibility. The purpose of this work was to develop and test a novel medical electron accelerator concept which is inherently robust to operation within magnetic fields for in-line MRI-Linac systems. Methods: Computational simulations were utilized to model the accelerator, including the thermionic emission process, the electromagnetic fields within the accelerating structure, and resulting particle trajectories through these fields. The spatial and energy characteristics of the electron beam were quantified at the accelerator target and compared to published data for conventional accelerators. The model was then coupled to the fields from a simulated 1 T superconducting magnet and solved for cathode to isocenter distances between 1.0 and 2.4 m; the impact on the electron beam was quantified. Results: For the zero field solution, the average current at the target was 146.3 mA, with a median energy of 5.8 MeV (interquartile spread of 0.1 MeV), and a spot size diameter of 1.5 mm full-width-tenth-maximum. Such an electron beam is suitable for therapy, comparing favorably to published data for conventional systems. The simulated accelerator showed increased robustness to operation in in-line magnetic fields, with a maximum current loss of 3% compared to 85% for a conventional system in the same magnetic fields. Conclusions: Computational simulations suggest that replacing conventional DC electron sources with a RF based source could be used to develop medical electron accelerators which are robust to operation in in-line magnetic fields. This would enable the development of MRI-Linac systems with no magnetic shielding around the Linac and reduce the requirements for optimization of

  6. GPU accelerated manifold correction method for spinning compact binaries

    NASA Astrophysics Data System (ADS)

    Ran, Chong-xi; Liu, Song; Zhong, Shuang-ying

    2018-04-01

    The graphics processing unit (GPU) acceleration of the manifold correction algorithm based on the compute unified device architecture (CUDA) technology is designed to simulate the dynamic evolution of the Post-Newtonian (PN) Hamiltonian formulation of spinning compact binaries. The feasibility and the efficiency of parallel computation on GPU have been confirmed by various numerical experiments. The numerical comparisons show that the accuracy on GPU execution of manifold corrections method has a good agreement with the execution of codes on merely central processing unit (CPU-based) method. The acceleration ability when the codes are implemented on GPU can increase enormously through the use of shared memory and register optimization techniques without additional hardware costs, implying that the speedup is nearly 13 times as compared with the codes executed on CPU for phase space scan (including 314 × 314 orbits). In addition, GPU-accelerated manifold correction method is used to numerically study how dynamics are affected by the spin-induced quadrupole-monopole interaction for black hole binary system.

  7. Cloud computing approaches to accelerate drug discovery value chain.

    PubMed

    Garg, Vibhav; Arora, Suchir; Gupta, Chitra

    2011-12-01

    Continued advancements in the area of technology have helped high throughput screening (HTS) evolve from a linear to parallel approach by performing system level screening. Advanced experimental methods used for HTS at various steps of drug discovery (i.e. target identification, target validation, lead identification and lead validation) can generate data of the order of terabytes. As a consequence, there is pressing need to store, manage, mine and analyze this data to identify informational tags. This need is again posing challenges to computer scientists to offer the matching hardware and software infrastructure, while managing the varying degree of desired computational power. Therefore, the potential of "On-Demand Hardware" and "Software as a Service (SAAS)" delivery mechanisms cannot be denied. This on-demand computing, largely referred to as Cloud Computing, is now transforming the drug discovery research. Also, integration of Cloud computing with parallel computing is certainly expanding its footprint in the life sciences community. The speed, efficiency and cost effectiveness have made cloud computing a 'good to have tool' for researchers, providing them significant flexibility, allowing them to focus on the 'what' of science and not the 'how'. Once reached to its maturity, Discovery-Cloud would fit best to manage drug discovery and clinical development data, generated using advanced HTS techniques, hence supporting the vision of personalized medicine.

  8. 2015 Enterprise Strategic Vision

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2015-08-01

    This document aligns with the Department of Energy Strategic Plan for 2014-2018 and provides a framework for integrating our missions and direction for pursuing DOE’s strategic goals. The vision is a guide to advancing world-class science and engineering, supporting our people, modernizing our infrastructure, and developing a management culture that operates a safe and secure enterprise in an efficient manner.

  9. Enhancing the Strategic Capability of the Army: An Investigation of Strategic Thinking Tasks, Skills, and Development

    DTIC Science & Technology

    2016-02-01

    continually develop their ability to think strategically, they gain the power to explore all options and help “ write the rules of the game” rather than...continually develop their ability to think strategically, they gain the power to explore all options and help “ write the rules of the game,” rather than...barriers to streamline communication Convey the position of multiple distinct agencies in writing through strategic use of language to the President

  10. Strategic planning: getting from here to there.

    PubMed

    Kaleba, Richard

    2006-11-01

    Hospitals should develop a strategic plan that defines specific actions in a realistic time frame. Hospitals can follow a five-phase process to develop a strategic plan. The strategic planning process requires a project leader and medical staff buy-in.

  11. 2014 Strategic Sustainability Performance Plan

    DTIC Science & Technology

    2014-06-30

    Strategic Sourcing Initiatives, such as Blanket Purchase Agreements ( BPAs ) for office products and imaging equipment, which include sustainable...end of FY2014. Use Federal Strategic Sourcing Initiatives, such as Blanket Purchase Agreements ( BPAs ) Yes USACE is required to participate in

  12. The Science of Strategic Communication

    EPA Science Inventory

    The field of Strategic Communication involves a focused effort to identify, develop, and present multiple types of communication media on a given subject. A Strategic Communication program recognizes the limitations of the most common communication models (primarily “one s...

  13. Hospital boards and hospital strategic focus: the impact of board involvement in strategic decision making.

    PubMed

    Ford-Eickhoff, Karen; Plowman, Donde Ashmos; McDaniel, Reuben R

    2011-01-01

    Despite pressures to change the role of hospital boards, hospitals have made few changes in board composition or director selection criteria. Hospital boards have often continued to operate in their traditional roles as either "monitors" or "advisors." More attention to the direct involvement of hospital boards in the strategic decision-making process of the organizations they serve, the timing and circumstances under which board involvement occurs, and the board composition that enhances their abilities to participate fully is needed. We investigated the relationship between broader expertise among hospital board members, board involvement in the stages of strategic decision making, and the hospital's strategic focus. We surveyed top management team members of 72 nonacademic hospitals to explore the participation of critical stakeholder groups such as the board of directors in the strategic decision-making process. We used hierarchical regression analysis to explore our hypotheses that there is a relationship between both the nature and involvement of the board and the hospital's strategic orientation. Hospitals with broader expertise on their boards reported an external focus. For some of their externally-oriented goals, hospitals also reported that their boards were involved earlier in the stages of decision making. In light of the complex and dynamic environment of hospitals today, those charged with developing hospital boards should match the variety in the external issues that the hospital faces with more variety in board makeup. By developing a board with greater breadth of expertise, the hospital responds to its complex environment by absorbing that complexity, enabling a greater potential for sensemaking and learning. Rather than acting only as monitors and advisors, boards impact their hospitals' strategic focus through their participation in the strategic decision-making process.

  14. Analysis of ballistic transport in nanoscale devices by using an accelerated finite element contact block reduction approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, H.; Li, G., E-mail: gli@clemson.edu

    2014-08-28

    An accelerated Finite Element Contact Block Reduction (FECBR) approach is presented for computational analysis of ballistic transport in nanoscale electronic devices with arbitrary geometry and unstructured mesh. Finite element formulation is developed for the theoretical CBR/Poisson model. The FECBR approach is accelerated through eigen-pair reduction, lead mode space projection, and component mode synthesis techniques. The accelerated FECBR is applied to perform quantum mechanical ballistic transport analysis of a DG-MOSFET with taper-shaped extensions and a DG-MOSFET with Si/SiO{sub 2} interface roughness. The computed electrical transport properties of the devices obtained from the accelerated FECBR approach and associated computational cost as amore » function of system degrees of freedom are compared with those obtained from the original CBR and direct inversion methods. The performance of the accelerated FECBR in both its accuracy and efficiency is demonstrated.« less

  15. Accurate and efficient spin integration for particle accelerators

    DOE PAGES

    Abell, Dan T.; Meiser, Dominic; Ranjbar, Vahid H.; ...

    2015-02-01

    Accurate spin tracking is a valuable tool for understanding spin dynamics in particle accelerators and can help improve the performance of an accelerator. In this paper, we present a detailed discussion of the integrators in the spin tracking code GPUSPINTRACK. We have implemented orbital integrators based on drift-kick, bend-kick, and matrix-kick splits. On top of the orbital integrators, we have implemented various integrators for the spin motion. These integrators use quaternions and Romberg quadratures to accelerate both the computation and the convergence of spin rotations.We evaluate their performance and accuracy in quantitative detail for individual elements as well as formore » the entire RHIC lattice. We exploit the inherently data-parallel nature of spin tracking to accelerate our algorithms on graphics processing units.« less

  16. Graphics Processing Unit Acceleration of Gyrokinetic Turbulence Simulations

    NASA Astrophysics Data System (ADS)

    Hause, Benjamin; Parker, Scott

    2012-10-01

    We find a substantial increase in on-node performance using Graphics Processing Unit (GPU) acceleration in gyrokinetic delta-f particle-in-cell simulation. Optimization is performed on a two-dimensional slab gyrokinetic particle simulation using the Portland Group Fortran compiler with the GPU accelerator compiler directives. We have implemented the GPU acceleration on a Core I7 gaming PC with a NVIDIA GTX 580 GPU. We find comparable, or better, acceleration relative to the NERSC DIRAC cluster with the NVIDIA Tesla C2050 computing processor. The Tesla C 2050 is about 2.6 times more expensive than the GTX 580 gaming GPU. Optimization strategies and comparisons between DIRAC and the gaming PC will be presented. We will also discuss progress on optimizing the comprehensive three dimensional general geometry GEM code.

  17. Computer modeling of photodegradation

    NASA Technical Reports Server (NTRS)

    Guillet, J.

    1986-01-01

    A computer program to simulate the photodegradation of materials exposed to terrestrial weathering environments is being developed. Input parameters would include the solar spectrum, the daily levels and variations of temperature and relative humidity, and materials such as EVA. A brief description of the program, its operating principles, and how it works was initially described. After that, the presentation focuses on the recent work of simulating aging in a normal, terrestrial day-night cycle. This is significant, as almost all accelerated aging schemes maintain a constant light illumination without a dark cycle, and this may be a critical factor not included in acceleration aging schemes. For outdoor aging, the computer model is indicating that the night dark cycle has a dramatic influence on the chemistry of photothermal degradation, and hints that a dark cycle may be needed in an accelerated aging scheme.

  18. Accelerated spike resampling for accurate multiple testing controls.

    PubMed

    Harrison, Matthew T

    2013-02-01

    Controlling for multiple hypothesis tests using standard spike resampling techniques often requires prohibitive amounts of computation. Importance sampling techniques can be used to accelerate the computation. The general theory is presented, along with specific examples for testing differences across conditions using permutation tests and for testing pairwise synchrony and precise lagged-correlation between many simultaneously recorded spike trains using interval jitter.

  19. A redshift survey of IRAS galaxies. V - The acceleration on the Local Group

    NASA Technical Reports Server (NTRS)

    Strauss, Michael A.; Yahil, Amos; Davis, Marc; Huchra, John P.; Fisher, Karl

    1992-01-01

    The acceleration on the Local Group is calculated based on a full-sky redshift survey of 5288 galaxies detected by IRAS. A formalism is developed to compute the distribution function of the IRAS acceleration for a given power spectrum of initial perturbations. The computed acceleration on the Local Group points 18-28 deg from the direction of the Local Group peculiar velocity vector. The data suggest that the CMB dipole is indeed due to the motion of the Local Group, that this motion is gravitationally induced, and that the distribution of IRAS galaxies on large scales is related to that of dark matter by a simple linear biasing model.

  20. Correlated histogram representation of Monte Carlo derived medical accelerator photon-output phase space

    DOEpatents

    Schach Von Wittenau, Alexis E.

    2003-01-01

    A method is provided to represent the calculated phase space of photons emanating from medical accelerators used in photon teletherapy. The method reproduces the energy distributions and trajectories of the photons originating in the bremsstrahlung target and of photons scattered by components within the accelerator head. The method reproduces the energy and directional information from sources up to several centimeters in radial extent, so it is expected to generalize well to accelerators made by different manufacturers. The method is computationally both fast and efficient overall sampling efficiency of 80% or higher for most field sizes. The computational cost is independent of the number of beams used in the treatment plan.

  1. Setting Strategic Directions Using Critical Success Factors.

    ERIC Educational Resources Information Center

    Bourne, Bonnie; Gates, Larry; Cofer, James

    2000-01-01

    Describes implementation of a system-level planning model focused on institutional improvement and effectiveness at the University of Missouri. Details implementation of three phases of the strategic planning model (strategic analysis, strategic thinking/decision-making, and campus outreach/systems administration planning); identifies critical…

  2. Strategic Management or Strategic Planning for Defense?

    DTIC Science & Technology

    1989-02-01

    manage at the regional or CinC level with an appreciation of strategic planning and management concepts currently taught at business schools . Military...those not in uniform. Science, engineering, and business schools all suggest that their faculties have experience tours so that they can appreciate

  3. Strategic planning for marketers.

    PubMed

    Wilson, I

    1978-12-01

    The merits of strategic planning as a marketing tool are discussed in this article which takes the view that although marketers claim to be future-oriented, they focus too little attention on long-term planning and forecasting. Strategic planning, as defined by these authors, usually encompasses periods of between five and twenty-five years and places less emphasis on the past as an absolute predictor of the future. It takes a more probabilistic view of the future than conventional marketing strategy and looks at the corporation as but one component interacting with the total environment. Inputs are examined in terms of environmental, social, political, technological and economic importance. Because of its futuristic orientation, an important tenant of strategic planning is the preparation of several alternative scenarios ranging from most to least likely. By planning for a wide-range of future market conditions, a corporation is more able to be flexible by anticipating the course of future events, and is less likely to become a captive reactor--as the authors believe is now the case. An example of strategic planning at General Elecric is cited.

  4. Accelerated Compressed Sensing Based CT Image Reconstruction.

    PubMed

    Hashemi, SayedMasoud; Beheshti, Soosan; Gill, Patrick R; Paul, Narinder S; Cobbold, Richard S C

    2015-01-01

    In X-ray computed tomography (CT) an important objective is to reduce the radiation dose without significantly degrading the image quality. Compressed sensing (CS) enables the radiation dose to be reduced by producing diagnostic images from a limited number of projections. However, conventional CS-based algorithms are computationally intensive and time-consuming. We propose a new algorithm that accelerates the CS-based reconstruction by using a fast pseudopolar Fourier based Radon transform and rebinning the diverging fan beams to parallel beams. The reconstruction process is analyzed using a maximum-a-posterior approach, which is transformed into a weighted CS problem. The weights involved in the proposed model are calculated based on the statistical characteristics of the reconstruction process, which is formulated in terms of the measurement noise and rebinning interpolation error. Therefore, the proposed method not only accelerates the reconstruction, but also removes the rebinning and interpolation errors. Simulation results are shown for phantoms and a patient. For example, a 512 × 512 Shepp-Logan phantom when reconstructed from 128 rebinned projections using a conventional CS method had 10% error, whereas with the proposed method the reconstruction error was less than 1%. Moreover, computation times of less than 30 sec were obtained using a standard desktop computer without numerical optimization.

  5. Accelerated Compressed Sensing Based CT Image Reconstruction

    PubMed Central

    Hashemi, SayedMasoud; Beheshti, Soosan; Gill, Patrick R.; Paul, Narinder S.; Cobbold, Richard S. C.

    2015-01-01

    In X-ray computed tomography (CT) an important objective is to reduce the radiation dose without significantly degrading the image quality. Compressed sensing (CS) enables the radiation dose to be reduced by producing diagnostic images from a limited number of projections. However, conventional CS-based algorithms are computationally intensive and time-consuming. We propose a new algorithm that accelerates the CS-based reconstruction by using a fast pseudopolar Fourier based Radon transform and rebinning the diverging fan beams to parallel beams. The reconstruction process is analyzed using a maximum-a-posterior approach, which is transformed into a weighted CS problem. The weights involved in the proposed model are calculated based on the statistical characteristics of the reconstruction process, which is formulated in terms of the measurement noise and rebinning interpolation error. Therefore, the proposed method not only accelerates the reconstruction, but also removes the rebinning and interpolation errors. Simulation results are shown for phantoms and a patient. For example, a 512 × 512 Shepp-Logan phantom when reconstructed from 128 rebinned projections using a conventional CS method had 10% error, whereas with the proposed method the reconstruction error was less than 1%. Moreover, computation times of less than 30 sec were obtained using a standard desktop computer without numerical optimization. PMID:26167200

  6. Accelerating execution of the integrated TIGER series Monte Carlo radiation transport codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, L.M.; Hochstedler, R.D.

    1997-02-01

    Execution of the integrated TIGER series (ITS) of coupled electron/photon Monte Carlo radiation transport codes has been accelerated by modifying the FORTRAN source code for more efficient computation. Each member code of ITS was benchmarked and profiled with a specific test case that directed the acceleration effort toward the most computationally intensive subroutines. Techniques for accelerating these subroutines included replacing linear search algorithms with binary versions, replacing the pseudo-random number generator, reducing program memory allocation, and proofing the input files for geometrical redundancies. All techniques produced identical or statistically similar results to the original code. Final benchmark timing of themore » accelerated code resulted in speed-up factors of 2.00 for TIGER (the one-dimensional slab geometry code), 1.74 for CYLTRAN (the two-dimensional cylindrical geometry code), and 1.90 for ACCEPT (the arbitrary three-dimensional geometry code).« less

  7. Using Accelerated Reader with ESL Students.

    ERIC Educational Resources Information Center

    Hamilton, Betty

    1997-01-01

    Describes the use of Accelerated Reader, a computer program that instantly provides scored tests on a variety of books read by high school ESL (English as a Second Language) students as free voluntary reading. Topics include reading improvement programs, including writing assignments; and changes in students' reading habits. (LRW)

  8. Strategic planning processes and hospital financial performance.

    PubMed

    Kaissi, Amer A; Begun, James W

    2008-01-01

    Many common management practices in healthcare organizations, including the practice of strategic planning, have not been subject to widespread assessment through empirical research. If management practice is to be evidence-based, evaluations of such common practices need to be undertaken. The purpose of this research is to provide evidence on the extent of strategic planning practices and the association between hospital strategic planning processes and financial performance. In 2006, we surveyed a sample of 138 chief executive officers (CEOs) of hospitals in the state of Texas about strategic planning in their organizations and collected financial information on the hospitals for 2003. Among the sample hospitals, 87 percent reported having a strategic plan, and most reported that they followed a variety of common practices recommended for strategic planning-having a comprehensive plan, involving physicians, involving the board, and implementing the plan. About one-half of the hospitals assigned responsibility for the plan to the CEO. We tested the association between these planning characteristics in 2006 and two measures of financial performance for 2003. Three dimensions of the strategic planning process--having a strategic plan, assigning the CEO responsibility for the plan, and involving the board--are positively associated with earlier financial performance. Further longitudinal studies are needed to evaluate the cause-and-effect relationship between planning and performance.

  9. Accelerating Innovation: How Nuclear Physics Benefits Us All

    DOE R&D Accomplishments Database

    2011-01-01

    Innovation has been accelerated by nuclear physics in the areas of improving our health; making the world safer; electricity, environment, archaeology; better computers; contributions to industry; and training the next generation of innovators.

  10. Installation Strategic Planning Guidebook

    DTIC Science & Technology

    2012-05-01

    Installation natural resource concerns (for example, wetlands , number of endangered species, water use restrictions, encroachment on training lands...Koehler Publishing, 1994 7. Strategy Safari – A Guided Tour Through the Wilds of Strategic Management by Henry Mintzberg, Bruce Ahlstrand, and...T. (1987). NY: Knopf 36. Shaping Strategic Planning: Frogs, Dragons, Bees and Turkey Tails. Pfeiffer, J. W., Goodstein, L. D. & Nolan, T. M. (1989

  11. A study on strategic provisioning of cloud computing services.

    PubMed

    Whaiduzzaman, Md; Haque, Mohammad Nazmul; Rejaul Karim Chowdhury, Md; Gani, Abdullah

    2014-01-01

    Cloud computing is currently emerging as an ever-changing, growing paradigm that models "everything-as-a-service." Virtualised physical resources, infrastructure, and applications are supplied by service provisioning in the cloud. The evolution in the adoption of cloud computing is driven by clear and distinct promising features for both cloud users and cloud providers. However, the increasing number of cloud providers and the variety of service offerings have made it difficult for the customers to choose the best services. By employing successful service provisioning, the essential services required by customers, such as agility and availability, pricing, security and trust, and user metrics can be guaranteed by service provisioning. Hence, continuous service provisioning that satisfies the user requirements is a mandatory feature for the cloud user and vitally important in cloud computing service offerings. Therefore, we aim to review the state-of-the-art service provisioning objectives, essential services, topologies, user requirements, necessary metrics, and pricing mechanisms. We synthesize and summarize different provision techniques, approaches, and models through a comprehensive literature review. A thematic taxonomy of cloud service provisioning is presented after the systematic review. Finally, future research directions and open research issues are identified.

  12. A Study on Strategic Provisioning of Cloud Computing Services

    PubMed Central

    Rejaul Karim Chowdhury, Md

    2014-01-01

    Cloud computing is currently emerging as an ever-changing, growing paradigm that models “everything-as-a-service.” Virtualised physical resources, infrastructure, and applications are supplied by service provisioning in the cloud. The evolution in the adoption of cloud computing is driven by clear and distinct promising features for both cloud users and cloud providers. However, the increasing number of cloud providers and the variety of service offerings have made it difficult for the customers to choose the best services. By employing successful service provisioning, the essential services required by customers, such as agility and availability, pricing, security and trust, and user metrics can be guaranteed by service provisioning. Hence, continuous service provisioning that satisfies the user requirements is a mandatory feature for the cloud user and vitally important in cloud computing service offerings. Therefore, we aim to review the state-of-the-art service provisioning objectives, essential services, topologies, user requirements, necessary metrics, and pricing mechanisms. We synthesize and summarize different provision techniques, approaches, and models through a comprehensive literature review. A thematic taxonomy of cloud service provisioning is presented after the systematic review. Finally, future research directions and open research issues are identified. PMID:25032243

  13. FPGA-accelerated adaptive optics wavefront control

    NASA Astrophysics Data System (ADS)

    Mauch, S.; Reger, J.; Reinlein, C.; Appelfelder, M.; Goy, M.; Beckert, E.; Tünnermann, A.

    2014-03-01

    The speed of real-time adaptive optical systems is primarily restricted by the data processing hardware and computational aspects. Furthermore, the application of mirror layouts with increasing numbers of actuators reduces the bandwidth (speed) of the system and, thus, the number of applicable control algorithms. This burden turns out a key-impediment for deformable mirrors with continuous mirror surface and highly coupled actuator influence functions. In this regard, specialized hardware is necessary for high performance real-time control applications. Our approach to overcome this challenge is an adaptive optics system based on a Shack-Hartmann wavefront sensor (SHWFS) with a CameraLink interface. The data processing is based on a high performance Intel Core i7 Quadcore hard real-time Linux system. Employing a Xilinx Kintex-7 FPGA, an own developed PCie card is outlined in order to accelerate the analysis of a Shack-Hartmann Wavefront Sensor. A recently developed real-time capable spot detection algorithm evaluates the wavefront. The main features of the presented system are the reduction of latency and the acceleration of computation For example, matrix multiplications which in general are of complexity O(n3 are accelerated by using the DSP48 slices of the field-programmable gate array (FPGA) as well as a novel hardware implementation of the SHWFS algorithm. Further benefits are the Streaming SIMD Extensions (SSE) which intensively use the parallelization capability of the processor for further reducing the latency and increasing the bandwidth of the closed-loop. Due to this approach, up to 64 actuators of a deformable mirror can be handled and controlled without noticeable restriction from computational burdens.

  14. Alternative World Scenarios for Strategic Planning

    DTIC Science & Technology

    1988-01-20

    STRATEGIC STUDIES INSTITUTE U.S. ARMY WAR COLLEGE CARLISLE BARRACKS, PENNSYLVANIA 17013-5050 20 JANUARY 198 ACN 81001 Lfl 0ALTERNATIVE WORLD...Howard D. Graves STRATEGIC STUDIES INSTITUTE Director Colonel Thomas R. Stone Author Charles W. Tayloi Editor Marianne P. Cowling Secretary Shirley A...Shearer STRATEGIC STUDIES INSTITUTE U.S. ARMY WAR COLLEGE Carlisle Barracks, Pennsylvania 17013-5050 20 January 1988 ACN 88001 4e 4’ ALTERNATIVE

  15. Acceleration modules in linear induction accelerators

    NASA Astrophysics Data System (ADS)

    Wang, Shao-Heng; Deng, Jian-Jun

    2014-05-01

    The Linear Induction Accelerator (LIA) is a unique type of accelerator that is capable of accelerating kilo-Ampere charged particle current to tens of MeV energy. The present development of LIA in MHz bursting mode and the successful application into a synchrotron have broadened LIA's usage scope. Although the transformer model is widely used to explain the acceleration mechanism of LIAs, it is not appropriate to consider the induction electric field as the field which accelerates charged particles for many modern LIAs. We have examined the transition of the magnetic cores' functions during the LIA acceleration modules' evolution, distinguished transformer type and transmission line type LIA acceleration modules, and re-considered several related issues based on transmission line type LIA acceleration module. This clarified understanding should help in the further development and design of LIA acceleration modules.

  16. Strengthening Deterrence for 21st Century Strategic Conflicts and Competition: Accelerating Adaptation and Integration - Annotated Bibliography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roberts, B.; Durkalec, J. J.

    This was the fourth in a series of annual events convened at Livermore to exploring the emerging place of the “new domains” in U.S. deterrence strategies. The purposes of the series are to facilitate the emergence of a community of interest that cuts across the policy, military, and technical communities and to inform laboratory strategic planning. U.S. allies have also been drawn into the conversation, as U.S. deterrence strategies are in part about their protection. Discussion in these workshops is on a not-for-attribution basis. It is also makes no use of classified information. On this occasion, there were nearly 100more » participants from a dozen countries.« less

  17. Strategic Planning and Information Systems.

    ERIC Educational Resources Information Center

    Shuman, Jack N.

    1982-01-01

    Discusses the functions of business planning systems and analyzes the underlying assumptions of the information systems that support strategic planning efforts within organizations. Development of a system framework, obstacles to the successful creation of strategic planning information systems, and resource allocation in organizations are…

  18. Enabling large-scale viscoelastic calculations via neural network acceleration

    NASA Astrophysics Data System (ADS)

    Robinson DeVries, P.; Thompson, T. B.; Meade, B. J.

    2017-12-01

    One of the most significant challenges involved in efforts to understand the effects of repeated earthquake cycle activity are the computational costs of large-scale viscoelastic earthquake cycle models. Deep artificial neural networks (ANNs) can be used to discover new, compact, and accurate computational representations of viscoelastic physics. Once found, these efficient ANN representations may replace computationally intensive viscoelastic codes and accelerate large-scale viscoelastic calculations by more than 50,000%. This magnitude of acceleration enables the modeling of geometrically complex faults over thousands of earthquake cycles across wider ranges of model parameters and at larger spatial and temporal scales than have been previously possible. Perhaps most interestingly from a scientific perspective, ANN representations of viscoelastic physics may lead to basic advances in the understanding of the underlying model phenomenology. We demonstrate the potential of artificial neural networks to illuminate fundamental physical insights with specific examples.

  19. Strategic planning--the role of the chief executive.

    PubMed

    Daniel, A L

    1992-04-01

    Failure to see strategic planning as a process and ineffective CEO involvement are two reasons for failures in strategic planning. This article outlines the stages in an effective strategic planning process, discusses the appropriate role or roles for the CEO or leader in each stage, and defines the expected results from effective strategic planning.

  20. The Value of Strategic Partnerships

    ScienceCinema

    Gould, Josh; Narayan, Amit; McNutt, Ty

    2018-05-30

    Strong strategic partnerships can be the difference between those technologies that only achieve success in the lab and those that actually break into the marketplace. Two ARPA-E awardees—AutoGrid and APEI—have forged strategic partnerships that have positioned their technologies to achieve major success in the market. This video features remarks from ARPA-E Technology-to-Market Advisor Josh Gould and interviews with technologists at AutoGrid and APEI, who each tell the story of how their company leveraged relationships with strategic partners to broaden their customer base and bring their technology to life.

  1. Rethinking Strategy and Strategic Leadership in Schools.

    ERIC Educational Resources Information Center

    Davies, Brent

    2003-01-01

    Reviews nature of strategy and strategic leadership in schools. Considers how leaders can map and reconceptualize the nature of strategy and develop strategic capabilities for longer-term sustainability. Questions hierarchical models of leadership. Highlights three characteristics of strategically oriented schools; suggests ways to improve art of…

  2. Scientific Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fermilab

    2017-09-01

    Scientists, engineers and programmers at Fermilab are tackling today’s most challenging computational problems. Their solutions, motivated by the needs of worldwide research in particle physics and accelerators, help America stay at the forefront of innovation.

  3. Design and simulation of a descent controller for strategic four-dimensional aircraft navigation. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Lax, F. M.

    1975-01-01

    A time-controlled navigation system applicable to the descent phase of flight for airline transport aircraft was developed and simulated. The design incorporates the linear discrete-time sampled-data version of the linearized continuous-time system describing the aircraft's aerodynamics. Using optimal linear quadratic control techniques, an optimal deterministic control regulator which is implementable on an airborne computer is designed. The navigation controller assists the pilot in complying with assigned times of arrival along a four-dimensional flight path in the presence of wind disturbances. The strategic air traffic control concept is also described, followed by the design of a strategic control descent path. A strategy for determining possible times of arrival at specified waypoints along the descent path and for generating the corresponding route-time profiles that are within the performance capabilities of the aircraft is presented. Using a mathematical model of the Boeing 707-320B aircraft along with a Boeing 707 cockpit simulator interfaced with an Adage AGT-30 digital computer, a real-time simulation of the complete aircraft aerodynamics was achieved. The strategic four-dimensional navigation controller for longitudinal dynamics was tested on the nonlinear aircraft model in the presence of 15, 30, and 45 knot head-winds. The results indicate that the controller preserved the desired accuracy and precision of a time-controlled aircraft navigation system.

  4. Teaching Subtraction and Multiplication with Regrouping Using the Concrete-Representational-Abstract Sequence and Strategic Instruction Model

    ERIC Educational Resources Information Center

    Flores, Margaret M.; Hinton, Vanessa; Strozier, Shaunita D.

    2014-01-01

    Based on Common Core Standards (2010), mathematics interventions should emphasize conceptual understanding of numbers and operations as well as fluency. For students at risk for failure, the concrete-representational-abstract (CRA) sequence and the Strategic Instruction Model (SIM) have been shown effective in teaching computation with an emphasis…

  5. Energy Innovation Acceleration Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wolfson, Johanna

    The Energy Innovation Acceleration Program (IAP) – also called U-Launch – has had a significant impact on early stage clean energy companies in the Northeast and on the clean energy economy in the Northeast, not only during program execution (2010-2014), but continuing into the future. Key results include: Leverage ratio of 105:1; $105M in follow-on funding (upon $1M investment by EERE); At least 19 commercial products launched; At least 17 new industry partnerships formed; At least $6.5M in revenue generated; >140 jobs created; 60% of assisted companies received follow-on funding within 1 year of program completion; In addition to themore » direct measurable program results summarized above, two primary lessons emerged from our work executing Energy IAP:; Validation and demonstration awards have an outsized, ‘tipping-point’ effect for startups looking to secure investments and strategic partnerships. An ecosystem approach is valuable, but an approach that evaluates the needs of individual companies and then draws from diverse ecosystem resources to fill them, is most valuable of all.« less

  6. Strategic Planning for School Administrators. Fastback 457.

    ERIC Educational Resources Information Center

    Prosise, Roger

    This fastback document examines the strategic-planning process. Intended for school administrators, the booklet offers practical advice on strategic planning, and the importance of such planning in those districts that experience high turnover. When conceptualizing a strategic plan, administrators should begin with an end in mind and then develop…

  7. Choosing order of operations to accelerate strip structure analysis in parameter range

    NASA Astrophysics Data System (ADS)

    Kuksenko, S. P.; Akhunov, R. R.; Gazizov, T. R.

    2018-05-01

    The paper considers the issue of using iteration methods in solving the sequence of linear algebraic systems obtained in quasistatic analysis of strip structures with the method of moments. Using the analysis of 4 strip structures, the authors have proved that additional acceleration (up to 2.21 times) of the iterative process can be obtained during the process of solving linear systems repeatedly by means of choosing a proper order of operations and a preconditioner. The obtained results can be used to accelerate the process of computer-aided design of various strip structures. The choice of the order of operations to accelerate the process is quite simple, universal and could be used not only for strip structure analysis but also for a wide range of computational problems.

  8. Strategic Planning for Higher Education.

    ERIC Educational Resources Information Center

    Kotler, Philip; Murphy, Patrick E.

    1981-01-01

    The framework necessary for achieving a strategic planning posture in higher education is outlined. The most important benefit of strategic planning for higher education decision makers is that it forces them to undertake a more market-oriented and systematic approach to long- range planning. (Author/MLW)

  9. Strategic Marketing for Educational Systems.

    ERIC Educational Resources Information Center

    Hanson, E. Mark; Henry, Walter

    1992-01-01

    Private-sector strategic marketing processes can significantly benefit schools desiring to develop public confidence and support and establish guidelines for future development. This article defines a strategic marketing model for school systems and articulates the sequence of related research and operational steps comprising it. Although schools…

  10. Combining Diffusive Shock Acceleration with Acceleration by Contracting and Reconnecting Small-scale Flux Ropes at Heliospheric Shocks

    NASA Astrophysics Data System (ADS)

    le Roux, J. A.; Zank, G. P.; Webb, G. M.; Khabarova, O. V.

    2016-08-01

    Computational and observational evidence is accruing that heliospheric shocks, as emitters of vorticity, can produce downstream magnetic flux ropes and filaments. This led Zank et al. to investigate a new paradigm whereby energetic particle acceleration near shocks is a combination of diffusive shock acceleration (DSA) with downstream acceleration by many small-scale contracting and reconnecting (merging) flux ropes. Using a model where flux-rope acceleration involves a first-order Fermi mechanism due to the mean compression of numerous contracting flux ropes, Zank et al. provide theoretical support for observations that power-law spectra of energetic particles downstream of heliospheric shocks can be harder than predicted by DSA theory and that energetic particle intensities should peak behind shocks instead of at shocks as predicted by DSA theory. In this paper, a more extended formalism of kinetic transport theory developed by le Roux et al. is used to further explore this paradigm. We describe how second-order Fermi acceleration, related to the variance in the electromagnetic fields produced by downstream small-scale flux-rope dynamics, modifies the standard DSA model. The results show that (I) this approach can qualitatively reproduce observations of particle intensities peaking behind the shock, thus providing further support for the new paradigm, and (II) stochastic acceleration by compressible flux ropes tends to be more efficient than incompressible flux ropes behind shocks in modifying the DSA spectrum of energetic particles.

  11. Computational Infrastructure for Geodynamics (CIG)

    NASA Astrophysics Data System (ADS)

    Gurnis, M.; Kellogg, L. H.; Bloxham, J.; Hager, B. H.; Spiegelman, M.; Willett, S.; Wysession, M. E.; Aivazis, M.

    2004-12-01

    Solid earth geophysicists have a long tradition of writing scientific software to address a wide range of problems. In particular, computer simulations came into wide use in geophysics during the decade after the plate tectonic revolution. Solution schemes and numerical algorithms that developed in other areas of science, most notably engineering, fluid mechanics, and physics, were adapted with considerable success to geophysics. This software has largely been the product of individual efforts and although this approach has proven successful, its strength for solving problems of interest is now starting to show its limitations as we try to share codes and algorithms or when we want to recombine codes in novel ways to produce new science. With funding from the NSF, the US community has embarked on a Computational Infrastructure for Geodynamics (CIG) that will develop, support, and disseminate community-accessible software for the greater geodynamics community from model developers to end-users. The software is being developed for problems involving mantle and core dynamics, crustal and earthquake dynamics, magma migration, seismology, and other related topics. With a high level of community participation, CIG is leveraging state-of-the-art scientific computing into a suite of open-source tools and codes. The infrastructure that we are now starting to develop will consist of: (a) a coordinated effort to develop reusable, well-documented and open-source geodynamics software; (b) the basic building blocks - an infrastructure layer - of software by which state-of-the-art modeling codes can be quickly assembled; (c) extension of existing software frameworks to interlink multiple codes and data through a superstructure layer; (d) strategic partnerships with the larger world of computational science and geoinformatics; and (e) specialized training and workshops for both the geodynamics and broader Earth science communities. The CIG initiative has already started to

  12. Torque-based optimal acceleration control for electric vehicle

    NASA Astrophysics Data System (ADS)

    Lu, Dongbin; Ouyang, Minggao

    2014-03-01

    The existing research of the acceleration control mainly focuses on an optimization of the velocity trajectory with respect to a criterion formulation that weights acceleration time and fuel consumption. The minimum-fuel acceleration problem in conventional vehicle has been solved by Pontryagin's maximum principle and dynamic programming algorithm, respectively. The acceleration control with minimum energy consumption for battery electric vehicle(EV) has not been reported. In this paper, the permanent magnet synchronous motor(PMSM) is controlled by the field oriented control(FOC) method and the electric drive system for the EV(including the PMSM, the inverter and the battery) is modeled to favor over a detailed consumption map. The analytical algorithm is proposed to analyze the optimal acceleration control and the optimal torque versus speed curve in the acceleration process is obtained. Considering the acceleration time, a penalty function is introduced to realize a fast vehicle speed tracking. The optimal acceleration control is also addressed with dynamic programming(DP). This method can solve the optimal acceleration problem with precise time constraint, but it consumes a large amount of computation time. The EV used in simulation and experiment is a four-wheel hub motor drive electric vehicle. The simulation and experimental results show that the required battery energy has little difference between the acceleration control solved by analytical algorithm and that solved by DP, and is greatly reduced comparing with the constant pedal opening acceleration. The proposed analytical and DP algorithms can minimize the energy consumption in EV's acceleration process and the analytical algorithm is easy to be implemented in real-time control.

  13. Children’s strategic theory of mind

    PubMed Central

    Sher, Itai; Koenig, Melissa; Rustichini, Aldo

    2014-01-01

    Human strategic interaction requires reasoning about other people’s behavior and mental states, combined with an understanding of their incentives. However, the ontogenic development of strategic reasoning is not well understood: At what age do we show a capacity for sophisticated play in social interactions? Several lines of inquiry suggest an important role for recursive thinking (RT) and theory of mind (ToM), but these capacities leave out the strategic element. We posit a strategic theory of mind (SToM) integrating ToM and RT with reasoning about incentives of all players. We investigated SToM in 3- to 9-y-old children and adults in two games that represent prevalent aspects of social interaction. Children anticipate deceptive and competitive moves from the other player and play both games in a strategically sophisticated manner by 7 y of age. One game has a pure strategy Nash equilibrium: In this game, children achieve equilibrium play by the age of 7 y on the first move. In the other game, with a single mixed-strategy equilibrium, children’s behavior moved toward the equilibrium with experience. These two results also correspond to two ways in which children’s behavior resembles adult behavior in the same games. In both games, children’s behavior becomes more strategically sophisticated with age on the first move. Beyond the age of 7 y, children begin to think about strategic interaction not myopically, but in a farsighted way, possibly with a view to cooperating and capitalizing on mutual gains in long-run relationships. PMID:25197065

  14. Accelerating Astronomy & Astrophysics in the New Era of Parallel Computing: GPUs, Phi and Cloud Computing

    NASA Astrophysics Data System (ADS)

    Ford, Eric B.; Dindar, Saleh; Peters, Jorg

    2015-08-01

    The realism of astrophysical simulations and statistical analyses of astronomical data are set by the available computational resources. Thus, astronomers and astrophysicists are constantly pushing the limits of computational capabilities. For decades, astronomers benefited from massive improvements in computational power that were driven primarily by increasing clock speeds and required relatively little attention to details of the computational hardware. For nearly a decade, increases in computational capabilities have come primarily from increasing the degree of parallelism, rather than increasing clock speeds. Further increases in computational capabilities will likely be led by many-core architectures such as Graphical Processing Units (GPUs) and Intel Xeon Phi. Successfully harnessing these new architectures, requires significantly more understanding of the hardware architecture, cache hierarchy, compiler capabilities and network network characteristics.I will provide an astronomer's overview of the opportunities and challenges provided by modern many-core architectures and elastic cloud computing. The primary goal is to help an astronomical audience understand what types of problems are likely to yield more than order of magnitude speed-ups and which problems are unlikely to parallelize sufficiently efficiently to be worth the development time and/or costs.I will draw on my experience leading a team in developing the Swarm-NG library for parallel integration of large ensembles of small n-body systems on GPUs, as well as several smaller software projects. I will share lessons learned from collaborating with computer scientists, including both technical and soft skills. Finally, I will discuss the challenges of training the next generation of astronomers to be proficient in this new era of high-performance computing, drawing on experience teaching a graduate class on High-Performance Scientific Computing for Astrophysics and organizing a 2014 advanced summer

  15. Profiles of Motivated Self-Regulation in College Computer Science Courses: Differences in Major versus Required Non-Major Courses

    NASA Astrophysics Data System (ADS)

    Shell, Duane F.; Soh, Leen-Kiat

    2013-12-01

    The goal of the present study was to utilize a profiling approach to understand differences in motivation and strategic self-regulation among post-secondary STEM students in major versus required non-major computer science courses. Participants were 233 students from required introductory computer science courses (194 men; 35 women; 4 unknown) at a large Midwestern state university. Cluster analysis identified five profiles: (1) a strategic profile of a highly motivated by-any-means good strategy user; (2) a knowledge-building profile of an intrinsically motivated autonomous, mastery-oriented student; (3) a surface learning profile of a utility motivated minimally engaged student; (4) an apathetic profile of an amotivational disengaged student; and (5) a learned helpless profile of a motivated but unable to effectively self-regulate student. Among CS majors and students in courses in their major field, the strategic and knowledge-building profiles were the most prevalent. Among non-CS majors and students in required non-major courses, the learned helpless, surface learning, and apathetic profiles were the most prevalent. Students in the strategic and knowledge-building profiles had significantly higher retention of computational thinking knowledge than students in other profiles. Students in the apathetic and surface learning profiles saw little instrumentality of the course for their future academic and career objectives. Findings show that students in STEM fields taking required computer science courses exhibit the same constellation of motivated strategic self-regulation profiles found in other post-secondary and K-12 settings.

  16. USAF Strategic Master Plan

    DTIC Science & Technology

    2015-05-01

    objectives, and four annexes: the Human Capital Annex ( HCA ), Strategic Posture Annex (SPA), Capabilities Annex (CA), and the Science and Technology...translate the SMP’s comprehensive goals and objectives into tangible actions and priorities. The four annexes are as follows:  Human Capital Annex ( HCA ...Figure 1: Internal Structure of the SMP Through the HCA , SPA, CA, and STA, the SMP consolidates and transmits strategic direction to staffs

  17. Strategic Performance Management Evaluation for the Navy’s Splice Local Area Networks.

    DTIC Science & Technology

    1985-04-01

    Communications Agency (DCA)/Federal Data Corporation (FDC) literature; an extensive survey of academic and professional book and article literature... interesting closing note on strategic planning characteristics is that the period during which collapse or disaster develops is of the same order as the...accepted set of standards. In computer performance, such things as paging rates , throughput, input/output channel usage, turnaround * 32 EM-. time

  18. Strategic Activism, Educational Leadership and Social Justice

    ERIC Educational Resources Information Center

    Ryan, James

    2016-01-01

    This article describes the strategic activism of educational leaders who promote social justice. Given the risks, educational leaders need to be strategic about the ways in which they pursue their activism. Citing current research, this article explores the ways in which leaders strategically pursue their social justice agendas within their own…

  19. National Weather Service - Strategic Planning and Policy

    Science.gov Websites

    Service Select to go to the NWS homepage Strategic Planning and Policy Site Map News Organization Search button to submit request City, St Go Homepage - Strategic Planning and Policy NWS Strategic Plan Current Plan Archive Policy Issues Public/Private Data Rights International Data Presentations/Tools

  20. DeepX: Deep Learning Accelerator for Restricted Boltzmann Machine Artificial Neural Networks.

    PubMed

    Kim, Lok-Won

    2018-05-01

    Although there have been many decades of research and commercial presence on high performance general purpose processors, there are still many applications that require fully customized hardware architectures for further computational acceleration. Recently, deep learning has been successfully used to learn in a wide variety of applications, but their heavy computation demand has considerably limited their practical applications. This paper proposes a fully pipelined acceleration architecture to alleviate high computational demand of an artificial neural network (ANN) which is restricted Boltzmann machine (RBM) ANNs. The implemented RBM ANN accelerator (integrating network size, using 128 input cases per batch, and running at a 303-MHz clock frequency) integrated in a state-of-the art field-programmable gate array (FPGA) (Xilinx Virtex 7 XC7V-2000T) provides a computational performance of 301-billion connection-updates-per-second and about 193 times higher performance than a software solution running on general purpose processors. Most importantly, the architecture enables over 4 times (12 times in batch learning) higher performance compared with a previous work when both are implemented in an FPGA device (XC2VP70).

  1. Coquina Elementary students enjoy gift of computers

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Children at Coquina Elementary School, Titusville, Fla., 'practice' using a computer keyboard, part of equipment donated by Kennedy Space Center. Coquina is one of 13 Brevard County schools receiving 81 excess contractor computers thanks to an innovative educational outreach project spearheaded by the Nasa k-12 Education Services Office at ksc. The Astronaut Memorial Foundation, a strategic partner in the effort, and several schools in rural Florida and Georgia also received refurbished computers as part of the year-long project. Ksc employees put in about 3,300 volunteer hours to transform old, excess computers into upgraded, usable units. A total of $90,000 in upgraded computer equipment is being donated.

  2. Coquina Elementary students enjoy gift of computers

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Children at Coquina Elementary School, Titusville, Fla., look with curiosity at the wrapped computer equipment donated by Kennedy Space Center. Coquina is one of 13 Brevard County schools receiving 81 excess contractor computers thanks to an innovative educational outreach project spearheaded by the Nasa k-12 Education Services Office at ksc. The Astronaut Memorial Foundation, a strategic partner in the effort, and several schools in rural Florida and Georgia also received refurbished computers as part of the year-long project. Ksc employees put in about 3,300 volunteer hours to transform old, excess computers into upgraded, usable units. A total of $90,000 in upgraded computer equipment is being donated.

  3. Audubon Elementary students enjoy gift of computers

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Children at Audubon Elementary School, Merritt Island, Fla., eagerly unwrap computer equipment donated by Kennedy Space Center. Audubon is one of 13 Brevard County schools receiving 81 excess contractor computers thanks to an innovative educational outreach project spearheaded by the Nasa k-12 Education Services Office at ksc. The Astronaut Memorial Foundation, a strategic partner in the effort, and several schools in rural Florida and Georgia also received refurbished computers as part of the year- long project. Ksc employees put in about 3,300 volunteer hours to transform old, excess computers into upgraded, usable units. A total of $90,000 in upgraded computer equipment is being donated.

  4. Coquina Elementary students enjoy gift of computers

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Children at Coquina Elementary School, Titusville, Fla., eagerly tear into the wrapped computer equipment donated by Kennedy Space Center. Coquina is one of 13 Brevard County schools receiving 81 excess contractor computers thanks to an innovative educational outreach project spearheaded by the Nasa k-12 Education Services Office at ksc. The Astronaut Memorial Foundation, a strategic partner in the effort, and several schools in rural Florida and Georgia also received refurbished computers as part of the year- long project. Ksc employees put in about 3,300 volunteer hours to transform old, excess computers into upgraded, usable units. A total of $90,000 in upgraded computer equipment is being donated.

  5. Coquina Elementary students enjoy gift of computers

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Children at Coquina Elementary School, Titusville, Fla., excitedly tear into the wrapped computer equipment donated by Kennedy Space Center. Coquina is one of 13 Brevard County schools receiving 81 excess contractor computers thanks to an innovative educational outreach project spearheaded by the Nasa k-12 Education Services Office at ksc. The Astronaut Memorial Foundation, a strategic partner in the effort, and several schools in rural Florida and Georgia also received refurbished computers as part of the year-long project. Ksc employees put in about 3,300 volunteer hours to transform old, excess computers into upgraded, usable units. A total of $90,000 in upgraded computer equipment is being donated.

  6. Transformational Assessment: A Simplified Model of Strategic Planning

    ERIC Educational Resources Information Center

    Bardwell, Rebecca

    2008-01-01

    Strategic planning is a way to evaluate a present situation and set a course for the future. While there is no dearth of literature on Strategic Planning, there appears to be reluctance on the part of K-12 educators to engage in strategic planning. Besides the cynicism about change, another roadblock to strategic planning is the time it takes.…

  7. The Ethics of Strategic Ambiguity.

    ERIC Educational Resources Information Center

    Paul, Jim; Strbiak, Christy A.

    1997-01-01

    Examines the concept of strategic ambiguity in communication, and addresses the ethics of strategic ambiguity from an intrapersonal perspective that considers the congruity of communicators' espoused-ethics, ethics-in-use, and behavior, where ethical judgements are based on the congruity between espoused-ethics and actual behavior. Poses questions…

  8. Strategic Human Resource Development. Symposium.

    ERIC Educational Resources Information Center

    2002

    This document contains three papers on strategic human resource (HR) development. "Strategic HR Orientation and Firm Performance in India" (Kuldeep Singh) reports findings from a study of Indian business executives that suggests there is a positive link between HR policies and practices and workforce motivation and loyalty and…

  9. NASA Space Sciences Strategic Planning

    NASA Technical Reports Server (NTRS)

    Crane, Philippe

    2004-01-01

    The purpose of strategic planning roadmap is to:Fulfill the strategic planning requirements; Provide a guide to the science community in presenting research requests to NASA; Inform and inspire; Focus investments in technology and research for future missions; and Provide the scientific and technical justification for augmentation requests.

  10. Acceleration of GPU-based Krylov solvers via data transfer reduction

    DOE PAGES

    Anzt, Hartwig; Tomov, Stanimire; Luszczek, Piotr; ...

    2015-04-08

    Krylov subspace iterative solvers are often the method of choice when solving large sparse linear systems. At the same time, hardware accelerators such as graphics processing units continue to offer significant floating point performance gains for matrix and vector computations through easy-to-use libraries of computational kernels. However, as these libraries are usually composed of a well optimized but limited set of linear algebra operations, applications that use them often fail to reduce certain data communications, and hence fail to leverage the full potential of the accelerator. In this study, we target the acceleration of Krylov subspace iterative methods for graphicsmore » processing units, and in particular the Biconjugate Gradient Stabilized solver that significant improvement can be achieved by reformulating the method to reduce data-communications through application-specific kernels instead of using the generic BLAS kernels, e.g. as provided by NVIDIA’s cuBLAS library, and by designing a graphics processing unit specific sparse matrix-vector product kernel that is able to more efficiently use the graphics processing unit’s computing power. Furthermore, we derive a model estimating the performance improvement, and use experimental data to validate the expected runtime savings. Finally, considering that the derived implementation achieves significantly higher performance, we assert that similar optimizations addressing algorithm structure, as well as sparse matrix-vector, are crucial for the subsequent development of high-performance graphics processing units accelerated Krylov subspace iterative methods.« less

  11. Programming and Runtime Support to Blaze FPGA Accelerator Deployment at Datacenter Scale.

    PubMed

    Huang, Muhuan; Wu, Di; Yu, Cody Hao; Fang, Zhenman; Interlandi, Matteo; Condie, Tyson; Cong, Jason

    2016-10-01

    With the end of CPU core scaling due to dark silicon limitations, customized accelerators on FPGAs have gained increased attention in modern datacenters due to their lower power, high performance and energy efficiency. Evidenced by Microsoft's FPGA deployment in its Bing search engine and Intel's 16.7 billion acquisition of Altera, integrating FPGAs into datacenters is considered one of the most promising approaches to sustain future datacenter growth. However, it is quite challenging for existing big data computing systems-like Apache Spark and Hadoop-to access the performance and energy benefits of FPGA accelerators. In this paper we design and implement Blaze to provide programming and runtime support for enabling easy and efficient deployments of FPGA accelerators in datacenters. In particular, Blaze abstracts FPGA accelerators as a service (FaaS) and provides a set of clean programming APIs for big data processing applications to easily utilize those accelerators. Our Blaze runtime implements an FaaS framework to efficiently share FPGA accelerators among multiple heterogeneous threads on a single node, and extends Hadoop YARN with accelerator-centric scheduling to efficiently share them among multiple computing tasks in the cluster. Experimental results using four representative big data applications demonstrate that Blaze greatly reduces the programming efforts to access FPGA accelerators in systems like Apache Spark and YARN, and improves the system throughput by 1.7 × to 3× (and energy efficiency by 1.5× to 2.7×) compared to a conventional CPU-only cluster.

  12. A Strategic Plan Is Just the Beginning

    ERIC Educational Resources Information Center

    Wilson, E. B.

    2009-01-01

    The process of strategic planning wears out institutions. Strategic planning, pursued with high purpose and energy, is enervating and exhausting. In this article, the author describes the value-added work of trustees in high-performing boards and discusses the role boards of trustees play as the process of continuous strategic planning unfolds.…

  13. What is strategic management?

    PubMed

    Jasper, Melanie; Crossan, Frank

    2012-10-01

    To discuss the theoretical concept of strategic management and explore its relevance for healthcare organisations and nursing management. Despite being a relatively new approach, the growth of strategic management within organisations has been consistently and increasingly promoted. However, comprehensive definitions are scarce and commonalities of interpretation are limited. This paper presents an exploratory discussion of the construct of strategic management, drawing on the literature and questioning its relevance within health-care organisations. Literature relating to strategic management across a number of fields was accessed, drawing primarily on meta-studies within management literature, to identify key concepts and attempt to present a consistent definition. The concept within health care is explored in relation to nursing management. Inconsistency in definitions and utilisation of key concepts within this management approach results in the term being loosely applied in health-care organisations without recourse to foundational principles and a deep understanding of the approach as a theory as opposed to an applied term. Nurse managers are increasingly asked to adopt the 'next-best-thing' in managerial theories, yet caution needs to be taken in nurses agreeing to use systems that lack an evidence base in terms of both efficacy and relevance of context. © 2012 Blackwell Publishing Ltd.

  14. Accelerating Advanced MRI Reconstructions on GPUs

    PubMed Central

    Stone, S.S.; Haldar, J.P.; Tsao, S.C.; Hwu, W.-m.W.; Sutton, B.P.; Liang, Z.-P.

    2008-01-01

    Computational acceleration on graphics processing units (GPUs) can make advanced magnetic resonance imaging (MRI) reconstruction algorithms attractive in clinical settings, thereby improving the quality of MR images across a broad spectrum of applications. This paper describes the acceleration of such an algorithm on NVIDIA’s Quadro FX 5600. The reconstruction of a 3D image with 1283 voxels achieves up to 180 GFLOPS and requires just over one minute on the Quadro, while reconstruction on a quad-core CPU is twenty-one times slower. Furthermore, relative to the true image, the error exhibited by the advanced reconstruction is only 12%, while conventional reconstruction techniques incur error of 42%. PMID:21796230

  15. Accelerating Advanced MRI Reconstructions on GPUs.

    PubMed

    Stone, S S; Haldar, J P; Tsao, S C; Hwu, W-M W; Sutton, B P; Liang, Z-P

    2008-10-01

    Computational acceleration on graphics processing units (GPUs) can make advanced magnetic resonance imaging (MRI) reconstruction algorithms attractive in clinical settings, thereby improving the quality of MR images across a broad spectrum of applications. This paper describes the acceleration of such an algorithm on NVIDIA's Quadro FX 5600. The reconstruction of a 3D image with 128(3) voxels achieves up to 180 GFLOPS and requires just over one minute on the Quadro, while reconstruction on a quad-core CPU is twenty-one times slower. Furthermore, relative to the true image, the error exhibited by the advanced reconstruction is only 12%, while conventional reconstruction techniques incur error of 42%.

  16. Strategic management of technostress. The chaining of Prometheus.

    PubMed

    Caro, D H; Sethi, A S

    1985-12-01

    The article proposes the concept of technostress and makes a strong recommendation for conducting research based on key researchable hypotheses. A conceptual framework of technostress is suggested to provide some focus to future research. A number of technostress management strategies are put forward, including strategic technological planning, organization culture development, technostress monitoring systems, and technouser self-development programs. The management of technostress is compared to the chaining of Prometheus, which, left uncontrolled, can create havoc in an organization. The authors believe that organizations have a responsibility to introduce, diffuse, and manage computer technology in such a way that it is congruent with the principles of sound, supportive, and humanistic management.

  17. Object-oriented design for accelerator control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stok, P.D.V. van der; Berk, F. van den; Deckers, R.

    1994-02-01

    An object-oriented design for the distributed computer control system of the accelerator ring EUTERPE is presented. Because of the experimental nature of the ring, flexibility is of the utmost importance. The object-oriented principles have contributed considerably to the flexibility of the design incorporating multiple views, multi-level access and distributed surveillance.

  18. Strategic Planning and Online Learning

    ERIC Educational Resources Information Center

    McLaughlin-Graham, Karen; Berge, Zane L.

    2005-01-01

    Strategic planning is a critical part of sustaining distance education. Through such planning, the organization can solve business problems that involve training and education in an effective and often cost savings manner compared to in-person training efforts. This paper examines the strategic planning process as it relates to sustaining distance…

  19. Strategic business planning linking strategy with financial reality.

    PubMed

    Bachrodt, Andrew K; Smyth, J Patrick

    2004-11-01

    To succeed in today's complex and often adverse business environment, a healthcare organization's strategic direction must be calculated, focused, and financially sustainable. Strategic business planning is an essential tool to help organizations focus strategic choices within the financial realities of their environment. An effective strategic business planning cycle includes conducting an assessment, identifying business objectives, developing strategy, conducting an impact analysis, and developing an implementation plan.

  20. Structuring Assignments to Improve Understanding and Presentation Skills: Experiential Learning in the Capstone Strategic Management Team Presentation

    ERIC Educational Resources Information Center

    Helms, Marilyn M.; Whitesell, Melissa

    2017-01-01

    In the strategic management course, students select, analyze, and present viable future alternatives based on information provided in cases or computer simulations. Rather than understanding the entire process, the student's focus is on the final presentation. Chickering's (1977) research on active learning suggests students learn more effectively…

  1. Acceleration of saddle-point searches with machine learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, Andrew A., E-mail: andrew-peterson@brown.edu

    In atomistic simulations, the location of the saddle point on the potential-energy surface (PES) gives important information on transitions between local minima, for example, via transition-state theory. However, the search for saddle points often involves hundreds or thousands of ab initio force calls, which are typically all done at full accuracy. This results in the vast majority of the computational effort being spent calculating the electronic structure of states not important to the researcher, and very little time performing the calculation of the saddle point state itself. In this work, we describe how machine learning (ML) can reduce the numbermore » of intermediate ab initio calculations needed to locate saddle points. Since machine-learning models can learn from, and thus mimic, atomistic simulations, the saddle-point search can be conducted rapidly in the machine-learning representation. The saddle-point prediction can then be verified by an ab initio calculation; if it is incorrect, this strategically has identified regions of the PES where the machine-learning representation has insufficient training data. When these training data are used to improve the machine-learning model, the estimates greatly improve. This approach can be systematized, and in two simple example problems we demonstrate a dramatic reduction in the number of ab initio force calls. We expect that this approach and future refinements will greatly accelerate searches for saddle points, as well as other searches on the potential energy surface, as machine-learning methods see greater adoption by the atomistics community.« less

  2. The French Strategic Dilemma.

    DTIC Science & Technology

    1984-03-01

    THE FRENCH STRATEGIC DILEMMA This paper identifies a basic strategic dilemma for France . On the one hand, French leaders identify the political purpose...are inducing a need for France to provide a more explicit definition of the role of French nuclear weapons in the West European security system. In...defense policy in France . This section concludes with an assessment of the alternative scenarios for the evolution of French defense policy in the 1980s

  3. Impaired strategic decision making in schizophrenia.

    PubMed

    Kim, Hyojin; Lee, Daeyeol; Shin, Young-Min; Chey, Jeanyung

    2007-11-14

    Adaptive decision making in dynamic social settings requires frequent re-evaluation of choice outcomes and revision of strategies. This requires an array of multiple cognitive abilities, such as working memory and response inhibition. Thus, the disruption of such abilities in schizophrenia can have significant implications for social dysfunctions in affected patients. In the present study, 20 schizophrenia patients and 20 control subjects completed two computerized binary decision-making tasks. In the first task, the participants played a competitive zero-sum game against a computer in which the predictable choice behavior was penalized and the optimal strategy was to choose the two targets stochastically. In the second task, the expected payoffs of the two targets were fixed and unaffected by the subject's choices, so the optimal strategy was to choose the target with the higher expected payoff exclusively. The schizophrenia patients earned significantly less money during the first task, even though their overall choice probabilities were not significantly different from the control subjects. This was mostly because patients were impaired in integrating the outcomes of their previous choices appropriately in order to maintain the optimal strategy. During the second task, the choices of patients and control subjects displayed more similar patterns. This study elucidated the specific components in strategic decision making that are impaired in schizophrenia. The deficit, which can be characterized as strategic stiffness, may have implications for the poor social adjustment in schizophrenia patients.

  4. A Primer on Strategic Financial Assessments.

    ERIC Educational Resources Information Center

    Richman, Naomi; Fitzgerald, Susan

    2003-01-01

    Describes how to perform a strategic financial assessment to enable the board to understand the fundamental internal and external challenges and opportunities confronting the institution when decision making and strategic capital planning. (EV)

  5. 12 CFR 563e.27 - Strategic plan.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Strategic plan. 563e.27 Section 563e.27 Banks... for Assessing Performance § 563e.27 Strategic plan. (a) Alternative election. The OTS will assess a... strategic plan if: (1) The savings association has submitted the plan to the OTS as provided for in this...

  6. Promise or Peril: The Strategic Defense Initiative.

    ERIC Educational Resources Information Center

    Brzezinski, Zbigniew, Ed.; And Others

    The major policy debate touched off by President Reagan's March 1983 speech announcing the Strategic Defense Initiative (SDI) was the reopening of one that had begun 35 years before. Then and now the ultimate question is what kind of strategic posture is most likely to contribute to mutual strategic stability? The answer is central to national…

  7. Production Level CFD Code Acceleration for Hybrid Many-Core Architectures

    NASA Technical Reports Server (NTRS)

    Duffy, Austen C.; Hammond, Dana P.; Nielsen, Eric J.

    2012-01-01

    In this work, a novel graphics processing unit (GPU) distributed sharing model for hybrid many-core architectures is introduced and employed in the acceleration of a production-level computational fluid dynamics (CFD) code. The latest generation graphics hardware allows multiple processor cores to simultaneously share a single GPU through concurrent kernel execution. This feature has allowed the NASA FUN3D code to be accelerated in parallel with up to four processor cores sharing a single GPU. For codes to scale and fully use resources on these and the next generation machines, codes will need to employ some type of GPU sharing model, as presented in this work. Findings include the effects of GPU sharing on overall performance. A discussion of the inherent challenges that parallel unstructured CFD codes face in accelerator-based computing environments is included, with considerations for future generation architectures. This work was completed by the author in August 2010, and reflects the analysis and results of the time.

  8. Accelerating separable footprint (SF) forward and back projection on GPU

    NASA Astrophysics Data System (ADS)

    Xie, Xiaobin; McGaffin, Madison G.; Long, Yong; Fessler, Jeffrey A.; Wen, Minhua; Lin, James

    2017-03-01

    Statistical image reconstruction (SIR) methods for X-ray CT can improve image quality and reduce radiation dosages over conventional reconstruction methods, such as filtered back projection (FBP). However, SIR methods require much longer computation time. The separable footprint (SF) forward and back projection technique simplifies the calculation of intersecting volumes of image voxels and finite-size beams in a way that is both accurate and efficient for parallel implementation. We propose a new method to accelerate the SF forward and back projection on GPU with NVIDIA's CUDA environment. For the forward projection, we parallelize over all detector cells. For the back projection, we parallelize over all 3D image voxels. The simulation results show that the proposed method is faster than the acceleration method of the SF projectors proposed by Wu and Fessler.13 We further accelerate the proposed method using multiple GPUs. The results show that the computation time is reduced approximately proportional to the number of GPUs.

  9. Computing Services Planning, Downsizing, and Organization at the University of Alberta.

    ERIC Educational Resources Information Center

    Beltrametti, Monica

    1993-01-01

    In a six-month period, the University of Alberta (Canada) campus computing services department formulated a strategic plan, and downsized and reorganized to meet financial constraints and respond to changing technology, especially distributed computing. The new department is organized to react more effectively to trends in technology and user…

  10. A Strategic Culture Assessment of the Transatlantic Divide

    DTIC Science & Technology

    2008-03-01

    security divide through the strategic culture lens, taking a comparative case study approach . It analyzes the emergent EU strategic culture by looking...utilize the strategic culture approach in the ensuing case study comparisons. B. WHY THE USE OF STRATEGIC CULTURE? In a study published in 2004...analysis use a comparative cultural approach when a previous comparison of U.S. and EU behavior found these actors’ behavior most aligned with realism’s

  11. Hospital strategic preparedness planning: the new imperative.

    PubMed

    Ginter, Peter M; Duncan, W Jack; Abdolrasulnia, Maziar

    2007-01-01

    Strategic preparedness planning is an important new imperative for many hospitals. Strategic preparedness planning goes beyond traditional product/market strategic planning by focusing on disaster prevention, containment, and response roles. Hospitals, because of their unique mission, size, complexity, the types of materials they handle, and the types of patients they encounter, are especially vulnerable to natural and human-initiated disasters. In addition, when disasters occur, hospitals must develop well-conceived first responder (receiver) strategies. This paper argues the case for strategic preparedness planning for hospitals and proposes a process for this relatively new and much needed type of planning.

  12. The application of artificial intelligent techniques to accelerator operations at McMaster University

    NASA Astrophysics Data System (ADS)

    Poehlman, W. F. S.; Garland, Wm. J.; Stark, J. W.

    1993-06-01

    In an era of downsizing and a limited pool of skilled accelerator personnel from which to draw replacements for an aging workforce, the impetus to integrate intelligent computer automation into the accelerator operator's repertoire is strong. However, successful deployment of an "Operator's Companion" is not trivial. Both graphical and human factors need to be recognized as critical areas that require extra care when formulating the Companion. They include interactive graphical user's interface that mimics, for the operator, familiar accelerator controls; knowledge of acquisition phases during development must acknowledge the expert's mental model of machine operation; and automated operations must be seen as improvements to the operator's environment rather than threats of ultimate replacement. Experiences with the PACES Accelerator Operator Companion developed at two sites over the past three years are related and graphical examples are given. The scale of the work involves multi-computer control of various start-up/shutdown and tuning procedures for Model FN and KN Van de Graaff accelerators. The response from licensing agencies has been encouraging.

  13. AESS: Accelerated Exact Stochastic Simulation

    NASA Astrophysics Data System (ADS)

    Jenkins, David D.; Peterson, Gregory D.

    2011-12-01

    The Stochastic Simulation Algorithm (SSA) developed by Gillespie provides a powerful mechanism for exploring the behavior of chemical systems with small species populations or with important noise contributions. Gene circuit simulations for systems biology commonly employ the SSA method, as do ecological applications. This algorithm tends to be computationally expensive, so researchers seek an efficient implementation of SSA. In this program package, the Accelerated Exact Stochastic Simulation Algorithm (AESS) contains optimized implementations of Gillespie's SSA that improve the performance of individual simulation runs or ensembles of simulations used for sweeping parameters or to provide statistically significant results. Program summaryProgram title: AESS Catalogue identifier: AEJW_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: University of Tennessee copyright agreement No. of lines in distributed program, including test data, etc.: 10 861 No. of bytes in distributed program, including test data, etc.: 394 631 Distribution format: tar.gz Programming language: C for processors, CUDA for NVIDIA GPUs Computer: Developed and tested on various x86 computers and NVIDIA C1060 Tesla and GTX 480 Fermi GPUs. The system targets x86 workstations, optionally with multicore processors or NVIDIA GPUs as accelerators. Operating system: Tested under Ubuntu Linux OS and CentOS 5.5 Linux OS Classification: 3, 16.12 Nature of problem: Simulation of chemical systems, particularly with low species populations, can be accurately performed using Gillespie's method of stochastic simulation. Numerous variations on the original stochastic simulation algorithm have been developed, including approaches that produce results with statistics that exactly match the chemical master equation (CME) as well as other approaches that approximate the CME. Solution

  14. Strategic Planning Is an Oxymoron

    ERIC Educational Resources Information Center

    Bassett, Patrick F.

    2012-01-01

    The thinking on "strategic thinking" has evolved significantly over the years. In the previous century, the independent school strategy was to focus on long-range planning, blithely projecting 10 years into the future. For decades this worked well enough, but in the late 20th century, independent schools shifted to "strategic planning," with its…

  15. 77 FR 25706 - Notice of Advisory Committee Closed Meeting; U.S. Strategic Command Strategic Advisory Group

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-01

    ..., Command and Control, Science and Technology, Missile Defense. Meeting Accessibility: Pursuant to 5 U.S.C... DEPARTMENT OF DEFENSE Notice of Advisory Committee Closed Meeting; U.S. Strategic Command Strategic Advisory Group AGENCY: Department of Defense. ACTION: Notice of Advisory Committee closed meeting...

  16. A preliminary design of the collinear dielectric wakefield accelerator

    NASA Astrophysics Data System (ADS)

    Zholents, A.; Gai, W.; Doran, S.; Lindberg, R.; Power, J. G.; Strelnikov, N.; Sun, Y.; Trakhtenberg, E.; Vasserman, I.; Jing, C.; Kanareykin, A.; Li, Y.; Gao, Q.; Shchegolkov, D. Y.; Simakov, E. I.

    2016-09-01

    A preliminary design of the multi-meter long collinear dielectric wakefield accelerator that achieves a highly efficient transfer of the drive bunch energy to the wakefields and to the witness bunch is considered. It is made from 0.5 m long accelerator modules containing a vacuum chamber with dielectric-lined walls, a quadrupole wiggler, an rf coupler, and BPM assembly. The single bunch breakup instability is a major limiting factor for accelerator efficiency, and the BNS damping is applied to obtain the stable multi-meter long propagation of a drive bunch. Numerical simulations using a 6D particle tracking computer code are performed and tolerances to various errors are defined.

  17. USGS Information Technology Strategic Plan: Fiscal Years 2007-2011

    USGS Publications Warehouse

    ,

    2006-01-01

    Introduction: The acquisition, management, communication, and long-term stewardship of natural science data, information, and knowledge are fundamental mission responsibilities of the U.S. Geological Survey (USGS). USGS scientists collect, maintain, and exchange raw scientific data and interpret and analyze it to produce a wide variety of science-based products. Managers throughout the Bureau access, summarize, and analyze administrative or business-related information to budget, plan, evaluate, and report on programs and projects. Information professionals manage the extensive and growing stores of irreplaceable scientific information and knowledge in numerous databases, archives, libraries, and other digital and nondigital holdings. Information is the primary currency of the USGS, and it flows to scientists, managers, partners, and a wide base of customers, including local, State, and Federal agencies, private sector organizations, and individual citizens. Supporting these information flows is an infrastructure of computer systems, telecommunications equipment, software applications, digital and nondigital data stores and archives, technical expertise, and information policies and procedures. This infrastructure has evolved over many years and consists of tools and technologies acquired or built to address the specific requirements of particular projects or programs. Developed independently, the elements of this infrastructure were typically not designed to facilitate the exchange of data and information across programs or disciplines, to allow for sharing of information resources or expertise, or to be combined into a Bureauwide and broader information infrastructure. The challenge to the Bureau is to wisely and effectively use its information resources to create a more Integrated Information Environment that can reduce costs, enhance the discovery and delivery of scientific products, and improve support for science. This Information Technology Strategic Plan

  18. Strategic Purchasing in Practice: Comparing Ten European Countries.

    PubMed

    Klasa, Katarzyna; Greer, Scott L; van Ginneken, Ewout

    2018-02-05

    Strategic purchasing of health care services is widely recommended as a policy instrument. We conducted a review of literature of material drawn from the European Observatory on Health Systems and Policies Health Systems in Transition series, other European Observatory databases, and selected country-specific literature to augment the comparative analysis by providing the most recent healthcare trends in ten selected countries. There is little evidence of purchasing being strategic according to any of the established definitions. There is little or no literature suggesting that existing purchasing mechanisms in Europe deliver improved population health, citizen empowerment, stronger governance and stewardship, or develop purchaser organization and capacity. Strategic purchasing has not generally been implemented. Policymakers considering adopting strategic purchasing policies should be aware of this systemic implementation problem. Policymakers in systems with strategic purchasing built into policy should not assume that a purchasing system is strategic or that it is delivering any expected objectives. However, there are individual components of strategic purchasing that are worth pursuing and can provide benefits to health systems. Copyright © 2018. Published by Elsevier B.V.

  19. Thinking strategically about capitation.

    PubMed

    Boland, P

    1997-05-01

    All managed care stakeholders--health plan members, employers, providers, community organizations, and government entitites--share a common interest in reducing healthcare costs while improving the quality of care health plan members receive. Although capitation is a usually thought of primarily as a payment mechanism, it can be a powerful tool providers and health plans can use to accomplish these strategic objectives and others, such as restoring and maintaining the health of plan members or improving a community's health status. For capitation to work effectively as a strategic tool, its use must be tied to a corporate agenda of partnering with stakeholders to achieve broader strategic goals. Health plans and providers must develop a partnership strategy in which each stakeholder has well-defined roles and responsibilities. The capitation structure must reinforce interdependence, shift focus from meeting organizational needs to meeting customer needs, and develop risk-driven care strategies.

  20. Design of Linear Accelerator (LINAC) tanks for proton therapy via Particle Swarm Optimization (PSO) and Genetic Algorithm (GA) approaches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Castellano, T.; De Palma, L.; Laneve, D.

    2015-07-01

    A homemade computer code for designing a Side- Coupled Linear Accelerator (SCL) is written. It integrates a simplified model of SCL tanks with the Particle Swarm Optimization (PSO) algorithm. The computer code main aim is to obtain useful guidelines for the design of Linear Accelerator (LINAC) resonant cavities. The design procedure, assisted via the aforesaid approach seems very promising, allowing future improvements towards the optimization of actual accelerating geometries. (authors)

  1. An Exploration of Strategic Planning Perspectives and Processes within Community Colleges Identified as Being Distinctive in Their Strategic Planning Practices

    ERIC Educational Resources Information Center

    Augustyniak, Lisa J.

    2015-01-01

    Community college leaders face unprecedented change, and some have begun reexamining their institutional strategic planning processes. Yet, studies in higher education strategic planning spend little time examining how community colleges formulate their strategic plans. This mixed-method qualitative study used an expert sampling method to identify…

  2. Accelerator system and method of accelerating particles

    NASA Technical Reports Server (NTRS)

    Wirz, Richard E. (Inventor)

    2010-01-01

    An accelerator system and method that utilize dust as the primary mass flux for generating thrust are provided. The accelerator system can include an accelerator capable of operating in a self-neutralizing mode and having a discharge chamber and at least one ionizer capable of charging dust particles. The system can also include a dust particle feeder that is capable of introducing the dust particles into the accelerator. By applying a pulsed positive and negative charge voltage to the accelerator, the charged dust particles can be accelerated thereby generating thrust and neutralizing the accelerator system.

  3. United States Strategic Plan for International Affairs.

    DTIC Science & Technology

    1998-01-01

    Humanitarian Response 39 Global Issues 41 US Strategic Plan for International Affairs International Affairs Strategic Plan Summary and Introduction...minimize the human costs of conflict and natural disasters. Global Issues : • Secure a sustainable global environment in order to protect the United States...involvement in addressing crises. 40 US Strategic Plan for International Affairs NATIONAL INTEREST: Global Issues The global environment has a

  4. COMBINING DIFFUSIVE SHOCK ACCELERATION WITH ACCELERATION BY CONTRACTING AND RECONNECTING SMALL-SCALE FLUX ROPES AT HELIOSPHERIC SHOCKS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Le Roux, J. A.; Zank, G. P.; Webb, G. M.

    2016-08-10

    Computational and observational evidence is accruing that heliospheric shocks, as emitters of vorticity, can produce downstream magnetic flux ropes and filaments. This led Zank et al. to investigate a new paradigm whereby energetic particle acceleration near shocks is a combination of diffusive shock acceleration (DSA) with downstream acceleration by many small-scale contracting and reconnecting (merging) flux ropes. Using a model where flux-rope acceleration involves a first-order Fermi mechanism due to the mean compression of numerous contracting flux ropes, Zank et al. provide theoretical support for observations that power-law spectra of energetic particles downstream of heliospheric shocks can be harder thanmore » predicted by DSA theory and that energetic particle intensities should peak behind shocks instead of at shocks as predicted by DSA theory. In this paper, a more extended formalism of kinetic transport theory developed by le Roux et al. is used to further explore this paradigm. We describe how second-order Fermi acceleration, related to the variance in the electromagnetic fields produced by downstream small-scale flux-rope dynamics, modifies the standard DSA model. The results show that (i) this approach can qualitatively reproduce observations of particle intensities peaking behind the shock, thus providing further support for the new paradigm, and (ii) stochastic acceleration by compressible flux ropes tends to be more efficient than incompressible flux ropes behind shocks in modifying the DSA spectrum of energetic particles.« less

  5. Modern Computational Techniques for the HMMER Sequence Analysis

    PubMed Central

    2013-01-01

    This paper focuses on the latest research and critical reviews on modern computing architectures, software and hardware accelerated algorithms for bioinformatics data analysis with an emphasis on one of the most important sequence analysis applications—hidden Markov models (HMM). We show the detailed performance comparison of sequence analysis tools on various computing platforms recently developed in the bioinformatics society. The characteristics of the sequence analysis, such as data and compute-intensive natures, make it very attractive to optimize and parallelize by using both traditional software approach and innovated hardware acceleration technologies. PMID:25937944

  6. Programming and Runtime Support to Blaze FPGA Accelerator Deployment at Datacenter Scale

    PubMed Central

    Huang, Muhuan; Wu, Di; Yu, Cody Hao; Fang, Zhenman; Interlandi, Matteo; Condie, Tyson; Cong, Jason

    2017-01-01

    With the end of CPU core scaling due to dark silicon limitations, customized accelerators on FPGAs have gained increased attention in modern datacenters due to their lower power, high performance and energy efficiency. Evidenced by Microsoft’s FPGA deployment in its Bing search engine and Intel’s 16.7 billion acquisition of Altera, integrating FPGAs into datacenters is considered one of the most promising approaches to sustain future datacenter growth. However, it is quite challenging for existing big data computing systems—like Apache Spark and Hadoop—to access the performance and energy benefits of FPGA accelerators. In this paper we design and implement Blaze to provide programming and runtime support for enabling easy and efficient deployments of FPGA accelerators in datacenters. In particular, Blaze abstracts FPGA accelerators as a service (FaaS) and provides a set of clean programming APIs for big data processing applications to easily utilize those accelerators. Our Blaze runtime implements an FaaS framework to efficiently share FPGA accelerators among multiple heterogeneous threads on a single node, and extends Hadoop YARN with accelerator-centric scheduling to efficiently share them among multiple computing tasks in the cluster. Experimental results using four representative big data applications demonstrate that Blaze greatly reduces the programming efforts to access FPGA accelerators in systems like Apache Spark and YARN, and improves the system throughput by 1.7 × to 3× (and energy efficiency by 1.5× to 2.7×) compared to a conventional CPU-only cluster. PMID:28317049

  7. Metocognitive Support Accelerates Computer Assisted Learning for Novice Programmers

    ERIC Educational Resources Information Center

    Rum, Siti Nurulain Mohd; Ismail, Maizatul Akmar

    2017-01-01

    Computer programming is a part of the curriculum in computer science education, and high drop rates for this subject are a universal problem. Development of metacognitive skills, including the conceptual framework provided by socio-cognitive theories that afford reflective thinking, such as actively monitoring, evaluating, and modifying one's…

  8. Effects of online cone-beam computed tomography with active breath control in determining planning target volume during accelerated partial breast irradiation.

    PubMed

    Li, Y; Zhong, R; Wang, X; Ai, P; Henderson, F; Chen, N; Luo, F

    2017-04-01

    To test if active breath control during cone-beam computed tomography (CBCT) could improve planning target volume during accelerated partial breast radiotherapy for breast cancer. Patients who were more than 40 years old, underwent breast-conserving dissection and planned for accelerated partial breast irradiation, and with postoperative staging limited to T1-2 N0 M0, or postoperative staging T2 lesion no larger than 3cm with a negative surgical margin greater than 2mm were enrolled. Patients with lobular carcinoma or extensive ductal carcinoma in situ were excluded. CBCT images were obtained pre-correction, post-correction and post-treatment. Set-up errors were recorded at left-right, anterior-posterior and superior-inferior directions. The differences between these CBCT images, as well as calculated radiation doses, were compared between patients with active breath control or free breathing. Forty patients were enrolled, among them 25 had active breath control. A total of 836 CBCT images were obtained for analysis. CBCT significantly reduced planning target volume. However, active breath control did not show significant benefit in decreasing planning target volume margin and the doses of organ-at-risk when compared to free breathing. CBCT, but not active breath control, could reduce planning target volume during accelerated partial breast irradiation. Copyright © 2017 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.

  9. Optimization of a Small Scale Linear Reluctance Accelerator

    NASA Astrophysics Data System (ADS)

    Barrera, Thor; Beard, Robby

    2011-11-01

    Reluctance accelerators are extremely promising future methods of transportation. Several problems still plague these devices, most prominently low efficiency. Variables to overcoming efficiency problems are many and difficult to correlate how they affect our accelerator. The study examined several differing variables that present potential challenges in optimizing the efficiency of reluctance accelerators. These include coil and projectile design, power supplies, switching, and the elusive gradient inductance problem. Extensive research in these areas has been performed from computational and theoretical to experimental. Findings show that these parameters share significant similarity to transformer design elements, thus general findings show current optimized parameters the research suggests as a baseline for further research and design. Demonstration of these current findings will be offered at the time of presentation.

  10. Microgravity strategic planning exercise

    NASA Technical Reports Server (NTRS)

    Halpern, Richard; Downey, Jim; Harvey, Harold

    1991-01-01

    The Center for Space and Advanced Technology supported a planning exercise for the Microgravity Program management at the Marshall Space Flight Center. The effort focused on the status of microgravity work at MSFC and elsewhere with the objective of preparing a goal-oriented strategic planning document which could be used for informational/brochure purposes. The effort entailed numerous interactions and presentations with Field Center programmatic components and Headquarters personnel. Appropriate material was consolidated in a draft format for a MSFC Strategic Plan.

  11. Timing of birth: Parsimony favors strategic over dysregulated parturition.

    PubMed

    Catalano, Ralph; Goodman, Julia; Margerison-Zilko, Claire; Falconi, April; Gemmill, Alison; Karasek, Deborah; Anderson, Elizabeth

    2016-01-01

    The "dysregulated parturition" narrative posits that the human stress response includes a cascade of hormones that "dysregulates" and accelerates parturition but provides questionable utility as a guide to understand or prevent preterm birth. We offer and test a "strategic parturition" narrative that not only predicts the excess preterm births that dysregulated parturition predicts but also makes testable, sex-specific predictions of the effect of stressful environments on the timing of birth among term pregnancies. We use interrupted time-series modeling of cohorts conceived over 101 months to test for lengthening of early term male gestations in stressed population. We use an event widely reported to have stressed Americans and to have increased the incidence of low birth weight and fetal death across the country-the terrorist attacks of September 2001. We tested the hypothesis that the odds of male infants conceived in December 2000 (i.e., at term in September 2001) being born early as opposed to full term fell below the value expected from those conceived in the 50 prior and 50 following months. We found that term male gestations exposed to the terrorist attacks exhibited 4% lower likelihood of early, as opposed to full or late, term birth. Strategic parturition explains observed data for which the dysregulated parturition narrative offers no prediction-the timing of birth among gestations stressed at term. Our narrative may help explain why findings from studies examining associations between population- and/or individual-level stressors and preterm birth are generally mixed. © 2015 Wiley Periodicals, Inc.

  12. Theater gateway closure: a strategic level barricade

    DTIC Science & Technology

    that at the strategic level the effects are based on the economic and diplomatic elements of the national power, affecting proportionally sustainment...Seven months of detrimental political implications, expensive effects on military operations, and strategic level barricades during 2011 and 2012 in...logistical planners at the strategic level can anticipate or mitigate the effects of a theater gateway closure on military operations. Through two

  13. Mercury BLASTP: Accelerating Protein Sequence Alignment

    PubMed Central

    Jacob, Arpith; Lancaster, Joseph; Buhler, Jeremy; Harris, Brandon; Chamberlain, Roger D.

    2008-01-01

    Large-scale protein sequence comparison is an important but compute-intensive task in molecular biology. BLASTP is the most popular tool for comparative analysis of protein sequences. In recent years, an exponential increase in the size of protein sequence databases has required either exponentially more running time or a cluster of machines to keep pace. To address this problem, we have designed and built a high-performance FPGA-accelerated version of BLASTP, Mercury BLASTP. In this paper, we describe the architecture of the portions of the application that are accelerated in the FPGA, and we also describe the integration of these FPGA-accelerated portions with the existing BLASTP software. We have implemented Mercury BLASTP on a commodity workstation with two Xilinx Virtex-II 6000 FPGAs. We show that the new design runs 11-15 times faster than software BLASTP on a modern CPU while delivering close to 99% identical results. PMID:19492068

  14. Metrics for NASA Aeronautics Research Mission Directorate (ARMD) Strategic Thrust 3B Vertical Lift Strategic Direction

    NASA Technical Reports Server (NTRS)

    Hochstetler, Ronald D.; Salvano, Dan; Gorton, Susan A.

    2017-01-01

    The NASA Aeronautics Research Mission Directorate (ARMD) Strategic Implementation Plan details an ambitious plan for aeronautical research for the next quarter century and beyond. It includes a number of advanced technologies needed to address requirements of the overall aviation community (domestic and international), with an emphasis on safety, efficiency, operational flexibility, and alternative propulsion air transport options. The six ARMD Strategic Thrust Areas (STAs) represent a specific set of multi-decade research agendas for creating the global aviation improvements most in demand by the aviation service consumers and the general public. To provide NASA with a measurement of the preeminent value of these research areas, it was necessary to identify and quantify the measurable benefits to the aviation community from capabilities delivered by the research programs. This paper will describe the processes used and the conclusions reached in defining the principal metrics for ARMD Strategic Thrust Area 3B "Vertical Lift Strategic Direction."

  15. TEACHING PHYSICS: Atwood's machine: experiments in an accelerating frame

    NASA Astrophysics Data System (ADS)

    Teck Chee, Chia; Hong, Chia Yee

    1999-03-01

    Experiments in an accelerating frame are often difficult to perform, but simple computer software allows sufficiently rapid and accurate measurements to be made on an arrangement of weights and pulleys known as Atwood's machine.

  16. Experiences that develop the ability to think strategically.

    PubMed

    Goldman, Ellen; Cahill, Terrence; Filho, Rubens Pessanha

    2009-01-01

    The ability to think strategically is an admired and a sought-after leadership requirement, yet we know little about how it develops. The purpose of this study is to identify specific experiences that contribute to the development of an individual's ability to think strategically. We identified eight work experiences, including different types of organizational projects, processes, and relationships, that contribute to an individual's strategic thinking ability. We also delineate specific characteristics material to each experience. These characteristics indicate that considerable time and focus are required to develop the ability to think strategically. In addition, the experiences are not all accessed equally: Women are less likely to have nonrelational experiences, while chief executive officers are more likely to have the most challenging ones. In addition, we found differences regarding work-related continuing education activities. Respondents rated nonhealthcare conferences and reading behind all other identified experiences that contribute to strategic thinking ability. Individuals can implement several strategies to improve their strategic thinking ability, including deliberately incorporating the requisite experiences into their development plans, ensuring that the experiences incorporate the required characteristics, and improving the benefit received from attending educational programs in nonhealthcare industries. Organizations can implement several strategies to ensure the experiences are as effective as possible, such as appraising gender differences across the experiences and reviewing the organization's strategic planning processes for the characteristics that best encourage strategic thinking.

  17. A New Type of Plasma Wakefield Accelerator Driven By Magnetowaves

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Pisin; /KIPAC, Menlo Park /Taiwan, Natl. Taiwan U.; Chang, Feng-Yin

    2011-09-12

    We present a new concept for a plasma wakefield accelerator driven by magnetowaves (MPWA). This concept was originally proposed as a viable mechanism for the 'cosmic accelerator' that would accelerate cosmic particles to ultra-high energies in the astrophysical setting. Unlike the more familiar plasma wakefield accelerator (PWFA) and the laser wakefield accelerator (LWFA) where the drivers, the charged-particle beam and the laser, are independently existing entities, MPWA invokes the high-frequency and high-speed whistler mode as the driver, which is a medium wave that cannot exist outside of the plasma. Aside from the difference in drivers, the underlying mechanism that excitesmore » the plasma wakefield via the ponderomotive potential is common. Our computer simulations show that under appropriate conditions, the plasma wakefield maintains very high coherence and can sustain high-gradient acceleration over many plasma wavelengths. We suggest that in addition to its celestial application, the MPWA concept can also be of terrestrial utility. A proof-of-principle experiment on MPWA would benefit both terrestrial and celestial accelerator concepts.« less

  18. Strategic Analysis of Terrorism

    NASA Astrophysics Data System (ADS)

    Arce, Daniel G.; Sandler, Todd

    Two areas that are increasingly studied in the game-theoretic literature on terrorism and counterterrorism are collective action and asymmetric information. One contribution of this chapter is a survey and extension of continuous policy models with differentiable payoff functions. In this way, policies can be characterized as strategic substitutes (e. g., proactive measures), or strategic complements (e. g., defensive measures). Mixed substitute-complement models are also introduced. We show that the efficiency of counterterror policy depends upon (i) the strategic substitutes-complements characterization, and (ii) who initiates the action. Surprisingly, in mixed-models the dichotomy between individual and collective action may disappear. A second contribution is the consideration of a signaling model where indiscriminant spectacular terrorist attacks may erode terrorists’ support among its constituency, and proactive government responses can create a backlash effect in favor of terrorists. A novel equilibrium of this model reflects the well-documented ineffectiveness of terrorism in achieving its stated goals.

  19. Fluid Physics Under a Stochastic Acceleration Field

    NASA Technical Reports Server (NTRS)

    Vinals, Jorge

    2001-01-01

    The research summarized in this report has involved a combined theoretical and computational study of fluid flow that results from the random acceleration environment present onboard space orbiters, also known as g-jitter. We have focused on a statistical description of the observed g-jitter, on the flows that such an acceleration field can induce in a number of experimental configurations of interest, and on extending previously developed methodology to boundary layer flows. Narrow band noise has been shown to describe many of the features of acceleration data collected during space missions. The scale of baroclinically induced flows when the driving acceleration is random is not given by the Rayleigh number. Spatially uniform g-jitter induces additional hydrodynamic forces among suspended particles in incompressible fluids. Stochastic modulation of the control parameter shifts the location of the onset of an oscillatory instability. Random vibration of solid boundaries leads to separation of boundary layers. Steady streaming ahead of a modulated solid-melt interface enhances solute transport, and modifies the stability boundaries of a planar front.

  20. Strategic Capability Development in the Higher Education Sector

    ERIC Educational Resources Information Center

    Brown, Paul

    2004-01-01

    The research adopts a case study approach (in higher education) to investigate how strategic capabilities might be developed in an organisation through strategic management development (SMD). SMD is defined as "Management development interventions which are intended to enhance the strategic capability and corporate performance of an…

  1. Strategic Leader Development for a 21st Century Army

    DTIC Science & Technology

    2008-04-30

    Fall of Strategic Planning. New York, NY: The Free Press, 1994. Northouse , Peter G. Leadership : Theory and Practice. Thousand Oaks, CA: Sage...TERMS Strategic Leadership ; Strategic Thinking; Contemporary Operational Environment; Adaptability; Self Awareness; Complexity; Officer Education...managing today’s fluid operational environment. The concept of strategic leadership , therefore, must be examined closely in Army doctrine. Social

  2. LHC Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lincoln, Don

    The LHC is the world’s highest energy particle accelerator and scientists use it to record an unprecedented amount of data. This data is recorded in electronic format and it requires an enormous computational infrastructure to convert the raw data into conclusions about the fundamental rules that govern matter. In this video, Fermilab’s Dr. Don Lincoln gives us a sense of just how much data is involved and the incredible computer resources that makes it all possible.

  3. Transformation and Change Management for Strategic Leaders

    DTIC Science & Technology

    2002-04-09

    TRANSFORMATION AND CHANGE MANAGEMENT FOR STRATEGIC LEADERS BY MR. KENNETH L. WRIGHT Department of the Army DISTRIBUTION STATEMENT A: Approved for Public...PROJECT TRANSFORMATION AND CHANGE MANAGEMENT FOR STRATEGIC LEADERS BY MR. KENNETH L. WRIGHT DEPARTMENT OF THE ARMY Dr. Robert M. Murphy Project Advisor The...STRATEGIC LEADERS FORMAT: Strategy Research Project DATE: 09 April 2002 PAGES: 33 CLASSIFICATION: Unclassified The objective of this work is to examine

  4. Extracting Time-Accurate Acceleration Vectors From Nontrivial Accelerometer Arrangements.

    PubMed

    Franck, Jennifer A; Blume, Janet; Crisco, Joseph J; Franck, Christian

    2015-09-01

    Sports-related concussions are of significant concern in many impact sports, and their detection relies on accurate measurements of the head kinematics during impact. Among the most prevalent recording technologies are videography, and more recently, the use of single-axis accelerometers mounted in a helmet, such as the HIT system. Successful extraction of the linear and angular impact accelerations depends on an accurate analysis methodology governed by the equations of motion. Current algorithms are able to estimate the magnitude of acceleration and hit location, but make assumptions about the hit orientation and are often limited in the position and/or orientation of the accelerometers. The newly formulated algorithm presented in this manuscript accurately extracts the full linear and rotational acceleration vectors from a broad arrangement of six single-axis accelerometers directly from the governing set of kinematic equations. The new formulation linearizes the nonlinear centripetal acceleration term with a finite-difference approximation and provides a fast and accurate solution for all six components of acceleration over long time periods (>250 ms). The approximation of the nonlinear centripetal acceleration term provides an accurate computation of the rotational velocity as a function of time and allows for reconstruction of a multiple-impact signal. Furthermore, the algorithm determines the impact location and orientation and can distinguish between glancing, high rotational velocity impacts, or direct impacts through the center of mass. Results are shown for ten simulated impact locations on a headform geometry computed with three different accelerometer configurations in varying degrees of signal noise. Since the algorithm does not require simplifications of the actual impacted geometry, the impact vector, or a specific arrangement of accelerometer orientations, it can be easily applied to many impact investigations in which accurate kinematics need

  5. Strategic Planning for Drought Mitigation Under Climate Change

    NASA Astrophysics Data System (ADS)

    Cai, X.; Zeng, R.; Valocchi, A. J.; Song, J.

    2012-12-01

    Droughts continue to be a major natural hazard and mounting evidence of global warming confronts society with a pressing question: Will climate change aggravate the risk of drought at local scale? It is important to explore what additional risk will be imposed by climate change and what level of strategic measures should be undertaken now to avoid vulnerable situations in the future, given that tactical measures may not avoid large damage. This study addresses the following key questions on strategic planning for drought mitigation under climate change: What combination of strategic and tactical measures will move the societal system response from a vulnerable situation to a resilient one with minimum cost? Are current infrastructures and their operation enough to mitigate the damage of future drought, or do we need in-advance infrastructure expansion for future drought preparedness? To address these questions, this study presents a decision support framework based on a coupled simulation and optimization model. A quasi-physically based watershed model is established for the Frenchman Creek Basin (FCB), part of the Republic River Basin, where groundwater based irrigation plays a significant role in agriculture production and local hydrological cycle. The physical model is used to train a statistical surrogate model, which predicts the watershed responses under future climate conditions. The statistical model replaces the complex physical model in the simulation-optimization framework, which makes the models computationally tractable. Decisions for drought preparedness include traditional short-term tactical measures (e.g. facility operation) and long-term or in-advance strategic measures, which require capital investment. A scenario based three-stage stochastic optimization model assesses the roles of strategic measures and tactical measures in drought preparedness and mitigation. Two benchmark climate prediction horizons, 2040s and 2090s, represent mid-term and

  6. On operator strategic behavior

    NASA Technical Reports Server (NTRS)

    Hancock, P. A.

    1991-01-01

    Deeper and more detailed knowledge as to how human operators such as pilots respond, singly and in groups, to demands on their performance which arise from technical systems will support the manipulation of such systems' design in order to accommodate the foibles of human behavior. Efforts to understand how self-autonomy impacts strategic behavior and such related issues as error generation/recognition/correction are still in their infancy. The present treatment offers both general and aviation-specific definitions of strategic behavior as precursors of prospective investigations.

  7. GASPRNG: GPU accelerated scalable parallel random number generator library

    NASA Astrophysics Data System (ADS)

    Gao, Shuang; Peterson, Gregory D.

    2013-04-01

    Graphics processors represent a promising technology for accelerating computational science applications. Many computational science applications require fast and scalable random number generation with good statistical properties, so they use the Scalable Parallel Random Number Generators library (SPRNG). We present the GPU Accelerated SPRNG library (GASPRNG) to accelerate SPRNG in GPU-based high performance computing systems. GASPRNG includes code for a host CPU and CUDA code for execution on NVIDIA graphics processing units (GPUs) along with a programming interface to support various usage models for pseudorandom numbers and computational science applications executing on the CPU, GPU, or both. This paper describes the implementation approach used to produce high performance and also describes how to use the programming interface. The programming interface allows a user to be able to use GASPRNG the same way as SPRNG on traditional serial or parallel computers as well as to develop tightly coupled programs executing primarily on the GPU. We also describe how to install GASPRNG and use it. To help illustrate linking with GASPRNG, various demonstration codes are included for the different usage models. GASPRNG on a single GPU shows up to 280x speedup over SPRNG on a single CPU core and is able to scale for larger systems in the same manner as SPRNG. Because GASPRNG generates identical streams of pseudorandom numbers as SPRNG, users can be confident about the quality of GASPRNG for scalable computational science applications. Catalogue identifier: AEOI_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEOI_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: UTK license. No. of lines in distributed program, including test data, etc.: 167900 No. of bytes in distributed program, including test data, etc.: 1422058 Distribution format: tar.gz Programming language: C and CUDA. Computer: Any PC or

  8. Double-pulse THz radiation bursts from laser-plasma acceleration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bosch, R. A.

    2006-11-15

    A model is presented for coherent THz radiation produced when an electron bunch undergoes laser-plasma acceleration and then exits axially from a plasma column. Radiation produced when the bunch is accelerated is superimposed with transition radiation from the bunch exiting the plasma. Computations give a double-pulse burst of radiation comparable to recent observations. The duration of each pulse very nearly equals the electron bunch length, while the time separation between pulses is proportional to the distance between the points where the bunch is accelerated and where it exits the plasma. The relative magnitude of the two pulses depends upon bymore » the radius of the plasma column. Thus, the radiation bursts may be useful in diagnosing the electron bunch length, the location of the bunch's acceleration, and the plasma radius.« less

  9. Institute for scientific computing research;fiscal year 1999 annual report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keyes, D

    2000-03-28

    Large-scale scientific computation, and all of the disciplines that support it and help to validate it, have been placed at the focus of Lawrence Livermore National Laboratory by the Accelerated Strategic Computing Initiative (ASCI). The Laboratory operates the computer with the highest peak performance in the world and has undertaken some of the largest and most compute-intensive simulations ever performed. Computers at the architectural extremes, however, are notoriously difficult to use efficiently. Even such successes as the Laboratory's two Bell Prizes awarded in November 1999 only emphasize the need for much better ways of interacting with the results of large-scalemore » simulations. Advances in scientific computing research have, therefore, never been more vital to the core missions of the Laboratory than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, the Laboratory must engage researchers at many academic centers of excellence. In FY 1999, the Institute for Scientific Computing Research (ISCR) has expanded the Laboratory's bridge to the academic community in the form of collaborative subcontracts, visiting faculty, student internships, a workshop, and a very active seminar series. ISCR research participants are integrated almost seamlessly with the Laboratory's Center for Applied Scientific Computing (CASC), which, in turn, addresses computational challenges arising throughout the Laboratory. Administratively, the ISCR flourishes under the Laboratory's University Relations Program (URP). Together with the other four Institutes of the URP, it must navigate a course that allows the Laboratory to benefit from academic exchanges while preserving national security. Although FY 1999 brought more than its share of challenges to the operation of an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met

  10. Strategic governance: Addressing neonatal mortality in situations of political instability and weak governance.

    PubMed

    Wise, Paul H; Darmstadt, Gary L

    2015-08-01

    Neonatal mortality is increasingly concentrated globally in situations of conflict and political instability. In 1991, countries with high levels of political instability accounted for approximately 10% of all neonatal deaths worldwide; in 2013, this figure had grown to 31%. This has generated a "grand divergence" between those countries showing progress in neonatal mortality reduction compared to those lagging behind. We present new analyses demonstrating associations of neonatal mortality with political instability (r = 0.55) and poor governance (r = 0.70). However, heterogeneity in these relationships suggests that progress is possible in addressing neonatal mortality even in the midst of political instability and poor governance. In order to address neonatal mortality more effectively in such situations, we must better understand how specific elements of "strategic governance"--the minimal conditions of political stability and governance required for health service implementation--can be leveraged for successful introduction of specific health services. Thus, a more strategic approach to policy and program implementation in situations of conflict and political instability could lead to major accelerations in neonatal mortality reduction globally. However, this will require new cross-disciplinary collaborations among public health professionals, political scientists, and country actors. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  11. Solutions for acceleration measurement in vehicle crash tests

    NASA Astrophysics Data System (ADS)

    Dima, D. S.; Covaciu, D.

    2017-10-01

    Crash tests are useful for validating computer simulations of road traffic accidents. One of the most important parameters measured is the acceleration. The evolution of acceleration versus time, during a crash test, form a crash pulse. The correctness of the crash pulse determination depends on the data acquisition system used. Recommendations regarding the instrumentation for impact tests are given in standards, which are focused on the use of accelerometers as impact sensors. The goal of this paper is to present the device and software developed by authors for data acquisition and processing. The system includes two accelerometers with different input ranges, a processing unit based on a 32-bit microcontroller and a data logging unit with SD card. Data collected on card, as text files, is processed with a dedicated software running on personal computers. The processing is based on diagrams and includes the digital filters recommended in standards.

  12. Strategic Help in User Interfaces for Information Retrieval.

    ERIC Educational Resources Information Center

    Brajnik, Giorgio; Mizzaro, Stefano; Tasso, Carlo; Venuti, Fabio

    2002-01-01

    Discussion of search strategy in information retrieval by end users focuses on the role played by strategic reasoning and design principles for user interfaces. Highlights include strategic help based on collaborative coaching; a conceptual model for strategic help; and a prototype knowledge-based system named FIRE. (Author/LRW)

  13. Robin Newmark - Executive Director for Strategic Initiatives | NREL

    Science.gov Websites

    Robin Newmark - Executive Director for Strategic Initiatives Robin Newmark - Executive Director for Strategic Initiatives A photo of Robin Newmark. At NREL, Robin Newmark focuses on the development of size and diversified its support. She previously served as the Director of the Strategic Energy

  14. Generating clock signals for a cycle accurate, cycle reproducible FPGA based hardware accelerator

    DOEpatents

    Asaad, Sameth W.; Kapur, Mohit

    2016-01-05

    A method, system and computer program product are disclosed for generating clock signals for a cycle accurate FPGA based hardware accelerator used to simulate operations of a device-under-test (DUT). In one embodiment, the DUT includes multiple device clocks generating multiple device clock signals at multiple frequencies and at a defined frequency ratio; and the FPG hardware accelerator includes multiple accelerator clocks generating multiple accelerator clock signals to operate the FPGA hardware accelerator to simulate the operations of the DUT. In one embodiment, operations of the DUT are mapped to the FPGA hardware accelerator, and the accelerator clock signals are generated at multiple frequencies and at the defined frequency ratio of the frequencies of the multiple device clocks, to maintain cycle accuracy between the DUT and the FPGA hardware accelerator. In an embodiment, the FPGA hardware accelerator may be used to control the frequencies of the multiple device clocks.

  15. Managing Uncertainty: Thinking and Planning Strategically.

    ERIC Educational Resources Information Center

    Lorenzo, Albert L.

    1993-01-01

    Argues that rapid change and tight resources demand reality-based planning, rather than planning models that ignore internal and external customers or emphasize process over product. Describes the Strategic Guidance Model (SGM) which provides colleges with strategic visioning, organizational assessment, environmental scanning, quality improvement,…

  16. Particle Accelerator Focus Automation

    NASA Astrophysics Data System (ADS)

    Lopes, José; Rocha, Jorge; Redondo, Luís; Cruz, João

    2017-08-01

    The Laboratório de Aceleradores e Tecnologias de Radiação (LATR) at the Campus Tecnológico e Nuclear, of Instituto Superior Técnico (IST) has a horizontal electrostatic particle accelerator based on the Van de Graaff machine which is used for research in the area of material characterization. This machine produces alfa (He+) and proton (H+) beams of some μA currents up to 2 MeV/q energies. Beam focusing is obtained using a cylindrical lens of the Einzel type, assembled near the high voltage terminal. This paper describes the developed system that automatically focuses the ion beam, using a personal computer running the LabVIEW software, a multifunction input/output board and signal conditioning circuits. The focusing procedure consists of a scanning method to find the lens bias voltage which maximizes the beam current measured on a beam stopper target, which is used as feedback for the scanning cycle. This system, as part of a wider start up and shut down automation system built for this particle accelerator, brings great advantages to the operation of the accelerator by turning it faster and easier to operate, requiring less human presence, and adding the possibility of total remote control in safe conditions.

  17. Distance Education Strategy: Mental Models and Strategic Choices

    ERIC Educational Resources Information Center

    Adams, John C.; Seagren, Alan T.

    2004-01-01

    What issues do distance education (DE) leaders believe will influence the future of DE? What are their colleges' DE strategies? This qualitative study compares DE strategic thinking and strategic choices at three community colleges. Two propositions are investigated: (1) each college's DE leaders use common strategic mental models (ways of…

  18. Accelerating 3D Elastic Wave Equations on Knights Landing based Intel Xeon Phi processors

    NASA Astrophysics Data System (ADS)

    Sourouri, Mohammed; Birger Raknes, Espen

    2017-04-01

    In advanced imaging methods like reverse-time migration (RTM) and full waveform inversion (FWI) the elastic wave equation (EWE) is numerically solved many times to create the seismic image or the elastic parameter model update. Thus, it is essential to optimize the solution time for solving the EWE as this will have a major impact on the total computational cost in running RTM or FWI. From a computational point of view applications implementing EWEs are associated with two major challenges. The first challenge is the amount of memory-bound computations involved, while the second challenge is the execution of such computations over very large datasets. So far, multi-core processors have not been able to tackle these two challenges, which eventually led to the adoption of accelerators such as Graphics Processing Units (GPUs). Compared to conventional CPUs, GPUs are densely populated with many floating-point units and fast memory, a type of architecture that has proven to map well to many scientific computations. Despite its architectural advantages, full-scale adoption of accelerators has yet to materialize. First, accelerators require a significant programming effort imposed by programming models such as CUDA or OpenCL. Second, accelerators come with a limited amount of memory, which also require explicit data transfers between the CPU and the accelerator over the slow PCI bus. The second generation of the Xeon Phi processor based on the Knights Landing (KNL) architecture, promises the computational capabilities of an accelerator but require the same programming effort as traditional multi-core processors. The high computational performance is realized through many integrated cores (number of cores and tiles and memory varies with the model) organized in tiles that are connected via a 2D mesh based interconnect. In contrary to accelerators, KNL is a self-hosted system, meaning explicit data transfers over the PCI bus are no longer required. However, like most

  19. Highly Productive Application Development with ViennaCL for Accelerators

    NASA Astrophysics Data System (ADS)

    Rupp, K.; Weinbub, J.; Rudolf, F.

    2012-12-01

    The use of graphics processing units (GPUs) for the acceleration of general purpose computations has become very attractive over the last years, and accelerators based on many integrated CPU cores are about to hit the market. However, there are discussions about the benefit of GPU computing when comparing the reduction of execution times with the increased development effort [1]. To counter these concerns, our open-source linear algebra library ViennaCL [2,3] uses modern programming techniques such as generic programming in order to provide a convenient access layer for accelerator and GPU computing. Other GPU-accelerated libraries are primarily tuned for performance, but less tailored to productivity and portability: MAGMA [4] provides dense linear algebra operations via a LAPACK-comparable interface, but no dedicated matrix and vector types. Cusp [5] is closest in functionality to ViennaCL for sparse matrices, but is based on CUDA and thus restricted to devices from NVIDIA. However, no convenience layer for dense linear algebra is provided with Cusp. ViennaCL is written in C++ and uses OpenCL to access the resources of accelerators, GPUs and multi-core CPUs in a unified way. On the one hand, the library provides iterative solvers from the family of Krylov methods, including various preconditioners, for the solution of linear systems typically obtained from the discretization of partial differential equations. On the other hand, dense linear algebra operations are supported, including algorithms such as QR factorization and singular value decomposition. The user application interface of ViennaCL is compatible to uBLAS [6], which is part of the peer-reviewed Boost C++ libraries [7]. This allows to port existing applications based on uBLAS with a minimum of effort to ViennaCL. Conversely, the interface compatibility allows to use the iterative solvers from ViennaCL with uBLAS types directly, thus enabling code reuse beyond CPU-GPU boundaries. Out-of-the-box support

  20. Assessment of Homomorphic Analysis for Human Activity Recognition from Acceleration Signals.

    PubMed

    Vanrell, Sebastian Rodrigo; Milone, Diego Humberto; Rufiner, Hugo Leonardo

    2017-07-03

    Unobtrusive activity monitoring can provide valuable information for medical and sports applications. In recent years, human activity recognition has moved to wearable sensors to deal with unconstrained scenarios. Accelerometers are the preferred sensors due to their simplicity and availability. Previous studies have examined several \\azul{classic} techniques for extracting features from acceleration signals, including time-domain, time-frequency, frequency-domain, and other heuristic features. Spectral and temporal features are the preferred ones and they are generally computed from acceleration components, leaving the acceleration magnitude potential unexplored. In this study, based on homomorphic analysis, a new type of feature extraction stage is proposed in order to exploit discriminative activity information present in acceleration signals. Homomorphic analysis can isolate the information about whole body dynamics and translate it into a compact representation, called cepstral coefficients. Experiments have explored several configurations of the proposed features, including size of representation, signals to be used, and fusion with other features. Cepstral features computed from acceleration magnitude obtained one of the highest recognition rates. In addition, a beneficial contribution was found when time-domain and moving pace information was included in the feature vector. Overall, the proposed system achieved a recognition rate of 91.21% on the publicly available SCUT-NAA dataset. To the best of our knowledge, this is the highest recognition rate on this dataset.

  1. The determinants of strategic thinking in preschool children.

    PubMed

    Brocas, Isabelle; Carrillo, Juan D

    2018-01-01

    Strategic thinking is an essential component of rational decision-making. However, little is known about its developmental aspects. Here we show that preschoolers can reason strategically in simple individual decisions that require anticipating a limited number of future decisions. This ability is transferred only partially to solve more complex individual decision problems and to efficiently interact with others. This ability is also more developed among older children in the classroom. Results indicate that while preschoolers potentially have the capacity to think strategically, it does not always translate into the ability to behave strategically.

  2. The determinants of strategic thinking in preschool children

    PubMed Central

    Brocas, Isabelle

    2018-01-01

    Strategic thinking is an essential component of rational decision-making. However, little is known about its developmental aspects. Here we show that preschoolers can reason strategically in simple individual decisions that require anticipating a limited number of future decisions. This ability is transferred only partially to solve more complex individual decision problems and to efficiently interact with others. This ability is also more developed among older children in the classroom. Results indicate that while preschoolers potentially have the capacity to think strategically, it does not always translate into the ability to behave strategically. PMID:29851954

  3. Strategic avionics technology planning

    NASA Technical Reports Server (NTRS)

    Cox, Kenneth J.; Brown, Don C.

    1991-01-01

    NASA experience in development and insertion of technology into programs had led to a recognition that a Strategic Plan for Avionics is needed for space. In the fall of 1989 an Avionics Technology Symposium was held in Williamsburg, Virginia. In early 1990, as a followon, a NASA wide Strategic Avionics Technology Working Group was chartered by NASA Headquarters. This paper will describe the objectives of this working group, technology bridging, and approaches to incentivize both the federal and commercial sectors to move toward rapidly developed, simple, and reliable systems with low life cycle cost.

  4. Integrating Risk Management and Strategic Planning

    ERIC Educational Resources Information Center

    Achampong, Francis K.

    2010-01-01

    Strategic planning is critical to ensuring that institutions of higher education thoughtfully and systematically position themselves to accomplish their mission, vision, and strategic goals, particularly when these institutions face a myriad of risks that can negatively impact their continued financial viability and compromise their ability to…

  5. Developing a Model for Strategic Leadership in Schools

    ERIC Educational Resources Information Center

    Davies, Barbara J.; Davies, Brent

    2006-01-01

    Strategic leadership is a critical component in the effective development of schools. Currently the educational debate is shifting to focus on how short-term improvements can become strategically sustainable. This article will put forward the view that renewed attention needs to be paid to the strategic dimension of leadership to ensure this…

  6. Anderson acceleration and application to the three-temperature energy equations

    NASA Astrophysics Data System (ADS)

    An, Hengbin; Jia, Xiaowei; Walker, Homer F.

    2017-10-01

    The Anderson acceleration method is an algorithm for accelerating the convergence of fixed-point iterations, including the Picard method. Anderson acceleration was first proposed in 1965 and, for some years, has been used successfully to accelerate the convergence of self-consistent field iterations in electronic-structure computations. Recently, the method has attracted growing attention in other application areas and among numerical analysts. Compared with a Newton-like method, an advantage of Anderson acceleration is that there is no need to form the Jacobian matrix. Thus the method is easy to implement. In this paper, an Anderson-accelerated Picard method is employed to solve the three-temperature energy equations, which are a type of strong nonlinear radiation-diffusion equations. Two strategies are used to improve the robustness of the Anderson acceleration method. One strategy is to adjust the iterates when necessary to satisfy the physical constraint. Another strategy is to monitor and, if necessary, reduce the matrix condition number of the least-squares problem in the Anderson-acceleration implementation so that numerical stability can be guaranteed. Numerical results show that the Anderson-accelerated Picard method can solve the three-temperature energy equations efficiently. Compared with the Picard method without acceleration, Anderson acceleration can reduce the number of iterations by at least half. A comparison between a Jacobian-free Newton-Krylov method, the Picard method, and the Anderson-accelerated Picard method is conducted in this paper.

  7. Effects of Spatial Gradients on Electron Runaway Acceleration

    NASA Technical Reports Server (NTRS)

    MacNeice, Peter; Ljepojevic, N. N.

    1996-01-01

    The runaway process is known to accelerate electrons in many laboratory plasmas and has been suggested as an acceleration mechanism in some astrophysical plasmas, including solar flares. Current calculations of the electron velocity distributions resulting from the runaway process are greatly restricted because they impose spatial homogeneity on the distribution. We have computed runaway distributions which include consistent development of spatial gradients in the energetic tail. Our solution for the electron velocity distribution is presented as a function of distance along a finite length acceleration region, and is compared with the equivalent distribution for the infinitely long homogenous system (i.e., no spatial gradients), as considered in the existing literature. All these results are for the weak field regime. We also discuss the severe restrictiveness of this weak field assumption.

  8. Reflections on a Strategic Vision for Computer Network Operations

    DTIC Science & Technology

    2010-05-25

    either a traditional or an irregular war. It cannot include the disarmament or destruction of enemy forces or the occupation of its geographic territory...Washington, DC: Chairman of the Joint Chiefs of Staff, 15 August 2007), GL-7. 34 Mr. John Mense , Basic Computer Network Operations Planners Course

  9. Treatment Planning for Accelerator-Based Boron Neutron Capture Therapy

    NASA Astrophysics Data System (ADS)

    Herrera, María S.; González, Sara J.; Minsky, Daniel M.; Kreiner, Andrés J.

    2010-08-01

    Glioblastoma multiforme and metastatic melanoma are frequent brain tumors in adults and presently still incurable diseases. Boron Neutron Capture Therapy (BNCT) is a promising alternative for this kind of pathologies. Accelerators have been proposed for BNCT as a way to circumvent the problem of siting reactors in hospitals and for their relative simplicity and lower cost among other advantages. Considerable effort is going into the development of accelerator-based BNCT neutron sources in Argentina. Epithermal neutron beams will be produced through appropriate proton-induced nuclear reactions and optimized beam shaping assemblies. Using these sources, computational dose distributions were evaluated in a real patient with diagnosed glioblastoma treated with BNCT. The simulated irradiation was delivered in order to optimize dose to the tumors within the normal tissue constraints. Using Monte Carlo radiation transport calculations, dose distributions were generated for brain, skin and tumor. Also, the dosimetry was studied by computing cumulative dose-volume histograms for volumes of interest. The results suggest acceptable skin average dose and a significant dose delivered to tumor with low average whole brain dose for irradiation times less than 60 minutes, indicating a good performance of an accelerator-based BNCT treatment.

  10. Treatment Planning for Accelerator-Based Boron Neutron Capture Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herrera, Maria S.; Gonzalez, Sara J.; Minsky, Daniel M.

    2010-08-04

    Glioblastoma multiforme and metastatic melanoma are frequent brain tumors in adults and presently still incurable diseases. Boron Neutron Capture Therapy (BNCT) is a promising alternative for this kind of pathologies. Accelerators have been proposed for BNCT as a way to circumvent the problem of siting reactors in hospitals and for their relative simplicity and lower cost among other advantages. Considerable effort is going into the development of accelerator-based BNCT neutron sources in Argentina. Epithermal neutron beams will be produced through appropriate proton-induced nuclear reactions and optimized beam shaping assemblies. Using these sources, computational dose distributions were evaluated in a realmore » patient with diagnosed glioblastoma treated with BNCT. The simulated irradiation was delivered in order to optimize dose to the tumors within the normal tissue constraints. Using Monte Carlo radiation transport calculations, dose distributions were generated for brain, skin and tumor. Also, the dosimetry was studied by computing cumulative dose-volume histograms for volumes of interest. The results suggest acceptable skin average dose and a significant dose delivered to tumor with low average whole brain dose for irradiation times less than 60 minutes, indicating a good performance of an accelerator-based BNCT treatment.« less

  11. Learn-as-you-go acceleration of cosmological parameter estimates

    NASA Astrophysics Data System (ADS)

    Aslanyan, Grigor; Easther, Richard; Price, Layne C.

    2015-09-01

    Cosmological analyses can be accelerated by approximating slow calculations using a training set, which is either precomputed or generated dynamically. However, this approach is only safe if the approximations are well understood and controlled. This paper surveys issues associated with the use of machine-learning based emulation strategies for accelerating cosmological parameter estimation. We describe a learn-as-you-go algorithm that is implemented in the Cosmo++ code and (1) trains the emulator while simultaneously estimating posterior probabilities; (2) identifies unreliable estimates, computing the exact numerical likelihoods if necessary; and (3) progressively learns and updates the error model as the calculation progresses. We explicitly describe and model the emulation error and show how this can be propagated into the posterior probabilities. We apply these techniques to the Planck likelihood and the calculation of ΛCDM posterior probabilities. The computation is significantly accelerated without a pre-defined training set and uncertainties in the posterior probabilities are subdominant to statistical fluctuations. We have obtained a speedup factor of 6.5 for Metropolis-Hastings and 3.5 for nested sampling. Finally, we discuss the general requirements for a credible error model and show how to update them on-the-fly.

  12. Creating a nursing strategic planning framework based on evidence.

    PubMed

    Shoemaker, Lorie K; Fischer, Brenda

    2011-03-01

    This article describes an evidence-informed strategic planning process and framework used by a Magnet-recognized public health system in California. This article includes (1) an overview of the organization and its strategic planning process, (2) the structure created within nursing for collaborative strategic planning and decision making, (3) the strategic planning framework developed based on the organization's balanced scorecard domains and the new Magnet model, and (4) the process undertaken to develop the nursing strategic priorities. Outcomes associated with the structure, process, and key initiatives are discussed throughout the article. Copyright © 2011 Elsevier Inc. All rights reserved.

  13. The Development of a Strategic Prioritisation Method for Green Supply Chain Initiatives.

    PubMed

    Masoumik, S Maryam; Abdul-Rashid, Salwa Hanim; Olugu, Ezutah Udoncy

    2015-01-01

    To maintain a competitive position, companies are increasingly required to integrate their proactive environmental strategies into their business strategies. The shift from reactive and compliance-based to proactive and strategic environmental management has driven companies to consider the strategic factors while identifying the areas in which they should focus their green initiatives. In previous studies little attention was given to providing the managers with a basis from which they could strategically prioritise these green initiatives across their companies' supply chains. Considering this lacuna in the literature, we present a decision-making method for prioritising green supply chain initiatives aligned with the preferred green strategies alternatives for the manufacturing companies. To develop this method, the study considered a position between determinism and the voluntarism orientation of environmental management involving both external pressures and internal competitive drivers and key resources as decision factors. This decision-making method was developed using the analytic network process (ANP) technique. The elements of the decision model were derived from the literature. The causal relationships among the multiple decision variables were validated based on the results of structural equation modelling (SEM) using a dataset collected from a survey of the ISO 14001-certified manufacturers in Malaysia. A portion of the relative weights required for computation in ANP was also calculated using the SEM results. A case study is presented to demonstrate the applicability of the method.

  14. The Development of a Strategic Prioritisation Method for Green Supply Chain Initiatives

    PubMed Central

    Masoumik, S. Maryam; Abdul-Rashid, Salwa Hanim; Olugu, Ezutah Udoncy

    2015-01-01

    To maintain a competitive position, companies are increasingly required to integrate their proactive environmental strategies into their business strategies. The shift from reactive and compliance-based to proactive and strategic environmental management has driven companies to consider the strategic factors while identifying the areas in which they should focus their green initiatives. In previous studies little attention was given to providing the managers with a basis from which they could strategically prioritise these green initiatives across their companies’ supply chains. Considering this lacuna in the literature, we present a decision-making method for prioritising green supply chain initiatives aligned with the preferred green strategies alternatives for the manufacturing companies. To develop this method, the study considered a position between determinism and the voluntarism orientation of environmental management involving both external pressures and internal competitive drivers and key resources as decision factors. This decision-making method was developed using the analytic network process (ANP) technique. The elements of the decision model were derived from the literature. The causal relationships among the multiple decision variables were validated based on the results of structural equation modelling (SEM) using a dataset collected from a survey of the ISO 14001-certified manufacturers in Malaysia. A portion of the relative weights required for computation in ANP was also calculated using the SEM results. A case study is presented to demonstrate the applicability of the method. PMID:26618353

  15. FPGA acceleration of rigid-molecule docking codes

    PubMed Central

    Sukhwani, B.; Herbordt, M.C.

    2011-01-01

    Modelling the interactions of biological molecules, or docking, is critical both to understanding basic life processes and to designing new drugs. The field programmable gate array (FPGA) based acceleration of a recently developed, complex, production docking code is described. The authors found that it is necessary to extend their previous three-dimensional (3D) correlation structure in several ways, most significantly to support simultaneous computation of several correlation functions. The result for small-molecule docking is a 100-fold speed-up of a section of the code that represents over 95% of the original run-time. An additional 2% is accelerated through a previously described method, yielding a total acceleration of 36× over a single core and 10× over a quad-core. This approach is found to be an ideal complement to graphics processing unit (GPU) based docking, which excels in the protein–protein domain. PMID:21857870

  16. Transforming Student Affairs Strategic Planning into Tangible Results

    ERIC Educational Resources Information Center

    Taylor, Simone Himbeault; Matney, Malinda M.

    2007-01-01

    The Division of Student Affairs at the University of Michigan has engaged in an iterative strategic process to create and implement a set of long-range goals. This strategic journey continues to evolve, bringing together the guiding framework of strategic planning steps, a reflective process with an assessment component within each step, and a…

  17. Superconducting Magnets for Accelerators

    NASA Astrophysics Data System (ADS)

    Brianti, G.; Tortschanoff, T.

    1993-03-01

    This chapter describes the main features of superconducting magnets for high energy synchrotrons and colliders. It refers to magnets presently used and under development for the most advanced accelerators projects, both recently constructed or in the preparatory phase. These magnets, using the technology mainly based on the NbTi conductor, are described from the aspect of design, materials, construction and performance. The trend toward higher performance can be gauged from the doubling of design field in less than a decade from about 4 T for the Tevatron to 10 T for the LHC. Special properties of the superconducting accelerator magnets, such as their general layout and the need of extensive computational treatment, the limits of performance inherent to the available conductors, the requirements on the structural design are described. The contribution is completed by elaborating on persistent current effects, quench protection and the cryostat design. As examples the main magnets for HERA and SSC, as well as the twin-aperture magnets for LHC, are presented.

  18. Collaborative Strategic Planning in Higher Education

    ERIC Educational Resources Information Center

    Sanaghan, Patrick

    2009-01-01

    This book outlines a simple, five-phase collaborative approach to strategic planning that has worked effectively on many campuses. Specifically, Collaborative Strategic Planning (CSP) refers to the disciplined and thoughtful process of meaningfully engaging relevant stakeholders in creating a shared future vision and goals for their institution.…

  19. 24 CFR 91.315 - Strategic plan.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 1 2011-04-01 2011-04-01 false Strategic plan. 91.315 Section 91... CONSOLIDATED SUBMISSIONS FOR COMMUNITY PLANNING AND DEVELOPMENT PROGRAMS State Governments; Contents of Consolidated Plan § 91.315 Strategic plan. (a) General. For the categories described in paragraphs (b), (c), (d...

  20. Recent Advances in Understanding Particle Acceleration Processes in Solar Flares

    NASA Astrophysics Data System (ADS)

    Zharkova, V. V.; Arzner, K.; Benz, A. O.; Browning, P.; Dauphin, C.; Emslie, A. G.; Fletcher, L.; Kontar, E. P.; Mann, G.; Onofri, M.; Petrosian, V.; Turkmani, R.; Vilmer, N.; Vlahos, L.

    2011-09-01

    We review basic theoretical concepts in particle acceleration, with particular emphasis on processes likely to occur in regions of magnetic reconnection. Several new developments are discussed, including detailed studies of reconnection in three-dimensional magnetic field configurations (e.g., current sheets, collapsing traps, separatrix regions) and stochastic acceleration in a turbulent environment. Fluid, test-particle, and particle-in-cell approaches are used and results compared. While these studies show considerable promise in accounting for the various observational manifestations of solar flares, they are limited by a number of factors, mostly relating to available computational power. Not the least of these issues is the need to explicitly incorporate the electrodynamic feedback of the accelerated particles themselves on the environment in which they are accelerated. A brief prognosis for future advancement is offered.

  1. Achieving competitive advantage through strategic human resource management.

    PubMed

    Fottler, M D; Phillips, R L; Blair, J D; Duran, C A

    1990-01-01

    The framework presented here challenges health care executives to manage human resources strategically as an integral part of the strategic planning process. Health care executives should consciously formulate human resource strategies and practices that are linked to and reinforce the broader strategic posture of the organization. This article provides a framework for (1) determining and focusing on desired strategic outcomes, (2) identifying and implementing essential human resource management actions, and (3) maintaining or enhancing competitive advantage. The strategic approach to human resource management includes assessing the organization's environment and mission; formulating the organization's business strategy; assessing the human resources requirements based on the intended strategy; comparing the current inventory of human resources in terms of numbers, characteristics, and human resource management practices with respect to the strategic requirements of the organization and its services or product lines; formulating the human resource strategy based on the differences between the assessed requirements and the current inventory; and implementing the appropriate human resource practices to reinforce the strategy and attain competitive advantage.

  2. Cambridge Elementary students enjoy gift of computers

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Children at Cambridge Elementary School, Cocoa, Fla., eagerly unwrap computer equipment donated by Kennedy Space Center. Cambridge is one of 13 Brevard County schools receiving 81 excess contractor computers thanks to an innovative educational outreach project spearheaded by the Nasa k-12 Education Services Office at ksc. Behind the children is Jim Thurston, a school volunteer and retired employee of USBI, who shared in the project. The Astronaut Memorial Foundation, a strategic partner in the effort, and several schools in rural Florida and Georgia also received refurbished computers as part of the year-long project. Ksc employees put in about 3,300 volunteer hours to transform old, excess computers into upgraded, usable units. A total of $90,000 in upgraded computer equipment is being donated.

  3. An exploratory study of healthcare strategic planning in two metropolitan areas.

    PubMed

    Begun, James W; Kaissi, Amer A

    2005-01-01

    Little is known about empirical variation in the extent to which healthcare organizations conduct formal strategic planning or the extent to which strategic planning affects performance. Structural contingency and complexity science theory offer differing interpretations of the value of strategic planning. Structural contingency theory emphasizes adaptation to achieve organizational fit with a changing environment and views strategic planning as a way to chart the organization's path. Complexity science argues that planning is largely futile in changing environments. Interviews of leaders in 20 healthcare organizations in the metropolitan areas of Minneapolis/St. Paul, Minnesota, and San Antonio, Texas, reveal that strategic planning is a common and valued function in healthcare organizations. Respondents emphasized the need to continuously update strategic plans, involve physicians and the governing board, and integrate strategic plans with other organizational plans. Most leaders expressed that strategic planning contributes to organizational focus, fosters stakeholder participation and commitment, and leads to achievement of strategic goals. Because the widespread belief in strategic planning is based largely on experience, intuition, and faith, we present recommendations for developing an evidence base for healthcare strategic planning.

  4. Strategic Sealift Supporting Army Deployments

    DTIC Science & Technology

    2016-06-10

    Approved for Public Release; Distribution is Unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT Mobility is a key element of the US Army’s ability to...together. There are three means the Army relies on for strategic mobility : airlift, sealift, and pre-positioning. Each of these modes has advantages...important than ever. Strategic mobility by either airlift or sealift is among the largest of the force’s routine expenditures and as such demands

  5. Strategic Sourcing in the Army

    DTIC Science & Technology

    2013-09-01

    200 words) The objective of this project is to examine how the Army is utilizing strategic sourcing as an effective process for getting the best...redundancy in the acquisition process. The discussions will also look at how important internal customer requirements and external marketplace ...to examine how the Army is utilizing strategic sourcing as an effective process for getting the best overall value for acquiring goods and services

  6. Strategic Planning towards a World-Class University

    NASA Astrophysics Data System (ADS)

    Usoh, E. J.; Ratu, D.; Manongko, A.; Taroreh, J.; Preston, G.

    2018-02-01

    Strategic planning with a focus on world-class university status is an option that cannot be avoided by universities today to survive and succeed in competition as a provider of higher education. The objective of this research is to obtain exploratory research results on the strategic plans of universities that are prepared to generate world-class university status. This research utilised exploratory qualitative research method and data was collected by in-depth interviews method. Interview transcripts were analyzed by using thematic content analysis through NVivo software analysis and manual systems. The main finding of interview shows that most interviewees agreed that UNIMA has been engaged in strategic planning. Contribution from faculties and schools are acknowledged and inform the planning process. However, a new model of strategic planning should be adopted by UNIMA due to the shift towards a “corporate university”. The finding results from documents, literature review and interview were the addition of world-class university characteristics and features to current strategic planning of UNIMA and how to upgrade by considering to use the characteristics and features towards world-class university.

  7. Computational Approach for Securing Radiology-Diagnostic Data in Connected Health Network using High-Performance GPU-Accelerated AES.

    PubMed

    Adeshina, A M; Hashim, R

    2017-03-01

    Diagnostic radiology is a core and integral part of modern medicine, paving ways for the primary care physicians in the disease diagnoses, treatments and therapy managements. Obviously, all recent standard healthcare procedures have immensely benefitted from the contemporary information technology revolutions, apparently revolutionizing those approaches to acquiring, storing and sharing of diagnostic data for efficient and timely diagnosis of diseases. Connected health network was introduced as an alternative to the ageing traditional concept in healthcare system, improving hospital-physician connectivity and clinical collaborations. Undoubtedly, the modern medicinal approach has drastically improved healthcare but at the expense of high computational cost and possible breach of diagnosis privacy. Consequently, a number of cryptographical techniques are recently being applied to clinical applications, but the challenges of not being able to successfully encrypt both the image and the textual data persist. Furthermore, processing time of encryption-decryption of medical datasets, within a considerable lower computational cost without jeopardizing the required security strength of the encryption algorithm, still remains as an outstanding issue. This study proposes a secured radiology-diagnostic data framework for connected health network using high-performance GPU-accelerated Advanced Encryption Standard. The study was evaluated with radiology image datasets consisting of brain MR and CT datasets obtained from the department of Surgery, University of North Carolina, USA, and the Swedish National Infrastructure for Computing. Sample patients' notes from the University of North Carolina, School of medicine at Chapel Hill were also used to evaluate the framework for its strength in encrypting-decrypting textual data in the form of medical report. Significantly, the framework is not only able to accurately encrypt and decrypt medical image datasets, but it also

  8. The U.S. Geological Survey Strategic Plan 1999-2009

    USGS Publications Warehouse

    ,

    1999-01-01

    This new version of the USGS Strategic Plan builds on our first strategic plan, which was developed in 1996, and focuses specifically on strategic goals in four areas: customers, programs, people, and operations of the USGS.

  9. Strategic group stability: evidence from the health care industry.

    PubMed

    Churchman, Richard L; Woodard, Beth

    2004-01-01

    To better understand strategic group stability and the associated mobility barriers concept, we surveyed health care administrators on their reasons for remaining in their current strategic group. We offer administrators' responses to the strategic group stability (mobility barrier) question. Decision-makers may be unaware of these cognitive biases (e.g., group-level world-view and resource similarity) and may not recognize the extent to which they are reducing their strategic alternatives.

  10. GPU-accelerated phase-field simulation of dendritic solidification in a binary alloy

    NASA Astrophysics Data System (ADS)

    Yamanaka, Akinori; Aoki, Takayuki; Ogawa, Satoi; Takaki, Tomohiro

    2011-03-01

    The phase-field simulation for dendritic solidification of a binary alloy has been accelerated by using a graphic processing unit (GPU). To perform the phase-field simulation of the alloy solidification on GPU, a program code was developed with computer unified device architecture (CUDA). In this paper, the implementation technique of the phase-field model on GPU is presented. Also, we evaluated the acceleration performance of the three-dimensional solidification simulation by using a single NVIDIA TESLA C1060 GPU and the developed program code. The results showed that the GPU calculation for 5763 computational grids achieved the performance of 170 GFLOPS by utilizing the shared memory as a software-managed cache. Furthermore, it can be demonstrated that the computation with the GPU is 100 times faster than that with a single CPU core. From the obtained results, we confirmed the feasibility of realizing a real-time full three-dimensional phase-field simulation of microstructure evolution on a personal desktop computer.

  11. A strategic informatics approach to autoverification.

    PubMed

    Jones, Jay B

    2013-03-01

    Autoverification is rapidly expanding with increased functionality provided by middleware tools. It is imperative that autoverification of laboratory test results be viewed as a process evolving into a broader, more sophisticated form of decision support, which will require strategic planning to form a foundational tool set for the laboratory. One must strategically plan to expand autoverification in the future to include a vision of instrument-generated order interfaces, reflexive testing, and interoperability with other information systems. It is hoped that the observations, examples, and opinions expressed in this article will stimulate such short-term and long-term strategic planning. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. Strategic orientations of small multihospital systems.

    PubMed Central

    Luke, R D; Begun, J W

    1988-01-01

    Strategic behaviors of organizations can be classified along two dimensions--growth orientations, or patterns of evolution over time, and action orientations, or strategic aggressiveness in undertaking a particular growth orientation. We create measures of growth and action orientations for small multihospital systems and test the validity of the growth and action orientation typologies, using data from a sample of small multihospital systems. Growth and action orientations do appear to exist independently of each other, and they are related to the ownership status of the systems. Not-for-profit and church-other systems exhibit similar strategic orientations, unlike those of Catholic and investor-owned systems. PMID:3060448

  13. Evolution in strategic forces and doctrine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Canavan, G.H.

    The era of the deterrence through the threat of retaliation is ending. Strategic defense opens new options for deep reductions without loss of stability, which could be a guide for shifting from the residual forces from the offensive era into those appropriate for a multipolar world. There are strong arguments for retiring missiles under the cover of missile defenses and returning to fewer but more capable aircraft for strategic roles. Developing the technologies for theater and strategic defenses could largely eliminate the incentive for the development of missiles by the third world and shift their efforts into more stabilizing areas.more » 20 refs.« less

  14. 75 FR 47346 - Draft Strategic Plan for FY 2010-2015

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-05

    ...://www.treas.gov/offices/management/budget/strategic-plan/2007-2012/strategic-plan2007-2012.pdf . The....ustreas.gov/offices/management/budget/strategic-plan/ and by clicking on the comment link. Comments may... DEPARTMENT OF THE TREASURY Draft Strategic Plan for FY 2010-2015 AGENCY: United States Department...

  15. Advancing the state of the art in healthcare strategic planning.

    PubMed

    Zuckerman, Alan M

    2006-01-01

    A recent survey of the state of strategic planning among healthcare organizations indicates that planners and executives believe that healthcare strategic planning practices are effective and provide the appropriate focus and direction for their organizations. When compared to strategic planning practices employed outside of the healthcare field, however, most healthcare strategic planning processes have not evolved to the more advanced, state-of-the-art levels of planning being used successfully outside of healthcare. While organizations that operate in stable markets may be able to survive using basic strategic planning practices, the volatile healthcare market demands that providers be nimble competitors with advanced, ongoing planning processes that drive growth and organizational effectiveness. What should healthcare organizations do to increase the rigor and sophistication of their strategic planning practices? This article identifies ten current healthcare strategic planning best practices and recommends five additional innovative approaches from pathbreaking companies outside of healthcare that have used advanced strategic planning practices to attain high levels of organizational success.

  16. Using Appropriate Tools Strategically for Instruction

    ERIC Educational Resources Information Center

    Sherman, Milan; Cayton, Charity

    2015-01-01

    Students' ability to use appropriate tools strategically is an important skill of mathematically proficient students (SMP 5, CCSSI 2010, p. 7). A parallel practice for teachers is using appropriate tools strategically for mathematics instruction. An important element of this practice is that the use of technology depends on the goals of…

  17. Accelerated GPU based SPECT Monte Carlo simulations.

    PubMed

    Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris

    2016-06-07

    Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: (99m) Tc, (111)In and (131)I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational

  18. Introduction of International Microgravity Strategic Planning Group

    NASA Technical Reports Server (NTRS)

    Rhome, Robert

    1998-01-01

    Established in May 6, 1995, the purpose of this International Strategic Planning Group for Microgravity Science and Applications Research is to develop and update, at least on a biennial basis, an International Strategic Plan for Microgravity Science and Applications Research. The member space agencies have agreed to contribute to the development of a Strategic Plan, and seek the implementation of the cooperative programs defined in this Plan. The emphasis of this plan is the coordination of hardware construction and utilization within the various areas of research including biotechnology, combustion science, fluid physics, materials science and other special topics in physical sciences. The Microgravity Science and Applications International Strategic Plan is a joint effort by the present members - ASI, CNES, CSA, DLR, ESA, NASA, and NASDA. It represents the consensus from a series of discussions held within the International Microgravity Strategic Planning Group (IMSPG). In 1996 several space agencies initiated multilateral discussions on how to improve the effectiveness of international microgravity research during the upcoming Space Station era. These discussions led to a recognition of the need for a comprehensive strategic plan for international microgravity research that would provide a framework for cooperation between international agencies. The Strategic Plan is intended to provide a basis for inter-agency coordination and cooperation in microgravity research in the environment of the International Space Station (ISS) era. This will be accomplished through analysis of the interests and goals of each participating agency and identification of mutual interests and program compatibilities. The Plan provides a framework for maximizing the productivity of space-based research for the benefit of our societies.

  19. Implementing successful strategic plans: a simple formula.

    PubMed

    Blondeau, Whitney; Blondeau, Benoit

    2015-01-01

    Strategic planning is a process. One way to think of strategic planning is to envision its development and design as a framework that will help your hospital navigate through internal and external changing environments over time. Although the process of strategic planning can feel daunting, following a simple formula involving five steps using the mnemonic B.E.G.I.N. (Begin, Evaluate, Goals & Objectives, Integration, and Next steps) will help the planning process feel more manageable, and lead you to greater success.

  20. Measuring strategic control in implicit learning: how and why?

    PubMed

    Norman, Elisabeth

    2015-01-01

    Several methods have been developed for measuring the extent to which implicitly learned knowledge can be applied in a strategic, flexible manner. Examples include generation exclusion tasks in Serial Reaction Time (SRT) learning (Goschke, 1998; Destrebecqz and Cleeremans, 2001) and 2-grammar classification tasks in Artificial Grammar Learning (AGL; Dienes et al., 1995; Norman et al., 2011). Strategic control has traditionally been used as a criterion for determining whether acquired knowledge is conscious or unconscious, or which properties of knowledge are consciously available. In this paper I first summarize existing methods that have been developed for measuring strategic control in the SRT and AGL tasks. I then address some methodological and theoretical questions. Methodological questions concern choice of task, whether the measurement reflects inhibitory control or task switching, and whether or not strategic control should be measured on a trial-by-trial basis. Theoretical questions concern the rationale for including measurement of strategic control, what form of knowledge is strategically controlled, and how strategic control can be combined with subjective awareness measures.