Sample records for computer evolution project

  1. The Evolution of Networked Computing in the Teaching of Japanese as a Foreign Language.

    ERIC Educational Resources Information Center

    Harrison, Richard

    1998-01-01

    Reviews the evolution of Internet-based projects in Japanese computer-assisted language learning and suggests future directions in which the field may develop, based on emerging network technology and learning theory. (Author/VWL)

  2. Dynamic computer model for the metallogenesis and tectonics of the Circum-North Pacific

    USGS Publications Warehouse

    Scotese, Christopher R.; Nokleberg, Warren J.; Monger, James W.H.; Norton, Ian O.; Parfenov, Leonid M.; Khanchuk, Alexander I.; Bundtzen, Thomas K.; Dawson, Kenneth M.; Eremin, Roman A.; Frolov, Yuri F.; Fujita, Kazuya; Goryachev, Nikolai A.; Pozdeev, Anany I.; Ratkin, Vladimir V.; Rodinov, Sergey M.; Rozenblum, Ilya S.; Scholl, David W.; Shpikerman, Vladimir I.; Sidorov, Anatoly A.; Stone, David B.

    2001-01-01

    The digital files on this report consist of a dynamic computer model of the metallogenesis and tectonics of the Circum-North Pacific, and background articles, figures, and maps. The tectonic part of the dynamic computer model is derived from a major analysis of the tectonic evolution of the Circum-North Pacific which is also contained in directory tectevol. The dynamic computer model and associated materials on this CD-ROM are part of a project on the major mineral deposits, metallogenesis, and tectonics of the Russian Far East, Alaska, and the Canadian Cordillera. The project provides critical information on bedrock geology and geophysics, tectonics, major metalliferous mineral resources, metallogenic patterns, and crustal origin and evolution of mineralizing systems for this region. The major scientific goals and benefits of the project are to: (1) provide a comprehensive international data base on the mineral resources of the region that is the first, extensive knowledge available in English; (2) provide major new interpretations of the origin and crustal evolution of mineralizing systems and their host rocks, thereby enabling enhanced, broad-scale tectonic reconstructions and interpretations; and (3) promote trade and scientific and technical exchanges between North America and Eastern Asia.

  3. The ChemViz Project: Using a Supercomputer To Illustrate Abstract Concepts in Chemistry.

    ERIC Educational Resources Information Center

    Beckwith, E. Kenneth; Nelson, Christopher

    1998-01-01

    Describes the Chemistry Visualization (ChemViz) Project, a Web venture maintained by the University of Illinois National Center for Supercomputing Applications (NCSA) that enables high school students to use computational chemistry as a technique for understanding abstract concepts. Discusses the evolution of computational chemistry and provides a…

  4. Research in Computational Astrobiology

    NASA Technical Reports Server (NTRS)

    Chaban, Galina; Colombano, Silvano; Scargle, Jeff; New, Michael H.; Pohorille, Andrew; Wilson, Michael A.

    2003-01-01

    We report on several projects in the field of computational astrobiology, which is devoted to advancing our understanding of the origin, evolution and distribution of life in the Universe using theoretical and computational tools. Research projects included modifying existing computer simulation codes to use efficient, multiple time step algorithms, statistical methods for analysis of astrophysical data via optimal partitioning methods, electronic structure calculations on water-nuclei acid complexes, incorporation of structural information into genomic sequence analysis methods and calculations of shock-induced formation of polycylic aromatic hydrocarbon compounds.

  5. Establishing the Basis for a CIS (Computer Information Systems) Undergraduate Program: On Seeking the Body of Knowledge

    ERIC Educational Resources Information Center

    Longenecker, Herbert E., Jr.; Babb, Jeffry; Waguespack, Leslie J.; Janicki, Thomas N.; Feinstein, David

    2015-01-01

    The evolution of computing education spans a spectrum from "computer science" ("CS") grounded in the theory of computing, to "information systems" ("IS"), grounded in the organizational application of data processing. This paper reports on a project focusing on a particular slice of that spectrum commonly…

  6. Simulating Halos with the Caterpillar Project

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2016-04-01

    The Caterpillar Project is a beautiful series of high-resolution cosmological simulations. The goal of this project is to examine the evolution of dark-matter halos like the Milky Ways, to learn about how galaxies like ours formed. This immense computational project is still in progress, but the Caterpillar team is already providing a look at some of its first results.Lessons from Dark-Matter HalosWhy simulate the dark-matter halos of galaxies? Observationally, the formation history of our galaxy is encoded in galactic fossil record clues, like the tidal debris from disrupted satellite galaxies in the outer reaches of our galaxy, or chemical abundance patterns throughout our galactic disk and stellar halo.But to interpret this information in a way that lets us learn about our galaxys history, we need to first test galaxy formation and evolution scenarios via cosmological simulations. Then we can compare the end result of these simulations to what we observe today.This figure illustrates the difference that mass resolution makes. In the left panel, the mass resolution is 1.5*10^7 solar masses per particle. In the right panel, the mass resolution is 3*10^4 solar masses per particle [Griffen et al. 2016]A Computational ChallengeDue to how computationally expensive such simulations are, previous N-body simulations of the growth of Milky-Way-like halos have consisted of only one or a few halos each. But in order to establish a statistical understanding of how galaxy halos form and find out whether the Milky Ways halo is typical or unusual! it is necessary to simulate a larger number of halos.In addition, in order to accurately follow the formation and evolution of substructure within the dark-matter halos, these simulations must be able to resolve the smallest dwarf galaxies, which are around a million solar masses. This requires an extremely high mass resolution, which adds to the computational expense of the simulation.First OutcomesThese are the challenges faced by the Caterpillar Project, detailed in a recently published paper led by Brendan Griffen (Massachusetts Institute of Technology). The Caterpillar Project was designed to simulate 70 Milky-Way-size halos (quadrupling the total number of halos that have been simulated in the past!) at a high mass resolution (10,000 solar masses per particle) and time resolution (5 Myr per snapshot). The project is extremely computationally intense, requiring 14 million CPU hours and 700 TB of data storage!Mass evolution of the first 24 Caterpillar halos (selected to be Milky-Way-size at z=0). The inset panel shows the mass evolution normalized by the halo mass at z=0, demonstrating the highly varied evolution these different halos undergo. [Griffen et al. 2016]In this first study, the Griffen and collaboratorsshow the end states for the first 24 halos of the project, evolved from a large redshift to today (z=0). They use these initialresults to demonstrate the integrity of their data and the utility of their methods, which include new halo-finding techniques that recover more substructure within each halo.The first results from the Caterpillar Project are already enough to show clear general trends, such as the highly variable paths the different halos take as they merge, accrete, and evolve, as well as how different their ends states can be. Statistically examining the evolution of these halos is an importantnext step in providinginsight intothe origin and evolution of the Milky Way, and helping us to understand how our galaxy differs from other galaxies of similar mass. Keep an eye out for future results from this project!BonusCheck out this video (make sure to watch in HD!) of how the first 24 Milky-Way-like halos from the Caterpillar simulations form. Seeingthese halos evolve simultaneously is an awesome way to identifythe similarities and differences between them.CitationBrendan F. Griffen et al 2016 ApJ 818 10. doi:10.3847/0004-637X/818/1/10

  7. Computational methods for coupling microstructural and micromechanical materials response simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HOLM,ELIZABETH A.; BATTAILE,CORBETT C.; BUCHHEIT,THOMAS E.

    2000-04-01

    Computational materials simulations have traditionally focused on individual phenomena: grain growth, crack propagation, plastic flow, etc. However, real materials behavior results from a complex interplay between phenomena. In this project, the authors explored methods for coupling mesoscale simulations of microstructural evolution and micromechanical response. In one case, massively parallel (MP) simulations for grain evolution and microcracking in alumina stronglink materials were dynamically coupled. In the other, codes for domain coarsening and plastic deformation in CuSi braze alloys were iteratively linked. this program provided the first comparison of two promising ways to integrate mesoscale computer codes. Coupled microstructural/micromechanical codes were appliedmore » to experimentally observed microstructures for the first time. In addition to the coupled codes, this project developed a suite of new computational capabilities (PARGRAIN, GLAD, OOF, MPM, polycrystal plasticity, front tracking). The problem of plasticity length scale in continuum calculations was recognized and a solution strategy was developed. The simulations were experimentally validated on stockpile materials.« less

  8. Volunteer Computing Experience with ATLAS@Home

    NASA Astrophysics Data System (ADS)

    Adam-Bourdarios, C.; Bianchi, R.; Cameron, D.; Filipčič, A.; Isacchini, G.; Lançon, E.; Wu, W.; ATLAS Collaboration

    2017-10-01

    ATLAS@Home is a volunteer computing project which allows the public to contribute to computing for the ATLAS experiment through their home or office computers. The project has grown continuously since its creation in mid-2014 and now counts almost 100,000 volunteers. The combined volunteers’ resources make up a sizeable fraction of overall resources for ATLAS simulation. This paper takes stock of the experience gained so far and describes the next steps in the evolution of the project. These improvements include running natively on Linux to ease the deployment on for example university clusters, using multiple cores inside one task to reduce the memory requirements and running different types of workload such as event generation. In addition to technical details the success of ATLAS@Home as an outreach tool is evaluated.

  9. Design Principles for "Thriving in Our Digital World": A High School Computer Science Course

    ERIC Educational Resources Information Center

    Veletsianos, George; Beth, Bradley; Lin, Calvin; Russell, Gregory

    2016-01-01

    "Thriving in Our Digital World" is a technology-enhanced dual enrollment course introducing high school students to computer science through project- and problem-based learning. This article describes the evolution of the course and five lessons learned during the design, development, implementation, and iteration of the course from its…

  10. The Physics of Open Ended Evolution

    NASA Astrophysics Data System (ADS)

    Adams, Alyssa M.

    What makes living systems different than non-living ones? Unfortunately this question is impossible to answer, at least currently. Instead, we must face computationally tangible questions based on our current understanding of physics, computation, information, and biology. Yet we have few insights into how living systems might quantifiably differ from their non-living counterparts, as in a mathematical foundation to explain away our observations of biological evolution, emergence, innovation, and organization. The development of a theory of living systems, if at all possible, demands a mathematical understanding of how data generated by complex biological systems changes over time. In addition, this theory ought to be broad enough as to not be constrained to an Earth-based biochemistry. In this dissertation, the philosophy of studying living systems from the perspective of traditional physics is first explored as a motivating discussion for subsequent research. Traditionally, we have often thought of the physical world from a bottom-up approach: things happening on a smaller scale aggregate into things happening on a larger scale. In addition, the laws of physics are generally considered static over time. Research suggests that biological evolution may follow dynamic laws that (at least in part) change as a function of the state of the system. Of the three featured research projects, cellular automata (CA) are used as a model to study certain aspects of living systems in two of them. These aspects include self-reference, open-ended evolution, local physical universality, subjectivity, and information processing. Open-ended evolution and local physical universality are attributed to the vast amount of innovation observed throughout biological evolution. Biological systems may distinguish themselves in terms of information processing and storage, not outside the theory of computation. The final research project concretely explores real-world phenomenon by means of mapping dominance hierarchies in the evolution of video game strategies. Though the main question of how life differs from non-life remains unanswered, the mechanisms behind open-ended evolution and physical universality are revealed.

  11. Multicore: Fallout from a Computing Evolution

    ScienceCinema

    Yelick, Kathy [Director, NERSC

    2017-12-09

    July 22, 2008 Berkeley Lab lecture: Parallel computing used to be reserved for big science and engineering projects, but in two years that's all changed. Even laptops and hand-helds use parallel processors. Unfortunately, the software hasn't kept pace. Kathy Yelick, Director of the National Energy Research Scientific Computing Center at Berkeley Lab, describes the resulting chaos and the computing community's efforts to develop exciting applications that take advantage of tens or hundreds of processors on a single chip.

  12. The Role of Scale in the Development and Evolution of Stratified Shear Turbulence, Entrainment and Mixing

    DTIC Science & Technology

    2015-09-30

    plan for the DNS runs of this project, which will be performed by using NGA (Desjardins et al., 2008), an in-house flow solver for variable density low...examined at geophysically relevant Reynolds numbers. In this project, the LES capabilities of the NGA computational framework is used. 4 REFERENCES

  13. The structure of common-envelope remnants

    NASA Astrophysics Data System (ADS)

    Hall, Philip D.

    2015-05-01

    We investigate the structure and evolution of the remnants of common-envelope evolution in binary star systems. In a common-envelope phase, two stars become engulfed in a gaseous envelope and, under the influence of drag forces, spiral to smaller separations. They may merge to form a single star or the envelope may be ejected to leave the stars in a shorter period orbit. This process explains the short orbital periods of many observed binary systems, such as cataclysmic variables and low-mass X-ray binary systems. Despite the importance of these systems, and of common-envelope evolution to their formation, it remains poorly understood. Specifically, we are unable to confidently predict the outcome of a common-envelope phase from the properties at its onset. After presenting a review of work on stellar evolution, binary systems, common-envelope evolution and the computer programs used, we describe the results of three computational projects on common-envelope evolution. Our work specifically relates to the methods and prescriptions which are used for predicting the outcome. We use the Cambridge stellar-evolution code STARS to produce detailed models of the structure and evolution of remnants of common-envelope evolution. We compare different assumptions about the uncertain end-of-common envelope structure and envelope mass of remnants which successfully eject their common envelopes. In the first project, we use detailed remnant models to investigate whether planetary nebulae are predicted after common-envelope phases initiated by low-mass red giants. We focus on the requirement that a remnant evolves rapidly enough to photoionize the nebula and compare the predictions for different ideas about the structure at the end of a common-envelope phase. We find that planetary nebulae are possible for some prescriptions for the end-of-common envelope structure. In our second contribution, we compute a large set of single-star models and fit new formulae to the core radii of evolved stars. These formulae can be used to better compute the outcome of common-envelope evolution with rapid evolution codes. We find that the new formulae are necessary for accurate predictions of the properties of post-common envelope systems. Finally, we use detailed remnant models of massive stars to investigate whether hydrogen may be retained after a common-envelope phase to the point of core-collapse and so be observable in supernovae. We find that this is possible and thus common-envelope evolution may contribute to the formation of Type IIb supernovae.

  14. Multicore: Fallout From a Computing Evolution (LBNL Summer Lecture Series)

    ScienceCinema

    Yelick, Kathy [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). National Energy Research Scientific Computing Center (NERSC)

    2018-05-07

    Summer Lecture Series 2008: Parallel computing used to be reserved for big science and engineering projects, but in two years that's all changed. Even laptops and hand-helds use parallel processors. Unfortunately, the software hasn't kept pace. Kathy Yelick, Director of the National Energy Research Scientific Computing Center at Berkeley Lab, describes the resulting chaos and the computing community's efforts to develop exciting applications that take advantage of tens or hundreds of processors on a single chip.

  15. Bioinformatics in high school biology curricula: a study of state science standards.

    PubMed

    Wefer, Stephen H; Sheppard, Keith

    2008-01-01

    The proliferation of bioinformatics in modern biology marks a modern revolution in science that promises to influence science education at all levels. This study analyzed secondary school science standards of 49 U.S. states (Iowa has no science framework) and the District of Columbia for content related to bioinformatics. The bioinformatics content of each state's biology standards was analyzed and categorized into nine areas: Human Genome Project/genomics, forensics, evolution, classification, nucleotide variations, medicine, computer use, agriculture/food technology, and science technology and society/socioscientific issues. Findings indicated a generally low representation of bioinformatics-related content, which varied substantially across the different areas, with Human Genome Project/genomics and computer use being the lowest (8%), and evolution being the highest (64%) among states' science frameworks. This essay concludes with recommendations for reworking/rewording existing standards to facilitate the goal of promoting science literacy among secondary school students.

  16. Bioinformatics in High School Biology Curricula: A Study of State Science Standards

    PubMed Central

    Sheppard, Keith

    2008-01-01

    The proliferation of bioinformatics in modern biology marks a modern revolution in science that promises to influence science education at all levels. This study analyzed secondary school science standards of 49 U.S. states (Iowa has no science framework) and the District of Columbia for content related to bioinformatics. The bioinformatics content of each state's biology standards was analyzed and categorized into nine areas: Human Genome Project/genomics, forensics, evolution, classification, nucleotide variations, medicine, computer use, agriculture/food technology, and science technology and society/socioscientific issues. Findings indicated a generally low representation of bioinformatics-related content, which varied substantially across the different areas, with Human Genome Project/genomics and computer use being the lowest (8%), and evolution being the highest (64%) among states' science frameworks. This essay concludes with recommendations for reworking/rewording existing standards to facilitate the goal of promoting science literacy among secondary school students. PMID:18316818

  17. Computers in Education: An Overview. Publication Number One. Software Engineering/Education Cooperative Project.

    ERIC Educational Resources Information Center

    Collis, Betty; Muir, Walter

    The first of four major sections in this report presents an overview of the background and evolution of computer applications to learning and teaching. It begins with the early attempts toward "automated teaching" of the 1920s, and the "teaching machines" of B. F. Skinner of the 1940s through the 1960s. It then traces the…

  18. Reviews.

    ERIC Educational Resources Information Center

    Science Teacher, 1988

    1988-01-01

    Presents information and concerns regarding computer courseware, books, and audiovisual materials reviewed by teachers. Covers a variety of topics including dissection of common classroom specimens, medicine, acid rain projects, molecules, the water cycle, erosion, plankton, and evolution. Notes on availability, price, and needed equipment, where…

  19. Crustal dynamics project data analysis, 1991: VLBI geodetic results, 1979 - 1990

    NASA Technical Reports Server (NTRS)

    Ma, C.; Ryan, J. W.; Caprette, D. S.

    1992-01-01

    The Goddard VLBI group reports the results of analyzing 1412 Mark II data sets acquired from fixed and mobile observing sites through the end of 1990 and available to the Crustal Dynamics Project. Three large solutions were used to obtain Earth rotation parameters, nutation offsets, global source positions, site velocities, and baseline evolution. Site positions are tabulated on a yearly basis from 1979 through 1992. Site velocities are presented in both geocentric Cartesian coordinates and topocentric coordinates. Baseline evolution is plotted for 175 baselines. Rates are computed for earth rotation and nutation parameters. Included are 104 sources, 88 fixed stations and mobile sites, and 688 baselines.

  20. NeuroManager: a workflow analysis based simulation management engine for computational neuroscience

    PubMed Central

    Stockton, David B.; Santamaria, Fidel

    2015-01-01

    We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project. PMID:26528175

  1. NeuroManager: a workflow analysis based simulation management engine for computational neuroscience.

    PubMed

    Stockton, David B; Santamaria, Fidel

    2015-01-01

    We developed NeuroManager, an object-oriented simulation management software engine for computational neuroscience. NeuroManager automates the workflow of simulation job submissions when using heterogeneous computational resources, simulators, and simulation tasks. The object-oriented approach (1) provides flexibility to adapt to a variety of neuroscience simulators, (2) simplifies the use of heterogeneous computational resources, from desktops to super computer clusters, and (3) improves tracking of simulator/simulation evolution. We implemented NeuroManager in MATLAB, a widely used engineering and scientific language, for its signal and image processing tools, prevalence in electrophysiology analysis, and increasing use in college Biology education. To design and develop NeuroManager we analyzed the workflow of simulation submission for a variety of simulators, operating systems, and computational resources, including the handling of input parameters, data, models, results, and analyses. This resulted in 22 stages of simulation submission workflow. The software incorporates progress notification, automatic organization, labeling, and time-stamping of data and results, and integrated access to MATLAB's analysis and visualization tools. NeuroManager provides users with the tools to automate daily tasks, and assists principal investigators in tracking and recreating the evolution of research projects performed by multiple people. Overall, NeuroManager provides the infrastructure needed to improve workflow, manage multiple simultaneous simulations, and maintain provenance of the potentially large amounts of data produced during the course of a research project.

  2. Computing Finite-Time Lyapunov Exponents with Optimally Time Dependent Reduction

    NASA Astrophysics Data System (ADS)

    Babaee, Hessam; Farazmand, Mohammad; Sapsis, Themis; Haller, George

    2016-11-01

    We present a method to compute Finite-Time Lyapunov Exponents (FTLE) of a dynamical system using Optimally Time-Dependent (OTD) reduction recently introduced by H. Babaee and T. P. Sapsis. The OTD modes are a set of finite-dimensional, time-dependent, orthonormal basis {ui (x , t) } |i=1N that capture the directions associated with transient instabilities. The evolution equation of the OTD modes is derived from a minimization principle that optimally approximates the most unstable directions over finite times. To compute the FTLE, we evolve a single OTD mode along with the nonlinear dynamics. We approximate the FTLE from the reduced system obtained from projecting the instantaneous linearized dynamics onto the OTD mode. This results in a significant reduction in the computational cost compared to conventional methods for computing FTLE. We demonstrate the efficiency of our method for double Gyre and ABC flows. ARO project 66710-EG-YIP.

  3. Development and Evaluation of Sterographic Display for Lung Cancer Screening

    DTIC Science & Technology

    2008-12-01

    burden. Application of GPUs – With the evolution of commodity graphics processing units (GPUs) for accelerating games on personal computers, over the...units, which are designed for rendering computer games , are readily available and can be programmed to perform the kinds of real-time calculations...575-581, 1994. 12. Anderson CM, Saloner D, Tsuruda JS, Shapeero LG, Lee RE. "Artifacts in maximun-intensity-projection display of MR angiograms

  4. The Evolution of a Connectionist Model of Situated Human Language Understanding

    NASA Astrophysics Data System (ADS)

    Mayberry, Marshall R.; Crocker, Matthew W.

    The Adaptive Mechanisms in Human Language Processing (ALPHA) project features both experimental and computational tracks designed to complement each other in the investigation of the cognitive mechanisms that underlie situated human utterance processing. The models developed in the computational track replicate results obtained in the experimental track and, in turn, suggest further experiments by virtue of behavior that arises as a by-product of their operation.

  5. An alternative approach for neural network evolution with a genetic algorithm: crossover by combinatorial optimization.

    PubMed

    García-Pedrajas, Nicolás; Ortiz-Boyer, Domingo; Hervás-Martínez, César

    2006-05-01

    In this work we present a new approach to crossover operator in the genetic evolution of neural networks. The most widely used evolutionary computation paradigm for neural network evolution is evolutionary programming. This paradigm is usually preferred due to the problems caused by the application of crossover to neural network evolution. However, crossover is the most innovative operator within the field of evolutionary computation. One of the most notorious problems with the application of crossover to neural networks is known as the permutation problem. This problem occurs due to the fact that the same network can be represented in a genetic coding by many different codifications. Our approach modifies the standard crossover operator taking into account the special features of the individuals to be mated. We present a new model for mating individuals that considers the structure of the hidden layer and redefines the crossover operator. As each hidden node represents a non-linear projection of the input variables, we approach the crossover as a problem on combinatorial optimization. We can formulate the problem as the extraction of a subset of near-optimal projections to create the hidden layer of the new network. This new approach is compared to a classical crossover in 25 real-world problems with an excellent performance. Moreover, the networks obtained are much smaller than those obtained with classical crossover operator.

  6. Trinity to Trinity 1945-2015

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moniz, Ernest; Carr, Alan; Bethe, Hans

    The Trinity Test of July 16, 1945 was the first full-scale, real-world test of a nuclear weapon; with the new Trinity supercomputer Los Alamos National Laboratory's goal is to do this virtually, in 3D. Trinity was the culmination of a fantastic effort of groundbreaking science and engineering by hundreds of men and women at Los Alamos and other Manhattan Project sites. It took them less than two years to change the world. The Laboratory is marking the 70th anniversary of the Trinity Test because it not only ushered in the Nuclear Age, but with it the origin of today’s advancedmore » supercomputing. We live in the Age of Supercomputers due in large part to nuclear weapons science here at Los Alamos. National security science, and nuclear weapons science in particular, at Los Alamos National Laboratory have provided a key motivation for the evolution of large-scale scientific computing. Beginning with the Manhattan Project there has been a constant stream of increasingly significant, complex problems in nuclear weapons science whose timely solutions demand larger and faster computers. The relationship between national security science at Los Alamos and the evolution of computing is one of interdependence.« less

  7. Trinity to Trinity 1945-2015

    ScienceCinema

    Moniz, Ernest; Carr, Alan; Bethe, Hans; Morrison, Phillip; Ramsay, Norman; Teller, Edward; Brixner, Berlyn; Archer, Bill; Agnew, Harold; Morrison, John

    2018-01-16

    The Trinity Test of July 16, 1945 was the first full-scale, real-world test of a nuclear weapon; with the new Trinity supercomputer Los Alamos National Laboratory's goal is to do this virtually, in 3D. Trinity was the culmination of a fantastic effort of groundbreaking science and engineering by hundreds of men and women at Los Alamos and other Manhattan Project sites. It took them less than two years to change the world. The Laboratory is marking the 70th anniversary of the Trinity Test because it not only ushered in the Nuclear Age, but with it the origin of today’s advanced supercomputing. We live in the Age of Supercomputers due in large part to nuclear weapons science here at Los Alamos. National security science, and nuclear weapons science in particular, at Los Alamos National Laboratory have provided a key motivation for the evolution of large-scale scientific computing. Beginning with the Manhattan Project there has been a constant stream of increasingly significant, complex problems in nuclear weapons science whose timely solutions demand larger and faster computers. The relationship between national security science at Los Alamos and the evolution of computing is one of interdependence.

  8. Preparing for the Integration of Emerging Technologies.

    ERIC Educational Resources Information Center

    Dyrli, Odvard Egil; Kinnaman, Daniel E.

    1994-01-01

    Discussion of the process of integrating new technologies into schools considers the evolution of technology, including personal computers, CD-ROMs, hypermedia, and networking/communications; the transition from Industrial-Age to Information-Age schools; and the logical steps of transition. Sidebars discuss a networked multimedia pilot project and…

  9. The Technological Evolution in Schools: Reflections and Projections.

    ERIC Educational Resources Information Center

    Higgins, James E.

    1991-01-01

    Presents a first-person account of one teacher's experiences with computer hardware and software. The article discusses various programs and applications, such as integrated learning systems, database searching via CD-ROM, desktop publishing, authoring programs, and indicates future changes in instruction with increasing use of technology. (SM)

  10. The Evolution of Untethered Communications.

    ERIC Educational Resources Information Center

    National Academy of Sciences - National Research Council, Washington, DC.

    In response to a request from the Defense Advanced Research Projects Agency (DARPA), the Computer Science and Telecommunications Board (CSTB) of the National Research Council initiated a one-year study on untethered communications in July 1996. To carry out the study, the CSTB appointed a committee of 15 wireless-technology experts, including…

  11. CF Metadata Conventions: Founding Principles, Governance, and Future Directions

    NASA Astrophysics Data System (ADS)

    Taylor, K. E.

    2016-12-01

    The CF Metadata Conventions define attributes that promote sharing of climate and forecasting data and facilitate automated processing by computers. The development, maintenance, and evolution of the conventions have mainly been provided by voluntary community contributions. Nevertheless, an organizational framework has been established, which relies on established rules and web-based discussion to ensure smooth (but relatively efficient) evolution of the standard to accommodate new types of data. The CF standard has been essential to the success of high-profile internationally-coordinated modeling activities (e.g, the Coupled Model Intercomparison Project). A summary of CF's founding principles and the prospects for its future evolution will be discussed.

  12. United States Air Force Computer-Aided Acquisition and Logistics Support (CALS) Evolution of Computer Integrated Manufacturing (CIM) Technologies

    DTIC Science & Technology

    1988-11-01

    Manufacturing System 22 4. Similar Parts Based Shape or Manufactuting Process 24 5. Projected Annual Unit Robot Sales and Installed Base Through 1992 30 6. U.S...effort needed to perform personnel, product design, marketing , and advertising, and finance tasks of the firm. Level III controls the resource...planning and accounting functions of the firm. Systems at this level support purchasing, accounts payable, accounts receivable, master scheduling and sales

  13. Developing the Cyber Defenders of Tomorrow with Regional Collegiate Cyber Defense Competitions (CCDC)

    ERIC Educational Resources Information Center

    Carlin, Anna; Manson, Daniel P.; Zhu, Jake

    2010-01-01

    With the projected higher demand for Network Systems Analysts and increasing computer crime, network security specialists are an organization's first line of defense. The principle function of this paper is to provide the evolution of Collegiate Cyber Defense Competitions (CCDC), event planning required, soliciting sponsors, recruiting personnel…

  14. Gradient optimization of finite projected entangled pair states

    NASA Astrophysics Data System (ADS)

    Liu, Wen-Yuan; Dong, Shao-Jun; Han, Yong-Jian; Guo, Guang-Can; He, Lixin

    2017-05-01

    Projected entangled pair states (PEPS) methods have been proven to be powerful tools to solve strongly correlated quantum many-body problems in two dimensions. However, due to the high computational scaling with the virtual bond dimension D , in a practical application, PEPS are often limited to rather small bond dimensions, which may not be large enough for some highly entangled systems, for instance, frustrated systems. Optimization of the ground state using the imaginary time evolution method with a simple update scheme may go to a larger bond dimension. However, the accuracy of the rough approximation to the environment of the local tensors is questionable. Here, we demonstrate that by combining the imaginary time evolution method with a simple update, Monte Carlo sampling techniques and gradient optimization will offer an efficient method to calculate the PEPS ground state. By taking advantage of massive parallel computing, we can study quantum systems with larger bond dimensions up to D =10 without resorting to any symmetry. Benchmark tests of the method on the J1-J2 model give impressive accuracy compared with exact results.

  15. Models of Protocellular Structure, Function and Evolution

    NASA Technical Reports Server (NTRS)

    New, Michael H.; Pohorille, Andrew; Szostak, Jack W.; Keefe, Tony; Lanyi, Janos K.; DeVincenzi, Donald L. (Technical Monitor)

    2001-01-01

    In the absence of any record of protocells, the most direct way to test our understanding, of the origin of cellular life is to construct laboratory models that capture important features of protocellular systems. Such efforts are currently underway in a collaborative project between NASA-Ames, Harvard Medical School and University of California. They are accompanied by computational studies aimed at explaining self-organization of simple molecules into ordered structures. The centerpiece of this project is a method for the in vitro evolution of protein enzymes toward arbitrary catalytic targets. A similar approach has already been developed for nucleic acids in which a small number of functional molecules are selected from a large, random population of candidates. The selected molecules are next vastly multiplied using the polymerase chain reaction.

  16. Time-Accurate Solutions of Incompressible Navier-Stokes Equations for Potential Turbopump Applications

    NASA Technical Reports Server (NTRS)

    Kiris, Cetin; Kwak, Dochan

    2001-01-01

    Two numerical procedures, one based on artificial compressibility method and the other pressure projection method, are outlined for obtaining time-accurate solutions of the incompressible Navier-Stokes equations. The performance of the two method are compared by obtaining unsteady solutions for the evolution of twin vortices behind a at plate. Calculated results are compared with experimental and other numerical results. For an un- steady ow which requires small physical time step, pressure projection method was found to be computationally efficient since it does not require any subiterations procedure. It was observed that the artificial compressibility method requires a fast convergence scheme at each physical time step in order to satisfy incompressibility condition. This was obtained by using a GMRES-ILU(0) solver in our computations. When a line-relaxation scheme was used, the time accuracy was degraded and time-accurate computations became very expensive.

  17. Partnerships for Change. Research.

    ERIC Educational Resources Information Center

    David, Jane L.

    Based on visits to four Apple Classrooms of Tomorrow (ACOT) sites in the spring of 1990 and interviews with Apple ACOT staff, this paper represents the first round of a 3-year study for Apple Computer, Inc., about the role of ACOT in educational restructuring. An overview of the ACOT project is presented, and the evolution and structure of ACOT…

  18. Climate change-driven cliff and beach evolution at decadal to centennial time scales

    USGS Publications Warehouse

    Erikson, Li; O'Neill, Andrea; Barnard, Patrick; Vitousek, Sean; Limber, Patrick

    2017-01-01

    Here we develop a computationally efficient method that evolves cross-shore profiles of sand beaches with or without cliffs along natural and urban coastal environments and across expansive geographic areas at decadal to centennial time-scales driven by 21st century climate change projections. The model requires projected sea level rise rates, extrema of nearshore wave conditions, bluff recession and shoreline change rates, and cross-shore profiles representing present-day conditions. The model is applied to the ~470-km long coast of the Southern California Bight, USA, using recently available projected nearshore waves and bluff recession and shoreline change rates. The results indicate that eroded cliff material, from unarmored cliffs, contribute 11% to 26% to the total sediment budget. Historical beach nourishment rates will need to increase by more than 30% for a 0.25 m sea level rise (~2044) and by at least 75% by the year 2100 for a 1 m sea level rise, if evolution of the shoreline is to keep pace with rising sea levels.

  19. Helios: Understanding Solar Evolution Through Text Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Randazzese, Lucien

    This proof-of-concept project focused on developing, testing, and validating a range of bibliometric, text analytic, and machine-learning based methods to explore the evolution of three photovoltaic (PV) technologies: Cadmium Telluride (CdTe), Dye-Sensitized solar cells (DSSC), and Multi-junction solar cells. The analytical approach to the work was inspired by previous work by the same team to measure and predict the scientific prominence of terms and entities within specific research domains. The goal was to create tools that could assist domain-knowledgeable analysts in investigating the history and path of technological developments in general, with a focus on analyzing step-function changes in performance,more » or “breakthroughs,” in particular. The text-analytics platform developed during this project was dubbed Helios. The project relied on computational methods for analyzing large corpora of technical documents. For this project we ingested technical documents from the following sources into Helios: Thomson Scientific Web of Science (papers), the U.S. Patent & Trademark Office (patents), the U.S. Department of Energy (technical documents), the U.S. National Science Foundation (project funding summaries), and a hand curated set of full-text documents from Thomson Scientific and other sources.« less

  20. Acceleration of incremental-pressure-correction incompressible flow computations using a coarse-grid projection method

    NASA Astrophysics Data System (ADS)

    Kashefi, Ali; Staples, Anne

    2016-11-01

    Coarse grid projection (CGP) methodology is a novel multigrid method for systems involving decoupled nonlinear evolution equations and linear elliptic equations. The nonlinear equations are solved on a fine grid and the linear equations are solved on a corresponding coarsened grid. Mapping functions transfer data between the two grids. Here we propose a version of CGP for incompressible flow computations using incremental pressure correction methods, called IFEi-CGP (implicit-time-integration, finite-element, incremental coarse grid projection). Incremental pressure correction schemes solve Poisson's equation for an intermediate variable and not the pressure itself. This fact contributes to IFEi-CGP's efficiency in two ways. First, IFEi-CGP preserves the velocity field accuracy even for a high level of pressure field grid coarsening and thus significant speedup is achieved. Second, because incremental schemes reduce the errors that arise from boundaries with artificial homogenous Neumann conditions, CGP generates undamped flows for simulations with velocity Dirichlet boundary conditions. Comparisons of the data accuracy and CPU times for the incremental-CGP versus non-incremental-CGP computations are presented.

  1. Computational fluid dynamics applications at McDonnel Douglas

    NASA Technical Reports Server (NTRS)

    Hakkinen, R. J.

    1987-01-01

    Representative examples are presented of applications and development of advanced Computational Fluid Dynamics (CFD) codes for aerodynamic design at the McDonnell Douglas Corporation (MDC). Transonic potential and Euler codes, interactively coupled with boundary layer computation, and solutions of slender-layer Navier-Stokes approximation are applied to aircraft wing/body calculations. An optimization procedure using evolution theory is described in the context of transonic wing design. Euler methods are presented for analysis of hypersonic configurations, and helicopter rotors in hover and forward flight. Several of these projects were accepted for access to the Numerical Aerodynamic Simulation (NAS) facility at the NASA-Ames Research Center.

  2. The beginning of the space age: information and mathematical aspect. To the 60th anniversary of the launch of the first sputnik

    NASA Astrophysics Data System (ADS)

    Sushkevich, T. A.

    2017-11-01

    60 years ago, on 4 October 1957, the USSR successfully launched into space the FIRST SPUTNIK (artificial Earth satellite). From this date begins the countdown of the space age. Information and mathematical software is an integral component of any space project. Discusses the history and future of space exploration and the role of mathematics and computers. For illustration, presents a large list of publications. It is important to pay attention to the role of mathematics and computer science in space projects and research, remote sensing problems, the evolution of the Earth's environment and climate, where the theory of radiation transfer plays a key role, and the achievements of Russian scientists at the dawn of the space age.

  3. Evolution of the ATLAS distributed computing system during the LHC long shutdown

    NASA Astrophysics Data System (ADS)

    Campana, S.; Atlas Collaboration

    2014-06-01

    The ATLAS Distributed Computing project (ADC) was established in 2007 to develop and operate a framework, following the ATLAS computing model, to enable data storage, processing and bookkeeping on top of the Worldwide LHC Computing Grid (WLCG) distributed infrastructure. ADC development has always been driven by operations and this contributed to its success. The system has fulfilled the demanding requirements of ATLAS, daily consolidating worldwide up to 1 PB of data and running more than 1.5 million payloads distributed globally, supporting almost one thousand concurrent distributed analysis users. Comprehensive automation and monitoring minimized the operational manpower required. The flexibility of the system to adjust to operational needs has been important to the success of the ATLAS physics program. The LHC shutdown in 2013-2015 affords an opportunity to improve the system in light of operational experience and scale it to cope with the demanding requirements of 2015 and beyond, most notably a much higher trigger rate and event pileup. We will describe the evolution of the ADC software foreseen during this period. This includes consolidating the existing Production and Distributed Analysis framework (PanDA) and ATLAS Grid Information System (AGIS), together with the development and commissioning of next generation systems for distributed data management (DDM/Rucio) and production (Prodsys-2). We will explain how new technologies such as Cloud Computing and NoSQL databases, which ATLAS investigated as R&D projects in past years, will be integrated in production. Finally, we will describe more fundamental developments such as breaking job-to-data locality by exploiting storage federations and caches, and event level (rather than file or dataset level) workload engines.

  4. Lessons learned in creating spacecraft computer systems: Implications for using Ada (R) for the space station

    NASA Technical Reports Server (NTRS)

    Tomayko, James E.

    1986-01-01

    Twenty-five years of spacecraft onboard computer development have resulted in a better understanding of the requirements for effective, efficient, and fault tolerant flight computer systems. Lessons from eight flight programs (Gemini, Apollo, Skylab, Shuttle, Mariner, Voyager, and Galileo) and three reserach programs (digital fly-by-wire, STAR, and the Unified Data System) are useful in projecting the computer hardware configuration of the Space Station and the ways in which the Ada programming language will enhance the development of the necessary software. The evolution of hardware technology, fault protection methods, and software architectures used in space flight in order to provide insight into the pending development of such items for the Space Station are reviewed.

  5. Computer simulations for the Mars Atmospheric and Volatile EvolutioN (MAVEN) mission through NASA's "Project Spectra!"

    NASA Astrophysics Data System (ADS)

    Christofferson, R.; Wood, E. L.; Euler, G.

    2012-12-01

    "Project Spectra!" is a standards-based light science and engineering program on solar system exploration that includes both hands-on paper and pencil activities as well as Flash-based computer games that help students solidify understanding of high-level planetary and solar physics. Using computer interactive games where students experience and manipulate the information makes abstract concepts accessible. Visualizing lessons with multi-media tools solidifies understanding and retention of knowledge. Since students can choose what to watch and explore, the interactives accommodate a broad range of learning styles. Students can go back and forth through the interactives if they've missed a concept or wish to view something again. In the end, students are asked critical thinking questions and conduct web-based research. As a part of the Mars Atmospheric and Volatile EvolutioN (MAVEN) mission education programming, we've developed two new "Project Spectra!" interactives that go hand-in-hand with a paper and pencil activity. The MAVEN mission will study volatiles in the upper atmosphere to help piece together Mars' climate history. In the first interactive, students explore black body radiation, albedo, and a simplified greenhouse effect to establish what factors contribute to overall planetary temperature and how they contribute. Students are asked to create a scenario in which a planet they build and design is able to maintain liquid water on the surface. In the second interactive, students are asked to consider Mars and the conditions needed for Mars to support water on the surface, keeping some variables fixed. Ideally, students will walk away with the very basic and critical elements required for climate studies, which has far-reaching implications beyond the study of Mars. These interactives are currently being pilot tested at Arvada High School in Colorado.

  6. Computer simulations for the Mars Atmospheric and Volatile EvolutioN (MAVEN) mission through NASA's 'Project Spectra!'

    NASA Astrophysics Data System (ADS)

    Wood, E. L.

    2013-12-01

    'Project Spectra!' is a standards-based light science and engineering program on solar system exploration that includes both hands-on paper and pencil activities as well as Flash-based computer games that help students solidify understanding of high-level planetary and solar physics. Using computer interactive games where students experience and manipulate the information makes abstract concepts accessible. Visualizing lessons with multi-media tools solidifies understanding and retention of knowledge. Since students can choose what to watch and explore, the interactives accommodate a broad range of learning styles. Students can go back and forth through the interactives if they've missed a concept or wish to view something again. In the end, students are asked critical thinking questions and conduct web-based research. As a part of the Mars Atmospheric and Volatile EvolutioN (MAVEN) mission education programming, we've developed two new 'Project Spectra!' interactives that go hand-in-hand with a paper and pencil activity. The MAVEN mission will study volatiles in the upper atmosphere to help piece together Mars' climate history. In the first interactive, students explore black body radiation, albedo, and a simplified greenhouse effect to establish what factors contribute to overall planetary temperature and how they contribute. Students are asked to create a scenario in which a planet they build and design is able to maintain liquid water on the surface. In the second interactive, students are asked to consider Mars and the conditions needed for Mars to support water on the surface, keeping some variables fixed. Ideally, students will walk away with the very basic and critical elements required for climate studies, which has far-reaching implications beyond the study of Mars. These interactives were pilot tested at Arvada High School in Colorado.

  7. Soft evolution of multi-jet final states

    DOE PAGES

    Gerwick, Erik; Schumann, Steffen; Höche, Stefan; ...

    2015-02-16

    We present a new framework for computing resummed and matched distributions in processes with many hard QCD jets. The intricate color structure of soft gluon emission at large angles renders resummed calculations highly non-trivial in this case. We automate all ingredients necessary for the color evolution of the soft function at next-to-leading-logarithmic accuracy, namely the selection of the color bases and the projections of color operators and Born amplitudes onto those bases. Explicit results for all QCD processes with up to 2 → 5 partons are given. We also devise a new tree-level matching scheme for resummed calculations which exploitsmore » a quasi-local subtraction based on the Catani-Seymour dipole formalism. We implement both resummation and matching in the Sherpa event generator. As a proof of concept, we compute the resummed and matched transverse-thrust distribution for hadronic collisions.« less

  8. The Philosophy of the Starship -- Revisited

    NASA Astrophysics Data System (ADS)

    Parkinson, B.

    2014-06-01

    The concept of a long-term, open ended project like the starship raises questions about why the human race should engage in such an endeavour. Thinking about such a project inevitably means considering the effects of change ­ not only in our technology, but also in our understanding of the world and even to ourselves. Project Daedalus was carried out in a time before the "Modern Age" with its instant communication, availability of information and massive computing resources, and the change is continuing. Considerations of possible encounters with extraterrestrials and even our evolution into "post-human" societies constrain us not to think about the future of an interstellar humanity but the future of creative consciousness.

  9. 1993 Annual report on scientific programs: A broad research program on the sciences of complexity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1993-12-31

    This report provides a summary of many of the research projects completed by the Santa Fe Institute (SFI) during 1993. These research efforts continue to focus on two general areas: the study of, and search for, underlying scientific principles governing complex adaptive systems, and the exploration of new theories of computation that incorporate natural mechanisms of adaptation (mutation, genetics, evolution).

  10. Low latency network and distributed storage for next generation HPC systems: the ExaNeSt project

    NASA Astrophysics Data System (ADS)

    Ammendola, R.; Biagioni, A.; Cretaro, P.; Frezza, O.; Lo Cicero, F.; Lonardo, A.; Martinelli, M.; Paolucci, P. S.; Pastorelli, E.; Pisani, F.; Simula, F.; Vicini, P.; Navaridas, J.; Chaix, F.; Chrysos, N.; Katevenis, M.; Papaeustathiou, V.

    2017-10-01

    With processor architecture evolution, the HPC market has undergone a paradigm shift. The adoption of low-cost, Linux-based clusters extended the reach of HPC from its roots in modelling and simulation of complex physical systems to a broader range of industries, from biotechnology, cloud computing, computer analytics and big data challenges to manufacturing sectors. In this perspective, the near future HPC systems can be envisioned as composed of millions of low-power computing cores, densely packed — meaning cooling by appropriate technology — with a tightly interconnected, low latency and high performance network and equipped with a distributed storage architecture. Each of these features — dense packing, distributed storage and high performance interconnect — represents a challenge, made all the harder by the need to solve them at the same time. These challenges lie as stumbling blocks along the road towards Exascale-class systems; the ExaNeSt project acknowledges them and tasks itself with investigating ways around them.

  11. The open science grid

    NASA Astrophysics Data System (ADS)

    Pordes, Ruth; OSG Consortium; Petravick, Don; Kramer, Bill; Olson, Doug; Livny, Miron; Roy, Alain; Avery, Paul; Blackburn, Kent; Wenaus, Torre; Würthwein, Frank; Foster, Ian; Gardner, Rob; Wilde, Mike; Blatecky, Alan; McGee, John; Quick, Rob

    2007-07-01

    The Open Science Grid (OSG) provides a distributed facility where the Consortium members provide guaranteed and opportunistic access to shared computing and storage resources. OSG provides support for and evolution of the infrastructure through activities that cover operations, security, software, troubleshooting, addition of new capabilities, and support for existing and engagement with new communities. The OSG SciDAC-2 project provides specific activities to manage and evolve the distributed infrastructure and support it's use. The innovative aspects of the project are the maintenance and performance of a collaborative (shared & common) petascale national facility over tens of autonomous computing sites, for many hundreds of users, transferring terabytes of data a day, executing tens of thousands of jobs a day, and providing robust and usable resources for scientific groups of all types and sizes. More information can be found at the OSG web site: www.opensciencegrid.org.

  12. Monochromatic-beam-based dynamic X-ray microtomography based on OSEM-TV algorithm.

    PubMed

    Xu, Liang; Chen, Rongchang; Yang, Yiming; Deng, Biao; Du, Guohao; Xie, Honglan; Xiao, Tiqiao

    2017-01-01

    Monochromatic-beam-based dynamic X-ray computed microtomography (CT) was developed to observe evolution of microstructure inside samples. However, the low flux density results in low efficiency in data collection. To increase efficiency, reducing the number of projections should be a practical solution. However, it has disadvantages of low image reconstruction quality using the traditional filtered back projection (FBP) algorithm. In this study, an iterative reconstruction method using an ordered subset expectation maximization-total variation (OSEM-TV) algorithm was employed to address and solve this problem. The simulated results demonstrated that normalized mean square error of the image slices reconstructed by the OSEM-TV algorithm was about 1/4 of that by FBP. Experimental results also demonstrated that the density resolution of OSEM-TV was high enough to resolve different materials with the number of projections less than 100. As a result, with the introduction of OSEM-TV, the monochromatic-beam-based dynamic X-ray microtomography is potentially practicable for the quantitative and non-destructive analysis to the evolution of microstructure with acceptable efficiency in data collection and reconstructed image quality.

  13. Simulation of Quantum Many-Body Dynamics for Generic Strongly-Interacting Systems

    NASA Astrophysics Data System (ADS)

    Meyer, Gregory; Machado, Francisco; Yao, Norman

    2017-04-01

    Recent experimental advances have enabled the bottom-up assembly of complex, strongly interacting quantum many-body systems from individual atoms, ions, molecules and photons. These advances open the door to studying dynamics in isolated quantum systems as well as the possibility of realizing novel out-of-equilibrium phases of matter. Numerical studies provide insight into these systems; however, computational time and memory usage limit common numerical methods such as exact diagonalization to relatively small Hilbert spaces of dimension 215 . Here we present progress toward a new software package for dynamical time evolution of large generic quantum systems on massively parallel computing architectures. By projecting large sparse Hamiltonians into a much smaller Krylov subspace, we are able to compute the evolution of strongly interacting systems with Hilbert space dimension nearing 230. We discuss and benchmark different design implementations, such as matrix-free methods and GPU based calculations, using both pre-thermal time crystals and the Sachdev-Ye-Kitaev model as examples. We also include a simple symbolic language to describe generic Hamiltonians, allowing simulation of diverse quantum systems without any modification of the underlying C and Fortran code.

  14. From cosmos to connectomes: the evolution of data-intensive science.

    PubMed

    Burns, Randal; Vogelstein, Joshua T; Szalay, Alexander S

    2014-09-17

    The analysis of data requires computation: originally by hand and more recently by computers. Different models of computing are designed and optimized for different kinds of data. In data-intensive science, the scale and complexity of data exceeds the comfort zone of local data stores on scientific workstations. Thus, cloud computing emerges as the preeminent model, utilizing data centers and high-performance clusters, enabling remote users to access and query subsets of the data efficiently. We examine how data-intensive computational systems originally built for cosmology, the Sloan Digital Sky Survey (SDSS), are now being used in connectomics, at the Open Connectome Project. We list lessons learned and outline the top challenges we expect to face. Success in computational connectomics would drastically reduce the time between idea and discovery, as SDSS did in cosmology. Copyright © 2014 Elsevier Inc. All rights reserved.

  15. Consolidating WLCG topology and configuration in the Computing Resource Information Catalogue

    DOE PAGES

    Alandes, Maria; Andreeva, Julia; Anisenkov, Alexey; ...

    2017-10-01

    Here, the Worldwide LHC Computing Grid infrastructure links about 200 participating computing centres affiliated with several partner projects. It is built by integrating heterogeneous computer and storage resources in diverse data centres all over the world and provides CPU and storage capacity to the LHC experiments to perform data processing and physics analysis. In order to be used by the experiments, these distributed resources should be well described, which implies easy service discovery and detailed description of service configuration. Currently this information is scattered over multiple generic information sources like GOCDB, OIM, BDII and experiment-specific information systems. Such a modelmore » does not allow to validate topology and configuration information easily. Moreover, information in various sources is not always consistent. Finally, the evolution of computing technologies introduces new challenges. Experiments are more and more relying on opportunistic resources, which by their nature are more dynamic and should also be well described in the WLCG information system. This contribution describes the new WLCG configuration service CRIC (Computing Resource Information Catalogue) which collects information from various information providers, performs validation and provides a consistent set of UIs and APIs to the LHC VOs for service discovery and usage configuration. The main requirements for CRIC are simplicity, agility and robustness. CRIC should be able to be quickly adapted to new types of computing resources, new information sources, and allow for new data structures to be implemented easily following the evolution of the computing models and operations of the experiments.« less

  16. Consolidating WLCG topology and configuration in the Computing Resource Information Catalogue

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alandes, Maria; Andreeva, Julia; Anisenkov, Alexey

    Here, the Worldwide LHC Computing Grid infrastructure links about 200 participating computing centres affiliated with several partner projects. It is built by integrating heterogeneous computer and storage resources in diverse data centres all over the world and provides CPU and storage capacity to the LHC experiments to perform data processing and physics analysis. In order to be used by the experiments, these distributed resources should be well described, which implies easy service discovery and detailed description of service configuration. Currently this information is scattered over multiple generic information sources like GOCDB, OIM, BDII and experiment-specific information systems. Such a modelmore » does not allow to validate topology and configuration information easily. Moreover, information in various sources is not always consistent. Finally, the evolution of computing technologies introduces new challenges. Experiments are more and more relying on opportunistic resources, which by their nature are more dynamic and should also be well described in the WLCG information system. This contribution describes the new WLCG configuration service CRIC (Computing Resource Information Catalogue) which collects information from various information providers, performs validation and provides a consistent set of UIs and APIs to the LHC VOs for service discovery and usage configuration. The main requirements for CRIC are simplicity, agility and robustness. CRIC should be able to be quickly adapted to new types of computing resources, new information sources, and allow for new data structures to be implemented easily following the evolution of the computing models and operations of the experiments.« less

  17. Consolidating WLCG topology and configuration in the Computing Resource Information Catalogue

    NASA Astrophysics Data System (ADS)

    Alandes, Maria; Andreeva, Julia; Anisenkov, Alexey; Bagliesi, Giuseppe; Belforte, Stephano; Campana, Simone; Dimou, Maria; Flix, Jose; Forti, Alessandra; di Girolamo, A.; Karavakis, Edward; Lammel, Stephan; Litmaath, Maarten; Sciaba, Andrea; Valassi, Andrea

    2017-10-01

    The Worldwide LHC Computing Grid infrastructure links about 200 participating computing centres affiliated with several partner projects. It is built by integrating heterogeneous computer and storage resources in diverse data centres all over the world and provides CPU and storage capacity to the LHC experiments to perform data processing and physics analysis. In order to be used by the experiments, these distributed resources should be well described, which implies easy service discovery and detailed description of service configuration. Currently this information is scattered over multiple generic information sources like GOCDB, OIM, BDII and experiment-specific information systems. Such a model does not allow to validate topology and configuration information easily. Moreover, information in various sources is not always consistent. Finally, the evolution of computing technologies introduces new challenges. Experiments are more and more relying on opportunistic resources, which by their nature are more dynamic and should also be well described in the WLCG information system. This contribution describes the new WLCG configuration service CRIC (Computing Resource Information Catalogue) which collects information from various information providers, performs validation and provides a consistent set of UIs and APIs to the LHC VOs for service discovery and usage configuration. The main requirements for CRIC are simplicity, agility and robustness. CRIC should be able to be quickly adapted to new types of computing resources, new information sources, and allow for new data structures to be implemented easily following the evolution of the computing models and operations of the experiments.

  18. Computer Interactives for the Mars Atmospheric and Volatile Evolution (MAVEN) Mission through NASA's "Project Spectra!"

    NASA Astrophysics Data System (ADS)

    Wood, E. L.

    2014-12-01

    "Project Spectra!" is a standards-based E-M spectrum and engineering program that includes paper and pencil activities as well as Flash-based computer games that help students solidify understanding of high-level planetary and solar physics. Using computer interactive games, students experience and manipulate information making abstract concepts accessible, solidifying understanding and enhancing retention of knowledge. Since students can choose what to watch and explore, the interactives accommodate a broad range of learning styles. Students can go back and forth through the interactives if they've missed a concept or wish to view something again. In the end, students are asked critical thinking questions and conduct web-based research. As part of the Mars Atmospheric and Volatile EvolutioN (MAVEN) mission education programming, we've developed two new interactives. The MAVEN mission will study volatiles in the upper atmosphere to help piece together Mars' climate history. In the first interactive, students explore black body radiation, albedo, and a simplified greenhouse effect to establish what factors contribute to overall planetary temperature. Students design a planet that is able to maintain liquid water on the surface. In the second interactive, students are asked to consider conditions needed for Mars to support water on the surface, keeping some variables fixed. Ideally, students will walk away with the very basic and critical elements required for climate studies, which has far-reaching implications beyond the study of Mars. These interactives were pilot tested at Arvada High School in Colorado.

  19. Efficient coarse simulation of a growing avascular tumor

    PubMed Central

    Kavousanakis, Michail E.; Liu, Ping; Boudouvis, Andreas G.; Lowengrub, John; Kevrekidis, Ioannis G.

    2013-01-01

    The subject of this work is the development and implementation of algorithms which accelerate the simulation of early stage tumor growth models. Among the different computational approaches used for the simulation of tumor progression, discrete stochastic models (e.g., cellular automata) have been widely used to describe processes occurring at the cell and subcell scales (e.g., cell-cell interactions and signaling processes). To describe macroscopic characteristics (e.g., morphology) of growing tumors, large numbers of interacting cells must be simulated. However, the high computational demands of stochastic models make the simulation of large-scale systems impractical. Alternatively, continuum models, which can describe behavior at the tumor scale, often rely on phenomenological assumptions in place of rigorous upscaling of microscopic models. This limits their predictive power. In this work, we circumvent the derivation of closed macroscopic equations for the growing cancer cell populations; instead, we construct, based on the so-called “equation-free” framework, a computational superstructure, which wraps around the individual-based cell-level simulator and accelerates the computations required for the study of the long-time behavior of systems involving many interacting cells. The microscopic model, e.g., a cellular automaton, which simulates the evolution of cancer cell populations, is executed for relatively short time intervals, at the end of which coarse-scale information is obtained. These coarse variables evolve on slower time scales than each individual cell in the population, enabling the application of forward projection schemes, which extrapolate their values at later times. This technique is referred to as coarse projective integration. Increasing the ratio of projection times to microscopic simulator execution times enhances the computational savings. Crucial accuracy issues arising for growing tumors with radial symmetry are addressed by applying the coarse projective integration scheme in a cotraveling (cogrowing) frame. As a proof of principle, we demonstrate that the application of this scheme yields highly accurate solutions, while preserving the computational savings of coarse projective integration. PMID:22587128

  20. The space program's impact on society

    NASA Astrophysics Data System (ADS)

    Toffler, Alvin

    In terms of human evolution, when viewed from 500 or 1000 years from now, today's primitive, still faltering steps beyond the Earth will be recognized as the most important human project of our era, matched only by what is going on in computers and biology. In this paper the social effects of space activity are addressed at three different levels: key social institutions, key social groups, and key social processes.

  1. Configuration Tool for the Trusted Computing Exemplar Project

    DTIC Science & Technology

    2009-12-01

    languages were examined: Microsoft .NET [8], Apple Cocoa (Objective-C) [9], wxPython [10], and Java [11]. Since every language has its pros and...languages using the criteria described above. Based on the developer’s limited experience and knowledge of Microsoft .NET and Apple Cocoa (Objective...became a tabbed panel within a separate window panel. Figure 9 depicts this evolution of the conceptual design. In Figure 9, the table column

  2. Research in Computational Astrobiology

    NASA Technical Reports Server (NTRS)

    Chaban, Galina; Jaffe, Richard; Liang, Shoudan; New, Michael H.; Pohorille, Andrew; Wilson, Michael A.

    2002-01-01

    We present results from several projects in the new field of computational astrobiology, which is devoted to advancing our understanding of the origin, evolution and distribution of life in the Universe using theoretical and computational tools. We have developed a procedure for calculating long-range effects in molecular dynamics using a plane wave expansion of the electrostatic potential. This method is expected to be highly efficient for simulating biological systems on massively parallel supercomputers. We have perform genomics analysis on a family of actin binding proteins. We have performed quantum mechanical calculations on carbon nanotubes and nucleic acids, which simulations will allow us to investigate possible sources of organic material on the early earth. Finally, we have developed a model of protobiological chemistry using neural networks.

  3. ABrIL - Advanced Brain Imaging Lab : a cloud based computation environment for cooperative neuroimaging projects.

    PubMed

    Neves Tafula, Sérgio M; Moreira da Silva, Nádia; Rozanski, Verena E; Silva Cunha, João Paulo

    2014-01-01

    Neuroscience is an increasingly multidisciplinary and highly cooperative field where neuroimaging plays an important role. Neuroimaging rapid evolution is demanding for a growing number of computing resources and skills that need to be put in place at every lab. Typically each group tries to setup their own servers and workstations to support their neuroimaging needs, having to learn from Operating System management to specific neuroscience software tools details before any results can be obtained from each setup. This setup and learning process is replicated in every lab, even if a strong collaboration among several groups is going on. In this paper we present a new cloud service model - Brain Imaging Application as a Service (BiAaaS) - and one of its implementation - Advanced Brain Imaging Lab (ABrIL) - in the form of an ubiquitous virtual desktop remote infrastructure that offers a set of neuroimaging computational services in an interactive neuroscientist-friendly graphical user interface (GUI). This remote desktop has been used for several multi-institution cooperative projects with different neuroscience objectives that already achieved important results, such as the contribution to a high impact paper published in the January issue of the Neuroimage journal. The ABrIL system has shown its applicability in several neuroscience projects with a relatively low-cost, promoting truly collaborative actions and speeding up project results and their clinical applicability.

  4. Design, Results, Evolution and Status of the ATLAS Simulation at Point1 Project

    NASA Astrophysics Data System (ADS)

    Ballestrero, S.; Batraneanu, S. M.; Brasolin, F.; Contescu, C.; Fazio, D.; Di Girolamo, A.; Lee, C. J.; Pozo Astigarraga, M. E.; Scannicchio, D. A.; Sedov, A.; Twomey, M. S.; Wang, F.; Zaytsev, A.

    2015-12-01

    During the LHC Long Shutdown 1 (LSI) period, that started in 2013, the Simulation at Point1 (Sim@P1) project takes advantage, in an opportunistic way, of the TDAQ (Trigger and Data Acquisition) HLT (High-Level Trigger) farm of the ATLAS experiment. This farm provides more than 1300 compute nodes, which are particularly suited for running event generation and Monte Carlo production jobs that are mostly CPU and not I/O bound. It is capable of running up to 2700 Virtual Machines (VMs) each with 8 CPU cores, for a total of up to 22000 parallel jobs. This contribution gives a review of the design, the results, and the evolution of the Sim@P1 project, operating a large scale OpenStack based virtualized platform deployed on top of the ATLAS TDAQ HLT farm computing resources. During LS1, Sim@P1 was one of the most productive ATLAS sites: it delivered more than 33 million CPU-hours and it generated more than 1.1 billion Monte Carlo events. The design aspects are presented: the virtualization platform exploited by Sim@P1 avoids interferences with TDAQ operations and it guarantees the security and the usability of the ATLAS private network. The cloud mechanism allows the separation of the needed support on both infrastructural (hardware, virtualization layer) and logical (Grid site support) levels. This paper focuses on the operational aspects of such a large system during the upcoming LHC Run 2 period: simple, reliable, and efficient tools are needed to quickly switch from Sim@P1 to TDAQ mode and back, to exploit the resources when they are not used for the data acquisition, even for short periods. The evolution of the central OpenStack infrastructure is described, as it was upgraded from Folsom to the Icehouse release, including the scalability issues addressed.

  5. Statistical Emulation of Climate Model Projections Based on Precomputed GCM Runs*

    DOE PAGES

    Castruccio, Stefano; McInerney, David J.; Stein, Michael L.; ...

    2014-02-24

    The authors describe a new approach for emulating the output of a fully coupled climate model under arbitrary forcing scenarios that is based on a small set of precomputed runs from the model. Temperature and precipitation are expressed as simple functions of the past trajectory of atmospheric CO 2 concentrations, and a statistical model is fit using a limited set of training runs. The approach is demonstrated to be a useful and computationally efficient alternative to pattern scaling and captures the nonlinear evolution of spatial patterns of climate anomalies inherent in transient climates. The approach does as well as patternmore » scaling in all circumstances and substantially better in many; it is not computationally demanding; and, once the statistical model is fit, it produces emulated climate output effectively instantaneously. In conclusion, it may therefore find wide application in climate impacts assessments and other policy analyses requiring rapid climate projections.« less

  6. OBSERVING LYAPUNOV EXPONENTS OF INFINITE-DIMENSIONAL DYNAMICAL SYSTEMS

    PubMed Central

    OTT, WILLIAM; RIVAS, MAURICIO A.; WEST, JAMES

    2016-01-01

    Can Lyapunov exponents of infinite-dimensional dynamical systems be observed by projecting the dynamics into ℝN using a ‘typical’ nonlinear projection map? We answer this question affirmatively by developing embedding theorems for compact invariant sets associated with C1 maps on Hilbert spaces. Examples of such discrete-time dynamical systems include time-T maps and Poincaré return maps generated by the solution semigroups of evolution partial differential equations. We make every effort to place hypotheses on the projected dynamics rather than on the underlying infinite-dimensional dynamical system. In so doing, we adopt an empirical approach and formulate checkable conditions under which a Lyapunov exponent computed from experimental data will be a Lyapunov exponent of the infinite-dimensional dynamical system under study (provided the nonlinear projection map producing the data is typical in the sense of prevalence). PMID:28066028

  7. OBSERVING LYAPUNOV EXPONENTS OF INFINITE-DIMENSIONAL DYNAMICAL SYSTEMS.

    PubMed

    Ott, William; Rivas, Mauricio A; West, James

    2015-12-01

    Can Lyapunov exponents of infinite-dimensional dynamical systems be observed by projecting the dynamics into ℝ N using a 'typical' nonlinear projection map? We answer this question affirmatively by developing embedding theorems for compact invariant sets associated with C 1 maps on Hilbert spaces. Examples of such discrete-time dynamical systems include time- T maps and Poincaré return maps generated by the solution semigroups of evolution partial differential equations. We make every effort to place hypotheses on the projected dynamics rather than on the underlying infinite-dimensional dynamical system. In so doing, we adopt an empirical approach and formulate checkable conditions under which a Lyapunov exponent computed from experimental data will be a Lyapunov exponent of the infinite-dimensional dynamical system under study (provided the nonlinear projection map producing the data is typical in the sense of prevalence).

  8. Symbolic Computation Using Cellular Automata-Based Hyperdimensional Computing.

    PubMed

    Yilmaz, Ozgur

    2015-12-01

    This letter introduces a novel framework of reservoir computing that is capable of both connectionist machine intelligence and symbolic computation. A cellular automaton is used as the reservoir of dynamical systems. Input is randomly projected onto the initial conditions of automaton cells, and nonlinear computation is performed on the input via application of a rule in the automaton for a period of time. The evolution of the automaton creates a space-time volume of the automaton state space, and it is used as the reservoir. The proposed framework is shown to be capable of long-term memory, and it requires orders of magnitude less computation compared to echo state networks. As the focus of the letter, we suggest that binary reservoir feature vectors can be combined using Boolean operations as in hyperdimensional computing, paving a direct way for concept building and symbolic processing. To demonstrate the capability of the proposed system, we make analogies directly on image data by asking, What is the automobile of air?

  9. Achieving reliability - The evolution of redundancy in American manned spacecraft computers

    NASA Technical Reports Server (NTRS)

    Tomayko, J. E.

    1985-01-01

    The Shuttle is the first launch system deployed by NASA with full redundancy in the on-board computer systems. Fault-tolerance, i.e., restoring to a backup with less capabilities, was the method selected for Apollo. The Gemini capsule was the first to carry a computer, which also served as backup for Titan launch vehicle guidance. Failure of the Gemini computer resulted in manual control of the spacecraft. The Apollo system served vehicle flight control and navigation functions. The redundant computer on Skylab provided attitude control only in support of solar telescope pointing. The STS digital, fly-by-wire avionics system requires 100 percent reliability. The Orbiter carries five general purpose computers, four being fully-redundant and the fifth being soley an ascent-descent tool. The computers are synchronized at input and output points at a rate of about six times a second. The system is projected to cause a loss of an Orbiter only four times in a billion flights.

  10. Configuration Tool Prototype for the Trusted Computing Exemplar Project

    DTIC Science & Technology

    2009-12-01

    languages were examined: Microsoft .NET [8], Apple Cocoa (Objective-C) [9], wxPython [10], and Java [11]. Since every language has its pros and...languages using the criteria described above. Based on the developer’s limited experience and knowledge of Microsoft .NET and Apple Cocoa (Objective...a tabbed panel within a separate window panel. Figure 9 depicts this evolution of the conceptual design. In Figure 9, the table column headers are

  11. Nonlinear simulations with and computational issues for NIMROD

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sovinec, C R

    The NIMROD (Non-Ideal Magnetohydrodynamics with Rotation, Open Discussion) code development project was commissioned by the US Department of Energy in February, 1996 to provide the fusion research community with a computational tool for studying low-frequency behavior in experiments. Specific problems of interest include the neoclassical evolution of magnetic islands and the nonlinear behavior of tearing modes in the presence of rotation and nonideal walls in tokamaks; they also include topics relevant to innovative confinement concepts such as magnetic turbulence. Besides having physics models appropriate for these phenomena, an additional requirement is the ability to perform the computations in realistic geometries.more » The NIMROD Team is using contemporary management and computational methods to develop a computational tool for investigating low-frequency behavior in plasma fusion experiments. The authors intend to make the code freely available, and are taking steps to make it as easy to learn and use as possible. An example application for NIMROD is the nonlinear toroidal RFP simulation--the first in a series to investigate how toroidal geometry affects MHD activity in RFPs. Finally, the most important issue facing the project is execution time, and they are exploring better matrix solvers and a better parallel decomposition to address this.« less

  12. Computational Morphometry for Detecting Changes in Brain Structure Due to Development, Aging, Learning, Disease and Evolution

    PubMed Central

    Mietchen, Daniel; Gaser, Christian

    2009-01-01

    The brain, like any living tissue, is constantly changing in response to genetic and environmental cues and their interaction, leading to changes in brain function and structure, many of which are now in reach of neuroimaging techniques. Computational morphometry on the basis of Magnetic Resonance (MR) images has become the method of choice for studying macroscopic changes of brain structure across time scales. Thanks to computational advances and sophisticated study designs, both the minimal extent of change necessary for detection and, consequently, the minimal periods over which such changes can be detected have been reduced considerably during the last few years. On the other hand, the growing availability of MR images of more and more diverse brain populations also allows more detailed inferences about brain changes that occur over larger time scales, way beyond the duration of an average research project. On this basis, a whole range of issues concerning the structures and functions of the brain are now becoming addressable, thereby providing ample challenges and opportunities for further contributions from neuroinformatics to our understanding of the brain and how it changes over a lifetime and in the course of evolution. PMID:19707517

  13. Quantum population and entanglement evolution in photosynthetic process

    NASA Astrophysics Data System (ADS)

    Zhu, Jing

    Applications of the concepts of quantum information theory are usually related to the powerful and counter-intuitive quantum mechanical effects of superposition, interference and entanglement. In this thesis, I examine the role of coherence and entanglement in complex chemical systems. The research has focused mainly on two related projects: The first project is developing a theoretical model to explain the recent ultrafast experiments on excitonic migration in photosynthetic complexes that show long-lived coherence of the order of hundreds of femtoseconds and the second project developing the Grover algorithm for global optimization of complex systems. The first part can be divided into two sections. The first section is investigating the theoretical frame about the transfer of electronic excitation energy through the Fenna-Matthews-Olson (FMO) pigment-protein complex. The new developed modified scaled hierarchical equation of motion (HEOM) approach is employed for simulating the open quantum system. The second section is investigating the evolution of entanglement in the FMO complex based on the simulation result via scaled HEOM approach. We examine the role of multipartite entanglement in the FMO complex by direct computation of the convex roof optimization for a number of different measures, including pairwise, triplet, quadruple and quintuple sites entanglement. Our results support the hypothesis that multipartite entanglement is maximum primary along the two distinct electronic energy transfer pathways. The second part of this thesis can be separated into two sections. The first section demonstrated that a modified Grover's quantum algorithm can be applied to real problems of finding a global minimum using modest numbers of quantum bits. Calculations of the global minimum of simple test functions and Lennard-Jones clusters have been carried out on a quantum computer simulator using a modified Grover's algorithm. The second section is implementing the basic quantum logical gates upon arrays of trapped ultracold polar molecules as qubits for the quantum computer. Utilized herein is the Multi-Target Optimal Control Theory (MTOCT) as a means of manipulating the initial-to-target transition probability via external laser field. The detailed calculation is applied for the SrO molecule, an ideal candidate in proposed quantum computers using arrays of trapped ultra-cold polar molecules.

  14. Computational and Experimental Studies of Microstructure-Scale Porosity in Metallic Fuels for Improved Gas Swelling Behavior

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mllett, Paul; McDeavitt, Sean; Deo, Chaitanya

    This proposal will investigate the stability of bimodal pore size distributions in metallic uranium and uranium-zirconium alloys during sintering and re-sintering annealing treatments. The project will utilize both computational and experimental approaches. The computational approach includes both Molecular Dynamics simulations to determine the self-diffusion coefficients in pure U and U-Zr alloys in single crystals, grain boundaries, and free surfaces, as well as calculations of grain boundary and free surface interfacial energies. Phase-field simulations using MOOSE will be conducted to study pore and grain structure evolution in microstructures with bimodal pore size distributions. Experiments will also be performed to validate themore » simulations, and measure the time-dependent densification of bimodal porous compacts.« less

  15. Evolution of the intelligent telecommunications network

    NASA Astrophysics Data System (ADS)

    Mayo, J. S.

    1982-02-01

    The development of the U.S. telecommunications network is described and traced from the invention of the telephone by Bell in 1876 to the use of integrated circuits and the UNIX system for interactive computers. The dialing system was introduced in the 19th century, and amplifiers were invented to permit coast to coast communication by 1914. Hierarchical switching was installed in the 1930s, along with telephoto and teletype services. PCM was invented in the 1930s, but was limited to military applications until the transistorized computer was fabricated in 1958, which coincided with spaceflight and the Telstar satellite in 1962. Fiber optics systems with laser pulse transmission are now entering widespread application, following the 1976 introduction of superfast digital switches controlled by a computer and capable of handling 1/2 million calls per hour. Projected advances are in increased teleconferencing, electronic mail, and full computer terminal services.

  16. From cancer genomes to cancer models: bridging the gaps

    PubMed Central

    Baudot, Anaïs; Real, Francisco X.; Izarzugaza, José M. G.; Valencia, Alfonso

    2009-01-01

    Cancer genome projects are now being expanded in an attempt to provide complete landscapes of the mutations that exist in tumours. Although the importance of cataloguing genome variations is well recognized, there are obvious difficulties in bridging the gaps between high-throughput resequencing information and the molecular mechanisms of cancer evolution. Here, we describe the current status of the high-throughput genomic technologies, and the current limitations of the associated computational analysis and experimental validation of cancer genetic variants. We emphasize how the current cancer-evolution models will be influenced by the high-throughput approaches, in particular through efforts devoted to monitoring tumour progression, and how, in turn, the integration of data and models will be translated into mechanistic knowledge and clinical applications. PMID:19305388

  17. Microstructural Quantification, Property Prediction, and Stochastic Reconstruction of Heterogeneous Materials Using Limited X-Ray Tomography Data

    NASA Astrophysics Data System (ADS)

    Li, Hechao

    An accurate knowledge of the complex microstructure of a heterogeneous material is crucial for quantitative structure-property relations establishment and its performance prediction and optimization. X-ray tomography has provided a non-destructive means for microstructure characterization in both 3D and 4D (i.e., structural evolution over time). Traditional reconstruction algorithms like filtered-back-projection (FBP) method or algebraic reconstruction techniques (ART) require huge number of tomographic projections and segmentation process before conducting microstructural quantification. This can be quite time consuming and computationally intensive. In this thesis, a novel procedure is first presented that allows one to directly extract key structural information in forms of spatial correlation functions from limited x-ray tomography data. The key component of the procedure is the computation of a "probability map", which provides the probability of an arbitrary point in the material system belonging to specific phase. The correlation functions of interest are then readily computed from the probability map. Using effective medium theory, accurate predictions of physical properties (e.g., elastic moduli) can be obtained. Secondly, a stochastic optimization procedure that enables one to accurately reconstruct material microstructure from a small number of x-ray tomographic projections (e.g., 20 - 40) is presented. Moreover, a stochastic procedure for multi-modal data fusion is proposed, where both X-ray projections and correlation functions computed from limited 2D optical images are fused to accurately reconstruct complex heterogeneous materials in 3D. This multi-modal reconstruction algorithm is proved to be able to integrate the complementary data to perform an excellent optimization procedure, which indicates its high efficiency in using limited structural information. Finally, the accuracy of the stochastic reconstruction procedure using limited X-ray projection data is ascertained by analyzing the microstructural degeneracy and the roughness of energy landscape associated with different number of projections. Ground-state degeneracy of a microstructure is found to decrease with increasing number of projections, which indicates a higher probability that the reconstructed configurations match the actual microstructure. The roughness of energy landscape can also provide information about the complexity and convergence behavior of the reconstruction for given microstructures and projection number.

  18. Simulation of wave interactions with MHD

    NASA Astrophysics Data System (ADS)

    Batchelor, D.; Alba, C.; Bateman, G.; Bernholdt, D.; Berry, L.; Bonoli, P.; Bramley, R.; Breslau, J.; Chance, M.; Chen, J.; Choi, M.; Elwasif, W.; Fu, G.; Harvey, R.; Jaeger, E.; Jardin, S.; Jenkins, T.; Keyes, D.; Klasky, S.; Kruger, S.; Ku, L.; Lynch, V.; McCune, D.; Ramos, J.; Schissel, D.; Schnack, D.; Wright, J.

    2008-07-01

    The broad scientific objectives of the SWIM (Simulation 01 Wave Interaction with MHD) project are twofold: (1) improve our understanding of interactions that both radio frequency (RF) wave and particle sources have on extended-MHD phenomena, and to substantially improve our capability for predicting and optimizing the performance of burning plasmas in devices such as ITER: and (2) develop an integrated computational system for treating multiphysics phenomena with the required flexibility and extensibility to serve as a prototype for the Fusion Simulation Project. The Integrated Plasma Simulator (IPS) has been implemented. Presented here are initial physics results on RP effects on MHD instabilities in tokamaks as well as simulation results for tokamak discharge evolution using the IPS.

  19. Computer Vision in the Temples of Karnak: Past, Present & Future

    NASA Astrophysics Data System (ADS)

    Tournadre, V.; Labarta, C.; Megard, P.; Garric, A.; Saubestre, E.; Durand, B.

    2017-05-01

    CFEETK, the French-Egyptian Center for the Study of the Temples of Karnak, is celebrating this year the 50th anniversary of its foundation. As a multicultural and transdisciplinary research center, it has always been a playground for testing emerging technologies applied to various fields. The raise of automatic computer vision algorithms is an interesting topic, as it allows nonexperts to provide high value results. This article presents the evolution in measurement experiments in the past 50 years, and it describes how cameras are used today. Ultimately, it aims to set the trends of the upcoming projects and it discusses how image processing could contribute further to the study and the conservation of the cultural heritage.

  20. From evolutionary computation to the evolution of things.

    PubMed

    Eiben, Agoston E; Smith, Jim

    2015-05-28

    Evolution has provided a source of inspiration for algorithm designers since the birth of computers. The resulting field, evolutionary computation, has been successful in solving engineering tasks ranging in outlook from the molecular to the astronomical. Today, the field is entering a new phase as evolutionary algorithms that take place in hardware are developed, opening up new avenues towards autonomous machines that can adapt to their environment. We discuss how evolutionary computation compares with natural evolution and what its benefits are relative to other computing approaches, and we introduce the emerging area of artificial evolution in physical systems.

  1. Evolutionary computation in zoology and ecology.

    PubMed

    Boone, Randall B

    2017-12-01

    Evolutionary computational methods have adopted attributes of natural selection and evolution to solve problems in computer science, engineering, and other fields. The method is growing in use in zoology and ecology. Evolutionary principles may be merged with an agent-based modeling perspective to have individual animals or other agents compete. Four main categories are discussed: genetic algorithms, evolutionary programming, genetic programming, and evolutionary strategies. In evolutionary computation, a population is represented in a way that allows for an objective function to be assessed that is relevant to the problem of interest. The poorest performing members are removed from the population, and remaining members reproduce and may be mutated. The fitness of the members is again assessed, and the cycle continues until a stopping condition is met. Case studies include optimizing: egg shape given different clutch sizes, mate selection, migration of wildebeest, birds, and elk, vulture foraging behavior, algal bloom prediction, and species richness given energy constraints. Other case studies simulate the evolution of species and a means to project shifts in species ranges in response to a changing climate that includes competition and phenotypic plasticity. This introduction concludes by citing other uses of evolutionary computation and a review of the flexibility of the methods. For example, representing species' niche spaces subject to selective pressure allows studies on cladistics, the taxon cycle, neutral versus niche paradigms, fundamental versus realized niches, community structure and order of colonization, invasiveness, and responses to a changing climate.

  2. Evolutionary computation in zoology and ecology

    PubMed Central

    2017-01-01

    Abstract Evolutionary computational methods have adopted attributes of natural selection and evolution to solve problems in computer science, engineering, and other fields. The method is growing in use in zoology and ecology. Evolutionary principles may be merged with an agent-based modeling perspective to have individual animals or other agents compete. Four main categories are discussed: genetic algorithms, evolutionary programming, genetic programming, and evolutionary strategies. In evolutionary computation, a population is represented in a way that allows for an objective function to be assessed that is relevant to the problem of interest. The poorest performing members are removed from the population, and remaining members reproduce and may be mutated. The fitness of the members is again assessed, and the cycle continues until a stopping condition is met. Case studies include optimizing: egg shape given different clutch sizes, mate selection, migration of wildebeest, birds, and elk, vulture foraging behavior, algal bloom prediction, and species richness given energy constraints. Other case studies simulate the evolution of species and a means to project shifts in species ranges in response to a changing climate that includes competition and phenotypic plasticity. This introduction concludes by citing other uses of evolutionary computation and a review of the flexibility of the methods. For example, representing species’ niche spaces subject to selective pressure allows studies on cladistics, the taxon cycle, neutral versus niche paradigms, fundamental versus realized niches, community structure and order of colonization, invasiveness, and responses to a changing climate. PMID:29492029

  3. Computational Design of DNA-Binding Proteins.

    PubMed

    Thyme, Summer; Song, Yifan

    2016-01-01

    Predicting the outcome of engineered and naturally occurring sequence perturbations to protein-DNA interfaces requires accurate computational modeling technologies. It has been well established that computational design to accommodate small numbers of DNA target site substitutions is possible. This chapter details the basic method of design used in the Rosetta macromolecular modeling program that has been successfully used to modulate the specificity of DNA-binding proteins. More recently, combining computational design and directed evolution has become a common approach for increasing the success rate of protein engineering projects. The power of such high-throughput screening depends on computational methods producing multiple potential solutions. Therefore, this chapter describes several protocols for increasing the diversity of designed output. Lastly, we describe an approach for building comparative models of protein-DNA complexes in order to utilize information from homologous sequences. These models can be used to explore how nature modulates specificity of protein-DNA interfaces and potentially can even be used as starting templates for further engineering.

  4. Fiber specklegram sensors sensitivities at high temperatures

    NASA Astrophysics Data System (ADS)

    Rodriguez-Cobo, L.; Lomer, M.; Lopez-Higuera, J. M.

    2015-09-01

    In this work, the sensitivity of Fiber Specklegram Sensors to high temperatures (up to 800ºC) have been studied. Two multimode silica fibers have been introduced into a tubular furnace while a HeNe laser source was launched into a fiber edge, projecting speckle patterns to a commercial webcam. A computer generated different heating and cooling sweeps while the specklegram evolution was recorded. The achieved results exhibit a remarkably linearity in FSS's sensitivity for temperatures under 800ºC, following the thermal expansion of fused silica.

  5. Effective Strategies for Teaching Evolution: The Primary Evolution Project

    ERIC Educational Resources Information Center

    Hatcher, Chris

    2015-01-01

    When Chris Hatcher joined the Primary Evolution Project team at the University of Reading, his goal was to find effective strategies to teach evolution in a way that keeps children engaged and enthused. Hatcher has collaborated with colleagues at the University's Institute of Education to break the evolution unit down into distinct topics and…

  6. Climatic variability effects on summer cropping systems of the Iberian Peninsula

    NASA Astrophysics Data System (ADS)

    Capa-Morocho, M.; Rodríguez-Fonseca, B.; Ruiz-Ramos, M.

    2012-04-01

    Climate variability and changes in the frequency of extremes events have a direct impact on crop yield and damages. Climate anomalies projections at monthly and yearly timescale allows us for adapting a cropping system (crops, varieties and management) to take advantage of favorable conditions or reduce the effect of adverse conditions. The objective of this work is to develop indices to evaluate the effect of climatic variability in summer cropping systems of Iberian Peninsula, in an attempt of relating yield variability to climate variability, extending the work of Rodríguez-Puebla (2004). This paper analyses the evolution of the yield anomalies of irrigated maize in several representative agricultural locations in Spain with contrasting temperature and precipitation regimes and compare it to the evolution of different patterns of climate variability, extending the methodology of Porter and Semenov (2005). To simulate maize yields observed daily data of radiation, maximum and minimum temperature and precipitation were used. These data were obtained from the State Meteorological Agency of Spain (AEMET). Time series of simulated maize yields were computed with CERES-maize model for periods ranging from 22 to 49 years, depending on the observed climate data available for each location. The computed standardized anomalies yields were projected on different oceanic and atmospheric anomalous fields and the resulting patterns were compared with a set of documented patterns from the National Oceanic and Atmospheric Administration (NOAA). The results can be useful also for climate change impact assessment, providing a scientific basis for selection of climate change scenarios where combined natural and forced variability represent a hazard for agricultural production. Interpretation of impact projections would also be enhanced.

  7. Adopting best practices: "Agility" moves from software development to healthcare project management.

    PubMed

    Kitzmiller, Rebecca; Hunt, Eleanor; Sproat, Sara Breckenridge

    2006-01-01

    It is time for a change in mindset in how nurses operationalize system implementations and manage projects. Computers and systems have evolved over time from unwieldy mysterious machines of the past to ubiquitous computer use in every aspect of daily lives and work sites. Yet, disconcertingly, the process used to implement these systems has not evolved. Technology implementation does not need to be a struggle. It is time to adapt traditional plan-driven implementation methods to incorporate agile techniques. Agility is a concept borrowed from software development and is presented here because it encourages flexibility, adaptation, and continuous learning as part of the implementation process. Agility values communication and harnesses change to an advantage, which facilitates the natural evolution of an adaptable implementation process. Specific examples of agility in an implementation are described, and plan-driven implementation stages are adapted to incorporate relevant agile techniques. This comparison demonstrates how an agile approach enhances traditional implementation techniques to meet the demands of today's complex healthcare environments.

  8. Phase Field Modeling of Microstructure Development in Microgravity

    NASA Technical Reports Server (NTRS)

    Dantzig, Jonathan A.; Goldenfeld, Nigel

    2001-01-01

    This newly funded project seeks to extend our NASA-sponsored project on modeling of dendritic microstructures to facilitate collaboration between our research group and those of other NASA investigators. In our ongoing program, we have applied advanced computational techniques to study microstructural evolution in dendritic solidification, for both pure isolated dendrites and directionally solidified alloys. This work has enabled us to compute dendritic microstructures using both realistic material parameters and experimentally relevant processing conditions, thus allowing for the first time direct comparison of phase field computations with laboratory observations. This work has been well received by the materials science and physics communities, and has led to several opportunities for collaboration with scientists working on experimental investigations of pattern selection and segregation in solidification. While we have been able to pursue these collaborations to a limited extent, with some important findings, this project focuses specifically on those collaborations. We have two target collaborations: with Prof. Glicksman's group working on the Isothermal Dendritic Growth Experiment (IDGE), and with Prof. Poirier's group studying directional solidification in Pb-Sb alloys. These two space experiments match well with our two thrusts in modeling, one for pure materials, as in the IDGE, and the other directional solidification. Such collaboration will benefit all of the research groups involved, and will provide for rapid dissemination of the results of our work where it will have significant impact.

  9. The Seismic Tool-Kit (STK): an open source software for seismology and signal processing.

    NASA Astrophysics Data System (ADS)

    Reymond, Dominique

    2016-04-01

    We present an open source software project (GNU public license), named STK: Seismic ToolKit, that is dedicated mainly for seismology and signal processing. The STK project that started in 2007, is hosted by SourceForge.net, and count more than 19 500 downloads at the date of writing. The STK project is composed of two main branches: First, a graphical interface dedicated to signal processing (in the SAC format (SAC_ASCII and SAC_BIN): where the signal can be plotted, zoomed, filtered, integrated, derivated, ... etc. (a large variety of IFR and FIR filter is proposed). The estimation of spectral density of the signal are performed via the Fourier transform, with visualization of the Power Spectral Density (PSD) in linear or log scale, and also the evolutive time-frequency representation (or sonagram). The 3-components signals can be also processed for estimating their polarization properties, either for a given window, or either for evolutive windows along the time. This polarization analysis is useful for extracting the polarized noises, differentiating P waves, Rayleigh waves, Love waves, ... etc. Secondly, a panel of Utilities-Program are proposed for working in a terminal mode, with basic programs for computing azimuth and distance in spherical geometry, inter/auto-correlation, spectral density, time-frequency for an entire directory of signals, focal planes, and main components axis, radiation pattern of P waves, Polarization analysis of different waves (including noize), under/over-sampling the signals, cubic-spline smoothing, and linear/non linear regression analysis of data set. A MINimum library of Linear AlGebra (MIN-LINAG) is also provided for computing the main matrix process like: QR/QL decomposition, Cholesky solve of linear system, finding eigen value/eigen vectors, QR-solve/Eigen-solve of linear equations systems ... etc. STK is developed in C/C++, mainly under Linux OS, and it has been also partially implemented under MS-Windows. Usefull links: http://sourceforge.net/projects/seismic-toolkit/ http://sourceforge.net/p/seismic-toolkit/wiki/browse_pages/

  10. Cloud computing task scheduling strategy based on improved differential evolution algorithm

    NASA Astrophysics Data System (ADS)

    Ge, Junwei; He, Qian; Fang, Yiqiu

    2017-04-01

    In order to optimize the cloud computing task scheduling scheme, an improved differential evolution algorithm for cloud computing task scheduling is proposed. Firstly, the cloud computing task scheduling model, according to the model of the fitness function, and then used improved optimization calculation of the fitness function of the evolutionary algorithm, according to the evolution of generation of dynamic selection strategy through dynamic mutation strategy to ensure the global and local search ability. The performance test experiment was carried out in the CloudSim simulation platform, the experimental results show that the improved differential evolution algorithm can reduce the cloud computing task execution time and user cost saving, good implementation of the optimal scheduling of cloud computing tasks.

  11. High Performance GPU-Based Fourier Volume Rendering.

    PubMed

    Abdellah, Marwan; Eldeib, Ayman; Sharawi, Amr

    2015-01-01

    Fourier volume rendering (FVR) is a significant visualization technique that has been used widely in digital radiography. As a result of its (N (2)log⁡N) time complexity, it provides a faster alternative to spatial domain volume rendering algorithms that are (N (3)) computationally complex. Relying on the Fourier projection-slice theorem, this technique operates on the spectral representation of a 3D volume instead of processing its spatial representation to generate attenuation-only projections that look like X-ray radiographs. Due to the rapid evolution of its underlying architecture, the graphics processing unit (GPU) became an attractive competent platform that can deliver giant computational raw power compared to the central processing unit (CPU) on a per-dollar-basis. The introduction of the compute unified device architecture (CUDA) technology enables embarrassingly-parallel algorithms to run efficiently on CUDA-capable GPU architectures. In this work, a high performance GPU-accelerated implementation of the FVR pipeline on CUDA-enabled GPUs is presented. This proposed implementation can achieve a speed-up of 117x compared to a single-threaded hybrid implementation that uses the CPU and GPU together by taking advantage of executing the rendering pipeline entirely on recent GPU architectures.

  12. LAMPS software and mesoscale prediction studies

    NASA Technical Reports Server (NTRS)

    Perkey, D. J.

    1985-01-01

    The full-physics version of the LAMPS model has been implemented on the Perkin-Elmer computer. In Addition, LAMPS graphics processors have been rewritten to the run on the Perkin-Elmer and they are currently undergoing final testing. Numerical experiments investigating the impact of convective parameterized latent heat release on the evolution of a precipitating storm have been performed and the results are currently being evaluated. Curent efforts include the continued evaluation of the impact of initial conditions on LAMPS model results. This work will help define measurement requirements for future research field projects as well as for observations in support of operational forecasts. Also, the impact of parameterized latent heat on the evolution of precipitating systems is continuing. This research is in support of NASA's proposed Earth Observation Mission (EOM).

  13. Factors Influencing Junior High School Teachers' Computer-Based Instructional Practices Regarding Their Instructional Evolution Stages

    ERIC Educational Resources Information Center

    Hsu, Ying-Shao; Wu, Hsin-Kai; Hwang, Fu-Kwun

    2007-01-01

    Sandholtz, Ringstaff, & Dwyer (1996) list five stages in the "evolution" of a teacher's capacity for computer-based instruction--entry, adoption, adaptation, appropriation and invention--which hereafter will be called the teacher's computer-based instructional evolution. In this study of approximately six hundred junior high school…

  14. Studying the laws of software evolution in a long-lived FLOSS project.

    PubMed

    Gonzalez-Barahona, Jesus M; Robles, Gregorio; Herraiz, Israel; Ortega, Felipe

    2014-07-01

    Some free, open-source software projects have been around for quite a long time, the longest living ones dating from the early 1980s. For some of them, detailed information about their evolution is available in source code management systems tracking all their code changes for periods of more than 15 years. This paper examines in detail the evolution of one of such projects, glibc, with the main aim of understanding how it evolved and how it matched Lehman's laws of software evolution. As a result, we have developed a methodology for studying the evolution of such long-lived projects based on the information in their source code management repository, described in detail several aspects of the history of glibc, including some activity and size metrics, and found how some of the laws of software evolution may not hold in this case. © 2013 The Authors. Journal of Software: Evolution and Process published by John Wiley & Sons Ltd.

  15. Studying the laws of software evolution in a long-lived FLOSS project

    PubMed Central

    Gonzalez-Barahona, Jesus M; Robles, Gregorio; Herraiz, Israel; Ortega, Felipe

    2014-01-01

    Some free, open-source software projects have been around for quite a long time, the longest living ones dating from the early 1980s. For some of them, detailed information about their evolution is available in source code management systems tracking all their code changes for periods of more than 15 years. This paper examines in detail the evolution of one of such projects, glibc, with the main aim of understanding how it evolved and how it matched Lehman's laws of software evolution. As a result, we have developed a methodology for studying the evolution of such long-lived projects based on the information in their source code management repository, described in detail several aspects of the history of glibc, including some activity and size metrics, and found how some of the laws of software evolution may not hold in this case. © 2013 The Authors. Journal of Software: Evolution and Process published by John Wiley & Sons Ltd. PMID:25893093

  16. Preparing Biology Teachers to Teach Evolution in a Project-Based Approach

    ERIC Educational Resources Information Center

    Cook, Kristin; Buck, Gayle; Park Rogers, Meredith

    2012-01-01

    This study investigates a project-based learning (PBL) approach to teaching evolution to inform efforts in teacher preparation. Data analysis of a secondary biology educator teaching evolution through a PBL approach illuminated: (1) active student voice, which allowed students to reflect on their positioning on evolution and consider multiple…

  17. The VASIMR[registered trademark] VF-200-1 ISS Experiment as a Laboratory for Astrophysics

    NASA Technical Reports Server (NTRS)

    Glover Tim W.; Squire, Jared P.; Longmier, Benjamin; Cassady, Leonard; Ilin, Andrew; Carter, Mark; Olsen, Chris S.; McCaskill, Greg; Diaz, Franklin Chang; Girimaji, Sharath; hide

    2010-01-01

    The VASIMR[R] Flight Experiment (VF-200-1) will be tested in space aboard the International Space Station (ISS) in about four years. It will consist of two 100 kW parallel plasma engines with opposite magnetic dipoles, resulting in a near zero-torque magnetic system. Electrical energy will come from ISS at low power level, be stored in batteries and used to fire the engine at 200 kW. The VF-200-1 project will provide a unique opportunity on the ISS National Laboratory for astrophysicists and space physicists to study the dynamic evolution of an expanding and reconnecting plasma loop. Here, we review the status of the project and discuss our current plans for computational modeling and in situ observation of a dynamic plasma loop on an experimental platform in low-Earth orbit. The VF-200-1 project is still in the early stages of development and we welcome new collaborators.

  18. A Web-Based Monitoring System for Multidisciplinary Design Projects

    NASA Technical Reports Server (NTRS)

    Rogers, James L.; Salas, Andrea O.; Weston, Robert P.

    1998-01-01

    In today's competitive environment, both industry and government agencies are under pressure to reduce the time and cost of multidisciplinary design projects. New tools have been introduced to assist in this process by facilitating the integration of and communication among diverse disciplinary codes. One such tool, a framework for multidisciplinary computational environments, is defined as a hardware and software architecture that enables integration, execution, and communication among diverse disciplinary processes. An examination of current frameworks reveals weaknesses in various areas, such as sequencing, displaying, monitoring, and controlling the design process. The objective of this research is to explore how Web technology, integrated with an existing framework, can improve these areas of weakness. This paper describes a Web-based system that optimizes and controls the execution sequence of design processes; and monitors the project status and results. The three-stage evolution of the system with increasingly complex problems demonstrates the feasibility of this approach.

  19. The VASIMR® VF-200-1 ISS Experiment as a Laboratory for Astrophysics

    NASA Astrophysics Data System (ADS)

    Glover, T.; Squire, J. P.; Longmier, B. W.; Carter, M. D.; Ilin, A. V.; Cassady, L. D.; Olsen, C. S.; Chang Díaz, F.; McCaskill, G. E.; Bering, E. A.; Garrison, D.; Girimaji, S.; Araya, D.; Morin, L.; Shebalin, J. V.

    2010-12-01

    The VASIMR® Flight Experiment (VF-200-1) will be tested in space aboard the International Space Station (ISS) in about four years. It will consist of two 100 kW parallel plasma engines with opposite magnetic dipoles, resulting in a near zero-torque magnetic system. Electrical energy will come from ISS at low power level, be stored in batteries and used to fire the engine at 200 kW. The VF-200-1 project will provide a unique opportunity on the ISS National Laboratory for astrophysicists and space physicists to study the dynamic evolution of an expanding and reconnecting plasma loop. Here, we review the status of the project and discuss our current plans for computational modeling and in situ observation of a dynamic plasma loop on an experimental platform in low-Earth orbit. The VF-200-1 project is still in the early stages of development and we welcome new collaborators.

  20. Virtual workstations and telepresence interfaces: Design accommodations and prototypes for Space Station Freedom evolution

    NASA Technical Reports Server (NTRS)

    Mcgreevy, Michael W.

    1990-01-01

    An advanced human-system interface is being developed for evolutionary Space Station Freedom as part of the NASA Office of Space Station (OSS) Advanced Development Program. The human-system interface is based on body-pointed display and control devices. The project will identify and document the design accommodations ('hooks and scars') required to support virtual workstations and telepresence interfaces, and prototype interface systems will be built, evaluated, and refined. The project is a joint enterprise of Marquette University, Astronautics Corporation of America (ACA), and NASA's ARC. The project team is working with NASA's JSC and McDonnell Douglas Astronautics Company (the Work Package contractor) to ensure that the project is consistent with space station user requirements and program constraints. Documentation describing design accommodations and tradeoffs will be provided to OSS, JSC, and McDonnell Douglas, and prototype interface devices will be delivered to ARC and JSC. ACA intends to commercialize derivatives of the interface for use with computer systems developed for scientific visualization and system simulation.

  1. Rapid execution of fan beam image reconstruction algorithms using efficient computational techniques and special-purpose processors

    NASA Astrophysics Data System (ADS)

    Gilbert, B. K.; Robb, R. A.; Chu, A.; Kenue, S. K.; Lent, A. H.; Swartzlander, E. E., Jr.

    1981-02-01

    Rapid advances during the past ten years of several forms of computer-assisted tomography (CT) have resulted in the development of numerous algorithms to convert raw projection data into cross-sectional images. These reconstruction algorithms are either 'iterative,' in which a large matrix algebraic equation is solved by successive approximation techniques; or 'closed form'. Continuing evolution of the closed form algorithms has allowed the newest versions to produce excellent reconstructed images in most applications. This paper will review several computer software and special-purpose digital hardware implementations of closed form algorithms, either proposed during the past several years by a number of workers or actually implemented in commercial or research CT scanners. The discussion will also cover a number of recently investigated algorithmic modifications which reduce the amount of computation required to execute the reconstruction process, as well as several new special-purpose digital hardware implementations under development in laboratories at the Mayo Clinic.

  2. Millstone: software for multiplex microbial genome analysis and engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goodman, Daniel B.; Kuznetsov, Gleb; Lajoie, Marc J.

    Inexpensive DNA sequencing and advances in genome editing have made computational analysis a major rate-limiting step in adaptive laboratory evolution and microbial genome engineering. Here, we describe Millstone, a web-based platform that automates genotype comparison and visualization for projects with up to hundreds of genomic samples. To enable iterative genome engineering, Millstone allows users to design oligonucleotide libraries and create successive versions of reference genomes. Millstone is open source and easily deployable to a cloud platform, local cluster, or desktop, making it a scalable solution for any lab.

  3. The Effects of the Uncertainty of Thermodynamic and Kinetic Properties on Nucleation and Evolution Kinetics of Cr-Rich Phase in Fe-Cr Alloys

    DTIC Science & Technology

    2012-12-01

    M. A.; Horstemeyer, M. F.; Gao, F.; Sun, X.: Khaleel, M. Scripta Materialia. 2011, 64, 908. 80. Plimpton , S . Journal of Computational Physics...99 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR( S ) Mark Tschopp,* Fei Gao,** and Xin Sun** 5d. PROJECT NUMBER 5e. TASK NUMBER...5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME( S ) AND ADDRESS(ES) U.S. Army Research Laboratory ATTN: RDRL-WMM-F Aberdeen Proving Ground

  4. Millstone: software for multiplex microbial genome analysis and engineering.

    PubMed

    Goodman, Daniel B; Kuznetsov, Gleb; Lajoie, Marc J; Ahern, Brian W; Napolitano, Michael G; Chen, Kevin Y; Chen, Changping; Church, George M

    2017-05-25

    Inexpensive DNA sequencing and advances in genome editing have made computational analysis a major rate-limiting step in adaptive laboratory evolution and microbial genome engineering. We describe Millstone, a web-based platform that automates genotype comparison and visualization for projects with up to hundreds of genomic samples. To enable iterative genome engineering, Millstone allows users to design oligonucleotide libraries and create successive versions of reference genomes. Millstone is open source and easily deployable to a cloud platform, local cluster, or desktop, making it a scalable solution for any lab.

  5. Millstone: software for multiplex microbial genome analysis and engineering

    DOE PAGES

    Goodman, Daniel B.; Kuznetsov, Gleb; Lajoie, Marc J.; ...

    2017-05-25

    Inexpensive DNA sequencing and advances in genome editing have made computational analysis a major rate-limiting step in adaptive laboratory evolution and microbial genome engineering. Here, we describe Millstone, a web-based platform that automates genotype comparison and visualization for projects with up to hundreds of genomic samples. To enable iterative genome engineering, Millstone allows users to design oligonucleotide libraries and create successive versions of reference genomes. Millstone is open source and easily deployable to a cloud platform, local cluster, or desktop, making it a scalable solution for any lab.

  6. Grids, virtualization, and clouds at Fermilab

    DOE PAGES

    Timm, S.; Chadwick, K.; Garzoglio, G.; ...

    2014-06-11

    Fermilab supports a scientific program that includes experiments and scientists located across the globe. To better serve this community, in 2004, the (then) Computing Division undertook the strategy of placing all of the High Throughput Computing (HTC) resources in a Campus Grid known as FermiGrid, supported by common shared services. In 2007, the FermiGrid Services group deployed a service infrastructure that utilized Xen virtualization, LVS network routing and MySQL circular replication to deliver highly available services that offered significant performance, reliability and serviceability improvements. This deployment was further enhanced through the deployment of a distributed redundant network core architecture andmore » the physical distribution of the systems that host the virtual machines across multiple buildings on the Fermilab Campus. In 2010, building on the experience pioneered by FermiGrid in delivering production services in a virtual infrastructure, the Computing Sector commissioned the FermiCloud, General Physics Computing Facility and Virtual Services projects to serve as platforms for support of scientific computing (FermiCloud 6 GPCF) and core computing (Virtual Services). Lastly, this work will present the evolution of the Fermilab Campus Grid, Virtualization and Cloud Computing infrastructure together with plans for the future.« less

  7. Grids, virtualization, and clouds at Fermilab

    NASA Astrophysics Data System (ADS)

    Timm, S.; Chadwick, K.; Garzoglio, G.; Noh, S.

    2014-06-01

    Fermilab supports a scientific program that includes experiments and scientists located across the globe. To better serve this community, in 2004, the (then) Computing Division undertook the strategy of placing all of the High Throughput Computing (HTC) resources in a Campus Grid known as FermiGrid, supported by common shared services. In 2007, the FermiGrid Services group deployed a service infrastructure that utilized Xen virtualization, LVS network routing and MySQL circular replication to deliver highly available services that offered significant performance, reliability and serviceability improvements. This deployment was further enhanced through the deployment of a distributed redundant network core architecture and the physical distribution of the systems that host the virtual machines across multiple buildings on the Fermilab Campus. In 2010, building on the experience pioneered by FermiGrid in delivering production services in a virtual infrastructure, the Computing Sector commissioned the FermiCloud, General Physics Computing Facility and Virtual Services projects to serve as platforms for support of scientific computing (FermiCloud 6 GPCF) and core computing (Virtual Services). This work will present the evolution of the Fermilab Campus Grid, Virtualization and Cloud Computing infrastructure together with plans for the future.

  8. Problem-Based Service Learning: The Evolution of a Team Project

    ERIC Educational Resources Information Center

    Connor-Greene, Patricia A.

    2002-01-01

    In this article, I describe the evolution of a problem-based service learning project in an undergraduate Abnormal Psychology course. Students worked in teams on a semester-long project to locate and evaluate information and treatment for specific psychiatric disorders. As part of the project, each team selected relevant bibliographic materials,…

  9. Cross-verification of the GENE and XGC codes in preparation for their coupling

    NASA Astrophysics Data System (ADS)

    Jenko, Frank; Merlo, Gabriele; Bhattacharjee, Amitava; Chang, Cs; Dominski, Julien; Ku, Seunghoe; Parker, Scott; Lanti, Emmanuel

    2017-10-01

    A high-fidelity Whole Device Model (WDM) of a magnetically confined plasma is a crucial tool for planning and optimizing the design of future fusion reactors, including ITER. Aiming at building such a tool, in the framework of the Exascale Computing Project (ECP) the two existing gyrokinetic codes GENE (Eulerian delta-f) and XGC (PIC full-f) will be coupled, thus enabling to carry out first principle kinetic WDM simulations. In preparation for this ultimate goal, a benchmark between the two codes is carried out looking at ITG modes in the adiabatic electron limit. This verification exercise is also joined by the global Lagrangian PIC code ORB5. Linear and nonlinear comparisons have been carried out, neglecting for simplicity collisions and sources. A very good agreement is recovered on frequency, growth rate and mode structure of linear modes. A similarly excellent agreement is also observed comparing the evolution of the heat flux and of the background temperature profile during nonlinear simulations. Work supported by the US DOE under the Exascale Computing Project (17-SC-20-SC).

  10. Constitutive Modeling of the Thermomechanical Behavior of Rock Salt

    NASA Astrophysics Data System (ADS)

    Hampel, A.

    2016-12-01

    For the safe disposal of heat-generating high-level radioactive waste in rock salt formations, highly reliable numerical simulations of the thermomechanical and hydraulic behavior of the host rock have to be performed. Today, the huge progress in computer technology has enabled experts to calculate large and detailed computer models of underground repositories. However, the big ad­van­ces in computer technology are only beneficial when the applied material models and modeling procedures also meet very high demands. They result from the fact that the evaluation of the long-term integrity of the geological barrier requires an extra­polation of a highly nonlinear deforma­tion behavior to up to 1 million years, while the underlying experimental investigations in the laboratory or in situ have a duration of only days, weeks or at most some years. Several advanced constitutive models were developed and continuously improved to describe the dependences of various deformation phenomena in rock salt on in-situ relevant boundary conditions: transient and steady-state creep, evolution of damage and dilatancy in the DRZ, failure, post-failure behavior, residual strength, damage and dilatancy reduction, and healing. In a joint project series between 2004 and 2016, fundamental features of the advanced models were investigated and compared in detail with benchmark calculations. The study included procedures for the determination of characteristic salt-type-specific model parameter values and for the performance of numerical calculations of underground structures. Based on the results of this work and on specific laboratory investigations, the rock mechanical modeling is currently developed further in a common research project of experts from Germany and the United States. In this presentation, an overview about the work and results of the project series is given and the current joint research project WEIMOS is introduced.

  11. a Study on Mental Representations for Realistic Visualization the Particular Case of Ski Trail Mapping

    NASA Astrophysics Data System (ADS)

    Balzarini, R.; Dalmasso, A.; Murat, M.

    2015-08-01

    This article presents preliminary results from a research project in progress that brings together geographers, cognitive scientists, historians and computer scientists. The project investigates the evolution of a particular territorial model: ski trails maps. Ski resorts, tourist and sporting innovations for mountain economies since the 1930s, have needed cartographic representations corresponding to new practices of the space.Painter artists have been involved in producing ski maps with painting techniques and panoramic views, which are by far the most common type of map, because they allow the resorts to look impressive to potential visitors. These techniques have evolved throughout the mutations of the ski resorts. Paper ski maps no longer meet the needs of a large part of the customers; the question now arises of their adaptation to digital media. In a computerized process perspective, the early stage of the project aims to identify the artist-representations, based on conceptual and technical rules, which are handled by users-skiers to perform a task (location, wayfinding, decision-making) and can be transferred to a computer system. This article presents the experimental phase that analyzes artist and user mental representations that are at stake during the making and the reading of a paper ski map. It particularly focuses on how the invention of the artist influences map reading.

  12. Using Decision Structures for Policy Analysis in Software Product-line Evolution - A Case Study

    NASA Astrophysics Data System (ADS)

    Sarang, Nita; Sanglikar, Mukund A.

    Project management decisions are the primary basis for project success (or failure). Mostly, such decisions are based on an intuitive understanding of the underlying software engineering and management process and have a likelihood of being misjudged. Our problem domain is product-line evolution. We model the dynamics of the process by incorporating feedback loops appropriate to two decision structures: staffing policy, and the forces of growth associated with long-term software evolution. The model is executable and supports project managers to assess the long-term effects of possible actions. Our work also corroborates results from earlier studies of E-type systems, in particular the FEAST project and the rules for software evolution, planning and management.

  13. Artificial neural networks: fundamentals, computing, design, and application.

    PubMed

    Basheer, I A; Hajmeer, M

    2000-12-01

    Artificial neural networks (ANNs) are relatively new computational tools that have found extensive utilization in solving many complex real-world problems. The attractiveness of ANNs comes from their remarkable information processing characteristics pertinent mainly to nonlinearity, high parallelism, fault and noise tolerance, and learning and generalization capabilities. This paper aims to familiarize the reader with ANN-based computing (neurocomputing) and to serve as a useful companion practical guide and toolkit for the ANNs modeler along the course of ANN project development. The history of the evolution of neurocomputing and its relation to the field of neurobiology is briefly discussed. ANNs are compared to both expert systems and statistical regression and their advantages and limitations are outlined. A bird's eye review of the various types of ANNs and the related learning rules is presented, with special emphasis on backpropagation (BP) ANNs theory and design. A generalized methodology for developing successful ANNs projects from conceptualization, to design, to implementation, is described. The most common problems that BPANNs developers face during training are summarized in conjunction with possible causes and remedies. Finally, as a practical application, BPANNs were used to model the microbial growth curves of S. flexneri. The developed model was reasonably accurate in simulating both training and test time-dependent growth curves as affected by temperature and pH.

  14. Computational evolution: taking liberties.

    PubMed

    Correia, Luís

    2010-09-01

    Evolution has, for a long time, inspired computer scientists to produce computer models mimicking its behavior. Evolutionary algorithm (EA) is one of the areas where this approach has flourished. EAs have been used to model and study evolution, but they have been especially developed for their aptitude as optimization tools for engineering. Developed models are quite simple in comparison with their natural sources of inspiration. However, since EAs run on computers, we have the freedom, especially in optimization models, to test approaches both realistic and outright speculative, from the biological point of view. In this article, we discuss different common evolutionary algorithm models, and then present some alternatives of interest. These include biologically inspired models, such as co-evolution and, in particular, symbiogenetics and outright artificial operators and representations. In each case, the advantages of the modifications to the standard model are identified. The other area of computational evolution, which has allowed us to study basic principles of evolution and ecology dynamics, is the development of artificial life platforms for open-ended evolution of artificial organisms. With these platforms, biologists can test theories by directly manipulating individuals and operators, observing the resulting effects in a realistic way. An overview of the most prominent of such environments is also presented. If instead of artificial platforms we use the real world for evolving artificial life, then we are dealing with evolutionary robotics (ERs). A brief description of this area is presented, analyzing its relations to biology. Finally, we present the conclusions and identify future research avenues in the frontier of computation and biology. Hopefully, this will help to draw the attention of more biologists and computer scientists to the benefits of such interdisciplinary research.

  15. A Monte Carlo model for 3D grain evolution during welding

    NASA Astrophysics Data System (ADS)

    Rodgers, Theron M.; Mitchell, John A.; Tikare, Veena

    2017-09-01

    Welding is one of the most wide-spread processes used in metal joining. However, there are currently no open-source software implementations for the simulation of microstructural evolution during a weld pass. Here we describe a Potts Monte Carlo based model implemented in the SPPARKS kinetic Monte Carlo computational framework. The model simulates melting, solidification and solid-state microstructural evolution of material in the fusion and heat-affected zones of a weld. The model does not simulate thermal behavior, but rather utilizes user input parameters to specify weld pool and heat-affect zone properties. Weld pool shapes are specified by Bézier curves, which allow for the specification of a wide range of pool shapes. Pool shapes can range from narrow and deep to wide and shallow representing different fluid flow conditions within the pool. Surrounding temperature gradients are calculated with the aide of a closest point projection algorithm. The model also allows simulation of pulsed power welding through time-dependent variation of the weld pool size. Example simulation results and comparisons with laboratory weld observations demonstrate microstructural variation with weld speed, pool shape, and pulsed-power.

  16. Crustal structure of mountain belts and basins: Industry and academic collaboration at Cornell

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Allmendinger, R.; Barazangi, M.; Brown, L.

    1995-08-01

    Interdisciplinary investigations of the large-scale structure and evolution of key basins and orogenic belts around the world are the focal point of academic-industry interaction at Cornell. Ongoing and new initiatives with significant industry involvement include: Project INDEPTH (Interdisciplinary Deep Profiling of Tibet and the Himalayas), a multinational effort to delineate deep structure across the type example of active continent-continent collision. 300 km of deep reflection profiling was collected across the Himalaya: and southern Tibet Plateau in 1992 and 1994. CAP (Cornell Andes Project), a long-standing interdisciplinary effort to understand the structure and evolution of the Andes, with a focus onmore » Argentina, Chile and Bolivia. A deep reflection profile is tentatively planned for 1997. Intra-plate Orogeny in the Middle East and North Africa is the focus of multidisciplinary regional syntheses of existing seismic reflection and other databases in Syria (Palmyrides)and Morocco (Atlas), with an emphasis on reactivation and inversion tectonics. Project URSEIS (Urals Reflection Seismic Experiment and Integrated Studies) is a collaboration with EUROPROBE to collect 500 km of vibroseis and dynamite deep reflection profiling across the southern Urals in 1995. Project CRATON, an element in COCORP`s systematic exploration of the continental US, is a nascent multi-disciplinary effort to understand the buried craton of the central US and the basins built upon it. Global Basins Research Network (GBRN) is a diversified observational and computational effort to image and model the movement of pore fluids in detail and on a regional scale for a producing oil structure in the Gulf of Mexico.« less

  17. Computational model for simulation small testing launcher, technical solution

    NASA Astrophysics Data System (ADS)

    Chelaru, Teodor-Viorel; Cristian, Barbu; Chelaru, Adrian

    2014-12-01

    The purpose of this paper is to present some aspects regarding the computational model and technical solutions for multistage suborbital launcher for testing (SLT) used to test spatial equipment and scientific measurements. The computational model consists in numerical simulation of SLT evolution for different start conditions. The launcher model presented will be with six degrees of freedom (6DOF) and variable mass. The results analysed will be the flight parameters and ballistic performances. The discussions area will focus around the technical possibility to realize a small multi-stage launcher, by recycling military rocket motors. From technical point of view, the paper is focused on national project "Suborbital Launcher for Testing" (SLT), which is based on hybrid propulsion and control systems, obtained through an original design. Therefore, while classical suborbital sounding rockets are unguided and they use as propulsion solid fuel motor having an uncontrolled ballistic flight, SLT project is introducing a different approach, by proposing the creation of a guided suborbital launcher, which is basically a satellite launcher at a smaller scale, containing its main subsystems. This is why the project itself can be considered an intermediary step in the development of a wider range of launching systems based on hybrid propulsion technology, which may have a major impact in the future European launchers programs. SLT project, as it is shown in the title, has two major objectives: first, a short term objective, which consists in obtaining a suborbital launching system which will be able to go into service in a predictable period of time, and a long term objective that consists in the development and testing of some unconventional sub-systems which will be integrated later in the satellite launcher as a part of the European space program. This is why the technical content of the project must be carried out beyond the range of the existing suborbital vehicle programs towards the current technological necessities in the space field, especially the European one.

  18. Computational model for simulation small testing launcher, technical solution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chelaru, Teodor-Viorel, E-mail: teodor.chelaru@upb.ro; Cristian, Barbu, E-mail: barbucr@mta.ro; Chelaru, Adrian, E-mail: achelaru@incas.ro

    The purpose of this paper is to present some aspects regarding the computational model and technical solutions for multistage suborbital launcher for testing (SLT) used to test spatial equipment and scientific measurements. The computational model consists in numerical simulation of SLT evolution for different start conditions. The launcher model presented will be with six degrees of freedom (6DOF) and variable mass. The results analysed will be the flight parameters and ballistic performances. The discussions area will focus around the technical possibility to realize a small multi-stage launcher, by recycling military rocket motors. From technical point of view, the paper ismore » focused on national project 'Suborbital Launcher for Testing' (SLT), which is based on hybrid propulsion and control systems, obtained through an original design. Therefore, while classical suborbital sounding rockets are unguided and they use as propulsion solid fuel motor having an uncontrolled ballistic flight, SLT project is introducing a different approach, by proposing the creation of a guided suborbital launcher, which is basically a satellite launcher at a smaller scale, containing its main subsystems. This is why the project itself can be considered an intermediary step in the development of a wider range of launching systems based on hybrid propulsion technology, which may have a major impact in the future European launchers programs. SLT project, as it is shown in the title, has two major objectives: first, a short term objective, which consists in obtaining a suborbital launching system which will be able to go into service in a predictable period of time, and a long term objective that consists in the development and testing of some unconventional sub-systems which will be integrated later in the satellite launcher as a part of the European space program. This is why the technical content of the project must be carried out beyond the range of the existing suborbital vehicle programs towards the current technological necessities in the space field, especially the European one.« less

  19. An approach to computing discrete adjoints for MPI-parallelized models applied to Ice Sheet System Model 4.11

    NASA Astrophysics Data System (ADS)

    Larour, Eric; Utke, Jean; Bovin, Anton; Morlighem, Mathieu; Perez, Gilberto

    2016-11-01

    Within the framework of sea-level rise projections, there is a strong need for hindcast validation of the evolution of polar ice sheets in a way that tightly matches observational records (from radar, gravity, and altimetry observations mainly). However, the computational requirements for making hindcast reconstructions possible are severe and rely mainly on the evaluation of the adjoint state of transient ice-flow models. Here, we look at the computation of adjoints in the context of the NASA/JPL/UCI Ice Sheet System Model (ISSM), written in C++ and designed for parallel execution with MPI. We present the adaptations required in the way the software is designed and written, but also generic adaptations in the tools facilitating the adjoint computations. We concentrate on the use of operator overloading coupled with the AdjoinableMPI library to achieve the adjoint computation of the ISSM. We present a comprehensive approach to (1) carry out type changing through the ISSM, hence facilitating operator overloading, (2) bind to external solvers such as MUMPS and GSL-LU, and (3) handle MPI-based parallelism to scale the capability. We demonstrate the success of the approach by computing sensitivities of hindcast metrics such as the misfit to observed records of surface altimetry on the northeastern Greenland Ice Stream, or the misfit to observed records of surface velocities on Upernavik Glacier, central West Greenland. We also provide metrics for the scalability of the approach, and the expected performance. This approach has the potential to enable a new generation of hindcast-validated projections that make full use of the wealth of datasets currently being collected, or already collected, in Greenland and Antarctica.

  20. An Approach to Computing Discrete Adjoints for MPI-Parallelized Models Applied to the Ice Sheet System Model}

    NASA Astrophysics Data System (ADS)

    Perez, G. L.; Larour, E. Y.; Morlighem, M.

    2016-12-01

    Within the framework of sea-level rise projections, there is a strong need for hindcast validation of the evolution of polar ice sheets in a way that tightly matches observational records (from radar and altimetry observations mainly). However, the computational requirements for making hindcast reconstructions possible are severe and rely mainly on the evaluation of the adjoint state of transient ice-flow models. Here, we look at the computation of adjoints in the context of the NASA/JPL/UCI Ice Sheet System Model, written in C++ and designed for parallel execution with MPI. We present the adaptations required in the way the software is designed and written but also generic adaptations in the tools facilitating the adjoint computations. We concentrate on the use of operator overloading coupled with the AdjoinableMPI library to achieve the adjoint computation of ISSM. We present a comprehensive approach to 1) carry out type changing through ISSM, hence facilitating operator overloading, 2) bind to external solvers such as MUMPS and GSL-LU and 3) handle MPI-based parallelism to scale the capability. We demonstrate the success of the approach by computing sensitivities of hindcast metrics such as the misfit to observed records of surface altimetry on the North-East Greenland Ice Stream, or the misfit to observed records of surface velocities on Upernavik Glacier, Central West Greenland. We also provide metrics for the scalability of the approach, and the expected performance. This approach has the potential of enabling a new generation of hindcast-validated projections that make full use of the wealth of datasets currently being collected, or alreay collected in Greenland and Antarctica, such as surface altimetry, surface velocities, and/or gravity measurements.

  1. An open source platform for multi-scale spatially distributed simulations of microbial ecosystems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Segre, Daniel

    2014-08-14

    The goal of this project was to develop a tool for facilitating simulation, validation and discovery of multiscale dynamical processes in microbial ecosystems. This led to the development of an open-source software platform for Computation Of Microbial Ecosystems in Time and Space (COMETS). COMETS performs spatially distributed time-dependent flux balance based simulations of microbial metabolism. Our plan involved building the software platform itself, calibrating and testing it through comparison with experimental data, and integrating simulations and experiments to address important open questions on the evolution and dynamics of cross-feeding interactions between microbial species.

  2. An integrated communications demand model

    NASA Astrophysics Data System (ADS)

    Doubleday, C. F.

    1980-11-01

    A computer model of communications demand is being developed to permit dynamic simulations of the long-term evolution of demand for communications media in the U.K. to be made under alternative assumptions about social, economic and technological trends in British Telecom's business environment. The context and objectives of the project and the potential uses of the model are reviewed, and four key concepts in the demand for communications media, around which the model is being structured are discussed: (1) the generation of communications demand; (2) substitution between media; (3) technological convergence; and (4) competition. Two outline perspectives on the model itself are given.

  3. Interannual evolutions of (sub)mesoscale dynamics in the Bay of Biscay and the English Channel

    NASA Astrophysics Data System (ADS)

    Charria, G.; Vandermeirsch, F.; Theetten, S.; Yelekçi, Ö.; Assassi, C.; Audiffren, N. J.

    2016-02-01

    In a context of global change, ocean regions as the Bay of the Biscay and the English Channel represent key domains to estimate the local impact on the coasts of interannual evolutions. Indeed, the coastal (considering in this project regions above the continental shelf) and regional (including the continental slope and the abyssal plain) environments are sensitive to the long-term fluctuations driven by the open ocean, the atmosphere and the watersheds. These evolutions can have impacts on the whole ecosystem. To understand and, by extension, forecast evolutions of these ecosystems, we need to go further in the description and the analysis of the past interannual variability over decadal to pluri-decadal periods. This variability can be described at different spatial scales from small (< 1 km) to basin scales (> 100 km). With a focus on smaller scales, the modelled dynamics, using a Coastal Circulation Model on national computing resources (GENCI/CINES), is discussed from interannual simulations (10 to 53 years) with different spatial (4 km to 1 km) and vertical (40 to 100 sigma levels) resolutions compared with available in situ observations. Exploring vorticity and kinetic energy based diagnostics; dynamical patterns are described including the vertical distribution of the mesoscale activity. Despite the lack of deep and spatially distributed observations, present numerical experiments draw a first picture of the 3D mesoscale distribution and its evolution at interannual time scales.

  4. Recent Evolution of the Introductory Curriculum in Computing.

    ERIC Educational Resources Information Center

    Tucker, Allen B.; Garnick, David K.

    1991-01-01

    Traces the evolution of introductory computing courses for undergraduates based on the Association for Computing Machinery (ACM) guidelines published in "Curriculum 78." Changes in the curricula are described, including the role of discrete mathematics and theory; and the need for a broader model for designing introductory courses is…

  5. Language evolution and human-computer interaction

    NASA Technical Reports Server (NTRS)

    Grudin, Jonathan; Norman, Donald A.

    1991-01-01

    Many of the issues that confront designers of interactive computer systems also appear in natural language evolution. Natural languages and human-computer interfaces share as their primary mission the support of extended 'dialogues' between responsive entities. Because in each case one participant is a human being, some of the pressures operating on natural languages, causing them to evolve in order to better support such dialogue, also operate on human-computer 'languages' or interfaces. This does not necessarily push interfaces in the direction of natural language - since one entity in this dialogue is not a human, this is not to be expected. Nonetheless, by discerning where the pressures that guide natural language evolution also appear in human-computer interaction, we can contribute to the design of computer systems and obtain a new perspective on natural languages.

  6. Duality quantum algorithm efficiently simulates open quantum systems

    PubMed Central

    Wei, Shi-Jie; Ruan, Dong; Long, Gui-Lu

    2016-01-01

    Because of inevitable coupling with the environment, nearly all practical quantum systems are open system, where the evolution is not necessarily unitary. In this paper, we propose a duality quantum algorithm for simulating Hamiltonian evolution of an open quantum system. In contrast to unitary evolution in a usual quantum computer, the evolution operator in a duality quantum computer is a linear combination of unitary operators. In this duality quantum algorithm, the time evolution of the open quantum system is realized by using Kraus operators which is naturally implemented in duality quantum computer. This duality quantum algorithm has two distinct advantages compared to existing quantum simulation algorithms with unitary evolution operations. Firstly, the query complexity of the algorithm is O(d3) in contrast to O(d4) in existing unitary simulation algorithm, where d is the dimension of the open quantum system. Secondly, By using a truncated Taylor series of the evolution operators, this duality quantum algorithm provides an exponential improvement in precision compared with previous unitary simulation algorithm. PMID:27464855

  7. The advanced software development workstation project

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III; Pitman, Charles L.

    1991-01-01

    The Advanced Software Development Workstation (ASDW) task is researching and developing the technologies required to support Computer Aided Software Engineering (CASE) with the emphasis on those advanced methods, tools, and processes that will be of benefit to support all NASA programs. Immediate goals are to provide research and prototype tools that will increase productivity, in the near term, in projects such as the Software Support Environment (SSE), the Space Station Control Center (SSCC), and the Flight Analysis and Design System (FADS) which will be used to support the Space Shuttle and Space Station Freedom. Goals also include providing technology for development, evolution, maintenance, and operations. The technologies under research and development in the ASDW project are targeted to provide productivity enhancements during the software life cycle phase of enterprise and information system modeling, requirements generation and analysis, system design and coding, and system use and maintenance. On-line user's guides will assist users in operating the developed information system with knowledge base expert assistance.

  8. Observations give us CLUES to Cosmic Flows' origins

    NASA Astrophysics Data System (ADS)

    Sorce, Jenny; Courtois, H.; Gottloeber, S.; Hoffman, Y.; Pomarede, D.; Tully, R. B.; Flows, Cosmic; CLUES

    2014-01-01

    In an era where the wealth of telescope-data and the development of computer superclusters keep increasing, the knowledge of Large Scale Structures' formation and evolution constitutes a tremendous challenge. Within this context the project Cosmic Flows has recently produced a catalog of peculiar velocities up to 150 Mpc. These velocities, obtained from direct distance measurements, are ideal markers of the underlying gravitational potential. They form a fantastic input to perform constrained simulations of the Local Universe within the CLUES project. A new method has recently been elaborated to achieve these simulations which prove to be excellent replicas of our neighborhood. The Wiener-Filter, the Reverse Zel'dovich Approximation and the Constrained Realization techniques are combined to build Initial Conditions. The resulting second generation of constrained simulations presents us the formidable history of the Great Attractor's and nearby supercluster's formation.

  9. Cosmic reionization on computers. Ultraviolet continuum slopes and dust opacities in high redshift galaxies

    DOE PAGES

    Khakhaleva-Li, Zimu; Gnedin, Nickolay Y.

    2016-03-30

    In this study, we compare the properties of stellar populations of model galaxies from the Cosmic Reionization On Computers (CROC) project with the exiting UV and IR data. Since CROC simulations do not follow cosmic dust directly, we adopt two variants of the dust-follows-metals ansatz to populate model galaxies with dust. Using the dust radiative transfer code Hyperion, we compute synthetic stellar spectra, UV continuum slopes, and IR fluxes for simulated galaxies. We find that the simulation results generally match observational measurements, but, perhaps, not in full detail. The differences seem to indicate that our adopted dust-follows-metals ansatzes are notmore » fully sufficient. While the discrepancies with the exiting data are marginal, the future JWST data will be of much higher precision, rendering highly significant any tentative difference between theory and observations. It is, therefore, likely, that in order to fully utilize the precision of JWST observations, fully dynamical modeling of dust formation, evolution, and destruction may be required.« less

  10. Cosmic reionization on computers. Ultraviolet continuum slopes and dust opacities in high redshift galaxies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khakhaleva-Li, Zimu; Gnedin, Nickolay Y.

    In this study, we compare the properties of stellar populations of model galaxies from the Cosmic Reionization On Computers (CROC) project with the exiting UV and IR data. Since CROC simulations do not follow cosmic dust directly, we adopt two variants of the dust-follows-metals ansatz to populate model galaxies with dust. Using the dust radiative transfer code Hyperion, we compute synthetic stellar spectra, UV continuum slopes, and IR fluxes for simulated galaxies. We find that the simulation results generally match observational measurements, but, perhaps, not in full detail. The differences seem to indicate that our adopted dust-follows-metals ansatzes are notmore » fully sufficient. While the discrepancies with the exiting data are marginal, the future JWST data will be of much higher precision, rendering highly significant any tentative difference between theory and observations. It is, therefore, likely, that in order to fully utilize the precision of JWST observations, fully dynamical modeling of dust formation, evolution, and destruction may be required.« less

  11. Evolution of E-Learning Projects: A Creative Experience?

    ERIC Educational Resources Information Center

    Wakeford, Carol

    2011-01-01

    e-Learning Projects involve the construction by final year students of e-learning resources in project work. Students are supported in a blended training course in which they acquire appropriate skills and critically review eresources of their peers This paper describes innovations in course design that have lead to the evolution of eresources…

  12. Learning Evolution and the Nature of Science Using Evolutionary Computing and Artificial Life

    ERIC Educational Resources Information Center

    Pennock, Robert T.

    2007-01-01

    Because evolution in natural systems happens so slowly, it is difficult to design inquiry-based labs where students can experiment and observe evolution in the way they can when studying other phenomena. New research in evolutionary computation and artificial life provides a solution to this problem. This paper describes a new A-Life software…

  13. Progress in Computational Simulation of Earthquakes

    NASA Technical Reports Server (NTRS)

    Donnellan, Andrea; Parker, Jay; Lyzenga, Gregory; Judd, Michele; Li, P. Peggy; Norton, Charles; Tisdale, Edwin; Granat, Robert

    2006-01-01

    GeoFEST(P) is a computer program written for use in the QuakeSim project, which is devoted to development and improvement of means of computational simulation of earthquakes. GeoFEST(P) models interacting earthquake fault systems from the fault-nucleation to the tectonic scale. The development of GeoFEST( P) has involved coupling of two programs: GeoFEST and the Pyramid Adaptive Mesh Refinement Library. GeoFEST is a message-passing-interface-parallel code that utilizes a finite-element technique to simulate evolution of stress, fault slip, and plastic/elastic deformation in realistic materials like those of faulted regions of the crust of the Earth. The products of such simulations are synthetic observable time-dependent surface deformations on time scales from days to decades. Pyramid Adaptive Mesh Refinement Library is a software library that facilitates the generation of computational meshes for solving physical problems. In an application of GeoFEST(P), a computational grid can be dynamically adapted as stress grows on a fault. Simulations on workstations using a few tens of thousands of stress and displacement finite elements can now be expanded to multiple millions of elements with greater than 98-percent scaled efficiency on over many hundreds of parallel processors (see figure).

  14. The insignificant evolution of the richness-mass relation of galaxy clusters

    NASA Astrophysics Data System (ADS)

    Andreon, S.; Congdon, P.

    2014-08-01

    We analysed the richness-mass scaling of 23 very massive clusters at 0.15 < z < 0.55 with homogenously measured weak-lensing masses and richnesses within a fixed aperture of 0.5 Mpc radius. We found that the richness-mass scaling is very tight (the scatter is <0.09 dex with 90% probability) and independent of cluster evolutionary status and morphology. This implies a close association between infall and evolution of dark matter and galaxies in the central region of clusters. We also found that the evolution of the richness-mass intercept is minor at most, and, given the minor mass evolution across the studied redshift range, the richness evolution of individual massive clusters also turns out to be very small. Finally, it was paramount to account for the cluster mass function and the selection function. Ignoring them would lead to larger biases than the (otherwise quoted) errors. Our study benefits from: a) weak-lensing masses instead of proxy-based masses thereby removing the ambiguity between a real trend and one induced by an accounted evolution of the used mass proxy; b) the use of projected masses that simplify the statistical analysis thereby not requiring consideration of the unknown covariance induced by the cluster orientation/triaxiality; c) the use of aperture masses as they are free of the pseudo-evolution of mass definitions anchored to the evolving density of the Universe; d) a proper accounting of the sample selection function and of the Malmquist-like effect induced by the cluster mass function; e) cosmological simulations for the computation of the cluster mass function, its evolution, and the mass growth of each individual cluster.

  15. Crustal Movement: A Major Force in Evolution. Crustal Evolution Education Project. Teacher's Guide [and] Student Investigation.

    ERIC Educational Resources Information Center

    Stoever, Edward C., Jr.

    Crustal Evolution Education Project (CEEP) modules were designed to: (1) provide students with the methods and results of continuing investigations into the composition, history, and processes of the earth's crust and the application of this knowledge to man's activities and (2) to be used by teachers with little or no previous background in the…

  16. Undecidability and Irreducibility Conditions for Open-Ended Evolution and Emergence.

    PubMed

    Hernández-Orozco, Santiago; Hernández-Quiroz, Francisco; Zenil, Hector

    2018-01-01

    Is undecidability a requirement for open-ended evolution (OEE)? Using methods derived from algorithmic complexity theory, we propose robust computational definitions of open-ended evolution and the adaptability of computable dynamical systems. Within this framework, we show that decidability imposes absolute limits on the stable growth of complexity in computable dynamical systems. Conversely, systems that exhibit (strong) open-ended evolution must be undecidable, establishing undecidability as a requirement for such systems. Complexity is assessed in terms of three measures: sophistication, coarse sophistication, and busy beaver logical depth. These three complexity measures assign low complexity values to random (incompressible) objects. As time grows, the stated complexity measures allow for the existence of complex states during the evolution of a computable dynamical system. We show, however, that finding these states involves undecidable computations. We conjecture that for similar complexity measures that assign low complexity values, decidability imposes comparable limits on the stable growth of complexity, and that such behavior is necessary for nontrivial evolutionary systems. We show that the undecidability of adapted states imposes novel and unpredictable behavior on the individuals or populations being modeled. Such behavior is irreducible. Finally, we offer an example of a system, first proposed by Chaitin, that exhibits strong OEE.

  17. The ECLSS Advanced Automation Project Evolution and Technology Assessment

    NASA Technical Reports Server (NTRS)

    Dewberry, Brandon S.; Carnes, James R.; Lukefahr, Brenda D.; Rogers, John S.; Rochowiak, Daniel M.; Mckee, James W.; Benson, Brian L.

    1990-01-01

    Viewgraphs on Environmental Control and Life Support System (ECLSS) advanced automation project evolution and technology assessment are presented. Topics covered include: the ECLSS advanced automation project; automatic fault diagnosis of ECLSS subsystems descriptions; in-line, real-time chemical and microbial fluid analysis; and object-oriented, distributed chemical and microbial modeling of regenerative environmental control systems description.

  18. Embedded Empiricisms in Soft Soil Technology

    NASA Astrophysics Data System (ADS)

    Wijeyesekera, D. C.; John, L. M. S. Alvin; Adnan, Z.

    2016-07-01

    Civil engineers of today are continuously challenged by innovative projects that push further the knowledge boundaries with conceptual and/or ingenious solutions leading to the realization of that once was considered impossible in the realms of geotechnology. Some of the forward developments rely on empirical methods embedded within soft soil technology and the spectral realms of engineering in its entirety. Empiricisms unlike folklore are not always shrouded in mysticism but can find scientific reasoning to justify them being adopted in design and tangible construction projects. This lecture therefore is an outline exposition of how empiricism has been integrally embedded in total empirical beginnings in the evolution of soft soil technology from the Renaissance time, through the developments of soil mechanics in the 19th century which in turn has paved the way to the rise of computational soil mechanics. Developments in computational soil mechanics has always embraced and are founded on a wide backdrop of empirical geoenvironment simulations. However, it is imperative that a competent geotechnical engineer needs postgraduate training combined with empiricism that is based on years of well- winnowed practical experience to fathom the diverseness and complexity of nature. However, experience being regarded more highly than expertise can, perhaps inadvertently, inhibit development and innovation.

  19. Three-dimensional deformable-model-based localization and recognition of road vehicles.

    PubMed

    Zhang, Zhaoxiang; Tan, Tieniu; Huang, Kaiqi; Wang, Yunhong

    2012-01-01

    We address the problem of model-based object recognition. Our aim is to localize and recognize road vehicles from monocular images or videos in calibrated traffic scenes. A 3-D deformable vehicle model with 12 shape parameters is set up as prior information, and its pose is determined by three parameters, which are its position on the ground plane and its orientation about the vertical axis under ground-plane constraints. An efficient local gradient-based method is proposed to evaluate the fitness between the projection of the vehicle model and image data, which is combined into a novel evolutionary computing framework to estimate the 12 shape parameters and three pose parameters by iterative evolution. The recovery of pose parameters achieves vehicle localization, whereas the shape parameters are used for vehicle recognition. Numerous experiments are conducted in this paper to demonstrate the performance of our approach. It is shown that the local gradient-based method can evaluate accurately and efficiently the fitness between the projection of the vehicle model and the image data. The evolutionary computing framework is effective for vehicles of different types and poses is robust to all kinds of occlusion.

  20. Undergraduate computational physics projects on quantum computing

    NASA Astrophysics Data System (ADS)

    Candela, D.

    2015-08-01

    Computational projects on quantum computing suitable for students in a junior-level quantum mechanics course are described. In these projects students write their own programs to simulate quantum computers. Knowledge is assumed of introductory quantum mechanics through the properties of spin 1/2. Initial, more easily programmed projects treat the basics of quantum computation, quantum gates, and Grover's quantum search algorithm. These are followed by more advanced projects to increase the number of qubits and implement Shor's quantum factoring algorithm. The projects can be run on a typical laptop or desktop computer, using most programming languages. Supplementing resources available elsewhere, the projects are presented here in a self-contained format especially suitable for a short computational module for physics students.

  1. Quantum gates by periodic driving.

    PubMed

    Shi, Z C; Wang, W; Yi, X X

    2016-02-25

    Topological quantum computation has been extensively studied in the past decades due to its robustness against decoherence. One way to realize the topological quantum computation is by adiabatic evolutions-it requires relatively long time to complete a gate, so the speed of quantum computation slows down. In this work, we present a method to realize single qubit quantum gates by periodic driving. Compared to adiabatic evolution, the single qubit gates can be realized at a fixed time much shorter than that by adiabatic evolution. The driving fields can be sinusoidal or square-well field. With the sinusoidal driving field, we derive an expression for the total operation time in the high-frequency limit, and an exact analytical expression for the evolution operator without any approximations is given for the square well driving. This study suggests that the period driving could provide us with a new direction in regulations of the operation time in topological quantum computation.

  2. Models of Protocellular Structure, Function and Evolution

    NASA Technical Reports Server (NTRS)

    New, Michael H.; Pohorille, Andrew; Szostak, Jack W.; Keefe, Tony; Lanyi, Janos K.

    2001-01-01

    In the absence of any record of protocells, the most direct way to test our understanding of the origin of cellular life is to construct laboratory models that capture important features of protocellular systems. Such efforts are currently underway in a collaborative project between NASA-Ames, Harvard Medical School and University of California. They are accompanied by computational studies aimed at explaining self-organization of simple molecules into ordered structures. The centerpiece of this project is a method for the in vitro evolution of protein enzymes toward arbitrary catalytic targets. A similar approach has already been developed for nucleic acids in which a small number of functional molecules are selected from a large, random population of candidates. The selected molecules are next vastly multiplied using the polymerase chain reaction. A mutagenic approach, in which the sequences of selected molecules are randomly altered, can yield further improvements in performance or alterations of specificities. Unfortunately, the catalytic potential of nucleic acids is rather limited. Proteins are more catalytically capable but cannot be directly amplified. In the new technique, this problem is circumvented by covalently linking each protein of the initial, diverse, pool to the RNA sequence that codes for it. Then, selection is performed on the proteins, but the nucleic acids are replicated. Additional information is contained in the original extended abstract.

  3. The multiple roles of computational chemistry in fragment-based drug design

    NASA Astrophysics Data System (ADS)

    Law, Richard; Barker, Oliver; Barker, John J.; Hesterkamp, Thomas; Godemann, Robert; Andersen, Ole; Fryatt, Tara; Courtney, Steve; Hallett, Dave; Whittaker, Mark

    2009-08-01

    Fragment-based drug discovery (FBDD) represents a change in strategy from the screening of molecules with higher molecular weights and physical properties more akin to fully drug-like compounds, to the screening of smaller, less complex molecules. This is because it has been recognised that fragment hit molecules can be efficiently grown and optimised into leads, particularly after the binding mode to the target protein has been first determined by 3D structural elucidation, e.g. by NMR or X-ray crystallography. Several studies have shown that medicinal chemistry optimisation of an already drug-like hit or lead compound can result in a final compound with too high molecular weight and lipophilicity. The evolution of a lower molecular weight fragment hit therefore represents an attractive alternative approach to optimisation as it allows better control of compound properties. Computational chemistry can play an important role both prior to a fragment screen, in producing a target focussed fragment library, and post-screening in the evolution of a drug-like molecule from a fragment hit, both with and without the available fragment-target co-complex structure. We will review many of the current developments in the area and illustrate with some recent examples from successful FBDD discovery projects that we have conducted.

  4. Methods for Prediction of High-Speed Reacting Flows in Aerospace Propulsion

    NASA Technical Reports Server (NTRS)

    Drummond, J. Philip

    2014-01-01

    Research to develop high-speed airbreathing aerospace propulsion systems was underway in the late 1950s. A major part of the effort involved the supersonic combustion ramjet, or scramjet, engine. Work had also begun to develop computational techniques for solving the equations governing the flow through a scramjet engine. However, scramjet technology and the computational methods to assist in its evolution would remain apart for another decade. The principal barrier was that the computational methods needed for engine evolution lacked the computer technology required for solving the discrete equations resulting from the numerical methods. Even today, computer resources remain a major pacing item in overcoming this barrier. Significant advances have been made over the past 35 years, however, in modeling the supersonic chemically reacting flow in a scramjet combustor. To see how scramjet development and the required computational tools finally merged, we briefly trace the evolution of the technology in both areas.

  5. Revisiting the School of the Future: The Evolution of a School-Based Services Project.

    ERIC Educational Resources Information Center

    Iscoe, Louise K.; Keir, Scott S.

    The Hogg Foundation for Mental Health created the School of the Future (SoF) project to enable selected Texas schools to coordinate and implement school-based social and health services on their campuses and to demonstrate the effectiveness of this method of service delivery by evaluating the project. This report documents the evolution of the SoF…

  6. Evolution of Computational Toxicology-from Primitive ...

    EPA Pesticide Factsheets

    Presentation at the Health Canada seminar in Ottawa, ON, Canada on Nov. 15. 2016 Presentation at the Health Canada seminar in Ottawa, ON, Canada on Nov. 15. 2016 on the Evolution of Computational Toxicology-from Primitive Beginnings to Sophisticated Application

  7. Scientific Results of the Nasa-sponsored Study Project on Mars: Evolution of Its Climate and Atmosphere

    NASA Technical Reports Server (NTRS)

    Clifford, Stephen M.; Greeley, Ronald; Haberle, Robert M.

    1988-01-01

    The scientific highlights of the Mars: Evolution of its Climate and Atmosphere (MECA) study project are reviewed and some of the important issues in Martian climate research that remain unresolved are discussed.

  8. Evolution of the Virtualized HPC Infrastructure of Novosibirsk Scientific Center

    NASA Astrophysics Data System (ADS)

    Adakin, A.; Anisenkov, A.; Belov, S.; Chubarov, D.; Kalyuzhny, V.; Kaplin, V.; Korol, A.; Kuchin, N.; Lomakin, S.; Nikultsev, V.; Skovpen, K.; Sukharev, A.; Zaytsev, A.

    2012-12-01

    Novosibirsk Scientific Center (NSC), also known worldwide as Akademgorodok, is one of the largest Russian scientific centers hosting Novosibirsk State University (NSU) and more than 35 research organizations of the Siberian Branch of Russian Academy of Sciences including Budker Institute of Nuclear Physics (BINP), Institute of Computational Technologies, and Institute of Computational Mathematics and Mathematical Geophysics (ICM&MG). Since each institute has specific requirements on the architecture of computing farms involved in its research field, currently we've got several computing facilities hosted by NSC institutes, each optimized for a particular set of tasks, of which the largest are the NSU Supercomputer Center, Siberian Supercomputer Center (ICM&MG), and a Grid Computing Facility of BINP. A dedicated optical network with the initial bandwidth of 10 Gb/s connecting these three facilities was built in order to make it possible to share the computing resources among the research communities, thus increasing the efficiency of operating the existing computing facilities and offering a common platform for building the computing infrastructure for future scientific projects. Unification of the computing infrastructure is achieved by extensive use of virtualization technology based on XEN and KVM platforms. This contribution gives a thorough review of the present status and future development prospects for the NSC virtualized computing infrastructure and the experience gained while using it for running production data analysis jobs related to HEP experiments being carried out at BINP, especially the KEDR detector experiment at the VEPP-4M electron-positron collider.

  9. The systematic evolution of a NASA software technology, Appendix C

    NASA Technical Reports Server (NTRS)

    Deregt, M. P.; Dulfer, J. E.

    1972-01-01

    A long range program is described whose ultimate purpose is to make possible the production of software in NASA within predictable schedule and budget constraints and with major characteristics such as size, run-time, and correctness predictable within reasonable tolerances. As part of the program a pilot NASA computer center will be chosen to apply software development and management techniques systematically and determine a set which is effective. The techniques will be developed by a Technology Group, which will guide the pilot project and be responsible for its success. The application of the technology will involve a sequence of NASA programming tasks graduated from simpler ones at first to complex systems in late phases of the project. The evaluation of the technology will be made by monitoring the operation of the software at the users' installations. In this way a coherent discipline for software design, production maintenance, and management will be evolved.

  10. Distributed Computing Environment for Mine Warfare Command

    DTIC Science & Technology

    1993-06-01

    based system to a decentralized network of personal computers over the past several years. This thesis analyzes the progress of the evolution as of May of...network of personal computers over the past several years. This thesis analyzes the progress of the evolution as of May of 1992. The building blocks of a...85 A. BACKGROUND ............. .................. 85 B. PAST ENVIRONMENT ........... ............... 86 C. PRESENT ENVIRONMENT

  11. Land use, water and Mediterranean landscapes: modelling long-term dynamics of complex socio-ecological systems.

    PubMed

    Barton, C Michael; Ullah, Isaac I; Bergin, Sean

    2010-11-28

    The evolution of Mediterranean landscapes during the Holocene has been increasingly governed by the complex interactions of water and human land use. Different land-use practices change the amount of water flowing across the surface and infiltrating the soil, and change water's ability to move surface sediments. Conversely, water amplifies the impacts of human land use and extends the ecological footprint of human activities far beyond the borders of towns and fields. Advances in computational modelling offer new tools to study the complex feedbacks between land use, land cover, topography and surface water. The Mediterranean Landscape Dynamics project (MedLand) is building a modelling laboratory where experiments can be carried out on the long-term impacts of agropastoral land use, and whose results can be tested against the archaeological record. These computational experiments are providing new insights into the socio-ecological consequences of human decisions at varying temporal and spatial scales.

  12. Superior colliculus neurons encode a visual saliency map during free viewing of natural dynamic video

    NASA Astrophysics Data System (ADS)

    White, Brian J.; Berg, David J.; Kan, Janis Y.; Marino, Robert A.; Itti, Laurent; Munoz, Douglas P.

    2017-01-01

    Models of visual attention postulate the existence of a saliency map whose function is to guide attention and gaze to the most conspicuous regions in a visual scene. Although cortical representations of saliency have been reported, there is mounting evidence for a subcortical saliency mechanism, which pre-dates the evolution of neocortex. Here, we conduct a strong test of the saliency hypothesis by comparing the output of a well-established computational saliency model with the activation of neurons in the primate superior colliculus (SC), a midbrain structure associated with attention and gaze, while monkeys watched video of natural scenes. We find that the activity of SC superficial visual-layer neurons (SCs), specifically, is well-predicted by the model. This saliency representation is unlikely to be inherited from fronto-parietal cortices, which do not project to SCs, but may be computed in SCs and relayed to other areas via tectothalamic pathways.

  13. Experience Paper: Software Engineering and Community Codes Track in ATPESC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dubey, Anshu; Riley, Katherine M.

    Argonne Training Program in Extreme Scale Computing (ATPESC) was started by the Argonne National Laboratory with the objective of expanding the ranks of better prepared users of high performance computing (HPC) machines. One of the unique aspects of the program was inclusion of software engineering and community codes track. The inclusion was motivated by the observation that the projects with a good scientific and software process were better able to meet their scientific goals. In this paper we present our experience of running the software track from the beginning of the program until now. We discuss the motivations, the reception,more » and the evolution of the track over the years. We welcome discussion and input from the community to enhance the track in ATPESC, and also to facilitate inclusion of similar tracks in other HPC oriented training programs.« less

  14. FY09 Final Report for LDRD Project: Understanding Viral Quasispecies Evolution through Computation and Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, C

    2009-11-12

    In FY09 they will (1) complete the implementation, verification, calibration, and sensitivity and scalability analysis of the in-cell virus replication model; (2) complete the design of the cell culture (cell-to-cell infection) model; (3) continue the research, design, and development of their bioinformatics tools: the Web-based structure-alignment-based sequence variability tool and the functional annotation of the genome database; (4) collaborate with the University of California at San Francisco on areas of common interest; and (5) submit journal articles that describe the in-cell model with simulations and the bioinformatics approaches to evaluation of genome variability and fitness.

  15. A Monte Carlo model for 3D grain evolution during welding

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodgers, Theron M.; Mitchell, John A.; Tikare, Veena

    Welding is one of the most wide-spread processes used in metal joining. However, there are currently no open-source software implementations for the simulation of microstructural evolution during a weld pass. Here we describe a Potts Monte Carlo based model implemented in the SPPARKS kinetic Monte Carlo computational framework. The model simulates melting, solidification and solid-state microstructural evolution of material in the fusion and heat-affected zones of a weld. The model does not simulate thermal behavior, but rather utilizes user input parameters to specify weld pool and heat-affect zone properties. Weld pool shapes are specified by Bezier curves, which allow formore » the specification of a wide range of pool shapes. Pool shapes can range from narrow and deep to wide and shallow representing different fluid flow conditions within the pool. Surrounding temperature gradients are calculated with the aide of a closest point projection algorithm. Furthermore, the model also allows simulation of pulsed power welding through time-dependent variation of the weld pool size. Example simulation results and comparisons with laboratory weld observations demonstrate microstructural variation with weld speed, pool shape, and pulsed-power.« less

  16. A Monte Carlo model for 3D grain evolution during welding

    DOE PAGES

    Rodgers, Theron M.; Mitchell, John A.; Tikare, Veena

    2017-08-04

    Welding is one of the most wide-spread processes used in metal joining. However, there are currently no open-source software implementations for the simulation of microstructural evolution during a weld pass. Here we describe a Potts Monte Carlo based model implemented in the SPPARKS kinetic Monte Carlo computational framework. The model simulates melting, solidification and solid-state microstructural evolution of material in the fusion and heat-affected zones of a weld. The model does not simulate thermal behavior, but rather utilizes user input parameters to specify weld pool and heat-affect zone properties. Weld pool shapes are specified by Bezier curves, which allow formore » the specification of a wide range of pool shapes. Pool shapes can range from narrow and deep to wide and shallow representing different fluid flow conditions within the pool. Surrounding temperature gradients are calculated with the aide of a closest point projection algorithm. Furthermore, the model also allows simulation of pulsed power welding through time-dependent variation of the weld pool size. Example simulation results and comparisons with laboratory weld observations demonstrate microstructural variation with weld speed, pool shape, and pulsed-power.« less

  17. THE EVOLUTION OF SCHOOL MATHEMATICS.

    ERIC Educational Resources Information Center

    DAVIS, ROBERT B.

    ACTION AND PLANS OF THE MADISON PROJECT TO GIVE RATIONAL GUIDANCE TO EVOLUTIONARY CHANGES IN SCHOOL MATHEMATICS ARE DESCRIBED. THE PROJECT ATTEMPTS TO CONTRIBUTE TO RATIONAL GUIDANCE OF EDUCATIONAL EVOLUTION BY MAKING "THRUSTS AND PROBES" INTO THE UNKNOWN POTENTIAL OF MATHEMATICS LEARNING. EXAMPLES ARE--INTRODUCING THE ARITHMETIC OF…

  18. Evolution of computational chemistry, the "launch pad" to scientific computational models: The early days from a personal account, the present status from the TACC-2012 congress, and eventual future applications from the global simulation approach

    NASA Astrophysics Data System (ADS)

    Clementi, Enrico

    2012-06-01

    This is the introductory chapter to the AIP Proceedings volume "Theory and Applications of Computational Chemistry: The First Decade of the Second Millennium" where we discuss the evolution of "computational chemistry". Very early variational computational chemistry developments are reported in Sections 1 to 7, and 11, 12 by recalling some of the computational chemistry contributions by the author and his collaborators (from late 1950 to mid 1990); perturbation techniques are not considered in this already extended work. Present day's computational chemistry is partly considered in Sections 8 to 10 where more recent studies by the author and his collaborators are discussed, including the Hartree-Fock-Heitler-London method; a more general discussion on present day computational chemistry is presented in Section 14. The following chapters of this AIP volume provide a view of modern computational chemistry. Future computational chemistry developments can be extrapolated from the chapters of this AIP volume; further, in Sections 13 and 15 present an overall analysis on computational chemistry, obtained from the Global Simulation approach, by considering the evolution of scientific knowledge confronted with the opportunities offered by modern computers.

  19. Kinetic Modeling of Radiative Turbulence in Relativistic Astrophysical Plasmas: Particle Acceleration and High-Energy Flares

    NASA Astrophysics Data System (ADS)

    Wise, John

    In the near future, next-generation telescopes, covering most of the electromagnetic spectrum, will provide a view into the very earliest stages of galaxy formation. To accurately interpret these future observations, accurate and high-resolution simulations of the first stars and galaxies are vital. This proposal is centered on the formation of the first galaxies in the Universe and their observational signatures in preparation for these future observatories. This proposal has two overall goals: 1. To simulate the formation and evolution of a statistically significant sample of galaxies during the first billion years of the Universe, including all relevant astrophysics while resolving individual molecular clouds, in various cosmological environments. These simulations will utilize a sophisticated physical model of star and black hole formation and feedback, including radiation transport and magnetic fields, which will lead to the most realistic and resolved predictions for the early universe; 2. To predict the observational features of the first galaxies throughout the electromagnetic spectrum, allowing for optimal extraction of galaxy and dark matter halo properties from their photometry, imaging, and spectra; The proposed research plan addresses a timely and relevant issue to theoretically prepare for the interpretation of future observations of the first galaxies in the Universe. A suite of adaptive mesh refinement simulations will be used to follow the formation and evolution of thousands of galaxies observable with the James Webb Space Telescope (JWST) that will be launched during the second year of this project. The simulations will have also tracked the formation and death of over 100,000 massive metal-free stars. Currently, there is a gap of two orders of magnitude in stellar mass between the smallest observed z > 6 galaxy and the largest simulated galaxy from "first principles", capturing its entire star formation history. This project will eliminate this gap between simulations and observations of the first galaxies, providing predictions for next-generation observations coming online throughout the next decade. The proposed activities present the graduate students involved in the project with opportunities to gain expertise in numerical algorithms, high performance computing, and software engineering. With this experience, the students will be in a powerful position to face the challenging job market. The computational tools produced by this project will be made freely available and incorporated into their respective frameworks to preserve their sustainability.

  20. Efficient solution of parabolic equations by Krylov approximation methods

    NASA Technical Reports Server (NTRS)

    Gallopoulos, E.; Saad, Y.

    1990-01-01

    Numerical techniques for solving parabolic equations by the method of lines is addressed. The main motivation for the proposed approach is the possibility of exploiting a high degree of parallelism in a simple manner. The basic idea of the method is to approximate the action of the evolution operator on a given state vector by means of a projection process onto a Krylov subspace. Thus, the resulting approximation consists of applying an evolution operator of a very small dimension to a known vector which is, in turn, computed accurately by exploiting well-known rational approximations to the exponential. Because the rational approximation is only applied to a small matrix, the only operations required with the original large matrix are matrix-by-vector multiplications, and as a result the algorithm can easily be parallelized and vectorized. Some relevant approximation and stability issues are discussed. We present some numerical experiments with the method and compare its performance with a few explicit and implicit algorithms.

  1. Wetting and free surface flow modeling for potting and encapsulation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brooks, Carlton, F.; Brooks, Michael J.; Graham, Alan Lyman

    As part of an effort to reduce costs and improve quality control in encapsulation and potting processes the Technology Initiative Project ''Defect Free Manufacturing and Assembly'' has completed a computational modeling study of flows representative of those seen in these processes. Flow solutions are obtained using a coupled, finite-element-based, numerical method based on the GOMA/ARIA suite of Sandia flow solvers. The evolution of the free surface is solved with an advanced level set algorithm. This approach incorporates novel methods for representing surface tension and wetting forces that affect the evolution of the free surface. In addition, two commercially available codes,more » ProCAST and MOLDFLOW, are also used on geometries representing encapsulation processes at the Kansas City Plant. Visual observations of the flow in several geometries are recorded in the laboratory and compared to the models. Wetting properties for the materials in these experiments are measured using a unique flowthrough goniometer.« less

  2. Quantum adiabatic computation with a constant gap is not useful in one dimension.

    PubMed

    Hastings, M B

    2009-07-31

    We show that it is possible to use a classical computer to efficiently simulate the adiabatic evolution of a quantum system in one dimension with a constant spectral gap, starting the adiabatic evolution from a known initial product state. The proof relies on a recently proven area law for such systems, implying the existence of a good matrix product representation of the ground state, combined with an appropriate algorithm to update the matrix product state as the Hamiltonian is changed. This implies that adiabatic evolution with such Hamiltonians is not useful for universal quantum computation. Therefore, adiabatic algorithms which are useful for universal quantum computation either require a spectral gap tending to zero or need to be implemented in more than one dimension (we leave open the question of the computational power of adiabatic simulation with a constant gap in more than one dimension).

  3. The View from a Few Hundred Feet : A New Transparent and Integrated Workflow for UAV-collected Data

    NASA Astrophysics Data System (ADS)

    Peterson, F. S.; Barbieri, L.; Wyngaard, J.

    2015-12-01

    Unmanned Aerial Vehicles (UAVs) allow scientists and civilians to monitor earth and atmospheric conditions in remote locations. To keep up with the rapid evolution of UAV technology, data workflows must also be flexible, integrated, and introspective. Here, we present our data workflow for a project to assess the feasibility of detecting threshold levels of methane, carbon-dioxide, and other aerosols by mounting consumer-grade gas analysis sensors on UAV's. Particularly, we highlight our use of Project Jupyter, a set of open-source software tools and documentation designed for developing "collaborative narratives" around scientific workflows. By embracing the GitHub-backed, multi-language systems available in Project Jupyter, we enable interaction and exploratory computation while simultaneously embracing distributed version control. Additionally, the transparency of this method builds trust with civilians and decision-makers and leverages collaboration and communication to resolve problems. The goal of this presentation is to provide a generic data workflow for scientific inquiries involving UAVs and to invite the participation of the AGU community in its improvement and curation.

  4. Computer simulation of solder joint failure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burchett, S.N.; Frear, D.R.; Rashid, M.M.

    The thermomechanical fatigue failure of solder joints is increasingly becoming an important reliability issue for electronic packages. The purpose of this Laboratory Directed Research and Development (LDRD) project was to develop computational tools for simulating the behavior of solder joints under strain and temperature cycling, taking into account the microstructural heterogeneities that exist in as-solidified near eutectic Sn-Pb joints, as well as subsequent microstructural evolution. The authors present two computational constitutive models, a two-phase model and a single-phase model, that were developed to predict the behavior of near eutectic Sn-Pb solder joints under fatigue conditions. Unique metallurgical tests provide themore » fundamental input for the constitutive relations. The two-phase model mathematically predicts the heterogeneous coarsening behavior of near eutectic Sn-Pb solder. The finite element simulations with this model agree qualitatively with experimental thermomechanical fatigue tests. The simulations show that the presence of an initial heterogeneity in the solder microstructure could significantly degrade the fatigue lifetime. The single-phase model was developed to predict solder joint behavior using materials data for constitutive relation constants that could be determined through straightforward metallurgical experiments. Special thermomechanical fatigue tests were developed to give fundamental materials input to the models, and an in situ SEM thermomechanical fatigue test system was developed to characterize microstructural evolution and the mechanical behavior of solder joints during the test. A shear/torsion test sample was developed to impose strain in two different orientations. Materials constants were derived from these tests. The simulation results from the two-phase model showed good fit to the experimental test results.« less

  5. Scientific results of the NASA-sponsored study project on Mars: Evolution of volcanism, tectonics, and volatiles

    NASA Technical Reports Server (NTRS)

    Solomon, Sean C. (Editor); Sharpton, Virgil L. (Editor); Zimbelman, James R. (Editor)

    1990-01-01

    The objectives of the Mars: Evolution of Volcanism, Tectonics, and Volatiles (MEVTV) project are to outline the volcanic and tectonic history of Mars; to determine the influence of volatiles on Martian volcanic and tectonic processes; and to attempt to determine the compositional, thermal, and volatile history of Mars from its volcanic and tectonic evolution. Available data sets were used to test general models of the volcanic and tectonic history of Mars.

  6. LEAMram (Trademark): Land Use Evolution and Impact Assessment Model Residential Attractiveness Model

    DTIC Science & Technology

    2006-09-01

    MEPLAN are popular in both the United States and overseas, and focus on identifying growth by income and housing costs. These and other models focus...CUF-2), SLEUTH, Landuse Evolution Assessment Model (LEAM™), Smart Places, and What If?: • CUF-2 uses a set of econometric models to project...ER D C/ CE R L TR -0 6 -2 8 LEAMram™: Land use Evolution and impact Assessment Model Residential Attractiveness Model James D

  7. COSMIC REIONIZATION ON COMPUTERS. ULTRAVIOLET CONTINUUM SLOPES AND DUST OPACITIES IN HIGH REDSHIFT GALAXIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khakhaleva-Li, Zimu; Gnedin, Nickolay Y., E-mail: zimu@uchicago.edu, E-mail: gnedin@fnal.gov

    We compare the properties of stellar populations of model galaxies from the Cosmic Reionization On Computers (CROC) project with the exiting ultraviolet (UV) and IR data. Since CROC simulations do not follow cosmic dust directly, we adopt two variants of the dust-follows-metals ansatz to populate model galaxies with dust. Using the dust radiative transfer code Hyperion, we compute synthetic stellar spectra, UV continuum slopes, and IR fluxes for simulated galaxies. We find that the simulation results generally match observational measurements, but, perhaps, not in full detail. The differences seem to indicate that our adopted dust-follows-metals ansatzes are not fully sufficient.more » While the discrepancies with the exiting data are marginal, the future James Webb Space Telescope (JWST) data will be of much higher precision, rendering highly significant any tentative difference between theory and observations. It is, therefore, likely, that in order to fully utilize the precision of JWST observations, fully dynamical modeling of dust formation, evolution, and destruction may be required.« less

  8. Reflexions sur L'Evolution de L'Enseignement de la Biologie en Afrique.

    ERIC Educational Resources Information Center

    Sasson, A.

    1980-01-01

    Reviews the main stages of the evolution of biology teaching in Africa. Discusses the UNESCO pilot project, the teaching aids produced for English- and French-speaking countries of Africa, and the "Teacher's Guide on the Biology of Human Populations" and projects needs for further assistance. (CS)

  9. Man as a Species.

    ERIC Educational Resources Information Center

    Solem, Alan; And Others

    Written in 1964, the document represents experimental material of the Anthropology Curriculum Study Project. The objectives of the project were to discuss the evolution of man as distinguished from the evolution of other species and as related to culture, and to emphasize human diversity. Three brief essays are presented. The first, "The…

  10. Differential-Evolution Control Parameter Optimization for Unmanned Aerial Vehicle Path Planning

    PubMed Central

    Kok, Kai Yit; Rajendran, Parvathy

    2016-01-01

    The differential evolution algorithm has been widely applied on unmanned aerial vehicle (UAV) path planning. At present, four random tuning parameters exist for differential evolution algorithm, namely, population size, differential weight, crossover, and generation number. These tuning parameters are required, together with user setting on path and computational cost weightage. However, the optimum settings of these tuning parameters vary according to application. Instead of trial and error, this paper presents an optimization method of differential evolution algorithm for tuning the parameters of UAV path planning. The parameters that this research focuses on are population size, differential weight, crossover, and generation number. The developed algorithm enables the user to simply define the weightage desired between the path and computational cost to converge with the minimum generation required based on user requirement. In conclusion, the proposed optimization of tuning parameters in differential evolution algorithm for UAV path planning expedites and improves the final output path and computational cost. PMID:26943630

  11. Evolving binary classifiers through parallel computation of multiple fitness cases.

    PubMed

    Cagnoni, Stefano; Bergenti, Federico; Mordonini, Monica; Adorni, Giovanni

    2005-06-01

    This paper describes two versions of a novel approach to developing binary classifiers, based on two evolutionary computation paradigms: cellular programming and genetic programming. Such an approach achieves high computation efficiency both during evolution and at runtime. Evolution speed is optimized by allowing multiple solutions to be computed in parallel. Runtime performance is optimized explicitly using parallel computation in the case of cellular programming or implicitly taking advantage of the intrinsic parallelism of bitwise operators on standard sequential architectures in the case of genetic programming. The approach was tested on a digit recognition problem and compared with a reference classifier.

  12. Computing Nash equilibria through computational intelligence methods

    NASA Astrophysics Data System (ADS)

    Pavlidis, N. G.; Parsopoulos, K. E.; Vrahatis, M. N.

    2005-03-01

    Nash equilibrium constitutes a central solution concept in game theory. The task of detecting the Nash equilibria of a finite strategic game remains a challenging problem up-to-date. This paper investigates the effectiveness of three computational intelligence techniques, namely, covariance matrix adaptation evolution strategies, particle swarm optimization, as well as, differential evolution, to compute Nash equilibria of finite strategic games, as global minima of a real-valued, nonnegative function. An issue of particular interest is to detect more than one Nash equilibria of a game. The performance of the considered computational intelligence methods on this problem is investigated using multistart and deflection.

  13. Aerodynamic shape optimization directed toward a supersonic transport using sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Baysal, Oktay

    1995-01-01

    This investigation was conducted from March 1994 to August 1995, primarily, to extend and implement the previously developed aerodynamic design optimization methodologies for the problems related to a supersonic transport design. These methods had demonstrated promise to improve the designs (more specifically, the shape) of aerodynamic surfaces, by coupling optimization algorithms (OA) with Computational Fluid Dynamics (CFD) algorithms via sensitivity analyses (SA) with surface definition methods from Computer Aided Design (CAD). The present extensions of this method and their supersonic implementations have produced wing section designs, delta wing designs, cranked-delta wing designs, and nacelle designs, all of which have been reported in the open literature. Despite the fact that these configurations were highly simplified to be of any practical or commercial use, they served the algorithmic and proof-of-concept objectives of the study very well. The primary cause for the configurational simplifications, other than the usual simplify-to-study the fundamentals reason, were the premature closing of the project. Only after the first of the originally intended three-year term, both the funds and the computer resources supporting the project were abruptly cut due to their severe shortages at the funding agency. Nonetheless, it was shown that the extended methodologies could be viable options in optimizing the design of not only an isolated single-component configuration, but also a multiple-component configuration in supersonic and viscous flow. This allowed designing with the mutual interference of the components being one of the constraints all along the evolution of the shapes.

  14. First-principles data-driven discovery of transition metal oxides for artificial photosynthesis

    NASA Astrophysics Data System (ADS)

    Yan, Qimin

    We develop a first-principles data-driven approach for rapid identification of transition metal oxide (TMO) light absorbers and photocatalysts for artificial photosynthesis using the Materials Project. Initially focusing on Cr, V, and Mn-based ternary TMOs in the database, we design a broadly-applicable multiple-layer screening workflow automating density functional theory (DFT) and hybrid functional calculations of bulk and surface electronic and magnetic structures. We further assess the electrochemical stability of TMOs in aqueous environments from computed Pourbaix diagrams. Several promising earth-abundant low band-gap TMO compounds with desirable band edge energies and electrochemical stability are identified by our computational efforts and then synergistically evaluated using high-throughput synthesis and photoelectrochemical screening techniques by our experimental collaborators at Caltech. Our joint theory-experiment effort has successfully identified new earth-abundant copper and manganese vanadate complex oxides that meet highly demanding requirements for photoanodes, substantially expanding the known space of such materials. By integrating theory and experiment, we validate our approach and develop important new insights into structure-property relationships for TMOs for oxygen evolution photocatalysts, paving the way for use of first-principles data-driven techniques in future applications. This work is supported by the Materials Project Predictive Modeling Center and the Joint Center for Artificial Photosynthesis through the U.S. Department of Energy, Office of Basic Energy Sciences, Materials Sciences and Engineering Division, under Contract No. DE-AC02-05CH11231. Computational resources also provided by the Department of Energy through the National Energy Supercomputing Center.

  15. A rapid parallelization of cone-beam projection and back-projection operator based on texture fetching interpolation

    NASA Astrophysics Data System (ADS)

    Xie, Lizhe; Hu, Yining; Chen, Yang; Shi, Luyao

    2015-03-01

    Projection and back-projection are the most computational consuming parts in Computed Tomography (CT) reconstruction. Parallelization strategies using GPU computing techniques have been introduced. We in this paper present a new parallelization scheme for both projection and back-projection. The proposed method is based on CUDA technology carried out by NVIDIA Corporation. Instead of build complex model, we aimed on optimizing the existing algorithm and make it suitable for CUDA implementation so as to gain fast computation speed. Besides making use of texture fetching operation which helps gain faster interpolation speed, we fixed sampling numbers in the computation of projection, to ensure the synchronization of blocks and threads, thus prevents the latency caused by inconsistent computation complexity. Experiment results have proven the computational efficiency and imaging quality of the proposed method.

  16. Discovery and dynamical characterization of the Amor-class asteroid 2012 XH16

    NASA Astrophysics Data System (ADS)

    Wlodarczyk, I.; Cernis, K.; Boyle, R. P.; Laugalys, V.

    2014-03-01

    The near-Earth asteroid belt is continuously replenished with material originally moving in Amor-class orbits. Here, the orbit of the dynamically interesting Amor-class asteroid 2012 XH16 is analysed. This asteroid was discovered with the Vatican Advanced Technology Telescope (VATT) at the Mt Graham International Observatory as part of an ongoing asteroid survey focused on astrometry and photometry. The orbit of the asteroid was computed using 66 observations (57 obtained with VATT and 9 from the Lunar and Planetary Laboratory-Spacewatch II project) to give a = 1.63 au, e = 0.36, i = 3.76°. The absolute magnitude of the asteroid is 22.3 which translates into a diameter in the range 104-231 m, assuming the average albedos of S-type and C-type asteroids, respectively. We have used the current orbit to study the future dynamical evolution of the asteroid under the perturbations of the planets and the Moon, relativistic effects, and the Yarkovsky force. Asteroid 2012 XH16 is locked close to the strong 1:2 mean motion resonance with the Earth. The object shows stable evolution and could survive in near-resonance for a relatively long period of time despite experiencing frequent close encounters with Mars. Moreover, results of our computations show that the asteroid 2012 XH16 can survive in the Amor region at most for about 200-400 Myr. The evolution is highly chaotic with a characteristic Lyapunov time of 245 yr. Jupiter is the main perturber but the effects of Saturn, Mars and the Earth-Moon system are also important. In particular, secular resonances with Saturn are significant.

  17. High Resolution Topography of Polar Regions from Commercial Satellite Imagery, Petascale Computing and Open Source Software

    NASA Astrophysics Data System (ADS)

    Morin, Paul; Porter, Claire; Cloutier, Michael; Howat, Ian; Noh, Myoung-Jong; Willis, Michael; Kramer, WIlliam; Bauer, Greg; Bates, Brian; Williamson, Cathleen

    2017-04-01

    Surface topography is among the most fundamental data sets for geosciences, essential for disciplines ranging from glaciology to geodynamics. Two new projects are using sub-meter, commercial imagery licensed by the National Geospatial-Intelligence Agency and open source photogrammetry software to produce a time-tagged 2m posting elevation model of the Arctic and an 8m posting reference elevation model for the Antarctic. When complete, this publically available data will be at higher resolution than any elevation models that cover the entirety of the Western United States. These two polar projects are made possible due to three equally important factors: 1) open-source photogrammetry software, 2) petascale computing, and 3) sub-meter imagery licensed to the United States Government. Our talk will detail the technical challenges of using automated photogrammetry software; the rapid workflow evolution to allow DEM production; the task of deploying the workflow on one of the world's largest supercomputers; the trials of moving massive amounts of data, and the management strategies the team needed to solve in order to meet deadlines. Finally, we will discuss the implications of this type of collaboration for future multi-team use of leadership-class systems such as Blue Waters, and for further elevation mapping.

  18. Evolution of Project-Based Learning in Small Groups in Environmental Engineering Courses

    ERIC Educational Resources Information Center

    Requies, Jesús M.; Agirre, Ion; Barrio, V. Laura; Graells, Moisès

    2018-01-01

    This work presents the assessment of the development and evolution of an active methodology (Project-Based Learning--PBL) implemented on the course "Unit Operations in Environmental Engineering", within the bachelor's degree in Environmental Engineering, with the purpose of decreasing the dropout rate in this course. After the initial…

  19. SAChES: Scalable Adaptive Chain-Ensemble Sampling.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swiler, Laura Painton; Ray, Jaideep; Ebeida, Mohamed Salah

    We present the development of a parallel Markov Chain Monte Carlo (MCMC) method called SAChES, Scalable Adaptive Chain-Ensemble Sampling. This capability is targed to Bayesian calibration of com- putationally expensive simulation models. SAChES involves a hybrid of two methods: Differential Evo- lution Monte Carlo followed by Adaptive Metropolis. Both methods involve parallel chains. Differential evolution allows one to explore high-dimensional parameter spaces using loosely coupled (i.e., largely asynchronous) chains. Loose coupling allows the use of large chain ensembles, with far more chains than the number of parameters to explore. This reduces per-chain sampling burden, enables high-dimensional inversions and the usemore » of computationally expensive forward models. The large number of chains can also ameliorate the impact of silent-errors, which may affect only a few chains. The chain ensemble can also be sampled to provide an initial condition when an aberrant chain is re-spawned. Adaptive Metropolis takes the best points from the differential evolution and efficiently hones in on the poste- rior density. The multitude of chains in SAChES is leveraged to (1) enable efficient exploration of the parameter space; and (2) ensure robustness to silent errors which may be unavoidable in extreme-scale computational platforms of the future. This report outlines SAChES, describes four papers that are the result of the project, and discusses some additional results.« less

  20. Efficient receiver tuning using differential evolution strategies

    NASA Astrophysics Data System (ADS)

    Wheeler, Caleb H.; Toland, Trevor G.

    2016-08-01

    Differential evolution (DE) is a powerful and computationally inexpensive optimization strategy that can be used to search an entire parameter space or to converge quickly on a solution. The Kilopixel Array Pathfinder Project (KAPPa) is a heterodyne receiver system delivering 5 GHz of instantaneous bandwidth in the tuning range of 645-695 GHz. The fully automated KAPPa receiver test system finds optimal receiver tuning using performance feedback and DE. We present an adaptation of DE for use in rapid receiver characterization. The KAPPa DE algorithm is written in Python 2.7 and is fully integrated with the KAPPa instrument control, data processing, and visualization code. KAPPa develops the technologies needed to realize heterodyne focal plane arrays containing 1000 pixels. Finding optimal receiver tuning by investigating large parameter spaces is one of many challenges facing the characterization phase of KAPPa. This is a difficult task via by-hand techniques. Characterizing or tuning in an automated fashion without need for human intervention is desirable for future large scale arrays. While many optimization strategies exist, DE is ideal for time and performance constraints because it can be set to converge to a solution rapidly with minimal computational overhead. We discuss how DE is utilized in the KAPPa system and discuss its performance and look toward the future of 1000 pixel array receivers and consider how the KAPPa DE system might be applied.

  1. BIOINFORMATICS IN THE K-8 CLASSROOM: DESIGNING INNOVATIVE ACTIVITIES FOR TEACHER IMPLEMENTATION

    PubMed Central

    Shuster, Michele; Claussen, Kira; Locke, Melly; Glazewski, Krista

    2016-01-01

    At the intersection of biology and computer science is the growing field of bioinformatics—the analysis of complex datasets of biological relevance. Despite the increasing importance of bioinformatics and associated practical applications, these are not standard topics in elementary and middle school classrooms. We report on a pilot project and its evolution to support implementation of bioinformatics-based activities in elementary and middle school classrooms. Specifically, we ultimately designed a multi-day summer teacher professional development workshop, in which teachers design innovative classroom activities. By focusing on teachers, our design leverages enhanced teacher knowledge and confidence to integrate innovative instructional materials into K-8 classrooms and contributes to capacity building in STEM instruction. PMID:27429860

  2. A History of the Andrew File System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bashear, Derrick

    2011-02-22

    Derrick Brashear and Jeffrey Altman will present a technical history of the evolution of Andrew File System starting with the early days of the Andrew Project at Carnegie Mellon through the commercialization by Transarc Corporation and IBM and a decade of OpenAFS. The talk will be technical with a focus on the various decisions and implementation trade-offs that were made over the course of AFS versions 1 through 4, the development of the Distributed Computing Environment Distributed File System (DCE DFS), and the course of the OpenAFS development community. The speakers will also discuss the various AFS branches developed atmore » the University of Michigan, Massachusetts Institute of Technology and Carnegie Mellon University.« less

  3. Community Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spentzouris, P.; /Fermilab; Cary, J.

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessarymore » accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors. ComPASS is in the first year of executing its plan to develop the next-generation HPC accelerator modeling tools. ComPASS aims to develop an integrated simulation environment that will utilize existing and new accelerator physics modules with petascale capabilities, by employing modern computing and solver technologies. The ComPASS vision is to deliver to accelerator scientists a virtual accelerator and virtual prototyping modeling environment, with the necessary multiphysics, multiscale capabilities. The plan for this development includes delivering accelerator modeling applications appropriate for each stage of the ComPASS software evolution. Such applications are already being used to address challenging problems in accelerator design and optimization. The ComPASS organization for software development and applications accounts for the natural domain areas (beam dynamics, electromagnetics, and advanced acceleration), and all areas depend on the enabling technologies activities, such as solvers and component technology, to deliver the desired performance and integrated simulation environment. The ComPASS applications focus on computationally challenging problems important for design or performance optimization to all major HEP, NP, and BES accelerator facilities. With the cost and complexity of particle accelerators rising, the use of computation to optimize their designs and find improved operating regimes becomes essential, potentially leading to significant cost savings with modest investment.« less

  4. Computing element evolution towards Exascale and its impact on legacy simulation codes

    NASA Astrophysics Data System (ADS)

    Colin de Verdière, Guillaume J. L.

    2015-12-01

    In the light of the current race towards the Exascale, this article highlights the main features of the forthcoming computing elements that will be at the core of next generations of supercomputers. The market analysis, underlying this work, shows that computers are facing a major evolution in terms of architecture. As a consequence, it is important to understand the impacts of those evolutions on legacy codes or programming methods. The problems of dissipated power and memory access are discussed and will lead to a vision of what should be an exascale system. To survive, programming languages had to respond to the hardware evolutions either by evolving or with the creation of new ones. From the previous elements, we elaborate why vectorization, multithreading, data locality awareness and hybrid programming will be the key to reach the exascale, implying that it is time to start rewriting codes.

  5. Software design studies emphasizing Project LOGOS

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The results of a research project on the development of computer software are presented. Research funds of $200,000 were expended over a three year period for software design and projects in connection with Project LOGOS (computer-aided design and certification of computing systems). Abstracts of theses prepared during the project are provided.

  6. Evolutionary Biology Digital Dissection Project: Web-Based Laboratory Learning Opportunities for Students

    ERIC Educational Resources Information Center

    Fabian, Carole Ann

    2004-01-01

    A university in Buffalo introduced its students to evolution by providing them with information on evidence of evolution, mechanisms for evolution, principles of genetics, selection, adaptation, evolution and sociobiology. This method of teaching with technology enabled students to improve and expand their learning opportunities.

  7. Expanding the Understanding of Evolution

    ERIC Educational Resources Information Center

    Musante, Susan

    2011-01-01

    Originally designed for K-12 teachers, the Understanding Evolution (UE) Web site ("www.understandingevolution.org") is a one-stop shop for all of a teacher's evolution education needs, with lesson plans, teaching tips, lists of common evolution misconceptions, and much more. However, during the past five years, the UE project team learned that…

  8. Desktop Computing Integration Project

    NASA Technical Reports Server (NTRS)

    Tureman, Robert L., Jr.

    1992-01-01

    The Desktop Computing Integration Project for the Human Resources Management Division (HRMD) of LaRC was designed to help division personnel use personal computing resources to perform job tasks. The three goals of the project were to involve HRMD personnel in desktop computing, link mainframe data to desktop capabilities, and to estimate training needs for the division. The project resulted in increased usage of personal computers by Awards specialists, an increased awareness of LaRC resources to help perform tasks, and personal computer output that was used in presentation of information to center personnel. In addition, the necessary skills for HRMD personal computer users were identified. The Awards Office was chosen for the project because of the consistency of their data requests and the desire of employees in that area to use the personal computer.

  9. Understanding the Irradiation Behavior of Zirconium Carbide

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Motta, Arthur; Sridharan, Kumar; Morgan, Dane

    2013-10-11

    Zirconium carbide (ZrC) is being considered for utilization in high-temperature gas-cooled reactor fuels in deep-burn TRISO fuel. Zirconium carbide possesses a cubic B1-type crystal structure with a high melting point, exceptional hardness, and good thermal and electrical conductivities. The use of ZrC as part of the TRISO fuel requires a thorough understanding of its irradiation response. However, the radiation effects on ZrC are still poorly understood. The majority of the existing research is focused on the radiation damage phenomena at higher temperatures (>450{degree}C) where many fundamental aspects of defect production and kinetics cannot be easily distinguished. Little is known aboutmore » basic defect formation, clustering, and evolution of ZrC under irradiation, although some atomistic simulation and phenomenological studies have been performed. Such detailed information is needed to construct a model describing the microstructural evolution in fast-neutron irradiated materials that will be of great technological importance for the development of ZrC-based fuel. The goal of the proposed project is to gain fundamental understanding of the radiation-induced defect formation in zirconium carbide and irradiation response by using a combination of state-of-the-art experimental methods and atomistic modeling. This project will combine (1) in situ ion irradiation at a specialized facility at a national laboratory, (2) controlled temperature proton irradiation on bulk samples, and (3) atomistic modeling to gain a fundamental understanding of defect formation in ZrC. The proposed project will cover the irradiation temperatures from cryogenic temperature to as high as 800{degree}C, and dose ranges from 0.1 to 100 dpa. The examination of this wide range of temperatures and doses allows us to obtain an experimental data set that can be effectively used to exercise and benchmark the computer calculations of defect properties. Combining the examination of radiation-induced microstructures mapped spatially and temporally, microstructural evolution during post-irradiation annealing, and atomistic modeling of defect formation and transport energetics will provide new, critical understanding about property changes in ZrC. The behavior of materials under irradiation is determined by the balance between damage production, defect clustering, and lattice response. In order to predict those effects at high temperatures so targeted testing can be expanded and extrapolated beyond the known database, it is necessary to determine the defect energetics and mobilities as these control damage accumulation and annealing. In particular, low-temperature irradiations are invaluable for determining the regions of defect mobility. Computer simulation techniques are particularly useful for identifying basic defect properties, especially if closely coupled with a well-constructed and complete experimental database. The close coupling of calculation and experiment in this project will provide mutual benchmarking and allow us to glean a deeper understanding of the irradiation response of ZrC, which can then be applied to the prediction of its behavior in reactor conditions.« less

  10. Use of non-adiabatic geometric phase for quantum computing by NMR.

    PubMed

    Das, Ranabir; Kumar, S K Karthick; Kumar, Anil

    2005-12-01

    Geometric phases have stimulated researchers for its potential applications in many areas of science. One of them is fault-tolerant quantum computation. A preliminary requisite of quantum computation is the implementation of controlled dynamics of qubits. In controlled dynamics, one qubit undergoes coherent evolution and acquires appropriate phase, depending on the state of other qubits. If the evolution is geometric, then the phase acquired depend only on the geometry of the path executed, and is robust against certain types of error. This phenomenon leads to an inherently fault-tolerant quantum computation. Here we suggest a technique of using non-adiabatic geometric phase for quantum computation, using selective excitation. In a two-qubit system, we selectively evolve a suitable subsystem where the control qubit is in state |1, through a closed circuit. By this evolution, the target qubit gains a phase controlled by the state of the control qubit. Using the non-adiabatic geometric phase we demonstrate implementation of Deutsch-Jozsa algorithm and Grover's search algorithm in a two-qubit system.

  11. Jupiter and Planet Earth. [planetary and biological evolution and natural satellites

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The evolution of Jupiter and Earth are discussed along with their atmospheres, the radiation belts around both planets, natural satellites, the evolution of life, and the Pioneer 10. Educational study projects are also included.

  12. Observational Tracers of Hot and Cold Gas in Isolated Galaxy Simulations

    NASA Astrophysics Data System (ADS)

    Brzycki, Bryan; Silvia, Devin

    2018-01-01

    We present results from an analysis comparing simulations of isolated spiral galaxies with recent observations of the circumgalactic medium (CGM). As the interface containing inflows and outflows between the interstellar and intergalactic media, the CGM plays an important role in the composition and evolution of galaxies. Using a set of isolated galaxy simulations over different initial conditions and star formation and feedback parameters, we investigate the evolution of CGM gas. Specifically, in light of recent observational studies, we compute the radial column density profiles and covering fractions of various observable ion species (H I, C IV, O VI, Mg II, Si III) for each simulated galaxy. Taking uniformly random sightlines through the CGM of each simulated galaxy, we find the abundance of gas absorbers and analyze their contribution to the overall column density along each sightline. By identifying the prevalence of high column density absorbers, we seek to characterize the distribution and evolution of observable ion species in the CGM. We also highlight a subset of our isolated galaxy simulations that produce and maintain a stable precipitating CGM that fuels high rates of sustained star formation. This project was supported in part by the NSF REU grant AST-1358980 and by the Nantucket Maria Mitchell Association.

  13. Evolution of a Planetary System. SETI Academy Planet Project.

    ERIC Educational Resources Information Center

    Search for Extraterrestrial Intelligence Inst., Mountain View, CA.

    The SETI Academy Planet Project provides an exciting, informative, and creative series of activities for elementary students (grades 5-6) in these activities each student plays the role of a cadet at the SETI Academy, a fictitious institution. This unit examines the evolution of stars and planets which is an important aspect of the search for…

  14. Volcanoes: Where and Why? Crustal Evolution Education Project. Teacher's Guide [and] Student Investigation.

    ERIC Educational Resources Information Center

    Stoever, Edward C., Jr.

    Crustal Evolution Education Project (CEEP) modules were designed to: (1) provide students with the methods and results of continuing investigations into the composition, history, and processes of the earth's crust and the application of this knowledge to man's activities and (2) to be used by teachers with little or no previous background in the…

  15. Reedy Creek Cleanup: The Evolution of a University Geography Service-Learning Project

    ERIC Educational Resources Information Center

    Parece, Tammy E.; Aspaas, Helen Ruth

    2007-01-01

    Service-learning courses within a university setting help students to better understand their roles as members of civil society. This article examines the evolution of an urban stream cleanup project that has been part of a world regions geography course for six years. After connecting course goals with the current best practice literature on…

  16. Which Way is North? Crustal Evolution Education Project. Teacher's Guide [and] Student Investigation.

    ERIC Educational Resources Information Center

    Stoever, Edward C., Jr.

    Crustal Evolution Education Project (CEEP) modules were designed to: (1) provide students with the methods and results of continuing investigations into the composition, history, and processes of the earth's crust and the application of this knowledge to man's activities and (2) to be used by teachers with little or no previous background in the…

  17. Quake Estate (board game). Crustal Evolution Education Project. Teacher's Guide [and] Student Investigation.

    ERIC Educational Resources Information Center

    Stoever, Edward C., Jr.

    Crustal Evolution Education Project (CEEP) modules were designed to: (1) provide students with the methods and results of continuing investigations into the composition, history, and processes of the earth's crust and the application of this knowledge to man's activities and (2) to be used by teachers with little or no previous background in the…

  18. PAL: an object-oriented programming library for molecular evolution and phylogenetics.

    PubMed

    Drummond, A; Strimmer, K

    2001-07-01

    Phylogenetic Analysis Library (PAL) is a collection of Java classes for use in molecular evolution and phylogenetics. PAL provides a modular environment for the rapid construction of both special-purpose and general analysis programs. PAL version 1.1 consists of 145 public classes or interfaces in 13 packages, including classes for models of character evolution, maximum-likelihood estimation, and the coalescent, with a total of more than 27000 lines of code. The PAL project is set up as a collaborative project to facilitate contributions from other researchers. AVAILIABILTY: The program is free and is available at http://www.pal-project.org. It requires Java 1.1 or later. PAL is licensed under the GNU General Public License.

  19. Evolution of synchronization and desynchronization in digital organisms.

    PubMed

    Knoester, David B; McKinley, Philip K

    2011-01-01

    We present a study in the evolution of temporal behavior, specifically synchronization and desynchronization, through digital evolution and group selection. In digital evolution, a population of self-replicating computer programs exists in a user-defined computational environment and is subject to instruction-level mutations and natural selection. Group selection links the survival of the individual to the survival of its group, thus encouraging cooperation. Previous approaches to engineering synchronization and desynchronization algorithms have taken inspiration from nature: In the well-known firefly model, the only form of communication between agents is in the form of flash messages among neighbors. Here we demonstrate that populations of digital organisms, provided with a similar mechanism and minimal information about their environment, are capable of evolving algorithms for synchronization and desynchronization, and that the evolved behaviors are robust to message loss. We further describe how the evolved behavior for synchronization mimics that of the well-known Ermentrout model for firefly synchronization in biology. In addition to discovering self-organizing behaviors for distributed computing systems, this result indicates that digital evolution may be used to further our understanding of synchronization in biology.

  20. Quantum simulation from the bottom up: the case of rebits

    NASA Astrophysics Data System (ADS)

    Enshan Koh, Dax; Yuezhen Niu, Murphy; Yoder, Theodore J.

    2018-05-01

    Typically, quantum mechanics is thought of as a linear theory with unitary evolution governed by the Schrödinger equation. While this is technically true and useful for a physicist, with regards to computation it is an unfortunately narrow point of view. Just as a classical computer can simulate highly nonlinear functions of classical states, so too can the more general quantum computer simulate nonlinear evolutions of quantum states. We detail one particular simulation of nonlinearity on a quantum computer, showing how the entire class of -unitary evolutions (on n qubits) can be simulated using a unitary, real-amplitude quantum computer (consisting of n  +  1 qubits in total). These operators can be represented as the sum of a linear and antilinear operator, and add an intriguing new set of nonlinear quantum gates to the toolbox of the quantum algorithm designer. Furthermore, a subgroup of these nonlinear evolutions, called the -Cliffords, can be efficiently classically simulated, by making use of the fact that Clifford operators can simulate non-Clifford (in fact, non-linear) operators. This perspective of using the physical operators that we have to simulate non-physical ones that we do not is what we call bottom-up simulation, and we give some examples of its broader implications.

  1. Lower bound on the time complexity of local adiabatic evolution

    NASA Astrophysics Data System (ADS)

    Chen, Zhenghao; Koh, Pang Wei; Zhao, Yan

    2006-11-01

    The adiabatic theorem of quantum physics has been, in recent times, utilized in the design of local search quantum algorithms, and has been proven to be equivalent to standard quantum computation, that is, the use of unitary operators [D. Aharonov in Proceedings of the 45th Annual Symposium on the Foundations of Computer Science, 2004, Rome, Italy (IEEE Computer Society Press, New York, 2004), pp. 42-51]. Hence, the study of the time complexity of adiabatic evolution algorithms gives insight into the computational power of quantum algorithms. In this paper, we present two different approaches of evaluating the time complexity for local adiabatic evolution using time-independent parameters, thus providing effective tests (not requiring the evaluation of the entire time-dependent gap function) for the time complexity of newly developed algorithms. We further illustrate our tests by displaying results from the numerical simulation of some problems, viz. specially modified instances of the Hamming weight problem.

  2. Quantum gates by periodic driving

    PubMed Central

    Shi, Z. C.; Wang, W.; Yi, X. X.

    2016-01-01

    Topological quantum computation has been extensively studied in the past decades due to its robustness against decoherence. One way to realize the topological quantum computation is by adiabatic evolutions—it requires relatively long time to complete a gate, so the speed of quantum computation slows down. In this work, we present a method to realize single qubit quantum gates by periodic driving. Compared to adiabatic evolution, the single qubit gates can be realized at a fixed time much shorter than that by adiabatic evolution. The driving fields can be sinusoidal or square-well field. With the sinusoidal driving field, we derive an expression for the total operation time in the high-frequency limit, and an exact analytical expression for the evolution operator without any approximations is given for the square well driving. This study suggests that the period driving could provide us with a new direction in regulations of the operation time in topological quantum computation. PMID:26911900

  3. A Comprehensive Toolset for General-Purpose Private Computing and Outsourcing

    DTIC Science & Technology

    2016-12-08

    project and scientific advances made towards each of the research thrusts throughout the project duration. 1 Project Objectives Cloud computing enables...possibilities that the cloud enables is computation outsourcing, when the client can utilize any necessary computing resources for its computational task...Security considerations, however, stand on the way of harnessing the full benefits of cloud computing to the fullest extent and prevent clients from

  4. Computers in Education: The Shape of Things to Come = L'informatique en education: Quelles Evolutions?

    ERIC Educational Resources Information Center

    Baron, Georges-Louis

    1989-01-01

    Presents an overview of the evolution of ideas and achievements in the sphere of general secondary education. Emphasis is on the social aspects of the implementation of the various stages which were involved in the introduction and development of computer technology in educational systems in different countries. Both French and English versions…

  5. Variety and evolution of American endoscopic image management and recording systems.

    PubMed

    Korman, L Y

    1996-03-01

    The rapid evolution of computing technology has and will continue to alter the practice of gastroenterology and gastrointestinal endoscopy. Development of communication standards for text, images, and security systems will be necessary for medicine to take advantage of high-speed computing and communications. Professional societies can have an important role in guiding the development process.

  6. Programmed Instruction to Computer-Based Instruction: The Evolution of an Instructional Technology.

    ERIC Educational Resources Information Center

    Lamos, Joseph P.

    This review of the evolution of programmed instruction from Pressey and Skinner to the present suggests that current computer technology will be able to free the learner from the limitations of time and place as Pressey originally proposed. It is noted that Skinner provided the necessary foundation for treating the learning process on an…

  7. Global Snow-Cover Evolution from Twenty Years of Satellite Passive Microwave Data

    USGS Publications Warehouse

    Mognard, N.M.; Kouraev, A.V.; Josberger, E.G.

    2003-01-01

    Starting in 1979 with the SMMR (Scanning Multichannel Microwave Radiometer) instrument onboard the satellite NIMBUS-7 and continuing since 1987 with the SSMI (Special Sensor Microwave Imager) instrument on board the DMSP (Defence Meteorological Satellite Program) series, more then twenty years of satellite passive microwave data are now available. This dataset has been processed to analyse the evolution of the global snow cover. This work is part of the AICSEX project from the 5th Framework Programme of the European Community. The spatio-temporal evolution of the satellite-derived yearly snow maximum extent and the timing of the spring snow melt were estimated and analysed over the Northern Hemisphere. Significant differences between the evolution of the yearly maximum snow extent in Eurasia and in North America were found. A positive correlation between the maximum yearly snow cover extent and the ENSO index was obtained. High interannual spatio-temporal variability characterises the timing of snow melt in the spring. Twenty-year trends in the timing of spring snow melt have been computed and compared with spring air temperature trends for the same period and the same area. In most parts of Eurasia and in the central and western parts of North America the tendency has been for earlier snow melt. In northeastern Canada, a large area of positive trends, where snow melt timing starts later than in the early 1980s, corresponds to a region of positive trends of spring air temperature observed over the same period.

  8. The future of climate science analysis in a coming era of exascale computing

    NASA Astrophysics Data System (ADS)

    Bates, S. C.; Strand, G.

    2013-12-01

    Projections of Community Earth System Model (CESM) output based on the growth of data archived over 2000-2012 at all of our computing sites (NCAR, NERSC, ORNL) show that we can expect to reach 1,000 PB (1 EB) sometime in the next decade or so. The current paradigms of using site-based archival systems to hold these data that are then accessed via portals or gateways, downloading the data to a local system, and then processing/analyzing the data will be irretrievably broken before then. From a climate modeling perspective, the expertise involved in making climate models themselves efficient on HPC systems will need to be applied to the data as well - providing fast parallel analysis tools co-resident in memory with the data, because the disk I/O bandwidth simply will not keep up with the expected arrival of exaflop systems. The ability of scientists, analysts, stakeholders and others to use climate model output to turn these data into understanding and knowledge will require significant advances in the current typical analysis tools and packages to enable these processes for these vast volumes of data. Allowing data users to enact their own analyses on model output is virtually a requirement as well - climate modelers cannot anticipate all the possibilities for analysis that users may want to do. In addition, the expertise of data scientists, and their knowledge of the model output and their knowledge of best practices in data management (metadata, curation, provenance and so on) will need to be rewarded and exploited to gain the most understanding possible from these volumes of data. In response to growing data size, demand, and future projections, the CESM output has undergone a structure evolution and the data management plan has been reevaluated and updated. The major evolution of the CESM data structure is presented here, along with the CESM experience and role within the CMIP3/CMIP5.

  9. Peridynamic Theory as a New Paradigm for Multiscale Modeling of Sintering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Silling, Stewart A.; Abdeljawad, Fadi; Ford, Kurtis Ross

    2017-09-01

    Sintering is a component fabrication process in which powder is compacted by pressing or some other means and then held at elevated temperature for a period of hours. The powder grains bond with each other, leading to the formation of a solid component with much lower porosity, and therefore higher density and higher strength, than the original powder compact. In this project, we investigated a new way of computationally modeling sintering at the length scale of grains. The model uses a high-fidelity, three-dimensional representation with a few hundred nodes per grain. The numerical model solves the peridynamic equations, in whichmore » nonlocal forces allow representation of the attraction, adhesion, and mass diffusion between grains. The deformation of the grains is represented through a viscoelastic material model. The project successfully demonstrated the use of this method to reproduce experimentally observed features of material behavior in sintering, including densification, the evolution of microstructure, and the occurrence of random defects in the sintered solid.« less

  10. Cranial base topology and basic trends in the facial evolution of Homo.

    PubMed

    Bastir, Markus; Rosas, Antonio

    2016-02-01

    Facial prognathism and projection are important characteristics in human evolution but their three-dimensional (3D) architectonic relationships to basicranial morphology are not clear. We used geometric morphometrics and measured 51 3D-landmarks in a comparative sample of modern humans (N = 78) and fossil Pleistocene hominins (N = 10) to investigate the spatial features of covariation between basicranial and facial elements. The study reveals complex morphological integration patterns in craniofacial evolution of Middle and Late Pleistocene hominins. A downwards-orientated cranial base correlates with alveolar maxillary prognathism, relatively larger faces, and relatively larger distances between the anterior cranial base and the frontal bone (projection). This upper facial projection correlates with increased overall relative size of the maxillary alveolar process. Vertical facial height is associated with tall nasal cavities and is accommodated by an elevated anterior cranial base, possibly because of relations between the cribriform and the nasal cavity in relation to body size and energetics. Variation in upper- and mid-facial projection can further be produced by basicranial topology in which the midline base and nasal cavity are shifted anteriorly relative to retracted lateral parts of the base and the face. The zygomatics and the middle cranial fossae act together as bilateral vertical systems that are either projected or retracted relative to the midline facial elements, causing either midfacial flatness or midfacial projection correspondingly. We propose that facial flatness and facial projection reflect classical principles of craniofacial growth counterparts, while facial orientation relative to the basicranium as well as facial proportions reflect the complex interplay of head-body integration in the light of encephalization and body size decrease in Middle to Late Pleistocene hominin evolution. Developmental and evolutionary patterns of integration may only partially overlap morphologically, and traditional concepts taken from research on two-dimensional (2D) lateral X-rays and sections have led to oversimplified and overly mechanistic models of basicranial evolution. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. A Vision on the Status and Evolution of HEP Physics Software Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Canal, P.; Elvira, D.; Hatcher, R.

    2013-07-28

    This paper represents the vision of the members of the Fermilab Scientific Computing Division's Computational Physics Department (SCD-CPD) on the status and the evolution of various HEP software tools such as the Geant4 detector simulation toolkit, the Pythia and GENIE physics generators, and the ROOT data analysis framework. The goal of this paper is to contribute ideas to the Snowmass 2013 process toward the composition of a unified document on the current status and potential evolution of the physics software tools which are essential to HEP.

  12. Gaudi Evolution for Future Challenges

    NASA Astrophysics Data System (ADS)

    Clemencic, M.; Hegner, B.; Leggett, C.

    2017-10-01

    The LHCb Software Framework Gaudi was initially designed and developed almost twenty years ago, when computing was very different from today. It has also been used by a variety of other experiments, including ATLAS, Daya Bay, GLAST, HARP, LZ, and MINERVA. Although it has been always actively developed all these years, stability and backward compatibility have been favoured, reducing the possibilities of adopting new techniques, like multithreaded processing. R&D efforts like GaudiHive have however shown its potential to cope with the new challenges. In view of the LHC second Long Shutdown approaching and to prepare for the computing challenges for the Upgrade of the collider and the detectors, now is a perfect moment to review the design of Gaudi and plan future developments of the project. To do this LHCb, ATLAS and the Future Circular Collider community joined efforts to bring Gaudi forward and prepare it for the upcoming needs of the experiments. We present here how Gaudi will evolve in the next years and the long term development plans.

  13. JINR cloud infrastructure evolution

    NASA Astrophysics Data System (ADS)

    Baranov, A. V.; Balashov, N. A.; Kutovskiy, N. A.; Semenov, R. N.

    2016-09-01

    To fulfil JINR commitments in different national and international projects related to the use of modern information technologies such as cloud and grid computing as well as to provide a modern tool for JINR users for their scientific research a cloud infrastructure was deployed at Laboratory of Information Technologies of Joint Institute for Nuclear Research. OpenNebula software was chosen as a cloud platform. Initially it was set up in simple configuration with single front-end host and a few cloud nodes. Some custom development was done to tune JINR cloud installation to fit local needs: web form in the cloud web-interface for resources request, a menu item with cloud utilization statistics, user authentication via Kerberos, custom driver for OpenVZ containers. Because of high demand in that cloud service and its resources over-utilization it was re-designed to cover increasing users' needs in capacity, availability and reliability. Recently a new cloud instance has been deployed in high-availability configuration with distributed network file system and additional computing power.

  14. The Earth System Model

    NASA Technical Reports Server (NTRS)

    Schoeberl, Mark; Rood, Richard B.; Hildebrand, Peter; Raymond, Carol

    2003-01-01

    The Earth System Model is the natural evolution of current climate models and will be the ultimate embodiment of our geophysical understanding of the planet. These models are constructed from components - atmosphere, ocean, ice, land, chemistry, solid earth, etc. models and merged together through a coupling program which is responsible for the exchange of data from the components. Climate models and future earth system models will have standardized modules, and these standards are now being developed by the ESMF project funded by NASA. The Earth System Model will have a variety of uses beyond climate prediction. The model can be used to build climate data records making it the core of an assimilation system, and it can be used in OSSE experiments to evaluate. The computing and storage requirements for the ESM appear to be daunting. However, the Japanese ES theoretical computing capability is already within 20% of the minimum requirements needed for some 2010 climate model applications. Thus it seems very possible that a focused effort to build an Earth System Model will achieve succcss.

  15. DIAPHANE: A portable radiation transport library for astrophysical applications

    NASA Astrophysics Data System (ADS)

    Reed, Darren S.; Dykes, Tim; Cabezón, Rubén; Gheller, Claudio; Mayer, Lucio

    2018-05-01

    One of the most computationally demanding aspects of the hydrodynamical modelingof Astrophysical phenomena is the transport of energy by radiation or relativistic particles. Physical processes involving energy transport are ubiquitous and of capital importance in many scenarios ranging from planet formation to cosmic structure evolution, including explosive events like core collapse supernova or gamma-ray bursts. Moreover, the ability to model and hence understand these processes has often been limited by the approximations and incompleteness in the treatment of radiation and relativistic particles. The DIAPHANE project has focused on developing a portable and scalable library that handles the transport of radiation and particles (in particular neutrinos) independently of the underlying hydrodynamic code. In this work, we present the computational framework and the functionalities of the first version of the DIAPHANE library, which has been successfully ported to three different smoothed-particle hydrodynamic codes, GADGET2, GASOLINE and SPHYNX. We also present validation of different modules solving the equations of radiation and neutrino transport using different numerical schemes.

  16. Geo-spatial Service and Application based on National E-government Network Platform and Cloud

    NASA Astrophysics Data System (ADS)

    Meng, X.; Deng, Y.; Li, H.; Yao, L.; Shi, J.

    2014-04-01

    With the acceleration of China's informatization process, our party and government take a substantive stride in advancing development and application of digital technology, which promotes the evolution of e-government and its informatization. Meanwhile, as a service mode based on innovative resources, cloud computing may connect huge pools together to provide a variety of IT services, and has become one relatively mature technical pattern with further studies and massive practical applications. Based on cloud computing technology and national e-government network platform, "National Natural Resources and Geospatial Database (NRGD)" project integrated and transformed natural resources and geospatial information dispersed in various sectors and regions, established logically unified and physically dispersed fundamental database and developed national integrated information database system supporting main e-government applications. Cross-sector e-government applications and services are realized to provide long-term, stable and standardized natural resources and geospatial fundamental information products and services for national egovernment and public users.

  17. A Volunteer Computing Project for Solving Geoacoustic Inversion Problems

    NASA Astrophysics Data System (ADS)

    Zaikin, Oleg; Petrov, Pavel; Posypkin, Mikhail; Bulavintsev, Vadim; Kurochkin, Ilya

    2017-12-01

    A volunteer computing project aimed at solving computationally hard inverse problems in underwater acoustics is described. This project was used to study the possibilities of the sound speed profile reconstruction in a shallow-water waveguide using a dispersion-based geoacoustic inversion scheme. The computational capabilities provided by the project allowed us to investigate the accuracy of the inversion for different mesh sizes of the sound speed profile discretization grid. This problem suits well for volunteer computing because it can be easily decomposed into independent simpler subproblems.

  18. Programmed Evolution for Optimization of Orthogonal Metabolic Output in Bacteria

    PubMed Central

    Eckdahl, Todd T.; Campbell, A. Malcolm; Heyer, Laurie J.; Poet, Jeffrey L.; Blauch, David N.; Snyder, Nicole L.; Atchley, Dustin T.; Baker, Erich J.; Brown, Micah; Brunner, Elizabeth C.; Callen, Sean A.; Campbell, Jesse S.; Carr, Caleb J.; Carr, David R.; Chadinha, Spencer A.; Chester, Grace I.; Chester, Josh; Clarkson, Ben R.; Cochran, Kelly E.; Doherty, Shannon E.; Doyle, Catherine; Dwyer, Sarah; Edlin, Linnea M.; Evans, Rebecca A.; Fluharty, Taylor; Frederick, Janna; Galeota-Sprung, Jonah; Gammon, Betsy L.; Grieshaber, Brandon; Gronniger, Jessica; Gutteridge, Katelyn; Henningsen, Joel; Isom, Bradley; Itell, Hannah L.; Keffeler, Erica C.; Lantz, Andrew J.; Lim, Jonathan N.; McGuire, Erin P.; Moore, Alexander K.; Morton, Jerrad; Nakano, Meredith; Pearson, Sara A.; Perkins, Virginia; Parrish, Phoebe; Pierson, Claire E.; Polpityaarachchige, Sachith; Quaney, Michael J.; Slattery, Abagael; Smith, Kathryn E.; Spell, Jackson; Spencer, Morgan; Taye, Telavive; Trueblood, Kamay; Vrana, Caroline J.; Whitesides, E. Tucker

    2015-01-01

    Current use of microbes for metabolic engineering suffers from loss of metabolic output due to natural selection. Rather than combat the evolution of bacterial populations, we chose to embrace what makes biological engineering unique among engineering fields – evolving materials. We harnessed bacteria to compute solutions to the biological problem of metabolic pathway optimization. Our approach is called Programmed Evolution to capture two concepts. First, a population of cells is programmed with DNA code to enable it to compute solutions to a chosen optimization problem. As analog computers, bacteria process known and unknown inputs and direct the output of their biochemical hardware. Second, the system employs the evolution of bacteria toward an optimal metabolic solution by imposing fitness defined by metabolic output. The current study is a proof-of-concept for Programmed Evolution applied to the optimization of a metabolic pathway for the conversion of caffeine to theophylline in E. coli. Introduced genotype variations included strength of the promoter and ribosome binding site, plasmid copy number, and chaperone proteins. We constructed 24 strains using all combinations of the genetic variables. We used a theophylline riboswitch and a tetracycline resistance gene to link theophylline production to fitness. After subjecting the mixed population to selection, we measured a change in the distribution of genotypes in the population and an increased conversion of caffeine to theophylline among the most fit strains, demonstrating Programmed Evolution. Programmed Evolution inverts the standard paradigm in metabolic engineering by harnessing evolution instead of fighting it. Our modular system enables researchers to program bacteria and use evolution to determine the combination of genetic control elements that optimizes catabolic or anabolic output and to maintain it in a population of cells. Programmed Evolution could be used for applications in energy, pharmaceuticals, chemical commodities, biomining, and bioremediation. PMID:25714374

  19. Programmed evolution for optimization of orthogonal metabolic output in bacteria.

    PubMed

    Eckdahl, Todd T; Campbell, A Malcolm; Heyer, Laurie J; Poet, Jeffrey L; Blauch, David N; Snyder, Nicole L; Atchley, Dustin T; Baker, Erich J; Brown, Micah; Brunner, Elizabeth C; Callen, Sean A; Campbell, Jesse S; Carr, Caleb J; Carr, David R; Chadinha, Spencer A; Chester, Grace I; Chester, Josh; Clarkson, Ben R; Cochran, Kelly E; Doherty, Shannon E; Doyle, Catherine; Dwyer, Sarah; Edlin, Linnea M; Evans, Rebecca A; Fluharty, Taylor; Frederick, Janna; Galeota-Sprung, Jonah; Gammon, Betsy L; Grieshaber, Brandon; Gronniger, Jessica; Gutteridge, Katelyn; Henningsen, Joel; Isom, Bradley; Itell, Hannah L; Keffeler, Erica C; Lantz, Andrew J; Lim, Jonathan N; McGuire, Erin P; Moore, Alexander K; Morton, Jerrad; Nakano, Meredith; Pearson, Sara A; Perkins, Virginia; Parrish, Phoebe; Pierson, Claire E; Polpityaarachchige, Sachith; Quaney, Michael J; Slattery, Abagael; Smith, Kathryn E; Spell, Jackson; Spencer, Morgan; Taye, Telavive; Trueblood, Kamay; Vrana, Caroline J; Whitesides, E Tucker

    2015-01-01

    Current use of microbes for metabolic engineering suffers from loss of metabolic output due to natural selection. Rather than combat the evolution of bacterial populations, we chose to embrace what makes biological engineering unique among engineering fields - evolving materials. We harnessed bacteria to compute solutions to the biological problem of metabolic pathway optimization. Our approach is called Programmed Evolution to capture two concepts. First, a population of cells is programmed with DNA code to enable it to compute solutions to a chosen optimization problem. As analog computers, bacteria process known and unknown inputs and direct the output of their biochemical hardware. Second, the system employs the evolution of bacteria toward an optimal metabolic solution by imposing fitness defined by metabolic output. The current study is a proof-of-concept for Programmed Evolution applied to the optimization of a metabolic pathway for the conversion of caffeine to theophylline in E. coli. Introduced genotype variations included strength of the promoter and ribosome binding site, plasmid copy number, and chaperone proteins. We constructed 24 strains using all combinations of the genetic variables. We used a theophylline riboswitch and a tetracycline resistance gene to link theophylline production to fitness. After subjecting the mixed population to selection, we measured a change in the distribution of genotypes in the population and an increased conversion of caffeine to theophylline among the most fit strains, demonstrating Programmed Evolution. Programmed Evolution inverts the standard paradigm in metabolic engineering by harnessing evolution instead of fighting it. Our modular system enables researchers to program bacteria and use evolution to determine the combination of genetic control elements that optimizes catabolic or anabolic output and to maintain it in a population of cells. Programmed Evolution could be used for applications in energy, pharmaceuticals, chemical commodities, biomining, and bioremediation.

  20. The World Spatiotemporal Analytics and Mapping Project (WSTAMP): Further Progress in Discovering, Exploring, and Mapping Spatiotemporal Patterns Across the World's Largest Open Source Data Sets

    NASA Astrophysics Data System (ADS)

    Piburn, J.; Stewart, R.; Myers, A.; Sorokine, A.; Axley, E.; Anderson, D.; Burdette, J.; Biddle, C.; Hohl, A.; Eberle, R.; Kaufman, J.; Morton, A.

    2017-10-01

    Spatiotemporal (ST) analytics applied to major data sources such as the World Bank and World Health Organization has shown tremendous value in shedding light on the evolution of cultural, health, economic, and geopolitical landscapes on a global level. WSTAMP engages this opportunity by situating analysts, data, and analytics together within a visually rich and computationally rigorous online analysis environment. Since introducing WSTAMP at the First International Workshop on Spatiotemporal Computing, several transformative advances have occurred. Collaboration with human computer interaction experts led to a complete interface redesign that deeply immerses the analyst within a ST context, significantly increases visual and textual content, provides navigational crosswalks for attribute discovery, substantially reduce mouse and keyboard actions, and supports user data uploads. Secondly, the database has been expanded to include over 16,000 attributes, 50 years of time, and 200+ nation states and redesigned to support non-annual, non-national, city, and interaction data. Finally, two new analytics are implemented for analyzing large portfolios of multi-attribute data and measuring the behavioral stability of regions along different dimensions. These advances required substantial new approaches in design, algorithmic innovations, and increased computational efficiency. We report on these advances and inform how others may freely access the tool.

  1. Computational design of materials for solar hydrogen generation

    NASA Astrophysics Data System (ADS)

    Umezawa, Naoto

    Photocatalysis has a great potential for the production of hydrogen from aquerous solution under solar light. In this talk, two different approaches toward the computational materials desing for solar hydrogen generation will be presented. Tin (Sn), which has two major oxidation states, Sn2+ and Sn4+, is abundant on the earth's crust. Recently, visible-light responsive photocatalytc H2 evolution reaction was identified over a mixed valence tin oxide Sn3O4. We have carried out crystal structure prediction for mixed valence tin oxides in different atomic compositions under ambient pressure condition using advanced computational methods based on the evolutionary crystal-structure search and density-functional theory. The predicted novel crystal structures realize the desirable band gaps and band edge positions for H2 evolution under visible light irradiation. It is concluded that multivalent tin oxides have a great potential as an abundant, cheap and environmentally-benign solar-energy conversion photofunctional materials. Transition metal doping is effective for sensitizing SrTiO3 under visible light. We have theoretically investigated the roles of the doped Cr in STO based on hybrid density-functional calculations. Cr atoms are preferably substituting for Ti under any equilibrium growth conditions. The lower oxidation state Cr3+, which is stabilized under an n-type condition of STO, is found to be advantageous for the photocatalytic performance. It is firther predicted that lanthanum is the best codopant for stabilizing the favorable oxidation state, Cr3+. The prediction was validated by our experiments that La and Cr co-doped STO shows the best performance among examined samples. This work was supported by the Japan Science and Technology Agency (JST) Precursory Research for Embryonic Science and Technology (PRESTO) and International Research Fellow program of Japan Society for the Promotion of Science (JSPS) through project P14207.

  2. Non-rigid CT/CBCT to CBCT registration for online external beam radiotherapy guidance

    NASA Astrophysics Data System (ADS)

    Zachiu, Cornel; de Senneville, Baudouin Denis; Tijssen, Rob H. N.; Kotte, Alexis N. T. J.; Houweling, Antonetta C.; Kerkmeijer, Linda G. W.; Lagendijk, Jan J. W.; Moonen, Chrit T. W.; Ries, Mario

    2018-01-01

    Image-guided external beam radiotherapy (EBRT) allows radiation dose deposition with a high degree of accuracy and precision. Guidance is usually achieved by estimating the displacements, via image registration, between cone beam computed tomography (CBCT) and computed tomography (CT) images acquired at different stages of the therapy. The resulting displacements are then used to reposition the patient such that the location of the tumor at the time of treatment matches its position during planning. Moreover, ongoing research aims to use CBCT-CT image registration for online plan adaptation. However, CBCT images are usually acquired using a small number of x-ray projections and/or low beam intensities. This often leads to the images being subject to low contrast, low signal-to-noise ratio and artifacts, which ends-up hampering the image registration process. Previous studies addressed this by integrating additional image processing steps into the registration procedure. However, these steps are usually designed for particular image acquisition schemes, therefore limiting their use on a case-by-case basis. In the current study we address CT to CBCT and CBCT to CBCT registration by the means of the recently proposed EVolution registration algorithm. Contrary to previous approaches, EVolution does not require the integration of additional image processing steps in the registration scheme. Moreover, the algorithm requires a low number of input parameters, is easily parallelizable and provides an elastic deformation on a point-by-point basis. Results have shown that relative to a pure CT-based registration, the intrinsic artifacts present in typical CBCT images only have a sub-millimeter impact on the accuracy and precision of the estimated deformation. In addition, the algorithm has low computational requirements, which are compatible with online image-based guidance of EBRT treatments.

  3. Combining cell-based hydrodynamics with hybrid particle-field simulations: efficient and realistic simulation of structuring dynamics.

    PubMed

    Sevink, G J A; Schmid, F; Kawakatsu, T; Milano, G

    2017-02-22

    We have extended an existing hybrid MD-SCF simulation technique that employs a coarsening step to enhance the computational efficiency of evaluating non-bonded particle interactions. This technique is conceptually equivalent to the single chain in mean-field (SCMF) method in polymer physics, in the sense that non-bonded interactions are derived from the non-ideal chemical potential in self-consistent field (SCF) theory, after a particle-to-field projection. In contrast to SCMF, however, MD-SCF evolves particle coordinates by the usual Newton's equation of motion. Since collisions are seriously affected by the softening of non-bonded interactions that originates from their evaluation at the coarser continuum level, we have devised a way to reinsert the effect of collisions on the structural evolution. Merging MD-SCF with multi-particle collision dynamics (MPCD), we mimic particle collisions at the level of computational cells and at the same time properly account for the momentum transfer that is important for a realistic system evolution. The resulting hybrid MD-SCF/MPCD method was validated for a particular coarse-grained model of phospholipids in aqueous solution, against reference full-particle simulations and the original MD-SCF model. We additionally implemented and tested an alternative and more isotropic finite difference gradient. Our results show that efficiency is improved by merging MD-SCF with MPCD, as properly accounting for hydrodynamic interactions considerably speeds up the phase separation dynamics, with negligible additional computational costs compared to efficient MD-SCF. This new method enables realistic simulations of large-scale systems that are needed to investigate the applications of self-assembled structures of lipids in nanotechnologies.

  4. Manufacturing Methods and Technology Project Summary Reports

    DTIC Science & Technology

    1985-06-01

    Computer -Aided Design (CAD)/ Computer -Aided Manufacturing (CAM) Process for the Production of Cold Forged Gears Project 483 6121 - Robotic Welding and...Caliber Projectile Bodies Project 682 8370 - Automatic Inspection and 1-I1 Process Control of Weapons Parts Manufacturing METALS Project 181 7285 - Cast...designed for use on each project. Experience suggested that a general purpose computer interface might be designed that could be used on any project

  5. Continents and Ocean Basins: Floaters and Sinkers. Crustal Evolution Education Project. Teacher's Guide [and] Student Investigation.

    ERIC Educational Resources Information Center

    Stoever, Edward C., Jr.

    Crustal Evolution Education Project (CEEP) modules were designed to: (1) provide students with the methods and results of continuing investigations into the composition, history, and processes of the earth's crust and the application of this knowledge to man's activities and (2) to be used by teachers with little or no previous background in the…

  6. Factors Shaping the Evolution of Electronic Documentation Systems. Research Activity No. IM.4.

    ERIC Educational Resources Information Center

    Dede, C. J.; And Others

    The first of 10 sections in this report focuses on factors that will affect the evolution of Space Station Project (SSP) documentation systems. The goal of this project is to prepare the space station technical and managerial structure for likely changes in the creation, capture, transfer, and utilization of knowledge about the space station which…

  7. Deep Sea Trenches and Radioactive Water. Crustal Evolution Education Project. Teacher's Guide [and] Student Investigation.

    ERIC Educational Resources Information Center

    Stoever, Edward C., Jr.

    Crustal Evolution Education Project (CEEP) modules were designed to: (1) provide students with the methods and results of continuing investigations into the composition, history, and processes of the earth's crust and the application of this knowledge to man's activities and (2) to be used by teachers with little or no previous background in the…

  8. When a Piece of a Continent Breaks Off. Crustal Evolution Education Project. Teacher's Guide [and] Student Investigation.

    ERIC Educational Resources Information Center

    Stoever, Edward C., Jr.

    Crustal Evolution Education Project (CEEP) modules were designed to: (1) provide students with the methods and results of continuing investigations into the composition, history, and processes of the earth's crust and the application of this knowledge to man's activities and (2) to be used by teachers with little or no previous background in the…

  9. Measuring Continental Drift: The Laser Ranging Experiment. Crustal Evolution Education Project. Teacher's Guide [and] Student Investigation.

    ERIC Educational Resources Information Center

    Stoever, Edward C., Jr.

    Crustal Evolution Education Project (CEEP) modules were designed to: (1) provide students with the methods and results of continuing investigations into the composition, history, and processes of the earth's crust and the application of this knowledge to man's activities and (2) to be used by teachers with little or no previous background in the…

  10. Drifting Continents and Wandering Poles. Crustal Evolution Education Project. Teacher's Guide [and] Student Investigation.

    ERIC Educational Resources Information Center

    Stoever, Edward C., Jr.

    Crustal Evolution Education Project (CEEP) modules were designed to: (1) provide students with the methods and results of continuing investigations into the composition, history, and processes of the earth's crust and the application of this knowledge to man's activities and (2) to be used by teachers with little or no previous background in the…

  11. A Sea-Floor Mystery: Mapping Polarity Reversals. Crustal Evolution Education Project. Teacher's Guide [and] Student Investigation.

    ERIC Educational Resources Information Center

    Stoever, Edward C., Jr.

    Crustal Evolution Education Project (CEEP) modules were designed to: (1) provide students with the methods and results of continuing investigations into the composition, history, and processes of the earth's crust and the application of this knowledge to man's activities and (2) to be used by teachers with little or no previous background in the…

  12. Movement of the Pacific Ocean Floor. Crustal Evolution Education Project. Teacher's Guide [and] Student Investigation.

    ERIC Educational Resources Information Center

    Stoever, Edward C., Jr.

    Crustal Evolution Education Project (CEEP) modules were designed to: (1) provide students with the methods and results of continuing investigations into the composition, history, and processes of the earth's crust and the application of this knowledge to man's activities and (2) to be used by teachers with little or no previous background in the…

  13. Hot Spots in the Earth's Crust. Crustal Evolution Education Project. Teacher's Guide [and] Student Investigation.

    ERIC Educational Resources Information Center

    Stoever, Edward C., Jr.

    Crustal Evolution Education Project (CEEP) modules were designed to: (1) provide students with the methods and results of continuing investigations into the composition, history, and processes of the earth's crust and the application of this knowledge to man's activities and (2) to be used by teachers with little or no previous background in the…

  14. Iceland: The Case of the Splitting Personality. Crustal Evolution Education Project. Teacher's Guide [and] Student Investigation.

    ERIC Educational Resources Information Center

    Stoever, Edward C., Jr.

    Crustal Evolution Education Project (CEEP) modules were designed to: (1) provide students with the methods and results of continuing investigations into the composition, history, and processes of the earth's crust and the application of this knowledge to man's activities and (2) to be used by teachers with little or no previous background in the…

  15. Plate Boundaries and Earthquake Prediction. Crustal Evolution Education Project. Teacher's Guide [and] Student Investigation.

    ERIC Educational Resources Information Center

    Stoever, Edward C., Jr.

    Crustal Evolution Education Project (CEEP) modules were designed to: (1) provide students with the methods and results of continuing investigations into the composition, history, and processes of the earth's crust and the application of this knowledge to man's activities and (2) to be used by teachers with little or no previous background in the…

  16. Plotting the Shape of the Ocean Floor. Crustal Evolution Education Project. Teacher's Guide [and] Student Investigation.

    ERIC Educational Resources Information Center

    Stoever, Edward C., Jr.

    Crustal Evolution Education Project (CEEP) modules were designed to: (1) provide students with the methods and results of continuing investigations into the composition, history, and processes of the earth's crust and the application of this knowledge to man's activities and (2) to be used by teachers with little or no previous background in the…

  17. Drifting Continents and Magnetic Fields. Crustal Evolution Education Project. Teacher's Guide [and] Student Investigation.

    ERIC Educational Resources Information Center

    Stoever, Edward C., Jr.

    Crustal Evolution Education Project (CEEP) modules were designed to: (1) provide students with the methods and results of continuing investigations into the composition, history, and processes of the earth's crust and the application of this knowledge to man's activities and (2) to be used by teachers with little or no previous background in the…

  18. Why Does Sea Level Change? Crustal Evolution Education Project. Teacher's Guide [and] Student Investigation.

    ERIC Educational Resources Information Center

    Stoever, Edward C., Jr.

    Crustal Evolution Education Project (CEEP) modules were designed to: (1) provide students with the methods and results of continuing investigations into the composition, history, and processes of the earth's crust and the application of this knowledge to man's activities and (2) to be used by teachers with little or no previous background in the…

  19. Evolution of the heteroharmonic strategy for target-range computation in the echolocation of Mormoopidae

    PubMed Central

    Mora, Emanuel C.; Macías, Silvio; Hechavarría, Julio; Vater, Marianne; Kössl, Manfred

    2013-01-01

    Echolocating bats use the time elapsed from biosonar pulse emission to the arrival of echo (defined as echo-delay) to assess target-distance. Target-distance is represented in the brain by delay-tuned neurons that are classified as either “heteroharmonic” or “homoharmormic.” Heteroharmonic neurons respond more strongly to pulse-echo pairs in which the timing of the pulse is given by the fundamental biosonar harmonic while the timing of echoes is provided by one (or several) of the higher order harmonics. On the other hand, homoharmonic neurons are tuned to the echo delay between similar harmonics in the emitted pulse and echo. It is generally accepted that heteroharmonic computations are advantageous over homoharmonic computations; i.e., heteroharmonic neurons receive information from call and echo in different frequency-bands which helps to avoid jamming between pulse and echo signals. Heteroharmonic neurons have been found in two species of the family Mormoopidae (Pteronotus parnellii and Pteronotus quadridens) and in Rhinolophus rouxi. Recently, it was proposed that heteroharmonic target-range computations are a primitive feature of the genus Pteronotus that was preserved in the evolution of the genus. Here, we review recent findings on the evolution of echolocation in Mormoopidae, and try to link those findings to the evolution of the heteroharmonic computation strategy (HtHCS). We stress the hypothesis that the ability to perform heteroharmonic computations evolved separately from the ability of using long constant-frequency echolocation calls, high duty cycle echolocation, and Doppler Shift Compensation. Also, we present the idea that heteroharmonic computations might have been of advantage for categorizing prey size, hunting eared insects, and living in large conspecific colonies. We make five testable predictions that might help future investigations to clarify the evolution of the heteroharmonic echolocation in Mormoopidae and other families. PMID:23781209

  20. Computer Series, 101: Accurate Equations of State in Computational Chemistry Projects.

    ERIC Educational Resources Information Center

    Albee, David; Jones, Edward

    1989-01-01

    Discusses the use of computers in chemistry courses at the United States Military Academy. Provides two examples of computer projects: (1) equations of state, and (2) solving for molar volume. Presents BASIC and PASCAL listings for the second project. Lists 10 applications for physical chemistry. (MVL)

  1. A general-purpose development environment for intelligent computer-aided training systems

    NASA Technical Reports Server (NTRS)

    Savely, Robert T.

    1990-01-01

    Space station training will be a major task, requiring the creation of large numbers of simulation-based training systems for crew, flight controllers, and ground-based support personnel. Given the long duration of space station missions and the large number of activities supported by the space station, the extension of space shuttle training methods to space station training may prove to be impractical. The application of artificial intelligence technology to simulation training can provide the ability to deliver individualized training to large numbers of personnel in a distributed workstation environment. The principal objective of this project is the creation of a software development environment which can be used to build intelligent training systems for procedural tasks associated with the operation of the space station. Current NASA Johnson Space Center projects and joint projects with other NASA operational centers will result in specific training systems for existing space shuttle crew, ground support personnel, and flight controller tasks. Concurrently with the creation of these systems, a general-purpose development environment for intelligent computer-aided training systems will be built. Such an environment would permit the rapid production, delivery, and evolution of training systems for space station crew, flight controllers, and other support personnel. The widespread use of such systems will serve to preserve task and training expertise, support the training of many personnel in a distributed manner, and ensure the uniformity and verifiability of training experiences. As a result, significant reductions in training costs can be realized while safety and the probability of mission success can be enhanced.

  2. A Comprehensive Infrastructure for Big Data in Cancer Research: Accelerating Cancer Research and Precision Medicine

    PubMed Central

    Hinkson, Izumi V.; Davidsen, Tanja M.; Klemm, Juli D.; Chandramouliswaran, Ishwar; Kerlavage, Anthony R.; Kibbe, Warren A.

    2017-01-01

    Advancements in next-generation sequencing and other -omics technologies are accelerating the detailed molecular characterization of individual patient tumors, and driving the evolution of precision medicine. Cancer is no longer considered a single disease, but rather, a diverse array of diseases wherein each patient has a unique collection of germline variants and somatic mutations. Molecular profiling of patient-derived samples has led to a data explosion that could help us understand the contributions of environment and germline to risk, therapeutic response, and outcome. To maximize the value of these data, an interdisciplinary approach is paramount. The National Cancer Institute (NCI) has initiated multiple projects to characterize tumor samples using multi-omic approaches. These projects harness the expertise of clinicians, biologists, computer scientists, and software engineers to investigate cancer biology and therapeutic response in multidisciplinary teams. Petabytes of cancer genomic, transcriptomic, epigenomic, proteomic, and imaging data have been generated by these projects. To address the data analysis challenges associated with these large datasets, the NCI has sponsored the development of the Genomic Data Commons (GDC) and three Cloud Resources. The GDC ensures data and metadata quality, ingests and harmonizes genomic data, and securely redistributes the data. During its pilot phase, the Cloud Resources tested multiple cloud-based approaches for enhancing data access, collaboration, computational scalability, resource democratization, and reproducibility. These NCI-led efforts are continuously being refined to better support open data practices and precision oncology, and to serve as building blocks of the NCI Cancer Research Data Commons. PMID:28983483

  3. G-Anchor: a novel approach for whole-genome comparative mapping utilizing evolutionary conserved DNA sequences.

    PubMed

    Lenis, Vasileios Panagiotis E; Swain, Martin; Larkin, Denis M

    2018-05-01

    Cross-species whole-genome sequence alignment is a critical first step for genome comparative analyses, ranging from the detection of sequence variants to studies of chromosome evolution. Animal genomes are large and complex, and whole-genome alignment is a computationally intense process, requiring expensive high-performance computing systems due to the need to explore extensive local alignments. With hundreds of sequenced animal genomes available from multiple projects, there is an increasing demand for genome comparative analyses. Here, we introduce G-Anchor, a new, fast, and efficient pipeline that uses a strictly limited but highly effective set of local sequence alignments to anchor (or map) an animal genome to another species' reference genome. G-Anchor makes novel use of a databank of highly conserved DNA sequence elements. We demonstrate how these elements may be aligned to a pair of genomes, creating anchors. These anchors enable the rapid mapping of scaffolds from a de novo assembled genome to chromosome assemblies of a reference species. Our results demonstrate that G-Anchor can successfully anchor a vertebrate genome onto a phylogenetically related reference species genome using a desktop or laptop computer within a few hours and with comparable accuracy to that achieved by a highly accurate whole-genome alignment tool such as LASTZ. G-Anchor thus makes whole-genome comparisons accessible to researchers with limited computational resources. G-Anchor is a ready-to-use tool for anchoring a pair of vertebrate genomes. It may be used with large genomes that contain a significant fraction of evolutionally conserved DNA sequences and that are not highly repetitive, polypoid, or excessively fragmented. G-Anchor is not a substitute for whole-genome aligning software but can be used for fast and accurate initial genome comparisons. G-Anchor is freely available and a ready-to-use tool for the pairwise comparison of two genomes.

  4. The Open Science Grid - Support for Multi-Disciplinary Team Science - the Adolescent Years

    NASA Astrophysics Data System (ADS)

    Bauerdick, Lothar; Ernst, Michael; Fraser, Dan; Livny, Miron; Pordes, Ruth; Sehgal, Chander; Würthwein, Frank; Open Science Grid

    2012-12-01

    As it enters adolescence the Open Science Grid (OSG) is bringing a maturing fabric of Distributed High Throughput Computing (DHTC) services that supports an expanding HEP community to an increasingly diverse spectrum of domain scientists. Working closely with researchers on campuses throughout the US and in collaboration with national cyberinfrastructure initiatives, we transform their computing environment through new concepts, advanced tools and deep experience. We discuss examples of these including: the pilot-job overlay concepts and technologies now in use throughout OSG and delivering 1.4 Million CPU hours/day; the role of campus infrastructures- built out from concepts of sharing across multiple local faculty clusters (made good use of already by many of the HEP Tier-2 sites in the US); the work towards the use of clouds and access to high throughput parallel (multi-core and GPU) compute resources; and the progress we are making towards meeting the data management and access needs of non-HEP communities with general tools derived from the experience of the parochial tools in HEP (integration of Globus Online, prototyping with IRODS, investigations into Wide Area Lustre). We will also review our activities and experiences as HTC Service Provider to the recently awarded NSF XD XSEDE project, the evolution of the US NSF TeraGrid project, and how we are extending the reach of HTC through this activity to the increasingly broad national cyberinfrastructure. We believe that a coordinated view of the HPC and HTC resources in the US will further expand their impact on scientific discovery.

  5. Using evolutionary computations to understand the design and evolution of gene and cell regulatory networks.

    PubMed

    Spirov, Alexander; Holloway, David

    2013-07-15

    This paper surveys modeling approaches for studying the evolution of gene regulatory networks (GRNs). Modeling of the design or 'wiring' of GRNs has become increasingly common in developmental and medical biology, as a means of quantifying gene-gene interactions, the response to perturbations, and the overall dynamic motifs of networks. Drawing from developments in GRN 'design' modeling, a number of groups are now using simulations to study how GRNs evolve, both for comparative genomics and to uncover general principles of evolutionary processes. Such work can generally be termed evolution in silico. Complementary to these biologically-focused approaches, a now well-established field of computer science is Evolutionary Computations (ECs), in which highly efficient optimization techniques are inspired from evolutionary principles. In surveying biological simulation approaches, we discuss the considerations that must be taken with respect to: (a) the precision and completeness of the data (e.g. are the simulations for very close matches to anatomical data, or are they for more general exploration of evolutionary principles); (b) the level of detail to model (we proceed from 'coarse-grained' evolution of simple gene-gene interactions to 'fine-grained' evolution at the DNA sequence level); (c) to what degree is it important to include the genome's cellular context; and (d) the efficiency of computation. With respect to the latter, we argue that developments in computer science EC offer the means to perform more complete simulation searches, and will lead to more comprehensive biological predictions. Copyright © 2013 Elsevier Inc. All rights reserved.

  6. The fast debris evolution model

    NASA Astrophysics Data System (ADS)

    Lewis, H. G.; Swinerd, G. G.; Newland, R. J.; Saunders, A.

    2009-09-01

    The 'particles-in-a-box' (PIB) model introduced by Talent [Talent, D.L. Analytic model for orbital debris environmental management. J. Spacecraft Rocket, 29 (4), 508-513, 1992.] removed the need for computer-intensive Monte Carlo simulation to predict the gross characteristics of an evolving debris environment. The PIB model was described using a differential equation that allows the stability of the low Earth orbit (LEO) environment to be tested by a straightforward analysis of the equation's coefficients. As part of an ongoing research effort to investigate more efficient approaches to evolutionary modelling and to develop a suite of educational tools, a new PIB model has been developed. The model, entitled Fast Debris Evolution (FADE), employs a first-order differential equation to describe the rate at which new objects ⩾10 cm are added and removed from the environment. Whilst Talent [Talent, D.L. Analytic model for orbital debris environmental management. J. Spacecraft Rocket, 29 (4), 508-513, 1992.] based the collision theory for the PIB approach on collisions between gas particles and adopted specific values for the parameters of the model from a number of references, the form and coefficients of the FADE model equations can be inferred from the outputs of future projections produced by high-fidelity models, such as the DAMAGE model. The FADE model has been implemented as a client-side, web-based service using JavaScript embedded within a HTML document. Due to the simple nature of the algorithm, FADE can deliver the results of future projections immediately in a graphical format, with complete user-control over key simulation parameters. Historical and future projections for the ⩾10 cm LEO debris environment under a variety of different scenarios are possible, including business as usual, no future launches, post-mission disposal and remediation. A selection of results is presented with comparisons with predictions made using the DAMAGE environment model. The results demonstrate that the FADE model is able to capture comparable time-series of collisions and number of objects as predicted by DAMAGE in several scenarios. Further, and perhaps more importantly, its speed and flexibility allows the user to explore and understand the evolution of the space debris environment.

  7. Teaching and Training in Geoinformatics: Experiences from the Cyberinfrastructure Summer Institute for Geoscientists (CSIG)

    NASA Astrophysics Data System (ADS)

    Smeekens, M.; Baru, C.; Keller, G. R.; Arrowsmith, R.; Crosby, C. J.

    2009-12-01

    The Cyberinfrastructure Summer Institute for Geoscientists (CSIG) has been conducted each year since 2004 under sponsorship of the GEON project that is funded by the NSF. The goal of the institute, which is broadly advertised to the Geoscience community, is to introduce geoscientists to Computer Science concepts and commonly-used as well as emergent information technology tools. The week-long program originally covered topics ranging from Data Modeling, Web Services, and Geographic Information Systems, to brief introductions to key concepts in Grid Computing, Parallel Programming, and Scientific Workflows. However, the program as well as the composition and expectations of the audience have evolved over time. Detailed course and instructor evaluations provide valuable feedback on course content and presentation approaches, and are used to plan future CSIG curriculum. From an initial emphasis on Geoscience graduate students and postdocs, the selection process has evolved to encourage participation by individuals with backgrounds in Geoscience as well as Computer Science from academia, government agencies, and industry. More recently, there has been an emphasis on selecting junior faculty and those interested in teaching Geoinformatics courses. While the initial objective of CSIG was to provide an overview of information technology topics via lectures and demonstrations, over time attendees have become more interested in specific instruction in how informatics and cyberinfrastructure (CI) capabilities could be utilized to address issues in Earth Science research and education. There have been requests over the years for more in-depth coverage on some topics and hands-on exercises. The program has now evolved to include a “Build Track”, focused on IT issues related to the development and implementation of Geoinformatics systems, and an “Education Track”, focused on use of Geoinformatics resources in education. With increasing awareness of CI projects, the audience is also becoming more interested in an introduction to the broader landscape of CI activities in the Geosciences and related areas. In the future, we plan a “demo” session to showcase various CI projects. Attendees will not only hear about such projects but will be able to use and experience the cyber-environments and tools in a hands-on session. The evolution of the CSIG program reflects major changes in the IT landscape since 2004. Where we once discussed Grid Computing, students are now learning about Cloud Computing and related concepts. An institute like CSIG play an important role in providing “cross-training” such that geoscientists gain insight into IT issues and solution approaches, while computer scientist gain a better appreciation of the needs and requirements of geoscience applications. In this presentation, we will summarize and analyze the trends over the years in program as well as audience composition; discuss lessons learnt over the years; and present our plan for future CSIG offerings.

  8. Has computational creativity successfully made it "Beyond the Fence" in musical theatre?

    NASA Astrophysics Data System (ADS)

    Jordanous, Anna

    2017-10-01

    A significant test for software is to task it with replicating human performance, as done recently with creative software and the commercial project Beyond the Fence (undertaken for a television documentary Computer Says Show). The remit of this project was to use computer software as much as possible to produce "the world's first computer-generated musical". Several creative systems were used to generate this musical, which was performed in London's West End in 2016. This paper considers the challenge of evaluating this project. Current computational creativity evaluation methods are ill-suited to evaluating projects that involve creative input from multiple systems and people. Following recent inspiration within computational creativity research from interaction design, here the DECIDE evaluation framework is applied to evaluate the Beyond the Fence project. Evaluation finds that the project was reasonably successful at achieving the task of using computational generation to produce a credible musical. Lessons have been learned for future computational creativity projects though, particularly for affording creative software more agency and enabling software to interact with other creative partners. Upon reflection, the DECIDE framework emerges as a useful evaluation "checklist" (if not a tangible operational methodology) for evaluating multiple creative systems participating in a creative task.

  9. AN OVERVIEW OF REDUCED ORDER MODELING TECHNIQUES FOR SAFETY APPLICATIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mandelli, D.; Alfonsi, A.; Talbot, P.

    2016-10-01

    The RISMC project is developing new advanced simulation-based tools to perform Computational Risk Analysis (CRA) for the existing fleet of U.S. nuclear power plants (NPPs). These tools numerically model not only the thermal-hydraulic behavior of the reactors primary and secondary systems, but also external event temporal evolution and component/system ageing. Thus, this is not only a multi-physics problem being addressed, but also a multi-scale problem (both spatial, µm-mm-m, and temporal, seconds-hours-years). As part of the RISMC CRA approach, a large amount of computationally-expensive simulation runs may be required. An important aspect is that even though computational power is growing, themore » overall computational cost of a RISMC analysis using brute-force methods may be not viable for certain cases. A solution that is being evaluated to assist the computational issue is the use of reduced order modeling techniques. During the FY2015, we investigated and applied reduced order modeling techniques to decrease the RISMC analysis computational cost by decreasing the number of simulation runs; for this analysis improvement we used surrogate models instead of the actual simulation codes. This article focuses on the use of reduced order modeling techniques that can be applied to RISMC analyses in order to generate, analyze, and visualize data. In particular, we focus on surrogate models that approximate the simulation results but in a much faster time (microseconds instead of hours/days).« less

  10. Geometry of Quantum Computation with Qudits

    PubMed Central

    Luo, Ming-Xing; Chen, Xiu-Bo; Yang, Yi-Xian; Wang, Xiaojun

    2014-01-01

    The circuit complexity of quantum qubit system evolution as a primitive problem in quantum computation has been discussed widely. We investigate this problem in terms of qudit system. Using the Riemannian geometry the optimal quantum circuits are equivalent to the geodetic evolutions in specially curved parametrization of SU(dn). And the quantum circuit complexity is explicitly dependent of controllable approximation error bound. PMID:24509710

  11. Numerical study of the vortex tube reconnection using vortex particle method on many graphics cards

    NASA Astrophysics Data System (ADS)

    Kudela, Henryk; Kosior, Andrzej

    2014-08-01

    Vortex Particle Methods are one of the most convenient ways of tracking the vorticity evolution. In the article we presented numerical recreation of the real life experiment concerning head-on collision of two vortex rings. In the experiment the evolution and reconnection of the vortex structures is tracked with passive markers (paint particles) which in viscous fluid does not follow the evolution of vorticity field. In numerical computations we showed the difference between vorticity evolution and movement of passive markers. The agreement with the experiment was very good. Due to problems with very long time of computations on a single processor the Vortex-in-Cell method was implemented on the multicore architecture of the graphics cards (GPUs). Vortex Particle Methods are very well suited for parallel computations. As there are myriads of particles in the flow and for each of them the same equations of motion have to be solved the SIMD architecture used in GPUs seems to be perfect. The main disadvantage in this case is the small amount of the RAM memory. To overcome this problem we created a multiGPU implementation of the VIC method. Some remarks on parallel computing are given in the article.

  12. Development of efficient time-evolution method based on three-term recurrence relation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Akama, Tomoko, E-mail: a.tomo---s-b-l-r@suou.waseda.jp; Kobayashi, Osamu; Nanbu, Shinkoh, E-mail: shinkoh.nanbu@sophia.ac.jp

    The advantage of the real-time (RT) propagation method is a direct solution of the time-dependent Schrödinger equation which describes frequency properties as well as all dynamics of a molecular system composed of electrons and nuclei in quantum physics and chemistry. Its applications have been limited by computational feasibility, as the evaluation of the time-evolution operator is computationally demanding. In this article, a new efficient time-evolution method based on the three-term recurrence relation (3TRR) was proposed to reduce the time-consuming numerical procedure. The basic formula of this approach was derived by introducing a transformation of the operator using the arcsine function.more » Since this operator transformation causes transformation of time, we derived the relation between original and transformed time. The formula was adapted to assess the performance of the RT time-dependent Hartree-Fock (RT-TDHF) method and the time-dependent density functional theory. Compared to the commonly used fourth-order Runge-Kutta method, our new approach decreased computational time of the RT-TDHF calculation by about factor of four, showing the 3TRR formula to be an efficient time-evolution method for reducing computational cost.« less

  13. An Equation-Free Reduced-Order Modeling Approach to Tropical Pacific Simulation

    NASA Astrophysics Data System (ADS)

    Wang, Ruiwen; Zhu, Jiang; Luo, Zhendong; Navon, I. M.

    2009-03-01

    The “equation-free” (EF) method is often used in complex, multi-scale problems. In such cases it is necessary to know the closed form of the required evolution equations about oscopic variables within some applied fields. Conceptually such equations exist, however, they are not available in closed form. The EF method can bypass this difficulty. This method can obtain oscopic information by implementing models at a microscopic level. Given an initial oscopic variable, through lifting we can obtain the associated microscopic variable, which may be evolved using Direct Numerical Simulations (DNS) and by restriction, we can obtain the necessary oscopic information and the projective integration to obtain the desired quantities. In this paper we apply the EF POD-assisted method to the reduced modeling of a large-scale upper ocean circulation in the tropical Pacific domain. The computation cost is reduced dramatically. Compared with the POD method, the method provided more accurate results and it did not require the availability of any explicit equations or the right-hand side (RHS) of the evolution equation.

  14. Factors shaping the evolution of electronic documentation systems

    NASA Technical Reports Server (NTRS)

    Dede, Christopher J.; Sullivan, Tim R.; Scace, Jacque R.

    1990-01-01

    The main goal is to prepare the space station technical and managerial structure for likely changes in the creation, capture, transfer, and utilization of knowledge. By anticipating advances, the design of Space Station Project (SSP) information systems can be tailored to facilitate a progression of increasingly sophisticated strategies as the space station evolves. Future generations of advanced information systems will use increases in power to deliver environmentally meaningful, contextually targeted, interconnected data (knowledge). The concept of a Knowledge Base Management System is emerging when the problem is focused on how information systems can perform such a conversion of raw data. Such a system would include traditional management functions for large space databases. Added artificial intelligence features might encompass co-existing knowledge representation schemes; effective control structures for deductive, plausible, and inductive reasoning; means for knowledge acquisition, refinement, and validation; explanation facilities; and dynamic human intervention. The major areas covered include: alternative knowledge representation approaches; advanced user interface capabilities; computer-supported cooperative work; the evolution of information system hardware; standardization, compatibility, and connectivity; and organizational impacts of information intensive environments.

  15. Spheroidal Populated Star Systems

    NASA Astrophysics Data System (ADS)

    Angeletti, Lucio; Giannone, Pietro

    2008-10-01

    Globular clusters and low-ellipticity early-type galaxies can be treated as systems populated by a large number of stars and whose structures can be schematized as spherically symmetric. Their studies profit from the synthesis of stellar populations. The computation of synthetic models makes use of various contributions from star evolution and stellar dynamics. In the first sections of the paper we present a short review of our results on the occurrence of galactic winds in star systems ranging from globular clusters to elliptical galaxies, and the dynamical evolution of a typical massive globular cluster. In the subsequent sections we describe our approach to the problem of the stellar populations in elliptical galaxies. The projected radial behaviours of spectro-photometric indices for a sample of eleven galaxies are compared with preliminary model results. The best agreement between observation and theory shows that our galaxies share a certain degree of heterogeneity. The gas energy dissipation varies from moderate to large, the metal yield ranges from solar to significantly oversolar, the dispersion of velocities is isotropic in most of the cases and anisotropic in the remaining instances.

  16. NASA's Advanced Multimission Operations System: A Case Study in Formalizing Software Architecture Evolution

    NASA Technical Reports Server (NTRS)

    Barnes, Jeffrey M.

    2011-01-01

    All software systems of significant size and longevity eventually undergo changes to their basic architectural structure. Such changes may be prompted by evolving requirements, changing technology, or other reasons. Whatever the cause, software architecture evolution is commonplace in real world software projects. Recently, software architecture researchers have begun to study this phenomenon in depth. However, this work has suffered from problems of validation; research in this area has tended to make heavy use of toy examples and hypothetical scenarios and has not been well supported by real world examples. To help address this problem, I describe an ongoing effort at the Jet Propulsion Laboratory to re-architect the Advanced Multimission Operations System (AMMOS), which is used to operate NASA's deep-space and astrophysics missions. Based on examination of project documents and interviews with project personnel, I describe the goals and approach of this evolution effort and then present models that capture some of the key architectural changes. Finally, I demonstrate how approaches and formal methods from my previous research in architecture evolution may be applied to this evolution, while using languages and tools already in place at the Jet Propulsion Laboratory.

  17. Hybrid RANS-LES using high order numerical methods

    NASA Astrophysics Data System (ADS)

    Henry de Frahan, Marc; Yellapantula, Shashank; Vijayakumar, Ganesh; Knaus, Robert; Sprague, Michael

    2017-11-01

    Understanding the impact of wind turbine wake dynamics on downstream turbines is particularly important for the design of efficient wind farms. Due to their tractable computational cost, hybrid RANS/LES models are an attractive framework for simulating separation flows such as the wake dynamics behind a wind turbine. High-order numerical methods can be computationally efficient and provide increased accuracy in simulating complex flows. In the context of LES, high-order numerical methods have shown some success in predictions of turbulent flows. However, the specifics of hybrid RANS-LES models, including the transition region between both modeling frameworks, pose unique challenges for high-order numerical methods. In this work, we study the effect of increasing the order of accuracy of the numerical scheme in simulations of canonical turbulent flows using RANS, LES, and hybrid RANS-LES models. We describe the interactions between filtering, model transition, and order of accuracy and their effect on turbulence quantities such as kinetic energy spectra, boundary layer evolution, and dissipation rate. This work was funded by the U.S. Department of Energy, Exascale Computing Project, under Contract No. DE-AC36-08-GO28308 with the National Renewable Energy Laboratory.

  18. OPserver: opacities and radiative accelerations on demand

    NASA Astrophysics Data System (ADS)

    Mendoza, C.; González, J.; Seaton, M. J.; Buerger, P.; Bellorín, A.; Meléndez, M.; Rodríguez, L. S.; Delahaye, F.; Zeippen, C. J.; Palacios, E.; Pradhan, A. K.

    2009-05-01

    We report on developments carried out within the Opacity Project (OP) to upgrade atomic database services to comply with e-infrastructure requirements. We give a detailed description of an interactive, online server for astrophysical opacities, referred to as OPserver, to be used in sophisticated stellar modelling where Rosseland mean opacities and radiative accelerations are computed at every depth point and each evolution cycle. This is crucial, for instance, in chemically peculiar stars and in the exploitation of the new asteroseismological data. OPserver, downloadable with the new OPCD_3.0 release from the Centre de Données Astronomiques de Strasbourg, France, computes mean opacities and radiative data for arbitrary chemical mixtures from the OP monochromatic opacities. It is essentially a client-server network restructuring and optimization of the suite of codes included in the earlier OPCD_2.0 release. The server can be installed locally or, alternatively, accessed remotely from the Ohio Supercomputer Center, Columbus, Ohio, USA. The client is an interactive web page or a subroutine library that can be linked to the user code. The suitability of this scheme in grid computing environments is emphasized, and its extension to other atomic database services for astrophysical purposes is discussed.

  19. The Evolution of a Science Project: A Preliminary System Dynamics Model of a Recurring Software-Reliant Acquisition Behavior

    DTIC Science & Technology

    2012-07-01

    3.3.4 User Community Management 14 3.3.5 Uncontrolled Prototype Growth 14 3.3.6 Project Manager Decisions 15 3.3.7 The 90% Syndrome 15 3.3.8 Re...Figure 3: 90% Syndrome Due to Rippling Rework in the Production Development 21 Figure 4: Causal Loop Diagram of "The Evolution of a Science Project...Unintended Burnout Due to Overtime 60 V | CMU/SEI-2012-TR-001 Acknowledgments Many people have worked to sponsor and improve this report and the

  20. Is the addition of an assisted driving Hamiltonian always useful for adiabatic evolution?

    NASA Astrophysics Data System (ADS)

    Sun, Jie; Lu, Songfeng; Li, Li

    2017-04-01

    It has been known that when an assisted driving item is added to the main system Hamiltonian, the efficiency of the resultant adiabatic evolution can be significantly improved. In some special cases, it can be seen that only through adding an assisted driving Hamiltonian can the resulting adiabatic evolution be made not to fail. Thus the additional driving Hamiltonian plays an important role in adiabatic computing. In this paper, we show that if the driving Hamiltonian is chosen inappropriately, the adiabatic computation may still fail. More importantly, we find that the adiabatic computation can only succeed if the assisted driving Hamiltonian has a relatively fixed form. This may help us understand why in the related literature all of the driving Hamiltonians used share the same form.

  1. Projects Using a Computer Algebra System in First-Year Undergraduate Mathematics

    ERIC Educational Resources Information Center

    Rosenzweig, Martin

    2007-01-01

    This paper illustrates the use of computer-based projects in two one-semester first-year undergraduate mathematics classes. Developed over a period of years, the approach is one in which the classes are organised into work-groups, with computer-based projects being undertaken periodically to illustrate the class material. These projects are…

  2. ComputerTown. A Do-It-Yourself Community Computer Project.

    ERIC Educational Resources Information Center

    Loop, Liza; And Others

    This manual based on Menlo Park's experiences in participating in a nationwide experimental computer literacy project provides guidelines for the development and management of a ComputerTown. This project, which was begun in September 1981 in the Menlo Park Public Library, concentrates on providing public access to microcomputers through hands-on…

  3. The diversity and evolution of ecological and environmental citizen science.

    PubMed

    Pocock, Michael J O; Tweddle, John C; Savage, Joanna; Robinson, Lucy D; Roy, Helen E

    2017-01-01

    Citizen science-the involvement of volunteers in data collection, analysis and interpretation-simultaneously supports research and public engagement with science, and its profile is rapidly rising. Citizen science represents a diverse range of approaches, but until now this diversity has not been quantitatively explored. We conducted a systematic internet search and discovered 509 environmental and ecological citizen science projects. We scored each project for 32 attributes based on publicly obtainable information and used multiple factor analysis to summarise this variation to assess citizen science approaches. We found that projects varied according to their methodological approach from 'mass participation' (e.g. easy participation by anyone anywhere) to 'systematic monitoring' (e.g. trained volunteers repeatedly sampling at specific locations). They also varied in complexity from approaches that are 'simple' to those that are 'elaborate' (e.g. provide lots of support to gather rich, detailed datasets). There was a separate cluster of entirely computer-based projects but, in general, we found that the range of citizen science projects in ecology and the environment showed continuous variation and cannot be neatly categorised into distinct types of activity. While the diversity of projects begun in each time period (pre 1990, 1990-99, 2000-09 and 2010-13) has not increased, we found that projects tended to have become increasingly different from each other as time progressed (possibly due to changing opportunities, including technological innovation). Most projects were still active so consequently we found that the overall diversity of active projects (available for participation) increased as time progressed. Overall, understanding the landscape of citizen science in ecology and the environment (and its change over time) is valuable because it informs the comparative evaluation of the 'success' of different citizen science approaches. Comparative evaluation provides an evidence-base to inform the future development of citizen science activities.

  4. Using palaeoclimate data to improve models of the Antarctic Ice Sheet

    NASA Astrophysics Data System (ADS)

    Phipps, Steven; King, Matt; Roberts, Jason; White, Duanne

    2017-04-01

    Ice sheet models are the most descriptive tools available to simulate the future evolution of the Antarctic Ice Sheet (AIS), including its contribution towards changes in global sea level. However, our knowledge of the dynamics of the coupled ice-ocean-lithosphere system is inevitably limited, in part due to a lack of observations. Furthemore, to build computationally efficient models that can be run for multiple millennia, it is necessary to use simplified descriptions of ice dynamics. Ice sheet modelling is therefore an inherently uncertain exercise. The past evolution of the AIS provides an opportunity to constrain the description of physical processes within ice sheet models and, therefore, to constrain our understanding of the role of the AIS in driving changes in global sea level. We use the Parallel Ice Sheet Model (PISM) to demonstrate how palaeoclimate data can improve our ability to predict the future evolution of the AIS. A 50-member perturbed-physics ensemble is generated, spanning uncertainty in the parameterisations of three key physical processes within the model: (i) the stress balance within the ice sheet, (ii) basal sliding and (iii) calving of ice shelves. A Latin hypercube approach is used to optimally sample the range of uncertainty in parameter values. This perturbed-physics ensemble is used to simulate the evolution of the AIS from the Last Glacial Maximum ( 21,000 years ago) to present. Palaeoclimate records are then used to determine which ensemble members are the most realistic. This allows us to use data on past climates to directly constrain our understanding of the past contribution of the AIS towards changes in global sea level. Critically, it also allows us to determine which ensemble members are likely to generate the most realistic projections of the future evolution of the AIS.

  5. Using paleoclimate data to improve models of the Antarctic Ice Sheet

    NASA Astrophysics Data System (ADS)

    King, M. A.; Phipps, S. J.; Roberts, J. L.; White, D.

    2016-12-01

    Ice sheet models are the most descriptive tools available to simulate the future evolution of the Antarctic Ice Sheet (AIS), including its contribution towards changes in global sea level. However, our knowledge of the dynamics of the coupled ice-ocean-lithosphere system is inevitably limited, in part due to a lack of observations. Furthemore, to build computationally efficient models that can be run for multiple millennia, it is necessary to use simplified descriptions of ice dynamics. Ice sheet modeling is therefore an inherently uncertain exercise. The past evolution of the AIS provides an opportunity to constrain the description of physical processes within ice sheet models and, therefore, to constrain our understanding of the role of the AIS in driving changes in global sea level. We use the Parallel Ice Sheet Model (PISM) to demonstrate how paleoclimate data can improve our ability to predict the future evolution of the AIS. A large, perturbed-physics ensemble is generated, spanning uncertainty in the parameterizations of four key physical processes within ice sheet models: ice rheology, ice shelf calving, and the stress balances within ice sheets and ice shelves. A Latin hypercube approach is used to optimally sample the range of uncertainty in parameter values. This perturbed-physics ensemble is used to simulate the evolution of the AIS from the Last Glacial Maximum ( 21,000 years ago) to present. Paleoclimate records are then used to determine which ensemble members are the most realistic. This allows us to use data on past climates to directly constrain our understanding of the past contribution of the AIS towards changes in global sea level. Critically, it also allows us to determine which ensemble members are likely to generate the most realistic projections of the future evolution of the AIS.

  6. Classical simulation of infinite-size quantum lattice systems in two spatial dimensions.

    PubMed

    Jordan, J; Orús, R; Vidal, G; Verstraete, F; Cirac, J I

    2008-12-19

    We present an algorithm to simulate two-dimensional quantum lattice systems in the thermodynamic limit. Our approach builds on the projected entangled-pair state algorithm for finite lattice systems [F. Verstraete and J. I. Cirac, arxiv:cond-mat/0407066] and the infinite time-evolving block decimation algorithm for infinite one-dimensional lattice systems [G. Vidal, Phys. Rev. Lett. 98, 070201 (2007)10.1103/PhysRevLett.98.070201]. The present algorithm allows for the computation of the ground state and the simulation of time evolution in infinite two-dimensional systems that are invariant under translations. We demonstrate its performance by obtaining the ground state of the quantum Ising model and analyzing its second order quantum phase transition.

  7. CONTACT: An Air Force technical report on military satellite control technology

    NASA Astrophysics Data System (ADS)

    Weakley, Christopher K.

    1993-07-01

    This technical report focuses on Military Satellite Control Technologies and their application to the Air Force Satellite Control Network (AFSCN). This report is a compilation of articles that provide an overview of the AFSCN and the Advanced Technology Program, and discusses relevant technical issues and developments applicable to the AFSCN. Among the topics covered are articles on Future Technology Projections; Future AFSCN Topologies; Modeling of the AFSCN; Wide Area Communications Technology Evolution; Automating AFSCN Resource Scheduling; Health & Status Monitoring at Remote Tracking Stations; Software Metrics and Tools for Measuring AFSCN Software Performance; Human-Computer Interface Working Group; Trusted Systems Workshop; and the University Technical Interaction Program. In addition, Key Technology Area points of contact are listed in the report.

  8. Ejecta evolutions and fates from the AIDA impact on the secondary of the binary asteroid Didymos: a NEOShield-2 project contribution

    NASA Astrophysics Data System (ADS)

    Michel, P.; Yu, Y.

    2017-09-01

    We simulated the evolutions and fates of ejecta produced by the impact of a projectile of the secondary of the binary asteroid Didymos, in the framework of the AIDA space mission project. Our results show how these evolutions and fates depend on the impact location on the secondary and ejection speeds of the ejecta. This information can be used to defined safe positions for an observing spacecraft and to better understand the outcome of an impact in the environment of a binary asteroid.

  9. Modeling the Evolution of a Science Project in Software-Reliant System Acquisition Programs

    DTIC Science & Technology

    2013-07-24

    might: • Limit worker burnout • Perform better regarding schedule 10 Software Technology Conference April 10, 2013 © 2013 Carnegie Mellon...University The Evolution of a Science Project Key Preliminary Findings -3 The tipping point contributes to the “90% Done” Syndrome Percentage...worker burnout - SP User Satisfaction SP increasing satisfaction indicated satisfaction + + B3 Moderating User Satisfaction overage switch demand switch

  10. The Rise and Fall of the Bering Land Bridge. Crustal Evolution Education Project. Teacher's Guide [and] Student Investigation.

    ERIC Educational Resources Information Center

    Stoever, Edward C., Jr.

    Crustal Evolution Education Project (CEEP) modules were designed to: (1) provide students with the methods and results of continuing investigations into the composition, history, and processes of the earth's crust and the application of this knowledge to man's activities and (2) to be used by teachers with little or no previous background in the…

  11. Education as an Agent of Social Evolution: The Educational Projects of Patrick Geddes in Late-Victorian Scotland

    ERIC Educational Resources Information Center

    Sutherland, Douglas

    2009-01-01

    This paper examines the educational projects of Patrick Geddes in late-Victorian Scotland. Initially a natural scientist, Geddes drew on an eclectic mix of social theory to develop his own ideas on social evolution. For him education was a vital agent of social change which, he believed, had the potential to develop active citizens whose…

  12. Extinction events can accelerate evolution.

    PubMed

    Lehman, Joel; Miikkulainen, Risto

    2015-01-01

    Extinction events impact the trajectory of biological evolution significantly. They are often viewed as upheavals to the evolutionary process. In contrast, this paper supports the hypothesis that although they are unpredictably destructive, extinction events may in the long term accelerate evolution by increasing evolvability. In particular, if extinction events extinguish indiscriminately many ways of life, indirectly they may select for the ability to expand rapidly through vacated niches. Lineages with such an ability are more likely to persist through multiple extinctions. Lending computational support for this hypothesis, this paper shows how increased evolvability will result from simulated extinction events in two computational models of evolved behavior. The conclusion is that although they are destructive in the short term, extinction events may make evolution more prolific in the long term.

  13. High-Tech Opens Doors.

    ERIC Educational Resources Information Center

    Eichleay, Kristen; Pressman, Harvey

    1987-01-01

    Exemplary projects which help disabled people use technology (particularly computers) expand their employment opportunities include: Project Entry (Seattle); Georgia Computer Programmer Project (Atlanta); Perkins Project with Industry (Watertown, Massachusetts); Project Byte (Newton Massachusetts); Technology Relevant to You (St. Louis); Special…

  14. Effect of local minima on adiabatic quantum optimization.

    PubMed

    Amin, M H S

    2008-04-04

    We present a perturbative method to estimate the spectral gap for adiabatic quantum optimization, based on the structure of the energy levels in the problem Hamiltonian. We show that, for problems that have an exponentially large number of local minima close to the global minimum, the gap becomes exponentially small making the computation time exponentially long. The quantum advantage of adiabatic quantum computation may then be accessed only via the local adiabatic evolution, which requires phase coherence throughout the evolution and knowledge of the spectrum. Such problems, therefore, are not suitable for adiabatic quantum computation.

  15. Launching "the evolution of cooperation".

    PubMed

    Axelrod, Robert

    2012-04-21

    This article describes three aspects of the author's early work on the evolution of the cooperation. First, it explains how the idea for a computer tournament for the iterated Prisoner's Dilemma was inspired by the artificial intelligence research on computer checkers and computer chess. Second, it shows how the vulnerability of simple reciprocity of misunderstanding or misimplementation can be eliminated with the addition of some degree of generosity or contrition. Third, it recounts the unusual collaboration between the author, a political scientist, and William D. Hamilton, an evolutionary biologist. Copyright © 2011 Elsevier Ltd. All rights reserved.

  16. The influence of a game-making project on male and female learners' attitudes to computing

    NASA Astrophysics Data System (ADS)

    Robertson, Judy

    2013-03-01

    There is a pressing need for gender inclusive approaches to engage young people in computer science. A recent popular approach has been to harness learners' enthusiasm for computer games to motivate them to learn computer science concepts through game authoring. This article describes a study in which 992 learners across 13 schools took part in a game-making project. It provides evidence from 225 pre-test and post-test questionnaires on how learners' attitudes to computing changed during the project, as well as qualitative reflections from the class teachers on how the project affected their learners. Results indicate that girls did not enjoy the experience as much as boys, and that in fact, the project may make pupils less inclined to study computing in the future. This has important implications for future efforts to engage young people in computing.

  17. Yes, I Can: Action Projects To Resolve Equity Issues in Educational Computing. A Project of ECCO, the Educational Computer Consortium of Ohio.

    ERIC Educational Resources Information Center

    Fredman, Alice, Ed.

    This book presents reports on selected "local action" projects that were developed as part of the Equity in Technology Project, which was inaugurated in 1985 by the Educational Computer Consortium of Ohio (ECCO). The book is organized into three sections, one for each of the populations targeted by the project. An introduction by Alice Fredman…

  18. Research infrastructure support to address ecosystem dynamics

    NASA Astrophysics Data System (ADS)

    Los, Wouter

    2014-05-01

    Predicting the evolution of ecosystems to climate change or human pressures is a challenge. Even understanding past or current processes is complicated as a result of the many interactions and feedbacks that occur within and between components of the system. This talk will present an example of current research on changes in landscape evolution, hydrology, soil biogeochemical processes, zoological food webs, and plant community succession, and how these affect feedbacks to components of the systems, including the climate system. Multiple observations, experiments, and simulations provide a wealth of data, but not necessarily understanding. Model development on the coupled processes on different spatial and temporal scales is sensitive for variations in data and of parameter change. Fast high performance computing may help to visualize the effect of these changes and the potential stability (and reliability) of the models. This may than allow for iteration between data production and models towards stable models reducing uncertainty and improving the prediction of change. The role of research infrastructures becomes crucial is overcoming barriers for such research. Environmental infrastructures are covering physical site facilities, dedicated instrumentation and e-infrastructure. The LifeWatch infrastructure for biodiversity and ecosystem research will provide services for data integration, analysis and modeling. But it has to cooperate intensively with the other kinds of infrastructures in order to support the iteration between data production and model computation. The cooperation in the ENVRI project (Common operations of environmental research infrastructures) is one of the initiatives to foster such multidisciplinary research.

  19. Proposal for future diagnosis and management of vascular tumors by using automatic software for image processing and statistic prediction.

    PubMed

    Popescu, M D; Draghici, L; Secheli, I; Secheli, M; Codrescu, M; Draghici, I

    2015-01-01

    Infantile Hemangiomas (IH) are the most frequent tumors of vascular origin, and the differential diagnosis from vascular malformations is difficult to establish. Specific types of IH due to the location, dimensions and fast evolution, can determine important functional and esthetic sequels. To avoid these unfortunate consequences it is necessary to establish the exact appropriate moment to begin the treatment and decide which the most adequate therapeutic procedure is. Based on clinical data collected by a serial clinical observations correlated with imaging data, and processed by a computer-aided diagnosis system (CAD), the study intended to develop a treatment algorithm to accurately predict the best final results, from the esthetical and functional point of view, for a certain type of lesion. The preliminary database was composed of 75 patients divided into 4 groups according to the treatment management they received: medical therapy, sclerotherapy, surgical excision and no treatment. The serial clinical observation was performed each month and all the data was processed by using CAD. The project goal was to create a software that incorporated advanced methods to accurately measure the specific IH lesions, integrated medical information, statistical methods and computational methods to correlate this information with that obtained from the processing of images. Based on these correlations, a prediction mechanism of the evolution of hemangioma, which helped determine the best method of therapeutic intervention to minimize further complications, was established.

  20. Cyberdyn supercomputer - a tool for imaging geodinamic processes

    NASA Astrophysics Data System (ADS)

    Pomeran, Mihai; Manea, Vlad; Besutiu, Lucian; Zlagnean, Luminita

    2014-05-01

    More and more physical processes developed within the deep interior of our planet, but with significant impact on the Earth's shape and structure, become subject to numerical modelling by using high performance computing facilities. Nowadays, worldwide an increasing number of research centers decide to make use of such powerful and fast computers for simulating complex phenomena involving fluid dynamics and get deeper insight to intricate problems of Earth's evolution. With the CYBERDYN cybernetic infrastructure (CCI), the Solid Earth Dynamics Department in the Institute of Geodynamics of the Romanian Academy boldly steps into the 21st century by entering the research area of computational geodynamics. The project that made possible this advancement, has been jointly supported by EU and Romanian Government through the Structural and Cohesion Funds. It lasted for about three years, ending October 2013. CCI is basically a modern high performance Beowulf-type supercomputer (HPCC), combined with a high performance visualization cluster (HPVC) and a GeoWall. The infrastructure is mainly structured around 1344 cores and 3 TB of RAM. The high speed interconnect is provided by a Qlogic InfiniBand switch, able to transfer up to 40 Gbps. The CCI storage component is a 40 TB Panasas NAS. The operating system is Linux (CentOS). For control and maintenance, the Bright Cluster Manager package is used. The SGE job scheduler manages the job queues. CCI has been designed for a theoretical peak performance up to 11.2 TFlops. Speed tests showed that a high resolution numerical model (256 × 256 × 128 FEM elements) could be resolved with a mean computational speed of 1 time step at 30 seconds, by employing only a fraction of the computing power (20%). After passing the mandatory tests, the CCI has been involved in numerical modelling of various scenarios related to the East Carpathians tectonic and geodynamic evolution, including the Neogene magmatic activity, and the intriguing intermediate-depth seismicity within the so-called Vrancea zone. The CFD code for numerical modelling is CitcomS, a widely employed open source package specifically developed for earth sciences. Several preliminary 3D geodynamic models for simulating an assumed subduction or the effect of a mantle plume will be presented and discussed.

  1. In silico evolution of biochemical networks

    NASA Astrophysics Data System (ADS)

    Francois, Paul

    2010-03-01

    We use computational evolution to select models of genetic networks that can be built from a predefined set of parts to achieve a certain behavior. Selection is made with the help of a fitness defining biological functions in a quantitative way. This fitness has to be specific to a process, but general enough to find processes common to many species. Computational evolution favors models that can be built by incremental improvements in fitness rather than via multiple neutral steps or transitions through less fit intermediates. With the help of these simulations, we propose a kinetic view of evolution, where networks are rapidly selected along a fitness gradient. This mathematics recapitulates Darwin's original insight that small changes in fitness can rapidly lead to the evolution of complex structures such as the eye, and explain the phenomenon of convergent/parallel evolution of similar structures in independent lineages. We will illustrate these ideas with networks implicated in embryonic development and patterning of vertebrates and primitive insects.

  2. System Safety in Early Manned Space Program: A Case Study of NASA and Project Mercury

    NASA Technical Reports Server (NTRS)

    Hansen, Frederick D.; Pitts, Donald

    2005-01-01

    This case study provides a review of National Aeronautics and Space Administration s (NASA's) involvement in system safety during research and evolution from air breathing to exo-atmospheric capable flight systems culminating in the successful Project Mercury. Although NASA has been philosophically committed to the principals of system safety, this case study points out that budget and manpower constraints-as well as a variety of internal and external pressures can jeopardize even a well-designed system safety program. This study begins with a review of the evolution and early years of NASA's rise as a project lead agency and ends with the lessons learned from Project Mercury.

  3. Hybrid Quantum-Classical Approach to Quantum Optimal Control.

    PubMed

    Li, Jun; Yang, Xiaodong; Peng, Xinhua; Sun, Chang-Pu

    2017-04-14

    A central challenge in quantum computing is to identify more computational problems for which utilization of quantum resources can offer significant speedup. Here, we propose a hybrid quantum-classical scheme to tackle the quantum optimal control problem. We show that the most computationally demanding part of gradient-based algorithms, namely, computing the fitness function and its gradient for a control input, can be accomplished by the process of evolution and measurement on a quantum simulator. By posing queries to and receiving answers from the quantum simulator, classical computing devices update the control parameters until an optimal control solution is found. To demonstrate the quantum-classical scheme in experiment, we use a seven-qubit nuclear magnetic resonance system, on which we have succeeded in optimizing state preparation without involving classical computation of the large Hilbert space evolution.

  4. Strategic Computing Computer Vision: Taking Image Understanding To The Next Plateau

    NASA Astrophysics Data System (ADS)

    Simpson, R. L., Jr.

    1987-06-01

    The overall objective of the Strategic Computing (SC) Program of the Defense Advanced Research Projects Agency (DARPA) is to develop and demonstrate a new generation of machine intelligence technology which can form the basis for more capable military systems in the future and also maintain a position of world leadership for the US in computer technology. Begun in 1983, SC represents a focused research strategy for accelerating the evolution of new technology and its rapid prototyping in realistic military contexts. Among the very ambitious demonstration prototypes being developed within the SC Program are: 1) the Pilot's Associate which will aid the pilot in route planning, aerial target prioritization, evasion of missile threats, and aircraft emergency safety procedures during flight; 2) two battle management projects one for the for the Army, which is just getting started, called the AirLand Battle Management program (ALBM) which will use knowledge-based systems technology to assist in the generation and evaluation of tactical options and plans at the Corps level; 3) the other more established program for the Navy is the Fleet Command Center Battle Management Program (FCCBIVIP) at Pearl Harbor. The FCCBMP is employing knowledge-based systems and natural language technology in a evolutionary testbed situated in an operational command center to demonstrate and evaluate intelligent decision-aids which can assist in the evaluation of fleet readiness and explore alternatives during contingencies; and 4) the Autonomous Land Vehicle (ALV) which integrates in a major robotic testbed the technologies for dynamic image understanding, knowledge-based route planning with replanning during execution, hosted on new advanced parallel architectures. The goal of the Strategic Computing computer vision technology base (SCVision) is to develop generic technology that will enable the construction of complete, robust, high performance image understanding systems to support a wide range of DoD applications. Possible applications include autonomous vehicle navigation, photointerpretation, smart weapons, and robotic manipulation. This paper provides an overview of the technical and program management plans being used in evolving this critical national technology.

  5. [Earth Science Technology Office's Computational Technologies Project

    NASA Technical Reports Server (NTRS)

    Fischer, James (Technical Monitor); Merkey, Phillip

    2005-01-01

    This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.

  6. Final Report for Project "A high-throughput pipeline for mapping inter-species interactions and metabolic synergy relevant to next-generation biofuel production"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Segre, Daniel; Marx, Christopher J.; Northen, Trent

    The goal of our project was to implement a pipeline for the systematic, computationally-driven study and optimization of microbial interactions and their effect on lignocellulose degradation and biofuel production. We specifically sought to design and construct artificial microbial consortia that could collectively degrade lignocellulose from plant biomass, and produce precursors of energy-rich biofuels. This project fits into the bigger picture goal of helping identify a sustainable strategy for the production of energy-rich biofuels that would satisfy the existing energy constraints and demand of our society. Based on the observation that complex natural microbial communities tend to be metabolically efficient andmore » ecologically robust, we pursued the study of a microbial system in which the desired engineering function is achieved through division of labor across multiple microbial species. Our approach was aimed at bypassing the complexity of natural communities by establishing a rational approach to design small synthetic microbial consortia. Towards this goal, we combined multiple approaches, including computer modeling of ecosystem-level microbial metabolism, mass spectrometry of metabolites, genetic engineering, and experimental evolution. The microbial production of biofuels from lignocellulose is a complex, multi-step process. Microbial consortia are an ideal approach to consolidated bioprocessing: a community of microorganisms performs a wide variety of functions more efficiently and is more resilient to environmental perturbations than a microbial monoculture. Each organism we chose for this project addresses a specific challenge: lignin degradation (Pseudomonas putida); (hemi)cellulose degradation (Cellulomonas fimi); lignin degradation product demethoxylation (Methylobacterium spp); generation of biofuel lipid precursors (Yarrowia lipolytica). These organisms are genetically tractable, aerobic, and have been used in biotechnological applications. Throughout the project, we have used mass spectrometry to characterize and measure the metabolic inputs and outputs of each of these consortium members, providing valuable information for model refinement, and enabling the establishment of metabolism-mediated interactions. In addition to lignocellulose degradation, we have started addressing the challenge of removing metabolites (e.g. formaldehyde) produced by the demethoxylation of lignin monomers, which can otherwise inhibit microbial growth due to their toxicity. On the computational side, we have implemented genome-scale models for all consortium members, based on KBase reconstructions and literature curation, and we studied small consortia and their properties. Overall, our project has identified a complex landscape of interactions types and metabolic processes relevant to community-level functions, illustrating the challenges and opportunities of microbial community engineering for the transformation of biomass into bioproducts.« less

  7. A Set of Computer Projects for an Electromagnetic Fields Class.

    ERIC Educational Resources Information Center

    Gleeson, Ronald F.

    1989-01-01

    Presented are three computer projects: vector analysis, electric field intensities at various distances, and the Biot-Savart law. Programing suggestions and project results are provided. One month is suggested for each project. (MVL)

  8. Extinction Events Can Accelerate Evolution

    PubMed Central

    Lehman, Joel; Miikkulainen, Risto

    2015-01-01

    Extinction events impact the trajectory of biological evolution significantly. They are often viewed as upheavals to the evolutionary process. In contrast, this paper supports the hypothesis that although they are unpredictably destructive, extinction events may in the long term accelerate evolution by increasing evolvability. In particular, if extinction events extinguish indiscriminately many ways of life, indirectly they may select for the ability to expand rapidly through vacated niches. Lineages with such an ability are more likely to persist through multiple extinctions. Lending computational support for this hypothesis, this paper shows how increased evolvability will result from simulated extinction events in two computational models of evolved behavior. The conclusion is that although they are destructive in the short term, extinction events may make evolution more prolific in the long term. PMID:26266804

  9. A three-dimensional FEM-DEM technique for predicting the evolution of fracture in geomaterials and concrete

    NASA Astrophysics Data System (ADS)

    Zárate, Francisco; Cornejo, Alejandro; Oñate, Eugenio

    2018-07-01

    This paper extends to three dimensions (3D), the computational technique developed by the authors in 2D for predicting the onset and evolution of fracture in a finite element mesh in a simple manner based on combining the finite element method and the discrete element method (DEM) approach (Zárate and Oñate in Comput Part Mech 2(3):301-314, 2015). Once a crack is detected at an element edge, discrete elements are generated at the adjacent element vertexes and a simple DEM mechanism is considered in order to follow the evolution of the crack. The combination of the DEM with simple four-noded linear tetrahedron elements correctly captures the onset of fracture and its evolution, as shown in several 3D examples of application.

  10. Position specific variation in the rate of evolution in transcription factor binding sites

    PubMed Central

    Moses, Alan M; Chiang, Derek Y; Kellis, Manolis; Lander, Eric S; Eisen, Michael B

    2003-01-01

    Background The binding sites of sequence specific transcription factors are an important and relatively well-understood class of functional non-coding DNAs. Although a wide variety of experimental and computational methods have been developed to characterize transcription factor binding sites, they remain difficult to identify. Comparison of non-coding DNA from related species has shown considerable promise in identifying these functional non-coding sequences, even though relatively little is known about their evolution. Results Here we analyse the genome sequences of the budding yeasts Saccharomyces cerevisiae, S. bayanus, S. paradoxus and S. mikatae to study the evolution of transcription factor binding sites. As expected, we find that both experimentally characterized and computationally predicted binding sites evolve slower than surrounding sequence, consistent with the hypothesis that they are under purifying selection. We also observe position-specific variation in the rate of evolution within binding sites. We find that the position-specific rate of evolution is positively correlated with degeneracy among binding sites within S. cerevisiae. We test theoretical predictions for the rate of evolution at positions where the base frequencies deviate from background due to purifying selection and find reasonable agreement with the observed rates of evolution. Finally, we show how the evolutionary characteristics of real binding motifs can be used to distinguish them from artefacts of computational motif finding algorithms. Conclusion As has been observed for protein sequences, the rate of evolution in transcription factor binding sites varies with position, suggesting that some regions are under stronger functional constraint than others. This variation likely reflects the varying importance of different positions in the formation of the protein-DNA complex. The characterization of the pattern of evolution in known binding sites will likely contribute to the effective use of comparative sequence data in the identification of transcription factor binding sites and is an important step toward understanding the evolution of functional non-coding DNA. PMID:12946282

  11. AGIS: Evolution of Distributed Computing information system for ATLAS

    NASA Astrophysics Data System (ADS)

    Anisenkov, A.; Di Girolamo, A.; Alandes, M.; Karavakis, E.

    2015-12-01

    ATLAS, a particle physics experiment at the Large Hadron Collider at CERN, produces petabytes of data annually through simulation production and tens of petabytes of data per year from the detector itself. The ATLAS computing model embraces the Grid paradigm and a high degree of decentralization of computing resources in order to meet the ATLAS requirements of petabytes scale data operations. It has been evolved after the first period of LHC data taking (Run-1) in order to cope with new challenges of the upcoming Run- 2. In this paper we describe the evolution and recent developments of the ATLAS Grid Information System (AGIS), developed in order to integrate configuration and status information about resources, services and topology of the computing infrastructure used by the ATLAS Distributed Computing applications and services.

  12. Constrained evolution in numerical relativity

    NASA Astrophysics Data System (ADS)

    Anderson, Matthew William

    The strongest potential source of gravitational radiation for current and future detectors is the merger of binary black holes. Full numerical simulation of such mergers can provide realistic signal predictions and enhance the probability of detection. Numerical simulation of the Einstein equations, however, is fraught with difficulty. Stability even in static test cases of single black holes has proven elusive. Common to unstable simulations is the growth of constraint violations. This work examines the effect of controlling the growth of constraint violations by solving the constraints periodically during a simulation, an approach called constrained evolution. The effects of constrained evolution are contrasted with the results of unconstrained evolution, evolution where the constraints are not solved during the course of a simulation. Two different formulations of the Einstein equations are examined: the standard ADM formulation and the generalized Frittelli-Reula formulation. In most cases constrained evolution vastly improves the stability of a simulation at minimal computational cost when compared with unconstrained evolution. However, in the more demanding test cases examined, constrained evolution fails to produce simulations with long-term stability in spite of producing improvements in simulation lifetime when compared with unconstrained evolution. Constrained evolution is also examined in conjunction with a wide variety of promising numerical techniques, including mesh refinement and overlapping Cartesian and spherical computational grids. Constrained evolution in boosted black hole spacetimes is investigated using overlapping grids. Constrained evolution proves to be central to the host of innovations required in carrying out such intensive simulations.

  13. Group Projects and the Computer Science Curriculum

    ERIC Educational Resources Information Center

    Joy, Mike

    2005-01-01

    Group projects in computer science are normally delivered with reference to good software engineering practice. The discipline of software engineering is rapidly evolving, and the application of the latest 'agile techniques' to group projects causes a potential conflict with constraints imposed by regulating bodies on the computer science…

  14. "The Evolution of Photosynthesis and the Transition from an Anaerobic to an Aerobic World"

    NASA Technical Reports Server (NTRS)

    Blankenship, Robert E.

    2005-01-01

    This project was focused on elucidating the evolution of photosynthesis, in particular the evolutionary developments that preceded and accompanied the transition from anoxygenic to oxygenic photosynthesis. Development of this process has clearly been of central importance to evolution of life on Earth. Photosynthesis is the mechanism that ultimately provides for the energy needs of most surface-dwelling organisms. Eukaryotic organisms are absolutely dependent on the molecular oxygen that has been produced by oxygenic photosynthesis. In this project we have employed a multidisciplinary approach to understand some of the processes that took place during the evolution of photosynthesis. In this project, we made excellent progress in the overall area of understanding the origin and evolution of photosynthesis. Particular progress has been made on several more specific research questions, including the molecular evolutionary analysis of photosynthetic components and biosynthetic pathways (2,3, 5, 7, 10), as well as biochemical characterization of electron transfer proteins related to photosynthesis and active oxygen protection (4,6,9). Finally, several review and commentary papers have been published (1, 8, 1 1). A total of twelve publications arose out of this grant, references to which are given below. Some specific areas of progress are highlighted and discussed in more detail.

  15. French Plans for Fifth Generation Computer Systems.

    DTIC Science & Technology

    1984-12-07

    centrally man- French industry In electronics, compu- aged project in France that covers all ters, software, and services and to make the facets of the...Centre National of Japan’s Fifth Generation Project , the de Recherche Scientifique (CNRS) Cooper- French scientific and industrial com- ative Research...systems, man-computer The National Projects interaction, novel computer structures, The French Ministry of Research and knowledge-based computer systems

  16. Japanese supercomputer technology.

    PubMed

    Buzbee, B L; Ewald, R H; Worlton, W J

    1982-12-17

    Under the auspices of the Ministry for International Trade and Industry the Japanese have launched a National Superspeed Computer Project intended to produce high-performance computers for scientific computation and a Fifth-Generation Computer Project intended to incorporate and exploit concepts of artificial intelligence. If these projects are successful, which appears likely, advanced economic and military research in the United States may become dependent on access to supercomputers of foreign manufacture.

  17. Mobile Computing and Ubiquitous Networking: Concepts, Technologies and Challenges.

    ERIC Educational Resources Information Center

    Pierre, Samuel

    2001-01-01

    Analyzes concepts, technologies and challenges related to mobile computing and networking. Defines basic concepts of cellular systems. Describes the evolution of wireless technologies that constitute the foundations of mobile computing and ubiquitous networking. Presents characterization and issues of mobile computing. Analyzes economical and…

  18. Leveraging Human Insights by Combining Multi-Objective Optimization with Interactive Evolution

    DTIC Science & Technology

    2015-03-26

    application, a program that used human selections to guide the evolution of insect -like images. He was able to demonstrate that humans provide key insights...LEVERAGING HUMAN INSIGHTS BY COMBINING MULTI-OBJECTIVE OPTIMIZATION WITH INTERACTIVE EVOLUTION THESIS Joshua R. Christman, Second Lieutenant, USAF...COMBINING MULTI-OBJECTIVE OPTIMIZATION WITH INTERACTIVE EVOLUTION THESIS Presented to the Faculty Department of Electrical and Computer Engineering

  19. Irreconcilable difference between quantum walks and adiabatic quantum computing

    NASA Astrophysics Data System (ADS)

    Wong, Thomas G.; Meyer, David A.

    2016-06-01

    Continuous-time quantum walks and adiabatic quantum evolution are two general techniques for quantum computing, both of which are described by Hamiltonians that govern their evolutions by Schrödinger's equation. In the former, the Hamiltonian is fixed, while in the latter, the Hamiltonian varies with time. As a result, their formulations of Grover's algorithm evolve differently through Hilbert space. We show that this difference is fundamental; they cannot be made to evolve along each other's path without introducing structure more powerful than the standard oracle for unstructured search. For an adiabatic quantum evolution to evolve like the quantum walk search algorithm, it must interpolate between three fixed Hamiltonians, one of which is complex and introduces structure that is stronger than the oracle for unstructured search. Conversely, for a quantum walk to evolve along the path of the adiabatic search algorithm, it must be a chiral quantum walk on a weighted, directed star graph with structure that is also stronger than the oracle for unstructured search. Thus, the two techniques, although similar in being described by Hamiltonians that govern their evolution, compute by fundamentally irreconcilable means.

  20. New Frontiers in Language Evolution and Development.

    PubMed

    Oller, D Kimbrough; Dale, Rick; Griebel, Ulrike

    2016-04-01

    This article introduces the Special Issue and its focus on research in language evolution with emphasis on theory as well as computational and robotic modeling. A key theme is based on the growth of evolutionary developmental biology or evo-devo. The Special Issue consists of 13 articles organized in two sections: A) Theoretical foundations and B) Modeling and simulation studies. All the papers are interdisciplinary in nature, encompassing work in biological and linguistic foundations for the study of language evolution as well as a variety of computational and robotic modeling efforts shedding light on how language may be developed and may have evolved. Copyright © 2016 Cognitive Science Society, Inc.

  1. Computational optimization and biological evolution.

    PubMed

    Goryanin, Igor

    2010-10-01

    Modelling and optimization principles become a key concept in many biological areas, especially in biochemistry. Definitions of objective function, fitness and co-evolution, although they differ between biology and mathematics, are similar in a general sense. Although successful in fitting models to experimental data, and some biochemical predictions, optimization and evolutionary computations should be developed further to make more accurate real-life predictions, and deal not only with one organism in isolation, but also with communities of symbiotic and competing organisms. One of the future goals will be to explain and predict evolution not only for organisms in shake flasks or fermenters, but for real competitive multispecies environments.

  2. Computer, Video, and Rapid-Cycling Plant Projects in an Undergraduate Plant Breeding Course.

    ERIC Educational Resources Information Center

    Michaels, T. E.

    1993-01-01

    Studies the perceived effectiveness of four student projects involving videotape production, computer conferencing, microcomputer simulation, and rapid-cycling Brassica breeding for undergraduate plant breeding students in two course offerings in consecutive years. Linking of the computer conferencing and video projects improved the rating of the…

  3. Networked Microcomputers--The Next Generation in College Computing.

    ERIC Educational Resources Information Center

    Harris, Albert L.

    The evolution of computer hardware for college computing has mirrored the industry's growth. When computers were introduced into the educational environment, they had limited capacity and served one user at a time. Then came large mainframes with many terminals sharing the resource. Next, the use of computers in office automation emerged. As…

  4. Leveraging Cloud Technology to Provide a Responsive, Reliable and Scalable Backend for the Virtual Ice Sheet Laboratory Using the Ice Sheet System Model and Amazon's Elastic Compute Cloud

    NASA Astrophysics Data System (ADS)

    Perez, G. L.; Larour, E. Y.; Halkides, D. J.; Cheng, D. L. C.

    2015-12-01

    The Virtual Ice Sheet Laboratory(VISL) is a Cryosphere outreach effort byscientists at the Jet Propulsion Laboratory(JPL) in Pasadena, CA, Earth and SpaceResearch(ESR) in Seattle, WA, and the University of California at Irvine (UCI), with the goal of providing interactive lessons for K-12 and college level students,while conforming to STEM guidelines. At the core of VISL is the Ice Sheet System Model(ISSM), an open-source project developed jointlyat JPL and UCI whose main purpose is to model the evolution of the polar ice caps in Greenland and Antarctica. By using ISSM, VISL students have access tostate-of-the-art modeling software that is being used to conduct scientificresearch by users all over the world. However, providing this functionality isby no means simple. The modeling of ice sheets in response to sea and atmospheric temperatures, among many other possible parameters, requiressignificant computational resources. Furthermore, this service needs to beresponsive and capable of handling burst requests produced by classrooms ofstudents. Cloud computing providers represent a burgeoning industry. With majorinvestments by tech giants like Amazon, Google and Microsoft, it has never beeneasier or more affordable to deploy computational elements on-demand. This isexactly what VISL needs and ISSM is capable of. Moreover, this is a promisingalternative to investing in expensive and rapidly devaluing hardware.

  5. ATLAS Distributed Computing Monitoring tools during the LHC Run I

    NASA Astrophysics Data System (ADS)

    Schovancová, J.; Campana, S.; Di Girolamo, A.; Jézéquel, S.; Ueda, I.; Wenaus, T.; Atlas Collaboration

    2014-06-01

    This contribution summarizes evolution of the ATLAS Distributed Computing (ADC) Monitoring project during the LHC Run I. The ADC Monitoring targets at the three groups of customers: ADC Operations team to early identify malfunctions and escalate issues to an activity or a service expert, ATLAS national contacts and sites for the real-time monitoring and long-term measurement of the performance of the provided computing resources, and the ATLAS Management for long-term trends and accounting information about the ATLAS Distributed Computing resources. During the LHC Run I a significant development effort has been invested in standardization of the monitoring and accounting applications in order to provide extensive monitoring and accounting suite. ADC Monitoring applications separate the data layer and the visualization layer. The data layer exposes data in a predefined format. The visualization layer is designed bearing in mind visual identity of the provided graphical elements, and re-usability of the visualization bits across the different tools. A rich family of various filtering and searching options enhancing available user interfaces comes naturally with the data and visualization layer separation. With a variety of reliable monitoring data accessible through standardized interfaces, the possibility of automating actions under well defined conditions correlating multiple data sources has become feasible. In this contribution we discuss also about the automated exclusion of degraded resources and their automated recovery in various activities.

  6. Bifurcation-based adiabatic quantum computation with a nonlinear oscillator network.

    PubMed

    Goto, Hayato

    2016-02-22

    The dynamics of nonlinear systems qualitatively change depending on their parameters, which is called bifurcation. A quantum-mechanical nonlinear oscillator can yield a quantum superposition of two oscillation states, known as a Schrödinger cat state, via quantum adiabatic evolution through its bifurcation point. Here we propose a quantum computer comprising such quantum nonlinear oscillators, instead of quantum bits, to solve hard combinatorial optimization problems. The nonlinear oscillator network finds optimal solutions via quantum adiabatic evolution, where nonlinear terms are increased slowly, in contrast to conventional adiabatic quantum computation or quantum annealing, where quantum fluctuation terms are decreased slowly. As a result of numerical simulations, it is concluded that quantum superposition and quantum fluctuation work effectively to find optimal solutions. It is also notable that the present computer is analogous to neural computers, which are also networks of nonlinear components. Thus, the present scheme will open new possibilities for quantum computation, nonlinear science, and artificial intelligence.

  7. Bifurcation-based adiabatic quantum computation with a nonlinear oscillator network

    NASA Astrophysics Data System (ADS)

    Goto, Hayato

    2016-02-01

    The dynamics of nonlinear systems qualitatively change depending on their parameters, which is called bifurcation. A quantum-mechanical nonlinear oscillator can yield a quantum superposition of two oscillation states, known as a Schrödinger cat state, via quantum adiabatic evolution through its bifurcation point. Here we propose a quantum computer comprising such quantum nonlinear oscillators, instead of quantum bits, to solve hard combinatorial optimization problems. The nonlinear oscillator network finds optimal solutions via quantum adiabatic evolution, where nonlinear terms are increased slowly, in contrast to conventional adiabatic quantum computation or quantum annealing, where quantum fluctuation terms are decreased slowly. As a result of numerical simulations, it is concluded that quantum superposition and quantum fluctuation work effectively to find optimal solutions. It is also notable that the present computer is analogous to neural computers, which are also networks of nonlinear components. Thus, the present scheme will open new possibilities for quantum computation, nonlinear science, and artificial intelligence.

  8. Flight Planning for the International Space Station-Levitation Observation of Dendrite Evolution in Steel Ternary Alloy Rapid Solidification

    NASA Technical Reports Server (NTRS)

    Flemings, M. C.; Matson, D. M.; Loser, W.; Hyers, R. W.; Rogers, J. R.; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    The paper is an overview of the status and science for the LODESTARS (Levitation Observation of Dendrite Evolution in Steel Ternary Alloy Rapid Solidification) research project. The program is aimed at understanding how melt convection influences phase selection and the evolution of rapid solidification microstructures.

  9. Generalized approximate spin projection calculations of effective exchange integrals of the CaMn4O5 cluster in the S1 and S3 states of the oxygen evolving complex of photosystem II.

    PubMed

    Isobe, H; Shoji, M; Yamanaka, S; Mino, H; Umena, Y; Kawakami, K; Kamiya, N; Shen, J-R; Yamaguchi, K

    2014-06-28

    Full geometry optimizations followed by the vibrational analysis were performed for eight spin configurations of the CaMn4O4X(H2O)3Y (X = O, OH; Y = H2O, OH) cluster in the S1 and S3 states of the oxygen evolution complex (OEC) of photosystem II (PSII). The energy gaps among these configurations obtained by vertical, adiabatic and adiabatic plus zero-point-energy (ZPE) correction procedures have been used for computation of the effective exchange integrals (J) in the spin Hamiltonian model. The J values are calculated by the (1) analytical method and the (2) generalized approximate spin projection (AP) method that eliminates the spin contamination errors of UB3LYP solutions. Using J values derived from these methods, exact diagonalization of the spin Hamiltonian matrix was carried out, yielding excitation energies and spin densities of the ground and lower-excited states of the cluster. The obtained results for the right (R)- and left (L)-opened structures in the S1 and S3 states are found to be consistent with available optical and magnetic experimental results. Implications of the computational results are discussed in relation to (a) the necessity of the exact diagonalization for computations of reliable energy levels, (b) magneto-structural correlations in the CaMn4O5 cluster of the OEC of PSII, (c) structural symmetry breaking in the S1 and S3 states, and (d) the right- and left-handed scenarios for the O-O bond formation for water oxidation.

  10. Simulations of solid-fluid coupling with application to crystal entrainment in vigorous convection

    NASA Astrophysics Data System (ADS)

    Suckale, J.; Elkins-Tanton, L. T.; Sethian, J.; Yu, J.

    2009-12-01

    Many problems in computational geophysics require the accurate coupling of a solid body to viscous flow. Examples range from understanding the role of highly crystalline magma for the dynamic of volcanic eruptions to crystal entrainment in magmatic flow and the emplacement of xenoliths. In this paper, we present and validate a numerical method for solid-fluid coupling. The algorithm relies on a two-step projection scheme: In the first step, we solve the multiple-phase Navier-Stokes or Stokes equation in both domains. In the second step, we project the velocity field in the solid domain onto a rigid-body motion by enforcing that the deformation tensor in the respective domain is zero. This procedure is also used to enforce the no-slip boundary condition on the solid-fluid interface. We perform several benchmark computations to validate our computations. More precisely, we investigate the formation of a wake behind both fixed and mobile cylinders and cuboids with and without imposed velocity fields in the fluid. These preliminary tests indicate that our code is able to simulate solid-fluid coupling for Reynolds numbers of up to 1000. Finally, we apply our method to the problem of crystal entrainment in vigorous convection. The interplay between sedimentation and re-entrainment of crystals in convective flow is of fundamental importance for understanding the compositional evolution of magmatic reservoirs of various sizes from small lava ponds to magma oceans at the planetary scale. Previous studies of this problem have focused primarily on laboratory experiments, often with conflicting conclusions. Our work is complementary to these prior studies as we model the competing processes of gravitational sedimentation and entrainment of crystals at the length scale of the size of the crystals.

  11. Progress on the Fabric for Frontier Experiments Project at Fermilab

    NASA Astrophysics Data System (ADS)

    Box, Dennis; Boyd, Joseph; Dykstra, Dave; Garzoglio, Gabriele; Herner, Kenneth; Kirby, Michael; Kreymer, Arthur; Levshina, Tanya; Mhashilkar, Parag; Sharma, Neha

    2015-12-01

    The FabrIc for Frontier Experiments (FIFE) project is an ambitious, major-impact initiative within the Fermilab Scientific Computing Division designed to lead the computing model for Fermilab experiments. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying needs and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of Open Science Grid solutions for high throughput computing, data management, database access and collaboration within experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid sites along with dedicated and commercial cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including new job submission services, software and reference data distribution through CVMFS repositories, flexible data transfer client, and access to opportunistic resources on the Open Science Grid. The progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken a leading role in the definition of the computing model for Fermilab experiments, aided in the design of computing for experiments beyond Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide.

  12. Special Education Teacher Computer Literacy Training. Project STEEL. A Special Project To Develop and Implement a Computer-Based Special Teacher Education and Evaluation Laboratory. Volume II. Final Report.

    ERIC Educational Resources Information Center

    Frick, Theodore W.; And Others

    The document is part of the final report on Project STEEL (Special Teacher Education and Evaluation Laboratory) intended to extend the utilization of technology in the training of preservice special education teachers. This volume focuses on the second of four project objectives, the development of a special education teacher computer literacy…

  13. Measurement-based quantum computation on two-body interacting qubits with adiabatic evolution.

    PubMed

    Kyaw, Thi Ha; Li, Ying; Kwek, Leong-Chuan

    2014-10-31

    A cluster state cannot be a unique ground state of a two-body interacting Hamiltonian. Here, we propose the creation of a cluster state of logical qubits encoded in spin-1/2 particles by adiabatically weakening two-body interactions. The proposal is valid for any spatial dimensional cluster states. Errors induced by thermal fluctuations and adiabatic evolution within finite time can be eliminated ensuring fault-tolerant quantum computing schemes.

  14. Gravitational field calculations on a dynamic lattice by distributed computing.

    NASA Astrophysics Data System (ADS)

    Mähönen, P.; Punkka, V.

    A new method of calculating numerically time evolution of a gravitational field in general relativity is introduced. Vierbein (tetrad) formalism, dynamic lattice and massively parallelized computation are suggested as they are expected to speed up the calculations considerably and facilitate the solution of problems previously considered too hard to be solved, such as the time evolution of a system consisting of two or more black holes or the structure of worm holes.

  15. Gravitation Field Calculations on a Dynamic Lattice by Distributed Computing

    NASA Astrophysics Data System (ADS)

    Mähönen, Petri; Punkka, Veikko

    A new method of calculating numerically time evolution of a gravitational field in General Relatity is introduced. Vierbein (tetrad) formalism, dynamic lattice and massively parallelized computation are suggested as they are expected to speed up the calculations considerably and facilitate the solution of problems previously considered too hard to be solved, such as the time evolution of a system consisting of two or more black holes or the structure of worm holes.

  16. Equation-free multiscale computation: algorithms and applications.

    PubMed

    Kevrekidis, Ioannis G; Samaey, Giovanni

    2009-01-01

    In traditional physicochemical modeling, one derives evolution equations at the (macroscopic, coarse) scale of interest; these are used to perform a variety of tasks (simulation, bifurcation analysis, optimization) using an arsenal of analytical and numerical techniques. For many complex systems, however, although one observes evolution at a macroscopic scale of interest, accurate models are only given at a more detailed (fine-scale, microscopic) level of description (e.g., lattice Boltzmann, kinetic Monte Carlo, molecular dynamics). Here, we review a framework for computer-aided multiscale analysis, which enables macroscopic computational tasks (over extended spatiotemporal scales) using only appropriately initialized microscopic simulation on short time and length scales. The methodology bypasses the derivation of macroscopic evolution equations when these equations conceptually exist but are not available in closed form-hence the term equation-free. We selectively discuss basic algorithms and underlying principles and illustrate the approach through representative applications. We also discuss potential difficulties and outline areas for future research.

  17. Enhancing Student Explanations of Evolution: Comparing Elaborating and Competing Theory Prompts

    ERIC Educational Resources Information Center

    Donnelly, Dermot F.; Namdar, Bahadir; Vitale, Jonathan M.; Lai, Kevin; Linn, Marcia C.

    2016-01-01

    In this study, we explore how two different prompt types within an online computer-based inquiry learning environment enhance 392 7th grade students' explanations of evolution with three teachers. In the "elaborating" prompt condition, students are prompted to write explanations that support the accepted theory of evolution. In the…

  18. Historical and contingent factors affect re-evolution of a complex feature lost during mass extinction in communities of digital organisms.

    PubMed

    Yedid, G; Ofria, C A; Lenski, R E

    2008-09-01

    Re-evolution of complex biological features following the extinction of taxa bearing them remains one of evolution's most interesting phenomena, but is not amenable to study in fossil taxa. We used communities of digital organisms (computer programs that self-replicate, mutate and evolve), subjected to periods of low resource availability, to study the evolution, loss and re-evolution of a complex computational trait, the function EQU (bit-wise logical equals). We focused our analysis on cases where the pre-extinction EQU clade had surviving descendents at the end of the extinction episode. To see if these clades retained the capacity to re-evolve EQU, we seeded one set of multiple subreplicate 'replay' populations using the most abundant survivor of the pre-extinction EQU clade, and another set with the actual end-extinction ancestor of the organism in which EQU re-evolved following the extinction episode. Our results demonstrate that stochastic, historical, genomic and ecological factors can lead to constraints on further adaptation, and facilitate or hinder re-evolution of a complex feature.

  19. Experimental implementation of local adiabatic evolution algorithms by an NMR quantum information processor.

    PubMed

    Mitra, Avik; Ghosh, Arindam; Das, Ranabir; Patel, Apoorva; Kumar, Anil

    2005-12-01

    Quantum adiabatic algorithm is a method of solving computational problems by evolving the ground state of a slowly varying Hamiltonian. The technique uses evolution of the ground state of a slowly varying Hamiltonian to reach the required output state. In some cases, such as the adiabatic versions of Grover's search algorithm and Deutsch-Jozsa algorithm, applying the global adiabatic evolution yields a complexity similar to their classical algorithms. However, using the local adiabatic evolution, the algorithms given by J. Roland and N.J. Cerf for Grover's search [J. Roland, N.J. Cerf, Quantum search by local adiabatic evolution, Phys. Rev. A 65 (2002) 042308] and by Saurya Das, Randy Kobes, and Gabor Kunstatter for the Deutsch-Jozsa algorithm [S. Das, R. Kobes, G. Kunstatter, Adiabatic quantum computation and Deutsh's algorithm, Phys. Rev. A 65 (2002) 062301], yield a complexity of order N (where N=2(n) and n is the number of qubits). In this paper, we report the experimental implementation of these local adiabatic evolution algorithms on a 2-qubit quantum information processor, by Nuclear Magnetic Resonance.

  20. The path toward HEP High Performance Computing

    NASA Astrophysics Data System (ADS)

    Apostolakis, John; Brun, René; Carminati, Federico; Gheata, Andrei; Wenzel, Sandro

    2014-06-01

    High Energy Physics code has been known for making poor use of high performance computing architectures. Efforts in optimising HEP code on vector and RISC architectures have yield limited results and recent studies have shown that, on modern architectures, it achieves a performance between 10% and 50% of the peak one. Although several successful attempts have been made to port selected codes on GPUs, no major HEP code suite has a "High Performance" implementation. With LHC undergoing a major upgrade and a number of challenging experiments on the drawing board, HEP cannot any longer neglect the less-than-optimal performance of its code and it has to try making the best usage of the hardware. This activity is one of the foci of the SFT group at CERN, which hosts, among others, the Root and Geant4 project. The activity of the experiments is shared and coordinated via a Concurrency Forum, where the experience in optimising HEP code is presented and discussed. Another activity is the Geant-V project, centred on the development of a highperformance prototype for particle transport. Achieving a good concurrency level on the emerging parallel architectures without a complete redesign of the framework can only be done by parallelizing at event level, or with a much larger effort at track level. Apart the shareable data structures, this typically implies a multiplication factor in terms of memory consumption compared to the single threaded version, together with sub-optimal handling of event processing tails. Besides this, the low level instruction pipelining of modern processors cannot be used efficiently to speedup the program. We have implemented a framework that allows scheduling vectors of particles to an arbitrary number of computing resources in a fine grain parallel approach. The talk will review the current optimisation activities within the SFT group with a particular emphasis on the development perspectives towards a simulation framework able to profit best from the recent technology evolution in computing.

  1. Space missions for automation and robotics technologies (SMART) program

    NASA Technical Reports Server (NTRS)

    Ciffone, D. L.; Lum, H., Jr.

    1985-01-01

    The motivations, features and expected benefits and applications of the NASA SMART program are summarized. SMART is intended to push the state of the art in automation and robotics, a goal that Public Law 98-371 mandated be an inherent part of the Space Station program. The effort would first require tests of sensors, manipulators, computers and other subsystems as seeds for the evolution of flight-qualified subsystems. Consideration is currently being given to robotics systems as add-ons to the RMS, MMU and OMV and a self-contained automation and robotics module which would be tended by astronaut visits. Probable experimentation and development paths that would be pursued with the equipment are discussed, along with the management structure and procedures for the program. The first hardware flight is projected for 1989.

  2. eTRIKS platform: Conception and operation of a highly scalable cloud-based platform for translational research and applications development.

    PubMed

    Bussery, Justin; Denis, Leslie-Alexandre; Guillon, Benjamin; Liu, Pengfeï; Marchetti, Gino; Rahal, Ghita

    2018-04-01

    We describe the genesis, design and evolution of a computing platform designed and built to improve the success rate of biomedical translational research. The eTRIKS project platform was developed with the aim of building a platform that can securely host heterogeneous types of data and provide an optimal environment to run tranSMART analytical applications. Many types of data can now be hosted, including multi-OMICS data, preclinical laboratory data and clinical information, including longitudinal data sets. During the last two years, the platform has matured into a robust translational research knowledge management system that is able to host other data mining applications and support the development of new analytical tools. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Computing generalized Langevin equations and generalized Fokker-Planck equations.

    PubMed

    Darve, Eric; Solomon, Jose; Kia, Amirali

    2009-07-07

    The Mori-Zwanzig formalism is an effective tool to derive differential equations describing the evolution of a small number of resolved variables. In this paper we present its application to the derivation of generalized Langevin equations and generalized non-Markovian Fokker-Planck equations. We show how long time scales rates and metastable basins can be extracted from these equations. Numerical algorithms are proposed to discretize these equations. An important aspect is the numerical solution of the orthogonal dynamics equation which is a partial differential equation in a high dimensional space. We propose efficient numerical methods to solve this orthogonal dynamics equation. In addition, we present a projection formalism of the Mori-Zwanzig type that is applicable to discrete maps. Numerical applications are presented from the field of Hamiltonian systems.

  4. 10 CFR 961.11 - Text of the contract.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... program including information on cost projections, project plans and progress reports. 5. (a) Beginning on...-type documents or computer software (including computer programs, computer software data bases, and computer software documentation). Examples of technical data include research and engineering data...

  5. 10 CFR 961.11 - Text of the contract.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... program including information on cost projections, project plans and progress reports. 5. (a) Beginning on...-type documents or computer software (including computer programs, computer software data bases, and computer software documentation). Examples of technical data include research and engineering data...

  6. EON: software for long time simulations of atomic scale systems

    NASA Astrophysics Data System (ADS)

    Chill, Samuel T.; Welborn, Matthew; Terrell, Rye; Zhang, Liang; Berthet, Jean-Claude; Pedersen, Andreas; Jónsson, Hannes; Henkelman, Graeme

    2014-07-01

    The EON software is designed for simulations of the state-to-state evolution of atomic scale systems over timescales greatly exceeding that of direct classical dynamics. States are defined as collections of atomic configurations from which a minimization of the potential energy gives the same inherent structure. The time evolution is assumed to be governed by rare events, where transitions between states are uncorrelated and infrequent compared with the timescale of atomic vibrations. Several methods for calculating the state-to-state evolution have been implemented in EON, including parallel replica dynamics, hyperdynamics and adaptive kinetic Monte Carlo. Global optimization methods, including simulated annealing, basin hopping and minima hopping are also implemented. The software has a client/server architecture where the computationally intensive evaluations of the interatomic interactions are calculated on the client-side and the state-to-state evolution is managed by the server. The client supports optimization for different computer architectures to maximize computational efficiency. The server is written in Python so that developers have access to the high-level functionality without delving into the computationally intensive components. Communication between the server and clients is abstracted so that calculations can be deployed on a single machine, clusters using a queuing system, large parallel computers using a message passing interface, or within a distributed computing environment. A generic interface to the evaluation of the interatomic interactions is defined so that empirical potentials, such as in LAMMPS, and density functional theory as implemented in VASP and GPAW can be used interchangeably. Examples are given to demonstrate the range of systems that can be modeled, including surface diffusion and island ripening of adsorbed atoms on metal surfaces, molecular diffusion on the surface of ice and global structural optimization of nanoparticles.

  7. Using Microcomputers for Communication. Summary Report: Sociology 110 Distance Education Pilot Project.

    ERIC Educational Resources Information Center

    Misanchuk, Earl R.

    A pilot project involved off-campus (distance education) students creating their assignments on Macintosh computers and "mailing" them electronically to a campus mainframe computer. The goal of the project was to determine what is necessary to implement and to evaluate the potential of computer communications for university-level…

  8. Psychology on Computers: Simulations, Experiments and Projects.

    ERIC Educational Resources Information Center

    Belcher, Duane M.; Smith, Stephen D.

    PSYCOM is a unique mixed media package which combines high interest projects on the computer with a written text of expository material. It goes beyond most computer-assisted instruction which emphasizes drill and practice and testing of knowledge. A project might consist of a simulation or an actual experiment, or it might be a demonstration, a…

  9. CBSS Outreach Project: Computer-Based Study Strategies for Students with Learning Disabilities. Final Report.

    ERIC Educational Resources Information Center

    Anderson-Inman, Lynne; Ditson, Mary

    This final report describes activities and accomplishments of the four-year Computer-Based Study Strategies (CBSS) Outreach Project at the University of Oregon. This project disseminated information about using computer-based study strategies as an intervention for students with learning disabilities and provided teachers in participating outreach…

  10. Music and Computers: Symbiotic Learning.

    ERIC Educational Resources Information Center

    Crenshaw, John H.

    Many individuals in middle school, high school, and university settings have an interest in both music and computers. This paper seeks to direct that interest by presenting a series of computer programming projects. The 53 projects fall under two categories: musical scales and musical sound production. Each group of projects is preceded by a short…

  11. [Earth and Space Sciences Project Services for NASA HPCC

    NASA Technical Reports Server (NTRS)

    Merkey, Phillip

    2002-01-01

    This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.

  12. Bridging Social and Semantic Computing - Design and Evaluation of User Interfaces for Hybrid Systems

    ERIC Educational Resources Information Center

    Bostandjiev, Svetlin Alex I.

    2012-01-01

    The evolution of the Web brought new interesting problems to computer scientists that we loosely classify in the fields of social and semantic computing. Social computing is related to two major paradigms: computations carried out by a large amount of people in a collective intelligence fashion (i.e. wikis), and performing computations on social…

  13. Fitting models of continuous trait evolution to incompletely sampled comparative data using approximate Bayesian computation.

    PubMed

    Slater, Graham J; Harmon, Luke J; Wegmann, Daniel; Joyce, Paul; Revell, Liam J; Alfaro, Michael E

    2012-03-01

    In recent years, a suite of methods has been developed to fit multiple rate models to phylogenetic comparative data. However, most methods have limited utility at broad phylogenetic scales because they typically require complete sampling of both the tree and the associated phenotypic data. Here, we develop and implement a new, tree-based method called MECCA (Modeling Evolution of Continuous Characters using ABC) that uses a hybrid likelihood/approximate Bayesian computation (ABC)-Markov-Chain Monte Carlo approach to simultaneously infer rates of diversification and trait evolution from incompletely sampled phylogenies and trait data. We demonstrate via simulation that MECCA has considerable power to choose among single versus multiple evolutionary rate models, and thus can be used to test hypotheses about changes in the rate of trait evolution across an incomplete tree of life. We finally apply MECCA to an empirical example of body size evolution in carnivores, and show that there is no evidence for an elevated rate of body size evolution in the pinnipeds relative to terrestrial carnivores. ABC approaches can provide a useful alternative set of tools for future macroevolutionary studies where likelihood-dependent approaches are lacking. © 2011 The Author(s). Evolution© 2011 The Society for the Study of Evolution.

  14. Mori-Zwanzig theory for dissipative forces in coarse-grained dynamics in the Markov limit

    NASA Astrophysics Data System (ADS)

    Izvekov, Sergei

    2017-01-01

    We derive alternative Markov approximations for the projected (stochastic) force and memory function in the coarse-grained (CG) generalized Langevin equation, which describes the time evolution of the center-of-mass coordinates of clusters of particles in the microscopic ensemble. This is done with the aid of the Mori-Zwanzig projection operator method based on the recently introduced projection operator [S. Izvekov, J. Chem. Phys. 138, 134106 (2013), 10.1063/1.4795091]. The derivation exploits the "generalized additive fluctuating force" representation to which the projected force reduces in the adopted projection operator formalism. For the projected force, we present a first-order time expansion which correctly extends the static fluctuating force ansatz with the terms necessary to maintain the required orthogonality of the projected dynamics in the Markov limit to the space of CG phase variables. The approximant of the memory function correctly accounts for the momentum dependence in the lowest (second) order and indicates that such a dependence may be important in the CG dynamics approaching the Markov limit. In the case of CG dynamics with a weak dependence of the memory effects on the particle momenta, the expression for the memory function presented in this work is applicable to non-Markov systems. The approximations are formulated in a propagator-free form allowing their efficient evaluation from the microscopic data sampled by standard molecular dynamics simulations. A numerical application is presented for a molecular liquid (nitromethane). With our formalism we do not observe the "plateau-value problem" if the friction tensors for dissipative particle dynamics (DPD) are computed using the Green-Kubo relation. Our formalism provides a consistent bottom-up route for hierarchical parametrization of DPD models from atomistic simulations.

  15. Can An Evolutionary Process Create English Text?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bailey, David H.

    Critics of the conventional theory of biological evolution have asserted that while natural processes might result in some limited diversity, nothing fundamentally new can arise from 'random' evolution. In response, biologists such as Richard Dawkins have demonstrated that a computer program can generate a specific short phrase via evolution-like iterations starting with random gibberish. While such demonstrations are intriguing, they are flawed in that they have a fixed, pre-specified future target, whereas in real biological evolution there is no fixed future target, but only a complicated 'fitness landscape'. In this study, a significantly more sophisticated evolutionary scheme is employed tomore » produce text segments reminiscent of a Charles Dickens novel. The aggregate size of these segments is larger than the computer program and the input Dickens text, even when comparing compressed data (as a measure of information content).« less

  16. ComputerTown: A Do-It-Yourself Community Computer Project. [Computer Town, USA and Other Microcomputer Based Alternatives to Traditional Learning Environments].

    ERIC Educational Resources Information Center

    Zamora, Ramon M.

    Alternative learning environments offering computer-related instruction are developing around the world. Storefront learning centers, museum-based computer facilities, and special theme parks are some of the new concepts. ComputerTown, USA! is a public access computer literacy project begun in 1979 to serve both adults and children in Menlo Park…

  17. [Orthogonal Vector Projection Algorithm for Spectral Unmixing].

    PubMed

    Song, Mei-ping; Xu, Xing-wei; Chang, Chein-I; An, Ju-bai; Yao, Li

    2015-12-01

    Spectrum unmixing is an important part of hyperspectral technologies, which is essential for material quantity analysis in hyperspectral imagery. Most linear unmixing algorithms require computations of matrix multiplication and matrix inversion or matrix determination. These are difficult for programming, especially hard for realization on hardware. At the same time, the computation costs of the algorithms increase significantly as the number of endmembers grows. Here, based on the traditional algorithm Orthogonal Subspace Projection, a new method called. Orthogonal Vector Projection is prompted using orthogonal principle. It simplifies this process by avoiding matrix multiplication and inversion. It firstly computes the final orthogonal vector via Gram-Schmidt process for each endmember spectrum. And then, these orthogonal vectors are used as projection vector for the pixel signature. The unconstrained abundance can be obtained directly by projecting the signature to the projection vectors, and computing the ratio of projected vector length and orthogonal vector length. Compared to the Orthogonal Subspace Projection and Least Squares Error algorithms, this method does not need matrix inversion, which is much computation costing and hard to implement on hardware. It just completes the orthogonalization process by repeated vector operations, easy for application on both parallel computation and hardware. The reasonability of the algorithm is proved by its relationship with Orthogonal Sub-space Projection and Least Squares Error algorithms. And its computational complexity is also compared with the other two algorithms', which is the lowest one. At last, the experimental results on synthetic image and real image are also provided, giving another evidence for effectiveness of the method.

  18. Specification of Computer Systems by Objectives.

    ERIC Educational Resources Information Center

    Eltoft, Douglas

    1989-01-01

    Discusses the evolution of mainframe and personal computers, and presents a case study of a network developed at the University of Iowa called the Iowa Computer-Aided Engineering Network (ICAEN) that combines Macintosh personal computers with Apollo workstations. Functional objectives are stressed as the best measure of system performance. (LRW)

  19. Simulation of multi-pulse coaxial helicity injection in the Sustained Spheromak Physics Experiment

    NASA Astrophysics Data System (ADS)

    O'Bryan, J. B.; Romero-Talamás, C. A.; Woodruff, S.

    2018-03-01

    Nonlinear, numerical computation with the NIMROD code is used to explore magnetic self-organization during multi-pulse coaxial helicity injection in the Sustained Spheromak Physics eXperiment. We describe multiple distinct phases of spheromak evolution, starting from vacuum magnetic fields and the formation of the initial magnetic flux bubble through multiple refluxing pulses and the eventual onset of the column mode instability. Experimental and computational magnetic diagnostics agree on the onset of the column mode instability, which first occurs during the second refluxing pulse of the simulated discharge. Our computations also reproduce the injector voltage traces, despite only specifying the injector current and not explicitly modeling the external capacitor bank circuit. The computations demonstrate that global magnetic evolution is fairly robust to different transport models and, therefore, that a single fluid-temperature model is sufficient for a broader, qualitative assessment of spheromak performance. Although discharges with similar traces of normalized injector current produce similar global spheromak evolution, details of the current distribution during the column mode instability impact the relative degree of poloidal flux amplification and magnetic helicity content.

  20. Convection in containerless processing.

    PubMed

    Hyers, Robert W; Matson, Douglas M; Kelton, Kenneth F; Rogers, Jan R

    2004-11-01

    Different containerless processing techniques have different strengths and weaknesses. Applying more than one technique allows various parts of a problem to be solved separately. For two research projects, one on phase selection in steels and the other on nucleation and growth of quasicrystals, a combination of experiments using electrostatic levitation (ESL) and electromagnetic levitation (EML) is appropriate. In both experiments, convection is an important variable. The convective conditions achievable with each method are compared for two very different materials: a low-viscosity, high-temperature stainless steel, and a high-viscosity, low-temperature quasicrystal-forming alloy. It is clear that the techniques are complementary when convection is a parameter to be explored in the experiments. For a number of reasons, including the sample size, temperature, and reactivity, direct measurement of the convective velocity is not feasible. Therefore, we must rely on computation techniques to estimate convection in these experiments. These models are an essential part of almost any microgravity investigation. The methods employed and results obtained for the projects levitation observation of dendrite evolution in steel ternary alloy rapid solidification (LODESTARS) and quasicrystalline undercooled alloys for space investigation (QUASI) are explained.

  1. Reconfigurable modular computer networks for spacecraft on-board processing

    NASA Technical Reports Server (NTRS)

    Rennels, D. A.

    1978-01-01

    The core electronics subsystems on unmanned spacecraft, which have been sent over the last 20 years to investigate the moon, Mars, Venus, and Mercury, have progressed through an evolution from simple fixed controllers and analog computers in the 1960's to general-purpose digital computers in current designs. This evolution is now moving in the direction of distributed computer networks. Current Voyager spacecraft already use three on-board computers. One is used to store commands and provide overall spacecraft management. Another is used for instrument control and telemetry collection, and the third computer is used for attitude control and scientific instrument pointing. An examination of the control logic in the instruments shows that, for many, it is cost-effective to replace the sequencing logic with a microcomputer. The Unified Data System architecture considered consists of a set of standard microcomputers connected by several redundant buses. A typical self-checking computer module will contain 23 RAMs, two microprocessors, one memory interface, three bus interfaces, and one core building block.

  2. Wide-angle display developments by computer graphics

    NASA Technical Reports Server (NTRS)

    Fetter, William A.

    1989-01-01

    Computer graphics can now expand its new subset, wide-angle projection, to be as significant a generic capability as computer graphics itself. Some prior work in computer graphics is presented which leads to an attractive further subset of wide-angle projection, called hemispheric projection, to be a major communication media. Hemispheric film systems have long been present and such computer graphics systems are in use in simulators. This is the leading edge of capabilities which should ultimately be as ubiquitous as CRTs (cathode-ray tubes). These assertions are not from degrees in science or only from a degree in graphic design, but in a history of computer graphics innovations, laying groundwork by demonstration. The author believes that it is timely to look at several development strategies, since hemispheric projection is now at a point comparable to the early stages of computer graphics, requiring similar patterns of development again.

  3. Dynamic Mesh Adaptation for Front Evolution Using Discontinuous Galerkin Based Weighted Condition Number Mesh Relaxation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greene, Patrick T.; Schofield, Samuel P.; Nourgaliev, Robert

    2016-06-21

    A new mesh smoothing method designed to cluster mesh cells near a dynamically evolving interface is presented. The method is based on weighted condition number mesh relaxation with the weight function being computed from a level set representation of the interface. The weight function is expressed as a Taylor series based discontinuous Galerkin projection, which makes the computation of the derivatives of the weight function needed during the condition number optimization process a trivial matter. For cases when a level set is not available, a fast method for generating a low-order level set from discrete cell-centered elds, such as amore » volume fraction or index function, is provided. Results show that the low-order level set works equally well for the weight function as the actual level set. Meshes generated for a number of interface geometries are presented, including cases with multiple level sets. Dynamic cases for moving interfaces are presented to demonstrate the method's potential usefulness to arbitrary Lagrangian Eulerian (ALE) methods.« less

  4. Temperature specification in atomistic molecular dynamics and its impact on simulation efficacy

    NASA Astrophysics Data System (ADS)

    Ocaya, R. O.; Terblans, J. J.

    2017-10-01

    Temperature is a vital thermodynamical function for physical systems. Knowledge of system temperature permits assessment of system ergodicity, entropy, system state and stability. Rapid theoretical and computational developments in the fields of condensed matter physics, chemistry, material science, molecular biology, nanotechnology and others necessitate clarity in the temperature specification. Temperature-based materials simulations, both standalone and distributed computing, are projected to grow in prominence over diverse research fields. In this article we discuss the apparent variability of temperature modeling formalisms used currently in atomistic molecular dynamics simulations, with respect to system energetics,dynamics and structural evolution. Commercial simulation programs, which by nature are heuristic, do not openly discuss this fundamental question. We address temperature specification in the context of atomistic molecular dynamics. We define a thermostat at 400K relative to a heat bath at 300K firstly using a modified ab-initio Newtonian method, and secondly using a Monte-Carlo method. The thermostatic vacancy formation and cohesion energies, equilibrium lattice constant for FCC copper is then calculated. Finally we compare and contrast the results.

  5. The ground state tunneling splitting and the zero point energy of malonaldehyde: a quantum Monte Carlo determination.

    PubMed

    Viel, Alexandra; Coutinho-Neto, Maurício D; Manthe, Uwe

    2007-01-14

    Quantum dynamics calculations of the ground state tunneling splitting and of the zero point energy of malonaldehyde on the full dimensional potential energy surface proposed by Yagi et al. [J. Chem. Phys. 1154, 10647 (2001)] are reported. The exact diffusion Monte Carlo and the projection operator imaginary time spectral evolution methods are used to compute accurate benchmark results for this 21-dimensional ab initio potential energy surface. A tunneling splitting of 25.7+/-0.3 cm-1 is obtained, and the vibrational ground state energy is found to be 15 122+/-4 cm-1. Isotopic substitution of the tunneling hydrogen modifies the tunneling splitting down to 3.21+/-0.09 cm-1 and the vibrational ground state energy to 14 385+/-2 cm-1. The computed tunneling splittings are slightly higher than the experimental values as expected from the potential energy surface which slightly underestimates the barrier height, and they are slightly lower than the results from the instanton theory obtained using the same potential energy surface.

  6. SOAP. A tool for the fast computation of photometry and radial velocity induced by stellar spots

    NASA Astrophysics Data System (ADS)

    Boisse, I.; Bonfils, X.; Santos, N. C.

    2012-09-01

    We define and put at the disposal of the community SOAP, Spot Oscillation And Planet, a software tool that simulates the effect of stellar spots and plages on radial velocimetry and photometry. This paper describes the tool release and provides instructions for its use. We present detailed tests with previous computations and real data to assess the code's performance and to validate its suitability. We characterize the variations of the radial velocity, line bisector, and photometric amplitude as a function of the main variables: projected stellar rotational velocity, filling factor of the spot, resolution of the spectrograph, linear limb-darkening coefficient, latitude of the spot, and inclination of the star. Finally, we model the spot distributions on the active stars HD 166435, TW Hya and HD 189733, which reproduce the observations. We show that the software is remarkably fast, allowing several evolutions in its capabilities that could be performed to study the next challenges in the exoplanetary field connected with the stellar variability. The tool is available at http://www.astro.up.pt/soap

  7. International Conference on Large Meteorite Impacts and Planetary Evolution

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The papers that were accepted for the International Conference on Large Meteorite Impacts and Planetary Evolution, 31 Aug. - 2 Sep. 1992, are presented. One of the major paper topics was the Sudbury project.

  8. A Synthesis and Survey of Critical Success Factors for Computer Technology Projects

    ERIC Educational Resources Information Center

    Baker, Ross A.

    2012-01-01

    The author investigated the existence of critical success factors for computer technology projects. Current research literature and a survey of experienced project managers indicate that there are 23 critical success factors (CSFs) that correlate with project success. The survey gathered an assessment of project success and the degree to which…

  9. Using Raspberry Pi to Teach Computing "Inside Out"

    ERIC Educational Resources Information Center

    Jaokar, Ajit

    2013-01-01

    This article discusses the evolution of computing education in preparing for the next wave of computing. With the proliferation of mobile devices, most agree that we are living in a "post-PC" world. Using the Raspberry Pi computer platform, based in the UK, as an example, the author discusses computing education in a world where the…

  10. Bibliography. Computer-Oriented Projects, 1987.

    ERIC Educational Resources Information Center

    Smith, Richard L., Comp.

    1988-01-01

    Provides an annotated list of references on computer-oriented projects. Includes information on computers; hands-on versus simulations; games; instruction; students' attitudes and learning styles; artificial intelligence; tutoring; and application of spreadsheets. (RT)

  11. Institutional computing (IC) information session

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koch, Kenneth R; Lally, Bryan R

    2011-01-19

    The LANL Institutional Computing Program (IC) will host an information session about the current state of unclassified Institutional Computing at Los Alamos, exciting plans for the future, and the current call for proposals for science and engineering projects requiring computing. Program representatives will give short presentations and field questions about the call for proposals and future planned machines, and discuss technical support available to existing and future projects. Los Alamos has started making a serious institutional investment in open computing available to our science projects, and that investment is expected to increase even more.

  12. QMC Goes BOINC: Using Public Resource Computing to Perform Quantum Monte Carlo Calculations

    NASA Astrophysics Data System (ADS)

    Rainey, Cameron; Engelhardt, Larry; Schröder, Christian; Hilbig, Thomas

    2008-10-01

    Theoretical modeling of magnetic molecules traditionally involves the diagonalization of quantum Hamiltonian matrices. However, as the complexity of these molecules increases, the matrices become so large that this process becomes unusable. An additional challenge to this modeling is that many repetitive calculations must be performed, further increasing the need for computing power. Both of these obstacles can be overcome by using a quantum Monte Carlo (QMC) method and a distributed computing project. We have recently implemented a QMC method within the Spinhenge@home project, which is a Public Resource Computing (PRC) project where private citizens allow part-time usage of their PCs for scientific computing. The use of PRC for scientific computing will be described in detail, as well as how you can contribute to the project. See, e.g., L. Engelhardt, et. al., Angew. Chem. Int. Ed. 47, 924 (2008). C. Schröoder, in Distributed & Grid Computing - Science Made Transparent for Everyone. Principles, Applications and Supporting Communities. (Weber, M.H.W., ed., 2008). Project URL: http://spin.fh-bielefeld.de

  13. Phenotypic models of evolution and development: geometry as destiny.

    PubMed

    François, Paul; Siggia, Eric D

    2012-12-01

    Quantitative models of development that consider all relevant genes typically are difficult to fit to embryonic data alone and have many redundant parameters. Computational evolution supplies models of phenotype with relatively few variables and parameters that allows the patterning dynamics to be reduced to a geometrical picture for how the state of a cell moves. The clock and wavefront model, that defines the phenotype of somitogenesis, can be represented as a sequence of two discrete dynamical transitions (bifurcations). The expression-time to space map for Hox genes and the posterior dominance rule are phenotypes that naturally follow from computational evolution without considering the genetics of Hox regulation. Copyright © 2012 Elsevier Ltd. All rights reserved.

  14. Nonsequential Computation and Laws of Nature.

    DTIC Science & Technology

    1986-05-01

    computing engines arose as a byproduct of the Manhattan Project in World War II. Broadly speaking, their purpose was to compute numerical solutions to...nature, and to representing algorithms in structures of space and time. After the Manhattan Project had been fulfilled, computer designers quickly pro

  15. A Three Cohort Study of Role-Play Instruction for Agile Project Management

    ERIC Educational Resources Information Center

    Schmitz, Kurt

    2018-01-01

    Agile Project Management methods and processes that emphasize action and feedback over planning continue to gain prominence for Information Systems projects. This topic is an ideal candidate to lead the evolution of project management instruction from teaching "about" to learning "how to." This paper describes a role-play…

  16. Blessing and curse of chaos in numerical turbulence simulations

    NASA Astrophysics Data System (ADS)

    Lee, Jon

    1994-03-01

    Because of the trajectory instability, time reversal is not possible beyond a certain evolution time and hence the time irreversibility prevails under the finite-accuracy trajectory computation. This therefore provides a practical reconciliation of the dynamic reversibility and macroscopic irreversibility (blessing of chaos). On the other hand, the trajectory instability is also responsible for a limited evolution time, so that finite-accuracy computation would yield a pseudo-orbit which is totally unrelated to the true trajectory (curse of chaos). For the inviscid 2D flow, however, we can accurately compute the long- time average of flow quantities with a pseudo-orbit by invoking the ergodic theorem.

  17. Cosmological applications of singular hypersurfaces in general relativity

    NASA Astrophysics Data System (ADS)

    Laguna-Castillo, Pablo

    Three applications to cosmology of surface layers, based on Israel's formalism of singular hypersurfaces and thin shells in general relativity, are presented. Einstein's field equations are analyzed in the presence of a bubble nucleated in vacuum phase transitions within the context of the old inflationary universe scenario. The evolution of a bubble with vanishing surface energy density is studied. It is found that such bubbles lead to a worm-hole matching. Next, the observable four-dimensional universe is considered as a singular hypersurface of discontinuity embedded in a five-dimensional Kaluza-Klein cosmology. It is possible to rewrite the projected five-dimensional Einstein equations on the surface layer in a similar way to the four-dimensional Robertson-Walker cosmology equations. Next, a model is described for an infinite-length, straight U(1) cosmic string as a cylindrical, singular shell enclosing a region of false vacuum. A set of equations is introduced which are required to develop a three-dimensional computer code whose purpose is to study the process of intercommuting cosmic strings with the inclusion of gravitational effects. The outcome is evolution and constraint equations for the gravitational, scalar and gauge field of two initially separated, perpendicular, cosmic strings.

  18. Comparison of human cell signaling pathway databases—evolution, drawbacks and challenges

    PubMed Central

    Chowdhury, Saikat; Sarkar, Ram Rup

    2015-01-01

    Elucidating the complexities of cell signaling pathways is of immense importance to gain understanding about various biological phenomenon, such as dynamics of gene/protein expression regulation, cell fate determination, embryogenesis and disease progression. The successful completion of human genome project has also helped experimental and theoretical biologists to analyze various important pathways. To advance this study, during the past two decades, systematic collections of pathway data from experimental studies have been compiled and distributed freely by several databases, which also integrate various computational tools for further analysis. Despite significant advancements, there exist several drawbacks and challenges, such as pathway data heterogeneity, annotation, regular update and automated image reconstructions, which motivated us to perform a thorough review on popular and actively functioning 24 cell signaling databases. Based on two major characteristics, pathway information and technical details, freely accessible data from commercial and academic databases are examined to understand their evolution and enrichment. This review not only helps to identify some novel and useful features, which are not yet included in any of the databases but also highlights their current limitations and subsequently propose the reasonable solutions for future database development, which could be useful to the whole scientific community. PMID:25632107

  19. EUV laser produced and induced plasmas for nanolithography

    NASA Astrophysics Data System (ADS)

    Sizyuk, Tatyana; Hassanein, Ahmed

    2017-10-01

    EUV produced plasma sources are being extensively studied for the development of new technology for computer chips production. Challenging tasks include optimization of EUV source efficiency, producing powerful source in 2 percentage bandwidth around 13.5 nm for high volume manufacture (HVM), and increasing the lifetime of collecting optics. Mass-limited targets, such as small droplet, allow to reduce contamination of chamber environment and mirror surface damage. However, reducing droplet size limits EUV power output. Our analysis showed the requirement for the target parameters and chamber conditions to achieve 500 W EUV output for HVM. The HEIGHTS package was used for the simulations of laser produced plasma evolution starting from laser interaction with solid target, development and expansion of vapor/plasma plume with accurate optical data calculation, especially in narrow EUV region. Detailed 3D modeling of mix environment including evolution and interplay of plasma produced by lasers from Sn target and plasma produced by in-band and out-of-band EUV radiation in ambient gas, used for the collecting optics protection and cleaning, allowed predicting conditions in entire LPP system. Effect of these conditions on EUV photon absorption and collection was analyzed. This work is supported by the National Science Foundation, PIRE project.

  20. Theoretical Near-IR Spectra for Surface Abundance Studies of Massive Stars

    NASA Technical Reports Server (NTRS)

    Sonneborn, George; Bouret, J.

    2011-01-01

    We present initial results of a study of abundance and mass loss properties of O-type stars based on theoretical near-IR spectra computed with state-of-the-art stellar atmosphere models. The James Webb Space Telescope (JWST) will be a powerful tool to obtain high signal-to-noise ratio near-IR (1-5 micron) spectra of massive stars in different environments of local galaxies. Our goal is to analyze model near-IR spectra corresponding to those expected from NIRspec on JWST in order to map the wind properties and surface composition across the parameter range of 0 stars and to determine projected rotational velocities. As a massive star evolves, internal coupling, related mixing, and mass loss impact its intrinsic rotation rate. These three parameters form an intricate loop, where enhanced rotation leads to more mixing which in turn changes the mass loss rate, the latter thus affecting the rotation rate. Since the effects of rotation are expected to be much more pronounced at low metallicity, we pay special attention to models for massive stars in the the Small Magellanic Cloud. This galaxy provides a unique opportunity to probe stellar evolution, and the feedback of massive stars on galactic evol.ution in conditions similar to the epoch of maximal star formation. Plain-Language Abstract: We present initial results of a study of abundance and mass loss properties of massive stars based on theoretical near-infrared (1-5 micron) spectra computed with state-of-the-art stellar atmosphere models. This study is to prepare for observations by the James Webb Space Telescope.

  1. Progress on the FabrIc for Frontier Experiments project at Fermilab

    DOE PAGES

    Box, Dennis; Boyd, Joseph; Dykstra, Dave; ...

    2015-12-23

    The FabrIc for Frontier Experiments (FIFE) project is an ambitious, major-impact initiative within the Fermilab Scientific Computing Division designed to lead the computing model for Fermilab experiments. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying needs and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of Open Science Grid solutions for high throughput computing, data management, database access and collaboration within experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid sites along with dedicated and commercialmore » cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including new job submission services, software and reference data distribution through CVMFS repositories, flexible data transfer client, and access to opportunistic resources on the Open Science Grid. Hence, the progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken a leading role in the definition of the computing model for Fermilab experiments, aided in the design of computing for experiments beyond Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide« less

  2. The Magellan Final Report on Cloud Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    ,; Coghlan, Susan; Yelick, Katherine

    The goal of Magellan, a project funded through the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR), was to investigate the potential role of cloud computing in addressing the computing needs for the DOE Office of Science (SC), particularly related to serving the needs of mid- range computing and future data-intensive computing workloads. A set of research questions was formed to probe various aspects of cloud computing from performance, usability, and cost. To address these questions, a distributed testbed infrastructure was deployed at the Argonne Leadership Computing Facility (ALCF) and the National Energy Research Scientific Computingmore » Center (NERSC). The testbed was designed to be flexible and capable enough to explore a variety of computing models and hardware design points in order to understand the impact for various scientific applications. During the project, the testbed also served as a valuable resource to application scientists. Applications from a diverse set of projects such as MG-RAST (a metagenomics analysis server), the Joint Genome Institute, the STAR experiment at the Relativistic Heavy Ion Collider, and the Laser Interferometer Gravitational Wave Observatory (LIGO), were used by the Magellan project for benchmarking within the cloud, but the project teams were also able to accomplish important production science utilizing the Magellan cloud resources.« less

  3. The FIFE Project at Fermilab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Box, D.; Boyd, J.; Di Benedetto, V.

    2016-01-01

    The FabrIc for Frontier Experiments (FIFE) project is an initiative within the Fermilab Scientific Computing Division designed to steer the computing model for non-LHC Fermilab experiments across multiple physics areas. FIFE is a collaborative effort between experimenters and computing professionals to design and develop integrated computing models for experiments of varying size, needs, and infrastructure. The major focus of the FIFE project is the development, deployment, and integration of solutions for high throughput computing, data management, database access and collaboration management within an experiment. To accomplish this goal, FIFE has developed workflows that utilize Open Science Grid compute sites alongmore » with dedicated and commercial cloud resources. The FIFE project has made significant progress integrating into experiment computing operations several services including a common job submission service, software and reference data distribution through CVMFS repositories, flexible and robust data transfer clients, and access to opportunistic resources on the Open Science Grid. The progress with current experiments and plans for expansion with additional projects will be discussed. FIFE has taken the leading role in defining the computing model for Fermilab experiments, aided in the design of experiments beyond those hosted at Fermilab, and will continue to define the future direction of high throughput computing for future physics experiments worldwide.« less

  4. Thermodynamic characterization of networks using graph polynomials

    NASA Astrophysics Data System (ADS)

    Ye, Cheng; Comin, César H.; Peron, Thomas K. DM.; Silva, Filipi N.; Rodrigues, Francisco A.; Costa, Luciano da F.; Torsello, Andrea; Hancock, Edwin R.

    2015-09-01

    In this paper, we present a method for characterizing the evolution of time-varying complex networks by adopting a thermodynamic representation of network structure computed from a polynomial (or algebraic) characterization of graph structure. Commencing from a representation of graph structure based on a characteristic polynomial computed from the normalized Laplacian matrix, we show how the polynomial is linked to the Boltzmann partition function of a network. This allows us to compute a number of thermodynamic quantities for the network, including the average energy and entropy. Assuming that the system does not change volume, we can also compute the temperature, defined as the rate of change of entropy with energy. All three thermodynamic variables can be approximated using low-order Taylor series that can be computed using the traces of powers of the Laplacian matrix, avoiding explicit computation of the normalized Laplacian spectrum. These polynomial approximations allow a smoothed representation of the evolution of networks to be constructed in the thermodynamic space spanned by entropy, energy, and temperature. We show how these thermodynamic variables can be computed in terms of simple network characteristics, e.g., the total number of nodes and node degree statistics for nodes connected by edges. We apply the resulting thermodynamic characterization to real-world time-varying networks representing complex systems in the financial and biological domains. The study demonstrates that the method provides an efficient tool for detecting abrupt changes and characterizing different stages in network evolution.

  5. Computer-based communication in support of scientific and technical work. [conferences on management information systems used by scientists of NASA programs

    NASA Technical Reports Server (NTRS)

    Vallee, J.; Wilson, T.

    1976-01-01

    Results are reported of the first experiments for a computer conference management information system at the National Aeronautics and Space Administration. Between August 1975 and March 1976, two NASA projects with geographically separated participants (NASA scientists) used the PLANET computer conferencing system for portions of their work. The first project was a technology assessment of future transportation systems. The second project involved experiments with the Communication Technology Satellite. As part of this project, pre- and postlaunch operations were discussed in a computer conference. These conferences also provided the context for an analysis of the cost of computer conferencing. In particular, six cost components were identified: (1) terminal equipment, (2) communication with a network port, (3) network connection, (4) computer utilization, (5) data storage and (6) administrative overhead.

  6. Rule-based programming paradigm: a formal basis for biological, chemical and physical computation.

    PubMed

    Krishnamurthy, V; Krishnamurthy, E V

    1999-03-01

    A rule-based programming paradigm is described as a formal basis for biological, chemical and physical computations. In this paradigm, the computations are interpreted as the outcome arising out of interaction of elements in an object space. The interactions can create new elements (or same elements with modified attributes) or annihilate old elements according to specific rules. Since the interaction rules are inherently parallel, any number of actions can be performed cooperatively or competitively among the subsets of elements, so that the elements evolve toward an equilibrium or unstable or chaotic state. Such an evolution may retain certain invariant properties of the attributes of the elements. The object space resembles Gibbsian ensemble that corresponds to a distribution of points in the space of positions and momenta (called phase space). It permits the introduction of probabilities in rule applications. As each element of the ensemble changes over time, its phase point is carried into a new phase point. The evolution of this probability cloud in phase space corresponds to a distributed probabilistic computation. Thus, this paradigm can handle tor deterministic exact computation when the initial conditions are exactly specified and the trajectory of evolution is deterministic. Also, it can handle probabilistic mode of computation if we want to derive macroscopic or bulk properties of matter. We also explain how to support this rule-based paradigm using relational-database like query processing and transactions.

  7. Spiral and Project-Based Learning with Peer Assessment in a Computer Science Project Management Course

    ERIC Educational Resources Information Center

    Jaime, Arturo; Blanco, José Miguel; Domínguez, César; Sánchez, Ana; Heras, Jónathan; Usandizaga, Imanol

    2016-01-01

    Different learning methods such as project-based learning, spiral learning and peer assessment have been implemented in science disciplines with different outcomes. This paper presents a proposal for a project management course in the context of a computer science degree. Our proposal combines three well-known methods: project-based learning,…

  8. Spatiotemporal characterization of current and future droughts in the High Atlas basins (Morocco)

    NASA Astrophysics Data System (ADS)

    Zkhiri, Wiam; Tramblay, Yves; Hanich, Lahoucine; Jarlan, Lionel; Ruelland, Denis

    2018-02-01

    Over the past decades, drought has become a major concern in Morocco due to the importance of agriculture in the economy of the country. In the present work, the standardized precipitation index (SPI) is used to monitor the evolution, frequency, and severity of droughts in the High Atlas basins (N'Fis, Ourika, Rhéraya, Zat, and R'dat), located south of Marrakech city. The spatiotemporal characterization of drought in these basins is performed by computing the SPI with precipitation spatially interpolated over the catchments. The Haouz plain, located downstream of these basins, is strongly dependent on water provided by the mountain ranges, as shown by the positive correlations between the normalized difference vegetation index (NDVI) in the plain and the 3, 6, and 12-month SPI in the High Atlas catchments. On the opposite, no significant correlations are found with piezometric levels of the Haouz groundwater due to intensified pumping for irrigation in the recent decades. A relative SPI index was computed to evaluate the climate change impacts on drought occurrence, based on the projected precipitation (2006-2100) from five high-resolution CORDEX regional climate simulations, under two emission scenarios (RCP 4.5 and RCP 8.5). These models show a decrease in precipitation towards the future up to - 65% compared to the historical period. In terms of drought events, the future projections indicate a strong increase in the frequency of SPI events below - 2, considered as severe drought condition.

  9. Click! 101 Computer Activities and Art Projects for Kids and Grown-Ups.

    ERIC Educational Resources Information Center

    Bundesen, Lynne; And Others

    This book presents 101 computer activities and projects geared toward children and adults. The activities for both personal computers (PCs) and Macintosh were developed on the Windows 95 computer operating system, but they are adaptable to non-Windows personal computers as well. The book is divided into two parts. The first part provides an…

  10. Predicting knee replacement damage in a simulator machine using a computational model with a consistent wear factor.

    PubMed

    Zhao, Dong; Sakoda, Hideyuki; Sawyer, W Gregory; Banks, Scott A; Fregly, Benjamin J

    2008-02-01

    Wear of ultrahigh molecular weight polyethylene remains a primary factor limiting the longevity of total knee replacements (TKRs). However, wear testing on a simulator machine is time consuming and expensive, making it impractical for iterative design purposes. The objectives of this paper were first, to evaluate whether a computational model using a wear factor consistent with the TKR material pair can predict accurate TKR damage measured in a simulator machine, and second, to investigate how choice of surface evolution method (fixed or variable step) and material model (linear or nonlinear) affect the prediction. An iterative computational damage model was constructed for a commercial knee implant in an AMTI simulator machine. The damage model combined a dynamic contact model with a surface evolution model to predict how wear plus creep progressively alter tibial insert geometry over multiple simulations. The computational framework was validated by predicting wear in a cylinder-on-plate system for which an analytical solution was derived. The implant damage model was evaluated for 5 million cycles of simulated gait using damage measurements made on the same implant in an AMTI machine. Using a pin-on-plate wear factor for the same material pair as the implant, the model predicted tibial insert wear volume to within 2% error and damage depths and areas to within 18% and 10% error, respectively. Choice of material model had little influence, while inclusion of surface evolution affected damage depth and area but not wear volume predictions. Surface evolution method was important only during the initial cycles, where variable step was needed to capture rapid geometry changes due to the creep. Overall, our results indicate that accurate TKR damage predictions can be made with a computational model using a constant wear factor obtained from pin-on-plate tests for the same material pair, and furthermore, that surface evolution method matters only during the initial "break in" period of the simulation.

  11. The Evolution of Computer Based Learning Software Design: Computer Assisted Teaching Unit Experience.

    ERIC Educational Resources Information Center

    Blandford, A. E.; Smith, P. R.

    1986-01-01

    Describes the style of design of computer simulations developed by Computer Assisted Teaching Unit at Queen Mary College with reference to user interface, input and initialization, input data vetting, effective display screen use, graphical results presentation, and need for hard copy. Procedures and problems relating to academic involvement are…

  12. Use of Failure in IS Development Statistics: Lessons for IS Curriculum Design

    ERIC Educational Resources Information Center

    Longenecker, Herbert H., Jr.; Babb, Jeffry; Waguespack, Leslie; Tastle, William; Landry, Jeff

    2016-01-01

    The evolution of computing education reflects the history of the professional practice of computing. Keeping computing education current has been a major challenge due to the explosive advances in technologies. Academic programs in Information Systems, a long-standing computing discipline, develop and refine the theory and practice of computing…

  13. The Evolution of Instructional Design Principles for Intelligent Computer-Assisted Instruction.

    ERIC Educational Resources Information Center

    Dede, Christopher; Swigger, Kathleen

    1988-01-01

    Discusses and compares the design and development of computer assisted instruction (CAI) and intelligent computer assisted instruction (ICAI). Topics discussed include instructional systems design (ISD), artificial intelligence, authoring languages, intelligent tutoring systems (ITS), qualitative models, and emerging issues in instructional…

  14. Computational Modeling of Fluctuations in Energy and Metabolic Pathways of Methanogenic Archaea

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luthey-Schulten, Zaida

    The methanogenic archaea, anaerobic microbes that convert CO2 and H2 and/or other small organic fermentation products into methane, play an unusually large role in the global carbon cycle. As they perform the final step in the anaerobic breakdown of biomass, methanogens are a biogenic source of an estimated one billion tons methane each year. Depending on the location, produced methane can be considered as either a greenhouse gas (agricultural byproduct), sequestered carbon storage (methane hydrate deposits), or a potential energy source (organic wastewater treatment). These microbes therefore represent an important target for biotechnology applications. Computational models of methanogens with predictivemore » power are useful aids in the adaptation of methanogenic systems, but need to connect processes of wide-ranging time and length scales. In this project, we developed several computational methodologies for modeling the dynamic behavior of entire cells that connects stochastic reaction-diffusion dynamics of individual biochemical pathways with genome-scale modeling of metabolic networks. While each of these techniques were in the realm of well-defined computational methods, here we integrated them to develop several entirely new approaches to systems biology. The first scientific aim of the project was to model how noise in a biochemical pathway propagates into cellular phenotypes. Genetic circuits have been optimized by evolution to regulate molecular processes despite stochastic noise, but the effect of such noise on a cellular biochemical networks is currently unknown. An integrated stochastic/systems model of Escherichia coli species was created to analyze how noise in protein expression gives—and therefore noise in metabolic fluxes—gives rise to multiple cellular phenotype in isogenic population. After the initial work developing and validating methods that allow characterization of the heterogeneity in the model organism E. coli, the project shifted toward investigations of the methanogen Methanosarcina acetivorans. By integrating an unprecedented transcriptomics dataset for growth of the methanogen on many substrates with an in silico model, heterogeneity in metabolic pathway usage and methane production were examined. This lent insight into the physiological requirements of the organism under different environmental conditions and uncovered the unique regulatory role that mRNA half-life has in shaping metabolic flux distributions in this organism.« less

  15. A membrane computing simulator of trans-hierarchical antibiotic resistance evolution dynamics in nested ecological compartments (ARES).

    PubMed

    Campos, Marcelino; Llorens, Carlos; Sempere, José M; Futami, Ricardo; Rodriguez, Irene; Carrasco, Purificación; Capilla, Rafael; Latorre, Amparo; Coque, Teresa M; Moya, Andres; Baquero, Fernando

    2015-08-05

    Antibiotic resistance is a major biomedical problem upon which public health systems demand solutions to construe the dynamics and epidemiological risk of resistant bacteria in anthropogenically-altered environments. The implementation of computable models with reciprocity within and between levels of biological organization (i.e. essential nesting) is central for studying antibiotic resistances. Antibiotic resistance is not just the result of antibiotic-driven selection but more properly the consequence of a complex hierarchy of processes shaping the ecology and evolution of the distinct subcellular, cellular and supra-cellular vehicles involved in the dissemination of resistance genes. Such a complex background motivated us to explore the P-system standards of membrane computing an innovative natural computing formalism that abstracts the notion of movement across membranes to simulate antibiotic resistance evolution processes across nested levels of micro- and macro-environmental organization in a given ecosystem. In this article, we introduce ARES (Antibiotic Resistance Evolution Simulator) a software device that simulates P-system model scenarios with five types of nested computing membranes oriented to emulate a hierarchy of eco-biological compartments, i.e. a) peripheral ecosystem; b) local environment; c) reservoir of supplies; d) animal host; and e) host's associated bacterial organisms (microbiome). Computational objects emulating molecular entities such as plasmids, antibiotic resistance genes, antimicrobials, and/or other substances can be introduced into this framework and may interact and evolve together with the membranes, according to a set of pre-established rules and specifications. ARES has been implemented as an online server and offers additional tools for storage and model editing and downstream analysis. The stochastic nature of the P-system model implemented in ARES explicitly links within and between host dynamics into a simulation, with feedback reciprocity among the different units of selection influenced by antibiotic exposure at various ecological levels. ARES offers the possibility of modeling predictive multilevel scenarios of antibiotic resistance evolution that can be interrogated, edited and re-simulated if necessary, with different parameters, until a correct model description of the process in the real world is convincingly approached. ARES can be accessed at http://gydb.org/ares.

  16. The Seismic Tool-Kit (STK): An Open Source Software For Learning the Basis of Signal Processing and Seismology.

    NASA Astrophysics Data System (ADS)

    Reymond, D.

    2016-12-01

    We present an open source software project (GNU public license), named STK: Seismic Tool-Kit, that is dedicated mainly for learning signal processing and seismology. The STK project that started in 2007, is hosted by SourceForge.net, and count more than 20000 downloads at the date of writing.The STK project is composed of two main branches:First, a graphical interface dedicated to signal processing (in the SAC format (SAC_ASCII and SAC_BIN): where the signal can be plotted, zoomed, filtered, integrated, derivated, ... etc. (a large variety of IFR and FIR filter is proposed). The passage in the frequency domain via the Fourier transform is used to introduce the estimation of spectral density of the signal , with visualization of the Power Spectral Density (PSD) in linear or log scale, and also the evolutive time-frequency representation (or sonagram). The 3-components signals can be also processed for estimating their polarization properties, either for a given window, or either for evolutive windows along the time. This polarization analysis is useful for extracting the polarized noises, differentiating P waves, Rayleigh waves, Love waves, ... etc. Secondly, a panel of Utilities-Program are proposed for working in a terminal mode, with basic programs for computing azimuth and distance in spherical geometry, inter/auto-correlation, spectral density, time-frequency for an entire directory of signals, focal planes, and main components axis, radiation pattern of P waves, Polarization analysis of different waves (including noise), under/over-sampling the signals, cubic-spline smoothing, and linear/non linear regression analysis of data set. STK is developed in C/C++, mainly under Linux OS, and it has been also partially implemented under MS-Windows. STK has been used in some schools for viewing and plotting seismic records provided by IRIS, and it has been used as a practical support for teaching the basis of signal processing. Useful links:http://sourceforge.net/projects/seismic-toolkit/http://sourceforge.net/p/seismic-toolkit/wiki/browse_pages/

  17. High-order hydrodynamic algorithms for exascale computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morgan, Nathaniel Ray

    Hydrodynamic algorithms are at the core of many laboratory missions ranging from simulating ICF implosions to climate modeling. The hydrodynamic algorithms commonly employed at the laboratory and in industry (1) typically lack requisite accuracy for complex multi- material vortical flows and (2) are not well suited for exascale computing due to poor data locality and poor FLOP/memory ratios. Exascale computing requires advances in both computer science and numerical algorithms. We propose to research the second requirement and create a new high-order hydrodynamic algorithm that has superior accuracy, excellent data locality, and excellent FLOP/memory ratios. This proposal will impact a broadmore » range of research areas including numerical theory, discrete mathematics, vorticity evolution, gas dynamics, interface instability evolution, turbulent flows, fluid dynamics and shock driven flows. If successful, the proposed research has the potential to radically transform simulation capabilities and help position the laboratory for computing at the exascale.« less

  18. Fast hydrological model calibration based on the heterogeneous parallel computing accelerated shuffled complex evolution method

    NASA Astrophysics Data System (ADS)

    Kan, Guangyuan; He, Xiaoyan; Ding, Liuqian; Li, Jiren; Hong, Yang; Zuo, Depeng; Ren, Minglei; Lei, Tianjie; Liang, Ke

    2018-01-01

    Hydrological model calibration has been a hot issue for decades. The shuffled complex evolution method developed at the University of Arizona (SCE-UA) has been proved to be an effective and robust optimization approach. However, its computational efficiency deteriorates significantly when the amount of hydrometeorological data increases. In recent years, the rise of heterogeneous parallel computing has brought hope for the acceleration of hydrological model calibration. This study proposed a parallel SCE-UA method and applied it to the calibration of a watershed rainfall-runoff model, the Xinanjiang model. The parallel method was implemented on heterogeneous computing systems using OpenMP and CUDA. Performance testing and sensitivity analysis were carried out to verify its correctness and efficiency. Comparison results indicated that heterogeneous parallel computing-accelerated SCE-UA converged much more quickly than the original serial version and possessed satisfactory accuracy and stability for the task of fast hydrological model calibration.

  19. Ten quick tips for machine learning in computational biology.

    PubMed

    Chicco, Davide

    2017-01-01

    Machine learning has become a pivotal tool for many projects in computational biology, bioinformatics, and health informatics. Nevertheless, beginners and biomedical researchers often do not have enough experience to run a data mining project effectively, and therefore can follow incorrect practices, that may lead to common mistakes or over-optimistic results. With this review, we present ten quick tips to take advantage of machine learning in any computational biology context, by avoiding some common errors that we observed hundreds of times in multiple bioinformatics projects. We believe our ten suggestions can strongly help any machine learning practitioner to carry on a successful project in computational biology and related sciences.

  20. Advances in computer simulation of genome evolution: toward more realistic evolutionary genomics analysis by approximate bayesian computation.

    PubMed

    Arenas, Miguel

    2015-04-01

    NGS technologies present a fast and cheap generation of genomic data. Nevertheless, ancestral genome inference is not so straightforward due to complex evolutionary processes acting on this material such as inversions, translocations, and other genome rearrangements that, in addition to their implicit complexity, can co-occur and confound ancestral inferences. Recently, models of genome evolution that accommodate such complex genomic events are emerging. This letter explores these novel evolutionary models and proposes their incorporation into robust statistical approaches based on computer simulations, such as approximate Bayesian computation, that may produce a more realistic evolutionary analysis of genomic data. Advantages and pitfalls in using these analytical methods are discussed. Potential applications of these ancestral genomic inferences are also pointed out.

  1. ENES the European Network for Earth System modelling and its infrastructure projects IS-ENES

    NASA Astrophysics Data System (ADS)

    Guglielmo, Francesca; Joussaume, Sylvie; Parinet, Marie

    2016-04-01

    The scientific community working on climate modelling is organized within the European Network for Earth System modelling (ENES). In the past decade, several European university departments, research centres, meteorological services, computer centres, and industrial partners engaged in the creation of ENES with the purpose of working together and cooperating towards the further development of the network, by signing a Memorandum of Understanding. As of 2015, the consortium counts 47 partners. The climate modelling community, and thus ENES, faces challenges which are both science-driven, i.e. analysing of the full complexity of the Earth System to improve our understanding and prediction of climate changes, and have multi-faceted societal implications, as a better representation of climate change on regional scales leads to improved understanding and prediction of impacts and to the development and provision of climate services. ENES, promoting and endorsing projects and initiatives, helps in developing and evaluating of state-of-the-art climate and Earth system models, facilitates model inter-comparison studies, encourages exchanges of software and model results, and fosters the use of high performance computing facilities dedicated to high-resolution multi-model experiments. ENES brings together public and private partners, integrates countries underrepresented in climate modelling studies, and reaches out to different user communities, thus enhancing European expertise and competitiveness. In this need of sophisticated models, world-class, high-performance computers, and state-of-the-art software solutions to make efficient use of models, data and hardware, a key role is played by the constitution and maintenance of a solid infrastructure, developing and providing services to the different user communities. ENES has investigated the infrastructural needs and has received funding from the EU FP7 program for the IS-ENES (InfraStructure for ENES) phase I and II projects. We present here the case study of an existing network of institutions brought together toward common goals by a non-binding agreement, ENES, and of its two IS-ENES projects. These latter will be discussed in their double role as a means to provide and/or maintain the actual infrastructure (hardware, software, skilled human resources, services) to achieve ENES scientific goals -fulfilling the aims set in a strategy document-, but also to inform and provide to the network a structured way of working and of interacting with the extended community. The genesis and evolution of the network and the interaction network/projects will also be analysed in terms of long-term sustainability.

  2. Exploiting CMS data popularity to model the evolution of data management for Run-2 and beyond

    NASA Astrophysics Data System (ADS)

    Bonacorsi, D.; Boccali, T.; Giordano, D.; Girone, M.; Neri, M.; Magini, N.; Kuznetsov, V.; Wildish, T.

    2015-12-01

    During the LHC Run-1 data taking, all experiments collected large data volumes from proton-proton and heavy-ion collisions. The collisions data, together with massive volumes of simulated data, were replicated in multiple copies, transferred among various Tier levels, transformed/slimmed in format/content. These data were then accessed (both locally and remotely) by large groups of distributed analysis communities exploiting the WorldWide LHC Computing Grid infrastructure and services. While efficient data placement strategies - together with optimal data redistribution and deletions on demand - have become the core of static versus dynamic data management projects, little effort has so far been invested in understanding the detailed data-access patterns which surfaced in Run-1. These patterns, if understood, can be used as input to simulation of computing models at the LHC, to optimise existing systems by tuning their behaviour, and to explore next-generation CPU/storage/network co-scheduling solutions. This is of great importance, given that the scale of the computing problem will increase far faster than the resources available to the experiments, for Run-2 and beyond. Studying data-access patterns involves the validation of the quality of the monitoring data collected on the “popularity of each dataset, the analysis of the frequency and pattern of accesses to different datasets by analysis end-users, the exploration of different views of the popularity data (by physics activity, by region, by data type), the study of the evolution of Run-1 data exploitation over time, the evaluation of the impact of different data placement and distribution choices on the available network and storage resources and their impact on the computing operations. This work presents some insights from studies on the popularity data from the CMS experiment. We present the properties of a range of physics analysis activities as seen by the data popularity, and make recommendations for how to tune the initial distribution of data in anticipation of how it will be used in Run-2 and beyond.

  3. Parametric Estimation of Load for Air Force Data Centers

    DTIC Science & Technology

    2015-03-27

    R. Nelson, L. Orsenigo and S . Winter, "’History-friendly’ models of industry evolution : the computer industry," Industrial and Corporate Change...34’History-friendly’ models of industry evolution : the computer industry," Industrial and Corporate Change, vol. 8, no. 1, pp. 3-40, 1999. [7] VMWare...NAME( S ) AND ADDRESS(ES) Vinh Phung, 38ES/ENOC 5813 Arnold St, Building 4064 Tinker AFB OK 73145-8120 COM : 405-734-7461, vinh.phung@us.af.mil 10

  4. Computational nuclear quantum many-body problem: The UNEDF project

    NASA Astrophysics Data System (ADS)

    Bogner, S.; Bulgac, A.; Carlson, J.; Engel, J.; Fann, G.; Furnstahl, R. J.; Gandolfi, S.; Hagen, G.; Horoi, M.; Johnson, C.; Kortelainen, M.; Lusk, E.; Maris, P.; Nam, H.; Navratil, P.; Nazarewicz, W.; Ng, E.; Nobre, G. P. A.; Ormand, E.; Papenbrock, T.; Pei, J.; Pieper, S. C.; Quaglioni, S.; Roche, K. J.; Sarich, J.; Schunck, N.; Sosonkina, M.; Terasaki, J.; Thompson, I.; Vary, J. P.; Wild, S. M.

    2013-10-01

    The UNEDF project was a large-scale collaborative effort that applied high-performance computing to the nuclear quantum many-body problem. The primary focus of the project was on constructing, validating, and applying an optimized nuclear energy density functional, which entailed a wide range of pioneering developments in microscopic nuclear structure and reactions, algorithms, high-performance computing, and uncertainty quantification. UNEDF demonstrated that close associations among nuclear physicists, mathematicians, and computer scientists can lead to novel physics outcomes built on algorithmic innovations and computational developments. This review showcases a wide range of UNEDF science results to illustrate this interplay.

  5. Evolution of 3-D geologic framework modeling and its application to groundwater flow studies

    USGS Publications Warehouse

    Blome, Charles D.; Smith, David V.

    2012-01-01

    In this Fact Sheet, the authors discuss the evolution of project 3-D subsurface framework modeling, research in hydrostratigraphy and airborne geophysics, and methodologies used to link geologic and groundwater flow models.

  6. A continuous stochastic model for non-equilibrium dense gases

    NASA Astrophysics Data System (ADS)

    Sadr, M.; Gorji, M. H.

    2017-12-01

    While accurate simulations of dense gas flows far from the equilibrium can be achieved by direct simulation adapted to the Enskog equation, the significant computational demand required for collisions appears as a major constraint. In order to cope with that, an efficient yet accurate solution algorithm based on the Fokker-Planck approximation of the Enskog equation is devised in this paper; the approximation is very much associated with the Fokker-Planck model derived from the Boltzmann equation by Jenny et al. ["A solution algorithm for the fluid dynamic equations based on a stochastic model for molecular motion," J. Comput. Phys. 229, 1077-1098 (2010)] and Gorji et al. ["Fokker-Planck model for computational studies of monatomic rarefied gas flows," J. Fluid Mech. 680, 574-601 (2011)]. The idea behind these Fokker-Planck descriptions is to project the dynamics of discrete collisions implied by the molecular encounters into a set of continuous Markovian processes subject to the drift and diffusion. Thereby, the evolution of particles representing the governing stochastic process becomes independent from each other and thus very efficient numerical schemes can be constructed. By close inspection of the Enskog operator, it is observed that the dense gas effects contribute further to the advection of molecular quantities. That motivates a modelling approach where the dense gas corrections can be cast in the extra advection of particles. Therefore, the corresponding Fokker-Planck approximation is derived such that the evolution in the physical space accounts for the dense effects present in the pressure, stress tensor, and heat fluxes. Hence the consistency between the devised Fokker-Planck approximation and the Enskog operator is shown for the velocity moments up to the heat fluxes. For validation studies, a homogeneous gas inside a box besides Fourier, Couette, and lid-driven cavity flow setups is considered. The results based on the Fokker-Planck model are compared with respect to benchmark simulations, where good agreement is found for the flow field along with the transport properties.

  7. The Bibliotherapy Education Project: Alive and Well--and Perpetually "Under Construction"

    ERIC Educational Resources Information Center

    McMillen, Paula S.

    2008-01-01

    The Bibliotherapy Education Project began as a teaching collaboration between faculty at Oregon State University's Libraries and School of Education. The project's evolution from 1999 to 2004 was previously described in this journal (McMillen 2005). The core of the project is a book evaluation tool, which builds counselor skill and knowledge in…

  8. Graphics supercomputer for computational fluid dynamics research

    NASA Astrophysics Data System (ADS)

    Liaw, Goang S.

    1994-11-01

    The objective of this project is to purchase a state-of-the-art graphics supercomputer to improve the Computational Fluid Dynamics (CFD) research capability at Alabama A & M University (AAMU) and to support the Air Force research projects. A cutting-edge graphics supercomputer system, Onyx VTX, from Silicon Graphics Computer Systems (SGI), was purchased and installed. Other equipment including a desktop personal computer, PC-486 DX2 with a built-in 10-BaseT Ethernet card, a 10-BaseT hub, an Apple Laser Printer Select 360, and a notebook computer from Zenith were also purchased. A reading room has been converted to a research computer lab by adding some furniture and an air conditioning unit in order to provide an appropriate working environments for researchers and the purchase equipment. All the purchased equipment were successfully installed and are fully functional. Several research projects, including two existing Air Force projects, are being performed using these facilities.

  9. Mapping quantum-classical Liouville equation: projectors and trajectories.

    PubMed

    Kelly, Aaron; van Zon, Ramses; Schofield, Jeremy; Kapral, Raymond

    2012-02-28

    The evolution of a mixed quantum-classical system is expressed in the mapping formalism where discrete quantum states are mapped onto oscillator states, resulting in a phase space description of the quantum degrees of freedom. By defining projection operators onto the mapping states corresponding to the physical quantum states, it is shown that the mapping quantum-classical Liouville operator commutes with the projection operator so that the dynamics is confined to the physical space. It is also shown that a trajectory-based solution of this equation can be constructed that requires the simulation of an ensemble of entangled trajectories. An approximation to this evolution equation which retains only the Poisson bracket contribution to the evolution operator does admit a solution in an ensemble of independent trajectories but it is shown that this operator does not commute with the projection operators and the dynamics may take the system outside the physical space. The dynamical instabilities, utility, and domain of validity of this approximate dynamics are discussed. The effects are illustrated by simulations on several quantum systems.

  10. Overview 1993: Computational applications

    NASA Technical Reports Server (NTRS)

    Benek, John A.

    1993-01-01

    Computational applications include projects that apply or develop computationally intensive computer programs. Such programs typically require supercomputers to obtain solutions in a timely fashion. This report describes two CSTAR projects involving Computational Fluid Dynamics (CFD) technology. The first, the Parallel Processing Initiative, is a joint development effort and the second, the Chimera Technology Development, is a transfer of government developed technology to American industry.

  11. A Physically Based Coupled Chemical and Physical Weathering Model for Simulating Soilscape Evolution

    NASA Astrophysics Data System (ADS)

    Willgoose, G. R.; Welivitiya, D.; Hancock, G. R.

    2015-12-01

    A critical missing link in existing landscape evolution models is a dynamic soil evolution models where soils co-evolve with the landform. Work by the authors over the last decade has demonstrated a computationally manageable model for soil profile evolution (soilscape evolution) based on physical weathering. For chemical weathering it is clear that full geochemistry models such as CrunchFlow and PHREEQC are too computationally intensive to be couplable to existing soilscape and landscape evolution models. This paper presents a simplification of CrunchFlow chemistry and physics that makes the task feasible, and generalises it for hillslope geomorphology applications. Results from this simplified model will be compared with field data for soil pedogenesis. Other researchers have previously proposed a number of very simple weathering functions (e.g. exponential, humped, reverse exponential) as conceptual models of the in-profile weathering process. The paper will show that all of these functions are possible for specific combinations of in-soil environmental, geochemical and geologic conditions, and the presentation will outline the key variables controlling which of these conceptual models can be realistic models of in-profile processes and under what conditions. The presentation will finish by discussing the coupling of this model with a physical weathering model, and will show sample results from our SSSPAM soilscape evolution model to illustrate the implications of including chemical weathering in the soilscape evolution model.

  12. Bifurcation-based adiabatic quantum computation with a nonlinear oscillator network

    PubMed Central

    Goto, Hayato

    2016-01-01

    The dynamics of nonlinear systems qualitatively change depending on their parameters, which is called bifurcation. A quantum-mechanical nonlinear oscillator can yield a quantum superposition of two oscillation states, known as a Schrödinger cat state, via quantum adiabatic evolution through its bifurcation point. Here we propose a quantum computer comprising such quantum nonlinear oscillators, instead of quantum bits, to solve hard combinatorial optimization problems. The nonlinear oscillator network finds optimal solutions via quantum adiabatic evolution, where nonlinear terms are increased slowly, in contrast to conventional adiabatic quantum computation or quantum annealing, where quantum fluctuation terms are decreased slowly. As a result of numerical simulations, it is concluded that quantum superposition and quantum fluctuation work effectively to find optimal solutions. It is also notable that the present computer is analogous to neural computers, which are also networks of nonlinear components. Thus, the present scheme will open new possibilities for quantum computation, nonlinear science, and artificial intelligence. PMID:26899997

  13. Computational complexity of the landscape II-Cosmological considerations

    NASA Astrophysics Data System (ADS)

    Denef, Frederik; Douglas, Michael R.; Greene, Brian; Zukowski, Claire

    2018-05-01

    We propose a new approach for multiverse analysis based on computational complexity, which leads to a new family of "computational" measure factors. By defining a cosmology as a space-time containing a vacuum with specified properties (for example small cosmological constant) together with rules for how time evolution will produce the vacuum, we can associate global time in a multiverse with clock time on a supercomputer which simulates it. We argue for a principle of "limited computational complexity" governing early universe dynamics as simulated by this supercomputer, which translates to a global measure for regulating the infinities of eternal inflation. The rules for time evolution can be thought of as a search algorithm, whose details should be constrained by a stronger principle of "minimal computational complexity". Unlike previously studied global measures, ours avoids standard equilibrium considerations and the well-known problems of Boltzmann Brains and the youngness paradox. We also give various definitions of the computational complexity of a cosmology, and argue that there are only a few natural complexity classes.

  14. Project Chrysalis: The Evolution of a Community School.

    ERIC Educational Resources Information Center

    Garrett, K.

    1996-01-01

    Describes the creation and operation of Project Chrysalis, a community, service-learning school transformed from row houses, where children can learn, work, and gain inspiration from artists and social entrepreneurs involved with Houston's Project Row Houses. Personal narratives of two teachers highlight the school's and students' accomplishments…

  15. The Big Build

    ERIC Educational Resources Information Center

    Haigh, Sarah; Bell, Christopher; Ruta, Chris

    2017-01-01

    This article provides details of a successful educational engineering project run in partnership between a group of ten schools and an international engineering, construction and technical services company. It covers the history and evolution of the project and highlights how the project has significant impact not only on the students involved but…

  16. The diversity and evolution of ecological and environmental citizen science

    PubMed Central

    Tweddle, John C.; Savage, Joanna; Robinson, Lucy D.; Roy, Helen E.

    2017-01-01

    Citizen science—the involvement of volunteers in data collection, analysis and interpretation—simultaneously supports research and public engagement with science, and its profile is rapidly rising. Citizen science represents a diverse range of approaches, but until now this diversity has not been quantitatively explored. We conducted a systematic internet search and discovered 509 environmental and ecological citizen science projects. We scored each project for 32 attributes based on publicly obtainable information and used multiple factor analysis to summarise this variation to assess citizen science approaches. We found that projects varied according to their methodological approach from ‘mass participation’ (e.g. easy participation by anyone anywhere) to ‘systematic monitoring’ (e.g. trained volunteers repeatedly sampling at specific locations). They also varied in complexity from approaches that are ‘simple’ to those that are ‘elaborate’ (e.g. provide lots of support to gather rich, detailed datasets). There was a separate cluster of entirely computer-based projects but, in general, we found that the range of citizen science projects in ecology and the environment showed continuous variation and cannot be neatly categorised into distinct types of activity. While the diversity of projects begun in each time period (pre 1990, 1990–99, 2000–09 and 2010–13) has not increased, we found that projects tended to have become increasingly different from each other as time progressed (possibly due to changing opportunities, including technological innovation). Most projects were still active so consequently we found that the overall diversity of active projects (available for participation) increased as time progressed. Overall, understanding the landscape of citizen science in ecology and the environment (and its change over time) is valuable because it informs the comparative evaluation of the ‘success’ of different citizen science approaches. Comparative evaluation provides an evidence-base to inform the future development of citizen science activities. PMID:28369087

  17. Maximum projection designs for computer experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joseph, V. Roshan; Gul, Evren; Ba, Shan

    Space-filling properties are important in designing computer experiments. The traditional maximin and minimax distance designs only consider space-filling in the full dimensional space. This can result in poor projections onto lower dimensional spaces, which is undesirable when only a few factors are active. Restricting maximin distance design to the class of Latin hypercubes can improve one-dimensional projections, but cannot guarantee good space-filling properties in larger subspaces. We propose designs that maximize space-filling properties on projections to all subsets of factors. We call our designs maximum projection designs. As a result, our design criterion can be computed at a cost nomore » more than a design criterion that ignores projection properties.« less

  18. Maximum projection designs for computer experiments

    DOE PAGES

    Joseph, V. Roshan; Gul, Evren; Ba, Shan

    2015-03-18

    Space-filling properties are important in designing computer experiments. The traditional maximin and minimax distance designs only consider space-filling in the full dimensional space. This can result in poor projections onto lower dimensional spaces, which is undesirable when only a few factors are active. Restricting maximin distance design to the class of Latin hypercubes can improve one-dimensional projections, but cannot guarantee good space-filling properties in larger subspaces. We propose designs that maximize space-filling properties on projections to all subsets of factors. We call our designs maximum projection designs. As a result, our design criterion can be computed at a cost nomore » more than a design criterion that ignores projection properties.« less

  19. Use of Microcomputers and Personal Computers in Pacing

    PubMed Central

    Sasmor, L.; Tarjan, P.; Mumford, V.; Smith, E.

    1983-01-01

    This paper describes the evolution from the early discrete circuit pacemaker of the past to the sophisticated microprocessor based pacemakers of today. The necessary computerized supporting instrumentation is also described. Technological and economical reasons for this evolution are discussed.

  20. Impacts of Climate Change and of Anthropisation on Water Resources: from the Risk Assessment to Adaptation, the Case of the Seine Basin (including Paris, France)

    NASA Astrophysics Data System (ADS)

    Habets, F.; Viennot, P.; Thierion, C.; Vergnes, J. P.; Ait Kaci, A.; Caballero, Y.

    2015-12-01

    The Seine river, located in the temperate climate of northern France and flowing over a large sedimentary basins that hosts multilayer aquifers, is characterized by small temporal variations of its discharge. However, the presence of a megacity (Paris) and a wide area of intensive agriculture combined with climate change puts pressure on the water resources both in terms of quality and quantity. Previous research projects have estimated the impact of climate change on the water resource of the Seine basin, with the uncertainties associated to climate projections, hydrological models or downscaling methods. The water resource was projected to decrease by -14 % ± 10 % in 2050 and -28 +/-16% in 2100. This led to new studies that focus on the combined impact of climate change and adaptations. The tested adaptations are: a reduction of the groundwater abstractions, evolution of land use, development of small dams to « harvest water » or artificial recharge of aquifers. The communication of the results of these projects to stakeholders have led to the development on new indicators that better express the risk on the water resource management, especially for the groundwater. For instance maps of the evolution of piezometric head are difficult to interpret. To better express the risk evolution, a new indicator was defined: the evolution of the groundwater crisis duration, ie, the period when the charge of the aquifer is below the crisis piezometric level defined by the stakeholders. Such crisis piezometric levels are used to help defining the period when the groundwater abstraction should be reduced. Such maps are more efficient to communicate with water resources managers. This communication will focus on the results from the MEDDE Explore 2070 and ANR Oracle projects.

  1. Funding for Computer-Assisted Instruction Projects.

    ERIC Educational Resources Information Center

    Corn, Milton

    1994-01-01

    An informal survey of individuals and a search of MEDLINE literature sought information on funding sources for computer-assisted instruction projects in dental, medical, and nursing education. General patterns are outlined, and suggestions are made for locating project funding. (MSE)

  2. Quantifying the rapid evolution of a nourishment project with video imagery

    USGS Publications Warehouse

    Elko, N.A.; Holman, R.A.; Gelfenbaum, G.

    2005-01-01

    Spatially and temporally high-resolution video imagery was combined with traditional surveyed beach profiles to investigate the evolution of a rapidly eroding beach nourishment project. Upham Beach is a 0.6-km beach located downdrift of a structured inlet on the west coast of Florida. The beach was stabilized in seaward advanced position during the 1960s and has been nourished every 4-5 years since 1975. During the 1996 nourishment project, 193,000 m 3 of sediment advanced the shoreline as much as 175 m. Video images were collected concurrent with traditional surveys during the 1996 nourishment project to test video imaging as a nourishment monitoring technique. Video imagery illustrated morphologic changes that were unapparent in survey data. Increased storminess during the second (El Nin??o) winter after the 1996 project resulted in increased erosion rates of 0.4 m/d (135.0 m/y) as compared with 0.2 m/d (69.4 m/y) during the first winter. The measured half-life, the time at which 50% of the nourished material remains, of the nourishment project was 0.94 years. A simple analytical equation indicates reasonable agreement with the measured values, suggesting that project evolution follows a predictable pattern of exponential decay. Long-shore planform equilibration does not occur on Upham Beach, rather sediment diffuses downdrift until 100% of the nourished material erodes. The wide nourished beach erodes rapidly due to the lack of sediment bypassing from the north and the stabilized headland at Upham Beach that is exposed to wave energy.

  3. Using concepts from biology to improve problem-solving methods

    NASA Astrophysics Data System (ADS)

    Goodman, Erik D.; Rothwell, Edward J.; Averill, Ronald C.

    2011-06-01

    Observing nature has been a cornerstone of engineering design. Today, engineers look not only at finished products, but imitate the evolutionary process by which highly optimized artifacts have appeared in nature. Evolutionary computation began by capturing only the simplest ideas of evolution, but today, researchers study natural evolution and incorporate an increasing number of concepts in order to evolve solutions to complex engineering problems. At the new BEACON Center for the Study of Evolution in Action, studies in the lab and field and in silico are laying the groundwork for new tools for evolutionary engineering design. This paper, which accompanies a keynote address, describes various steps in development and application of evolutionary computation, particularly as regards sensor design, and sets the stage for future advances.

  4. Evolution of Advection Upstream Splitting Method Schemes

    NASA Technical Reports Server (NTRS)

    Liou, Meng-Sing

    2010-01-01

    This paper focuses on the evolution of advection upstream splitting method(AUSM) schemes. The main ingredients that have led to the development of modern computational fluid dynamics (CFD) methods have been reviewed, thus the ideas behind AUSM. First and foremost is the concept of upwinding. Second, the use of Riemann problem in constructing the numerical flux in the finite-volume setting. Third, the necessity of including all physical processes, as characterised by the linear (convection) and nonlinear (acoustic) fields. Fourth, the realisation of separating the flux into convection and pressure fluxes. The rest of this review briefly outlines the technical evolution of AUSM and more details can be found in the cited references. Keywords: Computational fluid dynamics methods, hyperbolic systems, advection upstream splitting method, conservation laws, upwinding, CFD

  5. Optimized temporal pattern of brain stimulation designed by computational evolution

    PubMed Central

    Brocker, David T.; Swan, Brandon D.; So, Rosa Q.; Turner, Dennis A.; Gross, Robert E.; Grill, Warren M.

    2017-01-01

    Brain stimulation is a promising therapy for several neurological disorders, including Parkinson’s disease. Stimulation parameters are selected empirically and are limited to the frequency and intensity of stimulation. We used the temporal pattern of stimulation as a novel parameter of deep brain stimulation to ameliorate symptoms in a parkinsonian animal model and in humans with Parkinson’s disease. We used model-based computational evolution to optimize the stimulation pattern. The optimized pattern produced symptom relief comparable to that from standard high-frequency stimulation (a constant rate of 130 or 185 Hz) and outperformed frequency-matched standard stimulation in the parkinsonian rat and in patients. Both optimized and standard stimulation suppressed abnormal oscillatory activity in the basal ganglia of rats and humans. The results illustrate the utility of model-based computational evolution to design temporal pattern of stimulation to increase the efficiency of brain stimulation in Parkinson’s disease, thereby requiring substantially less energy than traditional brain stimulation. PMID:28053151

  6. Advantages of formulating an evolution equation directly for elastic distortional deformation in finite deformation plasticity

    NASA Astrophysics Data System (ADS)

    Rubin, M. B.; Cardiff, P.

    2017-11-01

    Simo (Comput Methods Appl Mech Eng 66:199-219, 1988) proposed an evolution equation for elastic deformation together with a constitutive equation for inelastic deformation rate in plasticity. The numerical algorithm (Simo in Comput Methods Appl Mech Eng 68:1-31, 1988) for determining elastic distortional deformation was simple. However, the proposed inelastic deformation rate caused plastic compaction. The corrected formulation (Simo in Comput Methods Appl Mech Eng 99:61-112, 1992) preserves isochoric plasticity but the numerical integration algorithm is complicated and needs special methods for calculation of the exponential map of a tensor. Alternatively, an evolution equation for elastic distortional deformation can be proposed directly with a simplified constitutive equation for inelastic distortional deformation rate. This has the advantage that the physics of inelastic distortional deformation is separated from that of dilatation. The example of finite deformation J2 plasticity with linear isotropic hardening is used to demonstrate the simplicity of the numerical algorithm.

  7. Shared Storage Usage Policy | High-Performance Computing | NREL

    Science.gov Websites

    Shared Storage Usage Policy Shared Storage Usage Policy To use NREL's high-performance computing (HPC) systems, you must abide by the Shared Storage Usage Policy. /projects NREL HPC allocations include storage space in the /projects filesystem. However, /projects is a shared resource and project

  8. Software Architecture Evolution

    ERIC Educational Resources Information Center

    Barnes, Jeffrey M.

    2013-01-01

    Many software systems eventually undergo changes to their basic architectural structure. Such changes may be prompted by new feature requests, new quality attribute requirements, changing technology, or other reasons. Whatever the causes, architecture evolution is commonplace in real-world software projects. Today's software architects, however,…

  9. Evolution of Curriculum...Before, Into, and Beyond Computer Literacy.

    ERIC Educational Resources Information Center

    Bonja, Robert P.; Rodgers, Robert J.

    A school district's 15 years of involvement with the computer are recounted in this paper, from the first computer literacy course to the present slackening of interest in the subject. The conclusion, however, is that computer technology will not be shelved, but will continue to change the way society lives and maximize student development. As an…

  10. An Introduction to Programming for Bioscientists: A Python-Based Primer

    PubMed Central

    Mura, Cameron

    2016-01-01

    Computing has revolutionized the biological sciences over the past several decades, such that virtually all contemporary research in molecular biology, biochemistry, and other biosciences utilizes computer programs. The computational advances have come on many fronts, spurred by fundamental developments in hardware, software, and algorithms. These advances have influenced, and even engendered, a phenomenal array of bioscience fields, including molecular evolution and bioinformatics; genome-, proteome-, transcriptome- and metabolome-wide experimental studies; structural genomics; and atomistic simulations of cellular-scale molecular assemblies as large as ribosomes and intact viruses. In short, much of post-genomic biology is increasingly becoming a form of computational biology. The ability to design and write computer programs is among the most indispensable skills that a modern researcher can cultivate. Python has become a popular programming language in the biosciences, largely because (i) its straightforward semantics and clean syntax make it a readily accessible first language; (ii) it is expressive and well-suited to object-oriented programming, as well as other modern paradigms; and (iii) the many available libraries and third-party toolkits extend the functionality of the core language into virtually every biological domain (sequence and structure analyses, phylogenomics, workflow management systems, etc.). This primer offers a basic introduction to coding, via Python, and it includes concrete examples and exercises to illustrate the language’s usage and capabilities; the main text culminates with a final project in structural bioinformatics. A suite of Supplemental Chapters is also provided. Starting with basic concepts, such as that of a “variable,” the Chapters methodically advance the reader to the point of writing a graphical user interface to compute the Hamming distance between two DNA sequences. PMID:27271528

  11. An Introduction to Programming for Bioscientists: A Python-Based Primer.

    PubMed

    Ekmekci, Berk; McAnany, Charles E; Mura, Cameron

    2016-06-01

    Computing has revolutionized the biological sciences over the past several decades, such that virtually all contemporary research in molecular biology, biochemistry, and other biosciences utilizes computer programs. The computational advances have come on many fronts, spurred by fundamental developments in hardware, software, and algorithms. These advances have influenced, and even engendered, a phenomenal array of bioscience fields, including molecular evolution and bioinformatics; genome-, proteome-, transcriptome- and metabolome-wide experimental studies; structural genomics; and atomistic simulations of cellular-scale molecular assemblies as large as ribosomes and intact viruses. In short, much of post-genomic biology is increasingly becoming a form of computational biology. The ability to design and write computer programs is among the most indispensable skills that a modern researcher can cultivate. Python has become a popular programming language in the biosciences, largely because (i) its straightforward semantics and clean syntax make it a readily accessible first language; (ii) it is expressive and well-suited to object-oriented programming, as well as other modern paradigms; and (iii) the many available libraries and third-party toolkits extend the functionality of the core language into virtually every biological domain (sequence and structure analyses, phylogenomics, workflow management systems, etc.). This primer offers a basic introduction to coding, via Python, and it includes concrete examples and exercises to illustrate the language's usage and capabilities; the main text culminates with a final project in structural bioinformatics. A suite of Supplemental Chapters is also provided. Starting with basic concepts, such as that of a "variable," the Chapters methodically advance the reader to the point of writing a graphical user interface to compute the Hamming distance between two DNA sequences.

  12. On computational methods for crashworthiness

    NASA Technical Reports Server (NTRS)

    Belytschko, T.

    1992-01-01

    The evolution of computational methods for crashworthiness and related fields is described and linked with the decreasing cost of computational resources and with improvements in computation methodologies. The latter includes more effective time integration procedures and more efficient elements. Some recent developments in methodologies and future trends are also summarized. These include multi-time step integration (or subcycling), further improvements in elements, adaptive meshes, and the exploitation of parallel computers.

  13. Application of Computer Technology to Educational Administration in the United States.

    ERIC Educational Resources Information Center

    Bozeman, William C.; And Others

    1991-01-01

    Description of evolution of computer applications in U.S. educational administration is followed by an overview of the structure and governance of public education and Visscher's developmental framework. Typical administrative computer applications in education are discussed, including student records, personnel management, budgeting, library…

  14. Switching from Computer to Microcomputer Architecture Education

    ERIC Educational Resources Information Center

    Bolanakis, Dimosthenis E.; Kotsis, Konstantinos T.; Laopoulos, Theodore

    2010-01-01

    In the last decades, the technological and scientific evolution of the computing discipline has been widely affecting research in software engineering education, which nowadays advocates more enlightened and liberal ideas. This article reviews cross-disciplinary research on a computer architecture class in consideration of its switching to…

  15. Students Upgrading through Computer and Career Education System Services (Project SUCCESS). Final Evaluation Report 1993-94. OER Report.

    ERIC Educational Resources Information Center

    Greene, Judy

    Students Upgrading through Computer and Career Education System Services (Project SUCCESS) was an Elementary and Secondary Education Act Title VII-funded project in its fourth year of operation. The project operated at two high schools in Brooklyn and one in Manhattan (New York). In the 1993-94 school year, the project served 393 students of…

  16. Computational Protein Engineering: Bridging the Gap between Rational Design and Laboratory Evolution

    PubMed Central

    Barrozo, Alexandre; Borstnar, Rok; Marloie, Gaël; Kamerlin, Shina Caroline Lynn

    2012-01-01

    Enzymes are tremendously proficient catalysts, which can be used as extracellular catalysts for a whole host of processes, from chemical synthesis to the generation of novel biofuels. For them to be more amenable to the needs of biotechnology, however, it is often necessary to be able to manipulate their physico-chemical properties in an efficient and streamlined manner, and, ideally, to be able to train them to catalyze completely new reactions. Recent years have seen an explosion of interest in different approaches to achieve this, both in the laboratory, and in silico. There remains, however, a gap between current approaches to computational enzyme design, which have primarily focused on the early stages of the design process, and laboratory evolution, which is an extremely powerful tool for enzyme redesign, but will always be limited by the vastness of sequence space combined with the low frequency for desirable mutations. This review discusses different approaches towards computational enzyme design and demonstrates how combining newly developed screening approaches that can rapidly predict potential mutation “hotspots” with approaches that can quantitatively and reliably dissect the catalytic step can bridge the gap that currently exists between computational enzyme design and laboratory evolution studies. PMID:23202907

  17. Enabling Predictive Simulation and UQ of Complex Multiphysics PDE Systems by the Development of Goal-Oriented Variational Sensitivity Analysis and a-Posteriori Error Estimation Methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Estep, Donald

    2015-11-30

    This project addressed the challenge of predictive computational analysis of strongly coupled, highly nonlinear multiphysics systems characterized by multiple physical phenomena that span a large range of length- and time-scales. Specifically, the project was focused on computational estimation of numerical error and sensitivity analysis of computational solutions with respect to variations in parameters and data. In addition, the project investigated the use of accurate computational estimates to guide efficient adaptive discretization. The project developed, analyzed and evaluated new variational adjoint-based techniques for integration, model, and data error estimation/control and sensitivity analysis, in evolutionary multiphysics multiscale simulations.

  18. Contemporary Primary Science Curricula in the United Kingdom

    ERIC Educational Resources Information Center

    Henry, John A.

    1976-01-01

    Following a review of the impact of Piaget's theories of cognitive development on science curriculum design, the evolution and development of the Science 5/13 Project, designed to extend the work of Nuffield Junior Science Project, is described. Evaluation studies assessing the aims and objectives of this project are detailed. (BT)

  19. Luminous Binary Supersoft X-Ray Sources

    NASA Technical Reports Server (NTRS)

    DiStefano, Rosanne; Oliversen, Ronald J. (Technical Monitor)

    2002-01-01

    This grant was for the study of Luminous Supersoft X-Ray Sources (SSSs). During the first year a number of projects were completed and new projects were started. The projects include: 1) Time variability of SSSs 2) SSSs in M31; 3) Binary evolution scenarios; and 4) Acquiring new data.

  20. A numerical scheme for the identification of hybrid systems describing the vibration of flexible beams with tip bodies

    NASA Technical Reports Server (NTRS)

    Rosen, I. G.

    1984-01-01

    A cubic spline based Galerkin-like method is developed for the identification of a class of hybrid systems which describe the transverse vibration to flexible beams with attached tip bodies. The identification problem is formulated as a least squares fit to data subject to the system dynamics given by a coupled system of ordnary and partial differential equations recast as an abstract evolution equation (AEE) in an appropriate infinite dimensional Hilbert space. Projecting the AEE into spline-based subspaces leads naturally to a sequence of approximating finite dimensional identification problems. The solutions to these problems are shown to exist, are relatively easily computed, and are shown to, in some sense, converge to solutions to the original identification problem. Numerical results for a variety of examples are discussed.

  1. Information system evolution at the French National Network of Seismic Survey (BCSF-RENASS)

    NASA Astrophysics Data System (ADS)

    Engels, F.; Grunberg, M.

    2013-12-01

    The aging information system of the French National Network of Seismic Survey (BCSF-RENASS), located in Strasbourg (EOST), needed to be updated to satisfy new practices from Computer science world. The latter means to evolve our system at different levels : development method, datamining solutions, system administration. The new system had to provide more agility for incoming projects. The main difficulty was to maintain old system and the new one in parallel the time to validate new solutions with a restricted team. Solutions adopted here are coming from standards used by the seismological community and inspired by the state of the art of devops community. The new system is easier to maintain and take advantage of large community to find support. This poster introduces the new system and choosen solutions like Puppet, Fabric, MongoDB and FDSN Webservices.

  2. The evolution of spinal instrumentation for the management of occipital cervical and cervicothoracic junctional injuries.

    PubMed

    Smucker, Joseph D; Sasso, Rick C

    2006-05-15

    Independent computer-based literature review of articles pertaining to instrumentation and fusion of junctional injuries of the cervical spine. To review and discuss the evolution of instrumentation techniques and systems used in the treatment of cervical spine junctional injuries. Instrumentation of junctional injuries of the cervical spine has been limited historically by failure to achieve rigid internal fixation in multiple planes. The evolution of these techniques has required increased insight into the morphology and unique biomechanics of the structures to be instrumented. Computer-based literature search of Ovid and PubMed databases. Extensive literature search yielded insights into the evolution of systems initially based on onlay bone graft combined with wiring techniques. Such techniques have come to include systems incorporating rigid, longitudinal struts that accommodate multiplanar screws placed in the lateral masses, pedicles, transarticular regions, and occipital bone. Despite a rapid evolution of techniques and instrumentation technologies, it remains incumbent on the physician to provide the patient with a surgical procedure that balances the likelihood of a favorable outcome with the risk inherent in the implementation of the procedure.

  3. PLANET TOPERS: Planets, Tracing the Transfer, Origin, Preservation, and Evolution of their ReservoirS.

    PubMed

    Dehant, V; Asael, D; Baland, R M; Baludikay, B K; Beghin, J; Belza, J; Beuthe, M; Breuer, D; Chernonozhkin, S; Claeys, Ph; Cornet, Y; Cornet, L; Coyette, A; Debaille, V; Delvigne, C; Deproost, M H; De WInter, N; Duchemin, C; El Atrassi, F; François, C; De Keyser, J; Gillmann, C; Gloesener, E; Goderis, S; Hidaka, Y; Höning, D; Huber, M; Hublet, G; Javaux, E J; Karatekin, Ö; Kodolanyi, J; Revilla, L Lobo; Maes, L; Maggiolo, R; Mattielli, N; Maurice, M; McKibbin, S; Morschhauser, A; Neumann, W; Noack, L; Pham, L B S; Pittarello, L; Plesa, A C; Rivoldini, A; Robert, S; Rosenblatt, P; Spohn, T; Storme, J -Y; Tosi, N; Trinh, A; Valdes, M; Vandaele, A C; Vanhaecke, F; Van Hoolst, T; Van Roosbroek, N; Wilquet, V; Yseboodt, M

    2016-11-01

    The Interuniversity Attraction Pole (IAP) 'PLANET TOPERS' (Planets: Tracing the Transfer, Origin, Preservation, and Evolution of their Reservoirs) addresses the fundamental understanding of the thermal and compositional evolution of the different reservoirs of planetary bodies (core, mantle, crust, atmosphere, hydrosphere, cryosphere, and space) considering interactions and feedback mechanisms. Here we present the first results after 2 years of project work.

  4. Transverse momentum dependent parton distributions at small- x

    DOE PAGES

    Xiao, Bo-Wen; Yuan, Feng; Zhou, Jian

    2017-05-23

    We study the transverse momentum dependent (TMD) parton distributions at small-x in a consistent framework that takes into account the TMD evolution and small-x evolution simultaneously. The small-x evolution effects are included by computing the TMDs at appropriate scales in terms of the dipole scattering amplitudes, which obey the relevant Balitsky–Kovchegov equation. Meanwhile, the TMD evolution is obtained by resumming the Collins–Soper type large logarithms emerged from the calculations in small-x formalism into Sudakov factors.

  5. Galactic evolution. I - Single-zone models. [encompassing stellar evolution and gas-star dynamic theories

    NASA Technical Reports Server (NTRS)

    Thuan, T. X.; Hart, M. H.; Ostriker, J. P.

    1975-01-01

    The two basic approaches of physical theory required to calculate the evolution of a galactic system are considered, taking into account stellar evolution theory and the dynamics of a gas-star system. Attention is given to intrinsic (stellar) physics, extrinsic (dynamical) physics, and computations concerning the fractionation of an initial mass of gas into stars. The characteristics of a 'standard' model and its variants are discussed along with the results obtained with the aid of these models.

  6. Transverse momentum dependent parton distributions at small-x

    NASA Astrophysics Data System (ADS)

    Xiao, Bo-Wen; Yuan, Feng; Zhou, Jian

    2017-08-01

    We study the transverse momentum dependent (TMD) parton distributions at small-x in a consistent framework that takes into account the TMD evolution and small-x evolution simultaneously. The small-x evolution effects are included by computing the TMDs at appropriate scales in terms of the dipole scattering amplitudes, which obey the relevant Balitsky-Kovchegov equation. Meanwhile, the TMD evolution is obtained by resumming the Collins-Soper type large logarithms emerged from the calculations in small-x formalism into Sudakov factors.

  7. Transverse momentum dependent parton distributions at small- x

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiao, Bo-Wen; Yuan, Feng; Zhou, Jian

    We study the transverse momentum dependent (TMD) parton distributions at small-x in a consistent framework that takes into account the TMD evolution and small-x evolution simultaneously. The small-x evolution effects are included by computing the TMDs at appropriate scales in terms of the dipole scattering amplitudes, which obey the relevant Balitsky–Kovchegov equation. Meanwhile, the TMD evolution is obtained by resumming the Collins–Soper type large logarithms emerged from the calculations in small-x formalism into Sudakov factors.

  8. Tracking the NGS revolution: managing life science research on shared high-performance computing clusters.

    PubMed

    Dahlö, Martin; Scofield, Douglas G; Schaal, Wesley; Spjuth, Ola

    2018-05-01

    Next-generation sequencing (NGS) has transformed the life sciences, and many research groups are newly dependent upon computer clusters to store and analyze large datasets. This creates challenges for e-infrastructures accustomed to hosting computationally mature research in other sciences. Using data gathered from our own clusters at UPPMAX computing center at Uppsala University, Sweden, where core hour usage of ∼800 NGS and ∼200 non-NGS projects is now similar, we compare and contrast the growth, administrative burden, and cluster usage of NGS projects with projects from other sciences. The number of NGS projects has grown rapidly since 2010, with growth driven by entry of new research groups. Storage used by NGS projects has grown more rapidly since 2013 and is now limited by disk capacity. NGS users submit nearly twice as many support tickets per user, and 11 more tools are installed each month for NGS projects than for non-NGS projects. We developed usage and efficiency metrics and show that computing jobs for NGS projects use more RAM than non-NGS projects, are more variable in core usage, and rarely span multiple nodes. NGS jobs use booked resources less efficiently for a variety of reasons. Active monitoring can improve this somewhat. Hosting NGS projects imposes a large administrative burden at UPPMAX due to large numbers of inexperienced users and diverse and rapidly evolving research areas. We provide a set of recommendations for e-infrastructures that host NGS research projects. We provide anonymized versions of our storage, job, and efficiency databases.

  9. Tracking the NGS revolution: managing life science research on shared high-performance computing clusters

    PubMed Central

    2018-01-01

    Abstract Background Next-generation sequencing (NGS) has transformed the life sciences, and many research groups are newly dependent upon computer clusters to store and analyze large datasets. This creates challenges for e-infrastructures accustomed to hosting computationally mature research in other sciences. Using data gathered from our own clusters at UPPMAX computing center at Uppsala University, Sweden, where core hour usage of ∼800 NGS and ∼200 non-NGS projects is now similar, we compare and contrast the growth, administrative burden, and cluster usage of NGS projects with projects from other sciences. Results The number of NGS projects has grown rapidly since 2010, with growth driven by entry of new research groups. Storage used by NGS projects has grown more rapidly since 2013 and is now limited by disk capacity. NGS users submit nearly twice as many support tickets per user, and 11 more tools are installed each month for NGS projects than for non-NGS projects. We developed usage and efficiency metrics and show that computing jobs for NGS projects use more RAM than non-NGS projects, are more variable in core usage, and rarely span multiple nodes. NGS jobs use booked resources less efficiently for a variety of reasons. Active monitoring can improve this somewhat. Conclusions Hosting NGS projects imposes a large administrative burden at UPPMAX due to large numbers of inexperienced users and diverse and rapidly evolving research areas. We provide a set of recommendations for e-infrastructures that host NGS research projects. We provide anonymized versions of our storage, job, and efficiency databases. PMID:29659792

  10. A Projection Quality-Driven Tube Current Modulation Method in Cone-Beam CT for IGRT: Proof of Concept.

    PubMed

    Men, Kuo; Dai, Jianrong

    2017-12-01

    To develop a projection quality-driven tube current modulation method in cone-beam computed tomography for image-guided radiotherapy based on the prior attenuation information obtained by the planning computed tomography and then evaluate its effect on a reduction in the imaging dose. The QCKV-1 phantom with different thicknesses (0-400 mm) of solid water upon it was used to simulate different attenuation (μ). Projections were acquired with a series of tube current-exposure time product (mAs) settings, and a 2-dimensional contrast to noise ratio was analyzed for each projection to create a lookup table of mAs versus 2-dimensional contrast to noise ratio, μ. Before a patient underwent computed tomography, the maximum attenuation [Formula: see text] within the 95% range of each projection angle (θ) was estimated according to the planning computed tomography images. Then, a desired 2-dimensional contrast to noise ratio value was selected, and the mAs setting at θ was calculated with the lookup table of mAs versus 2-dimensional contrast to noise ratio,[Formula: see text]. Three-dimensional cone-beam computed tomography images were reconstructed using the projections acquired with the selected mAs. The imaging dose was evaluated with a polymethyl methacrylate dosimetry phantom in terms of volume computed tomography dose index. Image quality was analyzed using a Catphan 503 phantom with an oval body annulus and a pelvis phantom. For the Catphan 503 phantom, the cone-beam computed tomography image obtained by the projection quality-driven tube current modulation method had a similar quality to that of conventional cone-beam computed tomography . However, the proposed method could reduce the imaging dose by 16% to 33% to achieve an equivalent contrast to noise ratio value. For the pelvis phantom, the structural similarity index was 0.992 with a dose reduction of 39.7% for the projection quality-driven tube current modulation method. The proposed method could reduce the additional dose to the patient while not degrading the image quality for cone-beam computed tomography. The projection quality-driven tube current modulation method could be especially beneficial to patients who undergo cone-beam computed tomography frequently during a treatment course.

  11. Lattice QCD Application Development within the US DOE Exascale Computing Project

    NASA Astrophysics Data System (ADS)

    Brower, Richard; Christ, Norman; DeTar, Carleton; Edwards, Robert; Mackenzie, Paul

    2018-03-01

    In October, 2016, the US Department of Energy launched the Exascale Computing Project, which aims to deploy exascale computing resources for science and engineering in the early 2020's. The project brings together application teams, software developers, and hardware vendors in order to realize this goal. Lattice QCD is one of the applications. Members of the US lattice gauge theory community with significant collaborators abroad are developing algorithms and software for exascale lattice QCD calculations. We give a short description of the project, our activities, and our plans.

  12. A European Flagship Programme on Extreme Computing and Climate

    NASA Astrophysics Data System (ADS)

    Palmer, Tim

    2017-04-01

    In 2016, an outline proposal co-authored by a number of leading climate modelling scientists from around Europe for a (c. 1 billion euro) flagship project on exascale computing and high-resolution global climate modelling was sent to the EU via its Future and Emerging Flagship Technologies Programme. The project is formally entitled "A Flagship European Programme on Extreme Computing and Climate (EPECC)"? In this talk I will outline the reasons why I believe such a project is needed and describe the current status of the project. I will leave time for some discussion.

  13. Lattice QCD Application Development within the US DOE Exascale Computing Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brower, Richard; Christ, Norman; DeTar, Carleton

    In October, 2016, the US Department of Energy launched the Exascale Computing Project, which aims to deploy exascale computing resources for science and engineering in the early 2020's. The project brings together application teams, software developers, and hardware vendors in order to realize this goal. Lattice QCD is one of the applications. Members of the US lattice gauge theory community with significant collaborators abroad are developing algorithms and software for exascale lattice QCD calculations. We give a short description of the project, our activities, and our plans.

  14. The CP-PACS Project and Lattice QCD Results

    NASA Astrophysics Data System (ADS)

    Iwasaki, Y.

    The aim of the CP-PACS project was to develop a massively parallel computer for performing numerical research in computational physics with primary emphasis on lattice QCD. The CP-PACS computer with a peak speed of 614 GFLOPS with 2048 processors was completed in September 1996, and has been in full operation since October 1996. We present an overview of the CP-PACS project and describe characteristics of the CP-PACS computer. The CP-PACS has been mainly used for hadron spectroscopy studies in lattice QCD. Main results in lattice QCD simulations are given.

  15. The hills are alive: Earth surface dynamics in the University of Arizona Landscape Evolution Observatory

    NASA Astrophysics Data System (ADS)

    DeLong, S.; Troch, P. A.; Barron-Gafford, G. A.; Huxman, T. E.; Pelletier, J. D.; Dontsova, K.; Niu, G.; Chorover, J.; Zeng, X.

    2012-12-01

    To meet the challenge of predicting landscape-scale changes in Earth system behavior, the University of Arizona has designed and constructed a new large-scale and community-oriented scientific facility - the Landscape Evolution Observatory (LEO). The primary scientific objectives are to quantify interactions among hydrologic partitioning, geochemical weathering, ecology, microbiology, atmospheric processes, and geomorphic change associated with incipient hillslope development. LEO consists of three identical, sloping, 333 m2 convergent landscapes inside a 5,000 m2 environmentally controlled facility. These engineered landscapes contain 1 meter of basaltic tephra ground to homogenous loamy sand and contains a spatially dense sensor and sampler network capable of resolving meter-scale lateral heterogeneity and sub-meter scale vertical heterogeneity in moisture, energy and carbon states and fluxes. Each ~1000 metric ton landscape has load cells embedded into the structure to measure changes in total system mass with 0.05% full-scale repeatability (equivalent to less than 1 cm of precipitation), to facilitate better quantification of evapotraspiration. Each landscape has an engineered rain system that allows application of precipitation at rates between3 and 45 mm/hr. These landscapes are being studied in replicate as "bare soil" for an initial period of several years. After this initial phase, heat- and drought-tolerant vascular plant communities will be introduced. Introduction of vascular plants is expected to change how water, carbon, and energy cycle through the landscapes, with potentially dramatic effects on co-evolution of the physical and biological systems. LEO also provides a physical comparison to computer models that are designed to predict interactions among hydrological, geochemical, atmospheric, ecological and geomorphic processes in changing climates. These computer models will be improved by comparing their predictions to physical measurements made in LEO. The main focus of our iterative modeling and measurement discovery cycle is to use rapid data assimilation to facilitate validation of newly coupled open-source Earth systems models. LEO will be a community resource for Earth system science research, education, and outreach. The LEO project operational philosophy includes 1) open and real-time availability of sensor network data, 2) a framework for community collaboration and facility access that includes integration of new or comparative measurement capabilities into existing facility cyberinfrastructure, 3) community-guided science planning and 4) development of novel education and outreach programs.Artistic rendering of the University of Arizona Landscape Evolution Observatory

  16. Evaluation and modelling of the size fractionated aerosol particle number concentration measurements nearby a major road in Helsinki - Part I: Modelling results within the LIPIKA project

    NASA Astrophysics Data System (ADS)

    Pohjola, M. A.; Pirjola, L.; Karppinen, A.; Härkönen, J.; Korhonen, H.; Hussein, T.; Ketzel, M.; Kukkonen, J.

    2007-08-01

    A field measurement campaign was conducted near a major road "Itäväylä" in an urban area in Helsinki in 17-20 February 2003. Aerosol measurements were conducted using a mobile laboratory "Sniffer" at various distances from the road, and at an urban background location. Measurements included particle size distribution in the size range of 7 nm-10 μm (aerodynamic diameter) by the Electrical Low Pressure Impactor (ELPI) and in the size range of 3-50 nm (mobility diameter) by Scanning Mobility Particle Sizer (SMPS), total number concentration of particles larger than 3 nm detected by an ultrafine condensation particle counter (UCPC), temperature, relative humidity, wind speed and direction, driving route of the mobile laboratory, and traffic density on the studied road. In this study, we have compared measured concentration data with the predictions of the road network dispersion model CAR-FMI used in combination with an aerosol process model MONO32. For model comparison purposes, one of the cases was additionally computed using the aerosol process model UHMA, combined with the CAR-FMI model. The vehicular exhaust emissions, and atmospheric dispersion and transformation of fine and ultrafine particles was evaluated within the distance scale of 200 m (corresponding to a time scale of a couple of minutes). We computed the temporal evolution of the number concentrations, size distributions and chemical compositions of various particle size classes. The atmospheric dilution rate of particles is obtained from the roadside dispersion model CAR-FMI. Considering the evolution of total number concentration, dilution was shown to be the most important process. The influence of coagulation and condensation on the number concentrations of particle size modes was found to be negligible on this distance scale. Condensation was found to affect the evolution of particle diameter in the two smallest particle modes. The assumed value of the concentration of condensable organic vapour of 1012 molecules cm-3 was shown to be in a disagreement with the measured particle size evolution, while the modelling runs with the concentration of condensable organic vapour of 109-1010 molecules cm-3 resulted in particle sizes that were closest to the measured values.

  17. Use of Computer Instruction in Rural Schools to Increase Curriculum Opportunities for the College Bound Student. ESEA Title IV-C Project Report.

    ERIC Educational Resources Information Center

    Ware, Ronnie J.

    In an effort to increase curriculum opportunities in a rural school district, a computer project was implemented involving grade 9-12 students chosen on the basis of national percentile scores, IQ, and desire to attend college. The project offered, through programmed computer instruction, physics, French I and II, and German I. One proctor was…

  18. Mindmodeling@Home. . . and Anywhere Else You Have Idle Processors

    DTIC Science & Technology

    2009-07-01

    the continuous growth rate of end-user processing capability around the world. The first volunteer computing project was SETI @Home. It was... SETI @Home remains the longest running and one of the most popular volunteer computing projects in the world. This actually is an impressive feat...volunteer computing projects available to those interested in donating their idle processor time to scientific pursuits. Most of them, including SETI

  19. A generalized public goods game with coupling of individual ability and project benefit

    NASA Astrophysics Data System (ADS)

    Zhong, Li-Xin; Xu, Wen-Juan; He, Yun-Xin; Zhong, Chen-Yang; Chen, Rong-Da; Qiu, Tian; Shi, Yong-Dong; Ren, Fei

    2017-08-01

    Facing a heavy task, any single person can only make a limited contribution and team cooperation is needed. As one enjoys the benefit of the public goods, the potential benefits of the project are not always maximized and may be partly wasted. By incorporating individual ability and project benefit into the original public goods game, we study the coupling effect of the four parameters, the upper limit of individual contribution, the upper limit of individual benefit, the needed project cost and the upper limit of project benefit on the evolution of cooperation. Coevolving with the individual-level group size preferences, an increase in the upper limit of individual benefit promotes cooperation while an increase in the upper limit of individual contribution inhibits cooperation. The coupling of the upper limit of individual contribution and the needed project cost determines the critical point of the upper limit of project benefit, where the equilibrium frequency of cooperators reaches its highest level. Above the critical point, an increase in the upper limit of project benefit inhibits cooperation. The evolution of cooperation is closely related to the preferred group-size distribution. A functional relation between the frequency of cooperators and the dominant group size is found.

  20. An Interdisciplinary Team Project: Psychology and Computer Science Students Create Online Cognitive Tasks

    ERIC Educational Resources Information Center

    Flannery, Kathleen A.; Malita, Mihaela

    2014-01-01

    We present our case study of an interdisciplinary team project for students taking either a psychology or computer science (CS) course. The project required psychology and CS students to combine their knowledge and skills to create an online cognitive task. Each interdisciplinary project team included two psychology students who conducted library…

  1. Using the Computer in Evolution Studies

    ERIC Educational Resources Information Center

    Mariner, James L.

    1973-01-01

    Describes a high school biology exercise in which a computer greatly reduces time spent on calculations. Genetic equilibrium demonstrated by the Hardy-Weinberg principle and the subsequent effects of violating any of its premises are more readily understood when frequencies of alleles through many generations are calculated by the computer. (JR)

  2. A Foothold for Handhelds.

    ERIC Educational Resources Information Center

    Joyner, Amy

    2003-01-01

    Handheld computers provide students tremendous computing and learning power at about a 10th the cost of a regular computer. Describes the evolution of handhelds; provides some examples of their uses; and cites research indicating they are effective classroom tools that can improve efficiency and instruction. A sidebar lists handheld resources.…

  3. Computer-Based Education (CBE): Tomorrow's Traditional System.

    ERIC Educational Resources Information Center

    Rizza, Peter J., Jr.

    1981-01-01

    Examines the role of computer technology in education; discusses reasons for the slow evolution of Computer-Based Education (CBE); explores educational areas in which CBE can be used; presents barriers to widespread use of CBE; and describes the responsibilities of education, government, and business in supporting technology-oriented education.…

  4. Fundamentals of Library Automation and Technology. Participant Workbook.

    ERIC Educational Resources Information Center

    Bridge, Frank; Walton, Robert

    This workbook presents outlines of topics to be covered during a two-day workshop on the fundamentals for library automation. Topics for the first day include: (1) Introduction; (2) Computer Technology--A Historical Overview; (3) Evolution of Library Automation; (4) Computer Hardware Technology--An Introduction; (5) Computer Software…

  5. Elders, Students & Computers--Background Information. Illinois Series on Educational Technology of Computers. Number 8.

    ERIC Educational Resources Information Center

    Jaycox, Kathy; Hicks, Bruce

    This report reviews the literature relating to computer uses for elders. Topics include: (1) variables affecting computer use by elders; (2) organizations and programs serving elders in Champaign County, Illinois; (3) University of Illinois workshops on problems of older people; (4) The Senior Citizens Project of Volunteer Illini Projects; (5)…

  6. Cambridge Elementary students enjoy gift of computers

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Children at Cambridge Elementary School, Cocoa, Fla., eagerly unwrap computer equipment donated by Kennedy Space Center. Cambridge is one of 13 Brevard County schools receiving 81 excess contractor computers thanks to an innovative educational outreach project spearheaded by the Nasa k-12 Education Services Office at ksc. Behind the children is Jim Thurston, a school volunteer and retired employee of USBI, who shared in the project. The Astronaut Memorial Foundation, a strategic partner in the effort, and several schools in rural Florida and Georgia also received refurbished computers as part of the year-long project. Ksc employees put in about 3,300 volunteer hours to transform old, excess computers into upgraded, usable units. A total of $90,000 in upgraded computer equipment is being donated.

  7. Fracture Evolution Following a Hydraulic Stimulation within an EGS Reservoir

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mella, Michael

    The objective of this project was to develop and demonstrate an approach for tracking the evolution of circulation immediately following a hydraulic stimulation in an EGS reservoir. Series of high-resolution tracer tests using conservative and thermally reactive tracers were designed at recently created EGS reservoirs in order to track changes in fluid flow parameters such as reservoir pore volume, flow capacity, and effective reservoir temperature over time. Data obtained from the project would be available for the calibration of reservoir models that could serve to predict EGS performance following a hydraulic stimulation.

  8. Radio Synthesis Imaging - A High Performance Computing and Communications Project

    NASA Astrophysics Data System (ADS)

    Crutcher, Richard M.

    The National Science Foundation has funded a five-year High Performance Computing and Communications project at the National Center for Supercomputing Applications (NCSA) for the direct implementation of several of the computing recommendations of the Astronomy and Astrophysics Survey Committee (the "Bahcall report"). This paper is a summary of the project goals and a progress report. The project will implement a prototype of the next generation of astronomical telescope systems - remotely located telescopes connected by high-speed networks to very high performance, scalable architecture computers and on-line data archives, which are accessed by astronomers over Gbit/sec networks. Specifically, a data link has been installed between the BIMA millimeter-wave synthesis array at Hat Creek, California and NCSA at Urbana, Illinois for real-time transmission of data to NCSA. Data are automatically archived, and may be browsed and retrieved by astronomers using the NCSA Mosaic software. In addition, an on-line digital library of processed images will be established. BIMA data will be processed on a very high performance distributed computing system, with I/O, user interface, and most of the software system running on the NCSA Convex C3880 supercomputer or Silicon Graphics Onyx workstations connected by HiPPI to the high performance, massively parallel Thinking Machines Corporation CM-5. The very computationally intensive algorithms for calibration and imaging of radio synthesis array observations will be optimized for the CM-5 and new algorithms which utilize the massively parallel architecture will be developed. Code running simultaneously on the distributed computers will communicate using the Data Transport Mechanism developed by NCSA. The project will also use the BLANCA Gbit/s testbed network between Urbana and Madison, Wisconsin to connect an Onyx workstation in the University of Wisconsin Astronomy Department to the NCSA CM-5, for development of long-distance distributed computing. Finally, the project is developing 2D and 3D visualization software as part of the international AIPS++ project. This research and development project is being carried out by a team of experts in radio astronomy, algorithm development for massively parallel architectures, high-speed networking, database management, and Thinking Machines Corporation personnel. The development of this complete software, distributed computing, and data archive and library solution to the radio astronomy computing problem will advance our expertise in high performance computing and communications technology and the application of these techniques to astronomical data processing.

  9. From Many-to-One to One-to-Many: The Evolution of Ubiquitous Computing in Education

    ERIC Educational Resources Information Center

    Chen, Wenli; Lim, Carolyn; Tan, Ashley

    2011-01-01

    Personal, Internet-connected technologies are becoming ubiquitous in the lives of students, and ubiquitous computing initiatives are already expanding in educational contexts. Historically in the field of education, the terms one-to-one (1:1) computing and ubiquitous computing have been interpreted in a number of ways and have at times been used…

  10. MIGHTEE: The MeerKAT International GHz Tiered Extragalactic Exploration

    NASA Astrophysics Data System (ADS)

    Taylor, A. Russ; Jarvis, Matt

    2017-05-01

    The MeerKAT telescope is the precursor of the Square Kilometre Array mid-frequency dish array to be deployed later this decade on the African continent. MIGHTEE is one of the MeerKAT large survey projects designed to pathfind SKA key science in cosmology and galaxy evolution. Through a tiered radio continuum deep imaging project including several fields totaling 20 square degrees to microJy sensitivities and an ultra-deep image of a single 1 square degree field of view, MIGHTEE will explore dark matter and large scale structure, the evolution of galaxies, including AGN activity and star formation as a function of cosmic time and environment, the emergence and evolution of magnetic fields in galaxies, and the magnetic counter part to large scale structure of the universe.

  11. SIMBA: a web tool for managing bacterial genome assembly generated by Ion PGM sequencing technology.

    PubMed

    Mariano, Diego C B; Pereira, Felipe L; Aguiar, Edgar L; Oliveira, Letícia C; Benevides, Leandro; Guimarães, Luís C; Folador, Edson L; Sousa, Thiago J; Ghosh, Preetam; Barh, Debmalya; Figueiredo, Henrique C P; Silva, Artur; Ramos, Rommel T J; Azevedo, Vasco A C

    2016-12-15

    The evolution of Next-Generation Sequencing (NGS) has considerably reduced the cost per sequenced-base, allowing a significant rise of sequencing projects, mainly in prokaryotes. However, the range of available NGS platforms requires different strategies and software to correctly assemble genomes. Different strategies are necessary to properly complete an assembly project, in addition to the installation or modification of various software. This requires users to have significant expertise in these software and command line scripting experience on Unix platforms, besides possessing the basic expertise on methodologies and techniques for genome assembly. These difficulties often delay the complete genome assembly projects. In order to overcome this, we developed SIMBA (SImple Manager for Bacterial Assemblies), a freely available web tool that integrates several component tools for assembling and finishing bacterial genomes. SIMBA provides a friendly and intuitive user interface so bioinformaticians, even with low computational expertise, can work under a centralized administrative control system of assemblies managed by the assembly center head. SIMBA guides the users to execute assembly process through simple and interactive pages. SIMBA workflow was divided in three modules: (i) projects: allows a general vision of genome sequencing projects, in addition to data quality analysis and data format conversions; (ii) assemblies: allows de novo assemblies with the software Mira, Minia, Newbler and SPAdes, also assembly quality validations using QUAST software; and (iii) curation: presents methods to finishing assemblies through tools for scaffolding contigs and close gaps. We also presented a case study that validated the efficacy of SIMBA to manage bacterial assemblies projects sequenced using Ion Torrent PGM. Besides to be a web tool for genome assembly, SIMBA is a complete genome assemblies project management system, which can be useful for managing of several projects in laboratories. SIMBA source code is available to download and install in local webservers at http://ufmg-simba.sourceforge.net .

  12. New methodology for fast prediction of wheel wear evolution

    NASA Astrophysics Data System (ADS)

    Apezetxea, I. S.; Perez, X.; Casanueva, C.; Alonso, A.

    2017-07-01

    In railway applications wear prediction in the wheel-rail interface is a fundamental matter in order to study problems such as wheel lifespan and the evolution of vehicle dynamic characteristic with time. However, one of the principal drawbacks of the existing methodologies for calculating the wear evolution is the computational cost. This paper proposes a new wear prediction methodology with a reduced computational cost. This methodology is based on two main steps: the first one is the substitution of the calculations over the whole network by the calculation of the contact conditions in certain characteristic point from whose result the wheel wear evolution can be inferred. The second one is the substitution of the dynamic calculation (time integration calculations) by the quasi-static calculation (the solution of the quasi-static situation of a vehicle at a certain point which is the same that neglecting the acceleration terms in the dynamic equations). These simplifications allow a significant reduction of computational cost to be obtained while maintaining an acceptable level of accuracy (error order of 5-10%). Several case studies are analysed along the paper with the objective of assessing the proposed methodology. The results obtained in the case studies allow concluding that the proposed methodology is valid for an arbitrary vehicle running through an arbitrary track layout.

  13. The Interactive Computer: Authors and Readers Online.

    ERIC Educational Resources Information Center

    Saccardi, Marianne

    1991-01-01

    Describes a computer-literature project for middle school and high school students that was developed through the Fairfield-Westchester Children's Reading Project (CT) to promote online discussions between students and authors. Classroom activities are described, project financing is discussed, and teacher responses that indicate positive effects…

  14. The Evolution of Biological Complexity in Digital Organisms

    NASA Astrophysics Data System (ADS)

    Ofria, Charles

    2013-03-01

    When Darwin first proposed his theory of evolution by natural selection, he realized that it had a problem explaining the origins of traits of ``extreme perfection and complication'' such as the vertebrate eye. Critics of Darwin's theory have latched onto this perceived flaw as a proof that Darwinian evolution is impossible. In anticipation of this issue, Darwin described the perfect data needed to understand this process, but lamented that such data are ``scarcely ever possible'' to obtain. In this talk, I will discuss research where we use populations of digital organisms (self-replicating and evolving computer programs) to elucidate the genetic and evolutionary processes by which new, highly-complex traits arise, drawing inspiration directly from Darwin's wistful thinking and hypotheses. During the process of evolution in these fully-transparent computational environments we can measure the incorporation of new information into the genome, a process akin to a natural Maxwell's Demon, and identify the original source of any such information. We show that, as Darwin predicted, much of the information used to encode a complex trait was already in the genome as part of simpler evolved traits, and that many routes must be possible for a new complex trait to have a high probability of successfully evolving. In even more extreme examples of the evolution of complexity, we are now using these same principles to examine the evolutionary dynamics the drive major transitions in evolution; that is transitions to higher-levels of organization, which are some of the most complex evolutionary events to occur in nature. Finally, I will explore some of the implications of this research to other aspects of evolutionary biology and as well as ways that these evolutionary principles can be applied toward solving computational and engineering problems.

  15. Argonne's Magellan Cloud Computing Research Project

    ScienceCinema

    Beckman, Pete

    2017-12-11

    Pete Beckman, head of Argonne's Leadership Computing Facility (ALCF), discusses the Department of Energy's new $32-million Magellan project, which designed to test how cloud computing can be used for scientific research. More information: http://www.anl.gov/Media_Center/News/2009/news091014a.html

  16. Argonne's Magellan Cloud Computing Research Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beckman, Pete

    Pete Beckman, head of Argonne's Leadership Computing Facility (ALCF), discusses the Department of Energy's new $32-million Magellan project, which designed to test how cloud computing can be used for scientific research. More information: http://www.anl.gov/Media_Center/News/2009/news091014a.html

  17. Final Report on Institutional Computing Project s15_hilaserion, “Kinetic Modeling of Next-Generation High-Energy, High-Intensity Laser-Ion Accelerators as an Enabling Capability”

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Albright, Brian James; Yin, Lin; Stark, David James

    This proposal sought of order 1M core-hours of Institutional Computing time intended to enable computing by a new LANL Postdoc (David Stark) working under LDRD ER project 20160472ER (PI: Lin Yin) on laser-ion acceleration. The project was “off-cycle,” initiating in June of 2016 with a postdoc hire.

  18. Flight Planning for the International Space Station-Levitation Observation of Dendrite Evolution in Steel Ternary Alloy Rapid Solidification

    NASA Technical Reports Server (NTRS)

    Flemings, M. C.; Matson, D. M.; Loser, W.; Hyers, R. W.; Rogers, J. R.; Curreri, Peter A. (Technical Monitor)

    2002-01-01

    The paper is an overview of the status and science for the LODESTARS research project. The program is aimed at understanding how melt convection influences phase selection and the evolution of rapid solidification microstructures

  19. Evolution: A Programmed Text [And] Pronunciation Guide for Evolution. Publications No. 71-8 and 71-1.

    ERIC Educational Resources Information Center

    Thomas, Georgelle; Fishburne, Robert P.

    Part of the Anthropology Curriculum Project, the document contains a programmed text on evolution and a vocabulary pronunciation guide. The unit is intended for use by students in social studies and science courses in the 5th, 6th, and 7th grades. The bulk of the document, the programmed text, is organized in a question answer format. Students are…

  20. The dynamics and evolution of clusters of galaxies

    NASA Technical Reports Server (NTRS)

    Geller, Margaret; Huchra, John P.

    1987-01-01

    Research was undertaken to produce a coherent picture of the formation and evolution of large-scale structures in the universe. The program is divided into projects which examine four areas: the relationship between individual galaxies and their environment; the structure and evolution of individual rich clusters of galaxies; the nature of superclusters; and the large-scale distribution of individual galaxies. A brief review of results in each area is provided.

  1. Campus and Community Connections: The Evolving IUPUI Common Theme Project

    ERIC Educational Resources Information Center

    Hanna, Kathleen A.

    2013-01-01

    In 2009, IUPUI launched the Common Theme Project, designed to "promote campus unity, conversation, and collaboration on timely issues that connect IUPUI to central Indiana and the world." This paper briefly discusses the evolution of the Common Theme Project, from its roots as a freshman common reader to the current campus focus on…

  2. Boeing--A Case Study Example of Enterprise Project Management from a Learning Organization Perspective.

    ERIC Educational Resources Information Center

    Szymczak, Conrad C.; Walker, Derek H. T.

    2003-01-01

    The evolution of the Boeing Company illustrates how to achieve an enterprise project management culture through organizational learning. Project management can be a survival technique for adapting to change as well as a proactive mechanism. An organizational culture that supports commitment and enthusiasm and a knowledge management infrastructure…

  3. The Charlotte Action Research Project: A Model for Direct and Mutually Beneficial Community-University Engagement

    ERIC Educational Resources Information Center

    Morrell, Elizabeth; Sorensen, Janni; Howarth, Joe

    2015-01-01

    This article describes the evolution of the Charlotte Action Research Project (CHARP), a community-university partnership founded in 2008 at the University of North Carolina at Charlotte, and focuses particularly on the program's unique organizational structure. Research findings of a project evaluation suggest that the CHARP model's unique…

  4. The AECT HistoryMakers Project: Conversations with Leaders in Educational Technology

    ERIC Educational Resources Information Center

    Lockee, Barbara B.; Song, Kibong; Li, Wei

    2014-01-01

    The early beginnings and evolution of the field of educational technology (ET) have been documented by various scholars in the field. Recently, another form of historical documentation has been undertaken through a project of the Association for Educational Communications and Technology (AECT). The AECT HistoryMakers Project is a collaborative…

  5. Two case studies in river naturalization: planform migration and bank erosion control

    NASA Astrophysics Data System (ADS)

    Abad, J. D.; Guneralp, I.; Rhoads, B. L.; Garcia, M. H.

    2005-05-01

    A sound understanding of river planform evolution and bank erosion control, along with integration of expertise from several disciplines is required for the development of predictive models for river naturalization. Over the last few years, several methodologies have been presented for naturalization projects, from purely heuristic to more advanced methods. Since the time and space scales of concern in naturalization vary widely, there is a need for appropriate tools at a variety of time and space scales. This study presents two case studies at different scales. The first case study describes the prediction of river planform evolution for a remeandering project based on a simplified two-dimensional hydrodynamic model. The second case study describes the applicability of a Computational Fluid Dynamics (CFD) model for evaluating the effectiveness of bank-erosion control structures in individual meander bends. Understanding the hydrodynamic influence of control structures on flow through bends allows accurate prediction of depositional and erosional distribution patterns, resulting in better assessment on river planform stability, especially for the case of natural complex systems. The first case study introduces a mathematical model for evolution of meandering rivers that can be used in remeandering projects. In United States in particular, several rivers have been channelized in the past causing environmental and ecological problems. Following Newton's third law, "for every action, there is a reaction", naturalization techniques evolve as natural reactive solutions to channelization. This model (herein referred as RVR Meander) can be used as a stand-alone Windows application or as module in a Geographic Information System. The model was applied to the Poplar Creek re-meanderization project and used to evaluate re-meandering alternatives for an approximately 800-meter long reach of Poplar Creek that was straightened in 1938. The second case study describes a streambank protection project using bendway weirs. In the State of Illinois, bendway weirs constructed of rock have been installed at hundreds of sites, especially on small streams, to control streambank erosion. Bendway weirs are low hard structures installed in the concave bank of a meander bend. Design criteria for these weirs are approximate and have not been rigorously evaluated for overall effectiveness at low-, medium- and high flows. This initial step of the study attempted to describe the hydrodynamics around the weirs and the influence of the hydrodynamic patterns on sediment transport (near-field and far-field). To do that, a state-of-the-art three-dimensional CFD model was used to simulate flow through meander bends where 3D velocity measurements have been obtained to validate model predictions at low stages. Results indicate that the weirs produce highly complex patterns of flow around the weirs, which in some cases may actually increase erosional potential near the outer bank. These two case studies represent components of an emerging initiative to develop predictive tools for naturalization over a range of spatial and temporal scales

  6. Advanced processing for high-bandwidth sensor systems

    NASA Astrophysics Data System (ADS)

    Szymanski, John J.; Blain, Phil C.; Bloch, Jeffrey J.; Brislawn, Christopher M.; Brumby, Steven P.; Cafferty, Maureen M.; Dunham, Mark E.; Frigo, Janette R.; Gokhale, Maya; Harvey, Neal R.; Kenyon, Garrett; Kim, Won-Ha; Layne, J.; Lavenier, Dominique D.; McCabe, Kevin P.; Mitchell, Melanie; Moore, Kurt R.; Perkins, Simon J.; Porter, Reid B.; Robinson, S.; Salazar, Alfonso; Theiler, James P.; Young, Aaron C.

    2000-11-01

    Compute performance and algorithm design are key problems of image processing and scientific computing in general. For example, imaging spectrometers are capable of producing data in hundreds of spectral bands with millions of pixels. These data sets show great promise for remote sensing applications, but require new and computationally intensive processing. The goal of the Deployable Adaptive Processing Systems (DAPS) project at Los Alamos National Laboratory is to develop advanced processing hardware and algorithms for high-bandwidth sensor applications. The project has produced electronics for processing multi- and hyper-spectral sensor data, as well as LIDAR data, while employing processing elements using a variety of technologies. The project team is currently working on reconfigurable computing technology and advanced feature extraction techniques, with an emphasis on their application to image and RF signal processing. This paper presents reconfigurable computing technology and advanced feature extraction algorithm work and their application to multi- and hyperspectral image processing. Related projects on genetic algorithms as applied to image processing will be introduced, as will the collaboration between the DAPS project and the DARPA Adaptive Computing Systems program. Further details are presented in other talks during this conference and in other conferences taking place during this symposium.

  7. CAVIAR: a serial ECG processing system for the comparative analysis of VCGs and their interpretation with auto-reference to the patient.

    PubMed

    Fayn, J; Rubel, P

    1988-01-01

    The authors present a new computer program for serial ECG analysis that allows a direct comparison of any couple of three-dimensional ECGs and quantitatively assesses the degree of evolution of the spatial loops as well as of their initial, central, or terminal sectors. Loops and sectors are superposed as best as possible, with the aim of overcoming tracing variability of nonpathological origin. As a result, optimal measures of evolution are computed and a tabular summary of measurements is dynamically configured with respect to the patient's history and is then printed. A multivariate classifier assigns each couple of tracings to one of four classes of evolution. Color graphic displays corresponding to several modes of representation may also be plotted.

  8. Students Upgrading through Computer and Career Education System Services (Project SUCCESS). Final Evaluation Report 1992-93. OER Report.

    ERIC Educational Resources Information Center

    New York City Board of Education, Brooklyn, NY. Office of Educational Research.

    Student Upgrading through Computer and Career Education System Services (Project SUCCESS) was an Elementary and Secondary Education Act Title VII-funded project in its third year of operation. Project SUCCESS served 460 students of limited English proficiency at two high schools in Brooklyn and one high school in Manhattan (New York City).…

  9. Snatching Defeat from the Jaws of Victory: When Good Projects Go Bad. Girls and Computer Science.

    ERIC Educational Resources Information Center

    Sanders, Jo

    In week-long semesters in the summers of 1997, 1998, and 1999, the 6APT (Summer Institute in Computer Science for Advanced Placement Teachers) project taught 240 high school teachers of Advanced Placement Computer Science (APCS) about gender equity in computers. Teachers were then followed through 2000. Results indicated that while teachers, did…

  10. MAPPER: A personal computer map projection tool

    NASA Technical Reports Server (NTRS)

    Bailey, Steven A.

    1993-01-01

    MAPPER is a set of software tools designed to let users create and manipulate map projections on a personal computer (PC). The capability exists to generate five popular map projections. These include azimuthal, cylindrical, mercator, lambert, and sinusoidal projections. Data for projections are contained in five coordinate databases at various resolutions. MAPPER is managed by a system of pull-down windows. This interface allows the user to intuitively create, view and export maps to other platforms.

  11. Feature Selection in Classification of Eye Movements Using Electrooculography for Activity Recognition

    PubMed Central

    Mala, S.; Latha, K.

    2014-01-01

    Activity recognition is needed in different requisition, for example, reconnaissance system, patient monitoring, and human-computer interfaces. Feature selection plays an important role in activity recognition, data mining, and machine learning. In selecting subset of features, an efficient evolutionary algorithm Differential Evolution (DE), a very efficient optimizer, is used for finding informative features from eye movements using electrooculography (EOG). Many researchers use EOG signals in human-computer interactions with various computational intelligence methods to analyze eye movements. The proposed system involves analysis of EOG signals using clearness based features, minimum redundancy maximum relevance features, and Differential Evolution based features. This work concentrates more on the feature selection algorithm based on DE in order to improve the classification for faultless activity recognition. PMID:25574185

  12. Feature selection in classification of eye movements using electrooculography for activity recognition.

    PubMed

    Mala, S; Latha, K

    2014-01-01

    Activity recognition is needed in different requisition, for example, reconnaissance system, patient monitoring, and human-computer interfaces. Feature selection plays an important role in activity recognition, data mining, and machine learning. In selecting subset of features, an efficient evolutionary algorithm Differential Evolution (DE), a very efficient optimizer, is used for finding informative features from eye movements using electrooculography (EOG). Many researchers use EOG signals in human-computer interactions with various computational intelligence methods to analyze eye movements. The proposed system involves analysis of EOG signals using clearness based features, minimum redundancy maximum relevance features, and Differential Evolution based features. This work concentrates more on the feature selection algorithm based on DE in order to improve the classification for faultless activity recognition.

  13. Commercial Drivers License Workplace Literacy Project. Computer Training.

    ERIC Educational Resources Information Center

    Minnesota Teamsters Service Bureau, Minneapolis.

    These course outlines and instructor's guides were developed for a workplace literacy project conducted cooperatively through the Minnesota Teamsters Service Bureau and Northeast Metro Technical College. They are part of the job-specific curriculum for commercial truck drivers developed during the project. The beginning computer course introduces…

  14. Alberta Education's Computer Technology Project.

    ERIC Educational Resources Information Center

    Thiessen, Jim

    This description of activities initiated through the Computer Technology Project of the provincial education ministry in Alberta, Canada, covers the 2-year period beginning with establishment of the project by the Alberta Department of Education in October 1981. Activities described include: (1) the establishment of the Office of Educational…

  15. Recent Evolution of the CDS Services - SIMBAD, VizieR and Aladin

    NASA Astrophysics Data System (ADS)

    Genova, F.; Allen, M. G.; Bienayme, O.; Boch, T.; Bonnarel, F.; Cambresy, L.; Derriere, S.; Dubois, P.; Fernique, P.; Lesteven, S.; Loup, C.; Ochsenbein, F.; Schaaff, A.; Vollmer, B.; Wenger, M.; Louys, M.; Jasniewicz, G.; Davoust, E.

    2005-12-01

    The Centre de Donnees astronomiques de Strasbourg (CDS) maintains several widely used databases and services. Among significant recent evolutions: - a new version of SIMBAD (SIMBAD 4), based on the PostgreSQL database system, has been developed, to replace the current version which has been operational since 1990. It allows new query and sampling possibilities. For accessing SIMBAD from other applications, a full Web Service will be made available in addition to the client-server program which is presently used as name resolver by many services. - VizieR, which gives access to major surveys, observation logs and tables published in journals, is continuously updated in collaboration with journals and ground- and space-based observatories. The diversity of information in VizieR makes it an excellent test-bed for the Virtual Observatory, in particular for the definition of astronomy semantics and of query language, and the implementation of registries. - a major update of Aladin (Aladin V3 Multiview) was released in April 2005. It integrates in particular a multiview display, image resampling, blinking, access to real pixel values (not only 8 bits), compatibility with common image formats such as GIF, JPEG and PNG, scaling functions for better pixel contrasts, a 'Region of Interest Generator' which automatically builds small views around catalog objects, a cross-match function, the possibility to compute new catalog colums via algebraic expressions, extended script commands for batch mode use, and access to additional data such as SDSS. Aladin is routinely used as a portal to the Virtual Observatory. Many of the new functions have been prototyped in the frame of the European Astrophysical Virtual Observatory project, and other are tested for the VO-TECH project.

  16. Modelling of anisotropic growth in biological tissues. A new approach and computational aspects.

    PubMed

    Menzel, A

    2005-03-01

    In this contribution, we develop a theoretical and computational framework for anisotropic growth phenomena. As a key idea of the proposed phenomenological approach, a fibre or rather structural tensor is introduced, which allows the description of transversely isotropic material behaviour. Based on this additional argument, anisotropic growth is modelled via appropriate evolution equations for the fibre while volumetric remodelling is realised by an evolution of the referential density. Both the strength of the fibre as well as the density follow Wolff-type laws. We however elaborate on two different approaches for the evolution of the fibre direction, namely an alignment with respect to strain or with respect to stress. One of the main benefits of the developed framework is therefore the opportunity to address the evolutions of the fibre strength and the fibre direction separately. It is then straightforward to set up appropriate integration algorithms such that the developed framework fits nicely into common, finite element schemes. Finally, several numerical examples underline the applicability of the proposed formulation.

  17. Modelling evolution of asteroid's rotation due to the YORP effect

    NASA Astrophysics Data System (ADS)

    Golubov, Oleksiy; Lipatova, Veronika; Scheeres, Daniel J.

    2016-05-01

    The Yarkovsky--O'Keefe--Radzievskii--Paddack (or YORP) effect is influence of light pressure on rotation of asteroids. It is the most important factor for evolution of rotation state of small asteroids, which can drastically alter their rotation rate and obliquity over cosmologic timescales.In the poster we present our program, which calculates evolution of ratation state of small asteroids subject to the YORP effect. The program accounts for both axial and obliquity components of YORP, takes into account the thermal inertia of the asteroid's soil, and the tangential YORP. The axial component of YORP is computed using the model by Steinberg and Sari (AJ, 141, 55). The thermal inertia is accounted for in the framework of Golubov et al. 2016 (MNRAS, stw540). Computation of the tangential YORP is based on a siple analytical model, whose applicability is verified via comparison to exact numeric simulations.We apply the program to different shape models of asteroids, and study coupled evolution of their rotation rate and obliquity.

  18. Core Collapse: The Race Between Stellar Evolution and Binary Heating

    NASA Astrophysics Data System (ADS)

    Converse, Joseph M.; Chandar, R.

    2012-01-01

    The dynamical formation of binary stars can dramatically affect the evolution of their host star clusters. In relatively small clusters (M < 6000 Msun) the most massive stars rapidly form binaries, heating the cluster and preventing any significant contraction of the core. The situation in much larger globular clusters (M 105 Msun) is quite different, with many showing collapsed cores, implying that binary formation did not affect them as severely as lower mass clusters. More massive clusters, however, should take longer to form their binaries, allowing stellar evolution more time to prevent the heating by causing the larger stars to die off. Here, we simulate the evolution of clusters between those of open and globular clusters in order to find at what size a star cluster is able to experience true core collapse. Our simulations make use of a new GPU-based computing cluster recently purchased at the University of Toledo. We also present some benchmarks of this new computational resource.

  19. Computer Technology Resources for Literacy Projects.

    ERIC Educational Resources Information Center

    Florida State Council on Aging, Tallahassee.

    This resource booklet was prepared to assist literacy projects and community adult education programs in determining the technology they need to serve more older persons. Section 1 contains the following reprinted articles: "The Human Touch in the Computer Age: Seniors Learn Computer Skills from Schoolkids" (Suzanne Kashuba);…

  20. Scientific Discovery through Advanced Computing (SciDAC-3) Partnership Project Annual Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoffman, Forest M.; Bochev, Pavel B.; Cameron-Smith, Philip J..

    The Applying Computationally Efficient Schemes for BioGeochemical Cycles ACES4BGC Project is advancing the predictive capabilities of Earth System Models (ESMs) by reducing two of the largest sources of uncertainty, aerosols and biospheric feedbacks, with a highly efficient computational approach. In particular, this project is implementing and optimizing new computationally efficient tracer advection algorithms for large numbers of tracer species; adding important biogeochemical interactions between the atmosphere, land, and ocean models; and applying uncertainty quanti cation (UQ) techniques to constrain process parameters and evaluate uncertainties in feedbacks between biogeochemical cycles and the climate system.

  1. New European Training Network to Improve Young Scientists' Capabilities in Computational Wave Propagation

    NASA Astrophysics Data System (ADS)

    Igel, Heiner

    2004-07-01

    The European Commission recently funded a Marie-Curie Research Training Network (MCRTN) in the field of computational seismology within the 6th Framework Program. SPICE (Seismic wave Propagation and Imaging in Complex media: a European network) is coordinated by the computational seismology group of the Ludwig-Maximilians-Universität in Munich linking 14 European research institutions in total. The 4-year project will provide funding for 14 Ph.D. students (3-year projects) and 14 postdoctoral positions (2-year projects) within the various fields of computational seismology. These positions have been advertised and are currently being filled.

  2. The Evolution of DEOMI

    DTIC Science & Technology

    2017-09-15

    technology opens the world to information in the computer database to all learners without the use of a human teacher other than the controller or manager ...THE EVOLUTION DE MI DEFENSE EQU AL OPPORTU NITY MANAG EMENT INST ITUTE IDENTITY TITLE: Dr. G · NAME: William Ga ry Mc u1re RACE: White NDER...The Evolution of DEOMI Defense Equal Opportunity Management Institute Research Directorate Written by William Gary McGuire, PhD

  3. A program to compute the two-step excitation of mesospheric sodium atoms for the Polychromatic Laser Guide Star Project

    NASA Astrophysics Data System (ADS)

    Bellanger, Véronique; Courcelle, Arnaud; Petit, Alain

    2004-09-01

    A program to compute the two-step excitation of sodium atoms ( 3S→3P→4D) using the density-matrix formalism is presented. The BEACON program calculates population evolution and the number of photons emitted by fluorescence from the 3P, 4D, 4P, 4S levels. Program summaryTitle of program: BEACON Catalogue identifier:ADSX Program Summary URL:http://cpc.cs.qub.ac.uk/cpc/summaries/ADSX Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Licensing provisions: none Operating systems under which the program has been tested: Win; Unix Programming language used: FORTRAN 77 Memory required to execute with typical data: 1 Mw Number of bits in a word: 32 Number of processors used: 1 (a parallel version of this code is also available and can be obtained on request) Number of lines in distributed program, including test data, etc.: 29 287 Number of bytes in distributed program, including test data, etc.: 830 331 Distribution format: tar.gz CPC Program Library subprograms used: none Nature of physical problem: Resolution of the Bloch equations in the case of the two-step laser excitation of sodium atoms. Method of solution: The program BEACON calculates the evolution of level population versus time using the density-matrix formalism. The number of photons emitted from the 3P, 4D and 4P levels is calculated using the branching ratios and the level lifetimes. Restriction on the complexity of the problem: Since the backscatter emission is calculated after the excitation process, excitation with laser pulse duration longer than the 4D level lifetime cannot be rigorously treated. Particularly, cw laser excitation cannot be calculated with this code. Typical running time:12 h

  4. Predictive characterization of aging and degradation of reactor materials in extreme environments. Final report, December 20, 2013 - September 20, 2017

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qu, Jianmin

    Understanding of reactor material behavior in extreme environments is vital not only to the development of new materials for the next generation nuclear reactors, but also to the extension of the operating lifetimes of the current fleet of nuclear reactors. To this end, this project conducted a suite of unique experimental techniques, augmented by a mesoscale computational framework, to understand and predict the long-term effects of irradiation, temperature, and stress on material microstructures and their macroscopic behavior. The experimental techniques and computational tools were demonstrated on two distinctive types of reactor materials, namely, Zr alloys and high-Cr martensitic steels. Thesemore » materials are chosen as the test beds because they are the archetypes of high-performance reactor materials (cladding, wrappers, ducts, pressure vessel, piping, etc.). To fill the knowledge gaps, and to meet the technology needs, a suite of innovative in situ transmission electron microscopy (TEM) characterization techniques (heating, heavy ion irradiation, He implantation, quantitative small-scale mechanical testing, and various combinations thereof) were developed and used to elucidate and map the fundamental mechanisms of microstructure evolution in both Zr and Cr alloys for a wide range environmental boundary conditions in the thermal-mechanical-irradiation input space. Knowledge gained from the experimental observations of the active mechanisms and the role of local microstructural defects on the response of the material has been incorporated into a mathematically rigorous and comprehensive three-dimensional mesoscale framework capable of accounting for the compositional variation, microstructural evolution and localized deformation (radiation damage) to predict aging and degradation of key reactor materials operating in extreme environments. Predictions from this mesoscale framework were compared with the in situ TEM observations to validate the model.« less

  5. Dealloying, Microstructure and the Corrosion/Protection of Cast Magnesium Alloys

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sieradzki, Karl; Aiello, Ashlee; McCue, Ian

    The purpose of this project was to develop a greater understanding of micro-galvanic corrosion effects in cast magnesium alloys using both experimental and computational methods. Experimental accomplishments have been made in the following areas of interest: characterization, aqueous free-corrosion, atmospheric corrosion, ionic liquid dissolution, rate kinetics of oxide dissolution, and coating investigation. Commercial alloys (AZ91D, AM60, and AZ31B), binary-phase alloys (αMg-2at.%Al, αMg-5at.%Al, and Mg-8at.%Al), and component phases (Mg, Al, β-Mg, β-1%Zn, MnAl3) were obtained and characterized using energy dispersive spectroscopy (EDS), X-ray diffraction (XRD), and scanning electron microscopy (SEM). Full immersion in aqueous chloride was used to characterize the corrosionmore » behavior of alloys. Rotating disc electrodes (RDEs) were used to observe accelerated long-term corrosion behavior. Al surface redistribution for freely corroded samples was analyzed using SEM, EDS, and lithium underpotential deposition (Li UPD). Atmospheric corrosion was observed using contact angle evolution, overnight pH monitoring, and surface pH evolution studies. Ionic liquid corrosion characterization was performed using linear sweep voltammetry and potentiostatic dissolution in 150° choline chloride-urea (cc-urea). Two surface coatings were investigated: (1) Li-carbonate and (2) cc-urea. Li-carbonate coatings were characterized using X-ray photoelectron spectroscopy (XPS), SEM, and aqueous free corrosion potential monitoring. Hydrophobic cc-urea coatings were characterized using contact angle measurements and electrochemical impedance spectroscopy. Oxide dissolution rate kinetics were studied using inductively coupled plasma mass spectroscopy (ICP-MS). Computational accomplishments have been made through the development of Kinetic Monte Carlo (KMC) simulations which model time- and composition-dependent effects on the microstructure due to spatial redistribution of alloying elements during corrosion.« less

  6. Dynamic sensitivity analysis of long running landslide models through basis set expansion and meta-modelling

    NASA Astrophysics Data System (ADS)

    Rohmer, Jeremy

    2016-04-01

    Predicting the temporal evolution of landslides is typically supported by numerical modelling. Dynamic sensitivity analysis aims at assessing the influence of the landslide properties on the time-dependent predictions (e.g., time series of landslide displacements). Yet two major difficulties arise: 1. Global sensitivity analysis require running the landslide model a high number of times (> 1000), which may become impracticable when the landslide model has a high computation time cost (> several hours); 2. Landslide model outputs are not scalar, but function of time, i.e. they are n-dimensional vectors with n usually ranging from 100 to 1000. In this article, I explore the use of a basis set expansion, such as principal component analysis, to reduce the output dimensionality to a few components, each of them being interpreted as a dominant mode of variation in the overall structure of the temporal evolution. The computationally intensive calculation of the Sobol' indices for each of these components are then achieved through meta-modelling, i.e. by replacing the landslide model by a "costless-to-evaluate" approximation (e.g., a projection pursuit regression model). The methodology combining "basis set expansion - meta-model - Sobol' indices" is then applied to the La Frasse landslide to investigate the dynamic sensitivity analysis of the surface horizontal displacements to the slip surface properties during the pore pressure changes. I show how to extract information on the sensitivity of each main modes of temporal behaviour using a limited number (a few tens) of long running simulations. In particular, I identify the parameters, which trigger the occurrence of a turning point marking a shift between a regime of low values of landslide displacements and one of high values.

  7. Orion Exploration Flight Test-1 Contingency Drogue Deploy Velocity Trigger

    NASA Technical Reports Server (NTRS)

    Gay, Robert S.; Stochowiak, Susan; Smith, Kelly

    2013-01-01

    As a backup to the GPS-aided Kalman filter and the Barometric altimeter, an "adjusted" velocity trigger is used during entry to trigger the chain of events that leads to drogue chute deploy for the Orion Multi-Purpose Crew Vehicle (MPCV) Exploration Flight Test-1 (EFT-1). Even though this scenario is multiple failures deep, the Orion Guidance, Navigation, and Control (GN&C) software makes use of a clever technique that was taken from the Mars Science Laboratory (MSL) program, which recently successfully landing the Curiosity rover on Mars. MSL used this technique to jettison the heat shield at the proper time during descent. Originally, Orion use the un-adjusted navigated velocity, but the removal of the Star Tracker to save costs for EFT-1, increased attitude errors which increased inertial propagation errors to the point where the un-adjusted velocity caused altitude dispersions at drogue deploy to be too large. Thus, to reduce dispersions, the velocity vector is projected onto a "reference" vector that represents the nominal "truth" vector at the desired point in the trajectory. Because the navigation errors are largely perpendicular to the truth vector, this projection significantly reduces dispersions in the velocity magnitude. This paper will detail the evolution of this trigger method for the Orion project and cover the various methods tested to determine the reference "truth" vector; and at what point in the trajectory it should be computed.

  8. MDCT distinguishing features of focal aortic projections (FAP) in acute clinical settings.

    PubMed

    Valente, Tullio; Rossi, Giovanni; Lassandro, Francesco; Rea, Gaetano; Marino, Maurizio; Urciuolo, Salvatore; Tortora, Giovanni; Muto, Maurizio

    2015-01-01

    Focal aortic projections (FAP) are protrusion images of the contrast medium (focal contour irregularity, breaks in the intimal contour, outward lumen bulging or localized blood-filled outpouching) projecting beyond the aortic lumen in the aortic wall and are commonly seen on multidetector computed tomography (MDCT) scans of the chest and abdomen. FAP include several common and uncommon etiologies, which can be demonstrated both in the native aorta, mainly in acute aortic syndromes, and in the post-surgical aorta or after endovascular therapy. They are also found in some types of post-traumatic injuries and in impending rupture of the aneurysms. The expanding, routine use of millimetric or submillimetric collimation of current state-of-the-art MDCT scanners (16 rows and higher) all the time allows the identification and characterization of these small ulcer-like lesions or irregularities in the entire aorta, as either an incidental or expected finding, and provides detailed three-dimensional pictures of these pathologic findings. In this pictorial review, we illustrate the possible significance of FAP and the discriminating MDCT features that help to distinguish among different types of aortic protrusions and their possible evolution. Awareness of some related and distinctive radiologic features in FAP may improve our understanding of aortic diseases, provide further insight into the pathophysiology and natural history, and guide the appropriate management of these lesions.

  9. Publications of the exobiology program for 1989: A special bibliography

    NASA Technical Reports Server (NTRS)

    1991-01-01

    A listing of 1989 publications resulting from research supported by the Exobiology Program is presented. Research supported by the Exobiology Program is explored in the following areas: (1) cosmic evolution of biogenic compounds; (2) prebiotic evolution; (3) early evolution of life; (4) and evolution of advanced life. Pre-mission and pre-project activities supporting these areas are supported in the areas of solar system exploration and search for extraterrestrial intelligence. The planetary protection subject area is included here because of its direct relevance to the Exobiology Program.

  10. Historical evolution of disease mapping in general and specifically of cancer mapping.

    PubMed

    Howe, G M

    1989-01-01

    The presentation of areal data in epidemiology is illustrated by such mapping techniques as dots (spots), shading (choropleth, thematic) and isolines (isopleths). Examples are also given of computer-assisted cartography (computer graphics) which employs hardware and software components of digital computers, together with the use of geographical and demographic base maps.

  11. Evolution of an Intelligent Deductive Logic Tutor Using Data-Driven Elements

    ERIC Educational Resources Information Center

    Mostafavi, Behrooz; Barnes, Tiffany

    2017-01-01

    Deductive logic is essential to a complete understanding of computer science concepts, and is thus fundamental to computer science education. Intelligent tutoring systems with individualized instruction have been shown to increase learning gains. We seek to improve the way deductive logic is taught in computer science by developing an intelligent,…

  12. Understanding Students' Perceptions and Challenges of Computer-Based Assessments: A Case of UKZN

    ERIC Educational Resources Information Center

    Faniran, Victor Temitayo; Ajayi, Nurudeen A.

    2018-01-01

    Assessments are important to academic institutions because they help in evaluating students' knowledge. The conduct of assessments nowadays has been influenced by the continuous evolution of information technology. Hence, academic institutions now use computers for assessments, often known as Computer-Based Assessments (CBAs), in tandem with…

  13. Values and Objectives in Computing Education Research

    ERIC Educational Resources Information Center

    Pears, Arnold; Malmi, Lauri

    2009-01-01

    What is Computing Education Research (CER), why are we doing this type of research, and what should the community achieve? As associate editors to this special edition we provide our perspectives and discuss how they have influenced the evolution of the Koli Calling International Conference on Computing Education Research over the last nine years.…

  14. Impact of Personal Computing on Education.

    ERIC Educational Resources Information Center

    McIsaac, Donald N.

    1979-01-01

    Describes microcomputers, outlines lessons learned from the evolution of other technologies as they apply to the development of the microcomputer, discusses computer literacy as a problem-solving tool, and speculates about microcomputer use in instruction and administration. (IRT)

  15. WE-AB-303-09: Rapid Projection Computations for On-Board Digital Tomosynthesis in Radiation Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iliopoulos, AS; Sun, X; Pitsianis, N

    2015-06-15

    Purpose: To facilitate fast and accurate iterative volumetric image reconstruction from limited-angle on-board projections. Methods: Intrafraction motion hinders the clinical applicability of modern radiotherapy techniques, such as lung stereotactic body radiation therapy (SBRT). The LIVE system may impact clinical practice by recovering volumetric information via Digital Tomosynthesis (DTS), thus entailing low time and radiation dose for image acquisition during treatment. The DTS is estimated as a deformation of prior CT via iterative registration with on-board images; this shifts the challenge to the computational domain, owing largely to repeated projection computations across iterations. We address this issue by composing efficient digitalmore » projection operators from their constituent parts. This allows us to separate the static (projection geometry) and dynamic (volume/image data) parts of projection operations by means of pre-computations, enabling fast on-board processing, while also relaxing constraints on underlying numerical models (e.g. regridding interpolation kernels). Further decoupling the projectors into simpler ones ensures the incurred memory overhead remains low, within the capacity of a single GPU. These operators depend only on the treatment plan and may be reused across iterations and patients. The dynamic processing load is kept to a minimum and maps well to the GPU computational model. Results: We have integrated efficient, pre-computable modules for volumetric ray-casting and FDK-based back-projection with the LIVE processing pipeline. Our results show a 60x acceleration of the DTS computations, compared to the previous version, using a single GPU; presently, reconstruction is attained within a couple of minutes. The present implementation allows for significant flexibility in terms of the numerical and operational projection model; we are investigating the benefit of further optimizations and accurate digital projection sub-kernels. Conclusion: Composable projection operators constitute a versatile research tool which can greatly accelerate iterative registration algorithms and may be conducive to the clinical applicability of LIVE. National Institutes of Health Grant No. R01-CA184173; GPU donation by NVIDIA Corporation.« less

  16. Army Corps of Engineers: Water Resource Authorizations, Appropriations, and Activities

    DTIC Science & Technology

    2014-07-01

    Assistance Authorities ........................................................................... 11 Appendixes Appendix. Evolution of the Its...projects (e.g., municipal water and wastewater treatment systems) and other nontraditional activities. The Appendix provides more on the evolution of...feasibility by the Secretary of the Army.13 10 General Robert B. Flowers , Army Corps

  17. BIOINSPIRED DESIGN AND DIRECTED EVOLUTION OF IRON CONTAINING ENZYMES FOR GREENSYNTHETIC PROCESSES AND BIOREMEDIATION

    EPA Science Inventory

    SU833912
    Title: Bioinspired Design and Directed Evolution of Iron Containing Enzymes for Green Synthetic Processes and BioremediationEdward I. Solomon, Shaun D. Wong, Lei Liu, Caleb B. Bell, IIICynthia Nolt-Helms
    Project Period: August 15, 2008 - August 14,...

  18. Generative Representations for Computer-Automated Evolutionary Design

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.

    2006-01-01

    With the increasing computational power of computers, software design systems are progressing from being tools for architects and designers to express their ideas to tools capable of creating designs under human guidance. One of the main limitations for these computer-automated design systems is the representation with which they encode designs. If the representation cannot encode a certain design, then the design system cannot produce it. To be able to produce new types of designs, and not just optimize pre-defined parameterizations, evolutionary design systems must use generative representations. Generative representations are assembly procedures, or algorithms, for constructing a design thereby allowing for truly novel design solutions to be encoded. In addition, by enabling modularity, regularity and hierarchy, the level of sophistication that can be evolved is increased. We demonstrate the advantages of generative representations on two different design domains: the evolution of spacecraft antennas and the evolution of 3D objects.

  19. The computational challenges of Earth-system science.

    PubMed

    O'Neill, Alan; Steenman-Clark, Lois

    2002-06-15

    The Earth system--comprising atmosphere, ocean, land, cryosphere and biosphere--is an immensely complex system, involving processes and interactions on a wide range of space- and time-scales. To understand and predict the evolution of the Earth system is one of the greatest challenges of modern science, with success likely to bring enormous societal benefits. High-performance computing, along with the wealth of new observational data, is revolutionizing our ability to simulate the Earth system with computer models that link the different components of the system together. There are, however, considerable scientific and technical challenges to be overcome. This paper will consider four of them: complexity, spatial resolution, inherent uncertainty and time-scales. Meeting these challenges requires a significant increase in the power of high-performance computers. The benefits of being able to make reliable predictions about the evolution of the Earth system should, on their own, amply repay this investment.

  20. Stochastic optimization of GeantV code by use of genetic algorithms

    DOE PAGES

    Amadio, G.; Apostolakis, J.; Bandieramonte, M.; ...

    2017-10-01

    GeantV is a complex system based on the interaction of different modules needed for detector simulation, which include transport of particles in fields, physics models simulating their interactions with matter and a geometrical modeler library for describing the detector and locating the particles and computing the path length to the current volume boundary. The GeantV project is recasting the classical simulation approach to get maximum benefit from SIMD/MIMD computational architectures and highly massive parallel systems. This involves finding the appropriate balance between several aspects influencing computational performance (floating-point performance, usage of off-chip memory bandwidth, specification of cache hierarchy, etc.) andmore » handling a large number of program parameters that have to be optimized to achieve the best simulation throughput. This optimization task can be treated as a black-box optimization problem, which requires searching the optimum set of parameters using only point-wise function evaluations. Here, the goal of this study is to provide a mechanism for optimizing complex systems (high energy physics particle transport simulations) with the help of genetic algorithms and evolution strategies as tuning procedures for massive parallel simulations. One of the described approaches is based on introducing a specific multivariate analysis operator that could be used in case of resource expensive or time consuming evaluations of fitness functions, in order to speed-up the convergence of the black-box optimization problem.« less

Top