Sample records for extreme-scale computing workshop

  1. ASCR Cybersecurity for Scientific Computing Integrity - Research Pathways and Ideas Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peisert, Sean; Potok, Thomas E.; Jones, Todd

    At the request of the U.S. Department of Energy's (DOE) Office of Science (SC) Advanced Scientific Computing Research (ASCR) program office, a workshop was held June 2-3, 2015, in Gaithersburg, MD, to identify potential long term (10 to +20 year) cybersecurity fundamental basic research and development challenges, strategies and roadmap facing future high performance computing (HPC), networks, data centers, and extreme-scale scientific user facilities. This workshop was a follow-on to the workshop held January 7-9, 2015, in Rockville, MD, that examined higher level ideas about scientific computing integrity specific to the mission of the DOE Office of Science. Issues includedmore » research computation and simulation that takes place on ASCR computing facilities and networks, as well as network-connected scientific instruments, such as those run by various DOE Office of Science programs. Workshop participants included researchers and operational staff from DOE national laboratories, as well as academic researchers and industry experts. Participants were selected based on the submission of abstracts relating to the topics discussed in the previous workshop report [1] and also from other ASCR reports, including "Abstract Machine Models and Proxy Architectures for Exascale Computing" [27], the DOE "Preliminary Conceptual Design for an Exascale Computing Initiative" [28], and the January 2015 machine learning workshop [29]. The workshop was also attended by several observers from DOE and other government agencies. The workshop was divided into three topic areas: (1) Trustworthy Supercomputing, (2) Extreme-Scale Data, Knowledge, and Analytics for Understanding and Improving Cybersecurity, and (3) Trust within High-end Networking and Data Centers. Participants were divided into three corresponding teams based on the category of their abstracts. The workshop began with a series of talks from the program manager and workshop chair, followed by the leaders for each of the three topics and a representative of each of the four major DOE Office of Science Advanced Scientific Computing Research Facilities: the Argonne Leadership Computing Facility (ALCF), the Energy Sciences Network (ESnet), the National Energy Research Scientific Computing Center (NERSC), and the Oak Ridge Leadership Computing Facility (OLCF). The rest of the workshop consisted of topical breakout discussions and focused writing periods that produced much of this report.« less

  2. The future of scientific workflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deelman, Ewa; Peterka, Tom; Altintas, Ilkay

    Today’s computational, experimental, and observational sciences rely on computations that involve many related tasks. The success of a scientific mission often hinges on the computer automation of these workflows. In April 2015, the US Department of Energy (DOE) invited a diverse group of domain and computer scientists from national laboratories supported by the Office of Science, the National Nuclear Security Administration, from industry, and from academia to review the workflow requirements of DOE’s science and national security missions, to assess the current state of the art in science workflows, to understand the impact of emerging extreme-scale computing systems on thosemore » workflows, and to develop requirements for automated workflow management in future and existing environments. This article is a summary of the opinions of over 50 leading researchers attending this workshop. We highlight use cases, computing systems, workflow needs and conclude by summarizing the remaining challenges this community sees that inhibit large-scale scientific workflows from becoming a mainstream tool for extreme-scale science.« less

  3. 2009 fault tolerance for extreme-scale computing workshop, Albuquerque, NM - March 19-20, 2009.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katz, D. S.; Daly, J.; DeBardeleben, N.

    2009-02-01

    This is a report on the third in a series of petascale workshops co-sponsored by Blue Waters and TeraGrid to address challenges and opportunities for making effective use of emerging extreme-scale computing. This workshop was held to discuss fault tolerance on large systems for running large, possibly long-running applications. The main point of the workshop was to have systems people, middleware people (including fault-tolerance experts), and applications people talk about the issues and figure out what needs to be done, mostly at the middleware and application levels, to run such applications on the emerging petascale systems, without having faults causemore » large numbers of application failures. The workshop found that there is considerable interest in fault tolerance, resilience, and reliability of high-performance computing (HPC) systems in general, at all levels of HPC. The only way to recover from faults is through the use of some redundancy, either in space or in time. Redundancy in time, in the form of writing checkpoints to disk and restarting at the most recent checkpoint after a fault that cause an application to crash/halt, is the most common tool used in applications today, but there are questions about how long this can continue to be a good solution as systems and memories grow faster than I/O bandwidth to disk. There is interest in both modifications to this, such as checkpoints to memory, partial checkpoints, and message logging, and alternative ideas, such as in-memory recovery using residues. We believe that systematic exploration of these ideas holds the most promise for the scientific applications community. Fault tolerance has been an issue of discussion in the HPC community for at least the past 10 years; but much like other issues, the community has managed to put off addressing it during this period. There is a growing recognition that as systems continue to grow to petascale and beyond, the field is approaching the point where we don't have any choice but to address this through R&D efforts.« less

  4. Workshop report on large-scale matrix diagonalization methods in chemistry theory institute

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bischof, C.H.; Shepard, R.L.; Huss-Lederman, S.

    The Large-Scale Matrix Diagonalization Methods in Chemistry theory institute brought together 41 computational chemists and numerical analysts. The goal was to understand the needs of the computational chemistry community in problems that utilize matrix diagonalization techniques. This was accomplished by reviewing the current state of the art and looking toward future directions in matrix diagonalization techniques. This institute occurred about 20 years after a related meeting of similar size. During those 20 years the Davidson method continued to dominate the problem of finding a few extremal eigenvalues for many computational chemistry problems. Work on non-diagonally dominant and non-Hermitian problems asmore » well as parallel computing has also brought new methods to bear. The changes and similarities in problems and methods over the past two decades offered an interesting viewpoint for the success in this area. One important area covered by the talks was overviews of the source and nature of the chemistry problems. The numerical analysts were uniformly grateful for the efforts to convey a better understanding of the problems and issues faced in computational chemistry. An important outcome was an understanding of the wide range of eigenproblems encountered in computational chemistry. The workshop covered problems involving self- consistent-field (SCF), configuration interaction (CI), intramolecular vibrational relaxation (IVR), and scattering problems. In atomic structure calculations using the Hartree-Fock method (SCF), the symmetric matrices can range from order hundreds to thousands. These matrices often include large clusters of eigenvalues which can be as much as 25% of the spectrum. However, if Cl methods are also used, the matrix size can be between 10{sup 4} and 10{sup 9} where only one or a few extremal eigenvalues and eigenvectors are needed. Working with very large matrices has lead to the development of« less

  5. 2016 Final Reports from the Los Alamos National Laboratory Computational Physics Student Summer Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Runnels, Scott Robert; Bachrach, Harrison Ian; Carlson, Nils

    The two primary purposes of LANL’s Computational Physics Student Summer Workshop are (1) To educate graduate and exceptional undergraduate students in the challenges and applications of computational physics of interest to LANL, and (2) Entice their interest toward those challenges. Computational physics is emerging as a discipline in its own right, combining expertise in mathematics, physics, and computer science. The mathematical aspects focus on numerical methods for solving equations on the computer as well as developing test problems with analytical solutions. The physics aspects are very broad, ranging from low-temperature material modeling to extremely high temperature plasma physics, radiation transportmore » and neutron transport. The computer science issues are concerned with matching numerical algorithms to emerging architectures and maintaining the quality of extremely large codes built to perform multi-physics calculations. Although graduate programs associated with computational physics are emerging, it is apparent that the pool of U.S. citizens in this multi-disciplinary field is relatively small and is typically not focused on the aspects that are of primary interest to LANL. Furthermore, more structured foundations for LANL interaction with universities in computational physics is needed; historically interactions rely heavily on individuals’ personalities and personal contacts. Thus a tertiary purpose of the Summer Workshop is to build an educational network of LANL researchers, university professors, and emerging students to advance the field and LANL’s involvement in it.« less

  6. Current status and future perspectives of electron interactions with molecules, clusters, surfaces, and interfaces [Workshop on Fundamental challenges in electron-driven chemistry; Workshop on Electron-driven processes: Scientific challenges and technological opportunities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Becker, Kurt H.; McCurdy, C. William; Orlando, Thomas M.

    2000-09-01

    This report is based largely on presentations and discussions at two workshops and contributions from workshop participants. The workshop on Fundamental Challenges in Electron-Driven Chemistry was held in Berkeley, October 9-10, 1998, and addressed questions regarding theory, computation, and simulation. The workshop on Electron-Driven Processes: Scientific Challenges and Technological Opportunities was held at Stevens Institute of Technology, March 16-17, 2000, and focused largely on experiments. Electron-molecule and electron-atom collisions initiate and drive almost all the relevant chemical processes associated with radiation chemistry, environmental chemistry, stability of waste repositories, plasma-enhanced chemical vapor deposition, plasma processing of materials for microelectronic devices andmore » other applications, and novel light sources for research purposes (e.g. excimer lamps in the extreme ultraviolet) and in everyday lighting applications. The life sciences are a rapidly advancing field where the important role of electron-driven processes is only now beginning to be recognized. Many of the applications of electron-initiated chemical processes require results in the near term. A large-scale, multidisciplinary and collaborative effort should be mounted to solve these problems in a timely way so that their solution will have the needed impact on the urgent questions of understanding the physico-chemical processes initiated and driven by electron interactions.« less

  7. Extreme Weather and Climate: Workshop Report

    NASA Technical Reports Server (NTRS)

    Sobel, Adam; Camargo, Suzana; Debucquoy, Wim; Deodatis, George; Gerrard, Michael; Hall, Timothy; Hallman, Robert; Keenan, Jesse; Lall, Upmanu; Levy, Marc; hide

    2016-01-01

    Extreme events are the aspects of climate to which human society is most sensitive. Due to both their severity and their rarity, extreme events can challenge the capacity of physical, social, economic and political infrastructures, turning natural events into human disasters. Yet, because they are low frequency events, the science of extreme events is very challenging. Among the challenges is the difficulty of connecting extreme events to longer-term, large-scale variability and trends in the climate system, including anthropogenic climate change. How can we best quantify the risks posed by extreme weather events, both in the current climate and in the warmer and different climates to come? How can we better predict them? What can we do to reduce the harm done by such events? In response to these questions, the Initiative on Extreme Weather and Climate has been created at Columbia University in New York City (extreme weather.columbia.edu). This Initiative is a University-wide activity focused on understanding the risks to human life, property, infrastructure, communities, institutions, ecosystems, and landscapes from extreme weather events, both in the present and future climates, and on developing solutions to mitigate those risks. In May 2015,the Initiative held its first science workshop, entitled Extreme Weather and Climate: Hazards, Impacts, Actions. The purpose of the workshop was to define the scope of the Initiative and tremendously broad intellectual footprint of the topic indicated by the titles of the presentations (see Table 1). The intent of the workshop was to stimulate thought across disciplinary lines by juxtaposing talks whose subjects differed dramatically. Each session concluded with question and answer panel sessions. Approximately, 150 people were in attendance throughout the day. Below is a brief synopsis of each presentation. The synopses collectively reflect the variety and richness of the emerging extreme event research agenda.

  8. 2015 Final Reports from the Los Alamos National Laboratory Computational Physics Student Summer Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Runnels, Scott Robert; Caldwell, Wendy; Brown, Barton Jed

    The two primary purposes of LANL’s Computational Physics Student Summer Workshop are (1) To educate graduate and exceptional undergraduate students in the challenges and applications of computational physics of interest to LANL, and (2) Entice their interest toward those challenges. Computational physics is emerging as a discipline in its own right, combining expertise in mathematics, physics, and computer science. The mathematical aspects focus on numerical methods for solving equations on the computer as well as developing test problems with analytical solutions. The physics aspects are very broad, ranging from low-temperature material modeling to extremely high temperature plasma physics, radiation transportmore » and neutron transport. The computer science issues are concerned with matching numerical algorithms to emerging architectures and maintaining the quality of extremely large codes built to perform multi-physics calculations. Although graduate programs associated with computational physics are emerging, it is apparent that the pool of U.S. citizens in this multi-disciplinary field is relatively small and is typically not focused on the aspects that are of primary interest to LANL. Furthermore, more structured foundations for LANL interaction with universities in computational physics is needed; historically interactions rely heavily on individuals’ personalities and personal contacts. Thus a tertiary purpose of the Summer Workshop is to build an educational network of LANL researchers, university professors, and emerging students to advance the field and LANL’s involvement in it. This report includes both the background for the program and the reports from the students.« less

  9. Short-Pulse Laser-Matter Computational Workshop Proceedings

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Town, R; Tabak, M

    For three days at the end of August 2004, 55 plasma scientists met at the Four Points by Sheraton in Pleasanton to discuss some of the critical issues associated with the computational aspects of the interaction of short-pulse high-intensity lasers with matter. The workshop was organized around the following six key areas: (1) Laser propagation/interaction through various density plasmas: micro scale; (2) Anomalous electron transport effects: From micro to meso scale; (3) Electron transport through plasmas: From meso to macro scale; (4) Ion beam generation, transport, and focusing; (5) ''Atomic-scale'' electron and proton stopping powers; and (6) K{alpha} diagnostics.

  10. Opening Comments: SciDAC 2009

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2009-07-01

    Welcome to San Diego and the 2009 SciDAC conference. Over the next four days, I would like to present an assessment of the SciDAC program. We will look at where we've been, how we got to where we are and where we are going in the future. Our vision is to be first in computational science, to be best in class in modeling and simulation. When Ray Orbach asked me what I would do, in my job interview for the SciDAC Director position, I said we would achieve that vision. And with our collective dedicated efforts, we have managed to achieve this vision. In the last year, we have now the most powerful supercomputer for open science, Jaguar, the Cray XT system at the Oak Ridge Leadership Computing Facility (OLCF). We also have NERSC, probably the best-in-the-world program for productivity in science that the Office of Science so depends on. And the Argonne Leadership Computing Facility offers architectural diversity with its IBM Blue Gene/P system as a counterbalance to Oak Ridge. There is also ESnet, which is often understated—the 40 gigabit per second dual backbone ring that connects all the labs and many DOE sites. In the President's Recovery Act funding, there is exciting news that ESnet is going to build out to a 100 gigabit per second network using new optical technologies. This is very exciting news for simulations and large-scale scientific facilities. But as one noted SciDAC luminary said, it's not all about the computers—it's also about the science—and we are also achieving our vision in this area. Together with having the fastest supercomputer for science, at the SC08 conference, SciDAC researchers won two ACM Gordon Bell Prizes for the outstanding performance of their applications. The DCA++ code, which solves some very interesting problems in materials, achieved a sustained performance of 1.3 petaflops, an astounding result and a mark I suspect will last for some time. The LS3DF application for studying nanomaterials also required the development of a new and novel algorithm to produce results up to 400 times faster than a similar application, and was recognized with a prize for algorithm innovation—a remarkable achievement. Day one of our conference will include examples of petascale science enabled at the OLCF. Although Jaguar has not been officially commissioned, it has gone through its acceptance tests, and during its shakedown phase there have been pioneer applications used for the acceptance tests, and they are running at scale. These include applications in the areas of astrophysics, biology, chemistry, combustion, fusion, geosciences, materials science, nuclear energy and nuclear physics. We also have a whole compendium of science we do at our facilities; these have been documented and reviewed at our last SciDAC conference. Many of these were highlighted in our Breakthroughs Report. One session at this week's conference will feature a cross-section of these breakthroughs. In the area of scalable electromagnetic simulations, the Auxiliary-space Maxwell Solver (AMS) uses specialized finite element discretizations and multigrid-based techniques, which decompose the original problem into easier-to-solve subproblems. Congratulations to the mathematicians on this. Another application on the list of breakthroughs was the authentication of PETSc, which provides scalable solvers used in many DOE applications and has solved problems with over 3 billion unknowns and scaled to over 16,000 processors on DOE leadership-class computers. This is becoming a very versatile and useful toolkit to achieve performance at scale. With the announcement of SIAM's first class of Fellows, we are remarkably well represented. Of the group of 191, more than 40 of these Fellows are in the 'DOE space.' We are so delighted that SIAM has recognized them for their many achievements. In the coming months, we will illustrate our leadership in applied math and computer science by looking at our contributions in the areas of programming models, development and performance tools, math libraries, system software, collaboration, and visualization and data analytics. This is a large and diverse list of libraries. We have asked for two panels, one chaired by David Keyes and composed of many of the nation's leading mathematicians, to produce a report on the most significant accomplishments in applied mathematics over the last eight years, taking us back to the start of the SciDAC program. In addition, we have a similar panel in computer science to be chaired by Kathy Yelick. They are going to identify the computer science accomplishments of the past eight years. These accomplishments are difficult to get a handle on, and I'm looking forward to this report. We will also have a follow-on to our report on breakthroughs in computational science and this will also go back eight years, looking at the many accomplishments under the SciDAC and INCITE programs. This will be chaired by Tony Mezzacappa. So, where are we going in the SciDAC program? It might help to take a look at computational science and how it got started. I go back to Ken Wilson, who made the model and has written on computational science and computational science education. His model was thus: The computational scientist plays the role of the experimentalist, and the math and CS researchers play the role of theorists, and the computers themselves are the experimental apparatus. And that in simulation science, we are carrying out numerical experiments as to the nature of physical and biological sciences. Peter Lax, in the same time frame, developed a report on large-scale computing in science and engineering. Peter remarked, 'Perhaps the most important applications of scientific computing come not in the solution of old problems, but in the discovery of new phenomena through numerical experimentation.' And in the early years, I think the person who provided the most guidance, the most innovation and the most vision for where the future might lie was Ed Oliver. Ed Oliver died last year. Ed did a number of things in science. He had this personality where he knew exactly what to do, but he preferred to stay out of the limelight so that others could enjoy the fruits of his vision. We in the SciDAC program and ASCR Facilities are still enjoying the benefits of his vision. We will miss him. Twenty years after Ken Wilson, Ray Orbach laid out the fundamental premise for SciDAC in an interview that appeared in SciDAC Review: 'SciDAC is unique in the world. There isn't any other program like it anywhere else, and it has the remarkable ability to do science by bringing together physical scientists, mathematicians, applied mathematicians, and computer scientists who recognize that computation is not something you do at the end, but rather it needs to be built into the solution of the very problem that one is addressing. ' As you look at the Lax report from 1982, it talks about how 'Future significant improvements may have to come from architectures embodying parallel processing elements—perhaps several thousands of processors.' And it continues, 'esearch in languages, algorithms and numerical analysis will be crucial in learning to exploit these new architectures fully.' In the early '90s, Sterling, Messina and Smith developed a workshop report on petascale computing and concluded, 'A petaflops computer system will be feasible in two decades, or less, and rely in part on the continual advancement of the semiconductor industry both in speed enhancement and cost reduction through improved fabrication processes.' So they were not wrong, and today we are embarking on a forward look that is at a different scale, the exascale, going to 1018 flops. In 2007, Stevens, Simon and Zacharia chaired a series of town hall meetings looking at exascale computing, and in their report wrote, 'Exascale computer systems are expected to be technologically feasible within the next 15 years, or perhaps sooner. These systems will push the envelope in a number of important technologies: processor architecture, scale of multicore integration, power management and packaging.' The concept of computing on the Jaguar computer involves hundreds of thousands of cores, as do the IBM systems that are currently out there. So the scale of computing with systems with billions of processors is staggering to me, and I don't know how the software and math folks feel about it. We have now embarked on a road toward extreme scale computing. We have created a series of town hall meetings and we are now in the process of holding workshops that address what I call within the DOE speak 'the mission need,' or what is the scientific justification for computing at that scale. We are going to have a total of 13 workshops. The workshops on climate, high energy physics, nuclear physics, fusion, and nuclear energy have been held. The report from the workshop on climate is actually out and available, and the other reports are being completed. The upcoming workshops are on biology, materials, and chemistry; and workshops that engage science for nuclear security are a partnership between NNSA and ASCR. There are additional workshops on applied math, computer science, and architecture that are needed for computing at the exascale. These extreme scale workshops will provide the foundation in our office, the Office of Science, the NNSA and DOE, and we will engage the National Science Foundation and the Department of Defense as partners. We envision a 10-year program for an exascale initiative. It will be an integrated R&D program initially—you can think about five years for research and development—that would be in hardware, operating systems, file systems, networking and so on, as well as software for applications. Application software and the operating system and the hardware all need to be bundled in this period so that at the end the system will execute the science applications at scale. We also believe that this process will have to have considerable investment from the manufacturers and vendors to be successful. We have formed laboratory, university and industry working groups to start this process and formed a panel to look at where SciDAC needs to go to compute at the extreme scale, and we have formed an executive committee within the Office of Science and the NNSA to focus on these activities. We will have outreach to DoD in the next few months. We are anticipating a solicitation within the next two years in which we will compete this bundled R&D process. We don't know how we will incorporate SciDAC into extreme scale computing, but we do know there will be many challenges. And as we have shown over the years, we have the expertise and determination to surmount these challenges.

  11. ARL Collaborative Research Alliance Materials in Extreme Dynamic Environments (MEDE)

    DTIC Science & Technology

    2010-11-19

    Program Internal to the CRA Staff Rotation Lectures, Workshops, and Research Reviews Education Opportunities for Government Personnel Student ... Engagement with ARL Research Environment Industry Partnership + Collaboration Other Collaboration Opportunities High Performance Computing DoD

  12. Biomanufacturing: a US-China National Science Foundation-sponsored workshop.

    PubMed

    Sun, Wei; Yan, Yongnian; Lin, Feng; Spector, Myron

    2006-05-01

    A recent US-China National Science Foundation-sponsored workshop on biomanufacturing reviewed the state-of-the-art of an array of new technologies for producing scaffolds for tissue engineering, providing precision multi-scale control of material, architecture, and cells. One broad category of such techniques has been termed solid freeform fabrication. The techniques in this category include: stereolithography, selected laser sintering, single- and multiple-nozzle deposition and fused deposition modeling, and three-dimensional printing. The precise and repetitive placement of material and cells in a three-dimensional construct at the micrometer length scale demands computer control. These novel computer-controlled scaffold production techniques, when coupled with computer-based imaging and structural modeling methods for the production of the templates for the scaffolds, define an emerging field of computer-aided tissue engineering. In formulating the questions that remain to be answered and discussing the knowledge required to further advance the field, the Workshop provided a basis for recommendations for future work.

  13. Preface: phys. stat. sol. (b) 243/5

    NASA Astrophysics Data System (ADS)

    Artacho, Emilio; Beck, Thomas L.; Hernández, Eduardo

    Between 20 and 24 June 2005 the Centre Européen de Calcul Atomique et Moléculaire - or CECAM, as it is more widely known - hosted a workshop entitled State-of-the-art, developments and perspectives of real-space electronic structure methods in condensed-matter and chemical physics, organized with the support of CECAM itself and the ?k network. The workshop was attended by some forty participants coming from fifteen countries, and about thirty presentations were given. The workshop provided a lively forum for the discussion of recent methodological developments in electronic structure calculations, ranging from linear-scaling methods, mesh techniques, time-dependent density functional methods, and a long etcetera, which had been our ultimate objective when undertaking its organization.The first-principles simulation of solids, liquids and complex matter in general has jumped in the last few years from the relatively confined niches in condensed matter and materials physics and in quantum chemistry, to cover most of the sciences, including nano, bio, geo, environmental sciences and engineering. This effect has been propitiated by the ability of simulation techniques to deal with an ever larger degree of complexity. Although this is partially to be attributed to the steady increase in computer power, the main factor behind this change has been the coming of age of the main theoretical framework for most of the simulations performed today, together with an extremely active development of the basic algorithms for its computer implementation. It is this latter aspect that is the topic of this special issue of physica status solidi.There is a relentless effort in the scientific community seeking to achieve not only higher accuracy, but also more efficient, cost-effective and if possible simpler computational methods in electronic structure calculations [1]. From the early 1990s onwards there has been a keen interest in the computational condensed matter and chemical physics communities in methods that had the potential to overcome the unfavourable scaling of the computational cost with the system size, implicit in the momentum-space formalism familiar to solid-state physicists and the quantum chemistry approaches more common in chemical physics and physical chemistry. This interest was sparkled by the famous paper in which Weitao Yang [2] introduced the Divide and Conquer method. Soon afterwards several practical schemes aiming to achieve linear-scaling calculations, by exploiting what Walter Kohn called most aptly the near-sightedness of quantum mechanics [3], were proposed and explored (for a review on linear-scaling methods, see [4]). This search for novel, more efficient and better scaling algorithms proved to be fruitful in more than one way. Not only was it the start of several packages which are well-known today (such as Siesta, Conquest, etc.), but it also leads to new ways of representing electronic states and orbitals, such as grids [5, 6], wavelets [7], finite elements, etc. Also, the drive to exploit near-sightedness attracted computational solid state physicists to the type of atomic-like basis functions traditionally used in the quantum chemistry community. At the same time computational chemists learnt about plane waves and density functional theory, and thus a fruitful dialogue was started between two communities that hitherto had not had much contact.Another interesting development that has begun to take place over the last decade or so is the convergence of several branches of science, notably physics, chemistry and biology, at the nanoscale. Experimentalists in all these different fields are now performing highly sophisticated measurements on systems of nanometer size, the kind of systems that us theoreticians can address with our computational methods, and this convergence of experiment and theory at this scale has also been very fruitful, particularly in the fields of electronic transport and STM image simulation. It is now quite common to find papers at the cutting edge of nanoscience and nanotechnology co-authored by experimentalists and theorists, and it can only be expected that this fruitful interplay between theory and experiment will increase in the future.It was considerations such as these that moved us to propose to CECAM and ?k the celebration of a workshop devoted to the discussion of recent developments in electronic structure techniques, a proposal that was enthusiastically received, not just by CECAM and ?k, but also by our invited speakers and participants. Interest in novel electronic structure methods is now as high as ever, and we are therefore very happy that physica status solidi has given us the opportunity to devote a special issue to the topics covered in the workshop. This special issue of physica status solidi gathers invited contributions from several attendants to the workshop, contributions that are representative of the range of topics and issues discussed then, including progress in linear scaling methods, electronic transport, simulation of STM images, time-dependent DFT methods, etc. It rests for us to thank all the contributors to this special issue for their efforts, CECAM and ?k for funding the workshop, physica status solidi for agreeing to devote this special issue to the workshop, and last but not least Emmanuelle and Emilie, the CECAM secretaries, for their invaluable practical help in putting this workshop together

  14. 76 FR 38360 - Workshop-Monitoring Changes in Extreme Storm Statistics: State of Knowledge; Notice of Open...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-30

    ... Changes in Extreme Storm Statistics: State of Knowledge; Notice of Open Public Workshop AGENCY: National... .) SUPPLEMENTARY INFORMATION: This workshop will provide an update to the climate science surrounding extreme... storms. Specific topics include: Severe Thunderstorms (and associated hail and winds), tornadoes, extreme...

  15. Final Report. Institute for Ultralscale Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, Kwan-Liu; Galli, Giulia; Gygi, Francois

    The SciDAC Institute for Ultrascale Visualization brought together leading experts from visualization, high-performance computing, and science application areas to make advanced visualization solutions for SciDAC scientists and the broader community. Over the five-year project, the Institute introduced many new enabling visualization techniques, which have significantly enhanced scientists’ ability to validate their simulations, interpret their data, and communicate with others about their work and findings. This Institute project involved a large number of junior and student researchers, who received the opportunities to work on some of the most challenging science applications and gain access to the most powerful high-performance computing facilitiesmore » in the world. They were readily trained and prepared for facing the greater challenges presented by extreme-scale computing. The Institute’s outreach efforts, through publications, workshops and tutorials, successfully disseminated the new knowledge and technologies to the SciDAC and the broader scientific communities. The scientific findings and experience of the Institute team helped plan the SciDAC3 program.« less

  16. Simple Assessment Techniques for Soil and Water. Environmental Factors in Small Scale Development Projects. Workshops.

    ERIC Educational Resources Information Center

    Coordination in Development, New York, NY.

    This booklet was produced in response to the growing need for reliable environmental assessment techniques that can be applied to small-scale development projects. The suggested techniques emphasize low-technology environmental analysis. Although these techniques may lack precision, they can be extremely valuable in helping to assure the success…

  17. Trends in Middle East climate extreme indices from 1950 to 2003

    NASA Astrophysics Data System (ADS)

    Zhang, Xuebin; Aguilar, Enric; Sensoy, Serhat; Melkonyan, Hamlet; Tagiyeva, Umayra; Ahmed, Nader; Kutaladze, Nato; Rahimzadeh, Fatemeh; Taghipour, Afsaneh; Hantosh, T. H.; Albert, Pinhas; Semawi, Mohammed; Karam Ali, Mohammad; Said Al-Shabibi, Mansoor Halal; Al-Oulan, Zaid; Zatari, Taha; Al Dean Khelet, Imad; Hamoud, Saleh; Sagir, Ramazan; Demircan, Mesut; Eken, Mehmet; Adiguzel, Mustafa; Alexander, Lisa; Peterson, Thomas C.; Wallis, Trevor

    2005-11-01

    A climate change workshop for the Middle East brought together scientists and data for the region to produce the first area-wide analysis of climate extremes for the region. This paper reports trends in extreme precipitation and temperature indices that were computed during the workshop and additional indices data that became available after the workshop. Trends in these indices were examined for 1950-2003 at 52 stations covering 15 countries, including Armenia, Azerbaijan, Bahrain, Cyprus, Georgia, Iran, Iraq, Israel, Jordan, Kuwait, Oman, Qatar, Saudi Arabia, Syria, and Turkey. Results indicate that there have been statistically significant, spatially coherent trends in temperature indices that are related to temperature increases in the region. Significant, increasing trends have been found in the annual maximum of daily maximum and minimum temperature, the annual minimum of daily maximum and minimum temperature, the number of summer nights, and the number of days where daily temperature has exceeded its 90th percentile. Significant negative trends have been found in the number of days when daily temperature is below its 10th percentile and daily temperature range. Trends in precipitation indices, including the number of days with precipitation, the average precipitation intensity, and maximum daily precipitation events, are weak in general and do not show spatial coherence. The workshop attendees have generously made the indices data available for the international research community.

  18. Large Scale Computing and Storage Requirements for High Energy Physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerber, Richard A.; Wasserman, Harvey

    2010-11-24

    The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility for the Department of Energy's Office of Science, providing high-performance computing (HPC) resources to more than 3,000 researchers working on about 400 projects. NERSC provides large-scale computing resources and, crucially, the support and expertise needed for scientists to make effective use of them. In November 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of High Energy Physics (HEP) held a workshop to characterize the HPC resources needed at NERSC to support HEP research through the next three to five years. Themore » effort is part of NERSC's legacy of anticipating users needs and deploying resources to meet those demands. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. The chief findings: (1) Science teams need access to a significant increase in computational resources to meet their research goals; (2) Research teams need to be able to read, write, transfer, store online, archive, analyze, and share huge volumes of data; (3) Science teams need guidance and support to implement their codes on future architectures; and (4) Projects need predictable, rapid turnaround of their computational jobs to meet mission-critical time constraints. This report expands upon these key points and includes others. It also presents a number of case studies as representative of the research conducted within HEP. Workshop participants were asked to codify their requirements in this case study format, summarizing their science goals, methods of solution, current and three-to-five year computing requirements, and software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, multi-core environment that is expected to dominate HPC architectures over the next few years. The report includes a section that describes efforts already underway or planned at NERSC that address requirements collected at the workshop. NERSC has many initiatives in progress that address key workshop findings and are aligned with NERSC's strategic plans.« less

  19. Grid Quality and Resolution Issues from the Drag Prediction Workshop Series

    NASA Technical Reports Server (NTRS)

    Mavriplis, Dimitri J.; Vassberg, John C.; Tinoco, Edward N.; Mani, Mori; Brodersen, Olaf P.; Eisfeld, Bernhard; Wahls, Richard A.; Morrison, Joseph H.; Zickuhr, Tom; Levy, David; hide

    2008-01-01

    The drag prediction workshop series (DPW), held over the last six years, and sponsored by the AIAA Applied Aerodynamics Committee, has been extremely useful in providing an assessment of the state-of-the-art in computationally based aerodynamic drag prediction. An emerging consensus from the three workshop series has been the identification of spatial discretization errors as a dominant error source in absolute as well as incremental drag prediction. This paper provides an overview of the collective experience from the workshop series regarding the effect of grid-related issues on overall drag prediction accuracy. Examples based on workshop results are used to illustrate the effect of grid resolution and grid quality on drag prediction, and grid convergence behavior is examined in detail. For fully attached flows, various accurate and successful workshop results are demonstrated, while anomalous behavior is identified for a number of cases involving substantial regions of separated flow. Based on collective workshop experiences, recommendations for improvements in mesh generation technology which have the potential to impact the state-of-the-art of aerodynamic drag prediction are given.

  20. Network-based approaches to climate knowledge discovery

    NASA Astrophysics Data System (ADS)

    Budich, Reinhard; Nyberg, Per; Weigel, Tobias

    2011-11-01

    Climate Knowledge Discovery Workshop; Hamburg, Germany, 30 March to 1 April 2011 Do complex networks combined with semantic Web technologies offer the next generation of solutions in climate science? To address this question, a first Climate Knowledge Discovery (CKD) Workshop, hosted by the German Climate Computing Center (Deutsches Klimarechenzentrum (DKRZ)), brought together climate and computer scientists from major American and European laboratories, data centers, and universities, as well as representatives from industry, the broader academic community, and the semantic Web communities. The participants, representing six countries, were concerned with large-scale Earth system modeling and computational data analysis. The motivation for the meeting was the growing problem that climate scientists generate data faster than it can be interpreted and the need to prepare for further exponential data increases. Current analysis approaches are focused primarily on traditional methods, which are best suited for large-scale phenomena and coarse-resolution data sets. The workshop focused on the open discussion of ideas and technologies to provide the next generation of solutions to cope with the increasing data volumes in climate science.

  1. A characterization of workflow management systems for extreme-scale applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia

    We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less

  2. A characterization of workflow management systems for extreme-scale applications

    DOE PAGES

    Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia; ...

    2017-02-16

    We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less

  3. Nonstationarity RC Workshop Report: Nonstationary Weather Patterns and Extreme Events Informing Design and Planning for Long-Lived Infrastructure

    DTIC Science & Technology

    2017-11-01

    magnitude, intensity, and seasonality of climate. For infrastructure projects, relevant design life often exceeds 30 years—a period of time of...uncertainty about future statistical properties of climate at time and spatial scales required for planning and design purposes. Information...about future statistical properties of climate at time and spatial scales required for planning and design , and for assessing future operational

  4. Language and Discourse Analysis with Coh-Metrix: Applications from Educational Material to Learning Environments at Scale

    ERIC Educational Resources Information Center

    Dowell, Nia M. M.; Graesser, Arthur\tC.; Cai, Zhiqiang

    2016-01-01

    The goal of this article is to preserve and distribute the information presented at the LASI (2014) workshop on Coh-Metrix, a theoretically grounded, computational linguistics facility that analyzes texts on multiple levels of language and discourse. The workshop focused on the utility of Coh-Metrix in discourse theory and educational practice. We…

  5. Community Science Workshops: A Powerful and Feasible Model for Serving Underserved Youth. An Evaluation Brief

    ERIC Educational Resources Information Center

    Inverness Research Associates, 2007

    2007-01-01

    The people at Inverness Research Associates spent 12 years studying Community Science Workshops (CSW) in California and in six other states. They gathered statistics on the scale, scope, and cost-efficiency of CSW services to youth. They observed youth at work in the shops--taking apart computers, repairing bikes, growing plants, and so on--and…

  6. Perspectives from the NSF-sponsored workshop on Grand Challenges in Nanomaterials

    NASA Astrophysics Data System (ADS)

    Hull, Robert

    2004-03-01

    At an NSF-sponsored workshop in June 2003, about seventy research leaders in the field of nanomaterials met to discuss, explore and identify future new directions and critical needs ("Grand Challenges") for the next decade and beyond. The key pervasive theme that was identified was the need to develop techniques for assembly of nanoscaled materials over multiple lengths scales, at the levels of efficiency, economy, and precision necessary to realize broad new classes of applications in such diverse technologies as electronics, computation, telecommunications, data storage, energy storage / transmission / generation, health care, transportation, civil infrastructure, military applications, national security, and the environment. Elements of this strategy include development of new self-assembly and lithographic techniques; biologically-mediated synthesis; three-dimensional atomic-scale measurement of structure, properties and chemistry; harnessing of the sub-atomic properties of materials such as electron spin and quantum interactions; new computational methods that span all relevant length- and time- scales; a fundamental understanding of acceptable / achievable "fault tolerance" at the nanoscale; and methods for real-time and distributed sensing of nanoscale assembly. A parallel theme was the need to provide education concerning the potential, applications, and benefits of nanomaterials to all components of society and all levels of the educational spectrum. This talk will summarize the conclusions and recommendations from this workshop, and illustrate the future potential of this field through presentation of selected break-through results provided by workshop participants.

  7. Physics of the 1 Teraflop RIKEN-BNL-Columbia QCD project. Proceedings of RIKEN BNL Research Center workshop: Volume 13

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-10-16

    A workshop was held at the RIKEN-BNL Research Center on October 16, 1998, as part of the first anniversary celebration for the center. This meeting brought together the physicists from RIKEN-BNL, BNL and Columbia who are using the QCDSP (Quantum Chromodynamics on Digital Signal Processors) computer at the RIKEN-BNL Research Center for studies of QCD. Many of the talks in the workshop were devoted to domain wall fermions, a discretization of the continuum description of fermions which preserves the global symmetries of the continuum, even at finite lattice spacing. This formulation has been the subject of analytic investigation for somemore » time and has reached the stage where large-scale simulations in QCD seem very promising. With the computational power available from the QCDSP computers, scientists are looking forward to an exciting time for numerical simulations of QCD.« less

  8. Extreme Environments Development of Decision Processes and Training Programs for Medical Policy Formulation

    NASA Technical Reports Server (NTRS)

    Stough, Roger

    2004-01-01

    The purpose of this workshop was to survey existing health and safety policies as well as processes and practices for various extreme environments; to identify strengths and shortcomings of these processes; and to recommend parameters for inclusion in a generic approach to policy formulation, applicable to the broadest categories of extreme environments. It was anticipated that two additional workshops would follow. The November 7, 2003 workshop would be devoted to the evaluation of different model(s) and a concluding expert evaluation of the usefulness of the model using a policy formulation example. The final workshop was planned for March 2004.

  9. Extreme Conditions Modeling Workshop Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coe, R. G.; Neary, V. S.; Lawson, M. J.

    2014-07-01

    Sandia National Laboratories (SNL) and the National Renewable Energy Laboratory (NREL) hosted the Wave Energy Converter (WEC) Extreme Conditions Modeling (ECM) Workshop in Albuquerque, NM on May 13th-14th, 2014. The objective of the workshop was to review the current state of knowledge on how to model WECs in extreme conditions (e.g. hurricanes and other large storms) and to suggest how U.S. Department of Energy (DOE) and national laboratory resources could be used to improve ECM methods for the benefit of the wave energy industry.

  10. PREFACE: Joint IPPP Durham/Cockcroft Institute/ICFA Workshop on Advanced QED methods for Future Accelerators

    NASA Astrophysics Data System (ADS)

    Bailey, I. R.; Barber, D. P.; Chattopadhyay, S.; Hartin, A.; Heinzl, T.; Hesselbach, S.; Moortgat-Pick, G. A.

    2009-11-01

    The joint IPPP Durham/Cockcroft Institute/ICFA workshop on advanced QED methods for future accelerators took place at the Cockcroft Institute in early March 2009. The motivation for the workshop was the need for a detailed consideration of the physics processes associated with beam-beam effects at the interaction points of future high-energy electron-positron colliders. There is a broad consensus within the particle physics community that the next international facility for experimental high-energy physics research beyond the Large Hadron Collider at CERN should be a high-luminosity electron-positron collider working at the TeV energy scale. One important feature of such a collider will be its ability to deliver polarised beams to the interaction point and to provide accurate measurements of the polarisation state during physics collisions. The physics collisions take place in very dense charge bunches in the presence of extremely strong electromagnetic fields of field strength of order of the Schwinger critical field strength of 4.4×1013 Gauss. These intense fields lead to depolarisation processes which need to be thoroughly understood in order to reduce uncertainty in the polarisation state at collision. To that end, this workshop reviewed the formalisms for describing radiative processes and the methods of calculation in the future strong-field environments. These calculations are based on the Furry picture of organising the interaction term of the Lagrangian. The means of deriving the transition probability of the most important of the beam-beam processes - Beamsstrahlung - was reviewed. The workshop was honoured by the presentations of one of the founders, V N Baier, of the 'Operator method' - one means for performing these calculations. Other theoretical methods of performing calculations in the Furry picture, namely those due to A I Nikishov, V I Ritus et al, were reviewed and intense field quantum processes in fields of different form - namely those present in intense lasers - were also presented. Within the Furry picture the lowest order physics processes are represented by one vertex Feynman diagrams. Additionally, higher order processes in the Furry picture are thought to be important and are still not fully studied. The Advanced QED methods workshop also benefited greatly from reports on ongoing and planned experimental work on quantum processes in intense external fields. Some of the experiments reviewed were the NA43 and NA63 experiments using the inter atomic fields in aligned crystals at CERN. In the past, evidence has been obtained from successful experiments using an intense laser at the SLAC experiment E144. The possibility now exists for new experiments with intense laser light with the planned XFEL at DESY and the European Extreme Light Infrastructure. For upcoming accelerator projects, computer simulations of the first order processes in the Furry Picture during the bunch-bunch collision are being performed using the programs CAIN and Guinea-Pig++. The implementation of spin dynamics in these simulation programs was reported on at the workshop. This relatively small workshop generated a very productive intermix of theoretical, experimental and computational developments covering this important field of physics. Fruitful discussions took place covering improvements to the models, estimations of the remaining theoretical uncertainties and future updates to the existing simulations. It was felt that ongoing workshops in the same field would be of benefit to all those involved. The organisers would like to express their sincere thanks to all of the attendees for their contributions, to the staff of the Cockcroft Institute for hosting the workshop, to the IPPP at Durham for providing substantial funding and administrative support, and to ICFA for their sponsorship. We would also like to thank IOP Publishing for their assistance in publishing our proceedings in the Journal of Physics: Conference Series.

  11. Machine Learning, deep learning and optimization in computer vision

    NASA Astrophysics Data System (ADS)

    Canu, Stéphane

    2017-03-01

    As quoted in the Large Scale Computer Vision Systems NIPS workshop, computer vision is a mature field with a long tradition of research, but recent advances in machine learning, deep learning, representation learning and optimization have provided models with new capabilities to better understand visual content. The presentation will go through these new developments in machine learning covering basic motivations, ideas, models and optimization in deep learning for computer vision, identifying challenges and opportunities. It will focus on issues related with large scale learning that is: high dimensional features, large variety of visual classes, and large number of examples.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jon Eisenberg, Director, CSTB

    The Computer Science and Telecommunications Board of the National Research Council considers technical and policy issues pertaining to computer science (CS), telecommunications, and information technology (IT). The functions of the board include: (1) monitoring and promoting the health of the CS, IT, and telecommunications fields, including attention as appropriate to issues of human resources and funding levels and program structures for research; (2) initiating studies involving CS, IT, and telecommunications as critical resources and sources of national economic strength; (3) responding to requests from the government, non-profit organizations, and private industry for expert advice on CS, IT, and telecommunications issues;more » and to requests from the government for expert advice on computer and telecommunications systems planning, utilization, and modernization; (4) fostering interaction among CS, IT, and telecommunications researchers and practitioners, and with other disciplines; and providing a base of expertise in the National Research Council in the areas of CS, IT, and telecommunications. This award has supported the overall operation of CSTB. Reports resulting from the Board's efforts have been widely disseminated in both electronic and print form, and all CSTB reports are available at its World Wide Web home page at cstb.org. The following reports, resulting from projects that were separately funded by a wide array of sponsors, were completed and released during the award period: 2007: * Summary of a Workshop on Software-Intensive Systems and Uncertainty at Scale * Social Security Administration Electronic Service Provision: A Strategic Assessment * Toward a Safer and More Secure Cyberspace * Software for Dependable Systems: Sufficient Evidence? * Engaging Privacy and Information Technology in a Digital Age * Improving Disaster Management: The Role of IT in Mitigation, Preparedness, Response, and Recovery 2006: * Renewing U.S. Telecommunications Research * Letter Report on Electronic Voting * Summary of a Workshop on the Technology, Policy, and Cultural Dimensions of Biometric System 2005: * Catalyzing Inquiry at the Interface of Computing and Biology * Summary of a Workshop on Using IT to Enhance Disaster Management * Asking the Right Questions About Electronic Voting * Building an Electronic Records Archive at NARA: Recommendations for a Long-Term Strategy * Signposts in Cyberspace: The Domain Name System and Internet Navigation 2004: * ITCP: Information Technology and Creative Practices (brochure) * Radio Frequency Identification (RFID) Technologies: A Workshop Summary * Getting up to Speed: The Future of Supercomputing * Summary of a Workshop on Software Certification and Dependability * Computer Science: Reflections on the Field, Reflections from the Field CSTB conducted numerous briefings of these reports and transmitted copies of these reports to researchers and key decision makers in the public and private sectors. It developed articles for journals based on several of these reports. As requested, and in fulfillment of its congressional charter to act as an independent advisor to the federal government, it arranged for congressional testimony on several of these reports. CSTB also convenes a number of workshops and other events, either as part of studies or in conjunctions with meetings of the CSTB members. These events have included the following: two 2007 workshops explored issues and challenges related to state voter registration databases, record matching, and database interoperability. A Sept. 2007 workshop, Trends in Computing Performance, explored fundamental trends in areas such as power, storage, programming, and applications. An Oct. 2007, workshop presented highlights of CSTB's May 2007 report, Software for Dependable Systems: Sufficient Evidence?, along with several panels discussing the report's conclusions and their implications. A Jan. 2007 workshop, Uncertainty at Scale, explored engineering uncertainty, system complexity, and scale issues in developing large software systems. A Feb. 2007 workshop explored China's and India's roles in the IT R&D ecosystem; observations about the ecosystem over the long term; perspectives from serial entrepreneurs about the evolution of the ecosystem; and a cross-industry, global view of the R&D ecosystem. A Nov. 2006 event brought together participants from government, industry, and academia to share their perspectives on the health of the ecosystem, patterns of funding and investment, and the Potomac-area IT startup environment. A symposium entitled 2016, held in Oct. 2006, featured a number of distinguished speakers who shared their views on how computer science and telecommunications will look in 10 years. This well-attended event was also the subject of an Oct. 31, 2006, feature essay in the New York Times, "Computing, 2016: What Won't Be Possible?"« less

  13. Final Technical Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knio, Omar M.

    QUEST is a SciDAC Institute comprising Sandia National Laboratories, Los Alamos National Laboratory, University of Southern California, Massachusetts Institute of Technology, University of Texas at Austin, and Duke University. The mission of QUEST is to: (1) develop a broad class of uncertainty quantification (UQ) methods/tools, and (2) provide UQ expertise and software to other SciDAC projects, thereby enabling/guiding their UQ activities. The Duke effort focused on the development of algorithms and utility software for non-intrusive sparse UQ representations, and on participation in the organization of annual workshops and tutorials to disseminate UQ tools to the community, and to gather inputmore » in order to adapt approaches to the needs of SciDAC customers. In particular, fundamental developments were made in (a) multiscale stochastic preconditioners, (b) gradient-based approaches to inverse problems, (c) adaptive pseudo-spectral approximations, (d) stochastic limit cycles, and (e) sensitivity analysis tools for noisy systems. In addition, large-scale demonstrations were performed, namely in the context of ocean general circulation models.« less

  14. Extreme Conditions Modeling Workshop Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coe, Ryan Geoffrey; Neary, Vincent Sinclair; Lawon, Michael J.

    2014-07-01

    Sandia National Laboratories (SNL) and the National Renewable Energy Laboratory (NREL) hosted the Wave Energy Converter (WEC) Extreme Conditions Modeling (ECM) Workshop in Albuquerque, New Mexico on May 13–14, 2014. The objective of the workshop was to review the current state of knowledge on how to numerically and experimentally model WECs in extreme conditions (e.g. large ocean storms) and to suggest how national laboratory resources could be used to improve ECM methods for the benefit of the wave energy industry. More than 30 U.S. and European WEC experts from industry, academia, and national research institutes attended the workshop, which consistedmore » of presentations from W EC developers, invited keynote presentations from subject matter experts, breakout sessions, and a final plenary session .« less

  15. 77 FR 26509 - Notice of Public Meeting-Cloud Computing Forum & Workshop V

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-04

    ...--Cloud Computing Forum & Workshop V AGENCY: National Institute of Standards & Technology (NIST), Commerce. ACTION: Notice. SUMMARY: NIST announces the Cloud Computing Forum & Workshop V to be held on Tuesday... workshop. This workshop will provide information on the U.S. Government (USG) Cloud Computing Technology...

  16. PREFACE: Proceedings of the International Workshop on Current Challenges in Liquid and Glass Science, (The Cosener's House, Abingdon 10 12 January 2007)

    NASA Astrophysics Data System (ADS)

    Hannon, Alex C.; Salmon, Philip S.; Soper, Alan K.

    2007-10-01

    The workshop was held to discuss current experimental and theoretical challenges in liquid and glass science and to honour the contribution made by Spencer Howells (ISIS, UK) to the field of neutron scattering from liquids and glasses. The meeting was attended by 70 experimentalists, theorists and computer simulators from Europe, Japan and North America and comprised 34 oral presentations together with two lively poster sessions. Three major themes were discussed, namely (i) the glass transition and properties of liquids and glasses under extreme conditions; (ii) the complementarity of neutron and x-ray scattering techniques with other experimental methods; and (iii) the modelling of liquid and glass structure. These themes served to highlight (a) recent advances in neutron and x-ray instrumentation used to investigate liquid and glassy materials under extreme conditions; (b) the relationship between the results obtained from different experimental and theoretical/computational methods; and (c) the modern methods used to interpret experimental results. The presentations ranged from polyamorphism in liquids and glasses to protein folding in aqueous solution and included the dynamics of fresh and freeze-dried strawberries and red onions. The properties of liquid phosphorus were also memorably demonstrated! The formal highlight was the 'Spencerfest' dinner where Neil Cowlam (Sheffield, UK) gave an excellent after dinner speech. The organisation of the workshop benefited tremendously from the secretarial skills of Carole Denning (ISIS, UK). The financial support of the Council for the Central Laboratory of the Research Councils (CCLRC), the Liquids and Complex Fluids Group of the Institute of Physics, The ISIS Disordered Materials Group, the CCLRC Centre for Materials Physics and Chemistry and the CCLRC Centre for Molecular Structure and Dynamics is gratefully acknowledged. Finally, it is a pleasure to thank all the workshop participants whose lively contributions led to the success of the meeting. The present special issue stems from the interest of many of those present to collect their work into a single volume.

  17. Extreme-Scale Computing Project Aims to Advance Precision Oncology | FNLCR Staging

    Cancer.gov

    Two government agencies and five national laboratories are collaborating to develop extremely high-performance computing capabilities that will analyze mountains of research and clinical data to improve scientific understanding of cancer, predict dru

  18. Extreme Events and Disaster Risk Reduction - a Future Earth KAN initiative

    NASA Astrophysics Data System (ADS)

    Frank, Dorothea; Reichstein, Markus

    2017-04-01

    The topic of Extreme Events in the context of global environmental change is both a scientifically challenging and exciting topic, and of very high societal relevance. The Future Earth Cluster initiative E3S organized in 2016 a cross-community/co-design workshop on Extreme Events and Environments from Climate to Society (http://www.e3s-future-earth.eu/index.php/ConferencesEvents/ConferencesAmpEvents). Based on the results, co-design research strategies and established network of the workshop, and previous activities, E3S is thriving to establish the basis for a longer-term research effort under the umbrella of Future Earth. These led to an initiative for a Future Earth Knowledge Action Network on Extreme Events and Disaster Risk Reduction. Example initial key question in this context include: What are meaningful indices to describe and quantify impact-relevant (e.g. climate) extremes? Which system properties yield resistance and resilience to extreme conditions? What are the key interactions between global urbanization processes, extreme events, and social and infrastructure vulnerability and resilience? The long-term goal of this KAN is to contribute to enhancing the resistance, resilience, and adaptive capacity of socio-ecological systems across spatial, temporal and institutional scales, in particular in the light of hazards affected by ongoing environmental change (e.g. climate change, global urbanization and land use/land cover change). This can be achieved by enhanced understanding, prediction, improved and open data and knowledge bases for detection and early warning decision making, and by new insights on natural and societal conditions and governance for resilience and adaptive capacity.

  19. Multiscale Computation. Needs and Opportunities for BER Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scheibe, Timothy D.; Smith, Jeremy C.

    2015-01-01

    The Environmental Molecular Sciences Laboratory (EMSL), a scientific user facility managed by Pacific Northwest National Laboratory for the U.S. Department of Energy, Office of Biological and Environmental Research (BER), conducted a one-day workshop on August 26, 2014 on the topic of “Multiscale Computation: Needs and Opportunities for BER Science.” Twenty invited participants, from various computational disciplines within the BER program research areas, were charged with the following objectives; Identify BER-relevant models and their potential cross-scale linkages that could be exploited to better connect molecular-scale research to BER research at larger scales and; Identify critical science directions that will motivate EMSLmore » decisions regarding future computational (hardware and software) architectures.« less

  20. Advanced Dynamically Adaptive Algorithms for Stochastic Simulations on Extreme Scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiu, Dongbin

    2017-03-03

    The focus of the project is the development of mathematical methods and high-performance computational tools for stochastic simulations, with a particular emphasis on computations on extreme scales. The core of the project revolves around the design of highly efficient and scalable numerical algorithms that can adaptively and accurately, in high dimensional spaces, resolve stochastic problems with limited smoothness, even containing discontinuities.

  1. Climate Change Extreme Events: Meeting the Information Needs of Water Resource Managers

    NASA Astrophysics Data System (ADS)

    Quay, R.; Garfin, G. M.; Dominguez, F.; Hirschboeck, K. K.; Woodhouse, C. A.; Guido, Z.; White, D. D.

    2013-12-01

    Information about climate has long been used by water managers to develop short term and long term plans and strategies for regional and local water resources. Inherent within longer term forecasts is an element of uncertainty, which is particularly evident in Global Climate model results for precipitation. For example in the southwest estimates in the flow of the Colorado River based on GCM results indicate changes from 120% or current flow to 60%. Many water resource managers are now using global climate model down scaled estimates results as indications of potential climate change as part of that planning. They are addressing the uncertainty within these estimates by using an anticipatory planning approach looking at a range of possible futures. One aspect of climate that is important for such planning are estimates of future extreme storm (short term) and drought (long term) events. However, the climate science of future possible changes in extreme events is less mature than general climate change science. At a recent workshop among climate scientists and water managers in the southwest, it was concluded the science of climate change extreme events is at least a decade away from being robust enough to be useful for water managers in their water resource management activities. However, it was proposed that there are existing estimates and records of past flooding and drought events that could be combined with general climate change science to create possible future events. These derived events could be of sufficient detail to be used by water resource managers until such time that the science of extreme events is able to provide more detailed estimates. Based on the results of this workshop and other work being done by the Decision Center for a Desert City at Arizona State University and the Climate Assessment for the Southwest center at University of Arizona., this article will 1) review what are the extreme event data needs of Water Resource Managers in the southwest, 2) review of the current state of extreme event climate science, 3) review what information is available about past extreme events in the southwest, 4) report the results of the 2012 workshop on climate change and extreme events, and 5) propose a method for combining this past information with current climate science information to produce estimates of possible future extreme events in sufficient detail to be useful to water resource managers.

  2. Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers.

    PubMed

    Jordan, Jakob; Ippen, Tammo; Helias, Moritz; Kitayama, Itaru; Sato, Mitsuhisa; Igarashi, Jun; Diesmann, Markus; Kunkel, Susanne

    2018-01-01

    State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems.

  3. Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers

    PubMed Central

    Jordan, Jakob; Ippen, Tammo; Helias, Moritz; Kitayama, Itaru; Sato, Mitsuhisa; Igarashi, Jun; Diesmann, Markus; Kunkel, Susanne

    2018-01-01

    State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems. PMID:29503613

  4. 77 FR 74829 - Notice of Public Meeting-Cloud Computing and Big Data Forum and Workshop

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-12-18

    ...--Cloud Computing and Big Data Forum and Workshop AGENCY: National Institute of Standards and Technology... Standards and Technology (NIST) announces a Cloud Computing and Big Data Forum and Workshop to be held on... followed by a one-day hands-on workshop. The NIST Cloud Computing and Big Data Forum and Workshop will...

  5. A Workshop on the Gathering of Information for Problem Formulation

    DTIC Science & Technology

    1991-06-01

    the Al specialists is to design "artificially intelligent" computer environments that tutor students in much the same way that a human teacher might...tuning the interface betweeen student and machine, and are using a technique of in situ development to tune the system towaid realistic user needs. 141...of transferability to new domains, while the latter suffers from extreme fragility: the inability to cope with any input not strictly conforming with

  6. Extreme-Scale Computing Project Aims to Advance Precision Oncology | Frederick National Laboratory for Cancer Research

    Cancer.gov

    Two government agencies and five national laboratories are collaborating to develop extremely high-performance computing capabilities that will analyze mountains of research and clinical data to improve scientific understanding of cancer, predict dru

  7. Extreme-Scale Computing Project Aims to Advance Precision Oncology | Poster

    Cancer.gov

    Two government agencies and five national laboratories are collaborating to develop extremely high-performance computing capabilities that will analyze mountains of research and clinical data to improve scientific understanding of cancer, predict drug response, and improve treatments for patients.

  8. Computational Materials Science and Chemistry: Accelerating Discovery and Innovation through Simulation-Based Engineering and Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crabtree, George; Glotzer, Sharon; McCurdy, Bill

    This report is based on a SC Workshop on Computational Materials Science and Chemistry for Innovation on July 26-27, 2010, to assess the potential of state-of-the-art computer simulations to accelerate understanding and discovery in materials science and chemistry, with a focus on potential impacts in energy technologies and innovation. The urgent demand for new energy technologies has greatly exceeded the capabilities of today's materials and chemical processes. To convert sunlight to fuel, efficiently store energy, or enable a new generation of energy production and utilization technologies requires the development of new materials and processes of unprecedented functionality and performance. Newmore » materials and processes are critical pacing elements for progress in advanced energy systems and virtually all industrial technologies. Over the past two decades, the United States has developed and deployed the world's most powerful collection of tools for the synthesis, processing, characterization, and simulation and modeling of materials and chemical systems at the nanoscale, dimensions of a few atoms to a few hundred atoms across. These tools, which include world-leading x-ray and neutron sources, nanoscale science facilities, and high-performance computers, provide an unprecedented view of the atomic-scale structure and dynamics of materials and the molecular-scale basis of chemical processes. For the first time in history, we are able to synthesize, characterize, and model materials and chemical behavior at the length scale where this behavior is controlled. This ability is transformational for the discovery process and, as a result, confers a significant competitive advantage. Perhaps the most spectacular increase in capability has been demonstrated in high performance computing. Over the past decade, computational power has increased by a factor of a million due to advances in hardware and software. This rate of improvement, which shows no sign of abating, has enabled the development of computer simulations and models of unprecedented fidelity. We are at the threshold of a new era where the integrated synthesis, characterization, and modeling of complex materials and chemical processes will transform our ability to understand and design new materials and chemistries with predictive power. In turn, this predictive capability will transform technological innovation by accelerating the development and deployment of new materials and processes in products and manufacturing. Harnessing the potential of computational science and engineering for the discovery and development of materials and chemical processes is essential to maintaining leadership in these foundational fields that underpin energy technologies and industrial competitiveness. Capitalizing on the opportunities presented by simulation-based engineering and science in materials and chemistry will require an integration of experimental capabilities with theoretical and computational modeling; the development of a robust and sustainable infrastructure to support the development and deployment of advanced computational models; and the assembly of a community of scientists and engineers to implement this integration and infrastructure. This community must extend to industry, where incorporating predictive materials science and chemistry into design tools can accelerate the product development cycle and drive economic competitiveness. The confluence of new theories, new materials synthesis capabilities, and new computer platforms has created an unprecedented opportunity to implement a "materials-by-design" paradigm with wide-ranging benefits in technological innovation and scientific discovery. The Workshop on Computational Materials Science and Chemistry for Innovation was convened in Bethesda, Maryland, on July 26-27, 2010. Sponsored by the Department of Energy (DOE) Offices of Advanced Scientific Computing Research and Basic Energy Sciences, the workshop brought together 160 experts in materials science, chemistry, and computational science representing more than 65 universities, laboratories, and industries, and four agencies. The workshop examined seven foundational challenge areas in materials science and chemistry: materials for extreme conditions, self-assembly, light harvesting, chemical reactions, designer fluids, thin films and interfaces, and electronic structure. Each of these challenge areas is critical to the development of advanced energy systems, and each can be accelerated by the integrated application of predictive capability with theory and experiment. The workshop concluded that emerging capabilities in predictive modeling and simulation have the potential to revolutionize the development of new materials and chemical processes. Coupled with world-leading materials characterization and nanoscale science facilities, this predictive capability provides the foundation for an innovation ecosystem that can accelerate the discovery, development, and deployment of new technologies, including advanced energy systems. Delivering on the promise of this innovation ecosystem requires the following: Integration of synthesis, processing, characterization, theory, and simulation and modeling. Many of the newly established Energy Frontier Research Centers and Energy Hubs are exploiting this integration. Achieving/strengthening predictive capability in foundational challenge areas. Predictive capability in the seven foundational challenge areas described in this report is critical to the development of advanced energy technologies. Developing validated computational approaches that span vast differences in time and length scales. This fundamental computational challenge crosscuts all of the foundational challenge areas. Similarly challenging is coupling of analytical data from multiple instruments and techniques that are required to link these length and time scales. Experimental validation and quantification of uncertainty in simulation and modeling. Uncertainty quantification becomes increasingly challenging as simulations become more complex. Robust and sustainable computational infrastructure, including software and applications. For modeling and simulation, software equals infrastructure. To validate the computational tools, software is critical infrastructure that effectively translates huge arrays of experimental data into useful scientific understanding. An integrated approach for managing this infrastructure is essential. Efficient transfer and incorporation of simulation-based engineering and science in industry. Strategies for bridging the gap between research and industrial applications and for widespread industry adoption of integrated computational materials engineering are needed.« less

  9. 75 FR 64258 - Cloud Computing Forum & Workshop II

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-19

    ... DEPARTMENT OF COMMERCE National Institute of Standards and Technology Cloud Computing Forum... workshop. SUMMARY: NIST announces the Cloud Computing Forum & Workshop II to be held on November 4 and 5, 2010. This workshop will provide information on a Cloud Computing Roadmap Strategy as well as provide...

  10. Gravitational Waves From the Kerr/CFT Correspondence

    NASA Astrophysics Data System (ADS)

    Porfyriadis, Achilleas

    Astronomical observation suggests the existence of near-extreme Kerr black holes in the sky. Properties of diffeomorphisms imply that dynamics of the near-horizon region of near-extreme Kerr are governed by an infinite-dimensional conformal symmetry. This symmetry may be exploited to analytically, rather than numerically, compute a variety of potentially observable processes. In this thesis we compute the gravitational radiation emitted by a small compact object that orbits in the near-horizon region and plunges into the horizon of a large rapidly rotating black hole. We study the holographically dual processes in the context of the Kerr/CFT correspondence and find our conformal field theory (CFT) computations in perfect agreement with the gravity results. We compute the radiation emitted by a particle on the innermost stable circular orbit (ISCO) of a rapidly spinning black hole. We confirm previous estimates of the overall scaling of the power radiated, but show that there are also small oscillations all the way to extremality. Furthermore, we reveal an intricate mode-by-mode structure in the flux to infinity, with only certain modes having the dominant scaling. The scaling of each mode is controlled by its conformal weight. Massive objects in adiabatic quasi-circular inspiral towards a near-extreme Kerr black hole quickly plunge into the horizon after passing the ISCO. The post-ISCO plunge trajectory is shown to be related by a conformal map to a circular orbit. Conformal symmetry of the near-horizon region is then used to compute analytically the gravitational radiation produced during the plunge phase. Most extreme-mass-ratio-inspirals of small compact objects into supermassive black holes end with a fast plunge from an eccentric last stable orbit. We use conformal transformations to analytically solve for the radiation emitted from various fast plunges into extreme and near-extreme Kerr black holes.

  11. Opportunities for nonvolatile memory systems in extreme-scale high-performance computing

    DOE PAGES

    Vetter, Jeffrey S.; Mittal, Sparsh

    2015-01-12

    For extreme-scale high-performance computing systems, system-wide power consumption has been identified as one of the key constraints moving forward, where DRAM main memory systems account for about 30 to 50 percent of a node's overall power consumption. As the benefits of device scaling for DRAM memory slow, it will become increasingly difficult to keep memory capacities balanced with increasing computational rates offered by next-generation processors. However, several emerging memory technologies related to nonvolatile memory (NVM) devices are being investigated as an alternative for DRAM. Moving forward, NVM devices could offer solutions for HPC architectures. Researchers are investigating how to integratemore » these emerging technologies into future extreme-scale HPC systems and how to expose these capabilities in the software stack and applications. In addition, current results show several of these strategies could offer high-bandwidth I/O, larger main memory capacities, persistent data structures, and new approaches for application resilience and output postprocessing, such as transaction-based incremental checkpointing and in situ visualization, respectively.« less

  12. New directions in mechanics

    DOE PAGES

    Kassner, Michael E.; Nemat-Nasser, Sia; Suo, Zhigang; ...

    2004-09-15

    The Division of Materials Sciences and Engineering of the US Department of Energy (DOE) sponsored a workshop to identify cutting-edge research needs and opportunities, enabled by the application of theoretical and applied mechanics. The workshop also included input from biochemical, surface science, and computational disciplines, on approaching scientific issues at the nanoscale, and the linkage of atomistic-scale with nano-, meso-, and continuum-scale mechanics. This paper is a summary of the outcome of the workshop, consisting of three main sections, each put together by a team of workshop participants. Section 1 addresses research opportunities that can be realized by the applicationmore » of mechanics fundamentals to the general area of self-assembly, directed self-assembly, and fluidics. Section 2 examines the role of mechanics in biological, bioinspired, and biohybrid material systems, closely relating to and complementing the material covered in Section 1. In this manner, it was made clear that mechanics plays a fundamental role in understanding the biological functions at all scales, in seeking to utilize biology and biological techniques to develop new materials and devices, and in the general area of bionanotechnology. While direct observational investigations are an essential ingredient of new discoveries and will continue to open new exciting research doors, it is the basic need for controlled experimentation and fundamentally- based modeling and computational simulations that will be truly empowered by a systematic use of the fundamentals of mechanics. Section 3 brings into focus new challenging issues in inelastic deformation and fracturing of materials that have emerged as a result of the development of nanodevices, biopolymers, and hybrid bio–abio systems. As a result, each section begins with some introductory overview comments, and then provides illustrative examples that were presented at the workshop and which are believed to highlight the enabling research areas and, particularly, the impact that mechanics can make in enhancing the fundamental understanding that can lead to new technologies.« less

  13. Improving Data Mobility & Management for International Cosmology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borrill, Julian; Dart, Eli; Gore, Brooklin

    In February 2015 the third workshop in the CrossConnects series, with a focus on Improving Data Mobility & Management for International Cosmology, was held at Lawrence Berkeley National Laboratory. Scientists from fields including astrophysics, cosmology, and astronomy collaborated with experts in computing and networking to outline strategic opportunities for enhancing scientific productivity and effectively managing the ever-increasing scale of scientific data. While each field has unique details which depend on the instruments employed, the type and scale of the data, and the structure of scientific collaborations, several important themes emerged from the workshop discussions. Findings, as well as a setmore » of recommendations, are contained in their respective sections in this report.« less

  14. Recovery Act - CAREER: Sustainable Silicon -- Energy-Efficient VLSI Interconnect for Extreme-Scale Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chiang, Patrick

    2014-01-31

    The research goal of this CAREER proposal is to develop energy-efficient, VLSI interconnect circuits and systems that will facilitate future massively-parallel, high-performance computing. Extreme-scale computing will exhibit massive parallelism on multiple vertical levels, from thou­ sands of computational units on a single processor to thousands of processors in a single data center. Unfortunately, the energy required to communicate between these units at every level (on­ chip, off-chip, off-rack) will be the critical limitation to energy efficiency. Therefore, the PI's career goal is to become a leading researcher in the design of energy-efficient VLSI interconnect for future computing systems.

  15. Everyday Banality in a Documentary by Teenage Women: Between the Trivial and the Extreme. Schooling and Desiring in Contexts of Extreme Urban Poverty

    ERIC Educational Resources Information Center

    Grinberg, Silvia

    2010-01-01

    In this article, I offer some reflections on a video documentary workshop for students in the first year of middle school. The workshop, which was held in 2008, took place in a school in an area of extreme urban poverty in the metropolitan area of Buenos Aires, Argentina, specifically in one of the more and more common spaces usually called…

  16. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Mid-year report FY17 Q2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Pugmire, David; Rogers, David

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  17. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY17.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Pugmire, David; Rogers, David

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  18. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem. Mid-year report FY16 Q2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Sewell, Christopher; Childs, Hank

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  19. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY15 Q4.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Sewell, Christopher; Childs, Hank

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  20. Proceedings of the Fifth NASA/NSF/DOD Workshop on Aerospace Computational Control

    NASA Technical Reports Server (NTRS)

    Wette, M. (Editor); Man, G. K. (Editor)

    1993-01-01

    The Fifth Annual Workshop on Aerospace Computational Control was one in a series of workshops sponsored by NASA, NSF, and the DOD. The purpose of these workshops is to address computational issues in the analysis, design, and testing of flexible multibody control systems for aerospace applications. The intention in holding these workshops is to bring together users, researchers, and developers of computational tools in aerospace systems (spacecraft, space robotics, aerospace transportation vehicles, etc.) for the purpose of exchanging ideas on the state of the art in computational tools and techniques.

  1. The science of visual analysis at extreme scale

    NASA Astrophysics Data System (ADS)

    Nowell, Lucy T.

    2011-01-01

    Driven by market forces and spanning the full spectrum of computational devices, computer architectures are changing in ways that present tremendous opportunities and challenges for data analysis and visual analytic technologies. Leadership-class high performance computing system will have as many as a million cores by 2020 and support 10 billion-way concurrency, while laptop computers are expected to have as many as 1,000 cores by 2015. At the same time, data of all types are increasing exponentially and automated analytic methods are essential for all disciplines. Many existing analytic technologies do not scale to make full use of current platforms and fewer still are likely to scale to the systems that will be operational by the end of this decade. Furthermore, on the new architectures and for data at extreme scales, validating the accuracy and effectiveness of analytic methods, including visual analysis, will be increasingly important.

  2. A Portable Computer Security Workshop

    ERIC Educational Resources Information Center

    Wagner, Paul J.; Phillips, Andrew T.

    2006-01-01

    We have developed a computer security workshop designed to instruct post-secondary instructors who want to start a course or laboratory exercise sequence in computer security. This workshop has also been used to provide computer security education to IT professionals and students. It is effective in communicating basic computer security principles…

  3. Emerging and Future Computing Paradigms and Their Impact on the Research, Training, and Design Environments of the Aerospace Workforce

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler)

    2003-01-01

    The document contains the proceedings of the training workshop on Emerging and Future Computing Paradigms and their impact on the Research, Training and Design Environments of the Aerospace Workforce. The workshop was held at NASA Langley Research Center, Hampton, Virginia, March 18 and 19, 2003. The workshop was jointly sponsored by Old Dominion University and NASA. Workshop attendees came from NASA, other government agencies, industry and universities. The objectives of the workshop were to a) provide broad overviews of the diverse activities related to new computing paradigms, including grid computing, pervasive computing, high-productivity computing, and the IBM-led autonomic computing; and b) identify future directions for research that have high potential for future aerospace workforce environments. The format of the workshop included twenty-one, half-hour overview-type presentations and three exhibits by vendors.

  4. XVIS: Visualization for the Extreme-Scale Scientific-Computation Ecosystem Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geveci, Berk; Maynard, Robert

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. The XVis project brought together collaborators from predominant DOE projects for visualization on accelerators and combining their respectivemore » features into a new visualization toolkit called VTK-m.« less

  5. EDITORIAL: TaCoNa-Photonics 2008 TaCoNa-Photonics 2008

    NASA Astrophysics Data System (ADS)

    Chigrin, Dmitry N.; Busch, Kurt; Lavrinenko, Andrei V.

    2009-11-01

    This special section on theoretical and computational nano-photonics features papers presented at the first International Workshop on Theoretical and Computational Nano-Photonics (TaCoNa-Photonics 2008) held in Bad Honnef, Germany, 3-5 December 2008. The workshop covered a broad range of topics related to current developments and achievements in this interdisciplinary area of research. Since the late 1960s, the word `photonics' has been understood as the science of generating, controlling, and detecting light. Nowadays, a routine fabrication of complex structures with micro- and nano-scale dimensions opens up many new and exciting possibilities in photonics. The science of generating, routing and detecting light in micro- and nano-structured matter, `nano-photonics', is becoming more important both in research and technology and offers many promising applications. The inherently sub-wavelength character of the structures that nano-photonics deals with challenges modern theoretical and computational physics and engineering with many nontrivial questions: Up to what length-scale can one use a macroscopic phenomenological description of matter? Where is the interface between the classical and quantum description of light in nano-scale structures? How can one combine different physical systems, different time- and length-scales in a single computational model? How can one engineer nano-structured materials in order to achieve the desired optical properties for particular applications? Any attempt at answering these kinds of questions is impossible without the joint efforts of physicists, engineers, applied mathematicians and programmers. This is the reason why the major goal of the TaCoNa-Photonics workshops is to provide a forum where theoreticians and specialists in numerical methods from all branches of physics, engineering sciences and mathematics can compare their results, report on novel results and breakthroughs, and discuss new challenges ahead. In order to intensify theoretical discussions and to put them on `solid' ground it was decided to invite world-leading experts in experimental photonics for plenary talks. Over three days, the workshop has brought together more than 70 specialists in theoretical and computational nano-photonics. The workshop took place in the historical `Physikzentrum Bad Honnef', whose unique atmosphere supported a multitude of highly interesting debates and discussions that often lasted until midnight and beyond. Different theoretical and numerical aspects of light generation, control and detection in general inhomogeneous media, photonic crystals, plasmonic structures, metamaterials and integrated optical systems were covered in 15 invited talks and 52 contributed oral and posters presentations. The plenary talks were given by Professor M Wegener (metamaterials) and Professor W Barnes (plasmonics). This special section is a cross-sectional selection of papers which were submitted by the authors of invited and contributed oral presentations. It also includes two papers of the winners of the Best Poster Awards. We hope that these papers will enhance the interest of the scientific community regarding nano-photonics in general and regarding the TaCoNa-Photonics workshop series in particular. It is our distinct pleasure to acknowledge the generous financial support of our sponsors: Karlsruhe School of Optics & Photonics (KSOP) (Germany), U.S. Army International Technology Center-Atlantic, Research Division (USA), and the Office of Naval Research Global (USA). Without the organizational assistance from the International Department of the Universität Karlsruhe GmbH (Germany) this event would simply have been impossible.

  6. High Tech/High Touch: A Computer Education Leadership Development Workshop. Second Edition.

    ERIC Educational Resources Information Center

    Moursund, David

    This document contains materials and suggested activities for use in a 5-day workshop on leadership development for instructional computer coordinators, computer education teachers, workshop leaders, teachers of teachers, and other people who play a leadership role in the workshop format in small group discussions, together with sharing and…

  7. Upper extremity pain and computer use among engineering graduate students.

    PubMed

    Schlossberg, Eric B; Morrow, Sandra; Llosa, Augusto E; Mamary, Edward; Dietrich, Peter; Rempel, David M

    2004-09-01

    The objective of this study was to investigate risk factors associated with persistent or recurrent upper extremity and neck pain among engineering graduate students. A random sample of 206 Electrical Engineering and Computer Science (EECS) graduate students at a large public university completed an online questionnaire. Approximately 60% of respondents reported upper extremity or neck pain attributed to computer use and reported a mean pain severity score of 4.5 (+/-2.2; scale 0-10). In a final logistic regression model, female gender, years of computer use, and hours of computer use per week were significantly associated with pain. The high prevalence of upper extremity pain reported by graduate students suggests a public health need to identify interventions that will reduce symptom severity and prevent impairment.

  8. 76 FR 13984 - Cloud Computing Forum & Workshop III

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-15

    ... DEPARTMENT OF COMMERCE National Institute of Standards and Technology Cloud Computing Forum... public workshop. SUMMARY: NIST announces the Cloud Computing Forum & Workshop III to be held on April 7... provide information on the NIST strategic and tactical Cloud Computing program, including progress on the...

  9. Developing a Computer Workshop To Facilitate Computer Skills and Minimize Anxiety for Early Childhood Educators.

    ERIC Educational Resources Information Center

    Wood, Eileen; Willoughby, Teena; Specht, Jacqueline; Stern-Cavalcante, Wilma; Child, Carol

    2002-01-01

    Early childhood educators were assigned to one of three instructional conditions to assess the impact of computer workshops on their level of computer anxiety, knowledge, and comfort with technology. Overall, workshops provided gains that could translate into more effective and efficient computer use in the classroom. (Author)

  10. Large Eddy Simulation ... Where Do We Stand? International Workshop Held in St. Petersburg Beach, Florida on 19-21 December 1990.

    DTIC Science & Technology

    1990-01-01

    S. Orszag, Chairman 1. P. Moin Some Issues in Computation of Turbulent Flows. 2. M. Lesieur, P. Comte, X. Normand, 0. Metais and A. Silveira Spectral...Richtmeyer’s computational experience with one-dimensional shock waves (1950) indicated the value of a non-linear artificial viscosity. Charney and... computer architecture and the advantages of semi-Lagrangian advective schemes may lure large-scale atmospheric modelers back to finite-difference

  11. The Role of Subtropical Intrusion in the Development of Typhoon Usagi (5W) 2007

    DTIC Science & Technology

    2008-03-01

    THE MARSUPIAL PARADIGM Summarizing the DMW08 theory, the relevant theoretical paradigm is the formation of a closed proto-vortex or “ embryo ...and begins control its own destiny . By correlating the marsupial analogy for TC formation in this study, we can verify in theory that the remnants...The hurricane embryo . Talk presented at short program workshop entitled Small scale and extreme events: The Hurricane, NSF Institute for Pure and

  12. PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deelman, Ewa; Carothers, Christopher; Mandal, Anirban

    Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less

  13. PANORAMA: An approach to performance modeling and diagnosis of extreme-scale workflows

    DOE PAGES

    Deelman, Ewa; Carothers, Christopher; Mandal, Anirban; ...

    2015-07-14

    Here we report that computational science is well established as the third pillar of scientific discovery and is on par with experimentation and theory. However, as we move closer toward the ability to execute exascale calculations and process the ensuing extreme-scale amounts of data produced by both experiments and computations alike, the complexity of managing the compute and data analysis tasks has grown beyond the capabilities of domain scientists. Therefore, workflow management systems are absolutely necessary to ensure current and future scientific discoveries. A key research question for these workflow management systems concerns the performance optimization of complex calculation andmore » data analysis tasks. The central contribution of this article is a description of the PANORAMA approach for modeling and diagnosing the run-time performance of complex scientific workflows. This approach integrates extreme-scale systems testbed experimentation, structured analytical modeling, and parallel systems simulation into a comprehensive workflow framework called Pegasus for understanding and improving the overall performance of complex scientific workflows.« less

  14. 76 FR 62373 - Notice of Public Meeting-Cloud Computing Forum & Workshop IV

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-07

    ...--Cloud Computing Forum & Workshop IV AGENCY: National Institute of Standards and Technology (NIST), Commerce. ACTION: Notice. SUMMARY: NIST announces the Cloud Computing Forum & Workshop IV to be held on... to help develop open standards in interoperability, portability and security in cloud computing. This...

  15. Extreme-scale Algorithms and Solver Resilience

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dongarra, Jack

    A widening gap exists between the peak performance of high-performance computers and the performance achieved by complex applications running on these platforms. Over the next decade, extreme-scale systems will present major new challenges to algorithm development that could amplify this mismatch in such a way that it prevents the productive use of future DOE Leadership computers due to the following; Extreme levels of parallelism due to multicore processors; An increase in system fault rates requiring algorithms to be resilient beyond just checkpoint/restart; Complex memory hierarchies and costly data movement in both energy and performance; Heterogeneous system architectures (mixing CPUs, GPUs,more » etc.); and Conflicting goals of performance, resilience, and power requirements.« less

  16. Developing a Framework for Seamless Prediction of Sub-Seasonal to Seasonal Extreme Precipitation Events in the United States.

    NASA Astrophysics Data System (ADS)

    Rosendahl, D. H.; Ćwik, P.; Martin, E. R.; Basara, J. B.; Brooks, H. E.; Furtado, J. C.; Homeyer, C. R.; Lazrus, H.; Mcpherson, R. A.; Mullens, E.; Richman, M. B.; Robinson-Cook, A.

    2017-12-01

    Extreme precipitation events cause significant damage to homes, businesses, infrastructure, and agriculture, as well as many injures and fatalities as a result of fast-moving water or waterborne diseases. In the USA, these natural hazard events claimed the lives of more than 300 people during 2015 - 2016 alone, with total damage reaching $24.4 billion. Prior studies of extreme precipitation events have focused on the sub-daily to sub-weekly timeframes. However, many decisions for planning, preparing and resilience-building require sub-seasonal to seasonal timeframes (S2S; 14 to 90 days), but adequate forecasting tools for prediction do not exist. Therefore, the goal of this newly funded project is an enhancement in understanding of the large-scale forcing and dynamics of S2S extreme precipitation events in the United States, and improved capability for modeling and predicting such events. Here, we describe the project goals, objectives, and research activities that will take place over the next 5 years. In this project, a unique team of scientists and stakeholders will identify and understand weather and climate processes connected with the prediction of S2S extreme precipitation events by answering these research questions: 1) What are the synoptic patterns associated with, and characteristic of, S2S extreme precipitation evens in the contiguous U.S.? 2) What role, if any, do large-scale modes of climate variability play in modulating these events? 3) How predictable are S2S extreme precipitation events across temporal scales? 4) How do we create an informative prediction of S2S extreme precipitation events for policymaking and planing? This project will use observational data, high-resolution radar composites, dynamical climate models and workshops that engage stakeholders (water resource managers, emergency managers and tribal environmental professionals) in co-production of knowledge. The overarching result of this project will be predictive models to reduce of the societal and economic impacts of extreme precipitation events. Another outcome will include statistical and co-production frameworks, which could be applied across other meteorological extremes, all time scales and in other parts of the world to increase resilience to extreme meteorological events.

  17. The 3d International Workshop on Computational Electronics

    NASA Astrophysics Data System (ADS)

    Goodnick, Stephen M.

    1994-09-01

    The Third International Workshop on Computational Electronics (IWCE) was held at the Benson Hotel in downtown Portland, Oregon, on May 18, 19, and 20, 1994. The workshop was devoted to a broad range of topics in computational electronics related to the simulation of electronic transport in semiconductors and semiconductor devices, particularly those which use large computational resources. The workshop was supported by the National Science Foundation (NSF), the Office of Naval Research and the Army Research Office, as well as local support from the Oregon Joint Graduate Schools of Engineering and the Oregon Center for Advanced Technology Education. There were over 100 participants in the Portland workshop, of which more than one quarter represented research groups outside of the United States from Austria, Canada, France, Germany, Italy, Japan, Switzerland, and the United Kingdom. There were a total 81 papers presented at the workshop, 9 invited talks, 26 oral presentations and 46 poster presentations. The emphasis of the contributions reflected the interdisciplinary nature of computational electronics with researchers from the Chemistry, Computer Science, Mathematics, Engineering, and Physics communities participating in the workshop.

  18. Report of a Workshop on the Pedagogical Aspects of Computational Thinking

    ERIC Educational Resources Information Center

    National Academies Press, 2011

    2011-01-01

    In 2008, the Computer and Information Science and Engineering Directorate of the National Science Foundation asked the National Research Council (NRC) to conduct two workshops to explore the nature of computational thinking and its cognitive and educational implications. The first workshop focused on the scope and nature of computational thinking…

  19. Extreme, Collaborative Curriculum Invention in Hawai'i

    ERIC Educational Resources Information Center

    Kelin, Daniel A., II; Jaffe, Nick; Bangerter, Neida; Wong, Randy; Kealoha; Penney-Rohner, Vicki

    2013-01-01

    This article describes ideas that came out of two workshops from a statewide Institute in Hawaii, comprised of sixty-five teaching artists, that focused on analyzing best practices. These were collaborative curriculum design workshops that yielded provocative and inspiring theoretical and practical ideas. In the first workshop, small groups of…

  20. dV/dt - Accelerating the Rate of Progress towards Extreme Scale Collaborative Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Livny, Miron

    This report introduces publications that report the results of a project that aimed to design a computational framework that enables computational experimentation at scale while supporting the model of “submit locally, compute globally”. The project focuses on estimating application resource needs, finding the appropriate computing resources, acquiring those resources,deploying the applications and data on the resources, managing applications and resources during run.

  1. Workshop Report On Sustainable Urban Development

    NASA Technical Reports Server (NTRS)

    Langhoff, Stephanie; Martin, Gary; Barone, Larry; Wagener, Wolfgang

    2010-01-01

    The key workshop goal was to explore and document how NASA technologies, such as remote sensing, climate modeling, and high-end computing and visualization along with NASA assets such as Earth Observing Satellites (EOS) and Unmanned Aerial Vehicles (UAVs) can contribute to creating and managing a sustainable urban environment. The focus was on the greater Bay Area, but many aspects of the workshop were applicable to urban management at the local, regional and global scales. A secondary goal was to help NASA better understand the problems facing urban managers and to make city leaders in the Bay Area more aware of NASA's capabilities. By bringing members of these two groups together we hope to see the beginnings of new collaborations between NASA and those faced with instituting sustainable urban management in Bay Area cities.

  2. HPCCP/CAS Workshop Proceedings 1998

    NASA Technical Reports Server (NTRS)

    Schulbach, Catherine; Mata, Ellen (Editor); Schulbach, Catherine (Editor)

    1999-01-01

    This publication is a collection of extended abstracts of presentations given at the HPCCP/CAS (High Performance Computing and Communications Program/Computational Aerosciences Project) Workshop held on August 24-26, 1998, at NASA Ames Research Center, Moffett Field, California. The objective of the Workshop was to bring together the aerospace high performance computing community, consisting of airframe and propulsion companies, independent software vendors, university researchers, and government scientists and engineers. The Workshop was sponsored by the HPCCP Office at NASA Ames Research Center. The Workshop consisted of over 40 presentations, including an overview of NASA's High Performance Computing and Communications Program and the Computational Aerosciences Project; ten sessions of papers representative of the high performance computing research conducted within the Program by the aerospace industry, academia, NASA, and other government laboratories; two panel sessions; and a special presentation by Mr. James Bailey.

  3. Multidisciplinary and participatory workshops with stakeholders in a community of extreme poverty in the Peruvian Amazon: Development of priority concerns and potential health, nutrition and education interventions

    PubMed Central

    Casapia, Martin; Joseph, Serene A; Gyorkos, Theresa W

    2007-01-01

    Background Communities of extreme poverty suffer disproportionately from a wide range of adverse outcomes, but are often neglected or underserved by organized services and research attention. In order to target the first Millennium Development Goal of eradicating extreme poverty, thereby reducing health inequalities, participatory research in these communities is needed. Therefore, the purpose of this study was to determine the priority problems and respective potential cost-effective interventions in Belen, a community of extreme poverty in the Peruvian Amazon, using a multidisciplinary and participatory focus. Methods Two multidisciplinary and participatory workshops were conducted with important stakeholders from government, non-government and community organizations, national institutes and academic institutions. In Workshop 1, participants prioritized the main health and health-related problems in the community of Belen. Problem trees were developed to show perceived causes and effects for the top six problems. In Workshop 2, following presentations describing data from recently completed field research in school and household populations of Belen, participants listed potential interventions for the priority problems, including associated barriers, enabling factors, costs and benefits. Results The top ten priority problems in Belen were identified as: 1) infant malnutrition; 2) adolescent pregnancy; 3) diarrhoea; 4) anaemia; 5) parasites; 6) lack of basic sanitation; 7) low level of education; 8) sexually transmitted diseases; 9) domestic violence; and 10) delayed school entry. Causes and effects for the top six problems, proposed interventions, and factors relating to the implementation of interventions were multidisciplinary in nature and included health, nutrition, education, social and environmental issues. Conclusion The two workshops provided valuable insight into the main health and health-related problems facing the community of Belen. The participatory focus of the workshops ensured the active involvement of important stakeholders from Belen. Based on the results of the workshops, effective and essential interventions are now being planned which will contribute to reducing health inequalities in the community. PMID:17623093

  4. Performance Analysis, Design Considerations, and Applications of Extreme-Scale In Situ Infrastructures

    DOE PAGES

    Ayachit, Utkarsh; Bauer, Andrew; Duque, Earl P. N.; ...

    2016-11-01

    A key trend facing extreme-scale computational science is the widening gap between computational and I/O rates, and the challenge that follows is how to best gain insight from simulation data when it is increasingly impractical to save it to persistent storage for subsequent visual exploration and analysis. One approach to this challenge is centered around the idea of in situ processing, where visualization and analysis processing is performed while data is still resident in memory. Our paper examines several key design and performance issues related to the idea of in situ processing at extreme scale on modern platforms: Scalability, overhead,more » performance measurement and analysis, comparison and contrast with a traditional post hoc approach, and interfacing with simulation codes. We illustrate these principles in practice with studies, conducted on large-scale HPC platforms, that include a miniapplication and multiple science application codes, one of which demonstrates in situ methods in use at greater than 1M-way concurrency.« less

  5. Norway and Cuba Continue Collaborating to Build Capacity to Improve Weather Forecasting

    NASA Astrophysics Data System (ADS)

    Antuña, Juan Carlos; Kalnay, Eugenia; Mesquita, Michel D. S.

    2014-06-01

    The Future of Climate Extremes in the Caribbean Extreme Cuban Climate (XCUBE) project, which is funded by the Norwegian Directorate for Civil Protection as part of an assignment for the Norwegian Ministry of Foreign Affairs to support scientific cooperation between Norway and Cuba, carried out a training workshop on seasonal forecasting, reanalysis data, and weather research and forecasting (WRF). The workshop was a follow-up to the XCUBE workshop conducted in Havana in 2013 and provided Cuban scientists with access to expertise on seasonal forecasting, the WRF model developed by the National Center for Atmospheric Research (NCAR) and the community, data assimilation, and reanalysis.

  6. Tenth Workshop for Computational Fluid Dynamic Applications in Rocket Propulsion, part 2

    NASA Technical Reports Server (NTRS)

    Williams, R. W. (Compiler)

    1992-01-01

    Presented here are 59 abstracts and presentations and three invited presentations given at the Tenth Workshop for Computational Fluid Dynamic Applications in Rocket Propulsion held at the George C. Marshall Space Flight Center, April 28-30, 1992. The purpose of the workshop is to discuss experimental and computational fluid dynamic activities in rocket propulsion. The workshop is an open meeting for government, industry, and academia. A broad number of topics are discussed, including a computational fluid dynamic methodology, liquid and solid rocket propulsion, turbomachinery, combustion, heat transfer, and grid generation.

  7. Eleventh Workshop for Computational Fluid Dynamic Applications in Rocket Propulsion

    NASA Technical Reports Server (NTRS)

    Williams, R. W. (Compiler)

    1993-01-01

    Conference publication includes 79 abstracts and presentations and 3 invited presentations given at the Eleventh Workshop for Computational Fluid Dynamic Applications in Rocket Propulsion held at George C. Marshall Space Flight Center, April 20-22, 1993. The purpose of the workshop is to discuss experimental and computational fluid dynamic activities in rocket propulsion. The workshop is an open meeting for government, industry, and academia. A broad number of topics are discussed including computational fluid dynamic methodology, liquid and solid rocket propulsion, turbomachinery, combustion, heat transfer, and grid generation.

  8. Eleventh Workshop for Computational Fluid Dynamic Applications in Rocket Propulsion, Part 1

    NASA Technical Reports Server (NTRS)

    Williams, Robert W. (Compiler)

    1993-01-01

    Conference publication includes 79 abstracts and presentations given at the Eleventh Workshop for Computational Fluid Dynamic Applications in Rocket Propulsion held at the George C. Marshall Space Flight Center, April 20-22, 1993. The purpose of this workshop is to discuss experimental and computational fluid dynamic activities in rocket propulsion. The workshop is an open meeting for government, industry, and academia. A broad number of topics are discussed including computational fluid dynamic methodology, liquid and solid rocket propulsion, turbomachinery, combustion, heat transfer, and grid generation.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, Valerie

    Given the significant impact of computing on society, it is important that all cultures, especially underrepresented cultures, are fully engaged in the field of computing to ensure that everyone benefits from the advances in computing. This proposal is focused on the field of high performance computing. The lack of cultural diversity in computing, in particular high performance computing, is especially evident with respect to the following ethnic groups – African Americans, Hispanics, and Native Americans – as well as People with Disabilities. The goal of this proposal is to organize and coordinate a National Laboratory Career Development Workshop focused onmore » underrepresented cultures (ethnic cultures and disability cultures) in high performance computing. It is expected that the proposed workshop will increase the engagement of underrepresented cultures in HPC through increased exposure to the excellent work at the national laboratories. The National Laboratory Workshops are focused on the recruitment of senior graduate students and the retention of junior lab staff through the various panels and discussions at the workshop. Further, the workshop will include a community building component that extends beyond the workshop. The workshop was held was held at the Lawrence Livermore National Laboratory campus in Livermore, CA. from June 14 - 15, 2012. The grant provided funding for 25 participants from underrepresented groups. The workshop also included another 25 local participants in the summer programs at Lawrence Livermore National Laboratory. Below are some key results from the assessment of the workshops: 86% of the participants indicated strongly agree or agree to the statement "I am more likely to consider/continue a career at a national laboratory as a result of participating in this workshop." 77% indicated strongly agree or agree to the statement "I plan to pursue a summer internship at a national laboratory." 100% of the participants indicated strongly agree or agree to the statement "The CMD-IT NLPDEV workshop was a valuable experience."« less

  10. Computational Modeling of Tires

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Tanner, John A. (Compiler)

    1995-01-01

    This document contains presentations and discussions from the joint UVA/NASA Workshop on Computational Modeling of Tires. The workshop attendees represented NASA, the Army and Air force, tire companies, commercial software developers, and academia. The workshop objectives were to assess the state of technology in the computational modeling of tires and to provide guidelines for future research.

  11. Outcomes from the DOE Workshop on Turbulent Flow Simulation at the Exascale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sprague, Michael; Boldyrev, Stanislav; Chang, Choong-Seock

    This paper summarizes the outcomes from the Turbulent Flow Simulation at the Exascale: Opportunities and Challenges Workshop, which was held 4-5 August 2015, and was sponsored by the U.S. Department of Energy Office of Advanced Scientific Computing Research. The workshop objective was to define and describe the challenges and opportunities that computing at the exascale will bring to turbulent-flow simulations in applied science and technology. The need for accurate simulation of turbulent flows is evident across the U.S. Department of Energy applied-science and engineering portfolios, including combustion, plasma physics, nuclear-reactor physics, wind energy, and atmospheric science. The workshop brought togethermore » experts in turbulent-flow simulation, computational mathematics, and high-performance computing. Building upon previous ASCR workshops on exascale computing, participants defined a research agenda and path forward that will enable scientists and engineers to continually leverage, engage, and direct advances in computational systems on the path to exascale computing.« less

  12. Facilitating Co-Design for Extreme-Scale Systems Through Lightweight Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engelmann, Christian; Lauer, Frank

    This work focuses on tools for investigating algorithm performance at extreme scale with millions of concurrent threads and for evaluating the impact of future architecture choices to facilitate the co-design of high-performance computing (HPC) architectures and applications. The approach focuses on lightweight simulation of extreme-scale HPC systems with the needed amount of accuracy. The prototype presented in this paper is able to provide this capability using a parallel discrete event simulation (PDES), such that a Message Passing Interface (MPI) application can be executed at extreme scale, and its performance properties can be evaluated. The results of an initial prototype aremore » encouraging as a simple 'hello world' MPI program could be scaled up to 1,048,576 virtual MPI processes on a four-node cluster, and the performance properties of two MPI programs could be evaluated at up to 16,384 virtual MPI processes on the same system.« less

  13. Thirteenth Workshop for Computational Fluid Dynamic Applications in Rocket Propulsion and Launch Vehicle Technology. Volume 2

    NASA Technical Reports Server (NTRS)

    Williams, R. W. (Compiler)

    1996-01-01

    This conference publication includes various abstracts and presentations given at the 13th Workshop for Computational Fluid Dynamic Applications in Rocket Propulsion and Launch Vehicle Technology held at the George C. Marshall Space Flight Center April 25-27 1995. The purpose of the workshop was to discuss experimental and computational fluid dynamic activities in rocket propulsion and launch vehicles. The workshop was an open meeting for government, industry, and academia. A broad number of topics were discussed including computational fluid dynamic methodology, liquid and solid rocket propulsion, turbomachinery, combustion, heat transfer, and grid generation.

  14. In Search of Gender Free Paradigms for Computer Science Education. [Proceedings of a Preconference Research Workshop at the National Educational Computing Conference (Nashville, Tennessee, June 24, 1990).

    ERIC Educational Resources Information Center

    Martin, C. Dianne, Ed.; Murchie-Beyma, Eric, Ed.

    This monograph includes nine papers delivered at a National Educational Computing Conference (NECC) preconference workshop, and a previously unpublished paper on gender and attitudes. The papers, which are presented in four categories, are: (1) "Report on the Workshop: In Search of Gender Free Paradigms for Computer Science Education"…

  15. Extreme-Scale De Novo Genome Assembly

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Georganas, Evangelos; Hofmeyr, Steven; Egan, Rob

    De novo whole genome assembly reconstructs genomic sequence from short, overlapping, and potentially erroneous DNA segments and is one of the most important computations in modern genomics. This work presents HipMER, a high-quality end-to-end de novo assembler designed for extreme scale analysis, via efficient parallelization of the Meraculous code. Genome assembly software has many components, each of which stresses different components of a computer system. This chapter explains the computational challenges involved in each step of the HipMer pipeline, the key distributed data structures, and communication costs in detail. We present performance results of assembling the human genome and themore » large hexaploid wheat genome on large supercomputers up to tens of thousands of cores.« less

  16. Remediation of Groundwater Contaminated by Nuclear Waste

    NASA Astrophysics Data System (ADS)

    Parker, Jack; Palumbo, Anthony

    2008-07-01

    A Workshop on Accelerating Development of Practical Field-Scale Bioremediation Models; An Online Meeting, 23 January to 20 February 2008; A Web-based workshop sponsored by the U.S. Department of Energy Environmental Remediation Sciences Program (DOE/ERSP) was organized in early 2008 to assess the state of the science and knowledge gaps associated with the use of computer models to facilitate remediation of groundwater contaminated by wastes from Cold War era nuclear weapons development and production. Microbially mediated biological reactions offer a potentially efficient means to treat these sites, but considerable uncertainty exists in the coupled biological, chemical, and physical processes and their mathematical representation.

  17. Next Generation Analytic Tools for Large Scale Genetic Epidemiology Studies of Complex Diseases

    PubMed Central

    Mechanic, Leah E.; Chen, Huann-Sheng; Amos, Christopher I.; Chatterjee, Nilanjan; Cox, Nancy J.; Divi, Rao L.; Fan, Ruzong; Harris, Emily L.; Jacobs, Kevin; Kraft, Peter; Leal, Suzanne M.; McAllister, Kimberly; Moore, Jason H.; Paltoo, Dina N.; Province, Michael A.; Ramos, Erin M.; Ritchie, Marylyn D.; Roeder, Kathryn; Schaid, Daniel J.; Stephens, Matthew; Thomas, Duncan C.; Weinberg, Clarice R.; Witte, John S.; Zhang, Shunpu; Zöllner, Sebastian; Feuer, Eric J.; Gillanders, Elizabeth M.

    2012-01-01

    Over the past several years, genome-wide association studies (GWAS) have succeeded in identifying hundreds of genetic markers associated with common diseases. However, most of these markers confer relatively small increments of risk and explain only a small proportion of familial clustering. To identify obstacles to future progress in genetic epidemiology research and provide recommendations to NIH for overcoming these barriers, the National Cancer Institute sponsored a workshop entitled “Next Generation Analytic Tools for Large-Scale Genetic Epidemiology Studies of Complex Diseases” on September 15–16, 2010. The goal of the workshop was to facilitate discussions on (1) statistical strategies and methods to efficiently identify genetic and environmental factors contributing to the risk of complex disease; and (2) how to develop, apply, and evaluate these strategies for the design, analysis, and interpretation of large-scale complex disease association studies in order to guide NIH in setting the future agenda in this area of research. The workshop was organized as a series of short presentations covering scientific (gene-gene and gene-environment interaction, complex phenotypes, and rare variants and next generation sequencing) and methodological (simulation modeling and computational resources and data management) topic areas. Specific needs to advance the field were identified during each session and are summarized. PMID:22147673

  18. Transfer of computer software technology through workshops: The case of fish bioenergetics modeling

    USGS Publications Warehouse

    Johnson, B.L.

    1992-01-01

    A three-part program is proposed to promote the availability and use of computer software packages to fishery managers and researchers. The approach consists of journal articles that announce new technologies, technical reports that serve as user's guides, and hands-on workshops that provide direct instruction to new users. Workshops, which allow experienced users to directly instruct novices in software operation and application are important, but often neglected. The author's experience with organizing and conducting bioenergetics modeling workshops suggests the optimal workshop would take 2 days, have 10-15 participants, one computer for every two users, and one instructor for every 5-6 people.

  19. A Summary of Data and Findings from the First Aeroelastic Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Schuster, David M.; Chwalowski, Pawel.; Heeg, Jennifer; Wieseman, Carol D.

    2012-01-01

    This paper summarizes data and findings from the first Aeroelastic Prediction Workshop (AePW) held in April, 2012. The workshop has been designed as a series of technical interchange meetings to assess the state of the art of computational methods for predicting unsteady flowfields and static and dynamic aeroelastic response. The goals are to provide an impartial forum to evaluate the effectiveness of existing computer codes and modeling techniques to simulate aeroelastic problems, and to identify computational and experimental areas needing additional research and development. For this initial workshop, three subject configurations have been chosen from existing wind tunnel data sets where there is pertinent experimental data available for comparison. Participant researchers analyzed one or more of the subject configurations and results from all of these computations were compared at the workshop. Keywords: Unsteady Aerodynamics, Aeroelasticity, Computational Fluid Dynamics, Transonic Flow, Separated Flow.

  20. PREFACE: 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011)

    NASA Astrophysics Data System (ADS)

    Teodorescu, Liliana; Britton, David; Glover, Nigel; Heinrich, Gudrun; Lauret, Jérôme; Naumann, Axel; Speer, Thomas; Teixeira-Dias, Pedro

    2012-06-01

    ACAT2011 This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 14th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2011) which took place on 5-7 September 2011 at Brunel University, UK. The workshop series, which began in 1990 in Lyon, France, brings together computer science researchers and practitioners, and researchers from particle physics and related fields in order to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. It is a forum for the exchange of ideas among the fields, exploring and promoting cutting-edge computing, data analysis and theoretical calculation techniques in fundamental physics research. This year's edition of the workshop brought together over 100 participants from all over the world. 14 invited speakers presented key topics on computing ecosystems, cloud computing, multivariate data analysis, symbolic and automatic theoretical calculations as well as computing and data analysis challenges in astrophysics, bioinformatics and musicology. Over 80 other talks and posters presented state-of-the art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. Panel and round table discussions on data management and multivariate data analysis uncovered new ideas and collaboration opportunities in the respective areas. This edition of ACAT was generously sponsored by the Science and Technology Facility Council (STFC), the Institute for Particle Physics Phenomenology (IPPP) at Durham University, Brookhaven National Laboratory in the USA and Dell. We would like to thank all the participants of the workshop for the high level of their scientific contributions and for the enthusiastic participation in all its activities which were, ultimately, the key factors in the success of the workshop. Further information on ACAT 2011 can be found at http://acat2011.cern.ch Dr Liliana Teodorescu Brunel University ACATgroup The PDF also contains details of the workshop's committees and sponsors.

  1. ASCR Workshop on Quantum Computing for Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aspuru-Guzik, Alan; Van Dam, Wim; Farhi, Edward

    This report details the findings of the DOE ASCR Workshop on Quantum Computing for Science that was organized to assess the viability of quantum computing technologies to meet the computational requirements of the DOE’s science and energy mission, and to identify the potential impact of quantum technologies. The workshop was held on February 17-18, 2015, in Bethesda, MD, to solicit input from members of the quantum computing community. The workshop considered models of quantum computation and programming environments, physical science applications relevant to DOE's science mission as well as quantum simulation, and applied mathematics topics including potential quantum algorithms formore » linear algebra, graph theory, and machine learning. This report summarizes these perspectives into an outlook on the opportunities for quantum computing to impact problems relevant to the DOE’s mission as well as the additional research required to bring quantum computing to the point where it can have such impact.« less

  2. Analysis of Test Case Computations and Experiments for the First Aeroelastic Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Schuster, David M.; Heeg, Jennifer; Wieseman, Carol D.; Chwalowski, Pawel

    2013-01-01

    This paper compares computational and experimental data from the Aeroelastic Prediction Workshop (AePW) held in April 2012. This workshop was designed as a series of technical interchange meetings to assess the state of the art of computational methods for predicting unsteady flowfields and static and dynamic aeroelastic response. The goals are to provide an impartial forum to evaluate the effectiveness of existing computer codes and modeling techniques to simulate aeroelastic problems and to identify computational and experimental areas needing additional research and development. Three subject configurations were chosen from existing wind-tunnel data sets where there is pertinent experimental data available for comparison. Participant researchers analyzed one or more of the subject configurations, and results from all of these computations were compared at the workshop.

  3. Proceedings of the International Workshop on Computational Electronics (3rd), Held in Portland, Oregon on May 18-20, 1994

    DTIC Science & Technology

    1994-05-20

    bias ( power supply ) as the relaxation time, field dependence of the average and channel length were chosen. energy and velocity, amount of heat flux... power supply voltage has been scaled less aggressively than device geometries. In deep submi- cron MOSFETs, the number of hot carriers is expected to...special attention given to versus field relation. Each of the HD models is cast issues related to power supply scaling.- into a generalized form allowing

  4. Coral disease and health workshop: Coral Histopathology II, July 12-14, 2005

    USGS Publications Warehouse

    Galloway, S.B.; Woodley, Cheryl M.; McLaughlin, S.M.; Work, Thierry M.; Bochsler, V.S.; Meteyer, Carol U.; Sileo, Louis; Peters, E.C.; Kramarsky-Winters, E.; Morado, J. Frank; Parnell, P.G.; Rotstein, D.S.; Harely, R.A.; Reynolds, T.L.

    2005-01-01

    An exciting highlight of this meeting was provided by Professor Robert Ogilvie (MUSC Department of Cell Biology and Anatomy) when he introduced participants to a new digital technology that is revolutionizing histology and histopathology in the medical field. The Virtual Slide technology creates digital images of histological tissue sections by computer scanning actual slides in high definition and storing the images for retrieval and viewing. Virtual slides now allow any investigator with access to a computer and the web to view, search, annotate and comment on the same tissue sections in real time. Medical and veterinary slide libraries across the country are being converted into virtual slides to enhance biomedical education, research and diagnosis. The coral health and disease researchers at this workshop deem virtual slides as a significant way to increase capabilities in coral histology and a means for pathology consultations on coral disease cases on a global scale. 

  5. Thirteenth Workshop for Computational Fluid Dynamic Applications in Rocket Propulsion and Launch Vehicle Technology. Volume 1

    NASA Technical Reports Server (NTRS)

    Williams, R. W. (Compiler)

    1996-01-01

    The purpose of the workshop was to discuss experimental and computational fluid dynamic activities in rocket propulsion and launch vehicles. The workshop was an open meeting for government, industry, and academia. A broad number of topics were discussed including computational fluid dynamic methodology, liquid and solid rocket propulsion, turbomachinery, combustion, heat transfer, and grid generation.

  6. Multi-Functional UV-Visible-IR Nanosensors Devices and Structures

    DTIC Science & Technology

    2015-04-29

    Dual-Gate MOSFET System, Proceedings of the International Workshop on Computational Electronics, Nara, Japan, Society of Micro- and Nanoelectronics ...International Workshop on Computational Electronics, Nara, Japan, Society of Micro- and Nanoelectronics , 216-217 (2013); ISBN 978-3-901578-26-7 M. S...Raman Spectroscopy, Proceedings of the International Workshop on Computational Electronics, Nara, Japan, Society of Micro- and Nanoelectronics , 198

  7. Overview of the Aeroelastic Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Heeg, Jennifer; Chwalowski, Pawel; Schuster, David M.; Dalenbring, Mats

    2013-01-01

    The AIAA Aeroelastic Prediction Workshop (AePW) was held in April, 2012, bringing together communities of aeroelasticians and computational fluid dynamicists. The objective in conducting this workshop on aeroelastic prediction was to assess state-of-the-art computational aeroelasticity methods as practical tools for the prediction of static and dynamic aeroelastic phenomena. No comprehensive aeroelastic benchmarking validation standard currently exists, greatly hindering validation and state-of-the-art assessment objectives. The workshop was a step towards assessing the state of the art in computational aeroelasticity. This was an opportunity to discuss and evaluate the effectiveness of existing computer codes and modeling techniques for unsteady flow, and to identify computational and experimental areas needing additional research and development. Three configurations served as the basis for the workshop, providing different levels of geometric and flow field complexity. All cases considered involved supercritical airfoils at transonic conditions. The flow fields contained oscillating shocks and in some cases, regions of separation. The computational tools principally employed Reynolds-Averaged Navier Stokes solutions. The successes and failures of the computations and the experiments are examined in this paper.

  8. Improving Nigerian health policymakers' capacity to access and utilize policy relevant evidence: outcome of information and communication technology training workshop.

    PubMed

    Uneke, Chigozie Jesse; Ezeoha, Abel Ebeh; Uro-Chukwu, Henry; Ezeonu, Chinonyelum Thecla; Ogbu, Ogbonnaya; Onwe, Friday; Edoga, Chima

    2015-01-01

    Information and communication technology (ICT) tools are known to facilitate communication and processing of information and sharing of knowledge by electronic means. In Nigeria, the lack of adequate capacity on the use of ICT by health sector policymakers constitutes a major impediment to the uptake of research evidence into the policymaking process. The objective of this study was to improve the knowledge and capacity of policymakers to access and utilize policy relevant evidence. A modified "before and after" intervention study design was used in which outcomes were measured on the target participants both before the intervention is implemented and after. A 4-point likert scale according to the degree of adequacy; 1 = grossly inadequate, 4 = very adequate was employed. This study was conducted in Ebonyi State, south-eastern Nigeria and the participants were career health policy makers. A two-day intensive ICT training workshop was organized for policymakers who had 52 participants in attendance. Topics covered included: (i). intersectoral partnership/collaboration; (ii). Engaging ICT in evidence-informed policy making; use of ICT for evidence synthesis; (iv) capacity development on the use of computer, internet and other ICT. The pre-workshop mean of knowledge and capacity for use of ICT ranged from 2.19-3.05, while the post-workshop mean ranged from 2.67-3.67 on 4-point scale. The percentage increase in mean of knowledge and capacity at the end of the workshop ranged from 8.3%-39.1%. Findings of this study suggest that policymakers' ICT competence relevant to evidence-informed policymaking can be enhanced through training workshop.

  9. Improving Nigerian health policymakers’ capacity to access and utilize policy relevant evidence: outcome of information and communication technology training workshop

    PubMed Central

    Uneke, Chigozie Jesse; Ezeoha, Abel Ebeh; Uro-Chukwu, Henry; Ezeonu, Chinonyelum Thecla; Ogbu, Ogbonnaya; Onwe, Friday; Edoga, Chima

    2015-01-01

    Information and communication technology (ICT) tools are known to facilitate communication and processing of information and sharing of knowledge by electronic means. In Nigeria, the lack of adequate capacity on the use of ICT by health sector policymakers constitutes a major impediment to the uptake of research evidence into the policymaking process. The objective of this study was to improve the knowledge and capacity of policymakers to access and utilize policy relevant evidence. A modified “before and after” intervention study design was used in which outcomes were measured on the target participants both before the intervention is implemented and after. A 4-point likert scale according to the degree of adequacy; 1 = grossly inadequate, 4 = very adequate was employed. This study was conducted in Ebonyi State, south-eastern Nigeria and the participants were career health policy makers. A two-day intensive ICT training workshop was organized for policymakers who had 52 participants in attendance. Topics covered included: (i). intersectoral partnership/collaboration; (ii). Engaging ICT in evidence-informed policy making; use of ICT for evidence synthesis; (iv) capacity development on the use of computer, internet and other ICT. The pre-workshop mean of knowledge and capacity for use of ICT ranged from 2.19-3.05, while the post-workshop mean ranged from 2.67-3.67 on 4-point scale. The percentage increase in mean of knowledge and capacity at the end of the workshop ranged from 8.3%-39.1%. Findings of this study suggest that policymakers’ ICT competence relevant to evidence-informed policymaking can be enhanced through training workshop. PMID:26448807

  10. Third Computational Aeroacoustics (CAA) Workshop on Benchmark Problems

    NASA Technical Reports Server (NTRS)

    Dahl, Milo D. (Editor)

    2000-01-01

    The proceedings of the Third Computational Aeroacoustics (CAA) Workshop on Benchmark Problems cosponsored by the Ohio Aerospace Institute and the NASA Glenn Research Center are the subject of this report. Fan noise was the chosen theme for this workshop with representative problems encompassing four of the six benchmark problem categories. The other two categories were related to jet noise and cavity noise. For the first time in this series of workshops, the computational results for the cavity noise problem were compared to experimental data. All the other problems had exact solutions, which are included in this report. The Workshop included a panel discussion by representatives of industry. The participants gave their views on the status of applying computational aeroacoustics to solve practical industry related problems and what issues need to be addressed to make CAA a robust design tool.

  11. The Pilot Land Data System: Report of the Program Planning Workshops

    NASA Technical Reports Server (NTRS)

    1984-01-01

    An advisory report to be used by NASA in developing a program plan for a Pilot Land Data System (PLDS) was developed. The purpose of the PLDS is to improve the ability of NASA and NASA sponsored researchers to conduct land-related research. The goal of the planning workshops was to provide and coordinate planning and concept development between the land related science and computer science disciplines, to discuss the architecture of the PLDs, requirements for information science technology, and system evaluation. The findings and recommendations of the Working Group are presented. The pilot program establishes a limited scale distributed information system to explore scientific, technical, and management approaches to satisfying the needs of the land science community. The PLDS paves the way for a land data system to improve data access, processing, transfer, and analysis, which land sciences information synthesis occurs on a scale not previously permitted because of limits to data assembly and access.

  12. Does one workshop on respecting cultural differences increase health professionals' confidence to improve the care of Australian Aboriginal patients with cancer? An evaluation.

    PubMed

    Durey, Angela; Halkett, Georgia; Berg, Melissa; Lester, Leanne; Kickett, Marion

    2017-09-15

    Aboriginal Australians have worse cancer survival rates than other Australians. Reasons include fear of a cancer diagnosis, reluctance to attend mainstream health services and discrimination from health professionals. Offering health professionals education in care focusing on Aboriginal patients' needs is important. The aim of this paper was to evaluate whether participating in a workshop improved the confidence of radiation oncology health professionals in their knowledge, communication and ability to offer culturally safe healthcare to Aboriginal Australians with cancer. Mixed methods using pre and post workshop online surveys, and one delivered 2 months later, were evaluated. Statistical analysis determined the relative proportion of participants who changed from not at all/a little confident at baseline to fairly/extremely confident immediately and 2 months after the workshop. Factor analysis identified underlying dimensions in the items and nonparametric tests recorded changes in mean dimension scores over and between times. Qualitative data was analysed for emerging themes. Fifty-nine participants attended the workshops, 39 (66% response rate) completed pre-workshop surveys, 32 (82% of study participants) completed post-workshop surveys and 25 (64% of study participants) completed surveys 2 months later. A significant increase in the proportion of attendees who reported fair/extreme confidence within 2 days of the workshop was found in nine of 14 items, which was sustained for all but one item 2 months later. Two additional items had a significant increase in the proportion of fair/extremely confident attendees 2 months post workshop compared to baseline. An exploratory factor analysis identified three dimensions: communication; relationships; and awareness. All dimensions' mean scores significantly improved within 2 days (p < 0.005) and persisted to 2 months. The workshop raised awareness about barriers and enablers to delivering services respectful of cultural differences, led to a willingness to reflect on pre-existing beliefs and assumptions about Aboriginal Australians that in some cases resulted in improved care. Single workshops co-delivered by an Aboriginal and non-Aboriginal presenter can be effective in building health professionals' confidence and translating into practice knowledge of respectful care of Aboriginal patients with cancer. Sustaining improvements may require integrating this approach into ongoing professional development.

  13. Improving Data Mobility & Management for International Cosmology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Borrill, Julian; Dart, Eli; Gore, Brooklin

    In February 2015 the third workshop in the CrossConnects series, with a focus on Improving Data Mobility & Management for International Cosmology, was held at Lawrence Berkeley National Laboratory. Scientists from fields including astrophysics, cosmology, and astronomy collaborated with experts in computing and networking to outline strategic opportunities for enhancing scientific productivity and effectively managing the ever-increasing scale of scientific data.

  14. Second Computational Aeroacoustics (CAA) Workshop on Benchmark Problems

    NASA Technical Reports Server (NTRS)

    Tam, C. K. W. (Editor); Hardin, J. C. (Editor)

    1997-01-01

    The proceedings of the Second Computational Aeroacoustics (CAA) Workshop on Benchmark Problems held at Florida State University are the subject of this report. For this workshop, problems arising in typical industrial applications of CAA were chosen. Comparisons between numerical solutions and exact solutions are presented where possible.

  15. Improving the Accuracy of Estimation of Climate Extremes

    NASA Astrophysics Data System (ADS)

    Zolina, Olga; Detemmerman, Valery; Trenberth, Kevin E.

    2010-12-01

    Workshop on Metrics and Methodologies of Estimation of Extreme Climate Events; Paris, France, 27-29 September 2010; Climate projections point toward more frequent and intense weather and climate extremes such as heat waves, droughts, and floods, in a warmer climate. These projections, together with recent extreme climate events, including flooding in Pakistan and the heat wave and wildfires in Russia, highlight the need for improved risk assessments to help decision makers and the public. But accurate analysis and prediction of risk of extreme climate events require new methodologies and information from diverse disciplines. A recent workshop sponsored by the World Climate Research Programme (WCRP) and hosted at United Nations Educational, Scientific and Cultural Organization (UNESCO) headquarters in France brought together, for the first time, a unique mix of climatologists, statisticians, meteorologists, oceanographers, social scientists, and risk managers (such as those from insurance companies) who sought ways to improve scientists' ability to characterize and predict climate extremes in a changing climate.

  16. Workshop in computational molecular biology, April 15, 1991--April 14, 1994

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tavare, S.

    Funds from this award were used to the Workshop in Computational Molecular Biology, `91 Symposium entitled Interface: Computing Science and Statistics, Seattle, Washington, April 21, 1991; the Workshop in Statistical Issues in Molecular Biology held at Stanford, California, August 8, 1993; and the Session on Population Genetics a part of the 56th Annual Meeting, Institute of Mathematical Statistics, San Francisco, California, August 9, 1993.

  17. Web Based Parallel Programming Workshop for Undergraduate Education.

    ERIC Educational Resources Information Center

    Marcus, Robert L.; Robertson, Douglass

    Central State University (Ohio), under a contract with Nichols Research Corporation, has developed a World Wide web based workshop on high performance computing entitled "IBN SP2 Parallel Programming Workshop." The research is part of the DoD (Department of Defense) High Performance Computing Modernization Program. The research…

  18. Welcome to the NASA High Performance Computing and Communications Computational Aerosciences (CAS) Workshop 2000

    NASA Technical Reports Server (NTRS)

    Schulbach, Catherine H. (Editor)

    2000-01-01

    The purpose of the CAS workshop is to bring together NASA's scientists and engineers and their counterparts in industry, other government agencies, and academia working in the Computational Aerosciences and related fields. This workshop is part of the technology transfer plan of the NASA High Performance Computing and Communications (HPCC) Program. Specific objectives of the CAS workshop are to: (1) communicate the goals and objectives of HPCC and CAS, (2) promote and disseminate CAS technology within the appropriate technical communities, including NASA, industry, academia, and other government labs, (3) help promote synergy among CAS and other HPCC scientists, and (4) permit feedback from peer researchers on issues facing High Performance Computing in general and the CAS project in particular. This year we had a number of exciting presentations in the traditional aeronautics, aerospace sciences, and high-end computing areas and in the less familiar (to many of us affiliated with CAS) earth science, space science, and revolutionary computing areas. Presentations of more than 40 high quality papers were organized into ten sessions and presented over the three-day workshop. The proceedings are organized here for easy access: by author, title and topic.

  19. 'Cloud computing' and clinical trials: report from an ECRIN workshop.

    PubMed

    Ohmann, Christian; Canham, Steve; Danielyan, Edgar; Robertshaw, Steve; Legré, Yannick; Clivio, Luca; Demotes, Jacques

    2015-07-29

    Growing use of cloud computing in clinical trials prompted the European Clinical Research Infrastructures Network, a European non-profit organisation established to support multinational clinical research, to organise a one-day workshop on the topic to clarify potential benefits and risks. The issues that arose in that workshop are summarised and include the following: the nature of cloud computing and the cloud computing industry; the risks in using cloud computing services now; the lack of explicit guidance on this subject, both generally and with reference to clinical trials; and some possible ways of reducing risks. There was particular interest in developing and using a European 'community cloud' specifically for academic clinical trial data. It was recognised that the day-long workshop was only the start of an ongoing process. Future discussion needs to include clarification of trial-specific regulatory requirements for cloud computing and involve representatives from the relevant regulatory bodies.

  20. Computational Materials Research

    NASA Technical Reports Server (NTRS)

    Hinkley, Jeffrey A. (Editor); Gates, Thomas S. (Editor)

    1996-01-01

    Computational Materials aims to model and predict thermodynamic, mechanical, and transport properties of polymer matrix composites. This workshop, the second coordinated by NASA Langley, reports progress in measurements and modeling at a number of length scales: atomic, molecular, nano, and continuum. Assembled here are presentations on quantum calculations for force field development, molecular mechanics of interfaces, molecular weight effects on mechanical properties, molecular dynamics applied to poling of polymers for electrets, Monte Carlo simulation of aromatic thermoplastics, thermal pressure coefficients of liquids, ultrasonic elastic constants, group additivity predictions, bulk constitutive models, and viscoplasticity characterization.

  1. Grand Challenges of Advanced Computing for Energy Innovation Report from the Workshop Held July 31-August 2, 2012

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Larzelere, Alex R.; Ashby, Steven F.; Christensen, Dana C.

    2013-03-06

    On July 31-August 2 of 2012, the U.S. Department of Energy (DOE) held a workshop entitled Grand Challenges of Advanced Computing for Energy Innovation. This workshop built on three earlier workshops that clearly identified the potential for the Department and its national laboratories to enable energy innovation. The specific goal of the workshop was to identify the key challenges that the nation must overcome to apply the full benefit of taxpayer-funded advanced computing technologies to U.S. energy innovation in the ways that the country produces, moves, stores, and uses energy. Perhaps more importantly, the workshop also developed a set ofmore » recommendations to help the Department overcome those challenges. These recommendations provide an action plan for what the Department can do in the coming years to improve the nation’s energy future.« less

  2. Effects of ergonomic intervention on work-related upper extremity musculoskeletal disorders among computer workers: a randomized controlled trial.

    PubMed

    Esmaeilzadeh, Sina; Ozcan, Emel; Capan, Nalan

    2014-01-01

    The aim of the study was to determine effects of ergonomic intervention on work-related upper extremity musculoskeletal disorders (WUEMSDs) among computer workers. Four hundred computer workers answered a questionnaire on work-related upper extremity musculoskeletal symptoms (WUEMSS). Ninety-four subjects with WUEMSS using computers at least 3 h a day participated in a prospective, randomized controlled 6-month intervention. Body posture and workstation layouts were assessed by the Ergonomic Questionnaire. We used the Visual Analogue Scale to assess the intensity of WUEMSS. The Upper Extremity Function Scale was used to evaluate functional limitations at the neck and upper extremities. Health-related quality of life was assessed with the Short Form-36. After baseline assessment, those in the intervention group participated in a multicomponent ergonomic intervention program including a comprehensive ergonomic training consisting of two interactive sessions, an ergonomic training brochure, and workplace visits with workstation adjustments. Follow-up assessment was conducted after 6 months. In the intervention group, body posture (p < 0.001) and workstation layout (p = 0.002) improved over 6 months; furthermore, intensity (p < 0.001), duration (p < 0.001), and frequency (p = 0.009) of WUEMSS decreased significantly in the intervention group compared with the control group. Additionally, the functional status (p = 0.001), and physical (p < 0.001), and mental (p = 0.035) health-related quality of life improved significantly compared with the controls. There was no improvement of work day loss due to WUEMSS (p > 0.05). Ergonomic intervention programs may be effective in reducing ergonomic risk factors among computer workers and consequently in the secondary prevention of WUEMSDs.

  3. Computational Intelligence and Its Impact on Future High-Performance Engineering Systems

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler)

    1996-01-01

    This document contains presentations from the joint UVA/NASA Workshop on Computational Intelligence held at the Virginia Consortium of Engineering and Science Universities, Hampton, Virginia, June 27-28, 1995. The presentations addressed activities in the areas of fuzzy logic, neural networks, and evolutionary computations. Workshop attendees represented NASA, the National Science Foundation, the Department of Energy, National Institute of Standards and Technology (NIST), the Jet Propulsion Laboratory, industry, and academia. The workshop objectives were to assess the state of technology in the Computational intelligence area and to provide guidelines for future research.

  4. Applied Information Systems Research Program (AISRP). Workshop 2: Meeting Proceedings

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The Earth and space science participants were able to see where the current research can be applied in their disciplines and computer science participants could see potential areas for future application of computer and information systems research. The Earth and Space Science research proposals for the High Performance Computing and Communications (HPCC) program were under evaluation. Therefore, this effort was not discussed at the AISRP Workshop. OSSA's other high priority area in computer science is scientific visualization, with the entire second day of the workshop devoted to it.

  5. Quaternary of Himalaya

    NASA Astrophysics Data System (ADS)

    Srivastava, Pradeep; Singh, Vimal

    2017-05-01

    Tectonically active Himalayan mountains evolves via feedbacks from deep earth and surface processes; the complex interaction of various processes results into the landscape which is dynamic both at longer and shorter time scales. The extreme hydrological events that possibly ride over a long term climate cycle bring the changes in the landscape that impact human societies more closely. These events in the Himalaya frequently cause huge damage to economy and human lives. The geologist community under the umbrella of Himalaya-Karakorum-Tibet (HKT) workshop in its 30th edition convened a special session and deliberated on the subject. This special issue "Quaternary of Himalaya" is an outcome of papers presented and discussion held during this session; it consists of 18 papers in three sub-themes (i) Extreme Events in Himalaya (ii) Paleoglaciation in Himalaya and (iii) Expressions of climate and neotectonics in Himalaya.

  6. 78 FR 54453 - Notice of Public Meeting-Intersection of Cloud Computing and Mobility Forum and Workshop

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-04

    ...--Intersection of Cloud Computing and Mobility Forum and Workshop AGENCY: National Institute of Standards and.../intersection-of-cloud-and-mobility.cfm . SUPPLEMENTARY INFORMATION: NIST hosted six prior Cloud Computing Forum... interoperability, portability, and security, discuss the Federal Government's experience with cloud computing...

  7. The changing landscape of astrostatistics and astroinformatics

    NASA Astrophysics Data System (ADS)

    Feigelson, Eric D.

    2017-06-01

    The history and current status of the cross-disciplinary fields of astrostatistics and astroinformatics are reviewed. Astronomers need a wide range of statistical methods for both data reduction and science analysis. With the proliferation of high-throughput telescopes, efficient large scale computational methods are also becoming essential. However, astronomers receive only weak training in these fields during their formal education. Interest in the fields is rapidly growing with conferences organized by scholarly societies, textbooks and tutorial workshops, and research studies pushing the frontiers of methodology. R, the premier language of statistical computing, can provide an important software environment for the incorporation of advanced statistical and computational methodology into the astronomical community.

  8. H2@Scale Workshop Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pivovar, Bryan

    2017-03-31

    Final report from the H2@Scale Workshop held November 16-17, 2016, at the National Renewable Energy Laboratory in Golden, Colorado. The U.S. Department of Energy's National Renewable Energy Laboratory hosted a technology workshop to identify the current barriers and research needs of the H2@Scale concept. H2@Scale is a concept regarding the potential for wide-scale impact of hydrogen produced from diverse domestic resources to enhance U.S. energy security and enable growth of innovative technologies and domestic industries. Feedback received from a diverse set of stakeholders at the workshop will guide the development of an H2@Scale roadmap for research, development, and early stagemore » demonstration activities that can enable hydrogen as an energy carrier at a national scale.« less

  9. Predictive Anomaly Management for Resilient Virtualized Computing Infrastructures

    DTIC Science & Technology

    2015-05-27

    PREC: Practical Root Exploit Containment for Android Devices, ACM Conference on Data and Application Security and Privacy (CODASPY) . 03-MAR-14...05-OCT-11, . : , Hiep Nguyen, Yongmin Tan, Xiaohui Gu. Propagation-aware Anomaly Localization for Cloud Hosted Distributed Applications , ACM...Workshop on Managing Large-Scale Systems via the Analysis of System Logs and the Application of Machine Learning Techniques (SLAML) in conjunction with SOSP

  10. Wafer level reliability testing: An idea whose time has come

    NASA Technical Reports Server (NTRS)

    Trapp, O. D.

    1987-01-01

    Wafer level reliability testing has been nurtured in the DARPA supported workshops, held each autumn since 1982. The seeds planted in 1982 have produced an active crop of very large scale integration manufacturers applying wafer level reliability test methods. Computer Aided Reliability (CAR) is a new seed being nurtured. Users are now being awakened by the huge economic value of the wafer reliability testing technology.

  11. Computer Science Research at Langley

    NASA Technical Reports Server (NTRS)

    Voigt, S. J. (Editor)

    1982-01-01

    A workshop was held at Langley Research Center, November 2-5, 1981, to highlight ongoing computer science research at Langley and to identify additional areas of research based upon the computer user requirements. A panel discussion was held in each of nine application areas, and these are summarized in the proceedings. Slides presented by the invited speakers are also included. A survey of scientific, business, data reduction, and microprocessor computer users helped identify areas of focus for the workshop. Several areas of computer science which are of most concern to the Langley computer users were identified during the workshop discussions. These include graphics, distributed processing, programmer support systems and tools, database management, and numerical methods.

  12. Turbulent Flow Simulation at the Exascale: Opportunities and Challenges Workshop: August 4-5, 2015, Washington, D.C.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sprague, Michael A.; Boldyrev, Stanislav; Fischer, Paul

    This report details the impact exascale will bring to turbulent-flow simulations in applied science and technology. The need for accurate simulation of turbulent flows is evident across the DOE applied-science and engineering portfolios, including combustion, plasma physics, nuclear-reactor physics, wind energy, and atmospheric science. The workshop brought together experts in turbulent-flow simulation, computational mathematics, and high-performance computing. Building upon previous ASCR workshops on exascale computing, participants defined a research agenda and path forward that will enable scientists and engineers to continually leverage, engage, and direct advances in computational systems on the path to exascale computing.

  13. Industry-Wide Workshop on Computational Turbulence Modeling

    NASA Technical Reports Server (NTRS)

    Shabbir, Aamir (Compiler)

    1995-01-01

    This publication contains the presentations made at the Industry-Wide Workshop on Computational Turbulence Modeling which took place on October 6-7, 1994. The purpose of the workshop was to initiate the transfer of technology developed at Lewis Research Center to industry and to discuss the current status and the future needs of turbulence models in industrial CFD.

  14. Simulation research on the process of large scale ship plane segmentation intelligent workshop

    NASA Astrophysics Data System (ADS)

    Xu, Peng; Liao, Liangchuang; Zhou, Chao; Xue, Rui; Fu, Wei

    2017-04-01

    Large scale ship plane segmentation intelligent workshop is a new thing, and there is no research work in related fields at home and abroad. The mode of production should be transformed by the existing industry 2.0 or part of industry 3.0, also transformed from "human brain analysis and judgment + machine manufacturing" to "machine analysis and judgment + machine manufacturing". In this transforming process, there are a great deal of tasks need to be determined on the aspects of management and technology, such as workshop structure evolution, development of intelligent equipment and changes in business model. Along with them is the reformation of the whole workshop. Process simulation in this project would verify general layout and process flow of large scale ship plane section intelligent workshop, also would analyze intelligent workshop working efficiency, which is significant to the next step of the transformation of plane segmentation intelligent workshop.

  15. Plans and Example Results for the 2nd AIAA Aeroelastic Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Heeg, Jennifer; Chwalowski, Pawel; Schuster, David M.; Raveh, Daniella; Jirasek, Adam; Dalenbring, Mats

    2015-01-01

    This paper summarizes the plans for the second AIAA Aeroelastic Prediction Workshop. The workshop is designed to assess the state-of-the-art of computational methods for predicting unsteady flow fields and aeroelastic response. The goals are to provide an impartial forum to evaluate the effectiveness of existing computer codes and modeling techniques, and to identify computational and experimental areas needing additional research and development. This paper provides guidelines and instructions for participants including the computational aerodynamic model, the structural dynamic properties, the experimental comparison data and the expected output data from simulations. The Benchmark Supercritical Wing (BSCW) has been chosen as the configuration for this workshop. The analyses to be performed will include aeroelastic flutter solutions of the wing mounted on a pitch-and-plunge apparatus.

  16. xSDK Foundations: Toward an Extreme-scale Scientific Software Development Kit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heroux, Michael A.; Bartlett, Roscoe; Demeshko, Irina

    Here, extreme-scale computational science increasingly demands multiscale and multiphysics formulations. Combining software developed by independent groups is imperative: no single team has resources for all predictive science and decision support capabilities. Scientific libraries provide high-quality, reusable software components for constructing applications with improved robustness and portability. However, without coordination, many libraries cannot be easily composed. Namespace collisions, inconsistent arguments, lack of third-party software versioning, and additional difficulties make composition costly. The Extreme-scale Scientific Software Development Kit (xSDK) defines community policies to improve code quality and compatibility across independently developed packages (hypre, PETSc, SuperLU, Trilinos, and Alquimia) and provides a foundationmore » for addressing broader issues in software interoperability, performance portability, and sustainability. The xSDK provides turnkey installation of member software and seamless combination of aggregate capabilities, and it marks first steps toward extreme-scale scientific software ecosystems from which future applications can be composed rapidly with assured quality and scalability.« less

  17. xSDK Foundations: Toward an Extreme-scale Scientific Software Development Kit

    DOE PAGES

    Heroux, Michael A.; Bartlett, Roscoe; Demeshko, Irina; ...

    2017-03-01

    Here, extreme-scale computational science increasingly demands multiscale and multiphysics formulations. Combining software developed by independent groups is imperative: no single team has resources for all predictive science and decision support capabilities. Scientific libraries provide high-quality, reusable software components for constructing applications with improved robustness and portability. However, without coordination, many libraries cannot be easily composed. Namespace collisions, inconsistent arguments, lack of third-party software versioning, and additional difficulties make composition costly. The Extreme-scale Scientific Software Development Kit (xSDK) defines community policies to improve code quality and compatibility across independently developed packages (hypre, PETSc, SuperLU, Trilinos, and Alquimia) and provides a foundationmore » for addressing broader issues in software interoperability, performance portability, and sustainability. The xSDK provides turnkey installation of member software and seamless combination of aggregate capabilities, and it marks first steps toward extreme-scale scientific software ecosystems from which future applications can be composed rapidly with assured quality and scalability.« less

  18. Data mining and computationally intensive methods: summary of Group 7 contributions to Genetic Analysis Workshop 13.

    PubMed

    Costello, Tracy J; Falk, Catherine T; Ye, Kenny Q

    2003-01-01

    The Framingham Heart Study data, as well as a related simulated data set, were generously provided to the participants of the Genetic Analysis Workshop 13 in order that newly developed and emerging statistical methodologies could be tested on that well-characterized data set. The impetus driving the development of novel methods is to elucidate the contributions of genes, environment, and interactions between and among them, as well as to allow comparison between and validation of methods. The seven papers that comprise this group used data-mining methodologies (tree-based methods, neural networks, discriminant analysis, and Bayesian variable selection) in an attempt to identify the underlying genetics of cardiovascular disease and related traits in the presence of environmental and genetic covariates. Data-mining strategies are gaining popularity because they are extremely flexible and may have greater efficiency and potential in identifying the factors involved in complex disorders. While the methods grouped together here constitute a diverse collection, some papers asked similar questions with very different methods, while others used the same underlying methodology to ask very different questions. This paper briefly describes the data-mining methodologies applied to the Genetic Analysis Workshop 13 data sets and the results of those investigations. Copyright 2003 Wiley-Liss, Inc.

  19. Workshop on Scaling Effects in Composite Materials and Structures

    NASA Technical Reports Server (NTRS)

    Jackson, Karen E. (Compiler)

    1994-01-01

    This document contains presentations and abstracts from the Workshop on Scaling Effects in Composite Materials and Structures jointly sponsored by NASA Langley Research Center, Virginia Tech, and the Institute for Mechanics and Materials at the University of California, San Diego, and held at NASA Langley on November 15-16, 1993. Workshop attendees represented NASA, other government research labs, the aircraft/rotorcraft industry, and academia. The workshop objectives were to assess the state-of-technology in scaling effects in composite materials and to provide guidelines for future research.

  20. ICASE/LaRC Workshop on Benchmark Problems in Computational Aeroacoustics (CAA)

    NASA Technical Reports Server (NTRS)

    Hardin, Jay C. (Editor); Ristorcelli, J. Ray (Editor); Tam, Christopher K. W. (Editor)

    1995-01-01

    The proceedings of the Benchmark Problems in Computational Aeroacoustics Workshop held at NASA Langley Research Center are the subject of this report. The purpose of the Workshop was to assess the utility of a number of numerical schemes in the context of the unusual requirements of aeroacoustical calculations. The schemes were assessed from the viewpoint of dispersion and dissipation -- issues important to long time integration and long distance propagation in aeroacoustics. Also investigated were the effect of implementation of different boundary conditions. The Workshop included a forum in which practical engineering problems related to computational aeroacoustics were discussed. This discussion took the form of a dialogue between an industrial panel and the workshop participants and was an effort to suggest the direction of evolution of this field in the context of current engineering needs.

  1. NERSC News

    Science.gov Websites

    Performance Data, Analytics & Services Job Logs & Statistics Training & Tutorials Software Outages NERSC Training Spectrum Scale User Group Meeting Live Status Now Computing Queue Look MOTD » Deep Learning at 15 PFlops Enables Training for Extreme Weather Identification at Scale March 29, 2018

  2. Self-directed student research through analysis of microarray datasets: a computer-based functional genomics practical class for masters-level students.

    PubMed

    Grenville-Briggs, Laura J; Stansfield, Ian

    2011-01-01

    This report describes a linked series of Masters-level computer practical workshops. They comprise an advanced functional genomics investigation, based upon analysis of a microarray dataset probing yeast DNA damage responses. The workshops require the students to analyse highly complex transcriptomics datasets, and were designed to stimulate active learning through experience of current research methods in bioinformatics and functional genomics. They seek to closely mimic a realistic research environment, and require the students first to propose research hypotheses, then test those hypotheses using specific sections of the microarray dataset. The complexity of the microarray data provides students with the freedom to propose their own unique hypotheses, tested using appropriate sections of the microarray data. This research latitude was highly regarded by students and is a strength of this practical. In addition, the focus on DNA damage by radiation and mutagenic chemicals allows them to place their results in a human medical context, and successfully sparks broad interest in the subject material. In evaluation, 79% of students scored the practical workshops on a five-point scale as 4 or 5 (totally effective) for student learning. More broadly, the general use of microarray data as a "student research playground" is also discussed. Copyright © 2011 Wiley Periodicals, Inc.

  3. Designing and Implementing a Faculty Internet Workshop: A Collaborative Effort of Academic Computing Services and the University Library.

    ERIC Educational Resources Information Center

    Bradford, Jane T.; And Others

    1996-01-01

    Academic Computing Services staff and University librarians at Stetson University (DeLand, Florida) designed and implemented a three-day Internet workshop for interested faculty. The workshop included both hands-on lab sessions and discussions covering e-mail, telnet, ftp, Gopher, and World Wide Web. The planning, preparation of the lab and…

  4. 76 FR 52353 - Assumption Buster Workshop: “Current Implementations of Cloud Computing Indicate a New Approach...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-22

    ... explored in this series is cloud computing. The workshop on this topic will be held in Gaithersburg, MD on October 21, 2011. Assertion: ``Current implementations of cloud computing indicate a new approach to security'' Implementations of cloud computing have provided new ways of thinking about how to secure data...

  5. Computer Academy. Western Michigan University: Summer 1985-Present.

    ERIC Educational Resources Information Center

    Kramer, Jane E.

    The Computer Academy at Western Michigan University (Kalamazoo) is a series of intensive, one-credit-hour workshops to assist professionals in increasing their level of computer competence. At the time they were initiated, in 1985, the workshops targeted elementary and secondary school teachers and administrators, were offered on Apple IIe…

  6. Tenth Workshop for Computational Fluid Dynamic Applications in Rocket Propulsion, part 1

    NASA Technical Reports Server (NTRS)

    Williams, R. W. (Compiler)

    1992-01-01

    Experimental and computational fluid dynamic activities in rocket propulsion were discussed. The workshop was an open meeting of government, industry, and academia. A broad number of topics were discussed including computational fluid dynamic methodology, liquid and solid rocket propulsion, turbomachinery, combustion, heat transfer, and grid generation.

  7. Information technology challenges of biodiversity and ecosystems informatics

    USGS Publications Warehouse

    Schnase, J.L.; Cushing, J.; Frame, M.; Frondorf, A.; Landis, E.; Maier, D.; Silberschatz, A.

    2003-01-01

    Computer scientists, biologists, and natural resource managers recently met to examine the prospects for advancing computer science and information technology research by focusing on the complex and often-unique challenges found in the biodiversity and ecosystem domain. The workshop and its final report reveal that the biodiversity and ecosystem sciences are fundamentally information sciences and often address problems having distinctive attributes of scale and socio-technical complexity. The paper provides an overview of the emerging field of biodiversity and ecosystem informatics and demonstrates how the demands of biodiversity and ecosystem research can advance our understanding and use of information technologies.

  8. Beyond moore computing research challenge workshop report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huey, Mark C.; Aidun, John Bahram

    2013-10-01

    We summarize the presentations and break out session discussions from the in-house workshop that was held on 11 July 2013 to acquaint a wider group of Sandians with the Beyond Moore Computing research challenge.

  9. A Qualitative Case Study Comparing a Computer-Mediated Delivery System to a Face-to-Face Mediated Delivery System for Teaching Creative Writing Fiction Workshops

    ERIC Educational Resources Information Center

    Daniels, Mindy A.

    2012-01-01

    The purpose of this case study was to compare the pedagogical and affective efficiency and efficacy of creative prose fiction writing workshops taught via asynchronous computer-mediated online distance education with creative prose fiction writing workshops taught face-to-face in order to better understand their operational pedagogy and…

  10. Extremely Severe Space Weather and Geomagnetically Induced Currents in Regions with Locally Heterogeneous Ground Resistivity

    NASA Technical Reports Server (NTRS)

    Fujita, Shigeru; Kataoka, Ryuho; Pulkkinen, Antti; Watari, Shinichi

    2016-01-01

    Large geomagnetically induced currents (GICs) triggered by extreme space weather events are now regarded as one of the serious natural threats to the modern electrified society. The risk is described in detail in High-Impact, Low-Frequency Event Risk, A Jointly-Commissioned Summary Report of the North American Electric Reliability Corporation and the US Department of Energy's November 2009 Workshop, June 2010. For example, the March 13-14,1989 storm caused a large-scale blackout affecting about 6 million people in Quebec, Canada, and resulting in substantial economic losses in Canada and the USA (Bolduc 2002). Therefore, European and North American nations have invested in GIC research such as the Solar Shield project in the USA (Pulkkinen et al. 2009, 2015a). In 2015, the Japanese government (Ministry of Economy, Trade and Industry, METI) acknowledged the importance of GIC research in Japan. After reviewing the serious damages caused by the 2011 Tohoku-Oki earthquake, METI recognized the potential risk to the electric power grid posed by extreme space weather. During extreme events, GICs can be concerning even in mid- and low-latitude countries and have become a global issue.

  11. Cosmological neutrino simulations at extreme scale

    DOE PAGES

    Emberson, J. D.; Yu, Hao-Ran; Inman, Derek; ...

    2017-08-01

    Constraining neutrino mass remains an elusive challenge in modern physics. Precision measurements are expected from several upcoming cosmological probes of large-scale structure. Achieving this goal relies on an equal level of precision from theoretical predictions of neutrino clustering. Numerical simulations of the non-linear evolution of cold dark matter and neutrinos play a pivotal role in this process. We incorporate neutrinos into the cosmological N-body code CUBEP3M and discuss the challenges associated with pushing to the extreme scales demanded by the neutrino problem. We highlight code optimizations made to exploit modern high performance computing architectures and present a novel method ofmore » data compression that reduces the phase-space particle footprint from 24 bytes in single precision to roughly 9 bytes. We scale the neutrino problem to the Tianhe-2 supercomputer and provide details of our production run, named TianNu, which uses 86% of the machine (13,824 compute nodes). With a total of 2.97 trillion particles, TianNu is currently the world’s largest cosmological N-body simulation and improves upon previous neutrino simulations by two orders of magnitude in scale. We finish with a discussion of the unanticipated computational challenges that were encountered during the TianNu runtime.« less

  12. PREFACE: 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT2013)

    NASA Astrophysics Data System (ADS)

    Wang, Jianxiong

    2014-06-01

    This volume of Journal of Physics: Conference Series is dedicated to scientific contributions presented at the 15th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2013) which took place on 16-21 May 2013 at the Institute of High Energy Physics, Chinese Academy of Sciences, Beijing, China. The workshop series brings together computer science researchers and practitioners, and researchers from particle physics and related fields to explore and confront the boundaries of computing and of automatic data analysis and theoretical calculation techniques. This year's edition of the workshop brought together over 120 participants from all over the world. 18 invited speakers presented key topics on the universe in computer, Computing in Earth Sciences, multivariate data analysis, automated computation in Quantum Field Theory as well as computing and data analysis challenges in many fields. Over 70 other talks and posters presented state-of-the-art developments in the areas of the workshop's three tracks: Computing Technologies, Data Analysis Algorithms and Tools, and Computational Techniques in Theoretical Physics. The round table discussions on open-source, knowledge sharing and scientific collaboration stimulate us to think over the issue in the respective areas. ACAT 2013 was generously sponsored by the Chinese Academy of Sciences (CAS), National Natural Science Foundation of China (NFSC), Brookhaven National Laboratory in the USA (BNL), Peking University (PKU), Theoretical Physics Cernter for Science facilities of CAS (TPCSF-CAS) and Sugon. We would like to thank all the participants for their scientific contributions and for the en- thusiastic participation in all its activities of the workshop. Further information on ACAT 2013 can be found at http://acat2013.ihep.ac.cn. Professor Jianxiong Wang Institute of High Energy Physics Chinese Academy of Science Details of committees and sponsors are available in the PDF

  13. Fourth Computational Aeroacoustics (CAA) Workshop on Benchmark Problems

    NASA Technical Reports Server (NTRS)

    Dahl, Milo D. (Editor)

    2004-01-01

    This publication contains the proceedings of the Fourth Computational Aeroacoustics (CAA) Workshop on Benchmark Problems. In this workshop, as in previous workshops, the problems were devised to gauge the technological advancement of computational techniques to calculate all aspects of sound generation and propagation in air directly from the fundamental governing equations. A variety of benchmark problems have been previously solved ranging from simple geometries with idealized acoustic conditions to test the accuracy and effectiveness of computational algorithms and numerical boundary conditions; to sound radiation from a duct; to gust interaction with a cascade of airfoils; to the sound generated by a separating, turbulent viscous flow. By solving these and similar problems, workshop participants have shown the technical progress from the basic challenges to accurate CAA calculations to the solution of CAA problems of increasing complexity and difficulty. The fourth CAA workshop emphasized the application of CAA methods to the solution of realistic problems. The workshop was held at the Ohio Aerospace Institute in Cleveland, Ohio, on October 20 to 22, 2003. At that time, workshop participants presented their solutions to problems in one or more of five categories. Their solutions are presented in this proceedings along with the comparisons of their solutions to the benchmark solutions or experimental data. The five categories for the benchmark problems were as follows: Category 1:Basic Methods. The numerical computation of sound is affected by, among other issues, the choice of grid used and by the boundary conditions. Category 2:Complex Geometry. The ability to compute the sound in the presence of complex geometric surfaces is important in practical applications of CAA. Category 3:Sound Generation by Interacting With a Gust. The practical application of CAA for computing noise generated by turbomachinery involves the modeling of the noise source mechanism as a vortical gust interacting with an airfoil. Category 4:Sound Transmission and Radiation. Category 5:Sound Generation in Viscous Problems. Sound is generated under certain conditions by a viscous flow as the flow passes an object or a cavity.

  14. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bremer, Peer-Timo; Mohr, Bernd; Schulz, Martin

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  15. Analytical Cost Metrics : Days of Future Past

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prajapati, Nirmal; Rajopadhye, Sanjay; Djidjev, Hristo Nikolov

    As we move towards the exascale era, the new architectures must be capable of running the massive computational problems efficiently. Scientists and researchers are continuously investing in tuning the performance of extreme-scale computational problems. These problems arise in almost all areas of computing, ranging from big data analytics, artificial intelligence, search, machine learning, virtual/augmented reality, computer vision, image/signal processing to computational science and bioinformatics. With Moore’s law driving the evolution of hardware platforms towards exascale, the dominant performance metric (time efficiency) has now expanded to also incorporate power/energy efficiency. Therefore the major challenge that we face in computing systems researchmore » is: “how to solve massive-scale computational problems in the most time/power/energy efficient manner?”« less

  16. Summary of SMIRT20 Preconference Topical Workshop – Identifying Structural Issues in Advanced Reactors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    William Richins; Stephen Novascone; Cheryl O'Brien

    Summary of SMIRT20 Preconference Topical Workshop – Identifying Structural Issues in Advanced Reactors William Richins1, Stephen Novascone1, and Cheryl O’Brien1 1Idaho National Laboratory, US Dept. of Energy, Idaho Falls, Idaho, USA, e-mail: William.Richins@inl.gov The Idaho National Laboratory (INL, USA) and IASMiRT sponsored an international forum Nov 5-6, 2008 in Porvoo, Finland for nuclear industry, academic, and regulatory representatives to identify structural issues in current and future advanced reactor design, especially for extreme conditions and external threats. The purpose of this Topical Workshop was to articulate research, engineering, and regulatory Code development needs. The topics addressed by the Workshop were selectedmore » to address critical industry needs specific to advanced reactor structures that have long lead times and can be the subject of future SMiRT technical sessions. The topics were; 1) structural/materials needs for extreme conditions and external threats in contemporary (Gen. III) and future (Gen. IV and NGNP) advanced reactors and 2) calibrating simulation software and methods that address topic 1 The workshop discussions and research needs identified are presented. The Workshop successfully produced interactive discussion on the two topics resulting in a list of research and technology needs. It is recommended that IASMiRT communicate the results of the discussion to industry and researchers to encourage new ideas and projects. In addition, opportunities exist to retrieve research reports and information that currently exists, and encourage more international cooperation and collaboration. It is recommended that IASMiRT continue with an off-year workshop series on select topics.« less

  17. Approaching the exa-scale: a real-world evaluation of rendering extremely large data sets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patchett, John M; Ahrens, James P; Lo, Li - Ta

    2010-10-15

    Extremely large scale analysis is becoming increasingly important as supercomputers and their simulations move from petascale to exascale. The lack of dedicated hardware acceleration for rendering on today's supercomputing platforms motivates our detailed evaluation of the possibility of interactive rendering on the supercomputer. In order to facilitate our understanding of rendering on the supercomputing platform, we focus on scalability of rendering algorithms and architecture envisioned for exascale datasets. To understand tradeoffs for dealing with extremely large datasets, we compare three different rendering algorithms for large polygonal data: software based ray tracing, software based rasterization and hardware accelerated rasterization. We presentmore » a case study of strong and weak scaling of rendering extremely large data on both GPU and CPU based parallel supercomputers using Para View, a parallel visualization tool. Wc use three different data sets: two synthetic and one from a scientific application. At an extreme scale, algorithmic rendering choices make a difference and should be considered while approaching exascale computing, visualization, and analysis. We find software based ray-tracing offers a viable approach for scalable rendering of the projected future massive data sizes.« less

  18. Reconstruction of Past Mediterranean Climate

    NASA Astrophysics Data System (ADS)

    García-Herrera, Ricardo; Luterbacher, Jürg; Lionello, Piero; Gonzáles-Rouco, Fidel; Ribera, Pedro; Rodó, Xavier; Kull, Christoph; Zerefos, Christos

    2007-02-01

    First MEDCLIVAR Workshop on Reconstruction of Past Mediterranean Climate; Pablo de Olavide University, Carmona, Spain, 8-11 November 2006; Mediterranean Climate Variability and Predictability (MEDCLIVAR; http://www.medclivar.eu) is a program that coordinates and promotes research on different aspects of Mediterranean climate. The main MEDCLIVAR goals include the reconstruction of past climate, describing patterns and mechanisms characterizing climate space-time variability, extremes at different time and space scales, coupled climate model/empirical reconstruction comparisons, seasonal forecasting, and the identification of the forcings responsible for the observed changes. The program has been endorsed by CLIVAR (Climate Variability and Predictability project) and is funded by the European Science Foundation.

  19. ES12; The 24th Annual Workshop on Recent Developments in Electronic Structure Theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holzwarth, Natalie; Thonhauser, Timo; Salam, Akbar

    ES12: The 24th Annual Workshop on Recent Developments in Electronic Structure Theory was held June 5-8, 2012 at Wake Forest University in Winston-Salem, NC 27109. The program consisted of 24 oral presentations, 70 posters, and 2 panel discussions. The attendance of the Workshop was comparable to or larger than previous workshops and participation was impressively diverse. The 136 participants came from all over the world and included undergraduate students, graduate students, postdoctoral researchers, and senior scientists. The general assessment of the Workshop was extremely positive in terms of the high level of scientific presentations and discussions, and in terms ofmore » the schedule, accommodations, and affordability of the meeting.« less

  20. Recent Naval Postgraduate School Publications.

    DTIC Science & Technology

    1982-04-01

    477 p. Haney, R L; et al.; eds. Ocean models for climate research: A workshop Sponsored by the U.S. Committee for the Global Atmos. Hes. Program. Nat... climate variability Oceanus, vol. 21, no. 4, p. 33-39, (1978). Williams, R T A review of theoretical models of atmospheric frontogenesis Chapman Conf...structure in large-scale optimization models Symp. 9 n Computer-Assisted Analysis and Model Simpification, Boulder, Colo., Mar. 24, 1980. Brown, G G

  1. Multilinear Computing and Multilinear Algebraic Geometry

    DTIC Science & Technology

    2016-08-10

    landmark paper titled “Most tensor problems are NP-hard” (see [14] in Section 3) in the Journal of the ACM, the premier journal in Computer Science ...Higher-order cone programming,” Machine Learning Thematic Trimester, International Centre for Mathematics and Computer Science , Toulouse, France...geometry-and-data-analysis • 2014 SIMONS INSTITUTE WORKSHOP: Workshop on Tensors in Computer Science and Geometry, University of California, Berkeley, CA

  2. Fourth NASA Workshop on Computational Control of Flexible Aerospace Systems, part 2

    NASA Technical Reports Server (NTRS)

    Taylor, Lawrence W., Jr. (Compiler)

    1991-01-01

    A collection of papers presented at the Fourth NASA Workshop on Computational Control of Flexible Aerospace Systems is given. The papers address modeling, systems identification, and control of flexible aircraft, spacecraft and robotic systems.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tranebjaerg, L.; Lubs, H.A.; Borghgraef, M.

    The Seventh International Workshop on the Fragile X and X-linked Mental Retardation was held at the University of Tromso in Norway on August 2-5, 1995. Approximately 120 participants from 20 countries attended the Workshop. By special invitation Dr. Felix de la Cruz, who initiated the first international Workshop on fragile X, attended this Workshop. For the first time, the workshop took place in Scandinavia and was hosted by Lisbeth Tranebjaerg and Herbert Lubs. For most participants this Workshop, held at the northernmost university in the world, presented a unique opportunity to visit this exotic place. Between sessions, the participants hadmore » a chance to experience 24 hours of daylight, codfishing, and extreme weather situations with excessive amounts of rain as well as spectacular changes in the light and rainbows. The format of the Workshop was a combination of platform presentations and poster presentations. In contrast to previous meetings, the Workshop opened with syndromal and non-syndromal X-linked mental retardation in order to allow time for discussion. 34 refs., 1 fig.« less

  4. Epidemic failure detection and consensus for extreme parallelism

    DOE PAGES

    Katti, Amogh; Di Fatta, Giuseppe; Naughton, Thomas; ...

    2017-02-01

    Future extreme-scale high-performance computing systems will be required to work under frequent component failures. The MPI Forum s User Level Failure Mitigation proposal has introduced an operation, MPI Comm shrink, to synchronize the alive processes on the list of failed processes, so that applications can continue to execute even in the presence of failures by adopting algorithm-based fault tolerance techniques. This MPI Comm shrink operation requires a failure detection and consensus algorithm. This paper presents three novel failure detection and consensus algorithms using Gossiping. The proposed algorithms were implemented and tested using the Extreme-scale Simulator. The results show that inmore » all algorithms the number of Gossip cycles to achieve global consensus scales logarithmically with system size. The second algorithm also shows better scalability in terms of memory and network bandwidth usage and a perfect synchronization in achieving global consensus. The third approach is a three-phase distributed failure detection and consensus algorithm and provides consistency guarantees even in very large and extreme-scale systems while at the same time being memory and bandwidth efficient.« less

  5. Workshop on Using NASA Data for Time-Sensitive Applications

    NASA Technical Reports Server (NTRS)

    Davies, Diane K.; Brown, Molly E.; Murphy, Kevin J.; Michael, Karen A.; Zavodsky, Bradley T.; Stavros, E. Natasha; Carroll, Mark L.

    2017-01-01

    Over the past decade, there has been an increase in the use of NASA's Earth Observing System (EOS) data and imagery for time-sensitive applications such as monitoring wildfires, floods, and extreme weather events. In September 2016, NASA sponsored a workshop for data users, producers, and scientists to discuss the needs of time-sensitive science applications.

  6. Creating an Effective Network: The GRACEnet Example

    NASA Astrophysics Data System (ADS)

    Follett, R. F.; Del Grosso, S.

    2008-12-01

    Networking activities require time, work, and nurturing. The objective of this presentation is to share the experience gained from The Greenhouse gas Reduction through Agricultural Carbon Enhancement network (GRACEnet). GRACEnet, formally established in 2005 by the ARS/USDA, resulted from workshops, teleconferences, and other activities beginning in at least 2002. Critical factors for its formation were to develop and formalize a common vision, goals, and objectives, which was accomplished in a 2005 workshop. The 4-person steering committee (now 5) was charged with coordinating the part-time (0.05- to 0.5 SY/location) efforts across 30 ARS locations to develop four products; (1) a national database, (2) regional/national guidelines of management practices, (3) computer models, and (4) "state-of-knowledge" summary publications. All locations are asked to contribute to the database from their field studies. Communication with everyone and periodic meeting are extremely important. Required to populate the database has to be a common vision of sharing, format, and trust. Based upon the e-mail list, GRACEnet has expanded from about 30 to now nearly 70 participants. Annual reports and a new website help facilitate this activity.

  7. Plans for Aeroelastic Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Heeg, Jennifer; Ballmann, Josef; Bhatia, Kumar; Blades, Eric; Boucke, Alexander; Chwalowski, Pawel; Dietz, Guido; Dowell, Earl; Florance, Jennifer P.; Hansen, Thorsten; hide

    2011-01-01

    This paper summarizes the plans for the first Aeroelastic Prediction Workshop. The workshop is designed to assess the state of the art of computational methods for predicting unsteady flow fields and aeroelastic response. The goals are to provide an impartial forum to evaluate the effectiveness of existing computer codes and modeling techniques, and to identify computational and experimental areas needing additional research and development. Three subject configurations have been chosen from existing wind tunnel data sets where there is pertinent experimental data available for comparison. For each case chosen, the wind tunnel testing was conducted using forced oscillation of the model at specified frequencies

  8. The Role of Computational Fluid Dynamics in the Management of Unruptured Intracranial Aneurysms: A Clinicians' View

    PubMed Central

    Singh, Pankaj K.; Marzo, Alberto; Coley, Stuart C.; Berti, Guntram; Bijlenga, Philippe; Lawford, Patricia V.; Villa-Uriol, Mari-Cruz; Rufenacht, Daniel A.; McCormack, Keith M.; Frangi, Alejandro; Patel, Umang J.; Hose, D. Rodney

    2009-01-01

    Objective. The importance of hemodynamics in the etiopathogenesis of intracranial aneurysms (IAs) is widely accepted. Computational fluid dynamics (CFD) is being used increasingly for hemodynamic predictions. However, alogn with the continuing development and validation of these tools, it is imperative to collect the opinion of the clinicians. Methods. A workshop on CFD was conducted during the European Society of Minimally Invasive Neurological Therapy (ESMINT) Teaching Course, Lisbon, Portugal. 36 delegates, mostly clinicians, performed supervised CFD analysis for an IA, using the @neuFuse software developed within the European project @neurIST. Feedback on the workshop was collected and analyzed. The performance was assessed on a scale of 1 to 4 and, compared with experts' performance. Results. Current dilemmas in the management of unruptured IAs remained the most important motivating factor to attend the workshop and majority of participants showed interest in participating in a multicentric trial. The participants achieved an average score of 2.52 (range 0–4) which was 63% (range 0–100%) of an expert user. Conclusions. Although participants showed a manifest interest in CFD, there was a clear lack of awareness concerning the role of hemodynamics in the etiopathogenesis of IAs and the use of CFD in this context. More efforts therefore are required to enhance understanding of the clinicians in the subject. PMID:19696903

  9. NCI Workshop Report: Clinical and Computational Requirements for Correlating Imaging Phenotypes with Genomics Signatures.

    PubMed

    Colen, Rivka; Foster, Ian; Gatenby, Robert; Giger, Mary Ellen; Gillies, Robert; Gutman, David; Heller, Matthew; Jain, Rajan; Madabhushi, Anant; Madhavan, Subha; Napel, Sandy; Rao, Arvind; Saltz, Joel; Tatum, James; Verhaak, Roeland; Whitman, Gary

    2014-10-01

    The National Cancer Institute (NCI) Cancer Imaging Program organized two related workshops on June 26-27, 2013, entitled "Correlating Imaging Phenotypes with Genomics Signatures Research" and "Scalable Computational Resources as Required for Imaging-Genomics Decision Support Systems." The first workshop focused on clinical and scientific requirements, exploring our knowledge of phenotypic characteristics of cancer biological properties to determine whether the field is sufficiently advanced to correlate with imaging phenotypes that underpin genomics and clinical outcomes, and exploring new scientific methods to extract phenotypic features from medical images and relate them to genomics analyses. The second workshop focused on computational methods that explore informatics and computational requirements to extract phenotypic features from medical images and relate them to genomics analyses and improve the accessibility and speed of dissemination of existing NIH resources. These workshops linked clinical and scientific requirements of currently known phenotypic and genotypic cancer biology characteristics with imaging phenotypes that underpin genomics and clinical outcomes. The group generated a set of recommendations to NCI leadership and the research community that encourage and support development of the emerging radiogenomics research field to address short-and longer-term goals in cancer research.

  10. Quantum Testbeds Stakeholder Workshop (QTSW) Report meeting purpose and agenda.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hebner, Gregory A.

    Quantum computing (QC) is a promising early-stage technology with the potential to provide scientific computing capabilities far beyond what is possible with even an Exascale computer in specific problems of relevance to the Office of Science. These include (but are not limited to) materials modeling, molecular dynamics, and quantum chromodynamics. However, commercial QC systems are not yet available and the technical maturity of current QC hardware, software, algorithms, and systems integration is woefully incomplete. Thus, there is a significant opportunity for DOE to define the technology building blocks, and solve the system integration issues to enable a revolutionary tool. Oncemore » realized, QC will have world changing impact on economic competitiveness, the scientific enterprise, and citizen well-being. Prior to this workshop, DOE / Office of Advanced Scientific Computing Research (ASCR) hosted a workshop in 2015 to explore QC scientific applications. The goal of that workshop was to assess the viability of QC technologies to meet the computational requirements in support of DOE’s science and energy mission and to identify the potential impact of these technologies.« less

  11. Computational Methods for Crashworthiness

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Carden, Huey D. (Compiler)

    1993-01-01

    Presentations and discussions from the joint UVA/NASA Workshop on Computational Methods for Crashworthiness held at Langley Research Center on 2-3 Sep. 1992 are included. The presentations addressed activities in the area of impact dynamics. Workshop attendees represented NASA, the Army and Air Force, the Lawrence Livermore and Sandia National Laboratories, the aircraft and automotive industries, and academia. The workshop objectives were to assess the state-of-technology in the numerical simulation of crash and to provide guidelines for future research.

  12. HAlign-II: efficient ultra-large multiple sequence alignment and phylogenetic tree reconstruction with distributed and parallel computing.

    PubMed

    Wan, Shixiang; Zou, Quan

    2017-01-01

    Multiple sequence alignment (MSA) plays a key role in biological sequence analyses, especially in phylogenetic tree construction. Extreme increase in next-generation sequencing results in shortage of efficient ultra-large biological sequence alignment approaches for coping with different sequence types. Distributed and parallel computing represents a crucial technique for accelerating ultra-large (e.g. files more than 1 GB) sequence analyses. Based on HAlign and Spark distributed computing system, we implement a highly cost-efficient and time-efficient HAlign-II tool to address ultra-large multiple biological sequence alignment and phylogenetic tree construction. The experiments in the DNA and protein large scale data sets, which are more than 1GB files, showed that HAlign II could save time and space. It outperformed the current software tools. HAlign-II can efficiently carry out MSA and construct phylogenetic trees with ultra-large numbers of biological sequences. HAlign-II shows extremely high memory efficiency and scales well with increases in computing resource. THAlign-II provides a user-friendly web server based on our distributed computing infrastructure. HAlign-II with open-source codes and datasets was established at http://lab.malab.cn/soft/halign.

  13. Use of computer games as an intervention for stroke.

    PubMed

    Proffitt, Rachel M; Alankus, Gazihan; Kelleher, Caitlin L; Engsberg, Jack R

    2011-01-01

    Current rehabilitation for persons with hemiparesis after stroke requires high numbers of repetitions to be in accordance with contemporary motor learning principles. The motivational characteristics of computer games can be harnessed to create engaging interventions for persons with hemiparesis after stroke that incorporate this high number of repetitions. The purpose of this case report was to test the feasibility of using computer games as a 6-week home therapy intervention to improve upper extremity function for a person with stroke. One person with left upper extremity hemiparesis after stroke participated in a 6-week home therapy computer game intervention. The games were customized to her preferences and abilities and modified weekly. Her performance was tracked and analyzed. Data from pre-, mid-, and postintervention testing using standard upper extremity measures and the Reaching Performance Scale (RPS) were analyzed. After 3 weeks, the participant demonstrated increased upper extremity range of motion at the shoulder and decreased compensatory trunk movements during reaching tasks. After 6 weeks, she showed functional gains in activities of daily living (ADLs) and instrumental ADLs despite no further improvements on the RPS. Results indicate that computer games have the potential to be a useful intervention for people with stroke. Future work will add additional support to quantify the effectiveness of the games as a home therapy intervention for persons with stroke.

  14. Workshop on Computational Turbulence Modeling

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This document contains presentations given at Workshop on Computational Turbulence Modeling held 15-16 Sep. 1993. The purpose of the meeting was to discuss the current status and future development of turbulence modeling in computational fluid dynamics for aerospace propulsion systems. Papers cover the following topics: turbulence modeling activities at the Center for Modeling of Turbulence and Transition (CMOTT); heat transfer and turbomachinery flow physics; aerothermochemistry and computational methods for space systems; computational fluid dynamics and the k-epsilon turbulence model; propulsion systems; and inlet, duct, and nozzle flow.

  15. Integrating Multimedia Techniques into CS Pedagogy.

    ERIC Educational Resources Information Center

    Adams, Sandra Honda; Jou, Richard; Nasri, Ahmad; Radimsky, Anne-Louise; Sy, Bon K.

    Through its grants, the National Science Foundation sponsors workshops that inform faculty of current topics in computer science. Such a workshop, entitled, "Developing Multimedia-based Interactive Laboratory Modules for Computer Science," was given July 27-August 6, 1998, at Illinois State University at Normal. Each participant was…

  16. Army Maneuver Center of Excellence

    DTIC Science & Technology

    2012-10-18

    agreements throughout DoD DARPA, JIEDDO, DHS, FAA, DoE, NSA , NASA, SMDC, etc. Strategic Partnerships Benefit the Army Materiel Enterprise External... Neuroscience Network Sciences Hierarchical Computing Extreme Energy Science Autonomous Systems Technology Emerging Sciences Meso-scale (grain...scales • Improvements in Soldier-system overall performance → operational neuroscience and advanced simulation and training technologies

  17. Computational Tools and Facilities for the Next-Generation Analysis and Design Environment

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)

    1997-01-01

    This document contains presentations from the joint UVA/NASA Workshop on Computational Tools and Facilities for the Next-Generation Analysis and Design Environment held at the Virginia Consortium of Engineering and Science Universities in Hampton, Virginia on September 17-18, 1996. The presentations focused on the computational tools and facilities for analysis and design of engineering systems, including, real-time simulations, immersive systems, collaborative engineering environment, Web-based tools and interactive media for technical training. Workshop attendees represented NASA, commercial software developers, the aerospace industry, government labs, and academia. The workshop objectives were to assess the level of maturity of a number of computational tools and facilities and their potential for application to the next-generation integrated design environment.

  18. The lichens of Fakahatchee Strand Preserve State Park, Florida: Proceedings from the 18th Tuckerman Workshop

    Treesearch

    Robert Lucking; Frederick Seavey; Ralph S. Common; Sean Q. Beeching; Othmar Breuss; William R. Buck; Lee Crane; Malcolm Hodges; Brendan P. Hodkinson; Elisabeth Lay; James C. Lendemer; R. Troy McMullin; Joel Mercado

    2011-01-01

    Fakahatchee Strand Preserve State Park is located in Collier County at the extreme southwestern corner of Florida, close to Everglades National Park and Big Cypress National Preserve. The 18th Tuckerman Workshop, an annual gathering of professional and amateur lichenologists and mycologists from the United States and Canada, this time with additional participants from...

  19. TNEEL workshop. Interactive methods for teaching end-of-life care.

    PubMed

    Wilkie, Diana J; Lin, Yu-Chuan; Judge, M Kay M; Shannon, Sarah E; Corless, Inge B; Farber, Stuart J; Brown, Marie-Annette

    2004-01-01

    Nurse educators have identified lack of end-of-life content as a serious deficit in undergraduate nursing education. TNEEL, a new software program with tools for teaching end-of-life topics, was created to help educators overcome this problem. The authors implemented an experiential workshop to help educators learn how to use TNEEL's wide variety of educational tools. Trainers provided information about TNEEL and coached participants (N = 94) as they practiced using laptop computers to increase their familiarity and comfort in using the toolkit. Workshop participants completed pre- and posttest evaluations addressing their opinions and beliefs about using this computer tool. Findings support the workshop as an effective way to facilitate adoption of this innovative educational resource and support the development of a nation-wide training plan for TNEEL with experiential workshops.

  20. A stress management workshop improves residents' coping skills.

    PubMed

    McCue, J D; Sachs, C L

    1991-11-01

    We describe the effectiveness of a stress management workshop designed for physicians. Of the 64 medicine, pediatrics, and medicine-pediatrics residents who agreed to participate in the workshop, the 43 who could be freed from clinical responsibilities constituted the intervention group; the 21 residents who could not be freed from clinical responsibilities were asked to be the nonintervention group. The ESSI Stress Systems Instrument and Maslach Burnout Inventory were administered to control subjects and workshop participants 2 weeks before and 6 weeks after the workshop. The half-day workshops taught management of the stresses of medical practice through: (1) learning and practicing interpersonal skills that increase the availability of social support; (2) prioritization of personal, work, and educational demands; (3) techniques to increase stamina and attend to self-care needs; (4) recognition and avoidance of maladaptive responses; and (5) positive outlook skills. Overall, the ESSI Stress Systems Instrument test scores for the workshop participants improved (+1.27), while the nonintervention group's mean scores declined (-0.65). All 21 individual ESSI Stress Systems Instrument scale items improved for the workshop, compared with eight of 21 items for the nonintervention group. The workshop group improved in the Maslach Burnout Inventory emotional exhaustion scale and deteriorated less than the nonintervention group in the depersonalization scale. We conclude that a modest, inexpensive stress management workshop was received positively, and can lead to significant short-term improvement in stress and burnout test scores for medicine and pediatrics residents.

  1. Lessons Learned and Future Goals of the High Lift Prediction Workshops

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.; Lee-Rausch, Elizabeth; Slotnick, Jeffrey P.

    2016-01-01

    The American Institute of Aeronautics and Astronautics (AIAA) High Lift Prediction Workshop series is described. Two workshops have been held to date. Major conclusions are summarized, and plans for future workshops are outlined. A compilation of lessons learned from the first two workshops is provided. This compilation includes a summary of needs for future high-lift experiments that are intended for computational fluid dynamics (CFD) validation.

  2. Nearfield Summary and Statistical Analysis of the Second AIAA Sonic Boom Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Park, Michael A.; Nemec, Marian

    2017-01-01

    A summary is provided for the Second AIAA Sonic Boom Workshop held 8-9 January 2017 in conjunction with AIAA SciTech 2017. The workshop used three required models of increasing complexity: an axisymmetric body, a wing body, and a complete configuration with flow-through nacelle. An optional complete configuration with propulsion boundary conditions is also provided. These models are designed with similar nearfield signatures to isolate geometry and shock/expansion interaction effects. Eleven international participant groups submitted nearfield signatures with forces, pitching moment, and iterative convergence norms. Statistics and grid convergence of these nearfield signatures are presented. These submissions are propagated to the ground, and noise levels are computed. This allows the grid convergence and the statistical distribution of a noise level to be computed. While progress is documented since the first workshop, improvement to the analysis methods for a possible subsequent workshop are provided. The complete configuration with flow-through nacelle showed the most dramatic improvement between the two workshops. The current workshop cases are more relevant to vehicles with lower loudness and have the potential for lower annoyance than the first workshop cases. The models for this workshop with quieter ground noise levels than the first workshop exposed weaknesses in analysis, particularly in convective discretization.

  3. Opening Remarks: SciDAC 2007

    NASA Astrophysics Data System (ADS)

    Strayer, Michael

    2007-09-01

    Good morning. Welcome to Boston, the home of the Red Sox, Celtics and Bruins, baked beans, tea parties, Robert Parker, and SciDAC 2007. A year ago I stood before you to share the legacy of the first SciDAC program and identify the challenges that we must address on the road to petascale computing—a road E E Cummins described as `. . . never traveled, gladly beyond any experience.' Today, I want to explore the preparations for the rapidly approaching extreme scale (X-scale) generation. These preparations are the first step propelling us along the road of burgeoning scientific discovery enabled by the application of X- scale computing. We look to petascale computing and beyond to open up a world of discovery that cuts across scientific fields and leads us to a greater understanding of not only our world, but our universe. As part of the President's America Competitiveness Initiative, the ASCR Office has been preparing a ten year vision for computing. As part of this planning the LBNL together with ORNL and ANL hosted three town hall meetings on Simulation and Modeling at the Exascale for Energy, Ecological Sustainability and Global Security (E3). The proposed E3 initiative is organized around four programmatic themes: Engaging our top scientists, engineers, computer scientists and applied mathematicians; investing in pioneering large-scale science; developing scalable analysis algorithms, and storage architectures to accelerate discovery; and accelerating the build-out and future development of the DOE open computing facilities. It is clear that we have only just started down the path to extreme scale computing. Plan to attend Thursday's session on the out-briefing and discussion of these meetings. The road to the petascale has been at best rocky. In FY07, the continuing resolution provided 12% less money for Advanced Scientific Computing than either the President, the Senate, or the House. As a consequence, many of you had to absorb a no cost extension for your SciDAC work. I am pleased that the President's FY08 budget restores the funding for SciDAC. Quoting from Advanced Scientific Computing Research description in the House Energy and Water Development Appropriations Bill for FY08, "Perhaps no other area of research at the Department is so critical to sustaining U.S. leadership in science and technology, revolutionizing the way science is done and improving research productivity." As a society we need to revolutionize our approaches to energy, environmental and global security challenges. As we go forward along the road to the X-scale generation, the use of computation will continue to be a critical tool along with theory and experiment in understanding the behavior of the fundamental components of nature as well as for fundamental discovery and exploration of the behavior of complex systems. The foundation to overcome these societal challenges will build from the experiences and knowledge gained as you, members of our SciDAC research teams, work together to attack problems at the tera- and peta- scale. If SciDAC is viewed as an experiment for revolutionizing scientific methodology, then a strategic goal of ASCR program must be to broaden the intellectual base prepared to address the challenges of the new X-scale generation of computing. We must focus our computational science experiences gained over the past five years on the opportunities introduced with extreme scale computing. Our facilities are on a path to provide the resources needed to undertake the first part of our journey. Using the newly upgraded 119 teraflop Cray XT system at the Leadership Computing Facility, SciDAC research teams have in three days performed a 100-year study of the time evolution of the atmospheric CO2 concentration originating from the land surface. The simulation of the El Nino/Southern Oscillation which was part of this study has been characterized as `the most impressive new result in ten years' gained new insight into the behavior of superheated ionic gas in the ITER reactor as a result of an AORSA run on 22,500 processors that achieved over 87 trillion calculations per second (87 teraflops) which is 74% of the system's theoretical peak. Tomorrow, Argonne and IBM will announce that the first IBM Blue Gene/P, a 100 teraflop system, will be shipped to the Argonne Leadership Computing Facility later this fiscal year. By the end of FY2007 ASCR high performance and leadership computing resources will include the 114 teraflop IBM Blue Gene/P; a 102 teraflop Cray XT4 at NERSC and a 119 teraflop Cray XT system at Oak Ridge. Before ringing in the New Year, Oak Ridge will upgrade to 250 teraflops with the replacement of the dual core processors with quad core processors and Argonne will upgrade to between 250-500 teraflops, and next year, a petascale Cray Baker system is scheduled for delivery at Oak Ridge. The multidisciplinary teams in our SciDAC Centers for Enabling Technologies and our SciDAC Institutes must continue to work with our Scientific Application teams to overcome the barriers that prevent effective use of these new systems. These challenges include: the need for new algorithms as well as operating system and runtime software and tools which scale to parallel systems composed of hundreds of thousands processors; program development environments and tools which scale effectively and provide ease of use for developers and scientific end users; and visualization and data management systems that support moving, storing, analyzing, manipulating and visualizing multi-petabytes of scientific data and objects. The SciDAC Centers, located primarily at our DOE national laboratories will take the lead in ensuring that critical computer science and applied mathematics issues are addressed in a timely and comprehensive fashion and to address issues associated with research software lifecycle. In contrast, the SciDAC Institutes, which are university-led centers of excellence, will have more flexibility to pursue new research topics through a range of research collaborations. The Institutes will also work to broaden the intellectual and researcher base—conducting short courses and summer schools to take advantage of new high performance computing capabilities. The SciDAC Outreach Center at Lawrence Berkeley National Laboratory complements the outreach efforts of the SciDAC Institutes. The Outreach Center is our clearinghouse for SciDAC activities and resources and will communicate with the high performance computing community in part to understand their needs for workshops, summer schools and institutes. SciDAC is not ASCR's only effort to broaden the computational science community needed to meet the challenges of the new X-scale generation. I hope that you were able to attend the Computational Science Graduate Fellowship poster session last night. ASCR developed the fellowship in 1991 to meet the nation's growing need for scientists and technology professionals with advanced computer skills. CSGF, now jointly funded between ASCR and NNSA, is more than a traditional academic fellowship. It has provided more than 200 of the best and brightest graduate students with guidance, support and community in preparing them as computational scientists. Today CSGF alumni are bringing their diverse top-level skills and knowledge to research teams at DOE laboratories and in industries such as Proctor and Gamble, Lockheed Martin and Intel. At universities they are working to train the next generation of computational scientists. To build on this success, we intend to develop a wholly new Early Career Principal Investigator's (ECPI) program. Our objective is to stimulate academic research in scientific areas within ASCR's purview especially among faculty in early stages of their academic careers. Last February, we lost Ken Kennedy, one of the leading lights of our community. As we move forward into the extreme computing generation, his vision and insight will be greatly missed. In memorial to Ken Kennedy, we shall designate the ECPI grants to beginning faculty in Computer Science as the Ken Kennedy Fellowship. Watch the ASCR website for more information about ECPI and other early career programs in the computational sciences. We look to you, our scientists, researchers, and visionaries to take X-scale computing and use it to explode scientific discovery in your fields. We at SciDAC will work to ensure that this tool is the sharpest and most precise and efficient instrument to carve away the unknown and reveal the most exciting secrets and stimulating scientific discoveries of our time. The partnership between research and computing is the marriage that will spur greater discovery, and as Spencer said to Susan in Robert Parker's novel, `Sudden Mischief', `We stick together long enough, and we may get as smart as hell'. Michael Strayer

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katti, Amogh; Di Fatta, Giuseppe; Naughton, Thomas

    Future extreme-scale high-performance computing systems will be required to work under frequent component failures. The MPI Forum s User Level Failure Mitigation proposal has introduced an operation, MPI Comm shrink, to synchronize the alive processes on the list of failed processes, so that applications can continue to execute even in the presence of failures by adopting algorithm-based fault tolerance techniques. This MPI Comm shrink operation requires a failure detection and consensus algorithm. This paper presents three novel failure detection and consensus algorithms using Gossiping. The proposed algorithms were implemented and tested using the Extreme-scale Simulator. The results show that inmore » all algorithms the number of Gossip cycles to achieve global consensus scales logarithmically with system size. The second algorithm also shows better scalability in terms of memory and network bandwidth usage and a perfect synchronization in achieving global consensus. The third approach is a three-phase distributed failure detection and consensus algorithm and provides consistency guarantees even in very large and extreme-scale systems while at the same time being memory and bandwidth efficient.« less

  5. New Challenges of the Computation of Multiple Sequence Alignments in the High-Throughput Era (2010 JGI/ANL HPC Workshop)

    ScienceCinema

    Notredame, Cedric

    2018-05-02

    Cedric Notredame from the Centre for Genomic Regulation gives a presentation on New Challenges of the Computation of Multiple Sequence Alignments in the High-Throughput Era at the JGI/Argonne HPC Workshop on January 26, 2010.

  6. Preparing the Next Generation of Environmental Scientists to Work at the Frontier of Data-Intensive Research

    NASA Astrophysics Data System (ADS)

    Hampton, S. E.

    2015-12-01

    The science necessary to unravel complex environmental problems confronts severe computational challenges - coping with huge volumes of heterogeneous data, spanning vast spatial scales at high resolution, and requiring integration of disparate measurements from multiple disciplines. But as cyberinfrastructure advances to support such work, scientists in many fields lack sufficient computational skills to participate in interdisciplinary, data-intensive research. In response, we developed innovative training workshops for early-career scientists, in order to explore both the needs and solutions for training next-generation scientists in skills for data-intensive environmental research. In 2013 and 2014 we ran intensive 3-week training workshops for early-career researchers. One of the workshops was run concurrently in California and North Carolina, connected by virtual technologies and coordinated schedules. We attracted applicants to the workshop with the opportunity to pursue data-intensive small-group research projects that they proposed. This approach presented a realistic possibility that publishable products could result from 3 weeks of focused hands-on classroom instruction combined with self-directed group research in which instructors were present to assist trainees. Instruction addressed 1) collaboration modes and technologies, 2) data management, preservation, and sharing, 3) preparing data for analysis using scripting, 4) reproducible research, 5) sustainable software practices, 6) data analysis and modeling, and 7) communicating results to broad communities. The most dramatic improvements in technical skills were in data management, version control, and working with spatial data outside of proprietary software. In addition, participants built strong networks and collaborative skills that later resulted in a successful student-led grant proposal, published manuscripts, and participants reported that the training was a highly influential experience.

  7. Report on First International Workshop on Robotic Surgery in Thoracic Oncology.

    PubMed

    Veronesi, Giulia; Cerfolio, Robert; Cingolani, Roberto; Rueckert, Jens C; Soler, Luc; Toker, Alper; Cariboni, Umberto; Bottoni, Edoardo; Fumagalli, Uberto; Melfi, Franca; Milli, Carlo; Novellis, Pierluigi; Voulaz, Emanuele; Alloisio, Marco

    2016-01-01

    A workshop of experts from France, Germany, Italy, and the United States took place at Humanitas Research Hospital Milan, Italy, on February 10 and 11, 2016, to examine techniques for and applications of robotic surgery to thoracic oncology. The main topics of presentation and discussion were robotic surgery for lung resection; robot-assisted thymectomy; minimally invasive surgery for esophageal cancer; new developments in computer-assisted surgery and medical applications of robots; the challenge of costs; and future clinical research in robotic thoracic surgery. The following article summarizes the main contributions to the workshop. The Workshop consensus was that since video-assisted thoracoscopic surgery (VATS) is becoming the mainstream approach to resectable lung cancer in North America and Europe, robotic surgery for thoracic oncology is likely to be embraced by an increasing numbers of thoracic surgeons, since it has technical advantages over VATS, including intuitive movements, tremor filtration, more degrees of manipulative freedom, motion scaling, and high-definition stereoscopic vision. These advantages may make robotic surgery more accessible than VATS to trainees and experienced surgeons and also lead to expanded indications. However, the high costs of robotic surgery and absence of tactile feedback remain obstacles to widespread dissemination. A prospective multicentric randomized trial (NCT02804893) to compare robotic and VATS approaches to stages I and II lung cancer will start shortly.

  8. Computational toxicity in 21st century safety sciences (China ...

    EPA Pesticide Factsheets

    presentation at the Joint Meeting of Analytical Toxicology and Computational Toxicology Committee (Chinese Society of Toxicology) International Workshop on Advanced Chemical Safety Assessment Technologies on 11 May 2016, Fuzhou University, Fuzhou China presentation at the Joint Meeting of Analytical Toxicology and Computational Toxicology Committee (Chinese Society of Toxicology) International Workshop on Advanced Chemical Safety Assessment Technologies on 11 May 2016, Fuzhou University, Fuzhou China

  9. Combinatorial Algorithms to Enable Computational Science and Engineering: Work from the CSCAPES Institute

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boman, Erik G.; Catalyurek, Umit V.; Chevalier, Cedric

    2015-01-16

    This final progress report summarizes the work accomplished at the Combinatorial Scientific Computing and Petascale Simulations Institute. We developed Zoltan, a parallel mesh partitioning library that made use of accurate hypergraph models to provide load balancing in mesh-based computations. We developed several graph coloring algorithms for computing Jacobian and Hessian matrices and organized them into a software package called ColPack. We developed parallel algorithms for graph coloring and graph matching problems, and also designed multi-scale graph algorithms. Three PhD students graduated, six more are continuing their PhD studies, and four postdoctoral scholars were advised. Six of these students and Fellowsmore » have joined DOE Labs (Sandia, Berkeley), as staff scientists or as postdoctoral scientists. We also organized the SIAM Workshop on Combinatorial Scientific Computing (CSC) in 2007, 2009, and 2011 to continue to foster the CSC community.« less

  10. Addressing socioeconomic and political challenges posed by climate change

    NASA Astrophysics Data System (ADS)

    Fernando, Harindra Joseph; Klaic, Zvjezdana Bencetic

    2011-08-01

    NATO Advanced Research Workshop: Climate Change, Human Health and National Security; Dubrovnik, Croatia, 28-30 April 2011; Climate change has been identified as one of the most serious threats to humanity. It not only causes sea level rise, drought, crop failure, vector-borne diseases, extreme events, degradation of water and air quality, heat waves, and other phenomena, but it is also a threat multiplier wherein concatenation of multiple events may lead to frequent human catastrophes and intranational and international conflicts. In particular, urban areas may bear the brunt of climate change because of the amplification of climate effects that cascade down from global to urban scales, but current modeling and downscaling capabilities are unable to predict these effects with confidence. These were the main conclusions of a NATO Advanced Research Workshop (ARW) sponsored by the NATO Science for Peace and Security program. Thirty-two invitees from 17 counties, including leading modelers; natural, political, and social scientists; engineers; politicians; military experts; urban planners; industry analysts; epidemiologists; and health care professionals, parsed the topic on a common platform.

  11. EMERGING MOLECULAR AND COMPUTATIONAL APPROACHES FOR CROSS-SPECIES EXTRAPLATIONS: A WORKSHOP SUMMARY REPORT

    EPA Science Inventory

    Benson, W.H., R.T. Di Giulio, J.C. Cook, J. Freedman, R.L. Malek, C. Thompson and D. Versteeg. In press. Emerging Molecular and Computational Approaches for Cross-Species Extrapolations: A Workshop Summary Report (Abstract). To be presented at the SETAC Fourth World Congress, 14-...

  12. Predictive Toxicology: Current Status and Future Outlook (EBI ...

    EPA Pesticide Factsheets

    Slide presentation at the EBI-EMBL Industry Programme Workshop on Predictive Toxicology and the currently status of Computational Toxicology activities at the US EPA. Slide presentation at the EBI-EMBL Industry Programme Workshop on Predictive Toxicology and the currently status of Computational Toxicology activities at the US EPA.

  13. 75 FR 18849 - Food and Drug Administration/National Heart Lung and Blood Institute/National Science Foundation...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-13

    ... cardiovascular diseases and therapies; Patient-specific modeling, including virtual surgical planning and... Workshop on Computer Methods for Cardiovascular Devices: The Integration of Nonclinical and Clinical Models... Workshop on Computer Methods for Cardiovascular Devices: The Integration of Nonclinical and Clinical Models...

  14. Instructional Styles, Attitudes and Experiences of Seniors in Computer Workshops

    ERIC Educational Resources Information Center

    Wood, Eileen; Lanuza, Catherine; Baciu, Iuliana; MacKenzie, Meagan; Nosko, Amanda

    2010-01-01

    Sixty-four seniors were introduced to computers through a series of five weekly workshops. Participants were given instruction followed by hands-on experience for topics related to social communication, information seeking, games, and word processing and were observed to determine their preferences for instructional support. Observations of…

  15. The Writing Workshop as an Inservice Tool: A Case Study.

    ERIC Educational Resources Information Center

    Pollock, Jeri

    1994-01-01

    Presents a case study of an inservice writing workshop (at Our Lady of Mercy School in Rio de Janeiro, Brazil) designed to give teachers hands-on experience in applying computer writing to their individual subjects. Describes how a computer culture was developed at the school. (RS)

  16. A recent history of science cases for optical interferometry

    NASA Astrophysics Data System (ADS)

    Defrère, Denis; Aerts, Conny; Kishimoto, Makoto; Léna, Pierre

    2018-04-01

    Optical long-baseline interferometry is a unique and powerful technique for astronomical research. Since the 1980's (with I2T, GI2T, Mark I to III, SUSI, ...), optical interferometers have produced an increasing number of scientific papers covering various fields of astrophysics. As current interferometric facilities are reaching their maturity, we take the opportunity in this paper to summarize the conclusions of a few key meetings, workshops, and conferences dedicated to interferometry. We present the most persistent recommendations related to science cases and discuss some key technological developments required to address them. In the era of extremely large telescopes, optical long-baseline interferometers will remain crucial to probe the smallest spatial scales and make breakthrough discoveries.

  17. Using Discrete Event Simulation for Programming Model Exploration at Extreme-Scale: Macroscale Components for the Structural Simulation Toolkit (SST).

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilke, Jeremiah J; Kenny, Joseph P.

    2015-02-01

    Discrete event simulation provides a powerful mechanism for designing and testing new extreme- scale programming models for high-performance computing. Rather than debug, run, and wait for results on an actual system, design can first iterate through a simulator. This is particularly useful when test beds cannot be used, i.e. to explore hardware or scales that do not yet exist or are inaccessible. Here we detail the macroscale components of the structural simulation toolkit (SST). Instead of depending on trace replay or state machines, the simulator is architected to execute real code on real software stacks. Our particular user-space threading frameworkmore » allows massive scales to be simulated even on small clusters. The link between the discrete event core and the threading framework allows interesting performance metrics like call graphs to be collected from a simulated run. Performance analysis via simulation can thus become an important phase in extreme-scale programming model and runtime system design via the SST macroscale components.« less

  18. Second user workshop on high-power lasers at the Linac Coherent Light Source

    DOE PAGES

    Heimann, Phil; Glenzer, Siegfried

    2015-05-28

    The second international workshop on the physics enabled by the unique combination of high-power lasers with the world-class Linac Coherent Light Source (LCLS) free-electron X-ray laser beam was held in Stanford, CA, on October 7–8, 2014. The workshop was co-organized by UC Berkeley, Lawrence Berkeley, Lawrence Livermore, and SLAC National Accelerator Laboratories. More than 120 scientists, including 40 students and postdoctoral scientists who are working in high-intensity laser-matter interactions, fusion research, and dynamic high-pressure science came together from North America, Europe, and Asia. The focus of the second workshop was on scientific highlights and the lessons learned from 16 newmore » experiments that were performed on the Matter in Extreme Conditions (MEC) instrument since the first workshop was held one year ago.« less

  19. Final Report National Laboratory Professional Development Workshop for Underrepresented Participants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, Valerie

    The 2013 CMD-IT National Laboratories Professional Development Workshop for Underrepresented Participants (CMD-IT NLPDev 2013) was held at the Oak Ridge National Laboratory campus in Oak Ridge, TN. from June 13 - 14, 2013. Sponsored by the Department of Energy (DOE) Advanced Scientific Computing Research Program, the primary goal of these workshops is to provide information about career opportunities in computational science at the various national laboratories and to mentor the underrepresented participants through community building and expert presentations focused on career success. This second annual workshop offered sessions to facilitate career advancement and, in particular, the strategies and resources neededmore » to be successful at the national laboratories.« less

  20. Reactive multiphase flow simulation workshop summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    VanderHeyden, W.B.

    1995-09-01

    A workshop on computer simulation of reactive multiphase flow was held on May 18 and 19, 1995 in the Computational Testbed for Industry at Los Alamos National Laboratory (LANL), Los Alamos, New Mexico. Approximately 35 to 40 people attended the workshop. This included 21 participants from 12 companies representing the petroleum, chemical, environmental and consumer products industries, two representatives from the DOE Office of Industrial Technologies and several from Los Alamos. The dialog at the meeting suggested that reactive multiphase flow simulation represents an excellent candidate for government/industry/academia collaborative research. A white paper on a potential consortium for reactive multiphasemore » flow with input from workshop participants will be issued separately.« less

  1. Workshop on Advances in Scientific Computation and Differential Equations 󈨡 (SCADE)

    DTIC Science & Technology

    1994-07-18

    STATEMENT ~~’"j’’ Approved for public release; distribution unlimited. I ABSTRACT (MAMMU 200WOMW 94 808 1 64 4.L SUBIECT TERMS Ii11URE Of PAGES 12 16...called differential algebraic ODEs (DAES). (Some important early research on this topic was by L. Petzold.) Both theoretically and in terms of...completely specify the solution. In many physical systems, especially those in biology, or other large scale slowly responding systems, the inclusion of some

  2. Unsteady Aerodynamic Validation Experiences From the Aeroelastic Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Heeg, Jennifer; Chawlowski, Pawel

    2014-01-01

    The AIAA Aeroelastic Prediction Workshop (AePW) was held in April 2012, bringing together communities of aeroelasticians, computational fluid dynamicists and experimentalists. The extended objective was to assess the state of the art in computational aeroelastic methods as practical tools for the prediction of static and dynamic aeroelastic phenomena. As a step in this process, workshop participants analyzed unsteady aerodynamic and weakly-coupled aeroelastic cases. Forced oscillation and unforced system experiments and computations have been compared for three configurations. This paper emphasizes interpretation of the experimental data, computational results and their comparisons from the perspective of validation of unsteady system predictions. The issues examined in detail are variability introduced by input choices for the computations, post-processing, and static aeroelastic modeling. The final issue addressed is interpreting unsteady information that is present in experimental data that is assumed to be steady, and the resulting consequences on the comparison data sets.

  3. ICASE Workshop on Programming Computational Grids

    DTIC Science & Technology

    2001-09-01

    ICASE Workshop on Programming Computational Grids Thomas M. Eidson and Merrell L. Patrick ICASE, Hampton, Virginia ICASE NASA Langley Research Center...Computational Grids Contract Number Grant Number Program Element Number Author(s) Thomas M. Eidson and Merrell L. Patrick Project Number Task Number...clear that neither group fully understood the ideas and problems of the other. It was also clear that neither group is given the time and support to

  4. Human-Computer Interaction and Virtual Environments

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler)

    1995-01-01

    The proceedings of the Workshop on Human-Computer Interaction and Virtual Environments are presented along with a list of attendees. The objectives of the workshop were to assess the state-of-technology and level of maturity of several areas in human-computer interaction and to provide guidelines for focused future research leading to effective use of these facilities in the design/fabrication and operation of future high-performance engineering systems.

  5. Proceedings of the international workshop on measurement and computation of turbulent nonpremixed flames

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barlow, R.S.

    This report documents the proceedings of the International Workshop on Measurement and Computation of Turbulent Nonpremixed Flames, held in Naples, Italy on July 26--27, 1996. Contents include materials that were distributed to participants at the beginning of the workshop, as well as a Summary of Workshop Accomplishments that was generated at the close to this Naples meeting. The Naples workshop involved sixty-one people from eleven countries. The primary objectives were: (1) to select a set of well-documented and relatively simple flames that would be appropriate for collaborative comparisons of model predictions; and (2) to specify common submodels to be usedmore » in these predictions, such that models for the coupling of turbulence and chemistry might be isolated and better understood. Studies involve hydrogen and natural gas fuels. These proceedings are also published on the Web and those interested in the ongoing process of data selection and model comparison should consult the workshop page for the most recent and complete information on these collaborative research efforts. The URL is: http://www/ca.sandia/gov/tdf/Workshop.html.« less

  6. Workshop Report on Additive Manufacturing for Large-Scale Metal Components - Development and Deployment of Metal Big-Area-Additive-Manufacturing (Large-Scale Metals AM) System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Babu, Sudarsanam Suresh; Love, Lonnie J.; Peter, William H.

    Additive manufacturing (AM) is considered an emerging technology that is expected to transform the way industry can make low-volume, high value complex structures. This disruptive technology promises to replace legacy manufacturing methods for the fabrication of existing components in addition to bringing new innovation for new components with increased functional and mechanical properties. This report outlines the outcome of a workshop on large-scale metal additive manufacturing held at Oak Ridge National Laboratory (ORNL) on March 11, 2016. The charter for the workshop was outlined by the Department of Energy (DOE) Advanced Manufacturing Office program manager. The status and impact ofmore » the Big Area Additive Manufacturing (BAAM) for polymer matrix composites was presented as the background motivation for the workshop. Following, the extension of underlying technology to low-cost metals was proposed with the following goals: (i) High deposition rates (approaching 100 lbs/h); (ii) Low cost (<$10/lbs) for steel, iron, aluminum, nickel, as well as, higher cost titanium, (iii) large components (major axis greater than 6 ft) and (iv) compliance of property requirements. The above concept was discussed in depth by representatives from different industrial sectors including welding, metal fabrication machinery, energy, construction, aerospace and heavy manufacturing. In addition, DOE’s newly launched High Performance Computing for Manufacturing (HPC4MFG) program was reviewed. This program will apply thermo-mechanical models to elucidate deeper understanding of the interactions between design, process, and materials during additive manufacturing. Following these presentations, all the attendees took part in a brainstorming session where everyone identified the top 10 challenges in large-scale metal AM from their own perspective. The feedback was analyzed and grouped in different categories including, (i) CAD to PART software, (ii) selection of energy source, (iii) systems development, (iv) material feedstock, (v) process planning, (vi) residual stress & distortion, (vii) post-processing, (viii) qualification of parts, (ix) supply chain and (x) business case. Furthermore, an open innovation network methodology was proposed to accelerate the development and deployment of new large-scale metal additive manufacturing technology with the goal of creating a new generation of high deposition rate equipment, affordable feed stocks, and large metallic components to enhance America’s economic competitiveness.« less

  7. Can Tablet Computers Enhance Faculty Teaching?

    PubMed

    Narayan, Aditee P; Whicker, Shari A; Benjamin, Robert W; Hawley, Jeffrey; McGann, Kathleen A

    2015-06-01

    Learner benefits of tablet computer use have been demonstrated, yet there is little evidence regarding faculty tablet use for teaching. Our study sought to determine if supplying faculty with tablet computers and peer mentoring provided benefits to learners and faculty beyond that of non-tablet-based teaching modalities. We provided faculty with tablet computers and three 2-hour peer-mentoring workshops on tablet-based teaching. Faculty used tablets to teach, in addition to their current, non-tablet-based methods. Presurveys, postsurveys, and monthly faculty surveys assessed feasibility, utilization, and comparisons to current modalities. Learner surveys assessed perceived effectiveness and comparisons to current modalities. All feedback received from open-ended questions was reviewed by the authors and organized into categories. Of 15 eligible faculty, 14 participated. Each participant attended at least 2 of the 3 workshops, with 10 to 12 participants at each workshop. All participants found the workshops useful, and reported that the new tablet-based teaching modality added value beyond that of current teaching methods. Respondents developed the following tablet-based outputs: presentations, photo galleries, evaluation tools, and online modules. Of the outputs, 60% were used in the ambulatory clinics, 33% in intensive care unit bedside teaching rounds, and 7% in inpatient medical unit bedside teaching rounds. Learners reported that common benefits of tablet computers were: improved access/convenience (41%), improved interactive learning (38%), and improved bedside teaching and patient care (13%). A common barrier faculty identified was inconsistent wireless access (14%), while no barriers were identified by the majority of learners. Providing faculty with tablet computers and having peer-mentoring workshops to discuss their use was feasible and added value.

  8. Comprehensive Materials and Morphologies Study of Ion Traps (COMMIT) for Scalable Quantum Computation

    DTIC Science & Technology

    2012-04-21

    the photoelectric effect. The typical shortest wavelengths needed for ion traps range from 194 nm for Hg+ to 493 nm for Ba +, corresponding to 6.4-2.5...REPORT Comprehensive Materials and Morphologies Study of Ion Traps (COMMIT) for scalable Quantum Computation - Final Report 14. ABSTRACT 16. SECURITY...CLASSIFICATION OF: Trapped ion systems, are extremely promising for large-scale quantum computation, but face a vexing problem, with motional quantum

  9. (U) Status of Trinity and Crossroads Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Archer, Billy Joe; Lujan, James Westley; Hemmert, K. S.

    2017-01-10

    (U) This paper provides a general overview of current and future plans for the Advanced Simulation and Computing (ASC) Advanced Technology (AT) systems fielded by the New Mexico Alliance for Computing at Extreme Scale (ACES), a collaboration between Los Alamos Laboratory and Sandia National Laboratories. Additionally, this paper touches on research of technology beyond traditional CMOS. The status of Trinity, ASCs first AT system, and Crossroads, anticipated to succeed Trinity as the third AT system in 2020 will be presented, along with initial performance studies of the Intel Knights Landing Xeon Phi processors, introduced on Trinity. The challenges and opportunitiesmore » for our production simulation codes on AT systems will also be discussed. Trinity and Crossroads are a joint procurement by ACES and Lawrence Berkeley Laboratory as part of the Alliance for application Performance at EXtreme scale (APEX) http://apex.lanl.gov.« less

  10. Some issues related to the novel spectral acceleration method for the fast computation of radiation/scattering from one-dimensional extremely large scale quasi-planar structures

    NASA Astrophysics Data System (ADS)

    Torrungrueng, Danai; Johnson, Joel T.; Chou, Hsi-Tseng

    2002-03-01

    The novel spectral acceleration (NSA) algorithm has been shown to produce an $[\\mathcal{O}]$(Ntot) efficient iterative method of moments for the computation of radiation/scattering from both one-dimensional (1-D) and two-dimensional large-scale quasi-planar structures, where Ntot is the total number of unknowns to be solved. This method accelerates the matrix-vector multiplication in an iterative method of moments solution and divides contributions between points into ``strong'' (exact matrix elements) and ``weak'' (NSA algorithm) regions. The NSA method is based on a spectral representation of the electromagnetic Green's function and appropriate contour deformation, resulting in a fast multipole-like formulation in which contributions from large numbers of points to a single point are evaluated simultaneously. In the standard NSA algorithm the NSA parameters are derived on the basis of the assumption that the outermost possible saddle point, φs,max, along the real axis in the complex angular domain is small. For given height variations of quasi-planar structures, this assumption can be satisfied by adjusting the size of the strong region Ls. However, for quasi-planar structures with large height variations, the adjusted size of the strong region is typically large, resulting in significant increases in computational time for the computation of the strong-region contribution and degrading overall efficiency of the NSA algorithm. In addition, for the case of extremely large scale structures, studies based on the physical optics approximation and a flat surface assumption show that the given NSA parameters in the standard NSA algorithm may yield inaccurate results. In this paper, analytical formulas associated with the NSA parameters for an arbitrary value of φs,max are presented, resulting in more flexibility in selecting Ls to compromise between the computation of the contributions of the strong and weak regions. In addition, a ``multilevel'' algorithm, decomposing 1-D extremely large scale quasi-planar structures into more than one weak region and appropriately choosing the NSA parameters for each weak region, is incorporated into the original NSA method to improve its accuracy.

  11. Institute for scientific computing research;fiscal year 1999 annual report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keyes, D

    2000-03-28

    Large-scale scientific computation, and all of the disciplines that support it and help to validate it, have been placed at the focus of Lawrence Livermore National Laboratory by the Accelerated Strategic Computing Initiative (ASCI). The Laboratory operates the computer with the highest peak performance in the world and has undertaken some of the largest and most compute-intensive simulations ever performed. Computers at the architectural extremes, however, are notoriously difficult to use efficiently. Even such successes as the Laboratory's two Bell Prizes awarded in November 1999 only emphasize the need for much better ways of interacting with the results of large-scalemore » simulations. Advances in scientific computing research have, therefore, never been more vital to the core missions of the Laboratory than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, the Laboratory must engage researchers at many academic centers of excellence. In FY 1999, the Institute for Scientific Computing Research (ISCR) has expanded the Laboratory's bridge to the academic community in the form of collaborative subcontracts, visiting faculty, student internships, a workshop, and a very active seminar series. ISCR research participants are integrated almost seamlessly with the Laboratory's Center for Applied Scientific Computing (CASC), which, in turn, addresses computational challenges arising throughout the Laboratory. Administratively, the ISCR flourishes under the Laboratory's University Relations Program (URP). Together with the other four Institutes of the URP, it must navigate a course that allows the Laboratory to benefit from academic exchanges while preserving national security. Although FY 1999 brought more than its share of challenges to the operation of an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and well worth the continued effort. A change of administration for the ISCR occurred during FY 1999. Acting Director John Fitzgerald retired from LLNL in August after 35 years of service, including the last two at helm of the ISCR. David Keyes, who has been a regular visitor in conjunction with ASCI scalable algorithms research since October 1997, overlapped with John for three months and serves half-time as the new Acting Director.« less

  12. An Undergraduate Research Experience Studying Ras and Ras Mutants

    ERIC Educational Resources Information Center

    Griffeth, Nancy; Batista, Naralys; Grosso, Terri; Arianna, Gianluca; Bhatia, Ravnit; Boukerche, Faiza; Crispi, Nicholas; Fuller, Neno; Gauza, Piotr; Kingsbury, Lyle; Krynski, Kamil; Levine, Alina; Ma, Rui Yan; Nam, Jennifer; Pearl, Eitan; Rosa, Alessandro; Salarbux, Stephanie; Sun, Dylan

    2016-01-01

    Each January from 2010 to 2014, an undergraduate workshop on modeling biological systems was held at Lehman College of the City University of New York. The workshops were funded by a National Science Foundation (NSF) Expedition in Computing, "Computational Modeling and Analysis of Complex Systems (CMACS)." The primary goal was to…

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCaskey, Alexander J.

    Hybrid programming models for beyond-CMOS technologies will prove critical for integrating new computing technologies alongside our existing infrastructure. Unfortunately the software infrastructure required to enable this is lacking or not available. XACC is a programming framework for extreme-scale, post-exascale accelerator architectures that integrates alongside existing conventional applications. It is a pluggable framework for programming languages developed for next-gen computing hardware architectures like quantum and neuromorphic computing. It lets computational scientists efficiently off-load classically intractable work to attached accelerators through user-friendly Kernel definitions. XACC makes post-exascale hybrid programming approachable for domain computational scientists.

  14. LAVA Simulations for the AIAA Sonic Boom Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Housman, Jeffrey A.; Sozer, Emre; Moini-Yekta , Shayan; Kiris, Cetin C.

    2014-01-01

    Computational simulations using the Launch Ascent and Vehicle Aerodynamics (LAVA) framework are presented for the First AIAA Sonic Boom Prediction Workshop test cases. The framework is utilized with both structured overset and unstructured meshing approaches. The three workshop test cases include an axisymmetric body, a Delta Wing-Body model, and a complete low-boom supersonic transport concept. Solution sensitivity to mesh type and sizing, and several numerical convective flux discretization choices are presented and discussed. Favorable comparison between the computational simulations and experimental data of nearand mid-field pressure signatures were obtained.

  15. The Role of Computers in Research and Development at Langley Research Center

    NASA Technical Reports Server (NTRS)

    Wieseman, Carol D. (Compiler)

    1994-01-01

    This document is a compilation of presentations given at a workshop on the role cf computers in research and development at the Langley Research Center. The objectives of the workshop were to inform the Langley Research Center community of the current software systems and software practices in use at Langley. The workshop was organized in 10 sessions: Software Engineering; Software Engineering Standards, methods, and CASE tools; Solutions of Equations; Automatic Differentiation; Mosaic and the World Wide Web; Graphics and Image Processing; System Design Integration; CAE Tools; Languages; and Advanced Topics.

  16. PREFACE: 16th International workshop on Advanced Computing and Analysis Techniques in physics research (ACAT2014)

    NASA Astrophysics Data System (ADS)

    Fiala, L.; Lokajicek, M.; Tumova, N.

    2015-05-01

    This volume of the IOP Conference Series is dedicated to scientific contributions presented at the 16th International Workshop on Advanced Computing and Analysis Techniques in Physics Research (ACAT 2014), this year the motto was ''bridging disciplines''. The conference took place on September 1-5, 2014, at the Faculty of Civil Engineering, Czech Technical University in Prague, Czech Republic. The 16th edition of ACAT explored the boundaries of computing system architectures, data analysis algorithmics, automatic calculations, and theoretical calculation technologies. It provided a forum for confronting and exchanging ideas among these fields, where new approaches in computing technologies for scientific research were explored and promoted. This year's edition of the workshop brought together over 140 participants from all over the world. The workshop's 16 invited speakers presented key topics on advanced computing and analysis techniques in physics. During the workshop, 60 talks and 40 posters were presented in three tracks: Computing Technology for Physics Research, Data Analysis - Algorithms and Tools, and Computations in Theoretical Physics: Techniques and Methods. The round table enabled discussions on expanding software, knowledge sharing and scientific collaboration in the respective areas. ACAT 2014 was generously sponsored by Western Digital, Brookhaven National Laboratory, Hewlett Packard, DataDirect Networks, M Computers, Bright Computing, Huawei and PDV-Systemhaus. Special appreciations go to the track liaisons Lorenzo Moneta, Axel Naumann and Grigory Rubtsov for their work on the scientific program and the publication preparation. ACAT's IACC would also like to express its gratitude to all referees for their work on making sure the contributions are published in the proceedings. Our thanks extend to the conference liaisons Andrei Kataev and Jerome Lauret who worked with the local contacts and made this conference possible as well as to the program coordinator Federico Carminati and the conference chair Denis Perret-Gallix for their global supervision. Further information on ACAT 2014 can be found at http://www.particle.cz/acat2014

  17. Aeronautical technology 2000: A projection of advanced vehicle concepts

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The Aeronautics and Space Engineering Board (ASEB) of the National Research Council conducted a Workshop on Aeronautical Technology: a Projection to the Year 2000 (Aerotech 2000 Workshop). The panels were asked to project advances in aeronautical technologies that could be available by the year 2000. As the workshop was drawing to a close, it became evident that a more comprehensive investigation of advanced air vehicle concepts than was possible in the limited time available at the workshop would be valuable. Thus, a special panel on vehicle applications was organized. In the course of two meetings, the panel identified and described representative types of aircraft judged possible with the workshop's technology projections. These representative aircraft types include: military aircraft; transport aircraft; rotorcraft; extremely high altitude aircraft; and transatmospheric aircraft. Improvements in performance, efficiency, and operational characteristics possible through the application of the workshop's year 2000 technology projections were discussed. The subgroups also identified the technologies considered essential and enhancing or supporting to achieve the projected aircraft improvements.

  18. The Growth Edge: Creative Use of Computers for Facilitating Learning and Enhancing Personal Development. Papers from the Workshop (Ann Arbor, Michigan, June 27-30, 1986).

    ERIC Educational Resources Information Center

    Walz, Garry R., Ed.; Bleuer, Jeanne C., Ed.

    This document is the fourth publication in a series devoted to the use of computers in counseling. The outgrowth of the 1986 ERIC/CAPS workshop, it contains four of the major presentations made at the conference. "The Impact of Computers on the Future of Counseling: Boom or Boomerang" (Edwin L. Herr) examines the effect of technology…

  19. Intelligent Computational Systems. Opening Remarks: CFD Application Process Workshop

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.

    1994-01-01

    This discussion will include a short review of the challenges that must be overcome if computational physics technology is to have a larger impact on the design cycles of U.S. aerospace companies. Some of the potential solutions to these challenges may come from the information sciences fields. A few examples of potential computational physics/information sciences synergy will be presented, as motivation and inspiration for the Improving The CFD Applications Process Workshop.

  20. Proceedings of the 2004 Workshop on CFD Validation of Synthetic Jets and Turbulent Separation Control

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L. (Compiler)

    2007-01-01

    The papers presented here are from the Langley Research Center Workshop on Computational Fluid Dynamics (CFD) Validation of Synthetic Jets and Turbulent Separation Control (nicknamed "CFDVAL2004"), held March 2004 in Williamsburg, Virginia. The goal of the workshop was to bring together an international group of CFD practitioners to assess the current capabilities of different classes of turbulent flow solution methodologies to predict flow fields induced by synthetic jets and separation control geometries. The workshop consisted of three flow-control test cases of varying complexity, and participants could contribute to any number of the cases. Along with their workshop submissions, each participant included a short write-up describing their method for computing the particular case(s). These write-ups are presented as received from the authors with no editing. Descriptions of each of the test cases and experiments are also included.

  1. Computational data sciences for assessment and prediction of climate extremes

    NASA Astrophysics Data System (ADS)

    Ganguly, A. R.

    2011-12-01

    Climate extremes may be defined inclusively as severe weather events or large shifts in global or regional weather patterns which may be caused or exacerbated by natural climate variability or climate change. This area of research arguably represents one of the largest knowledge-gaps in climate science which is relevant for informing resource managers and policy makers. While physics-based climate models are essential in view of non-stationary and nonlinear dynamical processes, their current pace of uncertainty reduction may not be adequate for urgent stakeholder needs. The structure of the models may in some cases preclude reduction of uncertainty for critical processes at scales or for the extremes of interest. On the other hand, methods based on complex networks, extreme value statistics, machine learning, and space-time data mining, have demonstrated significant promise to improve scientific understanding and generate enhanced predictions. When combined with conceptual process understanding at multiple spatiotemporal scales and designed to handle massive data, interdisciplinary data science methods and algorithms may complement or supplement physics-based models. Specific examples from the prior literature and our ongoing work suggests how data-guided improvements may be possible, for example, in the context of ocean meteorology, climate oscillators, teleconnections, and atmospheric process understanding, which in turn can improve projections of regional climate, precipitation extremes and tropical cyclones in an useful and interpretable fashion. A community-wide effort is motivated to develop and adapt computational data science tools for translating climate model simulations to information relevant for adaptation and policy, as well as for improving our scientific understanding of climate extremes from both observed and model-simulated data.

  2. A Fault Oblivious Extreme-Scale Execution Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McKie, Jim

    The FOX project, funded under the ASCR X-stack I program, developed systems software and runtime libraries for a new approach to the data and work distribution for massively parallel, fault oblivious application execution. Our work was motivated by the premise that exascale computing systems will provide a thousand-fold increase in parallelism and a proportional increase in failure rate relative to today’s machines. To deliver the capability of exascale hardware, the systems software must provide the infrastructure to support existing applications while simultaneously enabling efficient execution of new programming models that naturally express dynamic, adaptive, irregular computation; coupled simulations; and massivemore » data analysis in a highly unreliable hardware environment with billions of threads of execution. Our OS research has prototyped new methods to provide efficient resource sharing, synchronization, and protection in a many-core compute node. We have experimented with alternative task/dataflow programming models and shown scalability in some cases to hundreds of thousands of cores. Much of our software is in active development through open source projects. Concepts from FOX are being pursued in next generation exascale operating systems. Our OS work focused on adaptive, application tailored OS services optimized for multi → many core processors. We developed a new operating system NIX that supports role-based allocation of cores to processes which was released to open source. We contributed to the IBM FusedOS project, which promoted the concept of latency-optimized and throughput-optimized cores. We built a task queue library based on distributed, fault tolerant key-value store and identified scaling issues. A second fault tolerant task parallel library was developed, based on the Linda tuple space model, that used low level interconnect primitives for optimized communication. We designed fault tolerance mechanisms for task parallel computations employing work stealing for load balancing that scaled to the largest existing supercomputers. Finally, we implemented the Elastic Building Blocks runtime, a library to manage object-oriented distributed software components. To support the research, we won two INCITE awards for time on Intrepid (BG/P) and Mira (BG/Q). Much of our work has had impact in the OS and runtime community through the ASCR Exascale OS/R workshop and report, leading to the research agenda of the Exascale OS/R program. Our project was, however, also affected by attrition of multiple PIs. While the PIs continued to participate and offer guidance as time permitted, losing these key individuals was unfortunate both for the project and for the DOE HPC community.« less

  3. The Astronomy Workshop: Scientific Notation and Solar System Visualizer

    NASA Astrophysics Data System (ADS)

    Deming, Grace; Hamilton, D.; Hayes-Gehrke, M.

    2008-09-01

    The Astronomy Workshop (http://janus.astro.umd.edu) is a collection of interactive World Wide Web tools that were developed under the direction of Doug Hamilton for use in undergraduate classes and by the general public. The philosophy of the site is to foster student interest in astronomy by exploiting their fascination with computers and the internet. We have expanded the "Scientific Notation” tool from simply converting decimal numbers into and out of scientific notation to adding, subtracting, multiplying, and dividing numbers expressed in scientific notation. Students practice these skills and when confident they may complete a quiz. In addition, there are suggestions on how instructors may use the site to encourage students to practice these basic skills. The Solar System Visualizer animates orbits of planets, moons, and rings to scale. Extrasolar planetary systems are also featured. This research was sponsored by NASA EPO grant NNG06GGF99G.

  4. Self-Directed Student Research through Analysis of Microarray Datasets: A Computer-Based Functional Genomics Practical Class for Masters-Level Students

    ERIC Educational Resources Information Center

    Grenville-Briggs, Laura J.; Stansfield, Ian

    2011-01-01

    This report describes a linked series of Masters-level computer practical workshops. They comprise an advanced functional genomics investigation, based upon analysis of a microarray dataset probing yeast DNA damage responses. The workshops require the students to analyse highly complex transcriptomics datasets, and were designed to stimulate…

  5. 77 FR 31026 - Use of Computer Simulation of the United States Blood Supply in Support of Planning for Emergency...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-24

    ... enhancements to extend the model predictions from red blood cell units to other blood components, such as...] Use of Computer Simulation of the United States Blood Supply in Support of Planning for Emergency...: Notice of public workshop. The Food and Drug Administration (FDA) is announcing a public workshop...

  6. Pedagogy and Processes for a Computer Programming Outreach Workshop--The Bridge to College Model

    ERIC Educational Resources Information Center

    Tangney, Brendan; Oldham, Elizabeth; Conneely, Claire; Barrett, Stephen; Lawlor, John

    2010-01-01

    This paper describes a model for computer programming outreach workshops aimed at second-level students (ages 15-16). Participants engage in a series of programming activities based on the Scratch visual programming language, and a very strong group-based pedagogy is followed. Participants are not required to have any prior programming experience.…

  7. Looking for Life in Extreme Environments on Earth and Beyond: Professional Development Workshop for Educators

    NASA Astrophysics Data System (ADS)

    Droppo, R.; Pratt, L.; Suchecki, P. C.

    2010-08-01

    The Looking for Life in Extreme Environments workshop held at Indiana University Bloomington in July of 2009 was the first in a series of workshops for high-school teachers that are currently in development. The workshops' modules are based on the research of faculty members in the Departments of Geological Sciences, Biology, and Astronomy, the School of Education, and the School of Public and Environmental Affairs at Indiana University Bloomington; the modules use lessons from Exploring Deep-Subsurface Life. Earth Analogues for Possible Life on Mars: Lessons and Activities, curricular materials that were produced and edited by Lisa Pratt and Ruth Droppo and published by NASA in 2008. Exploring Deep-Subsurface Life is a workbook, a DVD (with closed-captioning), and a CD with the lessons in digital text format for adaptation to classroom needs and printing. Each lesson includes the National Education Standards that apply to the materials. The workbook's lessons are written with three considerations: Life Domains, Cellular Metabolism, and Extreme Environments and Microbes. Students are challenged to build, draw, measure, discuss, and participate in laboratory processes and experiments that help them understand and describe microbes and their environments. In the Capstone, the students write a grant proposal based on the three lessons' analogues. The DVD is collection of videotaped interviews with scientists in laboratories at Michigan State, Princeton, and Indiana University, who are working on water and gas samples they collected from deep gold mines in South Africa and the Canadian Arctic. The interview materials and some animated graphics are compiled into four video pieces that support and compliment the accompanying workbook lessons and activities, and offer students insight into the excitement of scientific discovery.

  8. Progress in fast, accurate multi-scale climate simulations

    DOE PAGES

    Collins, W. D.; Johansen, H.; Evans, K. J.; ...

    2015-06-01

    We present a survey of physical and computational techniques that have the potential to contribute to the next generation of high-fidelity, multi-scale climate simulations. Examples of the climate science problems that can be investigated with more depth with these computational improvements include the capture of remote forcings of localized hydrological extreme events, an accurate representation of cloud features over a range of spatial and temporal scales, and parallel, large ensembles of simulations to more effectively explore model sensitivities and uncertainties. Numerical techniques, such as adaptive mesh refinement, implicit time integration, and separate treatment of fast physical time scales are enablingmore » improved accuracy and fidelity in simulation of dynamics and allowing more complete representations of climate features at the global scale. At the same time, partnerships with computer science teams have focused on taking advantage of evolving computer architectures such as many-core processors and GPUs. As a result, approaches which were previously considered prohibitively costly have become both more efficient and scalable. In combination, progress in these three critical areas is poised to transform climate modeling in the coming decades.« less

  9. Nanobiotechnology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler)

    2000-01-01

    This document contains the proceedings of the Training Workshop on Nanobiotechnology held at NASA Langley Research Center, Hampton, Virginia, June 14-15, 2000. The workshop was jointly sponsored by the University of Virginia's Center for Advanced Computational Technolog and NASA. Workshop attendees were from NASA, other government agencies, industry and universities. The objectives of the workshop were to give overviews of the diverse activities in nanobiotechnology and to identify their potential for future aerospace systems.

  10. Can Tablet Computers Enhance Faculty Teaching?

    PubMed Central

    Narayan, Aditee P.; Whicker, Shari A.; Benjamin, Robert W.; Hawley, Jeffrey; McGann, Kathleen A.

    2015-01-01

    Background Learner benefits of tablet computer use have been demonstrated, yet there is little evidence regarding faculty tablet use for teaching. Objective Our study sought to determine if supplying faculty with tablet computers and peer mentoring provided benefits to learners and faculty beyond that of non–tablet-based teaching modalities. Methods We provided faculty with tablet computers and three 2-hour peer-mentoring workshops on tablet-based teaching. Faculty used tablets to teach, in addition to their current, non–tablet-based methods. Presurveys, postsurveys, and monthly faculty surveys assessed feasibility, utilization, and comparisons to current modalities. Learner surveys assessed perceived effectiveness and comparisons to current modalities. All feedback received from open-ended questions was reviewed by the authors and organized into categories. Results Of 15 eligible faculty, 14 participated. Each participant attended at least 2 of the 3 workshops, with 10 to 12 participants at each workshop. All participants found the workshops useful, and reported that the new tablet-based teaching modality added value beyond that of current teaching methods. Respondents developed the following tablet-based outputs: presentations, photo galleries, evaluation tools, and online modules. Of the outputs, 60% were used in the ambulatory clinics, 33% in intensive care unit bedside teaching rounds, and 7% in inpatient medical unit bedside teaching rounds. Learners reported that common benefits of tablet computers were: improved access/convenience (41%), improved interactive learning (38%), and improved bedside teaching and patient care (13%). A common barrier faculty identified was inconsistent wireless access (14%), while no barriers were identified by the majority of learners. Conclusions Providing faculty with tablet computers and having peer-mentoring workshops to discuss their use was feasible and added value. PMID:26221443

  11. Asia-Pacific POPIN workshop on Internet.

    PubMed

    1996-01-01

    This brief article announces the accomplishments of the ESCAP Population Division of the Department of Economic and Social Information and Policy Analysis (DESIPA) in conjunction with the Asia-Pacific POPIN Internet (Information Superhighway) Training Workshop in popularizing useful new computer information technologies. A successful workshop was held in Bangkok in November 1996 for 18 people from 8 countries in the Asian and Pacific region, many of whom were from population information centers. Participants were taught some techniques for disseminating population data and information through use of the Internet computer facility. Participants learned 1) how to use Windows software in the ESCAP local area network (LAN), 2) about concepts such as HTML (hypertext mark-up language), and 3) detailed information about computer language. Computer practices involved "surfing the Net (Internet)" and linking with the global POPIN site on the Internet. Participants learned about computer programs for information handling and learned how to prepare documents using HTML, how to mount information on the World Wide Web (WWW) of the Internet, how to convert existing documents into "HTML-style" files, and how to scan graphics, such as logos, photographs, and maps, for visual display on the Internet. The Workshop and the three training modules was funded by the UN Population Fund (UNFPA). The POPIN Coordinator was pleased that competency was accomplished in such a short period of time.

  12. The use of wireless laptop computers for computer-assisted learning in pharmacokinetics.

    PubMed

    Munar, Myrna Y; Singh, Harleen; Belle, Donna; Brackett, Carolyn C; Earle, Sandra B

    2006-02-15

    To implement computer-assisted learning workshops into pharmacokinetics courses in a doctor of pharmacy (PharmD) program. Workshops were designed for students to utilize computer software programs on laptop computers to build pharmacokinetic models to predict drug concentrations resulting from various dosage regimens. In addition, students were able to visualize through graphing programs how altering different parameters changed drug concentration-time curves. Surveys were conducted to measure students' attitudes toward computer technology before and after implementation. Finally, traditional examinations were used to evaluate student learning. Doctor of pharmacy students responded favorably to the use of wireless laptop computers in problem-based pharmacokinetic workshops. Eighty-eight percent (n = 61/69) and 82% (n = 55/67) of PharmD students completed surveys before and after computer implementation, respectively. Prior to implementation, 95% of students agreed that computers would enhance learning in pharmacokinetics. After implementation, 98% of students strongly agreed (p < 0.05) that computers enhanced learning. Examination results were significantly higher after computer implementation (89% with computers vs. 84% without computers; p = 0.01). Implementation of wireless laptop computers in a pharmacokinetic course enabled students to construct their own pharmacokinetic models that could respond to changing parameters. Students had greater comprehension and were better able to interpret results and provide appropriate recommendations. Computer-assisted pharmacokinetic techniques can be powerful tools when making decisions about drug therapy.

  13. The Use of Wireless Laptop Computers for Computer-Assisted Learning in Pharmacokinetics

    PubMed Central

    Munar, Myrna Y.; Singh, Harleen; Belle, Donna; Brackett, Carolyn C.; Earle, Sandra B.

    2006-01-01

    Objective To implement computer-assisted learning workshops into pharmacokinetics courses in a doctor of pharmacy (PharmD) program. Design Workshops were designed for students to utilize computer software programs on laptop computers to build pharmacokinetic models to predict drug concentrations resulting from various dosage regimens. In addition, students were able to visualize through graphing programs how altering different parameters changed drug concentration-time curves. Surveys were conducted to measure students’ attitudes toward computer technology before and after implementation. Finally, traditional examinations were used to evaluate student learning. Assessment Doctor of pharmacy students responded favorably to the use of wireless laptop computers in problem-based pharmacokinetic workshops. Eighty-eight percent (n = 61/69) and 82% (n = 55/67) of PharmD students completed surveys before and after computer implementation, respectively. Prior to implementation, 95% of students agreed that computers would enhance learning in pharmacokinetics. After implementation, 98% of students strongly agreed (p < 0.05) that computers enhanced learning. Examination results were significantly higher after computer implementation (89% with computers vs. 84% without computers; p = 0.01). Conclusion Implementation of wireless laptop computers in a pharmacokinetic course enabled students to construct their own pharmacokinetic models that could respond to changing parameters. Students had greater comprehension and were better able to interpret results and provide appropriate recommendations. Computer-assisted pharmacokinetic techniques can be powerful tools when making decisions about drug therapy. PMID:17136147

  14. A Large-Scale Multi-Hop Localization Algorithm Based on Regularized Extreme Learning for Wireless Networks.

    PubMed

    Zheng, Wei; Yan, Xiaoyong; Zhao, Wei; Qian, Chengshan

    2017-12-20

    A novel large-scale multi-hop localization algorithm based on regularized extreme learning is proposed in this paper. The large-scale multi-hop localization problem is formulated as a learning problem. Unlike other similar localization algorithms, the proposed algorithm overcomes the shortcoming of the traditional algorithms which are only applicable to an isotropic network, therefore has a strong adaptability to the complex deployment environment. The proposed algorithm is composed of three stages: data acquisition, modeling and location estimation. In data acquisition stage, the training information between nodes of the given network is collected. In modeling stage, the model among the hop-counts and the physical distances between nodes is constructed using regularized extreme learning. In location estimation stage, each node finds its specific location in a distributed manner. Theoretical analysis and several experiments show that the proposed algorithm can adapt to the different topological environments with low computational cost. Furthermore, high accuracy can be achieved by this method without setting complex parameters.

  15. Examining Extreme Events Using Dynamically Downscaled 12-km WRF Simulations

    EPA Science Inventory

    Continued improvements in the speed and availability of computational resources have allowed dynamical downscaling of global climate model (GCM) projections to be conducted at increasingly finer grid scales and over extended time periods. The implementation of dynamical downscal...

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, Janine Camille; Day, David Minot; Mitchell, Scott A.

    This report summarizes the Combinatorial Algebraic Topology: software, applications & algorithms workshop (CAT Workshop). The workshop was sponsored by the Computer Science Research Institute of Sandia National Laboratories. It was organized by CSRI staff members Scott Mitchell and Shawn Martin. It was held in Santa Fe, New Mexico, August 29-30. The CAT Workshop website has links to some of the talk slides and other information, http://www.cs.sandia.gov/CSRI/Workshops/2009/CAT/index.html. The purpose of the report is to summarize the discussions and recap the sessions. There is a special emphasis on technical areas that are ripe for further exploration, and the plans for follow-up amongstmore » the workshop participants. The intended audiences are the workshop participants, other researchers in the area, and the workshop sponsors.« less

  17. 3rd IAGA/ICMA Workshop on Vertical Coupling in the Atmosphere/Ionosphere System/ Abstract

    DTIC Science & Technology

    2007-01-10

    energy and momentum from the lower atmosphere to the upper atmosphere and ionosphere and vice versa. The programme focussed on various aspects and...ICMA Workshop Vertical Coupling in the Atmosphere/Ionosphere System - 6 - The influence of global dependence of gravity wave energy in the troposphere...transport during the polar night of thermospheric odd nitrogen produced by lower- energy electron precipitation and solar extreme UV fluxes. However, at low

  18. Astrobiology as a tool for getting high school students interested in science

    NASA Astrophysics Data System (ADS)

    Van der Meer, B. W.; Alletto, James J.; Bryant, Dudley; Carini, Mike; Elliott, Larry; Gelderman, Richard; Mason, Wayne; McDaniel, Kerrie; McGruder, Charles H.; Rinehart, Claire; Tyler, Rico; Walker, Linda

    2000-12-01

    A workshop was held (10/99) for high school students and teachers on astrobiology. NASA provided support through an IDEAS grant. Out of 63 qualified applicants, 29 were accepted: 22 students (11 minorities) and 7 teachers. The worship was held on 2 successive weekends. Activities included: culturing microbes from human skin, discussing 'what is life?', building and using a 2-inch refractive telescope and a van-Leeuwenhoek- type microscope (each participant built and kept them), listening to lectures by Dr. Richard Gelderman on detecting extra solar planets and by Dr. Richard Hoover on life in extreme environments. Other activities included: collecting samples and isolating micro-organisms from the lost river cave, studying microbial life from extreme environments in the laboratory, using the internet as a research tool and debating the logistics and feasibility of a lunar colony. Written evaluations of the workshop led to the following conclusions: 48% of the students considered a possible career in the biological and/or astrophysical sciences, and half of these stated they were spurred on by the workshop itself.

  19. Preface.

    PubMed

    Ditlevsen, Susanne; Lansky, Petr

    2016-06-01

    This Special Issue of Mathematical Biosciences and Engineering contains 11 selected papers presented at the Neural Coding 2014 workshop. The workshop was held in the royal city of Versailles in France, October 6-10, 2014. This was the 11th of a series of international workshops on this subject, the first held in Prague (1995), then Versailles (1997), Osaka (1999), Plymouth (2001), Aulla (2003), Marburg (2005), Montevideo (2007), Tainan (2009), Limassol (2010), and again in Prague (2012). Also selected papers from Prague were published as a special issue of Mathematical Biosciences and Engineering and in this way a tradition was started. Similarly to the previous workshops, this was a single track multidisciplinary event bringing together experimental and computational neuroscientists. The Neural Coding Workshops are traditionally biennial symposia. They are relatively small in size, interdisciplinary with major emphasis on the search for common principles in neural coding. The workshop was conceived to bring together scientists from different disciplines for an in-depth discussion of mathematical model-building and computational strategies. Further information on the meeting can be found at the NC2014 website at https://colloque6.inra.fr/neural_coding_2014. The meeting was supported by French National Institute for Agricultural Research, the world's leading institution in this field. This Special Issue of Mathematical Biosciences and Engineering contains 11 selected papers presented at the Neural Coding 2014 workshop. The workshop was held in the royal city of Versailles in France, October 6-10, 2014. This was the 11th of a series of international workshops on this subject, the first held in Prague (1995), then Versailles (1997), Osaka (1999), Plymouth (2001), Aulla (2003), Marburg (2005), Montevideo (2007), Tainan (2009), Limassol (2010), and again in Prague (2012). Also selected papers from Prague were published as a special issue of Mathematical Biosciences and Engineering and in this way a tradition was started. Similarly to the previous workshops, this was a single track multidisciplinary event bringing together experimental and computational neuroscientists. The Neural Coding Workshops are traditionally biennial symposia. They are relatively small in size, interdisciplinary with major emphasis on the search for common principles in neural coding. The workshop was conceived to bring together scientists from different disciplines for an in-depth discussion of mathematical model-building and computational strategies. Further information on the meeting can be found at the NC2014 website at https://colloque6.inra.fr/neural_coding_2014. The meeting was supported by French National Institute for Agricultural Research, the world's leading institution in this field. Understanding how the brain processes information is one of the most challenging subjects in neuroscience. The papers presented in this special issue show a small corner of the huge diversity of this field, and illustrate how scientists with different backgrounds approach this vast subject. The diversity of disciplines engaged in these investigations is remarkable: biologists, mathematicians, physicists, psychologists, computer scientists, and statisticians, all have original tools and ideas by which to try to elucidate the underlying mechanisms. In this issue, emphasis is put on mathematical modeling of single neurons. A variety of problems in computational neuroscience accompanied with a rich diversity of mathematical tools and approaches are presented. We hope it will inspire and challenge the readers in their own research. We would like to thank the authors for their valuable contributions and the referees for their priceless effort of reviewing the manuscripts. Finally, we would like to thank Yang Kuang for supporting us and making this publication possible.

  20. Computer Applications in Information Systems. Proceedings of a Workshop (Cape Town, South Africa, November 26-27, 1985). Continuing Education Series Number 1.

    ERIC Educational Resources Information Center

    Bleimschein, Sue, Ed.

    Sixteen papers from a workshop on computer applications sponsored by the University of Cape Town (South Africa) School of Librarianship are presented in this volume: (1) "Introduction to the Use of Information Technology" (Sue Bleimschein); (2) "Searching Remote Databases" (Steve Rossouw); (3) "SABINET [South African Bibliographic and Information…

  1. IFCPT S-Duct Grid-Adapted FUN3D Computations for the Third Propulsion Aerodynamics Works

    NASA Technical Reports Server (NTRS)

    Davis, Zach S.; Park, M. A.

    2017-01-01

    Contributions of the unstructured Reynolds-averaged Navier-Stokes code, FUN3D, to the 3rd AIAA Propulsion Aerodynamics Workshop are described for the diffusing IFCPT S-Duct. Using workshop-supplied grids, results for the baseline S-Duct, baseline S-Duct with Aerodynamic Interface Plane (AIP) rake hardware, and baseline S-Duct with flow control devices are compared with experimental data and results computed with output-based, off-body grid adaptation in FUN3D. Due to the absence of influential geometry components, total pressure recovery is overpredicted on the baseline S-Duct and S-Duct with flow control vanes when compared to experimental values. An estimate for the exact value of total pressure recovery is derived for these cases given an infinitely refined mesh. When results from output-based mesh adaptation are compared with those computed on workshop-supplied grids, a considerable improvement in predicting total pressure recovery is observed. By including more representative geometry, output-based mesh adaptation compares very favorably with experimental data in terms of predicting the total pressure recovery cost-function; whereas, results computed using the workshop-supplied grids are underpredicted.

  2. Summary and Statistical Analysis of the First AIAA Sonic Boom Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Park, Michael A.; Morgenstern, John M.

    2014-01-01

    A summary is provided for the First AIAA Sonic Boom Workshop held 11 January 2014 in conjunction with AIAA SciTech 2014. Near-field pressure signatures extracted from computational fluid dynamics solutions are gathered from nineteen participants representing three countries for the two required cases, an axisymmetric body and simple delta wing body. Structured multiblock, unstructured mixed-element, unstructured tetrahedral, overset, and Cartesian cut-cell methods are used by the participants. Participants provided signatures computed on participant generated and solution adapted grids. Signatures are also provided for a series of uniformly refined workshop provided grids. These submissions are propagated to the ground and loudness measures are computed. This allows the grid convergence of a loudness measure and a validation metric (dfference norm between computed and wind tunnel measured near-field signatures) to be studied for the first time. Statistical analysis is also presented for these measures. An optional configuration includes fuselage, wing, tail, flow-through nacelles, and blade sting. This full configuration exhibits more variation in eleven submissions than the sixty submissions provided for each required case. Recommendations are provided for potential improvements to the analysis methods and a possible subsequent workshop.

  3. Advanced Training Technologies and Learning Environments

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)

    1999-01-01

    This document contains the proceedings of the Workshop on Advanced Training Technologies and Learning Environments held at NASA Langley Research Center, Hampton, Virginia, March 9-10, 1999. The workshop was jointly sponsored by the University of Virginia's Center for Advanced Computational Technology and NASA. Workshop attendees were from NASA, other government agencies, industry, and universities. The objective of the workshop was to assess the status and effectiveness of different advanced training technologies and learning environments.

  4. Proceedings of Image Understanding Workshop Held in Los Angeles, California on 23-25 February 1987. Volume 1

    DTIC Science & Technology

    1987-02-01

    landmark set, and for computing a plan as an ordered list of of recursively executable sub-goals. The key to the search is to use the landmark database...Directed Object Extraction Using a Combined Region and Line Repretrentation, /Voc. of the Workshop on Computer Vision: Representation and Con... computational capability as well, such as the floating point calculations as required in this application . One such PE design which made effort to meet these

  5. Highly Scalable Asynchronous Computing Method for Partial Differential Equations: A Path Towards Exascale

    NASA Astrophysics Data System (ADS)

    Konduri, Aditya

    Many natural and engineering systems are governed by nonlinear partial differential equations (PDEs) which result in a multiscale phenomena, e.g. turbulent flows. Numerical simulations of these problems are computationally very expensive and demand for extreme levels of parallelism. At realistic conditions, simulations are being carried out on massively parallel computers with hundreds of thousands of processing elements (PEs). It has been observed that communication between PEs as well as their synchronization at these extreme scales take up a significant portion of the total simulation time and result in poor scalability of codes. This issue is likely to pose a bottleneck in scalability of codes on future Exascale systems. In this work, we propose an asynchronous computing algorithm based on widely used finite difference methods to solve PDEs in which synchronization between PEs due to communication is relaxed at a mathematical level. We show that while stability is conserved when schemes are used asynchronously, accuracy is greatly degraded. Since message arrivals at PEs are random processes, so is the behavior of the error. We propose a new statistical framework in which we show that average errors drop always to first-order regardless of the original scheme. We propose new asynchrony-tolerant schemes that maintain accuracy when synchronization is relaxed. The quality of the solution is shown to depend, not only on the physical phenomena and numerical schemes, but also on the characteristics of the computing machine. A novel algorithm using remote memory access communications has been developed to demonstrate excellent scalability of the method for large-scale computing. Finally, we present a path to extend this method in solving complex multi-scale problems on Exascale machines.

  6. Multigrid preconditioned conjugate-gradient method for large-scale wave-front reconstruction.

    PubMed

    Gilles, Luc; Vogel, Curtis R; Ellerbroek, Brent L

    2002-09-01

    We introduce a multigrid preconditioned conjugate-gradient (MGCG) iterative scheme for computing open-loop wave-front reconstructors for extreme adaptive optics systems. We present numerical simulations for a 17-m class telescope with n = 48756 sensor measurement grid points within the aperture, which indicate that our MGCG method has a rapid convergence rate for a wide range of subaperture average slope measurement signal-to-noise ratios. The total computational cost is of order n log n. Hence our scheme provides for fast wave-front simulation and control in large-scale adaptive optics systems.

  7. Addressing Failures in Exascale Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snir, Marc; Wisniewski, Robert; Abraham, Jacob

    2014-01-01

    We present here a report produced by a workshop on Addressing failures in exascale computing' held in Park City, Utah, 4-11 August 2012. The charter of this workshop was to establish a common taxonomy about resilience across all the levels in a computing system, discuss existing knowledge on resilience across the various hardware and software layers of an exascale system, and build on those results, examining potential solutions from both a hardware and software perspective and focusing on a combined approach. The workshop brought together participants with expertise in applications, system software, and hardware; they came from industry, government, andmore » academia, and their interests ranged from theory to implementation. The combination allowed broad and comprehensive discussions and led to this document, which summarizes and builds on those discussions.« less

  8. Workshops of the Fifth International Brain-Computer Interface Meeting: Defining the Future.

    PubMed

    Huggins, Jane E; Guger, Christoph; Allison, Brendan; Anderson, Charles W; Batista, Aaron; Brouwer, Anne-Marie A-M; Brunner, Clemens; Chavarriaga, Ricardo; Fried-Oken, Melanie; Gunduz, Aysegul; Gupta, Disha; Kübler, Andrea; Leeb, Robert; Lotte, Fabien; Miller, Lee E; Müller-Putz, Gernot; Rutkowski, Tomasz; Tangermann, Michael; Thompson, David Edward

    2014-01-01

    The Fifth International Brain-Computer Interface (BCI) Meeting met June 3-7 th , 2013 at the Asilomar Conference Grounds, Pacific Grove, California. The conference included 19 workshops covering topics in brain-computer interface and brain-machine interface research. Topics included translation of BCIs into clinical use, standardization and certification, types of brain activity to use for BCI, recording methods, the effects of plasticity, special interest topics in BCIs applications, and future BCI directions. BCI research is well established and transitioning to practical use to benefit people with physical impairments. At the same time, new applications are being explored, both for people with physical impairments and beyond. Here we provide summaries of each workshop, illustrating the breadth and depth of BCI research and high-lighting important issues for future research and development.

  9. Proceedings of the 1st Army Installation Waste to Energy Workshop

    DTIC Science & Technology

    2008-08-01

    Center 2902 Newmark Dr. Champaign, IL 61824 René S. Parker Select Engineering Services (SES) 1544 Woodland Park Ave. Suite 310 Layton , UT 84041...gasification technologies at different scales (Source: Larson, Eric D., “Small-Scale Gasification-Based Biomass Power Generation,” January 1998...Engineering Research Laboratory. Larson, Eric D. 1998. Small-scale gasification-based biomass power generation. Prepared for the Biomass Workshop

  10. Proceedings of USC (University of Southern California) Workshop on VLSI (Very Large Scale Integration) & Modern Signal Processing, held at Los Angeles, California on 1-3 November 1982

    DTIC Science & Technology

    1983-11-15

    Concurrent Algorithms", A. Cremers , Dortmund University, West Germany, and T. Hibbard, JPL, Pasadena, CA 64 "An Overview of Signal Representations in...n O f\\ n O P- A -> Problem-oriented specification of concurrent algorithms Armin B. Cremers and Thomas N. Hibbard Preliminary version September...1982 s* Armin B. Cremers Computer Science Department University of Dortmund P.O. Box 50 05 00 D-4600 Dortmund 50 Fed. Rep. Germany

  11. Predicting Benefit from a Gestalt Therapy Marathon Workshop.

    ERIC Educational Resources Information Center

    Healy, James; Dowd, E. Thomas

    1981-01-01

    Tested the utility of the Personal Orientation Inventory (POI), the Myers-Briggs Type Indicator, and the Girona Affect Scale in predicting the outcomes of a marathon Gestalt therapy workshop. Signigicant predictive equations were generated that use the POI to predict gains on the Girona Affect Scale. (Author/RC)

  12. PREFACE: 1st International Workshop on Theoretical and Computational Physics: Condensed Matter, Soft Matter and Materials Physics & 38th National Conference on Theoretical Physics

    NASA Astrophysics Data System (ADS)

    2014-09-01

    This volume contains selected papers presented at the 38th National Conference on Theoretical Physics (NCTP-38) and the 1st International Workshop on Theoretical and Computational Physics: Condensed Matter, Soft Matter and Materials Physics (IWTCP-1). Both the conference and the workshop were held from 29 July to 1 August 2013 in Pullman hotel, Da Nang, Vietnam. The IWTCP-1 was a new activity of the Vietnamese Theoretical Physics Society (VTPS) organized in association with the 38th National Conference on Theoretical Physics (NCTP-38), the most well-known annual scientific forum dedicated to the dissemination of the latest development in the field of theoretical physics within the country. The IWTCP-1 was also an External Activity of the Asia Pacific Center for Theoretical Physics (APCTP). The overriding goal of the IWTCP is to provide an international forum for scientists and engineers from academia to share ideas, problems and solution relating to the recent advances in theoretical physics as well as in computational physics. The main IWTCP motivation is to foster scientific exchanges between the Vietnamese theoretical and computational physics community and world-wide scientists as well as to promote high-standard level of research and education activities for young physicists in the country. About 110 participants coming from 10 countries participated in the conference and the workshop. 4 invited talks, 18 oral contributions and 46 posters were presented at the conference. In the workshop we had one keynote lecture and 9 invited talks presented by international experts in the fields of theoretical and computational physics, together with 14 oral and 33 poster contributions. The proceedings were edited by Nguyen Tri Lan, Trinh Xuan Hoang, and Nguyen Ai Viet. We would like to thank all invited speakers, participants and sponsors for making the conference and the workshop successful. Nguyen Ai Viet Chair of NCTP-38 and IWTCP-1

  13. The ECCO Logo Project: Materials for Classroom Teachers and Teacher Trainers.

    ERIC Educational Resources Information Center

    Tempel, Michael; And Others

    In the fall of 1985, the Educational Computer Consortium of Ohio (ECCO) presented an extensive series of workshops on Logo. The workshops were divided into two categories: those for teacher-trainers and those for classroom teachers. This booklet presents materials developed by a core of five participants in the workshops for trainers using Logo…

  14. Documentary Linguistics and Computational Linguistics: A Response to Brooks

    ERIC Educational Resources Information Center

    Bird, Steven; Chiang, David; Frowein, Friedel; Hanke, Florian; Vaswani, Ashish

    2015-01-01

    In mid-2012, the authors organized a two-week workshop in Papua New Guinea to provide training in basic techniques and technologies for language documentation, and to gain understanding of how these technologies might be improved in the future. An assessment of the workshop was conducted by Brooks with the central idea that the workshop's…

  15. A Framework for Medical Information Science

    PubMed Central

    Blum, Bruce

    1983-01-01

    The Seventh Annual Symposium for Computer Applications in Medical Care has sponsored a one day, limited attendance workshop to discuss the topic: A Framework for Medical Information Science. Participation was limited to approximately fifty people. Each attendee prepared either a paper or a working statement before the workshop; these documents will be revised following the workshop for publication. This session will contain a review of the workshop by some of its participants. An extract from the call for participation follows.

  16. The spatial return level of aggregated hourly extreme rainfall in Peninsular Malaysia

    NASA Astrophysics Data System (ADS)

    Shaffie, Mardhiyyah; Eli, Annazirin; Wan Zin, Wan Zawiah; Jemain, Abdul Aziz

    2015-07-01

    This paper is intended to ascertain the spatial pattern of extreme rainfall distribution in Peninsular Malaysia at several short time intervals, i.e., on hourly basis. Motivation of this research is due to historical records of extreme rainfall in Peninsular Malaysia, whereby many hydrological disasters at this region occur within a short time period. The hourly periods considered are 1, 2, 3, 6, 12, and 24 h. Many previous hydrological studies dealt with daily rainfall data; thus, this study enables comparison to be made on the estimated performances between daily and hourly rainfall data analyses so as to identify the impact of extreme rainfall at a shorter time scale. Return levels based on the time aggregate considered are also computed. Parameter estimation using L-moment method for four probability distributions, namely, the generalized extreme value (GEV), generalized logistic (GLO), generalized Pareto (GPA), and Pearson type III (PE3) distributions were conducted. Aided with the L-moment diagram test and mean square error (MSE) test, GLO was found to be the most appropriate distribution to represent the extreme rainfall data. At most time intervals (10, 50, and 100 years), the spatial patterns revealed that the rainfall distribution across the peninsula differ for 1- and 24-h extreme rainfalls. The outcomes of this study would provide additional information regarding patterns of extreme rainfall in Malaysia which may not be detected when considering only a higher time scale such as daily; thus, appropriate measures for shorter time scales of extreme rainfall can be planned. The implementation of such measures would be beneficial to the authorities to reduce the impact of any disastrous natural event.

  17. Computational Challenges in the Analysis of Petrophysics Using Microtomography and Upscaling

    NASA Astrophysics Data System (ADS)

    Liu, J.; Pereira, G.; Freij-Ayoub, R.; Regenauer-Lieb, K.

    2014-12-01

    Microtomography provides detailed 3D internal structures of rocks in micro- to tens of nano-meter resolution and is quickly turning into a new technology for studying petrophysical properties of materials. An important step is the upscaling of these properties as micron or sub-micron resolution can only be done on the sample-scale of millimeters or even less than a millimeter. We present here a recently developed computational workflow for the analysis of microstructures including the upscaling of material properties. Computations of properties are first performed using conventional material science simulations at micro to nano-scale. The subsequent upscaling of these properties is done by a novel renormalization procedure based on percolation theory. We have tested the workflow using different rock samples, biological and food science materials. We have also applied the technique on high-resolution time-lapse synchrotron CT scans. In this contribution we focus on the computational challenges that arise from the big data problem of analyzing petrophysical properties and its subsequent upscaling. We discuss the following challenges: 1) Characterization of microtomography for extremely large data sets - our current capability. 2) Computational fluid dynamics simulations at pore-scale for permeability estimation - methods, computing cost and accuracy. 3) Solid mechanical computations at pore-scale for estimating elasto-plastic properties - computational stability, cost, and efficiency. 4) Extracting critical exponents from derivative models for scaling laws - models, finite element meshing, and accuracy. Significant progress in each of these challenges is necessary to transform microtomography from the current research problem into a robust computational big data tool for multi-scale scientific and engineering problems.

  18. Opportunities and challenges for the life sciences community.

    PubMed

    Kolker, Eugene; Stewart, Elizabeth; Ozdemir, Vural

    2012-03-01

    Twenty-first century life sciences have transformed into data-enabled (also called data-intensive, data-driven, or big data) sciences. They principally depend on data-, computation-, and instrumentation-intensive approaches to seek comprehensive understanding of complex biological processes and systems (e.g., ecosystems, complex diseases, environmental, and health challenges). Federal agencies including the National Science Foundation (NSF) have played and continue to play an exceptional leadership role by innovatively addressing the challenges of data-enabled life sciences. Yet even more is required not only to keep up with the current developments, but also to pro-actively enable future research needs. Straightforward access to data, computing, and analysis resources will enable true democratization of research competitions; thus investigators will compete based on the merits and broader impact of their ideas and approaches rather than on the scale of their institutional resources. This is the Final Report for Data-Intensive Science Workshops DISW1 and DISW2. The first NSF-funded Data Intensive Science Workshop (DISW1, Seattle, WA, September 19-20, 2010) overviewed the status of the data-enabled life sciences and identified their challenges and opportunities. This served as a baseline for the second NSF-funded DIS workshop (DISW2, Washington, DC, May 16-17, 2011). Based on the findings of DISW2 the following overarching recommendation to the NSF was proposed: establish a community alliance to be the voice and framework of the data-enabled life sciences. After this Final Report was finished, Data-Enabled Life Sciences Alliance (DELSA, www.delsall.org ) was formed to become a Digital Commons for the life sciences community.

  19. Opportunities and Challenges for the Life Sciences Community

    PubMed Central

    Stewart, Elizabeth; Ozdemir, Vural

    2012-01-01

    Abstract Twenty-first century life sciences have transformed into data-enabled (also called data-intensive, data-driven, or big data) sciences. They principally depend on data-, computation-, and instrumentation-intensive approaches to seek comprehensive understanding of complex biological processes and systems (e.g., ecosystems, complex diseases, environmental, and health challenges). Federal agencies including the National Science Foundation (NSF) have played and continue to play an exceptional leadership role by innovatively addressing the challenges of data-enabled life sciences. Yet even more is required not only to keep up with the current developments, but also to pro-actively enable future research needs. Straightforward access to data, computing, and analysis resources will enable true democratization of research competitions; thus investigators will compete based on the merits and broader impact of their ideas and approaches rather than on the scale of their institutional resources. This is the Final Report for Data-Intensive Science Workshops DISW1 and DISW2. The first NSF-funded Data Intensive Science Workshop (DISW1, Seattle, WA, September 19–20, 2010) overviewed the status of the data-enabled life sciences and identified their challenges and opportunities. This served as a baseline for the second NSF-funded DIS workshop (DISW2, Washington, DC, May 16–17, 2011). Based on the findings of DISW2 the following overarching recommendation to the NSF was proposed: establish a community alliance to be the voice and framework of the data-enabled life sciences. After this Final Report was finished, Data-Enabled Life Sciences Alliance (DELSA, www.delsall.org) was formed to become a Digital Commons for the life sciences community. PMID:22401659

  20. Statistical Analysis of CFD Solutions From the Fifth AIAA Drag Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Morrison, Joseph H.

    2013-01-01

    A graphical framework is used for statistical analysis of the results from an extensive N-version test of a collection of Reynolds-averaged Navier-Stokes computational fluid dynamics codes. The solutions were obtained by code developers and users from North America, Europe, Asia, and South America using a common grid sequence and multiple turbulence models for the June 2012 fifth Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic configuration for this workshop was the Common Research Model subsonic transport wing-body previously used for the 4th Drag Prediction Workshop. This work continues the statistical analysis begun in the earlier workshops and compares the results from the grid convergence study of the most recent workshop with previous workshops.

  1. Electronic Communications: Education Via a Virtual Workshop.

    ERIC Educational Resources Information Center

    Leibensperger, Roslyn; Mehringer, Susan; Trefethen, Anne; Kalos, Malvin

    1997-01-01

    Describes a virtual workshop where participants across the United States learn by interacting with their own computers. Highlights the program's goals, audience activity, goals versus accomplishments, CPU usage, consulting, and effectiveness. (Author/VWL)

  2. 33 CFR 157.12f - Workshop functional test requirements.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... CARRYING OIL IN BULK Design, Equipment, and Installation § 157.12f Workshop functional test requirements... the specific design of equipment. A completed workshop certificate including the delivery test... several ppm values on all measurement scales when operated on an oil appropriate for the application of...

  3. 33 CFR 157.12f - Workshop functional test requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... CARRYING OIL IN BULK Design, Equipment, and Installation § 157.12f Workshop functional test requirements... the specific design of equipment. A completed workshop certificate including the delivery test... several ppm values on all measurement scales when operated on an oil appropriate for the application of...

  4. Data and Communications in Basic Energy Sciences: Creating a Pathway for Scientific Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nugent, Peter E.; Simonson, J. Michael

    2011-10-24

    This report is based on the Department of Energy (DOE) Workshop on “Data and Communications in Basic Energy Sciences: Creating a Pathway for Scientific Discovery” that was held at the Bethesda Marriott in Maryland on October 24-25, 2011. The workshop brought together leading researchers from the Basic Energy Sciences (BES) facilities and Advanced Scientific Computing Research (ASCR). The workshop was co-sponsored by these two Offices to identify opportunities and needs for data analysis, ownership, storage, mining, provenance and data transfer at light sources, neutron sources, microscopy centers and other facilities. Their charge was to identify current and anticipated issues inmore » the acquisition, analysis, communication and storage of experimental data that could impact the progress of scientific discovery, ascertain what knowledge, methods and tools are needed to mitigate present and projected shortcomings and to create the foundation for information exchanges and collaboration between ASCR and BES supported researchers and facilities. The workshop was organized in the context of the impending data tsunami that will be produced by DOE’s BES facilities. Current facilities, like SLAC National Accelerator Laboratory’s Linac Coherent Light Source, can produce up to 18 terabytes (TB) per day, while upgraded detectors at Lawrence Berkeley National Laboratory’s Advanced Light Source will generate ~10TB per hour. The expectation is that these rates will increase by over an order of magnitude in the coming decade. The urgency to develop new strategies and methods in order to stay ahead of this deluge and extract the most science from these facilities was recognized by all. The four focus areas addressed in this workshop were: Workflow Management - Experiment to Science: Identifying and managing the data path from experiment to publication. Theory and Algorithms: Recognizing the need for new tools for computation at scale, supporting large data sets and realistic theoretical models. Visualization and Analysis: Supporting near-real-time feedback for experiment optimization and new ways to extract and communicate critical information from large data sets. Data Processing and Management: Outlining needs in computational and communication approaches and infrastructure needed to handle unprecedented data volume and information content. It should be noted that almost all participants recognized that there were unlikely to be any turn-key solutions available due to the unique, diverse nature of the BES community, where research at adjacent beamlines at a given light source facility often span everything from biology to materials science to chemistry using scattering, imaging and/or spectroscopy. However, it was also noted that advances supported by other programs in data research, methodologies, and tool development could be implemented on reasonable time scales with modest effort. Adapting available standard file formats, robust workflows, and in-situ analysis tools for user facility needs could pay long-term dividends. Workshop participants assessed current requirements as well as future challenges and made the following recommendations in order to achieve the ultimate goal of enabling transformative science in current and future BES facilities: Theory and analysis components should be integrated seamlessly within experimental workflow. Develop new algorithms for data analysis based on common data formats and toolsets. Move analysis closer to experiment. Move the analysis closer to the experiment to enable real-time (in-situ) streaming capabilities, live visualization of the experiment and an increase of the overall experimental efficiency. Match data management access and capabilities with advancements in detectors and sources. Remove bottlenecks, provide interoperability across different facilities/beamlines and apply forefront mathematical techniques to more efficiently extract science from the experiments. This workshop report examines and reviews the status of several BES facilities and highlights the successes and shortcomings of the current data and communication pathways for scientific discovery. It then ascertains what methods and tools are needed to mitigate present and projected data bottlenecks to science over the next 10 years. The goal of this report is to create the foundation for information exchanges and collaborations among ASCR and BES supported researchers, the BES scientific user facilities, and ASCR computing and networking facilities. To jumpstart these activities, there was a strong desire to see a joint effort between ASCR and BES along the lines of the highly successful Scientific Discovery through Advanced Computing (SciDAC) program in which integrated teams of engineers, scientists and computer scientists were engaged to tackle a complete end-to-end workflow solution at one or more beamlines, to ascertain what challenges will need to be addressed in order to handle future increases in data« less

  5. PREFACE: 2nd International Workshop on Theoretical and Computational Physics (IWTCP-2): Modern Methods and Latest Results in Particle Physics, Nuclear Physics and Astrophysics and the 39th National Conference on Theoretical Physics (NCTP-39)

    NASA Astrophysics Data System (ADS)

    Hoang, Trinh Xuan; Ky, Nguyen Anh; Lan, Nguyen Tri; Viet, Nguyen Ai

    2015-06-01

    This volume contains selected papers presented at the 2nd International Workshop on Theoretical and Computational Physics (IWTCP-2): Modern Methods and Latest Results in Particle Physics, Nuclear Physics and Astrophysics and the 39th National Conference on Theoretical Physics (NCTP-39). Both the workshop and the conference were held from 28th - 31st July 2014 in Dakruco Hotel, Buon Ma Thuot, Dak Lak, Vietnam. The NCTP-39 and the IWTCP-2 were organized under the support of the Vietnamese Theoretical Physics Society, with a motivation to foster scientific exchanges between the theoretical and computational physicists in Vietnam and worldwide, as well as to promote high-standard level of research and education activities for young physicists in the country. The IWTCP-2 was also an External Activity of the Asia Pacific Center for Theoretical Physics (APCTP). About 100 participants coming from nine countries participated in the workshop and the conference. At the IWTCP-2 workshop, we had 16 invited talks presented by international experts, together with eight oral and ten poster contributions. At the NCTP-39, three invited talks, 15 oral contributions and 39 posters were presented. We would like to thank all invited speakers, participants and sponsors for making the workshop and the conference successful. Trinh Xuan Hoang, Nguyen Anh Ky, Nguyen Tri Lan and Nguyen Ai Viet

  6. Learning to consult with computers.

    PubMed

    Liaw, S T; Marty, J J

    2001-07-01

    To develop and evaluate a strategy to teach skills and issues associated with computers in the consultation. An overview lecture plus a workshop before and a workshop after practice placements, during the 10-week general practice (GP) term in the 5th year of the University of Melbourne medical course. Pre- and post-intervention study using a mix of qualitative and quantitative methods within a strategic evaluation framework. Self-reported attitudes and skills with clinical applications before, during and after the intervention. Most students had significant general computer experience but little in the medical area. They found the workshops relevant, interesting and easy to follow. The role-play approach facilitated students' learning of relevant communication and consulting skills and an appreciation of issues associated with using the information technology tools in simulated clinical situations to augment and complement their consulting skills. The workshops and exposure to GP systems were associated with an increase in the use of clinical software, more realistic expectations of existing clinical and medical record software and an understanding of the barriers to the use of computers in the consultation. The educational intervention assisted students to develop and express an understanding of the importance of consulting and communication skills in teaching and learning about medical informatics tools, hardware and software design, workplace issues and the impact of clinical computer systems on the consultation and patient care.

  7. Field Scale Monitoring and Modeling of Water and Chemical Transfer in the Vadose Zone

    USDA-ARS?s Scientific Manuscript database

    Natural resource systems involve highly complex interactions of soil-plant-atmosphere-management components that are extremely difficult to quantitatively describe. Computer simulations for prediction and management of watersheds, water supply areas, and agricultural fields and farms have become inc...

  8. Introduction to the 8th International Workshop on Smoltification and a synthesis of major findings

    USGS Publications Warehouse

    Schreck, Carl B.; McCormick, Steven D.; Björnsson, Björn Thrandur; Stefansson, Sigurd O.; Ueda, Hiroshi

    2012-01-01

    The early life history of anadromous salmonid fishes, be they Atlantic (Salmo salar) or Pacific salmon (Oncorhynchus spp.), trout of those genera, or charrs (Salvelinus spp.), appears much more complex than previously thought. The seaward movement or migration is extremely polymorphic among and within species. To help provide understanding of the processes involved, and implications for conservation, management and husbandry, the 8th International Workshop on Smoltification was held on September 20–24, 2009, with participants from 9 different countries. Because the native distribution of these fishes is in northern latitudes, more or less circumglobally, similar workshops have been held roughly every four years in various countries, starting in LaJolla, California; and subsequently in Stirling, Scotland; Trondheim, Norway; St. Andrews, Canada; Muonio, Finland; Westport, Ireland; and Tono, Japan. Papers emanating from these previous workshops can be found in earlier Special Issues of Aquaculture while those from the 2009 workshop are presented here.

  9. Computer adaptive test approach to the assessment of children and youth with brachial plexus birth palsy.

    PubMed

    Mulcahey, M J; Merenda, Lisa; Tian, Feng; Kozin, Scott; James, Michelle; Gogola, Gloria; Ni, Pengsheng

    2013-01-01

    This study examined the psychometric properties of item pools relevant to upper-extremity function and activity performance and evaluated simulated 5-, 10-, and 15-item computer adaptive tests (CATs). In a multicenter, cross-sectional study of 200 children and youth with brachial plexus birth palsy (BPBP), parents responded to upper-extremity (n = 52) and activity (n = 34) items using a 5-point response scale. We used confirmatory and exploratory factor analysis, ordinal logistic regression, item maps, and standard errors to evaluate the psychometric properties of the item banks. Validity was evaluated using analysis of variance and Pearson correlation coefficients. Results show that the two item pools have acceptable model fit, scaled well for children and youth with BPBP, and had good validity, content range, and precision. Simulated CATs performed comparably to the full item banks, suggesting that a reduced number of items provide similar information to the entire set of items. Copyright © 2013 by the American Occupational Therapy Association, Inc.

  10. Assessment of Slat Noise Predictions for 30P30N High-Lift Configuration From BANC-III Workshop

    NASA Technical Reports Server (NTRS)

    Choudhari, Meelan; Lockard, David P.

    2015-01-01

    This paper presents a summary of the computational predictions and measurement data contributed to Category 7 of the 3rd AIAA Workshop on Benchmark Problems for Airframe Noise Computations (BANC-III), which was held in Atlanta, GA, on June 14-15, 2014. Category 7 represents the first slat-noise configuration to be investigated under the BANC series of workshops, namely, the 30P30N two-dimensional high-lift model (with a slat contour that was slightly modified to enable unsteady pressure measurements) at an angle of attack that is relevant to approach conditions. Originally developed for a CFD challenge workshop to assess computational fluid dynamics techniques for steady high-lift predictions, the 30P30N configurations has provided a valuable opportunity for the airframe noise community to collectively assess and advance the computational and experimental techniques for slat noise. The contributed solutions are compared with each other as well as with the initial measurements that became available just prior to the BANC-III Workshop. Specific features of a number of computational solutions on the finer grids compare reasonably well with the initial measurements from FSU and JAXA facilities and/or with each other. However, no single solution (or a subset of solutions) could be identified as clearly superior to the remaining solutions. Grid sensitivity studies presented by multiple BANC-III participants demonstrated a relatively consistent trend of reduced surface pressure fluctuations, higher levels of turbulent kinetic energy in the flow, and lower levels of both narrow band peaks and the broadband component of unsteady pressure spectra in the nearfield and farfield. The lessons learned from the BANC-III contributions have been used to identify improvements to the problem statement for future Category-7 investigations.

  11. DOE JGI Quality Metrics; Approaches to Scaling and Improving Metagenome Assembly (Metagenomics Informatics Challenges Workshop: 10K Genomes at a Time)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Copeland, Alex; Brown, C. Titus

    2011-10-13

    DOE JGI's Alex Copeland on "DOE JGI Quality Metrics" and Michigan State University's C. Titus Brown on "Approaches to Scaling and Improving Metagenome Assembly" at the Metagenomics Informatics Challenges Workshop held at the DOE JGI on October 12-13, 2011.

  12. DOE JGI Quality Metrics; Approaches to Scaling and Improving Metagenome Assembly (Metagenomics Informatics Challenges Workshop: 10K Genomes at a Time)

    ScienceCinema

    Copeland, Alex; Brown, C. Titus

    2018-04-27

    DOE JGI's Alex Copeland on "DOE JGI Quality Metrics" and Michigan State University's C. Titus Brown on "Approaches to Scaling and Improving Metagenome Assembly" at the Metagenomics Informatics Challenges Workshop held at the DOE JGI on October 12-13, 2011.

  13. [Computer mediated discussion and attitude polarization].

    PubMed

    Shiraishi, Takashi; Endo, Kimihisa; Yoshida, Fujio

    2002-10-01

    This study examined the hypothesis that computer mediated discussions lead to more extreme decisions than face-to-face (FTF) meeting. Kiesler, Siegel, & McGuire (1984) claimed that computer mediated communication (CMC) tended to be relatively uninhibited, as seen in 'flaming', and that group decisions under CMC using Choice Dilemma Questionnaire tended to be more extreme and riskier than FTF meetings. However, for the same reason, CMC discussions on controversial social issues for which participants initially hold strongly opposing views, might be less likely to reach a consensus, and no polarization should occur. Fifteen 4-member groups discussed a controversial social issue under one of three conditions: FTF, CMC, and partition. After discussion, participants rated their position as a group on a 9-point bipolar scale ranging from strong disagreement to strong agreement. A stronger polarization effect was observed for FTF groups than those where members were separated with partitions. However, no extreme shift from their original, individual positions was found for CMC participants. There results were discussed in terms of 'expertise and status equalization' and 'absence of social context cues' under CMC.

  14. When Is a Protocol Broken? (Transcript of Discussion)

    NASA Astrophysics Data System (ADS)

    Christianson, Bruce

    Hello everyone, and welcome to the 15th International Security Protocols Workshop. I know that we're used to having these workshops in Cambridge, but we have a tradition that once every ten years we go to some far-off exotic place instead. The 5th Protocols Workshop was held in Paris, and now this being the 15th one, here we are in sunny Brno. The only credit I can claim for the excellence of the venue and all the arrangements, is that I made the decision to delegate all this to Vashek [Vaclav Matyas]. He and his team have worked extremely hard to do all the local arrangements, and I'm pathetically grateful to them.

  15. Processing of the WLCG monitoring data using NoSQL

    NASA Astrophysics Data System (ADS)

    Andreeva, J.; Beche, A.; Belov, S.; Dzhunov, I.; Kadochnikov, I.; Karavakis, E.; Saiz, P.; Schovancova, J.; Tuckett, D.

    2014-06-01

    The Worldwide LHC Computing Grid (WLCG) today includes more than 150 computing centres where more than 2 million jobs are being executed daily and petabytes of data are transferred between sites. Monitoring the computing activities of the LHC experiments, over such a huge heterogeneous infrastructure, is extremely demanding in terms of computation, performance and reliability. Furthermore, the generated monitoring flow is constantly increasing, which represents another challenge for the monitoring systems. While existing solutions are traditionally based on Oracle for data storage and processing, recent developments evaluate NoSQL for processing large-scale monitoring datasets. NoSQL databases are getting increasingly popular for processing datasets at the terabyte and petabyte scale using commodity hardware. In this contribution, the integration of NoSQL data processing in the Experiment Dashboard framework is described along with first experiences of using this technology for monitoring the LHC computing activities.

  16. Critical exponents of extremal Kerr perturbations

    NASA Astrophysics Data System (ADS)

    Gralla, Samuel E.; Zimmerman, Peter

    2018-05-01

    We show that scalar, electromagnetic, and gravitational perturbations of extremal Kerr black holes are asymptotically self-similar under the near-horizon, late-time scaling symmetry of the background metric. This accounts for the Aretakis instability (growth of transverse derivatives) as a critical phenomenon associated with the emergent symmetry. We compute the critical exponent of each mode, which is equivalent to its decay rate. It follows from symmetry arguments that, despite the growth of transverse derivatives, all generally covariant scalar quantities decay to zero.

  17. Workshops of the Fifth International Brain-Computer Interface Meeting: Defining the Future

    PubMed Central

    Huggins, Jane E.; Guger, Christoph; Allison, Brendan; Anderson, Charles W.; Batista, Aaron; Brouwer, Anne-Marie (A.-M.); Brunner, Clemens; Chavarriaga, Ricardo; Fried-Oken, Melanie; Gunduz, Aysegul; Gupta, Disha; Kübler, Andrea; Leeb, Robert; Lotte, Fabien; Miller, Lee E.; Müller-Putz, Gernot; Rutkowski, Tomasz; Tangermann, Michael; Thompson, David Edward

    2014-01-01

    The Fifth International Brain-Computer Interface (BCI) Meeting met June 3–7th, 2013 at the Asilomar Conference Grounds, Pacific Grove, California. The conference included 19 workshops covering topics in brain-computer interface and brain-machine interface research. Topics included translation of BCIs into clinical use, standardization and certification, types of brain activity to use for BCI, recording methods, the effects of plasticity, special interest topics in BCIs applications, and future BCI directions. BCI research is well established and transitioning to practical use to benefit people with physical impairments. At the same time, new applications are being explored, both for people with physical impairments and beyond. Here we provide summaries of each workshop, illustrating the breadth and depth of BCI research and high-lighting important issues for future research and development. PMID:25485284

  18. Pre-training to improve workshop performance in supervisor skills: an exploratory study of Latino agricultural workers.

    PubMed

    Austin, J; Alvero, A M; Fuchs, M M; Patterson, L; Anger, W K

    2009-07-01

    Employees with limited education may be excluded from advanced training due to assumptions that they might not learn rapidly. However, preparatory training may be able to overcome missing experience in education. The purpose of this study was to test the hypothesis that computer-based training (CBT) in supervisor skills of Latino agricultural workers would improve subsequent performance in a workshop designed to teach supervisor skills. Ten men born and educated in Mexico participated in the study; all spoke Spanish, the language of the training. Five participants (mean 6.4 years of education) completed supervisor skills CBT, and five participants (mean 8.2 years of education) completed hazard communication (HazCom) CBT as a control condition. Following the CBT, all participants completed a two-day face-to-face workshop on supervisory skills conducted by an experienced behavior management consultant. Although the groups did not differ in their knowledge scores on a multiple-choice test before the face-to-face workshop, after the workshop the HazCom group had a mean test score of 51.2% (SD = 8.7) while the supervisor group had a higher mean test score of 65.2% (SD = 14.3). The difference was marginally significant by a t-test (p = 0.052), and the effect size was large (d = 1.16). The results suggest that computer-based training in supervisor skills can be effective in preparing participants with limited education to learn supervisor skills from a face-to-face workshop. This result suggests that limited educational attainment is not a barrier to learning the complex knowledge required to supervise employees, that pre-training may improve learning in a workshop format, and that training may be presented effectively in a computer-based format to employees with limited education.

  19. Excellence in Physics Education Award Talk: Revitalizing Introductory Physics at Community Colleges and More

    NASA Astrophysics Data System (ADS)

    Hieggelke, Curtis

    2009-05-01

    This project started because many community college physics instructors wanted to improve the learning and understanding of their students in physics. However, these teachers, at that time, were isolated from many of the emerging developments in physics education research and computer technology such as MBL (microcomputer based laboratories). While there were some opportunities within the American Association of Physics Teachers to learn about recent educational developments, there was nothing targeted directly to the unique needs of the two-year college physics community; nor did many of the curriculum developers have much knowledge about this group. The initial goal of this project was to design and provide hands-on workshops to introduce new computer technology, software, curricular materials and approaches arising from physics education research to community college physics teachers. They would then have the background to decide if these new ideas were worthy of adoption and feasible at their institutions. NSF's Division of Undergraduate Education supported these workshop efforts by funding seven different grants from three different programs. These grants have led to 61 workshops with 52 workshop leaders, which were held at 23 community colleges in 14 states for over 1300 participants. This presentation will provide more details about these workshops, and about the subsequent development of the Conceptual Survey on Electricity and Magnetism, and a book on Ranking Tasks edited by us, but written by many participants in the early workshops. In addition, grants were received from NSF for the acquisition and development of computer lab technology that was later featured in some of the workshops. Finally, three NSF grants were received for the development of new educational materials called TIPERs (Tasks Inspired by Physics Education Research) that will be described.

  20. Meeting Stakeholder Energy Technology Education Needs Using a Mobile Demonstration

    ERIC Educational Resources Information Center

    de Koff, Jason P.; Ricketts, John C.; Robbins, Chris; Illukpitiya, Prabodh; Wade, Alvin

    2017-01-01

    Understanding the impact of workshops that include mobile demonstrations for describing technical applications can be useful when planning an Extension program on new energy technologies. We used a mobile demonstration in a workshop that provided information on small-scale on-farm biodiesel production. Evaluation of the workshop outcomes…

  1. Game-Coding Workshops in New Zealand Public Libraries: Evaluation of a Pilot Project

    ERIC Educational Resources Information Center

    Bolstad, Rachel

    2016-01-01

    This report evaluates a game coding workshop offered to young people and adults in seven public libraries round New Zealand. Participants were taken step by step through the process of creating their own simple 2D videogame, learning the basics of coding, computational thinking, and digital game design. The workshops were free and drew 426 people…

  2. Teaching evidence-based medicine: Impact on students' literature use and inpatient clinical documentation.

    PubMed

    Sastre, Elizabeth Ann; Denny, Joshua C; McCoy, Jacob A; McCoy, Allison B; Spickard, Anderson

    2011-01-01

    Effective teaching of evidence-based medicine (EBM) to medical students is important for lifelong self-directed learning. We implemented a brief workshop designed to teach literature searching skills to third-year medical students. We assessed its impact on students' utilization of EBM resources during their clinical rotation and the quality of EBM integration in inpatient notes. We developed a physician-led, hands-on workshop to introduce EBM resources to all internal medicine clerks. Pre- and post-workshop measures included student's attitudes to EBM, citations of EBM resources in their clinical notes, and quality of the EBM component of the discussion in the note. Computer log analysis recorded students' online search attempts. After the workshop, students reported improved comfort using EBM and increased utilization of EBM resources. EBM integration into the discussion component of the notes also showed significant improvement. Computer log analysis of students' searches demonstrated increased utilization of EBM resources following the workshop. We describe the successful implementation of a workshop designed to teach third-year medical students how to perform an efficient EBM literature search. We demonstrated improvements in students' confidence regarding EBM, increased utilization of EBM resources, and improved integration of EBM into inpatient notes.

  3. Consolidation of cloud computing in ATLAS

    NASA Astrophysics Data System (ADS)

    Taylor, Ryan P.; Domingues Cordeiro, Cristovao Jose; Giordano, Domenico; Hover, John; Kouba, Tomas; Love, Peter; McNab, Andrew; Schovancova, Jaroslava; Sobie, Randall; ATLAS Collaboration

    2017-10-01

    Throughout the first half of LHC Run 2, ATLAS cloud computing has undergone a period of consolidation, characterized by building upon previously established systems, with the aim of reducing operational effort, improving robustness, and reaching higher scale. This paper describes the current state of ATLAS cloud computing. Cloud activities are converging on a common contextualization approach for virtual machines, and cloud resources are sharing monitoring and service discovery components. We describe the integration of Vacuum resources, streamlined usage of the Simulation at Point 1 cloud for offline processing, extreme scaling on Amazon compute resources, and procurement of commercial cloud capacity in Europe. Finally, building on the previously established monitoring infrastructure, we have deployed a real-time monitoring and alerting platform which coalesces data from multiple sources, provides flexible visualization via customizable dashboards, and issues alerts and carries out corrective actions in response to problems.

  4. Workshop Report: Systems Biology for Organotypic Cell Cultures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grego, Sonia; Dougherty, Edward R.; Alexander, Francis Joseph

    Translating in vitro biological data into actionable information related to human health holds the potential to improve disease treatment and risk assessment of chemical exposures. While genomics has identified regulatory pathways at the cellular level, translation to the organism level requires a multiscale approach accounting for intra-cellular regulation, inter-cellular interaction, and tissue/organ-level effects. Tissue-level effects can now be probed in vitro thanks to recently developed systems of three-dimensional (3D), multicellular, “organotypic” cell cultures, which mimic functional responses of living tissue. However, there remains a knowledge gap regarding interactions across different biological scales, complicating accurate prediction of health outcomes from molecular/genomicmore » data and tissue responses. Systems biology aims at mathematical modeling of complex, non-linear biological systems. We propose to apply a systems biology approach to achieve a computational representation of tissue-level physiological responses by integrating empirical data derived from organotypic culture systems with computational models of intracellular pathways to better predict human responses. Successful implementation of this integrated approach will provide a powerful tool for faster, more accurate and cost-effective screening of potential toxicants and therapeutics. On September 11, 2015, an interdisciplinary group of scientists, engineers, and clinicians gathered for a workshop in Research Triangle Park, North Carolina, to discuss this ambitious goal. Participants represented laboratory-based and computational modeling approaches to pharmacology and toxicology, as well as the pharmaceutical industry, government, non-profits, and academia. Discussions focused on identifying critical system perturbations to model, the computational tools required, and the experimental approaches best suited to generating key data.« less

  5. Workshop Report: Systems Biology for Organotypic Cell Cultures

    DOE PAGES

    Grego, Sonia; Dougherty, Edward R.; Alexander, Francis Joseph; ...

    2016-11-14

    Translating in vitro biological data into actionable information related to human health holds the potential to improve disease treatment and risk assessment of chemical exposures. While genomics has identified regulatory pathways at the cellular level, translation to the organism level requires a multiscale approach accounting for intra-cellular regulation, inter-cellular interaction, and tissue/organ-level effects. Tissue-level effects can now be probed in vitro thanks to recently developed systems of three-dimensional (3D), multicellular, “organotypic” cell cultures, which mimic functional responses of living tissue. However, there remains a knowledge gap regarding interactions across different biological scales, complicating accurate prediction of health outcomes from molecular/genomicmore » data and tissue responses. Systems biology aims at mathematical modeling of complex, non-linear biological systems. We propose to apply a systems biology approach to achieve a computational representation of tissue-level physiological responses by integrating empirical data derived from organotypic culture systems with computational models of intracellular pathways to better predict human responses. Successful implementation of this integrated approach will provide a powerful tool for faster, more accurate and cost-effective screening of potential toxicants and therapeutics. On September 11, 2015, an interdisciplinary group of scientists, engineers, and clinicians gathered for a workshop in Research Triangle Park, North Carolina, to discuss this ambitious goal. Participants represented laboratory-based and computational modeling approaches to pharmacology and toxicology, as well as the pharmaceutical industry, government, non-profits, and academia. Discussions focused on identifying critical system perturbations to model, the computational tools required, and the experimental approaches best suited to generating key data.« less

  6. Test and Evaluation of Architecture-Aware Compiler Environment

    DTIC Science & Technology

    2011-11-01

    biology, medicine, social sciences , and security applications. Challenges include extremely large graphs (the Facebook friend network has over...Operations with Temporal Binning ....................................................................... 32 4.12 Memory behavior and Energy per...five challenge problems empirically, exploring their scaling properties, computation and datatype needs, memory behavior , and temporal behavior

  7. The Need for Optical Means as an Alternative for Electronic Computing

    NASA Technical Reports Server (NTRS)

    Adbeldayem, Hossin; Frazier, Donald; Witherow, William; Paley, Steve; Penn, Benjamin; Bank, Curtis; Whitaker, Ann F. (Technical Monitor)

    2001-01-01

    An increasing demand for faster computers is rapidly growing to encounter the fast growing rate of Internet, space communication, and robotic industry. Unfortunately, the Very Large Scale Integration technology is approaching its fundamental limits beyond which the device will be unreliable. Optical interconnections and optical integrated circuits are strongly believed to provide the way out of the extreme limitations imposed on the growth of speed and complexity of nowadays computations by conventional electronics. This paper demonstrates two ultra-fast, all-optical logic gates and a high-density storage medium, which are essential components in building the future optical computer.

  8. An orientation to wellness for new faculty of medicine members: meeting a need in faculty development.

    PubMed

    Brown, Garielle E; Bharwani, Aleem; Patel, Kamala D; Lemaire, Jane B

    2016-08-04

    To evaluate the format, content, and effectiveness of a newly developed orientation to wellness workshop, and to explore participants' overall perceptions. This was a mixed methods study. Participants consisted of 47 new faculty of medicine members who attended one of the four workshops held between 2011 and 2013. Questionnaires were used to evaluate workshop characteristics (10 survey items; response scale 1=unacceptable to 7=outstanding), intention to change behavior (yes/no), and retrospective pre/post workshop self-efficacy (4 survey items; response scale 1=no confidence to 6=absolute confidence). Mean scores and standard deviations were calculated for the workshop characteristics. Pre/post workshop self-efficacy scores were compared using a Wilcoxon signed-rank test. Participants' written qualitative feedback was coded using an inductive strategy to identify themes. There was strong support for the workshop characteristics with mean scores entirely above 6.00 (N=42). Thirty-one of 34 respondents (91%) expressed intention to change their behavior as a result of participating in the workshop. The post workshop self-efficacy scores (N=38 respondents) increased significantly for all four items (p<0.0001) compared to pre workshop ratings. Participants perceived the key workshop elements as the evidence-based content relevant to academic physicians, incorporation of practical tips and strategies, and an atmosphere conducive to discussion and experience sharing.   Participants welcomed wellness as a focus of faculty development. Enhancing instruction around wellness has the potential to contribute positively to the professional competency and overall functioning of faculty of medicine members.

  9. An orientation to wellness for new faculty of medicine members: meeting a need in faculty development

    PubMed Central

    Brown, Garielle E.; Bharwani, Aleem; Patel, Kamala D.

    2016-01-01

    Objectives To evaluate the format, content, and effectiveness of a newly developed orientation to wellness workshop, and to explore participants’ overall perceptions. Methods This was a mixed methods study. Participants consisted of 47 new faculty of medicine members who attended one of the four workshops held between 2011 and 2013. Questionnaires were used to evaluate workshop characteristics (10 survey items; response scale 1=unacceptable to 7=outstanding), intention to change behavior (yes/no), and retrospective pre/post workshop self-efficacy (4 survey items; response scale 1=no confidence to 6=absolute confidence). Mean scores and standard deviations were calculated for the workshop characteristics. Pre/post workshop self-efficacy scores were compared using a Wilcoxon signed-rank test. Participants’ written qualitative feedback was coded using an inductive strategy to identify themes. Results There was strong support for the workshop characteristics with mean scores entirely above 6.00 (N=42). Thirty-one of 34 respondents (91%) expressed intention to change their behavior as a result of participating in the workshop. The post workshop self-efficacy scores (N=38 respondents) increased significantly for all four items (p<0.0001) compared to pre workshop ratings. Participants perceived the key workshop elements as the evidence-based content relevant to academic physicians, incorporation of practical tips and strategies, and an atmosphere conducive to discussion and experience sharing. Conclusions   Participants welcomed wellness as a focus of faculty development. Enhancing instruction around wellness has the potential to contribute positively to the professional competency and overall functioning of faculty of medicine members. PMID:27494833

  10. STREAM2016: Streaming Requirements, Experience, Applications and Middleware Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fox, Geoffrey; Jha, Shantenu; Ramakrishnan, Lavanya

    The Department of Energy (DOE) Office of Science (SC) facilities including accelerators, light sources and neutron sources and sensors that study, the environment, and the atmosphere, are producing streaming data that needs to be analyzed for next-generation scientific discoveries. There has been an explosion of new research and technologies for stream analytics arising from the academic and private sectors. However, there has been no corresponding effort in either documenting the critical research opportunities or building a community that can create and foster productive collaborations. The two-part workshop series, STREAM: Streaming Requirements, Experience, Applications and Middleware Workshop (STREAM2015 and STREAM2016), weremore » conducted to bring the community together and identify gaps and future efforts needed by both NSF and DOE. This report describes the discussions, outcomes and conclusions from STREAM2016: Streaming Requirements, Experience, Applications and Middleware Workshop, the second of these workshops held on March 22-23, 2016 in Tysons, VA. STREAM2016 focused on the Department of Energy (DOE) applications, computational and experimental facilities, as well software systems. Thus, the role of “streaming and steering” as a critical mode of connecting the experimental and computing facilities was pervasive through the workshop. Given the overlap in interests and challenges with industry, the workshop had significant presence from several innovative companies and major contributors. The requirements that drive the proposed research directions, identified in this report, show an important opportunity for building competitive research and development program around streaming data. These findings and recommendations are consistent with vision outlined in NRC Frontiers of Data and National Strategic Computing Initiative (NCSI) [1, 2]. The discussions from the workshop are captured as topic areas covered in this report's sections. The report discusses four research directions driven by current and future application requirements reflecting the areas identified as important by STREAM2016. These include (i) Algorithms, (ii) Programming Models, Languages and Runtime Systems (iii) Human-in-the-loop and Steering in Scientific Workflow and (iv) Facilities.« less

  11. Science & Technology Review: September 2016

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vogt, Ramona L.; Meissner, Caryn N.; Chinn, Ken B.

    2016-09-30

    This is the September issue of the Lawrence Livermore National Laboratory's Science & Technology Review, which communicates, to a broad audience, the Laboratory’s scientific and technological accomplishments in fulfilling its primary missions. This month, there are features on "Laboratory Investments Drive Computational Advances" and "Laying the Groundwork for Extreme-Scale Computing." Research highlights include "Nuclear Data Moves into the 21st Century", "Peering into the Future of Lick Observatory", and "Facility Drives Hydrogen Vehicle Innovations."

  12. FOREWORD: 5th International Workshop on New Computational Methods for Inverse Problems

    NASA Astrophysics Data System (ADS)

    Vourc'h, Eric; Rodet, Thomas

    2015-11-01

    This volume of Journal of Physics: Conference Series is dedicated to the scientific research presented during the 5th International Workshop on New Computational Methods for Inverse Problems, NCMIP 2015 (http://complement.farman.ens-cachan.fr/NCMIP_2015.html). This workshop took place at Ecole Normale Supérieure de Cachan, on May 29, 2015. The prior editions of NCMIP also took place in Cachan, France, firstly within the scope of ValueTools Conference, in May 2011, and secondly at the initiative of Institut Farman, in May 2012, May 2013 and May 2014. The New Computational Methods for Inverse Problems (NCMIP) workshop focused on recent advances in the resolution of inverse problems. Indeed, inverse problems appear in numerous scientific areas such as geophysics, biological and medical imaging, material and structure characterization, electrical, mechanical and civil engineering, and finances. The resolution of inverse problems consists of estimating the parameters of the observed system or structure from data collected by an instrumental sensing or imaging device. Its success firstly requires the collection of relevant observation data. It also requires accurate models describing the physical interactions between the instrumental device and the observed system, as well as the intrinsic properties of the solution itself. Finally, it requires the design of robust, accurate and efficient inversion algorithms. Advanced sensor arrays and imaging devices provide high rate and high volume data; in this context, the efficient resolution of the inverse problem requires the joint development of new models and inversion methods, taking computational and implementation aspects into account. During this one-day workshop, researchers had the opportunity to bring to light and share new techniques and results in the field of inverse problems. The topics of the workshop were: algorithms and computational aspects of inversion, Bayesian estimation, Kernel methods, learning methods, convex optimization, free discontinuity problems, metamodels, proper orthogonal decomposition, reduced models for the inversion, non-linear inverse scattering, image reconstruction and restoration, and applications (bio-medical imaging, non-destructive evaluation...). NCMIP 2015 was a one-day workshop held in May 2015 which attracted around 70 attendees. Each of the submitted papers has been reviewed by two reviewers. There have been 15 accepted papers. In addition, three international speakers were invited to present a longer talk. The workshop was supported by Institut Farman (ENS Cachan, CNRS) and endorsed by the following French research networks: GDR ISIS, GDR MIA, GDR MOA and GDR Ondes. The program committee acknowledges the following research laboratories: CMLA, LMT, LURPA and SATIE.

  13. Spatial variability of extreme rainfall at radar subpixel scale

    NASA Astrophysics Data System (ADS)

    Peleg, Nadav; Marra, Francesco; Fatichi, Simone; Paschalis, Athanasios; Molnar, Peter; Burlando, Paolo

    2018-01-01

    Extreme rainfall is quantified in engineering practice using Intensity-Duration-Frequency curves (IDF) that are traditionally derived from rain-gauges and more recently also from remote sensing instruments, such as weather radars. These instruments measure rainfall at different spatial scales: rain-gauge samples rainfall at the point scale while weather radar averages precipitation on a relatively large area, generally around 1 km2. As such, a radar derived IDF curve is representative of the mean areal rainfall over a given radar pixel and neglects the within-pixel rainfall variability. In this study, we quantify subpixel variability of extreme rainfall by using a novel space-time rainfall generator (STREAP model) that downscales in space the rainfall within a given radar pixel. The study was conducted using a unique radar data record (23 years) and a very dense rain-gauge network in the Eastern Mediterranean area (northern Israel). Radar-IDF curves, together with an ensemble of point-based IDF curves representing the radar subpixel extreme rainfall variability, were developed fitting Generalized Extreme Value (GEV) distributions to annual rainfall maxima. It was found that the mean areal extreme rainfall derived from the radar underestimate most of the extreme values computed for point locations within the radar pixel (on average, ∼70%). The subpixel variability of rainfall extreme was found to increase with longer return periods and shorter durations (e.g. from a maximum variability of 10% for a return period of 2 years and a duration of 4 h to 30% for 50 years return period and 20 min duration). For the longer return periods, a considerable enhancement of extreme rainfall variability was found when stochastic (natural) climate variability was taken into account. Bounding the range of the subpixel extreme rainfall derived from radar-IDF can be of major importance for different applications that require very local estimates of rainfall extremes.

  14. Are X-rays the key to integrated computational materials engineering?

    DOE PAGES

    Ice, Gene E.

    2015-11-01

    The ultimate dream of materials science is to predict materials behavior from composition and processing history. Owing to the growing power of computers, this long-time dream has recently found expression through worldwide excitement in a number of computation-based thrusts: integrated computational materials engineering, materials by design, computational materials design, three-dimensional materials physics and mesoscale physics. However, real materials have important crystallographic structures at multiple length scales, which evolve during processing and in service. Moreover, real materials properties can depend on the extreme tails in their structural and chemical distributions. This makes it critical to map structural distributions with sufficient resolutionmore » to resolve small structures and with sufficient statistics to capture the tails of distributions. For two-dimensional materials, there are high-resolution nondestructive probes of surface and near-surface structures with atomic or near-atomic resolution that can provide detailed structural, chemical and functional distributions over important length scales. Furthermore, there are no nondestructive three-dimensional probes with atomic resolution over the multiple length scales needed to understand most materials.« less

  15. Fourth User Workshop on High-Power Lasers at the Linac Coherent Light Source

    DOE PAGES

    Bolme, Cindy Anne; Mackinnon, Andy; Glenzer, Siegfried

    2017-05-30

    The fourth international user workshop focusing on high-power lasers at the Linac Coherent Light Source (LCLS) was held in Menlo Park, CA, USA, on October 3–4, 2016. The workshop was co-organized by Los Alamos National Laboratory and SLAC National Accelerator Laboratory (SLAC), and garnered the attendance of more than 110 scientists. Participants discussed the warm dense matter and high-pressure science that is being conducted using high-power lasers at the LCLS Matter in Extreme Conditions (MEC) endstation. During the past year, there have been seven journal articles published from research at the MEC instrument. Here, the specific topics discussed at thismore » workshop were experimental highlights from the past year, current status and future commissioning of MEC capabilities, and future facility upgrades that will enable the expanded science reach of the facility.« less

  16. Fourth User Workshop on High-Power Lasers at the Linac Coherent Light Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolme, Cindy Anne; Mackinnon, Andy; Glenzer, Siegfried

    The fourth international user workshop focusing on high-power lasers at the Linac Coherent Light Source (LCLS) was held in Menlo Park, CA, USA, on October 3–4, 2016. The workshop was co-organized by Los Alamos National Laboratory and SLAC National Accelerator Laboratory (SLAC), and garnered the attendance of more than 110 scientists. Participants discussed the warm dense matter and high-pressure science that is being conducted using high-power lasers at the LCLS Matter in Extreme Conditions (MEC) endstation. During the past year, there have been seven journal articles published from research at the MEC instrument. Here, the specific topics discussed at thismore » workshop were experimental highlights from the past year, current status and future commissioning of MEC capabilities, and future facility upgrades that will enable the expanded science reach of the facility.« less

  17. Multiple multicontrol unitary operations: Implementation and applications

    NASA Astrophysics Data System (ADS)

    Lin, Qing

    2018-04-01

    The efficient implementation of computational tasks is critical to quantum computations. In quantum circuits, multicontrol unitary operations are important components. Here, we present an extremely efficient and direct approach to multiple multicontrol unitary operations without decomposition to CNOT and single-photon gates. With the proposed approach, the necessary two-photon operations could be reduced from O( n 3) with the traditional decomposition approach to O( n), which will greatly relax the requirements and make large-scale quantum computation feasible. Moreover, we propose the potential application to the ( n- k)-uniform hypergraph state.

  18. RICIS research

    NASA Technical Reports Server (NTRS)

    Mckay, Charles W.; Feagin, Terry; Bishop, Peter C.; Hallum, Cecil R.; Freedman, Glenn B.

    1987-01-01

    The principle focus of one of the RICIS (Research Institute for Computing and Information Systems) components is computer systems and software engineering in-the-large of the lifecycle of large, complex, distributed systems which: (1) evolve incrementally over a long time; (2) contain non-stop components; and (3) must simultaneously satisfy a prioritized balance of mission and safety critical requirements at run time. This focus is extremely important because of the contribution of the scaling direction problem to the current software crisis. The Computer Systems and Software Engineering (CSSE) component addresses the lifestyle issues of three environments: host, integration, and target.

  19. Workshops of the Sixth International Brain–Computer Interface Meeting: brain–computer interfaces past, present, and future

    PubMed Central

    Huggins, Jane E.; Guger, Christoph; Ziat, Mounia; Zander, Thorsten O.; Taylor, Denise; Tangermann, Michael; Soria-Frisch, Aureli; Simeral, John; Scherer, Reinhold; Rupp, Rüdiger; Ruffini, Giulio; Robinson, Douglas K. R.; Ramsey, Nick F.; Nijholt, Anton; Müller-Putz, Gernot; McFarland, Dennis J.; Mattia, Donatella; Lance, Brent J.; Kindermans, Pieter-Jan; Iturrate, Iñaki; Herff, Christian; Gupta, Disha; Do, An H.; Collinger, Jennifer L.; Chavarriaga, Ricardo; Chase, Steven M.; Bleichner, Martin G.; Batista, Aaron; Anderson, Charles W.; Aarnoutse, Erik J.

    2017-01-01

    The Sixth International Brain–Computer Interface (BCI) Meeting was held 30 May–3 June 2016 at the Asilomar Conference Grounds, Pacific Grove, California, USA. The conference included 28 workshops covering topics in BCI and brain–machine interface research. Topics included BCI for specific populations or applications, advancing BCI research through use of specific signals or technological advances, and translational and commercial issues to bring both implanted and non-invasive BCIs to market. BCI research is growing and expanding in the breadth of its applications, the depth of knowledge it can produce, and the practical benefit it can provide both for those with physical impairments and the general public. Here we provide summaries of each workshop, illustrating the breadth and depth of BCI research and highlighting important issues and calls for action to support future research and development. PMID:29152523

  20. Workshops of the Sixth International Brain-Computer Interface Meeting: brain-computer interfaces past, present, and future.

    PubMed

    Huggins, Jane E; Guger, Christoph; Ziat, Mounia; Zander, Thorsten O; Taylor, Denise; Tangermann, Michael; Soria-Frisch, Aureli; Simeral, John; Scherer, Reinhold; Rupp, Rüdiger; Ruffini, Giulio; Robinson, Douglas K R; Ramsey, Nick F; Nijholt, Anton; Müller-Putz, Gernot; McFarland, Dennis J; Mattia, Donatella; Lance, Brent J; Kindermans, Pieter-Jan; Iturrate, Iñaki; Herff, Christian; Gupta, Disha; Do, An H; Collinger, Jennifer L; Chavarriaga, Ricardo; Chase, Steven M; Bleichner, Martin G; Batista, Aaron; Anderson, Charles W; Aarnoutse, Erik J

    2017-01-01

    The Sixth International Brain-Computer Interface (BCI) Meeting was held 30 May-3 June 2016 at the Asilomar Conference Grounds, Pacific Grove, California, USA. The conference included 28 workshops covering topics in BCI and brain-machine interface research. Topics included BCI for specific populations or applications, advancing BCI research through use of specific signals or technological advances, and translational and commercial issues to bring both implanted and non-invasive BCIs to market. BCI research is growing and expanding in the breadth of its applications, the depth of knowledge it can produce, and the practical benefit it can provide both for those with physical impairments and the general public. Here we provide summaries of each workshop, illustrating the breadth and depth of BCI research and highlighting important issues and calls for action to support future research and development.

  1. Extreme Scale Computing Studies

    DTIC Science & Technology

    2010-12-01

    PUBLICATION IN ACCORDANCE WITH ASSIGNED DISTRIBUTION STATEMENT. *//Signature// //Signature// KERRY HILL, Program Manager BRADLEY J ...Research Institute William Carlson Institute for Defense Analyses William Dally Stanford University Monty Denneau IBM T. J . Watson Research...for Defense Analyses William Dally, Stanford University Monty Denneau, IBM T. J . Watson Research Laboratories Paul Franzon, North Carolina State

  2. ExScal Backbone Network Architecture

    DTIC Science & Technology

    2005-01-01

    802.11 battery powered nodes was laid over the sensor network. We adopted the Stargate platform for the backbone tier to serve as the basis for...its head. XSS Hardware and Network: XSS stands for eXtreme Scaling Stargate . A stargate is a linux-based single board computer. It has a 400 MHz

  3. Computer Simulation for Emergency Incident Management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, D L

    2004-12-03

    This report describes the findings and recommendations resulting from the Department of Homeland Security (DHS) Incident Management Simulation Workshop held by the DHS Advanced Scientific Computing Program in May 2004. This workshop brought senior representatives of the emergency response and incident-management communities together with modeling and simulation technologists from Department of Energy laboratories. The workshop provided an opportunity for incident responders to describe the nature and substance of the primary personnel roles in an incident response, to identify current and anticipated roles of modeling and simulation in support of incident response, and to begin a dialog between the incident responsemore » and simulation technology communities that will guide and inform planned modeling and simulation development for incident response. This report provides a summary of the discussions at the workshop as well as a summary of simulation capabilities that are relevant to incident-management training, and recommendations for the use of simulation in both incident management and in incident management training, based on the discussions at the workshop. In addition, the report discusses areas where further research and development will be required to support future needs in this area.« less

  4. Target matching based on multi-view tracking

    NASA Astrophysics Data System (ADS)

    Liu, Yahui; Zhou, Changsheng

    2011-01-01

    A feature matching method is proposed based on Maximally Stable Extremal Regions (MSER) and Scale Invariant Feature Transform (SIFT) to solve the problem of the same target matching in multiple cameras. Target foreground is extracted by using frame difference twice and bounding box which is regarded as target regions is calculated. Extremal regions are got by MSER. After fitted into elliptical regions, those regions will be normalized into unity circles and represented with SIFT descriptors. Initial matching is obtained from the ratio of the closest distance to second distance less than some threshold and outlier points are eliminated in terms of RANSAC. Experimental results indicate the method can reduce computational complexity effectively and is also adapt to affine transformation, rotation, scale and illumination.

  5. Computation of Flow Over a Drag Prediction Workshop Wing/Body Transport Configuration Using CFL3D

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.; Biedron, Robert T.

    2001-01-01

    A Drag Prediction Workshop was held in conjunction with the 19th AIAA Applied Aerodynamics Conference in June 2001. The purpose of the workshop was to assess the prediction of drag by computational methods for a wing/body configuration (DLR-F4) representative of subsonic transport aircraft. This report details computed results submitted to this workshop using the Reynolds-averaged Navier-Stokes code CFL3D. Two supplied grids were used: a point-matched 1-to-1 multi-block grid, and an overset multi-block grid. The 1-to-1 grid, generally of much poorer quality and with less streamwise resolution than the overset grid, is found to be too coarse to adequately resolve the surface pressures. However, the global forces and moments are nonetheless similar to those computed using the overset grid. The effect of three different turbulence models is assessed using the 1-to-1 grid. Surface pressures are very similar overall, and the drag variation due to turbulence model is 18 drag counts. Most of this drag variation is in the friction component, and is attributed in part to insufficient grid resolution of the 1-to-1 grid. The misnomer of 'fully turbulent' computations is discussed; comparisons are made using different transition locations and their effects on the global forces and moments are quantified. Finally, the effect of two different versions of a widely used one-equation turbulence model is explored.

  6. Characterization and prediction of extreme events in turbulence

    NASA Astrophysics Data System (ADS)

    Fonda, Enrico; Iyer, Kartik P.; Sreenivasan, Katepalli R.

    2017-11-01

    Extreme events in Nature such as tornadoes, large floods and strong earthquakes are rare but can have devastating consequences. The predictability of these events is very limited at present. Extreme events in turbulence are the very large events in small scales that are intermittent in character. We examine events in energy dissipation rate and enstrophy which are several tens to hundreds to thousands of times the mean value. To this end we use our DNS database of homogeneous and isotropic turbulence with Taylor Reynolds numbers spanning a decade, computed with different small scale resolutions and different box sizes, and study the predictability of these events using machine learning. We start with an aggressive data augmentation to virtually increase the number of these rare events by two orders of magnitude and train a deep convolutional neural network to predict their occurrence in an independent data set. The goal of the work is to explore whether extreme events can be predicted with greater assurance than can be done by conventional methods (e.g., D.A. Donzis & K.R. Sreenivasan, J. Fluid Mech. 647, 13-26, 2010).

  7. GLAD: a system for developing and deploying large-scale bioinformatics grid.

    PubMed

    Teo, Yong-Meng; Wang, Xianbing; Ng, Yew-Kwong

    2005-03-01

    Grid computing is used to solve large-scale bioinformatics problems with gigabytes database by distributing the computation across multiple platforms. Until now in developing bioinformatics grid applications, it is extremely tedious to design and implement the component algorithms and parallelization techniques for different classes of problems, and to access remotely located sequence database files of varying formats across the grid. In this study, we propose a grid programming toolkit, GLAD (Grid Life sciences Applications Developer), which facilitates the development and deployment of bioinformatics applications on a grid. GLAD has been developed using ALiCE (Adaptive scaLable Internet-based Computing Engine), a Java-based grid middleware, which exploits the task-based parallelism. Two bioinformatics benchmark applications, such as distributed sequence comparison and distributed progressive multiple sequence alignment, have been developed using GLAD.

  8. EDITORIAL: Selected papers from the 9th International Workshop on Micro and Nanotechnology for Power Generation and Energy Conversion Applications (PowerMEMS 2009) Selected papers from the 9th International Workshop on Micro and Nanotechnology for Power Generation and Energy Conversion Applications (PowerMEMS 2009)

    NASA Astrophysics Data System (ADS)

    Ghodssi, Reza; Livermore, Carol; Arnold, David

    2010-10-01

    This special section of the Journal of Micromechanics and Microengineering presents papers selected from the 9th International Workshop on Micro and Nanotechnology for Power Generation and Energy Conversion Applications (PowerMEMS 2009), which was held in Washington DC, USA from 1-4 December 2009. Since it was first held in Sendai, Japan in 2000, the PowerMEMS workshop has focused on small-scale systems that process, convert, or generate macroscopically significant amounts of power, typically with high power density or high energy density. In the workshop's early years, much of the research presented was on small-scale fueled systems, such as micro heat engines and micro fuel cells. The past nine years have seen a dramatic expansion in the range of technologies that are brought to bear on the challenge of high-power, small-scale systems, as well as an increase in the applications for such technologies. At this year's workshop, 158 contributed papers were presented, along with invited and plenary presentations. The papers focused on applications from micro heat engines and fuel cells, to energy harvesting and its enabling electronics, to thermal management and propulsion. Also presented were the technologies that enable these applications, such as the structuring of microscale, nanoscale and biological systems for power applications, as well as combustion and catalysis at small scales. This special section includes a selection of 12 expanded papers representing energy harvesting, chemical and fueled systems, and elastic energy storage at small scales. We would like to express our appreciation to the members of the International Steering Committee, the Technical Program Committee, the Local Organizing Committee, and to the workshop's financial supporters. We are grateful to the referees for their contributions to the review process. Finally, we would like to thank Dr Ian Forbes, the editorial staff of the Journal of Micromechanics and Microengineering, and the staff of IOP Publishing for making this special section possible.

  9. Crossing disciplines and scales to understand the critical zone

    USGS Publications Warehouse

    Brantley, S.L.; Goldhaber, M.B.; Vala, Ragnarsdottir K.

    2007-01-01

    The Critical Zone (CZ) is the system of coupled chemical, biological, physical, and geological processes operating together to support life at the Earth's surface. While our understanding of this zone has increased over the last hundred years, further advance requires scientists to cross disciplines and scales to integrate understanding of processes in the CZ, ranging in scale from the mineral-water interface to the globe. Despite the extreme heterogeneities manifest in the CZ, patterns are observed at all scales. Explanations require the use of new computational and analytical tools, inventive interdisciplinary approaches, and growing networks of sites and people.

  10. Employability and Technical Skill Required to Establish a Small Scale Automobile Workshop

    ERIC Educational Resources Information Center

    Olaitan, Olawale O.; Ikeh, Joshua O.

    2015-01-01

    The study focused on identifying the employability and technical skills needed to establish small-scale automobile workshop in Nsukka Urban of Enugu State. Five purposes of the study were stated to guide the study. Five research questions were stated and answered in line with the purpose of the study. The population for the study is 1,500…

  11. Statistical Analysis of CFD Solutions from the 6th AIAA CFD Drag Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Derlaga, Joseph M.; Morrison, Joseph H.

    2017-01-01

    A graphical framework is used for statistical analysis of the results from an extensive N- version test of a collection of Reynolds-averaged Navier-Stokes computational uid dynam- ics codes. The solutions were obtained by code developers and users from North America, Europe, Asia, and South America using both common and custom grid sequencees as well as multiple turbulence models for the June 2016 6th AIAA CFD Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic con guration for this workshop was the Common Research Model subsonic transport wing- body previously used for both the 4th and 5th Drag Prediction Workshops. This work continues the statistical analysis begun in the earlier workshops and compares the results from the grid convergence study of the most recent workshop with previous workshops.

  12. Energy Systems Integration Facility (ESIF) External Stakeholders Workshop: Workshop Proceedings, 9 October 2008, Golden, Colorado

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Komomua, C.; Kroposki, B.; Mooney, D.

    2009-01-01

    On October 9, 2008, NREL hosted a workshop to provide an opportunity for external stakeholders to offer insights and recommendations on the design and functionality of DOE's planned Energy Systems Infrastructure Facility (ESIF). The goal was to ensure that the planning for the ESIF effectively addresses the most critical barriers to large-scale energy efficiency (EE) and renewable energy (RE) deployment. This technical report documents the ESIF workshop proceedings.

  13. Computation in Physics: Resources and Support

    NASA Astrophysics Data System (ADS)

    Engelhardt, Larry; Caballero, Marcos; Chonacky, Norman; Hilborn, Robert; Lopez Del Puerto, Marie; Roos, Kelly

    We will describe exciting new resources and support opportunities that have been developed by ``PICUP'' to help faculty to integrate computation into their physics courses. (``PICUP'' is the ``Partnership for Integration of Computation into Undergraduate Physics''). These resources include editable curricular materials that can be downloaded from the PICUP Collection of the ComPADRE Digital Library: www.compadre.org/PICUP. Support opportunities include week-long workshops during the summer and single-day workshops at national AAPT and APS meetings. This project is funded by the National Science Foundation under DUE IUSE Grants 1524128, 1524493, 1524963, 1525062, and 1525525.

  14. Computer systems

    NASA Technical Reports Server (NTRS)

    Olsen, Lola

    1992-01-01

    In addition to the discussions, Ocean Climate Data Workshop hosts gave participants an opportunity to hear about, see, and test for themselves some of the latest computer tools now available for those studying climate change and the oceans. Six speakers described computer systems and their functions. The introductory talks were followed by demonstrations to small groups of participants and some opportunities for participants to get hands-on experience. After this familiarization period, attendees were invited to return during the course of the Workshop and have one-on-one discussions and further hands-on experience with these systems. Brief summaries or abstracts of introductory presentations are addressed.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chow, Edmond

    Solving sparse problems is at the core of many DOE computational science applications. We focus on the challenge of developing sparse algorithms that can fully exploit the parallelism in extreme-scale computing systems, in particular systems with massive numbers of cores per node. Our approach is to express a sparse matrix factorization as a large number of bilinear constraint equations, and then solving these equations via an asynchronous iterative method. The unknowns in these equations are the matrix entries of the factorization that is desired.

  16. ExM:System Support for Extreme-Scale, Many-Task Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katz, Daniel S

    The ever-increasing power of supercomputer systems is both driving and enabling the emergence of new problem-solving methods that require the effi cient execution of many concurrent and interacting tasks. Methodologies such as rational design (e.g., in materials science), uncertainty quanti fication (e.g., in engineering), parameter estimation (e.g., for chemical and nuclear potential functions, and in economic energy systems modeling), massive dynamic graph pruning (e.g., in phylogenetic searches), Monte-Carlo- based iterative fi xing (e.g., in protein structure prediction), and inverse modeling (e.g., in reservoir simulation) all have these requirements. These many-task applications frequently have aggregate computing needs that demand the fastestmore » computers. For example, proposed next-generation climate model ensemble studies will involve 1,000 or more runs, each requiring 10,000 cores for a week, to characterize model sensitivity to initial condition and parameter uncertainty. The goal of the ExM project is to achieve the technical advances required to execute such many-task applications efficiently, reliably, and easily on petascale and exascale computers. In this way, we will open up extreme-scale computing to new problem solving methods and application classes. In this document, we report on combined technical progress of the collaborative ExM project, and the institutional financial status of the portion of the project at University of Chicago, over the rst 8 months (through April 30, 2011)« less

  17. Building shared experience to advance practical application of pathway-based toxicology: liver toxicity mode-of-action.

    PubMed

    Willett, Catherine; Caverly Rae, Jessica; Goyak, Katy O; Minsavage, Gary; Westmoreland, Carl; Andersen, Melvin; Avigan, Mark; Duché, Daniel; Harris, Georgina; Hartung, Thomas; Jaeschke, Hartmut; Kleensang, Andre; Landesmann, Brigitte; Martos, Suzanne; Matevia, Marilyn; Toole, Colleen; Rowan, Andrew; Schultz, Terry; Seed, Jennifer; Senior, John; Shah, Imran; Subramanian, Kalyanasundaram; Vinken, Mathieu; Watkins, Paul

    2014-01-01

    A workshop sponsored by the Human Toxicology Project Consortium (HTPC), "Building Shared Experience to Advance Practical Application of Pathway-Based Toxicology: Liver Toxicity Mode-of-Action" brought together experts from a wide range of perspectives to inform the process of pathway development and to advance two prototype pathways initially developed by the European Commission Joint Research Center (JRC): liver-specific fibrosis and steatosis. The first half of the workshop focused on the theory and practice of pathway development; the second on liver disease and the two prototype pathways. Participants agreed pathway development is extremely useful for organizing information and found that focusing the theoretical discussion on a specific AOP is extremely helpful. In addition, it is important to include several perspectives during pathway development, including information specialists, pathologists, human health and environmental risk assessors, and chemical and product manufacturers, to ensure the biology is well captured and end use is considered.

  18. Brain-computer interface devices for patients with paralysis and amputation: a meeting report

    NASA Astrophysics Data System (ADS)

    Bowsher, K.; Civillico, E. F.; Coburn, J.; Collinger, J.; Contreras-Vidal, J. L.; Denison, T.; Donoghue, J.; French, J.; Getzoff, N.; Hochberg, L. R.; Hoffmann, M.; Judy, J.; Kleitman, N.; Knaack, G.; Krauthamer, V.; Ludwig, K.; Moynahan, M.; Pancrazio, J. J.; Peckham, P. H.; Pena, C.; Pinto, V.; Ryan, T.; Saha, D.; Scharen, H.; Shermer, S.; Skodacek, K.; Takmakov, P.; Tyler, D.; Vasudevan, S.; Wachrathit, K.; Weber, D.; Welle, C. G.; Ye, M.

    2016-04-01

    Objective. The Food and Drug Administration’s (FDA) Center for Devices and Radiological Health (CDRH) believes it is important to help stakeholders (e.g., manufacturers, health-care professionals, patients, patient advocates, academia, and other government agencies) navigate the regulatory landscape for medical devices. For innovative devices involving brain-computer interfaces, this is particularly important. Approach. Towards this goal, on 21 November, 2014, CDRH held an open public workshop on its White Oak, MD campus with the aim of fostering an open discussion on the scientific and clinical considerations associated with the development of brain-computer interface (BCI) devices, defined for the purposes of this workshop as neuroprostheses that interface with the central or peripheral nervous system to restore lost motor or sensory capabilities. Main results. This paper summarizes the presentations and discussions from that workshop. Significance. CDRH plans to use this information to develop regulatory considerations that will promote innovation while maintaining appropriate patient protections. FDA plans to build on advances in regulatory science and input provided in this workshop to develop guidance that provides recommendations for premarket submissions for BCI devices. These proceedings will be a resource for the BCI community during the development of medical devices for consumers.

  19. Simulations & Measurements of Airframe Noise: A BANC Workshops Perspective

    NASA Technical Reports Server (NTRS)

    Choudhari, Meelan; Lockard, David

    2016-01-01

    Airframe noise corresponds to the acoustic radiation due to turbulent flow in the vicinity of airframe components such as high-lift devices and landing gears. Since 2010, the American Institute of Aeronautics and Astronautics has organized an ongoing series of workshops devoted to Benchmark Problems for Airframe Noise Computations (BANC). The BANC workshops are aimed at enabling a systematic progress in the understanding and high-fidelity predictions of airframe noise via collaborative investigations that integrate computational fluid dynamics, computational aeroacoustics, and in depth measurements targeting a selected set of canonical yet realistic configurations that advance the current state-of-the-art in multiple respects. Unique features of the BANC Workshops include: intrinsically multi-disciplinary focus involving both fluid dynamics and aeroacoustics, holistic rather than predictive emphasis, concurrent, long term evolution of experiments and simulations with a powerful interplay between the two, and strongly integrative nature by virtue of multi-team, multi-facility, multiple-entry measurements. This paper illustrates these features in the context of the BANC problem categories and outlines some of the challenges involved and how they were addressed. A brief summary of the BANC effort, including its technical objectives, strategy, and selective outcomes thus far is also included.

  20. Brain-computer interface devices for patients with paralysis and amputation: a meeting report.

    PubMed

    Bowsher, K; Civillico, E F; Coburn, J; Collinger, J; Contreras-Vidal, J L; Denison, T; Donoghue, J; French, J; Getzoff, N; Hochberg, L R; Hoffmann, M; Judy, J; Kleitman, N; Knaack, G; Krauthamer, V; Ludwig, K; Moynahan, M; Pancrazio, J J; Peckham, P H; Pena, C; Pinto, V; Ryan, T; Saha, D; Scharen, H; Shermer, S; Skodacek, K; Takmakov, P; Tyler, D; Vasudevan, S; Wachrathit, K; Weber, D; Welle, C G; Ye, M

    2016-04-01

    The Food and Drug Administration's (FDA) Center for Devices and Radiological Health (CDRH) believes it is important to help stakeholders (e.g., manufacturers, health-care professionals, patients, patient advocates, academia, and other government agencies) navigate the regulatory landscape for medical devices. For innovative devices involving brain-computer interfaces, this is particularly important. Towards this goal, on 21 November, 2014, CDRH held an open public workshop on its White Oak, MD campus with the aim of fostering an open discussion on the scientific and clinical considerations associated with the development of brain-computer interface (BCI) devices, defined for the purposes of this workshop as neuroprostheses that interface with the central or peripheral nervous system to restore lost motor or sensory capabilities. This paper summarizes the presentations and discussions from that workshop. CDRH plans to use this information to develop regulatory considerations that will promote innovation while maintaining appropriate patient protections. FDA plans to build on advances in regulatory science and input provided in this workshop to develop guidance that provides recommendations for premarket submissions for BCI devices. These proceedings will be a resource for the BCI community during the development of medical devices for consumers.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katti, Amogh; Di Fatta, Giuseppe; Naughton III, Thomas J

    Future extreme-scale high-performance computing systems will be required to work under frequent component failures. The MPI Forum's User Level Failure Mitigation proposal has introduced an operation, MPI_Comm_shrink, to synchronize the alive processes on the list of failed processes, so that applications can continue to execute even in the presence of failures by adopting algorithm-based fault tolerance techniques. This MPI_Comm_shrink operation requires a fault tolerant failure detection and consensus algorithm. This paper presents and compares two novel failure detection and consensus algorithms. The proposed algorithms are based on Gossip protocols and are inherently fault-tolerant and scalable. The proposed algorithms were implementedmore » and tested using the Extreme-scale Simulator. The results show that in both algorithms the number of Gossip cycles to achieve global consensus scales logarithmically with system size. The second algorithm also shows better scalability in terms of memory and network bandwidth usage and a perfect synchronization in achieving global consensus.« less

  2. Towards large scale multi-target tracking

    NASA Astrophysics Data System (ADS)

    Vo, Ba-Ngu; Vo, Ba-Tuong; Reuter, Stephan; Lam, Quang; Dietmayer, Klaus

    2014-06-01

    Multi-target tracking is intrinsically an NP-hard problem and the complexity of multi-target tracking solutions usually do not scale gracefully with problem size. Multi-target tracking for on-line applications involving a large number of targets is extremely challenging. This article demonstrates the capability of the random finite set approach to provide large scale multi-target tracking algorithms. In particular it is shown that an approximate filter known as the labeled multi-Bernoulli filter can simultaneously track one thousand five hundred targets in clutter on a standard laptop computer.

  3. Proceedings of the Thirteenth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Topics covered in the workshop included studies and experiments conducted in the Software Engineering Laboratory (SEL), a cooperative effort of NASA Goddard Space Flight Center, the University of Maryland, and Computer Sciences Corporation; software models; software products; and software tools.

  4. [Clinical and communication simulation workshop for fellows in gastroenterology: the trainees' perspective].

    PubMed

    Lang, Alon; Melzer, Ehud; Bar-Meir, Simon; Eliakim, Rami; Ziv, Amitai

    2006-11-01

    The continuing development in computer-based medical simulators provides an ideal platform for simulator-assisted training programs for medical trainees. Computer-based endoscopic simulators provide a virtual reality environment for training endoscopic procedures. This study illustrates the use of a comprehensive training model combining the use of endoscopic simulators with simulated (actor) patients (SP). To evaluate the effectiveness of a comprehensive simulation workshop from the trainee perspective. Four case studies were developed with emphasis on communication skills. Three workshops with 10 fellows in each were conducted. During each workshop the trainees spent half of the time in SP case studies and the remaining half working with computerized endoscopic simulators with continuous guidance by an expert endoscopist. Questionnaires were completed by the fellows at the end of the workshop. Seventy percent of the fellows felt that the endoscopic simulator was close or very close to reality for gastroscopy and 63% for colonoscopy. Eighty eight percent thought the close guidance was important for the learning process with the simulator. Eighty percent felt that the case studies were an important learning experience for risk management. Further evaluation of multi-modality simulation workshops in gastroenterologist training is needed to identify how best to incorporate this form of instruction into training for gastroenterologists.

  5. FUN3D Analyses in Support of the Second Aeroelastic Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Chwalowski, Pawel; Heeg, Jennifer

    2016-01-01

    This paper presents the computational aeroelastic results generated in support of the second Aeroelastic Prediction Workshop for the Benchmark Supercritical Wing (BSCW) configurations and compares them to the experimental data. The computational results are obtained using FUN3D, an unstructured grid Reynolds- Averaged Navier-Stokes solver developed at NASA Langley Research Center. The analysis results include aerodynamic coefficients and surface pressures obtained for steady-state, static aeroelastic equilibrium, and unsteady flow due to a pitching wing or flutter prediction. Frequency response functions of the pressure coefficients with respect to the angular displacement are computed and compared with the experimental data. The effects of spatial and temporal convergence on the computational results are examined.

  6. PREFACE: Workshop on 'Buried' Interface Science with X-rays and Neutrons

    NASA Astrophysics Data System (ADS)

    Sakurai, Kenji

    2007-06-01

    The 2007 workshop on `buried' interface science with X-rays and neutrons was held at the Institute of Materials Research, Tohoku University, in Sendai, Japan, on July 22-24, 2007. The workshop was the latest in a series held since 2001; Tsukuba (December 2001), Niigata (September 2002), Nagoya (July 2003), Tsukuba (July 2004), Saitama (March 2005), Yokohama (July 2006), Kusatsu (August 2006) and Tokyo (December 2006). The 2007 workshop had 64 participants and 34 presentations. There are increasing demands for sophisticated metrology in order to observe multilayered materials with nano-structures (dots, wires, etc), which are finding applications in electronic, magnetic, optical and other devices. Unlike many other surface-sensitive methods, X-ray and neutron analysis is known for its ability to see even `buried' function interfaces as well as the surface. It is highly reliable in practice, because the information, which ranges from the atomic to mesoscopic scale, is quantitative and reproducible. The non-destructive nature of this type of analytical method ensures that the same specimen can be measured by other techniques. However, we now realize that the method should be upgraded further to cope with more realistic problems in nano sciences and technologies. In the case of the reflectivity technique and other related methods, which have been the main topics in our workshops over the past 7 years, there are three important directions as illustrated in the Figure. Current X-ray methods can give atomic-scale information for quite a large area on a scale of mm2-cm2. These methods can deliver good statistics for an average, but sometimes we need to be able to see a specific part in nano-scale rather than an average structure. In addition, there is a need to see unstable changing structures and related phenomena in order to understand more about the mechanism of the functioning of nano materials. Quick measurements are therefore important. Furthermore, in order to apply the method to a more realistic and complex system, we need some visual understanding to discuss the relationship among the different structures that are present in the same viewing. Therefore, 2D/3D real-space imaging is important. Interpretation of roughness is another significant subject, while combination with grazing-incidence small angle scattering (GISAS) will become much more widespread than before. The use of coherent beams and several other new approaches would be also significant. Figure The first day of the workshop was devoted to discussions on problems concerning the theory and software for data analysis using the X-ray and neutron reflectivity technique. Until now, an ordinary fitting procedure based on Parratt's formula has been widely employed for the analysis of reflectivity data. This appears to work well when a good model is given. However, for applications involving quite new, and rather complicated materials, choosing a model is difficult. Even arriving at an assumption of the number of layers is not easy. Sometimes good fits can be obtained, but this does not prove the validity of the model. Therefore, developing an analytical procedure that does not depend on the model is extremely important. We invited leading senior academics in this field as commentators, Professors J. Harada (Nagoya University and Rigaku Corporation), S. Kikuta (The University of Tokyo and JASRI) and J. Mizuki (JAEA). The three of them gave very encouraging and valuable talks during the workshop. All of us wish to thank them very much for their attendance and their useful advice on a wide variety of subjects. We also thank our invited speakers from Tohoku University in Sendai, workshop site, Professors K. Takanashi, M. Kawasaki and M. Yanagihara. They talked about the hot topic of spintronics, and/or control of 'buried' magnetic interfaces. As discussed at the workshop, the use of techniques sensitive to specific interfaces is crucial in analyzing many unsolved problems in this field. Their talks were really stimulating, and encouraged us to make X-ray and neutron reflectivity techniques more feasible. The workshop was jointly organized by the Institute of Materials Research, Tohoku University and the X-ray and Neutron Analysis of Buried Interfaces Group, the Japan Society for Applied Physics. We are indebted to The Chemistry Society of Japan, The Japan Society for Analytical Chemistry, The Institute of Electrical Engineers of Japan, The Iron and Steel Institute of Japan, The Spectroscopical Society of Japan, The Society of Materials Science Japan, The Japanese XAFS Society, The Japan Society for Synchrotron Radiation Research and The Japan Society for Neutron Science, all of which cooperated on various occasions by such means as carrying advertisements in their journals, etc. Kenji Sakurai, Chairman of the workshop (National Institute for Materials Science) The PDF also contains the workshop program, the list of participants and the conference photographs.

  7. Report for MaRIE Drivers Workshop on needs for energetic material's studies.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Specht, Paul Elliott

    Energetic materials (i.e. explosives, propellants, and pyrotechnics) have complex mesoscale features that influence their dynamic response. Direct measurement of the complex mechanical, thermal, and chemical response of energetic materials is critical for improving computational models and enabling predictive capabilities. Many of the physical phenomena of interest in energetic materials cover time and length scales spanning several orders of magnitude. Examples include chemical interactions in the reaction zone, the distribution and evolution of temperature fields, mesoscale deformation in heterogeneous systems, and phase transitions. This is particularly true for spontaneous phenomena, like thermal cook-off. The ability for MaRIE to capture multiple lengthmore » scales and stochastic phenomena can significantly advance our understanding of energetic materials and yield more realistic, predictive models.« less

  8. Extreme-Scale Bayesian Inference for Uncertainty Quantification of Complex Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biros, George

    Uncertainty quantification (UQ)—that is, quantifying uncertainties in complex mathematical models and their large-scale computational implementations—is widely viewed as one of the outstanding challenges facing the field of CS&E over the coming decade. The EUREKA project set to address the most difficult class of UQ problems: those for which both the underlying PDE model as well as the uncertain parameters are of extreme scale. In the project we worked on these extreme-scale challenges in the following four areas: 1. Scalable parallel algorithms for sampling and characterizing the posterior distribution that exploit the structure of the underlying PDEs and parameter-to-observable map. Thesemore » include structure-exploiting versions of the randomized maximum likelihood method, which aims to overcome the intractability of employing conventional MCMC methods for solving extreme-scale Bayesian inversion problems by appealing to and adapting ideas from large-scale PDE-constrained optimization, which have been very successful at exploring high-dimensional spaces. 2. Scalable parallel algorithms for construction of prior and likelihood functions based on learning methods and non-parametric density estimation. Constructing problem-specific priors remains a critical challenge in Bayesian inference, and more so in high dimensions. Another challenge is construction of likelihood functions that capture unmodeled couplings between observations and parameters. We will create parallel algorithms for non-parametric density estimation using high dimensional N-body methods and combine them with supervised learning techniques for the construction of priors and likelihood functions. 3. Bayesian inadequacy models, which augment physics models with stochastic models that represent their imperfections. The success of the Bayesian inference framework depends on the ability to represent the uncertainty due to imperfections of the mathematical model of the phenomena of interest. This is a central challenge in UQ, especially for large-scale models. We propose to develop the mathematical tools to address these challenges in the context of extreme-scale problems. 4. Parallel scalable algorithms for Bayesian optimal experimental design (OED). Bayesian inversion yields quantified uncertainties in the model parameters, which can be propagated forward through the model to yield uncertainty in outputs of interest. This opens the way for designing new experiments to reduce the uncertainties in the model parameters and model predictions. Such experimental design problems have been intractable for large-scale problems using conventional methods; we will create OED algorithms that exploit the structure of the PDE model and the parameter-to-output map to overcome these challenges. Parallel algorithms for these four problems were created, analyzed, prototyped, implemented, tuned, and scaled up for leading-edge supercomputers, including UT-Austin’s own 10 petaflops Stampede system, ANL’s Mira system, and ORNL’s Titan system. While our focus is on fundamental mathematical/computational methods and algorithms, we will assess our methods on model problems derived from several DOE mission applications, including multiscale mechanics and ice sheet dynamics.« less

  9. An evaluation of the impact of social interaction skills training for facially disfigured people.

    PubMed

    Robinson, E; Rumsey, N; Partridge, J

    1996-07-01

    Facially disfigured people can experience significant psychological problems, commonly relating to difficulties in social interaction. The effect of social interaction skills workshops on the psychological well-being of 64 facially disfigured participants is described. Participants completed the Hospital Anxiety and Depression Scale (HAD), the Social Avoidance and Distress Scale (SAD) and an open-ended questionnaire, before a workshop and at 6 weeks and 6 months follow-up. The high levels of anxiety evident prior to the workshop fell significantly 6 weeks post-workshop (HSD = 1.297, P < 0.01) and remained significantly lower at 6 month follow-up (HSD = 1.563, P < 0.01). Similarly, SAD scores fell significantly at 6 weeks (HSD = 1.89, P < 0.05) and again at 6 month follow-up (HSD = 2.26, P < 0.01). 6 weeks post-workshop, participants reported feeling more confident in the company of strangers (HSD = -1.266, P < 0.01) and about meeting new people (HSD = -1.159, P < 0.01). This increase in confidence was maintained at 6 months (HSD = -1.068 and -1.042 respectively, P < 0.01 for both). 61% of those who experienced problems before the workshop reported a positive change in these situations. The potential of these workshops as an addition to surgical intervention is discussed.

  10. NASA Workshop on Computational Structural Mechanics 1987, part 1

    NASA Technical Reports Server (NTRS)

    Sykes, Nancy P. (Editor)

    1989-01-01

    Topics in Computational Structural Mechanics (CSM) are reviewed. CSM parallel structural methods, a transputer finite element solver, architectures for multiprocessor computers, and parallel eigenvalue extraction are among the topics discussed.

  11. NETTAB 2012 on "Integrated Bio-Search"

    PubMed Central

    2014-01-01

    The NETTAB 2012 workshop, held in Como on November 14-16, 2012, was devoted to "Integrated Bio-Search", that is to technologies, methods, architectures, systems and applications for searching, retrieving, integrating and analyzing data, information, and knowledge with the aim of answering complex bio-medical-molecular questions, i.e. some of the most challenging issues in bioinformatics today. It brought together about 80 researchers working in the field of Bioinformatics, Computational Biology, Biology, Computer Science and Engineering. More than 50 scientific contributions, including keynote and tutorial talks, oral communications, posters and software demonstrations, were presented at the workshop. This preface provides a brief overview of the workshop and shortly introduces the peer-reviewed manuscripts that were accepted for publication in this Supplement. PMID:24564635

  12. Surface Modeling, Grid Generation, and Related Issues in Computational Fluid Dynamic (CFD) Solutions

    NASA Technical Reports Server (NTRS)

    Choo, Yung K. (Compiler)

    1995-01-01

    The NASA Steering Committee for Surface Modeling and Grid Generation (SMAGG) sponsored a workshop on surface modeling, grid generation, and related issues in Computational Fluid Dynamics (CFD) solutions at Lewis Research Center, Cleveland, Ohio, May 9-11, 1995. The workshop provided a forum to identify industry needs, strengths, and weaknesses of the five grid technologies (patched structured, overset structured, Cartesian, unstructured, and hybrid), and to exchange thoughts about where each technology will be in 2 to 5 years. The workshop also provided opportunities for engineers and scientists to present new methods, approaches, and applications in SMAGG for CFD. This Conference Publication (CP) consists of papers on industry overview, NASA overview, five grid technologies, new methods/ approaches/applications, and software systems.

  13. Earth Exploration Toolbook Workshops: Helping Teachers and Students Analyze Web-based Scientific Data

    NASA Astrophysics Data System (ADS)

    McAuliffe, C.; Ledley, T.; Dahlman, L.; Haddad, N.

    2007-12-01

    One of the challenges faced by Earth science teachers, particularly in K-12 settings, is that of connecting scientific research to classroom experiences. Helping teachers and students analyze Web-based scientific data is one way to bring scientific research to the classroom. The Earth Exploration Toolbook (EET) was developed as an online resource to accomplish precisely that. The EET consists of chapters containing step-by-step instructions for accessing Web-based scientific data and for using a software analysis tool to explore issues or concepts in science, technology, and mathematics. For example, in one EET chapter, users download Earthquake data from the USGS and bring it into a geographic information system (GIS), analyzing factors affecting the distribution of earthquakes. The goal of the EET Workshops project is to provide professional development that enables teachers to incorporate Web-based scientific data and analysis tools in ways that meet their curricular needs. In the EET Workshops project, Earth science teachers participate in a pair of workshops that are conducted in a combined teleconference and Web-conference format. In the first workshop, the EET Data Analysis Workshop, participants are introduced to the National Science Digital Library (NSDL) and the Digital Library for Earth System Education (DLESE). They also walk through an Earth Exploration Toolbook (EET) chapter and discuss ways to use Earth science datasets and tools with their students. In a follow-up second workshop, the EET Implementation Workshop, teachers share how they used these materials in the classroom by describing the projects and activities that they carried out with students. The EET Workshops project offers unique and effective professional development. Participants work at their own Internet-connected computers, and dial into a toll-free group teleconference for step-by-step facilitation and interaction. They also receive support via Elluminate, a Web-conferencing software program. The software allows participants to see the facilitator's computer as the analysis techniques of an EET chapter are demonstrated. If needed, the facilitator can also view individual participant's computers, assisting with technical difficulties. In addition, it enables a large number of end users, often widely distributed, to engage in interactive, real-time instruction. In this presentation, we will describe the elements of an EET Workshop pair, highlighting the capabilities and use of Elluminate. We will share lessons learned through several years of conducting this type of professional development. We will also share findings from survey data gathered from teachers who have participated in our workshops.

  14. Great IDEAS: Telescopes, Computers, and Education

    NASA Astrophysics Data System (ADS)

    Nook, M. A.; Williams, D. L.

    1999-05-01

    Two workshops were developed for k-12 teachers that take advantage of the excitement students esperience when viewing objects through a telescope for the first time and the growth in educational opportunities that the internet has generated. The observational astronomy workshop focused on teaching educators a few basics about naked-eye, binocular, and small telescope observing; while the computers in astronomy education workshop taught teachers to develop simple web sites and permitted them to test astronomy software. The observational astronomy workshop met for three days on the SCSU campus to teach basic celestial motions, several constellations, and the basic operation of small telescopes. The next four nights were spent at Camden State Park in southwest Minnesota learning to locate deep sky objects and preparing public presentations. The final two nights the teachers presented public observing programs at three state parks. Fifty percent of the teachers implemented night observing into their curriculum this past year, and one teacher purchased her own telescope to use with students and to help other teachers in the district. The computers in astronomy workshop introduced the teachers to several commercially available astronomy software packages and taught them the fundamentals of constructing simple web pages. The participants were required to develop astronomy lessons based on one of the software packages or a web site that they developed. Each participant then constructed a web-based lesson plan, student lesson, and teacher's guide for their lesson. These lessons are available at http://enstein.stcloudstate.edu/nook/IDEAS/computers/. Support for this work was provided by NASA through grant numbers ED-90156.01-97A and ED-90157.01-97A from the Space Telescope Science Institute, which is operated by the Association of Universities for Research in Astronomy, Incorporated, under NASA contract NAS5-26555.

  15. Biomedical Polar Research Workshop Minutes

    NASA Technical Reports Server (NTRS)

    1990-01-01

    This workshop was conducted to provide a background of NASA and National Science Foundation goals, an overview of previous and current biomedical research, and a discussion about areas of potential future joint activities. The objectives of the joint research were: (1) to develop an understanding of the physiological, psychological, and behavioral alterations and adaptations to extreme environments of the polar regions; (2) to ensure the health, well-being, and performance of humans in these environments; and (3) to promote the application of biomedical research to improve the quality of life in all environments.

  16. 76 FR 64330 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-18

    ... talks on HPC Reliability, Diffusion on Complex Networks, and Reversible Software Execution Systems Report from Applied Math Workshop on Mathematics for the Analysis, Simulation, and Optimization of Complex Systems Report from ASCR-BES Workshop on Data Challenges from Next Generation Facilities Public...

  17. The Astronomy Workshop

    NASA Astrophysics Data System (ADS)

    Hamilton, D. P.; Asbury, M. L.

    1999-12-01

    The Astronomy Workshop (http://janus.astro.umd.edu) is an interactive online astronomy resource developed and maintained at the University of Maryland for use by students, educators and the general public. The Astronomy Workshop has been extensively tested and used successfully at many different levels, including High School and Junior High School science classes, University introductory astronomy courses, and University intermediate and advanced astronomy courses. Some topics currently covered in the Astronomy Workshop are: Animated Orbits of Planets and Moons: The orbits of the nine planets and 63 known planetary satellites are shown in animated, to-scale drawings. The orbiting bodies move at their correct relative speeds about their parent, which is rendered as an attractive, to-scale gif image. Solar System Collisions: This most popular of our applications shows what happens when an asteroid or comet with user-defined size and speed impacts a given planet. The program calculates many effects, including the country impacted (if Earth is the target), energy of explosion, crater size, and magnitude of the ``planetquake'' generated. It also displays a relevant image (e.g. terrestrial crater, lunar crater, etc.). Scale of the Universe: Travel away from the Earth at a chosen speed and see how long it takes to reach other planets, stars and galaxies. This tool helps students visualize astronomical distances in an intuitive way. Scientific Notation: Students are interactively guided through conversions between scientific notation and regular numbers. Orbital Simulations: These tools allow the student to investigate different aspects of the three-body problem of celestial mechanics. Astronomy Workshop Bulletin Board: Get innovative teaching ideas and read about in-class experiences with the Astronomy Workshop. Share your ideas with other educators by posting on the Bulletin Board. Funding for the Astronomy Workshop is provided by NSF.

  18. The Astronomy Workshop

    NASA Astrophysics Data System (ADS)

    Hamilton, D. P.; Asbury, M. L.

    2000-05-01

    The Astronomy Workshop (http://janus.astro.umd.edu) is an interactive online astronomy resource developed and maintained at the University of Maryland for use by students, educators and the general public. The Astronomy Workshop has been extensively tested and used successfully at many different levels, including High School and Junior High School science classes, University introductory astronomy courses, and University intermediate and advanced astronomy courses. Some topics currently covered in the Astronomy Workshop are: ANIMATED ORBITS OF PLANETS AND MOONS: The orbits of the nine planets and 63 known planetary satellites are shown in animated, to-scale drawings. The orbiting bodies move at their correct relative speeds about their parent, which is rendered as an attractive, to-scale gif image. SOLAR SYSTEM COLLISIONS: This most popular of our applications shows what happens when an asteroid or comet with user-defined size and speed impacts a given planet. The program calculates many effects, including the country impacted (if Earth is the target), energy of explosion, crater size, and magnitude of the ``planetquake'' generated. It also displays a relevant image (e.g. terrestrial crater, lunar crater, etc.). SCALE OF THE UNIVERSE: Travel away from the Earth at a chosen speed and see how long it takes to reach other planets, stars and galaxies. This tool helps students visualize astronomical distances in an intuitive way. SCIENTIFIC NOTATION: Students are interactively guided through conversions between scientific notation and regular numbers. ORBITAL SIMULATIONS: These tools allow the student to investigate different aspects of the three-body problem of celestial mechanics. ASTRONOMY WORKSHOP BULLETIN BOARD: Get innovative teaching ideas and read about in-class experiences with the Astronomy Workshop. Share your ideas with other educators by posting on the Bulletin Board. Funding for the Astronomy Workshop is provided by NSF.

  19. The Astronomy Workshop

    NASA Astrophysics Data System (ADS)

    Hamilton, D. P.; Asbury, M. L.

    1999-09-01

    The Astronomy Workshop (http://janus.astro.umd.edu) is an interactive online astronomy resource developed and maintained at the University of Maryland for use by students, educators and the general public. The Astronomy Workshop has been extensively tested and used successfully at many different levels, including High School and Junior High School science classes, University introductory astronomy courses, and University intermediate and advanced astronomy courses. Some topics currently covered in the Astronomy Workshop are: Animated Orbits of Planets and Moons: The orbits of the nine planets and 63 known planetary satellites are shown in animated, to-scale drawings. The orbiting bodies move at their correct relative speeds about their parent, which is rendered as an attractive, to-scale gif image. Solar System Collisions: This most popular of our applications shows what happens when an asteroid or comet with user-defined size and speed impacts a given planet. The program calculates many effects, including the country impacted (if Earth is the target), energy of explosion, crater size, and magnitude of the ``planetquake'' generated. It also displays a relevant image (e.g. terrestrial crater, lunar crater, etc.). Scale of the Universe: Travel away from the Earth at a chosen speed and see how long it takes to reach other planets, stars and galaxies. This tool helps students visualize astronomical distances in an intuitive way. Scientific Notation: Students are interactively guided through conversions between scientific notation and regular numbers. Orbital Simulations: These tools allow the student to investigate different aspects of the three-body problem of celestial mechanics. Astronomy Workshop Bulletin Board: Get innovative teaching ideas and read about in-class experiences with the Astronomy Workshop. Share your ideas with other educators by posting on the Bulletin Board. Funding for the Astronomy Workshop is provided by NSF.

  20. Rapid, high-resolution measurement of leaf area and leaf orientation using terrestrial LiDAR scanning data

    USDA-ARS?s Scientific Manuscript database

    The rapid evolution of high performance computing technology has allowed for the development of extremely detailed models of the urban and natural environment. Although models can now represent sub-meter-scale variability in environmental geometry, model users are often unable to specify the geometr...

  1. FOREWORD: 4th International Workshop on New Computational Methods for Inverse Problems (NCMIP2014)

    NASA Astrophysics Data System (ADS)

    2014-10-01

    This volume of Journal of Physics: Conference Series is dedicated to the scientific contributions presented during the 4th International Workshop on New Computational Methods for Inverse Problems, NCMIP 2014 (http://www.farman.ens-cachan.fr/NCMIP_2014.html). This workshop took place at Ecole Normale Supérieure de Cachan, on May 23, 2014. The prior editions of NCMIP also took place in Cachan, France, firstly within the scope of ValueTools Conference, in May 2011 (http://www.ncmip.org/2011/), and secondly at the initiative of Institut Farman, in May 2012 and May 2013, (http://www.farman.ens-cachan.fr/NCMIP_2012.html), (http://www.farman.ens-cachan.fr/NCMIP_2013.html). The New Computational Methods for Inverse Problems (NCMIP) Workshop focused on recent advances in the resolution of inverse problems. Indeed, inverse problems appear in numerous scientific areas such as geophysics, biological and medical imaging, material and structure characterization, electrical, mechanical and civil engineering, and finances. The resolution of inverse problems consists of estimating the parameters of the observed system or structure from data collected by an instrumental sensing or imaging device. Its success firstly requires the collection of relevant observation data. It also requires accurate models describing the physical interactions between the instrumental device and the observed system, as well as the intrinsic properties of the solution itself. Finally, it requires the design of robust, accurate and efficient inversion algorithms. Advanced sensor arrays and imaging devices provide high rate and high volume data; in this context, the efficient resolution of the inverse problem requires the joint development of new models and inversion methods, taking computational and implementation aspects into account. During this one-day workshop, researchers had the opportunity to bring to light and share new techniques and results in the field of inverse problems. The topics of the workshop were: algorithms and computational aspects of inversion, Bayesian estimation, Kernel methods, learning methods, convex optimization, free discontinuity problems, metamodels, proper orthogonal decomposition, reduced models for the inversion, non-linear inverse scattering, image reconstruction and restoration, and applications (bio-medical imaging, non-destructive evaluation...). NCMIP 2014 was a one-day workshop held in May 2014 which attracted around sixty attendees. Each of the submitted papers has been reviewed by two reviewers. There have been nine accepted papers. In addition, three international speakers were invited to present a longer talk. The workshop was supported by Institut Farman (ENS Cachan, CNRS) and endorsed by the following French research networks (GDR ISIS, GDR MIA, GDR MOA, GDR Ondes). The program committee acknowledges the following research laboratories: CMLA, LMT, LURPA, SATIE. Eric Vourc'h and Thomas Rodet

  2. New Frontiers in Analyzing Dynamic Group Interactions: Bridging Social and Computer Science

    PubMed Central

    Lehmann-Willenbrock, Nale; Hung, Hayley; Keyton, Joann

    2017-01-01

    This special issue on advancing interdisciplinary collaboration between computer scientists and social scientists documents the joint results of the international Lorentz workshop, “Interdisciplinary Insights into Group and Team Dynamics,” which took place in Leiden, The Netherlands, July 2016. An equal number of scholars from social and computer science participated in the workshop and contributed to the papers included in this special issue. In this introduction, we first identify interaction dynamics as the core of group and team models and review how scholars in social and computer science have typically approached behavioral interactions in groups and teams. Next, we identify key challenges for interdisciplinary collaboration between social and computer scientists, and we provide an overview of the different articles in this special issue aimed at addressing these challenges. PMID:29249891

  3. New Frontiers in Analyzing Dynamic Group Interactions: Bridging Social and Computer Science.

    PubMed

    Lehmann-Willenbrock, Nale; Hung, Hayley; Keyton, Joann

    2017-10-01

    This special issue on advancing interdisciplinary collaboration between computer scientists and social scientists documents the joint results of the international Lorentz workshop, "Interdisciplinary Insights into Group and Team Dynamics," which took place in Leiden, The Netherlands, July 2016. An equal number of scholars from social and computer science participated in the workshop and contributed to the papers included in this special issue. In this introduction, we first identify interaction dynamics as the core of group and team models and review how scholars in social and computer science have typically approached behavioral interactions in groups and teams. Next, we identify key challenges for interdisciplinary collaboration between social and computer scientists, and we provide an overview of the different articles in this special issue aimed at addressing these challenges.

  4. Computer Numerical Control: Instructional Manual. The North Dakota High Technology Mobile Laboratory Project.

    ERIC Educational Resources Information Center

    Sinn, John W.

    This instructional manual contains five learning activity packets for use in a workshop on computer numerical control for computer-aided manufacturing. The lessons cover the following topics: introduction to computer-aided manufacturing, understanding the lathe, using the computer, computer numerically controlled part programming, and executing a…

  5. Data Base Directions: Information Resource Management - Strategies and Tools. Proceedings of the Workshop of the National Bureau of Standards and the Association for Computing Machinery (Ft. Lauderdale, Florida, October 20-22, 1980).

    ERIC Educational Resources Information Center

    Goldfine, Alan H., Ed.

    This workshop investigated how managers can evaluate, select, and effectively use information resource management (IRM) tools, especially data dictionary systems (DDS). An executive summary, which provides a definition of IRM as developed by workshop participants, precedes the keynote address, "Data: The Raw Material of a Paper Factory,"…

  6. GROMACS 4:  Algorithms for Highly Efficient, Load-Balanced, and Scalable Molecular Simulation.

    PubMed

    Hess, Berk; Kutzner, Carsten; van der Spoel, David; Lindahl, Erik

    2008-03-01

    Molecular simulation is an extremely useful, but computationally very expensive tool for studies of chemical and biomolecular systems. Here, we present a new implementation of our molecular simulation toolkit GROMACS which now both achieves extremely high performance on single processors from algorithmic optimizations and hand-coded routines and simultaneously scales very well on parallel machines. The code encompasses a minimal-communication domain decomposition algorithm, full dynamic load balancing, a state-of-the-art parallel constraint solver, and efficient virtual site algorithms that allow removal of hydrogen atom degrees of freedom to enable integration time steps up to 5 fs for atomistic simulations also in parallel. To improve the scaling properties of the common particle mesh Ewald electrostatics algorithms, we have in addition used a Multiple-Program, Multiple-Data approach, with separate node domains responsible for direct and reciprocal space interactions. Not only does this combination of algorithms enable extremely long simulations of large systems but also it provides that simulation performance on quite modest numbers of standard cluster nodes.

  7. Finding your inner modeler: An NSF-sponsored workshop to introduce cell biologists to modeling/computational approaches.

    PubMed

    Stone, David E; Haswell, Elizabeth S; Sztul, Elizabeth

    2017-01-01

    In classical Cell Biology, fundamental cellular processes are revealed empirically, one experiment at a time. While this approach has been enormously fruitful, our understanding of cells is far from complete. In fact, the more we know, the more keenly we perceive our ignorance of the profoundly complex and dynamic molecular systems that underlie cell structure and function. Thus, it has become apparent to many cell biologists that experimentation alone is unlikely to yield major new paradigms, and that empiricism must be combined with theory and computational approaches to yield major new discoveries. To facilitate those discoveries, three workshops will convene annually for one day in three successive summers (2017-2019) to promote the use of computational modeling by cell biologists currently unconvinced of its utility or unsure how to apply it. The first of these workshops was held at the University of Illinois, Chicago in July 2017. Organized to facilitate interactions between traditional cell biologists and computational modelers, it provided a unique educational opportunity: a primer on how cell biologists with little or no relevant experience can incorporate computational modeling into their research. Here, we report on the workshop and describe how it addressed key issues that cell biologists face when considering modeling including: (1) Is my project appropriate for modeling? (2) What kind of data do I need to model my process? (3) How do I find a modeler to help me in integrating modeling approaches into my work? And, perhaps most importantly, (4) why should I bother?

  8. Achieving benefit for patients in primary care informatics: the report of a international consensus workshop at Medinfo 2007.

    PubMed

    de Lusignan, Simon; Teasdale, Sheila

    2007-01-01

    Landmark reports suggest that sharing health data between clinical computer systems should improve patient safety and the quality of care. Enhancing the use of informatics in primary care is usually a key part of these strategies. To synthesise the learning from the international use of informatics in primary care. The workshop was attended by 21 delegates drawn from all continents. There were presentations from USA, UK and the Netherlands, and informal updates from Australia, Argentina, and Sweden and the Nordic countries. These presentations were discussed in a workshop setting to identify common issues. Key principles were synthesised through a post-workshop analysis and then sorted into themes. Themes emerged about the deployment of informatics which can be applied at health service, practice and individual clinical consultation level: 1 At the health service or provider level, success appeared proportional to the extent of collaboration between a broad range of stakeholders and identification of leaders. 2 Within the practice much is currently being achieved with legacy computer systems and apparently outdated coding systems. This includes prescribing safety alerts, clinical audit and promoting computer data recording and quality. 3 In the consultation the computer is a 'big player' and may make traditional models of the consultation redundant. We should make more efforts to share learning; develop clear internationally acceptable definitions; highlight gaps between pockets of excellence and real-world practice, and most importantly suggest how they might be bridged. Knowledge synthesis from different health systems may provide a greater understanding of how the third actor (the computer) is best used in primary care.

  9. European Workshop Industrical Computer Science Systems approach to design for safety

    NASA Technical Reports Server (NTRS)

    Zalewski, Janusz

    1992-01-01

    This paper presents guidelines on designing systems for safety, developed by the Technical Committee 7 on Reliability and Safety of the European Workshop on Industrial Computer Systems. The focus is on complementing the traditional development process by adding the following four steps: (1) overall safety analysis; (2) analysis of the functional specifications; (3) designing for safety; (4) validation of design. Quantitative assessment of safety is possible by means of a modular questionnaire covering various aspects of the major stages of system development.

  10. Evaluating the fidelity of CMIP5 models in producing large-scale meteorological patterns over the Northwestern United States

    NASA Astrophysics Data System (ADS)

    Lintner, B. R.; Loikith, P. C.; Pike, M.; Aragon, C.

    2017-12-01

    Climate change information is increasingly required at impact-relevant scales. However, most state-of-the-art climate models are not of sufficiently high spatial resolution to resolve features explicitly at such scales. This challenge is particularly acute in regions of complex topography, such as the Pacific Northwest of the United States. To address this scale mismatch problem, we consider large-scale meteorological patterns (LSMPs), which can be resolved by climate models and associated with the occurrence of local scale climate and climate extremes. In prior work, using self-organizing maps (SOMs), we computed LSMPs over the northwestern United States (NWUS) from daily reanalysis circulation fields and further related these to the occurrence of observed extreme temperatures and precipitation: SOMs were used to group LSMPs into 12 nodes or clusters spanning the continuum of synoptic variability over the regions. Here this observational foundation is utilized as an evaluation target for a suite of global climate models from the Fifth Phase of the Coupled Model Intercomparison Project (CMIP5). Evaluation is performed in two primary ways. First, daily model circulation fields are assigned to one of the 12 reanalysis nodes based on minimization of the mean square error. From this, a bulk model skill score is computed measuring the similarity between the model and reanalysis nodes. Next, SOMs are applied directly to the model output and compared to the nodes obtained from reanalysis. Results reveal that many of the models have LSMPs analogous to the reanalysis, suggesting that the models reasonably capture observed daily synoptic states.

  11. Proceedings of the 2004 NASA/ONR Circulation Control Workshop, Part 2

    NASA Technical Reports Server (NTRS)

    Jones, Gregory S. (Editor); Joslin, Ronald D. (Editor)

    2005-01-01

    This conference proceeding is comprised of papers that were presented at the NASA/ONR Circulation Control Workshop held 16-17 March 2004 at the Radisson-Hampton in Hampton, VA. Over two full days, 30 papers and 4 posters were presented with 110 scientists and engineers in attendance, representing 3 countries. As technological advances influence the efficiency and effectiveness of aerodynamic and hydrodynamic applications, designs, and operations, this workshop was intended to address the technologies, systems, challenges and successes specific to Coanda driven circulation control in aerodynamics and hydrodynamics. A major goal of this workshop was to determine the state-of-the-art in circulation control and to assess the future directions and applications for circulation control. The 2004 workshop addressed applications, experiments, computations, and theories related to circulation control, emphasizing fundamental physics, systems analysis, and applied research. The workshop consisted of single session oral presentations, posters, and written papers that are documented in this unclassified conference proceeding. The format of this written proceeding follows the agenda of the workshop. Each paper is followed with the presentation given at the workshop. the editors compiled brief summaries for each effort that is at the end of this proceeding. These summaries include the paper, oral presentation, and questions or comments that occurred during the workshop. The 2004 Circulation Control Workshop focused on applications including Naval vehicles (Surface and Underwater vehicles), Fixed Wing Aviation (general aviation, commercial, cargo, and business aircraft); V/STOL platforms (helicopters, military aircraft, tilt rotors); propulsion systems (propellers, jet engines, gas turbines), and ground vehicles (automotive, trucks, and other); wind turbines, and other nontraditional applications (e.g., vacuum cleaner, ceiling fan). As part of the CFD focus area of the 2004 CC Workshop, CFD practitioners were invited to compute a two-dimensional benchmark problem for which geometry, flow conditions, grids, and experimental data were available before the workshop. The purpose was to accumulate a database of simulations for a single problem using a range of CFD codes, turbulence models, and grid strategies so as to expand knowledge of model performance/requirements and guide simulation of practical CC configurations.

  12. Advanced Group Support Systems and Facilities

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Malone, John B. (Compiler)

    1999-01-01

    The document contains the proceedings of the Workshop on Advanced Group Support Systems and Facilities held at NASA Langley Research Center, Hampton, Virginia, July 19-20, 1999. The workshop was jointly sponsored by the University of Virginia Center for Advanced Computational Technology and NASA. Workshop attendees came from NASA, other government agencies, industry, and universities. The objectives of the workshop were to assess the status of advanced group support systems and to identify the potential of these systems for use in future collaborative distributed design and synthesis environments. The presentations covered the current status and effectiveness of different group support systems.

  13. Data Comparisons and Summary of the Second Aeroelastic Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Heeg, Jennifer; Wieseman, Carol D.; Chwalowski, Pawel

    2016-01-01

    This paper presents the computational results generated by participating teams of the second Aeroelastic Prediction Workshop and compare them with experimental data. Aeroelastic and rigid configurations of the Benchmark Supercritical Wing (BSCW) wind tunnel model served as the focus for the workshop. The comparison data sets include unforced ("steady") system responses, forced pitch oscillations and coupled fluid-structure responses. Integrated coefficients, frequency response functions, and flutter onset conditions are compared. The flow conditions studied were in the transonic range, including both attached and separated flow conditions. Some of the technical discussions that took place at the workshop are summarized.

  14. An Opening Chapter of the First Generation of Artificial Intelligence in Medicine: The First Rutgers AIM Workshop, June 1975

    PubMed Central

    2015-01-01

    Summary The first generation of Artificial Intelligence (AI) in Medicine methods were developed in the early 1970’s drawing on insights about problem solving in AI. They developed new ways of representing structured expert knowledge about clinical and biomedical problems using causal, taxonomic, associational, rule, and frame-based models. By 1975, several prototype systems had been developed and clinically tested, and the Rutgers Research Resource on Computers in Biomedicine hosted the first in a series of workshops on AI in Medicine that helped researchers and clinicians share their ideas, demonstrate their models, and comment on the prospects for the field. These developments and the workshops themselves benefited considerably from Stanford’s SUMEX-AIM pioneering experiment in biomedical computer networking. This paper focuses on discussions about issues at the intersection of medicine and artificial intelligence that took place during the presentations and panels at the First Rutgers AIM Workshop in New Brunswick, New Jersey from June 14 to 17, 1975. PMID:26123911

  15. An Opening Chapter of the First Generation of Artificial Intelligence in Medicine: The First Rutgers AIM Workshop, June 1975.

    PubMed

    Kulikowski, C A

    2015-08-13

    The first generation of Artificial Intelligence (AI) in Medicine methods were developed in the early 1970's drawing on insights about problem solving in AI. They developed new ways of representing structured expert knowledge about clinical and biomedical problems using causal, taxonomic, associational, rule, and frame-based models. By 1975, several prototype systems had been developed and clinically tested, and the Rutgers Research Resource on Computers in Biomedicine hosted the first in a series of workshops on AI in Medicine that helped researchers and clinicians share their ideas, demonstrate their models, and comment on the prospects for the field. These developments and the workshops themselves benefited considerably from Stanford's SUMEX-AIM pioneering experiment in biomedical computer networking. This paper focuses on discussions about issues at the intersection of medicine and artificial intelligence that took place during the presentations and panels at the First Rutgers AIM Workshop in New Brunswick, New Jersey from June 14 to 17, 1975.

  16. Lattice Boltzmann for Airframe Noise Predictions

    NASA Technical Reports Server (NTRS)

    Barad, Michael; Kocheemoolayil, Joseph; Kiris, Cetin

    2017-01-01

    Increase predictive use of High-Fidelity Computational Aero- Acoustics (CAA) capabilities for NASA's next generation aviation concepts. CFD has been utilized substantially in analysis and design for steady-state problems (RANS). Computational resources are extremely challenged for high-fidelity unsteady problems (e.g. unsteady loads, buffet boundary, jet and installation noise, fan noise, active flow control, airframe noise, etc) ü Need novel techniques for reducing the computational resources consumed by current high-fidelity CAA Need routine acoustic analysis of aircraft components at full-scale Reynolds number from first principles Need an order of magnitude reduction in wall time to solution!

  17. Research Frontiers in Bioinspired Energy: Molecular-Level Learning from Natural Systems: A Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zolandz, Dorothy

    An interactive, multidisciplinary, public workshop, organized by a group of experts in biochemistry, biophysics, chemical and biomolecular engineering, chemistry, microbial metabolism, and protein structure and function, was held on January 6-7, 2011 in Washington, DC. Fundamental insights into the biological energy capture, storage, and transformation processes provided by speakers was featured in this workshop which included topics such as microbes living in extreme environments such as hydrothermal vents or caustic soda lakes (extremophiles) provided a fascinating basis for discussing the exploration and development of new energy systems. Breakout sessions and extended discussions among the multidisciplinary groups of participants in themore » workshop fostered information sharing and possible collaborations on future bioinspired research. Printed and web-based materials that summarize the committee's assessment of what transpired at the workshop were prepared to advance further understanding of fundamental chemical properties of biological systems within and between the disciplines. In addition, webbased materials (including two animated videos) were developed to make the workshop content more accessible to a broad audience of students and researchers working across disciplinary boundaries. Key workshop discussion topics included: Exploring and identifying novel organisms; Identifying patterns and conserved biological structures in nature; Exploring and identifying fundamental properties and mechanisms of known biological systems; Supporting current, and creating new, opportunities for interdisciplinary education, training, and outreach; and Applying knowledge from biology to create new devices and sustainable technology.« less

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collins, William D; Johansen, Hans; Evans, Katherine J

    We present a survey of physical and computational techniques that have the potential to con- tribute to the next generation of high-fidelity, multi-scale climate simulations. Examples of the climate science problems that can be investigated with more depth include the capture of remote forcings of localized hydrological extreme events, an accurate representation of cloud features over a range of spatial and temporal scales, and parallel, large ensembles of simulations to more effectively explore model sensitivities and uncertainties. Numerical techniques, such as adaptive mesh refinement, implicit time integration, and separate treatment of fast physical time scales are enabling improved accuracy andmore » fidelity in simulation of dynamics and allow more complete representations of climate features at the global scale. At the same time, part- nerships with computer science teams have focused on taking advantage of evolving computer architectures, such as many-core processors and GPUs, so that these approaches which were previously considered prohibitively costly have become both more efficient and scalable. In combination, progress in these three critical areas is poised to transform climate modeling in the coming decades.« less

  19. Third user workshop on high-power lasers at the Linac Coherent Light Source

    DOE PAGES

    Bolme, Cynthia Anne; Glenzer, Sigfried; Fry, Alan

    2016-03-24

    On October 5–6, 2015, the third international user workshop focusing on high-power lasers at the Linac Coherent Light Source (LCLS) was held in Menlo Park, CA, USA [1 R. Falcone, S. Glenzer, and S. Hau-Riege, Synchrotron Radiation News 27(2), 56–58 (2014)., 2 P. Heimann and S. Glenzer, Synchrotron Radiation News 28(3), 54–56 (2015).]. Here, the workshop was co-organized by Los Alamos National Laboratory and SLAC National Accelerator Laboratory. More than 110 scientists attended from North America, Europe, and Asia to discuss high-energy-density (HED) science that is enabled by the unique combination of high-power lasers with the LCLS X-rays at themore » LCLS-Matter in Extreme Conditions (MEC) endstation.« less

  20. Ninth Thermal and Fluids Analysis Workshop Proceedings

    NASA Technical Reports Server (NTRS)

    Sakowski, Barbara (Compiler)

    1999-01-01

    The Ninth Thermal and Fluids Analysis Workshop (TFAWS 98) was held at the Ohio Aerospace Institute in Cleveland, Ohio from August 31 to September 4, 1998. The theme for the hands-on training workshop and conference was "Integrating Computational Fluid Dynamics and Heat Transfer into the Design Process." Highlights of the workshop (in addition to the papers published herein) included an address by the NASA Chief Engineer, Dr. Daniel Mulville; a CFD short course by Dr. John D. Anderson of the University of Maryland; and a short course by Dr. Robert Cochran of Sandia National Laboratories. In addition, lectures and hands-on training were offered in the use of several cutting-edge engineering design and analysis-oriented CFD and Heat Transfer tools. The workshop resulted in international participation of over 125 persons representing aerospace and automotive industries, academia, software providers, government agencies, and private corporations. The papers published herein address issues and solutions related to the integration of computational fluid dynamics and heat transfer into the engineering design process. Although the primary focus is aerospace, the topics and ideas presented are applicable to many other areas where these and other disciplines are interdependent.

  1. Summary of the Tandem Cylinder Solutions from the Benchmark Problems for Airframe Noise Computations-I Workshop

    NASA Technical Reports Server (NTRS)

    Lockard, David P.

    2011-01-01

    Fifteen submissions in the tandem cylinders category of the First Workshop on Benchmark problems for Airframe Noise Computations are summarized. Although the geometry is relatively simple, the problem involves complex physics. Researchers employed various block-structured, overset, unstructured and embedded Cartesian grid techniques and considerable computational resources to simulate the flow. The solutions are compared against each other and experimental data from 2 facilities. Overall, the simulations captured the gross features of the flow, but resolving all the details which would be necessary to compute the noise remains challenging. In particular, how to best simulate the effects of the experimental transition strip, and the associated high Reynolds number effects, was unclear. Furthermore, capturing the spanwise variation proved difficult.

  2. Uncertainty and Risk Assessment in the Design Process for Wind

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Damiani, Rick R.

    This report summarizes the concepts and opinions that emerged from an initial study on the subject of uncertainty in wind design that included expert elicitation during a workshop held at the National Wind Technology Center at the National Renewable Energy Laboratory July 12-13, 2016. In this paper, five major categories of uncertainties are identified. The first category is associated with direct impacts on turbine loads, (i.e., the inflow including extreme events, aero-hydro-servo-elastic response, soil-structure inter- action, and load extrapolation). The second category encompasses material behavior and strength. Site suitability and due-diligence aspects pertain to the third category. Calibration of partialmore » safety factors and optimal reliability levels make up the fourth one. And last but not least, is the category associated with uncertainties in computational modeling. The main sections of this paper follow this organization.« less

  3. Belowground Carbon Cycling Processes at the Molecular Scale: An EMSL Science Theme Advisory Panel Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hess, Nancy J.; Brown, Gordon E.; Plata, Charity

    2014-02-21

    As part of the Belowground Carbon Cycling Processes at the Molecular Scale workshop, an EMSL Science Theme Advisory Panel meeting held in February 2013, attendees discussed critical biogeochemical processes that regulate carbon cycling in soil. The meeting attendees determined that as a national scientific user facility, EMSL can provide the tools and expertise needed to elucidate the molecular foundation that underlies mechanistic descriptions of biogeochemical processes that control carbon allocation and fluxes at the terrestrial/atmospheric interface in landscape and regional climate models. Consequently, the workshop's goal was to identify the science gaps that hinder either development of mechanistic description ofmore » critical processes or their accurate representation in climate models. In part, this report offers recommendations for future EMSL activities in this research area. The workshop was co-chaired by Dr. Nancy Hess (EMSL) and Dr. Gordon Brown (Stanford University).« less

  4. A flexible hydrological warning system in Denmark for real-time surface water and groundwater simulations

    NASA Astrophysics Data System (ADS)

    He, Xin; Stisen, Simon; Wiese, Marianne B.; Jørgen Henriksen, Hans

    2015-04-01

    In Denmark, increasing focus on extreme weather events has created considerable demand for short term forecasts and early warnings in relation to groundwater and surface water flooding. The Geological Survey of Denmark and Greenland (GEUS) has setup, calibrated and applied a nationwide water resources model, the DK-Model, primarily for simulating groundwater and surface water flows and groundwater levels during the past 20 years. So far, the DK-model has only been used in offline historical and future scenario simulations. Therefore, challenges arise in operating such a model for online forecasts and early warnings, which requires access to continuously updated observed climate input data and forecast data of precipitation, temperature and global radiation for the next 48 hours or longer. GEUS has a close collaboration with the Danish Meteorological Institute in order to test and enable this data input for the DK model. Due to the comprehensive physical descriptions of the DK-Model, the simulation results can potentially be any component of the hydrological cycle within the models domain. Therefore, it is important to identify which results need to be updated and saved in the real-time mode, since it is not computationally economical to save every result considering the heavy load of data. GEUS have worked closely with the end-users and interest groups such as water planners and emergency managers from the municipalities, water supply and waste water companies, consulting companies and farmer organizations, in order to understand their possible needs for real time simulation and monitoring of the nationwide water cycle. This participatory process has been supported by a web based questionnaire survey, and a workshop that connected the model developers and the users. For qualifying the stakeholder engagement, GEUS has selected a representative catchment area (Skjern River) for testing and demonstrating a prototype of the web based hydrological warning system at the workshop, and illustrated simulated groundwater levels, streamflow and water content in the root zone. The webpages can be tailor-made to meet the requirements of the end-users and also enable flexibility to extend while the users' demand changes. The active involvement of stakeholders in the workshop provided very valuable insights and feedbacks for GEUS, relevant for the future development of the nationwide real-time modeling and water cycle monitoring system for Denmark, including possible linking to early warning and real-time forecasting systems operating at the local scale.

  5. Protecting Information: The Role of Community Colleges in Cybersecurity Education. A Report from a Workshop Sponsored by the National Science Foundation and the American Association of Community Colleges (Washington, DC, June 26-28, 2002).

    ERIC Educational Resources Information Center

    American Association of Community Colleges, Washington, DC.

    The education and training of the cybersecurity workforce is an essential element in protecting the nation's computer and information systems. On June 26-28, 2002, the National Science Foundation supported a cybersecurity education workshop hosted by the American Association of Community Colleges. The goals of the workshop were to map out the role…

  6. Fundamental heat transfer research for gas turbine engines

    NASA Technical Reports Server (NTRS)

    Metzger, D. E. (Editor)

    1980-01-01

    Thirty-seven experts from industry and the universities joined 24 NASA Lewis staff members in an exchange of ideas on trends in aeropropulsion research and technology, basic analyses, computational analyses, basic experiments, near-engine environment experiments, fundamental fluid mechanics and heat transfer, and hot technology as related to gas turbine engines. The workshop proceedings described include pre-workshop input from participants, presentations of current activity by the Lewis staff, reports of the four working groups, and a workshop summary.

  7. Spatial and temporal accuracy of asynchrony-tolerant finite difference schemes for partial differential equations at extreme scales

    NASA Astrophysics Data System (ADS)

    Kumari, Komal; Donzis, Diego

    2017-11-01

    Highly resolved computational simulations on massively parallel machines are critical in understanding the physics of a vast number of complex phenomena in nature governed by partial differential equations. Simulations at extreme levels of parallelism present many challenges with communication between processing elements (PEs) being a major bottleneck. In order to fully exploit the computational power of exascale machines one needs to devise numerical schemes that relax global synchronizations across PEs. This asynchronous computations, however, have a degrading effect on the accuracy of standard numerical schemes.We have developed asynchrony-tolerant (AT) schemes that maintain order of accuracy despite relaxed communications. We show, analytically and numerically, that these schemes retain their numerical properties with multi-step higher order temporal Runge-Kutta schemes. We also show that for a range of optimized parameters,the computation time and error for AT schemes is less than their synchronous counterpart. Stability of the AT schemes which depends upon history and random nature of delays, are also discussed. Support from NSF is gratefully acknowledged.

  8. Addressing failures in exascale computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Snir, Marc; Wisniewski, Robert W.; Abraham, Jacob A.

    2014-05-01

    We present here a report produced by a workshop on “Addressing Failures in Exascale Computing” held in Park City, Utah, August 4–11, 2012. The charter of this workshop was to establish a common taxonomy about resilience across all the levels in a computing system; discuss existing knowledge on resilience across the various hardware and software layers of an exascale system; and build on those results, examining potential solutions from both a hardware and software perspective and focusing on a combined approach. The workshop brought together participants with expertise in applications, system software, and hardware; they came from industry, government, andmore » academia; and their interests ranged from theory to implementation. The combination allowed broad and comprehensive discussions and led to this document, which summarizes and builds on those discussions.« less

  9. Lung Cancer Workshop XI: Tobacco-Induced Disease: Advances in Policy, Early Detection and Management.

    PubMed

    Mulshine, James L; Avila, Rick; Yankelevitz, David; Baer, Thomas M; Estépar, Raul San Jose; Ambrose, Laurie Fenton; Aldigé, Carolyn R

    2015-05-01

    The Prevent Cancer Foundation Lung Cancer Workshop XI: Tobacco-Induced Disease: Advances in Policy, Early Detection and Management was held in New York, NY on May 16 and 17, 2014. The two goals of the Workshop were to define strategies to drive innovation in precompetitive quantitative research on the use of imaging to assess new therapies for management of early lung cancer and to discuss a process to implement a national program to provide high quality computed tomography imaging for lung cancer and other tobacco-induced disease. With the central importance of computed tomography imaging for both early detection and volumetric lung cancer assessment, strategic issues around the development of imaging and ensuring its quality are critical to ensure continued progress against this most lethal cancer.

  10. Successes and Challenges for Flow Control Simulations

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2008-01-01

    A survey is made of recent computations published for synthetic jet flow control cases from a CFD workshop held in 2004. The three workshop cases were originally chosen to represent different aspects of flow control physics: nominally 2-D synthetic jet into quiescent air, 3-D circular synthetic jet into turbulent boundarylayer crossflow, and nominally 2-D flow-control (both steady suction and oscillatory zero-net-mass-flow) for separation control on a simple wall-mounted aerodynamic hump shape. The purpose of this survey is to summarize the progress as related to these workshop cases, particularly noting successes and remaining challenges for computational methods. It is hoped that this summary will also by extension serve as an overview of the state-of-the-art of CFD for these types of flow-controlled flow fields in general.

  11. Student Workshops for Severe Weather Warning Decision Making using AWIPS-2 at the University of Oklahoma

    NASA Astrophysics Data System (ADS)

    Zwink, A. B.; Morris, D.; Ware, P. J.; Ernst, S.; Holcomb, B.; Riley, S.; Hardy, J.; Mullens, S.; Bowlan, M.; Payne, C.; Bates, A.; Williams, B.

    2016-12-01

    For several years, employees at the Cooperative Institute of Mesoscale Meteorological Studies at the University of Oklahoma (OU) that are affiliated with Warning Decision Training Division (WDTD) of the National Weather Service (NWS) provided training simulations to students from OU's School of Meteorology (SoM). These simulations focused on warning decision making using Dual-Pol radar data products in an AWIPS-1 environment. Building on these previous experiences, CIMMS/WDTD recently continued the collaboration with the SoM Oklahoma Weather Lab (OWL) by holding a warning decision workshop simulating a NWS Weather Forecast Office (WFO) experience. The workshop took place in the WDTD AWIPS-2 computer laboratory with 25 AWIPS-2 workstations and the WES-2 Bridge (Weather Event Simulator) software which replayed AWIPS-2 data. Using the WES-2 Bridge and the WESSL-2 (WES Scripting Language) event display, this computer lab has the state-of-the-art ability to simulate severe weather events and recreate WFO warning operations. OWL Student forecasters attending the workshop worked in teams in a multi-player simulation of the Hastings, Nebraska WFO on May 6th, 2015, where thunderstorms across the service area produced large hail, damaging winds, and multiple tornadoes. This paper will discuss the design and goals of the WDTD/OWL workshop, as well as plans for holding similar workshops in the future.

  12. DOE's Computer Incident Advisory Capability (CIAC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schultz, E.

    1990-09-01

    Computer security is essential in maintaining quality in the computing environment. Computer security incidents, however, are becoming more sophisticated. The DOE Computer Incident Advisory Capability (CIAC) team was formed primarily to assist DOE sites in responding to computer security incidents. Among CIAC's other responsibilities are gathering and distributing information to DOE sites, providing training workshops, coordinating with other agencies, response teams, and vendors, creating guidelines for incident handling, and developing software tools. CIAC has already provided considerable assistance to DOE sites faced with virus infections and worm and hacker attacks, has issued over 40 information bulletins, and has developed andmore » presented a workshop on incident handling. CIAC's experience in helping sites has produced several lessons learned, including the need to follow effective procedures to avoid virus infections in small systems and the need for sound password management and system administration in networked systems. CIAC's activity and scope will expand in the future. 4 refs.« less

  13. Collaborative Problem-Solving Environments; Proceedings for the Workshop CPSEs for Scientific Research, San Diego, California, June 20 to July 1, 1999

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chin, George

    1999-01-11

    A workshop on collaborative problem-solving environments (CPSEs) was held June 29 through July 1, 1999, in San Diego, California. The workshop was sponsored by the U.S. Department of Energy and the High Performance Network Applications Team of the Large Scale Networking Working Group. The workshop brought together researchers and developers from industry, academia, and government to identify, define, and discuss future directions in collaboration and problem-solving technologies in support of scientific research.

  14. FOREWORD: International Workshop on Theoretical Plasma Physics: Modern Plasma Science. Sponsored by the Abdus Salam ICTP, Trieste, Italy

    NASA Astrophysics Data System (ADS)

    Shukla, P. K.; Stenflo, L.

    2005-01-01

    The "International Workshop on Theoretical Plasma Physics: Modern Plasma Science was held at the Abdus Salam International Centre for Theoretical Physics (Abdus Salam ICTP), Trieste, Italy during the period 5 16 July 2004. The workshop was organized by P K Shukla, R Bingham, S M Mahajan, J T Mendonça, L Stenflo, and others. The workshop enters into a series of previous biennial activities that we have held at the Abdus Salam ICTP since 1989. The scientific program of the workshop was split into two parts. In the first week, most of the lectures dealt with problems concerning astrophysical plasmas, while in the second week, diversity was introduced in order to address the important role of plasma physics in modern areas of science and technology. Here, attention was focused on cross-disciplinary topics including Schrödinger-like models, which are common in plasma physics, nonlinear optics, quantum engineering (Bose-Einstein condensates), and nonlinear fluid mechanics, as well as emerging topics in fundamental theoretical and computational plasma physics, space and dusty plasma physics, laser-plasma interactions, etc. The workshop was attended by approximately hundred-twenty participants from the developing countries, Europe, USA, and Japan. A large number of participants were young researchers from both the developing and industrial countries, as the directors of the workshop tried to keep a good balance in inviting senior and younger generations of theoretical, computational and experimental plasma physicists to our Trieste activities. In the first week, there were extensive discussions on the physics of electromagnetic wave emissions from pulsar magnetospheres, relativistic magnetohydrodynamics of astrophysical objects, different scale sizes turbulence and structures in astrophysics. The scientific program of the second week included five review talks (60 minutes) and about thirty invited topical lectures (30 minutes). In addition, during the two weeks, there were more than seventy poster papers in three sessions. The latter provided opportunities for younger physicists to display the results of their recent work and to obtain comments from the other participants. During the period 11 16 July 2004 at the Abdus Salam ICTP, we focused on nonlinear effects that are common in plasmas, fluids, nonlinear optics, and condensed matter physics. In addition, we concentrated on collective processes in space and dusty plasmas, as well as in astrophysics and intense laser-plasma interactions. Also presented were modern topics of nonlinear neutrino-plasma interactions, nonlinear quantum electrodynamics, quark-gluon plasmas, and high-energy astrophysics. This reflects that plasma physics is a truly cross-disciplinary and very fascinating science with many potential applications. The workshop was attended by several distinguished invited speakers. Most of the contributions from the second week of our Trieste workshop appear in this Topical Issue of Physica Scripta, which will be distributed to all the participants. The organizers are grateful to Professor Katepalli Raju Sreenivasan, the director of the Abdus Salam ICTP, for his generous support and warm hospitality in Trieste. The Editors appreciate their colleagues and co-organizers for their constant and wholehearted support in our endeavours of publishing this Topical Issue of Physica Scripta. We highly value the excellent work of Mrs Ave Lusenti and Dr. Brian Stewart at the Abdus Salam ICTP. Thanks are also due to the European Commission for supporting our activity through the Research Training Networks entitled "Complex Plasmas" and "Turbulent Boundary Layers". Finally, we would like to express our gratitude to the Abdus Salam ICTP for providing financial support to our workshop in Trieste. Besides, the workshop directors thank the speakers and the attendees for their contributions which resulted in the success of our Trieste workshop 2004. Specifically, we appreciate the speakers for delivering excellent talks, supplying well prepared manuscripts for publication, and enhancing the plasma physics activity at the Abdus Salam ICTP.

  15. PHOTOVOLTAICS AND THE ENVIRONMENT 1998. REPORT ON THE WORKSHOP PHOTOVOLTAICS AND THE ENVIRONMENT 1999

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    FTHENAKIS,V.; ZWEIBEL,K.; MOSKOWITZ,P.

    1999-02-01

    The objective of the workshop ``Photovoltaics and the Environment'' was to bring together PV manufacturers and industry analysts to define EH and S issues related to the large-scale commercialization of PV technologies.

  16. 77 FR 48107 - Workshop on Performance Assessments of Near-Surface Disposal Facilities: FEPs Analysis, Scenario...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-13

    ...) disposal facilities. The workshop has been developed to facilitate communication among Federal and State... and conceptual models, and (3) the selection of computer codes. Information gathered from invited.... NRC Public Meeting The purpose of this public meeting is to facilitate communication and gather...

  17. A DIY Ultrasonic Signal Generator for Sound Experiments

    ERIC Educational Resources Information Center

    Riad, Ihab F.

    2018-01-01

    Many physics departments around the world have electronic and mechanical workshops attached to them that can help build experimental setups and instruments for research and the training of undergraduate students. The workshops are usually run by experienced technicians and equipped with expensive lathing, computer numerical control (CNC) machines,…

  18. Applied Information Systems Research Program Workshop

    NASA Technical Reports Server (NTRS)

    Bredekamp, Joe

    1991-01-01

    Viewgraphs on Applied Information Systems Research Program Workshop are presented. Topics covered include: the Earth Observing System Data and Information System; the planetary data system; Astrophysics Data System project review; OAET Computer Science and Data Systems Programs; the Center of Excellence in Space Data and Information Sciences; and CASIS background.

  19. Assessing Knowledge Change in Computer Science

    ERIC Educational Resources Information Center

    Nash, Jane Gradwohl; Bravaco, Ralph J.; Simonson, Shai

    2006-01-01

    The purpose of this study was to assess structural knowledge change across a two-week workshop designed to provide high-school teachers with training in Java and Object Oriented Programming. Both before and after the workshop, teachers assigned relatedness ratings to pairs of key concepts regarding Java and Object Oriented Programming. Their…

  20. Rotordynamic Instability Problems in High-Performance Turbomachinery, 1990

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The present workshop continues to report field experience and experimental results, and it expands the use of computational and control techniques with the integration of damper, bearing, and eccentric seal operation results. The intent of the workshop was to provide a continuing impetus for an understanding and resolution of these problems.

  1. Microwave Workshop for Windows.

    ERIC Educational Resources Information Center

    White, Colin

    1998-01-01

    "Microwave Workshop for Windows" consists of three programs that act as teaching aid and provide a circuit design utility within the field of microwave engineering. The first program is a computer representation of a graphical design tool; the second is an accurate visual and analytical representation of a microwave test bench; the third…

  2. Vibration Damping Workshop Proceedings Held at Long Beach, California on 27-29 February 1984.

    DTIC Science & Technology

    1984-11-11

    control system with a sensing accelerometer plus a differentiating network is an extremely effective damping system, if - the magnitude of the... devopment /operating cost by 340M UU -2 p 0 i -L . ..’ - . , ,.. . ,, _,_ ... . .-; .. :: -- _. . , .:... : . -.. .*. - - -.- 2 -,-i-. . i

  3. Simulation Methods for Optics and Electromagnetics in Complex Geometries and Extreme Nonlinear Regimes with Disparate Scales

    DTIC Science & Technology

    2014-09-30

    software devel- oped with this project support. S1 Cork School 2013: I. UPPEcore Simulator design and usage, Simulation examples II. Nonlinear pulse...pulse propagation 08/28/13 — 08/02/13, University College Cork , Ireland S2 ACMS MURI School 2012: Computational Methods for Nonlinear PDEs describing

  4. Using equitable impact sensitive tool (EQUIST) and knowledge translation to promote evidence to policy link in maternal and child health: report of first EQUIST training workshop in Nigeria.

    PubMed

    Uneke, Chigozie Jesse; Sombie, Issiaka; Uro-Chukwu, Henry Chukwuemeka; Johnson, Ermel; Okonofua, Friday

    2017-01-01

    The Equitable Impact Sensitive Tool (EQUIST) designed by UNICEF and knowledge translation (KT) are important strategies that can help policymakers to improve equity and evidence-informed policy making in maternal, newborn and child health (MNCH). The purpose of this study was to improve the knowledge and capacity of an MNCH implementation research team (IRT) and policy makers to use EQUIST and KT. A modified "before and after" intervention study design was used in which outcomes were measured on the target participants both before the intervention (workshop) is implemented and after. A 5-point likert scale according to the degree of adequacy was employed. A three -day intensive EQUIST and KT training workshop was organized in Edo State, Nigeria with 45 participants in attendance. Some of the topics covered included: (i) Knowledge translation models, measures & tools; (ii) Policy review, analysis and contextualization; (iii) Policy formulation and legislation process; (iv) EQUIST Overview & Theory of change; (v) EQUIST's situation analysis, scenario analysis and scenario comparison. The pre-workshop mean of understanding of use of KT ranged from 2.02-3.41, while the post-workshop mean ranged from 3.24-4.30. Pre-workshop mean of understanding of use of EQUIST ranged from 1.66-2.41, while the post-workshop mean ranged from 3.56-4.54 on the 5point scale. The percentage increase in mean of KT and EQUIST at the end of the workshop ranged from 8.0%-88.1% and 65.6%-158.4% respectively. Findings of this study suggest that policymakers' and researchers KT and EQUSIT use competence relevant to evidence-informed policymaking can be enhanced through training workshop.

  5. Using equitable impact sensitive tool (EQUIST) and knowledge translation to promote evidence to policy link in maternal and child health: report of first EQUIST training workshop in Nigeria

    PubMed Central

    Uneke, Chigozie Jesse; Sombie, Issiaka; Uro-Chukwu, Henry Chukwuemeka; Johnson, Ermel; Okonofua, Friday

    2017-01-01

    The Equitable Impact Sensitive Tool (EQUIST) designed by UNICEF and knowledge translation (KT) are important strategies that can help policymakers to improve equity and evidence-informed policy making in maternal, newborn and child health (MNCH). The purpose of this study was to improve the knowledge and capacity of an MNCH implementation research team (IRT) and policy makers to use EQUIST and KT. A modified “before and after” intervention study design was used in which outcomes were measured on the target participants both before the intervention (workshop) is implemented and after. A 5-point likert scale according to the degree of adequacy was employed. A three -day intensive EQUIST and KT training workshop was organized in Edo State, Nigeria with 45 participants in attendance. Some of the topics covered included: (i) Knowledge translation models, measures & tools; (ii) Policy review, analysis and contextualization; (iii) Policy formulation and legislation process; (iv) EQUIST Overview & Theory of change; (v) EQUIST's situation analysis, scenario analysis and scenario comparison. The pre-workshop mean of understanding of use of KT ranged from 2.02-3.41, while the post-workshop mean ranged from 3.24-4.30. Pre-workshop mean of understanding of use of EQUIST ranged from 1.66-2.41, while the post-workshop mean ranged from 3.56-4.54 on the 5point scale. The percentage increase in mean of KT and EQUIST at the end of the workshop ranged from 8.0%-88.1% and 65.6%-158.4% respectively. Findings of this study suggest that policymakers' and researchers KT and EQUSIT use competence relevant to evidence-informed policymaking can be enhanced through training workshop. PMID:29158860

  6. Computers-in-the-Curriculum Workshop.

    ERIC Educational Resources Information Center

    Casella, Vicki

    1987-01-01

    Computer software to build skills and encourage family computer time over the summer are recommended for teachers to send home to parents. Programs include games based on classic adventure stories, a shopping mall game to encourage math skills, and keyboarding programs. (MT)

  7. Workshop Physics Activity Guide, Module 4: Electricity and Magnetism

    NASA Astrophysics Data System (ADS)

    Laws, Priscilla W.

    2004-05-01

    The Workshop Physics Activity Guide is a set of student workbooks designed to serve as the foundation for a two-semester calculus-based introductory physics course. It consists of 28 units that interweave text materials with activities that include prediction, qualitative observation, explanation, equation derivation, mathematical modeling, quantitative experiments, and problem solving. Students use a powerful set of computer tools to record, display, and analyze data, as well as to develop mathematical models of physical phenomena. The design of many of the activities is based on the outcomes of physics education research. The Workshop Physics Activity Guide is supported by an Instructor's Website that: (1) describes the history and philosophy of the Workshop Physics Project; (2) provides advice on how to integrate the Guide into a variety of educational settings; (3) provides information on computer tools (hardware and software) and apparatus; and (4) includes suggested homework assignments for each unit. Log on to the Workshop Physics Project website at http://physics.dickinson.edu/ Workshop Physics is a component of the Physics Suite--a collection of materials created by a group of educational reformers known as the Activity Based Physics Group. The Physics Suite contains a broad array of curricular materials that are based on physics education research, including:

      Understanding Physics, by Cummings, Laws, Redish and Cooney (an introductory textbook based on the best-selling text by Halliday/Resnick/Walker) RealTime Physics Laboratory Modules Physics by Inquiry (intended for use in a workshop setting) Interactive Lecture Demonstration Tutorials in Introductory Physics Activity Based Tutorials (designed primarily for use in recitations)

    • Workshop on Grid Generation and Related Areas

      NASA Technical Reports Server (NTRS)

      1992-01-01

      A collection of papers given at the Workshop on Grid Generation and Related Areas is presented. The purpose of this workshop was to assemble engineers and scientists who are currently working on grid generation for computational fluid dynamics (CFD), surface modeling, and related areas. The objectives were to provide an informal forum on grid generation and related topics, to assess user experience, to identify needs, and to help promote synergy among engineers and scientists working in this area. The workshop consisted of four sessions representative of grid generation and surface modeling research and application within NASA LeRC. Each session contained presentations and an open discussion period.

    • Theoretical and Computational Modeling of Magnetically Ordered Molecules & Electronic Nano-Transport of Spins: State of Art and Unanswered Questions : Workshop

      DOE Office of Scientific and Technical Information (OSTI.GOV)

      Baruah, Tunna

      2015-02-06

      As a culmination of a five-year Nordforsk Network project entitled “Nanospintronics: Theory and Simulations”, Professor Carlo Canali (Linneaus University, Sweden) and members of the network, and Vincenza Benza (Local Organizer Milan) organized a summer workshop in conjunction with the A. Volta Scientific Cultural Exchange program. This workshop took place 24-30 August 2013. Several Basic Energy Scientists from the US conducted lectures and also provided hands-on tutorials to developing materials and chemicals scientists. We have received a total of $10,000 of support to offset the travel expenses of US-based participants for this workshop.

    • The Relationships Between the Trends of Mean and Extreme Precipitation

      NASA Technical Reports Server (NTRS)

      Zhou, Yaping; Lau, William K.-M.

      2017-01-01

      This study provides a better understanding of the relationships between the trends of mean and extreme precipitation in two observed precipitation data sets: the Climate Prediction Center Unified daily precipitation data set and the Global Precipitation Climatology Program (GPCP) pentad data set. The study employs three kinds of definitions of extreme precipitation: (1) percentile, (2) standard deviation and (3) generalize extreme value (GEV) distribution analysis for extreme events based on local statistics. Relationship between trends in the mean and extreme precipitation is identified with a novel metric, i.e. area aggregated matching ratio (AAMR) computed on regional and global scales. Generally, more (less) extreme events are likely to occur in regions with a positive (negative) mean trend. The match between the mean and extreme trends deteriorates for increasingly heavy precipitation events. The AAMR is higher in regions with negative mean trends than in regions with positive mean trends, suggesting a higher likelihood of severe dry events, compared with heavy rain events in a warming climate. AAMR is found to be higher in tropics and oceans than in the extratropics and land regions, reflecting a higher degree of randomness and more important dynamical rather than thermodynamical contributions of extreme events in the latter regions.

    • Advances in Cross-Cutting Ideas for Computational Climate Science

      DOE Office of Scientific and Technical Information (OSTI.GOV)

      Ng, Esmond; Evans, Katherine J.; Caldwell, Peter

      This report presents results from the DOE-sponsored workshop titled, ``Advancing X-Cutting Ideas for Computational Climate Science Workshop,'' known as AXICCS, held on September 12--13, 2016 in Rockville, MD. The workshop brought together experts in climate science, computational climate science, computer science, and mathematics to discuss interesting but unsolved science questions regarding climate modeling and simulation, promoted collaboration among the diverse scientists in attendance, and brainstormed about possible tools and capabilities that could be developed to help address them. Emerged from discussions at the workshop were several research opportunities that the group felt could advance climate science significantly. These include (1)more » process-resolving models to provide insight into important processes and features of interest and inform the development of advanced physical parameterizations, (2) a community effort to develop and provide integrated model credibility, (3) including, organizing, and managing increasingly connected model components that increase model fidelity yet complexity, and (4) treating Earth system models as one interconnected organism without numerical or data based boundaries that limit interactions. The group also identified several cross-cutting advances in mathematics, computer science, and computational science that would be needed to enable one or more of these big ideas. It is critical to address the need for organized, verified, and optimized software, which enables the models to grow and continue to provide solutions in which the community can have confidence. Effectively utilizing the newest computer hardware enables simulation efficiency and the ability to handle output from increasingly complex and detailed models. This will be accomplished through hierarchical multiscale algorithms in tandem with new strategies for data handling, analysis, and storage. These big ideas and cross-cutting technologies for enabling breakthrough climate simulation advancements also need the "glue" of outreach and learning across the scientific domains to be successful. The workshop identified several strategies to allow productive, continuous engagement across those who have a broad knowledge of the various angles of the problem. Specific ideas to foster education and tools to make material progress were discussed. Examples include follow-on cross-cutting meetings that enable unstructured discussions of the types this workshop fostered. A concerted effort to recruit undergraduate and graduate students from all relevant domains and provide them experience, training, and networking across their immediate expertise is needed. This will broaden and expand their exposure to the future needs and solutions, and provide a pipeline of scientists with a diversity of knowledge and know-how. Providing real-world experience with subject matter experts from multiple angles may also motivate the students to attack these problems and even come up with the missing solutions.« less

    • Advances in Cross-Cutting Ideas for Computational Climate Science

      DOE Office of Scientific and Technical Information (OSTI.GOV)

      Ng, E.; Evans, K.; Caldwell, P.

      This report presents results from the DOE-sponsored workshop titled, Advancing X-Cutting Ideas for Computational Climate Science Workshop,'' known as AXICCS, held on September 12--13, 2016 in Rockville, MD. The workshop brought together experts in climate science, computational climate science, computer science, and mathematics to discuss interesting but unsolved science questions regarding climate modeling and simulation, promoted collaboration among the diverse scientists in attendance, and brainstormed about possible tools and capabilities that could be developed to help address them. Emerged from discussions at the workshop were several research opportunities that the group felt could advance climate science significantly. These include (1)more » process-resolving models to provide insight into important processes and features of interest and inform the development of advanced physical parameterizations, (2) a community effort to develop and provide integrated model credibility, (3) including, organizing, and managing increasingly connected model components that increase model fidelity yet complexity, and (4) treating Earth system models as one interconnected organism without numerical or data based boundaries that limit interactions. The group also identified several cross-cutting advances in mathematics, computer science, and computational science that would be needed to enable one or more of these big ideas. It is critical to address the need for organized, verified, and optimized software, which enables the models to grow and continue to provide solutions in which the community can have confidence. Effectively utilizing the newest computer hardware enables simulation efficiency and the ability to handle output from increasingly complex and detailed models. This will be accomplished through hierarchical multiscale algorithms in tandem with new strategies for data handling, analysis, and storage. These big ideas and cross-cutting technologies for enabling breakthrough climate simulation advancements also need the "glue" of outreach and learning across the scientific domains to be successful. The workshop identified several strategies to allow productive, continuous engagement across those who have a broad knowledge of the various angles of the problem. Specific ideas to foster education and tools to make material progress were discussed. Examples include follow-on cross-cutting meetings that enable unstructured discussions of the types this workshop fostered. A concerted effort to recruit undergraduate and graduate students from all relevant domains and provide them experience, training, and networking across their immediate expertise is needed. This will broaden and expand their exposure to the future needs and solutions, and provide a pipeline of scientists with a diversity of knowledge and know-how. Providing real-world experience with subject matter experts from multiple angles may also motivate the students to attack these problems and even come up with the missing solutions.« less

    • Proceedings of the Second Joint Technology Workshop on Neural Networks and Fuzzy Logic, volume 1

      NASA Technical Reports Server (NTRS)

      Lea, Robert N. (Editor); Villarreal, James (Editor)

      1991-01-01

      Documented here are papers presented at the Neural Networks and Fuzzy Logic Workshop sponsored by NASA and the University of Houston, Clear Lake. The workshop was held April 11 to 13 at the Johnson Space Flight Center. Technical topics addressed included adaptive systems, learning algorithms, network architectures, vision, robotics, neurobiological connections, speech recognition and synthesis, fuzzy set theory and application, control and dynamics processing, space applications, fuzzy logic and neural network computers, approximate reasoning, and multiobject decision making.

    • Proceedings of the Seventh International Workshop on Advances in Electrocorticography

      PubMed Central

      Ritaccio, Anthony; Matsumoto, Riki; Morrell, Martha; Kamada, Kyousuke; Koubeissi, Mohamad; Poeppel, David; Lachaux, Jean-Philippe; Yanagisawa, Yakufumi; Hirata, Masayuki; Guger, Christoph; Schalk, Gerwin

      2015-01-01

      The Seventh International Workshop on Advances in Electrocorticography (ECoG) convened in Washington, DC, on November 13–14, 2014. Electrocorticography-based research continues to proliferate widely across basic science and clinical disciplines. The 2014 workshop highlighted advances in neurolinguistics, brain-computer interface, functional mapping, and seizure termination facilitated by advances in the recording and analysis of the ECoG signal. The following proceedings document is an attempt at summarizing the content of this past year’s successful multidisciplinary gathering. PMID:26322594

    • DOE Office of Scientific and Technical Information (OSTI.GOV)

      Koniges, A.E.; Craddock, G.G.; Schnack, D.D.

      The purpose of the workshop was to assemble workers, both within and outside of the fusion-related computations areas, for discussion regarding the issues of dynamically adaptive gridding. There were three invited talks related to adaptive gridding application experiences in various related fields of computational fluid dynamics (CFD), and nine short talks reporting on the progress of adaptive techniques in the specific areas of scrape-off-layer (SOL) modeling and magnetohydrodynamic (MHD) stability. Adaptive mesh methods have been successful in a number of diverse fields of CFD for over a decade. The method involves dynamic refinement of computed field profiles in a waymore » that disperses uniformly the numerical errors associated with discrete approximations. Because the process optimizes computational effort, adaptive mesh methods can be used to study otherwise the intractable physical problems that involve complex boundary shapes or multiple spatial/temporal scales. Recent results indicate that these adaptive techniques will be required for tokamak fluid-based simulations involving the diverted tokamak SOL modeling and MHD simulations problems related to the highest priority ITER relevant issues.Individual papers are indexed separately on the energy data bases.« less

    • Proceedings of the Workshop on Computational Aspects in the Control of Flexible Systems, part 1

      NASA Technical Reports Server (NTRS)

      Taylor, Lawrence W., Jr. (Compiler)

      1989-01-01

      Control/Structures Integration program software needs, computer aided control engineering for flexible spacecraft, computer aided design, computational efficiency and capability, modeling and parameter estimation, and control synthesis and optimization software for flexible structures and robots are among the topics discussed.

    • Overview and Summary of the Second AIAA High Lift Prediction Workshop

      NASA Technical Reports Server (NTRS)

      Rumsey, Christopher L.; Slotnick, Jeffrey P.

      2014-01-01

      The second AIAA CFD High-Lift Prediction Workshop was held in San Diego, California, in June 2013. The goals of the workshop continued in the tradition of the first high-lift workshop: to assess the numerical prediction capability of current-generation computational fluid dynamics (CFD) technology for swept, medium/high-aspect-ratio wings in landing/takeoff (high-lift) configurations. This workshop analyzed the flow over the DLR-F11 model in landing configuration at two different Reynolds numbers. Twenty-six participants submitted a total of 48 data sets of CFD results. A variety of grid systems (both structured and unstructured) were used. Trends due to grid density and Reynolds number were analyzed, and effects of support brackets were also included. This paper analyzes the combined results from all workshop participants. Comparisons with experimental data are made. A statistical summary of the CFD results is also included.

    • Report of the workshop on Aviation Safety/Automation Program

      NASA Technical Reports Server (NTRS)

      Morello, Samuel A. (Editor)

      1990-01-01

      As part of NASA's responsibility to encourage and facilitate active exchange of information and ideas among members of the aviation community, an Aviation Safety/Automation workshop was organized and sponsored by the Flight Management Division of NASA Langley Research Center. The one-day workshop was held on October 10, 1989, at the Sheraton Beach Inn and Conference Center in Virginia Beach, Virginia. Participants were invited from industry, government, and universities to discuss critical questions and issues concerning the rapid introduction and utilization of advanced computer-based technology into the flight deck and air traffic controller workstation environments. The workshop was attended by approximately 30 discipline experts, automation and human factors researchers, and research and development managers. The goal of the workshop was to address major issues identified by the NASA Aviation Safety/Automation Program. Here, the results of the workshop are documented. The ideas, thoughts, and concepts were developed by the workshop participants. The findings, however, have been synthesized into a final report primarily by the NASA researchers.

    • Statistical Analysis of CFD Solutions from the Fourth AIAA Drag Prediction Workshop

      NASA Technical Reports Server (NTRS)

      Morrison, Joseph H.

      2010-01-01

      A graphical framework is used for statistical analysis of the results from an extensive N-version test of a collection of Reynolds-averaged Navier-Stokes computational fluid dynamics codes. The solutions were obtained by code developers and users from the U.S., Europe, Asia, and Russia using a variety of grid systems and turbulence models for the June 2009 4th Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic configuration for this workshop was a new subsonic transport model, the Common Research Model, designed using a modern approach for the wing and included a horizontal tail. The fourth workshop focused on the prediction of both absolute and incremental drag levels for wing-body and wing-body-horizontal tail configurations. This work continues the statistical analysis begun in the earlier workshops and compares the results from the grid convergence study of the most recent workshop with earlier workshops using the statistical framework.

    • Wall Modeled Large Eddy Simulation of Airfoil Trailing Edge Noise

      NASA Astrophysics Data System (ADS)

      Kocheemoolayil, Joseph; Lele, Sanjiva

      2014-11-01

      Large eddy simulation (LES) of airfoil trailing edge noise has largely been restricted to low Reynolds numbers due to prohibitive computational cost. Wall modeled LES (WMLES) is a computationally cheaper alternative that makes full-scale Reynolds numbers relevant to large wind turbines accessible. A systematic investigation of trailing edge noise prediction using WMLES is conducted. Detailed comparisons are made with experimental data. The stress boundary condition from a wall model does not constrain the fluctuating velocity to vanish at the wall. This limitation has profound implications for trailing edge noise prediction. The simulation over-predicts the intensity of fluctuating wall pressure and far-field noise. An improved wall model formulation that minimizes the over-prediction of fluctuating wall pressure is proposed and carefully validated. The flow configurations chosen for the study are from the workshop on benchmark problems for airframe noise computations. The large eddy simulation database is used to examine the adequacy of scaling laws that quantify the dependence of trailing edge noise on Mach number, Reynolds number and angle of attack. Simplifying assumptions invoked in engineering approaches towards predicting trailing edge noise are critically evaluated. We gratefully acknowledge financial support from GE Global Research and thank Cascade Technologies Inc. for providing access to their massively-parallel large eddy simulation framework.

  1. Workshop on Aircraft Surface Representation for Aerodynamic Computation

    NASA Technical Reports Server (NTRS)

    Gregory, T. J. (Editor); Ashbaugh, J. (Editor)

    1980-01-01

    Papers and discussions on surface representation and its integration with aerodynamics, computers, graphics, wind tunnel model fabrication, and flow field grid generation are presented. Surface definition is emphasized.

  2. Multi-GPU hybrid programming accelerated three-dimensional phase-field model in binary alloy

    NASA Astrophysics Data System (ADS)

    Zhu, Changsheng; Liu, Jieqiong; Zhu, Mingfang; Feng, Li

    2018-03-01

    In the process of dendritic growth simulation, the computational efficiency and the problem scales have extremely important influence on simulation efficiency of three-dimensional phase-field model. Thus, seeking for high performance calculation method to improve the computational efficiency and to expand the problem scales has a great significance to the research of microstructure of the material. A high performance calculation method based on MPI+CUDA hybrid programming model is introduced. Multi-GPU is used to implement quantitative numerical simulations of three-dimensional phase-field model in binary alloy under the condition of multi-physical processes coupling. The acceleration effect of different GPU nodes on different calculation scales is explored. On the foundation of multi-GPU calculation model that has been introduced, two optimization schemes, Non-blocking communication optimization and overlap of MPI and GPU computing optimization, are proposed. The results of two optimization schemes and basic multi-GPU model are compared. The calculation results show that the use of multi-GPU calculation model can improve the computational efficiency of three-dimensional phase-field obviously, which is 13 times to single GPU, and the problem scales have been expanded to 8193. The feasibility of two optimization schemes is shown, and the overlap of MPI and GPU computing optimization has better performance, which is 1.7 times to basic multi-GPU model, when 21 GPUs are used.

  3. 4th Penn State Bioinorganic Workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krebs, Carsten

    The research area of biological inorganic chemistry encompasses a wide variety of subfields, including molecular biology, biochemistry, biophysics, inorganic chemistry, analytical chemistry, physical chemistry, and theoretical chemistry, as well as many different methods, such as biochemical characterization of enzymes, reaction kinetics, a plethora of spectroscopic techniques, and computational methods. The above methods are combined to understand the formation, function, and regulation of the many metallo-cofactors found in Nature as well as to identify novel metallo-cofactors. Many metalloenzyme-catalyzed reactions are extremely complex, but are of fundamental importance to science and society. Examples include (i) the reduction of the chemically inert molecule,more » dinitrogen, to ammonia by the enzyme nitrogenase (this reaction is fundamental for the production of nitrogen fertilizers); (ii) the oxidation of water to dioxygen by the Mn4Ca cluster found in photosystem II; and (iii) myriad reactions in which aliphatic, inert C-H bonds are cleaved for subsequent functionalization of the carbon atoms (the latter reactions are important in the biosynthesis of many natural products). Because of the broad range of areas and techniques employed in this field, research in bioinorganic chemistry is typically carried out collaboratively between two or more research groups. It is of paramount importance that researchers working in this field have a good, basic, working knowledge of many methods and approaches employed in the field, in order to design and discuss experiments with collaborators. Therefore, the training of students working in bioinorganic chemistry is an important aspect of this field. Hugely successful “bioinorganic workshops” were offered in the 1990s at The University of Georgia. These workshops laid the foundation for many of the extant collaborative research efforts in this area today. The large and diverse group of bioinorganic chemists at The Pennsylvania State University and our unique laboratory space are well suited for the continuation of such training workshops. The co-principal investigators of this award lead these efforts. After a smaller “trial workshop” in 2010, the Penn State bioinorganic group, led by the co-PIs, offers these workshops biennially. The 2012, 2014, and 2016 workshops provided training to 123, 162, and 153 participants, respectively, by offering (i) a series of lectures given by faculty experts on the given topic, (ii) hands-on training in small groups by experts in the various methods, and (iii) sharing research results of the participants by oral and poster presentations. The centerpiece of the workshops is the hands-on training, in which approximately half of the participants from all ranks (undergraduate students to faculty) served as teachers. In this section, the traditional roles of teachers and students were sometimes reversed to the extent that undergraduate students taught faculty in the students' areas of specialty. We anticipate that these workshops will facilitate research in bioinorganic chemistry and will help establish future collaborations among “workshop alumni” to carry out cutting-edge research in bioinorganic chemistry that will address many important topics relevant to our society.« less

  4. Evaluating Imaging and Computer-aided Detection and Diagnosis Devices at the FDA

    PubMed Central

    Gallas, Brandon D.; Chan, Heang-Ping; D’Orsi, Carl J.; Dodd, Lori E.; Giger, Maryellen L.; Gur, David; Krupinski, Elizabeth A.; Metz, Charles E.; Myers, Kyle J.; Obuchowski, Nancy A.; Sahiner, Berkman; Toledano, Alicia Y.; Zuley, Margarita L.

    2017-01-01

    This report summarizes the Joint FDA-MIPS Workshop on Methods for the Evaluation of Imaging and Computer-Assist Devices. The purpose of the workshop was to gather information on the current state of the science and facilitate consensus development on statistical methods and study designs for the evaluation of imaging devices to support US Food and Drug Administration submissions. Additionally, participants expected to identify gaps in knowledge and unmet needs that should be addressed in future research. This summary is intended to document the topics that were discussed at the meeting and disseminate the lessons that have been learned through past studies of imaging and computer-aided detection and diagnosis device performance. PMID:22306064

  5. A Pervasive Parallel Processing Framework for Data Visualization and Analysis at Extreme Scale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth; Geveci, Berk

    2014-11-01

    The evolution of the computing world from teraflop to petaflop has been relatively effortless, with several of the existing programming models scaling effectively to the petascale. The migration to exascale, however, poses considerable challenges. All industry trends infer that the exascale machine will be built using processors containing hundreds to thousands of cores per chip. It can be inferred that efficient concurrency on exascale machines requires a massive amount of concurrent threads, each performing many operations on a localized piece of data. Currently, visualization libraries and applications are based off what is known as the visualization pipeline. In the pipelinemore » model, algorithms are encapsulated as filters with inputs and outputs. These filters are connected by setting the output of one component to the input of another. Parallelism in the visualization pipeline is achieved by replicating the pipeline for each processing thread. This works well for today’s distributed memory parallel computers but cannot be sustained when operating on processors with thousands of cores. Our project investigates a new visualization framework designed to exhibit the pervasive parallelism necessary for extreme scale machines. Our framework achieves this by defining algorithms in terms of worklets, which are localized stateless operations. Worklets are atomic operations that execute when invoked unlike filters, which execute when a pipeline request occurs. The worklet design allows execution on a massive amount of lightweight threads with minimal overhead. Only with such fine-grained parallelism can we hope to fill the billions of threads we expect will be necessary for efficient computation on an exascale machine.« less

  6. Proceedings of the Sixth NASA Langley Formal Methods (LFM) Workshop

    NASA Technical Reports Server (NTRS)

    Rozier, Kristin Yvonne (Editor)

    2008-01-01

    Today's verification techniques are hard-pressed to scale with the ever-increasing complexity of safety critical systems. Within the field of aeronautics alone, we find the need for verification of algorithms for separation assurance, air traffic control, auto-pilot, Unmanned Aerial Vehicles (UAVs), adaptive avionics, automated decision authority, and much more. Recent advances in formal methods have made verifying more of these problems realistic. Thus we need to continually re-assess what we can solve now and identify the next barriers to overcome. Only through an exchange of ideas between theoreticians and practitioners from academia to industry can we extend formal methods for the verification of ever more challenging problem domains. This volume contains the extended abstracts of the talks presented at LFM 2008: The Sixth NASA Langley Formal Methods Workshop held on April 30 - May 2, 2008 in Newport News, Virginia, USA. The topics of interest that were listed in the call for abstracts were: advances in formal verification techniques; formal models of distributed computing; planning and scheduling; automated air traffic management; fault tolerance; hybrid systems/hybrid automata; embedded systems; safety critical applications; safety cases; accident/safety analysis.

  7. Successes and Challenges for Flow Control Simulations

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2008-01-01

    A survey is made of recent computations published for synthetic jet flow control cases from a CFD workshop held in 2004. The three workshop cases were originally chosen to represent different aspects of flow control physics: nominally 2-D synthetic jet into quiescent air, 3-D circular synthetic jet into turbulent boundary-layer crossflow, and nominally 2-D flow-control (both steady suction and oscillatory zero-net-mass-flow) for separation control on a simple wall-mounted aerodynamic hump shape. The purpose of this survey is to summarize the progress as related to these workshop cases, particularly noting successes and remaining challenges for computational methods. It is hoped that this summary will also by extension serve as an overview of the state-of-the-art of CFD for these types of flow-controlled flow fields in general.

  8. Workshop on Wildlife Crime: An Interdisciplinary Perspective

    DTIC Science & Technology

    2015-01-01

    12211 Research Triangle Park, NC 27709-2211 Wildlife crime, computation, conservation, criminology , conservation biology, risk, poaching REPORT...Action items? Conference on “Conservation, Computation, Criminology ” C^3? Technology Transfer

  9. ISCR Annual Report: Fical Year 2004

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGraw, J R

    2005-03-03

    Large-scale scientific computation and all of the disciplines that support and help to validate it have been placed at the focus of Lawrence Livermore National Laboratory (LLNL) by the Advanced Simulation and Computing (ASC) program of the National Nuclear Security Administration (NNSA) and the Scientific Discovery through Advanced Computing (SciDAC) initiative of the Office of Science of the Department of Energy (DOE). The maturation of computational simulation as a tool of scientific and engineering research is underscored in the November 2004 statement of the Secretary of Energy that, ''high performance computing is the backbone of the nation's science and technologymore » enterprise''. LLNL operates several of the world's most powerful computers--including today's single most powerful--and has undertaken some of the largest and most compute-intensive simulations ever performed. Ultrascale simulation has been identified as one of the highest priorities in DOE's facilities planning for the next two decades. However, computers at architectural extremes are notoriously difficult to use efficiently. Furthermore, each successful terascale simulation only points out the need for much better ways of interacting with the resulting avalanche of data. Advances in scientific computing research have, therefore, never been more vital to LLNL's core missions than at present. Computational science is evolving so rapidly along every one of its research fronts that to remain on the leading edge, LLNL must engage researchers at many academic centers of excellence. In Fiscal Year 2004, the Institute for Scientific Computing Research (ISCR) served as one of LLNL's main bridges to the academic community with a program of collaborative subcontracts, visiting faculty, student internships, workshops, and an active seminar series. The ISCR identifies researchers from the academic community for computer science and computational science collaborations with LLNL and hosts them for short- and long-term visits with the aim of encouraging long-term academic research agendas that address LLNL's research priorities. Through such collaborations, ideas and software flow in both directions, and LLNL cultivates its future workforce. The Institute strives to be LLNL's ''eyes and ears'' in the computer and information sciences, keeping the Laboratory aware of and connected to important external advances. It also attempts to be the ''feet and hands'' that carry those advances into the Laboratory and incorporates them into practice. ISCR research participants are integrated into LLNL's Computing and Applied Research (CAR) Department, especially into its Center for Applied Scientific Computing (CASC). In turn, these organizations address computational challenges arising throughout the rest of the Laboratory. Administratively, the ISCR flourishes under LLNL's University Relations Program (URP). Together with the other five institutes of the URP, it navigates a course that allows LLNL to benefit from academic exchanges while preserving national security. While it is difficult to operate an academic-like research enterprise within the context of a national security laboratory, the results declare the challenges well met and worth the continued effort.« less

  10. Force Field Accelerated Density Functional Theory Molecular Dynamics for Simulation of Reactive Systems at Extreme Conditions

    NASA Astrophysics Data System (ADS)

    Lindsey, Rebecca; Goldman, Nir; Fried, Laurence

    2017-06-01

    Atomistic modeling of chemistry at extreme conditions remains a challenge, despite continuing advances in computing resources and simulation tools. While first principles methods provide a powerful predictive tool, the time and length scales associated with chemistry at extreme conditions (ns and μm, respectively) largely preclude extension of such models to molecular dynamics. In this work, we develop a simulation approach that retains the accuracy of density functional theory (DFT) while decreasing computational effort by several orders of magnitude. We generate n-body descriptions for atomic interactions by mapping forces arising from short density functional theory (DFT) trajectories on to simple Chebyshev polynomial series. We examine the importance of including greater than 2-body interactions, model transferability to different state points, and discuss approaches to ensure smooth and reasonable model shape outside of the distance domain sampled by the DFT training set. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  11. Fiscal Officer Training, 1999-2000. Participant's Guide.

    ERIC Educational Resources Information Center

    Department of Education, Washington, DC.

    This guide is intended for use by participants (college fiscal officers, business officers, bursars, loan managers, etc.) in a two-day workshop on Title IV of the reauthorized Higher Education Act. The guide includes copies of the visual displays used in the workshop, space for individual notes, sample forms, sample computer screens, quizzes, and…

  12. Future networking and cooperation summary of discussion

    Treesearch

    Roger R. Bay

    1993-01-01

    At the end of the workshop, I led a lightly structured and informal discussion concerning methods of continuing and improving communications and cooperation among workshop participants. The group specifically ad-dressed three areas: maintaining informal one-on-one direct contacts, improving the use of the ADAP computer system for mail, and the desirability of starting...

  13. Derive Workshop Matrix Algebra and Linear Algebra.

    ERIC Educational Resources Information Center

    Townsley Kulich, Lisa; Victor, Barbara

    This document presents the course content for a workshop that integrates the use of the computer algebra system Derive with topics in matrix and linear algebra. The first section is a guide to using Derive that provides information on how to write algebraic expressions, make graphs, save files, edit, define functions, differentiate expressions,…

  14. Comparative analysis of marine ecosystems: workshop on predator-prey interactions.

    PubMed

    Bailey, Kevin M; Ciannelli, Lorenzo; Hunsicker, Mary; Rindorf, Anna; Neuenfeldt, Stefan; Möllmann, Christian; Guichard, Frederic; Huse, Geir

    2010-10-23

    Climate and human influences on marine ecosystems are largely manifested by changes in predator-prey interactions. It follows that ecosystem-based management of the world's oceans requires a better understanding of food web relationships. An international workshop on predator-prey interactions in marine ecosystems was held at the Oregon State University, Corvallis, OR, USA on 16-18 March 2010. The meeting brought together scientists from diverse fields of expertise including theoretical ecology, animal behaviour, fish and seabird ecology, statistics, fisheries science and ecosystem modelling. The goals of the workshop were to critically examine the methods of scaling-up predator-prey interactions from local observations to systems, the role of shifting ecological processes with scale changes, and the complexity and organizational structure in trophic interactions.

  15. University teaching - where next?

    NASA Astrophysics Data System (ADS)

    1999-03-01

    A one-day workshop will take place on 23 April 1999 at the University of Edinburgh's Conference and Training Centre to consider the topic `The future of university teaching? Multimedia, web and new technologies'. The workshop is being organized by Edinburgh Parallel Computing Centre and will be attended by experts in distance learning from various institutions including the Clyde Virtual University and the Open University, plus a speaker from the USA. They will present case studies of the opportunities new technologies provide for higher education, covering all aspects from development of electronic courses through delivery mechanisms to user feedback. There is certainly an increasing need for quality teaching materials and new ways of learning. The workshop will aim to discuss how those involved in university teaching can benefit from new developments such as multimedia, the Internet, as well as new computing and networking technologies. Participation is free, with lunch and refreshments provided. More information and registration details can be found at http://www.epcc.ed.ac.uk/epcc-tec/JTAP/workshop/ or by e-mail to epcc-tec@epcc.ed.ac.uk.

  16. "See One, Sim One, Do One"- A National Pre-Internship Boot-Camp to Ensure a Safer "Student to Doctor" Transition.

    PubMed

    Minha, Sa'ar; Shefet, Daphna; Sagi, Doron; Berkenstadt, Haim; Ziv, Amitai

    2016-01-01

    The transition for being a medical student to a full functioning intern is accompanied by considerable stress and sense of unpreparedness. Simulation based workshops were previously reported to be effective in improving the readiness of interns and residents to their daily needed skills but only few programs were implemented on a large scale. A nationally endorsed and mandated pre-internship simulation based workshop is reported. We hypothesized that this intervention will have a meaningful and sustained impact on trainees' perception of their readiness to internship with regard to patient safety and quality of care skills. Main outcome measure was the workshop's contribution to professional training in general and to critical skills and error prevention in particular, as perceived by participants. Between 2004 and 2011, 85 workshops were conducted for a total of 4,172 trainees. Eight-hundred and six of the 2,700 participants approached by e-mail, returned feedback evaluation forms, which were analyzed. Eighty five percent of trainees perceived the workshop as an essential component of their professional training, and 87% agreed it should be mandatory. These ratings peaked during internship and were generally sustained 3 years following the workshop. Contribution to emergency care skills was especially highly ranked (83%). Implementation of a mandatory, simulation-based, pre-internship workshop on a national scale made a significant perceived impact on interns and residents. The sustained impact should encourage adopting this approach to facilitate the student to doctor transition.

  17. An evaluation of the state of time synchronization on leadership class supercomputers

    DOE PAGES

    Jones, Terry; Ostrouchov, George; Koenig, Gregory A.; ...

    2017-10-09

    We present a detailed examination of time agreement characteristics for nodes within extreme-scale parallel computers. Using a software tool we introduce in this paper, we quantify attributes of clock skew among nodes in three representative high-performance computers sited at three national laboratories. Our measurements detail the statistical properties of time agreement among nodes and how time agreement drifts over typical application execution durations. We discuss the implications of our measurements, why the current state of the field is inadequate, and propose strategies to address observed shortcomings.

  18. An evaluation of the state of time synchronization on leadership class supercomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, Terry; Ostrouchov, George; Koenig, Gregory A.

    We present a detailed examination of time agreement characteristics for nodes within extreme-scale parallel computers. Using a software tool we introduce in this paper, we quantify attributes of clock skew among nodes in three representative high-performance computers sited at three national laboratories. Our measurements detail the statistical properties of time agreement among nodes and how time agreement drifts over typical application execution durations. We discuss the implications of our measurements, why the current state of the field is inadequate, and propose strategies to address observed shortcomings.

  19. Diving into the analysis of time-depth recorder and behavioural data records: A workshop summary

    NASA Astrophysics Data System (ADS)

    Womble, Jamie N.; Horning, Markus; Lea, Mary-Anne; Rehberg, Michael J.

    2013-04-01

    Directly observing the foraging behavior of animals in the marine environment can be extremely challenging, if not impossible, as such behavior often takes place beneath the surface of the ocean and in extremely remote areas. In lieu of directly observing foraging behavior, data from time-depth recorders and other types of behavioral data recording devices are commonly used to describe and quantify the behavior of fish, squid, seabirds, sea turtles, pinnipeds, and cetaceans. Often the definitions of actual behavioral units and analytical approaches may vary substantially which may influence results and limit our ability to compare behaviors of interest across taxonomic groups and geographic regions. A workshop was convened in association with the Fourth International Symposium on Bio-logging in Hobart, Tasmania on 8 March 2011, with the goal of providing a forum for the presentation, review, and discussion of various methods and approaches that are used to describe and analyze time-depth recorder and associated behavioral data records. The international meeting brought together 36 participants from 14 countries from a diversity of backgrounds including scientists from academia and government, graduate students, post-doctoral fellows, and developers of electronic tagging technology and analysis software. The specific objectives of the workshop were to host a series of invited presentations followed by discussion sessions focused on (1) identifying behavioral units and metrics that are suitable for empirical studies, (2) reviewing analytical approaches and techniques that can be used to objectively classify behavior, and (3) identifying cases when temporal autocorrelation structure is useful for identifying behaviors of interest. Outcomes of the workshop included highlighting the need to better define behavioral units and to devise more standardized processing and analytical techniques in order to ensure that results are comparable across studies and taxonomic groups.

  20. Nonlinear power spectrum from resummed perturbation theory: a leap beyond the BAO scale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anselmi, Stefano; Pietroni, Massimo, E-mail: anselmi@ieec.uab.es, E-mail: massimo.pietroni@pd.infn.it

    2012-12-01

    A new computational scheme for the nonlinear cosmological matter power spectrum (PS) is presented. Our method is based on evolution equations in time, which can be cast in a form extremely convenient for fast numerical evaluations. A nonlinear PS is obtained in a time comparable to that needed for a simple 1-loop computation, and the numerical implementation is very simple. Our results agree with N-body simulations at the percent level in the BAO range of scales, and at the few-percent level up to k ≅ 1 h/Mpc at z∼>0.5, thereby opening the possibility of applying this tool to scales interestingmore » for weak lensing. We clarify the approximations inherent to this approach as well as its relations to previous ones, such as the Time Renormalization Group, and the multi-point propagator expansion. We discuss possible lines of improvements of the method and its intrinsic limitations by multi streaming at small scales and low redshifts.« less

  1. Native Peoples-Native Homelands Climate Change Workshop: Lessons Learned

    NASA Technical Reports Server (NTRS)

    Maynard, Nancy G.

    2003-01-01

    The Native Peoples-Native Homelands Climate Change Workshop was held on October 28 through November 01,1998, as part of a series of workshops being held around the U.S. to improve the understanding of the potential consequences of climate variability and change for the Nation. This workshop was specifically designed by Native Peoples to examine the impacts of climate change and extreme weather variability on Native Peoples and Native Homelands from an indigenous cultural and spiritual perspective and to develop recommendations as well as identify potential response actions. The workshop brought together interested Native Peoples, representatives of Tribal governments, traditional elders, Tribal leaders, natural resource managers, Tribal College faculty and students, and climate scientists fiom government agencies and universities. It is clear that Tribal colleges and universities play a unique and critical role in the success of these emerging partnerships for decision-making in addition to the important education function for both Native and non-Native communities such as serving as a culturally-appropriate vehicle for access, analysis, control, and protection of indigenous cultural and intellectual property. During the discussions between scientists and policy-makers from both Native and non-Native communities, a number of important lessons emerged which are key to building more effective partnerships between Native and non-Native communities for collaboration and decision-making for a more sustainable future. This talk summarizes the key issues, recommendations, and lessons learned during this workshop.

  2. Temporal and Spatio-Temporal Dynamic Instabilities: Novel Computational and Experimental approaches

    NASA Astrophysics Data System (ADS)

    Doedel, Eusebius J.; Panayotaros, Panayotis; Lambruschini, Carlos L. Pando

    2016-11-01

    This special issue contains a concise account of significant research results presented at the international workshop on Advanced Computational and Experimental Techniques in Nonlinear Dynamics, which was held in Cusco, Peru in August 2015. The meeting gathered leading experts, as well as new researchers, who have contributed to different aspects of Nonlinear Dynamics. Particularly significant was the presence of many active scientists from Latin America. The topics covered in this special issue range from advanced numerical techniques to novel physical experiments, and reflect the present state of the art in several areas of Nonlinear Dynamics. It contains seven review articles, followed by twenty-one regular papers that are organized in five categories, namely (1) Nonlinear Evolution Equations and Applications, (2) Numerical Continuation in Self-sustained Oscillators, (3) Synchronization, Control and Data Analysis, (4) Hamiltonian Systems, and (5) Scaling Properties in Maps.

  3. PREFACE: Special section featuring selected papers from the 3rd International Workshop on Numerical Modelling of High Temperature Superconductors Special section featuring selected papers from the 3rd International Workshop on Numerical Modelling of High Temperature Superconductors

    NASA Astrophysics Data System (ADS)

    Granados, Xavier; Sánchez, Àlvar; López-López, Josep

    2012-10-01

    The development of superconducting applications and superconducting engineering requires the support of consistent tools which can provide models for obtaining a good understanding of the behaviour of the systems and predict novel features. These models aim to compute the behaviour of the superconducting systems, design superconducting devices and systems, and understand and test the behavior of the superconducting parts. 50 years ago, in 1962, Charles Bean provided the superconducting community with a model efficient enough to allow the computation of the response of a superconductor to external magnetic fields and currents flowing through in an understandable way: the so called critical-state model. Since then, in addition to the pioneering critical-state approach, other tools have been devised for designing operative superconducting systems, allowing integration of the superconducting design in nearly standard electromagnetic computer-aided design systems by modelling the superconducting parts with consideration of time-dependent processes. In April 2012, Barcelona hosted the 3rd International Workshop on Numerical Modelling of High Temperature Superconductors (HTS), the third in a series of workshops started in Lausanne in 2010 and followed by Cambridge in 2011. The workshop reflected the state-of-the-art and the new initiatives of HTS modelling, considering mathematical, physical and technological aspects within a wide and interdisciplinary scope. Superconductor Science and Technology is now publishing a selection of papers from the workshop which have been selected for their high quality. The selection comprises seven papers covering mathematical, physical and technological topics which contribute to an improvement in the development of procedures, understanding of phenomena and development of applications. We hope that they provide a perspective on the relevance and growth that the modelling of HTS superconductors has achieved in the past 25 years.

  4. A Federal Vision for Future Computing: A Nanotechnology-Inspired Grand Challenge

    DTIC Science & Technology

    2016-07-29

    Science Foundation (NSF), Department of Defense (DOD), National Institute of Standards and Technology (NIST), Intelligence Community (IC) Introduction...multiple Federal agencies: • Intelligent big data sensors that act autonomously and are programmable via the network for increased flexibility, and... intelligence for scientific discovery enabled by rapid extreme-scale data analysis, capable of understanding and making sense of results and thereby

  5. PuLP/XtraPuLP : Partitioning Tools for Extreme-Scale Graphs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slota, George M; Rajamanickam, Sivasankaran; Madduri, Kamesh

    2017-09-21

    PuLP/XtraPulp is software for partitioning graphs from several real-world problems. Graphs occur in several places in real world from road networks, social networks and scientific simulations. For efficient parallel processing these graphs have to be partitioned (split) with respect to metrics such as computation and communication costs. Our software allows such partitioning for massive graphs.

  6. Towards the Goal of Modular Climate Data Services: An Overview of NCPP Applications and Software

    NASA Astrophysics Data System (ADS)

    Koziol, B. W.; Cinquini, L.; Treshansky, A.; Murphy, S.; DeLuca, C.

    2013-12-01

    In August 2013, the National Climate Predictions and Projections Platform (NCPP) organized a workshop focusing on the quantitative evaluation of downscaled climate data products (QED-2013). The QED-2013 workshop focused on real-world application problems drawn from several sectors (e.g. hydrology, ecology, environmental health, agriculture), and required that downscaled downscaled data products be dynamically accessed, generated, manipulated, annotated, and evaluated. The cyberinfrastructure elements that were integrated to support the workshop included (1) a wiki-based project hosting environment (Earth System CoG) with an interface to data services provided by an Earth System Grid Federation (ESGF) data node; (2) metadata tools provided by the Earth System Documentation (ES-DOC) collaboration; and (3) a Python-based library OpenClimateGIS (OCGIS) for subsetting and converting NetCDF-based climate data to GIS and tabular formats. Collectively, this toolset represents a first deployment of a 'ClimateTranslator' that enables users to access, interpret, and apply climate information at local and regional scales. This presentation will provide an overview of these components above, how they were used in the workshop, and discussion of current and potential integration. The long-term strategy for this software stack is to offer the suite of services described on a customizable, per-project basis. Additional detail on the three components is below. (1) Earth System CoG is a web-based collaboration environment that integrates data discovery and access services with tools for supporting governance and the organization of information. QED-2013 utilized these capabilities to share with workshop participants a suite of downscaled datasets, associated images derived from those datasets, and metadata files describing the downscaling techniques involved. The collaboration side of CoG was used for workshop organization, discussion, and results. (2) The ES-DOC Questionnaire, Viewer, and Comparator are web-based tools for the creation and use of model and experiment documentation. Workshop participants used the Questionnaire to generate metadata on regional downscaling models and statistical downscaling methods, and the Viewer to display the results. A prototype Comparator was available to compare properties across dynamically downscaled models. (3) OCGIS is a Python (v2.7) package designed for geospatial manipulation, subsetting, computation, and translation of Climate and Forecasting (CF)-compliant climate datasets - either stored in local NetCDF files, or files served through THREDDS data servers.

  7. A Unified Data-Driven Approach for Programming In Situ Analysis and Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aiken, Alex

    The placement and movement of data is becoming the key limiting factor on both performance and energy efficiency of high performance computations. As systems generate more data, it is becoming increasingly difficult to actually move that data elsewhere for post-processing, as the rate of improvements in supporting I/O infrastructure is not keeping pace. Together, these trends are creating a shift in how we think about exascale computations, from a viewpoint that focuses on FLOPS to one that focuses on data and data-centric operations as fundamental to the reasoning about, and optimization of, scientific workflows on extreme-scale architectures. The overarching goalmore » of our effort was the study of a unified data-driven approach for programming applications and in situ analysis and visualization. Our work was to understand the interplay between data-centric programming model requirements at extreme-scale and the overall impact of those requirements on the design, capabilities, flexibility, and implementation details for both applications and the supporting in situ infrastructure. In this context, we made many improvements to the Legion programming system (one of the leading data-centric models today) and demonstrated in situ analyses on real application codes using these improvements.« less

  8. A multiply-add engine with monolithically integrated 3D memristor crossbar/CMOS hybrid circuit.

    PubMed

    Chakrabarti, B; Lastras-Montaño, M A; Adam, G; Prezioso, M; Hoskins, B; Payvand, M; Madhavan, A; Ghofrani, A; Theogarajan, L; Cheng, K-T; Strukov, D B

    2017-02-14

    Silicon (Si) based complementary metal-oxide semiconductor (CMOS) technology has been the driving force of the information-technology revolution. However, scaling of CMOS technology as per Moore's law has reached a serious bottleneck. Among the emerging technologies memristive devices can be promising for both memory as well as computing applications. Hybrid CMOS/memristor circuits with CMOL (CMOS + "Molecular") architecture have been proposed to combine the extremely high density of the memristive devices with the robustness of CMOS technology, leading to terabit-scale memory and extremely efficient computing paradigm. In this work, we demonstrate a hybrid 3D CMOL circuit with 2 layers of memristive crossbars monolithically integrated on a pre-fabricated CMOS substrate. The integrated crossbars can be fully operated through the underlying CMOS circuitry. The memristive devices in both layers exhibit analog switching behavior with controlled tunability and stable multi-level operation. We perform dot-product operations with the 2D and 3D memristive crossbars to demonstrate the applicability of such 3D CMOL hybrid circuits as a multiply-add engine. To the best of our knowledge this is the first demonstration of a functional 3D CMOL hybrid circuit.

  9. A multiply-add engine with monolithically integrated 3D memristor crossbar/CMOS hybrid circuit

    PubMed Central

    Chakrabarti, B.; Lastras-Montaño, M. A.; Adam, G.; Prezioso, M.; Hoskins, B.; Cheng, K.-T.; Strukov, D. B.

    2017-01-01

    Silicon (Si) based complementary metal-oxide semiconductor (CMOS) technology has been the driving force of the information-technology revolution. However, scaling of CMOS technology as per Moore’s law has reached a serious bottleneck. Among the emerging technologies memristive devices can be promising for both memory as well as computing applications. Hybrid CMOS/memristor circuits with CMOL (CMOS + “Molecular”) architecture have been proposed to combine the extremely high density of the memristive devices with the robustness of CMOS technology, leading to terabit-scale memory and extremely efficient computing paradigm. In this work, we demonstrate a hybrid 3D CMOL circuit with 2 layers of memristive crossbars monolithically integrated on a pre-fabricated CMOS substrate. The integrated crossbars can be fully operated through the underlying CMOS circuitry. The memristive devices in both layers exhibit analog switching behavior with controlled tunability and stable multi-level operation. We perform dot-product operations with the 2D and 3D memristive crossbars to demonstrate the applicability of such 3D CMOL hybrid circuits as a multiply-add engine. To the best of our knowledge this is the first demonstration of a functional 3D CMOL hybrid circuit. PMID:28195239

  10. Planetary dune workshop expands to include subaqueous processes

    USGS Publications Warehouse

    Titus, Timothy N.; Bryant, Gerald; Rubin, David M.

    2018-01-01

    Dune-like structures appear in the depths of Earth’s oceans, across its landscapes, and in the extremities of the solar system beyond. Dunes rise up under the thick dense atmosphere of Venus, and they have been found under the almost unimaginably ephemeral atmosphere of a comet.

  11. MN GIS/LIS Consortium Annual Conference and Workshops, Rochester, MN, October 1-3, 2014

    EPA Science Inventory

    We mapped the distribution of multiple ecosystem services in the Saint Louis River Area of Concern (SLR AOC) under current and reported extreme lake levels. Services were mapped using measured or modeled natural features (i.e., bathymetry, vegetation, fetch, habitat, contaminated...

  12. Proceedings of the workshop on Compilation of (Symbolic) Languages for Parallel Computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foster, I.; Tick, E.

    1991-11-01

    This report comprises the abstracts and papers for the talks presented at the Workshop on Compilation of (Symbolic) Languages for Parallel Computers, held October 31--November 1, 1991, in San Diego. These unreferred contributions were provided by the participants for the purpose of this workshop; many of them will be published elsewhere in peer-reviewed conferences and publications. Our goal is planning this workshop was to bring together researchers from different disciplines with common problems in compilation. In particular, we wished to encourage interaction between researchers working in compilation of symbolic languages and those working on compilation of conventional, imperative languages. Themore » fundamental problems facing researchers interested in compilation of logic, functional, and procedural programming languages for parallel computers are essentially the same. However, differences in the basic programming paradigms have led to different communities emphasizing different species of the parallel compilation problem. For example, parallel logic and functional languages provide dataflow-like formalisms in which control dependencies are unimportant. Hence, a major focus of research in compilation has been on techniques that try to infer when sequential control flow can safely be imposed. Granularity analysis for scheduling is a related problem. The single- assignment property leads to a need for analysis of memory use in order to detect opportunities for reuse. Much of the work in each of these areas relies on the use of abstract interpretation techniques.« less

  13. A Partnership between English Language Learners and a Team of Rocket Scientists: EPO for the NASA SDO Extreme Ultraviolet Variability Experiment (EVE)

    NASA Astrophysics Data System (ADS)

    Buhr, S. M.; McCaffrey, M. S.; Eparvier, F.; Murillo, M.

    2008-05-01

    Recent immigrant high school students were successfully engaged in learning about Sun-Earth connections through a partnership with the NASA Solar Dynamics Observatory Extreme Ultraviolet Variability Experiment (EVE) project. The students were enrolled in a pilot course as part of the Math, Engineering and Science Achievement (MESA) program. The English Language Learner (ELL) students doubled their achievement on a pre- and post- assessment on the content of the course. Students learned scientific content and vocabulary in English with support in Spanish, attended field trips, hosted scientist speakers, built antenna and deployed space weather monitors as part of the Stanford SOLAR project, and gave final presentations in English, showcasing their new computer skills. Teachers who taught the students in other courses noted gains in the students' willingness to use English in class and noted gains in math skills. The course has been broken into modules for use in shorter after-school environments, or for use by EVE scientists who are outside of the Boulder area. Video footage of "The Making of a Satellite", and "All About EVE" is completed for use in the kits. Other EVE EPO includes upcoming professional development for teachers and content workshops for journalists.

  14. Overview of the Aeroelastic Prediction Workshop

    NASA Technical Reports Server (NTRS)

    Heeg, Jennifer; Chwalowski, Pawel; Florance, Jennifer P.; Wieseman, Carol D.; Schuster, David M.; Perry, Raleigh B.

    2013-01-01

    The Aeroelastic Prediction Workshop brought together an international community of computational fluid dynamicists as a step in defining the state of the art in computational aeroelasticity. This workshop's technical focus was prediction of unsteady pressure distributions resulting from forced motion, benchmarking the results first using unforced system data. The most challenging aspects of the physics were identified as capturing oscillatory shock behavior, dynamic shock-induced separated flow and tunnel wall boundary layer influences. The majority of the participants used unsteady Reynolds-averaged Navier Stokes codes. These codes were exercised at transonic Mach numbers for three configurations and comparisons were made with existing experimental data. Substantial variations were observed among the computational solutions as well as differences relative to the experimental data. Contributing issues to these differences include wall effects and wall modeling, non-standardized convergence criteria, inclusion of static aeroelastic deflection, methodology for oscillatory solutions, post-processing methods. Contributing issues pertaining principally to the experimental data sets include the position of the model relative to the tunnel wall, splitter plate size, wind tunnel expansion slot configuration, spacing and location of pressure instrumentation, and data processing methods.

  15. 1994 Science Information Management and Data Compression Workshop

    NASA Technical Reports Server (NTRS)

    Tilton, James C. (Editor)

    1994-01-01

    This document is the proceedings from the 'Science Information Management and Data Compression Workshop,' which was held on September 26-27, 1994, at the NASA Goddard Space Flight Center, Greenbelt, Maryland. The Workshop explored promising computational approaches for handling the collection, ingestion, archival and retrieval of large quantities of data in future Earth and space science missions. It consisted of eleven presentations covering a range of information management and data compression approaches that are being or have been integrated into actual or prototypical Earth or space science data information systems, or that hold promise for such an application. The workshop was organized by James C. Tilton and Robert F. Cromp of the NASA Goddard Space Flight Center.

  16. The 1995 Science Information Management and Data Compression Workshop

    NASA Technical Reports Server (NTRS)

    Tilton, James C. (Editor)

    1995-01-01

    This document is the proceedings from the 'Science Information Management and Data Compression Workshop,' which was held on October 26-27, 1995, at the NASA Goddard Space Flight Center, Greenbelt, Maryland. The Workshop explored promising computational approaches for handling the collection, ingestion, archival, and retrieval of large quantities of data in future Earth and space science missions. It consisted of fourteen presentations covering a range of information management and data compression approaches that are being or have been integrated into actual or prototypical Earth or space science data information systems, or that hold promise for such an application. The Workshop was organized by James C. Tilton and Robert F. Cromp of the NASA Goddard Space Flight Center.

  17. Regression and Data Mining Methods for Analyses of Multiple Rare Variants in the Genetic Analysis Workshop 17 Mini-Exome Data

    PubMed Central

    Bailey-Wilson, Joan E.; Brennan, Jennifer S.; Bull, Shelley B; Culverhouse, Robert; Kim, Yoonhee; Jiang, Yuan; Jung, Jeesun; Li, Qing; Lamina, Claudia; Liu, Ying; Mägi, Reedik; Niu, Yue S.; Simpson, Claire L.; Wang, Libo; Yilmaz, Yildiz E.; Zhang, Heping; Zhang, Zhaogong

    2012-01-01

    Group 14 of Genetic Analysis Workshop 17 examined several issues related to analysis of complex traits using DNA sequence data. These issues included novel methods for analyzing rare genetic variants in an aggregated manner (often termed collapsing rare variants), evaluation of various study designs to increase power to detect effects of rare variants, and the use of machine learning approaches to model highly complex heterogeneous traits. Various published and novel methods for analyzing traits with extreme locus and allelic heterogeneity were applied to the simulated quantitative and disease phenotypes. Overall, we conclude that power is (as expected) dependent on locus-specific heritability or contribution to disease risk, large samples will be required to detect rare causal variants with small effect sizes, extreme phenotype sampling designs may increase power for smaller laboratory costs, methods that allow joint analysis of multiple variants per gene or pathway are more powerful in general than analyses of individual rare variants, population-specific analyses can be optimal when different subpopulations harbor private causal mutations, and machine learning methods may be useful for selecting subsets of predictors for follow-up in the presence of extreme locus heterogeneity and large numbers of potential predictors. PMID:22128066

  18. Contributions of Dynamic and Thermodynamic Scaling in Subdaily Precipitation Extremes in India

    NASA Astrophysics Data System (ADS)

    Ali, Haider; Mishra, Vimal

    2018-03-01

    Despite the importance of subdaily precipitation extremes for urban areas, the role of dynamic and thermodynamic scaling in changes in precipitation extremes in India remains poorly constrained. Here we estimate contributions from thermodynamic and dynamic scaling on changes in subdaily precipitation extremes for 23 urban locations in India. Subdaily precipitation extremes have become more intense during the last few decades. Moreover, we find a twofold rise in the frequency of subdaily precipitation extremes during 1979-2015, which is faster than the increase in daily precipitation extremes. The contribution of dynamic scaling in this rise in the frequency and intensity of subdaily precipitation extremes is higher than the thermodynamic scaling. Moreover, half-hourly precipitation extremes show higher contributions from the both thermodynamic ( 10%/K) and dynamic ( 15%/K) scaling than daily (6%/K and 9%/K, respectively) extremes indicating the role of warming on the rise in the subdaily precipitation extremes in India. Our findings have implications for better understanding the dynamic response of precipitation extremes under the warming climate over India.

  19. Workshop Physics Activity Guide, Module 2: Mechanics II, Momentum, Energy, Rotational and Harmonic Motion, and Chaos (Units 8 - 15)

    NASA Astrophysics Data System (ADS)

    Laws, Priscilla W.

    2004-05-01

    The Workshop Physics Activity Guide is a set of student workbooks designed to serve as the foundation for a two-semester calculus-based introductory physics course. It consists of 28 units that interweave text materials with activities that include prediction, qualitative observation, explanation, equation derivation, mathematical modeling, quantitative experiments, and problem solving. Students use a powerful set of computer tools to record, display, and analyze data, as well as to develop mathematical models of physical phenomena. The design of many of the activities is based on the outcomes of physics education research. The Workshop Physics Activity Guide is supported by an Instructor's Website that: (1) describes the history and philosophy of the Workshop Physics Project; (2) provides advice on how to integrate the Guide into a variety of educational settings; (3) provides information on computer tools (hardware and software) and apparatus; and (4) includes suggested homework assignments for each unit. Log on to the Workshop Physics Project website at http://physics.dickinson.edu/ Workshop Physics is a component of the Physics Suite--a collection of materials created by a group of educational reformers known as the Activity Based Physics Group. The Physics Suite contains a broad array of curricular materials that are based on physics education research, including:

      Understanding Physics, by Cummings, Laws, Redish and Cooney (an introductory textbook based on the best-selling text by Halliday/Resnick/Walker) RealTime Physics Laboratory Modules Physics by Inquiry (intended for use in a workshop setting) Interactive Lecture Demonstration Tutorials in Introductory Physics Activity Based Tutorials (designed primarily for use in recitations)

    • [Upper extremities, neck and back symptoms in office employees working at computer stations].

      PubMed

      Zejda, Jan E; Bugajska, Joanna; Kowalska, Małgorzata; Krzych, Lukasz; Mieszkowska, Marzena; Brozek, Grzegorz; Braczkowska, Bogumiła

      2009-01-01

      To obtain current data on the occurrence ofwork-related symptoms of office computer users in Poland we implemented a questionnaire survey. Its goal was to assess the prevalence and intensity of symptoms of upper extremities, neck and back in office workers who use computers on a regular basis, and to find out if the occurrence of symptoms depends on the duration of computer use and other work-related factors. Office workers in two towns (Warszawa and Katowice), employed in large social services companies, were invited to fill in the Polish version of Nordic Questionnaire. The questions included work history and history of last-week symptoms of pain of hand/wrist, elbow, arm, neck and upper and lower back (occurrence and intensity measured by visual scale). Altogether 477 men and women returned the completed questionnaires. Between-group symptom differences (chi-square test) were verified by multivariate analysis (GLM). The prevalence of symptoms in individual body parts was as follows: neck, 55.6%; arm, 26.9%; elbow, 13.3%; wrist/hand, 29.9%; upper back, 49.6%; and lower back, 50.1%. Multivariate analysis confirmed the effect of gender, age and years of computer use on the occurrence of symptoms. Among other determinants, forearm support explained pain of wrist/hand, wrist support of elbow pain, and chair adjustment of arm pain. Association was also found between low back pain and chair adjustment and keyboard position. The findings revealed frequent occurrence of symptoms of pain in upper extremities and neck in office workers who use computers on a regular basis. Seating position could also contribute to the frequent occurrence of back pain in the examined population.

  1. NASA Workshop on Computational Structural Mechanics 1987, part 2

    NASA Technical Reports Server (NTRS)

    Sykes, Nancy P. (Editor)

    1989-01-01

    Advanced methods and testbed/simulator development topics are discussed. Computational Structural Mechanics (CSM) testbed architecture, engine structures simulation, applications to laminate structures, and a generic element processor are among the topics covered.

  2. Excellence in Computational Biology and Informatics — EDRN Public Portal

    Cancer.gov

    9th Early Detection Research Network (EDRN) Scientific Workshop. Excellence in Computational Biology and Informatics: Sponsored by the EDRN Data Sharing Subcommittee Moderator: Daniel Crichton, M.S., NASA Jet Propulsion Laboratory

  3. Final report for Conference Support Grant "From Computational Biophysics to Systems Biology - CBSB12"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hansmann, Ulrich H.E.

    2012-07-02

    This report summarizes the outcome of the international workshop From Computational Biophysics to Systems Biology (CBSB12) which was held June 3-5, 2012, at the University of Tennessee Conference Center in Knoxville, TN, and supported by DOE through the Conference Support Grant 120174. The purpose of CBSB12 was to provide a forum for the interaction between a data-mining interested systems biology community and a simulation and first-principle oriented computational biophysics/biochemistry community. CBSB12 was the sixth in a series of workshops of the same name organized in recent years, and the second that has been held in the USA. As in previousmore » years, it gave researchers from physics, biology, and computer science an opportunity to acquaint each other with current trends in computational biophysics and systems biology, to explore venues of cooperation, and to establish together a detailed understanding of cells at a molecular level. The conference grant of $10,000 was used to cover registration fees and provide travel fellowships to selected students and postdoctoral scientists. By educating graduate students and providing a forum for young scientists to perform research into the working of cells at a molecular level, the workshop adds to DOE's mission of paving the way to exploit the abilities of living systems to capture, store and utilize energy.« less

  4. 77 FR 12823 - Advanced Scientific Computing Advisory Committee

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-02

    ... Exascale ARRA projects--Magellan final report, Advanced Networking update Status from Computer Science COV Early Career technical talks Summary of Applied Math and Computer Science Workshops ASCR's new SBIR..., Office of Science. ACTION: Notice of Open Meeting. SUMMARY: This notice announces a meeting of the...

  5. Teaching Computer Literacy with Freeware and Shareware.

    ERIC Educational Resources Information Center

    Hobart, R. Dale; And Others

    1988-01-01

    Describes workshops given at Ferris State University for faculty and staff who want to acquire computer skills. Considered are a computer literacy and a software toolkit distributed to participants made from public domain/shareware resources. Stresses the benefits of shareware as an educational resource. (CW)

  6. Large Scale Meteorological Pattern of Extreme Rainfall in Indonesia

    NASA Astrophysics Data System (ADS)

    Kuswanto, Heri; Grotjahn, Richard; Rachmi, Arinda; Suhermi, Novri; Oktania, Erma; Wijaya, Yosep

    2014-05-01

    Extreme Weather Events (EWEs) cause negative impacts socially, economically, and environmentally. Considering these facts, forecasting EWEs is crucial work. Indonesia has been identified as being among the countries most vulnerable to the risk of natural disasters, such as floods, heat waves, and droughts. Current forecasting of extreme events in Indonesia is carried out by interpreting synoptic maps for several fields without taking into account the link between the observed events in the 'target' area with remote conditions. This situation may cause misidentification of the event leading to an inaccurate prediction. Grotjahn and Faure (2008) compute composite maps from extreme events (including heat waves and intense rainfall) to help forecasters identify such events in model output. The composite maps show large scale meteorological patterns (LSMP) that occurred during historical EWEs. Some vital information about the EWEs can be acquired from studying such maps, in addition to providing forecaster guidance. Such maps have robust mid-latitude meteorological patterns (for Sacramento and California Central Valley, USA EWEs). We study the performance of the composite approach for tropical weather condition such as Indonesia. Initially, the composite maps are developed to identify and forecast the extreme weather events in Indramayu district- West Java, the main producer of rice in Indonesia and contributes to about 60% of the national total rice production. Studying extreme weather events happening in Indramayu is important since EWEs there affect national agricultural and fisheries activities. During a recent EWE more than a thousand houses in Indramayu suffered from serious flooding with each home more than one meter underwater. The flood also destroyed a thousand hectares of rice plantings in 5 regencies. Identifying the dates of extreme events is one of the most important steps and has to be carried out carefully. An approach has been applied to identify the dates involving observations from multiple sites (rain gauges). The approach combines the POT (Peaks Over Threshold) with 'declustering' of the data to approximate independence based on the autocorrelation structure of each rainfall series. The cross correlation among sites is considered also to develop the event's criteria yielding a rational choice of the extreme dates given the 'spotty' nature of the intense convection. Based on the identified dates, we are developing a supporting tool for forecasting extreme rainfall based on the corresponding large-scale meteorological patterns (LSMPs). The LSMPs methodology focuses on the larger-scale patterns that the model are better able to forecast, as those larger-scale patterns create the conditions fostering the local EWE. Bootstrap resampling method is applied to highlight the key features that statistically significant with the extreme events. Grotjahn, R., and G. Faure. 2008: Composite Predictor Maps of Extraordinary Weather Events in the Sacramento California Region. Weather and Forecasting. 23: 313-335.

  7. Introducing Hospital Staff to Computer Concepts: An Educational Program

    PubMed Central

    Kaplan, Bonnie

    1981-01-01

    An in-house computer education program for hospital staff ran for two years at a large, metropolitan hospital. The program drew physicians, administrators, department heads, secretaries, technicians, and data managers to courses, seminars, and workshops on medical computing. Two courses, an introduction to computer concepts and a programming course, are described and evaluated.

  8. Computer Aided Design: Instructional Manual. The North Dakota High Technology Mobile Laboratory Project.

    ERIC Educational Resources Information Center

    Cheng, Wan-Lee

    This instructional manual contains 12 learning activity packets for use in a workshop in computer-aided design and drafting (CADD). The lessons cover the following topics: introduction to computer graphics and computer-aided design/drafting; coordinate systems; advance space graphics hardware configuration and basic features of the IBM PC…

  9. Outcome of a Workshop on Applications of Protein Models in Biomedical Research

    PubMed Central

    Schwede, Torsten; Sali, Andrej; Honig, Barry; Levitt, Michael; Berman, Helen M.; Jones, David; Brenner, Steven E.; Burley, Stephen K.; Das, Rhiju; Dokholyan, Nikolay V.; Dunbrack, Roland L.; Fidelis, Krzysztof; Fiser, Andras; Godzik, Adam; Huang, Yuanpeng Janet; Humblet, Christine; Jacobson, Matthew P.; Joachimiak, Andrzej; Krystek, Stanley R.; Kortemme, Tanja; Kryshtafovych, Andriy; Montelione, Gaetano T.; Moult, John; Murray, Diana; Sanchez, Roberto; Sosnick, Tobin R.; Standley, Daron M.; Stouch, Terry; Vajda, Sandor; Vasquez, Max; Westbrook, John D.; Wilson, Ian A.

    2009-01-01

    Summary We describe the proceedings and conclusions from a “Workshop on Applications of Protein Models in Biomedical Research” that was held at University of California at San Francisco on 11 and 12 July, 2008. At the workshop, international scientists involved with structure modeling explored (i) how models are currently used in biomedical research, (ii) what the requirements and challenges for different applications are, and (iii) how the interaction between the computational and experimental research communities could be strengthened to advance the field. PMID:19217386

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Watts, Bernadette M.; Collis, Samuel Scott; Ceballos, Deanna Rose

    This report summarizes the activities of the Computer Science Research Institute (CSRI) at Sandia National Laboratories during the period January 1, 2005 to December 31, 2005. During this period, the CSRI hosted 182 visitors representing 83 universities, companies and laboratories. Of these, 60 were summer students or faculty. The CSRI partially sponsored 2 workshops and also organized and was the primary host for 3 workshops. These 3 CSRI sponsored workshops had 105 participants, 78 from universities, companies and laboratories, and 27 from Sandia. Finally, the CSRI sponsored 12 long-term collaborative research projects and 3 Sabbaticals.

  11. The Fourth Annual Thermal and Fluids Analysis Workshop

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The Fourth Annual Thermal and Fluids Analysis Workshop was held from August 17-21, 1992, at NASA Lewis Research Center. The workshop consisted of classes, vendor demonstrations, and paper sessions. The classes and vendor demonstrations provided participants with the information on widely used tools for thermal and fluids analysis. The paper sessions provided a forum for the exchange of information and ideas among thermal and fluids analysts. Paper topics included advances and uses of established thermal and fluids computer codes (such as SINDA and TRASYS) as well as unique modeling techniques and applications.

  12. Teaching Students How to Study: A Workshop on Information Processing and Self-Testing Helps Students Learn

    PubMed Central

    Stanger-Hall, Kathrin F.; Shockley, Floyd W.; Wilson, Rachel E.

    2011-01-01

    We implemented a “how to study” workshop for small groups of students (6–12) for N = 93 consenting students, randomly assigned from a large introductory biology class. The goal of this workshop was to teach students self-regulating techniques with visualization-based exercises as a foundation for learning and critical thinking in two areas: information processing and self-testing. During the workshop, students worked individually or in groups and received immediate feedback on their progress. Here, we describe two individual workshop exercises, report their immediate results, describe students’ reactions (based on the workshop instructors’ experience and student feedback), and report student performance on workshop-related questions on the final exam. Students rated the workshop activities highly and performed significantly better on workshop-related final exam questions than the control groups. This was the case for both lower- and higher-order thinking questions. Student achievement (i.e., grade point average) was significantly correlated with overall final exam performance but not with workshop outcomes. This long-term (10 wk) retention of a self-testing effect across question levels and student achievement is a promising endorsement for future large-scale implementation and further evaluation of this “how to study” workshop as a study support for introductory biology (and other science) students. PMID:21633067

  13. A Decade of Neural Networks: Practical Applications and Prospects

    NASA Technical Reports Server (NTRS)

    Kemeny, Sabrina E.

    1994-01-01

    The Jet Propulsion Laboratory Neural Network Workshop, sponsored by NASA and DOD, brings together sponsoring agencies, active researchers, and the user community to formulate a vision for the next decade of neural network research and application prospects. While the speed and computing power of microprocessors continue to grow at an ever-increasing pace, the demand to intelligently and adaptively deal with the complex, fuzzy, and often ill-defined world around us remains to a large extent unaddressed. Powerful, highly parallel computing paradigms such as neural networks promise to have a major impact in addressing these needs. Papers in the workshop proceedings highlight benefits of neural networks in real-world applications compared to conventional computing techniques. Topics include fault diagnosis, pattern recognition, and multiparameter optimization.

  14. Applications of tethers in space: A review of workshop recommendations

    NASA Technical Reports Server (NTRS)

    Vontiesenhausen, G. (Editor)

    1986-01-01

    Well-organized and structured efforts of considerable magnitude involving NASA, industry, and academia have explored and defined the engineering and technological requirements of the use of tethers in space and have discovered their broad range of operational and economic benefits. The results of these efforts have produced a family of extremely promising candidate applications. The extensive efforts now in progress are gaining momentum and a series of flight demonstrations are being planned and can be expected to take place in a few years. This report provides an analysis and a review of NASA's second major workshop on Applications of Tethers in Space held in October 15 to 17, 1985, in Venice, Italy. It provides a summary of an up-to-date assessment and recommendations by the NASA Tether Applications in Space Program Planning Group, consisting of representatives of seven NASA Centers and responsible for tether applications program planning implementation as recommended by the workshop panels.

  15. Learning from the Periphery in a Collaborative Robotics Workshop for Girls

    ERIC Educational Resources Information Center

    Sullivan, Florence R.; Keith, Kevin; Wilson, Nicholas C.

    2016-01-01

    This study investigates how students who are peripherally positioned in computer science-based, collaborative group work meaningfully engage with the group activity in order to learn. Our research took place in the context of a one-day, all-girl robotics workshop, in which the participants were learning to program robotic devices. A total of 17…

  16. The Effect of a Classroom-Based Intensive Robotics and Programming Workshop on Sequencing Ability in Early Childhood

    ERIC Educational Resources Information Center

    Kazakoff, Elizabeth R.; Sullivan, Amanda; Bers, Marina U.

    2013-01-01

    This paper examines the impact of programming robots on sequencing ability during a 1-week intensive robotics workshop at an early childhood STEM magnet school in the Harlem area of New York City. Children participated in computer programming activities using a developmentally appropriate tangible programming language CHERP, specifically designed…

  17. Exploiting HPC Platforms for Metagenomics: Challenges and Opportunities (MICW - Metagenomics Informatics Challenges Workshop: 10K Genomes at a Time)

    ScienceCinema

    Canon, Shane

    2018-01-24

    DOE JGI's Zhong Wang, chair of the High-performance Computing session, gives a brief introduction before Berkeley Lab's Shane Canon talks about "Exploiting HPC Platforms for Metagenomics: Challenges and Opportunities" at the Metagenomics Informatics Challenges Workshop held at the DOE JGI on October 12-13, 2011.

  18. Proceedings of the Annual Southwest Park and Recreation Training Institute. (30th, Kingston, Oklahoma, February 3-6, 1985).

    ERIC Educational Resources Information Center

    Texas Tech Univ., Lubbock. Dept. of Park Administration and Landscape Architecture.

    This monograph presents brief summaries of workshops held at this conference. Workshop topics were: (1) Public Relations--Ideally Cost Effective; (2) Fund Raising through Park Operations; (3) Recent Computer Applications in Colorado and Texas; (4) Enterprise Funds: Funding for Enterprising Departments; (5) Coping with Change; (6) Implementing…

  19. Insights and Perspectives on Emerging Inputs to Weight of Evidence Determinations for Food Safety: Workshop Proceedings

    PubMed Central

    Bialk, Heidi; Llewellyn, Craig; Kretser, Alison; Canady, Richard; Lane, Richard; Barach, Jeffrey

    2013-01-01

    This workshop aimed to elucidate the contribution of computational and emerging in vitro methods to the weight of evidence used by risk assessors in food safety assessments. The following issues were discussed: using in silico and high-throughput screening (HTS) data to confirm the safety of approved food ingredients, applying in silico and HTS data in the process of assessing the safety of a new food ingredient, and utilizing in silico and HTS data in communicating the safety of food ingredients while enhancing the public’s trust in the food supply. Perspectives on integrating computational modeling and HTS assays as well as recommendations for optimizing predictive methods for risk assessment were also provided. Given the need to act quickly or proceed cautiously as new data emerge, this workshop also focused on effectively identifying a path forward in communicating in silico and in vitro data. PMID:24296863

  20. Tools for Embedded Computing Systems Software

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  1. Proceedings of the Association for Computing Machinery Special Interest Group for Ada Artificial Intelligence Working Group, 1992 Summer Workshop Held in Seattle, Washington on June 24-27, 1992

    DTIC Science & Technology

    1993-06-01

    June 1993 Special Interest Group for Ada Artificial Intelligence Working Group 1992 Summer Workshop Janet Faye Johns Ac- i cr N 7 Dilt S ." Contract ...Sponsor %/A Contract No. N/A Project No. G30D [- t Dept. G033 Approved for public release. distribution unlimited. MITRE Bedford, Massachusetts ABSTRACT...FAX) George (Rick) Wilbur Boeing Computer Services Box 24346, Mailstop 9H-84 Seattle, WA 98124 206-394-3055 (phone) 206-394-3064 (FAX) Mik Yen

  2. The active movement scale: an evaluative tool for infants with obstetrical brachial plexus palsy.

    PubMed

    Curtis, Christine; Stephens, Derek; Clarke, Howard M; Andrews, David

    2002-05-01

    Newborns with peripheral nerve lesions involving the upper extremity are difficult to evaluate. The reliability of the Active Movement Scale (AMS), a tool for assessing motor function in infants with obstetrical brachial plexus palsy (OBPP), was examined in 2 complementary studies. Part A was an interrater reliability study in which 63 infants younger than 1 year with OBPP were independently evaluated by 2 physical therapists using the AMS. The scores were compared for reliability and controlled for chance agreement by using kappa statistics. Overall kappa analysis of the 15 tested movements showed a moderate strength of score agreement (kappa = 0.51). Quadratic-weighted kappa (kappa(quad)) statistics showed that 8 of the 15 movements tested were in the highest strength of agreement category (kappa(quad) = 0.81-1.00). Five movements showed substantial agreement (kappa(quad) = 0.61-0.80), and 2 movements had moderate agreement (kappa(quad) = 0.41- 0.60). The overall kappa(quad) was 0.89. Part B was a variability study designed to examine the dispersion of scores when infants with OBPP were evaluated with the AMS by multiple raters. Ten pediatric physical therapists with varying degrees of experience using the scale attended a 1(1/2)-hour instructional workshop on administration of the tool for infants with OBPP. A chain-block study design was used to obtain 30 assessments of 10 infants by 10 raters. A 2-way analysis of variance indicated that the variability of scores due to rater factors was low compared with the variability due to patient factors and that variation in scores due to rater experience was minimal. The results of part A indicate that the AMS is a reliable tool for the assessment of infants with OBPP when raters familiar with the scale are compared. The results of part B suggest that, with minimal training, raters with a range of experience using the AMS are able to reliably evaluate infants with upper-extremity paralysis.

  3. Coupling lattice Boltzmann and continuum equations for flow and reactive transport in porous media.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coon, Ethan; Porter, Mark L.; Kang, Qinjun

    2012-06-18

    In spatially and temporally localized instances, capturing sub-reservoir scale information is necessary. Capturing sub-reservoir scale information everywhere is neither necessary, nor computationally possible. The lattice Boltzmann Method for solving pore-scale systems. At the pore-scale, LBM provides an extremely scalable, efficient way of solving Navier-Stokes equations on complex geometries. Coupling pore-scale and continuum scale systems via domain decomposition. By leveraging the interpolations implied by pore-scale and continuum scale discretizations, overlapping Schwartz domain decomposition is used to ensure continuity of pressure and flux. This approach is demonstrated on a fractured medium, in which Navier-Stokes equations are solved within the fracture while Darcy'smore » equation is solved away from the fracture Coupling reactive transport to pore-scale flow simulators allows hybrid approaches to be extended to solve multi-scale reactive transport.« less

  4. Upper Limb Absence: Predictors of Work Participation and Work Productivity.

    PubMed

    Postema, Sietke G; Bongers, Raoul M; Brouwers, Michael A; Burger, Helena; Norling-Hermansson, Liselotte M; Reneman, Michiel F; Dijkstra, Pieter U; van der Sluis, Corry K

    2016-06-01

    To analyze work participation, work productivity, contributing factors, and physical work demands of individuals with upper limb absence (ULA). Cross-sectional study: postal survey (response rate, 45%). Twelve rehabilitation centers and orthopedic workshops. Individuals (n=207) with unilateral transverse upper limb reduction deficiency (RD) or acquired amputation (AA), at or proximal to the carpal level, between the ages of 18 and 65 years, and a convenience sample of control subjects (n=90) matched on age and sex. Not applicable. Employment status, self-reported work productivity measured with the Quality-Quantity method, and self-reported upper extremity work demands measured with the Upper Extremity Work Demands scale. Seventy-four percent of the individuals with RD and 57% of the individuals with AA were employed (vs 82% of the control group and 66% of the general population). Male sex, younger age, a medium or higher level of education, prosthesis use, and good general health were predictors of work participation. Work productivity was similar to that of the control group. Higher work productivity was inversely related to musculoskeletal complaint-related pain. When having predominantly mentally demanding work, individuals with ULA perceived higher upper extremity work demands compared with controls. Work participation of individuals with RD was slightly higher compared with that of the general population, whereas employment rates of individuals with AA were slightly lower. Furthermore, work productivity did not differ between individuals with RD, AA, and controls. Copyright © 2016 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  5. Appling Andragogy Theory in Photoshop Training Programs

    ERIC Educational Resources Information Center

    Alajlan, Abdulrahman Saad

    2015-01-01

    Andragogy is a strategy for teaching adults that can be applied to Photoshop training. Photoshop workshops are frequented by adult learners, and thus andragogical models for instruction would be extremely helpful for prospective trainers looking to improve their classroom designs. Adult learners are much different than child learners, given the…

  6. EPA's Sustainable Port Communities: Anticipating Changes in Exposures and a Proposal to Facilitate Resilience through Knowledge Flows (MTS Workshop May 2017)

    EPA Science Inventory

    Port Communities Face Many Challenges: • Climate change – Sea Level Rise, Extreme Events: “Assets” become Vulnerabilities; Nuisance flooding; Changes in waste water and stormwater capacity; Changes in near-shore ecology and water quality • Port Exp...

  7. MFGA-IDT2 workshop: Astrophysical and geophysical fluid mechanics: the impact of data on turbulence theories

    NASA Astrophysics Data System (ADS)

    Schertzer, D.; Falgarone, E.

    1 Facts about the Workshop This workshop was convened on November 13-15 1995 by E. Falgarone and D. Schertzer within the framework of the Groupe de Recherche Mecanique des Fluides Geophysiques et Astrophysiques (GdR MFGA, Research Group of Geophysical and Astrophysical Fluid Mechanics) of Centre National de la Recherche Scientifique (CNRS, (French) National Center for Scientific Research). This Research Group is chaired by A. Babiano and the meeting was held at Ecole Normale Superieure, Paris, by courtesy of its Director E. Guyon. More than sixty attendees participated to this workshop, they came from a large number of institutions and countries from Europe, Canada and USA. There were twenty-five oral presentations as well as a dozen posters. A copy of the corresponding book of abstracts can be requested to the conveners. The theme of this meeting is somewhat related to the series of Nonlinear Variability in Geophysics conferences (NVAG1, Montreal, Aug. 1986; NVAG2, Paris, June 1988; NVAG3, Cargese (Corsica), September, 1993), as well as seven consecutive annual sessions at EGS general assemblies and two consecutive spring AGU meeting sessions devoted to similar topics. One may note that NVAG3 was a joint American Geophysical Union Chapman and European Geophysical Society Richardson Memorial conference, the first topical conference jointly sponsored by the two organizations. The corresponding proceedings were published in a special NPG issue (Nonlinear Processes in Geophysics 1, 2/3, 1994). In comparison with these previous meetings, MFGA-IDT2 is at the same time specialized to fluid turbulence and its intermittency, and an extension to the fields of astrophysics. Let us add that Nonlinear Processes in Geophysics was readily chosen as the appropriate journal for publication of these proceedings since this journal was founded in order to develop interdisciplinary fundamental research and corresponding innovative nonlinear methodologies in Geophysics. It had an appropriate editorial structure, in particular a large number of editors covering a wide range of methodologies, expertises and schools. At least two of its sections (Scaling and Multifractals, Turbulence and Diffusion) were directly related to the topics of the workshop, in any case contributors were invited to choose their editor freely. 2 Goals of the Workshop The objective of this meeting was to enhance the confrontation between turbulence theories and empirical data from geophysics and astrophysics fluids with very high Reynolds numbers. The importance of these data seems to have often been underestimated for the evaluation of theories of fully developed turbulence, presumably due to the fact that turbulence does not appear as pure as in laboratory experiments. However, they have the great advantage of giving access not only to very high Reynolds numbers (e.g. 1012 for atmospheric data), but also to very large data sets. It was intended to: (i) provide an overview of the diversity of potentially available data, as well as the necessary theoretical and statistical developments for a better use of these data (e.g. treatment of anisotropy, role of processes which induce other nonlinearities such as thermal instability, effect of magnetic field and compressibility ... ), (ii) evaluate the means of discriminating between different theories (e.g. multifractal intermittency models) or to better appreciate the relevance of different notions (e.g. Self-Organized Criticality) or phenomenology (e.g. filaments, structures), (iii) emphasise the different obstacles, such as the ubiquity of catastrophic events, which could be overcome in the various concerned disciplines, thanks to theoretical advances achieved. 3 Outlines of the Workshop During the two days of the workshop, the series of presentations covered many manifestations of turbulence in geophysics, including: oceans, troposphere, stratosphere, very high atmosphere, solar wind, giant planets, interstellar clouds... up to the very large scale of the Universe. The presentations and the round table at the end of the workshop pointed out the following: - the necessity of this type of confrontation which makes intervene numerical simulations, laboratory experiments, phenomenology as well as a very large diversity of geophysical and astrophysical data, - presumably a relative need for new geophysical data, whereas there have been recent astrophysical experiments which yield interesting data and exciting questions; - the need to develop a closer intercomparison between various intermittency models (in particular Log-Poisson /Log Levy models). Two main questions were underlined, in particular during the round table: - the behaviour of the extremes of intermittency, in particular the question of divergence or convergence of the highest statistical moments (equivalently, do the probability distributions have algebraic or more rapid falloffs?); - the extension of scaling ranges; in other words do we need to divide geophysics and astrophysics in many small (nearly) isotropic subranges or is it sufficient to use anisotropic scaling notions over wider ranges? 4 The contributions in this special issue Recalling that some of the most useful insights into the nature of turbulence in fluids have come from observations of geophysical flows, Van Atta gives a review of the impacts of geophysical turbulence data into theories. His paper starts from Taylor's inference of the nearly isotropy of atmospheric turbulence and the corresponding elegant theoretical developments by von Karman of the theory of isotropic turbulence, up to underline the fact that the observed extremely large intermittency in geophysical turbulence also raised new fundamental questions for turbulence theory. The paper discusses the potential contribution to theoretical development from the available or currently being made geophysical turbulence measurements, as well as from some recent laboratory measurements and direct numerical simulations of stably stratified turbulent shear flows. Seuront et al. consider scaling and multiscaling properties of scalar fields (temperature and phytoplankton concentration) advected by oceanic turbulence in both Eulerian and Lagrangian frameworks. Despite the apparent complexity linked to a multifractal background, temperature and fluorescence (i.e. phytoplankton biomass surrogate) fields are expressed over a wide range of scale by only three universal multifractal parameters, H, α and C_l. On scales smaller than the characteristic scale of the ship, sampling is rather Eulerian. On larger scales, the drifting platform being advected by turbulent motions, sampling may be rather considered as Lagrangian. Observed Eulerian and Lagrangian universal multifractal properties of the physical and biological fields are discussed. Whereas theoretical models provide different scaling laws for fluid and MHD turbulent flows, no attempt has been done up to now to experimentally support evidence for these differences. Carbone et al. use measurements from the solar wind turbulence and from turbulence in ordinary fluid flows, in order to assess these differences. They show that the so-called Extended Self-Similarity (ESS) is evident in the solar wind turbulence up to a certain scale. Furthermore, up to a given order of the velocity structure functions, the scaling laws of MHD and fluids flows axe experimentally indistinguishable. However, differences can be observed for higher orders and the authors speculate on their origin. Dudok de Wit and Krasnosel'skikh present analysis of strong plasma turbulence in the vicinity of the Earth's bow shock with the help of magnetometer data from the AMPTE UKS satellite. They demonstrate that there is a departure from Gaussianity which could be a signature of multifractality. However, they point out that the complexity of plasma turbulence precludes a more quantitative understanding. Finally, the authors emphasise the fact that the duration of records prevents to obtain any reliable estimate of structure functions beyond the fourth order. Sylos Labini and Pietronero discuss the problem of galaxy correlations. They conclude from all the recently available three dimensional catalogues that the distribution of galaxies and clusters is fractal with dimension D ~ 2 up to the present observational limits without any tendency towards homogenization. This result is discussed in contrast to angular data analysis. Furthermore, they point out that the galaxy-cluster mismatch disappears when considering a multifractal distribution of matter. They emphasise that a new picture emerges which changes the standard ideas about the properties of the universe and requires a corresponding change in the related theoretical concepts. Chilla et al. investigate with the help of a laboratory experiment the possible influence of the presence of a large scale structure on the intermittency of small scale structures. They study a flow between coaxial co-rotating disks generating a strong axial vortex over a turbulent background. They show that the cascade process is preserved although strongly modified and they discuss the relevance of parameters developed for the description of intermittency in homogeneous turbulence to evaluate this modification.

  8. Narrow-Line Seyfert 1 Galaxies

    NASA Technical Reports Server (NTRS)

    Leighly, Karen M.

    2000-01-01

    The primary work during this year has been the analysis and interpretation of our HST spectra from two extreme Narrow-line Seyfert 1 galaxies (NLS1s) Infrared Astronomy Satellite (IRAS) 13224-3809 and 1H 0707-495. This work has been presented as an invited talk at the workshop entitled "Observational and theoretical progress in the Study of Narrow-line Seyfert 1 Galaxies" held in Bad Honnef, Germany December 8-11, as a contributed talk at the January 2000 AAS meeting in Atlanta, Georgia, and as a contributed talk at the workshop "Probing the Physics of Active Galactic Nuclei by Multiwavelength Monitoring" held at Goddard Space Flight Center June 20-22, 2000.

  9. Simulating the Thermal Response of High Explosives on Time Scales of Days to Microseconds

    NASA Astrophysics Data System (ADS)

    Yoh, Jack J.; McClelland, Matthew A.

    2004-07-01

    We present an overview of computational techniques for simulating the thermal cookoff of high explosives using a multi-physics hydrodynamics code, ALE3D. Recent improvements to the code have aided our computational capability in modeling the response of energetic materials systems exposed to extreme thermal environments, such as fires. We consider an idealized model process for a confined explosive involving the transition from slow heating to rapid deflagration in which the time scale changes from days to hundreds of microseconds. The heating stage involves thermal expansion and decomposition according to an Arrhenius kinetics model while a pressure-dependent burn model is employed during the explosive phase. We describe and demonstrate the numerical strategies employed to make the transition from slow to fast dynamics.

  10. Variability in the Propagation Phase of CFD-Based Noise Prediction: Summary of Results From Category 8 of the BANC-III Workshop

    NASA Technical Reports Server (NTRS)

    Lopes, Leonard; Redonnet, Stephane; Imamura, Taro; Ikeda, Tomoaki; Zawodny, Nikolas; Cunha, Guilherme

    2015-01-01

    The usage of Computational Fluid Dynamics (CFD) in noise prediction typically has been a two part process: accurately predicting the flow conditions in the near-field and then propagating the noise from the near-field to the observer. Due to the increase in computing power and the cost benefit when weighed against wind tunnel testing, the usage of CFD to estimate the local flow field of complex geometrical structures has become more routine. Recently, the Benchmark problems in Airframe Noise Computation (BANC) workshops have provided a community focus on accurately simulating the local flow field near the body with various CFD approaches. However, to date, little effort has been given into assessing the impact of the propagation phase of noise prediction. This paper includes results from the BANC-III workshop which explores variability in the propagation phase of CFD-based noise prediction. This includes two test cases: an analytical solution of a quadrupole source near a sphere and a computational solution around a nose landing gear. Agreement between three codes was very good for the analytic test case, but CFD-based noise predictions indicate that the propagation phase can introduce 3dB or more of variability in noise predictions.

  11. NASA Workshop on Computational Structural Mechanics 1987, part 3

    NASA Technical Reports Server (NTRS)

    Sykes, Nancy P. (Editor)

    1989-01-01

    Computational Structural Mechanics (CSM) topics are explored. Algorithms and software for nonlinear structural dynamics, concurrent algorithms for transient finite element analysis, computational methods and software systems for dynamics and control of large space structures, and the use of multi-grid for structural analysis are discussed.

  12. A workshop report on HIV mHealth synergy and strategy meeting to review emerging evidence-based mHealth interventions and develop a framework for scale-up of these interventions

    PubMed Central

    Karanja, Sarah; Mbuagbaw, Lawrence; Ritvo, Paul; Law, Judith; Kyobutungi, Catherine; Reid, Graham; Ram, Ravi; Estambale, Benson; Lester, Richard

    2011-01-01

    mHealth is a term used to refer to mobile technologies such as personal digital assistants and mobile phones for healthcare. mHealth initiatives to support care and treatment of patients are emerging globally and this workshop brought together researchers, policy makers, information, communication and technology programmers, academics and civil society representatives for one and a half days synergy meeting in Kenya to review regional evidence based mHealth research for HIV care and treatment, review mHealth technologies for adherence and retention interventions in anti-retroviral therapy (ART) programs and develop a framework for scale up of evidence based mHealth interventions. The workshop was held in May 2011 in Nairobi, Kenya and was funded by the Canadian Global Health Research Initiatives (GHRI) and the US Centre for Disease Control and Prevention (CDC). At the end of the workshop participants came up with a framework to guide mHealth initiatives in the region and a plan to work together in scaling up evidence based mHealth interventions. The participants acknowledged the importance of the meeting in setting the pace for strengthening and coordinating mHealth initiatives and unanimously agreed to hold a follow up meeting after three months. PMID:22187619

  13. A workshop report on HIV mHealth synergy and strategy meeting to review emerging evidence-based mHealth interventions and develop a framework for scale-up of these interventions.

    PubMed

    Karanja, Sarah; Mbuagbaw, Lawrence; Ritvo, Paul; Law, Judith; Kyobutungi, Catherine; Reid, Graham; Ram, Ravi; Estambale, Benson; Lester, Richard

    2011-01-01

    mHealth is a term used to refer to mobile technologies such as personal digital assistants and mobile phones for healthcare. mHealth initiatives to support care and treatment of patients are emerging globally and this workshop brought together researchers, policy makers, information, communication and technology programmers, academics and civil society representatives for one and a half days synergy meeting in Kenya to review regional evidence based mHealth research for HIV care and treatment, review mHealth technologies for adherence and retention interventions in anti-retroviral therapy (ART) programs and develop a framework for scale up of evidence based mHealth interventions. The workshop was held in May 2011 in Nairobi, Kenya and was funded by the Canadian Global Health Research Initiatives (GHRI) and the US Centre for Disease Control and Prevention (CDC). At the end of the workshop participants came up with a framework to guide mHealth initiatives in the region and a plan to work together in scaling up evidence based mHealth interventions. The participants acknowledged the importance of the meeting in setting the pace for strengthening and coordinating mHealth initiatives and unanimously agreed to hold a follow up meeting after three months.

  14. Resources and Approaches for Teaching Quantitative and Computational Skills in the Geosciences and Allied Fields

    NASA Astrophysics Data System (ADS)

    Orr, C. H.; Mcfadden, R. R.; Manduca, C. A.; Kempler, L. A.

    2016-12-01

    Teaching with data, simulations, and models in the geosciences can increase many facets of student success in the classroom, and in the workforce. Teaching undergraduates about programming and improving students' quantitative and computational skills expands their perception of Geoscience beyond field-based studies. Processing data and developing quantitative models are critically important for Geoscience students. Students need to be able to perform calculations, analyze data, create numerical models and visualizations, and more deeply understand complex systems—all essential aspects of modern science. These skills require students to have comfort and skill with languages and tools such as MATLAB. To achieve comfort and skill, computational and quantitative thinking must build over a 4-year degree program across courses and disciplines. However, in courses focused on Geoscience content it can be challenging to get students comfortable with using computational methods to answers Geoscience questions. To help bridge this gap, we have partnered with MathWorks to develop two workshops focused on collecting and developing strategies and resources to help faculty teach students to incorporate data, simulations, and models into the curriculum at the course and program levels. We brought together faculty members from the sciences, including Geoscience and allied fields, who teach computation and quantitative thinking skills using MATLAB to build a resource collection for teaching. These materials, and the outcomes of the workshops are freely available on our website. The workshop outcomes include a collection of teaching activities, essays, and course descriptions that can help faculty incorporate computational skills at the course or program level. The teaching activities include in-class assignments, problem sets, labs, projects, and toolboxes. These activities range from programming assignments to creating and using models. The outcomes also include workshop syntheses that highlights best practices, a set of webpages to support teaching with software such as MATLAB, and an interest group actively discussing aspects these issues in Geoscience and allied fields. Learn more and view the resources at http://serc.carleton.edu/matlab_computation2016/index.html

  15. Summary of Cumulus Parameterization Workshop

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Starr, David OC.; Hou, Arthur; Newman, Paul; Sud, Yogesh

    2002-01-01

    A workshop on cumulus parameterization took place at the NASA Goddard Space Flight Center from December 3-5, 2001. The major objectives of this workshop were (1) to review the problem of representation of moist processes in large-scale models (mesoscale models, Numerical Weather Prediction models and Atmospheric General Circulation Models), (2) to review the state-of-the-art in cumulus parameterization schemes, and (3) to discuss the need for future research and applications. There were a total of 31 presentations and about 100 participants from the United States, Japan, the United Kingdom, France and South Korea. The specific presentations and discussions during the workshop are summarized in this paper.

  16. The Astronomy Workshop

    NASA Astrophysics Data System (ADS)

    Hamilton, D. P.; Asbury, M. L.; Proctor, A.

    2001-12-01

    The Astronomy Workshop (http://janus.astro.umd.edu) is an interactive online astronomy resource developed, and maintained at the University of Maryland, for use by students, educators and the general public. The Astronomy Workshop has been extensively tested and used successfully at many different levels, including High School and Junior High School science classes, University introductory astronomy courses, and University intermediate and advanced astronomy courses. Some topics currently covered in the Astronomy Workshop are: Animated Orbits of Planets and Moons: The orbits of the nine planets and 91 known planetary satellites are shown in animated, to-scale drawings. The orbiting bodies move at their correct relative speeds about their parent, which is rendered as an attractive, to-scale gif image. Solar System Collisions: This most popular of our applications shows what happens when an asteroid or comet with user-defined size and speed impacts a given planet. The program calculates many effects, including the country impacted (if Earth is the target), energy of the explosion, crater size, magnitude of the planetquake generated. It also displays a relevant image (e.g. terrestrial crater, lunar crater, etc.). Planetary and Satellite Data Calculators: These tools allow the user to easily calculate physical data for all of the planets or satellites simultaneously, making comparison very easy. Orbital Simulations: These tools allow the student to investigate different aspects of the three-body problem of celestial mechanics. Astronomy Workshop Bulletin Board: Get innovative teaching ideas and read about in-class experiences with the Astronomy Workshop. Share your ideas with other educators by posting on the Bulletin Board. Funding for the Astronomy Workshop is provided by the National Science Foundation.

  17. Emotional intelligence and coping styles: An intervention in geriatric nurses.

    PubMed

    Sarabia-Cobo, Carmen María; Suárez, Soraya González; Menéndez Crispín, Ernesto J; Sarabia Cobo, A Belén; Pérez, Victoria; de Lorena, Pablo; Rodríguez Rodríguez, Cristina; Sanlúcar Gross, Laura

    2017-06-01

    Current research indicates a relationship between EI, stress, coping strategies, well-being and mental health. Emotional intelligence skills and knowledge, and coping strategies can be increased with training. The aims of this study were to use a controlled design to test the impact of theoretically based training on the different components of EI and coping styles in a sample of nurses working with older adults. A group of 92 professionals (RN and CAN) who attended a workshop on EI were included in the study. They completed a self-reported measure of EI and coping styles on three occasions: pre- and post-workshop and at one year follow-up. The EI workshop consisted of four 4-h sessions conducted over a four-week period. Each session was held at the one-week interval. This interval allowed participants to apply what was taught during the session to their daily life. The instruments to measure the EI and coping were the Trait Meta-Mood Scale and the CAE test. There were significant differences between the pre- and post-workshop measures both at the end of the workshop and up to one year for both the Trait Meta-Mood Scale scores and the CAE test. There was a significant increase in the EI and coping styles after the workshop and one year thereafter. The workshop was useful for developing EI in the professionals. The immediate impact of the emotional consciousness of individuals was particularly significant for all participants. The long-term impact was notable for the significant increase in EI and most coping styles. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. 16(th) IHIW: analysis of HLA population data, with updated results for 1996 to 2012 workshop data (AHPD project report).

    PubMed

    Riccio, M E; Buhler, S; Nunes, J M; Vangenot, C; Cuénod, M; Currat, M; Di, D; Andreani, M; Boldyreva, M; Chambers, G; Chernova, M; Chiaroni, J; Darke, C; Di Cristofaro, J; Dubois, V; Dunn, P; Edinur, H A; Elamin, N; Eliaou, J-F; Grubic, Z; Jaatinen, T; Kanga, U; Kervaire, B; Kolesar, L; Kunachiwa, W; Lokki, M L; Mehra, N; Nicoloso, G; Paakkanen, R; Voniatis, D Papaioannou; Papasteriades, C; Poli, F; Richard, L; Romón Alonso, I; Slavčev, A; Sulcebe, G; Suslova, T; Testi, M; Tiercy, J-M; Varnavidou, A; Vidan-Jeras, B; Wennerström, A; Sanchez-Mazas, A

    2013-02-01

    We present here the results of the Analysis of HLA Population Data (AHPD) project of the 16th International HLA and Immunogenetics Workshop (16IHIW) held in Liverpool in May-June 2012. Thanks to the collaboration of 25 laboratories from 18 different countries, HLA genotypic data for 59 new population samples (either well-defined populations or donor registry samples) were gathered and 55 were analysed statistically following HLA-NET recommendations. The new data included, among others, large sets of well-defined populations from north-east Europe and West Asia, as well as many donor registry data from European countries. The Gene[rate] computer tools were combined to create a Gene[rate] computer pipeline to automatically (i) estimate allele frequencies by an expectation-maximization algorithm accommodating ambiguities, (ii) estimate heterozygosity, (iii) test for Hardy-Weinberg equilibrium (HWE), (iv) test for selective neutrality, (v) generate frequency graphs and summary statistics for each sample at each locus and (vi) plot multidimensional scaling (MDS) analyses comparing the new samples with previous IHIW data. Intrapopulation analyses show that HWE is rarely rejected, while neutrality tests often indicate a significant excess of heterozygotes compared with neutral expectations. The comparison of the 16IHIW AHPD data with data collected during previous workshops (12th-15th) shows that geography is an excellent predictor of HLA genetic differentiations for HLA-A, -B and -DRB1 loci but not for HLA-DQ, whose patterns are probably more influenced by natural selection. In Europe, HLA genetic variation clearly follows a north to south-east axis despite a low level of differentiation between European, North African and West Asian populations. Pacific populations are genetically close to Austronesian-speaking South-East Asian and Taiwanese populations, in agreement with current theories on the peopling of Oceania. Thanks to this project, HLA genetic variation is more clearly defined worldwide and better interpreted in relation to human peopling history and HLA molecular evolution. © 2012 Blackwell Publishing Ltd.

  19. Cross-scale phenological data integration to benefit resource management and monitoring

    USGS Publications Warehouse

    Richardson, Andrew D.; Weltzin, Jake F.; Morisette, Jeffrey T.

    2017-01-01

    Climate change is presenting new challenges for natural resource managers charged with maintaining sustainable ecosystems and landscapes. Phenology, a branch of science dealing with seasonal natural phenomena (bird migration or plant flowering in response to weather changes, for example), bridges the gap between the biosphere and the climate system. Phenological processes operate across scales that span orders of magnitude—from leaf to globe and from days to seasons—making phenology ideally suited to multiscale, multiplatform data integration and delivery of information at spatial and temporal scales suitable to inform resource management decisions.A workshop report: Workshop held June 2016 to investigate opportunities and challenges facing multi-scale, multi-platform integration of phenological data to support natural resource management decision-making.

  20. A Decade-Long European-Scale Convection-Resolving Climate Simulation on GPUs

    NASA Astrophysics Data System (ADS)

    Leutwyler, D.; Fuhrer, O.; Ban, N.; Lapillonne, X.; Lüthi, D.; Schar, C.

    2016-12-01

    Convection-resolving models have proven to be very useful tools in numerical weather prediction and in climate research. However, due to their extremely demanding computational requirements, they have so far been limited to short simulations and/or small computational domains. Innovations in the supercomputing domain have led to new supercomputer designs that involve conventional multi-core CPUs and accelerators such as graphics processing units (GPUs). One of the first atmospheric models that has been fully ported to GPUs is the Consortium for Small-Scale Modeling weather and climate model COSMO. This new version allows us to expand the size of the simulation domain to areas spanning continents and the time period up to one decade. We present results from a decade-long, convection-resolving climate simulation over Europe using the GPU-enabled COSMO version on a computational domain with 1536x1536x60 gridpoints. The simulation is driven by the ERA-interim reanalysis. The results illustrate how the approach allows for the representation of interactions between synoptic-scale and meso-scale atmospheric circulations at scales ranging from 1000 to 10 km. We discuss some of the advantages and prospects from using GPUs, and focus on the performance of the convection-resolving modeling approach on the European scale. Specifically we investigate the organization of convective clouds and on validate hourly rainfall distributions with various high-resolution data sets.

  1. The Two Ts: Teaching and Technology

    ERIC Educational Resources Information Center

    Flannery, Maura C.

    2004-01-01

    A professor of Biology shares his experience at the BioQUEST workshop, which he attended. He talks about the BioQUEST Curriculum Consortium, which is a large-scale college biology project, focusing on active learning strategies, and the use of technology in teaching. The approaches presented at the workshop are described.

  2. The International Symposium on Grids and Clouds

    NASA Astrophysics Data System (ADS)

    The International Symposium on Grids and Clouds (ISGC) 2012 will be held at Academia Sinica in Taipei from 26 February to 2 March 2012, with co-located events and workshops. The conference is hosted by the Academia Sinica Grid Computing Centre (ASGC). 2012 is the decennium anniversary of the ISGC which over the last decade has tracked the convergence, collaboration and innovation of individual researchers across the Asia Pacific region to a coherent community. With the continuous support and dedication from the delegates, ISGC has provided the primary international distributed computing platform where distinguished researchers and collaboration partners from around the world share their knowledge and experiences. The last decade has seen the wide-scale emergence of e-Infrastructure as a critical asset for the modern e-Scientist. The emergence of large-scale research infrastructures and instruments that has produced a torrent of electronic data is forcing a generational change in the scientific process and the mechanisms used to analyse the resulting data deluge. No longer can the processing of these vast amounts of data and production of relevant scientific results be undertaken by a single scientist. Virtual Research Communities that span organisations around the world, through an integrated digital infrastructure that connects the trust and administrative domains of multiple resource providers, have become critical in supporting these analyses. Topics covered in ISGC 2012 include: High Energy Physics, Biomedicine & Life Sciences, Earth Science, Environmental Changes and Natural Disaster Mitigation, Humanities & Social Sciences, Operations & Management, Middleware & Interoperability, Security and Networking, Infrastructure Clouds & Virtualisation, Business Models & Sustainability, Data Management, Distributed Volunteer & Desktop Grid Computing, High Throughput Computing, and High Performance, Manycore & GPU Computing.

  3. A scale-invariant change detection method for land use/cover change research

    NASA Astrophysics Data System (ADS)

    Xing, Jin; Sieber, Renee; Caelli, Terrence

    2018-07-01

    Land Use/Cover Change (LUCC) detection relies increasingly on comparing remote sensing images with different spatial and spectral scales. Based on scale-invariant image analysis algorithms in computer vision, we propose a scale-invariant LUCC detection method to identify changes from scale heterogeneous images. This method is composed of an entropy-based spatial decomposition, two scale-invariant feature extraction methods, Maximally Stable Extremal Region (MSER) and Scale-Invariant Feature Transformation (SIFT) algorithms, a spatial regression voting method to integrate MSER and SIFT results, a Markov Random Field-based smoothing method, and a support vector machine classification method to assign LUCC labels. We test the scale invariance of our new method with a LUCC case study in Montreal, Canada, 2005-2012. We found that the scale-invariant LUCC detection method provides similar accuracy compared with the resampling-based approach but this method avoids the LUCC distortion incurred by resampling.

  4. Tropical precipitation extremes: Response to SST-induced warming in aquaplanet simulations

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Ritthik; Bordoni, Simona; Teixeira, João.

    2017-04-01

    Scaling of tropical precipitation extremes in response to warming is studied in aquaplanet experiments using the global Weather Research and Forecasting (WRF) model. We show how the scaling of precipitation extremes is highly sensitive to spatial and temporal averaging: while instantaneous grid point extreme precipitation scales more strongly than the percentage increase (˜7% K-1) predicted by the Clausius-Clapeyron (CC) relationship, extremes for zonally and temporally averaged precipitation follow a slight sub-CC scaling, in agreement with results from Climate Model Intercomparison Project (CMIP) models. The scaling depends crucially on the employed convection parameterization. This is particularly true when grid point instantaneous extremes are considered. These results highlight how understanding the response of precipitation extremes to warming requires consideration of dynamic changes in addition to the thermodynamic response. Changes in grid-scale precipitation, unlike those in convective-scale precipitation, scale linearly with the resolved flow. Hence, dynamic changes include changes in both large-scale and convective-scale motions.

  5. Proceedings of the Toronto TEAM/ACES workshop

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turner, L.R.

    The third TEAM Workshop of the third round was held at Ontario Hydro in Toronto 25--26 October 1990, immediately following the Conference on Electromagnetic Field Computation. This was the first Joint Workshop with ACES (Applied Computational Electromagnetics Society), whose goals are similar to TEAM, but who tend to work at higher frequencies (Antennas, Propagation, and Scattering). A fusion problem, the eddy current heating of the case of the Euratom Large Coil Project Coil, was adapted as Problem 14 at the Oxford Workshop, and a solution to that problem was presented at Toronto by Oskar Biro of the Graz (Austria) Universitymore » of Technology. Individual solutions were also presented for Problems 8 (Flaw in a Plate) and 9 (Moving Coil inside a Pipe). Five new solutions were presented to Problem 13 (DC Coil in a Ferromagnetic Yoke), and Koji Fujiwara of Okayama University summarized these solutions along with the similar number presented at Oxford. The solutions agreed well in the air but disagreed in the steel. Codes with a formulation in magnetic field strength or scalar potential underestimated the flux density in the steel, and codes based on flux density or vector potential overestimated it. Codes with edge elements appeared to do better than codes with nodal elements. These results stimulated considerable discussions; in my view that was the most valuable result of the workshop.« less

  6. Finding a roadmap to achieve large neuromorphic hardware systems

    PubMed Central

    Hasler, Jennifer; Marr, Bo

    2013-01-01

    Neuromorphic systems are gaining increasing importance in an era where CMOS digital computing techniques are reaching physical limits. These silicon systems mimic extremely energy efficient neural computing structures, potentially both for solving engineering applications as well as understanding neural computation. Toward this end, the authors provide a glimpse at what the technology evolution roadmap looks like for these systems so that Neuromorphic engineers may gain the same benefit of anticipation and foresight that IC designers gained from Moore's law many years ago. Scaling of energy efficiency, performance, and size will be discussed as well as how the implementation and application space of Neuromorphic systems are expected to evolve over time. PMID:24058330

  7. Software Junctus: Joining Sign Language and Alphabetical Writing

    NASA Astrophysics Data System (ADS)

    Valentini, Carla Beatris; Bisol, Cláudia A.; Dalla Santa, Cristiane

    The authors’ aim is to describe the workshops developed to test the use of an authorship program that allows the simultaneous use of sign language and alphabetical writing. The workshops were prepared and conducted by a Computer Science undergraduate, with the support of the Program of Students’ Integration and Mediation (Programa de Integração e Mediação do Acadêmico - PIMA) at the University of Caxias do Sul. Two sign language interpreters, two deaf students and one hearing student, who also teach at a special school for the deaf, participated in the workshops. The main characteristics of the software and the development of the workshops are presented with examples of educational projects created during their development. Possible improvements are also outlined.

  8. Using Rollback Avoidance to Mitigate Failures in Next-Generation Extreme-Scale Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levy, Scott N.

    2016-05-01

    High-performance computing (HPC) systems enable scientists to numerically model complex phenomena in many important physical systems. The next major milestone in the development of HPC systems is the construction of the rst supercomputer capable executing more than an exa op, 10 18 oating point operations per second. On systems of this scale, failures will occur much more frequently than on current systems. As a result, resilience is a key obstacle to building next-generation extremescale systems. Coordinated checkpointing is currently the most widely-used mechanism for handling failures on HPC systems. Although coordinated checkpointing remains e ective on current systems, increasing themore » scale of today's systems to build next-generation systems will increase the cost of fault tolerance as more and more time is taken away from the application to protect against or recover from failure. Rollback avoidance techniques seek to mitigate the cost of checkpoint/restart by allowing an application to continue its execution rather than rolling back to an earlier checkpoint when failures occur. These techniqes include failure prediction and preventive migration, replicated computation, fault-tolerant algorithms, and softwarebased memory fault correction. In this thesis, we examine how rollback avoidance techniques can be used to address failures on extreme-scale systems. Using a combination of analytic modeling and simulation, we evaluate the potential impact of rollback avoidance on these systems. We then present a novel rollback avoidance technique that exploits similarities in application memory. Finally, we examine the feasibility of using this technique to protect against memory faults in kernel memory.« less

  9. High-resolution downscaling for hydrological management

    NASA Astrophysics Data System (ADS)

    Ulbrich, Uwe; Rust, Henning; Meredith, Edmund; Kpogo-Nuwoklo, Komlan; Vagenas, Christos

    2017-04-01

    Hydrological modellers and water managers require high-resolution climate data to model regional hydrologies and how these may respond to future changes in the large-scale climate. The ability to successfully model such changes and, by extension, critical infrastructure planning is often impeded by a lack of suitable climate data. This typically takes the form of too-coarse data from climate models, which are not sufficiently detailed in either space or time to be able to support water management decisions and hydrological research. BINGO (Bringing INnovation in onGOing water management; ) aims to bridge the gap between the needs of hydrological modellers and planners, and the currently available range of climate data, with the overarching aim of providing adaptation strategies for climate change-related challenges. Producing the kilometre- and sub-daily-scale climate data needed by hydrologists through continuous simulations is generally computationally infeasible. To circumvent this hurdle, we adopt a two-pronged approach involving (1) selective dynamical downscaling and (2) conditional stochastic weather generators, with the former presented here. We take an event-based approach to downscaling in order to achieve the kilometre-scale input needed by hydrological modellers. Computational expenses are minimized by identifying extremal weather patterns for each BINGO research site in lower-resolution simulations and then only downscaling to the kilometre-scale (convection permitting) those events during which such patterns occur. Here we (1) outline the methodology behind the selection of the events, and (2) compare the modelled precipitation distribution and variability (preconditioned on the extremal weather patterns) with that found in observations.

  10. Nearly extremal apparent horizons in simulations of merging black holes

    NASA Astrophysics Data System (ADS)

    Lovelace, Geoffrey; Scheel, Mark A.; Owen, Robert; Giesler, Matthew; Katebi, Reza; Szilágyi, Béla; Chu, Tony; Demos, Nicholas; Hemberger, Daniel A.; Kidder, Lawrence E.; Pfeiffer, Harald P.; Afshari, Nousha

    2015-03-01

    The spin angular momentum S of an isolated Kerr black hole is bounded by the surface area A of its apparent horizon: 8π S≤slant A, with equality for extremal black holes. In this paper, we explore the extremality of individual and common apparent horizons for merging, rapidly spinning binary black holes. We consider simulations of merging black holes with equal masses M and initial spin angular momenta aligned with the orbital angular momentum, including new simulations with spin magnitudes up to S/{{M}2}=0.994. We measure the area and (using approximate Killing vectors) the spin on the individual and common apparent horizons, finding that the inequality 8π S\\lt A is satisfied in all cases but is very close to equality on the common apparent horizon at the instant it first appears. We also evaluate the Booth-Fairhurst extremality, whose value for a given apparent horizon depends on the scaling of the horizon’s null normal vectors. In particular, we introduce a gauge-invariant lower bound on the extremality by computing the smallest value that Booth and Fairhurst’s extremality parameter can take for any scaling. Using this lower bound, we conclude that the common horizons are at least moderately close to extremal just after they appear. Finally, following Lovelace et al (2008 Phys. Rev. D 78 084017), we construct quasiequilibrium binary-black hole initial data with ‘overspun’ marginally trapped surfaces with 8π S\\gt A. We show that the overspun surfaces are indeed superextremal: our lower bound on their Booth-Fairhurst extremality exceeds unity. However, we confirm that these superextremal surfaces are always surrounded by marginally outer trapped surfaces (i.e., by apparent horizons) with 8π S\\lt A. The extremality lower bound on the enclosing apparent horizon is always less than unity but can exceed the value for an extremal Kerr black hole.

  11. Elders, Students & Computers--Background Information. Illinois Series on Educational Technology of Computers. Number 8.

    ERIC Educational Resources Information Center

    Jaycox, Kathy; Hicks, Bruce

    This report reviews the literature relating to computer uses for elders. Topics include: (1) variables affecting computer use by elders; (2) organizations and programs serving elders in Champaign County, Illinois; (3) University of Illinois workshops on problems of older people; (4) The Senior Citizens Project of Volunteer Illini Projects; (5)…

  12. WORKSHOP REPORT: COMPUTATIONAL TOXICOLOGY: FRAMEWORK, PARTNERSHIPS, AND PROGRAM DEVELOPMENT, SEPTEMBER 29-30, 2003, RESEARCH TRIANGLE PARK, NORTH CAROLINA

    EPA Science Inventory

    Computational toxicology is a new research initiative being developed within the Office of Research and Development (ORD) of the US Environmental Protection Agency (EPA). Operationally, it is defined as the application of mathematical and computer models together with molecular c...

  13. Fundamentals of Library Automation and Technology. Participant Workbook.

    ERIC Educational Resources Information Center

    Bridge, Frank; Walton, Robert

    This workbook presents outlines of topics to be covered during a two-day workshop on the fundamentals for library automation. Topics for the first day include: (1) Introduction; (2) Computer Technology--A Historical Overview; (3) Evolution of Library Automation; (4) Computer Hardware Technology--An Introduction; (5) Computer Software…

  14. Summary of the First Network-Centric Sensing Community Workshop, ’Netted Sensors: A Government, Industry and Academia Dialogue’

    DTIC Science & Technology

    2006-04-01

    and Scalability, (2) Sensors and Platforms, (3) Distributed Computing and Processing , (4) Information Management, (5) Fusion and Resource Management...use of the deployed system. 3.3 Distributed Computing and Processing Session The Distributed Computing and Processing Session consisted of three

  15. Research Directions for Cyber Experimentation: Workshop Discussion Analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DeWaard, Elizabeth; Deccio, Casey; Fritz, David Jakob

    Sandia National Laboratories hosted a workshop on August 11, 2017 entitled "Research Directions for Cyber Experimentation," which focused on identifying and addressing research gaps within the field of cyber experimentation , particularly emulation testbeds . This report mainly documents the discussion toward the end of the workshop, which included research gaps such as developing a sustainable research infrastructure, exp anding cyber experimentation, and making the field more accessible to subject matter experts who may not have a background in computer science . Other gaps include methodologies for rigorous experimentation, validation, and uncertainty quantification, which , if addressed, also have themore » potential to bridge the gap between cyber experimentation and cyber engineering. Workshop attendees presented various ways to overcome these research gaps, however the main conclusion for overcoming these gaps is better commun ication through increased workshops, conferences, email lists, and slack chann els, among other opportunities.« less

  16. National Forum on the Future of Automated Materials Processing in US Industry: The Role of Sensors. Report of a workshop (1st) held at Santa Barbara, California on December 16-17, 1985

    NASA Astrophysics Data System (ADS)

    Yolken, H. T.; Mehrabian, R.

    1985-12-01

    These are the proceedings of the workshop A National Forum on the Future of Automated Materials Processing in U.S. Industry - The Role of Sensors. This is the first of two workshops to be sponsored by the Industrial Research Institute and the White House Office of Science and Technology Policy, Committee on Materials Working Group on Automation of Materials Processing. The second workshop will address the other two key components required for automated materials processing, process models and artificial intelligence coupled with computer integration of the system. The objective of these workshops is to identify and assess important issues afecting the competitive position of U.S. industry related to its ability to automate production processes for basic and advanced materials and to develop approaches for improved capability through cooperative R&D and associated efforts.

  17. Fault Tolerant Real-Time Networks

    DTIC Science & Technology

    2007-05-30

    Alberto Sangiovanni-Vincentelli, editors Hybrid Systems: Computation and Control. Fourth International Workshop (HSCC󈧅, Rome, Italy, March 2001...average dwell time by solving optimization problems. In Ashish Tiwari and Joao P. Hespanha, editors, Hybrid Systems: Computation and Control (HSCC 06

  18. Computational toxicity in 21st century safety sciences (China talk - Fuzhou China)

    EPA Science Inventory

    presentation at the Joint Meeting of Analytical Toxicology and Computational Toxicology Committee (Chinese Society of Toxicology) International Workshop on Advanced Chemical Safety Assessment Technologies on 11 May 2016, Fuzhou University, Fuzhou China

  19. Support System Effects on the NASA Common Research Model

    NASA Technical Reports Server (NTRS)

    Rivers, S. Melissa B.; Hunter, Craig A.

    2012-01-01

    An experimental investigation of the NASA Common Research Model was conducted in the NASA Langley National Transonic Facility and NASA Ames 11-Foot Transonic Wind Tunnel Facility for use in the Drag Prediction Workshop. As data from the experimental investigations was collected, a large difference in moment values was seen between the experimental and the computational data from the 4th Drag Prediction Workshop. This difference led to the present work. In this study, a computational assessment has been undertaken to investigate model support system interference effects on the Common Research Model. The configurations computed during this investigation were the wing/body/tail=0deg without the support system and the wing/body/tail=0deg with the support system. The results from this investigation confirm that the addition of the support system to the computational cases does shift the pitching moment in the direction of the experimental results.

  20. Proceedings of the Annual Southwest Park and Recreation Training Institute (28th, Kingston, Oklahoma, January 30-February 2, 1983).

    ERIC Educational Resources Information Center

    Texas Tech Univ., Lubbock. Dept. of Park Administration and Landscape Architecture.

    Summaries are given on conference workshops and sessions on the topics of: (1) computers and their use in parks and leisure service agencies; (2) maintenance; (3) private initiatives in county park systems; (4) community education workshops; (5) trees and shrubs for the urban environment; (6) adjustments to severe fiscal constraints; (7) improving…

Top