Sample records for future generation computer

  1. Manufacturing Magic and Computational Creativity

    PubMed Central

    Williams, Howard; McOwan, Peter W.

    2016-01-01

    This paper describes techniques in computational creativity, blending mathematical modeling and psychological insight, to generate new magic tricks. The details of an explicit computational framework capable of creating new magic tricks are summarized, and evaluated against a range of contemporary theories about what constitutes a creative system. To allow further development of the proposed system we situate this approach to the generation of magic in the wider context of other areas of application in computational creativity in performance arts. We show how approaches in these domains could be incorporated to enhance future magic generation systems, and critically review possible future applications of such magic generating computers. PMID:27375533

  2. Future trends in computer waste generation in India.

    PubMed

    Dwivedy, Maheshwar; Mittal, R K

    2010-11-01

    The objective of this paper is to estimate the future projection of computer waste in India and to subsequently analyze their flow at the end of their useful phase. For this purpose, the study utilizes the logistic model-based approach proposed by Yang and Williams to forecast future trends in computer waste. The model estimates future projection of computer penetration rate utilizing their first lifespan distribution and historical sales data. A bounding analysis on the future carrying capacity was simulated using the three parameter logistic curve. The observed obsolete generation quantities from the extrapolated penetration rates are then used to model the disposal phase. The results of the bounding analysis indicate that in the year 2020, around 41-152 million units of computers will become obsolete. The obsolete computer generation quantities are then used to estimate the End-of-Life outflows by utilizing a time-series multiple lifespan model. Even a conservative estimate of the future recycling capacity of PCs will reach upwards of 30 million units during 2025. Apparently, more than 150 million units could be potentially recycled in the upper bound case. However, considering significant future investment in the e-waste recycling sector from all stakeholders in India, we propose a logistic growth in the recycling rate and estimate the requirement of recycling capacity between 60 and 400 million units for the lower and upper bound case during 2025. Finally, we compare the future obsolete PC generation amount of the US and India. Copyright © 2010 Elsevier Ltd. All rights reserved.

  3. Computational Fluid Dynamics: Past, Present, And Future

    NASA Technical Reports Server (NTRS)

    Kutler, Paul

    1988-01-01

    Paper reviews development of computational fluid dynamics and explores future prospects of technology. Report covers such topics as computer technology, turbulence, development of solution methodology, developemnt of algorithms, definition of flow geometries, generation of computational grids, and pre- and post-data processing.

  4. (Some) Computer Futures: Mainframes.

    ERIC Educational Resources Information Center

    Joseph, Earl C.

    Possible futures for the world of mainframe computers can be forecast through studies identifying forces of change and their impact on current trends. Some new prospects for the future have been generated by advances in information technology; for example, recent United States successes in applied artificial intelligence (AI) have created new…

  5. The Next Generation of Personal Computers.

    ERIC Educational Resources Information Center

    Crecine, John P.

    1986-01-01

    Discusses factors converging to create high-capacity, low-cost nature of next generation of microcomputers: a coherent vision of what graphics workstation and future computing environment should be like; hardware developments leading to greater storage capacity at lower costs; and development of software and expertise to exploit computing power…

  6. Challenges in scaling NLO generators to leadership computers

    NASA Astrophysics Data System (ADS)

    Benjamin, D.; Childers, JT; Hoeche, S.; LeCompte, T.; Uram, T.

    2017-10-01

    Exascale computing resources are roughly a decade away and will be capable of 100 times more computing than current supercomputers. In the last year, Energy Frontier experiments crossed a milestone of 100 million core-hours used at the Argonne Leadership Computing Facility, Oak Ridge Leadership Computing Facility, and NERSC. The Fortran-based leading-order parton generator called Alpgen was successfully scaled to millions of threads to achieve this level of usage on Mira. Sherpa and MadGraph are next-to-leading order generators used heavily by LHC experiments for simulation. Integration times for high-multiplicity or rare processes can take a week or more on standard Grid machines, even using all 16-cores. We will describe our ongoing work to scale the Sherpa generator to thousands of threads on leadership-class machines and reduce run-times to less than a day. This work allows the experiments to leverage large-scale parallel supercomputers for event generation today, freeing tens of millions of grid hours for other work, and paving the way for future applications (simulation, reconstruction) on these and future supercomputers.

  7. Recursive computer architecture for VLSI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Treleaven, P.C.; Hopkins, R.P.

    1982-01-01

    A general-purpose computer architecture based on the concept of recursion and suitable for VLSI computer systems built from replicated (lego-like) computing elements is presented. The recursive computer architecture is defined by presenting a program organisation, a machine organisation and an experimental machine implementation oriented to VLSI. The experimental implementation is being restricted to simple, identical microcomputers each containing a memory, a processor and a communications capability. This future generation of lego-like computer systems are termed fifth generation computers by the Japanese. 30 references.

  8. A Combinatorial Geometry Computer Description of the MEP-021A Generator Set

    DTIC Science & Technology

    1979-02-01

    Generator Computer Description Gasoline Generator GIFT MEP-021A 20. ABSTRACT fCbntteu* an rararaa eta* ft namamwaay anal Identify by block number) This... GIFT code is also stored on magnetic tape for future vulnerability analysis. 00,] 󈧚*7,1473 EDITION OF • NOV 65 IS OBSOLETE UNCLASSIFIED SECURITY...the Geometric Information for Targets ( GIFT ) computer code. The GIFT code traces shotlines through a COM-GEOM description from any specified attack

  9. Is There Computer Graphics after Multimedia?

    ERIC Educational Resources Information Center

    Booth, Kellogg S.

    Computer graphics has been driven by the desire to generate real-time imagery subject to constraints imposed by the human visual system. The future of computer graphics, when off-the-shelf systems have full multimedia capability and when standard computing engines render imagery faster than real-time, remains to be seen. A dedicated pipeline for…

  10. Computer Aided Design of Computer Generated Holograms for electron beam fabrication

    NASA Technical Reports Server (NTRS)

    Urquhart, Kristopher S.; Lee, Sing H.; Guest, Clark C.; Feldman, Michael R.; Farhoosh, Hamid

    1989-01-01

    Computer Aided Design (CAD) systems that have been developed for electrical and mechanical design tasks are also effective tools for the process of designing Computer Generated Holograms (CGHs), particularly when these holograms are to be fabricated using electron beam lithography. CAD workstations provide efficient and convenient means of computing, storing, displaying, and preparing for fabrication many of the features that are common to CGH designs. Experience gained in the process of designing CGHs with various types of encoding methods is presented. Suggestions are made so that future workstations may further accommodate the CGH design process.

  11. Current Grid Generation Strategies and Future Requirements in Hypersonic Vehicle Design, Analysis and Testing

    NASA Technical Reports Server (NTRS)

    Papadopoulos, Periklis; Venkatapathy, Ethiraj; Prabhu, Dinesh; Loomis, Mark P.; Olynick, Dave; Arnold, James O. (Technical Monitor)

    1998-01-01

    Recent advances in computational power enable computational fluid dynamic modeling of increasingly complex configurations. A review of grid generation methodologies implemented in support of the computational work performed for the X-38 and X-33 are presented. In strategizing topological constructs and blocking structures factors considered are the geometric configuration, optimal grid size, numerical algorithms, accuracy requirements, physics of the problem at hand, computational expense, and the available computer hardware. Also addressed are grid refinement strategies, the effects of wall spacing, and convergence. The significance of grid is demonstrated through a comparison of computational and experimental results of the aeroheating environment experienced by the X-38 vehicle. Special topics on grid generation strategies are also addressed to model control surface deflections, and material mapping.

  12. Culture and Risk: Does the Future Compute? A Symposium.

    ERIC Educational Resources Information Center

    Barnes, Susan B.; Perkinson, Henry J.; Talbott, Stephen L.

    1998-01-01

    Presents a symposium on the impact of computers on culture. Argues that the computer has mathematized culture and that widespread risk aversion has been generated everywhere. Finds that the ways in which communication technologies are used in social contexts is a topic of concern to communication scholars. (PA)

  13. The Instrument of the Future: Computers in Education.

    ERIC Educational Resources Information Center

    Leonard, Rex; LeCroy, Barbara

    Before computers will be able to fulfill their potential in education, two major challenges must be overcome--the lack of well-trained teachers and a lack of general knowledge about software and its capabilities. Teachers must acquire some computer literacy skills, including programming, word processing, materials generation and record keeping. In…

  14. A Systematic Review of Tablet Computers and Portable Media Players as Speech Generating Devices for Individuals with Autism Spectrum Disorder.

    PubMed

    Lorah, Elizabeth R; Parnell, Ashley; Whitby, Peggy Schaefer; Hantula, Donald

    2015-12-01

    Powerful, portable, off-the-shelf handheld devices, such as tablet based computers (i.e., iPad(®); Galaxy(®)) or portable multimedia players (i.e., iPod(®)), can be adapted to function as speech generating devices for individuals with autism spectrum disorders or related developmental disabilities. This paper reviews the research in this new and rapidly growing area and delineates an agenda for future investigations. In general, participants using these devices acquired verbal repertoires quickly. Studies comparing these devices to picture exchange or manual sign language found that acquisition was often quicker when using a tablet computer and that the vast majority of participants preferred using the device to picture exchange or manual sign language. Future research in interface design, user experience, and extended verbal repertoires is recommended.

  15. Developing the next generation of diverse computer scientists: the need for enhanced, intersectional computing identity theory

    NASA Astrophysics Data System (ADS)

    Rodriguez, Sarah L.; Lehman, Kathleen

    2017-10-01

    This theoretical paper explores the need for enhanced, intersectional computing identity theory for the purpose of developing a diverse group of computer scientists for the future. Greater theoretical understanding of the identity formation process specifically for computing is needed in order to understand how students come to understand themselves as computer scientists. To ensure that the next generation of computer scientists is diverse, this paper presents a case for examining identity development intersectionally, understanding the ways in which women and underrepresented students may have difficulty identifying as computer scientists and be systematically oppressed in their pursuit of computer science careers. Through a review of the available scholarship, this paper suggests that creating greater theoretical understanding of the computing identity development process will inform the way in which educational stakeholders consider computer science practices and policies.

  16. Efficient storage, computation, and exposure of computer-generated holograms by electron-beam lithography.

    PubMed

    Newman, D M; Hawley, R W; Goeckel, D L; Crawford, R D; Abraham, S; Gallagher, N C

    1993-05-10

    An efficient storage format was developed for computer-generated holograms for use in electron-beam lithography. This method employs run-length encoding and Lempel-Ziv-Welch compression and succeeds in exposing holograms that were previously infeasible owing to the hologram's tremendous pattern-data file size. These holograms also require significant computation; thus the algorithm was implemented on a parallel computer, which improved performance by 2 orders of magnitude. The decompression algorithm was integrated into the Cambridge electron-beam machine's front-end processor.Although this provides much-needed ability, some hardware enhancements will be required in the future to overcome inadequacies in the current front-end processor that result in a lengthy exposure time.

  17. A New Model that Generates Lotka's Law.

    ERIC Educational Resources Information Center

    Huber, John C.

    2002-01-01

    Develops a new model for a process that generates Lotka's Law. Topics include measuring scientific productivity through the number of publications; rate of production; career duration; randomness; Poisson distribution; computer simulations; goodness-of-fit; theoretical support for the model; and future research. (Author/LRW)

  18. Computer Aided Grid Interface: An Interactive CFD Pre-Processor

    NASA Technical Reports Server (NTRS)

    Soni, Bharat K.

    1997-01-01

    NASA maintains an applications oriented computational fluid dynamics (CFD) efforts complementary to and in support of the aerodynamic-propulsion design and test activities. This is especially true at NASA/MSFC where the goal is to advance and optimize present and future liquid-fueled rocket engines. Numerical grid generation plays a significant role in the fluid flow simulations utilizing CFD. An overall goal of the current project was to develop a geometry-grid generation tool that will help engineers, scientists and CFD practitioners to analyze design problems involving complex geometries in a timely fashion. This goal is accomplished by developing the CAGI: Computer Aided Grid Interface system. The CAGI system is developed by integrating CAD/CAM (Computer Aided Design/Computer Aided Manufacturing) geometric system output and/or Initial Graphics Exchange Specification (IGES) files (including all the NASA-IGES entities), geometry manipulations and generations associated with grid constructions, and robust grid generation methodologies. This report describes the development process of the CAGI system.

  19. Computer Aided Grid Interface: An Interactive CFD Pre-Processor

    NASA Technical Reports Server (NTRS)

    Soni, Bharat K.

    1996-01-01

    NASA maintains an applications oriented computational fluid dynamics (CFD) efforts complementary to and in support of the aerodynamic-propulsion design and test activities. This is especially true at NASA/MSFC where the goal is to advance and optimize present and future liquid-fueled rocket engines. Numerical grid generation plays a significant role in the fluid flow simulations utilizing CFD. An overall goal of the current project was to develop a geometry-grid generation tool that will help engineers, scientists and CFD practitioners to analyze design problems involving complex geometries in a timely fashion. This goal is accomplished by developing the Computer Aided Grid Interface system (CAGI). The CAGI system is developed by integrating CAD/CAM (Computer Aided Design/Computer Aided Manufacturing) geometric system output and / or Initial Graphics Exchange Specification (IGES) files (including all the NASA-IGES entities), geometry manipulations and generations associated with grid constructions, and robust grid generation methodologies. This report describes the development process of the CAGI system.

  20. Preliminary Computational Study for Future Tests in the NASA Ames 9 foot' x 7 foot Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Pearl, Jason M.; Carter, Melissa B.; Elmiligui, Alaa A.; WInski, Courtney S.; Nayani, Sudheer N.

    2016-01-01

    The NASA Advanced Air Vehicles Program, Commercial Supersonics Technology Project seeks to advance tools and techniques to make over-land supersonic flight feasible. In this study, preliminary computational results are presented for future tests in the NASA Ames 9 foot x 7 foot supersonic wind tunnel to be conducted in early 2016. Shock-plume interactions and their effect on pressure signature are examined for six model geometries. Near- field pressure signatures are assessed using the CFD code USM3D to model the proposed test geometries in free-air. Additionally, results obtained using the commercial grid generation software Pointwise Reigistered Trademark are compared to results using VGRID, the NASA Langley Research Center in-house mesh generation program.

  1. Computational Burden Resulting from Image Recognition of High Resolution Radar Sensors

    PubMed Central

    López-Rodríguez, Patricia; Fernández-Recio, Raúl; Bravo, Ignacio; Gardel, Alfredo; Lázaro, José L.; Rufo, Elena

    2013-01-01

    This paper presents a methodology for high resolution radar image generation and automatic target recognition emphasizing the computational cost involved in the process. In order to obtain focused inverse synthetic aperture radar (ISAR) images certain signal processing algorithms must be applied to the information sensed by the radar. From actual data collected by radar the stages and algorithms needed to obtain ISAR images are revised, including high resolution range profile generation, motion compensation and ISAR formation. Target recognition is achieved by comparing the generated set of actual ISAR images with a database of ISAR images generated by electromagnetic software. High resolution radar image generation and target recognition processes are burdensome and time consuming, so to determine the most suitable implementation platform the analysis of the computational complexity is of great interest. To this end and since target identification must be completed in real time, computational burden of both processes the generation and comparison with a database is explained separately. Conclusions are drawn about implementation platforms and calculation efficiency in order to reduce time consumption in a possible future implementation. PMID:23609804

  2. Computational burden resulting from image recognition of high resolution radar sensors.

    PubMed

    López-Rodríguez, Patricia; Fernández-Recio, Raúl; Bravo, Ignacio; Gardel, Alfredo; Lázaro, José L; Rufo, Elena

    2013-04-22

    This paper presents a methodology for high resolution radar image generation and automatic target recognition emphasizing the computational cost involved in the process. In order to obtain focused inverse synthetic aperture radar (ISAR) images certain signal processing algorithms must be applied to the information sensed by the radar. From actual data collected by radar the stages and algorithms needed to obtain ISAR images are revised, including high resolution range profile generation, motion compensation and ISAR formation. Target recognition is achieved by comparing the generated set of actual ISAR images with a database of ISAR images generated by electromagnetic software. High resolution radar image generation and target recognition processes are burdensome and time consuming, so to determine the most suitable implementation platform the analysis of the computational complexity is of great interest. To this end and since target identification must be completed in real time, computational burden of both processes the generation and comparison with a database is explained separately. Conclusions are drawn about implementation platforms and calculation efficiency in order to reduce time consumption in a possible future implementation.

  3. Molecular electronics: The technology of sixth generation computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jarvis, M.T.; Miller, R.K.

    1987-01-01

    In February 1986, Japan began the 6th Generation project. At the 1987 Economic Summit in Venice, Prime Minister Yashuhiro Makasone opened the project to world collaboration. A project director suggests that the 6th Generation ''may just be a turning point for human society.'' The major rationale for building molecular electronic devices is to achieve advances in computational densities and speeds. Proposed chromophore chains for molecular-scale chips, for example, could be spaced closer than today's silicone elements by a factor of almost 100. This book describes the research and proposed designs for molecular electronic devices and computers. It examines specific potentialmore » applications and the relationship to molecular electronics to silicon technology and presents the first published survey of experts on research issues, applications, and forecast of future developments and also includes market forecast. An interesting suggestion of the survey is that the chemical industry may become a significant factor in the computer industry as the sixth generation unfolds.« less

  4. Computer-Controlled Cylindrical Polishing Process for Development of Grazing Incidence Optics for Hard X-Ray Region

    NASA Technical Reports Server (NTRS)

    Khan, Gufran Sayeed; Gubarev, Mikhail; Speegle, Chet; Ramsey, Brian

    2010-01-01

    The presentation includes grazing incidence X-ray optics, motivation and challenges, mid spatial frequency generation in cylindrical polishing, design considerations for polishing lap, simulation studies and experimental results, future scope, and summary. Topics include current status of replication optics technology, cylindrical polishing process using large size polishing lap, non-conformance of polishin lap to the optics, development of software and polishing machine, deterministic prediction of polishing, polishing experiment under optimum conditions, and polishing experiment based on known error profile. Future plans include determination of non-uniformity in the polishing lap compliance, development of a polishing sequence based on a known error profile of the specimen, software for generating a mandrel polishing sequence, design an development of a flexible polishing lap, and computer controlled localized polishing process.

  5. Prosthetically directed implant placement using computer software to ensure precise placement and predictable prosthetic outcomes. Part 2: rapid-prototype medical modeling and stereolithographic drilling guides requiring bone exposure.

    PubMed

    Rosenfeld, Alan L; Mandelaris, George A; Tardieu, Philippe B

    2006-08-01

    The purpose of this paper is to expand on part 1 of this series (published in the previous issue) regarding the emerging future of computer-guided implant dentistry. This article will introduce the concept of rapid-prototype medical modeling as well as describe the utilization and fabrication of computer-generated surgical drilling guides used during implant surgery. The placement of dental implants has traditionally been an intuitive process, whereby the surgeon relies on mental navigation to achieve optimal implant positioning. Through rapid-prototype medical modeling and the ste-reolithographic process, surgical drilling guides (eg, SurgiGuide) can be created. These guides are generated from a surgical implant plan created with a computer software system that incorporates all relevant prosthetic information from which the surgical plan is developed. The utilization of computer-generated planning and stereolithographically generated surgical drilling guides embraces the concept of collaborative accountability and supersedes traditional mental navigation on all levels of implant therapy.

  6. Duct flow nonuniformities study for space shuttle main engine

    NASA Technical Reports Server (NTRS)

    Thoenes, J.

    1985-01-01

    To improve the Space Shuttle Main Engine (SSME) design and for future use in the development of generation rocket engines, a combined experimental/analytical study was undertaken with the goals of first, establishing an experimental data base for the flow conditions in the SSME high pressure fuel turbopump (HPFTP) hot gas manifold (HGM) and, second, setting up a computer model of the SSME HGM flow field. Using the test data to verify the computer model it should be possible in the future to computationally scan contemplated advanced design configurations and limit costly testing to the most promising design. The effort of establishing and using the computer model is detailed. The comparison of computational results and experimental data observed clearly demonstrate that computational fluid mechanics (CFD) techniques can be used successfully to predict the gross features of three dimensional fluid flow through configurations as intricate as the SSME turbopump hot gas manifold.

  7. Computational Fragment-Based Drug Design: Current Trends, Strategies, and Applications.

    PubMed

    Bian, Yuemin; Xie, Xiang-Qun Sean

    2018-04-09

    Fragment-based drug design (FBDD) has become an effective methodology for drug development for decades. Successful applications of this strategy brought both opportunities and challenges to the field of Pharmaceutical Science. Recent progress in the computational fragment-based drug design provide an additional approach for future research in a time- and labor-efficient manner. Combining multiple in silico methodologies, computational FBDD possesses flexibilities on fragment library selection, protein model generation, and fragments/compounds docking mode prediction. These characteristics provide computational FBDD superiority in designing novel and potential compounds for a certain target. The purpose of this review is to discuss the latest advances, ranging from commonly used strategies to novel concepts and technologies in computational fragment-based drug design. Particularly, in this review, specifications and advantages are compared between experimental and computational FBDD, and additionally, limitations and future prospective are discussed and emphasized.

  8. Vector computer memory bank contention

    NASA Technical Reports Server (NTRS)

    Bailey, D. H.

    1985-01-01

    A number of vector supercomputers feature very large memories. Unfortunately the large capacity memory chips that are used in these computers are much slower than the fast central processing unit (CPU) circuitry. As a result, memory bank reservation times (in CPU ticks) are much longer than on previous generations of computers. A consequence of these long reservation times is that memory bank contention is sharply increased, resulting in significantly lowered performance rates. The phenomenon of memory bank contention in vector computers is analyzed using both a Markov chain model and a Monte Carlo simulation program. The results of this analysis indicate that future generations of supercomputers must either employ much faster memory chips or else feature very large numbers of independent memory banks.

  9. Vector computer memory bank contention

    NASA Technical Reports Server (NTRS)

    Bailey, David H.

    1987-01-01

    A number of vector supercomputers feature very large memories. Unfortunately the large capacity memory chips that are used in these computers are much slower than the fast central processing unit (CPU) circuitry. As a result, memory bank reservation times (in CPU ticks) are much longer than on previous generations of computers. A consequence of these long reservation times is that memory bank contention is sharply increased, resulting in significantly lowered performance rates. The phenomenon of memory bank contention in vector computers is analyzed using both a Markov chain model and a Monte Carlo simulation program. The results of this analysis indicate that future generations of supercomputers must either employ much faster memory chips or else feature very large numbers of independent memory banks.

  10. Data Acquisition and Mass Storage

    NASA Astrophysics Data System (ADS)

    Vande Vyvre, P.

    2004-08-01

    The experiments performed at supercolliders will constitute a new challenge in several disciplines of High Energy Physics and Information Technology. This will definitely be the case for data acquisition and mass storage. The microelectronics, communication, and computing industries are maintaining an exponential increase of the performance of their products. The market of commodity products remains the largest and the most competitive market of technology products. This constitutes a strong incentive to use these commodity products extensively as components to build the data acquisition and computing infrastructures of the future generation of experiments. The present generation of experiments in Europe and in the US already constitutes an important step in this direction. The experience acquired in the design and the construction of the present experiments has to be complemented by a large R&D effort executed with good awareness of industry developments. The future experiments will also be expected to follow major trends of our present world: deliver physics results faster and become more and more visible and accessible. The present evolution of the technologies and the burgeoning of GRID projects indicate that these trends will be made possible. This paper includes a brief overview of the technologies currently used for the different tasks of the experimental data chain: data acquisition, selection, storage, processing, and analysis. The major trends of the computing and networking technologies are then indicated with particular attention paid to their influence on the future experiments. Finally, the vision of future data acquisition and processing systems and their promise for future supercolliders is presented.

  11. Future petroleum geologist: discussion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davis, G.D.

    1987-07-01

    Robert R. Berg's (1986) article, ''The Future Petroleum Geologist,'' summarizes the findings of the 13-member AAPG Select Committee on The Future Petroleum Geologist appointed by President William L. Fisher in July 1985. While this undertaking is laudable, particularly considering present circumstance in the petroleum industry, the committee has apparently overlooked a vital aspect concerning the future knowledge requirements of the petroleum geologist. Specifically, the Select Committee makes no mention of the need for computer literacy in its list of educational training categories. Obviously, AAPG is well aware of both the interest in computers by its membership and the increasing needmore » for training and familiarity in this discipline. The Select Committee on The Future Petroleum Geologist, while undertaking a difficult and potentially controversial task, has omitted an important aspect of the background requirements for generations of future petroleum geologists; the committee should consider an amendment to their recommendations to reflect this increasingly important field study.« less

  12. The next generation of command post computing

    NASA Astrophysics Data System (ADS)

    Arnold, Ross D.; Lieb, Aaron J.; Samuel, Jason M.; Burger, Mitchell A.

    2015-05-01

    The future of command post computing demands an innovative new solution to address a variety of challenging operational needs. The Command Post of the Future is the Army's primary command and control decision support system, providing situational awareness and collaborative tools for tactical decision making, planning, and execution management from Corps to Company level. However, as the U.S. Army moves towards a lightweight, fully networked battalion, disconnected operations, thin client architecture and mobile computing become increasingly essential. The Command Post of the Future is not designed to support these challenges in the coming decade. Therefore, research into a hybrid blend of technologies is in progress to address these issues. This research focuses on a new command and control system utilizing the rich collaboration framework afforded by Command Post of the Future coupled with a new user interface consisting of a variety of innovative workspace designs. This new system is called Tactical Applications. This paper details a brief history of command post computing, presents the challenges facing the modern Army, and explores the concepts under consideration for Tactical Applications that meet these challenges in a variety of innovative ways.

  13. Demonstration of measurement-only blind quantum computing

    NASA Astrophysics Data System (ADS)

    Greganti, Chiara; Roehsner, Marie-Christine; Barz, Stefanie; Morimae, Tomoyuki; Walther, Philip

    2016-01-01

    Blind quantum computing allows for secure cloud networks of quasi-classical clients and a fully fledged quantum server. Recently, a new protocol has been proposed, which requires a client to perform only measurements. We demonstrate a proof-of-principle implementation of this measurement-only blind quantum computing, exploiting a photonic setup to generate four-qubit cluster states for computation and verification. Feasible technological requirements for the client and the device-independent blindness make this scheme very applicable for future secure quantum networks.

  14. Three-Dimensional Displays In The Future Flight Station

    NASA Astrophysics Data System (ADS)

    Bridges, Alan L.

    1984-10-01

    This review paper summarizes the development and applications of computer techniques for the representation of three-dimensional data in the future flight station. It covers the development of the Lockheed-NASA Advanced Concepts Flight Station (ACFS) research simulators. These simulators contain: A Pilot's Desk Flight Station (PDFS) with five 13- inch diagonal, color, cathode ray tubes on the main instrument panel; a computer-generated day and night visual system; a six-degree-of-freedom motion base; and a computer complex. This paper reviews current research, development, and evaluation of easily modifiable display systems and software requirements for three-dimensional displays that may be developed for the PDFS. This includes the analysis and development of a 3-D representation of the entire flight profile. This 3-D flight path, or "Highway-in-the-Sky", will utilize motion and perspective cues to tightly couple the human responses of the pilot to the aircraft control systems. The use of custom logic, e.g., graphics engines, may provide the processing power and architecture required for 3-D computer-generated imagery (CGI) or visual scene simulation (VSS). Diffraction or holographic head-up displays (HUDs) will also be integrated into the ACFS simulator to permit research on the requirements and use of these "out-the-window" projection systems. Future research may include the retrieval of high-resolution, perspective view terrain maps which could then be overlaid with current weather information or other selectable cultural features.

  15. Rapid Geometry Creation for Computer-Aided Engineering Parametric Analyses: A Case Study Using ComGeom2 for Launch Abort System Design

    NASA Technical Reports Server (NTRS)

    Hawke, Veronica; Gage, Peter; Manning, Ted

    2007-01-01

    ComGeom2, a tool developed to generate Common Geometry representation for multidisciplinary analysis, has been used to create a large set of geometries for use in a design study requiring analysis by two computational codes. This paper describes the process used to generate the large number of configurations and suggests ways to further automate the process and make it more efficient for future studies. The design geometry for this study is the launch abort system of the NASA Crew Launch Vehicle.

  16. Has computational creativity successfully made it "Beyond the Fence" in musical theatre?

    NASA Astrophysics Data System (ADS)

    Jordanous, Anna

    2017-10-01

    A significant test for software is to task it with replicating human performance, as done recently with creative software and the commercial project Beyond the Fence (undertaken for a television documentary Computer Says Show). The remit of this project was to use computer software as much as possible to produce "the world's first computer-generated musical". Several creative systems were used to generate this musical, which was performed in London's West End in 2016. This paper considers the challenge of evaluating this project. Current computational creativity evaluation methods are ill-suited to evaluating projects that involve creative input from multiple systems and people. Following recent inspiration within computational creativity research from interaction design, here the DECIDE evaluation framework is applied to evaluate the Beyond the Fence project. Evaluation finds that the project was reasonably successful at achieving the task of using computational generation to produce a credible musical. Lessons have been learned for future computational creativity projects though, particularly for affording creative software more agency and enabling software to interact with other creative partners. Upon reflection, the DECIDE framework emerges as a useful evaluation "checklist" (if not a tangible operational methodology) for evaluating multiple creative systems participating in a creative task.

  17. Alloy design for aircraft engines

    NASA Astrophysics Data System (ADS)

    Pollock, Tresa M.

    2016-08-01

    Metallic materials are fundamental to advanced aircraft engines. While perceived as mature, emerging computational, experimental and processing innovations are expanding the scope for discovery and implementation of new metallic materials for future generations of advanced propulsion systems.

  18. Next generation keyboards: The importance of cognitive compatibility

    NASA Technical Reports Server (NTRS)

    Amell, John R.; Ewry, Michael E.; Colle, Herbert A.

    1988-01-01

    The computer keyboard of today is essentially the same as it has been for many years. Few advances have been made in keyboard design even though computer systems in general have made remarkable progress in improvements. This paper discusses the future of keyboards, their competition and compatibility with voice input systems, and possible special-application intelligent keyboards for controlling complex systems.

  19. Computer-assisted total hip arthroplasty: coding the next generation of navigation systems for orthopedic surgery.

    PubMed

    Renkawitz, Tobias; Tingart, Markus; Grifka, Joachim; Sendtner, Ernst; Kalteis, Thomas

    2009-09-01

    This article outlines the scientific basis and a state-of-the-art application of computer-assisted orthopedic surgery in total hip arthroplasty (THA) and provides a future perspective on this technology. Computer-assisted orthopedic surgery in primary THA has the potential to couple 3D simulations with real-time evaluations of surgical performance, which has brought these developments from the research laboratory all the way to clinical use. Nonimage- or imageless-based navigation systems without the need for additional pre- or intra-operative image acquisition have stood the test to significantly reduce the variability in positioning the acetabular component and have shown precise measurement of leg length and offset changes during THA. More recently, computer-assisted orthopedic surgery systems have opened a new frontier for accurate surgical practice in minimally invasive, tissue-preserving THA. The future generation of imageless navigation systems will switch from simple measurement tasks to real navigation tools. These software algorithms will consider the cup and stem as components of a coupled biomechanical system, navigating the orthopedic surgeon to find an optimized complementary component orientation rather than target values intraoperatively, and are expected to have a high impact on clinical practice and postoperative functionality in modern THA.

  20. Interconnection requirements in avionic systems

    NASA Astrophysics Data System (ADS)

    Vergnolle, Claude; Houssay, Bruno

    1991-04-01

    The future aircraft generation will have thousand smart electromagnetic sensors distributed allover. Each sensor is connected with fibers links to the main-frame computer in charge of the real time signal''s correlation. Such a computer must be compactly built and massively parallel: it needs the use of 3 D optical free-space interconnect between neighbouring boards and reconfigurable interconnects via holographic backplane. The optical interconnect facilities will be also used to build fault-tolerant computer through large redundancy.

  1. Current state and future direction of computer systems at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Rogers, James L. (Editor); Tucker, Jerry H. (Editor)

    1992-01-01

    Computer systems have advanced at a rate unmatched by any other area of technology. As performance has dramatically increased there has been an equally dramatic reduction in cost. This constant cost performance improvement has precipitated the pervasiveness of computer systems into virtually all areas of technology. This improvement is due primarily to advances in microelectronics. Most people are now convinced that the new generation of supercomputers will be built using a large number (possibly thousands) of high performance microprocessors. Although the spectacular improvements in computer systems have come about because of these hardware advances, there has also been a steady improvement in software techniques. In an effort to understand how these hardware and software advances will effect research at NASA LaRC, the Computer Systems Technical Committee drafted this white paper to examine the current state and possible future directions of computer systems at the Center. This paper discusses selected important areas of computer systems including real-time systems, embedded systems, high performance computing, distributed computing networks, data acquisition systems, artificial intelligence, and visualization.

  2. Medicinal chemistry in drug discovery in big pharma: past, present and future.

    PubMed

    Campbell, Ian B; Macdonald, Simon J F; Procopiou, Panayiotis A

    2018-02-01

    The changes in synthetic and medicinal chemistry and related drug discovery science as practiced in big pharma over the past few decades are described. These have been predominantly driven by wider changes in society namely the computer, internet and globalisation. Thoughts about the future of medicinal chemistry are also discussed including sharing the risks and costs of drug discovery and the future of outsourcing. The continuing impact of access to substantial computing power and big data, the use of algorithms in data analysis and drug design are also presented. The next generation of medicinal chemists will communicate in ways that reflect social media and the results of constantly being connected to each other and data. Copyright © 2017. Published by Elsevier Ltd.

  3. Practical applications of interactive voice technologies: Some accomplishments and prospects

    NASA Technical Reports Server (NTRS)

    Grady, Michael W.; Hicklin, M. B.; Porter, J. E.

    1977-01-01

    A technology assessment of the application of computers and electronics to complex systems is presented. Three existing systems which utilize voice technology (speech recognition and speech generation) are described. Future directions in voice technology are also described.

  4. This Is Your Future: A Case Study Approach to Foster Health Literacy

    ERIC Educational Resources Information Center

    Brey, Rebecca A.; Clark, Susan E.; Wantz, Molly S.

    2008-01-01

    Today's young people seem to live in an even faster fast-paced society than previous generations. As in the past, they are involved in sports, music, school, church, work, and are exposed to many forms of mass media that add to their base of information. However, they also have instant access to computer-generated information such as the Internet,…

  5. The present and future of de novo whole-genome assembly.

    PubMed

    Sohn, Jang-Il; Nam, Jin-Wu

    2018-01-01

    As the advent of next-generation sequencing (NGS) technology, various de novo assembly algorithms based on the de Bruijn graph have been developed to construct chromosome-level sequences. However, numerous technical or computational challenges in de novo assembly still remain, although many bright ideas and heuristics have been suggested to tackle the challenges in both experimental and computational settings. In this review, we categorize de novo assemblers on the basis of the type of de Bruijn graphs (Hamiltonian and Eulerian) and discuss the challenges of de novo assembly for short NGS reads regarding computational complexity and assembly ambiguity. Then, we discuss how the limitations of the short reads can be overcome by using a single-molecule sequencing platform that generates long reads of up to several kilobases. In fact, the long read assembly has caused a paradigm shift in whole-genome assembly in terms of algorithms and supporting steps. We also summarize (i) hybrid assemblies using both short and long reads and (ii) overlap-based assemblies for long reads and discuss their challenges and future prospects. This review provides guidelines to determine the optimal approach for a given input data type, computational budget or genome. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Human Modeling for Ground Processing Human Factors Engineering Analysis

    NASA Technical Reports Server (NTRS)

    Stambolian, Damon B.; Lawrence, Brad A.; Stelges, Katrine S.; Steady, Marie-Jeanne O.; Ridgwell, Lora C.; Mills, Robert E.; Henderson, Gena; Tran, Donald; Barth, Tim

    2011-01-01

    There have been many advancements and accomplishments over the last few years using human modeling for human factors engineering analysis for design of spacecraft. The key methods used for this are motion capture and computer generated human models. The focus of this paper is to explain the human modeling currently used at Kennedy Space Center (KSC), and to explain the future plans for human modeling for future spacecraft designs

  7. CGAT: a model for immersive personalized training in computational genomics

    PubMed Central

    Sims, David; Ponting, Chris P.

    2016-01-01

    How should the next generation of genomics scientists be trained while simultaneously pursuing high quality and diverse research? CGAT, the Computational Genomics Analysis and Training programme, was set up in 2010 by the UK Medical Research Council to complement its investment in next-generation sequencing capacity. CGAT was conceived around the twin goals of training future leaders in genome biology and medicine, and providing much needed capacity to UK science for analysing genome scale data sets. Here we outline the training programme employed by CGAT and describe how it dovetails with collaborative research projects to launch scientists on the road towards independent research careers in genomics. PMID:25981124

  8. Poster — Thur Eve — 74: Distributed, asynchronous, reactive dosimetric and outcomes analysis using DICOMautomaton

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clark, Haley; BC Cancer Agency, Surrey, B.C.; BC Cancer Agency, Vancouver, B.C.

    2014-08-15

    Many have speculated about the future of computational technology in clinical radiation oncology. It has been advocated that the next generation of computational infrastructure will improve on the current generation by incorporating richer aspects of automation, more heavily and seamlessly featuring distributed and parallel computation, and providing more flexibility toward aggregate data analysis. In this report we describe how a recently created — but currently existing — analysis framework (DICOMautomaton) incorporates these aspects. DICOMautomaton supports a variety of use cases but is especially suited for dosimetric outcomes correlation analysis, investigation and comparison of radiotherapy treatment efficacy, and dose-volume computation. Wemore » describe: how it overcomes computational bottlenecks by distributing workload across a network of machines; how modern, asynchronous computational techniques are used to reduce blocking and avoid unnecessary computation; and how issues of out-of-date data are addressed using reactive programming techniques and data dependency chains. We describe internal architecture of the software and give a detailed demonstration of how DICOMautomaton could be used to search for correlations between dosimetric and outcomes data.« less

  9. Status of Sundstrand research

    NASA Technical Reports Server (NTRS)

    Bateman, Don

    1991-01-01

    Wind shear detection status is presented in the form of view-graphs. The following subject areas are covered: second generation detection (Q-bias, gamma bias, temperature biases, maneuvering flight modulation, and altitude modulation); third generation wind shear detection (use wind shear computation to augment flight path and terrain alerts, modulation of alert thresholds based on wind/terrain data base, incorporate wind shear/terrain alert enhancements from predictive sensor data); and future research and development.

  10. Deterministic and robust generation of single photons from a single quantum dot with 99.5% indistinguishability using adiabatic rapid passage.

    PubMed

    Wei, Yu-Jia; He, Yu-Ming; Chen, Ming-Cheng; Hu, Yi-Nan; He, Yu; Wu, Dian; Schneider, Christian; Kamp, Martin; Höfling, Sven; Lu, Chao-Yang; Pan, Jian-Wei

    2014-11-12

    Single photons are attractive candidates of quantum bits (qubits) for quantum computation and are the best messengers in quantum networks. Future scalable, fault-tolerant photonic quantum technologies demand both stringently high levels of photon indistinguishability and generation efficiency. Here, we demonstrate deterministic and robust generation of pulsed resonance fluorescence single photons from a single semiconductor quantum dot using adiabatic rapid passage, a method robust against fluctuation of driving pulse area and dipole moments of solid-state emitters. The emitted photons are background-free, have a vanishing two-photon emission probability of 0.3% and a raw (corrected) two-photon Hong-Ou-Mandel interference visibility of 97.9% (99.5%), reaching a precision that places single photons at the threshold for fault-tolerant surface-code quantum computing. This single-photon source can be readily scaled up to multiphoton entanglement and used for quantum metrology, boson sampling, and linear optical quantum computing.

  11. Computational physical oceanography -- A comprehensive approach based on generalized CFD/grid techniques for planetary scale simulations of oceanic flows. Final report, September 1, 1995--August 31, 1996

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beddhu, M.; Jiang, M.Y.; Whitfield, D.L.

    The original intention for this work was to impart the technology that was developed in the field of computational aeronautics to the field of computational physical oceanography. This technology transfer involved grid generation techniques and solution procedures to solve the governing equations over the grids thus generated. Specifically, boundary fitting non-orthogonal grids would be generated over a sphere taking into account the topography of the ocean floor and the topography of the continents. The solution methodology to be employed involved the application of an upwind, finite volume discretization procedure that uses higher order numerical fluxes at the cell faces tomore » discretize the governing equations and an implicit Newton relaxation technique to solve the discretized equations. This report summarizes the efforts put forth during the past three years to achieve these goals and indicates the future direction of this work as it is still an ongoing effort.« less

  12. Exploiting current-generation graphics hardware for synthetic-scene generation

    NASA Astrophysics Data System (ADS)

    Tanner, Michael A.; Keen, Wayne A.

    2010-04-01

    Increasing seeker frame rate and pixel count, as well as the demand for higher levels of scene fidelity, have driven scene generation software for hardware-in-the-loop (HWIL) and software-in-the-loop (SWIL) testing to higher levels of parallelization. Because modern PC graphics cards provide multiple computational cores (240 shader cores for a current NVIDIA Corporation GeForce and Quadro cards), implementation of phenomenology codes on graphics processing units (GPUs) offers significant potential for simultaneous enhancement of simulation frame rate and fidelity. To take advantage of this potential requires algorithm implementation that is structured to minimize data transfers between the central processing unit (CPU) and the GPU. In this paper, preliminary methodologies developed at the Kinetic Hardware In-The-Loop Simulator (KHILS) will be presented. Included in this paper will be various language tradeoffs between conventional shader programming, Compute Unified Device Architecture (CUDA) and Open Computing Language (OpenCL), including performance trades and possible pathways for future tool development.

  13. Assessment of a stochastic downscaling methodology in generating an ensemble of hourly future climate time series

    NASA Astrophysics Data System (ADS)

    Fatichi, S.; Ivanov, V. Y.; Caporali, E.

    2013-04-01

    This study extends a stochastic downscaling methodology to generation of an ensemble of hourly time series of meteorological variables that express possible future climate conditions at a point-scale. The stochastic downscaling uses general circulation model (GCM) realizations and an hourly weather generator, the Advanced WEather GENerator (AWE-GEN). Marginal distributions of factors of change are computed for several climate statistics using a Bayesian methodology that can weight GCM realizations based on the model relative performance with respect to a historical climate and a degree of disagreement in projecting future conditions. A Monte Carlo technique is used to sample the factors of change from their respective marginal distributions. As a comparison with traditional approaches, factors of change are also estimated by averaging GCM realizations. With either approach, the derived factors of change are applied to the climate statistics inferred from historical observations to re-evaluate parameters of the weather generator. The re-parameterized generator yields hourly time series of meteorological variables that can be considered to be representative of future climate conditions. In this study, the time series are generated in an ensemble mode to fully reflect the uncertainty of GCM projections, climate stochasticity, as well as uncertainties of the downscaling procedure. Applications of the methodology in reproducing future climate conditions for the periods of 2000-2009, 2046-2065 and 2081-2100, using the period of 1962-1992 as the historical baseline are discussed for the location of Firenze (Italy). The inferences of the methodology for the period of 2000-2009 are tested against observations to assess reliability of the stochastic downscaling procedure in reproducing statistics of meteorological variables at different time scales.

  14. Computational biology for ageing

    PubMed Central

    Wieser, Daniela; Papatheodorou, Irene; Ziehm, Matthias; Thornton, Janet M.

    2011-01-01

    High-throughput genomic and proteomic technologies have generated a wealth of publicly available data on ageing. Easy access to these data, and their computational analysis, is of great importance in order to pinpoint the causes and effects of ageing. Here, we provide a description of the existing databases and computational tools on ageing that are available for researchers. We also describe the computational approaches to data interpretation in the field of ageing including gene expression, comparative and pathway analyses, and highlight the challenges for future developments. We review recent biological insights gained from applying bioinformatics methods to analyse and interpret ageing data in different organisms, tissues and conditions. PMID:21115530

  15. Eruptive event generator based on the Gibson-Low magnetic configuration

    NASA Astrophysics Data System (ADS)

    Borovikov, D.; Sokolov, I. V.; Manchester, W. B.; Jin, M.; Gombosi, T. I.

    2017-08-01

    Coronal mass ejections (CMEs), a kind of energetic solar eruptions, are an integral subject of space weather research. Numerical magnetohydrodynamic (MHD) modeling, which requires powerful computational resources, is one of the primary means of studying the phenomenon. With increasing accessibility of such resources, grows the demand for user-friendly tools that would facilitate the process of simulating CMEs for scientific and operational purposes. The Eruptive Event Generator based on Gibson-Low flux rope (EEGGL), a new publicly available computational model presented in this paper, is an effort to meet this demand. EEGGL allows one to compute the parameters of a model flux rope driving a CME via an intuitive graphical user interface. We provide a brief overview of the physical principles behind EEGGL and its functionality. Ways toward future improvements of the tool are outlined.

  16. CGAT: a model for immersive personalized training in computational genomics.

    PubMed

    Sims, David; Ponting, Chris P; Heger, Andreas

    2016-01-01

    How should the next generation of genomics scientists be trained while simultaneously pursuing high quality and diverse research? CGAT, the Computational Genomics Analysis and Training programme, was set up in 2010 by the UK Medical Research Council to complement its investment in next-generation sequencing capacity. CGAT was conceived around the twin goals of training future leaders in genome biology and medicine, and providing much needed capacity to UK science for analysing genome scale data sets. Here we outline the training programme employed by CGAT and describe how it dovetails with collaborative research projects to launch scientists on the road towards independent research careers in genomics. © The Author 2015. Published by Oxford University Press.

  17. Vectors into the Future of Mass and Interpersonal Communication Research: Big Data, Social Media, and Computational Social Science.

    PubMed

    Cappella, Joseph N

    2017-10-01

    Simultaneous developments in big data, social media, and computational social science have set the stage for how we think about and understand interpersonal and mass communication. This article explores some of the ways that these developments generate 4 hypothetical "vectors" - directions - into the next generation of communication research. These vectors include developments in network analysis, modeling interpersonal and social influence, recommendation systems, and the blurring of distinctions between interpersonal and mass audiences through narrowcasting and broadcasting. The methods and research in these arenas are occurring in areas outside the typical boundaries of the communication discipline but engage classic, substantive questions in mass and interpersonal communication.

  18. Transmission expansion with smart switching under demand uncertainty and line failures

    DOE PAGES

    Schumacher, Kathryn M.; Chen, Richard Li-Yang; Cohn, Amy E. M.

    2016-06-07

    One of the major challenges in deciding where to build new transmission lines is that there is uncertainty regarding future loads, renewal generation output and equipment failures. We propose a robust optimization model whose transmission expansion solutions ensure that demand can be met over a wide range of conditions. Specifically, we require feasible operation for all loads and renewable generation levels within given ranges, and for all single transmission line failures. Furthermore, we consider transmission switching as an allowable recovery action. This relatively inexpensive method of redirecting power flows improves resiliency, but introduces computational challenges. Lastly, we present a novelmore » algorithm to solve this model. Computational results are discussed.« less

  19. Information Technologies in the Health Care System. Hearing before the Subcommittee on Investigations and Oversight of the Committee on Science and Technology. U.S. House of Representatives, Ninety-Ninth Congress, Second Session.

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. House Committee on Science and Technology.

    Hearings on the use of computer technology in the health care field are presented to provide information needed by Congress and the Food and Drug Administration to make future policies. Medical computing systems can make interpretations of data on the patient's health and can generate diagnostic recommendations to the physician. Included are…

  20. A Prospectus for the Future Development of a Speech Lab: Hypertext Applications.

    ERIC Educational Resources Information Center

    Berube, David M.

    This paper presents a plan for the next generation of speech laboratories which integrates technologies of modern communication in order to improve and modernize the instructional process. The paper first examines the application of intermediate technologies including audio-video recording and playback, computer assisted instruction and testing…

  1. Connecting to the Future

    ERIC Educational Resources Information Center

    Kennedy, Mike

    2010-01-01

    For the generation of people whose classroom memories consist of chalk squeaking on a blackboard, weather-beaten textbooks and a ready supply of sharpened No. 2 pencils, the resources available to students in many 21st-century American schools may seem unfamiliar, even amazing. Computer networks with access to the Internet--wired or wireless--have…

  2. Finding the Future That Fits.

    ERIC Educational Resources Information Center

    Taylor, Alison

    In 2000, a government-supported foundation called Careers the Next Generation (CNG) in Alberta, Canada, began coordinating summer internships for high school students in information and computer technology (ICT). The participating firms represented a mix of large and small private and public organizations in high-tech and other industries in the…

  3. Assessing the benefits and economic values of trees

    Treesearch

    David J. Nowak

    2017-01-01

    Understanding the environmental, economic, and social/community benefits of nature, in particular trees and forests, can lead to better vegetation management and designs to optimize environmental quality and human health for current and future generations. Computer models have been developed to assess forest composition and its associated effects on environmental...

  4. President's Information Technology Advisory Committee Interim Report to the President.

    ERIC Educational Resources Information Center

    National Coordination Office for Information Technology Research and Development, Arlington, VA.

    This document is the Interim Report on future directions for Federal support of research and development in high performance computing, communications, information technology, and the Next Generation Internet. This report provides a more detailed explanation of the findings and recommendations summarized by the President's Information Technology…

  5. The FuturICT education accelerator

    NASA Astrophysics Data System (ADS)

    Johnson, J.; Buckingham Shum, S.; Willis, A.; Bishop, S.; Zamenopoulos, T.; Swithenby, S.; MacKay, R.; Merali, Y.; Lorincz, A.; Costea, C.; Bourgine, P.; Louçã, J.; Kapenieks, A.; Kelley, P.; Caird, S.; Bromley, J.; Deakin Crick, R.; Goldspink, C.; Collet, P.; Carbone, A.; Helbing, D.

    2012-11-01

    Education is a major force for economic and social wellbeing. Despite high aspirations, education at all levels can be expensive and ineffective. Three Grand Challenges are identified: (1) enable people to learn orders of magnitude more effectively, (2) enable people to learn at orders of magnitude less cost, and (3) demonstrate success by exemplary interdisciplinary education in complex systems science. A ten year `man-on-the-moon' project is proposed in which FuturICT's unique combination of Complexity, Social and Computing Sciences could provide an urgently needed transdisciplinary language for making sense of educational systems. In close dialogue with educational theory and practice, and grounded in the emerging data science and learning analytics paradigms, this will translate into practical tools (both analytical and computational) for researchers, practitioners and leaders; generative principles for resilient educational ecosystems; and innovation for radically scalable, yet personalised, learner engagement and assessment. The proposed Education Accelerator will serve as a `wind tunnel' for testing these ideas in the context of real educational programmes, with an international virtual campus delivering complex systems education exploiting the new understanding of complex, social, computationally enhanced organisational structure developed within FuturICT.

  6. A New Look at NASA: Strategic Research In Information Technology

    NASA Technical Reports Server (NTRS)

    Alfano, David; Tu, Eugene (Technical Monitor)

    2002-01-01

    This viewgraph presentation provides information on research undertaken by NASA to facilitate the development of information technologies. Specific ideas covered here include: 1) Bio/nano technologies: biomolecular and nanoscale systems and tools for assembly and computing; 2) Evolvable hardware: autonomous self-improving, self-repairing hardware and software for survivable space systems in extreme environments; 3) High Confidence Software Technologies: formal methods, high-assurance software design, and program synthesis; 4) Intelligent Controls and Diagnostics: Next generation machine learning, adaptive control, and health management technologies; 5) Revolutionary computing: New computational models to increase capability and robustness to enable future NASA space missions.

  7. Advances in Computational Capabilities for Hypersonic Flows

    NASA Technical Reports Server (NTRS)

    Kumar, Ajay; Gnoffo, Peter A.; Moss, James N.; Drummond, J. Philip

    1997-01-01

    The paper reviews the growth and advances in computational capabilities for hypersonic applications over the period from the mid-1980's to the present day. The current status of the code development issues such as surface and field grid generation, algorithms, physical and chemical modeling, and validation is provided. A brief description of some of the major codes being used at NASA Langley Research Center for hypersonic continuum and rarefied flows is provided, along with their capabilities and deficiencies. A number of application examples are presented, and future areas of research to enhance accuracy, reliability, efficiency, and robustness of computational codes are discussed.

  8. Data-driven indexing mechanism for the recognition of polyhedral objects

    NASA Astrophysics Data System (ADS)

    McLean, Stewart; Horan, Peter; Caelli, Terry M.

    1992-02-01

    This paper is concerned with the problem of searching large model databases. To date, most object recognition systems have concentrated on the problem of matching using simple searching algorithms. This is quite acceptable when the number of object models is small. However, in the future, general purpose computer vision systems will be required to recognize hundreds or perhaps thousands of objects and, in such circumstances, efficient searching algorithms will be needed. The problem of searching a large model database is one which must be addressed if future computer vision systems are to be at all effective. In this paper we present a method we call data-driven feature-indexed hypothesis generation as one solution to the problem of searching large model databases.

  9. The transition of GTDS to the Unix workstation environment

    NASA Technical Reports Server (NTRS)

    Carter, D.; Metzinger, R.; Proulx, R.; Cefola, P.

    1995-01-01

    Future Flight Dynamics systems should take advantage of the possibilities provided by current and future generations of low-cost, high performance workstation computing environments with Graphical User Interface. The port of the existing mainframe Flight Dynamics systems to the workstation environment offers an economic approach for combining the tremendous engineering heritage that has been encapsulated in these systems with the advantages of the new computing environments. This paper will describe the successful transition of the Draper Laboratory R&D version of GTDS (Goddard Trajectory Determination System) from the IBM Mainframe to the Unix workstation environment. The approach will be a mix of historical timeline notes, descriptions of the technical problems overcome, and descriptions of associated SQA (software quality assurance) issues.

  10. Enabling Grid Computing resources within the KM3NeT computing model

    NASA Astrophysics Data System (ADS)

    Filippidis, Christos

    2016-04-01

    KM3NeT is a future European deep-sea research infrastructure hosting a new generation neutrino detectors that - located at the bottom of the Mediterranean Sea - will open a new window on the universe and answer fundamental questions both in particle physics and astrophysics. International collaborative scientific experiments, like KM3NeT, are generating datasets which are increasing exponentially in both complexity and volume, making their analysis, archival, and sharing one of the grand challenges of the 21st century. These experiments, in their majority, adopt computing models consisting of different Tiers with several computing centres and providing a specific set of services for the different steps of data processing such as detector calibration, simulation and data filtering, reconstruction and analysis. The computing requirements are extremely demanding and, usually, span from serial to multi-parallel or GPU-optimized jobs. The collaborative nature of these experiments demands very frequent WAN data transfers and data sharing among individuals and groups. In order to support the aforementioned demanding computing requirements we enabled Grid Computing resources, operated by EGI, within the KM3NeT computing model. In this study we describe our first advances in this field and the method for the KM3NeT users to utilize the EGI computing resources in a simulation-driven use-case.

  11. Stress Computed Tomography Myocardial Perfusion Imaging: A New Topic in Cardiology.

    PubMed

    Seitun, Sara; Castiglione Morelli, Margherita; Budaj, Irilda; Boccalini, Sara; Galletto Pregliasco, Athena; Valbusa, Alberto; Cademartiri, Filippo; Ferro, Carlo

    2016-02-01

    Since its introduction about 15 years ago, coronary computed tomography angiography has become today the most accurate clinical instrument for noninvasive assessment of coronary atherosclerosis. Important technical developments have led to a continuous stream of new clinical applications together with a significant reduction in radiation dose exposure. Latest generation computed tomography scanners (≥ 64 slices) allow the possibility of performing static or dynamic perfusion imaging during stress by using coronary vasodilator agents (adenosine, dipyridamole, or regadenoson), combining both functional and anatomical information in the same examination. In this article, the emerging role and state-of-the-art of myocardial computed tomography perfusion imaging are reviewed and are illustrated by clinical cases from our experience with a second-generation dual-source 128-slice scanner (Somatom Definition Flash, Siemens; Erlangen, Germany). Technical aspects, data analysis, diagnostic accuracy, radiation dose and future prospects are reviewed. Copyright © 2015 Sociedad Española de Cardiología. Published by Elsevier España, S.L.U. All rights reserved.

  12. Estimation of future outflows of e-waste in India.

    PubMed

    Dwivedy, Maheshwar; Mittal, R K

    2010-03-01

    The purpose of this study is to construct an approach and a methodology to estimate the future outflows of electronic waste (e-waste) in India. Consequently, the study utilizes a time-series multiple lifespan end-of-life model proposed by Peralta and Fontanos for estimating the current and future quantities of e-waste in India. The model estimates future e-waste generation quantities by modeling their usage and disposal. The present work considers two scenarios for the approximation of e-waste generation based on user preferences to store or to recycle the e-waste. This model will help formal recyclers in India to make strategic decisions in planning for appropriate recycling infrastructure and institutional capacity building. Also an extension of the model proposed by Peralta and Fontanos is developed with the objective of helping decision makers to conduct WEEE estimates under a variety of assumptions to suit their region of study. During 2007-2011, the total WEEE estimates will be around 2.5 million metric tons which include waste from personal computers (PC), television, refrigerators and washing machines. During the said period, the waste from PC will account for 30% of total units of WEEE generated. Copyright 2009 Elsevier Ltd. All rights reserved.

  13. Estimation of future outflows of e-waste in India

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dwivedy, Maheshwar, E-mail: dwivedy_m@bits-pilani.ac.i; Mittal, R.K.

    2010-03-15

    The purpose of this study is to construct an approach and a methodology to estimate the future outflows of electronic waste (e-waste) in India. Consequently, the study utilizes a time-series multiple lifespan end-of-life model proposed by Peralta and Fontanos for estimating the current and future quantities of e-waste in India. The model estimates future e-waste generation quantities by modeling their usage and disposal. The present work considers two scenarios for the approximation of e-waste generation based on user preferences to store or to recycle the e-waste. This model will help formal recyclers in India to make strategic decisions in planningmore » for appropriate recycling infrastructure and institutional capacity building. Also an extension of the model proposed by Peralta and Fontanos is developed with the objective of helping decision makers to conduct WEEE estimates under a variety of assumptions to suit their region of study. During 2007-2011, the total WEEE estimates will be around 2.5 million metric tons which include waste from personal computers (PC), television, refrigerators and washing machines. During the said period, the waste from PC will account for 30% of total units of WEEE generated.« less

  14. Computational tools for copy number variation (CNV) detection using next-generation sequencing data: features and perspectives.

    PubMed

    Zhao, Min; Wang, Qingguo; Wang, Quan; Jia, Peilin; Zhao, Zhongming

    2013-01-01

    Copy number variation (CNV) is a prevalent form of critical genetic variation that leads to an abnormal number of copies of large genomic regions in a cell. Microarray-based comparative genome hybridization (arrayCGH) or genotyping arrays have been standard technologies to detect large regions subject to copy number changes in genomes until most recently high-resolution sequence data can be analyzed by next-generation sequencing (NGS). During the last several years, NGS-based analysis has been widely applied to identify CNVs in both healthy and diseased individuals. Correspondingly, the strong demand for NGS-based CNV analyses has fuelled development of numerous computational methods and tools for CNV detection. In this article, we review the recent advances in computational methods pertaining to CNV detection using whole genome and whole exome sequencing data. Additionally, we discuss their strengths and weaknesses and suggest directions for future development.

  15. Computational tools for copy number variation (CNV) detection using next-generation sequencing data: features and perspectives

    PubMed Central

    2013-01-01

    Copy number variation (CNV) is a prevalent form of critical genetic variation that leads to an abnormal number of copies of large genomic regions in a cell. Microarray-based comparative genome hybridization (arrayCGH) or genotyping arrays have been standard technologies to detect large regions subject to copy number changes in genomes until most recently high-resolution sequence data can be analyzed by next-generation sequencing (NGS). During the last several years, NGS-based analysis has been widely applied to identify CNVs in both healthy and diseased individuals. Correspondingly, the strong demand for NGS-based CNV analyses has fuelled development of numerous computational methods and tools for CNV detection. In this article, we review the recent advances in computational methods pertaining to CNV detection using whole genome and whole exome sequencing data. Additionally, we discuss their strengths and weaknesses and suggest directions for future development. PMID:24564169

  16. The applicability of a computer model for predicting head injury incurred during actual motor vehicle collisions.

    PubMed

    Moran, Stephan G; Key, Jason S; McGwin, Gerald; Keeley, Jason W; Davidson, James S; Rue, Loring W

    2004-07-01

    Head injury is a significant cause of both morbidity and mortality. Motor vehicle collisions (MVCs) are the most common source of head injury in the United States. No studies have conclusively determined the applicability of computer models for accurate prediction of head injuries sustained in actual MVCs. This study sought to determine the applicability of such models for predicting head injuries sustained by MVC occupants. The Crash Injury Research and Engineering Network (CIREN) database was queried for restrained drivers who sustained a head injury. These collisions were modeled using occupant dynamic modeling (MADYMO) software, and head injury scores were generated. The computer-generated head injury scores then were evaluated with respect to the actual head injuries sustained by the occupants to determine the applicability of MADYMO computer modeling for predicting head injury. Five occupants meeting the selection criteria for the study were selected from the CIREN database. The head injury scores generated by MADYMO were lower than expected given the actual injuries sustained. In only one case did the computer analysis predict a head injury of a severity similar to that actually sustained by the occupant. Although computer modeling accurately simulates experimental crash tests, it may not be applicable for predicting head injury in actual MVCs. Many complicating factors surrounding actual MVCs make accurate computer modeling difficult. Future modeling efforts should consider variables such as age of the occupant and should account for a wider variety of crash scenarios.

  17. Mind the Noise When Identifying Computational Models of Cognition from Brain Activity.

    PubMed

    Kolossa, Antonio; Kopp, Bruno

    2016-01-01

    The aim of this study was to analyze how measurement error affects the validity of modeling studies in computational neuroscience. A synthetic validity test was created using simulated P300 event-related potentials as an example. The model space comprised four computational models of single-trial P300 amplitude fluctuations which differed in terms of complexity and dependency. The single-trial fluctuation of simulated P300 amplitudes was computed on the basis of one of the models, at various levels of measurement error and at various numbers of data points. Bayesian model selection was performed based on exceedance probabilities. At very low numbers of data points, the least complex model generally outperformed the data-generating model. Invalid model identification also occurred at low levels of data quality and under low numbers of data points if the winning model's predictors were closely correlated with the predictors from the data-generating model. Given sufficient data quality and numbers of data points, the data-generating model could be correctly identified, even against models which were very similar to the data-generating model. Thus, a number of variables affects the validity of computational modeling studies, and data quality and numbers of data points are among the main factors relevant to the issue. Further, the nature of the model space (i.e., model complexity, model dependency) should not be neglected. This study provided quantitative results which show the importance of ensuring the validity of computational modeling via adequately prepared studies. The accomplishment of synthetic validity tests is recommended for future applications. Beyond that, we propose to render the demonstration of sufficient validity via adequate simulations mandatory to computational modeling studies.

  18. Image-guided tissue engineering

    PubMed Central

    Ballyns, Jeffrey J; Bonassar, Lawrence J

    2009-01-01

    Replication of anatomic shape is a significant challenge in developing implants for regenerative medicine. This has lead to significant interest in using medical imaging techniques such as magnetic resonance imaging and computed tomography to design tissue engineered constructs. Implementation of medical imaging and computer aided design in combination with technologies for rapid prototyping of living implants enables the generation of highly reproducible constructs with spatial resolution up to 25 μm. In this paper, we review the medical imaging modalities available and a paradigm for choosing a particular imaging technique. We also present fabrication techniques and methodologies for producing cellular engineered constructs. Finally, we comment on future challenges involved with image guided tissue engineering and efforts to generate engineered constructs ready for implantation. PMID:19583811

  19. Single-electron random-number generator (RNG) for highly secure ubiquitous computing applications

    NASA Astrophysics Data System (ADS)

    Uchida, Ken; Tanamoto, Tetsufumi; Fujita, Shinobu

    2007-11-01

    Since the security of all modern cryptographic techniques relies on unpredictable and irreproducible digital keys generated by random-number generators (RNGs), the realization of high-quality RNG is essential for secure communications. In this report, a new RNG, which utilizes single-electron phenomena, is proposed. A room-temperature operating silicon single-electron transistor (SET) having nearby an electron pocket is used as a high-quality, ultra-small RNG. In the proposed RNG, stochastic single-electron capture/emission processes to/from the electron pocket are detected with high sensitivity by the SET, and result in giant random telegraphic signals (GRTS) on the SET current. It is experimentally demonstrated that the single-electron RNG generates extremely high-quality random digital sequences at room temperature, in spite of its simple configuration. Because of its small-size and low-power properties, the single-electron RNG is promising as a key nanoelectronic device for future ubiquitous computing systems with highly secure mobile communication capabilities.

  20. Development of standardized air-blown coal gasifier/gas turbine concepts for future electric power systems, Volume 4. Appendix C: Design and performance of standardized fixed bed air-blown gasifier IGCC systems for future electric power generation: Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-02-01

    This appendix is a compilation of work done to predict overall cycle performance from gasifier to generator terminals. A spreadsheet has been generated for each case to show flows within a cycle. The spreadsheet shows gaseous or solid composition of flow, temperature of flow, quantity of flow, and heat heat content of flow. Prediction of steam and gas turbine performance was obtained by the computer program GTPro. Outputs of all runs for each combined cycle reviewed has been added to this appendix. A process schematic displaying all flows predicted through GTPro and the spreadsheet is also added to this appendix.more » The numbered bubbles on the schematic correspond to columns on the top headings of the spreadsheet.« less

  1. Web-Based Teaching: The Beginning of the End for Universities?

    ERIC Educational Resources Information Center

    Wyatt, Ray

    This paper describes a World Wide Web-based, generic, inter-disciplinary subject called computer-aided policymaking. It has been offered at Melbourne University (Australia) from the beginning of 2001. It has generated some salutary lessons in marketing and pedagogy, but overall it is concluded that Web-based teaching has a rosy future.…

  2. Video Games Take Testing to the Next Level

    ERIC Educational Resources Information Center

    Rothman, Robert

    2011-01-01

    Young people playing "Halo" or "World of Warcraft" might not realize it, but they are working on the prototypes for a future generation of student tests. The increasing popularity of video and computer games may cause concern among parents, who fear their children are spending too much time on them. However, educators and researchers increasingly…

  3. Past, Present, and Future Trends in Teaching Clinical Skills through Web-Based Learning Environments

    ERIC Educational Resources Information Center

    Coe Regan, Jo Ann R.; Youn, Eric J.

    2008-01-01

    Distance education in social work has grown significantly due to the use of interactive television and computer networks. Given the recent developments in delivering distance education utilizing Web-based technology, this article presents a literature review focused on identifying generational trends in the development of Web-based learning…

  4. NREL and Partners Highlight Collaboration and Explore Future During Partner

    Science.gov Websites

    districts, exascale computing, more efficient photovoltaics, and next-generation wind turbine blades, to blades on site? What does this do to our grid?" At sessions across NREL's South Table Mountain as the Solar Energy Research Institute (SERI), the lab and its research partners have helped shape

  5. The influence of leg-to-body ratio (LBR) on judgments of female physical attractiveness: assessments of computer-generated images varying in LBR.

    PubMed

    Frederick, David A; Hadji-Michael, Maria; Furnham, Adrian; Swami, Viren

    2010-01-01

    The leg-to-body ratio (LBR), which is reliably associated with developmental stability and health outcomes, is an understudied component of human physical attractiveness. Several studies examining the effects of LBR on aesthetic judgments have been limited by the reliance on stimuli composed of hand-drawn silhouettes. In the present study, we developed a new set of female computer-generated images portraying eight levels of LBR that fell within the typical range of human variation. A community sample of 207 Britons in London and students from two samples drawn from a US university (Ns=940, 114) rated the physical attractiveness of the images. We found that mid-ranging female LBRs were perceived as maximally attractive. The present research overcomes some of the problems associated with past work on LBR and aesthetic preferences through use of computer-generated images rather than hand-drawn images and provides an instrument that may be useful in future investigations of LBR preferences. Copyright 2009 Elsevier Ltd. All rights reserved.

  6. Fundamental device design considerations in the development of disruptive nanoelectronics.

    PubMed

    Singh, R; Poole, J O; Poole, K F; Vaidya, S D

    2002-01-01

    In the last quarter of a century silicon-based integrated circuits (ICs) have played a major role in the growth of the economy throughout the world. A number of new technologies, such as quantum computing, molecular computing, DNA molecules for computing, etc., are currently being explored to create a product to replace semiconductor transistor technology. We have examined all of the currently explored options and found that none of these options are suitable as silicon IC's replacements. In this paper we provide fundamental device criteria that must be satisfied for the successful operation of a manufacturable, not yet invented, device. The two fundamental limits are the removal of heat and reliability. The switching speed of any practical man-made computing device will be in the range of 10(-15) to 10(-3) s. Heisenberg's uncertainty principle and the computer architecture set the heat generation limit. The thermal conductivity of the materials used in the fabrication of a nanodimensional device sets the heat removal limit. In current electronic products, redundancy plays a significant part in improving the reliability of parts with macroscopic defects. In the future, microscopic and even nanoscopic defects will play a critical role in the reliability of disruptive nanoelectronics. The lattice vibrations will set the intrinsic reliability of future computing systems. The two critical limits discussed in this paper provide criteria for the selection of materials used in the fabrication of future devices. Our work shows that diamond contains the clue to providing computing devices that will surpass the performance of silicon-based nanoelectronics.

  7. SmaggIce User Guide. 1.0

    NASA Technical Reports Server (NTRS)

    Baez, Marivell; Vickerman, Mary; Choo, Yung

    2000-01-01

    SmaggIce (Surface Modeling And Grid Generation for Iced Airfoils) is one of NASNs aircraft icing research codes developed at the Glenn Research Center. It is a software toolkit used in the process of aerodynamic performance prediction of iced airfoils. It includes tools which complement the 2D grid-based Computational Fluid Dynamics (CFD) process: geometry probing; surface preparation for gridding: smoothing and re-discretization of geometry. Future releases will also include support for all aspects of gridding: domain decomposition; perimeter discretization; grid generation and modification.

  8. A 3D TCAD simulation of a thermoelectric module configured for thermoelectric power generation, cooling and heating

    NASA Astrophysics Data System (ADS)

    Gould, C. A.; Shammas, N. Y. A.; Grainger, S.; Taylor, I.; Simpson, K.

    2012-06-01

    This paper documents the 3D modeling and simulation of a three couple thermoelectric module using the Synopsys Technology Computer Aided Design (TCAD) semiconductor simulation software. Simulation results are presented for thermoelectric power generation, cooling and heating, and successfully demonstrate the basic thermoelectric principles. The 3D TCAD simulation model of a three couple thermoelectric module can be used in the future to evaluate different thermoelectric materials, device structures, and improve the efficiency and performance of thermoelectric modules.

  9. A second golden age of aeroacoustics?

    PubMed

    Lele, Sanjiva K; Nichols, Joseph W

    2014-08-13

    In 1992, Sir James Lighthill foresaw the dawn of a second golden age in aeroacoustics enabled by computer simulations (Hardin JC, Hussaini MY (eds) 1993 Computational aeroacoustics, New York, NY: Springer (doi:10.1007/978-1-4613-8342-0)). This review traces the progress in large-scale computations to resolve the noise-source processes and the methods devised to predict the far-field radiated sound using this information. Keeping focus on aviation-related noise sources a brief account of the progress in simulations of jet noise, fan noise and airframe noise is given highlighting the key technical issues and challenges. The complex geometry of nozzle elements and airframe components as well as the high Reynolds number of target applications require careful assessment of the discretization algorithms on unstructured grids and modelling compromises. High-fidelity simulations with 200-500 million points are not uncommon today and are used to improve scientific understanding of the noise generation process in specific situations. We attempt to discern where the future might take us, especially if exascale computing becomes a reality in 10 years. A pressing question in this context concerns the role of modelling in the coming era. While the sheer scale of the data generated by large-scale simulations will require new methods for data analysis and data visualization, it is our view that suitable theoretical formulations and reduced models will be even more important in future. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  10. Modeling and Analysis of Power Processing Systems (MAPPS). Volume 1: Technical report

    NASA Technical Reports Server (NTRS)

    Lee, F. C.; Rahman, S.; Carter, R. A.; Wu, C. H.; Yu, Y.; Chang, R.

    1980-01-01

    Computer aided design and analysis techniques were applied to power processing equipment. Topics covered include: (1) discrete time domain analysis of switching regulators for performance analysis; (2) design optimization of power converters using augmented Lagrangian penalty function technique; (3) investigation of current-injected multiloop controlled switching regulators; and (4) application of optimization for Navy VSTOL energy power system. The generation of the mathematical models and the development and application of computer aided design techniques to solve the different mathematical models are discussed. Recommendations are made for future work that would enhance the application of the computer aided design techniques for power processing systems.

  11. 2001: Things to come.

    PubMed

    Apuzzo, M L; Liu, C Y

    2001-10-01

    THIS ARTICLE DISCUSSES elements in the definition of modernity and emerging futurism in neurological surgery. In particular, it describes evolution, discovery, and paradigm shifts in the field and forces responsible for their realization. It analyzes the cyclical reinvention of the discipline experienced during the past generation and attempts to identify apertures to the near and more remote future. Subsequently, it focuses on forces and discovery in computational science, imaging, molecular science, biomedical engineering, and information processing as they relate to the theme of minimalism that is evident in the field. These areas are explained in the light of future possibilities offered by the emerging field of nanotechnology with molecular engineering.

  12. Current and anticipated uses of thermal-hydraulic codes in Germany

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teschendorff, V.; Sommer, F.; Depisch, F.

    1997-07-01

    In Germany, one third of the electrical power is generated by nuclear plants. ATHLET and S-RELAP5 are successfully applied for safety analyses of the existing PWR and BWR reactors and possible future reactors, e.g. EPR. Continuous development and assessment of thermal-hydraulic codes are necessary in order to meet present and future needs of licensing organizations, utilities, and vendors. Desired improvements include thermal-hydraulic models, multi-dimensional simulation, computational speed, interfaces to coupled codes, and code architecture. Real-time capability will be essential for application in full-scope simulators. Comprehensive code validation and quantification of uncertainties are prerequisites for future best-estimate analyses.

  13. Computational Fluid Dynamics of Whole-Body Aircraft

    NASA Astrophysics Data System (ADS)

    Agarwal, Ramesh

    1999-01-01

    The current state of the art in computational aerodynamics for whole-body aircraft flowfield simulations is described. Recent advances in geometry modeling, surface and volume grid generation, and flow simulation algorithms have led to accurate flowfield predictions for increasingly complex and realistic configurations. As a result, computational aerodynamics has emerged as a crucial enabling technology for the design and development of flight vehicles. Examples illustrating the current capability for the prediction of transport and fighter aircraft flowfields are presented. Unfortunately, accurate modeling of turbulence remains a major difficulty in the analysis of viscosity-dominated flows. In the future, inverse design methods, multidisciplinary design optimization methods, artificial intelligence technology, and massively parallel computer technology will be incorporated into computational aerodynamics, opening up greater opportunities for improved product design at substantially reduced costs.

  14. High-Throughput Computing on High-Performance Platforms: A Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oleynik, D; Panitkin, S; Matteo, Turilli

    The computing systems used by LHC experiments has historically consisted of the federation of hundreds to thousands of distributed resources, ranging from small to mid-size resource. In spite of the impressive scale of the existing distributed computing solutions, the federation of small to mid-size resources will be insufficient to meet projected future demands. This paper is a case study of how the ATLAS experiment has embraced Titan -- a DOE leadership facility in conjunction with traditional distributed high- throughput computing to reach sustained production scales of approximately 52M core-hours a years. The three main contributions of this paper are: (i)more » a critical evaluation of design and operational considerations to support the sustained, scalable and production usage of Titan; (ii) a preliminary characterization of a next generation executor for PanDA to support new workloads and advanced execution modes; and (iii) early lessons for how current and future experimental and observational systems can be integrated with production supercomputers and other platforms in a general and extensible manner.« less

  15. Computational Planning in Facial Surgery.

    PubMed

    Zachow, Stefan

    2015-10-01

    This article reflects the research of the last two decades in computational planning for cranio-maxillofacial surgery. Model-guided and computer-assisted surgery planning has tremendously developed due to ever increasing computational capabilities. Simulators for education, planning, and training of surgery are often compared with flight simulators, where maneuvers are also trained to reduce a possible risk of failure. Meanwhile, digital patient models can be derived from medical image data with astonishing accuracy and thus can serve for model surgery to derive a surgical template model that represents the envisaged result. Computerized surgical planning approaches, however, are often still explorative, meaning that a surgeon tries to find a therapeutic concept based on his or her expertise using computational tools that are mimicking real procedures. Future perspectives of an improved computerized planning may be that surgical objectives will be generated algorithmically by employing mathematical modeling, simulation, and optimization techniques. Planning systems thus act as intelligent decision support systems. However, surgeons can still use the existing tools to vary the proposed approach, but they mainly focus on how to transfer objectives into reality. Such a development may result in a paradigm shift for future surgery planning. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  16. Heat-driven liquid metal cooling device for the thermal management of a computer chip

    NASA Astrophysics Data System (ADS)

    Ma, Kun-Quan; Liu, Jing

    2007-08-01

    The tremendous heat generated in a computer chip or very large scale integrated circuit raises many challenging issues to be solved. Recently, liquid metal with a low melting point was established as the most conductive coolant for efficiently cooling the computer chip. Here, by making full use of the double merits of the liquid metal, i.e. superior heat transfer performance and electromagnetically drivable ability, we demonstrate for the first time the liquid-cooling concept for the thermal management of a computer chip using waste heat to power the thermoelectric generator (TEG) and thus the flow of the liquid metal. Such a device consumes no external net energy, which warrants it a self-supporting and completely silent liquid-cooling module. Experiments on devices driven by one or two stage TEGs indicate that a dramatic temperature drop on the simulating chip has been realized without the aid of any fans. The higher the heat load, the larger will be the temperature decrease caused by the cooling device. Further, the two TEGs will generate a larger current if a copper plate is sandwiched between them to enhance heat dissipation there. This new method is expected to be significant in future thermal management of a desk or notebook computer, where both efficient cooling and extremely low energy consumption are of major concern.

  17. Research on the generation of the background with sea and sky in infrared scene

    NASA Astrophysics Data System (ADS)

    Dong, Yan-zhi; Han, Yan-li; Lou, Shu-li

    2008-03-01

    It is important for scene generation to keep the texture of infrared images in simulation of anti-ship infrared imaging guidance. We studied the fractal method and applied it to the infrared scene generation. We adopted the method of horizontal-vertical (HV) partition to encode the original image. Basing on the properties of infrared image with sea-sky background, we took advantage of Local Iteration Function System (LIFS) to decrease the complexity of computation and enhance the processing rate. Some results were listed. The results show that the fractal method can keep the texture of infrared image better and can be used in the infrared scene generation widely in future.

  18. Translational Biomedical Informatics in the Cloud: Present and Future

    PubMed Central

    Chen, Jiajia; Qian, Fuliang; Yan, Wenying; Shen, Bairong

    2013-01-01

    Next generation sequencing and other high-throughput experimental techniques of recent decades have driven the exponential growth in publicly available molecular and clinical data. This information explosion has prepared the ground for the development of translational bioinformatics. The scale and dimensionality of data, however, pose obvious challenges in data mining, storage, and integration. In this paper we demonstrated the utility and promise of cloud computing for tackling the big data problems. We also outline our vision that cloud computing could be an enabling tool to facilitate translational bioinformatics research. PMID:23586054

  19. New ARCH: Future Generation Internet Architecture

    DTIC Science & Technology

    2004-08-01

    a vocabulary to talk about a system . This provides a framework ( a “reference model ...layered model Modularity and abstraction are central tenets of Computer Science thinking. Modularity breaks a system into parts, normally to permit...this complexity is hidden. Abstraction suggests a structure for the system . A popular and simple structure is a layered model : lower layer

  20. Future Directions: How Virtual Reality Can Further Improve the Assessment and Treatment of Eating Disorders and Obesity.

    PubMed

    Gutiérrez-Maldonado, José; Wiederhold, Brenda K; Riva, Giuseppe

    2016-02-01

    Transdisciplinary efforts for further elucidating the etiology of eating and weight disorders and improving the effectiveness of the available evidence-based interventions are imperative at this time. Recent studies indicate that computer-generated graphic environments-virtual reality (VR)-can integrate and extend existing treatments for eating and weight disorders (EWDs). Future possibilities for VR to improve actual approaches include its use for altering in real time the experience of the body (embodiment) and as a cue exposure tool for reducing food craving.

  1. Advanced Aerospace Materials by Design

    NASA Technical Reports Server (NTRS)

    Srivastava, Deepak; Djomehri, Jahed; Wei, Chen-Yu

    2004-01-01

    The advances in the emerging field of nanophase thermal and structural composite materials; materials with embedded sensors and actuators for morphing structures; light-weight composite materials for energy and power storage; and large surface area materials for in-situ resource generation and waste recycling, are expected to :revolutionize the capabilities of virtually every system comprising of future robotic and :human moon and mars exploration missions. A high-performance multiscale simulation platform, including the computational capabilities and resources of Columbia - the new supercomputer, is being developed to discover, validate, and prototype next generation (of such advanced materials. This exhibit will describe the porting and scaling of multiscale 'physics based core computer simulation codes for discovering and designing carbon nanotube-polymer composite materials for light-weight load bearing structural and 'thermal protection applications.

  2. Integration and Exposure of Large Scale Computational Resources Across the Earth System Grid Federation (ESGF)

    NASA Astrophysics Data System (ADS)

    Duffy, D.; Maxwell, T. P.; Doutriaux, C.; Williams, D. N.; Chaudhary, A.; Ames, S.

    2015-12-01

    As the size of remote sensing observations and model output data grows, the volume of the data has become overwhelming, even to many scientific experts. As societies are forced to better understand, mitigate, and adapt to climate changes, the combination of Earth observation data and global climate model projects is crucial to not only scientists but to policy makers, downstream applications, and even the public. Scientific progress on understanding climate is critically dependent on the availability of a reliable infrastructure that promotes data access, management, and provenance. The Earth System Grid Federation (ESGF) has created such an environment for the Intergovernmental Panel on Climate Change (IPCC). ESGF provides a federated global cyber infrastructure for data access and management of model outputs generated for the IPCC Assessment Reports (AR). The current generation of the ESGF federated grid allows consumers of the data to find and download data with limited capabilities for server-side processing. Since the amount of data for future AR is expected to grow dramatically, ESGF is working on integrating server-side analytics throughout the federation. The ESGF Compute Working Team (CWT) has created a Web Processing Service (WPS) Application Programming Interface (API) to enable access scalable computational resources. The API is the exposure point to high performance computing resources across the federation. Specifically, the API allows users to execute simple operations, such as maximum, minimum, average, and anomalies, on ESGF data without having to download the data. These operations are executed at the ESGF data node site with access to large amounts of parallel computing capabilities. This presentation will highlight the WPS API, its capabilities, provide implementation details, and discuss future developments.

  3. Phonological studies of the new gas-induced agitated reactor using computational fluid dynamics.

    PubMed

    Yang, T C; Hsu, Y C; Wang, S F

    2001-06-01

    An ozone-induced agitated reactor has been found to be very effective in degrading industrial wastewater. However, the cost of the ozone generation as well as its short residence time in reactors has restricted its application in a commercial scale. An innovated gas-induced draft tube installed inside a conventional agitated reactor was proved to effectively retain the ozone in a reactor. The setup was demonstrated to significantly promote the ozone utilization rate up to 96% from the conventional rate of 60% above the onset speed. This work investigates the mixing mechanism of an innovated gas-induced reactor for the future scale-up design by using the technique of computational fluid dynamics. A three-dimensional flow model was proposed to compute the liquid-gas free surface as well as the flow patterns inside the reactor. The turbulent effects generated by two 45 degrees pitch-blade turbines were considered and the two phases mixing phenomena were also manipulated by the Eulerian-Eulerian techniques. The consistency of the free surface profiles and the fluid flow patterns proved a good agreement between computational results and the experimental observation.

  4. Can An Evolutionary Process Create English Text?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bailey, David H.

    Critics of the conventional theory of biological evolution have asserted that while natural processes might result in some limited diversity, nothing fundamentally new can arise from 'random' evolution. In response, biologists such as Richard Dawkins have demonstrated that a computer program can generate a specific short phrase via evolution-like iterations starting with random gibberish. While such demonstrations are intriguing, they are flawed in that they have a fixed, pre-specified future target, whereas in real biological evolution there is no fixed future target, but only a complicated 'fitness landscape'. In this study, a significantly more sophisticated evolutionary scheme is employed tomore » produce text segments reminiscent of a Charles Dickens novel. The aggregate size of these segments is larger than the computer program and the input Dickens text, even when comparing compressed data (as a measure of information content).« less

  5. The future: biomarkers, biosensors, neuroinformatics, and e-neuropsychiatry.

    PubMed

    Lowe, Christopher R

    2011-01-01

    The emergence of molecular biomarkers for psychological, psychiatric, and neurodegenerative disorders is beginning to change current diagnostic paradigms for this debilitating family of mental illnesses. The development of new genomic, proteomic, and metabolomic tools has created the prospect of sensitive and specific biochemical tests to replace traditional pen-and-paper questionnaires. In the future, the realization of biosensor technologies, point-of-care testing, and the fusion of clinical biomarker data, electroencephalogram, and MRI data with the patient's past medical history, biopatterns, and prognosis may create personalized bioprofiles or fingerprints for brain disorders. Further, the application of mobile communications technology and grid computing to support data-, computation- and knowledge-based tasks will assist disease prediction, diagnosis, prognosis, and compliance monitoring. It is anticipated that, ultimately, mobile devices could become the next generation of personalized pharmacies. Copyright © 2011 Elsevier Inc. All rights reserved.

  6. Virtual reality and planetary exploration

    NASA Technical Reports Server (NTRS)

    Mcgreevy, Michael W.

    1992-01-01

    Exploring planetary environments is central to NASA's missions and goals. A new computing technology called Virtual Reality has much to offer in support of planetary exploration. This technology augments and extends human presence within computer-generated and remote spatial environments. Historically, NASA has been a leader in many of the fundamental concepts and technologies that comprise Virtual Reality. Indeed, Ames Research Center has a central role in the development of this rapidly emerging approach to using computers. This ground breaking work has inspired researchers in academia, industry, and the military. Further, NASA's leadership in this technology has spun off new businesses, has caught the attention of the international business community, and has generated several years of positive international media coverage. In the future, Virtual Reality technology will enable greatly improved human-machine interactions for more productive planetary surface exploration. Perhaps more importantly, Virtual Reality technology will democratize the experience of planetary exploration and thereby broaden understanding of, and support for, this historic enterprise.

  7. Virtual reality and planetary exploration

    NASA Astrophysics Data System (ADS)

    McGreevy, Michael W.

    Exploring planetary environments is central to NASA's missions and goals. A new computing technology called Virtual Reality has much to offer in support of planetary exploration. This technology augments and extends human presence within computer-generated and remote spatial environments. Historically, NASA has been a leader in many of the fundamental concepts and technologies that comprise Virtual Reality. Indeed, Ames Research Center has a central role in the development of this rapidly emerging approach to using computers. This ground breaking work has inspired researchers in academia, industry, and the military. Further, NASA's leadership in this technology has spun off new businesses, has caught the attention of the international business community, and has generated several years of positive international media coverage. In the future, Virtual Reality technology will enable greatly improved human-machine interactions for more productive planetary surface exploration. Perhaps more importantly, Virtual Reality technology will democratize the experience of planetary exploration and thereby broaden understanding of, and support for, this historic enterprise.

  8. Simplifying silicon burning: Application of quasi-equilibrium to (alpha) network nucleosynthesis

    NASA Technical Reports Server (NTRS)

    Hix, W. R.; Thielemann, F.-K.; Khokhlov, A. M.; Wheeler, J. C.

    1997-01-01

    While the need for accurate calculation of nucleosynthesis and the resulting rate of thermonuclear energy release within hydrodynamic models of stars and supernovae is clear, the computational expense of these nucleosynthesis calculations often force a compromise in accuracy to reduce the computational cost. To redress this trade-off of accuracy for speed, the authors present an improved nuclear network which takes advantage of quasi- equilibrium in order to reduce the number of independent nuclei, and hence the computational cost of nucleosynthesis, without significant reduction in accuracy. In this paper they will discuss the first application of this method, the further reduction in size of the minimal alpha network. The resultant QSE- reduced alpha network is twice as fast as the conventional alpha network it replaces and requires the tracking of half as many abundance variables, while accurately estimating the rate of energy generation. Such reduction in cost is particularly necessary for future generation of multi-dimensional models for supernovae.

  9. "Now, Year Ones, This Is Your Life!" Preparing the Present Generation of Students for a World of Shrinking Distances.

    ERIC Educational Resources Information Center

    Beare, Hedley

    2001-01-01

    Forecasts for the future are made against the backdrop of population growth, environmental change, information technology, and globalization. Schools and teachers as we know them will change radically, perhaps become obsolete, as computers and the Internet enable access to information from anywhere, any time. Learning will become a life-long,…

  10. Advanced Collaborative Environments Supporting Systems Integration and Design

    DTIC Science & Technology

    2003-03-01

    concurrently view a virtual system or product model while maintaining natural, human communication . These virtual systems operate within a computer-generated...These environments allow multiple individuals to concurrently view a virtual system or product model while simultaneously maintaining natural, human ... communication . As a result, TARDEC researchers and system developers are using this advanced high-end visualization technology to develop future

  11. New Perspectives on Popular Culture, Science and Technology: Web Browsers and the New Illiteracy

    ERIC Educational Resources Information Center

    Charters, Elizabeth

    2004-01-01

    Analysts predict that the knowledge economy of the near future will require people to be both computer literate and print literate. However, some of the reading and thinking habits of current college students suggest that electronic media such as web browsers may be limiting the new generation's ability to absorb and process what they read. Their…

  12. How Has Internet Use Changed between 2012 and 2015? PISA in Focus No. 83

    ERIC Educational Resources Information Center

    Echazarra, Alfonso

    2018-01-01

    In the growing world of digital technology everything is about speed: computer processors have doubled their performance every two years for decades; the future 5G mobile phone generation is predicted to be about 100 times faster than the current 4G and 20 000 times faster than the "ancient" 3G; and, according to the International…

  13. An Investigation into Spike-Based Neuromorphic Approaches for Artificial Olfactory Systems

    PubMed Central

    Osseiran, Adam

    2017-01-01

    The implementation of neuromorphic methods has delivered promising results for vision and auditory sensors. These methods focus on mimicking the neuro-biological architecture to generate and process spike-based information with minimal power consumption. With increasing interest in developing low-power and robust chemical sensors, the application of neuromorphic engineering concepts for electronic noses has provided an impetus for research focusing on improving these instruments. While conventional e-noses apply computationally expensive and power-consuming data-processing strategies, neuromorphic olfactory sensors implement the biological olfaction principles found in humans and insects to simplify the handling of multivariate sensory data by generating and processing spike-based information. Over the last decade, research on neuromorphic olfaction has established the capability of these sensors to tackle problems that plague the current e-nose implementations such as drift, response time, portability, power consumption and size. This article brings together the key contributions in neuromorphic olfaction and identifies future research directions to develop near-real-time olfactory sensors that can be implemented for a range of applications such as biosecurity and environmental monitoring. Furthermore, we aim to expose the computational parallels between neuromorphic olfaction and gustation for future research focusing on the correlation of these senses. PMID:29125586

  14. Medical imaging and registration in computer assisted surgery.

    PubMed

    Simon, D A; Lavallée, S

    1998-09-01

    Imaging, sensing, and computing technologies that are being introduced to aid in the planning and execution of surgical procedures are providing orthopaedic surgeons with a powerful new set of tools for improving clinical accuracy, reliability, and patient outcomes while reducing costs and operating times. Current computer assisted surgery systems typically include a measurement process for collecting patient specific medical data, a decision making process for generating a surgical plan, a registration process for aligning the surgical plan to the patient, and an action process for accurately achieving the goals specified in the plan. Some of the key concepts in computer assisted surgery applied to orthopaedics with a focus on the basic framework and underlying technologies is outlined. In addition, technical challenges and future trends in the field are discussed.

  15. An expert fitness diagnosis system based on elastic cloud computing.

    PubMed

    Tseng, Kevin C; Wu, Chia-Chuan

    2014-01-01

    This paper presents an expert diagnosis system based on cloud computing. It classifies a user's fitness level based on supervised machine learning techniques. This system is able to learn and make customized diagnoses according to the user's physiological data, such as age, gender, and body mass index (BMI). In addition, an elastic algorithm based on Poisson distribution is presented to allocate computation resources dynamically. It predicts the required resources in the future according to the exponential moving average of past observations. The experimental results show that Naïve Bayes is the best classifier with the highest accuracy (90.8%) and that the elastic algorithm is able to capture tightly the trend of requests generated from the Internet and thus assign corresponding computation resources to ensure the quality of service.

  16. Integrated Sustainable Planning for Industrial Region Using Geospatial Technology

    NASA Astrophysics Data System (ADS)

    Tiwari, Manish K.; Saxena, Aruna; Katare, Vivek

    2012-07-01

    The Geospatial techniques and its scope of applications have undergone an order of magnitude change since its advent and now it has been universally accepted as a most important and modern tool for mapping and monitoring of various natural resources as well as amenities and infrastructure. The huge and voluminous spatial database generated from various Remote Sensing platforms needs proper management like storage, retrieval, manipulation and analysis to extract desired information, which is beyond the capability of human brain. This is where the computer aided GIS technology came into existence. A GIS with major input from Remote Sensing satellites for the natural resource management applications must be able to handle the spatiotemporal data, supporting spatiotemporal quarries and other spatial operations. Software and the computer-based tools are designed to make things easier to the user and to improve the efficiency and quality of information processing tasks. The natural resources are a common heritage, which we have shared with the past generations, and our future generation will be inheriting these resources from us. Our greed for resource and our tremendous technological capacity to exploit them at a much larger scale has created a situation where we have started withdrawing from the future stocks. Bhopal capital region had attracted the attention of the planners from the beginning of the five-year plan strategy for Industrial development. However, a number of projects were carried out in the individual Districts (Bhopal, Rajgarh, Shajapur, Raisen, Sehore) which also gave fruitful results, but no serious efforts have been made to involve the entire region. No use of latest Geospatial technique (Remote Sensing, GIS, GPS) to prepare a well structured computerized data base without which it is very different to retrieve, analyze and compare the data for monitoring as well as for planning the developmental activities in future.

  17. Production Level CFD Code Acceleration for Hybrid Many-Core Architectures

    NASA Technical Reports Server (NTRS)

    Duffy, Austen C.; Hammond, Dana P.; Nielsen, Eric J.

    2012-01-01

    In this work, a novel graphics processing unit (GPU) distributed sharing model for hybrid many-core architectures is introduced and employed in the acceleration of a production-level computational fluid dynamics (CFD) code. The latest generation graphics hardware allows multiple processor cores to simultaneously share a single GPU through concurrent kernel execution. This feature has allowed the NASA FUN3D code to be accelerated in parallel with up to four processor cores sharing a single GPU. For codes to scale and fully use resources on these and the next generation machines, codes will need to employ some type of GPU sharing model, as presented in this work. Findings include the effects of GPU sharing on overall performance. A discussion of the inherent challenges that parallel unstructured CFD codes face in accelerator-based computing environments is included, with considerations for future generation architectures. This work was completed by the author in August 2010, and reflects the analysis and results of the time.

  18. A globally calibrated scheme for generating daily meteorology from monthly statistics: Global-WGEN (GWGEN) v1.0

    NASA Astrophysics Data System (ADS)

    Sommer, Philipp S.; Kaplan, Jed O.

    2017-10-01

    While a wide range of Earth system processes occur at daily and even subdaily timescales, many global vegetation and other terrestrial dynamics models historically used monthly meteorological forcing both to reduce computational demand and because global datasets were lacking. Recently, dynamic land surface modeling has moved towards resolving daily and subdaily processes, and global datasets containing daily and subdaily meteorology have become available. These meteorological datasets, however, cover only the instrumental era of the last approximately 120 years at best, are subject to considerable uncertainty, and represent extremely large data files with associated computational costs of data input/output and file transfer. For periods before the recent past or in the future, global meteorological forcing can be provided by climate model output, but the quality of these data at high temporal resolution is low, particularly for daily precipitation frequency and amount. Here, we present GWGEN, a globally applicable statistical weather generator for the temporal downscaling of monthly climatology to daily meteorology. Our weather generator is parameterized using a global meteorological database and simulates daily values of five common variables: minimum and maximum temperature, precipitation, cloud cover, and wind speed. GWGEN is lightweight, modular, and requires a minimal set of monthly mean variables as input. The weather generator may be used in a range of applications, for example, in global vegetation, crop, soil erosion, or hydrological models. While GWGEN does not currently perform spatially autocorrelated multi-point downscaling of daily weather, this additional functionality could be implemented in future versions.

  19. Tailored program evaluation: Past, present, future.

    PubMed

    Suggs, L Suzanne; Cowdery, Joan E; Carroll, Jennifer B

    2006-11-01

    This paper discusses measurement issues related to the evaluation of computer-tailored health behavior change programs. As the first generation of commercially available tailored products is utilized in health promotion programming, programmers and researchers are becoming aware of the unique challenges that the evaluation of these programs presents. A project is presented that used an online tailored health behavior assessment (HBA) in a worksite setting. Process and outcome evaluation methods are described and include the challenges faced, and strategies proposed and implemented, for meeting them. Implications for future research in tailored program development, implementation, and evaluation are also discussed.

  20. Cue generation: How learners flexibly support future retrieval.

    PubMed

    Tullis, Jonathan G; Benjamin, Aaron S

    2015-08-01

    The successful use of memory requires us to be sensitive to the cues that will be present during retrieval. In many situations, we have some control over the external cues that we will encounter. For instance, learners create shopping lists at home to help remember what items to later buy at the grocery store, and they generate computer file names to help remember the contents of those files. Generating cues in the service of later cognitive goals is a complex task that lies at the intersection of metacognition, communication, and memory. In this series of experiments, we investigated how and how well learners generate external mnemonic cues. Across 5 experiments, learners generated a cue for each target word in a to-be-remembered list and received these cues during a later cued recall test. Learners flexibly generated cues in response to different instructional demands and study list compositions. When generating mnemonic cues, as compared to descriptions of target items, learners produced cues that were more distinct than mere descriptions and consequently elicited greater cued recall performance than those descriptions. When learners were aware of competing targets in the study list, they generated mnemonic cues with smaller cue-to-target associative strength but that were even more distinct. These adaptations led to fewer confusions among competing targets and enhanced cued recall performance. These results provide another example of the metacognitively sophisticated tactics that learners use to effectively support future retrieval.

  1. Towards a Cloud Computing Environment: Near Real-time Cloud Product Processing and Distribution for Next Generation Satellites

    NASA Astrophysics Data System (ADS)

    Nguyen, L.; Chee, T.; Minnis, P.; Palikonda, R.; Smith, W. L., Jr.; Spangenberg, D.

    2016-12-01

    The NASA LaRC Satellite ClOud and Radiative Property retrieval System (SatCORPS) processes and derives near real-time (NRT) global cloud products from operational geostationary satellite imager datasets. These products are being used in NRT to improve forecast model, aircraft icing warnings, and support aircraft field campaigns. Next generation satellites, such as the Japanese Himawari-8 and the upcoming NOAA GOES-R, present challenges for NRT data processing and product dissemination due to the increase in temporal and spatial resolution. The volume of data is expected to increase to approximately 10 folds. This increase in data volume will require additional IT resources to keep up with the processing demands to satisfy NRT requirements. In addition, these resources are not readily available due to cost and other technical limitations. To anticipate and meet these computing resource requirements, we have employed a hybrid cloud computing environment to augment the generation of SatCORPS products. This paper will describe the workflow to ingest, process, and distribute SatCORPS products and the technologies used. Lessons learn from working on both AWS Clouds and GovCloud will be discussed: benefits, similarities, and differences that could impact decision to use cloud computing and storage. A detail cost analysis will be presented. In addition, future cloud utilization, parallelization, and architecture layout will be discussed for GOES-R.

  2. NASA HPCC Technology for Aerospace Analysis and Design

    NASA Technical Reports Server (NTRS)

    Schulbach, Catherine H.

    1999-01-01

    The Computational Aerosciences (CAS) Project is part of NASA's High Performance Computing and Communications Program. Its primary goal is to accelerate the availability of high-performance computing technology to the US aerospace community-thus providing the US aerospace community with key tools necessary to reduce design cycle times and increase fidelity in order to improve safety, efficiency and capability of future aerospace vehicles. A complementary goal is to hasten the emergence of a viable commercial market within the aerospace community for the advantage of the domestic computer hardware and software industry. The CAS Project selects representative aerospace problems (especially design) and uses them to focus efforts on advancing aerospace algorithms and applications, systems software, and computing machinery to demonstrate vast improvements in system performance and capability over the life of the program. Recent demonstrations have served to assess the benefits of possible performance improvements while reducing the risk of adopting high-performance computing technology. This talk will discuss past accomplishments in providing technology to the aerospace community, present efforts, and future goals. For example, the times to do full combustor and compressor simulations (of aircraft engines) have been reduced by factors of 320:1 and 400:1 respectively. While this has enabled new capabilities in engine simulation, the goal of an overnight, dynamic, multi-disciplinary, 3-dimensional simulation of an aircraft engine is still years away and will require new generations of high-end technology.

  3. High End Computing Technologies for Earth Science Applications: Trends, Challenges, and Innovations

    NASA Technical Reports Server (NTRS)

    Parks, John (Technical Monitor); Biswas, Rupak; Yan, Jerry C.; Brooks, Walter F.; Sterling, Thomas L.

    2003-01-01

    Earth science applications of the future will stress the capabilities of even the highest performance supercomputers in the areas of raw compute power, mass storage management, and software environments. These NASA mission critical problems demand usable multi-petaflops and exabyte-scale systems to fully realize their science goals. With an exciting vision of the technologies needed, NASA has established a comprehensive program of advanced research in computer architecture, software tools, and device technology to ensure that, in partnership with US industry, it can meet these demanding requirements with reliable, cost effective, and usable ultra-scale systems. NASA will exploit, explore, and influence emerging high end computing architectures and technologies to accelerate the next generation of engineering, operations, and discovery processes for NASA Enterprises. This article captures this vision and describes the concepts, accomplishments, and the potential payoff of the key thrusts that will help meet the computational challenges in Earth science applications.

  4. A comprehensive approach to decipher biological computation to achieve next generation high-performance exascale computing.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    James, Conrad D.; Schiess, Adrian B.; Howell, Jamie

    2013-10-01

    The human brain (volume=1200cm3) consumes 20W and is capable of performing > 10^16 operations/s. Current supercomputer technology has reached 1015 operations/s, yet it requires 1500m^3 and 3MW, giving the brain a 10^12 advantage in operations/s/W/cm^3. Thus, to reach exascale computation, two achievements are required: 1) improved understanding of computation in biological tissue, and 2) a paradigm shift towards neuromorphic computing where hardware circuits mimic properties of neural tissue. To address 1), we will interrogate corticostriatal networks in mouse brain tissue slices, specifically with regard to their frequency filtering capabilities as a function of input stimulus. To address 2), we willmore » instantiate biological computing characteristics such as multi-bit storage into hardware devices with future computational and memory applications. Resistive memory devices will be modeled, designed, and fabricated in the MESA facility in consultation with our internal and external collaborators.« less

  5. Ink Jet For Business Graphic Application

    NASA Astrophysics Data System (ADS)

    Hooper, Dana H.

    1987-04-01

    This talk covers the use of Computer generated color output in the preparation of professional, memorable presentations. The focus is on this application and today's business graphic marketplace. To provide a background, on overview of the factors and trends influencing the market for color hard copy output is essential. The availability of lower cost computing technology, improved graphic software and user interfaces and the availability of color copiers is combining with the latest generation of color ink jet printers to cause a strong growth in the use of color hardcopy devices in the business graphics marketplace. The market is expected to grow at a compound annual growth rate in excess of 25% and reach a level of 5 Billion by 1990. Color lasography and ink jet technology based products are expected to increase share significantly primarily at the expense of pen plotters. Essential to the above mentioned growth is the latest generation of products. The Xerox 4020 Color Ink Jet Printer embodies the latest ink jet technology and is a good example of this new generation of products. The printer brings highly reliable color to a broad range of business users. The 4020 is driven by over 50 software packages allowing users compatibility and supporting a variety of applications. The 4020 is easy to operate and maintain and capable of producing excellent hardcopy and transparencies at an attractive price point. Several specific applications areas were discussed. Images were typically created on an IBM PC or compatible with a graphics application package and output to the Xerox 4020 Color Ink Jet Printer. Bar charts, line graphs, pie charts, integrated text and graphics, reports and maps were displayed with a brief description. Additionally, the use of color in brainscanning to discern and communicate information and in computer generated Art demonstrate the wide variety of potential applications. Images may be output to paper or to transparency for overhead presentation. The future of color in the business graphics market looks bright and will continue to be strongly influenced by future product introductions.

  6. Transient Mathematical Modeling for Liquid Rocket Engine Systems: Methods, Capabilities, and Experience

    NASA Technical Reports Server (NTRS)

    Seymour, David C.; Martin, Michael A.; Nguyen, Huy H.; Greene, William D.

    2005-01-01

    The subject of mathematical modeling of the transient operation of liquid rocket engines is presented in overview form from the perspective of engineers working at the NASA Marshall Space Flight Center. The necessity of creating and utilizing accurate mathematical models as part of liquid rocket engine development process has become well established and is likely to increase in importance in the future. The issues of design considerations for transient operation, development testing, and failure scenario simulation are discussed. An overview of the derivation of the basic governing equations is presented along with a discussion of computational and numerical issues associated with the implementation of these equations in computer codes. Also, work in the field of generating usable fluid property tables is presented along with an overview of efforts to be undertaken in the future to improve the tools use for the mathematical modeling process.

  7. Transient Mathematical Modeling for Liquid Rocket Engine Systems: Methods, Capabilities, and Experience

    NASA Technical Reports Server (NTRS)

    Martin, Michael A.; Nguyen, Huy H.; Greene, William D.; Seymout, David C.

    2003-01-01

    The subject of mathematical modeling of the transient operation of liquid rocket engines is presented in overview form from the perspective of engineers working at the NASA Marshall Space Flight Center. The necessity of creating and utilizing accurate mathematical models as part of liquid rocket engine development process has become well established and is likely to increase in importance in the future. The issues of design considerations for transient operation, development testing, and failure scenario simulation are discussed. An overview of the derivation of the basic governing equations is presented along with a discussion of computational and numerical issues associated with the implementation of these equations in computer codes. Also, work in the field of generating usable fluid property tables is presented along with an overview of efforts to be undertaken in the future to improve the tools use for the mathematical modeling process.

  8. Systems Engineering and Integration (SE and I)

    NASA Technical Reports Server (NTRS)

    Chevers, ED; Haley, Sam

    1990-01-01

    The issue of technology advancement and future space transportation vehicles is addressed. The challenge is to develop systems which can be evolved and improved in small incremental steps where each increment reduces present cost, improves, reliability, or does neither but sets the stage for a second incremental upgrade that does. Future requirements are interface standards for commercial off the shelf products to aid in the development of integrated facilities; enhanced automated code generation system slightly coupled to specification and design documentation; modeling tools that support data flow analysis; and shared project data bases consisting of technical characteristics cast information, measurement parameters, and reusable software programs. Topics addressed include: advanced avionics development strategy; risk analysis and management; tool quality management; low cost avionics; cost estimation and benefits; computer aided software engineering; computer systems and software safety; system testability; and advanced avionics laboratories - and rapid prototyping. This presentation is represented by viewgraphs only.

  9. Motion planning: A journey of robots, molecules, digital actors, and other artifacts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Latombe, J.C.

    1999-11-01

    During the past three decades, motion planning has emerged as a crucial and productive research area in robotics. In the mid-1980s, the most advanced planners were barely able to compute collision-free paths for objects crawling in planar workspaces. Today, planners efficiently deal with robots with many degrees of freedom in complex environments. Techniques also exist to generate quasi-optimal trajectories, coordinate multiple robots, deal with dynamic and kinematic constraints, and handle dynamic environments. This paper describes some of these achievements, presents new problems that have recently emerged, discusses applications likely to motivate future research, and finally gives expectations for the comingmore » years. It stresses the fact that nonrobotics applications (e.g., graphic animation, surgical planning, computational biology) are growing in importance and are likely to shape future motion-planning research more than robotics itself.« less

  10. The future is 'ambient'

    NASA Astrophysics Data System (ADS)

    Lugmayr, Artur

    2006-02-01

    The research field of ambient media starts to spread rapidly and first applications for consumer homes are on the way. Ambient media is the logical continuation of research around media. Media has been evolving from old media (e.g. print media), to integrated presentation in one form (multimedia - or new media), to generating a synthetic world (virtual reality), to the natural environment is the user-interface (ambient media), and will be evolving towards real/synthetic undistinguishable media (bio-media or bio-multimedia). After the IT bubble was bursting, multimedia was lacking a vision of potential future scenarios and applications. Within this research paper the potentials, applications, and market available solutions of mobile ambient multimedia are studied. The different features of ambient mobile multimedia are manifold and include wearable computers, adaptive software, context awareness, ubiquitous computers, middleware, and wireless networks. The paper especially focuses on algorithms and methods that can be utilized to realize modern mobile ambient systems.

  11. Effects of episodic future thinking on discounting: Personalized age-progressed pictures improve risky long-term health decisions.

    PubMed

    Kaplan, Brent A; Reed, Derek D; Jarmolowicz, David P

    2016-03-01

    Many everyday choices are associated with both delayed and probabilistic outcomes. The temporal attention hypothesis suggests that individuals' decision making can be improved by focusing attention on temporally distal events and implies that environmental manipulations that bring temporally distal outcomes into focus may alter an individual's degree of discounting. One such manipulation, episodic future thinking, has shown to lower discount rates; however, several questions remain about the applicability of episodic future thinking to domains other than delay discounting. The present experiments examine the effects of a modified episodic-future-thinking procedure in which participants viewed age-progressed computer-generated images of themselves and answered questions related to their future, on probability discounting in the context of both a delayed health gain and loss. Results indicate that modified episodic future thinking effectively altered individuals' degree of discounting in the predicted directions and demonstrate the applicability of episodic future thinking to decision making of socially significant outcomes. © 2015 Society for the Experimental Analysis of Behavior.

  12. USSR Report, Science and Technology Policy.

    DTIC Science & Technology

    1987-01-21

    prerequisites for the effective application of mathematical economic methods and computer hardware for the predesigning forecasting of future generations of...M.I. Belkin, I. G. Bogorodskiy, et al.; EKONOMIKA I MATEMATICHESRTYE METODY, No 3, 1986) 1 Development of Mathematic Economics Instrumentarium at...ensuring the effectiveness of the automated elaboration of plans, it is essential, in the first place, to improve the skills of the planning specialists

  13. "Computer as Data Gatherer" for a New Generation: Martorella's Predictions, the Past, the Present, and the Future of Technology in Social Studies

    ERIC Educational Resources Information Center

    Friedman, Adam

    2014-01-01

    In his 1997 article "Technology and the Social Studies--or: Which Way to the Sleeping Giant?" Peter Martorella made several predictions regarding technology resources in the social studies. Through a 2014 lens, Martorella's Internet seems archaic, yet two of his predictions were particularly poignant and have had a significant impact on…

  14. Teaching Sustainability through System Dynamics: Exploring Stocks and Flows Embedded in Dynamic Computer Models of an Agricultural Land Management System

    ERIC Educational Resources Information Center

    Pallant, Amy; Lee, Hee-Sun

    2017-01-01

    During the past several decades, there has been a growing awareness of the ways humans affect Earth systems. As global problems emerge, educating the next generation of citizens to be able to make informed choices related to future outcomes is increasingly important. The challenge for educators is figuring out how to prepare students to think…

  15. Planning Systems for Distributed Operations

    NASA Technical Reports Server (NTRS)

    Maxwell, Theresa G.

    2002-01-01

    This viewgraph representation presents an overview of the mission planning process involving distributed operations (such as the International Space Station (ISS)) and the computer hardware and software systems needed to support such an effort. Topics considered include: evolution of distributed planning systems, ISS distributed planning, the Payload Planning System (PPS), future developments in distributed planning systems, Request Oriented Scheduling Engine (ROSE) and Next Generation distributed planning systems.

  16. Intelligent Control for Future Autonomous Distributed Sensor Systems

    DTIC Science & Technology

    2007-03-26

    recognized, the use of a pre-computed reconfiguration solution that fits the recognized scenario could allow reconfiguration to take place without...This data was loaded into the program developed to visualize the seabed and then the simulation was performed using frames to denote the target...to generate separate images for each eye. Users wear lightweight, inexpensive polarized eyeglasses and see a stereoscopic image. 35 Fig. 10

  17. On the Cusp of Change: Examining Pre-Service Teachers' Beliefs about ICT and Envisioning the Digital Classroom of the Future

    ERIC Educational Resources Information Center

    Fluck, A.; Dowden, T.

    2013-01-01

    Few contemporary pre-service teachers would have completed their schooling with the extensive aid of computers. Yet, classroom use of information and communication technology (ICT) is now ubiquitous in much of the world. Today's pre-service teachers are the "cusp generation" who, at a unique moment in history, straddle the two worlds of…

  18. Computational Models and Emergent Properties of Respiratory Neural Networks

    PubMed Central

    Lindsey, Bruce G.; Rybak, Ilya A.; Smith, Jeffrey C.

    2012-01-01

    Computational models of the neural control system for breathing in mammals provide a theoretical and computational framework bringing together experimental data obtained from different animal preparations under various experimental conditions. Many of these models were developed in parallel and iteratively with experimental studies and provided predictions guiding new experiments. This data-driven modeling approach has advanced our understanding of respiratory network architecture and neural mechanisms underlying generation of the respiratory rhythm and pattern, including their functional reorganization under different physiological conditions. Models reviewed here vary in neurobiological details and computational complexity and span multiple spatiotemporal scales of respiratory control mechanisms. Recent models describe interacting populations of respiratory neurons spatially distributed within the Bötzinger and pre-Bötzinger complexes and rostral ventrolateral medulla that contain core circuits of the respiratory central pattern generator (CPG). Network interactions within these circuits along with intrinsic rhythmogenic properties of neurons form a hierarchy of multiple rhythm generation mechanisms. The functional expression of these mechanisms is controlled by input drives from other brainstem components, including the retrotrapezoid nucleus and pons, which regulate the dynamic behavior of the core circuitry. The emerging view is that the brainstem respiratory network has rhythmogenic capabilities at multiple levels of circuit organization. This allows flexible, state-dependent expression of different neural pattern-generation mechanisms under various physiological conditions, enabling a wide repertoire of respiratory behaviors. Some models consider control of the respiratory CPG by pulmonary feedback and network reconfiguration during defensive behaviors such as cough. Future directions in modeling of the respiratory CPG are considered. PMID:23687564

  19. Survey of MapReduce frame operation in bioinformatics.

    PubMed

    Zou, Quan; Li, Xu-Bin; Jiang, Wen-Rui; Lin, Zi-Yu; Li, Gui-Lin; Chen, Ke

    2014-07-01

    Bioinformatics is challenged by the fact that traditional analysis tools have difficulty in processing large-scale data from high-throughput sequencing. The open source Apache Hadoop project, which adopts the MapReduce framework and a distributed file system, has recently given bioinformatics researchers an opportunity to achieve scalable, efficient and reliable computing performance on Linux clusters and on cloud computing services. In this article, we present MapReduce frame-based applications that can be employed in the next-generation sequencing and other biological domains. In addition, we discuss the challenges faced by this field as well as the future works on parallel computing in bioinformatics. © The Author 2013. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  20. DORMAN computer program (study 2.5). Volume 1: Executive summary. [development of data bank for computerized information storage of NASA programs

    NASA Technical Reports Server (NTRS)

    Stricker, L. T.

    1973-01-01

    The DORCA Applications study has been directed at development of a data bank management computer program identified as DORMAN. Because of the size of the DORCA data files and the manipulations required on that data to support analyses with the DORCA program, automated data techniques to replace time-consuming manual input generation are required. The Dynamic Operations Requirements and Cost Analysis (DORCA) program was developed for use by NASA in planning future space programs. Both programs are designed for implementation on the UNIVAC 1108 computing system. The purpose of this Executive Summary Report is to define for the NASA management the basic functions of the DORMAN program and its capabilities.

  1. Lewis Research Center studies of multiple large wind turbine generators on a utility network

    NASA Technical Reports Server (NTRS)

    Gilbert, L. J.; Triezenberg, D. M.

    1979-01-01

    A NASA-Lewis program to study the anticipated performance of a wind turbine generator farm on an electric utility network is surveyed. The paper describes the approach of the Lewis Wind Energy Project Office to developing analysis capabilities in the area of wind turbine generator-utility network computer simulations. Attention is given to areas such as, the Lewis Purdue hybrid simulation, an independent stability study, DOE multiunit plant study, and the WEST simulator. Also covered are the Lewis mod-2 simulation including analog simulation of a two wind turbine system and comparison with Boeing simulation results, and gust response of a two machine model. Finally future work to be done is noted and it is concluded that the study shows little interaction between the generators and between the generators and the bus.

  2. Modeling Imperfect Generator Behavior in Power System Operation Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krad, Ibrahim

    A key component in power system operations is the use of computer models to quickly study and analyze different operating conditions and futures in an efficient manner. The output of these models are sensitive to the data used in them as well as the assumptions made during their execution. One typical assumption is that generators and load assets perfectly follow operator control signals. While this is a valid simulation assumption, generators may not always accurately follow control signals. This imperfect response of generators could impact cost and reliability metrics. This paper proposes a generator model that capture this imperfect behaviormore » and examines its impact on production costs and reliability metrics using a steady-state power system operations model. Preliminary analysis shows that while costs remain relatively unchanged, there could be significant impacts on reliability metrics.« less

  3. CFD research and systems in Kawasaki Heavy Industries and its future prospects

    NASA Astrophysics Data System (ADS)

    Hiraoka, Koichi

    1990-09-01

    KHI Computational Fluid Dynamics (CFD) system is composed of VP100 computer and 2-D and 3-D Euler and/or Navier-Stokes (NS) analysis softwares. For KHI, this system has become a very powerful aerodynamic tool together with the Kawasaki 1 m Transonic Wind Tunnel. The 2-D Euler/NS software, developed in-house, is fully automated, requires no special skill, and was successfully applied to the design of YXX high lift devices and SST supersonic inlet, etc. The 3-D Euler/NS software, developed under joint research with NAL, has an interactively operated Multi-Block type grid generator and can effectively generate grids around complex airplane shapes. Due to the main memory size limitation, 3-D analysis of relatively simple shape, such as SST wing-body, was computed in-house on VP100, otherwise, such as detailed 3-D analyses of ASUKA and HOPE, were computed on NAL VP400, which is 10 times more powerful than VP100, under KHI-NAL joint research. These analysis results have very good correlation with experimental results. However, the present CFD system is less productive than wind tunnel and has applicability limitations.

  4. Advanced Helmet Mounted Display (AHMD) for simulator applications

    NASA Astrophysics Data System (ADS)

    Sisodia, Ashok; Riser, Andrew; Bayer, Michael; McGuire, James P.

    2006-05-01

    The Advanced Helmet Mounted Display (AHMD), augmented reality visual system first presented at last year's Cockpit and Future Displays for Defense and Security conference, has now been evaluated in a number of military simulator applications and by L-3 Link Simulation and Training. This paper presents the preliminary results of these evaluations and describes current and future simulator and training applications for HMD technology. The AHMD blends computer-generated data (symbology, synthetic imagery, enhanced imagery) with the actual and simulated visible environment. The AHMD is designed specifically for highly mobile deployable, minimum resource demanding reconfigurable virtual training systems to satisfy the military's in-theater warrior readiness objective. A description of the innovative AHMD system and future enhancements will be discussed.

  5. Crops in silico: A community wide multi-scale computational modeling framework of plant canopies

    NASA Astrophysics Data System (ADS)

    Srinivasan, V.; Christensen, A.; Borkiewic, K.; Yiwen, X.; Ellis, A.; Panneerselvam, B.; Kannan, K.; Shrivastava, S.; Cox, D.; Hart, J.; Marshall-Colon, A.; Long, S.

    2016-12-01

    Current crop models predict a looming gap between supply and demand for primary foodstuffs over the next 100 years. While significant yield increases were achieved in major food crops during the early years of the green revolution, the current rates of yield increases are insufficient to meet future projected food demand. Furthermore, with projected reduction in arable land, decrease in water availability, and increasing impacts of climate change on future food production, innovative technologies are required to sustainably improve crop yield. To meet these challenges, we are developing Crops in silico (Cis), a biologically informed, multi-scale, computational modeling framework that can facilitate whole plant simulations of crop systems. The Cis framework is capable of linking models of gene networks, protein synthesis, metabolic pathways, physiology, growth, and development in order to investigate crop response to different climate scenarios and resource constraints. This modeling framework will provide the mechanistic details to generate testable hypotheses toward accelerating directed breeding and engineering efforts to increase future food security. A primary objective for building such a framework is to create synergy among an inter-connected community of biologists and modelers to create a realistic virtual plant. This framework advantageously casts the detailed mechanistic understanding of individual plant processes across various scales in a common scalable framework that makes use of current advances in high performance and parallel computing. We are currently designing a user friendly interface that will make this tool equally accessible to biologists and computer scientists. Critically, this framework will provide the community with much needed tools for guiding future crop breeding and engineering, understanding the emergent implications of discoveries at the molecular level for whole plant behavior, and improved prediction of plant and ecosystem responses to the environment.

  6. Generation of a dynamo magnetic field in a protoplanetary accretion disk

    NASA Technical Reports Server (NTRS)

    Stepinski, T.; Levy, E. H.

    1987-01-01

    A new computational technique is developed that allows realistic calculations of dynamo magnetic field generation in disk geometries corresponding to protoplanetary and protostellar accretion disks. The approach is of sufficient generality to allow, in the future, a wide class of accretion disk problems to be solved. Here, basic modes of a disk dynamo are calculated. Spatially localized oscillatory states are found to occur in Keplerain disks. A physical interpretation is given that argues that spatially localized fields of the type found in these calculations constitute the basic modes of a Keplerian disk dynamo.

  7. Computational modeling of brain tumors: discrete, continuum or hybrid?

    NASA Astrophysics Data System (ADS)

    Wang, Zhihui; Deisboeck, Thomas S.

    In spite of all efforts, patients diagnosed with highly malignant brain tumors (gliomas), continue to face a grim prognosis. Achieving significant therapeutic advances will also require a more detailed quantitative understanding of the dynamic interactions among tumor cells, and between these cells and their biological microenvironment. Data-driven computational brain tumor models have the potential to provide experimental tumor biologists with such quantitative and cost-efficient tools to generate and test hypotheses on tumor progression, and to infer fundamental operating principles governing bidirectional signal propagation in multicellular cancer systems. This review highlights the modeling objectives of and challenges with developing such in silico brain tumor models by outlining two distinct computational approaches: discrete and continuum, each with representative examples. Future directions of this integrative computational neuro-oncology field, such as hybrid multiscale multiresolution modeling are discussed.

  8. Computational modeling of brain tumors: discrete, continuum or hybrid?

    NASA Astrophysics Data System (ADS)

    Wang, Zhihui; Deisboeck, Thomas S.

    2008-04-01

    In spite of all efforts, patients diagnosed with highly malignant brain tumors (gliomas), continue to face a grim prognosis. Achieving significant therapeutic advances will also require a more detailed quantitative understanding of the dynamic interactions among tumor cells, and between these cells and their biological microenvironment. Data-driven computational brain tumor models have the potential to provide experimental tumor biologists with such quantitative and cost-efficient tools to generate and test hypotheses on tumor progression, and to infer fundamental operating principles governing bidirectional signal propagation in multicellular cancer systems. This review highlights the modeling objectives of and challenges with developing such in silicobrain tumor models by outlining two distinct computational approaches: discrete and continuum, each with representative examples. Future directions of this integrative computational neuro-oncology field, such as hybrid multiscale multiresolution modeling are discussed.

  9. Fundamentals, current state of the development of, and prospects for further improvement of the new-generation thermal-hydraulic computational HYDRA-IBRAE/LM code for simulation of fast reactor systems

    NASA Astrophysics Data System (ADS)

    Alipchenkov, V. M.; Anfimov, A. M.; Afremov, D. A.; Gorbunov, V. S.; Zeigarnik, Yu. A.; Kudryavtsev, A. V.; Osipov, S. L.; Mosunova, N. A.; Strizhov, V. F.; Usov, E. V.

    2016-02-01

    The conceptual fundamentals of the development of the new-generation system thermal-hydraulic computational HYDRA-IBRAE/LM code are presented. The code is intended to simulate the thermalhydraulic processes that take place in the loops and the heat-exchange equipment of liquid-metal cooled fast reactor systems under normal operation and anticipated operational occurrences and during accidents. The paper provides a brief overview of Russian and foreign system thermal-hydraulic codes for modeling liquid-metal coolants and gives grounds for the necessity of development of a new-generation HYDRA-IBRAE/LM code. Considering the specific engineering features of the nuclear power plants (NPPs) equipped with the BN-1200 and the BREST-OD-300 reactors, the processes and the phenomena are singled out that require a detailed analysis and development of the models to be correctly described by the system thermal-hydraulic code in question. Information on the functionality of the computational code is provided, viz., the thermalhydraulic two-phase model, the properties of the sodium and the lead coolants, the closing equations for simulation of the heat-mass exchange processes, the models to describe the processes that take place during the steam-generator tube rupture, etc. The article gives a brief overview of the usability of the computational code, including a description of the support documentation and the supply package, as well as possibilities of taking advantages of the modern computer technologies, such as parallel computations. The paper shows the current state of verification and validation of the computational code; it also presents information on the principles of constructing of and populating the verification matrices for the BREST-OD-300 and the BN-1200 reactor systems. The prospects are outlined for further development of the HYDRA-IBRAE/LM code, introduction of new models into it, and enhancement of its usability. It is shown that the program of development and practical application of the code will allow carrying out in the nearest future the computations to analyze the safety of potential NPP projects at a qualitatively higher level.

  10. A Future Accelerated Cognitive Distributed Hybrid Testbed for Big Data Science Analytics

    NASA Astrophysics Data System (ADS)

    Halem, M.; Prathapan, S.; Golpayegani, N.; Huang, Y.; Blattner, T.; Dorband, J. E.

    2016-12-01

    As increased sensor spectral data volumes from current and future Earth Observing satellites are assimilated into high-resolution climate models, intensive cognitive machine learning technologies are needed to data mine, extract and intercompare model outputs. It is clear today that the next generation of computers and storage, beyond petascale cluster architectures, will be data centric. They will manage data movement and process data in place. Future cluster nodes have been announced that integrate multiple CPUs with high-speed links to GPUs and MICS on their backplanes with massive non-volatile RAM and access to active flash RAM disk storage. Active Ethernet connected key value store disk storage drives with 10Ge or higher are now available through the Kinetic Open Storage Alliance. At the UMBC Center for Hybrid Multicore Productivity Research, a future state-of-the-art Accelerated Cognitive Computer System (ACCS) for Big Data science is being integrated into the current IBM iDataplex computational system `bluewave'. Based on the next gen IBM 200 PF Sierra processor, an interim two node IBM Power S822 testbed is being integrated with dual Power 8 processors with 10 cores, 1TB Ram, a PCIe to a K80 GPU and an FPGA Coherent Accelerated Processor Interface card to 20TB Flash Ram. This system is to be updated to the Power 8+, an NVlink 1.0 with the Pascal GPU late in 2016. Moreover, the Seagate 96TB Kinetic Disk system with 24 Ethernet connected active disks is integrated into the ACCS storage system. A Lightweight Virtual File System developed at the NASA GSFC is installed on bluewave. Since remote access to publicly available quantum annealing computers is available at several govt labs, the ACCS will offer an in-line Restricted Boltzmann Machine optimization capability to the D-Wave 2X quantum annealing processor over the campus high speed 100 Gb network to Internet 2 for large files. As an evaluation test of the cognitive functionality of the architecture, the following studies utilizing all the system components will be presented; (i) a near real time climate change study generating CO2 fluxes and (ii) a deep dive capability into an 8000 x8000 pixel image pyramid display and (iii) Large dense and sparse eigenvalue decomposition.

  11. Future impacts of distributed power generation on ambient ozone and particulate matter concentrations in the San Joaquin Valley of California.

    PubMed

    Vutukuru, Satish; Carreras-Sospedra, Marc; Brouwer, Jacob; Dabdub, Donald

    2011-12-01

    Distributed power generation-electricity generation that is produced by many small stationary power generators distributed throughout an urban air basin-has the potential to supply a significant portion of electricity in future years. As a result, distributed generation may lead to increased pollutant emissions within an urban air basin, which could adversely affect air quality. However, the use of combined heating and power with distributed generation may reduce the energy consumption for space heating and air conditioning, resulting in a net decrease of pollutant and greenhouse gas emissions. This work used a systematic approach based on land-use geographical information system data to determine the spatial and temporal distribution of distributed generation emissions in the San Joaquin Valley Air Basin of California and simulated the potential air quality impacts using state-of-the-art three-dimensional computer models. The evaluation of the potential market penetration of distributed generation focuses on the year 2023. In general, the air quality impacts of distributed generation were found to be small due to the restrictive 2007 California Air Resources Board air emission standards applied to all distributed generation units and due to the use of combined heating and power. Results suggest that if distributed generation units were allowed to emit at the current Best Available Control Technology standards (which are less restrictive than the 2007 California Air Resources Board standards), air quality impacts of distributed generation could compromise compliance with the federal 8-hr average ozone standard in the region.

  12. Future Impacts of Distributed Power Generation on Ambient Ozone and Particulate Matter Concentrations in the San Joaquin Valley of California.

    PubMed

    Vutukuru, Satish; Carreras-Sospedra, Marc; Brouwer, Jacob; Dabdub, Donald

    2011-12-01

    Distributed power generation-electricity generation that is produced by many small stationary power generators distributed throughout an urban air basin-has the potential to supply a significant portion of electricity in future years. As a result, distributed generation may lead to increased pollutant emissions within an urban air basin, which could adversely affect air quality. However, the use of combined heating and power with distributed generation may reduce the energy consumption for space heating and air conditioning, resulting in a net decrease of pollutant and greenhouse gas emissions. This work used a systematic approach based on land-use geographical information system data to determine the spatial and temporal distribution of distributed generation emissions in the San Joaquin Valley Air Basin of California and simulated the potential air quality impacts using state-of-the-art three-dimensional computer models. The evaluation of the potential market penetration of distributed generation focuses on the year 2023. In general, the air quality impacts of distributed generation were found to be small due to the restrictive 2007 California Air Resources Board air emission standards applied to all distributed generation units and due to the use of combined heating and power. Results suggest that if distributed generation units were allowed to emit at the current Best Available Control Technology standards (which are less restrictive than the 2007 California Air Resources Board standards), air quality impacts of distributed generation could compromise compliance with the federal 8-hr average ozone standard in the region. [Box: see text].

  13. An Integrated Data-Driven Strategy for Safe-by-Design Nanoparticles: The FP7 MODERN Project.

    PubMed

    Brehm, Martin; Kafka, Alexander; Bamler, Markus; Kühne, Ralph; Schüürmann, Gerrit; Sikk, Lauri; Burk, Jaanus; Burk, Peeter; Tamm, Tarmo; Tämm, Kaido; Pokhrel, Suman; Mädler, Lutz; Kahru, Anne; Aruoja, Villem; Sihtmäe, Mariliis; Scott-Fordsmand, Janeck; Sorensen, Peter B; Escorihuela, Laura; Roca, Carlos P; Fernández, Alberto; Giralt, Francesc; Rallo, Robert

    2017-01-01

    The development and implementation of safe-by-design strategies is key for the safe development of future generations of nanotechnology enabled products. The safety testing of the huge variety of nanomaterials that can be synthetized is unfeasible due to time and cost constraints. Computational modeling facilitates the implementation of alternative testing strategies in a time and cost effective way. The development of predictive nanotoxicology models requires the use of high quality experimental data on the structure, physicochemical properties and bioactivity of nanomaterials. The FP7 Project MODERN has developed and evaluated the main components of a computational framework for the evaluation of the environmental and health impacts of nanoparticles. This chapter describes each of the elements of the framework including aspects related to data generation, management and integration; development of nanodescriptors; establishment of nanostructure-activity relationships; identification of nanoparticle categories; hazard ranking and risk assessment.

  14. [Simulation in surgical training].

    PubMed

    Nabavi, A; Schipper, J

    2017-01-01

    Patient safety during operations hinges on the surgeon's skills and abilities. However, surgical training has come under a variety of restrictions. To acquire dexterity with decreasingly "simple" cases, within the legislative time constraints and increasing expectations for surgical results is the future challenge. Are there alternatives to traditional master-apprentice learning? A literature review and analysis of the development, implementation, and evaluation of surgical simulation are presented. Simulation, using a variety of methods, most important physical and virtual (computer-generated) models, provides a safe environment to practice basic and advanced skills without endangering patients. These environments have specific strengths and weaknesses. Simulations can only serve to decrease the slope of learning curves, but cannot be a substitute for the real situation. Thus, they have to be an integral part of a comprehensive training curriculum. Our surgical societies have to take up that challenge to ensure the training of future generations.

  15. [INVITED] Computational intelligence for smart laser materials processing

    NASA Astrophysics Data System (ADS)

    Casalino, Giuseppe

    2018-03-01

    Computational intelligence (CI) involves using a computer algorithm to capture hidden knowledge from data and to use them for training ;intelligent machine; to make complex decisions without human intervention. As simulation is becoming more prevalent from design and planning to manufacturing and operations, laser material processing can also benefit from computer generating knowledge through soft computing. This work is a review of the state-of-the-art on the methodology and applications of CI in laser materials processing (LMP), which is nowadays receiving increasing interest from world class manufacturers and 4.0 industry. The focus is on the methods that have been proven effective and robust in solving several problems in welding, cutting, drilling, surface treating and additive manufacturing using the laser beam. After a basic description of the most common computational intelligences employed in manufacturing, four sections, namely, laser joining, machining, surface, and additive covered the most recent applications in the already extensive literature regarding the CI in LMP. Eventually, emerging trends and future challenges were identified and discussed.

  16. Computational Aerothermodynamic Design Issues for Hypersonic Vehicles

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.; Weilmuenster, K. James; Hamilton, H. Harris, II; Olynick, David R.; Venkatapathy, Ethiraj

    1997-01-01

    A brief review of the evolutionary progress in computational aerothermodynamics is presented. The current status of computational aerothermodynamics is then discussed, with emphasis on its capabilities and limitations for contributions to the design process of hypersonic vehicles. Some topics to be highlighted include: (1) aerodynamic coefficient predictions with emphasis on high temperature gas effects; (2) surface heating and temperature predictions for thermal protection system (TPS) design in a high temperature, thermochemical nonequilibrium environment; (3) methods for extracting and extending computational fluid dynamic (CFD) solutions for efficient utilization by all members of a multidisciplinary design team; (4) physical models; (5) validation process and error estimation; and (6) gridding and solution generation strategies. Recent experiences in the design of X-33 will be featured. Computational aerothermodynamic contributions to Mars Pathfinder, METEOR, and Stardust (Comet Sample return) will also provide context for this discussion. Some of the barriers that currently limit computational aerothermodynamics to a predominantly reactive mode in the design process will also be discussed, with the goal of providing focus for future research.

  17. Computational Aerothermodynamic Design Issues for Hypersonic Vehicles

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.; Weilmuenster, K. James; Hamilton, H. Harris, II; Olynick, David R.; Venkatapathy, Ethiraj

    2005-01-01

    A brief review of the evolutionary progress in computational aerothermodynamics is presented. The current status of computational aerothermodynamics is then discussed, with emphasis on its capabilities and limitations for contributions to the design process of hypersonic vehicles. Some topics to be highlighted include: (1) aerodynamic coefficient predictions with emphasis on high temperature gas effects; (2) surface heating and temperature predictions for thermal protection system (TPS) design in a high temperature, thermochemical nonequilibrium environment; (3) methods for extracting and extending computational fluid dynamic (CFD) solutions for efficient utilization by all members of a multidisciplinary design team; (4) physical models; (5) validation process and error estimation; and (6) gridding and solution generation strategies. Recent experiences in the design of X-33 will be featured. Computational aerothermodynamic contributions to Mars Path finder, METEOR, and Stardust (Comet Sample return) will also provide context for this discussion. Some of the barriers that currently limit computational aerothermodynamics to a predominantly reactive mode in the design process will also be discussed, with the goal of providing focus for future research.

  18. Computational Aerothermodynamic Design Issues for Hypersonic Vehicles

    NASA Technical Reports Server (NTRS)

    Olynick, David R.; Venkatapathy, Ethiraj

    2004-01-01

    A brief review of the evolutionary progress in computational aerothermodynamics is presented. The current status of computational aerothermodynamics is then discussed, with emphasis on its capabilities and limitations for contributions to the design process of hypersonic vehicles. Some topics to be highlighted include: (1) aerodynamic coefficient predictions with emphasis on high temperature gas effects; (2) surface heating and temperature predictions for thermal protection system (TPS) design in a high temperature, thermochemical nonequilibrium environment; (3) methods for extracting and extending computational fluid dynamic (CFD) solutions for efficient utilization by all members of a multidisciplinary design team; (4) physical models; (5) validation process and error estimation; and (6) gridding and solution generation strategies. Recent experiences in the design of X-33 will be featured. Computational aerothermodynamic contributions to Mars Pathfinder, METEOR, and Stardust (Comet Sample return) will also provide context for this discussion. Some of the barriers that currently limit computational aerothermodynamics to a predominantly reactive mode in the design process will also be discussed, with the goal of providing focus for future research.

  19. Introducing Enabling Computational Tools to the Climate Sciences: Multi-Resolution Climate Modeling with Adaptive Cubed-Sphere Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jablonowski, Christiane

    The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively withmore » advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project demonstrate significant advances in all six research areas. The major conclusions are that statically-adaptive variable-resolution modeling is currently becoming mature in the climate sciences, and that AMR holds outstanding promise for future-generation weather and climate models on high-performance computing architectures.« less

  20. Considerations in video playback design: using optic flow analysis to examine motion characteristics of live and computer-generated animation sequences.

    PubMed

    Woo, Kevin L; Rieucau, Guillaume

    2008-07-01

    The increasing use of the video playback technique in behavioural ecology reveals a growing need to ensure better control of the visual stimuli that focal animals experience. Technological advances now allow researchers to develop computer-generated animations instead of using video sequences of live-acting demonstrators. However, care must be taken to match the motion characteristics (speed and velocity) of the animation to the original video source. Here, we presented a tool based on the use of an optic flow analysis program to measure the resemblance of motion characteristics of computer-generated animations compared to videos of live-acting animals. We examined three distinct displays (tail-flick (TF), push-up body rock (PUBR), and slow arm wave (SAW)) exhibited by animations of Jacky dragons (Amphibolurus muricatus) that were compared to the original video sequences of live lizards. We found no significant differences between the motion characteristics of videos and animations across all three displays. Our results showed that our animations are similar the speed and velocity features of each display. Researchers need to ensure that similar motion characteristics in animation and video stimuli are represented, and this feature is a critical component in the future success of the video playback technique.

  1. A Novel Numerical Approach for Generation and Propagation of Rotor-Stator Interaction Noise

    NASA Astrophysics Data System (ADS)

    Patel, Krishna

    As turbofan engine designs move towards bypass ratios ≥12 and corresponding low pressure ratios, fan rotor blade tip Mach numbers are reduced, leading to rotor-stator interaction becoming an important contributor to tonal fan noise. For future aircraft configurations employing boundary layer ingestion, non-uniform flow enters the fan. The impact of such non-uniform flows on the generation and propagation of rotor-stator interaction tones has yet to be assessed. In this thesis, a novel approach is proposed to numerically predict the generation and propagation of rotor-stator interaction noise with distorted inflow. The approach enables a 42% reduction in computational cost compared to traditional approaches employing a sliding interface between the rotor and stator. Such an interface may distort rotor wakes and can cause non-physical acoustic wave reflections if time steps are not sufficiently small. Computational costs are reduced by modelling the rotor using distributed, volumetric body forces. This eliminates the need for a sliding interface and thus allows a larger time step size. The force model responds to local flow conditions and thus can capture the effects of long-wavelength flow distortions. Since interaction noise is generated by the incidence of the rotor wakes onto the stator vanes, the key challenge is to produce the wakes using a body force field since the rotor blades are not directly modelled. It is shown that such an approach can produce wakes by concentrating the viscous forces along streamtubes in the last 15% chord. The new approach to rotor wake generation is assessed on the GE R4 fan from NASA's Source Diagnostic Test, for which the computed overall aerodynamic performance matches the experiment to within 1%. The rotor blade wakes are generated with widths in excellent agreement and depths in fair agreement with the experiment. An assessment of modal sound power levels computed in the exhaust duct indicates that this approach can be used for predicting downstream propagating interaction noise.

  2. Holodeck: Telepresence Dome Visualization System Simulations

    NASA Technical Reports Server (NTRS)

    Hite, Nicolas

    2012-01-01

    This paper explores the simulation and consideration of different image-projection strategies for the Holodeck, a dome that will be used for highly immersive telepresence operations in future endeavors of the National Aeronautics and Space Administration (NASA). Its visualization system will include a full 360 degree projection onto the dome's interior walls in order to display video streams from both simulations and recorded video. Because humans innately trust their vision to precisely report their surroundings, the Holodeck's visualization system is crucial to its realism. This system will be rigged with an integrated hardware and software infrastructure-namely, a system of projectors that will relay with a Graphics Processing Unit (GPU) and computer to both project images onto the dome and correct warping in those projections in real-time. Using both Computer-Aided Design (CAD) and ray-tracing software, virtual models of various dome/projector geometries were created and simulated via tracking and analysis of virtual light sources, leading to the selection of two possible configurations for installation. Research into image warping and the generation of dome-ready video content was also conducted, including generation of fisheye images, distortion correction, and the generation of a reliable content-generation pipeline.

  3. Evaluating the uncertainty of predicting future climate time series at the hourly time scale

    NASA Astrophysics Data System (ADS)

    Caporali, E.; Fatichi, S.; Ivanov, V. Y.

    2011-12-01

    A stochastic downscaling methodology is developed to generate hourly, point-scale time series for several meteorological variables, such as precipitation, cloud cover, shortwave radiation, air temperature, relative humidity, wind speed, and atmospheric pressure. The methodology uses multi-model General Circulation Model (GCM) realizations and an hourly weather generator, AWE-GEN. Probabilistic descriptions of factors of change (a measure of climate change with respect to historic conditions) are computed for several climate statistics and different aggregation times using a Bayesian approach that weights the individual GCM contributions. The Monte Carlo method is applied to sample the factors of change from their respective distributions thereby permitting the generation of time series in an ensemble fashion, which reflects the uncertainty of climate projections of future as well as the uncertainty of the downscaling procedure. Applications of the methodology and probabilistic expressions of certainty in reproducing future climates for the periods, 2000 - 2009, 2046 - 2065 and 2081 - 2100, using the 1962 - 1992 period as the baseline, are discussed for the location of Firenze (Italy). The climate predictions for the period of 2000 - 2009 are tested against observations permitting to assess the reliability and uncertainties of the methodology in reproducing statistics of meteorological variables at different time scales.

  4. Visidep (TM): A Three-Dimensional Imaging System For The Unaided Eye

    NASA Astrophysics Data System (ADS)

    McLaurin, A. Porter; Jones, Edwin R.; Cathey, LeConte

    1984-05-01

    The VISIDEP process for creating images in three dimensions on flat screens is suitable for photographic, electrographic and computer generated imaging systems. Procedures for generating these images vary from medium to medium due to the specific requirements of each technology. Imaging requirements for photographic and electrographic media are more directly tied to the hardware than are computer based systems. Applications of these technologies are not limited to entertainment, but have implications for training, interactive computer/video systems, medical imaging, and inspection equipment. Through minor modification the system can provide three-dimensional images with accurately measureable relationships for robotics and adds this factor for future developments in artificial intelligence. In almost any area requiring image analysis or critical review, VISIDEP provides the added advantage of three-dimensionality. All of this is readily accomplished without aids to the human eye. The system can be viewed in full color, false-color infra-red, and monochromatic modalities from any angle and is also viewable with a single eye. Thus, the potential of application for this developing system is extensive and covers the broad spectrum of human endeavor from entertainment to scientific study.

  5. Aerosciences, Aero-Propulsion and Flight Mechanics Technology Development for NASA's Next Generation Launch Technology Program

    NASA Technical Reports Server (NTRS)

    Cockrell, Charles E., Jr.

    2003-01-01

    The Next Generation Launch Technology (NGLT) program, Vehicle Systems Research and Technology (VSR&T) project is pursuing technology advancements in aerothermodynamics, aeropropulsion and flight mechanics to enable development of future reusable launch vehicle (RLV) systems. The current design trade space includes rocket-propelled, hypersonic airbreathing and hybrid systems in two-stage and single-stage configurations. Aerothermodynamics technologies include experimental and computational databases to evaluate stage separation of two-stage vehicles as well as computational and trajectory simulation tools for this problem. Additionally, advancements in high-fidelity computational tools and measurement techniques are being pursued along with the study of flow physics phenomena, such as boundary-layer transition. Aero-propulsion technology development includes scramjet flowpath development and integration, with a current emphasis on hypervelocity (Mach 10 and above) operation, as well as the study of aero-propulsive interactions and the impact on overall vehicle performance. Flight mechanics technology development is focused on advanced guidance, navigation and control (GN&C) algorithms and adaptive flight control systems for both rocket-propelled and airbreathing vehicles.

  6. Unsteady Aero Computation of a 1 1/2 Stage Large Scale Rotating Turbine

    NASA Technical Reports Server (NTRS)

    To, Wai-Ming

    2012-01-01

    This report is the documentation of the work performed for the Subsonic Rotary Wing Project under the NASA s Fundamental Aeronautics Program. It was funded through Task Number NNC10E420T under GESS-2 Contract NNC06BA07B in the period of 10/1/2010 to 8/31/2011. The objective of the task is to provide support for the development of variable speed power turbine technology through application of computational fluid dynamics analyses. This includes work elements in mesh generation, multistage URANS simulations, and post-processing of the simulation results for comparison with the experimental data. The unsteady CFD calculations were performed with the TURBO code running in multistage single passage (phase lag) mode. Meshes for the blade rows were generated with the NASA developed TCGRID code. The CFD performance is assessed and improvements are recommended for future research in this area. For that, the United Technologies Research Center's 1 1/2 stage Large Scale Rotating Turbine was selected to be the candidate engine configuration for this computational effort because of the completeness and availability of the data.

  7. CFD Modelling of a Quadrupole Vortex Inside a Cylindrical Channel for Research into Advanced Hybrid Rocket Designs

    NASA Astrophysics Data System (ADS)

    Godfrey, B.; Majdalani, J.

    2014-11-01

    This study relies on computational fluid dynamics (CFD) tools to analyse a possible method for creating a stable quadrupole vortex within a simulated, circular-port, cylindrical rocket chamber. A model of the vortex generator is created in a SolidWorks CAD program and then the grid is generated using the Pointwise mesh generation software. The non-reactive flowfield is simulated using an open source computational program, Stanford University Unstructured (SU2). Subsequent analysis and visualization are performed using ParaView. The vortex generation approach that we employ consists of four tangentially injected monopole vortex generators that are arranged symmetrically with respect to the center of the chamber in such a way to produce a quadrupole vortex with a common downwash. The present investigation focuses on characterizing the flow dynamics so that future investigations can be undertaken with increasing levels of complexity. Our CFD simulations help to elucidate the onset of vortex filaments within the monopole tubes, and the evolution of quadrupole vortices downstream of the injection faceplate. Our results indicate that the quadrupole vortices produced using the present injection pattern can become quickly unstable to the extent of dissipating soon after being introduced into simulated rocket chamber. We conclude that a change in the geometrical configuration will be necessary to produce more stable quadrupoles.

  8. Application of Intrusion Tolerance Technology to Joint Battlespace Infosphere (JBI)

    DTIC Science & Technology

    2003-02-01

    performance, scalability and Security Issues and Requirements for Internet-Scale Publish-Subscribe Systems Chenxi Wang, Antonio Carzaniga, David ...by the Defense Advanced Research Agency, under the agreement number F30602-96-1-0314. The work of David Evans was supported by in part by the...Future Generations of Computer Science. October 1998. [10]. D. Chaum , C. Crepeau, and I. Damgard. “Multiparty Unconditionally Secure Protocols,” In

  9. Spin Glass Patch Planting

    NASA Technical Reports Server (NTRS)

    Wang, Wenlong; Mandra, Salvatore; Katzgraber, Helmut G.

    2016-01-01

    In this paper, we propose a patch planting method for creating arbitrarily large spin glass instances with known ground states. The scaling of the computational complexity of these instances with various block numbers and sizes is investigated and compared with random instances using population annealing Monte Carlo and the quantum annealing DW2X machine. The method can be useful for benchmarking tests for future generation quantum annealing machines, classical and quantum mechanical optimization algorithms.

  10. Assessment of Laminar, Convective Aeroheating Prediction Uncertainties for Mars Entry Vehicles

    NASA Technical Reports Server (NTRS)

    Hollis, Brian R.; Prabhu, Dinesh K.

    2011-01-01

    An assessment of computational uncertainties is presented for numerical methods used by NASA to predict laminar, convective aeroheating environments for Mars entry vehicles. A survey was conducted of existing experimental heat-transfer and shock-shape data for high enthalpy, reacting-gas CO2 flows and five relevant test series were selected for comparison to predictions. Solutions were generated at the experimental test conditions using NASA state-of-the-art computational tools and compared to these data. The comparisons were evaluated to establish predictive uncertainties as a function of total enthalpy and to provide guidance for future experimental testing requirements to help lower these uncertainties.

  11. Assessment of Laminar, Convective Aeroheating Prediction Uncertainties for Mars-Entry Vehicles

    NASA Technical Reports Server (NTRS)

    Hollis, Brian R.; Prabhu, Dinesh K.

    2013-01-01

    An assessment of computational uncertainties is presented for numerical methods used by NASA to predict laminar, convective aeroheating environments for Mars-entry vehicles. A survey was conducted of existing experimental heat transfer and shock-shape data for high-enthalpy reacting-gas CO2 flows, and five relevant test series were selected for comparison with predictions. Solutions were generated at the experimental test conditions using NASA state-of-the-art computational tools and compared with these data. The comparisons were evaluated to establish predictive uncertainties as a function of total enthalpy and to provide guidance for future experimental testing requirements to help lower these uncertainties.

  12. Computer-aided injection molding system

    NASA Astrophysics Data System (ADS)

    Wang, K. K.; Shen, S. F.; Cohen, C.; Hieber, C. A.; Isayev, A. I.

    1982-10-01

    Achievements are reported in cavity-filling simulation, modeling viscoelastic effects, measuring and predicting frozen-in birefringence in molded parts, measuring residual stresses and associated mechanical properties of molded parts, and developing an interactive mold-assembly design program and an automatic NC maching data generation and verification program. The Cornell Injection Molding Program (CIMP) consortium is discussed as are computer user manuals that have been published by the consortium. Major tasks which should be addressed in future efforts are listed, including: (1) predict and experimentally determine the post-fillin behavior of thermoplastics; (2) simulate and experimentally investigate the injection molding of thermosets and filled materials; and (3) further investigate residual stresses, orientation and mechanical properties.

  13. Online Treatment and Virtual Therapists in Child and Adolescent Psychiatry.

    PubMed

    Schueller, Stephen M; Stiles-Shields, Colleen; Yarosh, Lana

    2017-01-01

    Online and virtual therapies are a well-studied and efficacious treatment option for various mental and behavioral health conditions among children and adolescents. However, many interventions have not considered the unique affordances offered by technologies that might align with the capacities and interests of youth users. In this article, the authors discuss learnings from child-computer interaction that can inform future generations of interventions and guide developers, practitioners, and researchers how to best use new technologies for youth populations. The article concludes with innovative examples illustrating future potentials of online and virtual therapies such as gaming and social networking. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Future requirements in surface modeling and grid generation

    NASA Technical Reports Server (NTRS)

    Cosner, Raymond R.

    1995-01-01

    The past ten years have seen steady progress in surface modeling procedures, and wholesale changes in grid generation technology. Today, it seems fair to state that a satisfactory grid can be developed to model nearly any configuration of interest. The issues at present focus on operational concerns such as cost and quality. Continuing evolution of the engineering process is placing new demands on the technologies of surface modeling and grid generation. In the evolution toward a multidisciplinary analysis-bascd design environment, methods developed for Computational Fluid Dynamics are finding acceptance in many additional applications. These two trends, the normal evolution of the process and a watershed shift toward concurrent and multidisciplinary analysis, will be considered in assessing current capabilities and needed technological improvements.

  15. Computational Drafting of Plot Structures for Russian Folk Tales.

    PubMed

    Gervás, Pablo

    The plots of stories are known to follow general patterns in terms of their overall structure. This was the basic tenet of structuralist approaches to narratology. Vladimir Propp proposed a procedure for the generation of new tales based on his semi-formal description of the structure of Russian folk tales. This is one of the first existing instances of a creative process described procedurally. The present paper revisits Propp's morphology to build a system that generates instances of Russian folk tales. Propp's view of the folk tale as a rigid sequence of character functions is employed as a plot driver, and some issues that Propp declared relevant but did not explore in detail-such as long-range dependencies between functions or the importance of endings-are given computational shape in the context of a broader architecture that captures all the aspects discussed by Propp. A set of simple evaluation metrics for the resulting outputs is defined inspired on Propp's formalism. The potential of the resulting system for providing a creative story generation system is discussed, and possible lines of future work are discussed.

  16. Thermoelectric Power Generation System for Future Hybrid Vehicles Using Hot Exhaust Gas

    NASA Astrophysics Data System (ADS)

    Kim, Sun-Kook; Won, Byeong-Cheol; Rhi, Seok-Ho; Kim, Shi-Ho; Yoo, Jeong-Ho; Jang, Ju-Chan

    2011-05-01

    The present experimental and computational study investigates a new exhaust gas waste heat recovery system for hybrid vehicles, using a thermoelectric module (TEM) and heat pipes to produce electric power. It proposes a new thermoelectric generation (TEG) system, working with heat pipes to produce electricity from a limited hot surface area. The current TEG system is directly connected to the exhaust pipe, and the amount of electricity generated by the TEMs is directly proportional to their heated area. Current exhaust pipes fail to offer a sufficiently large hot surface area for the high-efficiency waste heat recovery required. To overcome this, a new TEG system has been designed to have an enlarged hot surface area by the addition of ten heat pipes, which act as highly efficient heat transfer devices and can transmit the heat to many TEMs. As designed, this new waste heat recovery system produces a maximum 350 W when the hot exhaust gas heats the evaporator surface of the heat pipe to 170°C; this promises great possibilities for application of this technology in future energy-efficient hybrid vehicles.

  17. Virtual worlds to support patient group communication? A questionnaire study investigating potential for virtual world focus group use by respiratory patients.

    PubMed

    Taylor, Michael J; Taylor, Dave; Vlaev, Ivo; Elkin, Sarah

    2017-01-01

    Recent advances in communication technologies enable potential provision of remote education for patients using computer-generated environments known as virtual worlds. Previous research has revealed highly variable levels of patient receptiveness to using information technologies for healthcare-related purposes. This preliminary study involved implementing a questionnaire investigating attitudes and access to computer technologies of respiratory outpatients, in order to assess potential for use of virtual worlds to facilitate health-related education for this sample. Ninety-four patients with a chronic respiratory condition completed surveys, which were distributed at a Chest Clinic. In accordance with our prediction, younger participants were more likely to be able to use, and have access to a computer and some patients were keen to explore use virtual worlds for healthcare-related purposes: Of those with access to computer facilities, 14.50% expressed a willingness to attend a virtual world focus group. Results indicate future virtual world health education facilities should be designed to cater for younger patients, because this group are most likely to accept and use such facilities. Within the study sample, this is likely to comprise of people diagnosed with asthma. Future work could investigate the potential of creating a virtual world asthma education facility.

  18. Virtual worlds to support patient group communication? A questionnaire study investigating potential for virtual world focus group use by respiratory patients

    PubMed Central

    Taylor, Michael J.; Taylor, Dave; Vlaev, Ivo; Elkin, Sarah

    2015-01-01

    Recent advances in communication technologies enable potential provision of remote education for patients using computer-generated environments known as virtual worlds. Previous research has revealed highly variable levels of patient receptiveness to using information technologies for healthcare-related purposes. This preliminary study involved implementing a questionnaire investigating attitudes and access to computer technologies of respiratory outpatients, in order to assess potential for use of virtual worlds to facilitate health-related education for this sample. Ninety-four patients with a chronic respiratory condition completed surveys, which were distributed at a Chest Clinic. In accordance with our prediction, younger participants were more likely to be able to use, and have access to a computer and some patients were keen to explore use virtual worlds for healthcare-related purposes: Of those with access to computer facilities, 14.50% expressed a willingness to attend a virtual world focus group. Results indicate future virtual world health education facilities should be designed to cater for younger patients, because this group are most likely to accept and use such facilities. Within the study sample, this is likely to comprise of people diagnosed with asthma. Future work could investigate the potential of creating a virtual world asthma education facility. PMID:28239187

  19. Interactive design and analysis of future large spacecraft concepts

    NASA Technical Reports Server (NTRS)

    Garrett, L. B.

    1981-01-01

    An interactive computer aided design program used to perform systems level design and analysis of large spacecraft concepts is presented. Emphasis is on rapid design, analysis of integrated spacecraft, and automatic spacecraft modeling for lattice structures. Capabilities and performance of multidiscipline applications modules, the executive and data management software, and graphics display features are reviewed. A single user at an interactive terminal create, design, analyze, and conduct parametric studies of Earth orbiting spacecraft with relative ease. Data generated in the design, analysis, and performance evaluation of an Earth-orbiting large diameter antenna satellite are used to illustrate current capabilities. Computer run time statistics for the individual modules quantify the speed at which modeling, analysis, and design evaluation of integrated spacecraft concepts is accomplished in a user interactive computing environment.

  20. Physical aspects of computing the flow of a viscous fluid

    NASA Technical Reports Server (NTRS)

    Mehta, U. B.

    1984-01-01

    One of the main themes in fluid dynamics at present and in the future is going to be computational fluid dynamics with the primary focus on the determination of drag, flow separation, vortex flows, and unsteady flows. A computation of the flow of a viscous fluid requires an understanding and consideration of the physical aspects of the flow. This is done by identifying the flow regimes and the scales of fluid motion, and the sources of vorticity. Discussions of flow regimes deal with conditions of incompressibility, transitional and turbulent flows, Navier-Stokes and non-Navier-Stokes regimes, shock waves, and strain fields. Discussions of the scales of fluid motion consider transitional and turbulent flows, thin- and slender-shear layers, triple- and four-deck regions, viscous-inviscid interactions, shock waves, strain rates, and temporal scales. In addition, the significance and generation of vorticity are discussed. These physical aspects mainly guide computations of the flow of a viscous fluid.

  1. Future Directions: Advances and Implications of Virtual Environments Designed for Pain Management

    PubMed Central

    Soomro, Ahmad; Riva, Giuseppe; Wiederhold, Mark D.

    2014-01-01

    Abstract Pain symptoms have been addressed with a variety of therapeutic measures in the past, but as we look to the future, we begin encountering new options for patient care and individual health and well-being. Recent studies indicate that computer-generated graphic environments—virtual reality (VR)—can offer effective cognitive distractions for individuals suffering from pain arising from a variety of physical and psychological illnesses. Studies also indicate the effectiveness of VR for both chronic and acute pain conditions. Future possibilities for VR to address pain-related concerns include such diverse groups as military personnel, space exploration teams, the general labor force, and our ever increasing elderly population. VR also shows promise to help in such areas as drug abuse, at-home treatments, and athletic injuries. PMID:24892206

  2. Future directions: advances and implications of virtual environments designed for pain management.

    PubMed

    Wiederhold, Brenda K; Soomro, Ahmad; Riva, Giuseppe; Wiederhold, Mark D

    2014-06-01

    Pain symptoms have been addressed with a variety of therapeutic measures in the past, but as we look to the future, we begin encountering new options for patient care and individual health and well-being. Recent studies indicate that computer-generated graphic environments--virtual reality (VR)--can offer effective cognitive distractions for individuals suffering from pain arising from a variety of physical and psychological illnesses. Studies also indicate the effectiveness of VR for both chronic and acute pain conditions. Future possibilities for VR to address pain-related concerns include such diverse groups as military personnel, space exploration teams, the general labor force, and our ever increasing elderly population. VR also shows promise to help in such areas as drug abuse, at-home treatments, and athletic injuries.

  3. CANFAR + Skytree: Mining Massive Datasets as an Essential Part of the Future of Astronomy

    NASA Astrophysics Data System (ADS)

    Ball, Nicholas M.

    2013-01-01

    The future study of large astronomical datasets, consisting of hundreds of millions to billions of objects, will be dominated by large computing resources, and by analysis tools of the necessary scalability and sophistication to extract useful information. Significant effort will be required to fulfil their potential as a provider of the next generation of science results. To-date, computing systems have allowed either sophisticated analysis of small datasets, e.g., most astronomy software, or simple analysis of large datasets, e.g., database queries. At the Canadian Astronomy Data Centre, we have combined our cloud computing system, the Canadian Advanced Network for Astronomical Research (CANFAR), with the world's most advanced machine learning software, Skytree, to create the world's first cloud computing system for data mining in astronomy. This allows the full sophistication of the huge fields of data mining and machine learning to be applied to the hundreds of millions of objects that make up current large datasets. CANFAR works by utilizing virtual machines, which appear to the user as equivalent to a desktop. Each machine is replicated as desired to perform large-scale parallel processing. Such an arrangement carries far more flexibility than other cloud systems, because it enables the user to immediately install and run the same code that they already utilize for science on their desktop. We demonstrate the utility of the CANFAR + Skytree system by showing science results obtained, including assigning photometric redshifts with full probability density functions (PDFs) to a catalog of approximately 133 million galaxies from the MegaPipe reductions of the Canada-France-Hawaii Telescope Legacy Wide and Deep surveys. Each PDF is produced nonparametrically from 100 instances of the photometric parameters for each galaxy, generated by perturbing within the errors on the measurements. Hence, we produce, store, and assign redshifts to, a catalog of over 13 billion object instances. This catalog is comparable in size to those expected from next-generation surveys, such as Large Synoptic Survey Telescope. The CANFAR+Skytree system is open for use by any interested member of the astronomical community.

  4. Computer-generated, calligraphic, full-spectrum color system for visual simulation landing approach maneuvers

    NASA Technical Reports Server (NTRS)

    Chase, W. D.

    1975-01-01

    The calligraphic chromatic projector described was developed to improve the perceived realism of visual scene simulation ('out-the-window visuals'). The optical arrangement of the projector is illustrated and discussed. The device permits drawing 2000 vectors in as many as 500 colors, all above critical flicker frequencies, and use of high scene resolution and brightness at an acceptable level to the pilot, with the maximum system capabilities of 1000 lines and 1000 fL. The device for generating the colors is discussed, along with an experiment conducted to demonstrate potential improvements in performance and pilot opinion. Current research work and future research plans are noted.

  5. Light-Field Imaging Toolkit

    NASA Astrophysics Data System (ADS)

    Bolan, Jeffrey; Hall, Elise; Clifford, Chris; Thurow, Brian

    The Light-Field Imaging Toolkit (LFIT) is a collection of MATLAB functions designed to facilitate the rapid processing of raw light field images captured by a plenoptic camera. An included graphical user interface streamlines the necessary post-processing steps associated with plenoptic images. The generation of perspective shifted views and computationally refocused images is supported, in both single image and animated formats. LFIT performs necessary calibration, interpolation, and structuring steps to enable future applications of this technology.

  6. Computer applications in diagnostic imaging.

    PubMed

    Horii, S C

    1991-03-01

    This article has introduced the nature, generation, use, and future of digital imaging. As digital technology has transformed other aspects of our lives--has the reader tried to buy a conventional record album recently? almost all music store stock is now compact disks--it is sure to continue to transform medicine as well. Whether that transformation will be to our liking as physicians or a source of frustration and disappointment is dependent on understanding the issues involved.

  7. Green Cloud on the Horizon

    NASA Astrophysics Data System (ADS)

    Ali, Mufajjul

    This paper proposes a Green Cloud model for mobile Cloud computing. The proposed model leverage on the current trend of IaaS (Infrastructure as a Service), PaaS (Platform as a Service) and SaaS (Software as a Service), and look at new paradigm called "Network as a Service" (NaaS). The Green Cloud model proposes various Telco's revenue generating streams and services with the CaaS (Cloud as a Service) for the near future.

  8. On the Relevancy of Efficient, Integrated Computer and Network Monitoring in HEP Distributed Online Environment

    NASA Astrophysics Data System (ADS)

    Carvalho, D.; Gavillet, Ph.; Delgado, V.; Albert, J. N.; Bellas, N.; Javello, J.; Miere, Y.; Ruffinoni, D.; Smith, G.

    Large Scientific Equipments are controlled by Computer Systems whose complexity is growing driven, on the one hand by the volume and variety of the information, its distributed nature, the sophistication of its treatment and, on the other hand by the fast evolution of the computer and network market. Some people call them genetically Large-Scale Distributed Data Intensive Information Systems or Distributed Computer Control Systems (DCCS) for those systems dealing more with real time control. Taking advantage of (or forced by) the distributed architecture, the tasks are more and more often implemented as Client-Server applications. In this framework the monitoring of the computer nodes, the communications network and the applications becomes of primary importance for ensuring the safe running and guaranteed performance of the system. With the future generation of HEP experiments, such as those at the LHC in view, it is proposed to integrate the various functions of DCCS monitoring into one general purpose Multi-layer System.

  9. Installing computers in older adults' homes and teaching them to access a patient education web site: a systematic approach.

    PubMed

    Dauz, Emily; Moore, Jan; Smith, Carol E; Puno, Florence; Schaag, Helen

    2004-01-01

    This article describes the experiences of nurses who, as part of a large clinical trial, brought the Internet into older adults' homes by installing a computer, if needed, and connecting to a patient education Web site. Most of these patients had not previously used the Internet and were taught even basic computer skills when necessary. Because of increasing use of the Internet in patient education, assessment, and home monitoring, nurses in various roles currently connect with patients to monitor their progress, teach about medications, and answer questions about appointments and treatments. Thus, nurses find themselves playing the role of technology managers for patients with home-based Internet connections. This article provides step-by-step procedures for computer installation and training in the form of protocols, checklists, and patient user guides. By following these procedures, nurses can install computers, arrange Internet access, teach and connect to their patients, and prepare themselves to install future generations of technological devices.

  10. TestDose: A nuclear medicine software based on Monte Carlo modeling for generating gamma camera acquisitions and dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia, Marie-Paule, E-mail: marie-paule.garcia@univ-brest.fr; Villoing, Daphnée; McKay, Erin

    Purpose: The TestDose platform was developed to generate scintigraphic imaging protocols and associated dosimetry by Monte Carlo modeling. TestDose is part of a broader project (www.dositest.com) whose aim is to identify the biases induced by different clinical dosimetry protocols. Methods: The TestDose software allows handling the whole pipeline from virtual patient generation to resulting planar and SPECT images and dosimetry calculations. The originality of their approach relies on the implementation of functional segmentation for the anthropomorphic model representing a virtual patient. Two anthropomorphic models are currently available: 4D XCAT and ICRP 110. A pharmacokinetic model describes the biodistribution of amore » given radiopharmaceutical in each defined compartment at various time-points. The Monte Carlo simulation toolkit GATE offers the possibility to accurately simulate scintigraphic images and absorbed doses in volumes of interest. The TestDose platform relies on GATE to reproduce precisely any imaging protocol and to provide reference dosimetry. For image generation, TestDose stores user’s imaging requirements and generates automatically command files used as input for GATE. Each compartment is simulated only once and the resulting output is weighted using pharmacokinetic data. Resulting compartment projections are aggregated to obtain the final image. For dosimetry computation, emission data are stored in the platform database and relevant GATE input files are generated for the virtual patient model and associated pharmacokinetics. Results: Two samples of software runs are given to demonstrate the potential of TestDose. A clinical imaging protocol for the Octreoscan™ therapeutical treatment was implemented using the 4D XCAT model. Whole-body “step and shoot” acquisitions at different times postinjection and one SPECT acquisition were generated within reasonable computation times. Based on the same Octreoscan™ kinetics, a dosimetry computation performed on the ICRP 110 model is also presented. Conclusions: The proposed platform offers a generic framework to implement any scintigraphic imaging protocols and voxel/organ-based dosimetry computation. Thanks to the modular nature of TestDose, other imaging modalities could be supported in the future such as positron emission tomography.« less

  11. TestDose: A nuclear medicine software based on Monte Carlo modeling for generating gamma camera acquisitions and dosimetry.

    PubMed

    Garcia, Marie-Paule; Villoing, Daphnée; McKay, Erin; Ferrer, Ludovic; Cremonesi, Marta; Botta, Francesca; Ferrari, Mahila; Bardiès, Manuel

    2015-12-01

    The TestDose platform was developed to generate scintigraphic imaging protocols and associated dosimetry by Monte Carlo modeling. TestDose is part of a broader project (www.dositest.com) whose aim is to identify the biases induced by different clinical dosimetry protocols. The TestDose software allows handling the whole pipeline from virtual patient generation to resulting planar and SPECT images and dosimetry calculations. The originality of their approach relies on the implementation of functional segmentation for the anthropomorphic model representing a virtual patient. Two anthropomorphic models are currently available: 4D XCAT and ICRP 110. A pharmacokinetic model describes the biodistribution of a given radiopharmaceutical in each defined compartment at various time-points. The Monte Carlo simulation toolkit gate offers the possibility to accurately simulate scintigraphic images and absorbed doses in volumes of interest. The TestDose platform relies on gate to reproduce precisely any imaging protocol and to provide reference dosimetry. For image generation, TestDose stores user's imaging requirements and generates automatically command files used as input for gate. Each compartment is simulated only once and the resulting output is weighted using pharmacokinetic data. Resulting compartment projections are aggregated to obtain the final image. For dosimetry computation, emission data are stored in the platform database and relevant gate input files are generated for the virtual patient model and associated pharmacokinetics. Two samples of software runs are given to demonstrate the potential of TestDose. A clinical imaging protocol for the Octreoscan™ therapeutical treatment was implemented using the 4D XCAT model. Whole-body "step and shoot" acquisitions at different times postinjection and one SPECT acquisition were generated within reasonable computation times. Based on the same Octreoscan™ kinetics, a dosimetry computation performed on the ICRP 110 model is also presented. The proposed platform offers a generic framework to implement any scintigraphic imaging protocols and voxel/organ-based dosimetry computation. Thanks to the modular nature of TestDose, other imaging modalities could be supported in the future such as positron emission tomography.

  12. A Climate Statistics Tool and Data Repository

    NASA Astrophysics Data System (ADS)

    Wang, J.; Kotamarthi, V. R.; Kuiper, J. A.; Orr, A.

    2017-12-01

    Researchers at Argonne National Laboratory and collaborating organizations have generated regional scale, dynamically downscaled climate model output using Weather Research and Forecasting (WRF) version 3.3.1 at a 12km horizontal spatial resolution over much of North America. The WRF model is driven by boundary conditions obtained from three independent global scale climate models and two different future greenhouse gas emission scenarios, named representative concentration pathways (RCPs). The repository of results has a temporal resolution of three hours for all the simulations, includes more than 50 variables, is stored in Network Common Data Form (NetCDF) files, and the data volume is nearly 600Tb. A condensed 800Gb set of NetCDF files were made for selected variables most useful for climate-related planning, including daily precipitation, relative humidity, solar radiation, maximum temperature, minimum temperature, and wind. The WRF model simulations are conducted for three 10-year time periods (1995-2004, 2045-2054, and 2085-2094), and two future scenarios RCP4.5 and RCP8.5). An open-source tool was coded using Python 2.7.8 and ESRI ArcGIS 10.3.1 programming libraries to parse the NetCDF files, compute summary statistics, and output results as GIS layers. Eight sets of summary statistics were generated as examples for the contiguous U.S. states and much of Alaska, including number of days over 90°F, number of days with a heat index over 90°F, heat waves, monthly and annual precipitation, drought, extreme precipitation, multi-model averages, and model bias. This paper will provide an overview of the project to generate the main and condensed data repositories, describe the Python tool and how to use it, present the GIS results of the computed examples, and discuss some of the ways they can be used for planning. The condensed climate data, Python tool, computed GIS results, and documentation of the work are shared on the Internet.

  13. Logistical Consideration in Computer-Based Screening of Astronaut Applicants

    NASA Technical Reports Server (NTRS)

    Galarza, Laura

    2000-01-01

    This presentation reviews the logistical, ergonomic, and psychometric issues and data related to the development and operational use of a computer-based system for the psychological screening of astronaut applicants. The Behavioral Health and Performance Group (BHPG) at the Johnson Space Center upgraded its astronaut psychological screening and selection procedures for the 1999 astronaut applicants and subsequent astronaut selection cycles. The questionnaires, tests, and inventories were upgraded from a paper-and-pencil system to a computer-based system. Members of the BHPG and a computer programmer designed and developed needed interfaces (screens, buttons, etc.) and programs for the astronaut psychological assessment system. This intranet-based system included the user-friendly computer-based administration of tests, test scoring, generation of reports, the integration of test administration and test output to a single system, and a complete database for past, present, and future selection data. Upon completion of the system development phase, four beta and usability tests were conducted with the newly developed system. The first three tests included 1 to 3 participants each. The final system test was conducted with 23 participants tested simultaneously. Usability and ergonomic data were collected from the system (beta) test participants and from 1999 astronaut applicants who volunteered the information in exchange for anonymity. Beta and usability test data were analyzed to examine operational, ergonomic, programming, test administration and scoring issues related to computer-based testing. Results showed a preference for computer-based testing over paper-and -pencil procedures. The data also reflected specific ergonomic, usability, psychometric, and logistical concerns that should be taken into account in future selection cycles. Conclusion. Psychological, psychometric, human and logistical factors must be examined and considered carefully when developing and using a computer-based system for psychological screening and selection.

  14. Monitoring of computing resource use of active software releases at ATLAS

    NASA Astrophysics Data System (ADS)

    Limosani, Antonio; ATLAS Collaboration

    2017-10-01

    The LHC is the world’s most powerful particle accelerator, colliding protons at centre of mass energy of 13 TeV. As the energy and frequency of collisions has grown in the search for new physics, so too has demand for computing resources needed for event reconstruction. We will report on the evolution of resource usage in terms of CPU and RAM in key ATLAS offline reconstruction workflows at the TierO at CERN and on the WLCG. Monitoring of workflows is achieved using the ATLAS PerfMon package, which is the standard ATLAS performance monitoring system running inside Athena jobs. Systematic daily monitoring has recently been expanded to include all workflows beginning at Monte Carlo generation through to end-user physics analysis, beyond that of event reconstruction. Moreover, the move to a multiprocessor mode in production jobs has facilitated the use of tools, such as “MemoryMonitor”, to measure the memory shared across processors in jobs. Resource consumption is broken down into software domains and displayed in plots generated using Python visualization libraries and collected into pre-formatted auto-generated Web pages, which allow the ATLAS developer community to track the performance of their algorithms. This information is however preferentially filtered to domain leaders and developers through the use of JIRA and via reports given at ATLAS software meetings. Finally, we take a glimpse of the future by reporting on the expected CPU and RAM usage in benchmark workflows associated with the High Luminosity LHC and anticipate the ways performance monitoring will evolve to understand and benchmark future workflows.

  15. The Bright, Artificial Intelligence-Augmented Future of Neuroimaging Reading.

    PubMed

    Hainc, Nicolin; Federau, Christian; Stieltjes, Bram; Blatow, Maria; Bink, Andrea; Stippich, Christoph

    2017-01-01

    Radiologists are among the first physicians to be directly affected by advances in computer technology. Computers are already capable of analyzing medical imaging data, and with decades worth of digital information available for training, will an artificial intelligence (AI) one day signal the end of the human radiologist? With the ever increasing work load combined with the looming doctor shortage, radiologists will be pushed far beyond their current estimated 3 s allotted time-of-analysis per image; an AI with super-human capabilities might seem like a logical replacement. We feel, however, that AI will lead to an augmentation rather than a replacement of the radiologist. The AI will be relied upon to handle the tedious, time-consuming tasks of detecting and segmenting outliers while possibly generating new, unanticipated results that can then be used as sources of medical discovery. This will affect not only radiologists but all physicians and also researchers dealing with medical imaging. Therefore, we must embrace future technology and collaborate interdisciplinary to spearhead the next revolution in medicine.

  16. Applications of Deep Learning and Reinforcement Learning to Biological Data.

    PubMed

    Mahmud, Mufti; Kaiser, Mohammed Shamim; Hussain, Amir; Vassanelli, Stefano

    2018-06-01

    Rapid advances in hardware-based technologies during the past decades have opened up new possibilities for life scientists to gather multimodal data in various application domains, such as omics, bioimaging, medical imaging, and (brain/body)-machine interfaces. These have generated novel opportunities for development of dedicated data-intensive machine learning techniques. In particular, recent research in deep learning (DL), reinforcement learning (RL), and their combination (deep RL) promise to revolutionize the future of artificial intelligence. The growth in computational power accompanied by faster and increased data storage, and declining computing costs have already allowed scientists in various fields to apply these techniques on data sets that were previously intractable owing to their size and complexity. This paper provides a comprehensive survey on the application of DL, RL, and deep RL techniques in mining biological data. In addition, we compare the performances of DL techniques when applied to different data sets across various application domains. Finally, we outline open issues in this challenging research area and discuss future development perspectives.

  17. Initial Progress Toward Development of a Voice-Based Computer-Delivered Motivational Intervention for Heavy Drinking College Students: An Experimental Study

    PubMed Central

    Lechner, William J; MacGlashan, James; Wray, Tyler B; Littman, Michael L

    2017-01-01

    Background Computer-delivered interventions have been shown to be effective in reducing alcohol consumption in heavy drinking college students. However, these computer-delivered interventions rely on mouse, keyboard, or touchscreen responses for interactions between the users and the computer-delivered intervention. The principles of motivational interviewing suggest that in-person interventions may be effective, in part, because they encourage individuals to think through and speak aloud their motivations for changing a health behavior, which current computer-delivered interventions do not allow. Objective The objective of this study was to take the initial steps toward development of a voice-based computer-delivered intervention that can ask open-ended questions and respond appropriately to users’ verbal responses, more closely mirroring a human-delivered motivational intervention. Methods We developed (1) a voice-based computer-delivered intervention that was run by a human controller and that allowed participants to speak their responses to scripted prompts delivered by speech generation software and (2) a text-based computer-delivered intervention that relied on the mouse, keyboard, and computer screen for all interactions. We randomized 60 heavy drinking college students to interact with the voice-based computer-delivered intervention and 30 to interact with the text-based computer-delivered intervention and compared their ratings of the systems as well as their motivation to change drinking and their drinking behavior at 1-month follow-up. Results Participants reported that the voice-based computer-delivered intervention engaged positively with them in the session and delivered content in a manner consistent with motivational interviewing principles. At 1-month follow-up, participants in the voice-based computer-delivered intervention condition reported significant decreases in quantity, frequency, and problems associated with drinking, and increased perceived importance of changing drinking behaviors. In comparison to the text-based computer-delivered intervention condition, those assigned to voice-based computer-delivered intervention reported significantly fewer alcohol-related problems at the 1-month follow-up (incident rate ratio 0.60, 95% CI 0.44-0.83, P=.002). The conditions did not differ significantly on perceived importance of changing drinking or on measures of drinking quantity and frequency of heavy drinking. Conclusions Results indicate that it is feasible to construct a series of open-ended questions and a bank of responses and follow-up prompts that can be used in a future fully automated voice-based computer-delivered intervention that may mirror more closely human-delivered motivational interventions to reduce drinking. Such efforts will require using advanced speech recognition capabilities and machine-learning approaches to train a program to mirror the decisions made by human controllers in the voice-based computer-delivered intervention used in this study. In addition, future studies should examine enhancements that can increase the perceived warmth and empathy of voice-based computer-delivered intervention, possibly through greater personalization, improvements in the speech generation software, and embodying the computer-delivered intervention in a physical form. PMID:28659259

  18. 48 CFR 53.105 - Computer generation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Computer generation. 53...) CLAUSES AND FORMS FORMS General 53.105 Computer generation. (a) Agencies may computer-generate the... be computer generated by the public. Unless prohibited by agency regulations, forms prescribed by...

  19. 48 CFR 53.105 - Computer generation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 2 2011-10-01 2011-10-01 false Computer generation. 53...) CLAUSES AND FORMS FORMS General 53.105 Computer generation. (a) Agencies may computer-generate the... be computer generated by the public. Unless prohibited by agency regulations, forms prescribed by...

  20. Computer-Generated Feedback on Student Writing

    ERIC Educational Resources Information Center

    Ware, Paige

    2011-01-01

    A distinction must be made between "computer-generated scoring" and "computer-generated feedback". Computer-generated scoring refers to the provision of automated scores derived from mathematical models built on organizational, syntactic, and mechanical aspects of writing. In contrast, computer-generated feedback, the focus of this article, refers…

  1. Making extreme computations possible with virtual machines

    NASA Astrophysics Data System (ADS)

    Reuter, J.; Chokoufe Nejad, B.; Ohl, T.

    2016-10-01

    State-of-the-art algorithms generate scattering amplitudes for high-energy physics at leading order for high-multiplicity processes as compiled code (in Fortran, C or C++). For complicated processes the size of these libraries can become tremendous (many GiB). We show that amplitudes can be translated to byte-code instructions, which even reduce the size by one order of magnitude. The byte-code is interpreted by a Virtual Machine with runtimes comparable to compiled code and a better scaling with additional legs. We study the properties of this algorithm, as an extension of the Optimizing Matrix Element Generator (O'Mega). The bytecode matrix elements are available as alternative input for the event generator WHIZARD. The bytecode interpreter can be implemented very compactly, which will help with a future implementation on massively parallel GPUs.

  2. Computational modeling of epidermal cell fate determination systems.

    PubMed

    Ryu, Kook Hui; Zheng, Xiaohua; Huang, Ling; Schiefelbein, John

    2013-02-01

    Cell fate decisions are of primary importance for plant development. Their simple 'either-or' outcome and dynamic nature has attracted the attention of computational modelers. Recent efforts have focused on modeling the determination of several epidermal cell types in the root and shoot of Arabidopsis where many molecular components have been defined. Results of integrated modeling and molecular biology experimentation in these systems have highlighted the importance of competitive positive and negative factors and interconnected feedback loops in generating flexible yet robust mechanisms for establishing distinct gene expression programs in neighboring cells. These models have proven useful in judging hypotheses and guiding future research. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Event parallelism: Distributed memory parallel computing for high energy physics experiments

    NASA Astrophysics Data System (ADS)

    Nash, Thomas

    1989-12-01

    This paper describes the present and expected future development of distributed memory parallel computers for high energy physics experiments. It covers the use of event parallel microprocessor farms, particularly at Fermilab, including both ACP multiprocessors and farms of MicroVAXES. These systems have proven very cost effective in the past. A case is made for moving to the more open environment of UNIX and RISC processors. The 2nd Generation ACP Multiprocessor System, which is based on powerful RISC system, is described. Given the promise of still more extraordinary increases in processor performance, a new emphasis on point to point, rather than bussed, communication will be required. Developments in this direction are described.

  4. Simulator for multilevel optimization research

    NASA Technical Reports Server (NTRS)

    Padula, S. L.; Young, K. C.

    1986-01-01

    A computer program designed to simulate and improve multilevel optimization techniques is described. By using simple analytic functions to represent complex engineering analyses, the simulator can generate and test a large variety of multilevel decomposition strategies in a relatively short time. This type of research is an essential step toward routine optimization of large aerospace systems. The paper discusses the types of optimization problems handled by the simulator and gives input and output listings and plots for a sample problem. It also describes multilevel implementation techniques which have value beyond the present computer program. Thus, this document serves as a user's manual for the simulator and as a guide for building future multilevel optimization applications.

  5. Applying Parallel Adaptive Methods with GeoFEST/PYRAMID to Simulate Earth Surface Crustal Dynamics

    NASA Technical Reports Server (NTRS)

    Norton, Charles D.; Lyzenga, Greg; Parker, Jay; Glasscoe, Margaret; Donnellan, Andrea; Li, Peggy

    2006-01-01

    This viewgraph presentation reviews the use Adaptive Mesh Refinement (AMR) in simulating the Crustal Dynamics of Earth's Surface. AMR simultaneously improves solution quality, time to solution, and computer memory requirements when compared to generating/running on a globally fine mesh. The use of AMR in simulating the dynamics of the Earth's Surface is spurred by future proposed NASA missions, such as InSAR for Earth surface deformation and other measurements. These missions will require support for large-scale adaptive numerical methods using AMR to model observations. AMR was chosen because it has been successful in computation fluid dynamics for predictive simulation of complex flows around complex structures.

  6. Quantifying the Beauty of Words: A Neurocognitive Poetics Perspective

    PubMed Central

    Jacobs, Arthur M.

    2017-01-01

    In this paper I would like to pave the ground for future studies in Computational Stylistics and (Neuro-)Cognitive Poetics by describing procedures for predicting the subjective beauty of words. A set of eight tentative word features is computed via Quantitative Narrative Analysis (QNA) and a novel metric for quantifying word beauty, the aesthetic potential is proposed. Application of machine learning algorithms fed with this QNA data shows that a classifier of the decision tree family excellently learns to split words into beautiful vs. ugly ones. The results shed light on surface and semantic features theoretically relevant for affective-aesthetic processes in literary reading and generate quantitative predictions for neuroaesthetic studies of verbal materials. PMID:29311877

  7. Quantifying the Beauty of Words: A Neurocognitive Poetics Perspective.

    PubMed

    Jacobs, Arthur M

    2017-01-01

    In this paper I would like to pave the ground for future studies in Computational Stylistics and (Neuro-)Cognitive Poetics by describing procedures for predicting the subjective beauty of words. A set of eight tentative word features is computed via Quantitative Narrative Analysis (QNA) and a novel metric for quantifying word beauty, the aesthetic potential is proposed. Application of machine learning algorithms fed with this QNA data shows that a classifier of the decision tree family excellently learns to split words into beautiful vs. ugly ones. The results shed light on surface and semantic features theoretically relevant for affective-aesthetic processes in literary reading and generate quantitative predictions for neuroaesthetic studies of verbal materials.

  8. A Bright Future for Evolutionary Methods in Drug Design.

    PubMed

    Le, Tu C; Winkler, David A

    2015-08-01

    Most medicinal chemists understand that chemical space is extremely large, essentially infinite. Although high-throughput experimental methods allow exploration of drug-like space more rapidly, they are still insufficient to fully exploit the opportunities that such large chemical space offers. Evolutionary methods can synergistically blend automated synthesis and characterization methods with computational design to identify promising regions of chemical space more efficiently. We describe how evolutionary methods are implemented, and provide examples of published drug development research in which these methods have generated molecules with increased efficacy. We anticipate that evolutionary methods will play an important role in future drug discovery. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. On recent advances and future research directions for computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Baker, A. J.; Soliman, M. O.; Manhardt, P. D.

    1986-01-01

    This paper highlights some recent accomplishments regarding CFD numerical algorithm constructions for generation of discrete approximate solutions to classes of Reynolds-averaged Navier-Stokes equations. Following an overview of turbulent closure modeling, and development of appropriate conservation law systems, a Taylor weak-statement semi-discrete approximate solution algorithm is developed. Various forms for completion to the final linear algebra statement are cited, as are a range of candidate numerical linear algebra solution procedures. This development sequence emphasizes the key building blocks of a CFD RNS algorithm, including solution trial and test spaces, integration procedure and added numerical stability mechanisms. A range of numerical results are discussed focusing on key topics guiding future research directions.

  10. Natural history's hypothetical moments: narratives of contingency in Victorian culture.

    PubMed

    Choi, Tina Young

    2009-01-01

    This essay focuses on the ways in which works by Robert Chambers, Charles Darwin, and George Eliot encouraged readers to imagine the future as contingent. But where Chambers alludes to Charles Babbage's computational engine and the period's life insurance industry to hint at the role of contingency in natural history, Darwin insists on the importance of contingently determined outcomes to speciation. The "Origin" consistently exercises the reader's speculative energies by generating conditional statements, causal hypotheses, adn diverging alternatives. "Adam Bede" constitutes its characters' interior lives around the proliferation of such contingent narratives. To reflect on the future or on the past, these works suggest, demands a temporal, moral, and narrative complexity in one's thinking.

  11. Generation of referring expressions: assessing the Incremental Algorithm.

    PubMed

    van Deemter, Kees; Gatt, Albert; van der Sluis, Ielka; Power, Richard

    2012-07-01

    A substantial amount of recent work in natural language generation has focused on the generation of ''one-shot'' referring expressions whose only aim is to identify a target referent. Dale and Reiter's Incremental Algorithm (IA) is often thought to be the best algorithm for maximizing the similarity to referring expressions produced by people. We test this hypothesis by eliciting referring expressions from human subjects and computing the similarity between the expressions elicited and the ones generated by algorithms. It turns out that the success of the IA depends substantially on the ''preference order'' (PO) employed by the IA, particularly in complex domains. While some POs cause the IA to produce referring expressions that are very similar to expressions produced by human subjects, others cause the IA to perform worse than its main competitors; moreover, it turns out to be difficult to predict the success of a PO on the basis of existing psycholinguistic findings or frequencies in corpora. We also examine the computational complexity of the algorithms in question and argue that there are no compelling reasons for preferring the IA over some of its main competitors on these grounds. We conclude that future research on the generation of referring expressions should explore alternatives to the IA, focusing on algorithms, inspired by the Greedy Algorithm, which do not work with a fixed PO. Copyright © 2011 Cognitive Science Society, Inc.

  12. A novel in silico approach to drug discovery via computational intelligence.

    PubMed

    Hecht, David; Fogel, Gary B

    2009-04-01

    A computational intelligence drug discovery platform is introduced as an innovative technology designed to accelerate high-throughput drug screening for generalized protein-targeted drug discovery. This technology results in collections of novel small molecule compounds that bind to protein targets as well as details on predicted binding modes and molecular interactions. The approach was tested on dihydrofolate reductase (DHFR) for novel antimalarial drug discovery; however, the methods developed can be applied broadly in early stage drug discovery and development. For this purpose, an initial fragment library was defined, and an automated fragment assembly algorithm was generated. These were combined with a computational intelligence screening tool for prescreening of compounds relative to DHFR inhibition. The entire method was assayed relative to spaces of known DHFR inhibitors and with chemical feasibility in mind, leading to experimental validation in future studies.

  13. Computational protein design-the next generation tool to expand synthetic biology applications.

    PubMed

    Gainza-Cirauqui, Pablo; Correia, Bruno Emanuel

    2018-05-02

    One powerful approach to engineer synthetic biology pathways is the assembly of proteins sourced from one or more natural organisms. However, synthetic pathways often require custom functions or biophysical properties not displayed by natural proteins, limitations that could be overcome through modern protein engineering techniques. Structure-based computational protein design is a powerful tool to engineer new functional capabilities in proteins, and it is beginning to have a profound impact in synthetic biology. Here, we review efforts to increase the capabilities of synthetic biology using computational protein design. We focus primarily on computationally designed proteins not only validated in vitro, but also shown to modulate different activities in living cells. Efforts made to validate computational designs in cells can illustrate both the challenges and opportunities in the intersection of protein design and synthetic biology. We also highlight protein design approaches, which although not validated as conveyors of new cellular function in situ, may have rapid and innovative applications in synthetic biology. We foresee that in the near-future, computational protein design will vastly expand the functional capabilities of synthetic cells. Copyright © 2018. Published by Elsevier Ltd.

  14. Progress in computational toxicology.

    PubMed

    Ekins, Sean

    2014-01-01

    Computational methods have been widely applied to toxicology across pharmaceutical, consumer product and environmental fields over the past decade. Progress in computational toxicology is now reviewed. A literature review was performed on computational models for hepatotoxicity (e.g. for drug-induced liver injury (DILI)), cardiotoxicity, renal toxicity and genotoxicity. In addition various publications have been highlighted that use machine learning methods. Several computational toxicology model datasets from past publications were used to compare Bayesian and Support Vector Machine (SVM) learning methods. The increasing amounts of data for defined toxicology endpoints have enabled machine learning models that have been increasingly used for predictions. It is shown that across many different models Bayesian and SVM perform similarly based on cross validation data. Considerable progress has been made in computational toxicology in a decade in both model development and availability of larger scale or 'big data' models. The future efforts in toxicology data generation will likely provide us with hundreds of thousands of compounds that are readily accessible for machine learning models. These models will cover relevant chemistry space for pharmaceutical, consumer product and environmental applications. Copyright © 2013 Elsevier Inc. All rights reserved.

  15. A Framework for Parallel Unstructured Grid Generation for Complex Aerodynamic Simulations

    NASA Technical Reports Server (NTRS)

    Zagaris, George; Pirzadeh, Shahyar Z.; Chrisochoides, Nikos

    2009-01-01

    A framework for parallel unstructured grid generation targeting both shared memory multi-processors and distributed memory architectures is presented. The two fundamental building-blocks of the framework consist of: (1) the Advancing-Partition (AP) method used for domain decomposition and (2) the Advancing Front (AF) method used for mesh generation. Starting from the surface mesh of the computational domain, the AP method is applied recursively to generate a set of sub-domains. Next, the sub-domains are meshed in parallel using the AF method. The recursive nature of domain decomposition naturally maps to a divide-and-conquer algorithm which exhibits inherent parallelism. For the parallel implementation, the Master/Worker pattern is employed to dynamically balance the varying workloads of each task on the set of available CPUs. Performance results by this approach are presented and discussed in detail as well as future work and improvements.

  16. A characterization of workflow management systems for extreme-scale applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia

    We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less

  17. A characterization of workflow management systems for extreme-scale applications

    DOE PAGES

    Ferreira da Silva, Rafael; Filgueira, Rosa; Pietri, Ilia; ...

    2017-02-16

    We present that the automation of the execution of computational tasks is at the heart of improving scientific productivity. Over the last years, scientific workflows have been established as an important abstraction that captures data processing and computation of large and complex scientific applications. By allowing scientists to model and express entire data processing steps and their dependencies, workflow management systems relieve scientists from the details of an application and manage its execution on a computational infrastructure. As the resource requirements of today’s computational and data science applications that process vast amounts of data keep increasing, there is a compellingmore » case for a new generation of advances in high-performance computing, commonly termed as extreme-scale computing, which will bring forth multiple challenges for the design of workflow applications and management systems. This paper presents a novel characterization of workflow management systems using features commonly associated with extreme-scale computing applications. We classify 15 popular workflow management systems in terms of workflow execution models, heterogeneous computing environments, and data access methods. Finally, the paper also surveys workflow applications and identifies gaps for future research on the road to extreme-scale workflows and management systems.« less

  18. The DoD's High Performance Computing Modernization Program - Ensuing the National Earth Systems Prediction Capability Becomes Operational

    NASA Astrophysics Data System (ADS)

    Burnett, W.

    2016-12-01

    The Department of Defense's (DoD) High Performance Computing Modernization Program (HPCMP) provides high performance computing to address the most significant challenges in computational resources, software application support and nationwide research and engineering networks. Today, the HPCMP has a critical role in ensuring the National Earth System Prediction Capability (N-ESPC) achieves initial operational status in 2019. A 2015 study commissioned by the HPCMP found that N-ESPC computational requirements will exceed interconnect bandwidth capacity due to the additional load from data assimilation and passing connecting data between ensemble codes. Memory bandwidth and I/O bandwidth will continue to be significant bottlenecks for the Navy's Hybrid Coordinate Ocean Model (HYCOM) scalability - by far the major driver of computing resource requirements in the N-ESPC. The study also found that few of the N-ESPC model developers have detailed plans to ensure their respective codes scale through 2024. Three HPCMP initiatives are designed to directly address and support these issues: Productivity Enhancement, Technology, Transfer and Training (PETTT), the HPCMP Applications Software Initiative (HASI), and Frontier Projects. PETTT supports code conversion by providing assistance, expertise and training in scalable and high-end computing architectures. HASI addresses the continuing need for modern application software that executes effectively and efficiently on next-generation high-performance computers. Frontier Projects enable research and development that could not be achieved using typical HPCMP resources by providing multi-disciplinary teams access to exceptional amounts of high performance computing resources. Finally, the Navy's DoD Supercomputing Resource Center (DSRC) currently operates a 6 Petabyte system, of which Naval Oceanography receives 15% of operational computational system use, or approximately 1 Petabyte of the processing capability. The DSRC will provide the DoD with future computing assets to initially operate the N-ESPC in 2019. This talk will further describe how DoD's HPCMP will ensure N-ESPC becomes operational, efficiently and effectively, using next-generation high performance computing.

  19. Photonic-Enabled RF Canceller with Tunable Time-Delay Taps

    DTIC Science & Technology

    2016-12-05

    ports indicated in Fig. 1. The analyzer was configured to sweep 10 MHz to 6 GHz with +10 dBm of output power , and compute the time-domain transmission ...Laboratory Lexington, Massachusetts, USA Abstract—Future 5G wireless networks can benefit from the use of in-band full-duplex technologies that allow access...microwave photonics, RF cancellation. I. INTRODUCTION In-Band Full-Duplex (IBFD) technologies are being consid- ered for 5th generation (5G) wireless

  20. Teachers' Participation in Professional Development Concerning the Implementation of New Technologies in Class: A Latent Class Analysis of Teachers and the Relationship with the Use of Computers, ICT Self-Efficacy and Emphasis on Teaching ICT Skills

    ERIC Educational Resources Information Center

    Drossel, Kerstin; Eickelmann, Birgit

    2017-01-01

    The increasing availability of new technologies in an ever more digitalized world has gained momentum in practically all spheres of life, making technology-related skills a key competence not only in professional settings. Thus, schools assume responsibility for imparting these skills to their students, and hence to future generations of…

  1. Design of mobile shelters for communication purposes

    NASA Astrophysics Data System (ADS)

    Lotens, W. A.; Leebeek, H. J.

    1982-03-01

    A general design for a future generation of shelters, to be used as mobile work places, is presented. Design criteria involve ergonomics, functional suitability, and air conditioning. Electronics, power supply, and personnel get their own compartments. Work space is provided for two people with room for two more. Center of mass and cable connections are considered. Air conditioning requirements are calculated with a computer program. The result is an integrated design, applicable to shelters for several purposes.

  2. The house of the future

    ScienceCinema

    None

    2017-12-09

    Learn what it will take to create tomorrow's net-zero energy home as scientists reveal the secrets of cool roofs, smart windows, and computer-driven energy control systems. The net-zero energy home: Scientists are working to make tomorrow's homes more than just energy efficient -- they want them to be zero energy. Iain Walker, a scientist in the Lab's Energy Performance of Buildings Group, will discuss what it takes to develop net-zero energy houses that generate as much energy as they use through highly aggressive energy efficiency and on-site renewable energy generation. Talking back to the grid: Imagine programming your house to use less energy if the electricity grid is full or price are high. Mary Ann Piette, deputy director of Berkeley Lab's building technology department and director of the Lab's Demand Response Research Center, will discuss how new technologies are enabling buildings to listen to the grid and automatically change their thermostat settings or lighting loads, among other demands, in response to fluctuating electricity prices. The networked (and energy efficient) house: In the future, your home's lights, climate control devices, computers, windows, and appliances could be controlled via a sophisticated digital network. If it's plugged in, it'll be connected. Bruce Nordman, an energy scientist in Berkeley Lab's Energy End-Use Forecasting group, will discuss how he and other scientists are working to ensure these networks help homeowners save energy.

  3. The house of the future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Learn what it will take to create tomorrow's net-zero energy home as scientists reveal the secrets of cool roofs, smart windows, and computer-driven energy control systems. The net-zero energy home: Scientists are working to make tomorrow's homes more than just energy efficient -- they want them to be zero energy. Iain Walker, a scientist in the Lab's Energy Performance of Buildings Group, will discuss what it takes to develop net-zero energy houses that generate as much energy as they use through highly aggressive energy efficiency and on-site renewable energy generation. Talking back to the grid: Imagine programming your house tomore » use less energy if the electricity grid is full or price are high. Mary Ann Piette, deputy director of Berkeley Lab's building technology department and director of the Lab's Demand Response Research Center, will discuss how new technologies are enabling buildings to listen to the grid and automatically change their thermostat settings or lighting loads, among other demands, in response to fluctuating electricity prices. The networked (and energy efficient) house: In the future, your home's lights, climate control devices, computers, windows, and appliances could be controlled via a sophisticated digital network. If it's plugged in, it'll be connected. Bruce Nordman, an energy scientist in Berkeley Lab's Energy End-Use Forecasting group, will discuss how he and other scientists are working to ensure these networks help homeowners save energy.« less

  4. Future tense: call for a new generation of artists

    NASA Astrophysics Data System (ADS)

    Ohlmann, Dietmar

    1995-02-01

    Some people try hard to educate others about the beauty and technical benefits of holographic applications but another generation is already waiting to learn more about the media which talk to them about the future. Today the most common question is 'How can I do holograms with a computer?' 'Can I do it with an Amiga?' For the MIT specialists these are now very simple questions. We can expect to see the present shape of the holographic laboratory pass into history. I personally like to work with a VHS camera and mix it with CAD/CAM images, but computer and video are not the only media which will change the face of holography. The He.Ne. will be exchanged by diode laser. In a wavelength of 690 nm, some of them bring 40 mW in single mode and single line, not bigger than your little finger. Having such energy in so little a container, and the state of the art drifts rapidly into more flexibility. Using new media and introducing it in our societies give us a new responsibility. Would too much media kill the art? I do not think so, because I like the variety of media which give new possibility of expression. The game with new media is the power of creativity and it will find its meaning by itself.

  5. A New Computational Technique for the Generation of Optimised Aircraft Trajectories

    NASA Astrophysics Data System (ADS)

    Chircop, Kenneth; Gardi, Alessandro; Zammit-Mangion, David; Sabatini, Roberto

    2017-12-01

    A new computational technique based on Pseudospectral Discretisation (PSD) and adaptive bisection ɛ-constraint methods is proposed to solve multi-objective aircraft trajectory optimisation problems formulated as nonlinear optimal control problems. This technique is applicable to a variety of next-generation avionics and Air Traffic Management (ATM) Decision Support Systems (DSS) for strategic and tactical replanning operations. These include the future Flight Management Systems (FMS) and the 4-Dimensional Trajectory (4DT) planning and intent negotiation/validation tools envisaged by SESAR and NextGen for a global implementation. In particular, after describing the PSD method, the adaptive bisection ɛ-constraint method is presented to allow an efficient solution of problems in which two or multiple performance indices are to be minimized simultaneously. Initial simulation case studies were performed adopting suitable aircraft dynamics models and addressing a classical vertical trajectory optimisation problem with two objectives simultaneously. Subsequently, a more advanced 4DT simulation case study is presented with a focus on representative ATM optimisation objectives in the Terminal Manoeuvring Area (TMA). The simulation results are analysed in-depth and corroborated by flight performance analysis, supporting the validity of the proposed computational techniques.

  6. Recent Developments in the VISRAD 3-D Target Design and Radiation Simulation Code

    NASA Astrophysics Data System (ADS)

    Macfarlane, Joseph; Golovkin, Igor; Sebald, James

    2017-10-01

    The 3-D view factor code VISRAD is widely used in designing HEDP experiments at major laser and pulsed-power facilities, including NIF, OMEGA, OMEGA-EP, ORION, Z, and LMJ. It simulates target designs by generating a 3-D grid of surface elements, utilizing a variety of 3-D primitives and surface removal algorithms, and can be used to compute the radiation flux throughout the surface element grid by computing element-to-element view factors and solving power balance equations. Target set-up and beam pointing are facilitated by allowing users to specify positions and angular orientations using a variety of coordinates systems (e.g., that of any laser beam, target component, or diagnostic port). Analytic modeling for laser beam spatial profiles for OMEGA DPPs and NIF CPPs is used to compute laser intensity profiles throughout the grid of surface elements. VISRAD includes a variety of user-friendly graphics for setting up targets and displaying results, can readily display views from any point in space, and can be used to generate image sequences for animations. We will discuss recent improvements to conveniently assess beam capture on target and beam clearance of diagnostic components, as well as plans for future developments.

  7. Mantle convection on modern supercomputers

    NASA Astrophysics Data System (ADS)

    Weismüller, Jens; Gmeiner, Björn; Mohr, Marcus; Waluga, Christian; Wohlmuth, Barbara; Rüde, Ulrich; Bunge, Hans-Peter

    2015-04-01

    Mantle convection is the cause for plate tectonics, the formation of mountains and oceans, and the main driving mechanism behind earthquakes. The convection process is modeled by a system of partial differential equations describing the conservation of mass, momentum and energy. Characteristic to mantle flow is the vast disparity of length scales from global to microscopic, turning mantle convection simulations into a challenging application for high-performance computing. As system size and technical complexity of the simulations continue to increase, design and implementation of simulation models for next generation large-scale architectures demand an interdisciplinary co-design. Here we report about recent advances of the TERRA-NEO project, which is part of the high visibility SPPEXA program, and a joint effort of four research groups in computer sciences, mathematics and geophysical application under the leadership of FAU Erlangen. TERRA-NEO develops algorithms for future HPC infrastructures, focusing on high computational efficiency and resilience in next generation mantle convection models. We present software that can resolve the Earth's mantle with up to 1012 grid points and scales efficiently to massively parallel hardware with more than 50,000 processors. We use our simulations to explore the dynamic regime of mantle convection assessing the impact of small scale processes on global mantle flow.

  8. Experimental Stage Separation Tool Development in NASA Langley's Aerothermodynamics Laboratory

    NASA Technical Reports Server (NTRS)

    Murphy, Kelly J.; Scallion, William I.

    2005-01-01

    As part of the research effort at NASA in support of the stage separation and ascent aerothermodynamics research program, proximity testing of a generic bimese wing-body configuration was conducted in NASA Langley's Aerothermodynamics Laboratory in the 20-Inch Mach 6 Air Tunnel. The objective of this work is the development of experimental tools and testing methodologies to apply to hypersonic stage separation problems for future multi-stage launch vehicle systems. Aerodynamic force and moment proximity data were generated at a nominal Mach number of 6 over a small range of angles of attack. The generic bimese configuration was tested in a belly-to-belly and back-to-belly orientation at 86 relative proximity locations. Over 800 aerodynamic proximity data points were taken to serve as a database for code validation. Longitudinal aerodynamic data generated in this test program show very good agreement with viscous computational predictions. Thus a framework has been established to study separation problems in the hypersonic regime using coordinated experimental and computational tools.

  9. Smart Sensors: Why and when the origin was and why and where the future will be

    NASA Astrophysics Data System (ADS)

    Corsi, C.

    2013-12-01

    Smart Sensors is a technique developed in the 70's when the processing capabilities, based on readout integrated with signal processing, was still far from the complexity needed in advanced IR surveillance and warning systems, because of the enormous amount of noise/unwanted signals emitted by operating scenario especially in military applications. The Smart Sensors technology was kept restricted within a close military environment exploding in applications and performances in the 90's years thanks to the impressive improvements in the integrated signal read-out and processing achieved by CCD-CMOS technologies in FPA. In fact the rapid advances of "very large scale integration" (VLSI) processor technology and mosaic EO detector array technology allowed to develop new generations of Smart Sensors with much improved signal processing by integrating microcomputers and other VLSI signal processors. inside the sensor structure achieving some basic functions of living eyes (dynamic stare, non-uniformity compensation, spatial and temporal filtering). New and future technologies (Nanotechnology, Bio-Organic Electronics, Bio-Computing) are lightning a new generation of Smart Sensors extending the Smartness from the Space-Time Domain to Spectroscopic Functional Multi-Domain Signal Processing. History and future forecasting of Smart Sensors will be reported.

  10. Predicting future discoveries from current scientific literature.

    PubMed

    Petrič, Ingrid; Cestnik, Bojan

    2014-01-01

    Knowledge discovery in biomedicine is a time-consuming process starting from the basic research, through preclinical testing, towards possible clinical applications. Crossing of conceptual boundaries is often needed for groundbreaking biomedical research that generates highly inventive discoveries. We demonstrate the ability of a creative literature mining method to advance valuable new discoveries based on rare ideas from existing literature. When emerging ideas from scientific literature are put together as fragments of knowledge in a systematic way, they may lead to original, sometimes surprising, research findings. If enough scientific evidence is already published for the association of such findings, they can be considered as scientific hypotheses. In this chapter, we describe a method for the computer-aided generation of such hypotheses based on the existing scientific literature. Our literature-based discovery of NF-kappaB with its possible connections to autism was recently approved by scientific community, which confirms the ability of our literature mining methodology to accelerate future discoveries based on rare ideas from existing literature.

  11. Evaluation of a CFD Method for Aerodynamic Database Development using the Hyper-X Stack Configuration

    NASA Technical Reports Server (NTRS)

    Parikh, Paresh; Engelund, Walter; Armand, Sasan; Bittner, Robert

    2004-01-01

    A computational fluid dynamic (CFD) study is performed on the Hyper-X (X-43A) Launch Vehicle stack configuration in support of the aerodynamic database generation in the transonic to hypersonic flow regime. The main aim of the study is the evaluation of a CFD method that can be used to support aerodynamic database development for similar future configurations. The CFD method uses the NASA Langley Research Center developed TetrUSS software, which is based on tetrahedral, unstructured grids. The Navier-Stokes computational method is first evaluated against a set of wind tunnel test data to gain confidence in the code s application to hypersonic Mach number flows. The evaluation includes comparison of the longitudinal stability derivatives on the complete stack configuration (which includes the X-43A/Hyper-X Research Vehicle, the launch vehicle and an adapter connecting the two), detailed surface pressure distributions at selected locations on the stack body and component (rudder, elevons) forces and moments. The CFD method is further used to predict the stack aerodynamic performance at flow conditions where no experimental data is available as well as for component loads for mechanical design and aero-elastic analyses. An excellent match between the computed and the test data over a range of flow conditions provides a computational tool that may be used for future similar hypersonic configurations with confidence.

  12. Perspectives on Emerging/Novel Computing Paradigms and Future Aerospace Workforce Environments

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    2003-01-01

    The accelerating pace of the computing technology development shows no signs of abating. Computing power reaching 100 Tflop/s is likely to be reached by 2004 and Pflop/s (10(exp 15) Flop/s) by 2007. The fundamental physical limits of computation, including information storage limits, communication limits and computation rate limits will likely be reached by the middle of the present millennium. To overcome these limits, novel technologies and new computing paradigms will be developed. An attempt is made in this overview to put the diverse activities related to new computing-paradigms in perspective and to set the stage for the succeeding presentations. The presentation is divided into five parts. In the first part, a brief historical account is given of development of computer and networking technologies. The second part provides brief overviews of the three emerging computing paradigms grid, ubiquitous and autonomic computing. The third part lists future computing alternatives and the characteristics of future computing environment. The fourth part describes future aerospace workforce research, learning and design environments. The fifth part lists the objectives of the workshop and some of the sources of information on future computing paradigms.

  13. Enabling the environmentally clean air transportation of the future: a vision of computational fluid dynamics in 2030

    PubMed Central

    Slotnick, Jeffrey P.; Khodadoust, Abdollah; Alonso, Juan J.; Darmofal, David L.; Gropp, William D.; Lurie, Elizabeth A.; Mavriplis, Dimitri J.; Venkatakrishnan, Venkat

    2014-01-01

    As global air travel expands rapidly to meet demand generated by economic growth, it is essential to continue to improve the efficiency of air transportation to reduce its carbon emissions and address concerns about climate change. Future transports must be ‘cleaner’ and designed to include technologies that will continue to lower engine emissions and reduce community noise. The use of computational fluid dynamics (CFD) will be critical to enable the design of these new concepts. In general, the ability to simulate aerodynamic and reactive flows using CFD has progressed rapidly during the past several decades and has fundamentally changed the aerospace design process. Advanced simulation capabilities not only enable reductions in ground-based and flight-testing requirements, but also provide added physical insight, and enable superior designs at reduced cost and risk. In spite of considerable success, reliable use of CFD has remained confined to a small region of the operating envelope due, in part, to the inability of current methods to reliably predict turbulent, separated flows. Fortunately, the advent of much more powerful computing platforms provides an opportunity to overcome a number of these challenges. This paper summarizes the findings and recommendations from a recent NASA-funded study that provides a vision for CFD in the year 2030, including an assessment of critical technology gaps and needed development, and identifies the key CFD technology advancements that will enable the design and development of much cleaner aircraft in the future. PMID:25024413

  14. Multimission image processing and science data visualization

    NASA Technical Reports Server (NTRS)

    Green, William B.

    1993-01-01

    The Operational Science Analysis (OSA) Functional area supports science instrument data display, analysis, visualization and photo processing in support of flight operations of planetary spacecraft managed by the Jet Propulsion Laboratory (JPL). This paper describes the data products generated by the OSA functional area, and the current computer system used to generate these data products. The objectives on a system upgrade now in process are described. The design approach to development of the new system are reviewed, including use of the Unix operating system and X-Window display standards to provide platform independence, portability, and modularity within the new system, is reviewed. The new system should provide a modular and scaleable capability supporting a variety of future missions at JPL.

  15. Integrated software system for low level waste management

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Worku, G.

    1995-12-31

    In the continually changing and uncertain world of low level waste management, many generators in the US are faced with the prospect of having to store their waste on site for the indefinite future. This consequently increases the set of tasks performed by the generators in the areas of packaging, characterizing, classifying, screening (if a set of acceptance criteria applies), and managing the inventory for the duration of onsite storage. When disposal sites become available, it is expected that the work will require re-evaluating the waste packages, including possible re-processing, re-packaging, or re-classifying in preparation for shipment for disposal undermore » the regulatory requirements of the time. In this day and age, when there is wide use of computers and computer literacy is at high levels, an important waste management tool would be an integrated software system that aids waste management personnel in conducting these tasks quickly and accurately. It has become evident that such an integrated radwaste management software system offers great benefits to radwaste generators both in the US and other countries. This paper discusses one such approach to integrated radwaste management utilizing some globally accepted radiological assessment software applications.« less

  16. Aircraft noise synthesis system

    NASA Technical Reports Server (NTRS)

    Mccurdy, David A.; Grandle, Robert E.

    1987-01-01

    A second-generation Aircraft Noise Synthesis System has been developed to provide test stimuli for studies of community annoyance to aircraft flyover noise. The computer-based system generates realistic, time-varying, audio simulations of aircraft flyover noise at a specified observer location on the ground. The synthesis takes into account the time-varying aircraft position relative to the observer; specified reference spectra consisting of broadband, narrowband, and pure-tone components; directivity patterns; Doppler shift; atmospheric effects; and ground effects. These parameters can be specified and controlled in such a way as to generate stimuli in which certain noise characteristics, such as duration or tonal content, are independently varied, while the remaining characteristics, such as broadband content, are held constant. The system can also generate simulations of the predicted noise characteristics of future aircraft. A description of the synthesis system and a discussion of the algorithms and methods used to generate the simulations are provided. An appendix describing the input data and providing user instructions is also included.

  17. Computational Nanoelectronics and Nanotechnology at NASA ARC

    NASA Technical Reports Server (NTRS)

    Saini, Subhash; Kutler, Paul (Technical Monitor)

    1998-01-01

    Both physical and economic considerations indicate that the scaling era of CMOS will run out of steam around the year 2010. However, physical laws also indicate that it is possible to compute at a rate of a billion times present speeds with the expenditure of only one Watt of electrical power. NASA has long-term needs where ultra-small semiconductor devices are needed for critical applications: high performance, low power, compact computers for intelligent autonomous vehicles and Petaflop computing technology are some key examples. To advance the design, development, and production of future generation micro- and nano-devices, IT Modeling and Simulation Group has been started at NASA Ames with a goal to develop an integrated simulation environment that addresses problems related to nanoelectronics and molecular nanotechnology. Overview of nanoelectronics and nanotechnology research activities being carried out at Ames Research Center will be presented. We will also present the vision and the research objectives of the IT Modeling and Simulation Group including the applications of nanoelectronic based devices relevant to NASA missions.

  18. Computational Nanoelectronics and Nanotechnology at NASA ARC

    NASA Technical Reports Server (NTRS)

    Saini, Subhash

    1998-01-01

    Both physical and economic considerations indicate that the scaling era of CMOS will run out of steam around the year 2010. However, physical laws also indicate that it is possible to compute at a rate of a billion times present speeds with the expenditure of only one Watt of electrical power. NASA has long-term needs where ultra-small semiconductor devices are needed for critical applications: high performance, low power, compact computers for intelligent autonomous vehicles and Petaflop computing technolpgy are some key examples. To advance the design, development, and production of future generation micro- and nano-devices, IT Modeling and Simulation Group has been started at NASA Ames with a goal to develop an integrated simulation environment that addresses problems related to nanoelectronics and molecular nanotecnology. Overview of nanoelectronics and nanotechnology research activities being carried out at Ames Research Center will be presented. We will also present the vision and the research objectives of the IT Modeling and Simulation Group including the applications of nanoelectronic based devices relevant to NASA missions.

  19. Airfoil Vibration Dampers program

    NASA Technical Reports Server (NTRS)

    Cook, Robert M.

    1991-01-01

    The Airfoil Vibration Damper program has consisted of an analysis phase and a testing phase. During the analysis phase, a state-of-the-art computer code was developed, which can be used to guide designers in the placement and sizing of friction dampers. The use of this computer code was demonstrated by performing representative analyses on turbine blades from the High Pressure Oxidizer Turbopump (HPOTP) and High Pressure Fuel Turbopump (HPFTP) of the Space Shuttle Main Engine (SSME). The testing phase of the program consisted of performing friction damping tests on two different cantilever beams. Data from these tests provided an empirical check on the accuracy of the computer code developed in the analysis phase. Results of the analysis and testing showed that the computer code can accurately predict the performance of friction dampers. In addition, a valuable set of friction damping data was generated, which can be used to aid in the design of friction dampers, as well as provide benchmark test cases for future code developers.

  20. Scout: high-performance heterogeneous computing made simple

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jablin, James; Mc Cormick, Patrick; Herlihy, Maurice

    2011-01-26

    Researchers must often write their own simulation and analysis software. During this process they simultaneously confront both computational and scientific problems. Current strategies for aiding the generation of performance-oriented programs do not abstract the software development from the science. Furthermore, the problem is becoming increasingly complex and pressing with the continued development of many-core and heterogeneous (CPU-GPU) architectures. To acbieve high performance, scientists must expertly navigate both software and hardware. Co-design between computer scientists and research scientists can alleviate but not solve this problem. The science community requires better tools for developing, optimizing, and future-proofing codes, allowing scientists to focusmore » on their research while still achieving high computational performance. Scout is a parallel programming language and extensible compiler framework targeting heterogeneous architectures. It provides the abstraction required to buffer scientists from the constantly-shifting details of hardware while still realizing higb-performance by encapsulating software and hardware optimization within a compiler framework.« less

  1. Modern design of a fast front-end computer

    NASA Astrophysics Data System (ADS)

    Šoštarić, Z.; Anic̈ić, D.; Sekolec, L.; Su, J.

    1994-12-01

    Front-end computers (FEC) at Paul Scherrer Institut provide access to accelerator CAMAC-based sensors and actuators by way of a local area network. In the scope of the new generation FEC project, a front-end is regarded as a collection of services. The functionality of one such service is described in terms of Yourdon's environment, behaviour, processor and task models. The computational model (software representation of the environment) of the service is defined separately, using the information model of the Shlaer-Mellor method, and Sather OO language. In parallel with the analysis and later with the design, a suite of test programmes was developed to evaluate the feasibility of different computing platforms for the project and a set of rapid prototypes was produced to resolve different implementation issues. The past and future aspects of the project and its driving forces are presented. Justification of the choice of methodology, platform and requirement, is given. We conclude with a description of the present state, priorities and limitations of our project.

  2. Automated Generation of Message-Passing Programs: An Evaluation Using CAPTools

    NASA Technical Reports Server (NTRS)

    Hribar, Michelle R.; Jin, Haoqiang; Yan, Jerry C.; Saini, Subhash (Technical Monitor)

    1998-01-01

    Scientists at NASA Ames Research Center have been developing computational aeroscience applications on highly parallel architectures over the past ten years. During that same time period, a steady transition of hardware and system software also occurred, forcing us to expend great efforts into migrating and re-coding our applications. As applications and machine architectures become increasingly complex, the cost and time required for this process will become prohibitive. In this paper, we present the first set of results in our evaluation of interactive parallelization tools. In particular, we evaluate CAPTool's ability to parallelize computational aeroscience applications. CAPTools was tested on serial versions of the NAS Parallel Benchmarks and ARC3D, a computational fluid dynamics application, on two platforms: the SGI Origin 2000 and the Cray T3E. This evaluation includes performance, amount of user interaction required, limitations and portability. Based on these results, a discussion on the feasibility of computer aided parallelization of aerospace applications is presented along with suggestions for future work.

  3. 48 CFR 52.253-1 - Computer Generated Forms.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 2 2012-10-01 2012-10-01 false Computer Generated Forms....253-1 Computer Generated Forms. As prescribed in FAR 53.111, insert the following clause: Computer... by the Federal Acquisition Regulation (FAR) may be submitted on a computer generated version of the...

  4. 48 CFR 52.253-1 - Computer Generated Forms.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 2 2013-10-01 2013-10-01 false Computer Generated Forms....253-1 Computer Generated Forms. As prescribed in FAR 53.111, insert the following clause: Computer... by the Federal Acquisition Regulation (FAR) may be submitted on a computer generated version of the...

  5. 48 CFR 52.253-1 - Computer Generated Forms.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 2 2014-10-01 2014-10-01 false Computer Generated Forms....253-1 Computer Generated Forms. As prescribed in FAR 53.111, insert the following clause: Computer... by the Federal Acquisition Regulation (FAR) may be submitted on a computer generated version of the...

  6. 27 CFR 19.634 - Computer-generated reports and transaction forms.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2013-04-01 2013-04-01 false Computer-generated reports... Reports Filing Forms and Reports § 19.634 Computer-generated reports and transaction forms. TTB will accept computer-generated reports of operations and transaction forms made using a computer printer on...

  7. 27 CFR 19.634 - Computer-generated reports and transaction forms.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2014-04-01 2014-04-01 false Computer-generated reports... Reports Filing Forms and Reports § 19.634 Computer-generated reports and transaction forms. TTB will accept computer-generated reports of operations and transaction forms made using a computer printer on...

  8. 27 CFR 19.634 - Computer-generated reports and transaction forms.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2012-04-01 2012-04-01 false Computer-generated reports... Reports Filing Forms and Reports § 19.634 Computer-generated reports and transaction forms. TTB will accept computer-generated reports of operations and transaction forms made using a computer printer on...

  9. 27 CFR 19.634 - Computer-generated reports and transaction forms.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2011-04-01 2011-04-01 false Computer-generated reports... Reports Filing Forms and Reports § 19.634 Computer-generated reports and transaction forms. TTB will accept computer-generated reports of operations and transaction forms made using a computer printer on...

  10. 48 CFR 52.253-1 - Computer Generated Forms.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 2 2011-10-01 2011-10-01 false Computer Generated Forms....253-1 Computer Generated Forms. As prescribed in FAR 53.111, insert the following clause: Computer... by the Federal Acquisition Regulation (FAR) may be submitted on a computer generated version of the...

  11. 48 CFR 52.253-1 - Computer Generated Forms.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Computer Generated Forms....253-1 Computer Generated Forms. As prescribed in FAR 53.111, insert the following clause: Computer... by the Federal Acquisition Regulation (FAR) may be submitted on a computer generated version of the...

  12. The Proposal of a Evolutionary Strategy Generating the Data Structures Based on a Horizontal Tree for the Tests

    NASA Astrophysics Data System (ADS)

    Żukowicz, Marek; Markiewicz, Michał

    2016-09-01

    The aim of the article is to present a mathematical definition of the object model, that is known in computer science as TreeList and to show application of this model for design evolutionary algorithm, that purpose is to generate structures based on this object. The first chapter introduces the reader to the problem of presenting data using the TreeList object. The second chapter describes the problem of testing data structures based on TreeList. The third one shows a mathematical model of the object TreeList and the parameters, used in determining the utility of structures created through this model and in evolutionary strategy, that generates these structures for testing purposes. The last chapter provides a brief summary and plans for future research related to the algorithm presented in the article.

  13. Machine-aided indexing for NASA STI

    NASA Technical Reports Server (NTRS)

    Wilson, John

    1987-01-01

    One of the major components of the NASA/STI processing system is machine-aided indexing (MAI). MAI is a computer process that generates a set of indexing terms selected from NASA's thesaurus, is used for indexing technical reports, is based on text, and is reviewed by indexers. This paper summarizes the MAI objectives and discusses the NASA Lexical Dictionary, subject switching, and phrase matching or natural languages. The benefits of using MAI are mentioned, and MAI production improvement and the future of MAI are briefly addressed.

  14. Transonic Symposium: Theory, Application, and Experiment, volume 1, part 2

    NASA Technical Reports Server (NTRS)

    Foughner, Jerome T., Jr. (Compiler)

    1989-01-01

    In order to assess the state of the art in transonic flow disciplines and to glimpse at future directions, NASA-Langley held a Transonic Symposium. Emphasis was placed on steady, three dimensional external, transonic flow and its simulation, both numerically and experimentally. The symposium included technical sessions on wind tunnel and flight experiments; computational fluid dynamic applications; inviscid methods and grid generation; viscous methods and boundary layer stability; and wind tunnel techniques and wall interference. This, being volume 1, is unclassified.

  15. The Interactive Generation of Alphanumerics and Symbology with Designs on the Future.

    DTIC Science & Technology

    1985-06-01

    Interface in Airborne Systems. 22-26. IErE7AESS Symposium, Dayton DW io, Decemb*er-Tg. 17. Damodaran , L. and K. D. Eason. "Design Procedures for User...Petrocelli BkooTsInc., 1982. 19. Eason, K. D. and L. Damodaran . "The Needs of the Commercial User" in Computing Skills and the User Interface, edited...inadequate or inappropriate for another (Auld and others, 1981 : 78-92; Eason and Damodaran , 1981 : 115-122; Damadaran and Eason, 1981 : 373-387). These

  16. Unstructured mesh algorithms for aerodynamic calculations

    NASA Technical Reports Server (NTRS)

    Mavriplis, D. J.

    1992-01-01

    The use of unstructured mesh techniques for solving complex aerodynamic flows is discussed. The principle advantages of unstructured mesh strategies, as they relate to complex geometries, adaptive meshing capabilities, and parallel processing are emphasized. The various aspects required for the efficient and accurate solution of aerodynamic flows are addressed. These include mesh generation, mesh adaptivity, solution algorithms, convergence acceleration, and turbulence modeling. Computations of viscous turbulent two-dimensional flows and inviscid three-dimensional flows about complex configurations are demonstrated. Remaining obstacles and directions for future research are also outlined.

  17. Techniques and resources for storm-scale numerical weather prediction

    NASA Technical Reports Server (NTRS)

    Droegemeier, Kelvin; Grell, Georg; Doyle, James; Soong, Su-Tzai; Skamarock, William; Bacon, David; Staniforth, Andrew; Crook, Andrew; Wilhelmson, Robert

    1993-01-01

    The topics discussed include the following: multiscale application of the 5th-generation PSU/NCAR mesoscale model, the coupling of nonhydrostatic atmospheric and hydrostatic ocean models for air-sea interaction studies; a numerical simulation of cloud formation over complex topography; adaptive grid simulations of convection; an unstructured grid, nonhydrostatic meso/cloud scale model; efficient mesoscale modeling for multiple scales using variable resolution; initialization of cloud-scale models with Doppler radar data; and making effective use of future computing architectures, networks, and visualization software.

  18. TOPICAL REVIEW: Advances and challenges in computational plasma science

    NASA Astrophysics Data System (ADS)

    Tang, W. M.; Chan, V. S.

    2005-02-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This should produce the scientific excitement which will help to (a) stimulate enhanced cross-cutting collaborations with other fields and (b) attract the bright young talent needed for the future health of the field of plasma science.

  19. Advances and challenges in computational plasma science

    NASA Astrophysics Data System (ADS)

    Tang, W. M.

    2005-02-01

    Scientific simulation, which provides a natural bridge between theory and experiment, is an essential tool for understanding complex plasma behaviour. Recent advances in simulations of magnetically confined plasmas are reviewed in this paper, with illustrative examples, chosen from associated research areas such as microturbulence, magnetohydrodynamics and other topics. Progress has been stimulated, in particular, by the exponential growth of computer speed along with significant improvements in computer technology. The advances in both particle and fluid simulations of fine-scale turbulence and large-scale dynamics have produced increasingly good agreement between experimental observations and computational modelling. This was enabled by two key factors: (a) innovative advances in analytic and computational methods for developing reduced descriptions of physics phenomena spanning widely disparate temporal and spatial scales and (b) access to powerful new computational resources. Excellent progress has been made in developing codes for which computer run-time and problem-size scale well with the number of processors on massively parallel processors (MPPs). Examples include the effective usage of the full power of multi-teraflop (multi-trillion floating point computations per second) MPPs to produce three-dimensional, general geometry, nonlinear particle simulations that have accelerated advances in understanding the nature of turbulence self-regulation by zonal flows. These calculations, which typically utilized billions of particles for thousands of time-steps, would not have been possible without access to powerful present generation MPP computers and the associated diagnostic and visualization capabilities. In looking towards the future, the current results from advanced simulations provide great encouragement for being able to include increasingly realistic dynamics to enable deeper physics insights into plasmas in both natural and laboratory environments. This should produce the scientific excitement which will help to (a) stimulate enhanced cross-cutting collaborations with other fields and (b) attract the bright young talent needed for the future health of the field of plasma science.

  20. NASA Scientists Push the Limits of Computer Technology

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Dr. Donald Frazier,NASA researcher, uses a blue laser shining through a quarts window into a special mix of chemicals to generate a polymer film on the inside quartz surface. As the chemicals respond to the laser light, they adhere to the glass surface, forming optical films. Dr. Frazier and Dr. Mark S. Paley developed the process in the Space Sciences Laboratory at NASA's Marshall Space Flight Center in Huntsville, AL. Working aboard the Space Shuttle, a science team led by Dr. Frazier formed thin films potentially useful in optical computers with fewer impurities than those formed on Earth. Patterns of these films can be traced onto the quartz surface. In the optical computers of the future, these films could replace electronic circuits and wires, making the systems more efficient and cost-effective, as well as lighter and more compact. Photo credit: NASA/Marshall Space Flight Center.

  1. NASA Scientists Push the Limits of Computer Technology

    NASA Technical Reports Server (NTRS)

    1998-01-01

    NASA research Dr. Donald Frazier uses a blue laser shining through a quartz window into a special mix of chemicals to generate a polymer film on the inside quartz surface. As the chemicals respond to the laser light, they adhere to the glass surface, forming opticl films. Dr. Frazier and Dr. Mark S. Paley developed the process in the Space Sciences Laboratory at NASA's Marshall Space Flight Center in Huntsville, AL. Working aboard the Space Shuttle, a science team led by Dr. Frazier formed thin-films potentially useful in optical computers with fewer impurities than those formed on Earth. Patterns of these films can be traced onto the quartz surface. In the optical computers on the future, these films could replace electronic circuits and wires, making the systems more efficient and cost-effective, as well as lighter and more compact. Photo credit: NASA/Marshall Space Flight Center

  2. NASA Scientists Push the Limits of Computer Technology

    NASA Technical Reports Server (NTRS)

    1999-01-01

    NASA researcher Dr. Donald Frazier uses a blue laser shining through a quartz window into a special mix of chemicals to generate a polymer film on the inside quartz surface. As the chemicals respond to the laser light, they adhere to the glass surface, forming optical films. Dr. Frazier and Dr. Mark S. Paley developed the process in the Space Sciences Laboratory at NASA's Marshall Space Flight Center in Huntsville, AL. Working aboard the Space Shuttle, a science team led by Dr. Frazier formed thin-films potentially useful in optical computers with fewer impurities than those formed on Earth. Patterns of these films can be traced onto the quartz surface. In the optical computers of the future, thee films could replace electronic circuits and wires, making the systems more efficient and cost-effective, as well as lighter and more compact. Photo credit: NASA/Marshall Space Flight Center

  3. Modeling Pilot Behavior for Assessing Integrated Alert and Notification Systems on Flight Decks

    NASA Technical Reports Server (NTRS)

    Cover, Mathew; Schnell, Thomas

    2010-01-01

    Numerous new flight deck configurations for caution, warning, and alerts can be conceived; yet testing them with human-in-the-Ioop experiments to evaluate each one would not be practical. New sensors, instruments, and displays are being put into cockpits every day and this is particularly true as we enter the dawn of the Next Generation Air Transportation System (NextGen). By modeling pilot behavior in a computer simulation, an unlimited number of unique caution, warning, and alert configurations can be evaluated 24/7 by a computer. These computer simulations can then identify the most promising candidate formats to further evaluate in higher fidelity, but more costly, Human-in-the-Ioop (HITL) simulations. Evaluations using batch simulations with human performance models saves time, money, and enables a broader consideration of possible caution, warning, and alerting configurations for future flight decks.

  4. Biosensors with Built-In Biomolecular Logic Gates for Practical Applications

    PubMed Central

    Lai, Yu-Hsuan; Sun, Sin-Cih; Chuang, Min-Chieh

    2014-01-01

    Molecular logic gates, designs constructed with biological and chemical molecules, have emerged as an alternative computing approach to silicon-based logic operations. These molecular computers are capable of receiving and integrating multiple stimuli of biochemical significance to generate a definitive output, opening a new research avenue to advanced diagnostics and therapeutics which demand handling of complex factors and precise control. In molecularly gated devices, Boolean logic computations can be activated by specific inputs and accurately processed via bio-recognition, bio-catalysis, and selective chemical reactions. In this review, we survey recent advances of the molecular logic approaches to practical applications of biosensors, including designs constructed with proteins, enzymes, nucleic acids, nanomaterials, and organic compounds, as well as the research avenues for future development of digitally operating “sense and act” schemes that logically process biochemical signals through networked circuits to implement intelligent control systems. PMID:25587423

  5. Cloud access to interoperable IVOA-compliant VOSpace storage

    NASA Astrophysics Data System (ADS)

    Bertocco, S.; Dowler, P.; Gaudet, S.; Major, B.; Pasian, F.; Taffoni, G.

    2018-07-01

    Handling, processing and archiving the huge amount of data produced by the new generation of experiments and instruments in Astronomy and Astrophysics are among the more exciting challenges to address in designing the future data management infrastructures and computing services. We investigated the feasibility of a data management and computation infrastructure, available world-wide, with the aim of merging the FAIR data management provided by IVOA standards with the efficiency and reliability of a cloud approach. Our work involved the Canadian Advanced Network for Astronomy Research (CANFAR) infrastructure and the European EGI federated cloud (EFC). We designed and deployed a pilot data management and computation infrastructure that provides IVOA-compliant VOSpace storage resources and wide access to interoperable federated clouds. In this paper, we detail the main user requirements covered, the technical choices and the implemented solutions and we describe the resulting Hybrid cloud Worldwide infrastructure, its benefits and limitations.

  6. Computing Properties of Hadrons, Nuclei and Nuclear Matter from Quantum Chromodynamics (LQCD)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Negele, John W.

    Building on the success of two preceding generations of Scientific Discovery through Advanced Computing (SciDAC) projects, this grant supported the MIT component (P.I. John Negele) of a multi-institutional SciDAC-3 project that also included Brookhaven National Laboratory, the lead laboratory with P. I. Frithjof Karsch serving as Project Director, Thomas Jefferson National Accelerator Facility with P. I. David Richards serving as Co-director, University of Washington with P. I. Martin Savage, University of North Carolina with P. I. Rob Fowler, and College of William and Mary with P. I. Andreas Stathopoulos. Nationally, this multi-institutional project coordinated the software development effort that themore » nuclear physics lattice QCD community needs to ensure that lattice calculations can make optimal use of forthcoming leadership-class and dedicated hardware, including that at the national laboratories, and to exploit future computational resources in the Exascale era.« less

  7. Probabilistic Structural Analysis Theory Development

    NASA Technical Reports Server (NTRS)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Slattery, Stuart R

    ExaMPM is a mini-application for the Material Point Method (MPM) for studying the application of MPM to future exascale computing systems. MPM is a general method for computational mechanics and fluids and is used in a wide variety of science and engineering disciplines to study problems with large deformations, phase change, fracture, and other phenomena. ExaMPM provides a reference implementation of MPM as described in the 1994 work of Sulsky et.al. (Sulsky, Deborah, Zhen Chen, and Howard L. Schreyer. "A particle method for history-dependent materials." Computer methods in applied mechanics and engineering 118.1-2 (1994): 179-196.). The software can solve basicmore » MPM problems in solid mechanics using the original algorithm of Sulsky with explicit time integration, basic geometries, and free-slip and no-slip boundary conditions as described in the reference. ExaMPM is intended to be used as a starting point to design new parallel algorithms for the next generation of DOE supercomputers.« less

  9. Initial Progress Toward Development of a Voice-Based Computer-Delivered Motivational Intervention for Heavy Drinking College Students: An Experimental Study.

    PubMed

    Kahler, Christopher W; Lechner, William J; MacGlashan, James; Wray, Tyler B; Littman, Michael L

    2017-06-28

    Computer-delivered interventions have been shown to be effective in reducing alcohol consumption in heavy drinking college students. However, these computer-delivered interventions rely on mouse, keyboard, or touchscreen responses for interactions between the users and the computer-delivered intervention. The principles of motivational interviewing suggest that in-person interventions may be effective, in part, because they encourage individuals to think through and speak aloud their motivations for changing a health behavior, which current computer-delivered interventions do not allow. The objective of this study was to take the initial steps toward development of a voice-based computer-delivered intervention that can ask open-ended questions and respond appropriately to users' verbal responses, more closely mirroring a human-delivered motivational intervention. We developed (1) a voice-based computer-delivered intervention that was run by a human controller and that allowed participants to speak their responses to scripted prompts delivered by speech generation software and (2) a text-based computer-delivered intervention that relied on the mouse, keyboard, and computer screen for all interactions. We randomized 60 heavy drinking college students to interact with the voice-based computer-delivered intervention and 30 to interact with the text-based computer-delivered intervention and compared their ratings of the systems as well as their motivation to change drinking and their drinking behavior at 1-month follow-up. Participants reported that the voice-based computer-delivered intervention engaged positively with them in the session and delivered content in a manner consistent with motivational interviewing principles. At 1-month follow-up, participants in the voice-based computer-delivered intervention condition reported significant decreases in quantity, frequency, and problems associated with drinking, and increased perceived importance of changing drinking behaviors. In comparison to the text-based computer-delivered intervention condition, those assigned to voice-based computer-delivered intervention reported significantly fewer alcohol-related problems at the 1-month follow-up (incident rate ratio 0.60, 95% CI 0.44-0.83, P=.002). The conditions did not differ significantly on perceived importance of changing drinking or on measures of drinking quantity and frequency of heavy drinking. Results indicate that it is feasible to construct a series of open-ended questions and a bank of responses and follow-up prompts that can be used in a future fully automated voice-based computer-delivered intervention that may mirror more closely human-delivered motivational interventions to reduce drinking. Such efforts will require using advanced speech recognition capabilities and machine-learning approaches to train a program to mirror the decisions made by human controllers in the voice-based computer-delivered intervention used in this study. In addition, future studies should examine enhancements that can increase the perceived warmth and empathy of voice-based computer-delivered intervention, possibly through greater personalization, improvements in the speech generation software, and embodying the computer-delivered intervention in a physical form. ©Christopher W Kahler, William J Lechner, James MacGlashan, Tyler B Wray, Michael L Littman. Originally published in JMIR Mental Health (http://mental.jmir.org), 28.06.2017.

  10. Effects of computer-based graphic organizers to solve one-step word problems for middle school students with mild intellectual disability: A preliminary study.

    PubMed

    Sheriff, Kelli A; Boon, Richard T

    2014-08-01

    The purpose of this study was to examine the effects of computer-based graphic organizers, using Kidspiration 3© software, to solve one-step word problems. Participants included three students with mild intellectual disability enrolled in a functional academic skills curriculum in a self-contained classroom. A multiple probe single-subject research design (Horner & Baer, 1978) was used to evaluate the effectiveness of computer-based graphic organizers to solving mathematical one-step word problems. During the baseline phase, the students completed a teacher-generated worksheet that consisted of nine functional word problems in a traditional format using a pencil, paper, and a calculator. In the intervention and maintenance phases, the students were instructed to complete the word problems using a computer-based graphic organizer. Results indicated that all three of the students improved in their ability to solve the one-step word problems using computer-based graphic organizers compared to traditional instructional practices. Limitations of the study and recommendations for future research directions are discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Computation of Asteroid Proper Elements on the Grid

    NASA Astrophysics Data System (ADS)

    Novakovic, B.; Balaz, A.; Knezevic, Z.; Potocnik, M.

    2009-12-01

    A procedure of gridification of the computation of asteroid proper orbital elements is described. The need to speed up the time consuming computations and make them more efficient is justified by the large increase of observational data expected from the next generation all sky surveys. We give the basic notion of proper elements and of the contemporary theories and methods used to compute them for different populations of objects. Proper elements for nearly 70,000 asteroids are derived since the beginning of use of the Grid infrastructure for the purpose. The average time for the catalogs update is significantly shortened with respect to the time needed with stand-alone workstations. We also present basics of the Grid computing, the concepts of Grid middleware and its Workload management system. The practical steps we undertook to efficiently gridify our application are described in full detail. We present the results of a comprehensive testing of the performance of different Grid sites, and offer some practical conclusions based on the benchmark results and on our experience. Finally, we propose some possibilities for the future work.

  12. Payload/orbiter contamination control requirement study, volume 2, exhibit A

    NASA Technical Reports Server (NTRS)

    Bareiss, L. E.; Hooper, V. W.; Rantanen, R. O.; Ress, E. B.

    1974-01-01

    The computer printout data generated during the Payload/Orbiter Contamination Control Requirement Study are presented. The computer listings of the input surface data matrices, the viewfactor data matrices, and the geometric relationship data matrices for the three orbiter/spacelab configurations analyzed in this study are given. These configurations have been broken up into the geometrical surfaces and nodes necessary to define the principal critical surfaces whether they are contaminant sources, experimental surfaces, or operational surfaces. A numbering scheme was established based upon nodal numbers that relates the various spacelab surfaces to a specific surface material or function. This numbering system was developed for the spacelab configurations such that future extension to a surface mapping capability could be developed as required.

  13. Whole-genome CNV analysis: advances in computational approaches.

    PubMed

    Pirooznia, Mehdi; Goes, Fernando S; Zandi, Peter P

    2015-01-01

    Accumulating evidence indicates that DNA copy number variation (CNV) is likely to make a significant contribution to human diversity and also play an important role in disease susceptibility. Recent advances in genome sequencing technologies have enabled the characterization of a variety of genomic features, including CNVs. This has led to the development of several bioinformatics approaches to detect CNVs from next-generation sequencing data. Here, we review recent advances in CNV detection from whole genome sequencing. We discuss the informatics approaches and current computational tools that have been developed as well as their strengths and limitations. This review will assist researchers and analysts in choosing the most suitable tools for CNV analysis as well as provide suggestions for new directions in future development.

  14. Online Treatment and Virtual Therapists in Child and Adolescent Psychiatry

    PubMed Central

    Schueller, Stephen M.; Stiles-Shields, Colleen; Yarosh, Lana

    2016-01-01

    Summary Online and virtual therapies are a well-studied and efficacious treatment option for various mental and behavioral health conditions among children and adolescents. That said, many interventions have not concerned the unique affordances offered by technologies that might align with the capacities and interests of youth users. In this article, we discuss learnings from child-computer interaction that can inform future generations of interventions and guide developers, practitioners, and researchers how to best utilize new technologies for youth populations. We highlight issues related to usability and user experience including challenge and feedback, social interaction, and storytelling. We conclude with innovative examples illustrating future potentials of online and virtual therapies such as gaming and social networking. PMID:27837935

  15. WOMBAT: A Scalable and High-performance Astrophysical Magnetohydrodynamics Code

    NASA Astrophysics Data System (ADS)

    Mendygral, P. J.; Radcliffe, N.; Kandalla, K.; Porter, D.; O'Neill, B. J.; Nolting, C.; Edmon, P.; Donnert, J. M. F.; Jones, T. W.

    2017-02-01

    We present a new code for astrophysical magnetohydrodynamics specifically designed and optimized for high performance and scaling on modern and future supercomputers. We describe a novel hybrid OpenMP/MPI programming model that emerged from a collaboration between Cray, Inc. and the University of Minnesota. This design utilizes MPI-RMA optimized for thread scaling, which allows the code to run extremely efficiently at very high thread counts ideal for the latest generation of multi-core and many-core architectures. Such performance characteristics are needed in the era of “exascale” computing. We describe and demonstrate our high-performance design in detail with the intent that it may be used as a model for other, future astrophysical codes intended for applications demanding exceptional performance.

  16. A Micro-Level Data-Calibrated Agent-Based Model: The Synergy between Microsimulation and Agent-Based Modeling.

    PubMed

    Singh, Karandeep; Ahn, Chang-Won; Paik, Euihyun; Bae, Jang Won; Lee, Chun-Hee

    2018-01-01

    Artificial life (ALife) examines systems related to natural life, its processes, and its evolution, using simulations with computer models, robotics, and biochemistry. In this article, we focus on the computer modeling, or "soft," aspects of ALife and prepare a framework for scientists and modelers to be able to support such experiments. The framework is designed and built to be a parallel as well as distributed agent-based modeling environment, and does not require end users to have expertise in parallel or distributed computing. Furthermore, we use this framework to implement a hybrid model using microsimulation and agent-based modeling techniques to generate an artificial society. We leverage this artificial society to simulate and analyze population dynamics using Korean population census data. The agents in this model derive their decisional behaviors from real data (microsimulation feature) and interact among themselves (agent-based modeling feature) to proceed in the simulation. The behaviors, interactions, and social scenarios of the agents are varied to perform an analysis of population dynamics. We also estimate the future cost of pension policies based on the future population structure of the artificial society. The proposed framework and model demonstrates how ALife techniques can be used by researchers in relation to social issues and policies.

  17. Eye-gaze determination of user intent at the computer interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, J.H.; Schryver, J.C.

    1993-12-31

    Determination of user intent at the computer interface through eye-gaze monitoring can significantly aid applications for the disabled, as well as telerobotics and process control interfaces. Whereas current eye-gaze control applications are limited to object selection and x/y gazepoint tracking, a methodology was developed here to discriminate a more abstract interface operation: zooming-in or out. This methodology first collects samples of eve-gaze location looking at controlled stimuli, at 30 Hz, just prior to a user`s decision to zoom. The sample is broken into data frames, or temporal snapshots. Within a data frame, all spatial samples are connected into a minimummore » spanning tree, then clustered, according to user defined parameters. Each cluster is mapped to one in the prior data frame, and statistics are computed from each cluster. These characteristics include cluster size, position, and pupil size. A multiple discriminant analysis uses these statistics both within and between data frames to formulate optimal rules for assigning the observations into zooming, zoom-out, or no zoom conditions. The statistical procedure effectively generates heuristics for future assignments, based upon these variables. Future work will enhance the accuracy and precision of the modeling technique, and will empirically test users in controlled experiments.« less

  18. POSE Algorithms for Automated Docking

    NASA Technical Reports Server (NTRS)

    Heaton, Andrew F.; Howard, Richard T.

    2011-01-01

    POSE (relative position and attitude) can be computed in many different ways. Given a sensor that measures bearing to a finite number of spots corresponding to known features (such as a target) of a spacecraft, a number of different algorithms can be used to compute the POSE. NASA has sponsored the development of a flash LIDAR proximity sensor called the Vision Navigation Sensor (VNS) for use by the Orion capsule in future docking missions. This sensor generates data that can be used by a variety of algorithms to compute POSE solutions inside of 15 meters, including at the critical docking range of approximately 1-2 meters. Previously NASA participated in a DARPA program called Orbital Express that achieved the first automated docking for the American space program. During this mission a large set of high quality mated sensor data was obtained at what is essentially the docking distance. This data set is perhaps the most accurate truth data in existence for docking proximity sensors in orbit. In this paper, the flight data from Orbital Express is used to test POSE algorithms at 1.22 meters range. Two different POSE algorithms are tested for two different Fields-of-View (FOVs) and two different pixel noise levels. The results of the analysis are used to predict future performance of the POSE algorithms with VNS data.

  19. Computational analysis of forebody tangential slot blowing on the high alpha research vehicle

    NASA Technical Reports Server (NTRS)

    Gee, Ken

    1994-01-01

    Current and future fighter aircraft can maneuver in the high-angle-of-attack flight regime while flying at low subsonic and transonic freestream Mach numbers. However, at any flight speed, the ability of the vertical tails to generate yawing moment is limited in high-angle-of-attack flight. Thus, any system designed to provide the pilot with additional side force and yawing moment must work in both low subsonic and transonic flight. However, previous investigations of the effectiveness of forebody tangential slot blowing in generating the desired control forces and moments have been limited to the low subsonic freestream flow regime. In order to investigate the effectiveness of tangential slot blowing in transonic flight, a computational fluid dynamics analysis was carried out during the grant period. Computational solutions were obtained at three different freestream Mach numbers and at various jet mass flow ratios. All results were obtained using the isolated F/A-18 forebody grid geometry at 30.3 degrees angle of attack. One goal of the research was to determine the effect of freestream Mach number on the effectiveness of forebody tangential slot blowing in generating yawing moment. The second part of the research studied the force onset time lag associated with blowing. The time required for the yawing moment to reach a steady-state value from the onset of blowing may have an impact on the implementation of a pneumatic system on a flight vehicle.

  20. Field Instrumentation With Bricks: Wireless Networks Built From Tough, Cheap, Reliable Field Computers

    NASA Astrophysics Data System (ADS)

    Fatland, D. R.; Anandakrishnan, S.; Heavner, M.

    2004-12-01

    We describe tough, cheap, reliable field computers configured as wireless networks for distributed high-volume data acquisition and low-cost data recovery. Running under the GNU/Linux open source model these network nodes ('Bricks') are intended for either autonomous or managed deployment for many months in harsh Arctic conditions. We present here results from Generation-1 Bricks used in 2004 for glacier seismology research in Alaska and Antarctica and describe future generation Bricks in terms of core capabilities and a growing list of field applications. Subsequent generations of Bricks will feature low-power embedded architecture, large data storage capacity (GB), long range telemetry (15 km+ up from 3 km currently), and robust operational software. The list of Brick applications is growing to include Geodetic GPS, Bioacoustics (bats to whales), volcano seismicity, tracking marine fauna, ice sounding via distributed microwave receivers and more. This NASA-supported STTR project capitalizes on advancing computer/wireless technology to get scientists more data per research budget dollar, solving system integration problems and thereby getting researchers out of the hardware lab and into the field. One exemplary scenario: An investigator can install a Brick network in a remote polar environment to collect data for several months and then fly over the site to recover the data via wireless telemetry. In the past year Brick networks have moved beyond proof-of-concept to the full-bore development and testing stage; they will be a mature and powerful tool available for IPY 2007-8.

  1. Microcomputers and the Future.

    ERIC Educational Resources Information Center

    Uhlig, George E.

    Dangers are inherent in predicting the future. In discussing the future of computers, specifically, it is useful to consider the brief history of computers from the development of ENIAC to microcomputers. Advances in computer technology can be seen by looking at changes in individual components, including internal and external memory, the…

  2. Battlefield awareness computers: the engine of battlefield digitization

    NASA Astrophysics Data System (ADS)

    Ho, Jackson; Chamseddine, Ahmad

    1997-06-01

    To modernize the army for the 21st century, the U.S. Army Digitization Office (ADO) initiated in 1995 the Force XXI Battle Command Brigade-and-Below (FBCB2) Applique program which became a centerpiece in the U.S. Army's master plan to win future information wars. The Applique team led by TRW fielded a 'tactical Internet' for Brigade and below command to demonstrate the advantages of 'shared situation awareness' and battlefield digitization in advanced war-fighting experiments (AWE) to be conducted in March 1997 at the Army's National Training Center in California. Computing Devices is designated the primary hardware developer for the militarized version of the battlefield awareness computers. The first generation of militarized battlefield awareness computer, designated as the V3 computer, was an integration of off-the-shelf components developed to meet the agressive delivery requirements of the Task Force XXI AWE. The design efficiency and cost effectiveness of the computer hardware were secondary in importance to delivery deadlines imposed by the March 1997 AWE. However, declining defense budgets will impose cost constraints on the Force XXI production hardware that can only be met by rigorous value engineering to further improve design optimization for battlefield awareness without compromising the level of reliability the military has come to expect in modern military hardened vetronics. To answer the Army's needs for a more cost effective computing solution, Computing Devices developed a second generation 'combat ready' battlefield awareness computer, designated the V3+, which is designed specifically to meet the upcoming demands of Force XXI (FBCB2) and beyond. The primary design objective is to achieve a technologically superior design, value engineered to strike an optimal balance between reliability, life cycle cost, and procurement cost. Recognizing that the diverse digitization demands of Force XXI cannot be adequately met by any one computer hardware solution, Computing Devices is planning to develop a notebook sized military computer designed for space limited vehicle-mounted applications, as well as a high-performance portable workstation equipped with a 19', full color, ultra-high resolution and high brightness active matrix liquid crystal display (AMLCD) targeting the command posts and tactical operations centers (TOC) applications. Together with the wearable computers Computing Devices developed at the Minneapolis facility for dismounted soldiers, Computing Devices will have a complete suite of interoperable battlefield awareness computers spanning the entire spectrum of battle digitization operating environments. Although this paper's primary focus is on a second generation 'combat ready' battlefield awareness computer or the V3+, this paper also briefly discusses the extension of the V3+ architecture to address the needs of the embedded and command post applications.3080

  3. Method based on the Laplace equations to reconstruct the river terrain for two-dimensional hydrodynamic numerical modeling

    NASA Astrophysics Data System (ADS)

    Lai, Ruixun; Wang, Min; Yang, Ming; Zhang, Chao

    2018-02-01

    The accuracy of the widely-used two-dimensional hydrodynamic numerical model depends on the quality of the river terrain model, particularly in the main channel. However, in most cases, the bathymetry of the river channel is difficult or expensive to obtain in the field, and there is a lack of available data to describe the geometry of the river channel. We introduce a method that originates from the grid generation with the elliptic equation to generate streamlines of the river channel. The streamlines are numerically solved with the Laplace equations. In the process, streamlines in the physical domain are first computed in a computational domain, and then transformed back to the physical domain. The interpolated streamlines are integrated with the surrounding topography to reconstruct the entire river terrain model. The approach was applied to a meandering reach in the Qinhe River, which is a tributary in the middle of the Yellow River, China. Cross-sectional validation and the two-dimensional shallow-water equations are used to test the performance of the river terrain generated. The results show that the approach can reconstruct the river terrain using the data from measured cross-sections. Furthermore, the created river terrain can maintain a geometrical shape consistent with the measurements, while generating a smooth main channel. Finally, several limitations and opportunities for future research are discussed.

  4. Computing Literacy in the University of the Future.

    ERIC Educational Resources Information Center

    Gantt, Vernon W.

    In exploring the impact of microcomputers and the future of the university in 1985 and beyond, a distinction should be made between computing literacy--the ability to use a computer--and computer literacy, which goes beyond successful computer use to include knowing how to program in various computer languages and understanding what goes on…

  5. The Bright, Artificial Intelligence-Augmented Future of Neuroimaging Reading

    PubMed Central

    Hainc, Nicolin; Federau, Christian; Stieltjes, Bram; Blatow, Maria; Bink, Andrea; Stippich, Christoph

    2017-01-01

    Radiologists are among the first physicians to be directly affected by advances in computer technology. Computers are already capable of analyzing medical imaging data, and with decades worth of digital information available for training, will an artificial intelligence (AI) one day signal the end of the human radiologist? With the ever increasing work load combined with the looming doctor shortage, radiologists will be pushed far beyond their current estimated 3 s allotted time-of-analysis per image; an AI with super-human capabilities might seem like a logical replacement. We feel, however, that AI will lead to an augmentation rather than a replacement of the radiologist. The AI will be relied upon to handle the tedious, time-consuming tasks of detecting and segmenting outliers while possibly generating new, unanticipated results that can then be used as sources of medical discovery. This will affect not only radiologists but all physicians and also researchers dealing with medical imaging. Therefore, we must embrace future technology and collaborate interdisciplinary to spearhead the next revolution in medicine. PMID:28983278

  6. Highlights of the Workshop

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1997-01-01

    Economic stresses are forcing many industries to reduce cost and time-to-market, and to insert emerging technologies into their products. Engineers are asked to design faster, ever more complex systems. Hence, there is a need for novel design paradigms and effective design tools to reduce the design and development times. Several computational tools and facilities have been developed to support the design process. Some of these are described in subsequent presentations. The focus of the workshop is on the computational tools and facilities which have high potential for use in future design environment for aerospace systems. The outline for the introductory remarks is given. First, the characteristics and design drivers for future aerospace systems are outlined; second, simulation-based design environment, and some of its key modules are described; third, the vision for the next-generation design environment being planned by NASA, the UVA ACT Center and JPL is presented. The anticipated major benefits of the planned environment are listed; fourth, some of the government-supported programs related to simulation-based design are listed; and fifth, the objectives and format of the workshop are presented.

  7. Bacteria as computers making computers

    PubMed Central

    Danchin, Antoine

    2009-01-01

    Various efforts to integrate biological knowledge into networks of interactions have produced a lively microbial systems biology. Putting molecular biology and computer sciences in perspective, we review another trend in systems biology, in which recursivity and information replace the usual concepts of differential equations, feedback and feedforward loops and the like. Noting that the processes of gene expression separate the genome from the cell machinery, we analyse the role of the separation between machine and program in computers. However, computers do not make computers. For cells to make cells requires a specific organization of the genetic program, which we investigate using available knowledge. Microbial genomes are organized into a paleome (the name emphasizes the role of the corresponding functions from the time of the origin of life), comprising a constructor and a replicator, and a cenome (emphasizing community-relevant genes), made up of genes that permit life in a particular context. The cell duplication process supposes rejuvenation of the machine and replication of the program. The paleome also possesses genes that enable information to accumulate in a ratchet-like process down the generations. The systems biology must include the dynamics of information creation in its future developments. PMID:19016882

  8. Bacteria as computers making computers.

    PubMed

    Danchin, Antoine

    2009-01-01

    Various efforts to integrate biological knowledge into networks of interactions have produced a lively microbial systems biology. Putting molecular biology and computer sciences in perspective, we review another trend in systems biology, in which recursivity and information replace the usual concepts of differential equations, feedback and feedforward loops and the like. Noting that the processes of gene expression separate the genome from the cell machinery, we analyse the role of the separation between machine and program in computers. However, computers do not make computers. For cells to make cells requires a specific organization of the genetic program, which we investigate using available knowledge. Microbial genomes are organized into a paleome (the name emphasizes the role of the corresponding functions from the time of the origin of life), comprising a constructor and a replicator, and a cenome (emphasizing community-relevant genes), made up of genes that permit life in a particular context. The cell duplication process supposes rejuvenation of the machine and replication of the program. The paleome also possesses genes that enable information to accumulate in a ratchet-like process down the generations. The systems biology must include the dynamics of information creation in its future developments.

  9. SU-E-T-628: A Cloud Computing Based Multi-Objective Optimization Method for Inverse Treatment Planning.

    PubMed

    Na, Y; Suh, T; Xing, L

    2012-06-01

    Multi-objective (MO) plan optimization entails generation of an enormous number of IMRT or VMAT plans constituting the Pareto surface, which presents a computationally challenging task. The purpose of this work is to overcome the hurdle by developing an efficient MO method using emerging cloud computing platform. As a backbone of cloud computing for optimizing inverse treatment planning, Amazon Elastic Compute Cloud with a master node (17.1 GB memory, 2 virtual cores, 420 GB instance storage, 64-bit platform) is used. The master node is able to scale seamlessly a number of working group instances, called workers, based on the user-defined setting account for MO functions in clinical setting. Each worker solved the objective function with an efficient sparse decomposition method. The workers are automatically terminated if there are finished tasks. The optimized plans are archived to the master node to generate the Pareto solution set. Three clinical cases have been planned using the developed MO IMRT and VMAT planning tools to demonstrate the advantages of the proposed method. The target dose coverage and critical structure sparing of plans are comparable obtained using the cloud computing platform are identical to that obtained using desktop PC (Intel Xeon® CPU 2.33GHz, 8GB memory). It is found that the MO planning speeds up the processing of obtaining the Pareto set substantially for both types of plans. The speedup scales approximately linearly with the number of nodes used for computing. With the use of N nodes, the computational time is reduced by the fitting model, 0.2+2.3/N, with r̂2>0.99, on average of the cases making real-time MO planning possible. A cloud computing infrastructure is developed for MO optimization. The algorithm substantially improves the speed of inverse plan optimization. The platform is valuable for both MO planning and future off- or on-line adaptive re-planning. © 2012 American Association of Physicists in Medicine.

  10. Computational aerodynamics development and outlook /Dryden Lecture in Research for 1979/

    NASA Technical Reports Server (NTRS)

    Chapman, D. R.

    1979-01-01

    Some past developments and current examples of computational aerodynamics are briefly reviewed. An assessment is made of the requirements on future computer memory and speed imposed by advanced numerical simulations, giving emphasis to the Reynolds averaged Navier-Stokes equations and to turbulent eddy simulations. Experimental scales of turbulence structure are used to determine the mesh spacings required to adequately resolve turbulent energy and shear. Assessment also is made of the changing market environment for developing future large computers, and of the projections of micro-electronics memory and logic technology that affect future computer capability. From the two assessments, estimates are formed of the future time scale in which various advanced types of aerodynamic flow simulations could become feasible. Areas of research judged especially relevant to future developments are noted.

  11. Harnessing the uncertainty monster: Putting quantitative constraints on the intergenerational social discount rate

    NASA Astrophysics Data System (ADS)

    Lewandowsky, Stephan; Freeman, Mark C.; Mann, Michael E.

    2017-09-01

    There is broad consensus among economists that unmitigated climate change will ultimately have adverse global economic consequences, that the costs of inaction will likely outweigh the cost of taking action, and that social planners should therefore put a price on carbon. However, there is considerable debate and uncertainty about the appropriate value of the social discount rate, that is the extent to which future damages should be discounted relative to mitigation costs incurred now. We briefly review the ethical issues surrounding the social discount rate and then report a simulation experiment that constrains the value of the discount rate by considering 4 sources of uncertainty and ambiguity: Scientific uncertainty about the extent of future warming, social uncertainty about future population and future economic development, political uncertainty about future mitigation trajectories, and ethical ambiguity about how much the welfare of future generations should be valued today. We compute a certainty-equivalent declining discount rate that accommodates all those sources of uncertainty and ambiguity. The forward (instantaneous) discount rate converges to a value near 0% by century's end and the spot (horizon) discount rate drops below 2% by 2100 and drops below previous estimates by 2070.

  12. Development of the Symbolic Manipulator Laboratory modeling package for the kinematic design and optimization of the Future Armor Rearm System robot. Ammunition Logistics Program

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    March-Leuba, S.; Jansen, J.F.; Kress, R.L.

    1992-08-01

    A new program package, Symbolic Manipulator Laboratory (SML), for the automatic generation of both kinematic and static manipulator models in symbolic form is presented. Critical design parameters may be identified and optimized using symbolic models as shown in the sample application presented for the Future Armor Rearm System (FARS) arm. The computer-aided development of the symbolic models yields equations with reduced numerical complexity. Important considerations have been placed on the closed form solutions simplification and on the user friendly operation. The main emphasis of this research is the development of a methodology which is implemented in a computer program capablemore » of generating symbolic kinematic and static forces models of manipulators. The fact that the models are obtained trigonometrically reduced is among the most significant results of this work and the most difficult to implement. Mathematica, a commercial program that allows symbolic manipulation, is used to implement the program package. SML is written such that the user can change any of the subroutines or create new ones easily. To assist the user, an on-line help has been written to make of SML a user friendly package. Some sample applications are presented. The design and optimization of the 5-degrees-of-freedom (DOF) FARS manipulator using SML is discussed. Finally, the kinematic and static models of two different 7-DOF manipulators are calculated symbolically.« less

  13. From prediction error to incentive salience: mesolimbic computation of reward motivation

    PubMed Central

    Berridge, Kent C.

    2011-01-01

    Reward contains separable psychological components of learning, incentive motivation and pleasure. Most computational models have focused only on the learning component of reward, but the motivational component is equally important in reward circuitry, and even more directly controls behavior. Modeling the motivational component requires recognition of additional control factors besides learning. Here I will discuss how mesocorticolimbic mechanisms generate the motivation component of incentive salience. Incentive salience takes Pavlovian learning and memory as one input and as an equally important input takes neurobiological state factors (e.g., drug states, appetite states, satiety states) that can vary independently of learning. Neurobiological state changes can produce unlearned fluctuations or even reversals in the ability of a previously-learned reward cue to trigger motivation. Such fluctuations in cue-triggered motivation can dramatically depart from all previously learned values about the associated reward outcome. Thus a consequence of the difference between incentive salience and learning can be to decouple cue-triggered motivation of the moment from previously learned values of how good the associated reward has been in the past. Another consequence can be to produce irrationally strong motivation urges that are not justified by any memories of previous reward values (and without distorting associative predictions of future reward value). Such irrationally strong motivation may be especially problematic in addiction. To comprehend these phenomena, future models of mesocorticolimbic reward function should address the neurobiological state factors that participate to control generation of incentive salience. PMID:22487042

  14. An Operationally Based Vision Assessment Simulator for Domes

    NASA Technical Reports Server (NTRS)

    Archdeacon, John; Gaska, James; Timoner, Samson

    2012-01-01

    The Operational Based Vision Assessment (OBVA) simulator was designed and built by NASA and the United States Air Force (USAF) to provide the Air Force School of Aerospace Medicine (USAFSAM) with a scientific testing laboratory to study human vision and testing standards in an operationally relevant environment. This paper describes the general design objectives and implementation characteristics of the simulator visual system being created to meet these requirements. A key design objective for the OBVA research simulator is to develop a real-time computer image generator (IG) and display subsystem that can display and update at 120 frame s per second (design target), or at a minimum, 60 frames per second, with minimal transport delay using commercial off-the-shelf (COTS) technology. There are three key parts of the OBVA simulator that are described in this paper: i) the real-time computer image generator, ii) the various COTS technology used to construct the simulator, and iii) the spherical dome display and real-time distortion correction subsystem. We describe the various issues, possible COTS solutions, and remaining problem areas identified by NASA and the USAF while designing and building the simulator for future vision research. We also describe the critically important relationship of the physical display components including distortion correction for the dome consistent with an objective of minimizing latency in the system. The performance of the automatic calibration system used in the dome is also described. Various recommendations for possible future implementations shall also be discussed.

  15. Nonlinear metamaterials for holography

    PubMed Central

    Almeida, Euclides; Bitton, Ora

    2016-01-01

    A hologram is an optical element storing phase and possibly amplitude information enabling the reconstruction of a three-dimensional image of an object by illumination and scattering of a coherent beam of light, and the image is generated at the same wavelength as the input laser beam. In recent years, it was shown that information can be stored in nanometric antennas giving rise to ultrathin components. Here we demonstrate nonlinear multilayer metamaterial holograms. A background free image is formed at a new frequency—the third harmonic of the illuminating beam. Using e-beam lithography of multilayer plasmonic nanoantennas, we fabricate polarization-sensitive nonlinear elements such as blazed gratings, lenses and other computer-generated holograms. These holograms are analysed and prospects for future device applications are discussed. PMID:27545581

  16. Calculating Reuse Distance from Source Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Narayanan, Sri Hari Krishna; Hovland, Paul

    The efficient use of a system is of paramount importance in high-performance computing. Applications need to be engineered for future systems even before the architecture of such a system is clearly known. Static performance analysis that generates performance bounds is one way to approach the task of understanding application behavior. Performance bounds provide an upper limit on the performance of an application on a given architecture. Predicting cache hierarchy behavior and accesses to main memory is a requirement for accurate performance bounds. This work presents our static reuse distance algorithm to generate reuse distance histograms. We then use these histogramsmore » to predict cache miss rates. Experimental results for kernels studied show that the approach is accurate.« less

  17. Enabling the environmentally clean air transportation of the future: a vision of computational fluid dynamics in 2030.

    PubMed

    Slotnick, Jeffrey P; Khodadoust, Abdollah; Alonso, Juan J; Darmofal, David L; Gropp, William D; Lurie, Elizabeth A; Mavriplis, Dimitri J; Venkatakrishnan, Venkat

    2014-08-13

    As global air travel expands rapidly to meet demand generated by economic growth, it is essential to continue to improve the efficiency of air transportation to reduce its carbon emissions and address concerns about climate change. Future transports must be 'cleaner' and designed to include technologies that will continue to lower engine emissions and reduce community noise. The use of computational fluid dynamics (CFD) will be critical to enable the design of these new concepts. In general, the ability to simulate aerodynamic and reactive flows using CFD has progressed rapidly during the past several decades and has fundamentally changed the aerospace design process. Advanced simulation capabilities not only enable reductions in ground-based and flight-testing requirements, but also provide added physical insight, and enable superior designs at reduced cost and risk. In spite of considerable success, reliable use of CFD has remained confined to a small region of the operating envelope due, in part, to the inability of current methods to reliably predict turbulent, separated flows. Fortunately, the advent of much more powerful computing platforms provides an opportunity to overcome a number of these challenges. This paper summarizes the findings and recommendations from a recent NASA-funded study that provides a vision for CFD in the year 2030, including an assessment of critical technology gaps and needed development, and identifies the key CFD technology advancements that will enable the design and development of much cleaner aircraft in the future. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  18. GLAS Spacecraft Pointing Study

    NASA Technical Reports Server (NTRS)

    Born, George H.; Gold, Kenn; Ondrey, Michael; Kubitschek, Dan; Axelrad, Penina; Komjathy, Attila

    1998-01-01

    Science requirements for the GLAS mission demand that the laser altimeter be pointed to within 50 m of the location of the previous repeat ground track. The satellite will be flown in a repeat orbit of 182 days. Operationally, the required pointing information will be determined on the ground using the nominal ground track, to which pointing is desired, and the current propagated orbit of the satellite as inputs to the roll computation algorithm developed by CCAR. The roll profile will be used to generate a set of fit coefficients which can be uploaded on a daily basis and used by the on-board attitude control system. In addition, an algorithm has been developed for computation of the associated command quaternions which will be necessary when pointing at targets of opportunity. It may be desirable in the future to perform the roll calculation in an autonomous real-time mode on-board the spacecraft. GPS can provide near real-time tracking of the satellite, and the nominal ground track can be stored in the on-board computer. It will be necessary to choose the spacing of this nominal ground track to meet storage requirements in the on-board environment. Several methods for generating the roll profile from a sparse reference ground track are presented.

  19. Real-time stylistic prediction for whole-body human motions.

    PubMed

    Matsubara, Takamitsu; Hyon, Sang-Ho; Morimoto, Jun

    2012-01-01

    The ability to predict human motion is crucial in several contexts such as human tracking by computer vision and the synthesis of human-like computer graphics. Previous work has focused on off-line processes with well-segmented data; however, many applications such as robotics require real-time control with efficient computation. In this paper, we propose a novel approach called real-time stylistic prediction for whole-body human motions to satisfy these requirements. This approach uses a novel generative model to represent a whole-body human motion including rhythmic motion (e.g., walking) and discrete motion (e.g., jumping). The generative model is composed of a low-dimensional state (phase) dynamics and a two-factor observation model, allowing it to capture the diversity of motion styles in humans. A real-time adaptation algorithm was derived to estimate both state variables and style parameter of the model from non-stationary unlabeled sequential observations. Moreover, with a simple modification, the algorithm allows real-time adaptation even from incomplete (partial) observations. Based on the estimated state and style, a future motion sequence can be accurately predicted. In our implementation, it takes less than 15 ms for both adaptation and prediction at each observation. Our real-time stylistic prediction was evaluated for human walking, running, and jumping behaviors. Copyright © 2011 Elsevier Ltd. All rights reserved.

  20. Research prioritization through prediction of future impact on biomedical science: a position paper on inference-analytics.

    PubMed

    Ganapathiraju, Madhavi K; Orii, Naoki

    2013-08-30

    Advances in biotechnology have created "big-data" situations in molecular and cellular biology. Several sophisticated algorithms have been developed that process big data to generate hundreds of biomedical hypotheses (or predictions). The bottleneck to translating this large number of biological hypotheses is that each of them needs to be studied by experimentation for interpreting its functional significance. Even when the predictions are estimated to be very accurate, from a biologist's perspective, the choice of which of these predictions is to be studied further is made based on factors like availability of reagents and resources and the possibility of formulating some reasonable hypothesis about its biological relevance. When viewed from a global perspective, say from that of a federal funding agency, ideally the choice of which prediction should be studied would be made based on which of them can make the most translational impact. We propose that algorithms be developed to identify which of the computationally generated hypotheses have potential for high translational impact; this way, funding agencies and scientific community can invest resources and drive the research based on a global view of biomedical impact without being deterred by local view of feasibility. In short, data-analytic algorithms analyze big-data and generate hypotheses; in contrast, the proposed inference-analytic algorithms analyze these hypotheses and rank them by predicted biological impact. We demonstrate this through the development of an algorithm to predict biomedical impact of protein-protein interactions (PPIs) which is estimated by the number of future publications that cite the paper which originally reported the PPI. This position paper describes a new computational problem that is relevant in the era of big-data and discusses the challenges that exist in studying this problem, highlighting the need for the scientific community to engage in this line of research. The proposed class of algorithms, namely inference-analytic algorithms, is necessary to ensure that resources are invested in translating those computational outcomes that promise maximum biological impact. Application of this concept to predict biomedical impact of PPIs illustrates not only the concept, but also the challenges in designing these algorithms.

  1. High-Efficiency High-Resolution Global Model Developments at the NASA Goddard Data Assimilation Office

    NASA Technical Reports Server (NTRS)

    Lin, Shian-Jiann; Atlas, Robert (Technical Monitor)

    2002-01-01

    The Data Assimilation Office (DAO) has been developing a new generation of ultra-high resolution General Circulation Model (GCM) that is suitable for 4-D data assimilation, numerical weather predictions, and climate simulations. These three applications have conflicting requirements. For 4-D data assimilation and weather predictions, it is highly desirable to run the model at the highest possible spatial resolution (e.g., 55 km or finer) so as to be able to resolve and predict socially and economically important weather phenomena such as tropical cyclones, hurricanes, and severe winter storms. For climate change applications, the model simulations need to be carried out for decades, if not centuries. To reduce uncertainty in climate change assessments, the next generation model would also need to be run at a fine enough spatial resolution that can at least marginally simulate the effects of intense tropical cyclones. Scientific problems (e.g., parameterization of subgrid scale moist processes) aside, all three areas of application require the model's computational performance to be dramatically improved as compared to the previous generation. In this talk, I will present the current and future developments of the "finite-volume dynamical core" at the Data Assimilation Office. This dynamical core applies modem monotonicity preserving algorithms and is genuinely conservative by construction, not by an ad hoc fixer. The "discretization" of the conservation laws is purely local, which is clearly advantageous for resolving sharp gradient flow features. In addition, the local nature of the finite-volume discretization also has a significant advantage on distributed memory parallel computers. Together with a unique vertically Lagrangian control volume discretization that essentially reduces the dimension of the computational problem from three to two, the finite-volume dynamical core is very efficient, particularly at high resolutions. I will also present the computational design of the dynamical core using a hybrid distributed-shared memory programming paradigm that is portable to virtually any of today's high-end parallel super-computing clusters.

  2. High-Efficiency High-Resolution Global Model Developments at the NASA Goddard Data Assimilation Office

    NASA Technical Reports Server (NTRS)

    Lin, Shian-Jiann; Atlas, Robert (Technical Monitor)

    2002-01-01

    The Data Assimilation Office (DAO) has been developing a new generation of ultra-high resolution General Circulation Model (GCM) that is suitable for 4-D data assimilation, numerical weather predictions, and climate simulations. These three applications have conflicting requirements. For 4-D data assimilation and weather predictions, it is highly desirable to run the model at the highest possible spatial resolution (e.g., 55 kin or finer) so as to be able to resolve and predict socially and economically important weather phenomena such as tropical cyclones, hurricanes, and severe winter storms. For climate change applications, the model simulations need to be carried out for decades, if not centuries. To reduce uncertainty in climate change assessments, the next generation model would also need to be run at a fine enough spatial resolution that can at least marginally simulate the effects of intense tropical cyclones. Scientific problems (e.g., parameterization of subgrid scale moist processes) aside, all three areas of application require the model's computational performance to be dramatically improved as compared to the previous generation. In this talk, I will present the current and future developments of the "finite-volume dynamical core" at the Data Assimilation Office. This dynamical core applies modem monotonicity preserving algorithms and is genuinely conservative by construction, not by an ad hoc fixer. The "discretization" of the conservation laws is purely local, which is clearly advantageous for resolving sharp gradient flow features. In addition, the local nature of the finite-volume discretization also has a significant advantage on distributed memory parallel computers. Together with a unique vertically Lagrangian control volume discretization that essentially reduces the dimension of the computational problem from three to two, the finite-volume dynamical core is very efficient, particularly at high resolutions. I will also present the computational design of the dynamical core using a hybrid distributed- shared memory programming paradigm that is portable to virtually any of today's high-end parallel super-computing clusters.

  3. Computational Approaches to Simulation and Optimization of Global Aircraft Trajectories

    NASA Technical Reports Server (NTRS)

    Ng, Hok Kwan; Sridhar, Banavar

    2016-01-01

    This study examines three possible approaches to improving the speed in generating wind-optimal routes for air traffic at the national or global level. They are: (a) using the resources of a supercomputer, (b) running the computations on multiple commercially available computers and (c) implementing those same algorithms into NASAs Future ATM Concepts Evaluation Tool (FACET) and compares those to a standard implementation run on a single CPU. Wind-optimal aircraft trajectories are computed using global air traffic schedules. The run time and wait time on the supercomputer for trajectory optimization using various numbers of CPUs ranging from 80 to 10,240 units are compared with the total computational time for running the same computation on a single desktop computer and on multiple commercially available computers for potential computational enhancement through parallel processing on the computer clusters. This study also re-implements the trajectory optimization algorithm for further reduction of computational time through algorithm modifications and integrates that with FACET to facilitate the use of the new features which calculate time-optimal routes between worldwide airport pairs in a wind field for use with existing FACET applications. The implementations of trajectory optimization algorithms use MATLAB, Python, and Java programming languages. The performance evaluations are done by comparing their computational efficiencies and based on the potential application of optimized trajectories. The paper shows that in the absence of special privileges on a supercomputer, a cluster of commercially available computers provides a feasible approach for national and global air traffic system studies.

  4. Decomposition Technique for Remaining Useful Life Prediction

    NASA Technical Reports Server (NTRS)

    Saha, Bhaskar (Inventor); Goebel, Kai F. (Inventor); Saxena, Abhinav (Inventor); Celaya, Jose R. (Inventor)

    2014-01-01

    The prognostic tool disclosed here decomposes the problem of estimating the remaining useful life (RUL) of a component or sub-system into two separate regression problems: the feature-to-damage mapping and the operational conditions-to-damage-rate mapping. These maps are initially generated in off-line mode. One or more regression algorithms are used to generate each of these maps from measurements (and features derived from these), operational conditions, and ground truth information. This decomposition technique allows for the explicit quantification and management of different sources of uncertainty present in the process. Next, the maps are used in an on-line mode where run-time data (sensor measurements and operational conditions) are used in conjunction with the maps generated in off-line mode to estimate both current damage state as well as future damage accumulation. Remaining life is computed by subtracting the instance when the extrapolated damage reaches the failure threshold from the instance when the prediction is made.

  5. De Novo Design of Bioactive Small Molecules by Artificial Intelligence

    PubMed Central

    Merk, Daniel; Friedrich, Lukas; Grisoni, Francesca

    2018-01-01

    Abstract Generative artificial intelligence offers a fresh view on molecular design. We present the first‐time prospective application of a deep learning model for designing new druglike compounds with desired activities. For this purpose, we trained a recurrent neural network to capture the constitution of a large set of known bioactive compounds represented as SMILES strings. By transfer learning, this general model was fine‐tuned on recognizing retinoid X and peroxisome proliferator‐activated receptor agonists. We synthesized five top‐ranking compounds designed by the generative model. Four of the compounds revealed nanomolar to low‐micromolar receptor modulatory activity in cell‐based assays. Apparently, the computational model intrinsically captured relevant chemical and biological knowledge without the need for explicit rules. The results of this study advocate generative artificial intelligence for prospective de novo molecular design, and demonstrate the potential of these methods for future medicinal chemistry. PMID:29319225

  6. 17 CFR 1.18 - Records for and relating to financial reporting and monthly computation by futures commission...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... financial reporting and monthly computation by futures commission merchants and introducing brokers. 1.18... UNDER THE COMMODITY EXCHANGE ACT Minimum Financial and Related Reporting Requirements § 1.18 Records for and relating to financial reporting and monthly computation by futures commission merchants and...

  7. 17 CFR 1.18 - Records for and relating to financial reporting and monthly computation by futures commission...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... financial reporting and monthly computation by futures commission merchants and introducing brokers. 1.18... UNDER THE COMMODITY EXCHANGE ACT Minimum Financial and Related Reporting Requirements § 1.18 Records for and relating to financial reporting and monthly computation by futures commission merchants and...

  8. Emerging and Future Computing Paradigms and Their Impact on the Research, Training, and Design Environments of the Aerospace Workforce

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler)

    2003-01-01

    The document contains the proceedings of the training workshop on Emerging and Future Computing Paradigms and their impact on the Research, Training and Design Environments of the Aerospace Workforce. The workshop was held at NASA Langley Research Center, Hampton, Virginia, March 18 and 19, 2003. The workshop was jointly sponsored by Old Dominion University and NASA. Workshop attendees came from NASA, other government agencies, industry and universities. The objectives of the workshop were to a) provide broad overviews of the diverse activities related to new computing paradigms, including grid computing, pervasive computing, high-productivity computing, and the IBM-led autonomic computing; and b) identify future directions for research that have high potential for future aerospace workforce environments. The format of the workshop included twenty-one, half-hour overview-type presentations and three exhibits by vendors.

  9. Method and computer program product for maintenance and modernization backlogging

    DOEpatents

    Mattimore, Bernard G; Reynolds, Paul E; Farrell, Jill M

    2013-02-19

    According to one embodiment, a computer program product for determining future facility conditions includes a computer readable medium having computer readable program code stored therein. The computer readable program code includes computer readable program code for calculating a time period specific maintenance cost, for calculating a time period specific modernization factor, and for calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. In another embodiment, a computer-implemented method for calculating future facility conditions includes calculating a time period specific maintenance cost, calculating a time period specific modernization factor, and calculating a time period specific backlog factor. Future facility conditions equal the time period specific maintenance cost plus the time period specific modernization factor plus the time period specific backlog factor. Other embodiments are also presented.

  10. Numerical and experimental investigation of melting with internal heat generation within cylindrical enclosures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amber Shrivastava; Brian Williams; Ali S. Siahpush

    2014-06-01

    There have been significant efforts by the heat transfer community to investigate the melting phenomenon of materials. These efforts have included the analytical development of equations to represent melting, numerical development of computer codes to assist in modeling the phenomena, and collection of experimental data. The understanding of the melting phenomenon has application in several areas of interest, for example, the melting of a Phase Change Material (PCM) used as a thermal storage medium as well as the melting of the fuel bundle in a nuclear power plant during an accident scenario. The objective of this research is two-fold. Firstmore » a numerical investigation, using computational fluid dynamics (CFD), of melting with internal heat generation for a vertical cylindrical geometry is presented. Second, to the best of authors knowledge, there are very limited number of engineering experimental results available for the case of melting with Internal Heat Generation (IHG). An experiment was performed to produce such data using resistive, or Joule, heating as the IHG mechanism. The numerical results are compared against the experimental results and showed favorable correlation. Uncertainties in the numerical and experimental analysis are discussed. Based on the numerical and experimental analysis, recommendations are made for future work.« less

  11. Public Outreach at RAL: Engaging the Next Generation of Scientists and Engineers

    NASA Astrophysics Data System (ADS)

    Corbett, G.; Ryall, G.; Palmer, S.; Collier, I. P.; Adams, J.; Appleyard, R.

    2015-12-01

    The Rutherford Appleton Laboratory (RAL) is part of the UK's Science and Technology Facilities Council (STFC). As part of the Royal Charter that established the STFC, the organisation is required to generate public awareness and encourage public engagement and dialogue in relation to the science undertaken. The staff at RAL firmly support this activity as it is important to encourage the next generation of students to consider studying Science, Technology, Engineering, and Mathematics (STEM) subjects, providing the UK with a highly skilled work-force in the future. To this end, the STFC undertakes a variety of outreach activities. This paper will describe the outreach activities undertaken by RAL, particularly focussing on those of the Scientific Computing Department (SCD). These activities include: an Arduino based activity day for 12-14 year-olds to celebrate Ada Lovelace day; running a centre as part of the Young Rewired State - encouraging 11-18 year-olds to create web applications with open data; sponsoring a team in the Engineering Education Scheme - supporting a small team of 16-17 year-olds to solve a real world engineering problem; as well as the more traditional tours of facilities. These activities could serve as an example for other sites involved in scientific computing around the globe.

  12. Internally generated hippocampal sequences as a vantage point to probe future-oriented cognition.

    PubMed

    Pezzulo, Giovanni; Kemere, Caleb; van der Meer, Matthijs A A

    2017-05-01

    Information processing in the rodent hippocampus is fundamentally shaped by internally generated sequences (IGSs), expressed during two different network states: theta sequences, which repeat and reset at the ∼8 Hz theta rhythm associated with active behavior, and punctate sharp wave-ripple (SWR) sequences associated with wakeful rest or slow-wave sleep. A potpourri of diverse functional roles has been proposed for these IGSs, resulting in a fragmented conceptual landscape. Here, we advance a unitary view of IGSs, proposing that they reflect an inferential process that samples a policy from the animal's generative model, supported by hippocampus-specific priors. The same inference affords different cognitive functions when the animal is in distinct dynamical modes, associated with specific functional networks. Theta sequences arise when inference is coupled to the animal's action-perception cycle, supporting online spatial decisions, predictive processing, and episode encoding. SWR sequences arise when the animal is decoupled from the action-perception cycle and may support offline cognitive processing, such as memory consolidation, the prospective simulation of spatial trajectories, and imagination. We discuss the empirical bases of this proposal in relation to rodent studies and highlight how the proposed computational principles can shed light on the mechanisms of future-oriented cognition in humans. © 2017 New York Academy of Sciences.

  13. Next Generation Analytic Tools for Large Scale Genetic Epidemiology Studies of Complex Diseases

    PubMed Central

    Mechanic, Leah E.; Chen, Huann-Sheng; Amos, Christopher I.; Chatterjee, Nilanjan; Cox, Nancy J.; Divi, Rao L.; Fan, Ruzong; Harris, Emily L.; Jacobs, Kevin; Kraft, Peter; Leal, Suzanne M.; McAllister, Kimberly; Moore, Jason H.; Paltoo, Dina N.; Province, Michael A.; Ramos, Erin M.; Ritchie, Marylyn D.; Roeder, Kathryn; Schaid, Daniel J.; Stephens, Matthew; Thomas, Duncan C.; Weinberg, Clarice R.; Witte, John S.; Zhang, Shunpu; Zöllner, Sebastian; Feuer, Eric J.; Gillanders, Elizabeth M.

    2012-01-01

    Over the past several years, genome-wide association studies (GWAS) have succeeded in identifying hundreds of genetic markers associated with common diseases. However, most of these markers confer relatively small increments of risk and explain only a small proportion of familial clustering. To identify obstacles to future progress in genetic epidemiology research and provide recommendations to NIH for overcoming these barriers, the National Cancer Institute sponsored a workshop entitled “Next Generation Analytic Tools for Large-Scale Genetic Epidemiology Studies of Complex Diseases” on September 15–16, 2010. The goal of the workshop was to facilitate discussions on (1) statistical strategies and methods to efficiently identify genetic and environmental factors contributing to the risk of complex disease; and (2) how to develop, apply, and evaluate these strategies for the design, analysis, and interpretation of large-scale complex disease association studies in order to guide NIH in setting the future agenda in this area of research. The workshop was organized as a series of short presentations covering scientific (gene-gene and gene-environment interaction, complex phenotypes, and rare variants and next generation sequencing) and methodological (simulation modeling and computational resources and data management) topic areas. Specific needs to advance the field were identified during each session and are summarized. PMID:22147673

  14. Institutional computing (IC) information session

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koch, Kenneth R; Lally, Bryan R

    2011-01-19

    The LANL Institutional Computing Program (IC) will host an information session about the current state of unclassified Institutional Computing at Los Alamos, exciting plans for the future, and the current call for proposals for science and engineering projects requiring computing. Program representatives will give short presentations and field questions about the call for proposals and future planned machines, and discuss technical support available to existing and future projects. Los Alamos has started making a serious institutional investment in open computing available to our science projects, and that investment is expected to increase even more.

  15. Interactions with Virtual People: Do Avatars Dream of Digital Sheep?. Chapter 6

    NASA Technical Reports Server (NTRS)

    Slater, Mel; Sanchez-Vives, Maria V.

    2007-01-01

    This paper explores another form of artificial entity, ones without physical embodiment. We refer to virtual characters as the name for a type of interactive object that have become familiar in computer games and within virtual reality applications. We refer to these as avatars: three-dimensional graphical objects that are in more-or-less human form which can interact with humans. Sometimes such avatars will be representations of real-humans who are interacting together within a shared networked virtual environment, other times the representations will be of entirely computer generated characters. Unlike other authors, who reserve the term agent for entirely computer generated characters and avatars for virtual embodiments of real people; the same term here is used for both. This is because avatars and agents are on a continuum. The question is where does their behaviour originate? At the extremes the behaviour is either completely computer generated or comes only from tracking of a real person. However, not every aspect of a real person can be tracked every eyebrow move, every blink, every breath rather real tracking data would be supplemented by inferred behaviours which are programmed based on the available information as to what the real human is doing and her/his underlying emotional and psychological state. Hence there is always some programmed behaviour it is only a matter of how much. In any case the same underlying problem remains how can the human character be portrayed in such a manner that its actions are believable and have an impact on the real people with whom it interacts? This paper has three main parts. In the first part we will review some evidence that suggests that humans react with appropriate affect in their interactions with virtual human characters, or with other humans who are represented as avatars. This is so in spite of the fact that the representational fidelity is relatively low. Our evidence will be from the realm of psychotherapy, where virtual social situations are created that do test whether people react appropriately within these situations. We will also consider some experiments on face-to-face virtual communications between people in the same shared virtual environments. The second part will try to give some clues about why this might happen, taking into account modern theories of perception from neuroscience. The third part will include some speculations about the future developments of the relationship between people and virtual people. We will suggest that a more likely scenario than the world becoming populated by physically embodied virtual people (robots, androids) is that in the relatively near future we will interact more and more in our everyday lives with virtual people- bank managers, shop assistants, instructors, and so on. What is happening in the movies with computer graphic generated individuals and entire crowds may move into the space of everyday life.

  16. A framework for different levels of integration of computational models into web-based virtual patients.

    PubMed

    Kononowicz, Andrzej A; Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil

    2014-01-23

    Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients' interactivity by enriching them with computational models of physiological and pathological processes. The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the model. The third element is the description of four integration strategies, and the last element consisted of evaluation profiles specifying the relevant feasibility features and acceptance thresholds for specific purposes. The group of experts who evaluated the virtual patient exemplar found higher integration more interesting, but at the same time they were more concerned with the validity of the result. The observed differences were not statistically significant. This paper outlines a framework for the integration of computational models into virtual patients. The opportunities and challenges of model exploitation are discussed from a number of user perspectives, considering different levels of model integration. The long-term aim for future research is to isolate the most crucial factors in the framework and to determine their influence on the integration outcome.

  17. A Framework for Different Levels of Integration of Computational Models Into Web-Based Virtual Patients

    PubMed Central

    Narracott, Andrew J; Manini, Simone; Bayley, Martin J; Lawford, Patricia V; McCormack, Keith; Zary, Nabil

    2014-01-01

    Background Virtual patients are increasingly common tools used in health care education to foster learning of clinical reasoning skills. One potential way to expand their functionality is to augment virtual patients’ interactivity by enriching them with computational models of physiological and pathological processes. Objective The primary goal of this paper was to propose a conceptual framework for the integration of computational models within virtual patients, with particular focus on (1) characteristics to be addressed while preparing the integration, (2) the extent of the integration, (3) strategies to achieve integration, and (4) methods for evaluating the feasibility of integration. An additional goal was to pilot the first investigation of changing framework variables on altering perceptions of integration. Methods The framework was constructed using an iterative process informed by Soft System Methodology. The Virtual Physiological Human (VPH) initiative has been used as a source of new computational models. The technical challenges associated with development of virtual patients enhanced by computational models are discussed from the perspectives of a number of different stakeholders. Concrete design and evaluation steps are discussed in the context of an exemplar virtual patient employing the results of the VPH ARCH project, as well as improvements for future iterations. Results The proposed framework consists of four main elements. The first element is a list of feasibility features characterizing the integration process from three perspectives: the computational modelling researcher, the health care educationalist, and the virtual patient system developer. The second element included three integration levels: basic, where a single set of simulation outcomes is generated for specific nodes in the activity graph; intermediate, involving pre-generation of simulation datasets over a range of input parameters; advanced, including dynamic solution of the model. The third element is the description of four integration strategies, and the last element consisted of evaluation profiles specifying the relevant feasibility features and acceptance thresholds for specific purposes. The group of experts who evaluated the virtual patient exemplar found higher integration more interesting, but at the same time they were more concerned with the validity of the result. The observed differences were not statistically significant. Conclusions This paper outlines a framework for the integration of computational models into virtual patients. The opportunities and challenges of model exploitation are discussed from a number of user perspectives, considering different levels of model integration. The long-term aim for future research is to isolate the most crucial factors in the framework and to determine their influence on the integration outcome. PMID:24463466

  18. A survey of artificial immune system based intrusion detection.

    PubMed

    Yang, Hua; Li, Tao; Hu, Xinlei; Wang, Feng; Zou, Yang

    2014-01-01

    In the area of computer security, Intrusion Detection (ID) is a mechanism that attempts to discover abnormal access to computers by analyzing various interactions. There is a lot of literature about ID, but this study only surveys the approaches based on Artificial Immune System (AIS). The use of AIS in ID is an appealing concept in current techniques. This paper summarizes AIS based ID methods from a new view point; moreover, a framework is proposed for the design of AIS based ID Systems (IDSs). This framework is analyzed and discussed based on three core aspects: antibody/antigen encoding, generation algorithm, and evolution mode. Then we collate the commonly used algorithms, their implementation characteristics, and the development of IDSs into this framework. Finally, some of the future challenges in this area are also highlighted.

  19. Advanced helmet mounted display (AHMD)

    NASA Astrophysics Data System (ADS)

    Sisodia, Ashok; Bayer, Michael; Townley-Smith, Paul; Nash, Brian; Little, Jay; Cassarly, William; Gupta, Anurag

    2007-04-01

    Due to significantly increased U.S. military involvement in deterrent, observer, security, peacekeeping and combat roles around the world, the military expects significant future growth in the demand for deployable virtual reality trainers with networked simulation capability of the battle space visualization process. The use of HMD technology in simulated virtual environments has been initiated by the demand for more effective training tools. The AHMD overlays computer-generated data (symbology, synthetic imagery, enhanced imagery) augmented with actual and simulated visible environment. The AHMD can be used to support deployable reconfigurable training solutions as well as traditional simulation requirements, UAV augmented reality, air traffic control and Command, Control, Communications, Computers, Intelligence, Surveillance, and Reconnaissance (C4ISR) applications. This paper will describe the design improvements implemented for production of the AHMD System.

  20. RISC Processors and High Performance Computing

    NASA Technical Reports Server (NTRS)

    Bailey, David H.; Saini, Subhash; Craw, James M. (Technical Monitor)

    1995-01-01

    This tutorial will discuss the top five RISC microprocessors and the parallel systems in which they are used. It will provide a unique cross-machine comparison not available elsewhere. The effective performance of these processors will be compared by citing standard benchmarks in the context of real applications. The latest NAS Parallel Benchmarks, both absolute performance and performance per dollar, will be listed. The next generation of the NPB will be described. The tutorial will conclude with a discussion of future directions in the field. Technology Transfer Considerations: All of these computer systems are commercially available internationally. Information about these processors is available in the public domain, mostly from the vendors themselves. The NAS Parallel Benchmarks and their results have been previously approved numerous times for public release, beginning back in 1991.

  1. Aeromechanics and Aeroacoustics Predictions of the Boeing-SMART Rotor Using Coupled-CFD/CSD Analyses

    NASA Technical Reports Server (NTRS)

    Bain, Jeremy; Sim, Ben W.; Sankar, Lakshmi; Brentner, Ken

    2010-01-01

    This paper will highlight helicopter aeromechanics and aeroacoustics prediction capabilities developed by Georgia Institute of Technology, the Pennsylvania State University, and Northern Arizona University under the Helicopter Quieting Program (HQP) sponsored by the Tactical Technology Office of the Defense Advanced Research Projects Agency (DARPA). First initiated in 2004, the goal of the HQP was to develop high fidelity, state-of-the-art computational tools for designing advanced helicopter rotors with reduced acoustic perceptibility and enhanced performance. A critical step towards achieving this objective is the development of rotorcraft prediction codes capable of assessing a wide range of helicopter configurations and operations for future rotorcraft designs. This includes novel next-generation rotor systems that incorporate innovative passive and/or active elements to meet future challenging military performance and survivability goals.

  2. WOMBAT: A Scalable and High-performance Astrophysical Magnetohydrodynamics Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendygral, P. J.; Radcliffe, N.; Kandalla, K.

    2017-02-01

    We present a new code for astrophysical magnetohydrodynamics specifically designed and optimized for high performance and scaling on modern and future supercomputers. We describe a novel hybrid OpenMP/MPI programming model that emerged from a collaboration between Cray, Inc. and the University of Minnesota. This design utilizes MPI-RMA optimized for thread scaling, which allows the code to run extremely efficiently at very high thread counts ideal for the latest generation of multi-core and many-core architectures. Such performance characteristics are needed in the era of “exascale” computing. We describe and demonstrate our high-performance design in detail with the intent that it maymore » be used as a model for other, future astrophysical codes intended for applications demanding exceptional performance.« less

  3. Climate change impact on wave energy in the Persian Gulf

    NASA Astrophysics Data System (ADS)

    Kamranzad, Bahareh; Etemad-Shahidi, Amir; Chegini, Vahid; Yeganeh-Bakhtiary, Abbas

    2015-06-01

    Excessive usage of fossil fuels and high emission of greenhouse gases have increased the earth's temperature, and consequently have changed the patterns of natural phenomena such as wind speed, wave height, etc. Renewable energy resources are ideal alternatives to reduce the negative effects of increasing greenhouse gases emission and climate change. However, these energy sources are also sensitive to changing climate. In this study, the effect of climate change on wave energy in the Persian Gulf is investigated. For this purpose, future wind data obtained from CGCM3.1 model were downscaled using a hybrid approach and modification factors were computed based on local wind data (ECMWF) and applied to control and future CGCM3.1 wind data. Downscaled wind data was used to generate the wave characteristics in the future based on A2, B1, and A1B scenarios, while ECMWF wind field was used to generate the wave characteristics in the control period. The results of these two 30-yearly wave modelings using SWAN model showed that the average wave power changes slightly in the future. Assessment of wave power spatial distribution showed that the reduction of the average wave power is more in the middle parts of the Persian Gulf. Investigation of wave power distribution in two coastal stations (Boushehr and Assalouyeh ports) indicated that the annual wave energy will decrease in both stations while the wave power distribution for different intervals of significant wave height and peak period will also change in Assalouyeh according to all scenarios.

  4. Didactical determinants use of information and communication technology in process of training of future specialists.

    PubMed

    Palamar, Borys I; Vaskivska, Halyna O; Palamar, Svitlana P

    In the article the author touches upon the subject of significance of computer equipment for organization of cooperation of professor and future specialists. Such subject-subject interaction may be directed to forming of professional skills of future specialists. By using information and communication technologies in education system range of didactic tasks can be solved. Improving of process of teaching of subjects in high school, self-learning future specialists, motivating to learning and self-learning, the development of reflection in the learning process. The authors considers computer equipment as instrument for development of intellectual skills, potential and willingness of future specialists to solve communicative and communication tasks and problems on the creative basis. Based on results of researches the author comes to certain conclusions about the effectiveness of usage of computer technologies in process of teaching future specialists and their self-learning. Improper supplying of high schools with computer equipment, lack of appropriate educational programs, professors' teachers' poor knowledge and usage of computers have negative impact on organization of process of teaching disciplines in high schools. Computer equipment and ICT in general are the instruments of development of intellectual skills, potential and willingness of future specialists to solve communicative and communication tasks and problems. So, the formation of psychosocial environment of development of future specialist is multifaceted, complex and didactically important issue.

  5. Crew appliance study

    NASA Technical Reports Server (NTRS)

    Proctor, B. W.; Reysa, R. P.; Russell, D. J.

    1975-01-01

    Viable crew appliance concepts were identified by means of a thorough literature search. Studies were made of the food management, personal hygiene, housekeeping, and off-duty habitability functions to determine which concepts best satisfy the Space Shuttle Orbiter and Modular Space Station mission requirements. Models of selected appliance concepts not currently included in the generalized environmental-thermal control and life support systems computer program were developed and validated. Development plans of selected concepts were generated for future reference. A shuttle freezer conceptual design was developed and a test support activity was provided for regenerative environmental control life support subsystems.

  6. Information Technology for the Twenty-First Century: A Bold Investment in America's Future

    NASA Astrophysics Data System (ADS)

    1999-06-01

    With this Information Technology for the Twenty First Century (IT2) initiative, the Federal Government is making an important re-commitment to fundamental research in information technology. The IT2 initiative proposes 366 million in increased investments in computing, information, and communications research and development (R&D) to help expand the knowledge base in fundamental information science, advance the Nations capabilities in cutting edge research, and train the next generation of researchers who will sustain the Information Revolution well into the 21st Century.

  7. Forecasting Occurrences of Activities.

    PubMed

    Minor, Bryan; Cook, Diane J

    2017-07-01

    While activity recognition has been shown to be valuable for pervasive computing applications, less work has focused on techniques for forecasting the future occurrence of activities. We present an activity forecasting method to predict the time that will elapse until a target activity occurs. This method generates an activity forecast using a regression tree classifier and offers an advantage over sequence prediction methods in that it can predict expected time until an activity occurs. We evaluate this algorithm on real-world smart home datasets and provide evidence that our proposed approach is most effective at predicting activity timings.

  8. Product Operations Status Summary Metrics

    NASA Technical Reports Server (NTRS)

    Takagi, Atsuya; Toole, Nicholas

    2010-01-01

    The Product Operations Status Summary Metrics (POSSUM) computer program provides a readable view into the state of the Phoenix Operations Product Generation Subsystem (OPGS) data pipeline. POSSUM provides a user interface that can search the data store, collect product metadata, and display the results in an easily-readable layout. It was designed with flexibility in mind for support in future missions. Flexibility over various data store hierarchies is provided through the disk-searching facilities of Marsviewer. This is a proven program that has been in operational use since the first day of the Phoenix mission.

  9. Future of Department of Defense Cloud Computing Amid Cultural Confusion

    DTIC Science & Technology

    2013-03-01

    enterprise cloud - computing environment and transition to a public cloud service provider. Services have started the development of individual cloud - computing environments...endorsing cloud computing . It addresses related issues in matters of service culture changes and how strategic leaders will dictate the future of cloud ...through data center consolidation and individual Service provided cloud computing .

  10. Design of a modular digital computer system, DRL 4. [for meeting future requirements of spaceborne computers

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The design is reported of an advanced modular computer system designated the Automatically Reconfigurable Modular Multiprocessor System, which anticipates requirements for higher computing capacity and reliability for future spaceborne computers. Subjects discussed include: an overview of the architecture, mission analysis, synchronous and nonsynchronous scheduling control, reliability, and data transmission.

  11. Hardware Considerations for Computer Based Education in the 1980's.

    ERIC Educational Resources Information Center

    Hirschbuhl, John J.

    1980-01-01

    In the future, computers will be needed to sift through the vast proliferation of available information. Among new developments in computer technology are the videodisc microcomputers and holography. Predictions for future developments include laser libraries for the visually handicapped and Computer Assisted Dialogue. (JN)

  12. How valid are future generations' arguments for preserving wilderness?

    Treesearch

    Thomas A. More; James R. Averill; Thomas H. Stevens

    2000-01-01

    We are often urged to preserve wilderness for the sake of future generations. Future generations consist of potential persons who are mute stakeholders in the decisions of today. Many claims about the rights of future generations or our present obligations to them have been vigorously advanced and just as vigorously denied. Recent theorists, however, have argued for a...

  13. Thermal Analysis of Magnetically-Coupled Pump for Cryogenic Applications

    NASA Technical Reports Server (NTRS)

    Senocak, Inanc; Udaykumar, H. S.; Ndri, Narcisse; Francois, Marianne; Shyy, Wei

    1999-01-01

    Magnetically-coupled pump is under evaluation at Kennedy Space Center for possible cryogenic applications. A major concern is the impact of low temperature fluid flows on the pump performance. As a first step toward addressing this and related issues, a computational fluid dynamics and heat transfer tool has been adopted in a pump geometry. The computational tool includes (i) a commercial grid generator to handle multiple grid blocks and complicated geometric definitions, and (ii) an in-house computational fluid dynamics and heat transfer software developed in the Principal Investigator's group at the University of Florida. Both pure-conduction and combined convection-conduction computations have been conducted. A pure-conduction analysis gives insufficient information about the overall thermal distribution. Combined convection-conduction analysis indicates the significant influence of the coolant over the entire flow path. Since 2-D simulation is of limited help, future work on full 3-D modeling of the pump using multi-materials is needed. A comprehensive and accurate model can be developed to take into account the effect of multi-phase flow in the cooling flow loop, and the magnetic interactions.

  14. Software for Testing Electroactive Structural Components

    NASA Technical Reports Server (NTRS)

    Moses, Robert W.; Fox, Robert L.; Dimery, Archie D.; Bryant, Robert G.; Shams, Qamar

    2003-01-01

    A computer program generates a graphical user interface that, in combination with its other features, facilitates the acquisition and preprocessing of experimental data on the strain response, hysteresis, and power consumption of a multilayer composite-material structural component containing one or more built-in sensor(s) and/or actuator(s) based on piezoelectric materials. This program runs in conjunction with Lab-VIEW software in a computer-controlled instrumentation system. For a test, a specimen is instrumented with appliedvoltage and current sensors and with strain gauges. Once the computational connection to the test setup has been made via the LabVIEW software, this program causes the test instrumentation to step through specified configurations. If the user is satisfied with the test results as displayed by the software, the user activates an icon on a front-panel display, causing the raw current, voltage, and strain data to be digitized and saved. The data are also put into a spreadsheet and can be plotted on a graph. Graphical displays are saved in an image file for future reference. The program also computes and displays the power and the phase angle between voltage and current.

  15. Computational modeling of psychiatric illnesses via well-defined neurophysiological and neurocognitive biomarkers.

    PubMed

    Siekmeier, Peter J

    2015-10-01

    A good deal of recent research has centered on the identification of biomarkers and endophenotypic measures of psychiatric illnesses using in vivo and in vitro studies. This is understandable, as these measures-as opposed to complex clinical phenotypes-may be more closely related to neurobiological and genetic vulnerabilities. However, instantiation of such biomarkers using computational models-in silico studies-has received less attention. This approach could become increasingly important, given the wealth of detailed information produced by recent basic neuroscience research, and increasing availability of high capacity computing platforms. The purpose of this review is to survey the current state of the art of research in this area. We discuss computational approaches to schizophrenia, bipolar disorder, Alzheimer's disease, fragile X syndrome and autism, and argue that it represents a promising and underappreciated research modality. In conclusion, we outline specific avenues for future research; also, potential uses of in silico models to conduct "virtual experiments" and to generate novel hypotheses, and as an aid in neuropsychiatric drug development are discussed. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. A computational imaging target specific detectivity metric

    NASA Astrophysics Data System (ADS)

    Preece, Bradley L.; Nehmetallah, George

    2017-05-01

    Due to the large quantity of low-cost, high-speed computational processing available today, computational imaging (CI) systems are expected to have a major role for next generation multifunctional cameras. The purpose of this work is to quantify the performance of theses CI systems in a standardized manner. Due to the diversity of CI system designs that are available today or proposed in the near future, significant challenges in modeling and calculating a standardized detection signal-to-noise ratio (SNR) to measure the performance of these systems. In this paper, we developed a path forward for a standardized detectivity metric for CI systems. The detectivity metric is designed to evaluate the performance of a CI system searching for a specific known target or signal of interest, and is defined as the optimal linear matched filter SNR, similar to the Hotelling SNR, calculated in computational space with special considerations for standardization. Therefore, the detectivity metric is designed to be flexible, in order to handle various types of CI systems and specific targets, while keeping the complexity and assumptions of the systems to a minimum.

  17. Viewing ISS Data in Real Time via the Internet

    NASA Technical Reports Server (NTRS)

    Myers, Gerry; Chamberlain, Jim

    2004-01-01

    EZStream is a computer program that enables authorized users at diverse terrestrial locations to view, in real time, data generated by scientific payloads aboard the International Space Station (ISS). The only computation/communication resource needed for use of EZStream is a computer equipped with standard Web-browser software and a connection to the Internet. EZStream runs in conjunction with the TReK software, described in a prior NASA Tech Briefs article, that coordinates multiple streams of data for the ground communication system of the ISS. EZStream includes server components that interact with TReK within the ISS ground communication system and client components that reside in the users' remote computers. Once an authorized client has logged in, a server component of EZStream pulls the requested data from a TReK application-program interface and sends the data to the client. Future EZStream enhancements will include (1) extensions that enable the server to receive and process arbitrary data streams on its own and (2) a Web-based graphical-user-interface-building subprogram that enables a client who lacks programming expertise to create customized display Web pages.

  18. 3D Acoustic Full Waveform Inversion for Engineering Purpose

    NASA Astrophysics Data System (ADS)

    Lim, Y.; Shin, S.; Kim, D.; Kim, S.; Chung, W.

    2017-12-01

    Seismic waveform inversion is the most researched data processing technique. In recent years, with an increase in marine development projects, seismic surveys are commonly conducted for engineering purposes; however, researches for application of waveform inversion are insufficient. The waveform inversion updates the subsurface physical property by minimizing the difference between modeled and observed data. Furthermore, it can be used to generate an accurate subsurface image; however, this technique consumes substantial computational resources. Its most compute-intensive step is the calculation of the gradient and hessian values. This aspect gains higher significance in 3D as compared to 2D. This paper introduces a new method for calculating gradient and hessian values, in an effort to reduce computational overburden. In the conventional waveform inversion, the calculation area covers all sources and receivers. In seismic surveys for engineering purposes, the number of receivers is limited. Therefore, it is inefficient to construct the hessian and gradient for the entire region (Figure 1). In order to tackle this problem, we calculate the gradient and the hessian for a single shot within the range of the relevant source and receiver. This is followed by summing up of these positions for the entire shot (Figure 2). In this paper, we demonstrate that reducing the area of calculation of the hessian and gradient for one shot reduces the overall amount of computation and therefore, the computation time. Furthermore, it is proved that the waveform inversion can be suitably applied for engineering purposes. In future research, we propose to ascertain an effective calculation range. This research was supported by the Basic Research Project(17-3314) of the Korea Institute of Geoscience and Mineral Resources(KIGAM) funded by the Ministry of Science, ICT and Future Planning of Korea.

  19. Advanced Free Flight Planner and Dispatcher's Workstation: Preliminary Design Specification

    NASA Technical Reports Server (NTRS)

    Wilson, J.; Wright, C.; Couluris, G. J.

    1997-01-01

    The National Aeronautics and Space Administration (NASA) has implemented the Advanced Air Transportation Technology (AATT) program to investigate future improvements to the national and international air traffic management systems. This research, as part of the AATT program, developed preliminary design requirements for an advanced Airline Operations Control (AOC) dispatcher's workstation, with emphasis on flight planning. This design will support the implementation of an experimental workstation in NASA laboratories that would emulate AOC dispatch operations. The work developed an airline flight plan data base and specified requirements for: a computer tool for generation and evaluation of free flight, user preferred trajectories (UPT); the kernel of an advanced flight planning system to be incorporated into the UPT-generation tool; and an AOC workstation to house the UPT-generation tool and to provide a real-time testing environment. A prototype for the advanced flight plan optimization kernel was developed and demonstrated. The flight planner uses dynamic programming to search a four-dimensional wind and temperature grid to identify the optimal route, altitude and speed for successive segments of a flight. An iterative process is employed in which a series of trajectories are successively refined until the LTPT is identified. The flight planner is designed to function in the current operational environment as well as in free flight. The free flight environment would enable greater flexibility in UPT selection based on alleviation of current procedural constraints. The prototype also takes advantage of advanced computer processing capabilities to implement more powerful optimization routines than would be possible with older computer systems.

  20. Development of a PC-based diabetes simulator in collaboration with teenagers with type 1 diabetes.

    PubMed

    Nordfeldt, S; Hanberger, L; Malm, F; Ludvigsson, J

    2007-02-01

    The main aim of this study was to develop and test in a pilot study a PC-based interactive diabetes simulator prototype as a part of future Internet-based support systems for young teenagers and their families. A second aim was to gain experience in user-centered design (UCD) methods applied to such subjects. Using UCD methods, a computer scientist participated in iterative user group sessions involving teenagers with Type 1 diabetes 13-17 years old and parents. Input was transformed into a requirements specification by the computer scientist and advisors. This was followed by gradual prototype development based on a previously developed mathematical core. Individual test sessions were followed by a pilot study with five subjects testing a prototype. The process was evaluated by registration of flow and content of input and opinions from expert advisors. It was initially difficult to motivate teenagers to participate. User group discussion topics ranged from concrete to more academic matters. The issue of a simulator created active discussions among parents and teenagers. A large amount of input was generated from discussions among the teenagers. Individual test runs generated useful input. A pilot study suggested that the gradually elaborated software was functional. A PC-based diabetes simulator may create substantial interest among teenagers and parents, and the prototype seems worthy of further development and studies. UCD methods may generate significant input for computer support system design work and contribute to a functional design. Teenager involvement in design work may require time, patience, and flexibility.

  1. Raster-Based Approach to Solar Pressure Modeling

    NASA Technical Reports Server (NTRS)

    Wright, Theodore W. II

    2013-01-01

    An algorithm has been developed to take advantage of the graphics processing hardware in modern computers to efficiently compute high-fidelity solar pressure forces and torques on spacecraft, taking into account the possibility of self-shading due to the articulation of spacecraft components such as solar arrays. The process is easily extended to compute other results that depend on three-dimensional attitude analysis, such as solar array power generation or free molecular flow drag. The impact of photons upon a spacecraft introduces small forces and moments. The magnitude and direction of the forces depend on the material properties of the spacecraft components being illuminated. The parts of the components being lit depends on the orientation of the craft with respect to the Sun, as well as the gimbal angles for any significant moving external parts (solar arrays, typically). Some components may shield others from the Sun. The purpose of this innovation is to enable high-fidelity computation of solar pressure and power generation effects of illuminated portions of spacecraft, taking self-shading from spacecraft attitude and movable components into account. The key idea in this innovation is to compute results dependent upon complicated geometry by using an image to break the problem into thousands or millions of sub-problems with simple geometry, and then the results from the simpler problems are combined to give high-fidelity results for the full geometry. This process is performed by constructing a 3D model of a spacecraft using an appropriate computer language (OpenGL), and running that model on a modern computer's 3D accelerated video processor. This quickly and accurately generates a view of the model (as shown on a computer screen) that takes rotation and articulation of spacecraft components into account. When this view is interpreted as the spacecraft as seen by the Sun, then only the portions of the craft visible in the view are illuminated. The view as shown on the computer screen is composed of up to millions of pixels. Each of those pixels is associated with a small illuminated area of the spacecraft. For each pixel, it is possible to compute its position, angle (surface normal) from the view direction, and the spacecraft material (and therefore, optical coefficients) associated with that area. With this information, the area associated with each pixel can be modeled as a simple flat plate for calculating solar pressure. The vector sum of these individual flat plate models is a high-fidelity approximation of the solar pressure forces and torques on the whole vehicle. In addition to using optical coefficients associated with each spacecraft material to calculate solar pressure, a power generation coefficient is added for computing solar array power generation from the sum of the illuminated areas. Similarly, other area-based calculations, such as free molecular flow drag, are also enabled. Because the model rendering is separated from other calculations, it is relatively easy to add a new model to explore a new vehicle or mission configuration. Adding a new model is performed by adding OpenGL code, but a future version might read a mesh file exported from a computer-aided design (CAD) system to enable very rapid turnaround for new designs

  2. Visions of the Future - the Changing Role of Actors in Data-Intensive Science

    NASA Astrophysics Data System (ADS)

    Schäfer, L.; Klump, J. F.

    2013-12-01

    Around the world scientific disciplines are increasingly facing the challenge of a burgeoning volume of research data. This data avalanche consists of a stream of information generated from sensors and scientific instruments, digital recordings, social-science surveys or drawn from the World Wide Web. All areas of the scientific economy are affected by this rapid growth in data, from the logging of digs in Archaeology, telescope data with observations of distant galaxies in Astrophysics or data from polls and surveys in the Social Sciences. The challenge for science is not only to process the data through analysis, reduction and visualization, but also to set up infrastructures for provisioning and storing the data. The rise of new technologies and developments also poses new challenges for the actors in the area of research data infrastructures. Libraries, as one of the actors, enable access to digital media and support the publication of research data and its long-term archiving. Digital media and research data, however, introduce new aspects into the libraries' range of activities. How are we to imagine the library of the future? The library as an interface to the computer centers? Will library and computer center fuse into a new service unit? What role will scientific publishers play in future? Currently the traditional form of publication still carry greater weight - articles for conferences and journals. But will this still be the case in future? New forms of publication are already making their presence felt. The tasks of the computer centers may also change. Yesterday their remit was provisioning of rapid hardware, whereas now everything revolves around the topic of data and services. Finally, how about the researchers themselves? Not such a long time ago, Geoscience was not necessarily seen as linked to Computer Science. Nowadays, modern Geoscience relies heavily on IT and its techniques. Thus, in how far will the profile of the modern geoscientist change? This gives rise to the question of what tools are required to locate and pursue the correct course in a networked world. One tool from the area of innovation management is the scenario technique. This poster will outline visions of the future as possible developments of the scientific world in 2020 (or later). The scenarios presented will show possible developments - both positive and negative. It is up then to the actors themselves to define their own position in this context, to rethink it and consider steps that can achieve a positive development for the future.

  3. Flame-Vortex Studies to Quantify Markstein Numbers Needed to Model Flame Extinction Limits

    NASA Technical Reports Server (NTRS)

    Driscoll, James F.; Feikema, Douglas A.

    2003-01-01

    This has quantified a database of Markstein numbers for unsteady flames; future work will quantify a database of flame extinction limits for unsteady conditions. Unsteady extinction limits have not been documented previously; both a stretch rate and a residence time must be measured, since extinction requires that the stretch rate be sufficiently large for a sufficiently long residence time. Ma was measured for an inwardly-propagating flame (IPF) that is negatively-stretched under microgravity conditions. Computations also were performed using RUN-1DL to explain the measurements. The Markstein number of an inwardly-propagating flame, for both the microgravity experiment and the computations, is significantly larger than that of an outwardy-propagating flame. The computed profiles of the various species within the flame suggest reasons. Computed hydrogen concentrations build up ahead of the IPF but not the OPF. Understanding was gained by running the computations for both simplified and full-chemistry conditions. Numerical Simulations. To explain the experimental findings, numerical simulations of both inwardly and outwardly propagating spherical flames (with complex chemistry) were generated using the RUN-1DL code, which includes 16 species and 46 reactions.

  4. High-Performance Computer Modeling of the Cosmos-Iridium Collision

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olivier, S; Cook, K; Fasenfest, B

    2009-08-28

    This paper describes the application of a new, integrated modeling and simulation framework, encompassing the space situational awareness (SSA) enterprise, to the recent Cosmos-Iridium collision. This framework is based on a flexible, scalable architecture to enable efficient simulation of the current SSA enterprise, and to accommodate future advancements in SSA systems. In particular, the code is designed to take advantage of massively parallel, high-performance computer systems available, for example, at Lawrence Livermore National Laboratory. We will describe the application of this framework to the recent collision of the Cosmos and Iridium satellites, including (1) detailed hydrodynamic modeling of the satellitemore » collision and resulting debris generation, (2) orbital propagation of the simulated debris and analysis of the increased risk to other satellites (3) calculation of the radar and optical signatures of the simulated debris and modeling of debris detection with space surveillance radar and optical systems (4) determination of simulated debris orbits from modeled space surveillance observations and analysis of the resulting orbital accuracy, (5) comparison of these modeling and simulation results with Space Surveillance Network observations. We will also discuss the use of this integrated modeling and simulation framework to analyze the risks and consequences of future satellite collisions and to assess strategies for mitigating or avoiding future incidents, including the addition of new sensor systems, used in conjunction with the Space Surveillance Network, for improving space situational awareness.« less

  5. Physics-based Modeling of Material Behavior and Damage Initiation in Nanoengineered Composites

    NASA Astrophysics Data System (ADS)

    Subramanian, Nithya

    Materials with unprecedented properties are necessary to make dramatic changes in current and future aerospace platforms. Hybrid materials and composites are increasingly being used in aircraft and spacecraft frames; however, future platforms will require an optimal design of novel materials that enable operation in a variety of environments and produce known/predicted damage mechanisms. Nanocomposites and nanoengineered composites with CNTs have the potential to make significant improvements in strength, stiffness, fracture toughness, flame retardancy and resistance to corrosion. Therefore, these materials have generated tremendous scientific and technical interest over the past decade and various architectures are being explored for applications to light-weight airframe structures. However, the success of such materials with significantly improved performance metrics requires careful control of the parameters during synthesis and processing. Their implementation is also limited due to the lack of complete understanding of the effects the nanoparticles impart to the bulk properties of composites. It is common for computational methods to be applied to explain phenomena measured or observed experimentally. Frequently, a given phenomenon or material property is only considered to be fully understood when the associated physics has been identified through accompanying calculations or simulations. The computationally and experimentally integrated research presented in this dissertation provides improved understanding of the mechanical behavior and response including damage and failure in CNT nanocomposites, enhancing confidence in their applications. The computations at the atomistic level helps to understand the underlying mechanochemistry and allow a systematic investigation of the complex CNT architectures and the material performance across a wide range of parameters. Simulation of the bond breakage phenomena and development of the interface to continuum scale damage captures the effects of applied loading and damage precursor and provides insight into the safety of nanoengineered composites under service loads. The validated modeling methodology is expected to be a step in the direction of computationally-assisted design and certification of novel materials, thus liberating the pace of their implementation in future applications.

  6. Planck 2015 results. XII. Full focal plane simulations

    NASA Astrophysics Data System (ADS)

    Planck Collaboration; Ade, P. A. R.; Aghanim, N.; Arnaud, M.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartlett, J. G.; Bartolo, N.; Battaner, E.; Benabed, K.; Benoît, A.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bock, J. J.; Bonaldi, A.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Boulanger, F.; Bucher, M.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Castex, G.; Catalano, A.; Challinor, A.; Chamballu, A.; Chiang, H. C.; Christensen, P. R.; Clements, D. L.; Colombi, S.; Colombo, L. P. L.; Combet, C.; Couchot, F.; Coulais, A.; Crill, B. P.; Curto, A.; Cuttaia, F.; Danese, L.; Davies, R. D.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Delouis, J.-M.; Désert, F.-X.; Dickinson, C.; Diego, J. M.; Dolag, K.; Dole, H.; Donzelli, S.; Doré, O.; Douspis, M.; Ducout, A.; Dupac, X.; Efstathiou, G.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Fergusson, J.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A. A.; Franceschi, E.; Frejsel, A.; Galeotta, S.; Galli, S.; Ganga, K.; Ghosh, T.; Giard, M.; Giraud-Héraud, Y.; Gjerløw, E.; González-Nuevo, J.; Górski, K. M.; Gratton, S.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J. E.; Hansen, F. K.; Hanson, D.; Harrison, D. L.; Henrot-Versillé, S.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hivon, E.; Hobson, M.; Holmes, W. A.; Hornstrup, A.; Hovest, W.; Huffenberger, K. M.; Hurier, G.; Jaffe, A. H.; Jaffe, T. R.; Jones, W. C.; Juvela, M.; Karakci, A.; Keihänen, E.; Keskitalo, R.; Kiiveri, K.; Kisner, T. S.; Kneissl, R.; Knoche, J.; Kunz, M.; Kurki-Suonio, H.; Lagache, G.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Lawrence, C. R.; Leonardi, R.; Lesgourgues, J.; Levrier, F.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; Lindholm, V.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maggio, G.; Maino, D.; Mandolesi, N.; Mangilli, A.; Maris, M.; Martin, P. G.; Martínez-González, E.; Masi, S.; Matarrese, S.; McGehee, P.; Meinhold, P. R.; Melchiorri, A.; Melin, J.-B.; Mendes, L.; Mennella, A.; Migliaccio, M.; Mitra, S.; Miville-Deschênes, M.-A.; Moneti, A.; Montier, L.; Morgante, G.; Mortlock, D.; Moss, A.; Munshi, D.; Murphy, J. A.; Naselsky, P.; Nati, F.; Natoli, P.; Netterfield, C. B.; Nørgaard-Nielsen, H. U.; Noviello, F.; Novikov, D.; Novikov, I.; Oxborrow, C. A.; Paci, F.; Pagano, L.; Pajot, F.; Paoletti, D.; Pasian, F.; Patanchon, G.; Pearson, T. J.; Perdereau, O.; Perotto, L.; Perrotta, F.; Pettorino, V.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Pietrobon, D.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Pratt, G. W.; Prézeau, G.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Rebolo, R.; Reinecke, M.; Remazeilles, M.; Renault, C.; Renzi, A.; Ristorcelli, I.; Rocha, G.; Roman, M.; Rosset, C.; Rossetti, M.; Roudier, G.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Santos, D.; Savelainen, M.; Scott, D.; Seiffert, M. D.; Shellard, E. P. S.; Spencer, L. D.; Stolyarov, V.; Stompor, R.; Sudiwala, R.; Sutton, D.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Tuovinen, J.; Valenziano, L.; Valiviita, J.; Van Tent, B.; Vielva, P.; Villa, F.; Wade, L. A.; Wandelt, B. D.; Wehus, I. K.; Welikala, N.; Yvon, D.; Zacchei, A.; Zonca, A.

    2016-09-01

    We present the 8th full focal plane simulation set (FFP8), deployed in support of the Planck 2015 results. FFP8 consists of 10 fiducial mission realizations reduced to 18 144 maps, together with the most massive suite of Monte Carlo realizations of instrument noise and CMB ever generated, comprising 104 mission realizations reduced to about 106 maps. The resulting maps incorporate the dominant instrumental, scanning, and data analysis effects, and the remaining subdominant effects will be included in future updates. Generated at a cost of some 25 million CPU-hours spread across multiple high-performance-computing (HPC) platforms, FFP8 is used to validate and verify analysis algorithms and their implementations, and to remove biases from and quantify uncertainties in the results of analyses of the real data.

  7. Time activities at the BIPM

    NASA Technical Reports Server (NTRS)

    Thomas, Claudine

    1995-01-01

    The generation and dissemination of International Atomic Time, TAI, and of Coordinated Universal Time, UTC, are explicitly mentioned in the list of the principal tasks of the BIPM, recalled in the Comptes Rendus of the 18th Conference Generale des Poids et Mesures, in 1987. These tasks are fulfilled by the BIPM Time Section, thanks to international cooperation with national timing centers, which maintain, under metrological conditions, the clocks used to generate TAI. Besides the current work of data collection and processing, research activities are carried out in order to adapt the computation of TAI to the most recent improvements occurring in the time and frequency domains. Studies concerning the application of general relativity and pulsar timing to time metrology are also actively pursued. This paper summarizes the work done in all these fields and outlines future projects.

  8. Development of standardized air-blown coal gasifier/gas turbine concepts for future electric power systems, Volume 4

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1991-02-01

    This appendix is a compilation of work done to predict overall cycle performance from gasifier to generator terminals. A spreadsheet has been generated for each case to show flows within a cycle. The spreadsheet shows gaseous or solid composition of flow, temperature of flow, quantity of flow, and heat heat content of flow. Prediction of steam and gas turbine performance was obtained by the computer program GTPro. Outputs of all runs for each combined cycle reviewed has been added to this appendix. A process schematic displaying all flows predicted through GTPro and the spreadsheet is also added to this appendix.more » The numbered bubbles on the schematic correspond to columns on the top headings of the spreadsheet.« less

  9. Cochlear Implant Electrode Localization Using an Ultra-High Resolution Scan Mode on Conventional 64-Slice and New Generation 192-Slice Multi-Detector Computed Tomography.

    PubMed

    Carlson, Matthew L; Leng, Shuai; Diehn, Felix E; Witte, Robert J; Krecke, Karl N; Grimes, Josh; Koeller, Kelly K; Bruesewitz, Michael R; McCollough, Cynthia H; Lane, John I

    2017-08-01

    A new generation 192-slice multi-detector computed tomography (MDCT) clinical scanner provides enhanced image quality and superior electrode localization over conventional MDCT. Currently, accurate and reliable cochlear implant electrode localization using conventional MDCT scanners remains elusive. Eight fresh-frozen cadaveric temporal bones were implanted with full-length cochlear implant electrodes. Specimens were subsequently scanned with conventional 64-slice and new generation 192-slice MDCT scanners utilizing ultra-high resolution modes. Additionally, all specimens were scanned with micro-CT to provide a reference criterion for electrode position. Images were reconstructed according to routine temporal bone clinical protocols. Three neuroradiologists, blinded to scanner type, reviewed images independently to assess resolution of individual electrodes, scalar localization, and severity of image artifact. Serving as the reference standard, micro-CT identified scalar crossover in one specimen; imaging of all remaining cochleae demonstrated complete scala tympani insertions. The 192-slice MDCT scanner exhibited improved resolution of individual electrodes (p < 0.01), superior scalar localization (p < 0.01), and reduced blooming artifact (p < 0.05), compared with conventional 64-slice MDCT. There was no significant difference between platforms when comparing streak or ring artifact. The new generation 192-slice MDCT scanner offers several notable advantages for cochlear implant imaging compared with conventional MDCT. This technology provides important feedback regarding electrode position and course, which may help in future optimization of surgical technique and electrode design.

  10. From what should we protect future generations: germ-line therapy or genetic screening?

    PubMed

    Mallia, Pierre; ten Have, Henk

    2003-01-01

    This paper discusses the issue of whether we have responsibilities to future generations with respect to genetic screening, including for purposes of selective abortion or discard. Future generations have been discussed at length among scholars. The concept of 'Guardian for Future Generations' is tackled and its main criticisms discussed. Whilst germ-line cures, it is argued, can only affect family trees, genetic screening and testing can have wider implications. If asking how this may affect future generations is a legitimate question and since we indeed make retrospective moral judgements, it would be wise to consider that future generations will make the same retrospective judgements on us. Moreover such technologies affect present embryos to which we indeed can be considered to have an obligation.

  11. Reducing a Knowledge-Base Search Space When Data Are Missing

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    This software addresses the problem of how to efficiently execute a knowledge base in the presence of missing data. Computationally, this is an exponentially expensive operation that without heuristics generates a search space of 1 + 2n possible scenarios, where n is the number of rules in the knowledge base. Even for a knowledge base of the most modest size, say 16 rules, it would produce 65,537 possible scenarios. The purpose of this software is to reduce the complexity of this operation to a more manageable size. The problem that this system solves is to develop an automated approach that can reason in the presence of missing data. This is a meta-reasoning capability that repeatedly calls a diagnostic engine/model to provide prognoses and prognosis tracking. In the big picture, the scenario generator takes as its input the current state of a system, including probabilistic information from Data Forecasting. Using model-based reasoning techniques, it returns an ordered list of fault scenarios that could be generated from the current state, i.e., the plausible future failure modes of the system as it presently stands. The scenario generator models a Potential Fault Scenario (PFS) as a black box, the input of which is a set of states tagged with priorities and the output of which is one or more potential fault scenarios tagged by a confidence factor. The results from the system are used by a model-based diagnostician to predict the future health of the monitored system.

  12. Multi-Attribute Task Battery - Applications in pilot workload and strategic behavior research

    NASA Technical Reports Server (NTRS)

    Arnegard, Ruth J.; Comstock, J. R., Jr.

    1991-01-01

    The Multi-Attribute Task (MAT) Battery provides a benchmark set of tasks for use in a wide range of lab studies of operator performance and workload. The battery incorporates tasks analogous to activities that aircraft crewmembers perform in flight, while providing a high degree of experimenter control, performance data on each subtask, and freedom to nonpilot test subjects. Features not found in existing computer based tasks include an auditory communication task (to simulate Air Traffic Control communication), a resource management task permitting many avenues or strategies of maintaining target performance, a scheduling window which gives the operator information about future task demands, and the option of manual or automated control of tasks. Performance data are generated for each subtask. In addition, the task battery may be paused and onscreen workload rating scales presented to the subject. The MAT Battery requires a desktop computer with color graphics. The communication task requires a serial link to a second desktop computer with a voice synthesizer or digitizer card.

  13. The multi-attribute task battery for human operator workload and strategic behavior research

    NASA Technical Reports Server (NTRS)

    Comstock, J. Raymond, Jr.; Arnegard, Ruth J.

    1992-01-01

    The Multi-Attribute Task (MAT) Battery provides a benchmark set of tasks for use in a wide range of lab studies of operator performance and workload. The battery incorporates tasks analogous to activities that aircraft crewmembers perform in flight, while providing a high degree of experimenter control, performance data on each subtask, and freedom to use nonpilot test subjects. Features not found in existing computer based tasks include an auditory communication task (to simulate Air Traffic Control communication), a resource management task permitting many avenues or strategies of maintaining target performance, a scheduling window which gives the operator information about future task demands, and the option of manual or automated control of tasks. Performance data are generated for each subtask. In addition, the task battery may be paused and onscreen workload rating scales presented to the subject. The MAT Battery requires a desktop computer with color graphics. The communication task requires a serial link to a second desktop computer with a voice synthesizer or digitizer card.

  14. Opportunities for nonvolatile memory systems in extreme-scale high-performance computing

    DOE PAGES

    Vetter, Jeffrey S.; Mittal, Sparsh

    2015-01-12

    For extreme-scale high-performance computing systems, system-wide power consumption has been identified as one of the key constraints moving forward, where DRAM main memory systems account for about 30 to 50 percent of a node's overall power consumption. As the benefits of device scaling for DRAM memory slow, it will become increasingly difficult to keep memory capacities balanced with increasing computational rates offered by next-generation processors. However, several emerging memory technologies related to nonvolatile memory (NVM) devices are being investigated as an alternative for DRAM. Moving forward, NVM devices could offer solutions for HPC architectures. Researchers are investigating how to integratemore » these emerging technologies into future extreme-scale HPC systems and how to expose these capabilities in the software stack and applications. In addition, current results show several of these strategies could offer high-bandwidth I/O, larger main memory capacities, persistent data structures, and new approaches for application resilience and output postprocessing, such as transaction-based incremental checkpointing and in situ visualization, respectively.« less

  15. Digital pathology imaging as a novel platform for standardization and globalization of quantitative nephropathology

    PubMed Central

    Gimpel, Charlotte; Kain, Renate; Laurinavicius, Arvydas; Bueno, Gloria; Zeng, Caihong; Liu, Zhihong; Schaefer, Franz; Kretzler, Matthias; Holzman, Lawrence B.; Hewitt, Stephen M.

    2017-01-01

    Abstract The introduction of digital pathology to nephrology provides a platform for the development of new methodologies and protocols for visual, morphometric and computer-aided assessment of renal biopsies. Application of digital imaging to pathology made substantial progress over the past decade; it is now in use for education, clinical trials and translational research. Digital pathology evolved as a valuable tool to generate comprehensive structural information in digital form, a key prerequisite for achieving precision pathology for computational biology. The application of this new technology on an international scale is driving novel methods for collaborations, providing unique opportunities but also challenges. Standardization of methods needs to be rigorously evaluated and applied at each step, from specimen processing to scanning, uploading into digital repositories, morphologic, morphometric and computer-aided assessment, data collection and analysis. In this review, we discuss the status and opportunities created by the application of digital imaging to precision nephropathology, and present a vision for the near future. PMID:28584625

  16. Design and Analysis Tools for Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Folk, Thomas C.

    2009-01-01

    Computational tools are being developed for the design and analysis of supersonic inlets. The objective is to update existing tools and provide design and low-order aerodynamic analysis capability for advanced inlet concepts. The Inlet Tools effort includes aspects of creating an electronic database of inlet design information, a document describing inlet design and analysis methods, a geometry model for describing the shape of inlets, and computer tools that implement the geometry model and methods. The geometry model has a set of basic inlet shapes that include pitot, two-dimensional, axisymmetric, and stream-traced inlet shapes. The inlet model divides the inlet flow field into parts that facilitate the design and analysis methods. The inlet geometry model constructs the inlet surfaces through the generation and transformation of planar entities based on key inlet design factors. Future efforts will focus on developing the inlet geometry model, the inlet design and analysis methods, a Fortran 95 code to implement the model and methods. Other computational platforms, such as Java, will also be explored.

  17. Digital pathology imaging as a novel platform for standardization and globalization of quantitative nephropathology.

    PubMed

    Barisoni, Laura; Gimpel, Charlotte; Kain, Renate; Laurinavicius, Arvydas; Bueno, Gloria; Zeng, Caihong; Liu, Zhihong; Schaefer, Franz; Kretzler, Matthias; Holzman, Lawrence B; Hewitt, Stephen M

    2017-04-01

    The introduction of digital pathology to nephrology provides a platform for the development of new methodologies and protocols for visual, morphometric and computer-aided assessment of renal biopsies. Application of digital imaging to pathology made substantial progress over the past decade; it is now in use for education, clinical trials and translational research. Digital pathology evolved as a valuable tool to generate comprehensive structural information in digital form, a key prerequisite for achieving precision pathology for computational biology. The application of this new technology on an international scale is driving novel methods for collaborations, providing unique opportunities but also challenges. Standardization of methods needs to be rigorously evaluated and applied at each step, from specimen processing to scanning, uploading into digital repositories, morphologic, morphometric and computer-aided assessment, data collection and analysis. In this review, we discuss the status and opportunities created by the application of digital imaging to precision nephropathology, and present a vision for the near future.

  18. Reconfigurable Hardware Adapts to Changing Mission Demands

    NASA Technical Reports Server (NTRS)

    2003-01-01

    A new class of computing architectures and processing systems, which use reconfigurable hardware, is creating a revolutionary approach to implementing future spacecraft systems. With the increasing complexity of electronic components, engineers must design next-generation spacecraft systems with new technologies in both hardware and software. Derivation Systems, Inc., of Carlsbad, California, has been working through NASA s Small Business Innovation Research (SBIR) program to develop key technologies in reconfigurable computing and Intellectual Property (IP) soft cores. Founded in 1993, Derivation Systems has received several SBIR contracts from NASA s Langley Research Center and the U.S. Department of Defense Air Force Research Laboratories in support of its mission to develop hardware and software for high-assurance systems. Through these contracts, Derivation Systems began developing leading-edge technology in formal verification, embedded Java, and reconfigurable computing for its PF3100, Derivational Reasoning System (DRS ), FormalCORE IP, FormalCORE PCI/32, FormalCORE DES, and LavaCORE Configurable Java Processor, which are designed for greater flexibility and security on all space missions.

  19. Low latency network and distributed storage for next generation HPC systems: the ExaNeSt project

    NASA Astrophysics Data System (ADS)

    Ammendola, R.; Biagioni, A.; Cretaro, P.; Frezza, O.; Lo Cicero, F.; Lonardo, A.; Martinelli, M.; Paolucci, P. S.; Pastorelli, E.; Pisani, F.; Simula, F.; Vicini, P.; Navaridas, J.; Chaix, F.; Chrysos, N.; Katevenis, M.; Papaeustathiou, V.

    2017-10-01

    With processor architecture evolution, the HPC market has undergone a paradigm shift. The adoption of low-cost, Linux-based clusters extended the reach of HPC from its roots in modelling and simulation of complex physical systems to a broader range of industries, from biotechnology, cloud computing, computer analytics and big data challenges to manufacturing sectors. In this perspective, the near future HPC systems can be envisioned as composed of millions of low-power computing cores, densely packed — meaning cooling by appropriate technology — with a tightly interconnected, low latency and high performance network and equipped with a distributed storage architecture. Each of these features — dense packing, distributed storage and high performance interconnect — represents a challenge, made all the harder by the need to solve them at the same time. These challenges lie as stumbling blocks along the road towards Exascale-class systems; the ExaNeSt project acknowledges them and tasks itself with investigating ways around them.

  20. Optimal dynamic remapping of parallel computations

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Reynolds, Paul F., Jr.

    1987-01-01

    A large class of computations are characterized by a sequence of phases, with phase changes occurring unpredictably. The decision problem was considered regarding the remapping of workload to processors in a parallel computation when the utility of remapping and the future behavior of the workload is uncertain, and phases exhibit stable execution requirements during a given phase, but requirements may change radically between phases. For these problems a workload assignment generated for one phase may hinder performance during the next phase. This problem is treated formally for a probabilistic model of computation with at most two phases. The fundamental problem of balancing the expected remapping performance gain against the delay cost was addressed. Stochastic dynamic programming is used to show that the remapping decision policy minimizing the expected running time of the computation has an extremely simple structure. Because the gain may not be predictable, the performance of a heuristic policy that does not require estimnation of the gain is examined. The heuristic method's feasibility is demonstrated by its use on an adaptive fluid dynamics code on a multiprocessor. The results suggest that except in extreme cases, the remapping decision problem is essentially that of dynamically determining whether gain can be achieved by remapping after a phase change. The results also suggest that this heuristic is applicable to computations with more than two phases.

  1. 78 FR 68058 - Next Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-13

    ... Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology... Generation Risk Assessment: Incorporation of Recent Advances in Molecular, Computational, and Systems Biology..., computational, and systems biology data can better inform risk assessment. This draft document is available for...

  2. The RANDOM computer program: A linear congruential random number generator

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1986-01-01

    The RANDOM Computer Program is a FORTRAN program for generating random number sequences and testing linear congruential random number generators (LCGs). The linear congruential form of random number generator is discussed, and the selection of parameters of an LCG for a microcomputer described. This document describes the following: (1) The RANDOM Computer Program; (2) RANDOM.MOD, the computer code needed to implement an LCG in a FORTRAN program; and (3) The RANCYCLE and the ARITH Computer Programs that provide computational assistance in the selection of parameters for an LCG. The RANDOM, RANCYCLE, and ARITH Computer Programs are written in Microsoft FORTRAN for the IBM PC microcomputer and its compatibles. With only minor modifications, the RANDOM Computer Program and its LCG can be run on most micromputers or mainframe computers.

  3. Identifying Differences between Depressed Adolescent Suicide Ideators and Attempters

    PubMed Central

    Auerbach, Randy P.; Millner, Alexander J.; Stewart, Jeremy G.; Esposito, Erika

    2015-01-01

    Background Adolescent depression and suicide are pressing public health concerns, and identifying key differences among suicide ideators and attempters is critical. The goal of the current study is to test whether depressed adolescent suicide attempters report greater anhedonia severity and exhibit aberrant effort-cost computations in the face of uncertainty. Methods Depressed adolescents (n = 101) ages 13–19 years were administered structured clinical interviews to assess current mental health disorders and a history of suicidality (suicide ideators = 55, suicide attempters = 46). Then, participants completed self-report instruments assessing symptoms of suicidal ideation, depression, anhedonia, and anxiety as well as a computerized effort-cost computation task. Results Compared with depressed adolescent suicide ideators, attempters report greater anhedonia severity, even after concurrently controlling for symptoms of suicidal ideation, depression, and anxiety. Additionally, when completing the effort-cost computation task, suicide attempters are less likely to pursue the difficult, high value option when outcomes are uncertain. Follow-up, trial-level analyses of effort-cost computations suggest that receipt of reward does not influence future decision-making among suicide attempters, however, suicide ideators exhibit a win-stay approach when receiving rewards on previous trials. Limitations Findings should be considered in light of limitations including a modest sample size, which limits generalizability, and the cross-sectional design. Conclusions Depressed adolescent suicide attempters are characterized by greater anhedonia severity, which may impair the ability to integrate previous rewarding experiences to inform future decisions. Taken together, this may generate a feeling of powerlessness that contributes to increased suicidality and a needless loss of life. PMID:26233323

  4. From data to analysis: linking NWChem and Avogadro with the syntax and semantics of Chemical Markup Language.

    PubMed

    de Jong, Wibe A; Walker, Andrew M; Hanwell, Marcus D

    2013-05-24

    Multidisciplinary integrated research requires the ability to couple the diverse sets of data obtained from a range of complex experiments and computer simulations. Integrating data requires semantically rich information. In this paper an end-to-end use of semantically rich data in computational chemistry is demonstrated utilizing the Chemical Markup Language (CML) framework. Semantically rich data is generated by the NWChem computational chemistry software with the FoX library and utilized by the Avogadro molecular editor for analysis and visualization. The NWChem computational chemistry software has been modified and coupled to the FoX library to write CML compliant XML data files. The FoX library was expanded to represent the lexical input files and molecular orbitals used by the computational chemistry software. Draft dictionary entries and a format for molecular orbitals within CML CompChem were developed. The Avogadro application was extended to read in CML data, and display molecular geometry and electronic structure in the GUI allowing for an end-to-end solution where Avogadro can create input structures, generate input files, NWChem can run the calculation and Avogadro can then read in and analyse the CML output produced. The developments outlined in this paper will be made available in future releases of NWChem, FoX, and Avogadro. The production of CML compliant XML files for computational chemistry software such as NWChem can be accomplished relatively easily using the FoX library. The CML data can be read in by a newly developed reader in Avogadro and analysed or visualized in various ways. A community-based effort is needed to further develop the CML CompChem convention and dictionary. This will enable the long-term goal of allowing a researcher to run simple "Google-style" searches of chemistry and physics and have the results of computational calculations returned in a comprehensible form alongside articles from the published literature.

  5. From data to analysis: linking NWChem and Avogadro with the syntax and semantics of Chemical Markup Language

    PubMed Central

    2013-01-01

    Background Multidisciplinary integrated research requires the ability to couple the diverse sets of data obtained from a range of complex experiments and computer simulations. Integrating data requires semantically rich information. In this paper an end-to-end use of semantically rich data in computational chemistry is demonstrated utilizing the Chemical Markup Language (CML) framework. Semantically rich data is generated by the NWChem computational chemistry software with the FoX library and utilized by the Avogadro molecular editor for analysis and visualization. Results The NWChem computational chemistry software has been modified and coupled to the FoX library to write CML compliant XML data files. The FoX library was expanded to represent the lexical input files and molecular orbitals used by the computational chemistry software. Draft dictionary entries and a format for molecular orbitals within CML CompChem were developed. The Avogadro application was extended to read in CML data, and display molecular geometry and electronic structure in the GUI allowing for an end-to-end solution where Avogadro can create input structures, generate input files, NWChem can run the calculation and Avogadro can then read in and analyse the CML output produced. The developments outlined in this paper will be made available in future releases of NWChem, FoX, and Avogadro. Conclusions The production of CML compliant XML files for computational chemistry software such as NWChem can be accomplished relatively easily using the FoX library. The CML data can be read in by a newly developed reader in Avogadro and analysed or visualized in various ways. A community-based effort is needed to further develop the CML CompChem convention and dictionary. This will enable the long-term goal of allowing a researcher to run simple “Google-style” searches of chemistry and physics and have the results of computational calculations returned in a comprehensible form alongside articles from the published literature. PMID:23705910

  6. Entropy generation method to quantify thermal comfort.

    PubMed

    Boregowda, S C; Tiwari, S N; Chaturvedi, S K

    2001-12-01

    The present paper presents a thermodynamic approach to assess the quality of human-thermal environment interaction and quantify thermal comfort. The approach involves development of entropy generation term by applying second law of thermodynamics to the combined human-environment system. The entropy generation term combines both human thermal physiological responses and thermal environmental variables to provide an objective measure of thermal comfort. The original concepts and definitions form the basis for establishing the mathematical relationship between thermal comfort and entropy generation term. As a result of logic and deterministic approach, an Objective Thermal Comfort Index (OTCI) is defined and established as a function of entropy generation. In order to verify the entropy-based thermal comfort model, human thermal physiological responses due to changes in ambient conditions are simulated using a well established and validated human thermal model developed at the Institute of Environmental Research of Kansas State University (KSU). The finite element based KSU human thermal computer model is being utilized as a "Computational Environmental Chamber" to conduct series of simulations to examine the human thermal responses to different environmental conditions. The output from the simulation, which include human thermal responses and input data consisting of environmental conditions are fed into the thermal comfort model. Continuous monitoring of thermal comfort in comfortable and extreme environmental conditions is demonstrated. The Objective Thermal Comfort values obtained from the entropy-based model are validated against regression based Predicted Mean Vote (PMV) values. Using the corresponding air temperatures and vapor pressures that were used in the computer simulation in the regression equation generates the PMV values. The preliminary results indicate that the OTCI and PMV values correlate well under ideal conditions. However, an experimental study is needed in the future to fully establish the validity of the OTCI formula and the model. One of the practical applications of this index is that could it be integrated in thermal control systems to develop human-centered environmental control systems for potential use in aircraft, mass transit vehicles, intelligent building systems, and space vehicles.

  7. Entropy generation method to quantify thermal comfort

    NASA Technical Reports Server (NTRS)

    Boregowda, S. C.; Tiwari, S. N.; Chaturvedi, S. K.

    2001-01-01

    The present paper presents a thermodynamic approach to assess the quality of human-thermal environment interaction and quantify thermal comfort. The approach involves development of entropy generation term by applying second law of thermodynamics to the combined human-environment system. The entropy generation term combines both human thermal physiological responses and thermal environmental variables to provide an objective measure of thermal comfort. The original concepts and definitions form the basis for establishing the mathematical relationship between thermal comfort and entropy generation term. As a result of logic and deterministic approach, an Objective Thermal Comfort Index (OTCI) is defined and established as a function of entropy generation. In order to verify the entropy-based thermal comfort model, human thermal physiological responses due to changes in ambient conditions are simulated using a well established and validated human thermal model developed at the Institute of Environmental Research of Kansas State University (KSU). The finite element based KSU human thermal computer model is being utilized as a "Computational Environmental Chamber" to conduct series of simulations to examine the human thermal responses to different environmental conditions. The output from the simulation, which include human thermal responses and input data consisting of environmental conditions are fed into the thermal comfort model. Continuous monitoring of thermal comfort in comfortable and extreme environmental conditions is demonstrated. The Objective Thermal Comfort values obtained from the entropy-based model are validated against regression based Predicted Mean Vote (PMV) values. Using the corresponding air temperatures and vapor pressures that were used in the computer simulation in the regression equation generates the PMV values. The preliminary results indicate that the OTCI and PMV values correlate well under ideal conditions. However, an experimental study is needed in the future to fully establish the validity of the OTCI formula and the model. One of the practical applications of this index is that could it be integrated in thermal control systems to develop human-centered environmental control systems for potential use in aircraft, mass transit vehicles, intelligent building systems, and space vehicles.

  8. Toward a molecular programming language for algorithmic self-assembly

    NASA Astrophysics Data System (ADS)

    Patitz, Matthew John

    Self-assembly is the process whereby relatively simple components autonomously combine to form more complex objects. Nature exhibits self-assembly to form everything from microscopic crystals to living cells to galaxies. With a desire to both form increasingly sophisticated products and to understand the basic components of living systems, scientists have developed and studied artificial self-assembling systems. One such framework is the Tile Assembly Model introduced by Erik Winfree in 1998. In this model, simple two-dimensional square 'tiles' are designed so that they self-assemble into desired shapes. The work in this thesis consists of a series of results which build toward the future goal of designing an abstracted, high-level programming language for designing the molecular components of self-assembling systems which can perform powerful computations and form into intricate structures. The first two sets of results demonstrate self-assembling systems which perform infinite series of computations that characterize computably enumerable and decidable languages, and exhibit tools for algorithmically generating the necessary sets of tiles. In the next chapter, methods for generating tile sets which self-assemble into complicated shapes, namely a class of discrete self-similar fractal structures, are presented. Next, a software package for graphically designing tile sets, simulating their self-assembly, and debugging designed systems is discussed. Finally, a high-level programming language which abstracts much of the complexity and tedium of designing such systems, while preventing many of the common errors, is presented. The summation of this body of work presents a broad coverage of the spectrum of desired outputs from artificial self-assembling systems and a progression in the sophistication of tools used to design them. By creating a broader and deeper set of modular tools for designing self-assembling systems, we hope to increase the complexity which is attainable. These tools provide a solid foundation for future work in both the Tile Assembly Model and explorations into more advanced models.

  9. From prediction error to incentive salience: mesolimbic computation of reward motivation.

    PubMed

    Berridge, Kent C

    2012-04-01

    Reward contains separable psychological components of learning, incentive motivation and pleasure. Most computational models have focused only on the learning component of reward, but the motivational component is equally important in reward circuitry, and even more directly controls behavior. Modeling the motivational component requires recognition of additional control factors besides learning. Here I discuss how mesocorticolimbic mechanisms generate the motivation component of incentive salience. Incentive salience takes Pavlovian learning and memory as one input and as an equally important input takes neurobiological state factors (e.g. drug states, appetite states, satiety states) that can vary independently of learning. Neurobiological state changes can produce unlearned fluctuations or even reversals in the ability of a previously learned reward cue to trigger motivation. Such fluctuations in cue-triggered motivation can dramatically depart from all previously learned values about the associated reward outcome. Thus, one consequence of the difference between incentive salience and learning can be to decouple cue-triggered motivation of the moment from previously learned values of how good the associated reward has been in the past. Another consequence can be to produce irrationally strong motivation urges that are not justified by any memories of previous reward values (and without distorting associative predictions of future reward value). Such irrationally strong motivation may be especially problematic in addiction. To understand these phenomena, future models of mesocorticolimbic reward function should address the neurobiological state factors that participate to control generation of incentive salience. © 2012 The Author. European Journal of Neuroscience © 2012 Federation of European Neuroscience Societies and Blackwell Publishing Ltd.

  10. Non-harmful insertion of data mimicking computer network attacks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neil, Joshua Charles; Kent, Alexander; Hash, Jr, Curtis Lee

    Non-harmful data mimicking computer network attacks may be inserted in a computer network. Anomalous real network connections may be generated between a plurality of computing systems in the network. Data mimicking an attack may also be generated. The generated data may be transmitted between the plurality of computing systems using the real network connections and measured to determine whether an attack is detected.

  11. Simulation of a proposed emergency outlet from Devils Lake, North Dakota

    USGS Publications Warehouse

    Vecchia, Aldo V.

    2002-01-01

    From 1993 to 2001, Devils Lake rose more than 25 feet, flooding farmland, roads, and structures around the lake and causing more than $400 million in damages in the Devils Lake Basin. In July 2001, the level of Devils Lake was at 1,448.0 feet above sea level1, which was the highest lake level in more than 160 years. The lake could continue to rise to several feet above its natural spill elevation to the Sheyenne River (1,459 feet above sea level) in future years, causing extensive additional flooding in the basin and, in the event of an uncontrolled natural spill, downstream in the Red River of the North Basin as well. The outlet simulation model described in this report was developed to determine the potential effects of various outlet alternatives on the future lake levels and water quality of Devils Lake.Lake levels of Devils Lake are controlled largely by precipitation on the lake surface, evaporation from the lake surface, and surface inflow. For this study, a monthly water-balance model was developed to compute the change in total volume of Devils Lake, and a regression model was used to estimate monthly water-balance data on the basis of limited recorded data. Estimated coefficients for the regression model indicated fitted precipitation on the lake surface was greater than measured precipitation in most months, fitted evaporation from the lake surface was less than estimated evaporation in most months, and ungaged inflow was about 2 percent of gaged inflow in most months. Dissolved sulfate was considered to be the key water-quality constituent for evaluating the effects of a proposed outlet on downstream water quality. Because large differences in sulfate concentrations existed among the various bays of Devils Lake, monthly water-balance data were used to develop detailed water and sulfate mass-balance models to compute changes in sulfate load for each of six major storage compartments in response to precipitation, evaporation, inflow, and outflow from each compartment. The storage compartments--five for Devils Lake and one for Stump Lake--were connected by bridge openings, culverts, or natural channels that restricted mixing between compartments. A numerical algorithm was developed to calculate inflow and outflow from each compartment. Sulfate loads for the storage compartments first were calculated using the assumptions that no interaction occurred between the bottom sediments and the water column and no wind- or buoyancy-induced mixing occurred between compartments. However, because the fitted sulfate loads did not agree with the estimated sulfate loads, which were obtained from recorded sulfate concentrations, components were added to the sulfate mass-balance model to account for the flux of sulfate between bottom sediments and the lake and for mixing between storage compartments. Mixing between compartments can occur during periods of open water because of wind and during periods of ice cover because of water-density differences between compartments. Sulfate loads calculated using the sulfate mass-balance model with sediment interaction and mixing between compartments closely matched sulfate loads computed from historical concentrations. The water and sulfate mass-balance models were used to calculate potential future lake levels and sulfate concentrations for Devils Lake and Stump Lake given potential future values of monthly precipitation, evaporation, and inflow. Potential future inputs were generated using a scenario approach and a stochastic approach. In the scenario approach, historical values of precipitation, evaporation, and inflow were repeated in the future for a particular sequence of historical years. In the stochastic approach, a statistical time-series model was developed to randomly generate potential future inputs. The scenario approach was used to evaluate the effectiveness of various outlet alternatives, and the stochastic approach was used to evaluate the hydrologic and water-quality effects of the potential outlet alternatives that were selected on the basis of the scenario analysis. Given potential future lake levels and sulfate concentrations generated using either the scenario or stochastic approach and potential future ambient flows and sulfate concentrations for the Sheyenne River receiving waters, daily outlet discharges could be calculated for virtually any outlet alternative. For the scenario approach, future ambient flows and sulfate concentrations for the Sheyenne River were generated using the same sequence of years used for generating water-balance data for Devils Lake. For the stochastic approach, a procedure was developed for generating daily Sheyenne River flows and sulfate concentrations that were "in-phase" with the generated water-balance data for Devils Lake. Simulation results for the scenario approach indicated that neither of the West Bay outlet alternatives provided effective flood-damage reduction without exceeding downstream water-quality constraints. However, both Pelican Lake outlet alternatives provided significant flood-damage reduction with only minor downstream water-quality changes. The most effective alternative for controlling rising lake levels was a Pelican Lake outlet with a 480-cubic-foot-per-second pump capacity and a 250-milligram-per-liter downstream sulfate constraint. However, this plan is costly because of the high pump capacity and the requirement of a control structure on Highway 19 to control the level of Pelican Lake. A less costly, though less effective for flood-damage reduction, plan is a Pelican Lake outlet with a 300-cubic-foot-per-second pump capacity and a 250-milligram-per-liter downstream sulfate constraint. The plan is less costly because the pump capacity is smaller and because the control structure on Highway 19 is not required. The less costly Pelican Lake alternative with a 450-milligramper- liter downstream sulfate constraint rather than a 250-milligram-per-liter downstream sulfate constraint was identified by the U.S. Army Corps of Engineers as the preferred alternative for detailed design and engineering analysis. Simulation results for the stochastic approach indicated that the geologic history of lake-level fluctuations of Devils Lake for the past 2,500 years was consistent with a climatic history that consisted of two climate states--a wet state, similar to conditions during 1980-99, and a normal state, similar to conditions during 1950-78. The transition times between the wet and normal climatic periods occurred randomly. The average duration of the wet climatic periods was 20 years, and the average duration of the normal climatic periods was 120 years. The stochastic approach was used to generate 10,000 independent sequences of lake levels and sulfate concentrations for Devils Lake for water years 2001-50. Each trace began with the same starting conditions, and the duration of the current wet cycle was generated randomly for each trace. Each trace was generated for the baseline (natural) condition and for the Pelican Lake outlet with a 300-cubic-foot-per-second pump capacity and a 450-milligram-per-liter downstream sulfate constraint. The outlet significantly lowered the probabilities of future lake-level increases within the next 50 years and did not substantially increase the probabilities of reaching low lake levels or poor water-quality conditions during the same period.

  12. Identifying opportune landing sites in degraded visual environments with terrain and cultural databases

    NASA Astrophysics Data System (ADS)

    Moody, Marc; Fisher, Robert; Little, J. Kristin

    2014-06-01

    Boeing has developed a degraded visual environment navigational aid that is flying on the Boeing AH-6 light attack helicopter. The navigational aid is a two dimensional software digital map underlay generated by the Boeing™ Geospatial Embedded Mapping Software (GEMS) and fully integrated with the operational flight program. The page format on the aircraft's multi function displays (MFD) is termed the Approach page. The existing work utilizes Digital Terrain Elevation Data (DTED) and OpenGL ES 2.0 graphics capabilities to compute the pertinent graphics underlay entirely on the graphics processor unit (GPU) within the AH-6 mission computer. The next release will incorporate cultural databases containing Digital Vertical Obstructions (DVO) to warn the crew of towers, buildings, and power lines when choosing an opportune landing site. Future IRAD will include Light Detection and Ranging (LIDAR) point cloud generating sensors to provide 2D and 3D synthetic vision on the final approach to the landing zone. Collision detection with respect to terrain, cultural, and point cloud datasets may be used to further augment the crew warning system. The techniques for creating the digital map underlay leverage the GPU almost entirely, making this solution viable on most embedded mission computing systems with an OpenGL ES 2.0 capable GPU. This paper focuses on the AH-6 crew interface process for determining a landing zone and flying the aircraft to it.

  13. Mantle Convection on Modern Supercomputers

    NASA Astrophysics Data System (ADS)

    Weismüller, J.; Gmeiner, B.; Huber, M.; John, L.; Mohr, M.; Rüde, U.; Wohlmuth, B.; Bunge, H. P.

    2015-12-01

    Mantle convection is the cause for plate tectonics, the formation of mountains and oceans, and the main driving mechanism behind earthquakes. The convection process is modeled by a system of partial differential equations describing the conservation of mass, momentum and energy. Characteristic to mantle flow is the vast disparity of length scales from global to microscopic, turning mantle convection simulations into a challenging application for high-performance computing. As system size and technical complexity of the simulations continue to increase, design and implementation of simulation models for next generation large-scale architectures is handled successfully only in an interdisciplinary context. A new priority program - named SPPEXA - by the German Research Foundation (DFG) addresses this issue, and brings together computer scientists, mathematicians and application scientists around grand challenges in HPC. Here we report from the TERRA-NEO project, which is part of the high visibility SPPEXA program, and a joint effort of four research groups. TERRA-NEO develops algorithms for future HPC infrastructures, focusing on high computational efficiency and resilience in next generation mantle convection models. We present software that can resolve the Earth's mantle with up to 1012 grid points and scales efficiently to massively parallel hardware with more than 50,000 processors. We use our simulations to explore the dynamic regime of mantle convection and assess the impact of small scale processes on global mantle flow.

  14. Efficiently mapping structure-property relationships of gas adsorption in porous materials: application to Xe adsorption.

    PubMed

    Kaija, A R; Wilmer, C E

    2017-09-08

    Designing better porous materials for gas storage or separations applications frequently leverages known structure-property relationships. Reliable structure-property relationships, however, only reveal themselves when adsorption data on many porous materials are aggregated and compared. Gathering enough data experimentally is prohibitively time consuming, and even approaches based on large-scale computer simulations face challenges. Brute force computational screening approaches that do not efficiently sample the space of porous materials may be ineffective when the number of possible materials is too large. Here we describe a general and efficient computational method for mapping structure-property spaces of porous materials that can be useful for adsorption related applications. We describe an algorithm that generates random porous "pseudomaterials", for which we calculate structural characteristics (e.g., surface area, pore size and void fraction) and also gas adsorption properties via molecular simulations. Here we chose to focus on void fraction and Xe adsorption at 1 bar, 5 bar, and 10 bar. The algorithm then identifies pseudomaterials with rare combinations of void fraction and Xe adsorption and mutates them to generate new pseudomaterials, thereby selectively adding data only to those parts of the structure-property map that are the least explored. Use of this method can help guide the design of new porous materials for gas storage and separations applications in the future.

  15. Geometry definition and grid generation for a complete fighter aircraft

    NASA Technical Reports Server (NTRS)

    Edwards, T. A.

    1986-01-01

    Recent advances in computing power and numerical solution procedures have enabled computational fluid dynamicists to attempt increasingly difficult problems. In particular, efforts are focusing on computations of complex three-dimensional flow fields about realistic aerodynamic bodies. To perform such computations, a very accurate and detailed description of the surface geometry must be provided, and a three-dimensional grid must be generated in the space around the body. The geometry must be supplied in a format compatible with the grid generation requirements, and must be verified to be free of inconsistencies. This paper presents a procedure for performing the geometry definition of a fighter aircraft that makes use of a commercial computer-aided design/computer-aided manufacturing system. Furthermore, visual representations of the geometry are generated using a computer graphics system for verification of the body definition. Finally, the three-dimensional grids for fighter-like aircraft are generated by means of an efficient new parabolic grid generation method. This method exhibits good control of grid quality.

  16. Geometry definition and grid generation for a complete fighter aircraft

    NASA Technical Reports Server (NTRS)

    Edwards, Thomas A.

    1986-01-01

    Recent advances in computing power and numerical solution procedures have enabled computational fluid dynamicists to attempt increasingly difficult problems. In particular, efforts are focusing on computations of complex three-dimensional flow fields about realistic aerodynamic bodies. To perform such computations, a very accurate and detailed description of the surface geometry must be provided, and a three-dimensional grid must be generated in the space around the body. The geometry must be supplied in a format compatible with the grid generation requirements, and must be verified to be free of inconsistencies. A procedure for performing the geometry definition of a fighter aircraft that makes use of a commercial computer-aided design/computer-aided manufacturing system is presented. Furthermore, visual representations of the geometry are generated using a computer graphics system for verification of the body definition. Finally, the three-dimensional grids for fighter-like aircraft are generated by means of an efficient new parabolic grid generation method. This method exhibits good control of grid quality.

  17. Flexible Ionic-Electronic Hybrid Oxide Synaptic TFTs with Programmable Dynamic Plasticity for Brain-Inspired Neuromorphic Computing.

    PubMed

    John, Rohit Abraham; Ko, Jieun; Kulkarni, Mohit R; Tiwari, Naveen; Chien, Nguyen Anh; Ing, Ng Geok; Leong, Wei Lin; Mathews, Nripan

    2017-08-01

    Emulation of biological synapses is necessary for future brain-inspired neuromorphic computational systems that could look beyond the standard von Neuman architecture. Here, artificial synapses based on ionic-electronic hybrid oxide-based transistors on rigid and flexible substrates are demonstrated. The flexible transistors reported here depict a high field-effect mobility of ≈9 cm 2 V -1 s -1 with good mechanical performance. Comprehensive learning abilities/synaptic rules like paired-pulse facilitation, excitatory and inhibitory postsynaptic currents, spike-time-dependent plasticity, consolidation, superlinear amplification, and dynamic logic are successfully established depicting concurrent processing and memory functionalities with spatiotemporal correlation. The results present a fully solution processable approach to fabricate artificial synapses for next-generation transparent neural circuits. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. A Survey of Artificial Immune System Based Intrusion Detection

    PubMed Central

    Li, Tao; Hu, Xinlei; Wang, Feng; Zou, Yang

    2014-01-01

    In the area of computer security, Intrusion Detection (ID) is a mechanism that attempts to discover abnormal access to computers by analyzing various interactions. There is a lot of literature about ID, but this study only surveys the approaches based on Artificial Immune System (AIS). The use of AIS in ID is an appealing concept in current techniques. This paper summarizes AIS based ID methods from a new view point; moreover, a framework is proposed for the design of AIS based ID Systems (IDSs). This framework is analyzed and discussed based on three core aspects: antibody/antigen encoding, generation algorithm, and evolution mode. Then we collate the commonly used algorithms, their implementation characteristics, and the development of IDSs into this framework. Finally, some of the future challenges in this area are also highlighted. PMID:24790549

  19. Multiprocessor architectural study

    NASA Technical Reports Server (NTRS)

    Kosmala, A. L.; Stanten, S. F.; Vandever, W. H.

    1972-01-01

    An architectural design study was made of a multiprocessor computing system intended to meet functional and performance specifications appropriate to a manned space station application. Intermetrics, previous experience, and accumulated knowledge of the multiprocessor field is used to generate a baseline philosophy for the design of a future SUMC* multiprocessor. Interrupts are defined and the crucial questions of interrupt structure, such as processor selection and response time, are discussed. Memory hierarchy and performance is discussed extensively with particular attention to the design approach which utilizes a cache memory associated with each processor. The ability of an individual processor to approach its theoretical maximum performance is then analyzed in terms of a hit ratio. Memory management is envisioned as a virtual memory system implemented either through segmentation or paging. Addressing is discussed in terms of various register design adopted by current computers and those of advanced design.

  20. Microgravity

    NASA Image and Video Library

    1998-02-27

    NASA research Dr. Donald Frazier uses a blue laser shining through a quartz window into a special mix of chemicals to generate a polymer film on the inside quartz surface. As the chemicals respond to the laser light, they adhere to the glass surface, forming opticl films. Dr. Frazier and Dr. Mark S. Paley developed the process in the Space Sciences Laboratory at NASA's Marshall Space Flight Center in Huntsville, AL. Working aboard the Space Shuttle, a science team led by Dr. Frazier formed thin-films potentially useful in optical computers with fewer impurities than those formed on Earth. Patterns of these films can be traced onto the quartz surface. In the optical computers on the future, these films could replace electronic circuits and wires, making the systems more efficient and cost-effective, as well as lighter and more compact. Photo credit: NASA/Marshall Space Flight Center

  1. Microgravity

    NASA Image and Video Library

    1999-05-26

    NASA researcher Dr. Donald Frazier uses a blue laser shining through a quartz window into a special mix of chemicals to generate a polymer film on the inside quartz surface. As the chemicals respond to the laser light, they adhere to the glass surface, forming optical films. Dr. Frazier and Dr. Mark S. Paley developed the process in the Space Sciences Laboratory at NASA's Marshall Space Flight Center in Huntsville, AL. Working aboard the Space Shuttle, a science team led by Dr. Frazier formed thin-films potentially useful in optical computers with fewer impurities than those formed on Earth. Patterns of these films can be traced onto the quartz surface. In the optical computers of the future, thee films could replace electronic circuits and wires, making the systems more efficient and cost-effective, as well as lighter and more compact. Photo credit: NASA/Marshall Space Flight Center

  2. Knowledge-based computational intelligence development for predicting protein secondary structures from sequences.

    PubMed

    Shen, Hong-Bin; Yi, Dong-Liang; Yao, Li-Xiu; Yang, Jie; Chou, Kuo-Chen

    2008-10-01

    In the postgenomic age, with the avalanche of protein sequences generated and relatively slow progress in determining their structures by experiments, it is important to develop automated methods to predict the structure of a protein from its sequence. The membrane proteins are a special group in the protein family that accounts for approximately 30% of all proteins; however, solved membrane protein structures only represent less than 1% of known protein structures to date. Although a great success has been achieved for developing computational intelligence techniques to predict secondary structures in both globular and membrane proteins, there is still much challenging work in this regard. In this review article, we firstly summarize the recent progress of automation methodology development in predicting protein secondary structures, especially in membrane proteins; we will then give some future directions in this research field.

  3. Empirical Relationships Between Optical Properties and Equivalent Diameters of Fractal Soot Aggregates at 550 Nm Wavelength.

    NASA Technical Reports Server (NTRS)

    Pandey, Apoorva; Chakrabarty, Rajan K.; Liu, Li; Mishchenko, Michael I.

    2015-01-01

    Soot aggregates (SAs)-fractal clusters of small, spherical carbonaceous monomers-modulate the incoming visible solar radiation and contribute significantly to climate forcing. Experimentalists and climate modelers typically assume a spherical morphology for SAs when computing their optical properties, causing significant errors. Here, we calculate the optical properties of freshly-generated (fractal dimension Df = 1.8) and aged (Df = 2.6) SAs at 550 nm wavelength using the numericallyexact superposition T-Matrix method. These properties were expressed as functions of equivalent aerosol diameters as measured by contemporary aerosol instruments. This work improves upon previous efforts wherein SA optical properties were computed as a function of monomer number, rendering them unusable in practical applications. Future research will address the sensitivity of variation in refractive index, fractal prefactor, and monomer overlap of SAs on the reported empirical relationships.

  4. Low-Energy Truly Random Number Generation with Superparamagnetic Tunnel Junctions for Unconventional Computing

    NASA Astrophysics Data System (ADS)

    Vodenicarevic, D.; Locatelli, N.; Mizrahi, A.; Friedman, J. S.; Vincent, A. F.; Romera, M.; Fukushima, A.; Yakushiji, K.; Kubota, H.; Yuasa, S.; Tiwari, S.; Grollier, J.; Querlioz, D.

    2017-11-01

    Low-energy random number generation is critical for many emerging computing schemes proposed to complement or replace von Neumann architectures. However, current random number generators are always associated with an energy cost that is prohibitive for these computing schemes. We introduce random number bit generation based on specific nanodevices: superparamagnetic tunnel junctions. We experimentally demonstrate high-quality random bit generation that represents an orders-of-magnitude improvement in energy efficiency over current solutions. We show that the random generation speed improves with nanodevice scaling, and we investigate the impact of temperature, magnetic field, and cross talk. Finally, we show how alternative computing schemes can be implemented using superparamagentic tunnel junctions as random number generators. These results open the way for fabricating efficient hardware computing devices leveraging stochasticity, and they highlight an alternative use for emerging nanodevices.

  5. A high-speed DAQ framework for future high-level trigger and event building clusters

    NASA Astrophysics Data System (ADS)

    Caselle, M.; Ardila Perez, L. E.; Balzer, M.; Dritschler, T.; Kopmann, A.; Mohr, H.; Rota, L.; Vogelgesang, M.; Weber, M.

    2017-03-01

    Modern data acquisition and trigger systems require a throughput of several GB/s and latencies of the order of microseconds. To satisfy such requirements, a heterogeneous readout system based on FPGA readout cards and GPU-based computing nodes coupled by InfiniBand has been developed. The incoming data from the back-end electronics is delivered directly into the internal memory of GPUs through a dedicated peer-to-peer PCIe communication. High performance DMA engines have been developed for direct communication between FPGAs and GPUs using "DirectGMA (AMD)" and "GPUDirect (NVIDIA)" technologies. The proposed infrastructure is a candidate for future generations of event building clusters, high-level trigger filter farms and low-level trigger system. In this paper the heterogeneous FPGA-GPU architecture will be presented and its performance be discussed.

  6. [OMICS AND BIG DATA, MAJOR ADVANCES TOWARDS PERSONALIZED MEDICINE OF THE FUTURE?].

    PubMed

    Scheen, A J

    2015-01-01

    The increasing interest for personalized medicine evolves together with two major technological advances. First, the new-generation, rapid and less expensive, DNA sequencing method, combined with remarkable progresses in molecular biology leading to the post-genomic era (transcriptomics, proteomics, metabolomics). Second, the refinement of computing tools (IT), which allows the immediate analysis of a huge amount of data (especially, those resulting from the omics approaches) and, thus, creates a new universe for medical research, that of analyzed by computerized modelling. This article for scientific communication and popularization briefly describes the main advances in these two fields of interest. These technological progresses are combined with those occurring in communication, which makes possible the development of artificial intelligence. These major advances will most probably represent the grounds of the future personalized medicine.

  7. Two schemes for rapid generation of digital video holograms using PC cluster

    NASA Astrophysics Data System (ADS)

    Park, Hanhoon; Song, Joongseok; Kim, Changseob; Park, Jong-Il

    2017-12-01

    Computer-generated holography (CGH), which is a process of generating digital holograms, is computationally expensive. Recently, several methods/systems of parallelizing the process using graphic processing units (GPUs) have been proposed. Indeed, use of multiple GPUs or a personal computer (PC) cluster (each PC with GPUs) enabled great improvements in the process speed. However, extant literature has less often explored systems involving rapid generation of multiple digital holograms and specialized systems for rapid generation of a digital video hologram. This study proposes a system that uses a PC cluster and is able to more efficiently generate a video hologram. The proposed system is designed to simultaneously generate multiple frames and accelerate the generation by parallelizing the CGH computations across a number of frames, as opposed to separately generating each individual frame while parallelizing the CGH computations within each frame. The proposed system also enables the subprocesses for generating each frame to execute in parallel through multithreading. With these two schemes, the proposed system significantly reduced the data communication time for generating a digital hologram when compared with that of the state-of-the-art system.

  8. Intergenerational equity and conservation

    NASA Technical Reports Server (NTRS)

    Otoole, R. P.; Walton, A. L.

    1980-01-01

    The issue of integenerational equity in the use of natural resources is discussed in the context of coal mining conversion. An attempt to determine if there is a clear-cut benefit to future generations in setting minimum coal extraction efficiency standards in mining is made. It is demonstrated that preserving fossil fuels beyond the economically efficient level is not necessarily beneficial to future generations even in terms of their own preferences. Setting fossil fuel conservation targets for intermediate products (i.e. energy) may increase the quantities of fossil fuels available to future generations and hence lower the costs, but there may be serious disadvantages to future generations as well. The use of relatively inexpensive fossil fuels in this generation may result in more infrastructure development and more knowledge production available to future generations. The value of fossil fuels versus these other endowments in the future depends on many factors which cannot possibly be evaluated at present. Since there is no idea of whether future generations are being helped or harmed, it is recommended that integenerational equity not be used as a factor in setting coal mine extraction efficiency standards, or in establishing requirements.

  9. Speed and path control for conflict-free flight in high air traffic demand in terminal airspace

    NASA Astrophysics Data System (ADS)

    Rezaei, Ali

    To accommodate the growing air traffic demand, flights will need to be planned and navigated with a much higher level of precision than today's aircraft flight path. The Next Generation Air Transportation System (NextGen) stands to benefit significantly in safety and efficiency from such movement of aircraft along precisely defined paths. Air Traffic Operations (ATO) relying on such precision--the Precision Air Traffic Operations or PATO--are the foundation of high throughput capacity envisioned for the future airports. In PATO, the preferred method is to manage the air traffic by assigning a speed profile to each aircraft in a given fleet in a given airspace (in practice known as (speed control). In this research, an algorithm has been developed, set in the context of a Hybrid Control System (HCS) model, that determines whether a speed control solution exists for a given fleet of aircraft in a given airspace and if so, computes this solution as a collective speed profile that assures separation if executed without deviation. Uncertainties such as weather are not considered but the algorithm can be modified to include uncertainties. The algorithm first computes all feasible sequences (i.e., all sequences that allow the given fleet of aircraft to reach destinations without violating the FAA's separation requirement) by looking at all pairs of aircraft. Then, the most likely sequence is determined and the speed control solution is constructed by a backward trajectory generation, starting with the aircraft last out and proceeds to the first out. This computation can be done for different sequences in parallel which helps to reduce the computation time. If such a solution does not exist, then the algorithm calculates a minimal path modification (known as path control) that will allow separation-compliance speed control. We will also prove that the algorithm will modify the path without creating a new separation violation. The new path will be generated by adding new waypoints in the airspace. As a byproduct, instead of minimal path modification, one can use the aircraft arrival time schedule to generate the sequence in which the aircraft reach their destinations.

  10. Now and next-generation sequencing techniques: future of sequence analysis using cloud computing.

    PubMed

    Thakur, Radhe Shyam; Bandopadhyay, Rajib; Chaudhary, Bratati; Chatterjee, Sourav

    2012-01-01

    Advances in the field of sequencing techniques have resulted in the greatly accelerated production of huge sequence datasets. This presents immediate challenges in database maintenance at datacenters. It provides additional computational challenges in data mining and sequence analysis. Together these represent a significant overburden on traditional stand-alone computer resources, and to reach effective conclusions quickly and efficiently, the virtualization of the resources and computation on a pay-as-you-go concept (together termed "cloud computing") has recently appeared. The collective resources of the datacenter, including both hardware and software, can be available publicly, being then termed a public cloud, the resources being provided in a virtual mode to the clients who pay according to the resources they employ. Examples of public companies providing these resources include Amazon, Google, and Joyent. The computational workload is shifted to the provider, which also implements required hardware and software upgrades over time. A virtual environment is created in the cloud corresponding to the computational and data storage needs of the user via the internet. The task is then performed, the results transmitted to the user, and the environment finally deleted after all tasks are completed. In this discussion, we focus on the basics of cloud computing, and go on to analyze the prerequisites and overall working of clouds. Finally, the applications of cloud computing in biological systems, particularly in comparative genomics, genome informatics, and SNP detection are discussed with reference to traditional workflows.

  11. Skel: Generative Software for Producing Skeletal I/O Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Logan, J.; Klasky, S.; Lofstead, J.

    2011-01-01

    Massively parallel computations consist of a mixture of computation, communication, and I/O. As part of the co-design for the inevitable progress towards exascale computing, we must apply lessons learned from past work to succeed in this new age of computing. Of the three components listed above, implementing an effective parallel I/O solution has often been overlooked by application scientists and was usually added to large scale simulations only when existing serial techniques had failed. As scientists teams scaled their codes to run on hundreds of processors, it was common to call on an I/O expert to implement a set ofmore » more scalable I/O routines. These routines were easily separated from the calculations and communication, and in many cases, an I/O kernel was derived from the application which could be used for testing I/O performance independent of the application. These I/O kernels developed a life of their own used as a broad measure for comparing different I/O techniques. Unfortunately, as years passed and computation and communication changes required changes to the I/O, the separate I/O kernel used for benchmarking remained static no longer providing an accurate indicator of the I/O performance of the simulation making I/O research less relevant for the application scientists. In this paper we describe a new approach to this problem where I/O kernels are replaced with skeletal I/O applications automatically generated from an abstract set of simulation I/O parameters. We realize this abstraction by leveraging the ADIOS middleware's XML I/O specification with additional runtime parameters. Skeletal applications offer all of the benefits of I/O kernels including allowing I/O optimizations to focus on useful I/O patterns. Moreover, since they are automatically generated, it is easy to produce an updated I/O skeleton whenever the simulation's I/O changes. In this paper we analyze the performance of automatically generated I/O skeletal applications for the S3D and GTS codes. We show that these skeletal applications achieve performance comparable to that of the production applications. We wrap up the paper with a discussion of future changes to make the skeletal application better approximate the actual I/O performed in the simulation.« less

  12. Health Management Applications for International Space Station

    NASA Technical Reports Server (NTRS)

    Alena, Richard; Duncavage, Dan

    2005-01-01

    Traditional mission and vehicle management involves teams of highly trained specialists monitoring vehicle status and crew activities, responding rapidly to any anomalies encountered during operations. These teams work from the Mission Control Center and have access to engineering support teams with specialized expertise in International Space Station (ISS) subsystems. Integrated System Health Management (ISHM) applications can significantly augment these capabilities by providing enhanced monitoring, prognostic and diagnostic tools for critical decision support and mission management. The Intelligent Systems Division of NASA Ames Research Center is developing many prototype applications using model-based reasoning, data mining and simulation, working with Mission Control through the ISHM Testbed and Prototypes Project. This paper will briefly describe information technology that supports current mission management practice, and will extend this to a vision for future mission control workflow incorporating new ISHM applications. It will describe ISHM applications currently under development at NASA and will define technical approaches for implementing our vision of future human exploration mission management incorporating artificial intelligence and distributed web service architectures using specific examples. Several prototypes are under development, each highlighting a different computational approach. The ISStrider application allows in-depth analysis of Caution and Warning (C&W) events by correlating real-time telemetry with the logical fault trees used to define off-nominal events. The application uses live telemetry data and the Livingstone diagnostic inference engine to display the specific parameters and fault trees that generated the C&W event, allowing a flight controller to identify the root cause of the event from thousands of possibilities by simply navigating animated fault tree models on their workstation. SimStation models the functional power flow for the ISS Electrical Power System and can predict power balance for nominal and off-nominal conditions. SimStation uses realtime telemetry data to keep detailed computational physics models synchronized with actual ISS power system state. In the event of failure, the application can then rapidly diagnose root cause, predict future resource levels and even correlate technical documents relevant to the specific failure. These advanced computational models will allow better insight and more precise control of ISS subsystems, increasing safety margins by speeding up anomaly resolution and reducing,engineering team effort and cost. This technology will make operating ISS more efficient and is directly applicable to next-generation exploration missions and Crew Exploration Vehicles.

  13. 76 FR 79609 - Federal Acquisition Regulation; Clarification of Standards for Computer Generation of Forms

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-22

    ... Regulation; Clarification of Standards for Computer Generation of Forms AGENCY: Department of Defense (DoD... American National Standards Institute X12, as the valid standard to use for computer-generated forms. FAR... optional forms on their computers. In addition to clarifying that FIPS 161 is no longer in use, public...

  14. COMPUTER SUPPORT SYSTEMS FOR ESTIMATING CHEMICAL TOXICITY: PRESENT CAPABILITIES AND FUTURE TRENDS

    EPA Science Inventory

    Computer Support Systems for Estimating Chemical Toxicity: Present Capabilities and Future Trends

    A wide variety of computer-based artificial intelligence (AI) and decision support systems exist currently to aid in the assessment of toxicity for environmental chemicals. T...

  15. Modeling a space-based quantum link that includes an adaptive optics system

    NASA Astrophysics Data System (ADS)

    Duchane, Alexander W.; Hodson, Douglas D.; Mailloux, Logan O.

    2017-10-01

    Quantum Key Distribution uses optical pulses to generate shared random bit strings between two locations. If a high percentage of the optical pulses are comprised of single photons, then the statistical nature of light and information theory can be used to generate secure shared random bit strings which can then be converted to keys for encryption systems. When these keys are incorporated along with symmetric encryption techniques such as a one-time pad, then this method of key generation and encryption is resistant to future advances in quantum computing which will significantly degrade the effectiveness of current asymmetric key sharing techniques. This research first reviews the transition of Quantum Key Distribution free-space experiments from the laboratory environment to field experiments, and finally, ongoing space experiments. Next, a propagation model for an optical pulse from low-earth orbit to ground and the effects of turbulence on the transmitted optical pulse is described. An Adaptive Optics system is modeled to correct for the aberrations caused by the atmosphere. The long-term point spread function of the completed low-earth orbit to ground optical system is explored in the results section. Finally, the impact of this optical system and its point spread function on an overall quantum key distribution system as well as the future work necessary to show this impact is described.

  16. Crops In Silico: Generating Virtual Crops Using an Integrative and Multi-scale Modeling Platform.

    PubMed

    Marshall-Colon, Amy; Long, Stephen P; Allen, Douglas K; Allen, Gabrielle; Beard, Daniel A; Benes, Bedrich; von Caemmerer, Susanne; Christensen, A J; Cox, Donna J; Hart, John C; Hirst, Peter M; Kannan, Kavya; Katz, Daniel S; Lynch, Jonathan P; Millar, Andrew J; Panneerselvam, Balaji; Price, Nathan D; Prusinkiewicz, Przemyslaw; Raila, David; Shekar, Rachel G; Shrivastava, Stuti; Shukla, Diwakar; Srinivasan, Venkatraman; Stitt, Mark; Turk, Matthew J; Voit, Eberhard O; Wang, Yu; Yin, Xinyou; Zhu, Xin-Guang

    2017-01-01

    Multi-scale models can facilitate whole plant simulations by linking gene networks, protein synthesis, metabolic pathways, physiology, and growth. Whole plant models can be further integrated with ecosystem, weather, and climate models to predict how various interactions respond to environmental perturbations. These models have the potential to fill in missing mechanistic details and generate new hypotheses to prioritize directed engineering efforts. Outcomes will potentially accelerate improvement of crop yield, sustainability, and increase future food security. It is time for a paradigm shift in plant modeling, from largely isolated efforts to a connected community that takes advantage of advances in high performance computing and mechanistic understanding of plant processes. Tools for guiding future crop breeding and engineering, understanding the implications of discoveries at the molecular level for whole plant behavior, and improved prediction of plant and ecosystem responses to the environment are urgently needed. The purpose of this perspective is to introduce Crops in silico (cropsinsilico.org), an integrative and multi-scale modeling platform, as one solution that combines isolated modeling efforts toward the generation of virtual crops, which is open and accessible to the entire plant biology community. The major challenges involved both in the development and deployment of a shared, multi-scale modeling platform, which are summarized in this prospectus, were recently identified during the first Crops in silico Symposium and Workshop.

  17. Crops In Silico: Generating Virtual Crops Using an Integrative and Multi-scale Modeling Platform

    PubMed Central

    Marshall-Colon, Amy; Long, Stephen P.; Allen, Douglas K.; Allen, Gabrielle; Beard, Daniel A.; Benes, Bedrich; von Caemmerer, Susanne; Christensen, A. J.; Cox, Donna J.; Hart, John C.; Hirst, Peter M.; Kannan, Kavya; Katz, Daniel S.; Lynch, Jonathan P.; Millar, Andrew J.; Panneerselvam, Balaji; Price, Nathan D.; Prusinkiewicz, Przemyslaw; Raila, David; Shekar, Rachel G.; Shrivastava, Stuti; Shukla, Diwakar; Srinivasan, Venkatraman; Stitt, Mark; Turk, Matthew J.; Voit, Eberhard O.; Wang, Yu; Yin, Xinyou; Zhu, Xin-Guang

    2017-01-01

    Multi-scale models can facilitate whole plant simulations by linking gene networks, protein synthesis, metabolic pathways, physiology, and growth. Whole plant models can be further integrated with ecosystem, weather, and climate models to predict how various interactions respond to environmental perturbations. These models have the potential to fill in missing mechanistic details and generate new hypotheses to prioritize directed engineering efforts. Outcomes will potentially accelerate improvement of crop yield, sustainability, and increase future food security. It is time for a paradigm shift in plant modeling, from largely isolated efforts to a connected community that takes advantage of advances in high performance computing and mechanistic understanding of plant processes. Tools for guiding future crop breeding and engineering, understanding the implications of discoveries at the molecular level for whole plant behavior, and improved prediction of plant and ecosystem responses to the environment are urgently needed. The purpose of this perspective is to introduce Crops in silico (cropsinsilico.org), an integrative and multi-scale modeling platform, as one solution that combines isolated modeling efforts toward the generation of virtual crops, which is open and accessible to the entire plant biology community. The major challenges involved both in the development and deployment of a shared, multi-scale modeling platform, which are summarized in this prospectus, were recently identified during the first Crops in silico Symposium and Workshop. PMID:28555150

  18. Designing Next Generation Massively Multithreaded Architectures for Irregular Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tumeo, Antonino; Secchi, Simone; Villa, Oreste

    Irregular applications, such as data mining or graph-based computations, show unpredictable memory/network access patterns and control structures. Massively multi-threaded architectures with large node count, like the Cray XMT, have been shown to address their requirements better than commodity clusters. In this paper we present the approaches that we are currently pursuing to design future generations of these architectures. First, we introduce the Cray XMT and compare it to other multithreaded architectures. We then propose an evolution of the architecture, integrating multiple cores per node and next generation network interconnect. We advocate the use of hardware support for remote memory referencemore » aggregation to optimize network utilization. For this evaluation we developed a highly parallel, custom simulation infrastructure for multi-threaded systems. Our simulator executes unmodified XMT binaries with very large datasets, capturing effects due to contention and hot-spotting, while predicting execution times with greater than 90% accuracy. We also discuss the FPGA prototyping approach that we are employing to study efficient support for irregular applications in next generation manycore processors.« less

  19. Compilation of Abstracts for SC12 Conference Proceedings

    NASA Technical Reports Server (NTRS)

    Morello, Gina Francine (Compiler)

    2012-01-01

    1 A Breakthrough in Rotorcraft Prediction Accuracy Using Detached Eddy Simulation; 2 Adjoint-Based Design for Complex Aerospace Configurations; 3 Simulating Hypersonic Turbulent Combustion for Future Aircraft; 4 From a Roar to a Whisper: Making Modern Aircraft Quieter; 5 Modeling of Extended Formation Flight on High-Performance Computers; 6 Supersonic Retropropulsion for Mars Entry; 7 Validating Water Spray Simulation Models for the SLS Launch Environment; 8 Simulating Moving Valves for Space Launch System Liquid Engines; 9 Innovative Simulations for Modeling the SLS Solid Rocket Booster Ignition; 10 Solid Rocket Booster Ignition Overpressure Simulations for the Space Launch System; 11 CFD Simulations to Support the Next Generation of Launch Pads; 12 Modeling and Simulation Support for NASA's Next-Generation Space Launch System; 13 Simulating Planetary Entry Environments for Space Exploration Vehicles; 14 NASA Center for Climate Simulation Highlights; 15 Ultrascale Climate Data Visualization and Analysis; 16 NASA Climate Simulations and Observations for the IPCC and Beyond; 17 Next-Generation Climate Data Services: MERRA Analytics; 18 Recent Advances in High-Resolution Global Atmospheric Modeling; 19 Causes and Consequences of Turbulence in the Earths Protective Shield; 20 NASA Earth Exchange (NEX): A Collaborative Supercomputing Platform; 21 Powering Deep Space Missions: Thermoelectric Properties of Complex Materials; 22 Meeting NASA's High-End Computing Goals Through Innovation; 23 Continuous Enhancements to the Pleiades Supercomputer for Maximum Uptime; 24 Live Demonstrations of 100-Gbps File Transfers Across LANs and WANs; 25 Untangling the Computing Landscape for Climate Simulations; 26 Simulating Galaxies and the Universe; 27 The Mysterious Origin of Stellar Masses; 28 Hot-Plasma Geysers on the Sun; 29 Turbulent Life of Kepler Stars; 30 Modeling Weather on the Sun; 31 Weather on Mars: The Meteorology of Gale Crater; 32 Enhancing Performance of NASAs High-End Computing Applications; 33 Designing Curiosity's Perfect Landing on Mars; 34 The Search Continues: Kepler's Quest for Habitable Earth-Sized Planets.

  20. The Argonne Leadership Computing Facility 2010 annual report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drugan, C.

    Researchers found more ways than ever to conduct transformative science at the Argonne Leadership Computing Facility (ALCF) in 2010. Both familiar initiatives and innovative new programs at the ALCF are now serving a growing, global user community with a wide range of computing needs. The Department of Energy's (DOE) INCITE Program remained vital in providing scientists with major allocations of leadership-class computing resources at the ALCF. For calendar year 2011, 35 projects were awarded 732 million supercomputer processor-hours for computationally intensive, large-scale research projects with the potential to significantly advance key areas in science and engineering. Argonne also continued tomore » provide Director's Discretionary allocations - 'start up' awards - for potential future INCITE projects. And DOE's new ASCR Leadership Computing (ALCC) Program allocated resources to 10 ALCF projects, with an emphasis on high-risk, high-payoff simulations directly related to the Department's energy mission, national emergencies, or for broadening the research community capable of using leadership computing resources. While delivering more science today, we've also been laying a solid foundation for high performance computing in the future. After a successful DOE Lehman review, a contract was signed to deliver Mira, the next-generation Blue Gene/Q system, to the ALCF in 2012. The ALCF is working with the 16 projects that were selected for the Early Science Program (ESP) to enable them to be productive as soon as Mira is operational. Preproduction access to Mira will enable ESP projects to adapt their codes to its architecture and collaborate with ALCF staff in shaking down the new system. We expect the 10-petaflops system to stoke economic growth and improve U.S. competitiveness in key areas such as advancing clean energy and addressing global climate change. Ultimately, we envision Mira as a stepping-stone to exascale-class computers that will be faster than petascale-class computers by a factor of a thousand. Pete Beckman, who served as the ALCF's Director for the past few years, has been named director of the newly created Exascale Technology and Computing Institute (ETCi). The institute will focus on developing exascale computing to extend scientific discovery and solve critical science and engineering problems. Just as Pete's leadership propelled the ALCF to great success, we know that that ETCi will benefit immensely from his expertise and experience. Without question, the future of supercomputing is certainly in good hands. I would like to thank Pete for all his effort over the past two years, during which he oversaw the establishing of ALCF2, the deployment of the Magellan project, increases in utilization, availability, and number of projects using ALCF1. He managed the rapid growth of ALCF staff and made the facility what it is today. All the staff and users are better for Pete's efforts.« less

  1. Hard Copy Market Overview

    NASA Astrophysics Data System (ADS)

    Testan, Peter R.

    1987-04-01

    A number of Color Hard Copy (CHC) market drivers are currently indicating strong growth in the use of CHC technologies for the business graphics marketplace. These market drivers relate to product, software, color monitors and color copiers. The use of color in business graphics allows more information to be relayed than is normally the case in a monochrome format. The communicative powers of full-color computer generated output in the business graphics application area will continue to induce end users to desire and require color in their future applications. A number of color hard copy technologies will be utilized in the presentation graphics arena. Thermal transfer, ink jet, photographic and electrophotographic technologies are all expected to be utilized in the business graphics presentation application area in the future. Since the end of 1984, the availability of color application software packages has grown significantly. Sales revenue generated by business graphics software is expected to grow at a compound annual growth rate of just over 40 percent to 1990. Increased availability of packages to allow the integration of text and graphics is expected. Currently, the latest versions of page description languages such as Postscript, Interpress and DDL all support color output. The use of color monitors will also drive the demand for color hard copy in the business graphics market place. The availability of higher resolution screens is allowing color monitors to be easily used for both text and graphics applications in the office environment. During 1987, the sales of color monitors are expected to surpass the sales of monochrome monitors. Another major color hard copy market driver will be the color copier. In order to take advantage of the communications power of computer generated color output, multiple copies are required for distribution. Product introductions of a new generation of color copiers is now underway with additional introductions expected during 1987. The color hard copy market continues to be in a state of constant change, typical of any immature market. However, much of the change is positive. During 1985, the color hard copy market generated 1.2 billion. By 1990, total market revenue is expected to exceed 5.5 billion. The business graphics CHC application area is expected to grow at a compound annual growth rate greater than 40 percent to 1990.

  2. Programming Equity into Computer Education: Today's Guide to the Schools of the Future. A PEER Computer Equity Action Kit.

    ERIC Educational Resources Information Center

    Martin-McCormick, Lynda; And Others

    An advocacy packet on educational equity in computer education consists of five separate materials. A booklet entitled "Today's Guide to the Schools of the Future" contains four sections. The first section, a computer equity assessment guide, includes interview questions about school policies and allocation of resources, student and teacher…

  3. A Qualitative Look at Preservice Teacher's Perceptions of the Future of Computers in Education.

    ERIC Educational Resources Information Center

    Schnackenberg, Heidi L.; Savenye, Wilhelmina C.

    A qualitative study was conducted to determine the perceptions of preservice teachers on how computers will be used in schools in the future. Undergraduate students (n=40) were given a 60-minute multimedia presentation on how computer and multimedia technologies are used in schools, followed by group discussions on the ways in which computers will…

  4. A new generation in computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kahn, R.E.

    1983-11-01

    Fifth generation of computers is described. The three disciplines involved in bringing such a new generation to reality are: microelectronics; artificial intelligence and, computer systems and architecture. Applications in industry, offices, aerospace, education, health care and retailing are outlined. An analysis is given of research efforts in the US, Japan, U.K., and Europe. Fifth generation programming languages are detailed.

  5. Perspectives on the Future of CFD

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan

    2000-01-01

    This viewgraph presentation gives an overview of the future of computational fluid dynamics (CFD), which in the past has pioneered the field of flow simulation. Over time CFD has progressed as computing power. Numerical methods have been advanced as CPU and memory capacity increases. Complex configurations are routinely computed now and direct numerical simulations (DNS) and large eddy simulations (LES) are used to study turbulence. As the computing resources changed to parallel and distributed platforms, computer science aspects such as scalability (algorithmic and implementation) and portability and transparent codings have advanced. Examples of potential future (or current) challenges include risk assessment, limitations of the heuristic model, and the development of CFD and information technology (IT) tools.

  6. An insight into cyanobacterial genomics--a perspective.

    PubMed

    Lakshmi, Palaniswamy Thanga Velan

    2007-05-20

    At the turn of the millennium, cyanobacteria deserve attention to be reviewed to understand the past, present and future. The advent of post genomic research, which encompasses functional genomics, structural genomics, transcriptomics, pharmacogenomics, proteomics and metabolomics that allows a systematic wide approach for biological system studies. Thus by exploiting genomic and associated protein information through computational analyses, the fledging information that are generated by biotechnological analyses, could be well extrapolated to fill in the lacuna of scarce information on cyanobacteria and as an effort this paper attempts to highlights the perspectives available and awakens researcher to concentrate in the field of cyanobacterial informatics.

  7. Creating technical heritage object replicas in a virtual environment

    NASA Astrophysics Data System (ADS)

    Egorova, Olga; Shcherbinin, Dmitry

    2016-03-01

    The paper presents innovative informatics methods for creating virtual technical heritage replicas, which are of significant scientific and practical importance not only to researchers but to the public in general. By performing 3D modeling and animation of aircrafts, spaceships, architectural-engineering buildings, and other technical objects, the process of learning is achieved while promoting the preservation of the replicas for future generations. Modern approaches based on the wide usage of computer technologies attract a greater number of young people to explore the history of science and technology and renew their interest in the field of mechanical engineering.

  8. COLAcode: COmoving Lagrangian Acceleration code

    NASA Astrophysics Data System (ADS)

    Tassev, Svetlin V.

    2016-02-01

    COLAcode is a serial particle mesh-based N-body code illustrating the COLA (COmoving Lagrangian Acceleration) method; it solves for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). It differs from standard N-body code by trading accuracy at small-scales to gain computational speed without sacrificing accuracy at large scales. This is useful for generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing; such catalogs are needed to perform detailed error analysis for ongoing and future surveys of LSS.

  9. 3D detectors with high space and time resolution

    NASA Astrophysics Data System (ADS)

    Loi, A.

    2018-01-01

    For future high luminosity LHC experiments it will be important to develop new detector systems with increased space and time resolution and also better radiation hardness in order to operate in high luminosity environment. A possible technology which could give such performances is 3D silicon detectors. This work explores the possibility of a pixel geometry by designing and simulating different solutions, using Sentaurus Tecnology Computer Aided Design (TCAD) as design and simulation tool, and analysing their performances. A key factor during the selection was the generated electric field and the carrier velocity inside the active area of the pixel.

  10. ATLAS computing on CSCS HPC

    NASA Astrophysics Data System (ADS)

    Filipcic, A.; Haug, S.; Hostettler, M.; Walker, R.; Weber, M.

    2015-12-01

    The Piz Daint Cray XC30 HPC system at CSCS, the Swiss National Supercomputing centre, was the highest ranked European system on TOP500 in 2014, also featuring GPU accelerators. Event generation and detector simulation for the ATLAS experiment have been enabled for this machine. We report on the technical solutions, performance, HPC policy challenges and possible future opportunities for HEP on extreme HPC systems. In particular a custom made integration to the ATLAS job submission system has been developed via the Advanced Resource Connector (ARC) middleware. Furthermore, a partial GPU acceleration of the Geant4 detector simulations has been implemented.

  11. Documentation of Two- and Three-Dimensional Hypersonic Shock Wave/Turbulent Boundary Layer Interaction Flows

    NASA Technical Reports Server (NTRS)

    Kussoy, Marvin I.; Horstman, Clifford C.

    1989-01-01

    Experimental data for a series of two- and three-dimensional shock wave/turbulent boundary layer interaction flows at Mach 7 are presented. Test bodies, composed of simple geometric shapes, were designed to generate flows with varying degrees of pressure gradient, boundary-layer separation, and turning angle. The data include surface-pressure and heat-transfer distributions as well as limited mean-flow-field surveys in both the undisturbed and the interaction regimes. The data are presented in a convenient form for use in validating existing or future computational models of these generic hypersonic flows.

  12. Strategic Adaptation of SCA for STRS

    NASA Technical Reports Server (NTRS)

    Quinn, Todd; Kacpura, Thomas

    2007-01-01

    The Space Telecommunication Radio System (STRS) architecture is being developed to provide a standard framework for future NASA space radios with greater degrees of interoperability and flexibility to meet new mission requirements. The space environment imposes unique operational requirements with restrictive size, weight, and power constraints that are significantly smaller than terrestrial-based military communication systems. With the harsh radiation environment of space, the computing and processing resources are typically one or two generations behind current terrestrial technologies. Despite these differences, there are elements of the SCA that can be adapted to facilitate the design and implementation of the STRS architecture.

  13. Treatment and management of child pornography use.

    PubMed

    Seto, Michael C; Ahmed, A G

    2014-06-01

    Advances in Internet and other digital technologies have created ready and affordable access to pornography involving real children or computer-generated images of children. To better understand and manage child pornography users, clinicians must acquaint themselves with the characteristics and behaviors of these offenders. This article distinguishes motivations to use child pornography and different types of child pornography offenders and provides a brief overview of the assessment, diagnosis, and management options available. The authors conclude with recommendations on future directions in the assessment, diagnosis, and management of child pornography offenders. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. The next generation of similarity measures that fully explore the semantics in biomedical ontologies.

    PubMed

    Couto, Francisco M; Pinto, H Sofia

    2013-10-01

    There is a prominent trend to augment and improve the formality of biomedical ontologies. For example, this is shown by the current effort on adding description logic axioms, such as disjointness. One of the key ontology applications that can take advantage of this effort is the conceptual (functional) similarity measurement. The presence of description logic axioms in biomedical ontologies make the current structural or extensional approaches weaker and further away from providing sound semantics-based similarity measures. Although beneficial in small ontologies, the exploration of description logic axioms by semantics-based similarity measures is computational expensive. This limitation is critical for biomedical ontologies that normally contain thousands of concepts. Thus in the process of gaining their rightful place, biomedical functional similarity measures have to take the journey of finding how this rich and powerful knowledge can be fully explored while keeping feasible computational costs. This manuscript aims at promoting and guiding the development of compelling tools that deliver what the biomedical community will require in a near future: a next-generation of biomedical similarity measures that efficiently and fully explore the semantics present in biomedical ontologies.

  15. Towards Large Eddy Simulation of gas turbine compressors

    NASA Astrophysics Data System (ADS)

    McMullan, W. A.; Page, G. J.

    2012-07-01

    With increasing computing power, Large Eddy Simulation could be a useful simulation tool for gas turbine axial compressor design. This paper outlines a series of simulations performed on compressor geometries, ranging from a Controlled Diffusion Cascade stator blade to the periodic sector of a stage in a 3.5 stage axial compressor. The simulation results show that LES may offer advantages over traditional RANS methods when off-design conditions are considered - flow regimes where RANS models often fail to converge. The time-dependent nature of LES permits the resolution of transient flow structures, and can elucidate new mechanisms of vorticity generation on blade surfaces. It is shown that accurate LES is heavily reliant on both the near-wall mesh fidelity and the ability of the imposed inflow condition to recreate the conditions found in the reference experiment. For components embedded in a compressor this requires the generation of turbulence fluctuations at the inlet plane. A recycling method is developed that improves the quality of the flow in a single stage calculation of an axial compressor, and indicates that future developments in both the recycling technique and computing power will bring simulations of axial compressors within reach of industry in the coming years.

  16. Computationally efficient modeling of proprioceptive signals in the upper limb for prostheses: a simulation study

    PubMed Central

    Williams, Ian; Constandinou, Timothy G.

    2014-01-01

    Accurate models of proprioceptive neural patterns could 1 day play an important role in the creation of an intuitive proprioceptive neural prosthesis for amputees. This paper looks at combining efficient implementations of biomechanical and proprioceptor models in order to generate signals that mimic human muscular proprioceptive patterns for future experimental work in prosthesis feedback. A neuro-musculoskeletal model of the upper limb with 7 degrees of freedom and 17 muscles is presented and generates real time estimates of muscle spindle and Golgi Tendon Organ neural firing patterns. Unlike previous neuro-musculoskeletal models, muscle activation and excitation levels are unknowns in this application and an inverse dynamics tool (static optimization) is integrated to estimate these variables. A proprioceptive prosthesis will need to be portable and this is incompatible with the computationally demanding nature of standard biomechanical and proprioceptor modeling. This paper uses and proposes a number of approximations and optimizations to make real time operation on portable hardware feasible. Finally technical obstacles to mimicking natural feedback for an intuitive proprioceptive prosthesis, as well as issues and limitations with existing models, are identified and discussed. PMID:25009463

  17. CaloGAN: Simulating 3D high energy particle showers in multilayer electromagnetic calorimeters with generative adversarial networks

    NASA Astrophysics Data System (ADS)

    Paganini, Michela; de Oliveira, Luke; Nachman, Benjamin

    2018-01-01

    The precise modeling of subatomic particle interactions and propagation through matter is paramount for the advancement of nuclear and particle physics searches and precision measurements. The most computationally expensive step in the simulation pipeline of a typical experiment at the Large Hadron Collider (LHC) is the detailed modeling of the full complexity of physics processes that govern the motion and evolution of particle showers inside calorimeters. We introduce CaloGAN, a new fast simulation technique based on generative adversarial networks (GANs). We apply these neural networks to the modeling of electromagnetic showers in a longitudinally segmented calorimeter and achieve speedup factors comparable to or better than existing full simulation techniques on CPU (100 ×-1000 × ) and even faster on GPU (up to ˜105× ). There are still challenges for achieving precision across the entire phase space, but our solution can reproduce a variety of geometric shower shape properties of photons, positrons, and charged pions. This represents a significant stepping stone toward a full neural network-based detector simulation that could save significant computing time and enable many analyses now and in the future.

  18. Software Surface Modeling and Grid Generation Steering Committee

    NASA Technical Reports Server (NTRS)

    Smith, Robert E. (Editor)

    1992-01-01

    It is a NASA objective to promote improvements in the capability and efficiency of computational fluid dynamics. Grid generation, the creation of a discrete representation of the solution domain, is an essential part of computational fluid dynamics. However, grid generation about complex boundaries requires sophisticated surface-model descriptions of the boundaries. The surface modeling and the associated computation of surface grids consume an extremely large percentage of the total time required for volume grid generation. Efficient and user friendly software systems for surface modeling and grid generation are critical for computational fluid dynamics to reach its potential. The papers presented here represent the state-of-the-art in software systems for surface modeling and grid generation. Several papers describe improved techniques for grid generation.

  19. OpenSoC Fabric

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2014-08-21

    Recent advancements in technology scaling have shown a trend towards greater integration with large-scale chips containing thousands of processors connected to memories and other I/O devices using non-trivial network topologies. Software simulation proves insufficient to study the tradeoffs in such complex systems due to slow execution time, whereas hardware RTL development is too time-consuming. We present OpenSoC Fabric, an on-chip network generation infrastructure which aims to provide a parameterizable and powerful on-chip network generator for evaluating future high performance computing architectures based on SoC technology. OpenSoC Fabric leverages a new hardware DSL, Chisel, which contains powerful abstractions provided by itsmore » base language, Scala, and generates both software (C++) and hardware (Verilog) models from a single code base. The OpenSoC Fabric2 infrastructure is modeled after existing state-of-the-art simulators, offers large and powerful collections of configuration options, and follows object-oriented design and functional programming to make functionality extension as easy as possible.« less

  20. De Novo Design of Bioactive Small Molecules by Artificial Intelligence.

    PubMed

    Merk, Daniel; Friedrich, Lukas; Grisoni, Francesca; Schneider, Gisbert

    2018-01-01

    Generative artificial intelligence offers a fresh view on molecular design. We present the first-time prospective application of a deep learning model for designing new druglike compounds with desired activities. For this purpose, we trained a recurrent neural network to capture the constitution of a large set of known bioactive compounds represented as SMILES strings. By transfer learning, this general model was fine-tuned on recognizing retinoid X and peroxisome proliferator-activated receptor agonists. We synthesized five top-ranking compounds designed by the generative model. Four of the compounds revealed nanomolar to low-micromolar receptor modulatory activity in cell-based assays. Apparently, the computational model intrinsically captured relevant chemical and biological knowledge without the need for explicit rules. The results of this study advocate generative artificial intelligence for prospective de novo molecular design, and demonstrate the potential of these methods for future medicinal chemistry. © 2018 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  1. A Summary Description of a Computer Program Concept for the Design and Simulation of Solar Pond Electric Power Generation Systems

    NASA Technical Reports Server (NTRS)

    1984-01-01

    A solar pond electric power generation subsystem, an electric power transformer and switch yard, a large solar pond, a water treatment plant, and numerous storage and evaporation ponds. Because a solar pond stores thermal energy over a long period of time, plant operation at any point in time is dependent upon past operation and future perceived generation plans. This time or past history factor introduces a new dimension in the design process. The design optimization of a plant must go beyond examination of operational state points and consider the seasonal variations in solar, solar pond energy storage, and desired plant annual duty-cycle profile. Models or design tools will be required to optimize a plant design. These models should be developed in order to include a proper but not excessive level of detail. The model should be targeted to a specific objective and not conceived as a do everything analysis tool, i.e., system design and not gradient-zone stability.

  2. CW-pumped telecom band polarization entangled photon pair generation in a Sagnac interferometer.

    PubMed

    Li, Yan; Zhou, Zhi-Yuan; Ding, Dong-Sheng; Shi, Bao-Sen

    2015-11-02

    Polarization entangled photon pair source is widely used in many quantum information processing applications such as teleportation, quantum communications, quantum computation and high precision quantum metrology. We report on the generation of a continuous-wave pumped 1550 nm polarization entangled photon pair source at telecom wavelength using a type-II periodically poled KTiOPO(4) (PPKTP) crystal in a Sagnac interferometer. Hong-Ou-Mandel (HOM) interference measurement yields signal and idler photon bandwidth of 2.4 nm. High quality of entanglement is verified by various kinds of measurements, for example two-photon interference fringes, Bell inequality and quantum states tomography. The source can be tuned over a broad range against temperature or pump power without loss of visibilities. This source will be used in our future experiments such as generation of orbital angular momentum entangled source at telecom wavelength for quantum frequency up-conversion, entanglement based quantum key distributions and many other quantum optics experiments at telecom wavelengths.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vallisneri, Michele

    We report on three numerical experiments on the implementation of Time-Delay Interferometry (TDI) for LISA, performed with Synthetic LISA, a C++/Python package that we developed to simulate the LISA science process at the level of scientific and technical requirements. Specifically, we study the laser-noise residuals left by first-generation TDI when the LISA armlengths have a realistic time dependence; we characterize the armlength-measurement accuracies that are needed to have effective laser-noise cancellation in both first- and second-generation TDI; and we estimate the quantization and telemetry bitdepth needed for the phase measurements. Synthetic LISA generates synthetic time series of the LISA fundamentalmore » noises, as filtered through all the TDI observables; it also provides a streamlined module to compute the TDI responses to gravitational waves according to a full model of TDI, including the motion of the LISA array and the temporal and directional dependence of the armlengths. We discuss the theoretical model that underlies the simulation, its implementation, and its use in future investigations on system-characterization and data-analysis prototyping for LISA.« less

  4. Structural Mechanics and Dynamics Branch

    NASA Technical Reports Server (NTRS)

    Stefko, George

    2003-01-01

    The 2002 annual report of the Structural Mechanics and Dynamics Branch reflects the majority of the work performed by the branch staff during the 2002 calendar year. Its purpose is to give a brief review of the branch s technical accomplishments. The Structural Mechanics and Dynamics Branch develops innovative computational tools, benchmark experimental data, and solutions to long-term barrier problems in the areas of propulsion aeroelasticity, active and passive damping, engine vibration control, rotor dynamics, magnetic suspension, structural mechanics, probabilistics, smart structures, engine system dynamics, and engine containment. Furthermore, the branch is developing a compact, nonpolluting, bearingless electric machine with electric power supplied by fuel cells for future "more electric" aircraft. An ultra-high-power-density machine that can generate projected power densities of 50 hp/lb or more, in comparison to conventional electric machines, which generate usually 0.2 hp/lb, is under development for application to electric drives for propulsive fans or propellers. In the future, propulsion and power systems will need to be lighter, to operate at higher temperatures, and to be more reliable in order to achieve higher performance and economic viability. The Structural Mechanics and Dynamics Branch is working to achieve these complex, challenging goals.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haugen, Carl C.; Forget, Benoit; Smith, Kord S.

    Most high performance computing systems being deployed currently and envisioned for the future are based on making use of heavy parallelism across many computational nodes and many concurrent cores. These types of heavily parallel systems often have relatively little memory per core but large amounts of computing capability. This places a significant constraint on how data storage is handled in many Monte Carlo codes. This is made even more significant in fully coupled multiphysics simulations, which requires simulations of many physical phenomena be carried out concurrently on individual processing nodes, which further reduces the amount of memory available for storagemore » of Monte Carlo data. As such, there has been a move towards on-the-fly nuclear data generation to reduce memory requirements associated with interpolation between pre-generated large nuclear data tables for a selection of system temperatures. Methods have been previously developed and implemented in MIT’s OpenMC Monte Carlo code for both the resolved resonance regime and the unresolved resonance regime, but are currently absent for the thermal energy regime. While there are many components involved in generating a thermal neutron scattering cross section on-the-fly, this work will focus on a proposed method for determining the energy and direction of a neutron after a thermal incoherent inelastic scattering event. This work proposes a rejection sampling based method using the thermal scattering kernel to determine the correct outgoing energy and angle. The goal of this project is to be able to treat the full S (a, ß) kernel for graphite, to assist in high fidelity simulations of the TREAT reactor at Idaho National Laboratory. The method is, however, sufficiently general to be applicable in other thermal scattering materials, and can be initially validated with the continuous analytic free gas model.« less

  6. Extreme Scale Computing to Secure the Nation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, D L; McGraw, J R; Johnson, J R

    2009-11-10

    Since the dawn of modern electronic computing in the mid 1940's, U.S. national security programs have been dominant users of every new generation of high-performance computer. Indeed, the first general-purpose electronic computer, ENIAC (the Electronic Numerical Integrator and Computer), was used to calculate the expected explosive yield of early thermonuclear weapons designs. Even the U. S. numerical weather prediction program, another early application for high-performance computing, was initially funded jointly by sponsors that included the U.S. Air Force and Navy, agencies interested in accurate weather predictions to support U.S. military operations. For the decades of the cold war, national securitymore » requirements continued to drive the development of high performance computing (HPC), including advancement of the computing hardware and development of sophisticated simulation codes to support weapons and military aircraft design, numerical weather prediction as well as data-intensive applications such as cryptography and cybersecurity U.S. national security concerns continue to drive the development of high-performance computers and software in the U.S. and in fact, events following the end of the cold war have driven an increase in the growth rate of computer performance at the high-end of the market. This mainly derives from our nation's observance of a moratorium on underground nuclear testing beginning in 1992, followed by our voluntary adherence to the Comprehensive Test Ban Treaty (CTBT) beginning in 1995. The CTBT prohibits further underground nuclear tests, which in the past had been a key component of the nation's science-based program for assuring the reliability, performance and safety of U.S. nuclear weapons. In response to this change, the U.S. Department of Energy (DOE) initiated the Science-Based Stockpile Stewardship (SBSS) program in response to the Fiscal Year 1994 National Defense Authorization Act, which requires, 'in the absence of nuclear testing, a progam to: (1) Support a focused, multifaceted program to increase the understanding of the enduring stockpile; (2) Predict, detect, and evaluate potential problems of the aging of the stockpile; (3) Refurbish and re-manufacture weapons and components, as required; and (4) Maintain the science and engineering institutions needed to support the nation's nuclear deterrent, now and in the future'. This program continues to fulfill its national security mission by adding significant new capabilities for producing scientific results through large-scale computational simulation coupled with careful experimentation, including sub-critical nuclear experiments permitted under the CTBT. To develop the computational science and the computational horsepower needed to support its mission, SBSS initiated the Accelerated Strategic Computing Initiative, later renamed the Advanced Simulation & Computing (ASC) program (sidebar: 'History of ASC Computing Program Computing Capability'). The modern 3D computational simulation capability of the ASC program supports the assessment and certification of the current nuclear stockpile through calibration with past underground test (UGT) data. While an impressive accomplishment, continued evolution of national security mission requirements will demand computing resources at a significantly greater scale than we have today. In particular, continued observance and potential Senate confirmation of the Comprehensive Test Ban Treaty (CTBT) together with the U.S administration's promise for a significant reduction in the size of the stockpile and the inexorable aging and consequent refurbishment of the stockpile all demand increasing refinement of our computational simulation capabilities. Assessment of the present and future stockpile with increased confidence of the safety and reliability without reliance upon calibration with past or future test data is a long-term goal of the ASC program. This will be accomplished through significant increases in the scientific bases that underlie the computational tools. Computer codes must be developed that replace phenomenology with increased levels of scientific understanding together with an accompanying quantification of uncertainty. These advanced codes will place significantly higher demands on the computing infrastructure than do the current 3D ASC codes. This article discusses not only the need for a future computing capability at the exascale for the SBSS program, but also considers high performance computing requirements for broader national security questions. For example, the increasing concern over potential nuclear terrorist threats demands a capability to assess threats and potential disablement technologies as well as a rapid forensic capability for determining a nuclear weapons design from post-detonation evidence (nuclear counterterrorism).« less

  7. Computer Applications in Teaching and Learning.

    ERIC Educational Resources Information Center

    Halley, Fred S.; And Others

    Some examples of the usage of computers in teaching and learning are examination generation, automatic exam grading, student tracking, problem generation, computational examination generators, program packages, simulation, and programing skills for problem solving. These applications are non-trivial and do fulfill the basic assumptions necessary…

  8. Automated apparatus and method of generating native code for a stitching machine

    NASA Technical Reports Server (NTRS)

    Miller, Jeffrey L. (Inventor)

    2000-01-01

    A computer system automatically generates CNC code for a stitching machine. The computer determines the locations of a present stitching point and a next stitching point. If a constraint is not found between the present stitching point and the next stitching point, the computer generates code for making a stitch at the next stitching point. If a constraint is found, the computer generates code for changing a condition (e.g., direction) of the stitching machine's stitching head.

  9. Fuel Performance Experiments and Modeling: Fission Gas Bubble Nucleation and Growth in Alloy Nuclear Fuels

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDeavitt, Sean; Shao, Lin; Tsvetkov, Pavel

    2014-04-07

    Advanced fast reactor systems being developed under the DOE's Advanced Fuel Cycle Initiative are designed to destroy TRU isotopes generated in existing and future nuclear energy systems. Over the past 40 years, multiple experiments and demonstrations have been completed using U-Zr, U-Pu-Zr, U-Mo and other metal alloys. As a result, multiple empirical and semi-empirical relationships have been established to develop empirical performance modeling codes. Many mechanistic questions about fission as mobility, bubble coalescience, and gas release have been answered through industrial experience, research, and empirical understanding. The advent of modern computational materials science, however, opens new doors of development suchmore » that physics-based multi-scale models may be developed to enable a new generation of predictive fuel performance codes that are not limited by empiricism.« less

  10. Design of the Hybrid Wing Body with Nacelle: N3-X Propulsion-Airframe Configuration

    NASA Technical Reports Server (NTRS)

    Kim, Hyoungjin; Harding, David; Gronstal, David T.; Liou, May-Fun; Liou, Meng-Sing

    2016-01-01

    The Hybrid Wing Body (HWB) aircraft is of great interest for future transport concepts due to itspromises of reduced aircraft noise, nitrous-oxide emissions, and fuel consumption. A design parameterizationmethod for HWB configurations with mail slot nacelle has been developed for a fast exploration of designspace in conceptual and preliminary design phases of a HWB configuration. A HWB planform model byLaughlin [11] was implemented, and the Class Shape Transformation (CST) airfoil generation method byKulfan [10] was utilized to construct the needed geometry for computational high fidelity aerodynamicsimulations. Geometric constraints for the parameterization such as internal cabin and cargo hold layoutswere imposed on the geometry generation. A CFD simulation was performed for a HWB configurationgenerated by the current geometric modeler, clearly showing a significant effect of the installed nacelle on theflowfield.

  11. Room temperature operation of electro-optical bistability in the edge-emitting tunneling-collector transistor laser

    NASA Astrophysics Data System (ADS)

    Feng, M.; Holonyak, N.; Wang, C. Y.

    2017-09-01

    Optical bistable devices are fundamental to digital photonics as building blocks of switches, logic gates, and memories in future computer systems. Here, we demonstrate both optical and electrical bistability and capability for switching in a single transistor operated at room temperature. The electro-optical hysteresis is explained by the interaction of electron-hole (e-h) generation and recombination dynamics with the cavity photon modulation in different switching paths. The switch-UP and switch-DOWN threshold voltages are determined by the rate difference of photon generation at the base quantum-well and the photon absorption via intra-cavity photon-assisted tunneling controlled by the collector voltage. Thus, the transistor laser electro-optical bistable switching is programmable with base current and collector voltage, and the basis for high speed optical logic processors.

  12. Planck 2015 results: XII. Full focal plane simulations

    DOE PAGES

    Ade, P. A. R.; Aghanim, N.; Arnaud, M.; ...

    2016-09-20

    In this paper, we present the 8th full focal plane simulation set (FFP8), deployed in support of the Planck 2015 results. FFP8 consists of 10 fiducial mission realizations reduced to 18 144 maps, together with the most massive suite of Monte Carlo realizations of instrument noise and CMB ever generated, comprising 10 4 mission realizations reduced to about 10 6 maps. The resulting maps incorporate the dominant instrumental, scanning, and data analysis effects, and the remaining subdominant effects will be included in future updates. Finally, generated at a cost of some 25 million CPU-hours spread across multiple high-performance-computing (HPC) platforms,more » FFP8 is used to validate and verify analysis algorithms and their implementations, and to remove biases from and quantify uncertainties in the results of analyses of the real data.« less

  13. Taking a fresh look at boiling heat transfer on the road to improved nuclear economics and efficiency

    DOE PAGES

    Pointer, William David; Baglietto, Emilio

    2016-05-01

    Here, in the effort to reinvigorate innovation in the way we design, build, and operate the nuclear power generating stations of today and tomorrow, nothing can be taken for granted. Not even the seemingly familiar physics of boiling water. The Consortium for the Advanced Simulation of Light Water Reactors, or CASL, is focused on the deployment of advanced modeling and simulation capabilities to enable the nuclear industry to reduce uncertainties in the prediction of multi-physics phenomena and continue to improve the performance of today’s Light Water Reactors and their fuel. An important part of the CASL mission is the developmentmore » of a next generation thermal hydraulics simulation capability, integrating the history of engineering models based on experimental experience with the computing technology of the future.« less

  14. Study of variability of permittivity and its mapping over lunar surface and subsurface using multisensors datasets

    NASA Astrophysics Data System (ADS)

    Calla, O. P. N.; Mathur, Shubhra; Gadri, Kishan Lal; Jangid, Monika

    2016-12-01

    In the present paper, permittivity maps of equatorial lunar surface are generated using brightness temperature (TB) data obtained from Microwave Radiometer (MRM) of Chang'e-1 and physical temperature (TP) data obtained from Diviner of Lunar Reconnaissance Orbiter (LRO). Here, permittivity mapping is not carried out above 60° latitudes towards the lunar poles due to large anomaly in the physical temperature obtained from the Diviner. Microwave frequencies, which are used to generate these maps are 3 GHz, 7.8 GHz, 19.35 GHz and 37 GHz. Permittivity values are simulated using TB values at these four frequencies. Here, weighted average of physical temperature obtained from Diviner are used to compute permittivity at each microwave frequencies. Longer wavelengths of microwave signals give information of more deeper layers of the lunar surface as compared to smaller wavelength. Initially, microwave emissivity is estimated using TB values from MRM and physical temperature (TP) from Diviner. From estimated emissivity the real part of permittivity (ε), is calculated using Fresnel equations. The permittivity maps of equatorial lunar surface is generated. The simulated permittivity values are normalized with respect to density for easy comparison of simulated permittivity values with the permittivity values of Apollo samples as well as with the permittivity values of Terrestrial Analogue of Lunar Soil (TALS) JSC-1A. Lower value of dielectric constant (ε‧) indicates that the corresponding lunar surface is smooth and doesn't have rough rocky terrain. Thus a future lunar astronaut can use these data to decide proper landing site for future lunar missions. The results of this paper will serve as input to future exploration of lunar surface.

  15. Specifying the Concept of Future Generations for Addressing Issues Related to High-Level Radioactive Waste.

    PubMed

    Kermisch, Celine

    2016-12-01

    The nuclear community frequently refers to the concept of "future generations" when discussing the management of high-level radioactive waste. However, this notion is generally not defined. In this context, we have to assume a wide definition of the concept of future generations, conceived as people who will live after the contemporary people are dead. This definition embraces thus each generation following ours, without any restriction in time. The aim of this paper is to show that, in the debate about nuclear waste, this broad notion should be further specified and to clarify the related implications for nuclear waste management policies. Therefore, we provide an ethical analysis of different management strategies for high-level waste in the light of two principles, protection of future generations-based on safety and security-and respect for their choice. This analysis shows that high-level waste management options have different ethical impacts across future generations, depending on whether the memory of the waste and its location is lost, or not. We suggest taking this distinction into account by introducing the notions of "close future generations" and "remote future generations", which has important implications on nuclear waste management policies insofar as it stresses that a retrievable disposal has fewer benefits than usually assumed.

  16. Now and Next-Generation Sequencing Techniques: Future of Sequence Analysis Using Cloud Computing

    PubMed Central

    Thakur, Radhe Shyam; Bandopadhyay, Rajib; Chaudhary, Bratati; Chatterjee, Sourav

    2012-01-01

    Advances in the field of sequencing techniques have resulted in the greatly accelerated production of huge sequence datasets. This presents immediate challenges in database maintenance at datacenters. It provides additional computational challenges in data mining and sequence analysis. Together these represent a significant overburden on traditional stand-alone computer resources, and to reach effective conclusions quickly and efficiently, the virtualization of the resources and computation on a pay-as-you-go concept (together termed “cloud computing”) has recently appeared. The collective resources of the datacenter, including both hardware and software, can be available publicly, being then termed a public cloud, the resources being provided in a virtual mode to the clients who pay according to the resources they employ. Examples of public companies providing these resources include Amazon, Google, and Joyent. The computational workload is shifted to the provider, which also implements required hardware and software upgrades over time. A virtual environment is created in the cloud corresponding to the computational and data storage needs of the user via the internet. The task is then performed, the results transmitted to the user, and the environment finally deleted after all tasks are completed. In this discussion, we focus on the basics of cloud computing, and go on to analyze the prerequisites and overall working of clouds. Finally, the applications of cloud computing in biological systems, particularly in comparative genomics, genome informatics, and SNP detection are discussed with reference to traditional workflows. PMID:23248640

  17. THE FUTURE OF COMPUTER-BASED TOXICITY PREDICTION: MECHANISM-BASED MODELS VS. INFORMATION MINING APPROACHES

    EPA Science Inventory


    The Future of Computer-Based Toxicity Prediction:
    Mechanism-Based Models vs. Information Mining Approaches

    When we speak of computer-based toxicity prediction, we are generally referring to a broad array of approaches which rely primarily upon chemical structure ...

  18. Educational Computer Utilization and Computer Communications.

    ERIC Educational Resources Information Center

    Singh, Jai P.; Morgan, Robert P.

    As part of an analysis of educational needs and telecommunications requirements for future educational satellite systems, three studies were carried out. 1) The role of the computer in education was examined and both current status and future requirements were analyzed. Trade-offs between remote time sharing and remote batch process were explored…

  19. Scenario-neutral Food Security Risk Assessment: A livestock Heat Stress Case Study

    NASA Astrophysics Data System (ADS)

    Broman, D.; Rajagopalan, B.; Hopson, T. M.

    2015-12-01

    Food security risk assessments can provide decision-makers with actionable information to identify critical system limitations, and alternatives to mitigate the impacts of future conditions. The majority of current risk assessments have been scenario-led and results are limited by the scenarios - selected future states of the world's climate system and socioeconomic factors. A generic scenario-neutral framework for food security risk assessments is presented here that uses plausible states of the world without initially assigning likelihoods. Measures of system vulnerabilities are identified and system risk is assessed for these states. This framework has benefited greatly by research in the water and natural resource fields to adapt their planning to provide better risk assessments. To illustrate the utility of this framework we develop a case study using livestock heat stress risk within the pastoral system of West Africa. Heat stress can have a major impact not only on livestock owners, but on the greater food production system, decreasing livestock growth, milk production, and reproduction, and in severe cases, death. A heat stress index calculated from daily weather is used as a vulnerability measure and is computed from historic daily weather data at several locations in the study region. To generate plausible states, a stochastic weather generator is developed to generate synthetic weather sequences at each location, consistent with the seasonal climate. A spatial model of monthly and seasonal heat stress provide projections of current and future livestock heat stress measures across the study region, and can incorporate in seasonal climate and other external covariates. These models, when linked with empirical thresholds of heat stress risk for specific breeds offer decision-makers with actionable information for use in near-term warning systems as well as for future planning. Future assessment can indicate under which states livestock are at greatest risk of heat stress; when coupled with assessments of additional measures (e.g. water and fodder availability) can inform on alternatives that provide satisfactory performance under a wide range of states (e.g. optimal cattle breed, supplemental feed, increased water access).

  20. Fast generation of computer-generated holograms using wavelet shrinkage.

    PubMed

    Shimobaba, Tomoyoshi; Ito, Tomoyoshi

    2017-01-09

    Computer-generated holograms (CGHs) are generated by superimposing complex amplitudes emitted from a number of object points. However, this superposition process remains very time-consuming even when using the latest computers. We propose a fast calculation algorithm for CGHs that uses a wavelet shrinkage method, eliminating small wavelet coefficient values to express approximated complex amplitudes using only a few representative wavelet coefficients.

  1. Measurement of Satellite Impact Test Fragments for Modeling Orbital Debris

    NASA Technical Reports Server (NTRS)

    Hill, Nicole M.

    2009-01-01

    There are over 13,000 pieces of catalogued objects 10cm and larger in orbit around Earth [ODQN, January 2009, p12]. More than 6000 of these objects are fragments from explosions and collisions. As the earth-orbiting object count increases, debris-generating collisions in the future become a statistical inevitability. To aid in understanding this collision risk, the NASA Orbital Debris Program Office has developed computer models that calculate quantity and orbits of debris both currently in orbit and in future epochs. In order to create a reasonable computer model of the orbital debris environment, it is important to understand the mechanics of creation of debris as a result of a collision. The measurement of the physical characteristics of debris resulting from ground-based, hypervelocity impact testing aids in understanding the sizes and shapes of debris produced from potential impacts in orbit. To advance the accuracy of fragment shape/size determination, the NASA Orbital Debris Program Office recently implemented a computerized measurement system. The goal of this system is to improve knowledge and understanding of the relation between commonly used dimensions and overall shape. The technique developed involves scanning a single fragment with a hand-held laser device, measuring its size properties using a sophisticated software tool, and creating a three-dimensional computer model to demonstrate how the object might appear in orbit. This information is used to aid optical techniques in shape determination. This more automated and repeatable method provides higher accuracy in the size and shape determination of debris.

  2. Computer vision-based technologies and commercial best practices for the advancement of the motion imagery tradecraft

    NASA Astrophysics Data System (ADS)

    Phipps, Marja; Capel, David; Srinivasan, James

    2014-06-01

    Motion imagery capabilities within the Department of Defense/Intelligence Community (DoD/IC) have advanced significantly over the last decade, attempting to meet continuously growing data collection, video processing and analytical demands in operationally challenging environments. The motion imagery tradecraft has evolved accordingly, enabling teams of analysts to effectively exploit data and generate intelligence reports across multiple phases in structured Full Motion Video (FMV) Processing Exploitation and Dissemination (PED) cells. Yet now the operational requirements are drastically changing. The exponential growth in motion imagery data continues, but to this the community adds multi-INT data, interoperability with existing and emerging systems, expanded data access, nontraditional users, collaboration, automation, and support for ad hoc configurations beyond the current FMV PED cells. To break from the legacy system lifecycle, we look towards a technology application and commercial adoption model course which will meet these future Intelligence, Surveillance and Reconnaissance (ISR) challenges. In this paper, we explore the application of cutting edge computer vision technology to meet existing FMV PED shortfalls and address future capability gaps. For example, real-time georegistration services developed from computer-vision-based feature tracking, multiple-view geometry, and statistical methods allow the fusion of motion imagery with other georeferenced information sources - providing unparalleled situational awareness. We then describe how these motion imagery capabilities may be readily deployed in a dynamically integrated analytical environment; employing an extensible framework, leveraging scalable enterprise-wide infrastructure and following commercial best practices.

  3. Using NERSC High-Performance Computing (HPC) systems for high-energy nuclear physics applications with ALICE

    NASA Astrophysics Data System (ADS)

    Fasel, Markus

    2016-10-01

    High-Performance Computing Systems are powerful tools tailored to support large- scale applications that rely on low-latency inter-process communications to run efficiently. By design, these systems often impose constraints on application workflows, such as limited external network connectivity and whole node scheduling, that make more general-purpose computing tasks, such as those commonly found in high-energy nuclear physics applications, more difficult to carry out. In this work, we present a tool designed to simplify access to such complicated environments by handling the common tasks of job submission, software management, and local data management, in a framework that is easily adaptable to the specific requirements of various computing systems. The tool, initially constructed to process stand-alone ALICE simulations for detector and software development, was successfully deployed on the NERSC computing systems, Carver, Hopper and Edison, and is being configured to provide access to the next generation NERSC system, Cori. In this report, we describe the tool and discuss our experience running ALICE applications on NERSC HPC systems. The discussion will include our initial benchmarks of Cori compared to other systems and our attempts to leverage the new capabilities offered with Cori to support data-intensive applications, with a future goal of full integration of such systems into ALICE grid operations.

  4. Ferromagnetism in doped or undoped spintronics nanomaterials

    NASA Astrophysics Data System (ADS)

    Qiang, You

    2010-10-01

    Much interest has been sparked by the discovery of ferromagnetism in a range of oxide doped and undoped semiconductors. The development of ferromagnetic oxide semiconductor materials with giant magnetoresistance (GMR) offers many advantages in spintronics devices for future miniaturization of computers. Among them, TM-doped ZnO is an extensively studied n-type wide-band-gap (3.36 eV) semiconductor with a tremendous interest as future mini-computer, blue light emitting, and solar cells. In this talk, Co-doped ZnO and Co-doped Cu2O semiconductor nanoclusters are successfully synthesized by a third generation sputtering-gas-aggregation cluster technique. The Co-doped nanoclusters are ferromagnetic with Curie temperature above room temperature. Both of Co-doped nanoclusters show positive magnetoresistance (PMR) at low temperature, but the amplitude of the PMRs shows an anomalous difference. For similar Co doping concentration at 5 K, PMR is greater than 800% for Co-doped ZnO but only 5% for Co-doped Cu2O nanoclusters. Giant PMR in Co-doped ZnO which is attributed to large Zeeman splitting effect has a linear dependence on applied magnetic field with very high sensitivity, which makes it convenient for the future spintronics applications. The small PMR in Co-doped Cu2O is related to its vanishing density of states at Fermi level. Undoped Zn/ZnO core-shell nanoparticle gives high ferromagnetic properties above room temperature due to the defect induced magnetization at the interface.

  5. A large-scale simulation of climate change effects on flood regime - A case study for the Alabama-Coosa-Tallapoosa River Basin

    NASA Astrophysics Data System (ADS)

    Dullo, T. T.; Gangrade, S.; Marshall, R.; Islam, S. R.; Ghafoor, S. K.; Kao, S. C.; Kalyanapu, A. J.

    2017-12-01

    The damage and cost of flooding are continuously increasing due to climate change and variability, which compels the development and advance of global flood hazard models. However, due to computational expensiveness, evaluation of large-scale and high-resolution flood regime remains a challenge. The objective of this research is to use a coupled modeling framework that consists of a dynamically downscaled suite of eleven Coupled Model Intercomparison Project Phase 5 (CMIP5) climate models, a distributed hydrologic model called DHSVM, and a computational-efficient 2-dimensional hydraulic model called Flood2D-GPU to study the impacts of climate change on flood regime in the Alabama-Coosa-Tallapoosa (ACT) River Basin. Downscaled meteorologic forcings for 40 years in the historical period (1966-2005) and 40 years in the future period (2011-2050) were used as inputs to drive the calibrated DHSVM to generate annual maximum flood hydrographs. These flood hydrographs along with 30-m resolution digital elevation and estimated surface roughness were then used by Flood2D-GPU to estimate high-resolution flood depth, velocities, duration, and regime. Preliminary results for the Conasauga river basin (an upper subbasin within ACT) indicate that seven of the eleven climate projections show an average increase of 25 km2 in flooded area (between historic and future projections). Future work will focus on illustrating the effects of climate change on flood duration and area for the entire ACT basin.

  6. A Three-Dimensional Statistical Average Skull: Application of Biometric Morphing in Generating Missing Anatomy.

    PubMed

    Teshima, Tara Lynn; Patel, Vaibhav; Mainprize, James G; Edwards, Glenn; Antonyshyn, Oleh M

    2015-07-01

    The utilization of three-dimensional modeling technology in craniomaxillofacial surgery has grown exponentially during the last decade. Future development, however, is hindered by the lack of a normative three-dimensional anatomic dataset and a statistical mean three-dimensional virtual model. The purpose of this study is to develop and validate a protocol to generate a statistical three-dimensional virtual model based on a normative dataset of adult skulls. Two hundred adult skull CT images were reviewed. The average three-dimensional skull was computed by processing each CT image in the series using thin-plate spline geometric morphometric protocol. Our statistical average three-dimensional skull was validated by reconstructing patient-specific topography in cranial defects. The experiment was repeated 4 times. In each case, computer-generated cranioplasties were compared directly to the original intact skull. The errors describing the difference between the prediction and the original were calculated. A normative database of 33 adult human skulls was collected. Using 21 anthropometric landmark points, a protocol for three-dimensional skull landmarking and data reduction was developed and a statistical average three-dimensional skull was generated. Our results show the root mean square error (RMSE) for restoration of a known defect using the native best match skull, our statistical average skull, and worst match skull was 0.58, 0.74, and 4.4  mm, respectively. The ability to statistically average craniofacial surface topography will be a valuable instrument for deriving missing anatomy in complex craniofacial defects and deficiencies as well as in evaluating morphologic results of surgery.

  7. The Millennial Generation: Developing Leaders for the Future Security Environment

    DTIC Science & Technology

    2011-02-15

    Dumbest Generation (Penguin Group, New York, New York: 2009) p 8, 10. 19 National Academy of Sciences, “Generation Y : The Millennials …Ready or Not, Here...St ra te gy R es ea rc h Pr oj ec t THE MILLENNIAL GENERATION: DEVELOPING LEADERS FOR THE FUTURE SECURITY ENVIRONMENT BY COLONEL LANCE...Strategy Research Project 3. DATES COVERED (From - To) 4. TITLE AND SUBTITLE The Millennial Generation: Developing Leaders for the Future

  8. Health Data Entanglement and artificial intelligence-based analysis: a brand new methodology to improve the effectiveness of healthcare services.

    PubMed

    Capone, A; Cicchetti, A; Mennini, F S; Marcellusi, A; Baio, G; Favato, G

    2016-01-01

    Healthcare expenses will be the most relevant policy issue for most governments in the EU and in the USA. This expenditure can be associated with two major key categories: demographic and economic drivers. Factors driving healthcare expenditure were rarely recognised, measured and comprehended. An improvement of health data generation and analysis is mandatory, and in order to tackle healthcare spending growth, it may be useful to design and implement an effective, advanced system to generate and analyse these data. A methodological approach relied upon the Health Data Entanglement (HDE) can be a suitable option. By definition, in the HDE a large amount of data sets having several sources are functionally interconnected and computed through learning machines that generate patterns of highly probable future health conditions of a population. Entanglement concept is borrowed from quantum physics and means that multiple particles (information) are linked together in a way such that the measurement of one particle's quantum state (individual health conditions and related economic requirements) determines the possible quantum states of other particles (population health forecasts to predict their impact). The value created by the HDE is based on the combined evaluation of clinical, economic and social effects generated by health interventions. To predict the future health conditions of a population, analyses of data are performed using self-learning AI, in which sequential decisions are based on Bayesian algorithmic probabilities. HDE and AI-based analysis can be adopted to improve the effectiveness of the health governance system in ways that also lead to better quality of care.

  9. [Generation Y : recruitment, retention and development].

    PubMed

    Schmidt, C E; Möller, J; Schmidt, K; Gerbershagen, M U; Wappler, F; Limmroth, V; Padosch, S A; Bauer, M

    2011-06-01

    There is a significant shortage of highly qualified personnel in medicine, especially skilled doctors and nurses. This shortage of qualified labor has led to competition between hospitals. Analyzing the circumstances of the competition, nurses and doctors of the so-called generation Y are of importance. Recruitment and retention of these staff members will become a critical success factor for hospitals in the future. An internet search was conducted using the key words "generation Y and medicine, demography, personnel and hospitals". A search in Medline/pubmed for scientific studies on the topics of labor shortage was performed using the key words "personnel, shortage doctors, generation X, baby boomer, personnel and demographic changes, staff". Finally, sources from public institutions and academic medical societies were analyzed. The data were sorted by main categories and relevance for hospitals. Statistical analysis was done using descriptive measures. The analysis confirmed the heterogeneous and complex flood of information on the topic demography and generation. A comparison of the generations showed that they can be separated into baby boomers (born 1946-1964 live to work), generation X (born 1965-1980 work to live) and generation Y (born 1981 and after, live while working). Members of generation Y "live while working" are oriented to competence and less with hierarchies. They exchange information using modern communication methods and within networks. Internet and computers are part of their daily routine. Employees of generation Y challenge leadership in hospitals by increasing the demands. However, generation Y can significantly increase professionalization and competitiveness for hospitals.

  10. Can We Communicate with Other Generations?

    ERIC Educational Resources Information Center

    Mellert, Robert B.

    Communicating with future generations is not merely a question of "how" to communicate, but also of "what." Our major moral responsibilities to future generations concern the size of future populations, conservation of nonrenewable resources, diversity of the gene pool, and quality of the environment. To determine our…

  11. Predictive Technologies: Can Smart Tools Augment the Brain's Predictive Abilities?

    PubMed Central

    Pezzulo, Giovanni; D'Ausilio, Alessandro; Gaggioli, Andrea

    2016-01-01

    The ability of “looking into the future”—namely, the capacity of anticipating future states of the environment or of the body—represents a fundamental function of human (and animal) brains. A goalkeeper who tries to guess the ball's direction; a chess player who attempts to anticipate the opponent's next move; or a man-in-love who tries to calculate what are the chances of her saying yes—in all these cases, people are simulating possible future states of the world, in order to maximize the success of their decisions or actions. Research in neuroscience is showing that our ability to predict the behavior of physical or social phenomena is largely dependent on the brain's ability to integrate current and past information to generate (probabilistic) simulations of the future. But could predictive processing be augmented using advanced technologies? In this contribution, we discuss how computational technologies may be used to support, facilitate or enhance the prediction of future events, by considering exemplificative scenarios across different domains, from simpler sensorimotor decisions to more complex cognitive tasks. We also examine the key scientific and technical challenges that must be faced to turn this vision into reality. PMID:27199648

  12. Computation of canonical correlation and best predictable aspect of future for time series

    NASA Technical Reports Server (NTRS)

    Pourahmadi, Mohsen; Miamee, A. G.

    1989-01-01

    The canonical correlation between the (infinite) past and future of a stationary time series is shown to be the limit of the canonical correlation between the (infinite) past and (finite) future, and computation of the latter is reduced to a (generalized) eigenvalue problem involving (finite) matrices. This provides a convenient and essentially, finite-dimensional algorithm for computing canonical correlations and components of a time series. An upper bound is conjectured for the largest canonical correlation.

  13. Mathematical modeling and computational prediction of cancer drug resistance.

    PubMed

    Sun, Xiaoqiang; Hu, Bin

    2017-06-23

    Diverse forms of resistance to anticancer drugs can lead to the failure of chemotherapy. Drug resistance is one of the most intractable issues for successfully treating cancer in current clinical practice. Effective clinical approaches that could counter drug resistance by restoring the sensitivity of tumors to the targeted agents are urgently needed. As numerous experimental results on resistance mechanisms have been obtained and a mass of high-throughput data has been accumulated, mathematical modeling and computational predictions using systematic and quantitative approaches have become increasingly important, as they can potentially provide deeper insights into resistance mechanisms, generate novel hypotheses or suggest promising treatment strategies for future testing. In this review, we first briefly summarize the current progress of experimentally revealed resistance mechanisms of targeted therapy, including genetic mechanisms, epigenetic mechanisms, posttranslational mechanisms, cellular mechanisms, microenvironmental mechanisms and pharmacokinetic mechanisms. Subsequently, we list several currently available databases and Web-based tools related to drug sensitivity and resistance. Then, we focus primarily on introducing some state-of-the-art computational methods used in drug resistance studies, including mechanism-based mathematical modeling approaches (e.g. molecular dynamics simulation, kinetic model of molecular networks, ordinary differential equation model of cellular dynamics, stochastic model, partial differential equation model, agent-based model, pharmacokinetic-pharmacodynamic model, etc.) and data-driven prediction methods (e.g. omics data-based conventional screening approach for node biomarkers, static network approach for edge biomarkers and module biomarkers, dynamic network approach for dynamic network biomarkers and dynamic module network biomarkers, etc.). Finally, we discuss several further questions and future directions for the use of computational methods for studying drug resistance, including inferring drug-induced signaling networks, multiscale modeling, drug combinations and precision medicine. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  14. Efficient multi-scenario Model Predictive Control for water resources management with ensemble streamflow forecasts

    NASA Astrophysics Data System (ADS)

    Tian, Xin; Negenborn, Rudy R.; van Overloop, Peter-Jules; María Maestre, José; Sadowska, Anna; van de Giesen, Nick

    2017-11-01

    Model Predictive Control (MPC) is one of the most advanced real-time control techniques that has been widely applied to Water Resources Management (WRM). MPC can manage the water system in a holistic manner and has a flexible structure to incorporate specific elements, such as setpoints and constraints. Therefore, MPC has shown its versatile performance in many branches of WRM. Nonetheless, with the in-depth understanding of stochastic hydrology in recent studies, MPC also faces the challenge of how to cope with hydrological uncertainty in its decision-making process. A possible way to embed the uncertainty is to generate an Ensemble Forecast (EF) of hydrological variables, rather than a deterministic one. The combination of MPC and EF results in a more comprehensive approach: Multi-scenario MPC (MS-MPC). In this study, we will first assess the model performance of MS-MPC, considering an ensemble streamflow forecast. Noticeably, the computational inefficiency may be a critical obstacle that hinders applicability of MS-MPC. In fact, with more scenarios taken into account, the computational burden of solving an optimization problem in MS-MPC accordingly increases. To deal with this challenge, we propose the Adaptive Control Resolution (ACR) approach as a computationally efficient scheme to practically reduce the number of control variables in MS-MPC. In brief, the ACR approach uses a mixed-resolution control time step from the near future to the distant future. The ACR-MPC approach is tested on a real-world case study: an integrated flood control and navigation problem in the North Sea Canal of the Netherlands. Such an approach reduces the computation time by 18% and up in our case study. At the same time, the model performance of ACR-MPC remains close to that of conventional MPC.

  15. Long-term simulations of dissolved oxygen concentrations in Lake Trout lakes

    NASA Astrophysics Data System (ADS)

    Jabbari, A.; Boegman, L.; MacKay, M.; Hadley, K.; Paterson, A.; Jeziorski, A.; Nelligan, C.; Smol, J. P.

    2016-02-01

    Lake Trout are a rare and valuable natural resource that are threatened by multiple environmental stressors. With the added threat of climate warming, there is growing concern among resource managers that increased thermal stratification will reduce the habitat quality of deep-water Lake Trout lakes through enhanced oxygen depletion. To address this issue, a three-part study is underway, which aims to: analyze sediment cores to understand the past, develop empirical formulae to model the present and apply computational models to forecast the future. This presentation reports on the computational modeling efforts. To this end, a simple dissolved oxygen sub-model has been embedded in the one-dimensional bulk mixed-layer thermodynamic Canadian Small Lake Model (CSLM). This model is currently being incorporated into the Canadian Land Surface Scheme (CLASS), the primary land surface component of Environment Canada's global and regional climate modelling systems. The oxygen model was calibrated and validated by hind-casting temperature and dissolved oxygen profiles from two Lake Trout lakes on the Canadian Shield. These data sets include 5 years of high-frequency (10 s to 10 min) data from Eagle Lake and 30 years of bi-weekly data from Harp Lake. Initial results show temperature and dissolved oxygen was predicted with root mean square error <1.5 °C and <3 mgL-1, respectively. Ongoing work is validating the model, over climate-change relevant timescales, against dissolved oxygen reconstructions from the sediment cores and predicting future deep-water temperature and dissolved oxygen concentrations in Canadian Lake Trout lakes under future climate change scenarios. This model will provide a useful tool for managers to ensure sustainable fishery resources for future generations.

  16. Science & Technology Review June 2012

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poyneer, L A

    2012-04-20

    This month's issue has the following articles: (1) A New Era in Climate System Analysis - Commentary by William H. Goldstein; (2) Seeking Clues to Climate Change - By comparing past climate records with results from computer simulations, Livermore scientists can better understand why Earth's climate has changed and how it might change in the future; (3) Finding and Fixing a Supercomputer's Faults - Livermore experts have developed innovative methods to detect hardware faults in supercomputers and help applications recover from errors that do occur; (4) Targeting Ignition - Enhancements to the cryogenic targets for National Ignition Facility experiments aremore » furthering work to achieve fusion ignition with energy gain; (5) Neural Implants Come of Age - A new generation of fully implantable, biocompatible neural prosthetics offers hope to patients with neurological impairment; and (6) Incubator Busy Growing Energy Technologies - Six collaborations with industrial partners are using the Laboratory's high-performance computing resources to find solutions to urgent energy-related problems.« less

  17. S-Cube: Enabling the Next Generation of Software Services

    NASA Astrophysics Data System (ADS)

    Metzger, Andreas; Pohl, Klaus

    The Service Oriented Architecture (SOA) paradigm is increasingly adopted by industry for building distributed software systems. However, when designing, developing and operating innovative software services and servicebased systems, several challenges exist. Those challenges include how to manage the complexity of those systems, how to establish, monitor and enforce Quality of Service (QoS) and Service Level Agreements (SLAs), as well as how to build those systems such that they can proactively adapt to dynamically changing requirements and context conditions. Developing foundational solutions for those challenges requires joint efforts of different research communities such as Business Process Management, Grid Computing, Service Oriented Computing and Software Engineering. This paper provides an overview of S-Cube, the European Network of Excellence on Software Services and Systems. S-Cube brings together researchers from leading research institutions across Europe, who join their competences to develop foundations, theories as well as methods and tools for future service-based systems.

  18. Enhancing performance of next generation FSO communication systems using soft computing-based predictions.

    PubMed

    Kazaura, Kamugisha; Omae, Kazunori; Suzuki, Toshiji; Matsumoto, Mitsuji; Mutafungwa, Edward; Korhonen, Timo O; Murakami, Tadaaki; Takahashi, Koichi; Matsumoto, Hideki; Wakamori, Kazuhiko; Arimoto, Yoshinori

    2006-06-12

    The deterioration and deformation of a free-space optical beam wave-front as it propagates through the atmosphere can reduce the link availability and may introduce burst errors thus degrading the performance of the system. We investigate the suitability of utilizing soft-computing (SC) based tools for improving performance of free-space optical (FSO) communications systems. The SC based tools are used for the prediction of key parameters of a FSO communications system. Measured data collected from an experimental FSO communication system is used as training and testing data for a proposed multi-layer neural network predictor (MNNP) used to predict future parameter values. The predicted parameters are essential for reducing transmission errors by improving the antenna's accuracy of tracking data beams. This is particularly essential for periods considered to be of strong atmospheric turbulence. The parameter values predicted using the proposed tool show acceptable conformity with original measurements.

  19. Power monitoring and control for large scale projects: SKA, a case study

    NASA Astrophysics Data System (ADS)

    Barbosa, Domingos; Barraca, João. Paulo; Maia, Dalmiro; Carvalho, Bruno; Vieira, Jorge; Swart, Paul; Le Roux, Gerhard; Natarajan, Swaminathan; van Ardenne, Arnold; Seca, Luis

    2016-07-01

    Large sensor-based science infrastructures for radio astronomy like the SKA will be among the most intensive datadriven projects in the world, facing very high demanding computation, storage, management, and above all power demands. The geographically wide distribution of the SKA and its associated processing requirements in the form of tailored High Performance Computing (HPC) facilities, require a Greener approach towards the Information and Communications Technologies (ICT) adopted for the data processing to enable operational compliance to potentially strict power budgets. Addressing the reduction of electricity costs, improve system power monitoring and the generation and management of electricity at system level is paramount to avoid future inefficiencies and higher costs and enable fulfillments of Key Science Cases. Here we outline major characteristics and innovation approaches to address power efficiency and long-term power sustainability for radio astronomy projects, focusing on Green ICT for science and Smart power monitoring and control.

  20. Post-Genomics and Vaccine Improvement for Leishmania

    PubMed Central

    Seyed, Negar; Taheri, Tahereh; Rafati, Sima

    2016-01-01

    Leishmaniasis is a parasitic disease that primarily affects Asia, Africa, South America, and the Mediterranean basin. Despite extensive efforts to develop an effective prophylactic vaccine, no promising vaccine is available yet. However, recent advancements in computational vaccinology on the one hand and genome sequencing approaches on the other have generated new hopes in vaccine development. Computational genome mining for new vaccine candidates is known as reverse vaccinology and is believed to further extend the current list of Leishmania vaccine candidates. Reverse vaccinology can also reduce the intrinsic risks associated with live attenuated vaccines. Individual epitopes arranged in tandem as polytopes are also a possible outcome of reverse genome mining. Here, we will briefly compare reverse vaccinology with conventional vaccinology in respect to Leishmania vaccine, and we will discuss how it influences the aforementioned topics. We will also introduce new in vivo models that will bridge the gap between human and laboratory animal models in future studies. PMID:27092123

  1. Packaging strategies for printed circuit board components. Volume I, materials & thermal stresses.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neilsen, Michael K.; Austin, Kevin N.; Adolf, Douglas Brian

    2011-09-01

    Decisions on material selections for electronics packaging can be quite complicated by the need to balance the criteria to withstand severe impacts yet survive deep thermal cycles intact. Many times, material choices are based on historical precedence perhaps ignorant of whether those initial choices were carefully investigated or whether the requirements on the new component match those of previous units. The goal of this program focuses on developing both increased intuition for generic packaging guidelines and computational methodologies for optimizing packaging in specific components. Initial efforts centered on characterization of classes of materials common to packaging strategies and computational analysesmore » of stresses generated during thermal cycling to identify strengths and weaknesses of various material choices. Future studies will analyze the same example problems incorporating the effects of curing stresses as needed and analyzing dynamic loadings to compare trends with the quasi-static conclusions.« less

  2. Role of virtual reality for cerebral palsy management.

    PubMed

    Weiss, Patrice L Tamar; Tirosh, Emanuel; Fehlings, Darcy

    2014-08-01

    Virtual reality is the use of interactive simulations to present users with opportunities to perform in virtual environments that appear, sound, and less frequently, feel similar to real-world objects and events. Interactive computer play refers to the use of a game where a child interacts and plays with virtual objects in a computer-generated environment. Because of their distinctive attributes that provide ecologically realistic and motivating opportunities for active learning, these technologies have been used in pediatric rehabilitation over the past 15 years. The ability of virtual reality to create opportunities for active repetitive motor/sensory practice adds to their potential for neuroplasticity and learning in individuals with neurologic disorders. The objectives of this article is to provide an overview of how virtual reality and gaming are used clinically, to present the results of several example studies that demonstrate their use in research, and to briefly remark on future developments. © The Author(s) 2014.

  3. Automatic mathematical modeling for space application

    NASA Technical Reports Server (NTRS)

    Wang, Caroline K.

    1987-01-01

    A methodology for automatic mathematical modeling is described. The major objective is to create a very friendly environment for engineers to design, maintain and verify their model and also automatically convert the mathematical model into FORTRAN code for conventional computation. A demonstration program was designed for modeling the Space Shuttle Main Engine simulation mathematical model called Propulsion System Automatic Modeling (PSAM). PSAM provides a very friendly and well organized environment for engineers to build a knowledge base for base equations and general information. PSAM contains an initial set of component process elements for the Space Shuttle Main Engine simulation and a questionnaire that allows the engineer to answer a set of questions to specify a particular model. PSAM is then able to automatically generate the model and the FORTRAN code. A future goal is to download the FORTRAN code to the VAX/VMS system for conventional computation.

  4. Utilizing population variation, vaccination, and systems biology to study human immunology

    PubMed Central

    Tsang, John S.

    2016-01-01

    The move toward precision medicine has highlighted the importance of understanding biological variability within and across individuals in the human population. In particular, given the prevalent involvement of the immune system in diverse pathologies, an important question is how much and what information about the state of the immune system is required to enable accurate prediction of future health and response to medical interventions. Towards addressing this question, recent studies using vaccination as a model perturbation and systems-biology approaches are beginning to provide a glimpse of how natural population variation together with multiplexed, high-throughput measurement and computational analysis can be used to uncover predictors of immune response quality in humans. Here I discuss recent developments in this emerging field, with emphasis on baseline correlates of vaccination responses, sources of immune-state variability, as well as relevant features of study design, data generation, and computational analysis. PMID:26187853

  5. The Virtual Pelvic Floor, a tele-immersive educational environment.

    PubMed Central

    Pearl, R. K.; Evenhouse, R.; Rasmussen, M.; Dech, F.; Silverstein, J. C.; Prokasy, S.; Panko, W. B.

    1999-01-01

    This paper describes the development of the Virtual Pelvic Floor, a new method of teaching the complex anatomy of the pelvic region utilizing virtual reality and advanced networking technology. Virtual reality technology allows improved visualization of three-dimensional structures over conventional media because it supports stereo vision, viewer-centered perspective, large angles of view, and interactivity. Two or more ImmersaDesk systems, drafting table format virtual reality displays, are networked together providing an environment where teacher and students share a high quality three-dimensional anatomical model, and are able to converse, see each other, and to point in three dimensions to indicate areas of interest. This project was realized by the teamwork of surgeons, medical artists and sculptors, computer scientists, and computer visualization experts. It demonstrates the future of virtual reality for surgical education and applications for the Next Generation Internet. Images Figure 1 Figure 2 Figure 3 PMID:10566378

  6. Neural-network quantum state tomography

    NASA Astrophysics Data System (ADS)

    Torlai, Giacomo; Mazzola, Guglielmo; Carrasquilla, Juan; Troyer, Matthias; Melko, Roger; Carleo, Giuseppe

    2018-05-01

    The experimental realization of increasingly complex synthetic quantum systems calls for the development of general theoretical methods to validate and fully exploit quantum resources. Quantum state tomography (QST) aims to reconstruct the full quantum state from simple measurements, and therefore provides a key tool to obtain reliable analytics1-3. However, exact brute-force approaches to QST place a high demand on computational resources, making them unfeasible for anything except small systems4,5. Here we show how machine learning techniques can be used to perform QST of highly entangled states with more than a hundred qubits, to a high degree of accuracy. We demonstrate that machine learning allows one to reconstruct traditionally challenging many-body quantities—such as the entanglement entropy—from simple, experimentally accessible measurements. This approach can benefit existing and future generations of devices ranging from quantum computers to ultracold-atom quantum simulators6-8.

  7. Creating targeted initial populations for genetic product searches in heterogeneous markets

    NASA Astrophysics Data System (ADS)

    Foster, Garrett; Turner, Callaway; Ferguson, Scott; Donndelinger, Joseph

    2014-12-01

    Genetic searches often use randomly generated initial populations to maximize diversity and enable a thorough sampling of the design space. While many of these initial configurations perform poorly, the trade-off between population diversity and solution quality is typically acceptable for small-scale problems. Navigating complex design spaces, however, often requires computationally intelligent approaches that improve solution quality. This article draws on research advances in market-based product design and heuristic optimization to strategically construct 'targeted' initial populations. Targeted initial designs are created using respondent-level part-worths estimated from discrete choice models. These designs are then integrated into a traditional genetic search. Two case study problems of differing complexity are presented to illustrate the benefits of this approach. In both problems, targeted populations lead to computational savings and product configurations with improved market share of preferences. Future research efforts to tailor this approach and extend it towards multiple objectives are also discussed.

  8. Virtual Reality-Enhanced Extinction of Phobias and Post-Traumatic Stress.

    PubMed

    Maples-Keller, Jessica L; Yasinski, Carly; Manjin, Nicole; Rothbaum, Barbara Olasov

    2017-07-01

    Virtual reality (VR) refers to an advanced technological communication interface in which the user is actively participating in a computer-generated 3-dimensional virtual world that includes computer sensory input devices used to simulate real-world interactive experiences. VR has been used within psychiatric treatment for anxiety disorders, particularly specific phobias and post-traumatic stress disorder, given several advantages that VR provides for use within treatment for these disorders. Exposure therapy for anxiety disorder is grounded in fear-conditioning models, in which extinction learning involves the process through which conditioned fear responses decrease or are inhibited. The present review will provide an overview of extinction training and anxiety disorder treatment, advantages for using VR within extinction training, a review of the literature regarding the effectiveness of VR within exposure therapy for specific phobias and post-traumatic stress disorder, and limitations and future directions of the extant empirical literature.

  9. An Overview of High Performance Computing and Challenges for the Future

    ScienceCinema

    Google Tech Talks

    2017-12-09

    In this talk we examine how high performance computing has changed over the last 10-year and look toward the future in terms of trends. These changes have had and will continue to have a major impact on our software. A new generation of software libraries and lgorithms are needed for the effective and reliable use of (wide area) dynamic, distributed and parallel environments. Some of the software and algorithm challenges have already been encountered, such as management of communication and memory hierarchies through a combination of compile--time and run--time techniques, but the increased scale of computation, depth of memory hierarchies, range of latencies, and increased run--time environment variability will make these problems much harder. We will focus on the redesign of software to fit multicore architectures. Speaker: Jack Dongarra University of Tennessee Oak Ridge National Laboratory University of Manchester Jack Dongarra received a Bachelor of Science in Mathematics from Chicago State University in 1972 and a Master of Science in Computer Science from the Illinois Institute of Technology in 1973. He received his Ph.D. in Applied Mathematics from the University of New Mexico in 1980. He worked at the Argonne National Laboratory until 1989, becoming a senior scientist. He now holds an appointment as University Distinguished Professor of Computer Science in the Electrical Engineering and Computer Science Department at the University of Tennessee, has the position of a Distinguished Research Staff member in the Computer Science and Mathematics Division at Oak Ridge National Laboratory (ORNL), Turing Fellow in the Computer Science and Mathematics Schools at the University of Manchester, and an Adjunct Professor in the Computer Science Department at Rice University. He specializes in numerical algorithms in linear algebra, parallel computing, the use of advanced-computer architectures, programming methodology, and tools for parallel computers. His research includes the development, testing and documentation of high quality mathematical software. He has contributed to the design and implementation of the following open source software packages and systems: EISPACK, LINPACK, the BLAS, LAPACK, ScaLAPACK, Netlib, PVM, MPI, NetSolve, Top500, ATLAS, and PAPI. He has published approximately 200 articles, papers, reports and technical memoranda and he is coauthor of several books. He was awarded the IEEE Sid Fernbach Award in 2004 for his contributions in the application of high performance computers using innovative approaches. He is a Fellow of the AAAS, ACM, and the IEEE and a member of the National Academy of Engineering.

  10. An Overview of High Performance Computing and Challenges for the Future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Google Tech Talks

    In this talk we examine how high performance computing has changed over the last 10-year and look toward the future in terms of trends. These changes have had and will continue to have a major impact on our software. A new generation of software libraries and lgorithms are needed for the effective and reliable use of (wide area) dynamic, distributed and parallel environments. Some of the software and algorithm challenges have already been encountered, such as management of communication and memory hierarchies through a combination of compile--time and run--time techniques, but the increased scale of computation, depth of memory hierarchies,more » range of latencies, and increased run--time environment variability will make these problems much harder. We will focus on the redesign of software to fit multicore architectures. Speaker: Jack Dongarra University of Tennessee Oak Ridge National Laboratory University of Manchester Jack Dongarra received a Bachelor of Science in Mathematics from Chicago State University in 1972 and a Master of Science in Computer Science from the Illinois Institute of Technology in 1973. He received his Ph.D. in Applied Mathematics from the University of New Mexico in 1980. He worked at the Argonne National Laboratory until 1989, becoming a senior scientist. He now holds an appointment as University Distinguished Professor of Computer Science in the Electrical Engineering and Computer Science Department at the University of Tennessee, has the position of a Distinguished Research Staff member in the Computer Science and Mathematics Division at Oak Ridge National Laboratory (ORNL), Turing Fellow in the Computer Science and Mathematics Schools at the University of Manchester, and an Adjunct Professor in the Computer Science Department at Rice University. He specializes in numerical algorithms in linear algebra, parallel computing, the use of advanced-computer architectures, programming methodology, and tools for parallel computers. His research includes the development, testing and documentation of high quality mathematical software. He has contributed to the design and implementation of the following open source software packages and systems: EISPACK, LINPACK, the BLAS, LAPACK, ScaLAPACK, Netlib, PVM, MPI, NetSolve, Top500, ATLAS, and PAPI. He has published approximately 200 articles, papers, reports and technical memoranda and he is coauthor of several books. He was awarded the IEEE Sid Fernbach Award in 2004 for his contributions in the application of high performance computers using innovative approaches. He is a Fellow of the AAAS, ACM, and the IEEE and a member of the National Academy of Engineering.« less

  11. Computer-Generated, Three-Dimensional Character Animation: A Report and Analysis.

    ERIC Educational Resources Information Center

    Kingsbury, Douglas Lee

    This master's thesis details the experience gathered in the production "Snoot and Muttly," a short character animation with 3-D computer generated images, and provides an analysis of the computer-generated 3-D character animation system capabilities. Descriptions are provided of the animation environment at the Ohio State University…

  12. Turbofan noise generation. Volume 2: Computer programs

    NASA Technical Reports Server (NTRS)

    Ventres, C. S.; Theobald, M. A.; Mark, W. D.

    1982-01-01

    The use of a package of computer programs developed to calculate the in duct acoustic mods excited by a fan/stator stage operating at subsonic tip speed is described. The following three noise source mechanisms are included: (1) sound generated by the rotor blades interacting with turbulence ingested into, or generated within, the inlet duct; (2) sound generated by the stator vanes interacting with the turbulent wakes of the rotor blades; and (3) sound generated by the stator vanes interacting with the velocity deficits in the mean wakes of the rotor blades. The computations for three different noise mechanisms are coded as three separate computer program packages. The computer codes are described by means of block diagrams, tables of data and variables, and example program executions; FORTRAN listings are included.

  13. Turbofan noise generation. Volume 2: Computer programs

    NASA Astrophysics Data System (ADS)

    Ventres, C. S.; Theobald, M. A.; Mark, W. D.

    1982-07-01

    The use of a package of computer programs developed to calculate the in duct acoustic mods excited by a fan/stator stage operating at subsonic tip speed is described. The following three noise source mechanisms are included: (1) sound generated by the rotor blades interacting with turbulence ingested into, or generated within, the inlet duct; (2) sound generated by the stator vanes interacting with the turbulent wakes of the rotor blades; and (3) sound generated by the stator vanes interacting with the velocity deficits in the mean wakes of the rotor blades. The computations for three different noise mechanisms are coded as three separate computer program packages. The computer codes are described by means of block diagrams, tables of data and variables, and example program executions; FORTRAN listings are included.

  14. Establishment of sustainable health science for future generations: from a hundred years ago to a hundred years in the future.

    PubMed

    Mori, Chisato; Todaka, Emiko

    2009-01-01

    Recently, we have investigated the relationship between environment and health from a scientific perspective and developed a new academic field, "Sustainable Health Science" that will contribute to creating a healthy environment for future generations. There are three key points in Sustainable Heath Science. The first key point is "focusing on future generations"-society should improve the environment and prevent possible adverse health effects on future generations (Environmental Preventive Medicine). The second key point is the "precautious principle". The third key point is "transdisciplinary science", which means that not only medical science but also other scientific fields, such as architectural and engineering science, should be involved. Here, we introduce our recent challenging project "Chemiless Town Project", in which a model town is under construction with fewer chemicals. In the project, a trial of an education program and a health-examination system of chemical exposure is going to be conducted. In the future, we are aiming to establish health examination of exposure to chemicals of women of reproductive age so that the risk of adverse health effects to future generations will decrease and they can enjoy a better quality of life. We hope that society will accept the importance of forming a sustainable society for future generations not only with regard to chemicals but also to the whole surrounding environment. As the proverb of American native people tells us, we should live considering the effects on seven generations in the future.

  15. Integrated circuit test-port architecture and method and apparatus of test-port generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Teifel, John

    A method and apparatus are provided for generating RTL code for a test-port interface of an integrated circuit. In an embodiment, a test-port table is provided as input data. A computer automatically parses the test-port table into data structures and analyzes it to determine input, output, local, and output-enable port names. The computer generates address-detect and test-enable logic constructed from combinational functions. The computer generates one-hot multiplexer logic for at least some of the output ports. The one-hot multiplexer logic for each port is generated so as to enable the port to toggle between data signals and test signals. Themore » computer then completes the generation of the RTL code.« less

  16. Current and Future Development of a Non-hydrostatic Unified Atmospheric Model (NUMA)

    DTIC Science & Technology

    2010-09-09

    following capabilities: 1.  Highly scalable on current and future computer architectures ( exascale computing and beyond and GPUs) 2.  Flexibility... Exascale Computing •  10 of Top 500 are already in the Petascale range •  Should also keep our eyes on GPUs (e.g., Mare Nostrum) 2.  Numerical

  17. Comparison of voice-automated transcription and human transcription in generating pathology reports.

    PubMed

    Al-Aynati, Maamoun M; Chorneyko, Katherine A

    2003-06-01

    Software that can convert spoken words into written text has been available since the early 1980s. Early continuous speech systems were developed in 1994, with the latest commercially available editions having a claimed accuracy of up to 98% of speech recognition at natural speech rates. To evaluate the efficacy of one commercially available voice-recognition software system with pathology vocabulary in generating pathology reports and to compare this with human transcription. To draw cost analysis conclusions regarding human versus computer-based transcription. Two hundred six routine pathology reports from the surgical pathology material handled at St Joseph's Healthcare, Hamilton, Ontario, were generated simultaneously using computer-based transcription and human transcription. The following hardware and software were used: a desktop 450-MHz Intel Pentium III processor with 192 MB of RAM, a speech-quality sound card (Sound Blaster), noise-canceling headset microphone, and IBM ViaVoice Pro version 8 with pathology vocabulary support (Voice Automated, Huntington Beach, Calif). The cost of the hardware and software used was approximately Can 2250 dollars. A total of 23 458 words were transcribed using both methods with a mean of 114 words per report. The mean accuracy rate was 93.6% (range, 87.4%-96%) using the computer software, compared to a mean accuracy of 99.6% (range, 99.4%-99.8%) for human transcription (P <.001). Time needed to edit documents by the primary evaluator (M.A.) using the computer was on average twice that needed for editing the documents produced by human transcriptionists (range, 1.4-3.5 times). The extra time needed to edit documents was 67 minutes per week (13 minutes per day). Computer-based continuous speech-recognition systems in pathology can be successfully used in pathology practice even during the handling of gross pathology specimens. The relatively low accuracy rate of this voice-recognition software with resultant increased editing burden on pathologists may not encourage its application on a wide scale in pathology departments with sufficient human transcription services, despite significant potential financial savings. However, computer-based transcription represents an attractive and relatively inexpensive alternative to human transcription in departments where there is a shortage of transcription services, and will no doubt become more commonly used in pathology departments in the future.

  18. Virtual presence for mission visualization: computer game technology provides a new approach

    NASA Astrophysics Data System (ADS)

    Hussey, K.

    2007-08-01

    The concept of virtual presence for mission and planetary science visualization is to allow the public to "see" in space as if they were either riding aboard or standing next to an ESA/NASA spacecraft. Our approach to accomplishing this goal is to utilize and extend the same technology used by the computer gaming industry.With this technology, people would be able to immediately "look" in any direction from their virtual location and "zoom-in" at will. Whenever real data for their "view" exists it would be incorporated into the scene. Where data is missing, a high-fidelity simulation of the view would be generated to fill in the chosen field of view. The observer could also change the time of observation into the past or future. The potential for the application of this technology for the development of educational curricula is huge. On the engineering side, all allowable spacecraft and environmental parameters that are being measured and sent to Earth would be immediately viewable as if looking at the dashboard of a car or an instrument panel of an aircraft. Historical information could also be displayed upon request. This can revolutionize the way the general public and planetary scientific community views ESA/NASA missions and provides an educational context that is attractive to the younger generation. While conceptually using this technology is quite simple, the cross-discipline technical challenges are very demanding. This technology is currently under development and application at JPL to assist current missions in viewing their data, communicating with the public and visualizing future mission plans. Real-time demonstrations of the technology described will be shown.

  19. Computer image generation: Reconfigurability as a strategy in high fidelity space applications

    NASA Technical Reports Server (NTRS)

    Bartholomew, Michael J.

    1989-01-01

    The demand for realistic, high fidelity, computer image generation systems to support space simulation is well established. However, as the number and diversity of space applications increase, the complexity and cost of computer image generation systems also increase. One strategy used to harmonize cost with varied requirements is establishment of a reconfigurable image generation system that can be adapted rapidly and easily to meet new and changing requirements. The reconfigurability strategy through the life cycle of system conception, specification, design, implementation, operation, and support for high fidelity computer image generation systems are discussed. The discussion is limited to those issues directly associated with reconfigurability and adaptability of a specialized scene generation system in a multi-faceted space applications environment. Examples and insights gained through the recent development and installation of the Improved Multi-function Scene Generation System at Johnson Space Center, Systems Engineering Simulator are reviewed and compared with current simulator industry practices. The results are clear; the strategy of reconfigurability applied to space simulation requirements provides a viable path to supporting diverse applications with an adaptable computer image generation system.

  20. Model and Scenario Variations in Predicted Number of Generations of Spodoptera litura Fab. on Peanut during Future Climate Change Scenario

    PubMed Central

    Srinivasa Rao, Mathukumalli; Swathi, Pettem; Rama Rao, Chitiprolu Anantha; Rao, K. V.; Raju, B. M. K.; Srinivas, Karlapudi; Manimanjari, Dammu; Maheswari, Mandapaka

    2015-01-01

    The present study features the estimation of number of generations of tobacco caterpillar, Spodoptera litura. Fab. on peanut crop at six locations in India using MarkSim, which provides General Circulation Model (GCM) of future data on daily maximum (T.max), minimum (T.min) air temperatures from six models viz., BCCR-BCM2.0, CNRM-CM3, CSIRO-Mk3.5, ECHams5, INCM-CM3.0 and MIROC3.2 along with an ensemble of the six from three emission scenarios (A2, A1B and B1). This data was used to predict the future pest scenarios following the growing degree days approach in four different climate periods viz., Baseline-1975, Near future (NF) -2020, Distant future (DF)-2050 and Very Distant future (VDF)—2080. It is predicted that more generations would occur during the three future climate periods with significant variation among scenarios and models. Among the seven models, 1–2 additional generations were predicted during DF and VDF due to higher future temperatures in CNRM-CM3, ECHams5 & CSIRO-Mk3.5 models. The temperature projections of these models indicated that the generation time would decrease by 18–22% over baseline. Analysis of variance (ANOVA) was used to partition the variation in the predicted number of generations and generation time of S. litura on peanut during crop season. Geographical location explained 34% of the total variation in number of generations, followed by time period (26%), model (1.74%) and scenario (0.74%). The remaining 14% of the variation was explained by interactions. Increased number of generations and reduction of generation time across the six peanut growing locations of India suggest that the incidence of S. litura may increase due to projected increase in temperatures in future climate change periods. PMID:25671564

  1. Computer Series, 13: Bits and Pieces, 11.

    ERIC Educational Resources Information Center

    Moore, John W., Ed.

    1982-01-01

    Describes computer programs (with ordering information) on various topics including, among others, modeling of thermodynamics and economics of solar energy, radioactive decay simulation, stoichiometry drill/tutorial (in Spanish), computer-generated safety quiz, medical chemistry computer game, medical biochemistry question bank, generation of…

  2. eXascale PRogramming Environment and System Software (XPRESS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chapman, Barbara; Gabriel, Edgar

    Exascale systems, with a thousand times the compute capacity of today’s leading edge petascale computers, are expected to emerge during the next decade. Their software systems will need to facilitate the exploitation of exceptional amounts of concurrency in applications, and ensure that jobs continue to run despite the occurrence of system failures and other kinds of hard and soft errors. Adapting computations at runtime to cope with changes in the execution environment, as well as to improve power and performance characteristics, is likely to become the norm. As a result, considerable innovation is required to develop system support to meetmore » the needs of future computing platforms. The XPRESS project aims to develop and prototype a revolutionary software system for extreme-­scale computing for both exascale and strong­scaled problems. The XPRESS collaborative research project will advance the state-­of-­the-­art in high performance computing and enable exascale computing for current and future DOE mission-­critical applications and supporting systems. The goals of the XPRESS research project are to: A. enable exascale performance capability for DOE applications, both current and future, B. develop and deliver a practical computing system software X-­stack, OpenX, for future practical DOE exascale computing systems, and C. provide programming methods and environments for effective means of expressing application and system software for portable exascale system execution.« less

  3. New computing systems, future computing environment, and their implications on structural analysis and design

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  4. Toward a web-based real-time radiation treatment planning system in a cloud computing environment.

    PubMed

    Na, Yong Hum; Suh, Tae-Suk; Kapp, Daniel S; Xing, Lei

    2013-09-21

    To exploit the potential dosimetric advantages of intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT), an in-depth approach is required to provide efficient computing methods. This needs to incorporate clinically related organ specific constraints, Monte Carlo (MC) dose calculations, and large-scale plan optimization. This paper describes our first steps toward a web-based real-time radiation treatment planning system in a cloud computing environment (CCE). The Amazon Elastic Compute Cloud (EC2) with a master node (named m2.xlarge containing 17.1 GB of memory, two virtual cores with 3.25 EC2 Compute Units each, 420 GB of instance storage, 64-bit platform) is used as the backbone of cloud computing for dose calculation and plan optimization. The master node is able to scale the workers on an 'on-demand' basis. MC dose calculation is employed to generate accurate beamlet dose kernels by parallel tasks. The intensity modulation optimization uses total-variation regularization (TVR) and generates piecewise constant fluence maps for each initial beam direction in a distributed manner over the CCE. The optimized fluence maps are segmented into deliverable apertures. The shape of each aperture is iteratively rectified to be a sequence of arcs using the manufacture's constraints. The output plan file from the EC2 is sent to the simple storage service. Three de-identified clinical cancer treatment plans have been studied for evaluating the performance of the new planning platform with 6 MV flattening filter free beams (40 × 40 cm(2)) from the Varian TrueBeam(TM) STx linear accelerator. A CCE leads to speed-ups of up to 14-fold for both dose kernel calculations and plan optimizations in the head and neck, lung, and prostate cancer cases considered in this study. The proposed system relies on a CCE that is able to provide an infrastructure for parallel and distributed computing. The resultant plans from the cloud computing are identical to PC-based IMRT and VMAT plans, confirming the reliability of the cloud computing platform. This cloud computing infrastructure has been established for a radiation treatment planning. It substantially improves the speed of inverse planning and makes future on-treatment adaptive re-planning possible.

  5. Toward a web-based real-time radiation treatment planning system in a cloud computing environment

    NASA Astrophysics Data System (ADS)

    Hum Na, Yong; Suh, Tae-Suk; Kapp, Daniel S.; Xing, Lei

    2013-09-01

    To exploit the potential dosimetric advantages of intensity modulated radiation therapy (IMRT) and volumetric modulated arc therapy (VMAT), an in-depth approach is required to provide efficient computing methods. This needs to incorporate clinically related organ specific constraints, Monte Carlo (MC) dose calculations, and large-scale plan optimization. This paper describes our first steps toward a web-based real-time radiation treatment planning system in a cloud computing environment (CCE). The Amazon Elastic Compute Cloud (EC2) with a master node (named m2.xlarge containing 17.1 GB of memory, two virtual cores with 3.25 EC2 Compute Units each, 420 GB of instance storage, 64-bit platform) is used as the backbone of cloud computing for dose calculation and plan optimization. The master node is able to scale the workers on an ‘on-demand’ basis. MC dose calculation is employed to generate accurate beamlet dose kernels by parallel tasks. The intensity modulation optimization uses total-variation regularization (TVR) and generates piecewise constant fluence maps for each initial beam direction in a distributed manner over the CCE. The optimized fluence maps are segmented into deliverable apertures. The shape of each aperture is iteratively rectified to be a sequence of arcs using the manufacture’s constraints. The output plan file from the EC2 is sent to the simple storage service. Three de-identified clinical cancer treatment plans have been studied for evaluating the performance of the new planning platform with 6 MV flattening filter free beams (40 × 40 cm2) from the Varian TrueBeamTM STx linear accelerator. A CCE leads to speed-ups of up to 14-fold for both dose kernel calculations and plan optimizations in the head and neck, lung, and prostate cancer cases considered in this study. The proposed system relies on a CCE that is able to provide an infrastructure for parallel and distributed computing. The resultant plans from the cloud computing are identical to PC-based IMRT and VMAT plans, confirming the reliability of the cloud computing platform. This cloud computing infrastructure has been established for a radiation treatment planning. It substantially improves the speed of inverse planning and makes future on-treatment adaptive re-planning possible.

  6. Applications of patient-specific 3D printing in medicine.

    PubMed

    Heller, Martin; Bauer, Heide-Katharina; Goetze, Elisabeth; Gielisch, Matthias; Roth, Klaus E; Drees, Philipp; Maier, Gerrit S; Dorweiler, Bernhard; Ghazy, Ahmed; Neufurth, Meik; Müller, Werner E G; Schröder, Heinz C; Wang, Xiaohong; Vahl, Christian-Friedrich; Al-Nawas, Bilal

    Already three decades ago, the potential of medical 3D printing (3DP) or rapid prototyping for improved patient treatment began to be recognized. Since then, more and more medical indications in different surgical disciplines have been improved by using this new technique. Numerous examples have demonstrated the enormous benefit of 3DP in the medical care of patients by, for example, planning complex surgical interventions preoperatively, reducing implantation steps and anesthesia times, and helping with intraoperative orientation. At the beginning of every individual 3D model, patient-specific data on the basis of computed tomography (CT), magnetic resonance imaging (MRI), or ultrasound data is generated, which is then digitalized and processed using computer-aided design/computer-aided manufacturing (CAD/CAM) software. Finally, the resulting data sets are used to generate 3D-printed models or even implants. There are a variety of different application areas in the various medical fields, eg, drill or positioning templates, or surgical guides in maxillofacial surgery, or patient-specific implants in orthopedics. Furthermore, in vascular surgery it is possible to visualize pathologies such as aortic aneurysms so as to improve the planning of surgical treatment. Although rapid prototyping of individual models and implants is already applied very successfully in regenerative medicine, most of the materials used for 3DP are not yet suitable for implantation in the body. Therefore, it will be necessary in future to develop novel therapy approaches and design new materials in order to completely reconstruct natural tissue.

  7. Sensor planning for moving targets

    NASA Astrophysics Data System (ADS)

    Musman, Scott A.; Lehner, Paul; Elsaesser, Chris

    1994-10-01

    Planning a search for moving ground targets is difficult for humans and computationally intractable. This paper describes a technique to solve such problems. The main idea is to combine probability of detection assessments with computational search heuristics to generate sensor plans which approximately maximize either the probability of detection or a user- specified knowledge function (e.g., determining the target's probable destination; locating the enemy tanks). In contrast to super computer-based moving target search planning, our technique has been implemented using workstation technology. The data structures generated by sensor planning can be used to evaluate sensor reports during plan execution. Our system revises its objective function with each sensor report, allowing the user to assess both the current situation as well as the expected value of future information. This capability is particularly useful in situations involving a high rate of sensor reporting, helping the user focus his attention on sensors reports most pertinent to current needs. Our planning approach is implemented in a three layer architecture. The layers are: mobility analysis, followed by sensor coverage analysis, and concluding with sensor plan analysis. It is possible using these layers to describe the physical, spatial, and temporal characteristics of a scenario in the first two layers, and customize the final analysis to specific intelligence objectives. The architecture also allows a user to customize operational parameters in each of the three major components of the system. As examples of these performance options, we briefly describe the mobility analysis and discuss issues affecting sensor plan analysis.

  8. Virtual reality: a reality for future military pilotage?

    NASA Astrophysics Data System (ADS)

    McIntire, John P.; Martinsen, Gary L.; Marasco, Peter L.; Havig, Paul R.

    2009-05-01

    Virtual reality (VR) systems provide exciting new ways to interact with information and with the world. The visual VR environment can be synthetic (computer generated) or be an indirect view of the real world using sensors and displays. With the potential opportunities of a VR system, the question arises about what benefits or detriments a military pilot might incur by operating in such an environment. Immersive and compelling VR displays could be accomplished with an HMD (e.g., imagery on the visor), large area collimated displays, or by putting the imagery on an opaque canopy. But what issues arise when, instead of viewing the world directly, a pilot views a "virtual" image of the world? Is 20/20 visual acuity in a VR system good enough? To deliver this acuity over the entire visual field would require over 43 megapixels (MP) of display surface for an HMD or about 150 MP for an immersive CAVE system, either of which presents a serious challenge with current technology. Additionally, the same number of sensor pixels would be required to drive the displays to this resolution (and formidable network architectures required to relay this information), or massive computer clusters are necessary to create an entirely computer-generated virtual reality with this resolution. Can we presently implement such a system? What other visual requirements or engineering issues should be considered? With the evolving technology, there are many technological issues and human factors considerations that need to be addressed before a pilot is placed within a virtual cockpit.

  9. Telescience Support Center Data System Software

    NASA Technical Reports Server (NTRS)

    Rahman, Hasan

    2010-01-01

    The Telescience Support Center (TSC) team has developed a databasedriven, increment-specific Data Require - ment Document (DRD) generation tool that automates much of the work required for generating and formatting the DRD. It creates a database to load the required changes to configure the TSC data system, thus eliminating a substantial amount of labor in database entry and formatting. The TSC database contains the TSC systems configuration, along with the experimental data, in which human physiological data must be de-commutated in real time. The data for each experiment also must be cataloged and archived for future retrieval. TSC software provides tools and resources for ground operation and data distribution to remote users consisting of PIs (principal investigators), bio-medical engineers, scientists, engineers, payload specialists, and computer scientists. Operations support is provided for computer systems access, detailed networking, and mathematical and computational problems of the International Space Station telemetry data. User training is provided for on-site staff and biomedical researchers and other remote personnel in the usage of the space-bound services via the Internet, which enables significant resource savings for the physical facility along with the time savings versus traveling to NASA sites. The software used in support of the TSC could easily be adapted to other Control Center applications. This would include not only other NASA payload monitoring facilities, but also other types of control activities, such as monitoring and control of the electric grid, chemical, or nuclear plant processes, air traffic control, and the like.

  10. Dynamic Optical Networks for Future Internet Environments

    NASA Astrophysics Data System (ADS)

    Matera, Francesco

    2014-05-01

    This article reports an overview on the evolution of the optical network scenario taking into account the exponential growth of connected devices, big data, and cloud computing that is driving a concrete transformation impacting the information and communication technology world. This hyper-connected scenario is deeply affecting relationships between individuals, enterprises, citizens, and public administrations, fostering innovative use cases in practically any environment and market, and introducing new opportunities and new challenges. The successful realization of this hyper-connected scenario depends on different elements of the ecosystem. In particular, it builds on connectivity and functionalities allowed by converged next-generation networks and their capacity to support and integrate with the Internet of Things, machine-to-machine, and cloud computing. This article aims at providing some hints of this scenario to contribute to analyze impacts on optical system and network issues and requirements. In particular, the role of the software-defined network is investigated by taking into account all scenarios regarding data centers, cloud computing, and machine-to-machine and trying to illustrate all the advantages that could be introduced by advanced optical communications.

  11. Operations analysis (study 2.1): Program manual and users guide for the LOVES computer code

    NASA Technical Reports Server (NTRS)

    Wray, S. T., Jr.

    1975-01-01

    Information is provided necessary to use the LOVES Computer Program in its existing state, or to modify the program to include studies not properly handled by the basic model. The Users Guide defines the basic elements assembled together to form the model for servicing satellites in orbit. As the program is a simulation, the method of attack is to disassemble the problem into a sequence of events, each occurring instantaneously and each creating one or more other events in the future. The main driving force of the simulation is the deterministic launch schedule of satellites and the subsequent failure of the various modules which make up the satellites. The LOVES Computer Program uses a random number generator to simulate the failure of module elements and therefore operates over a long span of time typically 10 to 15 years. The sequence of events is varied by making several runs in succession with different random numbers resulting in a Monte Carlo technique to determine statistical parameters of minimum value, average value, and maximum value.

  12. Deep learning and the electronic structure problem

    NASA Astrophysics Data System (ADS)

    Mills, Kyle; Spanner, Michael; Tamblyn, Isaac

    In the past decade, the fields of artificial intelligence and computer vision have progressed remarkably. Supported by the enthusiasm of large tech companies, as well as significant hardware advances and the utilization of graphical processing units to accelerate computations, deep neural networks (DNN) are gaining momentum as a robust choice for many diverse machine learning applications. We have demonstrated the ability of a DNN to solve a quantum mechanical eigenvalue equation directly, without the need to compute a wavefunction, and without knowledge of the underlying physics. We have trained a convolutional neural network to predict the total energy of an electron in a confining, 2-dimensional electrostatic potential. We numerically solved the one-electron Schrödinger equation for millions of electrostatic potentials, and used this as training data for our neural network. Four classes of potentials were assessed: the canonical cases of the harmonic oscillator and infinite well, and two types of randomly generated potentials for which no analytic solution is known. We compare the performance of the neural network and consider how these results could lead to future advances in electronic structure theory.

  13. Inlets, ducts, and nozzles

    NASA Technical Reports Server (NTRS)

    Abbott, John M.; Anderson, Bernhard H.; Rice, Edward J.

    1990-01-01

    The internal fluid mechanics research program in inlets, ducts, and nozzles consists of a balanced effort between the development of computational tools (both parabolized Navier-Stokes and full Navier-Stokes) and the conduct of experimental research. The experiments are designed to better understand the fluid flow physics, to develop new or improved flow models, and to provide benchmark quality data sets for validation of the computational methods. The inlet, duct, and nozzle research program is described according to three major classifications of flow phenomena: (1) highly 3-D flow fields; (2) shock-boundary-layer interactions; and (3) shear layer control. Specific examples of current and future elements of the research program are described for each of these phenomenon. In particular, the highly 3-D flow field phenomenon is highlighted by describing the computational and experimental research program in transition ducts having a round-to-rectangular area variation. In the case of shock-boundary-layer interactions, the specific details of research for normal shock-boundary-layer interactions are described. For shear layer control, research in vortex generators and the use of aerodynamic excitation for enhancement of the jet mixing process are described.

  14. Numerical Modelling of Staged Combustion Aft-Injected Hybrid Rocket Motors

    NASA Astrophysics Data System (ADS)

    Nijsse, Jeff

    The staged combustion aft-injected hybrid (SCAIH) rocket motor is a promising design for the future of hybrid rocket propulsion. Advances in computational fluid dynamics and scientific computing have made computational modelling an effective tool in hybrid rocket motor design and development. The focus of this thesis is the numerical modelling of the SCAIH rocket motor in a turbulent combustion, high-speed, reactive flow framework accounting for solid soot transport and radiative heat transfer. The SCAIH motor is modelled with a shear coaxial injector with liquid oxygen injected in the center at sub-critical conditions: 150 K and 150 m/s (Mach ≈ 0.9), and a gas-generator gas-solid mixture of one-third carbon soot by mass injected in the annual opening at 1175 K and 460 m/s (Mach ≈ 0.6). Flow conditions in the near injector region and the flame anchoring mechanism are of particular interest. Overall, the flow is shown to exhibit instabilities and the flame is shown to anchor directly on the injector faceplate with temperatures in excess of 2700 K.

  15. Computer-aided classification of forest cover types from small scale aerial photography

    NASA Astrophysics Data System (ADS)

    Bliss, John C.; Bonnicksen, Thomas M.; Mace, Thomas H.

    1980-11-01

    The US National Park Service must map forest cover types over extensive areas in order to fulfill its goal of maintaining or reconstructing presettlement vegetation within national parks and monuments. Furthermore, such cover type maps must be updated on a regular basis to document vegetation changes. Computer-aided classification of small scale aerial photography is a promising technique for generating forest cover type maps efficiently and inexpensively. In this study, seven cover types were classified with an overall accuracy of 62 percent from a reproduction of a 1∶120,000 color infrared transparency of a conifer-hardwood forest. The results were encouraging, given the degraded quality of the photograph and the fact that features were not centered, as well as the lack of information on lens vignetting characteristics to make corrections. Suggestions are made for resolving these problems in future research and applications. In addition, it is hypothesized that the overall accuracy is artificially low because the computer-aided classification more accurately portrayed the intermixing of cover types than the hand-drawn maps to which it was compared.

  16. Design and analysis of advanced flight planning concepts

    NASA Technical Reports Server (NTRS)

    Sorensen, John A.

    1987-01-01

    The objectives of this continuing effort are to develop and evaluate new algorithms and advanced concepts for flight management and flight planning. This includes the minimization of fuel or direct operating costs, the integration of the airborne flight management and ground-based flight planning processes, and the enhancement of future traffic management systems design. Flight management (FMS) concepts are for on-board profile computation and steering of transport aircraft in the vertical plane between a city pair and along a given horizontal path. Flight planning (FPS) concepts are for the pre-flight ground based computation of the three-dimensional reference trajectory that connects the city pair and specifies the horizontal path, fuel load, and weather profiles for initializing the FMS. As part of these objectives, a new computer program called EFPLAN has been developed and utilized to study advanced flight planning concepts. EFPLAN represents an experimental version of an FPS. It has been developed to generate reference flight plans compatible as input to an FMS and to provide various options for flight planning research. This report describes EFPLAN and the associated research conducted in its development.

  17. Computers in the Classroom: The School of the Future, The Future of the School.

    ERIC Educational Resources Information Center

    Tapia, Ivan, Ed.

    1995-01-01

    Computer uses in the classroom is the theme topic of this journal issue. Contents include: "Emo Welzl: 1995 Leibniz Laureate" (Hartmut Wewetzer); "Learning to Read with the Aid of a Computer: Research Project with Children Starting School" (Horst Meermann); "The Multimedia School: The Comenius Pilot Project" (Tom Sperlich); "A Very Useful Piece of…

  18. Middle School Girls' Envisioned Future in Computing

    ERIC Educational Resources Information Center

    Friend, Michelle

    2015-01-01

    Experience is necessary but not sufficient to cause girls to envision a future career in computing. This study investigated the experiences and attitudes of girls who had taken three years of mandatory computer science classes in an all-girls setting in middle school, measured at the end of eighth grade. The one third of participants who were open…

  19. Applications of computer-graphics animation for motion-perception research

    NASA Technical Reports Server (NTRS)

    Proffitt, D. R.; Kaiser, M. K.

    1986-01-01

    The advantages and limitations of using computer animated stimuli in studying motion perception are presented and discussed. Most current programs of motion perception research could not be pursued without the use of computer graphics animation. Computer generated displays afford latitudes of freedom and control that are almost impossible to attain through conventional methods. There are, however, limitations to this presentational medium. At present, computer generated displays present simplified approximations of the dynamics in natural events. Very little is known about how the differences between natural events and computer simulations influence perceptual processing. In practice, the differences are assumed to be irrelevant to the questions under study, and that findings with computer generated stimuli will generalize to natural events.

  20. Positron Computed Tomography: Current State, Clinical Results and Future Trends

    DOE R&D Accomplishments Database

    Schelbert, H. R.; Phelps, M. E.; Kuhl, D. E.

    1980-09-01

    An overview is presented of positron computed tomography: its advantages over single photon emission tomography, its use in metabolic studies of the heart and chemical investigation of the brain, and future trends. (ACR)

Top