Sample records for generation computer project

  1. French Plans for Fifth Generation Computer Systems.

    DTIC Science & Technology

    1984-12-07

    centrally man- French industry In electronics, compu- aged project in France that covers all ters, software, and services and to make the facets of the...Centre National of Japan’s Fifth Generation Project , the de Recherche Scientifique (CNRS) Cooper- French scientific and industrial com- ative Research...systems, man-computer The National Projects interaction, novel computer structures, The French Ministry of Research and knowledge-based computer systems

  2. Has computational creativity successfully made it "Beyond the Fence" in musical theatre?

    NASA Astrophysics Data System (ADS)

    Jordanous, Anna

    2017-10-01

    A significant test for software is to task it with replicating human performance, as done recently with creative software and the commercial project Beyond the Fence (undertaken for a television documentary Computer Says Show). The remit of this project was to use computer software as much as possible to produce "the world's first computer-generated musical". Several creative systems were used to generate this musical, which was performed in London's West End in 2016. This paper considers the challenge of evaluating this project. Current computational creativity evaluation methods are ill-suited to evaluating projects that involve creative input from multiple systems and people. Following recent inspiration within computational creativity research from interaction design, here the DECIDE evaluation framework is applied to evaluate the Beyond the Fence project. Evaluation finds that the project was reasonably successful at achieving the task of using computational generation to produce a credible musical. Lessons have been learned for future computational creativity projects though, particularly for affording creative software more agency and enabling software to interact with other creative partners. Upon reflection, the DECIDE framework emerges as a useful evaluation "checklist" (if not a tangible operational methodology) for evaluating multiple creative systems participating in a creative task.

  3. Japanese supercomputer technology.

    PubMed

    Buzbee, B L; Ewald, R H; Worlton, W J

    1982-12-17

    Under the auspices of the Ministry for International Trade and Industry the Japanese have launched a National Superspeed Computer Project intended to produce high-performance computers for scientific computation and a Fifth-Generation Computer Project intended to incorporate and exploit concepts of artificial intelligence. If these projects are successful, which appears likely, advanced economic and military research in the United States may become dependent on access to supercomputers of foreign manufacture.

  4. Molecular electronics: The technology of sixth generation computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jarvis, M.T.; Miller, R.K.

    1987-01-01

    In February 1986, Japan began the 6th Generation project. At the 1987 Economic Summit in Venice, Prime Minister Yashuhiro Makasone opened the project to world collaboration. A project director suggests that the 6th Generation ''may just be a turning point for human society.'' The major rationale for building molecular electronic devices is to achieve advances in computational densities and speeds. Proposed chromophore chains for molecular-scale chips, for example, could be spaced closer than today's silicone elements by a factor of almost 100. This book describes the research and proposed designs for molecular electronic devices and computers. It examines specific potentialmore » applications and the relationship to molecular electronics to silicon technology and presents the first published survey of experts on research issues, applications, and forecast of future developments and also includes market forecast. An interesting suggestion of the survey is that the chemical industry may become a significant factor in the computer industry as the sixth generation unfolds.« less

  5. Digital computer programs for generating oblique orthographic projections and contour plots

    NASA Technical Reports Server (NTRS)

    Giles, G. L.

    1975-01-01

    User and programer documentation is presented for two programs for automatic plotting of digital data. One of the programs generates oblique orthographic projections of three-dimensional numerical models and the other program generates contour plots of data distributed in an arbitrary planar region. A general description of the computational algorithms, user instructions, and complete listings of the programs is given. Several plots are included to illustrate various program options, and a single example is described to facilitate learning the use of the programs.

  6. View generated database

    NASA Technical Reports Server (NTRS)

    Downward, James G.

    1992-01-01

    This document represents the final report for the View Generated Database (VGD) project, NAS7-1066. It documents the work done on the project up to the point at which all project work was terminated due to lack of project funds. The VGD was to provide the capability to accurately represent any real-world object or scene as a computer model. Such models include both an accurate spatial/geometric representation of surfaces of the object or scene, as well as any surface detail present on the object. Applications of such models are numerous, including acquisition and maintenance of work models for tele-autonomous systems, generation of accurate 3-D geometric/photometric models for various 3-D vision systems, and graphical models for realistic rendering of 3-D scenes via computer graphics.

  7. Student-Generated Multimedia Projects in the Classroom.

    ERIC Educational Resources Information Center

    Green, Tim; Brown, Abbie H.

    2002-01-01

    Explains how student-generated, computer-based multimedia projects can be meaningful learning activities to integrate into the K-12 curriculum. Describes three phases: design, including goals and objectives of the project and use of the finished product; production, including choosing multimedia software, prototyping, and usability testing; and…

  8. Computer-Generated Phase Diagrams for Binary Mixtures.

    ERIC Educational Resources Information Center

    Jolls, Kenneth R.; And Others

    1983-01-01

    Computer programs that generate projections of thermodynamic phase surfaces through computer graphics were used to produce diagrams representing properties of water and steam and the pressure-volume-temperature behavior of most of the common equations of state. The program, program options emphasizing thermodynamic features of interest, and…

  9. Visualization of x-ray computer tomography using computer-generated holography

    NASA Astrophysics Data System (ADS)

    Daibo, Masahiro; Tayama, Norio

    1998-09-01

    The theory converted from x-ray projection data to the hologram directly by combining the computer tomography (CT) with the computer generated hologram (CGH), is proposed. The purpose of this study is to offer the theory for realizing the all- electronic and high-speed seeing through 3D visualization system, which is for the application to medical diagnosis and non- destructive testing. First, the CT is expressed using the pseudo- inverse matrix which is obtained by the singular value decomposition. CGH is expressed in the matrix style. Next, `projection to hologram conversion' (PTHC) matrix is calculated by the multiplication of phase matrix of CGH with pseudo-inverse matrix of the CT. Finally, the projection vector is converted to the hologram vector directly, by multiplication of the PTHC matrix with the projection vector. Incorporating holographic analog computation into CT reconstruction, it becomes possible that the calculation amount is drastically reduced. We demonstrate the CT cross section which is reconstituted by He-Ne laser in the 3D space from the real x-ray projection data acquired by x-ray television equipment, using our direct conversion technique.

  10. Electromagnetic tracking of motion in the proximity of computer generated graphical stimuli: a tutorial.

    PubMed

    Schnabel, Ulf H; Hegenloh, Michael; Müller, Hermann J; Zehetleitner, Michael

    2013-09-01

    Electromagnetic motion-tracking systems have the advantage of capturing the tempo-spatial kinematics of movements independently of the visibility of the sensors. However, they are limited in that they cannot be used in the proximity of electromagnetic field sources, such as computer monitors. This prevents exploiting the tracking potential of the sensor system together with that of computer-generated visual stimulation. Here we present a solution for presenting computer-generated visual stimulation that does not distort the electromagnetic field required for precise motion tracking, by means of a back projection medium. In one experiment, we verify that cathode ray tube monitors, as well as thin-film-transistor monitors, distort electro-magnetic sensor signals even at a distance of 18 cm. Our back projection medium, by contrast, leads to no distortion of the motion-tracking signals even when the sensor is touching the medium. This novel solution permits combining the advantages of electromagnetic motion tracking with computer-generated visual stimulation.

  11. Final Report on Institutional Computing Project s15_hilaserion, “Kinetic Modeling of Next-Generation High-Energy, High-Intensity Laser-Ion Accelerators as an Enabling Capability”

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Albright, Brian James; Yin, Lin; Stark, David James

    This proposal sought of order 1M core-hours of Institutional Computing time intended to enable computing by a new LANL Postdoc (David Stark) working under LDRD ER project 20160472ER (PI: Lin Yin) on laser-ion acceleration. The project was “off-cycle,” initiating in June of 2016 with a postdoc hire.

  12. MAPPER: A personal computer map projection tool

    NASA Technical Reports Server (NTRS)

    Bailey, Steven A.

    1993-01-01

    MAPPER is a set of software tools designed to let users create and manipulate map projections on a personal computer (PC). The capability exists to generate five popular map projections. These include azimuthal, cylindrical, mercator, lambert, and sinusoidal projections. Data for projections are contained in five coordinate databases at various resolutions. MAPPER is managed by a system of pull-down windows. This interface allows the user to intuitively create, view and export maps to other platforms.

  13. Perceptual factors that influence use of computer enhanced visual displays

    NASA Technical Reports Server (NTRS)

    Littman, David; Boehm-Davis, Debbie

    1993-01-01

    This document is the final report for the NASA/Langley contract entitled 'Perceptual Factors that Influence Use of Computer Enhanced Visual Displays.' The document consists of two parts. The first part contains a discussion of the problem to which the grant was addressed, a brief discussion of work performed under the grant, and several issues suggested for follow-on work. The second part, presented as Appendix I, contains the annual report produced by Dr. Ann Fulop, the Postdoctoral Research Associate who worked on-site in this project. The main focus of this project was to investigate perceptual factors that might affect a pilot's ability to use computer generated information that is projected into the same visual space that contains information about real world objects. For example, computer generated visual information can identify the type of an attacking aircraft, or its likely trajectory. Such computer generated information must not be so bright that it adversely affects a pilot's ability to perceive other potential threats in the same volume of space. Or, perceptual attributes of computer generated and real display components should not contradict each other in ways that lead to problems of accommodation and, thus, distance judgments. The purpose of the research carried out under this contract was to begin to explore the perceptual factors that contribute to effective use of these displays.

  14. Development of New Generation of Multibody System Computer Software

    DTIC Science & Technology

    2012-04-12

    DEVELOPMENT OF NEW GENERATION OF MULTIBODY SYSTEM COMPUTER SOFTWARE Ahmed A. Shabana University of Illinois at Chicago Paramsothy Jayakumar ...Paramsothy Jayakumar ; Michael Letherwood 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES

  15. American River Watershed Project, California. Part 1: Main Report. Part 2: Final Supplemental Environmental Impact Statement/Environmental Impact Report. Supplemental Information Report

    DTIC Science & Technology

    1996-03-01

    VII-7 VIII-1 Computer generated rendering of flood detention dam ................ VIII-3 VIII-2 American River Watershed Project Schedule...shows a plan view of the dam and plate 19 shows the dam in section and profile. Figure VIII-1 is a computer generated rendering of the dam. Table VIH-1...Williamson Act render the land ineligible for continued protection under that law, the local sponsor would be responsible for compensating the landowners

  16. A Longitudinal Study of Student Attitudes toward Computers: Resolving an Attitude Decay Paradox.

    ERIC Educational Resources Information Center

    McKinnon, David H.; Nolan, C. J. Patrick; Sinclair, Kenneth E.

    2000-01-01

    Describes results of an integrated curriculum project in New Zealand that generated paradoxical results regarding student motivation and attitudes toward computer use during junior high school, where attitudes became less positive. Examines implications for the design and implementation of curriculum projects involving student use of computer…

  17. Ultra-high resolution computed tomography imaging

    DOEpatents

    Paulus, Michael J.; Sari-Sarraf, Hamed; Tobin, Jr., Kenneth William; Gleason, Shaun S.; Thomas, Jr., Clarence E.

    2002-01-01

    A method for ultra-high resolution computed tomography imaging, comprising the steps of: focusing a high energy particle beam, for example x-rays or gamma-rays, onto a target object; acquiring a 2-dimensional projection data set representative of the target object; generating a corrected projection data set by applying a deconvolution algorithm, having an experimentally determined a transfer function, to the 2-dimensional data set; storing the corrected projection data set; incrementally rotating the target object through an angle of approximately 180.degree., and after each the incremental rotation, repeating the radiating, acquiring, generating and storing steps; and, after the rotating step, applying a cone-beam algorithm, for example a modified tomographic reconstruction algorithm, to the corrected projection data sets to generate a 3-dimensional image. The size of the spot focus of the beam is reduced to not greater than approximately 1 micron, and even to not greater than approximately 0.5 microns.

  18. EPA DSSTox and ToxCast Project Updates: Generating New Data and Linkages in Support of Public Toxico-Cheminformatics Efforts

    EPA Science Inventory

    EPA’s National Center for Computational Toxicology is generating data and capabilities to support a new paradigm for toxicity screening and prediction. The DSSTox project is improving public access to quality structure-annotated chemical toxicity information in less summarized fo...

  19. Intelligent supercomputers: the Japanese computer sputnik

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walter, G.

    1983-11-01

    Japan's government-supported fifth-generation computer project has had a pronounced effect on the American computer and information systems industry. The US firms are intensifying their research on and production of intelligent supercomputers, a combination of computer architecture and artificial intelligence software programs. While the present generation of computers is built for the processing of numbers, the new supercomputers will be designed specifically for the solution of symbolic problems and the use of artificial intelligence software. This article discusses new and exciting developments that will increase computer capabilities in the 1990s. 4 references.

  20. The Fifth Generation. An annotated bibliography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bramer, M.; Bramer, D.

    The Japanese Fifth Generation Computer System project constitutes a radical reappraisal of the functions which an advanced computer system should be able to perform, the programming languages needed to implement such functions, and the machine architectures suitable for supporting the chosen languages. The book guides the reader through the ever-growing literature on the project, and the international responses, including the United Kingdom Government's Alvey Program and the MCC Program in the United States. Evaluative abstracts are given, including books, journal articles, unpublished reports and material at both overview and technical levels.

  1. Advanced Computer Image Generation Techniques Exploiting Perceptual Characteristics. Final Report.

    ERIC Educational Resources Information Center

    Stenger, Anthony J.; And Others

    This study suggests and identifies computer image generation (CIG) algorithms for visual simulation that improve the training effectiveness of CIG simulators and identifies areas of basic research in visual perception that are significant for improving CIG technology. The first phase of the project entailed observing three existing CIG simulators.…

  2. Compensation for Transport Delays Produced by Computer Image Generation Systems. Cooperative Training Series.

    ERIC Educational Resources Information Center

    Ricard, G. L.; And Others

    The cooperative Navy/Air Force project described is aimed at the problem of image-flutter encountered when visual displays that present computer generated images are used for the simulation of certain flying situations. Two experiments are described which extend laboratory work on delay compensation schemes to the simulation of formation flight in…

  3. Viewing-zone control of integral imaging display using a directional projection and elemental image resizing method.

    PubMed

    Alam, Md Ashraful; Piao, Mei-Lan; Bang, Le Thanh; Kim, Nam

    2013-10-01

    Viewing-zone control of integral imaging (II) displays using a directional projection and elemental image (EI) resizing method is proposed. Directional projection of EIs with the same size of microlens pitch causes an EI mismatch at the EI plane. In this method, EIs are generated computationally using a newly introduced algorithm: the directional elemental image generation and resizing algorithm considering the directional projection geometry of each pixel as well as an EI resizing method to prevent the EI mismatch. Generated EIs are projected as a collimated projection beam with a predefined directional angle, either horizontally or vertically. The proposed II display system allows reconstruction of a 3D image within a predefined viewing zone that is determined by the directional projection angle.

  4. Why Micros?

    ERIC Educational Resources Information Center

    Myers, Darlene

    1983-01-01

    Discusses development of the first computers; identifies five generations of computers; reviews projections for sales and proliferation of microcomputers into the marketplace; outlines procedures for selecting microcomputers and evaluating software; and reviews several articles on microcomputers centering on user requirements. (MBR)

  5. Holodeck: Telepresence Dome Visualization System Simulations

    NASA Technical Reports Server (NTRS)

    Hite, Nicolas

    2012-01-01

    This paper explores the simulation and consideration of different image-projection strategies for the Holodeck, a dome that will be used for highly immersive telepresence operations in future endeavors of the National Aeronautics and Space Administration (NASA). Its visualization system will include a full 360 degree projection onto the dome's interior walls in order to display video streams from both simulations and recorded video. Because humans innately trust their vision to precisely report their surroundings, the Holodeck's visualization system is crucial to its realism. This system will be rigged with an integrated hardware and software infrastructure-namely, a system of projectors that will relay with a Graphics Processing Unit (GPU) and computer to both project images onto the dome and correct warping in those projections in real-time. Using both Computer-Aided Design (CAD) and ray-tracing software, virtual models of various dome/projector geometries were created and simulated via tracking and analysis of virtual light sources, leading to the selection of two possible configurations for installation. Research into image warping and the generation of dome-ready video content was also conducted, including generation of fisheye images, distortion correction, and the generation of a reliable content-generation pipeline.

  6. Computer tomography of the neurocranium.

    PubMed

    Liliequist, B; Forssell, A

    1976-07-01

    The experience with computer tomography of the neurocranium in 300 patients submitted for computer tomography of the brain is reported. The more appropriate projections which may be obtained with the second generation of scanners in combination with an elaborated reconstruction technique seem to constitute a replacement of conventional skull films.

  7. A computer based approach for Material, Manpower and Equipment managementin the Construction Projects

    NASA Astrophysics Data System (ADS)

    Sasidhar, Jaladanki; Muthu, D.; Venkatasubramanian, C.; Ramakrishnan, K.

    2017-07-01

    The success of any construction project will depend on efficient management of resources in a perfect manner to complete the project with a reasonable budget and time and the quality cannot be compromised. The efficient and timely procurement of material, deployment of adequate labor at correct time and mobilization of machinery lacking in time, all of them causes delay, lack of quality and finally affect the project cost. It is known factor that Project cost can be controlled by taking corrective actions on mobilization of resources at a right time. This research focuses on integration of management systems with the computer to generate the model which uses OOM data structure which decides to include automatic commodity code generation, automatic takeoff execution, intelligent purchase order generation, and components of design and schedule integration to overcome the problems of stock out. To overcome the problem in equipment management system inventory management module is suggested and the data set of equipment registration number, equipment number, description, date of purchase, manufacturer, equipment price, market value, life of equipment, production data of the equipment which includes equipment number, date, name of the job, hourly rate, insurance, depreciation cost of the equipment, taxes, storage cost, interest, oil, grease, and fuel consumption, etc. is analyzed and the decision support systems to overcome the problem arising out improper management is generated. The problem on labor is managed using scheduling, Strategic management of human resources. From the generated support systems tool, the resources are mobilized at a right time and help the project manager to finish project in time and thereby save the abnormal project cost and also provides the percentage that can be improved and also research focuses on determining the percentage of delays that are caused by lack of management of materials, manpower and machinery in different types of projects and how the percentage various from project to project.

  8. Microprocessor Control of Low Speed VSTOL Flight.

    DTIC Science & Technology

    1979-06-08

    Analog IAS Indicated Air Speed I/O Input/Output KIAS Knots, Indicated Air Speed NATOPS Naval Air Training and Operating Procedures Standardization SAS...computer programming necessary in the research, and contain, in the form of computer- generated time histories, the results of the project. -17- I...of the aircraft causes airflow over the wings and therefore produces aerodynamic lift. As the transition progresses, wing- generated lift gradually

  9. Digital Poetry: A Narrow Relation between Poetics and the Codes of the Computational Logic

    NASA Astrophysics Data System (ADS)

    Laurentiz, Silvia

    The project "Percorrendo Escrituras" (Walking Through Writings Project) has been developed at ECA-USP Fine Arts Department. Summarizing, it intends to study different structures of digital information that share the same universe and are generators of a new aesthetics condition. The aim is to search which are the expressive possibilities of the computer among the algorithm functions and other of its specific properties. It is a practical, theoretical and interdisciplinary project where the study of programming evolutionary language, logic and mathematics take us to poetic experimentations. The focus of this research is the digital poetry, and it comes from poetics of permutation combinations and culminates with dynamic and complex systems, autonomous, multi-user and interactive, through agents generation derivations, filtration and emergent standards. This lecture will present artworks that use some mechanisms introduced by cybernetics and the notion of system in digital poetry that demonstrate the narrow relationship between poetics and the codes of computational logic.

  10. A Project to Computerize Performance Objectives and Criterion-Referenced Measures in Occupational Education for Research and Determination of Applicability to Handicapped Learners. Final Report.

    ERIC Educational Resources Information Center

    Lee, Connie W.; Hinson, Tony M.

    This publication is the final report of a 21-month project designed to (1) expand and refine the computer capabilities of the Vocational-Technical Education Consortium of States (V-TECS) to ensure rapid data access for generating routine and special occupational data-based reports; (2) develop and implement a computer storage and retrieval system…

  11. Integrated geometry and grid generation system for complex configurations

    NASA Technical Reports Server (NTRS)

    Akdag, Vedat; Wulf, Armin

    1992-01-01

    A grid generation system was developed that enables grid generation for complex configurations. The system called ICEM/CFD is described and its role in computational fluid dynamics (CFD) applications is presented. The capabilities of the system include full computer aided design (CAD), grid generation on the actual CAD geometry definition using robust surface projection algorithms, interfacing easily with known CAD packages through common file formats for geometry transfer, grid quality evaluation of the volume grid, coupling boundary condition set-up for block faces with grid topology generation, multi-block grid generation with or without point continuity and block to block interface requirement, and generating grid files directly compatible with known flow solvers. The interactive and integrated approach to the problem of computational grid generation not only substantially reduces manpower time but also increases the flexibility of later grid modifications and enhancements which is required in an environment where CFD is integrated into a product design cycle.

  12. Future trends in computer waste generation in India.

    PubMed

    Dwivedy, Maheshwar; Mittal, R K

    2010-11-01

    The objective of this paper is to estimate the future projection of computer waste in India and to subsequently analyze their flow at the end of their useful phase. For this purpose, the study utilizes the logistic model-based approach proposed by Yang and Williams to forecast future trends in computer waste. The model estimates future projection of computer penetration rate utilizing their first lifespan distribution and historical sales data. A bounding analysis on the future carrying capacity was simulated using the three parameter logistic curve. The observed obsolete generation quantities from the extrapolated penetration rates are then used to model the disposal phase. The results of the bounding analysis indicate that in the year 2020, around 41-152 million units of computers will become obsolete. The obsolete computer generation quantities are then used to estimate the End-of-Life outflows by utilizing a time-series multiple lifespan model. Even a conservative estimate of the future recycling capacity of PCs will reach upwards of 30 million units during 2025. Apparently, more than 150 million units could be potentially recycled in the upper bound case. However, considering significant future investment in the e-waste recycling sector from all stakeholders in India, we propose a logistic growth in the recycling rate and estimate the requirement of recycling capacity between 60 and 400 million units for the lower and upper bound case during 2025. Finally, we compare the future obsolete PC generation amount of the US and India. Copyright © 2010 Elsevier Ltd. All rights reserved.

  13. Supplemental Tables to the Annual Energy Outlook

    EIA Publications

    2017-01-01

    The Annual Energy Outlook (AEO) Supplemental tables were generated for the reference case of the AEO using the National Energy Modeling System, a computer-based model which produces annual projections of energy markets. Most of the tables were not published in the AEO, but contain regional and other more detailed projections underlying the AEO projections.

  14. Intelligent Computer-Assisted Instruction: A Review and Assessment of ICAI Research and Its Potential for Education.

    ERIC Educational Resources Information Center

    Dede, Christopher J.; And Others

    The first of five sections in this report places intelligent computer-assisted instruction (ICAI) in its historical context through discussions of traditional computer-assisted instruction (CAI) linear and branching programs; TICCIT and PLATO IV, two CAI demonstration projects funded by the National Science Foundation; generative programs, the…

  15. Volunteer Computing Experience with ATLAS@Home

    NASA Astrophysics Data System (ADS)

    Adam-Bourdarios, C.; Bianchi, R.; Cameron, D.; Filipčič, A.; Isacchini, G.; Lançon, E.; Wu, W.; ATLAS Collaboration

    2017-10-01

    ATLAS@Home is a volunteer computing project which allows the public to contribute to computing for the ATLAS experiment through their home or office computers. The project has grown continuously since its creation in mid-2014 and now counts almost 100,000 volunteers. The combined volunteers’ resources make up a sizeable fraction of overall resources for ATLAS simulation. This paper takes stock of the experience gained so far and describes the next steps in the evolution of the project. These improvements include running natively on Linux to ease the deployment on for example university clusters, using multiple cores inside one task to reduce the memory requirements and running different types of workload such as event generation. In addition to technical details the success of ATLAS@Home as an outreach tool is evaluated.

  16. ART/Ada design project, phase 1. Task 3 report: Test plan

    NASA Technical Reports Server (NTRS)

    Allen, Bradley P.

    1988-01-01

    The plan is described for the integrated testing and benchmark of Phase Ada based ESBT Design Research Project. The integration testing is divided into two phases: (1) the modules that do not rely on the Ada code generated by the Ada Generator are tested before the Ada Generator is implemented; and (2) all modules are integrated and tested with the Ada code generated by the Ada Generator. Its performance and size as well as its functionality is verified in this phase. The target platform is a DEC Ada compiler on VAX mini-computers and VAX stations running the VMS operating system.

  17. A computer program for the generation of logic networks from task chart data

    NASA Technical Reports Server (NTRS)

    Herbert, H. E.

    1980-01-01

    The Network Generation Program (NETGEN), which creates logic networks from task chart data is presented. NETGEN is written in CDC FORTRAN IV (Extended) and runs in a batch mode on the CDC 6000 and CYBER 170 series computers. Data is input via a two-card format and contains information regarding the specific tasks in a project. From this data, NETGEN constructs a logic network of related activities with each activity having unique predecessor and successor nodes, activity duration, descriptions, etc. NETGEN then prepares this data on two files that can be used in the Project Planning Analysis and Reporting System Batch Network Scheduling program and the EZPERT graphics program.

  18. 36 CFR Appendix A to Part 1234 - Minimum Security Standards for Level III Federal Facilities

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... technology and blast standards. Immediate review of ongoing projects may generate savings in the... critical systems (alarm systems, radio communications, computer facilities, etc.) Required. Occupant... all exterior windows (shatter protection) Recommended. Review current projects for blast standards...

  19. 36 CFR Appendix A to Part 1234 - Minimum Security Standards for Level III Federal Facilities

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... technology and blast standards. Immediate review of ongoing projects may generate savings in the... critical systems (alarm systems, radio communications, computer facilities, etc.) Required. Occupant... all exterior windows (shatter protection) Recommended. Review current projects for blast standards...

  20. Multisensor fusion for 3D target tracking using track-before-detect particle filter

    NASA Astrophysics Data System (ADS)

    Moshtagh, Nima; Romberg, Paul M.; Chan, Moses W.

    2015-05-01

    This work presents a novel fusion mechanism for estimating the three-dimensional trajectory of a moving target using images collected by multiple imaging sensors. The proposed projective particle filter avoids the explicit target detection prior to fusion. In projective particle filter, particles that represent the posterior density (of target state in a high-dimensional space) are projected onto the lower-dimensional observation space. Measurements are generated directly in the observation space (image plane) and a marginal (sensor) likelihood is computed. The particles states and their weights are updated using the joint likelihood computed from all the sensors. The 3D state estimate of target (system track) is then generated from the states of the particles. This approach is similar to track-before-detect particle filters that are known to perform well in tracking dim and stealthy targets in image collections. Our approach extends the track-before-detect approach to 3D tracking using the projective particle filter. The performance of this measurement-level fusion method is compared with that of a track-level fusion algorithm using the projective particle filter. In the track-level fusion algorithm, the 2D sensor tracks are generated separately and transmitted to a fusion center, where they are treated as measurements to the state estimator. The 2D sensor tracks are then fused to reconstruct the system track. A realistic synthetic scenario with a boosting target was generated, and used to study the performance of the fusion mechanisms.

  1. Overview of the Cranked-Arrow Wing Aerodynamics Project International

    NASA Technical Reports Server (NTRS)

    Obara, Clifford J.; Lamar, John E.

    2008-01-01

    This paper provides a brief history of the F-16XL-1 aircraft, its role in the High Speed Research program and how it was morphed into the Cranked Arrow Wing Aerodynamics Project. Various flight, wind-tunnel and Computational Fluid Dynamics data sets were generated as part of the project. These unique and open flight datasets for surface pressures, boundary-layer profiles and skin-friction distributions, along with surface flow data, are described and sample data comparisons given. This is followed by a description of how the project became internationalized to be known as Cranked Arrow Wing Aerodynamics Project International and is concluded by an introduction to the results of a four year computational predictive study of data collected at flight conditions by participating researchers.

  2. Computer Aided Grid Interface: An Interactive CFD Pre-Processor

    NASA Technical Reports Server (NTRS)

    Soni, Bharat K.

    1997-01-01

    NASA maintains an applications oriented computational fluid dynamics (CFD) efforts complementary to and in support of the aerodynamic-propulsion design and test activities. This is especially true at NASA/MSFC where the goal is to advance and optimize present and future liquid-fueled rocket engines. Numerical grid generation plays a significant role in the fluid flow simulations utilizing CFD. An overall goal of the current project was to develop a geometry-grid generation tool that will help engineers, scientists and CFD practitioners to analyze design problems involving complex geometries in a timely fashion. This goal is accomplished by developing the CAGI: Computer Aided Grid Interface system. The CAGI system is developed by integrating CAD/CAM (Computer Aided Design/Computer Aided Manufacturing) geometric system output and/or Initial Graphics Exchange Specification (IGES) files (including all the NASA-IGES entities), geometry manipulations and generations associated with grid constructions, and robust grid generation methodologies. This report describes the development process of the CAGI system.

  3. Computer Aided Grid Interface: An Interactive CFD Pre-Processor

    NASA Technical Reports Server (NTRS)

    Soni, Bharat K.

    1996-01-01

    NASA maintains an applications oriented computational fluid dynamics (CFD) efforts complementary to and in support of the aerodynamic-propulsion design and test activities. This is especially true at NASA/MSFC where the goal is to advance and optimize present and future liquid-fueled rocket engines. Numerical grid generation plays a significant role in the fluid flow simulations utilizing CFD. An overall goal of the current project was to develop a geometry-grid generation tool that will help engineers, scientists and CFD practitioners to analyze design problems involving complex geometries in a timely fashion. This goal is accomplished by developing the Computer Aided Grid Interface system (CAGI). The CAGI system is developed by integrating CAD/CAM (Computer Aided Design/Computer Aided Manufacturing) geometric system output and / or Initial Graphics Exchange Specification (IGES) files (including all the NASA-IGES entities), geometry manipulations and generations associated with grid constructions, and robust grid generation methodologies. This report describes the development process of the CAGI system.

  4. DEMONSTRATION OF THE ANALYTIC ELEMENT METHOD FOR WELLHEAD PROJECTION - PROJECT SUMMARY

    EPA Science Inventory

    A new computer program has been developed to determine time-of-travel capture zones in relatively simple geohydrological settings. The WhAEM package contains an analytic element model that uses superposition of (many) closed form analytical solutions to generate a ground-water fl...

  5. Fast polyenergetic forward projection for image formation using OpenCL on a heterogeneous parallel computing platform.

    PubMed

    Zhou, Lili; Clifford Chao, K S; Chang, Jenghwa

    2012-11-01

    Simulated projection images of digital phantoms constructed from CT scans have been widely used for clinical and research applications but their quality and computation speed are not optimal for real-time comparison with the radiography acquired with an x-ray source of different energies. In this paper, the authors performed polyenergetic forward projections using open computing language (OpenCL) in a parallel computing ecosystem consisting of CPU and general purpose graphics processing unit (GPGPU) for fast and realistic image formation. The proposed polyenergetic forward projection uses a lookup table containing the NIST published mass attenuation coefficients (μ∕ρ) for different tissue types and photon energies ranging from 1 keV to 20 MeV. The CT images of interested sites are first segmented into different tissue types based on the CT numbers and converted to a three-dimensional attenuation phantom by linking each voxel to the corresponding tissue type in the lookup table. The x-ray source can be a radioisotope or an x-ray generator with a known spectrum described as weight w(n) for energy bin E(n). The Siddon method is used to compute the x-ray transmission line integral for E(n) and the x-ray fluence is the weighted sum of the exponential of line integral for all energy bins with added Poisson noise. To validate this method, a digital head and neck phantom constructed from the CT scan of a Rando head phantom was segmented into three (air, gray∕white matter, and bone) regions for calculating the polyenergetic projection images for the Mohan 4 MV energy spectrum. To accelerate the calculation, the authors partitioned the workloads using the task parallelism and data parallelism and scheduled them in a parallel computing ecosystem consisting of CPU and GPGPU (NVIDIA Tesla C2050) using OpenCL only. The authors explored the task overlapping strategy and the sequential method for generating the first and subsequent DRRs. A dispatcher was designed to drive the high-degree parallelism of the task overlapping strategy. Numerical experiments were conducted to compare the performance of the OpenCL∕GPGPU-based implementation with the CPU-based implementation. The projection images were similar to typical portal images obtained with a 4 or 6 MV x-ray source. For a phantom size of 512 × 512 × 223, the time for calculating the line integrals for a 512 × 512 image panel was 16.2 ms on GPGPU for one energy bin in comparison to 8.83 s on CPU. The total computation time for generating one polyenergetic projection image of 512 × 512 was 0.3 s (141 s for CPU). The relative difference between the projection images obtained with the CPU-based and OpenCL∕GPGPU-based implementations was on the order of 10(-6) and was virtually indistinguishable. The task overlapping strategy was 5.84 and 1.16 times faster than the sequential method for the first and the subsequent digitally reconstruction radiographies, respectively. The authors have successfully built digital phantoms using anatomic CT images and NIST μ∕ρ tables for simulating realistic polyenergetic projection images and optimized the processing speed with parallel computing using GPGPU∕OpenCL-based implementation. The computation time was fast (0.3 s per projection image) enough for real-time IGRT (image-guided radiotherapy) applications.

  6. Modern design of a fast front-end computer

    NASA Astrophysics Data System (ADS)

    Šoštarić, Z.; Anic̈ić, D.; Sekolec, L.; Su, J.

    1994-12-01

    Front-end computers (FEC) at Paul Scherrer Institut provide access to accelerator CAMAC-based sensors and actuators by way of a local area network. In the scope of the new generation FEC project, a front-end is regarded as a collection of services. The functionality of one such service is described in terms of Yourdon's environment, behaviour, processor and task models. The computational model (software representation of the environment) of the service is defined separately, using the information model of the Shlaer-Mellor method, and Sather OO language. In parallel with the analysis and later with the design, a suite of test programmes was developed to evaluate the feasibility of different computing platforms for the project and a set of rapid prototypes was produced to resolve different implementation issues. The past and future aspects of the project and its driving forces are presented. Justification of the choice of methodology, platform and requirement, is given. We conclude with a description of the present state, priorities and limitations of our project.

  7. Intelligent Control of a Sensor-Actuator System via Kernelized Least-Squares Policy Iteration

    PubMed Central

    Liu, Bo; Chen, Sanfeng; Li, Shuai; Liang, Yongsheng

    2012-01-01

    In this paper a new framework, called Compressive Kernelized Reinforcement Learning (CKRL), for computing near-optimal policies in sequential decision making with uncertainty is proposed via incorporating the non-adaptive data-independent Random Projections and nonparametric Kernelized Least-squares Policy Iteration (KLSPI). Random Projections are a fast, non-adaptive dimensionality reduction framework in which high-dimensionality data is projected onto a random lower-dimension subspace via spherically random rotation and coordination sampling. KLSPI introduce kernel trick into the LSPI framework for Reinforcement Learning, often achieving faster convergence and providing automatic feature selection via various kernel sparsification approaches. In this approach, policies are computed in a low-dimensional subspace generated by projecting the high-dimensional features onto a set of random basis. We first show how Random Projections constitute an efficient sparsification technique and how our method often converges faster than regular LSPI, while at lower computational costs. Theoretical foundation underlying this approach is a fast approximation of Singular Value Decomposition (SVD). Finally, simulation results are exhibited on benchmark MDP domains, which confirm gains both in computation time and in performance in large feature spaces. PMID:22736969

  8. GEODE An electrical energy supply with high availability

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mertz, J.L.; Gerard, M.J.; Girard, J.

    1983-10-01

    Project GEODE describes an electrical energy supply characterized by its very high availability. It is to be used in the PTT (French Telephone Company) telephone exchanges and is targeted for an unavailability of better than 10/sup -6/. In order to achieve this performance Merlin Gerin has adopted: a double bus bar architecture, remote controlled electrical equipment, a motor-generator set specifically designed for this project, and computer assisted surveillance. The authors present the overall reliability calculations for this project along with that for energy sources. The E.d.F (French Utility Company) network and the Motor-Generators.

  9. EPA Project Updates: DSSTox and ToxCast Generating New Data and Data Linkages for Use in Predictive Modeling

    EPA Science Inventory

    EPAs National Center for Computational Toxicology is building capabilities to support a new paradigm for toxicity screening and prediction. The DSSTox project is improving public access to quality structure-annotated chemical toxicity information in less summarized forms than tr...

  10. Systems and Software Producibility Collaboration and Experimental Environment (SPRUCE)

    DTIC Science & Technology

    2009-04-23

    Research Manhattan Project Like Research – Transition timeframe needed • Current generation programs – DoD acquisitions over next 1-5 years • Next...Specific Computing Plant B a s i c Transformational Research Manhattan Project Like Research B a s i c 16 • Sponsored by Lockheed Martin

  11. Update on the Culicoides sonorensis transcriptome project: a peek into the molecular biology of the midge

    USDA-ARS?s Scientific Manuscript database

    Next Generation Sequencing is transforming the way scientists collect and measure an organism’s genetic background and gene dynamics, while bioinformatics and super-computing are merging to facilitate parallel sample computation and interpretation at unprecedented speeds. Analyzing the complete gene...

  12. Grammar as a Programming Language. Artificial Intelligence Memo 391.

    ERIC Educational Resources Information Center

    Rowe, Neil

    Student projects that involve writing generative grammars in the computer language, "LOGO," are described in this paper, which presents a grammar-running control structure that allows students to modify and improve the grammar interpreter itself while learning how a simple kind of computer parser works. Included are procedures for…

  13. Lessons Learned on Management of CAS Development.

    ERIC Educational Resources Information Center

    Boyadjieff, Kiril

    1995-01-01

    Computer-assisted studies (CAS) attract foreign language professionals' attention due to the reliability of personal computers, the decreasing cost of available technology, and the new generation of students for whom electronic media are a familiar habitat. This article focuses on a project of the Defense Language Institute that produced over…

  14. Tracking the NGS revolution: managing life science research on shared high-performance computing clusters.

    PubMed

    Dahlö, Martin; Scofield, Douglas G; Schaal, Wesley; Spjuth, Ola

    2018-05-01

    Next-generation sequencing (NGS) has transformed the life sciences, and many research groups are newly dependent upon computer clusters to store and analyze large datasets. This creates challenges for e-infrastructures accustomed to hosting computationally mature research in other sciences. Using data gathered from our own clusters at UPPMAX computing center at Uppsala University, Sweden, where core hour usage of ∼800 NGS and ∼200 non-NGS projects is now similar, we compare and contrast the growth, administrative burden, and cluster usage of NGS projects with projects from other sciences. The number of NGS projects has grown rapidly since 2010, with growth driven by entry of new research groups. Storage used by NGS projects has grown more rapidly since 2013 and is now limited by disk capacity. NGS users submit nearly twice as many support tickets per user, and 11 more tools are installed each month for NGS projects than for non-NGS projects. We developed usage and efficiency metrics and show that computing jobs for NGS projects use more RAM than non-NGS projects, are more variable in core usage, and rarely span multiple nodes. NGS jobs use booked resources less efficiently for a variety of reasons. Active monitoring can improve this somewhat. Hosting NGS projects imposes a large administrative burden at UPPMAX due to large numbers of inexperienced users and diverse and rapidly evolving research areas. We provide a set of recommendations for e-infrastructures that host NGS research projects. We provide anonymized versions of our storage, job, and efficiency databases.

  15. Tracking the NGS revolution: managing life science research on shared high-performance computing clusters

    PubMed Central

    2018-01-01

    Abstract Background Next-generation sequencing (NGS) has transformed the life sciences, and many research groups are newly dependent upon computer clusters to store and analyze large datasets. This creates challenges for e-infrastructures accustomed to hosting computationally mature research in other sciences. Using data gathered from our own clusters at UPPMAX computing center at Uppsala University, Sweden, where core hour usage of ∼800 NGS and ∼200 non-NGS projects is now similar, we compare and contrast the growth, administrative burden, and cluster usage of NGS projects with projects from other sciences. Results The number of NGS projects has grown rapidly since 2010, with growth driven by entry of new research groups. Storage used by NGS projects has grown more rapidly since 2013 and is now limited by disk capacity. NGS users submit nearly twice as many support tickets per user, and 11 more tools are installed each month for NGS projects than for non-NGS projects. We developed usage and efficiency metrics and show that computing jobs for NGS projects use more RAM than non-NGS projects, are more variable in core usage, and rarely span multiple nodes. NGS jobs use booked resources less efficiently for a variety of reasons. Active monitoring can improve this somewhat. Conclusions Hosting NGS projects imposes a large administrative burden at UPPMAX due to large numbers of inexperienced users and diverse and rapidly evolving research areas. We provide a set of recommendations for e-infrastructures that host NGS research projects. We provide anonymized versions of our storage, job, and efficiency databases. PMID:29659792

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Walker, Andrew; Lawrence, Earl

    The Response Surface Modeling (RSM) Tool Suite is a collection of three codes used to generate an empirical interpolation function for a collection of drag coefficient calculations computed with Test Particle Monte Carlo (TPMC) simulations. The first code, "Automated RSM", automates the generation of a drag coefficient RSM for a particular object to a single command. "Automated RSM" first creates a Latin Hypercube Sample (LHS) of 1,000 ensemble members to explore the global parameter space. For each ensemble member, a TPMC simulation is performed and the object drag coefficient is computed. In the next step of the "Automated RSM" code,more » a Gaussian process is used to fit the TPMC simulations. In the final step, Markov Chain Monte Carlo (MCMC) is used to evaluate the non-analytic probability distribution function from the Gaussian process. The second code, "RSM Area", creates a look-up table for the projected area of the object based on input limits on the minimum and maximum allowed pitch and yaw angles and pitch and yaw angle intervals. The projected area from the look-up table is used to compute the ballistic coefficient of the object based on its pitch and yaw angle. An accurate ballistic coefficient is crucial in accurately computing the drag on an object. The third code, "RSM Cd", uses the RSM generated by the "Automated RSM" code and the projected area look-up table generated by the "RSM Area" code to accurately compute the drag coefficient and ballistic coefficient of the object. The user can modify the object velocity, object surface temperature, the translational temperature of the gas, the species concentrations of the gas, and the pitch and yaw angles of the object. Together, these codes allow for the accurate derivation of an object's drag coefficient and ballistic coefficient under any conditions with only knowledge of the object's geometry and mass.« less

  17. Computing Properties of Hadrons, Nuclei and Nuclear Matter from Quantum Chromodynamics (LQCD)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Negele, John W.

    Building on the success of two preceding generations of Scientific Discovery through Advanced Computing (SciDAC) projects, this grant supported the MIT component (P.I. John Negele) of a multi-institutional SciDAC-3 project that also included Brookhaven National Laboratory, the lead laboratory with P. I. Frithjof Karsch serving as Project Director, Thomas Jefferson National Accelerator Facility with P. I. David Richards serving as Co-director, University of Washington with P. I. Martin Savage, University of North Carolina with P. I. Rob Fowler, and College of William and Mary with P. I. Andreas Stathopoulos. Nationally, this multi-institutional project coordinated the software development effort that themore » nuclear physics lattice QCD community needs to ensure that lattice calculations can make optimal use of forthcoming leadership-class and dedicated hardware, including that at the national laboratories, and to exploit future computational resources in the Exascale era.« less

  18. Computational modeling of Radioisotope Thermoelectric Generators (RTG) for interplanetary and deep space travel

    NASA Astrophysics Data System (ADS)

    Nejat, Cyrus; Nejat, Narsis; Nejat, Najmeh

    2014-06-01

    This research project is part of Narsis Nejat Master of Science thesis project that it is done at Shiraz University. The goals of this research are to make a computer model to evaluate the thermal power, electrical power, amount of emitted/absorbed dose, and amount of emitted/absorbed dose rate for static Radioisotope Thermoelectric Generators (RTG)s that is include a comprehensive study of the types of RTG systems and in particular RTG’s fuel resulting from both natural and artificial isotopes, calculation of the permissible dose radioisotope selected from the above, and conceptual design modeling and comparison between several NASA made RTGs with the project computer model pointing out the strong and weakness points for using this model in nuclear industries for simulation. The heat is being converted to electricity by two major methods in RTGs: static conversion and dynamic conversion. The model that is created for this project is for RTGs that heat is being converted to electricity statically. The model approximates good results as being compared with SNAP-3, SNAP-19, MHW, and GPHS RTGs in terms of electrical power, efficiency, specific power, and types of the mission and amount of fuel mass that is required to accomplish the mission.

  19. 9 CFR 381.132 - Labeling approval.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... drawn, computer generated or other reasonable facsimiles that clearly reflect and project the final...) (2)-(9). The management of the official establishment or establishment certified under a foreign...

  20. Gravitation, Symmetry and Undergraduates

    NASA Astrophysics Data System (ADS)

    Jorgensen, Jamie

    2001-04-01

    This talk will discuss "Project Petrov" Which is designed to investigate gravitational fields with symmetry. Project Petrov represents a collaboration involving physicists, mathematicians as well as graduate and undergraduate math and physics students. An overview of Project Petrov will be given, with an emphasis on students' contributions, including software to classify and generate Lie algebras, to classify isometry groups, and to compute the isometry group of a given metric.

  1. Experimental Investigation of Project Orion Crew Exploration Vehicle Aeroheating in AEDC Tunnel 9

    NASA Technical Reports Server (NTRS)

    Hollis, Brian R.; Horvath, Thomas J.; Berger, Karen T.; Lillard, Randolph P.; Kirk, Benjamin S.; Coblish, Joseph J.; Norris, Joseph D.

    2008-01-01

    An investigation of the aeroheating environment of the Project Orion Crew Entry Vehicle has been performed in the Arnold Engineering Development Center Tunnel 9. The goals of this test were to measure turbulent heating augmentation levels on the heat shield and to obtain high-fidelity heating data for assessment of computational fluid dynamics methods. Laminar and turbulent predictions were generated for all wind tunnel test conditions and comparisons were performed with the data for the purpose of helping to define uncertainty margins for the computational method. Data from both the wind tunnel test and the computational study are presented herein.

  2. 2nd Generation QUATARA Flight Computer Project

    NASA Technical Reports Server (NTRS)

    Falker, Jay; Keys, Andrew; Fraticelli, Jose Molina; Capo-Iugo, Pedro; Peeples, Steven

    2015-01-01

    Single core flight computer boards have been designed, developed, and tested (DD&T) to be flown in small satellites for the last few years. In this project, a prototype flight computer will be designed as a distributed multi-core system containing four microprocessors running code in parallel. This flight computer will be capable of performing multiple computationally intensive tasks such as processing digital and/or analog data, controlling actuator systems, managing cameras, operating robotic manipulators and transmitting/receiving from/to a ground station. In addition, this flight computer will be designed to be fault tolerant by creating both a robust physical hardware connection and by using a software voting scheme to determine the processor's performance. This voting scheme will leverage on the work done for the Space Launch System (SLS) flight software. The prototype flight computer will be constructed with Commercial Off-The-Shelf (COTS) components which are estimated to survive for two years in a low-Earth orbit.

  3. Funder Report on Decision Support Systems Project Dissemination Activities, Fiscal Year 1985.

    ERIC Educational Resources Information Center

    Tetlow, William L.

    Dissemination activities for the Decision Support Systems (DSS) for fiscal year (FY) 1985 are reported by the National Center for Higher Education Management Systems (NCHEMS). The main means for disseminating results of the DSS research and development project has been through computer-generated video presentations at meetings of higher education…

  4. An efficient and scalable analysis framework for variant extraction and refinement from population-scale DNA sequence data.

    PubMed

    Jun, Goo; Wing, Mary Kate; Abecasis, Gonçalo R; Kang, Hyun Min

    2015-06-01

    The analysis of next-generation sequencing data is computationally and statistically challenging because of the massive volume of data and imperfect data quality. We present GotCloud, a pipeline for efficiently detecting and genotyping high-quality variants from large-scale sequencing data. GotCloud automates sequence alignment, sample-level quality control, variant calling, filtering of likely artifacts using machine-learning techniques, and genotype refinement using haplotype information. The pipeline can process thousands of samples in parallel and requires less computational resources than current alternatives. Experiments with whole-genome and exome-targeted sequence data generated by the 1000 Genomes Project show that the pipeline provides effective filtering against false positive variants and high power to detect true variants. Our pipeline has already contributed to variant detection and genotyping in several large-scale sequencing projects, including the 1000 Genomes Project and the NHLBI Exome Sequencing Project. We hope it will now prove useful to many medical sequencing studies. © 2015 Jun et al.; Published by Cold Spring Harbor Laboratory Press.

  5. Programming Pluralism: Using Learning Analytics to Detect Patterns in the Learning of Computer Programming

    ERIC Educational Resources Information Center

    Blikstein, Paulo; Worsley, Marcelo; Piech, Chris; Sahami, Mehran; Cooper, Steven; Koller, Daphne

    2014-01-01

    New high-frequency, automated data collection and analysis algorithms could offer new insights into complex learning processes, especially for tasks in which students have opportunities to generate unique open-ended artifacts such as computer programs. These approaches should be particularly useful because the need for scalable project-based and…

  6. Microform Catalogs: A Viable Alternative for Texas Libraries.

    ERIC Educational Resources Information Center

    Cox, Carolyn, M.; Juergens, Bonnie

    This project proposed to develop and test the use of microform catalogs produced from computer-generated magnetic tape records in both fiche and film formats. The Computer Output Microform (COM) catalog developed for this purpose is a union list of titles from the five participating libraries--Houston and Dallas Public Libraries, Texas State…

  7. Using the Microcomputer for Advertising Research Presentations.

    ERIC Educational Resources Information Center

    Larkin, Ernest F.

    A midwestern university is testing a program that uses the Apple II computer to help students in an advertising research course develop their skills in preparing and presenting research reports using computer generated graphics for both oral and written presentations. One of the course requirements is the preparation of a final project, including…

  8. Syringomyelia

    MedlinePlus

    ... is the most reliable way to diagnose syringomyelia. Computer-generated radio waves and a powerful magnetic field ... a searchable database of current and past research projects supported by NIH and other federal agencies. RePORTER ...

  9. First 3 years of operation of RIACS (Research Institute for Advanced Computer Science) (1983-1985)

    NASA Technical Reports Server (NTRS)

    Denning, P. J.

    1986-01-01

    The focus of the Research Institute for Advanced Computer Science (RIACS) is to explore matches between advanced computing architectures and the processes of scientific research. An architecture evaluation of the MIT static dataflow machine, specification of a graphical language for expressing distributed computations, and specification of an expert system for aiding in grid generation for two-dimensional flow problems was initiated. Research projects for 1984 and 1985 are summarized.

  10. Volumetric visualization algorithm development for an FPGA-based custom computing machine

    NASA Astrophysics Data System (ADS)

    Sallinen, Sami J.; Alakuijala, Jyrki; Helminen, Hannu; Laitinen, Joakim

    1998-05-01

    Rendering volumetric medical images is a burdensome computational task for contemporary computers due to the large size of the data sets. Custom designed reconfigurable hardware could considerably speed up volume visualization if an algorithm suitable for the platform is used. We present an algorithm and speedup techniques for visualizing volumetric medical CT and MR images with a custom-computing machine based on a Field Programmable Gate Array (FPGA). We also present simulated performance results of the proposed algorithm calculated with a software implementation running on a desktop PC. Our algorithm is capable of generating perspective projection renderings of single and multiple isosurfaces with transparency, simulated X-ray images, and Maximum Intensity Projections (MIP). Although more speedup techniques exist for parallel projection than for perspective projection, we have constrained ourselves to perspective viewing, because of its importance in the field of radiotherapy. The algorithm we have developed is based on ray casting, and the rendering is sped up by three different methods: shading speedup by gradient precalculation, a new generalized version of Ray-Acceleration by Distance Coding (RADC), and background ray elimination by speculative ray selection.

  11. Automated recognition of helium speech. Phase I: Investigation of microprocessor based analysis/synthesis system

    NASA Astrophysics Data System (ADS)

    Jelinek, H. J.

    1986-01-01

    This is the Final Report of Electronic Design Associates on its Phase I SBIR project. The purpose of this project is to develop a method for correcting helium speech, as experienced in diver-surface communication. The goal of the Phase I study was to design, prototype, and evaluate a real time helium speech corrector system based upon digital signal processing techniques. The general approach was to develop hardware (an IBM PC board) to digitize helium speech and software (a LAMBDA computer based simulation) to translate the speech. As planned in the study proposal, this initial prototype may now be used to assess expected performance from a self contained real time system which uses an identical algorithm. The Final Report details the work carried out to produce the prototype system. Four major project tasks were: a signal processing scheme for converting helium speech to normal sounding speech was generated. The signal processing scheme was simulated on a general purpose (LAMDA) computer. Actual helium speech was supplied to the simulation and the converted speech was generated. An IBM-PC based 14 bit data Input/Output board was designed and built. A bibliography of references on speech processing was generated.

  12. What's in your next-generation sequence data? An exploration of unmapped DNA and RNA sequence reads from the bovine reference individual

    USDA-ARS?s Scientific Manuscript database

    BACKGROUND: Next-generation sequencing projects commonly commence by aligning reads to a reference genome assembly. While improvements in alignment algorithms and computational hardware have greatly enhanced the efficiency and accuracy of alignments, a significant percentage of reads often remain u...

  13. Methods, apparatuses, and computer-readable media for projectional morphological analysis of N-dimensional signals

    DOEpatents

    Glazoff, Michael V.; Gering, Kevin L.; Garnier, John E.; Rashkeev, Sergey N.; Pyt'ev, Yuri Petrovich

    2016-05-17

    Embodiments discussed herein in the form of methods, systems, and computer-readable media deal with the application of advanced "projectional" morphological algorithms for solving a broad range of problems. In a method of performing projectional morphological analysis, an N-dimensional input signal is supplied. At least one N-dimensional form indicative of at least one feature in the N-dimensional input signal is identified. The N-dimensional input signal is filtered relative to the at least one N-dimensional form and an N-dimensional output signal is generated indicating results of the filtering at least as differences in the N-dimensional input signal relative to the at least one N-dimensional form.

  14. High-Resolution Large-Field-of-View Three-Dimensional Hologram Display System and Method Thereof

    NASA Technical Reports Server (NTRS)

    Chao, Tien-Hsin (Inventor); Mintz, Frederick W. (Inventor); Tsou, Peter (Inventor); Bryant, Nevin A. (Inventor)

    2001-01-01

    A real-time, dynamic, free space-virtual reality, 3-D image display system is enabled by using a unique form of Aerogel as the primary display media. A preferred embodiment of this system comprises a 3-D mosaic topographic map which is displayed by fusing four projected hologram images. In this embodiment, four holographic images are projected from four separate holograms. Each holographic image subtends a quadrant of the 4(pi) solid angle. By fusing these four holographic images, a static 3-D image such as a featured terrain map would be visible for 360 deg in the horizontal plane and 180 deg in the vertical plane. An input, either acquired by 3-D image sensor or generated by computer animation, is first converted into a 2-D computer generated hologram (CGH). This CGH is then downloaded into large liquid crystal (LC) panel. A laser projector illuminates the CGH-filled LC panel and generates and displays a real 3-D image in the Aerogel matrix.

  15. Computer Cataloging of Electronic Journals in Unstable Aggregator Databases: The Hong Kong Baptist University Library Experience.

    ERIC Educational Resources Information Center

    Li, Yiu-On; Leung, Shirley W.

    2001-01-01

    Discussion of aggregator databases focuses on a project at the Hong Kong Baptist University library to integrate full-text electronic journal titles from three unstable aggregator databases into its online public access catalog (OPAC). Explains the development of the electronic journal computer program (EJCOP) to generate MARC records for…

  16. Automatic speech recognition and training for severely dysarthric users of assistive technology: the STARDUST project.

    PubMed

    Parker, Mark; Cunningham, Stuart; Enderby, Pam; Hawley, Mark; Green, Phil

    2006-01-01

    The STARDUST project developed robust computer speech recognizers for use by eight people with severe dysarthria and concomitant physical disability to access assistive technologies. Independent computer speech recognizers trained with normal speech are of limited functional use by those with severe dysarthria due to limited and inconsistent proximity to "normal" articulatory patterns. Severe dysarthric output may also be characterized by a small mass of distinguishable phonetic tokens making the acoustic differentiation of target words difficult. Speaker dependent computer speech recognition using Hidden Markov Models was achieved by the identification of robust phonetic elements within the individual speaker output patterns. A new system of speech training using computer generated visual and auditory feedback reduced the inconsistent production of key phonetic tokens over time.

  17. The next generation

    NASA Technical Reports Server (NTRS)

    Yudkin, Howard

    1988-01-01

    The next generation of computer systems are studied by examining the processes and methodologies. The present generation is ok for small projects, but not so good for large projects. They are not good for addressing the iterative nature of requirements, resolution, and implementation. They do not address complexity issues of requirements stabilization. They do not explicitly address reuse opportunities, and they do not help with people shortages. Therefore, there is a need to define and automate improved software engineering processes. Some help may be gained by reuse and prototyping, which are two sides of the same coin. Reuse library parts are used to generate good approximations to desired solutions, i.e., prototypes. And rapid prototype composition implies use of preexistent parts, i.e., reusable parts.

  18. Computational simulations of supersonic magnetohydrodynamic flow control, power and propulsion systems

    NASA Astrophysics Data System (ADS)

    Wan, Tian

    This work is motivated by the lack of fully coupled computational tool that solves successfully the turbulent chemically reacting Navier-Stokes equation, the electron energy conservation equation and the electric current Poisson equation. In the present work, the abovementioned equations are solved in a fully coupled manner using fully implicit parallel GMRES methods. The system of Navier-Stokes equations are solved using a GMRES method with combined Schwarz and ILU(0) preconditioners. The electron energy equation and the electric current Poisson equation are solved using a GMRES method with combined SOR and Jacobi preconditioners. The fully coupled method has also been implemented successfully in an unstructured solver, US3D, and convergence test results were presented. This new method is shown two to five times faster than the original DPLR method. The Poisson solver is validated with analytic test problems. Then, four problems are selected; two of them are computed to explore the possibility of onboard MHD control and power generation, and the other two are simulation of experiments. First, the possibility of onboard reentry shock control by a magnetic field is explored. As part of a previous project, MHD power generation onboard a re-entry vehicle is also simulated. Then, the MHD acceleration experiments conducted at NASA Ames research center are simulated. Lastly, the MHD power generation experiments known as the HVEPS project are simulated. For code validation, the scramjet experiments at University of Queensland are simulated first. The generator section of the HVEPS test facility is computed then. The main conclusion is that the computational tool is accurate for different types of problems and flow conditions, and its accuracy and efficiency are necessary when the flow complexity increases.

  19. HyperForest: A high performance multi-processor architecture for real-time intelligent systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia, P. Jr.; Rebeil, J.P.; Pollard, H.

    1997-04-01

    Intelligent Systems are characterized by the intensive use of computer power. The computer revolution of the last few years is what has made possible the development of the first generation of Intelligent Systems. Software for second generation Intelligent Systems will be more complex and will require more powerful computing engines in order to meet real-time constraints imposed by new robots, sensors, and applications. A multiprocessor architecture was developed that merges the advantages of message-passing and shared-memory structures: expendability and real-time compliance. The HyperForest architecture will provide an expandable real-time computing platform for computationally intensive Intelligent Systems and open the doorsmore » for the application of these systems to more complex tasks in environmental restoration and cleanup projects, flexible manufacturing systems, and DOE`s own production and disassembly activities.« less

  20. Evaluation of Computer Tools for Idea Generation and Team Formation in Project-Based Learning

    ERIC Educational Resources Information Center

    Ardaiz-Villanueva, Oscar; Nicuesa-Chacon, Xabier; Brene-Artazcoz, Oscar; Sanz de Acedo Lizarraga, Maria Luisa; Sanz de Acedo Baquedano, Maria Teresa

    2011-01-01

    The main objective of this research was to validate the effectiveness of Wikideas and Creativity Connector tools to stimulate the generation of ideas and originality by university students organized into groups according to their indexes of creativity and affinity. Another goal of the study was to evaluate the classroom climate created by these…

  1. An application of artificial intelligence to the interpretation of mass spectra.

    NASA Technical Reports Server (NTRS)

    Buchanan, B. G.; Duffield, A. M.; Robertson, A. V.

    1971-01-01

    Description of the DENDRAL (Dendritic Algorithm) project, the objectives of which were to base the computer program on an alogorithm that generates an exhaustive, nonredundant list of all the structural isomers of a given chemical composition, and to devise a computer program that would perform an organic structure determination, given a molecular formula and a mass spectrum. This program is called 'Heuristic DENDRAL' and it operates by using the known structure/spectrum correlations to constrain the DENDRAL isomer generator to produce a single isomer for that composition. The collaboration of chemists and computer scientists has produced a tool of some practical utility from the chemical viewpoint, and an interesting program from the viewpoint of artificial intelligence.

  2. Using Volunteer Computing to Study Some Features of Diagonal Latin Squares

    NASA Astrophysics Data System (ADS)

    Vatutin, Eduard; Zaikin, Oleg; Kochemazov, Stepan; Valyaev, Sergey

    2017-12-01

    In this research, the study concerns around several features of diagonal Latin squares (DLSs) of small order. Authors of the study suggest an algorithm for computing minimal and maximal numbers of transversals of DLSs. According to this algorithm, all DLSs of a particular order are generated, and for each square all its transversals and diagonal transversals are constructed. The algorithm was implemented and applied to DLSs of order at most 7 on a personal computer. The experiment for order 8 was performed in the volunteer computing project Gerasim@home. In addition, the problem of finding pairs of orthogonal DLSs of order 10 was considered and reduced to Boolean satisfiability problem. The obtained problem turned out to be very hard, therefore it was decomposed into a family of subproblems. In order to solve the problem, the volunteer computing project SAT@home was used. As a result, several dozen pairs of described kind were found.

  3. Improving Student Learning via Mobile Phone Video Content: Evidence from the BridgeIT India Project

    ERIC Educational Resources Information Center

    Wennersten, Matthew; Quraishy, Zubeeda Banu; Velamuri, Malathi

    2015-01-01

    Past efforts invested in computer-based education technology interventions have generated little evidence of affordable success at scale. This paper presents the results of a mobile phone-based intervention conducted in the Indian states of Andhra Pradesh and Tamil Nadu in 2012-13. The BridgeIT project provided a pool of audio-visual learning…

  4. eScience for molecular-scale simulations and the eMinerals project.

    PubMed

    Salje, E K H; Artacho, E; Austen, K F; Bruin, R P; Calleja, M; Chappell, H F; Chiang, G-T; Dove, M T; Frame, I; Goodwin, A L; Kleese van Dam, K; Marmier, A; Parker, S C; Pruneda, J M; Todorov, I T; Trachenko, K; Tyer, R P; Walker, A M; White, T O H

    2009-03-13

    We review the work carried out within the eMinerals project to develop eScience solutions that facilitate a new generation of molecular-scale simulation work. Technological developments include integration of compute and data systems, developing of collaborative frameworks and new researcher-friendly tools for grid job submission, XML data representation, information delivery, metadata harvesting and metadata management. A number of diverse science applications will illustrate how these tools are being used for large parameter-sweep studies, an emerging type of study for which the integration of computing, data and collaboration is essential.

  5. Cylindrical Projection of Jupiter

    NASA Image and Video Library

    1996-01-29

    This computer generated map of Jupiter was made from 10 color images of Jupiter taken Feb. 1, 1979, by NASA Voyager 1, during a single, 10 hour rotation of the planet. http://photojournal.jpl.nasa.gov/catalog/PIA00011

  6. Comparison of Computational-Model and Experimental-Example Trained Neural Networks for Processing Speckled Fringe Patterns

    NASA Technical Reports Server (NTRS)

    Decker, A. J.; Fite, E. B.; Thorp, S. A.; Mehmed, O.

    1998-01-01

    The responses of artificial neural networks to experimental and model-generated inputs are compared for detection of damage in twisted fan blades using electronic holography. The training-set inputs, for this work, are experimentally generated characteristic patterns of the vibrating blades. The outputs are damage-flag indicators or second derivatives of the sensitivity-vector-projected displacement vectors from a finite element model. Artificial neural networks have been trained in the past with computational-model-generated training sets. This approach avoids the difficult inverse calculations traditionally used to compare interference fringes with the models. But the high modeling standards are hard to achieve, even with fan-blade finite-element models.

  7. Comparison of Computational, Model and Experimental, Example Trained Neural Networks for Processing Speckled Fringe Patterns

    NASA Technical Reports Server (NTRS)

    Decker, A. J.; Fite, E. B.; Thorp, S. A.; Mehmed, O.

    1998-01-01

    The responses of artificial neural networks to experimental and model-generated inputs are compared for detection of damage in twisted fan blades using electronic holography. The training-set inputs, for this work, are experimentally generated characteristic patterns of the vibrating blades. The outputs are damage-flag indicators or second derivatives of the sensitivity-vector-projected displacement vectors from a finite element model. Artificial neural networks have been trained in the past with computational-model- generated training sets. This approach avoids the difficult inverse calculations traditionally used to compare interference fringes with the models. But the high modeling standards are hard to achieve, even with fan-blade finite-element models.

  8. Creation of Synthetic Surface Temperature and Precipitation Ensembles Through A Computationally Efficient, Mixed Method Approach

    NASA Astrophysics Data System (ADS)

    Hartin, C.; Lynch, C.; Kravitz, B.; Link, R. P.; Bond-Lamberty, B. P.

    2017-12-01

    Typically, uncertainty quantification of internal variability relies on large ensembles of climate model runs under multiple forcing scenarios or perturbations in a parameter space. Computationally efficient, standard pattern scaling techniques only generate one realization and do not capture the complicated dynamics of the climate system (i.e., stochastic variations with a frequency-domain structure). In this study, we generate large ensembles of climate data with spatially and temporally coherent variability across a subselection of Coupled Model Intercomparison Project Phase 5 (CMIP5) models. First, for each CMIP5 model we apply a pattern emulation approach to derive the model response to external forcing. We take all the spatial and temporal variability that isn't explained by the emulator and decompose it into non-physically based structures through use of empirical orthogonal functions (EOFs). Then, we perform a Fourier decomposition of the EOF projection coefficients to capture the input fields' temporal autocorrelation so that our new emulated patterns reproduce the proper timescales of climate response and "memory" in the climate system. Through this 3-step process, we derive computationally efficient climate projections consistent with CMIP5 model trends and modes of variability, which address a number of deficiencies inherent in the ability of pattern scaling to reproduce complex climate model behavior.

  9. Engineering specification and system design for CAD/CAM of custom shoes: UMC project effort

    NASA Technical Reports Server (NTRS)

    Bao, Han P.

    1991-01-01

    The goal of this project is to supplement the footwear design system of North Carolina State University (NCSU) with a software module to design and manufacture a combination sole. The four areas of concentration were: customization of NASCAD (NASA Computer Aided Design) to the footwear project; use of CENCIT data; computer aided manufacturing activities; and beginning work for the bottom elements of shoes. The task of generating a software module for producing a sole was completed with a demonstrated product realization. The software written in C was delivered to NCSU for inclusion in their design system for custom footwear known as LASTMOD. The machining process of the shoe last was improved using a spiral tool path approach.

  10. The mission events graphic generator software: A small tool with big results

    NASA Technical Reports Server (NTRS)

    Lupisella, Mark; Leibee, Jack; Scaffidi, Charles

    1993-01-01

    Utilization of graphics has long been a useful methodology for many aspects of spacecraft operations. A personal computer based software tool that implements straight-forward graphics and greatly enhances spacecraft operations is presented. This unique software tool is the Mission Events Graphic Generator (MEGG) software which is used in support of the Hubble Space Telescope (HST) Project. MEGG reads the HST mission schedule and generates a graphical timeline.

  11. Radio Synthesis Imaging - A High Performance Computing and Communications Project

    NASA Astrophysics Data System (ADS)

    Crutcher, Richard M.

    The National Science Foundation has funded a five-year High Performance Computing and Communications project at the National Center for Supercomputing Applications (NCSA) for the direct implementation of several of the computing recommendations of the Astronomy and Astrophysics Survey Committee (the "Bahcall report"). This paper is a summary of the project goals and a progress report. The project will implement a prototype of the next generation of astronomical telescope systems - remotely located telescopes connected by high-speed networks to very high performance, scalable architecture computers and on-line data archives, which are accessed by astronomers over Gbit/sec networks. Specifically, a data link has been installed between the BIMA millimeter-wave synthesis array at Hat Creek, California and NCSA at Urbana, Illinois for real-time transmission of data to NCSA. Data are automatically archived, and may be browsed and retrieved by astronomers using the NCSA Mosaic software. In addition, an on-line digital library of processed images will be established. BIMA data will be processed on a very high performance distributed computing system, with I/O, user interface, and most of the software system running on the NCSA Convex C3880 supercomputer or Silicon Graphics Onyx workstations connected by HiPPI to the high performance, massively parallel Thinking Machines Corporation CM-5. The very computationally intensive algorithms for calibration and imaging of radio synthesis array observations will be optimized for the CM-5 and new algorithms which utilize the massively parallel architecture will be developed. Code running simultaneously on the distributed computers will communicate using the Data Transport Mechanism developed by NCSA. The project will also use the BLANCA Gbit/s testbed network between Urbana and Madison, Wisconsin to connect an Onyx workstation in the University of Wisconsin Astronomy Department to the NCSA CM-5, for development of long-distance distributed computing. Finally, the project is developing 2D and 3D visualization software as part of the international AIPS++ project. This research and development project is being carried out by a team of experts in radio astronomy, algorithm development for massively parallel architectures, high-speed networking, database management, and Thinking Machines Corporation personnel. The development of this complete software, distributed computing, and data archive and library solution to the radio astronomy computing problem will advance our expertise in high performance computing and communications technology and the application of these techniques to astronomical data processing.

  12. Modeling of urban solid waste management system: The case of Dhaka city

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sufian, M.A.; Bala, B.K.

    2007-07-01

    This paper presents a system dynamics computer model to predict solid waste generation, collection capacity and electricity generation from solid waste and to assess the needs for waste management of the urban city of Dhaka, Bangladesh. Simulated results show that solid waste generation, collection capacity and electricity generation potential from solid waste increase with time. Population, uncleared waste, untreated waste, composite index and public concern are projected to increase with time for Dhaka city. Simulated results also show that increasing the budget for collection capacity alone does not improve environmental quality; rather an increased budget is required for both collectionmore » and treatment of solid wastes of Dhaka city. Finally, this model can be used as a computer laboratory for urban solid waste management (USWM) policy analysis.« less

  13. An Integrated Data-Driven Strategy for Safe-by-Design Nanoparticles: The FP7 MODERN Project.

    PubMed

    Brehm, Martin; Kafka, Alexander; Bamler, Markus; Kühne, Ralph; Schüürmann, Gerrit; Sikk, Lauri; Burk, Jaanus; Burk, Peeter; Tamm, Tarmo; Tämm, Kaido; Pokhrel, Suman; Mädler, Lutz; Kahru, Anne; Aruoja, Villem; Sihtmäe, Mariliis; Scott-Fordsmand, Janeck; Sorensen, Peter B; Escorihuela, Laura; Roca, Carlos P; Fernández, Alberto; Giralt, Francesc; Rallo, Robert

    2017-01-01

    The development and implementation of safe-by-design strategies is key for the safe development of future generations of nanotechnology enabled products. The safety testing of the huge variety of nanomaterials that can be synthetized is unfeasible due to time and cost constraints. Computational modeling facilitates the implementation of alternative testing strategies in a time and cost effective way. The development of predictive nanotoxicology models requires the use of high quality experimental data on the structure, physicochemical properties and bioactivity of nanomaterials. The FP7 Project MODERN has developed and evaluated the main components of a computational framework for the evaluation of the environmental and health impacts of nanoparticles. This chapter describes each of the elements of the framework including aspects related to data generation, management and integration; development of nanodescriptors; establishment of nanostructure-activity relationships; identification of nanoparticle categories; hazard ranking and risk assessment.

  14. Intelligent Manufacturing of Commercial Optics Final Report CRADA No. TC-0313-92

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Taylor, J. S.; Pollicove, H.

    The project combined the research and development efforts of LLNL and the University of Rochester Center for Manufacturing Optics (COM), to develop a new generation of flexible computer controlled optics· grinding machines. COM's principal near term development effort is to commercialize the OPTICAM-SM, a new prototype spherical grinding machine. A crucial requirement for commercializing the OPTICAM-SM is the development of a predictable and repeatable material removal process ( deterministic micro-grinding) that yields high quality surfaces that minimize non-deterministic polishing. OPTICAM machine tools and the fabrication process development studies are part of COM' s response to the DOD (ARPA) request tomore » implement a modernization strategy for revitalizing the U.S. optics manufacturing base. This project was entered into in order to develop a new generation of :flexible, computer-controlled optics grinding machines.« less

  15. Hardware for hard-up schools?

    NASA Astrophysics Data System (ADS)

    St John, Stuart A.

    2012-01-01

    The purpose of this work was to investigate ways in which everyday computers can be used in schools to fulfil several of the roles of more expensive, specialized laboratory equipment for teaching and learning purposes. The brief adopted was to keep things as straightforward as possible so that any school science department with a few basic tools can copy the ideas presented. The project has so far produced a simple, safe input device to enable use of a computer as an oscilloscope and the conversion of external speakers into a signal generator. They are not without their limitations, but the intention is that they may provide opportunities for hands-on learning in schools where budgets are very limited. Several teaching ideas are outlined, with pointers for further development. It is hoped that interest in the project may generate further application of the ideas to the teaching of high school physics.

  16. CFD and Neutron codes coupling on a computational platform

    NASA Astrophysics Data System (ADS)

    Cerroni, D.; Da Vià, R.; Manservisi, S.; Menghini, F.; Scardovelli, R.

    2017-01-01

    In this work we investigate the thermal-hydraulics behavior of a PWR nuclear reactor core, evaluating the power generation distribution taking into account the local temperature field. The temperature field, evaluated using a self-developed CFD module, is exchanged with a neutron code, DONJON-DRAGON, which updates the macroscopic cross sections and evaluates the new neutron flux. From the updated neutron flux the new peak factor is evaluated and the new temperature field is computed. The exchange of data between the two codes is obtained thanks to their inclusion into the computational platform SALOME, an open-source tools developed by the collaborative project NURESAFE. The numerical libraries MEDmem, included into the SALOME platform, are used in this work, for the projection of computational fields from one problem to another. The two problems are driven by a common supervisor that can access to the computational fields of both systems, in every time step, the temperature field, is extracted from the CFD problem and set into the neutron problem. After this iteration the new power peak factor is projected back into the CFD problem and the new time step can be computed. Several computational examples, where both neutron and thermal-hydraulics quantities are parametrized, are finally reported in this work.

  17. FRACTAL Systems & Project suite: engineering tools for improving development and operation of the systems

    NASA Astrophysics Data System (ADS)

    Pérez-Calpena, A.; Mujica-Alvarez, E.; Osinde-Lopez, J.; García-Vargas, M.

    2008-07-01

    This paper describes the FRACTAL Systems & Projects suite. This suite is composed by several tools (GECO, DOCMA and SUMO) that provide the capabilities that all organizations need to store and manage the system information generated along the project's lifetime, from the design phase to the operation phase. The amount of information that is generated in a project keeps growing in size and complexity along the project's lifetime, to an extent that it becomes impossible to manage it without the aid of specific computer-based tools. The suite described in this paper is the solution developed by FRACTAL to assist the execution of different scientific projects, mainly related with telescopes and instruments, for astronomical research centres. These tools help the system and project engineers to maintain the technical control of the systems and to ensure an optimal use of the resources. GECO eases the control of the system configuration data; DOCMA provides the means to organise and manage the documents generated in the project; SUMO allows managing and scheduling the operation, the maintenance activities and the resources during the operational phase of a system. These tools improve the project communication making the information available to the authorized users (project team, customers, Consortium's members, etc). Finally and depending on the project needs, these three tools can be used integrated or in an independent manner.

  18. Meeting Report: The Terabase Metagenomics Workshop and the Vision of an Earth Microbiome Project

    PubMed Central

    Gilbert, Jack A.; Meyer, Folker; Antonopoulos, Dion; Balaji, Pavan; Brown, C. Titus; Brown, Christopher T.; Desai, Narayan; Eisen, Jonathan A; Evers, Dirk; Field, Dawn; Feng, Wu; Huson, Daniel; Jansson, Janet; Knight, Rob; Knight, James; Kolker, Eugene; Konstantindis, Kostas; Kostka, Joel; Kyrpides, Nikos; Mackelprang, Rachel; McHardy, Alice; Quince, Christopher; Raes, Jeroen; Sczyrba, Alexander; Shade, Ashley; Stevens, Rick

    2010-01-01

    Between July 18th and 24th 2010, 26 leading microbial ecology, computation, bioinformatics and statistics researchers came together in Snowbird, Utah (USA) to discuss the challenge of how to best characterize the microbial world using next-generation sequencing technologies. The meeting was entitled “Terabase Metagenomics” and was sponsored by the Institute for Computing in Science (ICiS) summer 2010 workshop program. The aim of the workshop was to explore the fundamental questions relating to microbial ecology that could be addressed using advances in sequencing potential. Technological advances in next-generation sequencing platforms such as the Illumina HiSeq 2000 can generate in excess of 250 billion base pairs of genetic information in 8 days. Thus, the generation of a trillion base pairs of genetic information is becoming a routine matter. The main outcome from this meeting was the birth of a concept and practical approach to exploring microbial life on earth, the Earth Microbiome Project (EMP). Here we briefly describe the highlights of this meeting and provide an overview of the EMP concept and how it can be applied to exploration of the microbiome of each ecosystem on this planet. PMID:21304727

  19. Design and Construction of Detector and Data Acquisition Elements for Proton Computed Tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fermi Research Alliance; Northern Illinois University

    2015-07-15

    Proton computed tomography (pCT) offers an alternative to x-ray imaging with potential for three-dimensional imaging, reduced radiation exposure, and in-situ imaging. Northern Illinois University (NIU) is developing a second-generation proton computed tomography system with a goal of demonstrating the feasibility of three-dimensional imaging within clinically realistic imaging times. The second-generation pCT system is comprised of a tracking system, a calorimeter, data acquisition, a computing farm, and software algorithms. The proton beam encounters the upstream tracking detectors, the patient or phantom, the downstream tracking detectors, and a calorimeter. The schematic layout of the PCT system is shown. The data acquisition sendsmore » the proton scattering information to an offline computing farm. Major innovations of the second generation pCT project involve an increased data acquisition rate ( MHz range) and development of three-dimensional imaging algorithms. The Fermilab Particle Physics Division and Northern Illinois Center for Accelerator and Detector Development at Northern Illinois University worked together to design and construct the tracking detectors, calorimeter, readout electronics and detector mounting system.« less

  20. D Modelling with the Samsung Gear 360

    NASA Astrophysics Data System (ADS)

    Barazzetti, L.; Previtali, M.; Roncoroni, F.

    2017-02-01

    The Samsung Gear 360 is a consumer grade spherical camera able to capture photos and videos. The aim of this work is to test the metric accuracy and the level of detail achievable with the Samsung Gear 360 coupled with digital modelling techniques based on photogrammetry/computer vision algorithms. Results demonstrate that the direct use of the projection generated inside the mobile phone or with Gear 360 Action Direction (the desktop software for post-processing) have a relatively low metric accuracy. As results were in contrast with the accuracy achieved by using the original fisheye images (front and rear facing images) in photogrammetric reconstructions, an alternative solution to generate the equirectangular projections was developed. A calibration aimed at understanding the intrinsic parameters of the two lenses camera, as well as their relative orientation, allowed one to generate new equirectangular projections from which a significant improvement of geometric accuracy has been achieved.

  1. DEGAS: Dynamic Exascale Global Address Space Programming Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Demmel, James

    The Dynamic, Exascale Global Address Space programming environment (DEGAS) project will develop the next generation of programming models and runtime systems to meet the challenges of Exascale computing. The Berkeley part of the project concentrated on communication-optimal code generation to optimize speed and energy efficiency by reducing data movement. Our work developed communication lower bounds, and/or communication avoiding algorithms (that either meet the lower bound, or do much less communication than their conventional counterparts) for a variety of algorithms, including linear algebra, machine learning and genomics. The Berkeley part of the project concentrated on communication-optimal code generation to optimize speedmore » and energy efficiency by reducing data movement. Our work developed communication lower bounds, and/or communication avoiding algorithms (that either meet the lower bound, or do much less communication than their conventional counterparts) for a variety of algorithms, including linear algebra, machine learning and genomics.« less

  2. The Triangle: a Multiprocessor Architecture for Fast Curve and Surface Generation.

    DTIC Science & Technology

    1987-08-01

    design , curves and surfaces, graphics hardware. 20...curves, B-splines, computer-aided geometric design ; curves and sur- faces, graphics hardware. (k 12). -/ .... This work was supported in part by the...34 Electronic Design , October 30, 1986. 21. M. A. Penna and R. R. Patterson, Projective Geometry and its Applications to Computer Graphics , Prentice-Hall, Englewood Cliffs, N.J., 1985. 70,e, 41100vr -~ ~ - -- --

  3. RELIABILITY, AVAILABILITY, AND SERVICEABILITY FOR PETASCALE HIGH-END COMPUTING AND BEYOND

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chokchai "Box" Leangsuksun

    2011-05-31

    Our project is a multi-institutional research effort that adopts interplay of RELIABILITY, AVAILABILITY, and SERVICEABILITY (RAS) aspects for solving resilience issues in highend scientific computing in the next generation of supercomputers. results lie in the following tracks: Failure prediction in a large scale HPC; Investigate reliability issues and mitigation techniques including in GPGPU-based HPC system; HPC resilience runtime & tools.

  4. Real-time fuzzy inference based robot path planning

    NASA Technical Reports Server (NTRS)

    Pacini, Peter J.; Teichrow, Jon S.

    1990-01-01

    This project addresses the problem of adaptive trajectory generation for a robot arm. Conventional trajectory generation involves computing a path in real time to minimize a performance measure such as expended energy. This method can be computationally intensive, and it may yield poor results if the trajectory is weakly constrained. Typically some implicit constraints are known, but cannot be encoded analytically. The alternative approach used here is to formulate domain-specific knowledge, including implicit and ill-defined constraints, in terms of fuzzy rules. These rules utilize linguistic terms to relate input variables to output variables. Since the fuzzy rulebase is determined off-line, only high-level, computationally light processing is required in real time. Potential applications for adaptive trajectory generation include missile guidance and various sophisticated robot control tasks, such as automotive assembly, high speed electrical parts insertion, stepper alignment, and motion control for high speed parcel transfer systems.

  5. Vehicle Component Characterization. Volume 1 : Project Results.

    DOT National Transportation Integrated Search

    1987-01-01

    This study developed a set of data which could be used in computer crash occupant simulation models to study automobile crashworthiness. The data generated has been used to develop a data base on the National Highway Traffic Safety Administration's V...

  6. e-Stars Template Builder

    NASA Technical Reports Server (NTRS)

    Cox, Brian

    2003-01-01

    e-Stars Template Builder is a computer program that implements a concept of enabling users to rapidly gain access to information on projects of NASA's Jet Propulsion Laboratory. The information about a given project is not stored in a data base, but rather, in a network that follows the project as it develops. e-Stars Template Builder resides on a server computer, using Practical Extraction and Reporting Language (PERL) scripts to create what are called "e-STARS node templates," which are software constructs that allow for project-specific configurations. The software resides on the server and does not require specific software on the user machine except for an Internet browser. A user's computer need not be equipped with special software (other than an Internet-browser program). e-Stars Template Builder is compatible with Windows, Macintosh, and UNIX operating systems. A user invokes e-Stars Template Builder from a browser window. Operations that can be performed by the user include the creation of child processes and the addition of links and descriptions of documentation to existing pages or nodes. By means of this addition of "child processes" of nodes, a network that reflects the development of a project is generated.

  7. System Matrix Analysis for Computed Tomography Imaging

    PubMed Central

    Flores, Liubov; Vidal, Vicent; Verdú, Gumersindo

    2015-01-01

    In practical applications of computed tomography imaging (CT), it is often the case that the set of projection data is incomplete owing to the physical conditions of the data acquisition process. On the other hand, the high radiation dose imposed on patients is also undesired. These issues demand that high quality CT images can be reconstructed from limited projection data. For this reason, iterative methods of image reconstruction have become a topic of increased research interest. Several algorithms have been proposed for few-view CT. We consider that the accurate solution of the reconstruction problem also depends on the system matrix that simulates the scanning process. In this work, we analyze the application of the Siddon method to generate elements of the matrix and we present results based on real projection data. PMID:26575482

  8. Software life cycle methodologies and environments

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest

    1991-01-01

    Products of this project will significantly improve the quality and productivity of Space Station Freedom Program software processes by: improving software reliability and safety; and broadening the range of problems that can be solved with computational solutions. Projects brings in Computer Aided Software Engineering (CASE) technology for: Environments such as Engineering Script Language/Parts Composition System (ESL/PCS) application generator, Intelligent User Interface for cost avoidance in setting up operational computer runs, Framework programmable platform for defining process and software development work flow control, Process for bringing CASE technology into an organization's culture, and CLIPS/CLIPS Ada language for developing expert systems; and methodologies such as Method for developing fault tolerant, distributed systems and a method for developing systems for common sense reasoning and for solving expert systems problems when only approximate truths are known.

  9. Eastern Renewable Generation Integration Study: Redefining What’s Possible for Renewable Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bloom, Aaron

    NREL project manager Aaron Bloom introduces NREL’s Eastern Renewable Generation Integration Study (ERGIS) and high-performance computing capabilities and new methodologies that allowed NREL to model operations of the Eastern Interconnection at unprecedented fidelity. ERGIS shows that the Eastern Interconnection can balance the variability and uncertainty of wind and solar photovoltaics at a 5-minute level, for one simulated year.

  10. FOUNTAIN: A JAVA open-source package to assist large sequencing projects

    PubMed Central

    Buerstedde, Jean-Marie; Prill, Florian

    2001-01-01

    Background Better automation, lower cost per reaction and a heightened interest in comparative genomics has led to a dramatic increase in DNA sequencing activities. Although the large sequencing projects of specialized centers are supported by in-house bioinformatics groups, many smaller laboratories face difficulties managing the appropriate processing and storage of their sequencing output. The challenges include documentation of clones, templates and sequencing reactions, and the storage, annotation and analysis of the large number of generated sequences. Results We describe here a new program, named FOUNTAIN, for the management of large sequencing projects . FOUNTAIN uses the JAVA computer language and data storage in a relational database. Starting with a collection of sequencing objects (clones), the program generates and stores information related to the different stages of the sequencing project using a web browser interface for user input. The generated sequences are subsequently imported and annotated based on BLAST searches against the public databases. In addition, simple algorithms to cluster sequences and determine putative polymorphic positions are implemented. Conclusions A simple, but flexible and scalable software package is presented to facilitate data generation and storage for large sequencing projects. Open source and largely platform and database independent, we wish FOUNTAIN to be improved and extended in a community effort. PMID:11591214

  11. Cheminformatic Analysis of the US EPA ToxCast Chemical Library

    EPA Science Inventory

    The ToxCast project is employing high throughput screening (HTS) technologies, along with chemical descriptors and computational models, to develop approaches for screening and prioritizing environmental chemicals for further toxicity testing. ToxCast Phase I generated HTS data f...

  12. HeatmapGenerator: high performance RNAseq and microarray visualization software suite to examine differential gene expression levels using an R and C++ hybrid computational pipeline.

    PubMed

    Khomtchouk, Bohdan B; Van Booven, Derek J; Wahlestedt, Claes

    2014-01-01

    The graphical visualization of gene expression data using heatmaps has become an integral component of modern-day medical research. Heatmaps are used extensively to plot quantitative differences in gene expression levels, such as those measured with RNAseq and microarray experiments, to provide qualitative large-scale views of the transcriptonomic landscape. Creating high-quality heatmaps is a computationally intensive task, often requiring considerable programming experience, particularly for customizing features to a specific dataset at hand. Software to create publication-quality heatmaps is developed with the R programming language, C++ programming language, and OpenGL application programming interface (API) to create industry-grade high performance graphics. We create a graphical user interface (GUI) software package called HeatmapGenerator for Windows OS and Mac OS X as an intuitive, user-friendly alternative to researchers with minimal prior coding experience to allow them to create publication-quality heatmaps using R graphics without sacrificing their desired level of customization. The simplicity of HeatmapGenerator is that it only requires the user to upload a preformatted input file and download the publicly available R software language, among a few other operating system-specific requirements. Advanced features such as color, text labels, scaling, legend construction, and even database storage can be easily customized with no prior programming knowledge. We provide an intuitive and user-friendly software package, HeatmapGenerator, to create high-quality, customizable heatmaps generated using the high-resolution color graphics capabilities of R. The software is available for Microsoft Windows and Apple Mac OS X. HeatmapGenerator is released under the GNU General Public License and publicly available at: http://sourceforge.net/projects/heatmapgenerator/. The Mac OS X direct download is available at: http://sourceforge.net/projects/heatmapgenerator/files/HeatmapGenerator_MAC_OSX.tar.gz/download. The Windows OS direct download is available at: http://sourceforge.net/projects/heatmapgenerator/files/HeatmapGenerator_WINDOWS.zip/download.

  13. Image

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marsh, Amber; Harsch, Tim; Pitt, Julie

    2007-08-31

    The computer side of the IMAGE project consists of a collection of Perl scripts that perform a variety of tasks; scripts are available to insert, update and delete data from the underlying Oracle database, download data from NCBI's Genbank and other sources, and generate data files for download by interested parties. Web scripts make up the tracking interface, and various tools available on the project web-site (image.llnl.gov) that provide a search interface to the database.

  14. Molecular Dynamics-based Simulations of Bulk/Interfacial Structures and Diffusion Behaviors in Nuclear Waste Glasses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Du, Jincheng; Rimsza, Jessica; Deng, Lu

    This NEUP Project aimed to generate accurate atomic structural models of nuclear waste glasses by using large-scale molecular dynamics-based computer simulations and to use these models to investigate self-diffusion behaviors, interfacial structures, and hydrated gel structures formed during dissolution of these glasses. The goal was to obtain realistic and accurate short and medium range structures of these complex oxide glasses, to provide a mechanistic understanding of the dissolution behaviors, and to generate reliable information with predictive power in designing nuclear waste glasses for long-term geological storage. Looking back of the research accomplishments of this project, most of the scientific goalsmore » initially proposed have been achieved through intensive research in the three and a half year period of the project. This project has also generated a wealth of scientific data and vibrant discussions with various groups through collaborations within and outside of this project. Throughout the project one book chapter and 14 peer reviewed journal publications have been generated (including one under review) and 16 presentations (including 8 invited talks) have been made to disseminate the results of this project in national and international conference. Furthermore, this project has trained several outstanding graduate students and young researchers for future workforce in nuclear related field, especially on nuclear waste immobilization. One postdoc and four PhD students have been fully or partially supported through the project with intensive training in the field material science and engineering with expertise on glass science and nuclear waste disposal« less

  15. Techniques for deriving tissue structure from multiple projection dual-energy x-ray absorptiometry

    NASA Technical Reports Server (NTRS)

    Feldmesser, Howard S. (Inventor); Charles, Jr., Harry K. (Inventor); Beck, Thomas J. (Inventor); Magee, Thomas C. (Inventor)

    2004-01-01

    Techniques for deriving bone properties from images generated by a dual-energy x-ray absorptiometry apparatus include receiving first image data having pixels indicating bone mineral density projected at a first angle of a plurality of projection angles. Second image data and third image data are also received. The second image data indicates bone mineral density projected at a different second angle. The third image data indicates bone mineral density projected at a third angle. The third angle is different from the first angle and the second angle. Principal moments of inertia for a bone in the subject are computed based on the first image data, the second image data and the third image data. The techniques allow high-precision, high-resolution dual-energy x-ray attenuation images to be used for computing principal moments of inertia and strength moduli of individual bones, plus risk of injury and changes in risk of injury to a patient.

  16. Computer Simulation Performed for Columbia Project Cooling System

    NASA Technical Reports Server (NTRS)

    Ahmad, Jasim

    2005-01-01

    This demo shows a high-fidelity simulation of the air flow in the main computer room housing the Columbia (10,024 intel titanium processors) system. The simulation asseses the performance of the cooling system and identified deficiencies, and recommended modifications to eliminate them. It used two in house software packages on NAS supercomputers: Chimera Grid tools to generate a geometric model of the computer room, OVERFLOW-2 code for fluid and thermal simulation. This state-of-the-art technology can be easily extended to provide a general capability for air flow analyses on any modern computer room. Columbia_CFD_black.tiff

  17. Enabling the First Ever Measurement of Coherent Neutrino Scattering Through Background Neutron Measurements.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reyna, David; Betty, Rita

    Using High Performance Computing to Examine the Processes of Neurogenesis Underlying Pattern Separation/Completion of Episodic Information - Sandia researchers developed novel methods and metrics for studying the computational function of neurogenesis,thus generating substantial impact to the neuroscience and neural computing communities. This work could benefit applications in machine learning and other analysis activities. The purpose of this project was to computationally model the impact of neural population dynamics within the neurobiological memory system in order to examine how subareas in the brain enable pattern separation and completion of information in memory across time as associated experiences.

  18. TestDose: A nuclear medicine software based on Monte Carlo modeling for generating gamma camera acquisitions and dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garcia, Marie-Paule, E-mail: marie-paule.garcia@univ-brest.fr; Villoing, Daphnée; McKay, Erin

    Purpose: The TestDose platform was developed to generate scintigraphic imaging protocols and associated dosimetry by Monte Carlo modeling. TestDose is part of a broader project (www.dositest.com) whose aim is to identify the biases induced by different clinical dosimetry protocols. Methods: The TestDose software allows handling the whole pipeline from virtual patient generation to resulting planar and SPECT images and dosimetry calculations. The originality of their approach relies on the implementation of functional segmentation for the anthropomorphic model representing a virtual patient. Two anthropomorphic models are currently available: 4D XCAT and ICRP 110. A pharmacokinetic model describes the biodistribution of amore » given radiopharmaceutical in each defined compartment at various time-points. The Monte Carlo simulation toolkit GATE offers the possibility to accurately simulate scintigraphic images and absorbed doses in volumes of interest. The TestDose platform relies on GATE to reproduce precisely any imaging protocol and to provide reference dosimetry. For image generation, TestDose stores user’s imaging requirements and generates automatically command files used as input for GATE. Each compartment is simulated only once and the resulting output is weighted using pharmacokinetic data. Resulting compartment projections are aggregated to obtain the final image. For dosimetry computation, emission data are stored in the platform database and relevant GATE input files are generated for the virtual patient model and associated pharmacokinetics. Results: Two samples of software runs are given to demonstrate the potential of TestDose. A clinical imaging protocol for the Octreoscan™ therapeutical treatment was implemented using the 4D XCAT model. Whole-body “step and shoot” acquisitions at different times postinjection and one SPECT acquisition were generated within reasonable computation times. Based on the same Octreoscan™ kinetics, a dosimetry computation performed on the ICRP 110 model is also presented. Conclusions: The proposed platform offers a generic framework to implement any scintigraphic imaging protocols and voxel/organ-based dosimetry computation. Thanks to the modular nature of TestDose, other imaging modalities could be supported in the future such as positron emission tomography.« less

  19. TestDose: A nuclear medicine software based on Monte Carlo modeling for generating gamma camera acquisitions and dosimetry.

    PubMed

    Garcia, Marie-Paule; Villoing, Daphnée; McKay, Erin; Ferrer, Ludovic; Cremonesi, Marta; Botta, Francesca; Ferrari, Mahila; Bardiès, Manuel

    2015-12-01

    The TestDose platform was developed to generate scintigraphic imaging protocols and associated dosimetry by Monte Carlo modeling. TestDose is part of a broader project (www.dositest.com) whose aim is to identify the biases induced by different clinical dosimetry protocols. The TestDose software allows handling the whole pipeline from virtual patient generation to resulting planar and SPECT images and dosimetry calculations. The originality of their approach relies on the implementation of functional segmentation for the anthropomorphic model representing a virtual patient. Two anthropomorphic models are currently available: 4D XCAT and ICRP 110. A pharmacokinetic model describes the biodistribution of a given radiopharmaceutical in each defined compartment at various time-points. The Monte Carlo simulation toolkit gate offers the possibility to accurately simulate scintigraphic images and absorbed doses in volumes of interest. The TestDose platform relies on gate to reproduce precisely any imaging protocol and to provide reference dosimetry. For image generation, TestDose stores user's imaging requirements and generates automatically command files used as input for gate. Each compartment is simulated only once and the resulting output is weighted using pharmacokinetic data. Resulting compartment projections are aggregated to obtain the final image. For dosimetry computation, emission data are stored in the platform database and relevant gate input files are generated for the virtual patient model and associated pharmacokinetics. Two samples of software runs are given to demonstrate the potential of TestDose. A clinical imaging protocol for the Octreoscan™ therapeutical treatment was implemented using the 4D XCAT model. Whole-body "step and shoot" acquisitions at different times postinjection and one SPECT acquisition were generated within reasonable computation times. Based on the same Octreoscan™ kinetics, a dosimetry computation performed on the ICRP 110 model is also presented. The proposed platform offers a generic framework to implement any scintigraphic imaging protocols and voxel/organ-based dosimetry computation. Thanks to the modular nature of TestDose, other imaging modalities could be supported in the future such as positron emission tomography.

  20. Computation of the ensemble channelized Hotelling observer signal-to-noise ratio for ordered-subset image reconstruction using noisy data

    NASA Astrophysics Data System (ADS)

    Soares, Edward J.; Gifford, Howard C.; Glick, Stephen J.

    2003-05-01

    We investigated the estimation of the ensemble channelized Hotelling observer (CHO) signal-to-noise ratio (SNR) for ordered-subset (OS) image reconstruction using noisy projection data. Previously, we computed the ensemble CHO SNR using a method for approximating the channelized covariance of OS reconstruction, which requires knowledge of the noise-free projection data. Here, we use a "plug-in" approach, in which noisy data is used in place of the noise-free data in the aforementioned channelized covariance approximation. Additionally, we evaluated the use of smoothing of the noisy projections before use in the covariance approximation. Additionally, we evaluated the use of smoothing of the noisy projections before use in the covariance calculation. The task was detection of a 10% contrast Gaussian signal within a slice of the MCAT phantom. Simulated projections of the MCAT phantom were scaled and Poisson noise was added to create 100 noisy signal-absent data sets. Simulated projections of the scaled signal were then added to the noisy background projections to create 100 noisy signal-present data set. These noisy data sets were then used to generate 100 estimates of the ensemble CHO SNR for reconstructions at various iterates. For comparison purposes, the same calculation was repeated with the noise-free data. The results, reported as plots of the average CHO SNR generated in this fashion, along with 95% confidence intervals, demonstrate that this approach works very well, and would allow optimization of imaging systems and reconstruction methods using a more accurate object model (i.e., real patient data).

  1. Project-based faculty development for e-learning.

    PubMed

    Vyas, Rashmi; Faith, Minnie; Selvakumar, Dhayakani; Pulimood, Anna; Lee, Mary

    2016-12-01

    The Christian Medical College, Vellore, in collaboration with Tufts University, Boston, conducted an advanced workshop in e-learning for medical faculty members in India. E-learning can enhance educational reforms for today's computer-literate generation, and keep faculty members up to speed in a rapidly changing world. The purpose of this paper is to report on the design and evaluation of a project-based faculty member development programme focused on developing faculty members as educators and as peer trainers who can use e-learning for educational reforms. During a 2-day workshop, 29 participants in groups of two or three developed 13 e-learning projects for implementation in their institutions. Evaluation of the workshop was through written feedback from the participants at the end of the workshop and by telephone interview with one participant from each project group at the end of one year. Content analysis of qualitative data was perfomed. The participants reported that they were motivated to implement e-learning projects and recognised the need for and usefulness of e-learning. The majority of projects (10 out of 13) that were implemented 'to some extent' or 'to a great extent' faced challenges with a lack of resources and administrative support, but faculty members were able to overcome them. E-learning can enhance educational reforms for today's computer-literate generation IMPLICATIONS: Designing feasible e-learning projects in small groups and obtaining hands-on experience with e-learning tools enhance the effectiveness of subsequent implementation. To successfully incorporate e-learning when designing educational reforms, faculty member training, continuing support and infrastructure facilities are essential. © 2016 John Wiley & Sons Ltd.

  2. Instituto Nacional de Electrification, Guatemala Load Dispatch Center and Global Communications Center. Feasibility report (Instituto Nacional de Electrificacion, Guatemala Centro Nacional de Despacho de Carga y Sistema Global de Comunicaciones). Export trade information

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1985-07-01

    The report presents the results of a feasibility study for the National Load Dispatch Center and Global Communications System Project in Guatemala. The project consists of a communication system which will provide Institute Nacional de Electrificacion (INDE) operations personnel direct voice access to all major power system facilities. In addition, a modern computer based load dispatch center has been configured on a secure and reliable basis to provide automatic generation control of all major interconnected generating plants within Guatemala.

  3. Tempest: Tools for Addressing the Needs of Next-Generation Climate Models

    NASA Astrophysics Data System (ADS)

    Ullrich, P. A.; Guerra, J. E.; Pinheiro, M. C.; Fong, J.

    2015-12-01

    Tempest is a comprehensive simulation-to-science infrastructure that tackles the needs of next-generation, high-resolution, data intensive climate modeling activities. This project incorporates three key components: TempestDynamics, a global modeling framework for experimental numerical methods and high-performance computing; TempestRemap, a toolset for arbitrary-order conservative and consistent remapping between unstructured grids; and TempestExtremes, a suite of detection and characterization tools for identifying weather extremes in large climate datasets. In this presentation, the latest advances with the implementation of this framework will be discussed, and a number of projects now utilizing these tools will be featured.

  4. FY96-98 Summary Report Mercury: Next Generation Laser for High Energy Density Physics SI-014

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bayramian, A.; Beach, R.; Bibeau, C.

    The scope of the Mercury Laser project encompasses the research, development, and engineering required to build a new generation of diode-pumped solid-state lasers for Inertial Confinement Fusion (ICF). The Mercury Laser will be the first integrated demonstration of laser diodes, crystals, and gas cooling within a scalable laser architecture. This report is intended to summarize the progress accomplished during the first three years of the project. Due to the technological challenges associated with production of 900 nm diode-bars, heatsinks, and high optical-quality Yb:S-FAP crystals, the initial focus of the project was primarily centered on the R&D in these three areas.more » During the third year of the project, the R&D continued in parallel with the development of computer codes, partial activation of the laser, component testing, and code validation where appropriate.« less

  5. CGAT: a model for immersive personalized training in computational genomics

    PubMed Central

    Sims, David; Ponting, Chris P.

    2016-01-01

    How should the next generation of genomics scientists be trained while simultaneously pursuing high quality and diverse research? CGAT, the Computational Genomics Analysis and Training programme, was set up in 2010 by the UK Medical Research Council to complement its investment in next-generation sequencing capacity. CGAT was conceived around the twin goals of training future leaders in genome biology and medicine, and providing much needed capacity to UK science for analysing genome scale data sets. Here we outline the training programme employed by CGAT and describe how it dovetails with collaborative research projects to launch scientists on the road towards independent research careers in genomics. PMID:25981124

  6. High-speed high-accuracy three-dimensional shape measurement using digital binary defocusing method versus sinusoidal method

    NASA Astrophysics Data System (ADS)

    Hyun, Jae-Sang; Li, Beiwen; Zhang, Song

    2017-07-01

    This paper presents our research findings on high-speed high-accuracy three-dimensional shape measurement using digital light processing (DLP) technologies. In particular, we compare two different sinusoidal fringe generation techniques using the DLP projection devices: direct projection of computer-generated 8-bit sinusoidal patterns (a.k.a., the sinusoidal method), and the creation of sinusoidal patterns by defocusing binary patterns (a.k.a., the binary defocusing method). This paper mainly examines their performance on high-accuracy measurement applications under precisely controlled settings. Two different projection systems were tested in this study: a commercially available inexpensive projector and the DLP development kit. Experimental results demonstrated that the binary defocusing method always outperforms the sinusoidal method if a sufficient number of phase-shifted fringe patterns can be used.

  7. High-speed 3D imaging using digital binary defocusing method vs sinusoidal method

    NASA Astrophysics Data System (ADS)

    Zhang, Song; Hyun, Jae-Sang; Li, Beiwen

    2017-02-01

    This paper presents our research findings on high-speed 3D imaging using digital light processing (DLP) technologies. In particular, we compare two different sinusoidal fringe generation techniques using the DLP projection devices: direct projection of 8-bit computer generated sinusoidal patterns (a.k.a, the sinusoidal method), and the creation of sinusoidal patterns by defocusing binary patterns (a.k.a., the binary defocusing method). This paper mainly examines their performance on high-accuracy measurement applications under precisely controlled settings. Two different projection systems were tested in this study: the commercially available inexpensive projector, and the DLP development kit. Experimental results demonstrated that the binary defocusing method always outperforms the sinusoidal method if a sufficient number of phase-shifted fringe patterns can be used.

  8. Integration of an open interface PC scene generator using COTS DVI converter hardware

    NASA Astrophysics Data System (ADS)

    Nordland, Todd; Lyles, Patrick; Schultz, Bret

    2006-05-01

    Commercial-Off-The-Shelf (COTS) personal computer (PC) hardware is increasingly capable of computing high dynamic range (HDR) scenes for military sensor testing at high frame rates. New electro-optical and infrared (EO/IR) scene projectors feature electrical interfaces that can accept the DVI output of these PC systems. However, military Hardware-in-the-loop (HWIL) facilities such as those at the US Army Aviation and Missile Research Development and Engineering Center (AMRDEC) utilize a sizeable inventory of existing projection systems that were designed to use the Silicon Graphics Incorporated (SGI) digital video port (DVP, also known as DVP2 or DD02) interface. To mate the new DVI-based scene generation systems to these legacy projection systems, CG2 Inc., a Quantum3D Company (CG2), has developed a DVI-to-DVP converter called Delta DVP. This device takes progressive scan DVI input, converts it to digital parallel data, and combines and routes color components to derive a 16-bit wide luminance channel replicated on a DVP output interface. The HWIL Functional Area of AMRDEC has developed a suite of modular software to perform deterministic real-time, wave band-specific rendering of sensor scenes, leveraging the features of commodity graphics hardware and open source software. Together, these technologies enable sensor simulation and test facilities to integrate scene generation and projection components with diverse pedigrees.

  9. Eastern Renewable Generation Integration Study: Redefining What’s Possible for Renewable Energy

    ScienceCinema

    Bloom, Aaron

    2018-01-16

    NREL project manager Aaron Bloom introduces NREL’s Eastern Renewable Generation Integration Study (ERGIS) and high-performance computing capabilities and new methodologies that allowed NREL to model operations of the Eastern Interconnection at unprecedented fidelity. ERGIS shows that the Eastern Interconnection can balance the variability and uncertainty of wind and solar photovoltaics at a 5-minute level, for one simulated year.

  10. Computer programs for plotting spot-beam coverages from an earth synchronous satellite and earth-station antenna elevation angle contours

    NASA Technical Reports Server (NTRS)

    Stagl, T. W.; Singh, J. P.

    1972-01-01

    A description and listings of computer programs for plotting geographical and political features of the world or a specified portion of it, for plotting spot-beam coverages from an earth-synchronous satellite over the computer generated mass, and for plotting polar perspective views of the earth and earth-station antenna elevation contours for a given satellite location are presented. The programs have been prepared in connection with a project on Application of Communication Satellites to Educational Development.

  11. Project Listen Compute Show (LCS) - Marine

    DTIC Science & Technology

    2004-02-01

    Figure 15. Block diagram of a BB-5. Notice the discrete components between the FPGA and the display connection. All of these are scheduled to be... scheduled to form the core of the next generation projection product. This architecture is expected to scale to true HDTV resolution of 1920 by 1080...flight schedule obtained from a SABRE database in order to offer on-time status. We have developed more sophisticated mechanisms for dealing with

  12. Holographic memory system based on projection recording of computer-generated 1D Fourier holograms.

    PubMed

    Betin, A Yu; Bobrinev, V I; Donchenko, S S; Odinokov, S B; Evtikhiev, N N; Starikov, R S; Starikov, S N; Zlokazov, E Yu

    2014-10-01

    Utilization of computer generation of holographic structures significantly simplifies the optical scheme that is used to record the microholograms in a holographic memory record system. Also digital holographic synthesis allows to account the nonlinear errors of the record system to improve the microholograms quality. The multiplexed record of holograms is a widespread technique to increase the data record density. In this article we represent the holographic memory system based on digital synthesis of amplitude one-dimensional (1D) Fourier transform holograms and the multiplexed record of these holograms onto the holographic carrier using optical projection scheme. 1D Fourier transform holograms are very sensitive to orientation of the anamorphic optical element (cylindrical lens) that is required for encoded data object reconstruction. The multiplex record of several holograms with different orientation in an optical projection scheme allowed reconstruction of the data object from each hologram by rotating the cylindrical lens on the corresponding angle. Also, we discuss two optical schemes for the recorded holograms readout: a full-page readout system and line-by-line readout system. We consider the benefits of both systems and present the results of experimental modeling of 1D Fourier holograms nonmultiplex and multiplex record and reconstruction.

  13. Resource Aware Intelligent Network Services (RAINS) Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lehman, Tom; Yang, Xi

    The Resource Aware Intelligent Network Services (RAINS) project conducted research and developed technologies in the area of cyber infrastructure resource modeling and computation. The goal of this work was to provide a foundation to enable intelligent, software defined services which spanned the network AND the resources which connect to the network. A Multi-Resource Service Plane (MRSP) was defined, which allows resource owners/managers to locate and place themselves from a topology and service availability perspective within the dynamic networked cyberinfrastructure ecosystem. The MRSP enables the presentation of integrated topology views and computation results which can include resources across the spectrum ofmore » compute, storage, and networks. The RAINS project developed MSRP includes the following key components: i) Multi-Resource Service (MRS) Ontology/Multi-Resource Markup Language (MRML), ii) Resource Computation Engine (RCE), iii) Modular Driver Framework (to allow integration of a variety of external resources). The MRS/MRML is a general and extensible modeling framework that allows for resource owners to model, or describe, a wide variety of resource types. All resources are described using three categories of elements: Resources, Services, and Relationships between the elements. This modeling framework defines a common method for the transformation of cyber infrastructure resources into data in the form of MRML models. In order to realize this infrastructure datification, the RAINS project developed a model based computation system, i.e. “RAINS Computation Engine (RCE)”. The RCE has the ability to ingest, process, integrate, and compute based on automatically generated MRML models. The RCE interacts with the resources thru system drivers which are specific to the type of external network or resource controller. The RAINS project developed a modular and pluggable driver system which facilities a variety of resource controllers to automatically generate, maintain, and distribute MRML based resource descriptions. Once all of the resource topologies are absorbed by the RCE, a connected graph of the full distributed system topology is constructed, which forms the basis for computation and workflow processing. The RCE includes a Modular Computation Element (MCE) framework which allows for tailoring of the computation process to the specific set of resources under control, and the services desired. The input and output of an MCE are both model data based on MRS/MRML ontology and schema. Some of the RAINS project accomplishments include: Development of general and extensible multi-resource modeling framework; Design of a Resource Computation Engine (RCE) system which includes the following key capabilities; Absorb a variety of multi-resource model types and build integrated models; Novel architecture which uses model based communications across the full stack for all Flexible provision of abstract or intent based user facing interfaces; Workflow processing based on model descriptions; Release of the RCE as an open source software; Deployment of RCE in the University of Maryland/Mid-Atlantic Crossroad ScienceDMZ in prototype mode with a plan under way to transition to production; Deployment at the Argonne National Laboratory DTN Facility in prototype mode; Selection of RCE by the DOE SENSE (SDN for End-to-end Networked Science at the Exascale) project as the basis for their orchestration service.« less

  14. eduCRATE--a Virtual Hospital architecture.

    PubMed

    Stoicu-Tivadar, Lăcrimioara; Stoicu-Tivadar, Vasile; Berian, Dorin; Drăgan, Simona; Serban, Alexandru; Serban, Corina

    2014-01-01

    eduCRATE is a complex project proposal which aims to develop a virtual learning environment offering interactive digital content through original and integrated solutions using cloud computing, complex multimedia systems in virtual space and personalized design with avatars. Compared to existing similar products the project brings the novelty of using languages for medical guides in order to ensure a maximum of flexibility. The Virtual Hospital simulations will create interactive clinical scenarios for which students will find solutions for positive diagnosis and therapeutic management. The solution based on cloud computing and immersive multimedia is an attractive option in education because is economical and it matches the current working style of the young generation to whom it addresses.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Savic, Vesna; Hector, Louis G.; Ezzat, Hesham

    This paper presents an overview of a four-year project focused on development of an integrated computational materials engineering (ICME) toolset for third generation advanced high-strength steels (3GAHSS). Following a brief look at ICME as an emerging discipline within the Materials Genome Initiative, technical tasks in the ICME project will be discussed. Specific aims of the individual tasks are multi-scale, microstructure-based material model development using state-of-the-art computational and experimental techniques, forming, toolset assembly, design optimization, integration and technical cost modeling. The integrated approach is initially illustrated using a 980 grade transformation induced plasticity (TRIP) steel, subject to a two-step quenching andmore » partitioning (Q&P) heat treatment, as an example.« less

  16. Use of a virtual world computer environment for international distance education: lessons from a pilot project using Second Life

    PubMed Central

    2014-01-01

    Virtual worlds (VWs), in which participants navigate as avatars through three-dimensional, computer-generated, realistic-looking environments, are emerging as important new technologies for distance health education. However, there is relatively little documented experience using VWs for international healthcare training. The Geneva Foundation for Medical Education and Research (GFMER) conducted a VW training for healthcare professionals enrolled in a GFMER training course. This paper describes the development, delivery, and results of a pilot project undertaken to explore the potential of VWs as an environment for distance healthcare education for an international audience that has generally limited access to conventionally delivered education. PMID:24555833

  17. The DoD's High Performance Computing Modernization Program - Ensuing the National Earth Systems Prediction Capability Becomes Operational

    NASA Astrophysics Data System (ADS)

    Burnett, W.

    2016-12-01

    The Department of Defense's (DoD) High Performance Computing Modernization Program (HPCMP) provides high performance computing to address the most significant challenges in computational resources, software application support and nationwide research and engineering networks. Today, the HPCMP has a critical role in ensuring the National Earth System Prediction Capability (N-ESPC) achieves initial operational status in 2019. A 2015 study commissioned by the HPCMP found that N-ESPC computational requirements will exceed interconnect bandwidth capacity due to the additional load from data assimilation and passing connecting data between ensemble codes. Memory bandwidth and I/O bandwidth will continue to be significant bottlenecks for the Navy's Hybrid Coordinate Ocean Model (HYCOM) scalability - by far the major driver of computing resource requirements in the N-ESPC. The study also found that few of the N-ESPC model developers have detailed plans to ensure their respective codes scale through 2024. Three HPCMP initiatives are designed to directly address and support these issues: Productivity Enhancement, Technology, Transfer and Training (PETTT), the HPCMP Applications Software Initiative (HASI), and Frontier Projects. PETTT supports code conversion by providing assistance, expertise and training in scalable and high-end computing architectures. HASI addresses the continuing need for modern application software that executes effectively and efficiently on next-generation high-performance computers. Frontier Projects enable research and development that could not be achieved using typical HPCMP resources by providing multi-disciplinary teams access to exceptional amounts of high performance computing resources. Finally, the Navy's DoD Supercomputing Resource Center (DSRC) currently operates a 6 Petabyte system, of which Naval Oceanography receives 15% of operational computational system use, or approximately 1 Petabyte of the processing capability. The DSRC will provide the DoD with future computing assets to initially operate the N-ESPC in 2019. This talk will further describe how DoD's HPCMP will ensure N-ESPC becomes operational, efficiently and effectively, using next-generation high performance computing.

  18. The National Grid Project: A system overview

    NASA Technical Reports Server (NTRS)

    Gaither, Adam; Gaither, Kelly; Jean, Brian; Remotigue, Michael; Whitmire, John; Soni, Bharat; Thompson, Joe; Dannenhoffer,, John; Weatherill, Nigel

    1995-01-01

    The National Grid Project (NGP) is a comprehensive numerical grid generation software system that is being developed at the National Science Foundation (NSF) Engineering Research Center (ERC) for Computational Field Simulation (CFS) at Mississippi State University (MSU). NGP is supported by a coalition of U.S. industries and federal laboratories. The objective of the NGP is to significantly decrease the amount of time it takes to generate a numerical grid for complex geometries and to increase the quality of these grids to enable computational field simulations for applications in industry. A geometric configuration can be discretized into grids (or meshes) that have two fundamental forms: structured and unstructured. Structured grids are formed by intersecting curvilinear coordinate lines and are composed of quadrilateral (2D) and hexahedral (3D) logically rectangular cells. The connectivity of a structured grid provides for trivial identification of neighboring points by incrementing coordinate indices. Unstructured grids are composed of cells of any shape (commonly triangles, quadrilaterals, tetrahedra and hexahedra), but do not have trivial identification of neighbors by incrementing an index. For unstructured grids, a set of points and an associated connectivity table is generated to define unstructured cell shapes and neighboring points. Hybrid grids are a combination of structured grids and unstructured grids. Chimera (overset) grids are intersecting or overlapping structured grids. The NGP system currently provides a user interface that integrates both 2D and 3D structured and unstructured grid generation, a solid modeling topology data management system, an internal Computer Aided Design (CAD) system based on Non-Uniform Rational B-Splines (NURBS), a journaling language, and a grid/solution visualization system.

  19. A Summary of NASA Research Exploring the Acoustics of Small Unmanned Aerial Systems

    NASA Technical Reports Server (NTRS)

    Zawodny, Nikolas S.; Christian, Andrew; Cabell, Randolph

    2018-01-01

    Proposed uses of small unmanned aerial systems (sUAS) have the potential to expose large portions of communities to a new noise source. In order to understand the potential noise impact of sUAS, NASA initiated acoustics research as one component of the 3-year DELIVER project, with the goal of documenting the feasibility of using existing aircraft design tools and methods on this class of vehicles. This paper summarizes the acoustics research conducted within the DELIVER project. The research described here represents an initial study, and subsequent research building on the findings of this work has been proposed for other NASA projects. The paper summarizes acoustics research in four areas: measurements of noise generated by flyovers of small unmanned aerial vehicles, measurements in controlled test facilities to understand the noise generated by components of these vehicles, computational predictions of component and full vehicle noise, and psychoacoustic tests including auralizations conducted to assess human annoyance to the noise generated by these vehicles.

  20. Electric Grid Expansion Planning with High Levels of Variable Generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hadley, Stanton W.; You, Shutang; Shankar, Mallikarjun

    2016-02-01

    Renewables are taking a large proportion of generation capacity in U.S. power grids. As their randomness has increasing influence on power system operation, it is necessary to consider their impact on system expansion planning. To this end, this project studies the generation and transmission expansion co-optimization problem of the US Eastern Interconnection (EI) power grid with a high wind power penetration rate. In this project, the generation and transmission expansion problem for the EI system is modeled as a mixed-integer programming (MIP) problem. This study analyzed a time series creation method to capture the diversity of load and wind powermore » across balancing regions in the EI system. The obtained time series can be easily introduced into the MIP co-optimization problem and then solved robustly through available MIP solvers. Simulation results show that the proposed time series generation method and the expansion co-optimization model and can improve the expansion result significantly after considering the diversity of wind and load across EI regions. The improved expansion plan that combines generation and transmission will aid system planners and policy makers to maximize the social welfare. This study shows that modelling load and wind variations and diversities across balancing regions will produce significantly different expansion result compared with former studies. For example, if wind is modeled in more details (by increasing the number of wind output levels) so that more wind blocks are considered in expansion planning, transmission expansion will be larger and the expansion timing will be earlier. Regarding generation expansion, more wind scenarios will slightly reduce wind generation expansion in the EI system and increase the expansion of other generation such as gas. Also, adopting detailed wind scenarios will reveal that it may be uneconomic to expand transmission networks for transmitting a large amount of wind power through a long distance in the EI system. Incorporating more details of renewables in expansion planning will inevitably increase the computational burden. Therefore, high performance computing (HPC) techniques are urgently needed for power system operation and planning optimization. As a scoping study task, this project tested some preliminary parallel computation techniques such as breaking down the simulation task into several sub-tasks based on chronology splitting or sample splitting, and then assigning these sub-tasks to different cores. Testing results show significant time reduction when a simulation task is split into several sub-tasks for parallel execution.« less

  1. A new DoD initiative: the Computational Research and Engineering Acquisition Tools and Environments (CREATE) program

    NASA Astrophysics Data System (ADS)

    Arevalo, S.; Atwood, C.; Bell, P.; Blacker, T. D.; Dey, S.; Fisher, D.; Fisher, D. A.; Genalis, P.; Gorski, J.; Harris, A.; Hill, K.; Hurwitz, M.; Kendall, R. P.; Meakin, R. L.; Morton, S.; Moyer, E. T.; Post, D. E.; Strawn, R.; Veldhuizen, D. v.; Votta, L. G.; Wynn, S.; Zelinski, G.

    2008-07-01

    In FY2008, the U.S. Department of Defense (DoD) initiated the Computational Research and Engineering Acquisition Tools and Environments (CREATE) program, a 360M program with a two-year planning phase and a ten-year execution phase. CREATE will develop and deploy three computational engineering tool sets for DoD acquisition programs to use to design aircraft, ships and radio-frequency antennas. The planning and execution of CREATE are based on the 'lessons learned' from case studies of large-scale computational science and engineering projects. The case studies stress the importance of a stable, close-knit development team; a focus on customer needs and requirements; verification and validation; flexible and agile planning, management, and development processes; risk management; realistic schedules and resource levels; balanced short- and long-term goals and deliverables; and stable, long-term support by the program sponsor. Since it began in FY2008, the CREATE program has built a team and project structure, developed requirements and begun validating them, identified candidate products, established initial connections with the acquisition programs, begun detailed project planning and development, and generated the initial collaboration infrastructure necessary for success by its multi-institutional, multidisciplinary teams.

  2. India's Computational Biology Growth and Challenges.

    PubMed

    Chakraborty, Chiranjib; Bandyopadhyay, Sanghamitra; Agoramoorthy, Govindasamy

    2016-09-01

    India's computational science is growing swiftly due to the outburst of internet and information technology services. The bioinformatics sector of India has been transforming rapidly by creating a competitive position in global bioinformatics market. Bioinformatics is widely used across India to address a wide range of biological issues. Recently, computational researchers and biologists are collaborating in projects such as database development, sequence analysis, genomic prospects and algorithm generations. In this paper, we have presented the Indian computational biology scenario highlighting bioinformatics-related educational activities, manpower development, internet boom, service industry, research activities, conferences and trainings undertaken by the corporate and government sectors. Nonetheless, this new field of science faces lots of challenges.

  3. Automatic documentation system extension to multi-manufacturers' computers and to measure, improve, and predict software reliability. Appendix A and B

    NASA Technical Reports Server (NTRS)

    Simmons, D. B.

    1975-01-01

    A manual which explains how to use the DOMONIC command language is presented. The manual consists of the following sections: 1. Introduction 2. What You Must Know to Use DOMONIC 3. Functions of DOMONIC 4. Entering and Manipulating Data 5. Templates and Data Definitions 6. Recipes and Document Generation 7. Initiating A Project and 8. Entering and Changing Security Controls. The first four sections must be known by all DOMONIC users. Sections five, seven and eight describe functions normally used only by project management. Section six tells how to generate documents. The manual specifies what commands to use in performing each of the functions mentioned above.

  4. Kinematic Measurement of Knee Prosthesis from Single-Plane Projection Images

    NASA Astrophysics Data System (ADS)

    Hirokawa, Shunji; Ariyoshi, Shogo; Takahashi, Kenji; Maruyama, Koichi

    In this paper, the measurement of 3D motion from 2D perspective projections of knee prosthesis is described. The technique reported by Banks and Hodge was further developed in this study. The estimation was performed in two steps. The first-step estimation was performed on the assumption of orthogonal projection. Then, the second-step estimation was subsequently carried out based upon the perspective projection to accomplish more accurate estimation. The simulation results have demonstrated that the technique archived sufficient accuracies of position/orientation estimation for prosthetic kinematics. Then we applied our algorithm to the CCD images, thereby examining the influences of various artifacts, possibly incorporated through an imaging process, on the estimation accuracies. We found that accuracies in the experiment were influenced mainly by the geometric discrepancies between the prosthesis component and computer generated model and by the spacial inconsistencies between the coordinate axes of the positioner and that of the computer model. However, we verified that our algorithm could achieve proper and consistent estimation even for the CCD images.

  5. CGAT: a model for immersive personalized training in computational genomics.

    PubMed

    Sims, David; Ponting, Chris P; Heger, Andreas

    2016-01-01

    How should the next generation of genomics scientists be trained while simultaneously pursuing high quality and diverse research? CGAT, the Computational Genomics Analysis and Training programme, was set up in 2010 by the UK Medical Research Council to complement its investment in next-generation sequencing capacity. CGAT was conceived around the twin goals of training future leaders in genome biology and medicine, and providing much needed capacity to UK science for analysing genome scale data sets. Here we outline the training programme employed by CGAT and describe how it dovetails with collaborative research projects to launch scientists on the road towards independent research careers in genomics. © The Author 2015. Published by Oxford University Press.

  6. Power monitoring and control for large scale projects: SKA, a case study

    NASA Astrophysics Data System (ADS)

    Barbosa, Domingos; Barraca, João. Paulo; Maia, Dalmiro; Carvalho, Bruno; Vieira, Jorge; Swart, Paul; Le Roux, Gerhard; Natarajan, Swaminathan; van Ardenne, Arnold; Seca, Luis

    2016-07-01

    Large sensor-based science infrastructures for radio astronomy like the SKA will be among the most intensive datadriven projects in the world, facing very high demanding computation, storage, management, and above all power demands. The geographically wide distribution of the SKA and its associated processing requirements in the form of tailored High Performance Computing (HPC) facilities, require a Greener approach towards the Information and Communications Technologies (ICT) adopted for the data processing to enable operational compliance to potentially strict power budgets. Addressing the reduction of electricity costs, improve system power monitoring and the generation and management of electricity at system level is paramount to avoid future inefficiencies and higher costs and enable fulfillments of Key Science Cases. Here we outline major characteristics and innovation approaches to address power efficiency and long-term power sustainability for radio astronomy projects, focusing on Green ICT for science and Smart power monitoring and control.

  7. Four-body trajectory optimization

    NASA Technical Reports Server (NTRS)

    Pu, C. L.; Edelbaum, T. N.

    1974-01-01

    A comprehensive optimization program has been developed for computing fuel-optimal trajectories between the earth and a point in the sun-earth-moon system. It presents methods for generating fuel optimal two-impulse trajectories which may originate at the earth or a point in space and fuel optimal three-impulse trajectories between two points in space. The extrapolation of the state vector and the computation of the state transition matrix are accomplished by the Stumpff-Weiss method. The cost and constraint gradients are computed analytically in terms of the terminal state and the state transition matrix. The 4-body Lambert problem is solved by using the Newton-Raphson method. An accelerated gradient projection method is used to optimize a 2-impulse trajectory with terminal constraint. The Davidon's Variance Method is used both in the accelerated gradient projection method and the outer loop of a 3-impulse trajectory optimization problem.

  8. Turbine Internal and Film Cooling Modeling For 3D Navier-Stokes Codes

    NASA Technical Reports Server (NTRS)

    DeWitt, Kenneth; Garg Vijay; Ameri, Ali

    2005-01-01

    The aim of this research project is to make use of NASA Glenn on-site computational facilities in order to develop, validate and apply aerodynamic, heat transfer, and turbine cooling models for use in advanced 3D Navier-Stokes Computational Fluid Dynamics (CFD) codes such as the Glenn-" code. Specific areas of effort include: Application of the Glenn-HT code to specific configurations made available under Turbine Based Combined Cycle (TBCC), and Ultra Efficient Engine Technology (UEET) projects. Validating the use of a multi-block code for the time accurate computation of the detailed flow and heat transfer of cooled turbine airfoils. The goal of the current research is to improve the predictive ability of the Glenn-HT code. This will enable one to design more efficient turbine components for both aviation and power generation. The models will be tested against specific configurations provided by NASA Glenn.

  9. The use of PC based VR in clinical medicine: the VREPAR projects.

    PubMed

    Riva, G; Bacchetta, M; Baruffi, M; Borgomainerio, E; Defrance, C; Gatti, F; Galimberti, C; Fontaneto, S; Marchi, S; Molinari, E; Nugues, P; Rinaldi, S; Rovetta, A; Ferretti, G S; Tonci, A; Wann, J; Vincelli, F

    1999-01-01

    Virtual reality (VR) is an emerging technology that alters the way individuals interact with computers: a 3D computer-generated environment in which a person can move about and interact as if he actually was inside it. Given to the high computational power required to create virtual environments, these are usually developed on expensive high-end workstations. However, the significant advances in PC hardware that have been made over the last three years, are making PC-based VR a possible solution for clinical assessment and therapy. VREPAR - Virtual Reality Environments for Psychoneurophysiological Assessment and Rehabilitation - are two European Community funded projects (Telematics for health - HC 1053/HC 1055 - http://www.psicologia.net) that are trying to develop a modular PC-based virtual reality system for the medical market. The paper describes the rationale of the developed modules and the preliminary results obtained.

  10. Technologies for Large Data Management in Scientific Computing

    NASA Astrophysics Data System (ADS)

    Pace, Alberto

    2014-01-01

    In recent years, intense usage of computing has been the main strategy of investigations in several scientific research projects. The progress in computing technology has opened unprecedented opportunities for systematic collection of experimental data and the associated analysis that were considered impossible only few years ago. This paper focuses on the strategies in use: it reviews the various components that are necessary for an effective solution that ensures the storage, the long term preservation, and the worldwide distribution of large quantities of data that are necessary in a large scientific research project. The paper also mentions several examples of data management solutions used in High Energy Physics for the CERN Large Hadron Collider (LHC) experiments in Geneva, Switzerland which generate more than 30,000 terabytes of data every year that need to be preserved, analyzed, and made available to a community of several tenth of thousands scientists worldwide.

  11. A Bitslice Implementation of Anderson's Attack on A5/1

    NASA Astrophysics Data System (ADS)

    Bulavintsev, Vadim; Semenov, Alexander; Zaikin, Oleg; Kochemazov, Stepan

    2018-03-01

    The A5/1 keystream generator is a part of Global System for Mobile Communications (GSM) protocol, employed in cellular networks all over the world. Its cryptographic resistance was extensively analyzed in dozens of papers. However, almost all corresponding methods either employ a specific hardware or require an extensive preprocessing stage and significant amounts of memory. In the present study, a bitslice variant of Anderson's Attack on A5/1 is implemented. It requires very little computer memory and no preprocessing. Moreover, the attack can be made even more efficient by harnessing the computing power of modern Graphics Processing Units (GPUs). As a result, using commonly available GPUs this method can quite efficiently recover the secret key using only 64 bits of keystream. To test the performance of the implementation, a volunteer computing project was launched. 10 instances of A5/1 cryptanalysis have been successfully solved in this project in a single week.

  12. A pervasive parallel framework for visualization: final report for FWP 10-014707

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.

    2014-01-01

    We are on the threshold of a transformative change in the basic architecture of highperformance computing. The use of accelerator processors, characterized by large core counts, shared but asymmetrical memory, and heavy thread loading, is quickly becoming the norm in high performance computing. These accelerators represent significant challenges in updating our existing base of software. An intrinsic problem with this transition is a fundamental programming shift from message passing processes to much more fine thread scheduling with memory sharing. Another problem is the lack of stability in accelerator implementation; processor and compiler technology is currently changing rapidly. This report documentsmore » the results of our three-year ASCR project to address these challenges. Our project includes the development of the Dax toolkit, which contains the beginnings of new algorithms for a new generation of computers and the underlying infrastructure to rapidly prototype and build further algorithms as necessary.« less

  13. Acceleration of incremental-pressure-correction incompressible flow computations using a coarse-grid projection method

    NASA Astrophysics Data System (ADS)

    Kashefi, Ali; Staples, Anne

    2016-11-01

    Coarse grid projection (CGP) methodology is a novel multigrid method for systems involving decoupled nonlinear evolution equations and linear elliptic equations. The nonlinear equations are solved on a fine grid and the linear equations are solved on a corresponding coarsened grid. Mapping functions transfer data between the two grids. Here we propose a version of CGP for incompressible flow computations using incremental pressure correction methods, called IFEi-CGP (implicit-time-integration, finite-element, incremental coarse grid projection). Incremental pressure correction schemes solve Poisson's equation for an intermediate variable and not the pressure itself. This fact contributes to IFEi-CGP's efficiency in two ways. First, IFEi-CGP preserves the velocity field accuracy even for a high level of pressure field grid coarsening and thus significant speedup is achieved. Second, because incremental schemes reduce the errors that arise from boundaries with artificial homogenous Neumann conditions, CGP generates undamped flows for simulations with velocity Dirichlet boundary conditions. Comparisons of the data accuracy and CPU times for the incremental-CGP versus non-incremental-CGP computations are presented.

  14. Learning to Drive a Wheelchair in Virtual Reality

    ERIC Educational Resources Information Center

    Inman, Dean P.; Loge, Ken; Cram, Aaron; Peterson, Missy

    2011-01-01

    This research project studied the effect that a technology-based training program, WheelchairNet, could contribute to the education of children with physical disabilities by providing a chance to practice driving virtual motorized wheelchairs safely within a computer-generated world. Programmers created three virtual worlds for training. Scenarios…

  15. An Interactive Introduction to Protein Structure

    ERIC Educational Resources Information Center

    Lee, W. Theodore

    2004-01-01

    To improve student understanding of protein structure and the significance of noncovalent interactions in protein structure and function, students are assigned a project to write a paper complemented with computer-generated images. The assignment provides an opportunity for students to select a protein structure that is of interest and detail…

  16. 9 CFR 317.4 - Labeling approval.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Devices, except for generically approved labeling authorized for use in § 317.5(b). The management of the... indication of final color, as specified in § 317.2. FSIS will accept sketches that are hand drawn, computer generated or other reasonable facsimiles that clearly reflect and project the final version of the labeling...

  17. 15 CFR 750.7 - Issuance of licenses.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    .... After a license application is approved, a computer generated license is issued by the Department of... technology, commodities will be approved with a quantity or dollar value limit. The “Unit” paragraph within... is related to a multi-year project, when production lead time will not permit an export or reexport...

  18. 47 CFR 2.911 - Written application required.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... representative who shall indicate his title, such as plant manager, project engineer, etc. (d) Technical test... signature; however, the Office of Engineering and Technology may allow signature by any symbol executed or... computer-generated electronic impulses. [39 FR 5919, Feb. 15, 1974, as amended at 39 FR 27802, Aug. 1, 1974...

  19. 47 CFR 2.911 - Written application required.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... representative who shall indicate his title, such as plant manager, project engineer, etc. (d) Technical test... signature; however, the Office of Engineering and Technology may allow signature by any symbol executed or... computer-generated electronic impulses. [39 FR 5919, Feb. 15, 1974, as amended at 39 FR 27802, Aug. 1, 1974...

  20. The Newsroom to the Year 2001.

    ERIC Educational Resources Information Center

    Keirstead, Phillip O.

    Projections for a possible scenario for a television broadcast newsroom in 2001 would include a nearly completely computerized system, one which will write scripts, select and create graphics, organize newscasts and visuals, keep records, do research, and manage the newsroom from terminals. This computer system will generate many more newscasts…

  1. Evaluation of Computer-Assisted Instruction in Principles of Economics.

    ERIC Educational Resources Information Center

    Coates, Dennis; Humphreys, Brad R.

    2001-01-01

    Assesses the effectiveness of supplementary Web-based materials and activities in traditional introductory college economics courses. Results suggest that faculty should focus more on developing self-test quizzes and effective bulletin board discussion projects as opposed to generating online content related to text or lecture notes. (Author/LRW)

  2. 47 CFR 2.911 - Written application required.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... representative who shall indicate his title, such as plant manager, project engineer, etc. (d) Technical test... signature; however, the Office of Engineering and Technology may allow signature by any symbol executed or... computer-generated electronic impulses. [39 FR 5919, Feb. 15, 1974, as amended at 39 FR 27802, Aug. 1, 1974...

  3. 47 CFR 2.911 - Written application required.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... representative who shall indicate his title, such as plant manager, project engineer, etc. (d) Technical test... signature; however, the Office of Engineering and Technology may allow signature by any symbol executed or... computer-generated electronic impulses. [39 FR 5919, Feb. 15, 1974, as amended at 39 FR 27802, Aug. 1, 1974...

  4. 47 CFR 2.911 - Written application required.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... representative who shall indicate his title, such as plant manager, project engineer, etc. (d) Technical test... signature; however, the Office of Engineering and Technology may allow signature by any symbol executed or... computer-generated electronic impulses. [39 FR 5919, Feb. 15, 1974, as amended at 39 FR 27802, Aug. 1, 1974...

  5. Computer driven optical keratometer and method of evaluating the shape of the cornea

    NASA Technical Reports Server (NTRS)

    Baroth, Edmund C. (Inventor); Mouneimme, Samih A. (Inventor)

    1994-01-01

    An apparatus and method for measuring the shape of the cornea utilize only one reticle to generate a pattern of rings projected onto the surface of a subject's eye. The reflected pattern is focused onto an imaging device such as a video camera and a computer compares the reflected pattern with a reference pattern stored in the computer's memory. The differences between the reflected and stored patterns are used to calculate the deformation of the cornea which may be useful for pre-and post-operative evaluation of the eye by surgeons.

  6. Space-time least-squares Petrov-Galerkin projection in nonlinear model reduction.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choi, Youngsoo; Carlberg, Kevin Thomas

    Our work proposes a space-time least-squares Petrov-Galerkin (ST-LSPG) projection method for model reduction of nonlinear dynamical systems. In contrast to typical nonlinear model-reduction methods that first apply Petrov-Galerkin projection in the spatial dimension and subsequently apply time integration to numerically resolve the resulting low-dimensional dynamical system, the proposed method applies projection in space and time simultaneously. To accomplish this, the method first introduces a low-dimensional space-time trial subspace, which can be obtained by computing tensor decompositions of state-snapshot data. The method then computes discrete-optimal approximations in this space-time trial subspace by minimizing the residual arising after time discretization over allmore » space and time in a weighted ℓ 2-norm. This norm can be de ned to enable complexity reduction (i.e., hyper-reduction) in time, which leads to space-time collocation and space-time GNAT variants of the ST-LSPG method. Advantages of the approach relative to typical spatial-projection-based nonlinear model reduction methods such as Galerkin projection and least-squares Petrov-Galerkin projection include: (1) a reduction of both the spatial and temporal dimensions of the dynamical system, (2) the removal of spurious temporal modes (e.g., unstable growth) from the state space, and (3) error bounds that exhibit slower growth in time. Numerical examples performed on model problems in fluid dynamics demonstrate the ability of the method to generate orders-of-magnitude computational savings relative to spatial-projection-based reduced-order models without sacrificing accuracy.« less

  7. Description of the F-16XL Geometry and Computational Grids Used in CAWAPI

    NASA Technical Reports Server (NTRS)

    Boelens, O. J.; Badcock, K. J.; Gortz, S.; Morton, S.; Fritz, W.; Karman, S. L., Jr.; Michal, T.; Lamar, J. E.

    2009-01-01

    The objective of the Cranked-Arrow Wing Aerodynamics Project International (CAWAPI) was to allow a comprehensive validation of Computational Fluid Dynamics methods against the CAWAP flight database. A major part of this work involved the generation of high-quality computational grids. Prior to the grid generation an IGES file containing the air-tight geometry of the F-16XL aircraft was generated by a cooperation of the CAWAPI partners. Based on this geometry description both structured and unstructured grids have been generated. The baseline structured (multi-block) grid (and a family of derived grids) has been generated by the National Aerospace Laboratory NLR. Although the algorithms used by NLR had become available just before CAWAPI and thus only a limited experience with their application to such a complex configuration had been gained, a grid of good quality was generated well within four weeks. This time compared favourably with that required to produce the unstructured grids in CAWAPI. The baseline all-tetrahedral and hybrid unstructured grids has been generated at NASA Langley Research Center and the USAFA, respectively. To provide more geometrical resolution, trimmed unstructured grids have been generated at EADS-MAS, the UTSimCenter, Boeing Phantom Works and KTH/FOI. All grids generated within the framework of CAWAPI will be discussed in the article. Both results obtained on the structured grids and the unstructured grids showed a significant improvement in agreement with flight test data in comparison with those obtained on the structured multi-block grid used during CAWAP.

  8. Effects on Training Using Illumination in Virtual Environments

    NASA Technical Reports Server (NTRS)

    Maida, James C.; Novak, M. S. Jennifer; Mueller, Kristian

    1999-01-01

    Camera based tasks are commonly performed during orbital operations, and orbital lighting conditions, such as high contrast shadowing and glare, are a factor in performance. Computer based training using virtual environments is a common tool used to make and keep CTW members proficient. If computer based training included some of these harsh lighting conditions, would the crew increase their proficiency? The project goal was to determine whether computer based training increases proficiency if one trains for a camera based task using computer generated virtual environments with enhanced lighting conditions such as shadows and glare rather than color shaded computer images normally used in simulators. Previous experiments were conducted using a two degree of freedom docking system. Test subjects had to align a boresight camera using a hand controller with one axis of rotation and one axis of rotation. Two sets of subjects were trained on two computer simulations using computer generated virtual environments, one with lighting, and one without. Results revealed that when subjects were constrained by time and accuracy, those who trained with simulated lighting conditions performed significantly better than those who did not. To reinforce these results for speed and accuracy, the task complexity was increased.

  9. Laser Boron Fusion Reactor With Picosecond Petawatt Block Ignition

    NASA Astrophysics Data System (ADS)

    Hora, Heinrich; Eliezer, Shalom; Wang, Jiaxiang; Korn, Georg; Nissim, Noaz; Xu, Yan-Xia; Lalousis, Paraskevas; Kirchhoff, Gotz J.; Miley, George H.

    2018-05-01

    For developing a laser boron fusion reactor driven by picosecond laser pulses of more than 30 petawatts power, advances are reported about computations for the plasma block generation by the dielectric explosion of the interaction. Further results are about the direct drive ignition mechanism by a single laser pulse without the problems of spherical irradiation. For the sufficiently large stopping lengths of the generated alpha particles in the plasma results from other projects can be used.

  10. OVERGRID: A Unified Overset Grid Generation Graphical Interface

    NASA Technical Reports Server (NTRS)

    Chan, William M.; Akien, Edwin W. (Technical Monitor)

    1999-01-01

    This paper presents a unified graphical interface and gridding strategy for performing overset grid generation. The interface called OVERGRID has been specifically designed to follow an efficient overset gridding strategy, and contains general grid manipulation capabilities as well as modules that are specifically suited for overset grids. General grid utilities include functions for grid redistribution, smoothing, concatenation, extraction, extrapolation, projection, and many others. Modules specially tailored for overset grids include a seam curve extractor, hyperbolic and algebraic surface grid generators, a hyperbolic volume grid generator, and a Cartesian box grid generator, Grid visualization is achieved using OpenGL while widgets are constructed with Tcl/Tk. The software is portable between various platforms from UNIX workstations to personal computers.

  11. Off-farm applications of solar energy in agriculture

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berry, R.E.

    1980-01-01

    Food processing applications make up almost all present off-farm studies of solar energy in agriculture. Research, development and demonstration projects on solar food processing have shown significant progress over the past 3 years. Projects have included computer simulation and mathematical models, hardware and process development for removing moisture from horticultural or animal products, integration of energy conservation with solar energy augmentation in conventional processes, and commercial scale demonstrations. The demonstration projects include solar heated air for drying prunes and raisins, soy beans and onions/garlic; and solar generated steam for orange juice pasteurization. Several new and planned projects hold considerable promisemore » for commerical exploitation in future food processes.« less

  12. Bringing Computational Thinking into the High School Science and Math Classroom

    NASA Astrophysics Data System (ADS)

    Trouille, Laura; Beheshti, E.; Horn, M.; Jona, K.; Kalogera, V.; Weintrop, D.; Wilensky, U.; University CT-STEM Project, Northwestern; University CenterTalent Development, Northwestern

    2013-01-01

    Computational thinking (for example, the thought processes involved in developing algorithmic solutions to problems that can then be automated for computation) has revolutionized the way we do science. The Next Generation Science Standards require that teachers support their students’ development of computational thinking and computational modeling skills. As a result, there is a very high demand among teachers for quality materials. Astronomy provides an abundance of opportunities to support student development of computational thinking skills. Our group has taken advantage of this to create a series of astronomy-based computational thinking lesson plans for use in typical physics, astronomy, and math high school classrooms. This project is funded by the NSF Computing Education for the 21st Century grant and is jointly led by Northwestern University’s Center for Interdisciplinary Exploration and Research in Astrophysics (CIERA), the Computer Science department, the Learning Sciences department, and the Office of STEM Education Partnerships (OSEP). I will also briefly present the online ‘Astro Adventures’ courses for middle and high school students I have developed through NU’s Center for Talent Development. The online courses take advantage of many of the amazing online astronomy enrichment materials available to the public, including a range of hands-on activities and the ability to take images with the Global Telescope Network. The course culminates with an independent computational research project.

  13. EPA Project Updates: DSSTox and ToxCast Generating New ...

    EPA Pesticide Factsheets

    EPAs National Center for Computational Toxicology is building capabilities to support a new paradigm for toxicity screening and prediction. The DSSTox project is improving public access to quality structure-annotated chemical toxicity information in less summarized forms than traditionally employed in SAR modeling, and in ways that facilitate data-mining, and data read-across. The DSSTox Structure-Browser, launched in September 2007, provides structure searchability across all published DSSTox toxicity-related inventory, and is enabling linkages between previously isolated toxicity data resources. As of early March 2008, the public DSSTox inventory as been integrated into PubChem, allowing a user to take full advantage of PubChem structure-activity and bioassay clustering features. The most recent DSSTox version of Carcinogenic Potency Database file (CPDBAS) illustrates ways in which various summary definitions of carcinogenic activity can be employed in modeling and data mining. Phase I of the ToxCast project is generating high-throughput screening data from several hundred biochemical and cell-based assays for a set of 320 chemicals, mostly pesticide actives, with rich toxicology profiles. Incorporating and expanding traditional SAR Concepts into this new high-throughput and data-rich would pose conceptual and practical challenges, but also holds great promise for improving predictive capabilities. EPA's National Center for Computational Toxicology is bu

  14. Computational scalability of large size image dissemination

    NASA Astrophysics Data System (ADS)

    Kooper, Rob; Bajcsy, Peter

    2011-01-01

    We have investigated the computational scalability of image pyramid building needed for dissemination of very large image data. The sources of large images include high resolution microscopes and telescopes, remote sensing and airborne imaging, and high resolution scanners. The term 'large' is understood from a user perspective which means either larger than a display size or larger than a memory/disk to hold the image data. The application drivers for our work are digitization projects such as the Lincoln Papers project (each image scan is about 100-150MB or about 5000x8000 pixels with the total number to be around 200,000) and the UIUC library scanning project for historical maps from 17th and 18th century (smaller number but larger images). The goal of our work is understand computational scalability of the web-based dissemination using image pyramids for these large image scans, as well as the preservation aspects of the data. We report our computational benchmarks for (a) building image pyramids to be disseminated using the Microsoft Seadragon library, (b) a computation execution approach using hyper-threading to generate image pyramids and to utilize the underlying hardware, and (c) an image pyramid preservation approach using various hard drive configurations of Redundant Array of Independent Disks (RAID) drives for input/output operations. The benchmarks are obtained with a map (334.61 MB, JPEG format, 17591x15014 pixels). The discussion combines the speed and preservation objectives.

  15. MIDAS, prototype Multivariate Interactive Digital Analysis System, phase 1. Volume 3: Wiring diagrams

    NASA Technical Reports Server (NTRS)

    Kriegler, F. J.; Christenson, D.; Gordon, M.; Kistler, R.; Lampert, S.; Marshall, R.; Mclaughlin, R.

    1974-01-01

    The Midas System is a third-generation, fast, multispectral recognition system able to keep pace with the large quantity and high rates of data acquisition from present and projected sensors. A principal objective of the MIDAS Program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turn-around time and significant gains in throughput. The hardware and software generated in Phase I of the overall program are described. The system contains a mini-computer to control the various high-speed processing elements in the data path and a classifier which implements an all-digital prototype multivariate-Gaussian maximum likelihood decision algorithm operating at 2 x 100,000 pixels/sec. Sufficient hardware was developed to perform signature extraction from computer-compatible tapes, compute classifier coefficients, control the classifier operation, and diagnose operation. The MIDAS construction and wiring diagrams are given.

  16. MIDAS, prototype Multivariate Interactive Digital Analysis System, Phase 1. Volume 2: Diagnostic system

    NASA Technical Reports Server (NTRS)

    Kriegler, F. J.; Christenson, D.; Gordon, M.; Kistler, R.; Lampert, S.; Marshall, R.; Mclaughlin, R.

    1974-01-01

    The MIDAS System is a third-generation, fast, multispectral recognition system able to keep pace with the large quantity and high rates of data acquisition from present and projected sensors. A principal objective of the MIDAS Program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turn-around time and significant gains in throughout. The hardware and software generated in Phase I of the over-all program are described. The system contains a mini-computer to control the various high-speed processing elements in the data path and a classifier which implements an all-digital prototype multivariate-Gaussian maximum likelihood decision algorithm operating 2 x 105 pixels/sec. Sufficient hardware was developed to perform signature extraction from computer-compatible tapes, compute classifier coefficients, control the classifier operation, and diagnose operation. Diagnostic programs used to test MIDAS' operations are presented.

  17. The Argonne Leadership Computing Facility 2010 annual report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drugan, C.

    Researchers found more ways than ever to conduct transformative science at the Argonne Leadership Computing Facility (ALCF) in 2010. Both familiar initiatives and innovative new programs at the ALCF are now serving a growing, global user community with a wide range of computing needs. The Department of Energy's (DOE) INCITE Program remained vital in providing scientists with major allocations of leadership-class computing resources at the ALCF. For calendar year 2011, 35 projects were awarded 732 million supercomputer processor-hours for computationally intensive, large-scale research projects with the potential to significantly advance key areas in science and engineering. Argonne also continued tomore » provide Director's Discretionary allocations - 'start up' awards - for potential future INCITE projects. And DOE's new ASCR Leadership Computing (ALCC) Program allocated resources to 10 ALCF projects, with an emphasis on high-risk, high-payoff simulations directly related to the Department's energy mission, national emergencies, or for broadening the research community capable of using leadership computing resources. While delivering more science today, we've also been laying a solid foundation for high performance computing in the future. After a successful DOE Lehman review, a contract was signed to deliver Mira, the next-generation Blue Gene/Q system, to the ALCF in 2012. The ALCF is working with the 16 projects that were selected for the Early Science Program (ESP) to enable them to be productive as soon as Mira is operational. Preproduction access to Mira will enable ESP projects to adapt their codes to its architecture and collaborate with ALCF staff in shaking down the new system. We expect the 10-petaflops system to stoke economic growth and improve U.S. competitiveness in key areas such as advancing clean energy and addressing global climate change. Ultimately, we envision Mira as a stepping-stone to exascale-class computers that will be faster than petascale-class computers by a factor of a thousand. Pete Beckman, who served as the ALCF's Director for the past few years, has been named director of the newly created Exascale Technology and Computing Institute (ETCi). The institute will focus on developing exascale computing to extend scientific discovery and solve critical science and engineering problems. Just as Pete's leadership propelled the ALCF to great success, we know that that ETCi will benefit immensely from his expertise and experience. Without question, the future of supercomputing is certainly in good hands. I would like to thank Pete for all his effort over the past two years, during which he oversaw the establishing of ALCF2, the deployment of the Magellan project, increases in utilization, availability, and number of projects using ALCF1. He managed the rapid growth of ALCF staff and made the facility what it is today. All the staff and users are better for Pete's efforts.« less

  18. An Approach to Integrate a Space-Time GIS Data Model with High Performance Computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Dali; Zhao, Ziliang; Shaw, Shih-Lung

    2011-01-01

    In this paper, we describe an approach to integrate a Space-Time GIS data model on a high performance computing platform. The Space-Time GIS data model has been developed on a desktop computing environment. We use the Space-Time GIS data model to generate GIS module, which organizes a series of remote sensing data. We are in the process of porting the GIS module into an HPC environment, in which the GIS modules handle large dataset directly via parallel file system. Although it is an ongoing project, authors hope this effort can inspire further discussions on the integration of GIS on highmore » performance computing platforms.« less

  19. Parallel Algorithms for the Exascale Era

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robey, Robert W.

    New parallel algorithms are needed to reach the Exascale level of parallelism with millions of cores. We look at some of the research developed by students in projects at LANL. The research blends ideas from the early days of computing while weaving in the fresh approach brought by students new to the field of high performance computing. We look at reproducibility of global sums and why it is important to parallel computing. Next we look at how the concept of hashing has led to the development of more scalable algorithms suitable for next-generation parallel computers. Nearly all of this workmore » has been done by undergraduates and published in leading scientific journals.« less

  20. PREPARING FOR EXASCALE: ORNL Leadership Computing Application Requirements and Strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joubert, Wayne; Kothe, Douglas B; Nam, Hai Ah

    2009-12-01

    In 2009 the Oak Ridge Leadership Computing Facility (OLCF), a U.S. Department of Energy (DOE) facility at the Oak Ridge National Laboratory (ORNL) National Center for Computational Sciences (NCCS), elicited petascale computational science requirements from leading computational scientists in the international science community. This effort targeted science teams whose projects received large computer allocation awards on OLCF systems. A clear finding of this process was that in order to reach their science goals over the next several years, multiple projects will require computational resources in excess of an order of magnitude more powerful than those currently available. Additionally, for themore » longer term, next-generation science will require computing platforms of exascale capability in order to reach DOE science objectives over the next decade. It is generally recognized that achieving exascale in the proposed time frame will require disruptive changes in computer hardware and software. Processor hardware will become necessarily heterogeneous and will include accelerator technologies. Software must undergo the concomitant changes needed to extract the available performance from this heterogeneous hardware. This disruption portends to be substantial, not unlike the change to the message passing paradigm in the computational science community over 20 years ago. Since technological disruptions take time to assimilate, we must aggressively embark on this course of change now, to insure that science applications and their underlying programming models are mature and ready when exascale computing arrives. This includes initiation of application readiness efforts to adapt existing codes to heterogeneous architectures, support of relevant software tools, and procurement of next-generation hardware testbeds for porting and testing codes. The 2009 OLCF requirements process identified numerous actions necessary to meet this challenge: (1) Hardware capabilities must be advanced on multiple fronts, including peak flops, node memory capacity, interconnect latency, interconnect bandwidth, and memory bandwidth. (2) Effective parallel programming interfaces must be developed to exploit the power of emerging hardware. (3) Science application teams must now begin to adapt and reformulate application codes to the new hardware and software, typified by hierarchical and disparate layers of compute, memory and concurrency. (4) Algorithm research must be realigned to exploit this hierarchy. (5) When possible, mathematical libraries must be used to encapsulate the required operations in an efficient and useful way. (6) Software tools must be developed to make the new hardware more usable. (7) Science application software must be improved to cope with the increasing complexity of computing systems. (8) Data management efforts must be readied for the larger quantities of data generated by larger, more accurate science models. Requirements elicitation, analysis, validation, and management comprise a difficult and inexact process, particularly in periods of technological change. Nonetheless, the OLCF requirements modeling process is becoming increasingly quantitative and actionable, as the process becomes more developed and mature, and the process this year has identified clear and concrete steps to be taken. This report discloses (1) the fundamental science case driving the need for the next generation of computer hardware, (2) application usage trends that illustrate the science need, (3) application performance characteristics that drive the need for increased hardware capabilities, (4) resource and process requirements that make the development and deployment of science applications on next-generation hardware successful, and (5) summary recommendations for the required next steps within the computer and computational science communities.« less

  1. Preliminary Computational Study for Future Tests in the NASA Ames 9 foot' x 7 foot Wind Tunnel

    NASA Technical Reports Server (NTRS)

    Pearl, Jason M.; Carter, Melissa B.; Elmiligui, Alaa A.; WInski, Courtney S.; Nayani, Sudheer N.

    2016-01-01

    The NASA Advanced Air Vehicles Program, Commercial Supersonics Technology Project seeks to advance tools and techniques to make over-land supersonic flight feasible. In this study, preliminary computational results are presented for future tests in the NASA Ames 9 foot x 7 foot supersonic wind tunnel to be conducted in early 2016. Shock-plume interactions and their effect on pressure signature are examined for six model geometries. Near- field pressure signatures are assessed using the CFD code USM3D to model the proposed test geometries in free-air. Additionally, results obtained using the commercial grid generation software Pointwise Reigistered Trademark are compared to results using VGRID, the NASA Langley Research Center in-house mesh generation program.

  2. Objective evaluation of linear and nonlinear tomosynthetic reconstruction algorithms

    NASA Astrophysics Data System (ADS)

    Webber, Richard L.; Hemler, Paul F.; Lavery, John E.

    2000-04-01

    This investigation objectively tests five different tomosynthetic reconstruction methods involving three different digital sensors, each used in a different radiologic application: chest, breast, and pelvis, respectively. The common task was to simulate a specific representative projection for each application by summation of appropriately shifted tomosynthetically generated slices produced by using the five algorithms. These algorithms were, respectively, (1) conventional back projection, (2) iteratively deconvoluted back projection, (3) a nonlinear algorithm similar to back projection, except that the minimum value from all of the component projections for each pixel is computed instead of the average value, (4) a similar algorithm wherein the maximum value was computed instead of the minimum value, and (5) the same type of algorithm except that the median value was computed. Using these five algorithms, we obtained data from each sensor-tissue combination, yielding three factorially distributed series of contiguous tomosynthetic slices. The respective slice stacks then were aligned orthogonally and averaged to yield an approximation of a single orthogonal projection radiograph of the complete (unsliced) tissue thickness. Resulting images were histogram equalized, and actual projection control images were subtracted from their tomosynthetically synthesized counterparts. Standard deviations of the resulting histograms were recorded as inverse figures of merit (FOMs). Visual rankings of image differences by five human observers of a subset (breast data only) also were performed to determine whether their subjective observations correlated with homologous FOMs. Nonparametric statistical analysis of these data demonstrated significant differences (P > 0.05) between reconstruction algorithms. The nonlinear minimization reconstruction method nearly always outperformed the other methods tested. Observer rankings were similar to those measured objectively.

  3. "To Err Is Human, but to Persist Is Diabolical": Loss of Organizational Memory and E-Learning Projects

    ERIC Educational Resources Information Center

    Ozdemir, Selcuk

    2010-01-01

    Many countries around the world install millions of computers, printers, projectors, smartboards, and similar technologies in primary and secondary schools to equip new generations with the ability to effectively access and critically evaluate information and communication technologies. However, experiences from different countries show that…

  4. Fibonacci and Nature. Mathematics Investigations for Schools.

    ERIC Educational Resources Information Center

    Newton, Lynn D.

    1987-01-01

    Sets forth the history of the Fibonacci Sequence and details its occurrence in nature and its potential for project work in schools. Ideas and activities include the rabbit problem, investigations of the sequence itself, its relationship to plants, music, snail shells, and the golden section. Computer generation of spirals is also discussed. (PK)

  5. Library Information System Time-Sharing (LISTS) Project. Final Report.

    ERIC Educational Resources Information Center

    Black, Donald V.

    The Library Information System Time-Sharing (LISTS) experiment was based on three innovations in data processing technology: (1) the advent of computer time-sharing on third-generation machines, (2) the development of general-purpose file-management software and (3) the introduction of large, library-oriented data bases. The main body of the…

  6. Remote Sensing: The View from Above. Know Your Environment.

    ERIC Educational Resources Information Center

    Academy of Natural Sciences, Philadelphia, PA.

    This publication identifies some of the general concepts of remote sensing and explains the image collection process and computer-generated reconstruction of the data. Monitoring the ecological collapse in coral reefs, weather phenomena like El Nino/La Nina, and U.S. Space Shuttle-based sensing projects are some of the areas for which remote…

  7. Effects of Response-Driven Feedback in Computer Science Learning

    ERIC Educational Resources Information Center

    Fernandez Aleman, J. L.; Palmer-Brown, D.; Jayne, C.

    2011-01-01

    This paper presents the results of a project on generating diagnostic feedback for guided learning in a first-year course on programming and a Master's course on software quality. An online multiple-choice questions (MCQs) system is integrated with neural network-based data analysis. Findings about how students use the system suggest that the…

  8. Progress Monitoring in Grade 5 Science for Low Achievers

    ERIC Educational Resources Information Center

    Vannest, Kimberly J.; Parker, Richard; Dyer, Nicole

    2011-01-01

    This article presents procedures and results from a 2-year project developing science key vocabulary (KV) short tests suitable for progress monitoring Grade 5 science in Texas public schools using computer-generated, -administered, and -scored assessments. KV items included KV definitions and important usages in a multiple-choice cloze format. A…

  9. GeNN: a code generation framework for accelerated brain simulations

    NASA Astrophysics Data System (ADS)

    Yavuz, Esin; Turner, James; Nowotny, Thomas

    2016-01-01

    Large-scale numerical simulations of detailed brain circuit models are important for identifying hypotheses on brain functions and testing their consistency and plausibility. An ongoing challenge for simulating realistic models is, however, computational speed. In this paper, we present the GeNN (GPU-enhanced Neuronal Networks) framework, which aims to facilitate the use of graphics accelerators for computational models of large-scale neuronal networks to address this challenge. GeNN is an open source library that generates code to accelerate the execution of network simulations on NVIDIA GPUs, through a flexible and extensible interface, which does not require in-depth technical knowledge from the users. We present performance benchmarks showing that 200-fold speedup compared to a single core of a CPU can be achieved for a network of one million conductance based Hodgkin-Huxley neurons but that for other models the speedup can differ. GeNN is available for Linux, Mac OS X and Windows platforms. The source code, user manual, tutorials, Wiki, in-depth example projects and all other related information can be found on the project website http://genn-team.github.io/genn/.

  10. GeNN: a code generation framework for accelerated brain simulations.

    PubMed

    Yavuz, Esin; Turner, James; Nowotny, Thomas

    2016-01-07

    Large-scale numerical simulations of detailed brain circuit models are important for identifying hypotheses on brain functions and testing their consistency and plausibility. An ongoing challenge for simulating realistic models is, however, computational speed. In this paper, we present the GeNN (GPU-enhanced Neuronal Networks) framework, which aims to facilitate the use of graphics accelerators for computational models of large-scale neuronal networks to address this challenge. GeNN is an open source library that generates code to accelerate the execution of network simulations on NVIDIA GPUs, through a flexible and extensible interface, which does not require in-depth technical knowledge from the users. We present performance benchmarks showing that 200-fold speedup compared to a single core of a CPU can be achieved for a network of one million conductance based Hodgkin-Huxley neurons but that for other models the speedup can differ. GeNN is available for Linux, Mac OS X and Windows platforms. The source code, user manual, tutorials, Wiki, in-depth example projects and all other related information can be found on the project website http://genn-team.github.io/genn/.

  11. GeNN: a code generation framework for accelerated brain simulations

    PubMed Central

    Yavuz, Esin; Turner, James; Nowotny, Thomas

    2016-01-01

    Large-scale numerical simulations of detailed brain circuit models are important for identifying hypotheses on brain functions and testing their consistency and plausibility. An ongoing challenge for simulating realistic models is, however, computational speed. In this paper, we present the GeNN (GPU-enhanced Neuronal Networks) framework, which aims to facilitate the use of graphics accelerators for computational models of large-scale neuronal networks to address this challenge. GeNN is an open source library that generates code to accelerate the execution of network simulations on NVIDIA GPUs, through a flexible and extensible interface, which does not require in-depth technical knowledge from the users. We present performance benchmarks showing that 200-fold speedup compared to a single core of a CPU can be achieved for a network of one million conductance based Hodgkin-Huxley neurons but that for other models the speedup can differ. GeNN is available for Linux, Mac OS X and Windows platforms. The source code, user manual, tutorials, Wiki, in-depth example projects and all other related information can be found on the project website http://genn-team.github.io/genn/. PMID:26740369

  12. Computational mechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raboin, P J

    1998-01-01

    The Computational Mechanics thrust area is a vital and growing facet of the Mechanical Engineering Department at Lawrence Livermore National Laboratory (LLNL). This work supports the development of computational analysis tools in the areas of structural mechanics and heat transfer. Over 75 analysts depend on thrust area-supported software running on a variety of computing platforms to meet the demands of LLNL programs. Interactions with the Department of Defense (DOD) High Performance Computing and Modernization Program and the Defense Special Weapons Agency are of special importance as they support our ParaDyn project in its development of new parallel capabilities for DYNA3D.more » Working with DOD customers has been invaluable to driving this technology in directions mutually beneficial to the Department of Energy. Other projects associated with the Computational Mechanics thrust area include work with the Partnership for a New Generation Vehicle (PNGV) for ''Springback Predictability'' and with the Federal Aviation Administration (FAA) for the ''Development of Methodologies for Evaluating Containment and Mitigation of Uncontained Engine Debris.'' In this report for FY-97, there are five articles detailing three code development activities and two projects that synthesized new code capabilities with new analytic research in damage/failure and biomechanics. The article this year are: (1) Energy- and Momentum-Conserving Rigid-Body Contact for NIKE3D and DYNA3D; (2) Computational Modeling of Prosthetics: A New Approach to Implant Design; (3) Characterization of Laser-Induced Mechanical Failure Damage of Optical Components; (4) Parallel Algorithm Research for Solid Mechanics Applications Using Finite Element Analysis; and (5) An Accurate One-Step Elasto-Plasticity Algorithm for Shell Elements in DYNA3D.« less

  13. Guidelines for developing distributed virtual environment applications

    NASA Astrophysics Data System (ADS)

    Stytz, Martin R.; Banks, Sheila B.

    1998-08-01

    We have conducted a variety of projects that served to investigate the limits of virtual environments and distributed virtual environment (DVE) technology for the military and medical professions. The projects include an application that allows the user to interactively explore a high-fidelity, dynamic scale model of the Solar System and a high-fidelity, photorealistic, rapidly reconfigurable aircraft simulator. Additional projects are a project for observing, analyzing, and understanding the activity in a military distributed virtual environment, a project to develop a distributed threat simulator for training Air Force pilots, a virtual spaceplane to determine user interface requirements for a planned military spaceplane system, and an automated wingman for use in supplementing or replacing human-controlled systems in a DVE. The last two projects are a virtual environment user interface framework; and a project for training hospital emergency department personnel. In the process of designing and assembling the DVE applications in support of these projects, we have developed rules of thumb and insights into assembling DVE applications and the environment itself. In this paper, we open with a brief review of the applications that were the source for our insights and then present the lessons learned as a result of these projects. The lessons we have learned fall primarily into five areas. These areas are requirements development, software architecture, human-computer interaction, graphical database modeling, and construction of computer-generated forces.

  14. Opportunities for Breakthroughs in Large-Scale Computational Simulation and Design

    NASA Technical Reports Server (NTRS)

    Alexandrov, Natalia; Alter, Stephen J.; Atkins, Harold L.; Bey, Kim S.; Bibb, Karen L.; Biedron, Robert T.; Carpenter, Mark H.; Cheatwood, F. McNeil; Drummond, Philip J.; Gnoffo, Peter A.

    2002-01-01

    Opportunities for breakthroughs in the large-scale computational simulation and design of aerospace vehicles are presented. Computational fluid dynamics tools to be used within multidisciplinary analysis and design methods are emphasized. The opportunities stem from speedups and robustness improvements in the underlying unit operations associated with simulation (geometry modeling, grid generation, physical modeling, analysis, etc.). Further, an improved programming environment can synergistically integrate these unit operations to leverage the gains. The speedups result from reducing the problem setup time through geometry modeling and grid generation operations, and reducing the solution time through the operation counts associated with solving the discretized equations to a sufficient accuracy. The opportunities are addressed only at a general level here, but an extensive list of references containing further details is included. The opportunities discussed are being addressed through the Fast Adaptive Aerospace Tools (FAAST) element of the Advanced Systems Concept to Test (ASCoT) and the third Generation Reusable Launch Vehicles (RLV) projects at NASA Langley Research Center. The overall goal is to enable greater inroads into the design process with large-scale simulations.

  15. PeptideNavigator: An interactive tool for exploring large and complex data sets generated during peptide-based drug design projects.

    PubMed

    Diller, Kyle I; Bayden, Alexander S; Audie, Joseph; Diller, David J

    2018-01-01

    There is growing interest in peptide-based drug design and discovery. Due to their relatively large size, polymeric nature, and chemical complexity, the design of peptide-based drugs presents an interesting "big data" challenge. Here, we describe an interactive computational environment, PeptideNavigator, for naturally exploring the tremendous amount of information generated during a peptide drug design project. The purpose of PeptideNavigator is the presentation of large and complex experimental and computational data sets, particularly 3D data, so as to enable multidisciplinary scientists to make optimal decisions during a peptide drug discovery project. PeptideNavigator provides users with numerous viewing options, such as scatter plots, sequence views, and sequence frequency diagrams. These views allow for the collective visualization and exploration of many peptides and their properties, ultimately enabling the user to focus on a small number of peptides of interest. To drill down into the details of individual peptides, PeptideNavigator provides users with a Ramachandran plot viewer and a fully featured 3D visualization tool. Each view is linked, allowing the user to seamlessly navigate from collective views of large peptide data sets to the details of individual peptides with promising property profiles. Two case studies, based on MHC-1A activating peptides and MDM2 scaffold design, are presented to demonstrate the utility of PeptideNavigator in the context of disparate peptide-design projects. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. NASA's Aviation Safety and Modeling Project

    NASA Technical Reports Server (NTRS)

    Chidester, Thomas R.; Statler, Irving C.

    2006-01-01

    The Aviation Safety Monitoring and Modeling (ASMM) Project of NASA's Aviation Safety program is cultivating sources of data and developing automated computer hardware and software to facilitate efficient, comprehensive, and accurate analyses of the data collected from large, heterogeneous databases throughout the national aviation system. The ASMM addresses the need to provide means for increasing safety by enabling the identification and correcting of predisposing conditions that could lead to accidents or to incidents that pose aviation risks. A major component of the ASMM Project is the Aviation Performance Measuring System (APMS), which is developing the next generation of software tools for analyzing and interpreting flight data.

  17. ProjectQ: Compiling quantum programs for various backends

    NASA Astrophysics Data System (ADS)

    Haener, Thomas; Steiger, Damian S.; Troyer, Matthias

    In order to control quantum computers beyond the current generation, a high level quantum programming language and optimizing compilers will be essential. Therefore, we have developed ProjectQ - an open source software framework to facilitate implementing and running quantum algorithms both in software and on actual quantum hardware. Here, we introduce the backends available in ProjectQ. This includes a high-performance simulator and emulator to test and debug quantum algorithms, tools for resource estimation, and interfaces to several small-scale quantum devices. We demonstrate the workings of the framework and show how easily it can be further extended to control upcoming quantum hardware.

  18. Paranoia.Ada: A diagnostic program to evaluate Ada floating-point arithmetic

    NASA Technical Reports Server (NTRS)

    Hjermstad, Chris

    1986-01-01

    Many essential software functions in the mission critical computer resource application domain depend on floating point arithmetic. Numerically intensive functions associated with the Space Station project, such as emphemeris generation or the implementation of Kalman filters, are likely to employ the floating point facilities of Ada. Paranoia.Ada appears to be a valuabe program to insure that Ada environments and their underlying hardware exhibit the precision and correctness required to satisfy mission computational requirements. As a diagnostic tool, Paranoia.Ada reveals many essential characteristics of an Ada floating point implementation. Equipped with such knowledge, programmers need not tremble before the complex task of floating point computation.

  19. Contributions of TetrUSS to Project Orion

    NASA Technical Reports Server (NTRS)

    Mcmillin, Susan N.; Frink, Neal T.; Kerimo, Johannes; Ding, Djiang; Nayani, Sudheer; Parlette, Edward B.

    2011-01-01

    The NASA Constellation program has relied heavily on Computational Fluid Dynamics simulations for generating aerodynamic databases and design loads. The Orion Project focuses on the Orion Crew Module and the Orion Launch Abort Vehicle. NASA TetrUSS codes (GridTool/VGRID/USM3D) have been applied in a supporting role to the Crew Exploration Vehicle Aerosciences Project for investigating various aerodynamic sensitivities and supplementing the aerodynamic database. This paper provides an overview of the contributions from the TetrUSS team to the Project Orion Crew Module and Launch Abort Vehicle aerodynamics, along with selected examples to highlight the challenges encountered along the way. A brief description of geometries and tasks will be discussed followed by a description of the flow solution process that produced production level computational solutions. Four tasks conducted by the USM3D team will be discussed to show how USM3D provided aerodynamic data for inclusion in the Orion aero-database, contributed data for the build-up of aerodynamic uncertainties for the aero-database, and provided insight into the flow features about the Crew Module and the Launch Abort Vehicle.

  20. Inventory of Federal energy-related environment and safety research for FY 1979. Volume II. Project listings and indexes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    This volume contains summaries of FY 1979 government-sponsored environment and safety research related to energy arranged by log number, which groups the projects by reporting agency. The log number is a unique number assigned to each project from a block of numbers set aside for each contributing agency. Information elements included in the summary listings are project title, principal investigators, research organization, project number, contract number, supporting organization, funding level, related energy sources with numbers indicating percentages of effort devoted to each, and R and D categories. A brief description of each project is given, and this is followed bymore » subject index terms that were assigned for computer searching and for generating the printed subject index in the back of this volume.« less

  1. Integrated Graphics Operations and Analysis Lab Development of Advanced Computer Graphics Algorithms

    NASA Technical Reports Server (NTRS)

    Wheaton, Ira M.

    2011-01-01

    The focus of this project is to aid the IGOAL in researching and implementing algorithms for advanced computer graphics. First, this project focused on porting the current International Space Station (ISS) Xbox experience to the web. Previously, the ISS interior fly-around education and outreach experience only ran on an Xbox 360. One of the desires was to take this experience and make it into something that can be put on NASA s educational site for anyone to be able to access. The current code works in the Unity game engine which does have cross platform capability but is not 100% compatible. The tasks for an intern to complete this portion consisted of gaining familiarity with Unity and the current ISS Xbox code, porting the Xbox code to the web as is, and modifying the code to work well as a web application. In addition, a procedurally generated cloud algorithm will be developed. Currently, the clouds used in AGEA animations and the Xbox experiences are a texture map. The desire is to create a procedurally generated cloud algorithm to provide dynamically generated clouds for both AGEA animations and the Xbox experiences. This task consists of gaining familiarity with AGEA and the plug-in interface, developing the algorithm, creating an AGEA plug-in to implement the algorithm inside AGEA, and creating a Unity script to implement the algorithm for the Xbox. This portion of the project was unable to be completed in the time frame of the internship; however, the IGOAL will continue to work on it in the future.

  2. 2009 ALCF annual report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beckman, P.; Martin, D.; Drugan, C.

    2010-11-23

    This year the Argonne Leadership Computing Facility (ALCF) delivered nearly 900 million core hours of science. The research conducted at their leadership class facility touched our lives in both minute and massive ways - whether it was studying the catalytic properties of gold nanoparticles, predicting protein structures, or unearthing the secrets of exploding stars. The authors remained true to their vision to act as the forefront computational center in extending science frontiers by solving pressing problems for our nation. Our success in this endeavor was due mainly to the Department of Energy's (DOE) INCITE (Innovative and Novel Computational Impact onmore » Theory and Experiment) program. The program awards significant amounts of computing time to computationally intensive, unclassified research projects that can make high-impact scientific advances. This year, DOE allocated 400 million hours of time to 28 research projects at the ALCF. Scientists from around the world conducted the research, representing such esteemed institutions as the Princeton Plasma Physics Laboratory, National Institute of Standards and Technology, and European Center for Research and Advanced Training in Scientific Computation. Argonne also provided Director's Discretionary allocations for research challenges, addressing such issues as reducing aerodynamic noise, critical for next-generation 'green' energy systems. Intrepid - the ALCF's 557-teraflops IBM Blue/Gene P supercomputer - enabled astounding scientific solutions and discoveries. Intrepid went into full production five months ahead of schedule. As a result, the ALCF nearly doubled the days of production computing available to the DOE Office of Science, INCITE awardees, and Argonne projects. One of the fastest supercomputers in the world for open science, the energy-efficient system uses about one-third as much electricity as a machine of comparable size built with more conventional parts. In October 2009, President Barack Obama recognized the excellence of the entire Blue Gene series by awarding it to the National Medal of Technology and Innovation. Other noteworthy achievements included the ALCF's collaboration with the National Energy Research Scientific Computing Center (NERSC) to examine cloud computing as a potential new computing paradigm for scientists. Named Magellan, the DOE-funded initiative will explore which science application programming models work well within the cloud, as well as evaluate the challenges that come with this new paradigm. The ALCF obtained approval for its next-generation machine, a 10-petaflops system to be delivered in 2012. This system will allow us to resolve ever more pressing problems, even more expeditiously through breakthrough science in the years to come.« less

  3. Conservation status of polar bears (Ursus maritimus) in relation to projected sea-ice declines

    NASA Astrophysics Data System (ADS)

    Laidre, K. L.; Regehr, E. V.; Akcakaya, H. R.; Amstrup, S. C.; Atwood, T.; Lunn, N.; Obbard, M.; Stern, H. L., III; Thiemann, G.; Wiig, O.

    2016-12-01

    Loss of Arctic sea ice due to climate change is the most serious threat to polar bears (Ursus maritimus) throughout their circumpolar range. We performed a data-based sensitivity analysis with respect to this threat by evaluating the potential response of the global polar bear population to projected sea-ice conditions. We conducted 1) an assessment of generation length for polar bears, 2) developed of a standardized sea-ice metric representing important habitat characteristics for the species; and 3) performed population projections over three generations, using computer simulation and statistical models representing alternative relationships between sea ice and polar bear abundance. Using three separate approaches, the median percent change in mean global population size for polar bears between 2015 and 2050 ranged from -4% (95% CI = -62%, 50%) to -43% (95% CI = -76%, -20%). Results highlight the potential for large reductions in the global population if sea-ice loss continues. They also highlight the large amount of uncertainty in statistical projections of polar bear abundance and the sensitivity of projections to plausible alternative assumptions. The median probability of a reduction in the mean global population size of polar bears greater than 30% over three generations was approximately 0.71 (range 0.20-0.95. The median probability of a reduction greater than 50% was approximately 0.07 (range 0-0.35), and the probability of a reduction greater than 80% was negligible.

  4. An Overview of Ares-I CFD Ascent Aerodynamic Data Development And Analysis Based on USM3D

    NASA Technical Reports Server (NTRS)

    Abdol-Hamid, Khaled S.; Ghaffari, Farhad; Parlette, Edward B.

    2011-01-01

    An overview of the computational results obtained from the NASA Langley developed unstructured grid, Reynolds-averaged Navier-Stokes flow solver USM3D, in support of the Ares-I project within the NASA s Constellation program, are presented. The numerical data are obtained for representative flow conditions pertinent to the ascent phase of the trajectory at both wind tunnel and flight Reynolds number without including any propulsion effects. The USM3D flow solver has been designated to have the primary role within the Ares-I project in developing the computational aerodynamic data for the vehicle while other flow solvers, namely OVERFLOW and FUN3D, have supporting roles to provide complementary results for fewer cases as part of the verification process to ensure code-to-code solution consistency. Similarly, as part of the solution validation efforts, the predicted numerical results are correlated with the aerodynamic wind tunnel data that have been generated within the project in the past few years. Sample aerodynamic results and the processes established for the computational solution/data development for the evolving Ares-I design cycles are presented.

  5. Research data collection methods: from paper to tablet computers.

    PubMed

    Wilcox, Adam B; Gallagher, Kathleen D; Boden-Albala, Bernadette; Bakken, Suzanne R

    2012-07-01

    Primary data collection is a critical activity in clinical research. Even with significant advances in technical capabilities, clear benefits of use, and even user preferences for using electronic systems for collecting primary data, paper-based data collection is still common in clinical research settings. However, with recent developments in both clinical research and tablet computer technology, the comparative advantages and disadvantages of data collection methods should be determined. To describe case studies using multiple methods of data collection, including next-generation tablets, and consider their various advantages and disadvantages. We reviewed 5 modern case studies using primary data collection, using methods ranging from paper to next-generation tablet computers. We performed semistructured telephone interviews with each project, which considered factors relevant to data collection. We address specific issues with workflow, implementation and security for these different methods, and identify differences in implementation that led to different technology considerations for each case study. There remain multiple methods for primary data collection, each with its own strengths and weaknesses. Two recent methods are electronic health record templates and next-generation tablet computers. Electronic health record templates can link data directly to medical records, but are notably difficult to use. Current tablet computers are substantially different from previous technologies with regard to user familiarity and software cost. The use of cloud-based storage for tablet computers, however, creates a specific challenge for clinical research that must be considered but can be overcome.

  6. Biorthogonal projected energies of a Gutzwiller similarity transformed Hamiltonian.

    PubMed

    Wahlen-Strothman, J M; Scuseria, G E

    2016-12-07

    We present a method incorporating biorthogonal orbital-optimization, symmetry projection, and double-occupancy screening with a non-unitary similarity transformation generated by the Gutzwiller factor [Formula: see text], and apply it to the Hubbard model. Energies are calculated with mean-field computational scaling with high-quality results comparable to coupled cluster singles and doubles. This builds on previous work performing similarity transformations with more general, two-body Jastrow-style correlators. The theory is tested on 2D lattices ranging from small systems into the thermodynamic limit and is compared to available reference data.

  7. A general software reliability process simulation technique

    NASA Technical Reports Server (NTRS)

    Tausworthe, Robert C.

    1991-01-01

    The structure and rationale of the generalized software reliability process, together with the design and implementation of a computer program that simulates this process are described. Given assumed parameters of a particular project, the users of this program are able to generate simulated status timelines of work products, numbers of injected anomalies, and the progress of testing, fault isolation, repair, validation, and retest. Such timelines are useful in comparison with actual timeline data, for validating the project input parameters, and for providing data for researchers in reliability prediction modeling.

  8. Relationships among Individual Task Self-Efficacy, Self-Regulated Learning Strategy Use and Academic Performance in a Computer-Supported Collaborative Learning Environment

    ERIC Educational Resources Information Center

    Wilson, Kimberly; Narayan, Anupama

    2016-01-01

    This study investigates relationships between self-efficacy, self-regulated learning strategy use and academic performance. Participants were 96 undergraduate students working on projects with three subtasks (idea generation task, methodical task and data collection) in a blended learning environment. Task self-efficacy was measured with…

  9. Supporting Concrete Visual Thinking in Multiplicative Reasoning: Difficulties and Opportunities. Technical Report 88-16.

    ERIC Educational Resources Information Center

    Kaput, James J.

    The Educational Technology Center has attempted to develop a series of computer based learning environments to support the learning and application of multiplicative reasoning. The work and software described in this paper, including the teaching experiment that generated the error phenomena examined, is part of a larger ongoing research project.…

  10. A general solution to the hidden-line problem. [to graphically represent aerodynamic stability derivatives

    NASA Technical Reports Server (NTRS)

    Hedgley, D. R., Jr.

    1982-01-01

    The requirements for computer-generated perspective projections of three dimensional objects has escalated. A general solution was developed. The theoretical solution to this problem is presented. The method is very efficient as it minimizes the selection of points and comparison of line segments and hence avoids the devastation of square-law growth.

  11. Beam Generated Vorticity and Convective Channel Mixing.

    DTIC Science & Technology

    1980-09-17

    one . PERFORMING ORGANIZATION NAME AND ADDRESS ,0. PROGRAM ELEMENT. PROJECT. TASK Laboratory for Computational Physics Naval Research Laboratory.- 62...profile, Eq. (21). Letting the integration variable be q- riR. yields If d’n. g(SO[71b + ci) V(r) - U 2 1 a dq (1 + 7 -a2 )2 fd 77 g ( Soic -7b)- g

  12. Computer-assisted Lemmatisation of a Cornish Text Corpus for Lexicographical Purposes

    ERIC Educational Resources Information Center

    Mills, Jon

    2002-01-01

    This project sets out to discover and develop techniques for the lemmatisation of a historical corpus of the Cornish language in order that a lemmatised dictionary macrostructure can be generated from the corpus. The system should be capable of uniquely identifying every lexical item that is attested in the corpus. A survey of published and…

  13. Optical Computing. Organization of the 1993 Photonics Science Topical Meetings Held in Palm Springs, California on March 16 - 19, 1993. Technical Digest Series, Volume 7

    DTIC Science & Technology

    1993-03-19

    network Implementation using 9:20 am asymmetric Fabry-Perot modulators, Andrew Jennings, Brian OWA3 Multiwavelength optical half adder, Pochi Yeh... multiwavelength optical half adder. (p. 68) nects. (p. 96) 9:40 am 2:50 pm OWA4 Wavelength multiplexed computer-generated volume OWC3 Content addramble...ATMOS and OSCAR are RACE projects, mentioned in the text shape this into new systems architectures, ("optical ether"). Broadly speaking, this has led to

  14. Compiling for Application Specific Computational Acceleration in Reconfigurable Architectures Final Report CRADA No. TSB-2033-01

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Supinski, B.; Caliga, D.

    2017-09-28

    The primary objective of this project was to develop memory optimization technology to efficiently deliver data to, and distribute data within, the SRC-6's Field Programmable Gate Array- ("FPGA") based Multi-Adaptive Processors (MAPs). The hardware/software approach was to explore efficient MAP configurations and generate the compiler technology to exploit those configurations. This memory accessing technology represents an important step towards making reconfigurable symmetric multi-processor (SMP) architectures that will be a costeffective solution for large-scale scientific computing.

  15. Using Equation-Free Computation to Accelerate Network-Free Stochastic Simulation of Chemical Kinetics.

    PubMed

    Lin, Yen Ting; Chylek, Lily A; Lemons, Nathan W; Hlavacek, William S

    2018-06-21

    The chemical kinetics of many complex systems can be concisely represented by reaction rules, which can be used to generate reaction events via a kinetic Monte Carlo method that has been termed network-free simulation. Here, we demonstrate accelerated network-free simulation through a novel approach to equation-free computation. In this process, variables are introduced that approximately capture system state. Derivatives of these variables are estimated using short bursts of exact stochastic simulation and finite differencing. The variables are then projected forward in time via a numerical integration scheme, after which a new exact stochastic simulation is initialized and the whole process repeats. The projection step increases efficiency by bypassing the firing of numerous individual reaction events. As we show, the projected variables may be defined as populations of building blocks of chemical species. The maximal number of connected molecules included in these building blocks determines the degree of approximation. Equation-free acceleration of network-free simulation is found to be both accurate and efficient.

  16. A flow visualization study of single-arm sculling movement emulating cephalopod thrust generation

    NASA Astrophysics Data System (ADS)

    Kazakidi, Asimina; Gnanamanickam, Ebenezer P.; Tsakiris, Dimitris P.; Ekaterinaris, John A.

    2014-11-01

    In addition to jet propulsion, octopuses use arm-swimming motion as an effective means of generating bursts of thrust, for hunting, defense, or escape. The individual role of their arms, acting as thrust generators during this motion, is still under investigation, in view of an increasing robotic interest for alternative modes of propulsion, inspired by the octopus. Computational studies have revealed that thrust generation is associated with complex vortical flow patterns in the wake of the moving arm, however further experimental validation is required. Using the hydrogen bubble technique, we studied the flow disturbance around a single octopus-like robotic arm, undergoing two-stroke sculling movements in quiescent fluid. Although simplified, sculling profiles have been found to adequately capture the fundamental kinematics of the octopus arm-swimming behavior. In fact, variation of the sculling parameters alters considerably the generation of forward thrust. Flow visualization revealed the generation of complex vortical structures around both rigid and compliant arms. Increased disturbance was evident near the tip, particularly at the transitional phase between recovery and power strokes. These results are in good qualitative agreement with computational and robotic studies. Work funded by the ESF-GSRT HYDRO-ROB Project PE7(281).

  17. Efficient and Flexible Climate Analysis with Python in a Cloud-Based Distributed Computing Framework

    NASA Astrophysics Data System (ADS)

    Gannon, C.

    2017-12-01

    As climate models become progressively more advanced, and spatial resolution further improved through various downscaling projects, climate projections at a local level are increasingly insightful and valuable. However, the raw size of climate datasets presents numerous hurdles for analysts wishing to develop customized climate risk metrics or perform site-specific statistical analysis. Four Twenty Seven, a climate risk consultancy, has implemented a Python-based distributed framework to analyze large climate datasets in the cloud. With the freedom afforded by efficiently processing these datasets, we are able to customize and continually develop new climate risk metrics using the most up-to-date data. Here we outline our process for using Python packages such as XArray and Dask to evaluate netCDF files in a distributed framework, StarCluster to operate in a cluster-computing environment, cloud computing services to access publicly hosted datasets, and how this setup is particularly valuable for generating climate change indicators and performing localized statistical analysis.

  18. Potential New Ligand Systems for Binding Uranyl Ions in Seawater Environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Arnold, John

    2014-12-13

    Work began this quarter on a new project involving a combined computational and biosynthetic approach to selective recognition of uranyl ion in aqueous solution. This project exploits the results of computational studies to discover new ligand classes. Synthetic studies will follow to generate target systems for uranyl binding and determination of binding constants. The process will be iterative, with results from computation informing synthesis, and vice versa. The theme of the ligand classes to be examined initially will be biologically based. New phosphonate-containing α-amino acid N-carboxyanhydride (NCA) monomers were used recently to prepare well-defined phosphonate-containing poly-peptides and block copolypeptides. Ourmore » first approach is to utilize these phosphate- and phosphonate-containing NCAs for the coordination of uranyl. The work includes the laboratory-scale preparation of a series of NCAs and the full thermodynamic and spectroscopic characterization of the resulting uranyl complexes. We are also evaluating the sequestering activity in different physiological and environmental conditions of these copolymers as well as their biodegradability.« less

  19. Low latency network and distributed storage for next generation HPC systems: the ExaNeSt project

    NASA Astrophysics Data System (ADS)

    Ammendola, R.; Biagioni, A.; Cretaro, P.; Frezza, O.; Lo Cicero, F.; Lonardo, A.; Martinelli, M.; Paolucci, P. S.; Pastorelli, E.; Pisani, F.; Simula, F.; Vicini, P.; Navaridas, J.; Chaix, F.; Chrysos, N.; Katevenis, M.; Papaeustathiou, V.

    2017-10-01

    With processor architecture evolution, the HPC market has undergone a paradigm shift. The adoption of low-cost, Linux-based clusters extended the reach of HPC from its roots in modelling and simulation of complex physical systems to a broader range of industries, from biotechnology, cloud computing, computer analytics and big data challenges to manufacturing sectors. In this perspective, the near future HPC systems can be envisioned as composed of millions of low-power computing cores, densely packed — meaning cooling by appropriate technology — with a tightly interconnected, low latency and high performance network and equipped with a distributed storage architecture. Each of these features — dense packing, distributed storage and high performance interconnect — represents a challenge, made all the harder by the need to solve them at the same time. These challenges lie as stumbling blocks along the road towards Exascale-class systems; the ExaNeSt project acknowledges them and tasks itself with investigating ways around them.

  20. The Human Toxome Collaboratorium: A Shared Environment for Multi-Omic Computational Collaboration within a Consortium.

    PubMed

    Fasani, Rick A; Livi, Carolina B; Choudhury, Dipanwita R; Kleensang, Andre; Bouhifd, Mounir; Pendse, Salil N; McMullen, Patrick D; Andersen, Melvin E; Hartung, Thomas; Rosenberg, Michael

    2015-01-01

    The Human Toxome Project is part of a long-term vision to modernize toxicity testing for the 21st century. In the initial phase of the project, a consortium of six academic, commercial, and government organizations has partnered to map pathways of toxicity, using endocrine disruption as a model hazard. Experimental data is generated at multiple sites, and analyzed using a range of computational tools. While effectively gathering, managing, and analyzing the data for high-content experiments is a challenge in its own right, doing so for a growing number of -omics technologies, with larger data sets, across multiple institutions complicates the process. Interestingly, one of the most difficult, ongoing challenges has been the computational collaboration between the geographically separate institutions. Existing solutions cannot handle the growing heterogeneous data, provide a computational environment for consistent analysis, accommodate different workflows, and adapt to the constantly evolving methods and goals of a research project. To meet the needs of the project, we have created and managed The Human Toxome Collaboratorium, a shared computational environment hosted on third-party cloud services. The Collaboratorium provides a familiar virtual desktop, with a mix of commercial, open-source, and custom-built applications. It shares some of the challenges of traditional information technology, but with unique and unexpected constraints that emerge from the cloud. Here we describe the problems we faced, the current architecture of the solution, an example of its use, the major lessons we learned, and the future potential of the concept. In particular, the Collaboratorium represents a novel distribution method that could increase the reproducibility and reusability of results from similar large, multi-omic studies.

  1. Use of Emerging Grid Computing Technologies for the Analysis of LIGO Data

    NASA Astrophysics Data System (ADS)

    Koranda, Scott

    2004-03-01

    The LIGO Scientific Collaboration (LSC) today faces the challenge of enabling analysis of terabytes of LIGO data by hundreds of scientists from institutions all around the world. To meet this challenge the LSC is developing tools, infrastructure, applications, and expertise leveraging Grid Computing technologies available today, and making available to LSC scientists compute resources at sites across the United States and Europe. We use digital credentials for strong and secure authentication and authorization to compute resources and data. Building on top of products from the Globus project for high-speed data transfer and information discovery we have created the Lightweight Data Replicator (LDR) to securely and robustly replicate data to resource sites. We have deployed at our computing sites the Virtual Data Toolkit (VDT) Server and Client packages, developed in collaboration with our partners in the GriPhyN and iVDGL projects, providing uniform access to distributed resources for users and their applications. Taken together these Grid Computing technologies and infrastructure have formed the LSC DataGrid--a coherent and uniform environment across two continents for the analysis of gravitational-wave detector data. Much work, however, remains in order to scale current analyses and recent lessons learned need to be integrated into the next generation of Grid middleware.

  2. Principles of metadata organization at the ENCODE data coordination center

    PubMed Central

    Hong, Eurie L.; Sloan, Cricket A.; Chan, Esther T.; Davidson, Jean M.; Malladi, Venkat S.; Strattan, J. Seth; Hitz, Benjamin C.; Gabdank, Idan; Narayanan, Aditi K.; Ho, Marcus; Lee, Brian T.; Rowe, Laurence D.; Dreszer, Timothy R.; Roe, Greg R.; Podduturi, Nikhil R.; Tanaka, Forrest; Hilton, Jason A.; Cherry, J. Michael

    2016-01-01

    The Encyclopedia of DNA Elements (ENCODE) Data Coordinating Center (DCC) is responsible for organizing, describing and providing access to the diverse data generated by the ENCODE project. The description of these data, known as metadata, includes the biological sample used as input, the protocols and assays performed on these samples, the data files generated from the results and the computational methods used to analyze the data. Here, we outline the principles and philosophy used to define the ENCODE metadata in order to create a metadata standard that can be applied to diverse assays and multiple genomic projects. In addition, we present how the data are validated and used by the ENCODE DCC in creating the ENCODE Portal (https://www.encodeproject.org/). Database URL: www.encodeproject.org PMID:26980513

  3. Enhancing Collaborative Learning through Group Intelligence Software

    NASA Astrophysics Data System (ADS)

    Tan, Yin Leng; Macaulay, Linda A.

    Employers increasingly demand not only academic excellence from graduates but also excellent interpersonal skills and the ability to work collaboratively in teams. This paper discusses the role of Group Intelligence software in helping to develop these higher order skills in the context of an enquiry based learning (EBL) project. The software supports teams in generating ideas, categorizing, prioritizing, voting and multi-criteria decision making and automatically generates a report of each team session. Students worked in a Group Intelligence lab designed to support both face to face and computer-mediated communication and employers provided feedback at two key points in the year long team project. Evaluation of the effectiveness of Group Intelligence software in collaborative learning was based on five key concepts of creativity, participation, productivity, engagement and understanding.

  4. Computer based interpretation of infrared spectra-structure of the knowledge-base, automatic rule generation and interpretation

    NASA Astrophysics Data System (ADS)

    Ehrentreich, F.; Dietze, U.; Meyer, U.; Abbas, S.; Schulz, H.

    1995-04-01

    It is a main task within the SpecInfo-Project to develop interpretation tools that can handle a great deal more of the complicated, more specific spectrum-structure-correlations. In the first step the empirical knowledge about the assignment of structural groups and their characteristic IR-bands has been collected from literature and represented in a computer readable well-structured form. Vague, verbal rules are managed by introduction of linguistic variables. The next step was the development of automatic rule generating procedures. We had combined and enlarged the IDIOTS algorithm with the algorithm by Blaffert relying on set theory. The procedures were successfully applied to the SpecInfo database. The realization of the preceding items is a prerequisite for the improvement of the computerized structure elucidation procedure.

  5. Computer integrated documentation

    NASA Technical Reports Server (NTRS)

    Boy, Guy

    1991-01-01

    The main technical issues of the Computer Integrated Documentation (CID) project are presented. The problem of automation of documents management and maintenance is analyzed both from an artificial intelligence viewpoint and from a human factors viewpoint. Possible technologies for CID are reviewed: conventional approaches to indexing and information retrieval; hypertext; and knowledge based systems. A particular effort was made to provide an appropriate representation for contextual knowledge. This representation is used to generate context on hypertext links. Thus, indexing in CID is context sensitive. The implementation of the current version of CID is described. It includes a hypertext data base, a knowledge based management and maintenance system, and a user interface. A series is also presented of theoretical considerations as navigation in hyperspace, acquisition of indexing knowledge, generation and maintenance of a large documentation, and relation to other work.

  6. A HWIL test facility of infrared imaging laser radar using direct signal injection

    NASA Astrophysics Data System (ADS)

    Wang, Qian; Lu, Wei; Wang, Chunhui; Wang, Qi

    2005-01-01

    Laser radar has been widely used these years and the hardware-in-the-loop (HWIL) testing of laser radar become important because of its low cost and high fidelity compare with On-the-Fly testing and whole digital simulation separately. Scene generation and projection two key technologies of hardware-in-the-loop testing of laser radar and is a complicated problem because the 3D images result from time delay. The scene generation process begins with the definition of the target geometry and reflectivity and range. The real-time 3D scene generation computer is a PC based hardware and the 3D target models were modeled using 3dsMAX. The scene generation software was written in C and OpenGL and is executed to extract the Z-buffer from the bit planes to main memory as range image. These pixels contain each target position x, y, z and its respective intensity and range value. Expensive optical injection technologies of scene projection such as LDP array, VCSEL array, DMD and associated scene generation is ongoing. But the optical scene projection is complicated and always unaffordable. In this paper a cheaper test facility was described that uses direct electronic injection to provide rang images for laser radar testing. The electronic delay and pulse shaping circuits inject the scenes directly into the seeker's signal processing unit.

  7. Analyzing How We Do Analysis and Consume Data, Results from the SciDAC-Data Project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, P.; Aliaga, L.; Mubarak, M.

    One of the main goals of the Dept. of Energy funded SciDAC-Data project is to analyze the more than 410,000 high energy physics datasets that have been collected, generated and defined over the past two decades by experiments using the Fermilab storage facilities. These datasets have been used as the input to over 5.6 million recorded analysis projects, for which detailed analytics have been gathered. The analytics and meta information for these datasets and analysis projects are being combined with knowledge of their part of the HEP analysis chains for major experiments to understand how modern computing and data deliverymore » is being used. We present the first results of this project, which examine in detail how the CDF, D0, NOvA, MINERvA and MicroBooNE experiments have organized, classified and consumed petascale datasets to produce their physics results. The results include analysis of the correlations in dataset/file overlap, data usage patterns, data popularity, dataset dependency and temporary dataset consumption. The results provide critical insight into how workflows and data delivery schemes can be combined with different caching strategies to more efficiently perform the work required to mine these large HEP data volumes and to understand the physics analysis requirements for the next generation of HEP computing facilities. In particular we present a detailed analysis of the NOvA data organization and consumption model corresponding to their first and second oscillation results (2014-2016) and the first look at the analysis of the Tevatron Run II experiments. We present statistical distributions for the characterization of these data and data driven models describing their consumption« less

  8. Analyzing how we do Analysis and Consume Data, Results from the SciDAC-Data Project

    NASA Astrophysics Data System (ADS)

    Ding, P.; Aliaga, L.; Mubarak, M.; Tsaris, A.; Norman, A.; Lyon, A.; Ross, R.

    2017-10-01

    One of the main goals of the Dept. of Energy funded SciDAC-Data project is to analyze the more than 410,000 high energy physics datasets that have been collected, generated and defined over the past two decades by experiments using the Fermilab storage facilities. These datasets have been used as the input to over 5.6 million recorded analysis projects, for which detailed analytics have been gathered. The analytics and meta information for these datasets and analysis projects are being combined with knowledge of their part of the HEP analysis chains for major experiments to understand how modern computing and data delivery is being used. We present the first results of this project, which examine in detail how the CDF, D0, NOvA, MINERvA and MicroBooNE experiments have organized, classified and consumed petascale datasets to produce their physics results. The results include analysis of the correlations in dataset/file overlap, data usage patterns, data popularity, dataset dependency and temporary dataset consumption. The results provide critical insight into how workflows and data delivery schemes can be combined with different caching strategies to more efficiently perform the work required to mine these large HEP data volumes and to understand the physics analysis requirements for the next generation of HEP computing facilities. In particular we present a detailed analysis of the NOvA data organization and consumption model corresponding to their first and second oscillation results (2014-2016) and the first look at the analysis of the Tevatron Run II experiments. We present statistical distributions for the characterization of these data and data driven models describing their consumption.

  9. Functional identification of spike-processing neural circuits.

    PubMed

    Lazar, Aurel A; Slutskiy, Yevgeniy B

    2014-02-01

    We introduce a novel approach for a complete functional identification of biophysical spike-processing neural circuits. The circuits considered accept multidimensional spike trains as their input and comprise a multitude of temporal receptive fields and conductance-based models of action potential generation. Each temporal receptive field describes the spatiotemporal contribution of all synapses between any two neurons and incorporates the (passive) processing carried out by the dendritic tree. The aggregate dendritic current produced by a multitude of temporal receptive fields is encoded into a sequence of action potentials by a spike generator modeled as a nonlinear dynamical system. Our approach builds on the observation that during any experiment, an entire neural circuit, including its receptive fields and biophysical spike generators, is projected onto the space of stimuli used to identify the circuit. Employing the reproducing kernel Hilbert space (RKHS) of trigonometric polynomials to describe input stimuli, we quantitatively describe the relationship between underlying circuit parameters and their projections. We also derive experimental conditions under which these projections converge to the true parameters. In doing so, we achieve the mathematical tractability needed to characterize the biophysical spike generator and identify the multitude of receptive fields. The algorithms obviate the need to repeat experiments in order to compute the neurons' rate of response, rendering our methodology of interest to both experimental and theoretical neuroscientists.

  10. Parallelizable 3D statistical reconstruction for C-arm tomosynthesis system

    NASA Astrophysics Data System (ADS)

    Wang, Beilei; Barner, Kenneth; Lee, Denny

    2005-04-01

    Clinical diagnosis and security detection tasks increasingly require 3D information which is difficult or impossible to obtain from 2D (two dimensional) radiographs. As a 3D (three dimensional) radiographic and non-destructive imaging technique, digital tomosynthesis is especially fit for cases where 3D information is required while a complete projection data is not available. Nowadays, FBP (filtered back projection) is extensively used in industry for its fast speed and simplicity. However, it is hard to deal with situations where only a limited number of projections from constrained directions are available, or the SNR (signal to noises ratio) of the projections is low. In order to deal with noise and take into account a priori information of the object, a statistical image reconstruction method is described based on the acquisition model of X-ray projections. We formulate a ML (maximum likelihood) function for this model and develop an ordered-subsets iterative algorithm to estimate the unknown attenuation of the object. Simulations show that satisfied results can be obtained after 1 to 2 iterations, and after that there is no significant improvement of the image quality. An adaptive wiener filter is also applied to the reconstructed image to remove its noise. Some approximations to speed up the reconstruction computation are also considered. Applying this method to computer generated projections of a revised Shepp phantom and true projections from diagnostic radiographs of a patient"s hand and mammography images yields reconstructions with impressive quality. Parallel programming is also implemented and tested. The quality of the reconstructed object is conserved, while the computation time is considerably reduced by almost the number of threads used.

  11. ExM:System Support for Extreme-Scale, Many-Task Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katz, Daniel S

    The ever-increasing power of supercomputer systems is both driving and enabling the emergence of new problem-solving methods that require the effi cient execution of many concurrent and interacting tasks. Methodologies such as rational design (e.g., in materials science), uncertainty quanti fication (e.g., in engineering), parameter estimation (e.g., for chemical and nuclear potential functions, and in economic energy systems modeling), massive dynamic graph pruning (e.g., in phylogenetic searches), Monte-Carlo- based iterative fi xing (e.g., in protein structure prediction), and inverse modeling (e.g., in reservoir simulation) all have these requirements. These many-task applications frequently have aggregate computing needs that demand the fastestmore » computers. For example, proposed next-generation climate model ensemble studies will involve 1,000 or more runs, each requiring 10,000 cores for a week, to characterize model sensitivity to initial condition and parameter uncertainty. The goal of the ExM project is to achieve the technical advances required to execute such many-task applications efficiently, reliably, and easily on petascale and exascale computers. In this way, we will open up extreme-scale computing to new problem solving methods and application classes. In this document, we report on combined technical progress of the collaborative ExM project, and the institutional financial status of the portion of the project at University of Chicago, over the rst 8 months (through April 30, 2011)« less

  12. Optimizing Preseason Training Loads in Australian Football.

    PubMed

    Carey, David L; Crow, Justin; Ong, Kok-Leong; Blanch, Peter; Morris, Meg E; Dascombe, Ben J; Crossley, Kay M

    2018-02-01

    To investigate whether preseason training plans for Australian football can be computer generated using current training-load guidelines to optimize injury-risk reduction and performance improvement. A constrained optimization problem was defined for daily total and sprint distance, using the preseason schedule of an elite Australian football team as a template. Maximizing total training volume and maximizing Banister-model-projected performance were both considered optimization objectives. Cumulative workload and acute:chronic workload-ratio constraints were placed on training programs to reflect current guidelines on relative and absolute training loads for injury-risk reduction. Optimization software was then used to generate preseason training plans. The optimization framework was able to generate training plans that satisfied relative and absolute workload constraints. Increasing the off-season chronic training loads enabled the optimization algorithm to prescribe higher amounts of "safe" training and attain higher projected performance levels. Simulations showed that using a Banister-model objective led to plans that included a taper in training load prior to competition to minimize fatigue and maximize projected performance. In contrast, when the objective was to maximize total training volume, more frequent training was prescribed to accumulate as much load as possible. Feasible training plans that maximize projected performance and satisfy injury-risk constraints can be automatically generated by an optimization problem for Australian football. The optimization methods allow for individualized training-plan design and the ability to adapt to changing training objectives and different training-load metrics.

  13. Short-term Temperature Prediction Using Adaptive Computing on Dynamic Scales

    NASA Astrophysics Data System (ADS)

    Hu, W.; Cervone, G.; Jha, S.; Balasubramanian, V.; Turilli, M.

    2017-12-01

    When predicting temperature, there are specific places and times when high accuracy predictions are harder. For example, not all the sub-regions in the domain require the same amount of computing resources to generate an accurate prediction. Plateau areas might require less computing resources than mountainous areas because of the steeper gradient of temperature change in the latter. However, it is difficult to estimate beforehand the optimal allocation of computational resources because several parameters play a role in determining the accuracy of the forecasts, in addition to orography. The allocation of resources to perform simulations can become a bottleneck because it requires human intervention to stop jobs or start new ones. The goal of this project is to design and develop a dynamic approach to generate short-term temperature predictions that can automatically determines the required computing resources and the geographic scales of the predictions based on the spatial and temporal uncertainties. The predictions and the prediction quality metrics are computed using a numeric weather prediction model, Analog Ensemble (AnEn), and the parallelization on high performance computing systems is accomplished using Ensemble Toolkit, one component of the RADICAL-Cybertools family of tools. RADICAL-Cybertools decouple the science needs from the computational capabilities by building an intermediate layer to run general ensemble patterns, regardless of the science. In this research, we show how the ensemble toolkit allows generating high resolution temperature forecasts at different spatial and temporal resolution. The AnEn algorithm is run using NAM analysis and forecasts data for the continental United States for a period of 2 years. AnEn results show that temperature forecasts perform well according to different probabilistic and deterministic statistical tests.

  14. Analysis of chaos attractors of MCG-recordings.

    PubMed

    Jiang, Shiqin; Yang, Fan; Yi, Panke; Chen, Bo; Luo, Ming; Wang, Lemin

    2006-01-01

    By studying the chaos attractor of cardiac magnetic induction strength B(z) generated by the electrical activity of the heart, we found that its projection in the reconstructed phase space has a similar shape with the map of the total current dipole vector. It is worth noting that the map of the total current dipole vector is computed with MCG recordings measured at 36 locations, whereas the chaos attractor of B(z) is generated by only one cardiac magnetic field recordings on the measured plan. We discuss only two subjects of different ages in this paper.

  15. Physics and Novel Schemes of Laser Radiation Pressure Acceleration for Quasi-monoenergetic Proton Generation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Chuan S.; Shao, Xi

    2016-06-14

    The main objective of our work is to provide theoretical basis and modeling support for the design and experimental setup of compact laser proton accelerator to produce high quality proton beams tunable with energy from 50 to 250 MeV using short pulse sub-petawatt laser. We performed theoretical and computational studies of energy scaling and Raleigh--Taylor instability development in laser radiation pressure acceleration (RPA) and developed novel RPA-based schemes to remedy/suppress instabilities for high-quality quasimonoenergetic proton beam generation as we proposed. During the project period, we published nine peer-reviewed journal papers and made twenty conference presentations including six invited talks onmore » our work. The project supported one graduate student who received his PhD degree in physics in 2013 and supported two post-doctoral associates. We also mentored three high school students and one undergraduate student of physics major by inspiring their interests and having them involved in the project.« less

  16. Design and analysis of sustainable computer mouse using design for disassembly methodology

    NASA Astrophysics Data System (ADS)

    Roni Sahroni, Taufik; Fitri Sukarman, Ahmad; Agung Mahardini, Karunia

    2017-12-01

    This paper presents the design and analysis of computer mouse using Design for Disassembly methodology. Basically, the existing computer mouse model consist a number of unnecessary part that cause the assembly and disassembly time in production. The objective of this project is to design a new computer mouse based on Design for Disassembly (DFD) methodology. The main methodology of this paper was proposed from sketch generation, concept selection, and concept scoring. Based on the design screening, design concept B was selected for further analysis. New design of computer mouse is proposed using fastening system. Furthermore, three materials of ABS, Polycarbonate, and PE high density were prepared to determine the environmental impact category. Sustainable analysis was conducted using software SolidWorks. As a result, PE High Density gives the lowers amount in the environmental category with great maximum stress value.

  17. Tomographic methods in flow diagnostics

    NASA Technical Reports Server (NTRS)

    Decker, Arthur J.

    1993-01-01

    This report presents a viewpoint of tomography that should be well adapted to currently available optical measurement technology as well as the needs of computational and experimental fluid dynamists. The goals in mind are to record data with the fastest optical array sensors; process the data with the fastest parallel processing technology available for small computers; and generate results for both experimental and theoretical data. An in-depth example treats interferometric data as it might be recorded in an aeronautics test facility, but the results are applicable whenever fluid properties are to be measured or applied from projections of those properties. The paper discusses both computed and neural net calibration tomography. The report also contains an overview of key definitions and computational methods, key references, computational problems such as ill-posedness, artifacts, missing data, and some possible and current research topics.

  18. PyBoolNet: a python package for the generation, analysis and visualization of boolean networks.

    PubMed

    Klarner, Hannes; Streck, Adam; Siebert, Heike

    2017-03-01

    The goal of this project is to provide a simple interface to working with Boolean networks. Emphasis is put on easy access to a large number of common tasks including the generation and manipulation of networks, attractor and basin computation, model checking and trap space computation, execution of established graph algorithms as well as graph drawing and layouts. P y B ool N et is a Python package for working with Boolean networks that supports simple access to model checking via N u SMV, standard graph algorithms via N etwork X and visualization via dot . In addition, state of the art attractor computation exploiting P otassco ASP is implemented. The package is function-based and uses only native Python and N etwork X data types. https://github.com/hklarner/PyBoolNet. hannes.klarner@fu-berlin.de. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  19. Next Generation Workload Management System For Big Data on Heterogeneous Distributed Computing

    NASA Astrophysics Data System (ADS)

    Klimentov, A.; Buncic, P.; De, K.; Jha, S.; Maeno, T.; Mount, R.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Petrosyan, A.; Porter, R. J.; Read, K. F.; Vaniachine, A.; Wells, J. C.; Wenaus, T.

    2015-05-01

    The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS and ALICE are the largest collaborations ever assembled in the sciences and are at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, both experiments rely on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Management System (WMS) for managing the workflow for all data processing on hundreds of data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. The scale is demonstrated by the following numbers: PanDA manages O(102) sites, O(105) cores, O(108) jobs per year, O(103) users, and ATLAS data volume is O(1017) bytes. In 2013 we started an ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF). The project titled ‘Next Generation Workload Management and Analysis System for Big Data’ (BigPanDA) is funded by DOE ASCR and HEP. Extending PanDA to clouds and LCF presents new challenges in managing heterogeneity and supporting workflow. The BigPanDA project is underway to setup and tailor PanDA at the Oak Ridge Leadership Computing Facility (OLCF) and at the National Research Center "Kurchatov Institute" together with ALICE distributed computing and ORNL computing professionals. Our approach to integration of HPC platforms at the OLCF and elsewhere is to reuse, as much as possible, existing components of the PanDA system. We will present our current accomplishments with running the PanDA WMS at OLCF and other supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications.

  20. Materials Database Development for Ballistic Impact Modeling

    NASA Technical Reports Server (NTRS)

    Pereira, J. Michael

    2007-01-01

    A set of experimental data is being generated under the Fundamental Aeronautics Program Supersonics project to help create and validate accurate computational impact models of jet engine impact events. The data generated will include material property data generated at a range of different strain rates, from 1x10(exp -4)/sec to 5x10(exp 4)/sec, over a range of temperatures. In addition, carefully instrumented ballistic impact tests will be conducted on flat plates and curved structures to provide material and structural response information to help validate the computational models. The material property data and the ballistic impact data will be generated using materials from the same lot, as far as possible. It was found in preliminary testing that the surface finish of test specimens has an effect on measured high strain rate tension response of AL2024. Both the maximum stress and maximum elongation are greater on specimens with a smoother finish. This report gives an overview of the testing that is being conducted and presents results of preliminary testing of the surface finish study.

  1. Continuous Variable Cluster State Generation over the Optical Spatial Mode Comb

    DOE PAGES

    Pooser, Raphael C.; Jing, Jietai

    2014-10-20

    One way quantum computing uses single qubit projective measurements performed on a cluster state (a highly entangled state of multiple qubits) in order to enact quantum gates. The model is promising due to its potential scalability; the cluster state may be produced at the beginning of the computation and operated on over time. Continuous variables (CV) offer another potential benefit in the form of deterministic entanglement generation. This determinism can lead to robust cluster states and scalable quantum computation. Recent demonstrations of CV cluster states have made great strides on the path to scalability utilizing either time or frequency multiplexingmore » in optical parametric oscillators (OPO) both above and below threshold. The techniques relied on a combination of entangling operators and beam splitter transformations. Here we show that an analogous transformation exists for amplifiers with Gaussian inputs states operating on multiple spatial modes. By judicious selection of local oscillators (LOs), the spatial mode distribution is analogous to the optical frequency comb consisting of axial modes in an OPO cavity. We outline an experimental system that generates cluster states across the spatial frequency comb which can also scale the amount of quantum noise reduction to potentially larger than in other systems.« less

  2. SigmaPlot 2000, Version 6.00, SPSS Inc. Computer Software Project Management, Requirements, and Design Document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HURLBUT, S.T.

    2000-10-24

    SigmaPlot is a vendor software product that will be used to convert the area under an absorbance curve generated by a Fourier transform infrared spectrometer (FTIR) to a relative area. SigmaPlot will be used in conjunction with procedure ZA-565-301, ''Determination of Moisture by Supercritical Fluid Extraction and Infrared Detection.''

  3. Boosting Bandwidth: Colleges Are Upgrading Their Network Infrastructure to Support Next-Generation Technologies

    ERIC Educational Resources Information Center

    Wong, Wylie

    2014-01-01

    In October 2013, nearly half of U.S. community colleges--46.2 percent-- said upgrading their campus networks is a "very important priority" within the next three years, according to The Campus Computing Project, a survey of 94 community colleges. More robust networks are needed to support a host of new wireless access points and the…

  4. New techniques for positron emission tomography in the study of human neurological disorders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuhl, D.E.

    1992-07-01

    The general goals of the physics and kinetic modeling projects are to: (1) improve the quantitative information extractable from PET images, and (2) develop, implement and optimize tracer kinetic models for new PET neurotransmitter/receptor ligands aided by computer simulations. Work towards improving PET quantification has included projects evaluating: (1) iterative reconstruction algorithms using supplemental boundary information, (2) automated registration of dynamic PET emission and transmission data using sinogram edge detection, and (3) automated registration of multiple subjects to a common coordinate system, including the use of non-linear warping methods. Simulation routines have been developed providing more accurate representation of datamore » generated from neurotransmitter/receptor studies. Routines consider data generated from complex compartmental models, high or low specific activity administrations, non-specific binding, pre- or post-injection of cold or competing ligands, temporal resolution of the data, and radiolabeled metabolites. Computer simulations and human PET studies have been performed to optimize kinetic models for four new neurotransmitter/receptor ligands, [{sup 11}C]TRB (muscarinic), [{sup 11}C]flumazenil (benzodiazepine), [{sup 18}F]GBR12909, (dopamine), and [{sup 11}C]NMPB (muscarinic).« less

  5. New techniques for positron emission tomography in the study of human neurological disorders

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuhl, D.E.

    1992-01-01

    The general goals of the physics and kinetic modeling projects are to: (1) improve the quantitative information extractable from PET images, and (2) develop, implement and optimize tracer kinetic models for new PET neurotransmitter/receptor ligands aided by computer simulations. Work towards improving PET quantification has included projects evaluating: (1) iterative reconstruction algorithms using supplemental boundary information, (2) automated registration of dynamic PET emission and transmission data using sinogram edge detection, and (3) automated registration of multiple subjects to a common coordinate system, including the use of non-linear warping methods. Simulation routines have been developed providing more accurate representation of datamore » generated from neurotransmitter/receptor studies. Routines consider data generated from complex compartmental models, high or low specific activity administrations, non-specific binding, pre- or post-injection of cold or competing ligands, temporal resolution of the data, and radiolabeled metabolites. Computer simulations and human PET studies have been performed to optimize kinetic models for four new neurotransmitter/receptor ligands, ({sup 11}C)TRB (muscarinic), ({sup 11}C)flumazenil (benzodiazepine), ({sup 18}F)GBR12909, (dopamine), and ({sup 11}C)NMPB (muscarinic).« less

  6. Review of Cranked-Arrow Wing Aerodynamics Project: Its International Aeronautical Community Role

    NASA Technical Reports Server (NTRS)

    Lamar, John E.; Obara, Clifford J.

    2007-01-01

    This paper provides a brief history of the F-16XL-1 aircraft, its role in the High Speed Research (HSR) program and how it was morphed into the Cranked Arrow Wing Aerodynamics Project (CAWAP). Various flight, wind-tunnel and Computational Fluid Dynamics (CFD) data sets were generated during the CAWAP. These unique and open flight datasets for surface pressures, boundary-layer profiles and skinfriction distributions, along with surface flow data, are described and sample data comparisons given. This is followed by a description of how the project became internationalized to be known as Cranked Arrow Wing Aerodynamics Project International (CAWAPI) and is concluded by an introduction to the results of a 4 year CFD predictive study of data collected at flight conditions by participating researchers.

  7. Great Lakes O shore Wind Project: Utility and Regional Integration Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sajadi, Amirhossein; Loparo, Kenneth A.; D'Aquila, Robert

    This project aims to identify transmission system upgrades needed to facilitate offshore wind projects as well as operational impacts of offshore generation on operation of the regional transmission system in the Great Lakes region. A simulation model of the US Eastern Interconnection was used as the test system as a case study for investigating the impact of the integration of a 1000MW offshore wind farm operating in Lake Erie into FirstEnergy/PJM service territory. The findings of this research provide recommendations on offshore wind integration scenarios, the locations of points of interconnection, wind profile modeling and simulation, and computational methods tomore » quantify performance, along with operating changes and equipment upgrades needed to mitigate system performance issues introduced by an offshore wind project.« less

  8. Inventory of Federal energy-related environment and safety research for FY 1978. Volume II. Project listings and indexes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    1979-12-01

    This volume contains summaries of FY-1978 government-sponsored environment and safety research related to energy. Project summaries were collected by Aerospace Corporation under contract with the Department of Energy, Office of Program Coordination, under the Assistant Secretary for Environment. Summaries are arranged by log number, which groups the projects by reporting agency. The log number is a unique number assigned to each project from a block of numbers set aside for each agency. Information about the projects is included in the summary listings. This includes the project title, principal investigators, research organization, project number, contract number, supporting organization, funding level ifmore » known, related energy sources with numbers indicating percentages of effort devoted to each, and R and D categories. A brief description of each project is given, and this is followed by subject index terms that were assigned for computer searching and for generating the printed subject index in Volume IV.« less

  9. Undergraduate computational physics projects on quantum computing

    NASA Astrophysics Data System (ADS)

    Candela, D.

    2015-08-01

    Computational projects on quantum computing suitable for students in a junior-level quantum mechanics course are described. In these projects students write their own programs to simulate quantum computers. Knowledge is assumed of introductory quantum mechanics through the properties of spin 1/2. Initial, more easily programmed projects treat the basics of quantum computation, quantum gates, and Grover's quantum search algorithm. These are followed by more advanced projects to increase the number of qubits and implement Shor's quantum factoring algorithm. The projects can be run on a typical laptop or desktop computer, using most programming languages. Supplementing resources available elsewhere, the projects are presented here in a self-contained format especially suitable for a short computational module for physics students.

  10. A concurrent distributed system for aircraft tactical decision generation

    NASA Technical Reports Server (NTRS)

    Mcmanus, John W.

    1990-01-01

    A research program investigating the use of AI techniques to aid in the development of a tactical decision generator (TDG) for within visual range (WVR) air combat engagements is discussed. The application of AI programming and problem-solving methods in the development and implementation of a concurrent version of the computerized logic for air-to-air warfare simulations (CLAWS) program, a second-generation TDG, is presented. Concurrent computing environments and programming approaches are discussed, and the design and performance of prototype concurrent TDG system (Cube CLAWS) are presented. It is concluded that the Cube CLAWS has provided a useful testbed to evaluate the development of a distributed blackboard system. The project has shown that the complexity of developing specialized software on a distributed, message-passing architecture such as the Hypercube is not overwhelming, and that reasonable speedups and processor efficiency can be achieved by a distributed blackboard system. The project has also highlighted some of the costs of using a distributed approach to designing a blackboard system.

  11. Generating a Dynamic Synthetic Population – Using an Age-Structured Two-Sex Model for Household Dynamics

    PubMed Central

    Namazi-Rad, Mohammad-Reza; Mokhtarian, Payam; Perez, Pascal

    2014-01-01

    Generating a reliable computer-simulated synthetic population is necessary for knowledge processing and decision-making analysis in agent-based systems in order to measure, interpret and describe each target area and the human activity patterns within it. In this paper, both synthetic reconstruction (SR) and combinatorial optimisation (CO) techniques are discussed for generating a reliable synthetic population for a certain geographic region (in Australia) using aggregated- and disaggregated-level information available for such an area. A CO algorithm using the quadratic function of population estimators is presented in this paper in order to generate a synthetic population while considering a two-fold nested structure for the individuals and households within the target areas. The baseline population in this study is generated from the confidentialised unit record files (CURFs) and 2006 Australian census tables. The dynamics of the created population is then projected over five years using a dynamic micro-simulation model for individual- and household-level demographic transitions. This projection is then compared with the 2011 Australian census. A prediction interval is provided for the population estimates obtained by the bootstrapping method, by which the variability structure of a predictor can be replicated in a bootstrap distribution. PMID:24733522

  12. A method to encapsulate model structural uncertainty in ensemble projections of future climate: EPIC v1.0

    NASA Astrophysics Data System (ADS)

    Lewis, Jared; Bodeker, Greg E.; Kremser, Stefanie; Tait, Andrew

    2017-12-01

    A method, based on climate pattern scaling, has been developed to expand a small number of projections of fields of a selected climate variable (X) into an ensemble that encapsulates a wide range of indicative model structural uncertainties. The method described in this paper is referred to as the Ensemble Projections Incorporating Climate model uncertainty (EPIC) method. Each ensemble member is constructed by adding contributions from (1) a climatology derived from observations that represents the time-invariant part of the signal; (2) a contribution from forced changes in X, where those changes can be statistically related to changes in global mean surface temperature (Tglobal); and (3) a contribution from unforced variability that is generated by a stochastic weather generator. The patterns of unforced variability are also allowed to respond to changes in Tglobal. The statistical relationships between changes in X (and its patterns of variability) and Tglobal are obtained in a training phase. Then, in an implementation phase, 190 simulations of Tglobal are generated using a simple climate model tuned to emulate 19 different global climate models (GCMs) and 10 different carbon cycle models. Using the generated Tglobal time series and the correlation between the forced changes in X and Tglobal, obtained in the training phase, the forced change in the X field can be generated many times using Monte Carlo analysis. A stochastic weather generator is used to generate realistic representations of weather which include spatial coherence. Because GCMs and regional climate models (RCMs) are less likely to correctly represent unforced variability compared to observations, the stochastic weather generator takes as input measures of variability derived from observations, but also responds to forced changes in climate in a way that is consistent with the RCM projections. This approach to generating a large ensemble of projections is many orders of magnitude more computationally efficient than running multiple GCM or RCM simulations. Such a large ensemble of projections permits a description of a probability density function (PDF) of future climate states rather than a small number of individual story lines within that PDF, which may not be representative of the PDF as a whole; the EPIC method largely corrects for such potential sampling biases. The method is useful for providing projections of changes in climate to users wishing to investigate the impacts and implications of climate change in a probabilistic way. A web-based tool, using the EPIC method to provide probabilistic projections of changes in daily maximum and minimum temperatures for New Zealand, has been developed and is described in this paper.

  13. Practical implementation of tetrahedral mesh reconstruction in emission tomography

    PubMed Central

    Boutchko, R.; Sitek, A.; Gullberg, G. T.

    2014-01-01

    This paper presents a practical implementation of image reconstruction on tetrahedral meshes optimized for emission computed tomography with parallel beam geometry. Tetrahedral mesh built on a point cloud is a convenient image representation method, intrinsically three-dimensional and with a multi-level resolution property. Image intensities are defined at the mesh nodes and linearly interpolated inside each tetrahedron. For the given mesh geometry, the intensities can be computed directly from tomographic projections using iterative reconstruction algorithms with a system matrix calculated using an exact analytical formula. The mesh geometry is optimized for a specific patient using a two stage process. First, a noisy image is reconstructed on a finely-spaced uniform cloud. Then, the geometry of the representation is adaptively transformed through boundary-preserving node motion and elimination. Nodes are removed in constant intensity regions, merged along the boundaries, and moved in the direction of the mean local intensity gradient in order to provide higher node density in the boundary regions. Attenuation correction and detector geometric response are included in the system matrix. Once the mesh geometry is optimized, it is used to generate the final system matrix for ML-EM reconstruction of node intensities and for visualization of the reconstructed images. In dynamic PET or SPECT imaging, the system matrix generation procedure is performed using a quasi-static sinogram, generated by summing projection data from multiple time frames. This system matrix is then used to reconstruct the individual time frame projections. Performance of the new method is evaluated by reconstructing simulated projections of the NCAT phantom and the method is then applied to dynamic SPECT phantom and patient studies and to a dynamic microPET rat study. Tetrahedral mesh-based images are compared to the standard voxel-based reconstruction for both high and low signal-to-noise ratio projection datasets. The results demonstrate that the reconstructed images represented as tetrahedral meshes based on point clouds offer image quality comparable to that achievable using a standard voxel grid while allowing substantial reduction in the number of unknown intensities to be reconstructed and reducing the noise. PMID:23588373

  14. Practical implementation of tetrahedral mesh reconstruction in emission tomography

    NASA Astrophysics Data System (ADS)

    Boutchko, R.; Sitek, A.; Gullberg, G. T.

    2013-05-01

    This paper presents a practical implementation of image reconstruction on tetrahedral meshes optimized for emission computed tomography with parallel beam geometry. Tetrahedral mesh built on a point cloud is a convenient image representation method, intrinsically three-dimensional and with a multi-level resolution property. Image intensities are defined at the mesh nodes and linearly interpolated inside each tetrahedron. For the given mesh geometry, the intensities can be computed directly from tomographic projections using iterative reconstruction algorithms with a system matrix calculated using an exact analytical formula. The mesh geometry is optimized for a specific patient using a two stage process. First, a noisy image is reconstructed on a finely-spaced uniform cloud. Then, the geometry of the representation is adaptively transformed through boundary-preserving node motion and elimination. Nodes are removed in constant intensity regions, merged along the boundaries, and moved in the direction of the mean local intensity gradient in order to provide higher node density in the boundary regions. Attenuation correction and detector geometric response are included in the system matrix. Once the mesh geometry is optimized, it is used to generate the final system matrix for ML-EM reconstruction of node intensities and for visualization of the reconstructed images. In dynamic PET or SPECT imaging, the system matrix generation procedure is performed using a quasi-static sinogram, generated by summing projection data from multiple time frames. This system matrix is then used to reconstruct the individual time frame projections. Performance of the new method is evaluated by reconstructing simulated projections of the NCAT phantom and the method is then applied to dynamic SPECT phantom and patient studies and to a dynamic microPET rat study. Tetrahedral mesh-based images are compared to the standard voxel-based reconstruction for both high and low signal-to-noise ratio projection datasets. The results demonstrate that the reconstructed images represented as tetrahedral meshes based on point clouds offer image quality comparable to that achievable using a standard voxel grid while allowing substantial reduction in the number of unknown intensities to be reconstructed and reducing the noise.

  15. Principles of metadata organization at the ENCODE data coordination center.

    PubMed

    Hong, Eurie L; Sloan, Cricket A; Chan, Esther T; Davidson, Jean M; Malladi, Venkat S; Strattan, J Seth; Hitz, Benjamin C; Gabdank, Idan; Narayanan, Aditi K; Ho, Marcus; Lee, Brian T; Rowe, Laurence D; Dreszer, Timothy R; Roe, Greg R; Podduturi, Nikhil R; Tanaka, Forrest; Hilton, Jason A; Cherry, J Michael

    2016-01-01

    The Encyclopedia of DNA Elements (ENCODE) Data Coordinating Center (DCC) is responsible for organizing, describing and providing access to the diverse data generated by the ENCODE project. The description of these data, known as metadata, includes the biological sample used as input, the protocols and assays performed on these samples, the data files generated from the results and the computational methods used to analyze the data. Here, we outline the principles and philosophy used to define the ENCODE metadata in order to create a metadata standard that can be applied to diverse assays and multiple genomic projects. In addition, we present how the data are validated and used by the ENCODE DCC in creating the ENCODE Portal (https://www.encodeproject.org/). Database URL: www.encodeproject.org. © The Author(s) 2016. Published by Oxford University Press.

  16. Electromagnetic Simulations for Aerospace Application Final Report CRADA No. TC-0376-92

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Madsen, N.; Meredith, S.

    Electromagnetic (EM) simulation tools play an important role in the design cycle, allowing optimization of a design before it is fabricated for testing. The purpose of this cooperative project was to provide Lockheed with state-of-the-art electromagnetic (EM) simulation software that will enable the optimal design of the next generation of low-observable (LO) military aircraft through the VHF regime. More particularly, the project was principally code development and validation, its goal to produce a 3-D, conforming grid,time-domain (TD) EM simulation tool, consisting of a mesh generator, a DS13D-based simulation kernel, and an RCS postprocessor, which was useful in the optimization ofmore » LO aircraft, both for full-aircraft simulations run on a massively parallel computer and for small scale problems run on a UNIX workstation.« less

  17. Incomplete projection reconstruction of computed tomography based on the modified discrete algebraic reconstruction technique

    NASA Astrophysics Data System (ADS)

    Yang, Fuqiang; Zhang, Dinghua; Huang, Kuidong; Gao, Zongzhao; Yang, YaFei

    2018-02-01

    Based on the discrete algebraic reconstruction technique (DART), this study aims to address and test a new improved algorithm applied to incomplete projection data to generate a high quality reconstruction image by reducing the artifacts and noise in computed tomography. For the incomplete projections, an augmented Lagrangian based on compressed sensing is first used in the initial reconstruction for segmentation of the DART to get higher contrast graphics for boundary and non-boundary pixels. Then, the block matching 3D filtering operator was used to suppress the noise and to improve the gray distribution of the reconstructed image. Finally, simulation studies on the polychromatic spectrum were performed to test the performance of the new algorithm. Study results show a significant improvement in the signal-to-noise ratios (SNRs) and average gradients (AGs) of the images reconstructed from incomplete data. The SNRs and AGs of the new images reconstructed by DART-ALBM were on average 30%-40% and 10% higher than the images reconstructed by DART algorithms. Since the improved DART-ALBM algorithm has a better robustness to limited-view reconstruction, which not only makes the edge of the image clear but also makes the gray distribution of non-boundary pixels better, it has the potential to improve image quality from incomplete projections or sparse projections.

  18. Acceleration of Monte Carlo SPECT simulation using convolution-based forced detection

    NASA Astrophysics Data System (ADS)

    de Jong, H. W. A. M.; Slijpen, E. T. P.; Beekman, F. J.

    2001-02-01

    Monte Carlo (MC) simulation is an established tool to calculate photon transport through tissue in Emission Computed Tomography (ECT). Since the first appearance of MC a large variety of variance reduction techniques (VRT) have been introduced to speed up these notoriously slow simulations. One example of a very effective and established VRT is known as forced detection (FD). In standard FD the path from the photon's scatter position to the camera is chosen stochastically from the appropriate probability density function (PDF), modeling the distance-dependent detector response. In order to speed up MC the authors propose a convolution-based FD (CFD) which involves replacing the sampling of the PDF by a convolution with a kernel which depends on the position of the scatter event. The authors validated CFD for parallel-hole Single Photon Emission Computed Tomography (SPECT) using a digital thorax phantom. Comparison of projections estimated with CFD and standard FD shows that both estimates converge to practically identical projections (maximum bias 0.9% of peak projection value), despite the slightly different photon paths used in CFD and standard FD. Projections generated with CFD converge, however, to a noise-free projection up to one or two orders of magnitude faster, which is extremely useful in many applications such as model-based image reconstruction.

  19. The computer integrated documentation project: A merge of hypermedia and AI techniques

    NASA Technical Reports Server (NTRS)

    Mathe, Nathalie; Boy, Guy

    1993-01-01

    To generate intelligent indexing that allows context-sensitive information retrieval, a system must be able to acquire knowledge directly through interaction with users. In this paper, we present the architecture for CID (Computer Integrated Documentation). CID is a system that enables integration of various technical documents in a hypertext framework and includes an intelligent browsing system that incorporates indexing in context. CID's knowledge-based indexing mechanism allows case based knowledge acquisition by experimentation. It utilizes on-line user information requirements and suggestions either to reinforce current indexing in case of success or to generate new knowledge in case of failure. This allows CID's intelligent interface system to provide helpful responses, based on previous experience (user feedback). We describe CID's current capabilities and provide an overview of our plans for extending the system.

  20. Real-time volumetric image reconstruction and 3D tumor localization based on a single x-ray projection image for lung cancer radiotherapy.

    PubMed

    Li, Ruijiang; Jia, Xun; Lewis, John H; Gu, Xuejun; Folkerts, Michael; Men, Chunhua; Jiang, Steve B

    2010-06-01

    To develop an algorithm for real-time volumetric image reconstruction and 3D tumor localization based on a single x-ray projection image for lung cancer radiotherapy. Given a set of volumetric images of a patient at N breathing phases as the training data, deformable image registration was performed between a reference phase and the other N-1 phases, resulting in N-1 deformation vector fields (DVFs). These DVFs can be represented efficiently by a few eigenvectors and coefficients obtained from principal component analysis (PCA). By varying the PCA coefficients, new DVFs can be generated, which, when applied on the reference image, lead to new volumetric images. A volumetric image can then be reconstructed from a single projection image by optimizing the PCA coefficients such that its computed projection matches the measured one. The 3D location of the tumor can be derived by applying the inverted DVF on its position in the reference image. The algorithm was implemented on graphics processing units (GPUs) to achieve real-time efficiency. The training data were generated using a realistic and dynamic mathematical phantom with ten breathing phases. The testing data were 360 cone beam projections corresponding to one gantry rotation, simulated using the same phantom with a 50% increase in breathing amplitude. The average relative image intensity error of the reconstructed volumetric images is 6.9% +/- 2.4%. The average 3D tumor localization error is 0.8 +/- 0.5 mm. On an NVIDIA Tesla C1060 GPU card, the average computation time for reconstructing a volumetric image from each projection is 0.24 s (range: 0.17 and 0.35 s). The authors have shown the feasibility of reconstructing volumetric images and localizing tumor positions in 3D in near real-time from a single x-ray image.

  1. The 4D nucleome project.

    PubMed

    Dekker, Job; Belmont, Andrew S; Guttman, Mitchell; Leshyk, Victor O; Lis, John T; Lomvardas, Stavros; Mirny, Leonid A; O'Shea, Clodagh C; Park, Peter J; Ren, Bing; Politz, Joan C Ritland; Shendure, Jay; Zhong, Sheng

    2017-09-13

    The 4D Nucleome Network aims to develop and apply approaches to map the structure and dynamics of the human and mouse genomes in space and time with the goal of gaining deeper mechanistic insights into how the nucleus is organized and functions. The project will develop and benchmark experimental and computational approaches for measuring genome conformation and nuclear organization, and investigate how these contribute to gene regulation and other genome functions. Validated experimental technologies will be combined with biophysical approaches to generate quantitative models of spatial genome organization in different biological states, both in cell populations and in single cells.

  2. The 4D Nucleome Project

    PubMed Central

    Dekker, Job; Belmont, Andrew S.; Guttman, Mitchell; Leshyk, Victor O.; Lis, John T.; Lomvardas, Stavros; Mirny, Leonid A.; O’Shea, Clodagh C.; Park, Peter J.; Ren, Bing; Ritland Politz, Joan C.; Shendure, Jay; Zhong, Sheng

    2017-01-01

    Preface The 4D Nucleome Network aims to develop and apply approaches to map the structure and dynamics of the human and mouse genomes in space and time with the goal of gaining deeper mechanistic understanding of how the nucleus is organized and functions. The project will develop and benchmark experimental and computational approaches for measuring genome conformation and nuclear organization, and investigate how these contribute to gene regulation and other genome functions. Validated experimental approaches will be combined with biophysical modeling to generate quantitative models of spatial genome organization in different biological states, both in cell populations and in single cells. PMID:28905911

  3. LST CGM Generator and Viewer Final Report CRADA No. TSB-1558-98

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vickers, Don; Larson, Don

    The purpose of this project was to jointly develop and test a software plug-in that would convert native Pro /ENGINEER digital engineering drawings to Computer Graphics Metafile (CGM) format. If it was not feasible to convert the Pro/ENGINEER files, we planned to develop and test a similar conversion of native AutoCAD engineering drawings to CGM. CGM viewer plug-ins were developed as needed. There were four main tasks in this project: 1. Requirements for CGM Plug-in 2. Product Evaluation 3. Product Development Feasibility Study 4. Developing a "Plug-In" Application.

  4. The aeroacoustics of supersonic jets

    NASA Technical Reports Server (NTRS)

    Morris, Philip J.; McLaughlin, Dennis K.

    1995-01-01

    This research project was a joint experimental/computational study of noise in supersonic jets. The experiments were performed in a low to moderate Reynolds number anechoic supersonic jet facility. Computations have focused on the modeling of the effect of an external shroud on the generation and radiation of jet noise. This report summarizes the results of the research program in the form of the Masters and Doctoral theses of those students who obtained their degrees with the assistance of this research grant. In addition, the presentations and publications made by the principal investigators and the research students is appended.

  5. Removing a barrier to computer-based outbreak and disease surveillance--the RODS Open Source Project.

    PubMed

    Espino, Jeremy U; Wagner, M; Szczepaniak, C; Tsui, F C; Su, H; Olszewski, R; Liu, Z; Chapman, W; Zeng, X; Ma, L; Lu, Z; Dara, J

    2004-09-24

    Computer-based outbreak and disease surveillance requires high-quality software that is well-supported and affordable. Developing software in an open-source framework, which entails free distribution and use of software and continuous, community-based software development, can produce software with such characteristics, and can do so rapidly. The objective of the Real-Time Outbreak and Disease Surveillance (RODS) Open Source Project is to accelerate the deployment of computer-based outbreak and disease surveillance systems by writing software and catalyzing the formation of a community of users, developers, consultants, and scientists who support its use. The University of Pittsburgh seeded the Open Source Project by releasing the RODS software under the GNU General Public License. An infrastructure was created, consisting of a website, mailing lists for developers and users, designated software developers, and shared code-development tools. These resources are intended to encourage growth of the Open Source Project community. Progress is measured by assessing website usage, number of software downloads, number of inquiries, number of system deployments, and number of new features or modules added to the code base. During September--November 2003, users generated 5,370 page views of the project website, 59 software downloads, 20 inquiries, one new deployment, and addition of four features. Thus far, health departments and companies have been more interested in using the software as is than in customizing or developing new features. The RODS laboratory anticipates that after initial installation has been completed, health departments and companies will begin to customize the software and contribute their enhancements to the public code base.

  6. The virtual digital nuclear power plant: A modern tool for supporting the lifecycle of VVER-based nuclear power units

    NASA Astrophysics Data System (ADS)

    Arkadov, G. V.; Zhukavin, A. P.; Kroshilin, A. E.; Parshikov, I. A.; Solov'ev, S. L.; Shishov, A. V.

    2014-10-01

    The article describes the "Virtual Digital VVER-Based Nuclear Power Plant" computerized system comprising a totality of verified initial data (sets of input data for a model intended for describing the behavior of nuclear power plant (NPP) systems in design and emergency modes of their operation) and a unified system of new-generation computation codes intended for carrying out coordinated computation of the variety of physical processes in the reactor core and NPP equipment. Experiments with the demonstration version of the "Virtual Digital VVER-Based NPP" computerized system has shown that it is in principle possible to set up a unified system of computation codes in a common software environment for carrying out interconnected calculations of various physical phenomena at NPPs constructed according to the standard AES-2006 project. With the full-scale version of the "Virtual Digital VVER-Based NPP" computerized system put in operation, the concerned engineering, design, construction, and operating organizations will have access to all necessary information relating to the NPP power unit project throughout its entire lifecycle. The domestically developed commercial-grade software product set to operate as an independently operating application to the project will bring about additional competitive advantages in the modern market of nuclear power technologies.

  7. Yahoo! Compute Coop (YCC). A Next-Generation Passive Cooling Design for Data Centers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robison, AD; Page, Christina; Lytle, Bob

    The purpose of the Yahoo! Compute Coop (YCC) project is to research, design, build and implement a greenfield "efficient data factory" and to specifically demonstrate that the YCC concept is feasible for large facilities housing tens of thousands of heat-producing computing servers. The project scope for the Yahoo! Compute Coop technology includes: - Analyzing and implementing ways in which to drastically decrease energy consumption and waste output. - Analyzing the laws of thermodynamics and implementing naturally occurring environmental effects in order to maximize the "free-cooling" for large data center facilities. "Free cooling" is the direct usage of outside air tomore » cool the servers vs. traditional "mechanical cooling" which is supplied by chillers or other Dx units. - Redesigning and simplifying building materials and methods. - Shortening and simplifying build-to-operate schedules while at the same time reducing initial build and operating costs. Selected for its favorable climate, the greenfield project site is located in Lockport, NY. Construction on the 9.0 MW critical load data center facility began in May 2009, with the fully operational facility deployed in September 2010. The relatively low initial build cost, compatibility with current server and network models, and the efficient use of power and water are all key features that make it a highly compatible and globally implementable design innovation for the data center industry. Yahoo! Compute Coop technology is designed to achieve 99.98% uptime availability. This integrated building design allows for free cooling 99% of the year via the building's unique shape and orientation, as well as server physical configuration.« less

  8. NASA HPCC Technology for Aerospace Analysis and Design

    NASA Technical Reports Server (NTRS)

    Schulbach, Catherine H.

    1999-01-01

    The Computational Aerosciences (CAS) Project is part of NASA's High Performance Computing and Communications Program. Its primary goal is to accelerate the availability of high-performance computing technology to the US aerospace community-thus providing the US aerospace community with key tools necessary to reduce design cycle times and increase fidelity in order to improve safety, efficiency and capability of future aerospace vehicles. A complementary goal is to hasten the emergence of a viable commercial market within the aerospace community for the advantage of the domestic computer hardware and software industry. The CAS Project selects representative aerospace problems (especially design) and uses them to focus efforts on advancing aerospace algorithms and applications, systems software, and computing machinery to demonstrate vast improvements in system performance and capability over the life of the program. Recent demonstrations have served to assess the benefits of possible performance improvements while reducing the risk of adopting high-performance computing technology. This talk will discuss past accomplishments in providing technology to the aerospace community, present efforts, and future goals. For example, the times to do full combustor and compressor simulations (of aircraft engines) have been reduced by factors of 320:1 and 400:1 respectively. While this has enabled new capabilities in engine simulation, the goal of an overnight, dynamic, multi-disciplinary, 3-dimensional simulation of an aircraft engine is still years away and will require new generations of high-end technology.

  9. Transparent volume imaging

    NASA Astrophysics Data System (ADS)

    Wixson, Steve E.

    1990-07-01

    Transparent Volume Imaging began with the stereo xray in 1895 and ended for most investigators when radiation safety concerns eliminated the second view. Today, similiar images can be generated by the computer without safety hazards providing improved perception and new means of image quantification. A volumetric workstation is under development based on an operational prototype. The workstation consists of multiple symbolic and numeric processors, binocular stereo color display generator with large image memory and liquid crystal shutter, voice input and output, a 3D pointer that uses projection lenses so that structures in 3 space can be touched directly, 3D hard copy using vectograph and lenticular printing, and presentation facilities using stereo 35mm slide and stereo video tape projection. Volumetric software includes a volume window manager, Mayo Clinic's Analyze program and our Digital Stereo Microscope (DSM) algorithms. The DSM uses stereo xray-like projections, rapidly oscillating motion and focal depth cues such that detail can be studied in the spatial context of the entire set of data. Focal depth cues are generated with a lens and apeture algorithm that generates a plane of sharp focus, and multiple stereo pairs each with a different plane of sharp focus are generated and stored in the large memory for interactive selection using a physical or symbolic depth selector. More recent work is studying non-linear focussing. Psychophysical studies are underway to understand how people perce ive images on a volumetric display and how accurately 3 dimensional structures can be quantitated from these displays.

  10. Numerical computation of complex multi-body Navier-Stokes flows with applications for the integrated Space Shuttle launch vehicle

    NASA Technical Reports Server (NTRS)

    Chan, William M.

    1993-01-01

    An enhanced grid system for the Space Shuttle Orbiter was built by integrating CAD definitions from several sources and then generating the surface and volume grids. The new grid system contains geometric components not modeled previously plus significant enhancements on geometry that has been modeled in the old grid system. The new orbiter grids were then integrated with new grids for the rest of the launch vehicle. Enhancements were made to the hyperbolic grid generator HYPGEN and new tools for grid projection, manipulation, and modification, Cartesian box grid and far field grid generation and post-processing of flow solver data were developed.

  11. Security Approaches in Using Tablet Computers for Primary Data Collection in Clinical Research

    PubMed Central

    Wilcox, Adam B.; Gallagher, Kathleen; Bakken, Suzanne

    2013-01-01

    Next-generation tablets (iPads and Android tablets) may potentially improve the collection and management of clinical research data. The widespread adoption of tablets, coupled with decreased software and hardware costs, has led to increased consideration of tablets for primary research data collection. When using tablets for the Washington Heights/Inwood Infrastructure for Comparative Effectiveness Research (WICER) project, we found that the devices give rise to inherent security issues associated with the potential use of cloud-based data storage approaches. This paper identifies and describes major security considerations for primary data collection with tablets; proposes a set of architectural strategies for implementing data collection forms with tablet computers; and discusses the security, cost, and workflow of each strategy. The paper briefly reviews the strategies with respect to their implementation for three primary data collection activities for the WICER project. PMID:25848559

  12. Security approaches in using tablet computers for primary data collection in clinical research.

    PubMed

    Wilcox, Adam B; Gallagher, Kathleen; Bakken, Suzanne

    2013-01-01

    Next-generation tablets (iPads and Android tablets) may potentially improve the collection and management of clinical research data. The widespread adoption of tablets, coupled with decreased software and hardware costs, has led to increased consideration of tablets for primary research data collection. When using tablets for the Washington Heights/Inwood Infrastructure for Comparative Effectiveness Research (WICER) project, we found that the devices give rise to inherent security issues associated with the potential use of cloud-based data storage approaches. This paper identifies and describes major security considerations for primary data collection with tablets; proposes a set of architectural strategies for implementing data collection forms with tablet computers; and discusses the security, cost, and workflow of each strategy. The paper briefly reviews the strategies with respect to their implementation for three primary data collection activities for the WICER project.

  13. ENFIN--A European network for integrative systems biology.

    PubMed

    Kahlem, Pascal; Clegg, Andrew; Reisinger, Florian; Xenarios, Ioannis; Hermjakob, Henning; Orengo, Christine; Birney, Ewan

    2009-11-01

    Integration of biological data of various types and the development of adapted bioinformatics tools represent critical objectives to enable research at the systems level. The European Network of Excellence ENFIN is engaged in developing an adapted infrastructure to connect databases, and platforms to enable both the generation of new bioinformatics tools and the experimental validation of computational predictions. With the aim of bridging the gap existing between standard wet laboratories and bioinformatics, the ENFIN Network runs integrative research projects to bring the latest computational techniques to bear directly on questions dedicated to systems biology in the wet laboratory environment. The Network maintains internally close collaboration between experimental and computational research, enabling a permanent cycling of experimental validation and improvement of computational prediction methods. The computational work includes the development of a database infrastructure (EnCORE), bioinformatics analysis methods and a novel platform for protein function analysis FuncNet.

  14. COED Transactions, Vol. X, No. 4, April 1978. An Experience in Teaching "COBOL?" to Graduate Engineers.

    ERIC Educational Resources Information Center

    Bremmer, Dale; Childs, Bart

    This document discusses the importance of computing knowledge and experience in the techniques of fast data retrieval for today's engineer. It describes a course designed to teach the engineer the COBOL Language structure. One of the projects of the course, a report generator (REGE) written in COBOL which is used to alter, sort and print selected…

  15. Automated measurement of human body shape and curvature using computer vision

    NASA Astrophysics Data System (ADS)

    Pearson, Jeremy D.; Hobson, Clifford A.; Dangerfield, Peter H.

    1993-06-01

    A system to measure the surface shape of the human body has been constructed. The system uses a fringe pattern generated by projection of multi-stripe structured light. The optical methodology used is fully described and the algorithms used to process acquired digital images are outlined. The system has been applied to the measurement of the shape of the human back in scoliosis.

  16. Survey of MapReduce frame operation in bioinformatics.

    PubMed

    Zou, Quan; Li, Xu-Bin; Jiang, Wen-Rui; Lin, Zi-Yu; Li, Gui-Lin; Chen, Ke

    2014-07-01

    Bioinformatics is challenged by the fact that traditional analysis tools have difficulty in processing large-scale data from high-throughput sequencing. The open source Apache Hadoop project, which adopts the MapReduce framework and a distributed file system, has recently given bioinformatics researchers an opportunity to achieve scalable, efficient and reliable computing performance on Linux clusters and on cloud computing services. In this article, we present MapReduce frame-based applications that can be employed in the next-generation sequencing and other biological domains. In addition, we discuss the challenges faced by this field as well as the future works on parallel computing in bioinformatics. © The Author 2013. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  17. Quantitative, steady-state properties of Catania's computational model of the operant reserve.

    PubMed

    Berg, John P; McDowell, J J

    2011-05-01

    Catania (2005) found that a computational model of the operant reserve (Skinner, 1938) produced realistic behavior in initial, exploratory analyses. Although Catania's operant reserve computational model demonstrated potential to simulate varied behavioral phenomena, the model was not systematically tested. The current project replicated and extended the Catania model, clarified its capabilities through systematic testing, and determined the extent to which it produces behavior corresponding to matching theory. Significant departures from both classic and modern matching theory were found in behavior generated by the model across all conditions. The results suggest that a simple, dynamic operant model of the reflex reserve does not simulate realistic steady state behavior. Copyright © 2011 Elsevier B.V. All rights reserved.

  18. Three-dimensional monochromatic x-ray computed tomography using synchrotron radiation

    NASA Astrophysics Data System (ADS)

    Saito, Tsuneo; Kudo, Hiroyuki; Takeda, Tohoru; Itai, Yuji; Tokumori, Kenji; Toyofuku, Fukai; Hyodo, Kazuyuki; Ando, Masami; Nishimura, Katsuyuki; Uyama, Chikao

    1998-08-01

    We describe a technique of 3D computed tomography (3D CT) using monochromatic x rays generated by synchrotron radiation, which performs a direct reconstruction of a 3D volume image of an object from its cone-beam projections. For the development, we propose a practical scanning orbit of the x-ray source to obtain complete 3D information on an object, and its corresponding 3D image reconstruction algorithm. The validity and usefulness of the proposed scanning orbit and reconstruction algorithm were confirmed by computer simulation studies. Based on these investigations, we have developed a prototype 3D monochromatic x-ray CT using synchrotron radiation, which provides exact 3D reconstruction and material-selective imaging by using the K-edge energy subtraction technique.

  19. Range and egomotion estimation from compound photodetector arrays with parallel optical axis using optical flow techniques.

    PubMed

    Chahl, J S

    2014-01-20

    This paper describes an application for arrays of narrow-field-of-view sensors with parallel optical axes. These devices exhibit some complementary characteristics with respect to conventional perspective projection or angular projection imaging devices. Conventional imaging devices measure rotational egomotion directly by measuring the angular velocity of the projected image. Translational egomotion cannot be measured directly by these devices because the induced image motion depends on the unknown range of the viewed object. On the other hand, a known translational motion generates image velocities which can be used to recover the ranges of objects and hence the three-dimensional (3D) structure of the environment. A new method is presented for computing egomotion and range using the properties of linear arrays of independent narrow-field-of-view optical sensors. An approximate parallel projection can be used to measure translational egomotion in terms of the velocity of the image. On the other hand, a known rotational motion of the paraxial sensor array generates image velocities, which can be used to recover the 3D structure of the environment. Results of tests of an experimental array confirm these properties.

  20. Lewis Research Center studies of multiple large wind turbine generators on a utility network

    NASA Technical Reports Server (NTRS)

    Gilbert, L. J.; Triezenberg, D. M.

    1979-01-01

    A NASA-Lewis program to study the anticipated performance of a wind turbine generator farm on an electric utility network is surveyed. The paper describes the approach of the Lewis Wind Energy Project Office to developing analysis capabilities in the area of wind turbine generator-utility network computer simulations. Attention is given to areas such as, the Lewis Purdue hybrid simulation, an independent stability study, DOE multiunit plant study, and the WEST simulator. Also covered are the Lewis mod-2 simulation including analog simulation of a two wind turbine system and comparison with Boeing simulation results, and gust response of a two machine model. Finally future work to be done is noted and it is concluded that the study shows little interaction between the generators and between the generators and the bus.

  1. OBSERVING LYAPUNOV EXPONENTS OF INFINITE-DIMENSIONAL DYNAMICAL SYSTEMS

    PubMed Central

    OTT, WILLIAM; RIVAS, MAURICIO A.; WEST, JAMES

    2016-01-01

    Can Lyapunov exponents of infinite-dimensional dynamical systems be observed by projecting the dynamics into ℝN using a ‘typical’ nonlinear projection map? We answer this question affirmatively by developing embedding theorems for compact invariant sets associated with C1 maps on Hilbert spaces. Examples of such discrete-time dynamical systems include time-T maps and Poincaré return maps generated by the solution semigroups of evolution partial differential equations. We make every effort to place hypotheses on the projected dynamics rather than on the underlying infinite-dimensional dynamical system. In so doing, we adopt an empirical approach and formulate checkable conditions under which a Lyapunov exponent computed from experimental data will be a Lyapunov exponent of the infinite-dimensional dynamical system under study (provided the nonlinear projection map producing the data is typical in the sense of prevalence). PMID:28066028

  2. OBSERVING LYAPUNOV EXPONENTS OF INFINITE-DIMENSIONAL DYNAMICAL SYSTEMS.

    PubMed

    Ott, William; Rivas, Mauricio A; West, James

    2015-12-01

    Can Lyapunov exponents of infinite-dimensional dynamical systems be observed by projecting the dynamics into ℝ N using a 'typical' nonlinear projection map? We answer this question affirmatively by developing embedding theorems for compact invariant sets associated with C 1 maps on Hilbert spaces. Examples of such discrete-time dynamical systems include time- T maps and Poincaré return maps generated by the solution semigroups of evolution partial differential equations. We make every effort to place hypotheses on the projected dynamics rather than on the underlying infinite-dimensional dynamical system. In so doing, we adopt an empirical approach and formulate checkable conditions under which a Lyapunov exponent computed from experimental data will be a Lyapunov exponent of the infinite-dimensional dynamical system under study (provided the nonlinear projection map producing the data is typical in the sense of prevalence).

  3. Application of electrochemical energy storage in solar thermal electric generation systems

    NASA Technical Reports Server (NTRS)

    Das, R.; Krauthamer, S.; Frank, H.

    1982-01-01

    This paper assesses the status, cost, and performance of existing electrochemical energy storage systems, and projects the cost, performance, and availability of advanced storage systems for application in terrestrial solar thermal electric generation. A 10 MWe solar plant with five hours of storage is considered and the cost of delivered energy is computed for sixteen different storage systems. The results indicate that the five most attractive electrochemical storage systems use the following battery types: zinc-bromine (Exxon), iron-chromium redox (NASA/Lewis Research Center, LeRC), sodium-sulfur (Ford), sodium-sulfur (Dow), and zinc-chlorine (Energy Development Associates, EDA).

  4. Large-scale-system effectiveness analysis. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patton, A.D.; Ayoub, A.K.; Foster, J.W.

    1979-11-01

    Objective of the research project has been the investigation and development of methods for calculating system reliability indices which have absolute, and measurable, significance to consumers. Such indices are a necessary prerequisite to any scheme for system optimization which includes the economic consequences of consumer service interruptions. A further area of investigation has been joint consideration of generation and transmission in reliability studies. Methods for finding or estimating the probability distributions of some measures of reliability performance have been developed. The application of modern Monte Carlo simulation methods to compute reliability indices in generating systems has been studied.

  5. Holographic optical disc

    NASA Astrophysics Data System (ADS)

    Zhou, Gan; An, Xin; Pu, Allen; Psaltis, Demetri; Mok, Fai H.

    1999-11-01

    The holographic disc is a high capacity, disk-based data storage device that can provide the performance for next generation mass data storage needs. With a projected capacity approaching 1 terabit on a single 12 cm platter, the holographic disc has the potential to become a highly efficient storage hardware for data warehousing applications. The high readout rate of holographic disc makes it especially suitable for generating multiple, high bandwidth data streams such as required for network server computers. Multimedia applications such as interactive video and HDTV can also potentially benefit from the high capacity and fast data access of holographic memory.

  6. Successful generation of structural information for fragment-based drug discovery.

    PubMed

    Öster, Linda; Tapani, Sofia; Xue, Yafeng; Käck, Helena

    2015-09-01

    Fragment-based drug discovery relies upon structural information for efficient compound progression, yet it is often challenging to generate structures with bound fragments. A summary of recent literature reveals that a wide repertoire of experimental procedures is employed to generate ligand-bound crystal structures successfully. We share in-house experience from setting up and executing fragment crystallography in a project that resulted in 55 complex structures. The ligands span five orders of magnitude in affinity and the resulting structures are made available to be of use, for example, for development of computational methods. Analysis of the results revealed that ligand properties such as potency, ligand efficiency (LE) and, to some degree, clogP influence the success of complex structure generation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. SU-D-17A-02: Four-Dimensional CBCT Using Conventional CBCT Dataset and Iterative Subtraction Algorithm of a Lung Patient

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, E; Lasio, G; Yi, B

    2014-06-01

    Purpose: The Iterative Subtraction Algorithm (ISA) method generates retrospectively a pre-selected motion phase cone-beam CT image from the full motion cone-beam CT acquired at standard rotation speed. This work evaluates ISA method with real lung patient data. Methods: The goal of the ISA algorithm is to extract motion and no- motion components form the full reconstruction CBCT. The workflow consists of subtracting from the full CBCT all of the undesired motion phases and obtain a motion de-blurred single-phase CBCT image, followed by iteration of this subtraction process. ISA is realized as follows: 1) The projections are sorted to various phases,more » and from all phases, a full reconstruction is performed to generate an image CTM. 2) Generate forward projections of CTM at the desired phase projection angles, the subtraction of projection and the forward projection will reconstruct a CTSub1, which diminishes the desired phase component. 3) By adding back the CTSub1 to CTm, no motion CBCT, CTS1, can be computed. 4) CTS1 still contains residual motion component. 5) This residual motion component can be further reduced by iteration.The ISA 4DCBCT technique was implemented using Varian Trilogy accelerator OBI system. To evaluate the method, a lung patient CBCT dataset was used. The reconstruction algorithm is FDK. Results: The single phase CBCT reconstruction generated via ISA successfully isolates the desired motion phase from the full motion CBCT, effectively reducing motion blur. It also shows improved image quality, with reduced streak artifacts with respect to the reconstructions from unprocessed phase-sorted projections only. Conclusion: A CBCT motion de-blurring algorithm, ISA, has been developed and evaluated with lung patient data. The algorithm allows improved visualization of a single phase motion extracted from a standard CBCT dataset. This study has been supported by National Institute of Health through R01CA133539.« less

  8. Improvements on non-equilibrium and transport Green function techniques: The next-generation TRANSIESTA

    NASA Astrophysics Data System (ADS)

    Papior, Nick; Lorente, Nicolás; Frederiksen, Thomas; García, Alberto; Brandbyge, Mads

    2017-03-01

    We present novel methods implemented within the non-equilibrium Green function code (NEGF) TRANSIESTA based on density functional theory (DFT). Our flexible, next-generation DFT-NEGF code handles devices with one or multiple electrodes (Ne ≥ 1) with individual chemical potentials and electronic temperatures. We describe its novel methods for electrostatic gating, contour optimizations, and assertion of charge conservation, as well as the newly implemented algorithms for optimized and scalable matrix inversion, performance-critical pivoting, and hybrid parallelization. Additionally, a generic NEGF "post-processing" code (TBTRANS/PHTRANS) for electron and phonon transport is presented with several novelties such as Hamiltonian interpolations, Ne ≥ 1 electrode capability, bond-currents, generalized interface for user-defined tight-binding transport, transmission projection using eigenstates of a projected Hamiltonian, and fast inversion algorithms for large-scale simulations easily exceeding 106 atoms on workstation computers. The new features of both codes are demonstrated and bench-marked for relevant test systems.

  9. Fundamentals, current state of the development of, and prospects for further improvement of the new-generation thermal-hydraulic computational HYDRA-IBRAE/LM code for simulation of fast reactor systems

    NASA Astrophysics Data System (ADS)

    Alipchenkov, V. M.; Anfimov, A. M.; Afremov, D. A.; Gorbunov, V. S.; Zeigarnik, Yu. A.; Kudryavtsev, A. V.; Osipov, S. L.; Mosunova, N. A.; Strizhov, V. F.; Usov, E. V.

    2016-02-01

    The conceptual fundamentals of the development of the new-generation system thermal-hydraulic computational HYDRA-IBRAE/LM code are presented. The code is intended to simulate the thermalhydraulic processes that take place in the loops and the heat-exchange equipment of liquid-metal cooled fast reactor systems under normal operation and anticipated operational occurrences and during accidents. The paper provides a brief overview of Russian and foreign system thermal-hydraulic codes for modeling liquid-metal coolants and gives grounds for the necessity of development of a new-generation HYDRA-IBRAE/LM code. Considering the specific engineering features of the nuclear power plants (NPPs) equipped with the BN-1200 and the BREST-OD-300 reactors, the processes and the phenomena are singled out that require a detailed analysis and development of the models to be correctly described by the system thermal-hydraulic code in question. Information on the functionality of the computational code is provided, viz., the thermalhydraulic two-phase model, the properties of the sodium and the lead coolants, the closing equations for simulation of the heat-mass exchange processes, the models to describe the processes that take place during the steam-generator tube rupture, etc. The article gives a brief overview of the usability of the computational code, including a description of the support documentation and the supply package, as well as possibilities of taking advantages of the modern computer technologies, such as parallel computations. The paper shows the current state of verification and validation of the computational code; it also presents information on the principles of constructing of and populating the verification matrices for the BREST-OD-300 and the BN-1200 reactor systems. The prospects are outlined for further development of the HYDRA-IBRAE/LM code, introduction of new models into it, and enhancement of its usability. It is shown that the program of development and practical application of the code will allow carrying out in the nearest future the computations to analyze the safety of potential NPP projects at a qualitatively higher level.

  10. The Generation Challenge Programme Platform: Semantic Standards and Workbench for Crop Science

    PubMed Central

    Bruskiewich, Richard; Senger, Martin; Davenport, Guy; Ruiz, Manuel; Rouard, Mathieu; Hazekamp, Tom; Takeya, Masaru; Doi, Koji; Satoh, Kouji; Costa, Marcos; Simon, Reinhard; Balaji, Jayashree; Akintunde, Akinnola; Mauleon, Ramil; Wanchana, Samart; Shah, Trushar; Anacleto, Mylah; Portugal, Arllet; Ulat, Victor Jun; Thongjuea, Supat; Braak, Kyle; Ritter, Sebastian; Dereeper, Alexis; Skofic, Milko; Rojas, Edwin; Martins, Natalia; Pappas, Georgios; Alamban, Ryan; Almodiel, Roque; Barboza, Lord Hendrix; Detras, Jeffrey; Manansala, Kevin; Mendoza, Michael Jonathan; Morales, Jeffrey; Peralta, Barry; Valerio, Rowena; Zhang, Yi; Gregorio, Sergio; Hermocilla, Joseph; Echavez, Michael; Yap, Jan Michael; Farmer, Andrew; Schiltz, Gary; Lee, Jennifer; Casstevens, Terry; Jaiswal, Pankaj; Meintjes, Ayton; Wilkinson, Mark; Good, Benjamin; Wagner, James; Morris, Jane; Marshall, David; Collins, Anthony; Kikuchi, Shoshi; Metz, Thomas; McLaren, Graham; van Hintum, Theo

    2008-01-01

    The Generation Challenge programme (GCP) is a global crop research consortium directed toward crop improvement through the application of comparative biology and genetic resources characterization to plant breeding. A key consortium research activity is the development of a GCP crop bioinformatics platform to support GCP research. This platform includes the following: (i) shared, public platform-independent domain models, ontology, and data formats to enable interoperability of data and analysis flows within the platform; (ii) web service and registry technologies to identify, share, and integrate information across diverse, globally dispersed data sources, as well as to access high-performance computational (HPC) facilities for computationally intensive, high-throughput analyses of project data; (iii) platform-specific middleware reference implementations of the domain model integrating a suite of public (largely open-access/-source) databases and software tools into a workbench to facilitate biodiversity analysis, comparative analysis of crop genomic data, and plant breeding decision making. PMID:18483570

  11. A fast and flexible panoramic virtual reality system for behavioural and electrophysiological experiments.

    PubMed

    Takalo, Jouni; Piironen, Arto; Honkanen, Anna; Lempeä, Mikko; Aikio, Mika; Tuukkanen, Tuomas; Vähäsöyrinki, Mikko

    2012-01-01

    Ideally, neuronal functions would be studied by performing experiments with unconstrained animals whilst they behave in their natural environment. Although this is not feasible currently for most animal models, one can mimic the natural environment in the laboratory by using a virtual reality (VR) environment. Here we present a novel VR system based upon a spherical projection of computer generated images using a modified commercial data projector with an add-on fish-eye lens. This system provides equidistant visual stimulation with extensive coverage of the visual field, high spatio-temporal resolution and flexible stimulus generation using a standard computer. It also includes a track-ball system for closed-loop behavioural experiments with walking animals. We present a detailed description of the system and characterize it thoroughly. Finally, we demonstrate the VR system's performance whilst operating in closed-loop conditions by showing the movement trajectories of the cockroaches during exploratory behaviour in a VR forest.

  12. Generating Animated Displays of Spacecraft Orbits

    NASA Technical Reports Server (NTRS)

    Candey, Robert M.; Chimiak, Reine A.; Harris, Bernard T.

    2005-01-01

    Tool for Interactive Plotting, Sonification, and 3D Orbit Display (TIPSOD) is a computer program for generating interactive, animated, four-dimensional (space and time) displays of spacecraft orbits. TIPSOD utilizes the programming interface of the Satellite Situation Center Web (SSCWeb) services to communicate with the SSC logic and database by use of the open protocols of the Internet. TIPSOD is implemented in Java 3D and effects an extension of the preexisting SSCWeb two-dimensional static graphical displays of orbits. Orbits can be displayed in any or all of the following seven reference systems: true-of-date (an inertial system), J2000 (another inertial system), geographic, geomagnetic, geocentric solar ecliptic, geocentric solar magnetospheric, and solar magnetic. In addition to orbits, TIPSOD computes and displays Sibeck's magnetopause and Fairfield's bow-shock surfaces. TIPSOD can be used by the scientific community as a means of projection or interpretation. It also has potential as an educational tool.

  13. AutoAssemblyD: a graphical user interface system for several genome assemblers.

    PubMed

    Veras, Adonney Allan de Oliveira; de Sá, Pablo Henrique Caracciolo Gomes; Azevedo, Vasco; Silva, Artur; Ramos, Rommel Thiago Jucá

    2013-01-01

    Next-generation sequencing technologies have increased the amount of biological data generated. Thus, bioinformatics has become important because new methods and algorithms are necessary to manipulate and process such data. However, certain challenges have emerged, such as genome assembly using short reads and high-throughput platforms. In this context, several algorithms have been developed, such as Velvet, Abyss, Euler-SR, Mira, Edna, Maq, SHRiMP, Newbler, ALLPATHS, Bowtie and BWA. However, most such assemblers do not have a graphical interface, which makes their use difficult for users without computing experience given the complexity of the assembler syntax. Thus, to make the operation of such assemblers accessible to users without a computing background, we developed AutoAssemblyD, which is a graphical tool for genome assembly submission and remote management by multiple assemblers through XML templates. AssemblyD is freely available at https://sourceforge.net/projects/autoassemblyd. It requires Sun jdk 6 or higher.

  14. BridgeUP: STEM. Creating Opportunities for Women through Tiered Mentorship

    NASA Astrophysics Data System (ADS)

    Secunda, Amy; Cornelis, Juliette; Ferreira, Denelis; Gomez, Anay; Khan, Ariba; Li, Anna; Soo, Audrey; Mac Low, Mordecai

    2018-01-01

    BridgeUP: STEM is an ambitious, and exciting initiative responding to the extensive gender and opportunity gaps that exist in the STEM pipeline for women, girls, and under-resourced youth. BridgeUP: STEM has developed a distinct identity in the landscape of computer science education by embedding programming in the context of scientific research. One of the ways in which this is accomplished is through a tiered mentorship program. Five Helen Fellows are chosen from a pool of female, postbaccalaureate applicants to be mentored by researchers at the American Museum of Natural History in a computational research project. The Helen Fellows then act as mentors to six high school women (Brown Scholars), guiding them through a computational project aligned with their own research. This year, three of the Helen Fellows, and by extension, eighteen Brown Scholars, are performing computational astrophysics research. This poster presents one example of a tiered mentorship working on modeling the migration of stellar mass black holes (BH) in active galactic nucleus (AGN) disks. Making an analogy from the well-studied migration and formation of planets in protoplanetary disks to the newer field of migration and formation of binary BH in AGN disks, the Helen Fellow is working with her mentors to make the necessary adaptations of an N-body code incorporating migration torques from the protoplanetary disk case to the AGN disk case to model how binary BH form. This is in order to better understand and make predictions for gravitational wave observations from the Laser Interferometer Gravitational-Wave Observatory (LIGO). The Brown Scholars then implement the Helen Fellow’s code for a variety of different distributions of initial stellar mass BH populations that they generate using python, and produce visualizations of the output to be used in a published paper. Over the course of the project, students will develop a basic understanding of the physics related to their project and develop their practical computational skills.

  15. Optimal resolution in maximum entropy image reconstruction from projections with multigrid acceleration

    NASA Technical Reports Server (NTRS)

    Limber, Mark A.; Manteuffel, Thomas A.; Mccormick, Stephen F.; Sholl, David S.

    1993-01-01

    We consider the problem of image reconstruction from a finite number of projections over the space L(sup 1)(Omega), where Omega is a compact subset of the set of Real numbers (exp 2). We prove that, given a discretization of the projection space, the function that generates the correct projection data and maximizes the Boltzmann-Shannon entropy is piecewise constant on a certain discretization of Omega, which we call the 'optimal grid'. It is on this grid that one obtains the maximum resolution given the problem setup. The size of this grid grows very quickly as the number of projections and number of cells per projection grow, indicating fast computational methods are essential to make its use feasible. We use a Fenchel duality formulation of the problem to keep the number of variables small while still using the optimal discretization, and propose a multilevel scheme to improve convergence of a simple cyclic maximization scheme applied to the dual problem.

  16. Introduction to SIMRAND: Simulation of research and development project

    NASA Technical Reports Server (NTRS)

    Miles, R. F., Jr.

    1982-01-01

    SIMRAND: SIMulation of Research ANd Development Projects is a methodology developed to aid the engineering and management decision process in the selection of the optimal set of systems or tasks to be funded on a research and development project. A project may have a set of systems or tasks under consideration for which the total cost exceeds the allocated budget. Other factors such as personnel and facilities may also enter as constraints. Thus the project's management must select, from among the complete set of systems or tasks under consideration, a partial set that satisfies all project constraints. The SIMRAND methodology uses analytical techniques and probability theory, decision analysis of management science, and computer simulation, in the selection of this optimal partial set. The SIMRAND methodology is truly a management tool. It initially specifies the information that must be generated by the engineers, thus providing information for the management direction of the engineers, and it ranks the alternatives according to the preferences of the decision makers.

  17. Knowledge sharing and collaboration in translational research, and the DC-THERA Directory

    PubMed Central

    Gündel, Michaela; Austyn, Jonathan M.; Cavalieri, Duccio; Scognamiglio, Ciro; Brandizi, Marco

    2011-01-01

    Biomedical research relies increasingly on large collections of data sets and knowledge whose generation, representation and analysis often require large collaborative and interdisciplinary efforts. This dimension of ‘big data’ research calls for the development of computational tools to manage such a vast amount of data, as well as tools that can improve communication and access to information from collaborating researchers and from the wider community. Whenever research projects have a defined temporal scope, an additional issue of data management arises, namely how the knowledge generated within the project can be made available beyond its boundaries and life-time. DC-THERA is a European ‘Network of Excellence’ (NoE) that spawned a very large collaborative and interdisciplinary research community, focusing on the development of novel immunotherapies derived from fundamental research in dendritic cell immunobiology. In this article we introduce the DC-THERA Directory, which is an information system designed to support knowledge management for this research community and beyond. We present how the use of metadata and Semantic Web technologies can effectively help to organize the knowledge generated by modern collaborative research, how these technologies can enable effective data management solutions during and beyond the project lifecycle, and how resources such as the DC-THERA Directory fit into the larger context of e-science. PMID:21969471

  18. Impact analysis of government investment on water projects in the arid Gansu Province of China

    NASA Astrophysics Data System (ADS)

    Wang, Zhan; Deng, Xiangzheng; Li, Xiubin; Zhou, Qing; Yan, Haiming

    In this paper, we introduced three-nested Constant Elasticity of Substitution (CES) production function into a static Computable General Equilibrium (CGE) Model. Through four levels of factor productivity, we constructed three nested production function of land use productivity in the conceptual modeling frameworks. The first level of factor productivity is generated by the basic value-added land. On the second level, factor productivity in each sector is generated by human activities that presents human intervention to the first level of factor productivity. On the third level of factor productivity, water allocation reshapes the non-linear structure of transaction among first and second levels. From the perspective of resource utilization, we examined the economic efficiency of water allocation. The scenario-based empirical analysis results show that the three-nested CES production function within CGE model is well-behaved to present the economy system of the case study area. Firstly, water scarcity harmed economic production. Government investment on water projects in Gansu thereby had impacts on economic outcomes. Secondly, huge governmental financing on water projects bring depreciation of present value of social welfare. Moreover, water use for environment adaptation pressures on water supply. The theoretical water price can be sharply increased due to the increasing costs of factor inputs. Thirdly, water use efficiency can be improved by water projects, typically can be benefited from the expansion of water-saving irrigation areas even in those expanding dry area in Gansu. Therefore, increasing governmental financing on water projects can depreciate present value of social welfare but benefit economic efficiency for future generation.

  19. Introducing Enabling Computational Tools to the Climate Sciences: Multi-Resolution Climate Modeling with Adaptive Cubed-Sphere Grids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jablonowski, Christiane

    The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively withmore » advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project demonstrate significant advances in all six research areas. The major conclusions are that statically-adaptive variable-resolution modeling is currently becoming mature in the climate sciences, and that AMR holds outstanding promise for future-generation weather and climate models on high-performance computing architectures.« less

  20. Generating Mosaics of Astronomical Images

    NASA Technical Reports Server (NTRS)

    Bergou, Attila; Berriman, Bruce; Good, John; Jacob, Joseph; Katz, Daniel; Laity, Anastasia; Prince, Thomas; Williams, Roy

    2005-01-01

    "Montage" is the name of a service of the National Virtual Observatory (NVO), and of software being developed to implement the service via the World Wide Web. Montage generates science-grade custom mosaics of astronomical images on demand from input files that comply with the Flexible Image Transport System (FITS) standard and contain image data registered on projections that comply with the World Coordinate System (WCS) standards. "Science-grade" in this context signifies that terrestrial and instrumental features are removed from images in a way that can be described quantitatively. "Custom" refers to user-specified parameters of projection, coordinates, size, rotation, and spatial sampling. The greatest value of Montage is expected to lie in its ability to analyze images at multiple wavelengths, delivering them on a common projection, coordinate system, and spatial sampling, and thereby enabling further analysis as though they were part of a single, multi-wavelength image. Montage will be deployed as a computation-intensive service through existing astronomy portals and other Web sites. It will be integrated into the emerging NVO architecture and will be executed on the TeraGrid. The Montage software will also be portable and publicly available.

  1. Final Results from A Pilot Project to Investigate Wake Vortex Patterns and Weather Patterns at the Atlantic City Airport by the Richard Stockton College of NJ and the FAA

    NASA Astrophysics Data System (ADS)

    Trout, Joseph; Manson, J. Russell; King, David; Decicco, Nicolas; Prince, Alyssa; di Mercurio, Alexis; Rios, Manual

    2017-01-01

    Wake Vortex Turbulence is the turbulence generated by an aircraft in flight. This turbulence is created by vortices at the tips of the wing that may decay slowly and persist for several minutes after creation. These vortices and turbulence are hazardous to other aircraft in the vicinity. The strength, formation and lifetime of the turbulence and vortices are effected by many things including the weather. Here we present the final results of the pilot project to investigation of low level wind fields generated by the Weather Research and Forecasting Model and an analysis of historical data. The findings from the historical data and the data simulations were used as inputs for the computational fluid dynamics model (OpenFoam) to show that the vortices could be simulated using OpenFoam. Presented here are the updated results from a research grant, ``A Pilot Project to Investigate Wake Vortex Patterns and Weather Patterns at the Atlantic City Airport by the Stockton University and the FAA''.

  2. Using virtual environment technology for preadapting astronauts to the novel sensory conditions of microgravity

    NASA Technical Reports Server (NTRS)

    Duncan, K. M.; Harm, D. L.; Crosier, W. G.; Worthington, J. W.

    1993-01-01

    A unique training device is being developed at the Johnson Space Center Neurosciences Laboratory to help reduce or eliminate Space Motion Sickness (SMS) and spatial orientation disturbances that occur during spaceflight. The Device for Orientation and Motion Environments Preflight Adaptation Trainer (DOME PAT) uses virtual reality technology to simulate some sensory rearrangements experienced by astronauts in microgravity. By exposing a crew member to this novel environment preflight, it is expected that he/she will become partially adapted, and thereby suffer fewer symptoms inflight. The DOME PAT is a 3.7 m spherical dome, within which a 170 by 100 deg field of view computer-generated visual database is projected. The visual database currently in use depicts the interior of a Shuttle spacelab. The trainee uses a six degree-of-freedom, isometric force hand controller to navigate through the virtual environment. Alternatively, the trainee can be 'moved' about within the virtual environment by the instructor, or can look about within the environment by wearing a restraint that controls scene motion in response to head movements. The computer system is comprised of four personal computers that provide the real time control and user interface, and two Silicon Graphics computers that generate the graphical images. The image generator computers use custom algorithms to compensate for spherical image distortion, while maintaining a video update rate of 30 Hz. The DOME PAT is the first such system known to employ virtual reality technology to reduce the untoward effects of the sensory rearrangement associated with exposure to microgravity, and it does so in a very cost-effective manner.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salama, A.; Mikhail, M.

    Comprehensive software packages have been developed at the Western Research Centre as tools to help coal preparation engineers analyze, evaluate, and control coal cleaning processes. The COal Preparation Software package (COPS) performs three functions: (1) data handling and manipulation, (2) data analysis, including the generation of washability data, performance evaluation and prediction, density and size modeling, evaluation of density and size partition characteristics and attrition curves, and (3) generation of graphics output. The Separation ChARacteristics Estimation software packages (SCARE) are developed to balance raw density or size separation data. The cases of density and size separation data are considered. Themore » generated balanced data can take the balanced or normalized forms. The scaled form is desirable for direct determination of the partition functions (curves). The raw and generated separation data are displayed in tabular and/or graphical forms. The computer softwares described in this paper are valuable tools for coal preparation plant engineers and operators for evaluating process performance, adjusting plant parameters, and balancing raw density or size separation data. These packages have been applied very successfully in many projects carried out by WRC for the Canadian coal preparation industry. The software packages are designed to run on a personal computer (PC).« less

  4. ReSeqTools: an integrated toolkit for large-scale next-generation sequencing based resequencing analysis.

    PubMed

    He, W; Zhao, S; Liu, X; Dong, S; Lv, J; Liu, D; Wang, J; Meng, Z

    2013-12-04

    Large-scale next-generation sequencing (NGS)-based resequencing detects sequence variations, constructs evolutionary histories, and identifies phenotype-related genotypes. However, NGS-based resequencing studies generate extraordinarily large amounts of data, making computations difficult. Effective use and analysis of these data for NGS-based resequencing studies remains a difficult task for individual researchers. Here, we introduce ReSeqTools, a full-featured toolkit for NGS (Illumina sequencing)-based resequencing analysis, which processes raw data, interprets mapping results, and identifies and annotates sequence variations. ReSeqTools provides abundant scalable functions for routine resequencing analysis in different modules to facilitate customization of the analysis pipeline. ReSeqTools is designed to use compressed data files as input or output to save storage space and facilitates faster and more computationally efficient large-scale resequencing studies in a user-friendly manner. It offers abundant practical functions and generates useful statistics during the analysis pipeline, which significantly simplifies resequencing analysis. Its integrated algorithms and abundant sub-functions provide a solid foundation for special demands in resequencing projects. Users can combine these functions to construct their own pipelines for other purposes.

  5. Computer-Aided Parallelizer and Optimizer

    NASA Technical Reports Server (NTRS)

    Jin, Haoqiang

    2011-01-01

    The Computer-Aided Parallelizer and Optimizer (CAPO) automates the insertion of compiler directives (see figure) to facilitate parallel processing on Shared Memory Parallel (SMP) machines. While CAPO currently is integrated seamlessly into CAPTools (developed at the University of Greenwich, now marketed as ParaWise), CAPO was independently developed at Ames Research Center as one of the components for the Legacy Code Modernization (LCM) project. The current version takes serial FORTRAN programs, performs interprocedural data dependence analysis, and generates OpenMP directives. Due to the widely supported OpenMP standard, the generated OpenMP codes have the potential to run on a wide range of SMP machines. CAPO relies on accurate interprocedural data dependence information currently provided by CAPTools. Compiler directives are generated through identification of parallel loops in the outermost level, construction of parallel regions around parallel loops and optimization of parallel regions, and insertion of directives with automatic identification of private, reduction, induction, and shared variables. Attempts also have been made to identify potential pipeline parallelism (implemented with point-to-point synchronization). Although directives are generated automatically, user interaction with the tool is still important for producing good parallel codes. A comprehensive graphical user interface is included for users to interact with the parallelization process.

  6. Aerosciences, Aero-Propulsion and Flight Mechanics Technology Development for NASA's Next Generation Launch Technology Program

    NASA Technical Reports Server (NTRS)

    Cockrell, Charles E., Jr.

    2003-01-01

    The Next Generation Launch Technology (NGLT) program, Vehicle Systems Research and Technology (VSR&T) project is pursuing technology advancements in aerothermodynamics, aeropropulsion and flight mechanics to enable development of future reusable launch vehicle (RLV) systems. The current design trade space includes rocket-propelled, hypersonic airbreathing and hybrid systems in two-stage and single-stage configurations. Aerothermodynamics technologies include experimental and computational databases to evaluate stage separation of two-stage vehicles as well as computational and trajectory simulation tools for this problem. Additionally, advancements in high-fidelity computational tools and measurement techniques are being pursued along with the study of flow physics phenomena, such as boundary-layer transition. Aero-propulsion technology development includes scramjet flowpath development and integration, with a current emphasis on hypervelocity (Mach 10 and above) operation, as well as the study of aero-propulsive interactions and the impact on overall vehicle performance. Flight mechanics technology development is focused on advanced guidance, navigation and control (GN&C) algorithms and adaptive flight control systems for both rocket-propelled and airbreathing vehicles.

  7. Unsteady Aero Computation of a 1 1/2 Stage Large Scale Rotating Turbine

    NASA Technical Reports Server (NTRS)

    To, Wai-Ming

    2012-01-01

    This report is the documentation of the work performed for the Subsonic Rotary Wing Project under the NASA s Fundamental Aeronautics Program. It was funded through Task Number NNC10E420T under GESS-2 Contract NNC06BA07B in the period of 10/1/2010 to 8/31/2011. The objective of the task is to provide support for the development of variable speed power turbine technology through application of computational fluid dynamics analyses. This includes work elements in mesh generation, multistage URANS simulations, and post-processing of the simulation results for comparison with the experimental data. The unsteady CFD calculations were performed with the TURBO code running in multistage single passage (phase lag) mode. Meshes for the blade rows were generated with the NASA developed TCGRID code. The CFD performance is assessed and improvements are recommended for future research in this area. For that, the United Technologies Research Center's 1 1/2 stage Large Scale Rotating Turbine was selected to be the candidate engine configuration for this computational effort because of the completeness and availability of the data.

  8. Space coding for sensorimotor transformations can emerge through unsupervised learning.

    PubMed

    De Filippo De Grazia, Michele; Cutini, Simone; Lisi, Matteo; Zorzi, Marco

    2012-08-01

    The posterior parietal cortex (PPC) is fundamental for sensorimotor transformations because it combines multiple sensory inputs and posture signals into different spatial reference frames that drive motor programming. Here, we present a computational model mimicking the sensorimotor transformations occurring in the PPC. A recurrent neural network with one layer of hidden neurons (restricted Boltzmann machine) learned a stochastic generative model of the sensory data without supervision. After the unsupervised learning phase, the activity of the hidden neurons was used to compute a motor program (a population code on a bidimensional map) through a simple linear projection and delta rule learning. The average motor error, calculated as the difference between the expected and the computed output, was less than 3°. Importantly, analyses of the hidden neurons revealed gain-modulated visual receptive fields, thereby showing that space coding for sensorimotor transformations similar to that observed in the PPC can emerge through unsupervised learning. These results suggest that gain modulation is an efficient coding strategy to integrate visual and postural information toward the generation of motor commands.

  9. Use of cone beam computed tomography in periodontology

    PubMed Central

    Acar, Buket; Kamburoğlu, Kıvanç

    2014-01-01

    Diagnosis of periodontal disease mainly depends on clinical signs and symptoms. However, in the case of bone destruction, radiographs are valuable diagnostic tools as an adjunct to the clinical examination. Two dimensional periapical and panoramic radiographs are routinely used for diagnosing periodontal bone levels. In two dimensional imaging, evaluation of bone craters, lamina dura and periodontal bone level is limited by projection geometry and superpositions of adjacent anatomical structures. Those limitations of 2D radiographs can be eliminated by three-dimensional imaging techniques such as computed tomography. Cone beam computed tomography (CBCT) generates 3D volumetric images and is also commonly used in dentistry. All CBCT units provide axial, coronal and sagittal multi-planar reconstructed images without magnification. Also, panoramic images without distortion and magnification can be generated with curved planar reformation. CBCT displays 3D images that are necessary for the diagnosis of intra bony defects, furcation involvements and buccal/lingual bone destructions. CBCT applications provide obvious benefits in periodontics, however; it should be used only in correct indications considering the necessity and the potential hazards of the examination. PMID:24876918

  10. 15Mcps photon-counting X-ray computed tomography system using a ZnO-MPPC detector and its application to gadolinium imaging.

    PubMed

    Sato, Eiichi; Sugimura, Shigeaki; Endo, Haruyuki; Oda, Yasuyuki; Abudurexiti, Abulajiang; Hagiwara, Osahiko; Osawa, Akihiro; Matsukiyo, Hiroshi; Enomoto, Toshiyuki; Watanabe, Manabu; Kusachi, Shinya; Sato, Shigehiro; Ogawa, Akira; Onagawa, Jun

    2012-01-01

    15Mcps photon-counting X-ray computed tomography (CT) system is a first-generation type and consists of an X-ray generator, a turntable, a translation stage, a two-stage controller, a detector consisting of a 2mm-thick zinc-oxide (ZnO) single-crystal scintillator and an MPPC (multipixel photon counter) module, a counter card (CC), and a personal computer (PC). High-speed photon counting was carried out using the detector in the X-ray CT system. The maximum count rate was 15Mcps (mega counts per second) at a tube voltage of 100kV and a tube current of 1.95mA. Tomography is accomplished by repeated translations and rotations of an object, and projection curves of the object are obtained by the translation. The pulses of the event signal from the module are counted by the CC in conjunction with the PC. The minimum exposure time for obtaining a tomogram was 15min, and photon-counting CT was accomplished using gadolinium-based contrast media. Copyright © 2011 Elsevier Ltd. All rights reserved.

  11. A rapid parallelization of cone-beam projection and back-projection operator based on texture fetching interpolation

    NASA Astrophysics Data System (ADS)

    Xie, Lizhe; Hu, Yining; Chen, Yang; Shi, Luyao

    2015-03-01

    Projection and back-projection are the most computational consuming parts in Computed Tomography (CT) reconstruction. Parallelization strategies using GPU computing techniques have been introduced. We in this paper present a new parallelization scheme for both projection and back-projection. The proposed method is based on CUDA technology carried out by NVIDIA Corporation. Instead of build complex model, we aimed on optimizing the existing algorithm and make it suitable for CUDA implementation so as to gain fast computation speed. Besides making use of texture fetching operation which helps gain faster interpolation speed, we fixed sampling numbers in the computation of projection, to ensure the synchronization of blocks and threads, thus prevents the latency caused by inconsistent computation complexity. Experiment results have proven the computational efficiency and imaging quality of the proposed method.

  12. Development of stable Grid service at the next generation system of KEKCC

    NASA Astrophysics Data System (ADS)

    Nakamura, T.; Iwai, G.; Matsunaga, H.; Murakami, K.; Sasaki, T.; Suzuki, S.; Takase, W.

    2017-10-01

    A lot of experiments in the field of accelerator based science are actively running at High Energy Accelerator Research Organization (KEK) by using SuperKEKB and J-PARC accelerator in Japan. In these days at KEK, the computing demand from the various experiments for the data processing, analysis, and MC simulation is monotonically increasing. It is not only for the case with high-energy experiments, the computing requirement from the hadron and neutrino experiments and some projects of astro-particle physics is also rapidly increasing due to the very high precision measurement. Under this situation, several projects, Belle II, T2K, ILC and KAGRA experiments supported by KEK are going to utilize Grid computing infrastructure as the main computing resource. The Grid system and services in KEK, which is already in production, are upgraded for the further stable operation at the same time of whole scale hardware replacement of KEK Central Computer System (KEKCC). The next generation system of KEKCC starts the operation from the beginning of September 2016. The basic Grid services e.g. BDII, VOMS, LFC, CREAM computing element and StoRM storage element are made by the more robust hardware configuration. Since the raw data transfer is one of the most important tasks for the KEKCC, two redundant GridFTP servers are adapted to the StoRM service instances with 40 Gbps network bandwidth on the LHCONE routing. These are dedicated to the Belle II raw data transfer to the other sites apart from the servers for the data transfer usage of the other VOs. Additionally, we prepare the redundant configuration for the database oriented services like LFC and AMGA by using LifeKeeper. The LFC servers are made by two read/write servers and two read-only servers for the Belle II experiment, and all of them have an individual database for the purpose of load balancing. The FTS3 service is newly deployed as a service for the Belle II data distribution. The service of CVMFS stratum-0 is started for the Belle II software repository, and stratum-1 service is prepared for the other VOs. In this way, there are a lot of upgrade for the real production service of Grid infrastructure at KEK Computing Research Center. In this paper, we would like to introduce the detailed configuration of the hardware for Grid instance, and several mechanisms to construct the robust Grid system in the next generation system of KEKCC.

  13. Representative Sampling: Follow-up of Spring 1972 and Spring 1973 Students. TEX-SIS FOLLOW-UP SC3.

    ERIC Educational Resources Information Center

    Wilkinson, Larry; And Others

    This report presents the findings of a research study, conducted by the College of the Mainland (COM) as a subcontractor for Project FOLLOW-UP, designed to test the accuracy of random sampling and to measure non-response bias in mail surveys. In 1975, a computer-generated random sample of 500 students was drawn from a population of 1,256 students…

  14. Russian and Chinese Information Warfare: Theory and Practice

    DTIC Science & Technology

    2004-06-01

    Integral neurolinguistic programming •Placing essential programs into the conscious or sub- conscious mind •Subconscious suggestions that modify human...Generators of special rays •Optical systems • Neurolinguistic programming •Computer psychotechnology •The mass media •Audiovisual effects •Special effects...Information Warfare: Theory and Practice 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e

  15. Onboard processor technology review

    NASA Technical Reports Server (NTRS)

    Benz, Harry F.

    1990-01-01

    The general need and requirements for the onboard embedded processors necessary to control and manipulate data in spacecraft systems are discussed. The current known requirements are reviewed from a user perspective, based on current practices in the spacecraft development process. The current capabilities of available processor technologies are then discussed, and these are projected to the generation of spacecraft computers currently under identified, funded development. An appraisal is provided for the current national developmental effort.

  16. Cost and schedule analytical techniques development

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This contract provided technical services and products to the Marshall Space Flight Center's Engineering Cost Office (PP03) and the Program Plans and Requirements Office (PP02) for the period of 3 Aug. 1991 - 30 Nov. 1994. Accomplishments summarized cover the REDSTAR data base, NASCOM hard copy data base, NASCOM automated data base, NASCOM cost model, complexity generators, program planning, schedules, NASA computer connectivity, other analytical techniques, and special project support.

  17. More steam for Kern River

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rintoul, B.

    1973-02-01

    While production generally is declining elsewhere in California, the Kern River field continues to post gains. The field last year produced at an all-time high for the second year in a row, putting out at least 1.5 million bbl more than in its previous peak year. There is every reason to believe that gains will continue through this year. Steam is in the factor that underlies Kern River's resurgence, and Getty Oil Co., the field's premier steamer, recently added to its already imposing array of steam-generating equipment a pair of large boilers, each capable of generating 240 million btus permore » hr. Along with expansion of the steaming effort the company also expanded its water-treating facilities, making sure there will be plenty of feed water to fuel the steam generators at work in the field. The new boilers are being used to furnish steam to 136 wells in a steam displacement project. The purpose of going to a larger generator has been to gain higher efficiency. The components that have made Getty Oil the leading steamer at Kern River and the field, in turn, the world capital for oil-field steam operations include shallow wells, steam generators and--since 1969--a computer. The entire project is described in detail.« less

  18. QADATA user's manual; an interactive computer program for the retrieval and analysis of the results from the external blind sample quality- assurance project of the U.S. Geological Survey

    USGS Publications Warehouse

    Lucey, K.J.

    1990-01-01

    The U.S. Geological Survey conducts an external blind sample quality assurance project for its National Water Quality Laboratory in Denver, Colorado, based on the analysis of reference water samples. Reference samples containing selected inorganic and nutrient constituents are disguised as environmental samples at the Survey 's office in Ocala, Florida, and are sent periodically through other Survey offices to the laboratory. The results of this blind sample project indicate the quality of analytical data produced by the laboratory. This report provides instructions on the use of QADATA, an interactive, menu-driven program that allows users to retrieve the results of the blind sample quality- assurance project. The QADATA program, which is available on the U.S. Geological Survey 's national computer network, accesses a blind sample data base that contains more than 50,000 determinations from the last five water years for approximately 40 constituents at various concentrations. The data can be retrieved from the database for any user- defined time period and for any or all available constituents. After the user defines the retrieval, the program prepares statistical tables, control charts, and precision plots and generates a report which can be transferred to the user 's office through the computer network. A discussion of the interpretation of the program output is also included. This quality assurance information will permit users to document the quality of the analytical results received from the laboratory. The blind sample data is entered into the database within weeks after being produced by the laboratory and can be retrieved to meet the needs of specific projects or programs. (USGS)

  19. Feasibility study for mega-electron-volt electron beam tomography.

    PubMed

    Hampel, U; Bärtling, Y; Hoppe, D; Kuksanov, N; Fadeev, S; Salimov, R

    2012-09-01

    Electron beam tomography is a promising imaging modality for the study of fast technical processes. But for many technical objects of interest x rays of several hundreds of keV energy are required to achieve sufficient material penetration. In this article we report on a feasibility study for fast electron beam computed tomography with a 1 MeV electron beam. The experimental setup comprises an electrostatic accelerator with beam optics, transmission target, and a single x-ray detector. We employed an inverse fan-beam tomography approach with radiographic projections being generated from the linearly moving x-ray source. Angular projections were obtained by rotating the object.

  20. All-optical animation projection system with rotating fieldstone.

    PubMed

    Ishii, Yuko; Takayama, Yoshihisa; Kodate, Kashiko

    2007-06-11

    A simple and compact rewritable holographic memory system using a fieldstone of Ulexite is proposed. The role of the fieldstone is to impose random patterns on the reference beam to record plural images with the random-reference multiplexing scheme. The operations for writing and reading holograms are carried out by simply rotating the fieldstone in one direction. One of the features of this approach is found in a way to generate random patterns without computer drawings. The experimental study confirms that our system enables the smooth readout of the stored images one after another so that the series of reproduced images are projected as an animation.

  1. All-optical animation projection system with rotating fieldstone

    NASA Astrophysics Data System (ADS)

    Ishii, Yuko; Takayama, Yoshihisa; Kodate, Kashiko

    2007-06-01

    A simple and compact rewritable holographic memory system using a fieldstone of Ulexite is proposed. The role of the fieldstone is to impose random patterns on the reference beam to record plural images with the random-reference multiplexing scheme. The operations for writing and reading holograms are carried out by simply rotating the fieldstone in one direction. One of the features of this approach is found in a way to generate random patterns without computer drawings. The experimental study confirms that our system enables the smooth readout of the stored images one after another so that the series of reproduced images are projected as an animation.

  2. The recognition of potato varieties using of neural image analysis method

    NASA Astrophysics Data System (ADS)

    Przybył, K.; Górna, K.; Wojcieszak, D.; Czekała, W.; Ludwiczak, A.; Przybylak, A.; Boniecki, P.; Koszela, K.; Zaborowicz, M.; Janczak, D.; Lewicki, A.

    2015-07-01

    The aim of this paper was to extract the representative features and generate an appropriate neural model for classification of varieties of edible potato. Potatoes of variety the Vineta and the Denar were the empirical object of this thesis. The main concept of the project was to develop and prepare an image database using the computer image analysis software. The choice of appropriate neural model the one which will have the greatest abilities to identify the selected variety. The aim of this project is ultimately to conduct assistance and accelerate work of the expert, who classifies and keeps different varieties of potatoes in heaps.

  3. Automated documentation generator for advanced protein crystal growth

    NASA Technical Reports Server (NTRS)

    Maddux, Gary A.; Provancha, Anna; Chattam, David; Ford, Ronald

    1993-01-01

    The System Management and Production Laboratory at the Research Institute, the University of Alabama in Huntsville (UAH), was tasked by the Microgravity Experiment Projects (MEP) Office of the Payload Projects Office (PPO) at Marshall Space Flight Center (MSFC) to conduct research in the current methods of written documentation control and retrieval. The goals of this research were to determine the logical interrelationships within selected NASA documentation, and to expand on a previously developed prototype system to deliver a distributable, electronic knowledge-based system. This computer application would then be used to provide a paperless interface between the appropriate parties for the required NASA document.

  4. Development of a Cloud Resolving Model for Heterogeneous Supercomputers

    NASA Astrophysics Data System (ADS)

    Sreepathi, S.; Norman, M. R.; Pal, A.; Hannah, W.; Ponder, C.

    2017-12-01

    A cloud resolving climate model is needed to reduce major systematic errors in climate simulations due to structural uncertainty in numerical treatments of convection - such as convective storm systems. This research describes the porting effort to enable SAM (System for Atmosphere Modeling) cloud resolving model on heterogeneous supercomputers using GPUs (Graphical Processing Units). We have isolated a standalone configuration of SAM that is targeted to be integrated into the DOE ACME (Accelerated Climate Modeling for Energy) Earth System model. We have identified key computational kernels from the model and offloaded them to a GPU using the OpenACC programming model. Furthermore, we are investigating various optimization strategies intended to enhance GPU utilization including loop fusion/fission, coalesced data access and loop refactoring to a higher abstraction level. We will present early performance results, lessons learned as well as optimization strategies. The computational platform used in this study is the Summitdev system, an early testbed that is one generation removed from Summit, the next leadership class supercomputer at Oak Ridge National Laboratory. The system contains 54 nodes wherein each node has 2 IBM POWER8 CPUs and 4 NVIDIA Tesla P100 GPUs. This work is part of a larger project, ACME-MMF component of the U.S. Department of Energy(DOE) Exascale Computing Project. The ACME-MMF approach addresses structural uncertainty in cloud processes by replacing traditional parameterizations with cloud resolving "superparameterization" within each grid cell of global climate model. Super-parameterization dramatically increases arithmetic intensity, making the MMF approach an ideal strategy to achieve good performance on emerging exascale computing architectures. The goal of the project is to integrate superparameterization into ACME, and explore its full potential to scientifically and computationally advance climate simulation and prediction.

  5. Video image processing

    NASA Technical Reports Server (NTRS)

    Murray, N. D.

    1985-01-01

    Current technology projections indicate a lack of availability of special purpose computing for Space Station applications. Potential functions for video image special purpose processing are being investigated, such as smoothing, enhancement, restoration and filtering, data compression, feature extraction, object detection and identification, pixel interpolation/extrapolation, spectral estimation and factorization, and vision synthesis. Also, architectural approaches are being identified and a conceptual design generated. Computationally simple algorithms will be research and their image/vision effectiveness determined. Suitable algorithms will be implimented into an overall architectural approach that will provide image/vision processing at video rates that are flexible, selectable, and programmable. Information is given in the form of charts, diagrams and outlines.

  6. Distributed databases for materials study of thermo-kinetic properties

    NASA Astrophysics Data System (ADS)

    Toher, Cormac

    2015-03-01

    High-throughput computational materials science provides researchers with the opportunity to rapidly generate large databases of materials properties. To rapidly add thermal properties to the AFLOWLIB consortium and Materials Project repositories, we have implemented an automated quasi-harmonic Debye model, the Automatic GIBBS Library (AGL). This enables us to screen thousands of materials for thermal conductivity, bulk modulus, thermal expansion and related properties. The search and sort functions of the online database can then be used to identify suitable materials for more in-depth study using more precise computational or experimental techniques. AFLOW-AGL source code is public domain and will soon be released within the GNU-GPL license.

  7. Three-dimensional monochromatic x-ray CT

    NASA Astrophysics Data System (ADS)

    Saito, Tsuneo; Kudo, Hiroyuki; Takeda, Tohoru; Itai, Yuji; Tokumori, Kenji; Toyofuku, Fukai; Hyodo, Kazuyuki; Ando, Masami; Nishimura, Ktsuyuki; Uyama, Chikao

    1995-08-01

    In this paper, we describe a 3D computed tomography (3D CT) using monochromatic x-rays generated by synchrotron radiation, which performs a direct reconstruction of 3D volume image of an object from its cone-beam projections. For the develpment of 3D CT, scanning orbit of x-ray source to obtain complete 3D information about an object and corresponding 3D image reconstruction algorithm are considered. Computer simulation studies demonstrate the validities of proposed scanning method and reconstruction algorithm. A prototype experimental system of 3D CT was constructed. Basic phantom examinations and specific material CT image by energy subtraction obtained in this experimental system are shown.

  8. Brain Computation Is Organized via Power-of-Two-Based Permutation Logic.

    PubMed

    Xie, Kun; Fox, Grace E; Liu, Jun; Lyu, Cheng; Lee, Jason C; Kuang, Hui; Jacobs, Stephanie; Li, Meng; Liu, Tianming; Song, Sen; Tsien, Joe Z

    2016-01-01

    There is considerable scientific interest in understanding how cell assemblies-the long-presumed computational motif-are organized so that the brain can generate intelligent cognition and flexible behavior. The Theory of Connectivity proposes that the origin of intelligence is rooted in a power-of-two-based permutation logic ( N = 2 i -1), producing specific-to-general cell-assembly architecture capable of generating specific perceptions and memories, as well as generalized knowledge and flexible actions. We show that this power-of-two-based permutation logic is widely used in cortical and subcortical circuits across animal species and is conserved for the processing of a variety of cognitive modalities including appetitive, emotional and social information. However, modulatory neurons, such as dopaminergic (DA) neurons, use a simpler logic despite their distinct subtypes. Interestingly, this specific-to-general permutation logic remained largely intact although NMDA receptors-the synaptic switch for learning and memory-were deleted throughout adulthood, suggesting that the logic is developmentally pre-configured. Moreover, this computational logic is implemented in the cortex via combining a random-connectivity strategy in superficial layers 2/3 with nonrandom organizations in deep layers 5/6. This randomness of layers 2/3 cliques-which preferentially encode specific and low-combinatorial features and project inter-cortically-is ideal for maximizing cross-modality novel pattern-extraction, pattern-discrimination and pattern-categorization using sparse code, consequently explaining why it requires hippocampal offline-consolidation. In contrast, the nonrandomness in layers 5/6-which consists of few specific cliques but a higher portion of more general cliques projecting mostly to subcortical systems-is ideal for feedback-control of motivation, emotion, consciousness and behaviors. These observations suggest that the brain's basic computational algorithm is indeed organized by the power-of-two-based permutation logic. This simple mathematical logic can account for brain computation across the entire evolutionary spectrum, ranging from the simplest neural networks to the most complex.

  9. Brain Computation Is Organized via Power-of-Two-Based Permutation Logic

    PubMed Central

    Xie, Kun; Fox, Grace E.; Liu, Jun; Lyu, Cheng; Lee, Jason C.; Kuang, Hui; Jacobs, Stephanie; Li, Meng; Liu, Tianming; Song, Sen; Tsien, Joe Z.

    2016-01-01

    There is considerable scientific interest in understanding how cell assemblies—the long-presumed computational motif—are organized so that the brain can generate intelligent cognition and flexible behavior. The Theory of Connectivity proposes that the origin of intelligence is rooted in a power-of-two-based permutation logic (N = 2i–1), producing specific-to-general cell-assembly architecture capable of generating specific perceptions and memories, as well as generalized knowledge and flexible actions. We show that this power-of-two-based permutation logic is widely used in cortical and subcortical circuits across animal species and is conserved for the processing of a variety of cognitive modalities including appetitive, emotional and social information. However, modulatory neurons, such as dopaminergic (DA) neurons, use a simpler logic despite their distinct subtypes. Interestingly, this specific-to-general permutation logic remained largely intact although NMDA receptors—the synaptic switch for learning and memory—were deleted throughout adulthood, suggesting that the logic is developmentally pre-configured. Moreover, this computational logic is implemented in the cortex via combining a random-connectivity strategy in superficial layers 2/3 with nonrandom organizations in deep layers 5/6. This randomness of layers 2/3 cliques—which preferentially encode specific and low-combinatorial features and project inter-cortically—is ideal for maximizing cross-modality novel pattern-extraction, pattern-discrimination and pattern-categorization using sparse code, consequently explaining why it requires hippocampal offline-consolidation. In contrast, the nonrandomness in layers 5/6—which consists of few specific cliques but a higher portion of more general cliques projecting mostly to subcortical systems—is ideal for feedback-control of motivation, emotion, consciousness and behaviors. These observations suggest that the brain’s basic computational algorithm is indeed organized by the power-of-two-based permutation logic. This simple mathematical logic can account for brain computation across the entire evolutionary spectrum, ranging from the simplest neural networks to the most complex. PMID:27895562

  10. Next Generation Workload Management System For Big Data on Heterogeneous Distributed Computing

    DOE PAGES

    Klimentov, A.; Buncic, P.; De, K.; ...

    2015-05-22

    The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS and ALICE are the largest collaborations ever assembled in the sciences and are at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, both experiments rely on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Managementmore » System (WMS) for managing the workflow for all data processing on hundreds of data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. The scale is demonstrated by the following numbers: PanDA manages O(10 2) sites, O(10 5) cores, O(10 8) jobs per year, O(10 3) users, and ATLAS data volume is O(10 17) bytes. In 2013 we started an ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF). The project titled 'Next Generation Workload Management and Analysis System for Big Data' (BigPanDA) is funded by DOE ASCR and HEP. Extending PanDA to clouds and LCF presents new challenges in managing heterogeneity and supporting workflow. The BigPanDA project is underway to setup and tailor PanDA at the Oak Ridge Leadership Computing Facility (OLCF) and at the National Research Center "Kurchatov Institute" together with ALICE distributed computing and ORNL computing professionals. Our approach to integration of HPC platforms at the OLCF and elsewhere is to reuse, as much as possible, existing components of the PanDA system. Finally, we will present our current accomplishments with running the PanDA WMS at OLCF and other supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications.« less

  11. Next Generation Workload Management System For Big Data on Heterogeneous Distributed Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klimentov, A.; Buncic, P.; De, K.

    The Large Hadron Collider (LHC), operating at the international CERN Laboratory in Geneva, Switzerland, is leading Big Data driven scientific explorations. Experiments at the LHC explore the fundamental nature of matter and the basic forces that shape our universe, and were recently credited for the discovery of a Higgs boson. ATLAS and ALICE are the largest collaborations ever assembled in the sciences and are at the forefront of research at the LHC. To address an unprecedented multi-petabyte data processing challenge, both experiments rely on a heterogeneous distributed computational infrastructure. The ATLAS experiment uses PanDA (Production and Data Analysis) Workload Managementmore » System (WMS) for managing the workflow for all data processing on hundreds of data centers. Through PanDA, ATLAS physicists see a single computing facility that enables rapid scientific breakthroughs for the experiment, even though the data centers are physically scattered all over the world. The scale is demonstrated by the following numbers: PanDA manages O(10 2) sites, O(10 5) cores, O(10 8) jobs per year, O(10 3) users, and ATLAS data volume is O(10 17) bytes. In 2013 we started an ambitious program to expand PanDA to all available computing resources, including opportunistic use of commercial and academic clouds and Leadership Computing Facilities (LCF). The project titled 'Next Generation Workload Management and Analysis System for Big Data' (BigPanDA) is funded by DOE ASCR and HEP. Extending PanDA to clouds and LCF presents new challenges in managing heterogeneity and supporting workflow. The BigPanDA project is underway to setup and tailor PanDA at the Oak Ridge Leadership Computing Facility (OLCF) and at the National Research Center "Kurchatov Institute" together with ALICE distributed computing and ORNL computing professionals. Our approach to integration of HPC platforms at the OLCF and elsewhere is to reuse, as much as possible, existing components of the PanDA system. Finally, we will present our current accomplishments with running the PanDA WMS at OLCF and other supercomputers and demonstrate our ability to use PanDA as a portal independent of the computing facilities infrastructure for High Energy and Nuclear Physics as well as other data-intensive science applications.« less

  12. Experimental generation of Laguerre-Gaussian beam using digital micromirror device.

    PubMed

    Ren, Yu-Xuan; Li, Ming; Huang, Kun; Wu, Jian-Guang; Gao, Hong-Fang; Wang, Zi-Qiang; Li, Yin-Mei

    2010-04-01

    A digital micromirror device (DMD) modulates laser intensity through computer control of the device. We experimentally investigate the performance of the modulation property of a DMD and optimize the modulation procedure through image correction. Furthermore, Laguerre-Gaussian (LG) beams with different topological charges are generated by projecting a series of forklike gratings onto the DMD. We measure the field distribution with and without correction, the energy of LG beams with different topological charges, and the polarization property in sequence. Experimental results demonstrate that it is possible to generate LG beams with a DMD that allows the use of a high-intensity laser with proper correction to the input images, and that the polarization state of the LG beam differs from that of the input beam.

  13. Directional sinogram interpolation for sparse angular acquisition in cone-beam computed tomography.

    PubMed

    Zhang, Hua; Sonke, Jan-Jakob

    2013-01-01

    Cone-beam (CB) computed tomography (CT) is widely used in the field of medical imaging for guidance. Inspired by Betram's directional interpolation (BDI) methods, directional sinogram interpolation (DSI) was implemented to generate more CB projections by optimized (iterative) double-orientation estimation in sinogram space and directional interpolation. A new CBCT was subsequently reconstructed with the Feldkamp algorithm using both the original and interpolated CB projections. The proposed method was evaluated on both phantom and clinical data, and image quality was assessed by correlation ratio (CR) between the interpolated image and a gold standard obtained from full measured projections. Additionally, streak artifact reduction and image blur were assessed. In a CBCT reconstructed by 40 acquired projections over an arc of 360 degree, streak artifacts dropped 20.7% and 6.7% in a thorax phantom, when our method was compared to linear interpolation (LI) and BDI methods. Meanwhile, image blur was assessed by a head-and-neck phantom, where image blur of DSI was 20.1% and 24.3% less than LI and BDI. When our method was compared to LI and DI methods, CR increased by 4.4% and 3.1%. Streak artifacts of sparsely acquired CBCT were decreased by our method and image blur induced by interpolation was constrained to below other interpolation methods.

  14. Adapting wheat to uncertain future

    NASA Astrophysics Data System (ADS)

    Semenov, Mikhail; Stratonovitch, Pierre

    2015-04-01

    This study describes integration of climate change projections from the Coupled Model Intercomparison Project Phase 5 (CMIP5) multi-model ensemble with the LARS-WG weather generator, which delivers an attractive option for downscaling of large-scale climate projections from global climate models (GCMs) to local-scale climate scenarios for impact assessments. A subset of 18 GCMs from the CMIP5 ensemble and 2 RCPs, RCP4.5 and RCP8.5, were integrated with LARS-WG. Climate sensitivity indexes for temperature and precipitation were computed for all GCMs and for 21 regions in the world. For computationally demanding impact assessments, where it is not practical to explore all possible combinations of GCM × RCP, climate sensitivity indexes could be used to select a subset of GCMs from CMIP5 with contrasting climate sensitivity. This would allow to quantify uncertainty in impacts resulting from the CMIP5 ensemble by conducting fewer simulation experiments. As an example, an in silico design of wheat ideotype optimised for future climate scenarios in Europe was described. Two contrasting GCMs were selected for the analysis, "hot" HadGEM2-ES and "cool" GISS-E2-R-CC, along with 2 RCPs. Despite large uncertainty in climate projections, several wheat traits were identified as beneficial for the high-yielding wheat ideotypes that could be used as targets for wheat improvement by breeders.

  15. Three-dimensional DNA image cytometry by optical projection tomographic microscopy for early cancer diagnosis.

    PubMed

    Agarwal, Nitin; Biancardi, Alberto M; Patten, Florence W; Reeves, Anthony P; Seibel, Eric J

    2014-04-01

    Aneuploidy is typically assessed by flow cytometry (FCM) and image cytometry (ICM). We used optical projection tomographic microscopy (OPTM) for assessing cellular DNA content using absorption and fluorescence stains. OPTM combines some of the attributes of both FCM and ICM and generates isometric high-resolution three-dimensional (3-D) images of single cells. Although the depth of field of the microscope objective was in the submicron range, it was extended by scanning the objective's focal plane. The extended depth of field image is similar to a projection in a conventional x-ray computed tomography. These projections were later reconstructed using computed tomography methods to form a 3-D image. We also present an automated method for 3-D nuclear segmentation. Nuclei of chicken, trout, and triploid trout erythrocyte were used to calibrate OPTM. Ratios of integrated optical densities extracted from 50 images of each standard were compared to ratios of DNA indices from FCM. A comparison of mean square errors with thionin, hematoxylin, Feulgen, and SYTOX green was done. Feulgen technique was preferred as it showed highest stoichiometry, least variance, and preserved nuclear morphology in 3-D. The addition of this quantitative biomarker could further strengthen existing classifiers and improve early diagnosis of cancer using 3-D microscopy.

  16. Kern River steam expansion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rintoul, B.

    1970-09-15

    The newest addition to Getty Oil Co.'s imposing array of steam equipment at Kern River is a 240-million-btu-per-hr boiler. This boiler is almost 5 times more powerful than the previous largest piece of steam-generating hardware in use in the field. The huge boiler went into operation in Aug. on the Canfield Fee property on Sec. 29, 28S-28E. It is being used to furnish steam for 60 wells in a displacement project. The components that have made Getty Oil Co. the leading steamer at Kern River and the field, in turn, the world capital for oil-field steam operations include shallow wells,more » steam generators, and--since last year--a computer. There are more than 4,500 oil wells in the Kern River field, including more than 2,600 on Getty Oil properties. Getty Oil's steam operations involve 2,469 producing wells and 151 injection wells, including 2,167 producing wells in stimulation projects and 302 producing wells in displacement projects. The Kern River drilling program for 1970 consists of 313 wells of which 179 are steam-injection wells for the expansion of displacement projects. Wells are shallow, drilled mainly to the Kern River Series sands at an average depth of 900 ft, with a few drilled to the China Grade zone at an average depth of 1,300 ft. To furnish steam for the massive Kern River program, Getty Oil has assembled a force of 96 steam generators.« less

  17. Exploring new kinds of relationships using generative music-making software.

    PubMed

    Dillon, Steve; Jones, Anita

    2009-08-01

    This project focuses upon the use of jam2jam, a generative computer system, to increase access to improvization experiences for children and to facilitate new kinds of relationships with artists. The network jamming system uses visual and audio cultural materials to enable communities to be expressive with artistic materials that they value as a community. As the system is part of a network, performances can be shared between communities at great distances and recordings of performances can be uploaded to a digital social network (http://www.jam2jam.com/) and shared both locally and with the wider community. This paper examines a preliminary project where artwork made by Indigenous mental health clients in Far North Queensland was digitized and given to a group of 8-12-year-old urban Indigenous children to 'improvize' with and make music/video clips using the jam2jam instrument. It seeks to generate a discussion and identify applications within creative arts-led community health settings to facilitate new kinds of relationships with self, peers, local community, culture and artists through collaborative improvization.

  18. A Structured-Inquiry Approach to Teaching Neurophysiology Using Computer Simulation

    PubMed Central

    Crisp, Kevin M.

    2012-01-01

    Computer simulation is a valuable tool for teaching the fundamentals of neurophysiology in undergraduate laboratories where time and equipment limitations restrict the amount of course content that can be delivered through hands-on interaction. However, students often find such exercises to be tedious and unstimulating. In an effort to engage students in the use of computational modeling while developing a deeper understanding of neurophysiology, an attempt was made to use an educational neurosimulation environment as the basis for a novel, inquiry-based research project. During the semester, students in the class wrote a research proposal, used the Neurodynamix II simulator to generate a large data set, analyzed their modeling results statistically, and presented their findings at the Midbrains Neuroscience Consortium undergraduate poster session. Learning was assessed in the form of a series of short term papers and two 10-min in-class writing responses to the open-ended question, “How do ion channels influence neuronal firing?”, which they completed on weeks 6 and 15 of the semester. Students’ answers to this question showed a deeper understanding of neuronal excitability after the project; their term papers revealed evidence of critical thinking about computational modeling and neuronal excitability. Suggestions for the adaptation of this structured-inquiry approach into shorter term lab experiences are discussed. PMID:23494064

  19. Three-dimensional elliptic grid generation for an F-16

    NASA Technical Reports Server (NTRS)

    Sorenson, Reese L.

    1988-01-01

    A case history depicting the effort to generate a computational grid for the simulation of transonic flow about an F-16 aircraft at realistic flight conditions is presented. The flow solver for which this grid is designed is a zonal one, using the Reynolds averaged Navier-Stokes equations near the surface of the aircraft, and the Euler equations in regions removed from the aircraft. A body conforming global grid, suitable for the Euler equation, is first generated using 3-D Poisson equations having inhomogeneous terms modeled after the 2-D GRAPE code. Regions of the global grid are then designated for zonal refinement as appropriate to accurately model the flow physics. Grid spacing suitable for solution of the Navier-Stokes equations is generated in the refinement zones by simple subdivision of the given coarse grid intervals. That grid generation project is described, with particular emphasis on the global coarse grid.

  20. Final Report from The University of Texas at Austin for DEGAS: Dynamic Global Address Space programming environments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erez, Mattan; Yelick, Katherine; Sarkar, Vivek

    The Dynamic, Exascale Global Address Space programming environment (DEGAS) project will develop the next generation of programming models and runtime systems to meet the challenges of Exascale computing. Our approach is to provide an efficient and scalable programming model that can be adapted to application needs through the use of dynamic runtime features and domain-specific languages for computational kernels. We address the following technical challenges: Programmability: Rich set of programming constructs based on a Hierarchical Partitioned Global Address Space (HPGAS) model, demonstrated in UPC++. Scalability: Hierarchical locality control, lightweight communication (extended GASNet), and ef- ficient synchronization mechanisms (Phasers). Performance Portability:more » Just-in-time specialization (SEJITS) for generating hardware-specific code and scheduling libraries for domain-specific adaptive runtimes (Habanero). Energy Efficiency: Communication-optimal code generation to optimize energy efficiency by re- ducing data movement. Resilience: Containment Domains for flexible, domain-specific resilience, using state capture mechanisms and lightweight, asynchronous recovery mechanisms. Interoperability: Runtime and language interoperability with MPI and OpenMP to encourage broad adoption.« less

  1. Respiratory motion guided four dimensional cone beam computed tomography: encompassing irregular breathing

    NASA Astrophysics Data System (ADS)

    O'Brien, Ricky T.; Cooper, Benjamin J.; Kipritidis, John; Shieh, Chun-Chien; Keall, Paul J.

    2014-02-01

    Four dimensional cone beam computed tomography (4DCBCT) images suffer from angular under sampling and bunching of projections due to a lack of feedback between the respiratory signal and the acquisition system. To address this problem, respiratory motion guided 4DCBCT (RMG-4DCBCT) regulates the gantry velocity and projection time interval, in response to the patient’s respiratory signal, with the aim of acquiring evenly spaced projections in a number of phase or displacement bins during the respiratory cycle. Our previous study of RMG-4DCBCT was limited to sinusoidal breathing traces. Here we expand on that work to provide a practical algorithm for the case of real patient breathing data. We give a complete description of RMG-4DCBCT including full details on how to implement the algorithms to determine when to move the gantry and when to acquire projections in response to the patient’s respiratory signal. We simulate a realistic working RMG-4DCBCT system using 112 breathing traces from 24 lung cancer patients. Acquisition used phase-based binning and parameter settings typically used on commercial 4DCBCT systems (4 min acquisition time, 1200 projections across 10 respiratory bins), with the acceleration and velocity constraints of current generation linear accelerators. We quantified streaking artefacts and image noise for conventional and RMG-4DCBCT methods by reconstructing projection data selected from an oversampled set of Catphan phantom projections. RMG-4DCBCT allows us to optimally trade-off image quality, acquisition time and image dose. For example, for the same image quality and acquisition time as conventional 4DCBCT approximately half the imaging dose is needed. Alternatively, for the same imaging dose, the image quality as measured by the signal to noise ratio, is improved by 63% on average. C-arm cone beam computed tomography systems, with an acceleration up to 200°/s2, a velocity up to 100°/s and the acquisition of 80 projections per second, allow the image acquisition time to be reduced to below 60 s. We have made considerable progress towards realizing a system to reduce projection clustering in conventional 4DCBCT imaging and hence reduce the imaging dose to the patient.

  2. The iPlant Collaborative: Cyberinfrastructure for Plant Biology.

    PubMed

    Goff, Stephen A; Vaughn, Matthew; McKay, Sheldon; Lyons, Eric; Stapleton, Ann E; Gessler, Damian; Matasci, Naim; Wang, Liya; Hanlon, Matthew; Lenards, Andrew; Muir, Andy; Merchant, Nirav; Lowry, Sonya; Mock, Stephen; Helmke, Matthew; Kubach, Adam; Narro, Martha; Hopkins, Nicole; Micklos, David; Hilgert, Uwe; Gonzales, Michael; Jordan, Chris; Skidmore, Edwin; Dooley, Rion; Cazes, John; McLay, Robert; Lu, Zhenyuan; Pasternak, Shiran; Koesterke, Lars; Piel, William H; Grene, Ruth; Noutsos, Christos; Gendler, Karla; Feng, Xin; Tang, Chunlao; Lent, Monica; Kim, Seung-Jin; Kvilekval, Kristian; Manjunath, B S; Tannen, Val; Stamatakis, Alexandros; Sanderson, Michael; Welch, Stephen M; Cranston, Karen A; Soltis, Pamela; Soltis, Doug; O'Meara, Brian; Ane, Cecile; Brutnell, Tom; Kleibenstein, Daniel J; White, Jeffery W; Leebens-Mack, James; Donoghue, Michael J; Spalding, Edgar P; Vision, Todd J; Myers, Christopher R; Lowenthal, David; Enquist, Brian J; Boyle, Brad; Akoglu, Ali; Andrews, Greg; Ram, Sudha; Ware, Doreen; Stein, Lincoln; Stanzione, Dan

    2011-01-01

    The iPlant Collaborative (iPlant) is a United States National Science Foundation (NSF) funded project that aims to create an innovative, comprehensive, and foundational cyberinfrastructure in support of plant biology research (PSCIC, 2006). iPlant is developing cyberinfrastructure that uniquely enables scientists throughout the diverse fields that comprise plant biology to address Grand Challenges in new ways, to stimulate and facilitate cross-disciplinary research, to promote biology and computer science research interactions, and to train the next generation of scientists on the use of cyberinfrastructure in research and education. Meeting humanity's projected demands for agricultural and forest products and the expectation that natural ecosystems be managed sustainably will require synergies from the application of information technologies. The iPlant cyberinfrastructure design is based on an unprecedented period of research community input, and leverages developments in high-performance computing, data storage, and cyberinfrastructure for the physical sciences. iPlant is an open-source project with application programming interfaces that allow the community to extend the infrastructure to meet its needs. iPlant is sponsoring community-driven workshops addressing specific scientific questions via analysis tool integration and hypothesis testing. These workshops teach researchers how to add bioinformatics tools and/or datasets into the iPlant cyberinfrastructure enabling plant scientists to perform complex analyses on large datasets without the need to master the command-line or high-performance computational services.

  3. The iPlant Collaborative: Cyberinfrastructure for Plant Biology

    PubMed Central

    Goff, Stephen A.; Vaughn, Matthew; McKay, Sheldon; Lyons, Eric; Stapleton, Ann E.; Gessler, Damian; Matasci, Naim; Wang, Liya; Hanlon, Matthew; Lenards, Andrew; Muir, Andy; Merchant, Nirav; Lowry, Sonya; Mock, Stephen; Helmke, Matthew; Kubach, Adam; Narro, Martha; Hopkins, Nicole; Micklos, David; Hilgert, Uwe; Gonzales, Michael; Jordan, Chris; Skidmore, Edwin; Dooley, Rion; Cazes, John; McLay, Robert; Lu, Zhenyuan; Pasternak, Shiran; Koesterke, Lars; Piel, William H.; Grene, Ruth; Noutsos, Christos; Gendler, Karla; Feng, Xin; Tang, Chunlao; Lent, Monica; Kim, Seung-Jin; Kvilekval, Kristian; Manjunath, B. S.; Tannen, Val; Stamatakis, Alexandros; Sanderson, Michael; Welch, Stephen M.; Cranston, Karen A.; Soltis, Pamela; Soltis, Doug; O'Meara, Brian; Ane, Cecile; Brutnell, Tom; Kleibenstein, Daniel J.; White, Jeffery W.; Leebens-Mack, James; Donoghue, Michael J.; Spalding, Edgar P.; Vision, Todd J.; Myers, Christopher R.; Lowenthal, David; Enquist, Brian J.; Boyle, Brad; Akoglu, Ali; Andrews, Greg; Ram, Sudha; Ware, Doreen; Stein, Lincoln; Stanzione, Dan

    2011-01-01

    The iPlant Collaborative (iPlant) is a United States National Science Foundation (NSF) funded project that aims to create an innovative, comprehensive, and foundational cyberinfrastructure in support of plant biology research (PSCIC, 2006). iPlant is developing cyberinfrastructure that uniquely enables scientists throughout the diverse fields that comprise plant biology to address Grand Challenges in new ways, to stimulate and facilitate cross-disciplinary research, to promote biology and computer science research interactions, and to train the next generation of scientists on the use of cyberinfrastructure in research and education. Meeting humanity's projected demands for agricultural and forest products and the expectation that natural ecosystems be managed sustainably will require synergies from the application of information technologies. The iPlant cyberinfrastructure design is based on an unprecedented period of research community input, and leverages developments in high-performance computing, data storage, and cyberinfrastructure for the physical sciences. iPlant is an open-source project with application programming interfaces that allow the community to extend the infrastructure to meet its needs. iPlant is sponsoring community-driven workshops addressing specific scientific questions via analysis tool integration and hypothesis testing. These workshops teach researchers how to add bioinformatics tools and/or datasets into the iPlant cyberinfrastructure enabling plant scientists to perform complex analyses on large datasets without the need to master the command-line or high-performance computational services. PMID:22645531

  4. Using Compilers to Enhance Cryptographic Product Development

    NASA Astrophysics Data System (ADS)

    Bangerter, E.; Barbosa, M.; Bernstein, D.; Damgård, I.; Page, D.; Pagter, J. I.; Sadeghi, A.-R.; Sovio, S.

    Developing high-quality software is hard in the general case, and it is significantly more challenging in the case of cryptographic software. A high degree of new skill and understanding must be learnt and applied without error to avoid vulnerability and inefficiency. This is often beyond the financial, manpower or intellectual resources avail-able. In this paper we present the motivation for the European funded CACE (Computer Aided Cryptography Engineering) project The main objective of CACE is to provide engineers (with limited or no expertise in cryptography) with a toolbox that allows them to generate robust and efficient implementations of cryptographic primitives. We also present some preliminary results already obtained in the early stages of this project, and discuss the relevance of the project as perceived by stakeholders in the mobile device arena.

  5. From Speech to Emotional Interaction: EmotiRob Project

    NASA Astrophysics Data System (ADS)

    Le Tallec, Marc; Saint-Aimé, Sébastien; Jost, Céline; Villaneau, Jeanne; Antoine, Jean-Yves; Letellier-Zarshenas, Sabine; Le-Pévédic, Brigitte; Duhaut, Dominique

    This article presents research work done in the domain of nonverbal emotional interaction for the EmotiRob project. It is a component of the MAPH project, the objective of which is to give comfort to vulnerable children and/or those undergoing long-term hospitalisation through the help of an emotional robot companion. It is important to note that we are not trying to reproduce human emotion and behavior, but trying to make a robot emotionally expressive. This paper will present the different hypotheses we have used from understanding to emotional reaction. We begin the article with a presentation of the MAPH and EmotiRob project. Then, we quickly describe the speech undestanding system, the iGrace computational model of emotions and integration of dynamics behavior. We conclude with a description of the architecture of Emi, as well as improvements to be made to its next generation.

  6. CGS-MSFSS Project report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harvey-Collard, Patrick

    2015-10-27

    From January 2015 to July 2015, I was doing research at Sandia National Laboratories in Albuquerque, United States. My work there consisted of performing experimental measurements using Sandia’s unique silicon quantum computing platform. The project is about coupling donor spin quantum bits, or qubits, to quantum dots in a silicon nanostructure based on conventional microchip technology. During the project, I devised a new quantum state readout mechanism that allow better, longer lived measurement signals. The measurement (or readout) mechanism is key to any qubit architecture. Next, I was able to demonstrate a quantum manipulation of the two-electron spin states ofmore » the coupled donor and quantum dot system. This constitutes a breakthrough for donor spin qubits in silicon because it could enable larger systems consisting of many qubits. This project will lead to publications in scientific journals, presentations in international conferences, and generates exciting new opportunities for manipulating nature at the nanoscale.« less

  7. Fast fringe pattern phase demodulation using FIR Hilbert transformers

    NASA Astrophysics Data System (ADS)

    Gdeisat, Munther; Burton, David; Lilley, Francis; Arevalillo-Herráez, Miguel

    2016-01-01

    This paper suggests the use of FIR Hilbert transformers to extract the phase of fringe patterns. This method is computationally faster than any known spatial method that produces wrapped phase maps. Also, the algorithm does not require any parameters to be adjusted which are dependent upon the specific fringe pattern that is being processed, or upon the particular setup of the optical fringe projection system that is being used. It is therefore particularly suitable for full algorithmic automation. The accuracy and validity of the suggested method has been tested using both computer-generated and real fringe patterns. This novel algorithm has been proposed for its advantages in terms of computational processing speed as it is the fastest available method to extract the wrapped phase information from a fringe pattern.

  8. Software design studies emphasizing Project LOGOS

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The results of a research project on the development of computer software are presented. Research funds of $200,000 were expended over a three year period for software design and projects in connection with Project LOGOS (computer-aided design and certification of computing systems). Abstracts of theses prepared during the project are provided.

  9. High resolution global climate modelling; the UPSCALE project, a large simulation campaign

    NASA Astrophysics Data System (ADS)

    Mizielinski, M. S.; Roberts, M. J.; Vidale, P. L.; Schiemann, R.; Demory, M.-E.; Strachan, J.; Edwards, T.; Stephens, A.; Lawrence, B. N.; Pritchard, M.; Chiu, P.; Iwi, A.; Churchill, J.; del Cano Novales, C.; Kettleborough, J.; Roseblade, W.; Selwood, P.; Foster, M.; Glover, M.; Malcolm, A.

    2014-01-01

    The UPSCALE (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk) project constructed and ran an ensemble of HadGEM3 (Hadley centre Global Environment Model 3) atmosphere-only global climate simulations over the period 1985-2011, at resolutions of N512 (25 km), N216 (60 km) and N96 (130 km) as used in current global weather forecasting, seasonal prediction and climate modelling respectively. Alongside these present climate simulations a parallel ensemble looking at extremes of future climate was run, using a time-slice methodology to consider conditions at the end of this century. These simulations were primarily performed using a 144 million core hour, single year grant of computing time from PRACE (the Partnership for Advanced Computing in Europe) in 2012, with additional resources supplied by the Natural Environmental Research Council (NERC) and the Met Office. Almost 400 terabytes of simulation data were generated on the HERMIT supercomputer at the high performance computing center Stuttgart (HLRS), and transferred to the JASMIN super-data cluster provided by the Science and Technology Facilities Council Centre for Data Archival (STFC CEDA) for analysis and storage. In this paper we describe the implementation of the project, present the technical challenges in terms of optimisation, data output, transfer and storage that such a project involves and include details of the model configuration and the composition of the UPSCALE dataset. This dataset is available for scientific analysis to allow assessment of the value of model resolution in both present and potential future climate conditions.

  10. High-resolution global climate modelling: the UPSCALE project, a large-simulation campaign

    NASA Astrophysics Data System (ADS)

    Mizielinski, M. S.; Roberts, M. J.; Vidale, P. L.; Schiemann, R.; Demory, M.-E.; Strachan, J.; Edwards, T.; Stephens, A.; Lawrence, B. N.; Pritchard, M.; Chiu, P.; Iwi, A.; Churchill, J.; del Cano Novales, C.; Kettleborough, J.; Roseblade, W.; Selwood, P.; Foster, M.; Glover, M.; Malcolm, A.

    2014-08-01

    The UPSCALE (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk) project constructed and ran an ensemble of HadGEM3 (Hadley Centre Global Environment Model 3) atmosphere-only global climate simulations over the period 1985-2011, at resolutions of N512 (25 km), N216 (60 km) and N96 (130 km) as used in current global weather forecasting, seasonal prediction and climate modelling respectively. Alongside these present climate simulations a parallel ensemble looking at extremes of future climate was run, using a time-slice methodology to consider conditions at the end of this century. These simulations were primarily performed using a 144 million core hour, single year grant of computing time from PRACE (the Partnership for Advanced Computing in Europe) in 2012, with additional resources supplied by the Natural Environment Research Council (NERC) and the Met Office. Almost 400 terabytes of simulation data were generated on the HERMIT supercomputer at the High Performance Computing Center Stuttgart (HLRS), and transferred to the JASMIN super-data cluster provided by the Science and Technology Facilities Council Centre for Data Archival (STFC CEDA) for analysis and storage. In this paper we describe the implementation of the project, present the technical challenges in terms of optimisation, data output, transfer and storage that such a project involves and include details of the model configuration and the composition of the UPSCALE data set. This data set is available for scientific analysis to allow assessment of the value of model resolution in both present and potential future climate conditions.

  11. Promoting Interests in Atmospheric Science at a Liberal Arts Institution

    NASA Astrophysics Data System (ADS)

    Roussev, S.; Sherengos, P. M.; Limpasuvan, V.; Xue, M.

    2007-12-01

    Coastal Carolina University (CCU) students in Computer Science participated in a project to set up an operational weather forecast for the local community. The project involved the construction of two computing clusters and the automation of daily forecasting. Funded by NSF-MRI, two high-performance clusters were successfully established to run the University of Oklahoma's Advance Regional Prediction System (ARPS). Daily weather predictions are made over South Carolina and North Carolina at 3-km horizontal resolution (roughly 1.9 miles) using initial and boundary condition data provided by UNIDATA. At this high resolution, the model is cloud- resolving, thus providing detailed picture of heavy thunderstorms and precipitation. Forecast results are displayed on CCU's website (https://marc.coastal.edu/HPC) to complement observations at the National Weather Service in Wilmington N.C. Present efforts include providing forecasts at 1-km resolution (or finer), comparisons with other models like Weather Research and Forecasting (WRF) model, and the examination of local phenomena (like water spouts and tornadoes). Through these activities the students learn about shell scripting, cluster operating systems, and web design. More importantly, students are introduced to Atmospheric Science, the processes involved in making weather forecasts, and the interpretation of their forecasts. Simulations generated by the forecasts will be integrated into the contents of CCU's course like Fluid Dynamics, Atmospheric Sciences, Atmospheric Physics, and Remote Sensing. Operated jointly between the departments of Applied Physics and Computer Science, the clusters are expected to be used by CCU faculty and students for future research and inquiry-based projects in Computer Science, Applied Physics, and Marine Science.

  12. A Comprehensive Infrastructure for Big Data in Cancer Research: Accelerating Cancer Research and Precision Medicine

    PubMed Central

    Hinkson, Izumi V.; Davidsen, Tanja M.; Klemm, Juli D.; Chandramouliswaran, Ishwar; Kerlavage, Anthony R.; Kibbe, Warren A.

    2017-01-01

    Advancements in next-generation sequencing and other -omics technologies are accelerating the detailed molecular characterization of individual patient tumors, and driving the evolution of precision medicine. Cancer is no longer considered a single disease, but rather, a diverse array of diseases wherein each patient has a unique collection of germline variants and somatic mutations. Molecular profiling of patient-derived samples has led to a data explosion that could help us understand the contributions of environment and germline to risk, therapeutic response, and outcome. To maximize the value of these data, an interdisciplinary approach is paramount. The National Cancer Institute (NCI) has initiated multiple projects to characterize tumor samples using multi-omic approaches. These projects harness the expertise of clinicians, biologists, computer scientists, and software engineers to investigate cancer biology and therapeutic response in multidisciplinary teams. Petabytes of cancer genomic, transcriptomic, epigenomic, proteomic, and imaging data have been generated by these projects. To address the data analysis challenges associated with these large datasets, the NCI has sponsored the development of the Genomic Data Commons (GDC) and three Cloud Resources. The GDC ensures data and metadata quality, ingests and harmonizes genomic data, and securely redistributes the data. During its pilot phase, the Cloud Resources tested multiple cloud-based approaches for enhancing data access, collaboration, computational scalability, resource democratization, and reproducibility. These NCI-led efforts are continuously being refined to better support open data practices and precision oncology, and to serve as building blocks of the NCI Cancer Research Data Commons. PMID:28983483

  13. Direct Partial Oxidations Using Molecular Oxygen - Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kemp, Richard

    In 2006, Richard A. Kemp (University of New Mexico) and Karen I. Goldberg (University of Washington) formed a team and began to investigate new strategies to accomplish direct selective aerobic oxidations, with a particular emphasis on the epoxidation of propylene and higher olefins. This DOE-BES funded project was renewed twice and concluded after a no-cost extension earlier this year. Multiple novel strategies involving homogeneous catalyst systems were initiated and investigated during the award. Important fundamental understanding and insight concerning requirements for promotion of aerobic olefin epoxidation was generated. During the tenure of this project, new knowledge was generated concerning themore » synthesis, characterization and aerobic reactivity of metal hydrides and hydroxides. Key results describing synthetic strategies and optimization of the preparation of mononuclear late metal hydrides were published. The team reported the first example of O2 insertion into a Pd-H bond, a reaction which had been proposed in the literature but never previously observed. Our experimental investigation of the mechanism was later followed by computational work, and a description of what is now referred to as the Hydrogen Atom Abstraction (HAA) pathway for this reaction has been widely accepted in the community. After investigation of many other late metal hydrides, both experimentally and computationally, the team put together a chapter that included a description of key contributing factors that allow reaction by the HAA mechanism. A brief sampling of other classic papers from our project include hydrogenolysis reactions of late metal hydroxide and alkoxide complexes, the synthesis of nickel-hydrides, and the involvement of hemilabile ligands in promoting new reaction pathways.« less

  14. Desktop Computing Integration Project

    NASA Technical Reports Server (NTRS)

    Tureman, Robert L., Jr.

    1992-01-01

    The Desktop Computing Integration Project for the Human Resources Management Division (HRMD) of LaRC was designed to help division personnel use personal computing resources to perform job tasks. The three goals of the project were to involve HRMD personnel in desktop computing, link mainframe data to desktop capabilities, and to estimate training needs for the division. The project resulted in increased usage of personal computers by Awards specialists, an increased awareness of LaRC resources to help perform tasks, and personal computer output that was used in presentation of information to center personnel. In addition, the necessary skills for HRMD personal computer users were identified. The Awards Office was chosen for the project because of the consistency of their data requests and the desire of employees in that area to use the personal computer.

  15. Final Report for Project "A high-throughput pipeline for mapping inter-species interactions and metabolic synergy relevant to next-generation biofuel production"

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Segre, Daniel; Marx, Christopher J.; Northen, Trent

    The goal of our project was to implement a pipeline for the systematic, computationally-driven study and optimization of microbial interactions and their effect on lignocellulose degradation and biofuel production. We specifically sought to design and construct artificial microbial consortia that could collectively degrade lignocellulose from plant biomass, and produce precursors of energy-rich biofuels. This project fits into the bigger picture goal of helping identify a sustainable strategy for the production of energy-rich biofuels that would satisfy the existing energy constraints and demand of our society. Based on the observation that complex natural microbial communities tend to be metabolically efficient andmore » ecologically robust, we pursued the study of a microbial system in which the desired engineering function is achieved through division of labor across multiple microbial species. Our approach was aimed at bypassing the complexity of natural communities by establishing a rational approach to design small synthetic microbial consortia. Towards this goal, we combined multiple approaches, including computer modeling of ecosystem-level microbial metabolism, mass spectrometry of metabolites, genetic engineering, and experimental evolution. The microbial production of biofuels from lignocellulose is a complex, multi-step process. Microbial consortia are an ideal approach to consolidated bioprocessing: a community of microorganisms performs a wide variety of functions more efficiently and is more resilient to environmental perturbations than a microbial monoculture. Each organism we chose for this project addresses a specific challenge: lignin degradation (Pseudomonas putida); (hemi)cellulose degradation (Cellulomonas fimi); lignin degradation product demethoxylation (Methylobacterium spp); generation of biofuel lipid precursors (Yarrowia lipolytica). These organisms are genetically tractable, aerobic, and have been used in biotechnological applications. Throughout the project, we have used mass spectrometry to characterize and measure the metabolic inputs and outputs of each of these consortium members, providing valuable information for model refinement, and enabling the establishment of metabolism-mediated interactions. In addition to lignocellulose degradation, we have started addressing the challenge of removing metabolites (e.g. formaldehyde) produced by the demethoxylation of lignin monomers, which can otherwise inhibit microbial growth due to their toxicity. On the computational side, we have implemented genome-scale models for all consortium members, based on KBase reconstructions and literature curation, and we studied small consortia and their properties. Overall, our project has identified a complex landscape of interactions types and metabolic processes relevant to community-level functions, illustrating the challenges and opportunities of microbial community engineering for the transformation of biomass into bioproducts.« less

  16. Implementation of a Proposed System for Automated Microcode Generation.

    DTIC Science & Technology

    1984-12-01

    Marcia Elaine Provance 6 . PERFORMING ORGANIZATION NAME AND ADDRESS t0. PROGRAM ELEMENT, PROJECT, TASK AREA 6 WORK UNIT NUMBERS * Naval Postgraduate...The proposed microprogramming system is organized around a series of menus which are presented to a microprograrmuer so that she can build FDC 1473... organization is not affected. A computer can also respond more easily to new performance demands and problem solutions. A richer or a larger instruction set

  17. Design for the Maintainer: Projecting Maintenance Performance from Design Characteristics.

    DTIC Science & Technology

    1981-07-01

    of Kahneman and Tversky (Tversky & Kahneman, 1974; Kahneman & Tversky, 1979). They have observed some general principles to which human decision...makers tend to adhere. The first of these is the "representativeness heuristicw . According to this principle , the question, ’will event A be generated by...process B?", will be decided affirmatively to the extent that the event A resembles process B. According to this principle , if failure in a computer

  18. ML-o-Scope: A Diagnostic Visualization System for Deep Machine Learning Pipelines

    DTIC Science & Technology

    2014-05-16

    ML-o-scope: a diagnostic visualization system for deep machine learning pipelines Daniel Bruckner Electrical Engineering and Computer Sciences... machine learning pipelines 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f...the system as a support for tuning large scale object-classification pipelines. 1 Introduction A new generation of pipelined machine learning models

  19. Control Improvisation with Application to Music

    DTIC Science & Technology

    2013-11-04

    Control Improvisation with Application to Music Alexandre Donze Sophie Libkind Sanjit A. Seshia David Wessel Electrical Engineering and Computer...to Music 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7...domain of music . More speci cally, we consider the scenario of generating a monophonic Jazz melody (solo) on a given song harmonization. The music is

  20. Employing Deceptive Dynamic Network Topology Through Software-Defined Networking

    DTIC Science & Technology

    2014-03-01

    manage economies, banking, and businesses , to the way we gather intelligence and militaries wage war. With computer networks and the Internet, we have seen...space, along with other generated statistics , similar to that performed by the Ant Census project. As we have shown, there is an extensive and diverse...calculated RTT for each probe. In the ping statistics , we are presented the details of probes sent and responses received, and the calculated packet loss

  1. Genome Improvement at JGI-HAGSC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grimwood, Jane; Schmutz, Jeremy J.; Myers, Richard M.

    Since the completion of the sequencing of the human genome, the Joint Genome Institute (JGI) has rapidly expanded its scientific goals in several DOE mission-relevant areas. At the JGI-HAGSC, we have kept pace with this rapid expansion of projects with our focus on assessing, assembling, improving and finishing eukaryotic whole genome shotgun (WGS) projects for which the shotgun sequence is generated at the Production Genomic Facility (JGI-PGF). We follow this by combining the draft WGS with genomic resources generated at JGI-HAGSC or in collaborator laboratories (including BAC end sequences, genetic maps and FLcDNA sequences) to produce an improved draft sequence.more » For eukaryotic genomes important to the DOE mission, we then add further information from directed experiments to produce reference genomic sequences that are publicly available for any scientific researcher. Also, we have continued our program for producing BAC-based finished sequence, both for adding information to JGI genome projects and for small BAC-based sequencing projects proposed through any of the JGI sequencing programs. We have now built our computational expertise in WGS assembly and analysis and have moved eukaryotic genome assembly from the JGI-PGF to JGI-HAGSC. We have concentrated our assembly development work on large plant genomes and complex fungal and algal genomes.« less

  2. Advances in Toxico-Cheminformatics: Supporting a New ...

    EPA Pesticide Factsheets

    EPA’s National Center for Computational Toxicology is building capabilities to support a new paradigm for toxicity screening and prediction through the harnessing of legacy toxicity data, creation of data linkages, and generation of new high-throughput screening (HTS) data. The DSSTox project is working to improve public access to quality structure-annotated chemical toxicity information in less summarized forms than traditionally employed in SAR modeling, and in ways that facilitate both data-mining and read-across. Both DSSTox Structure-Files and the dedicated on-line DSSTox Structure-Browser are enabling seamless structure-based searching and linkages to and from previously isolated, chemically indexed public toxicity data resources (e.g., NTP, EPA IRIS, CPDB). Most recently, structure-enabled search capabilities have been extended to chemical exposure-related microarray experiments in the public EBI Array Express database, additionally linking this resource to the NIEHS CEBS toxicogenomics database. The public DSSTox chemical and bioassay inventory has been recently integrated into PubChem, allowing a user to take full advantage of PubChem structure-activity and bioassay clustering features. The DSSTox project is providing cheminformatics support for EPA’s ToxCastTM project, as well as supporting collaborations with the National Toxicology Program (NTP) HTS and the NIH Chemical Genomics Center (NCGC). Phase I of the ToxCastTM project is generating HT

  3. Computer animation for minimally invasive surgery: computer system requirements and preferred implementations

    NASA Astrophysics Data System (ADS)

    Pieper, Steven D.; McKenna, Michael; Chen, David; McDowall, Ian E.

    1994-04-01

    We are interested in the application of computer animation to surgery. Our current project, a navigation and visualization tool for knee arthroscopy, relies on real-time computer graphics and the human interface technologies associated with virtual reality. We believe that this new combination of techniques will lead to improved surgical outcomes and decreased health care costs. To meet these expectations in the medical field, the system must be safe, usable, and cost-effective. In this paper, we outline some of the most important hardware and software specifications in the areas of video input and output, spatial tracking, stereoscopic displays, computer graphics models and libraries, mass storage and network interfaces, and operating systems. Since this is a fairly new combination of technologies and a new application, our justification for our specifications are drawn from the current generation of surgical technology and by analogy to other fields where virtual reality technology has been more extensively applied and studied.

  4. 16(th) IHIW: analysis of HLA population data, with updated results for 1996 to 2012 workshop data (AHPD project report).

    PubMed

    Riccio, M E; Buhler, S; Nunes, J M; Vangenot, C; Cuénod, M; Currat, M; Di, D; Andreani, M; Boldyreva, M; Chambers, G; Chernova, M; Chiaroni, J; Darke, C; Di Cristofaro, J; Dubois, V; Dunn, P; Edinur, H A; Elamin, N; Eliaou, J-F; Grubic, Z; Jaatinen, T; Kanga, U; Kervaire, B; Kolesar, L; Kunachiwa, W; Lokki, M L; Mehra, N; Nicoloso, G; Paakkanen, R; Voniatis, D Papaioannou; Papasteriades, C; Poli, F; Richard, L; Romón Alonso, I; Slavčev, A; Sulcebe, G; Suslova, T; Testi, M; Tiercy, J-M; Varnavidou, A; Vidan-Jeras, B; Wennerström, A; Sanchez-Mazas, A

    2013-02-01

    We present here the results of the Analysis of HLA Population Data (AHPD) project of the 16th International HLA and Immunogenetics Workshop (16IHIW) held in Liverpool in May-June 2012. Thanks to the collaboration of 25 laboratories from 18 different countries, HLA genotypic data for 59 new population samples (either well-defined populations or donor registry samples) were gathered and 55 were analysed statistically following HLA-NET recommendations. The new data included, among others, large sets of well-defined populations from north-east Europe and West Asia, as well as many donor registry data from European countries. The Gene[rate] computer tools were combined to create a Gene[rate] computer pipeline to automatically (i) estimate allele frequencies by an expectation-maximization algorithm accommodating ambiguities, (ii) estimate heterozygosity, (iii) test for Hardy-Weinberg equilibrium (HWE), (iv) test for selective neutrality, (v) generate frequency graphs and summary statistics for each sample at each locus and (vi) plot multidimensional scaling (MDS) analyses comparing the new samples with previous IHIW data. Intrapopulation analyses show that HWE is rarely rejected, while neutrality tests often indicate a significant excess of heterozygotes compared with neutral expectations. The comparison of the 16IHIW AHPD data with data collected during previous workshops (12th-15th) shows that geography is an excellent predictor of HLA genetic differentiations for HLA-A, -B and -DRB1 loci but not for HLA-DQ, whose patterns are probably more influenced by natural selection. In Europe, HLA genetic variation clearly follows a north to south-east axis despite a low level of differentiation between European, North African and West Asian populations. Pacific populations are genetically close to Austronesian-speaking South-East Asian and Taiwanese populations, in agreement with current theories on the peopling of Oceania. Thanks to this project, HLA genetic variation is more clearly defined worldwide and better interpreted in relation to human peopling history and HLA molecular evolution. © 2012 Blackwell Publishing Ltd.

  5. Strategic Computing Computer Vision: Taking Image Understanding To The Next Plateau

    NASA Astrophysics Data System (ADS)

    Simpson, R. L., Jr.

    1987-06-01

    The overall objective of the Strategic Computing (SC) Program of the Defense Advanced Research Projects Agency (DARPA) is to develop and demonstrate a new generation of machine intelligence technology which can form the basis for more capable military systems in the future and also maintain a position of world leadership for the US in computer technology. Begun in 1983, SC represents a focused research strategy for accelerating the evolution of new technology and its rapid prototyping in realistic military contexts. Among the very ambitious demonstration prototypes being developed within the SC Program are: 1) the Pilot's Associate which will aid the pilot in route planning, aerial target prioritization, evasion of missile threats, and aircraft emergency safety procedures during flight; 2) two battle management projects one for the for the Army, which is just getting started, called the AirLand Battle Management program (ALBM) which will use knowledge-based systems technology to assist in the generation and evaluation of tactical options and plans at the Corps level; 3) the other more established program for the Navy is the Fleet Command Center Battle Management Program (FCCBIVIP) at Pearl Harbor. The FCCBMP is employing knowledge-based systems and natural language technology in a evolutionary testbed situated in an operational command center to demonstrate and evaluate intelligent decision-aids which can assist in the evaluation of fleet readiness and explore alternatives during contingencies; and 4) the Autonomous Land Vehicle (ALV) which integrates in a major robotic testbed the technologies for dynamic image understanding, knowledge-based route planning with replanning during execution, hosted on new advanced parallel architectures. The goal of the Strategic Computing computer vision technology base (SCVision) is to develop generic technology that will enable the construction of complete, robust, high performance image understanding systems to support a wide range of DoD applications. Possible applications include autonomous vehicle navigation, photointerpretation, smart weapons, and robotic manipulation. This paper provides an overview of the technical and program management plans being used in evolving this critical national technology.

  6. Leveraging the Power of High Performance Computing for Next Generation Sequencing Data Analysis: Tricks and Twists from a High Throughput Exome Workflow

    PubMed Central

    Wonczak, Stephan; Thiele, Holger; Nieroda, Lech; Jabbari, Kamel; Borowski, Stefan; Sinha, Vishal; Gunia, Wilfried; Lang, Ulrich; Achter, Viktor; Nürnberg, Peter

    2015-01-01

    Next generation sequencing (NGS) has been a great success and is now a standard method of research in the life sciences. With this technology, dozens of whole genomes or hundreds of exomes can be sequenced in rather short time, producing huge amounts of data. Complex bioinformatics analyses are required to turn these data into scientific findings. In order to run these analyses fast, automated workflows implemented on high performance computers are state of the art. While providing sufficient compute power and storage to meet the NGS data challenge, high performance computing (HPC) systems require special care when utilized for high throughput processing. This is especially true if the HPC system is shared by different users. Here, stability, robustness and maintainability are as important for automated workflows as speed and throughput. To achieve all of these aims, dedicated solutions have to be developed. In this paper, we present the tricks and twists that we utilized in the implementation of our exome data processing workflow. It may serve as a guideline for other high throughput data analysis projects using a similar infrastructure. The code implementing our solutions is provided in the supporting information files. PMID:25942438

  7. Non-Adiabatic Molecular Dynamics Methods for Materials Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Furche, Filipp; Parker, Shane M.; Muuronen, Mikko J.

    2017-04-04

    The flow of radiative energy in light-driven materials such as photosensitizer dyes or photocatalysts is governed by non-adiabatic transitions between electronic states and cannot be described within the Born-Oppenheimer approximation commonly used in electronic structure theory. The non-adiabatic molecular dynamics (NAMD) methods based on Tully surface hopping and time-dependent density functional theory developed in this project have greatly extended the range of molecular materials that can be tackled by NAMD simulations. New algorithms to compute molecular excited state and response properties efficiently were developed. Fundamental limitations of common non-linear response methods were discovered and characterized. Methods for accurate computations ofmore » vibronic spectra of materials such as black absorbers were developed and applied. It was shown that open-shell TDDFT methods capture bond breaking in NAMD simulations, a longstanding challenge for single-reference molecular dynamics simulations. The methods developed in this project were applied to study the photodissociation of acetaldehyde and revealed that non-adiabatic effects are experimentally observable in fragment kinetic energy distributions. Finally, the project enabled the first detailed NAMD simulations of photocatalytic water oxidation by titania nanoclusters, uncovering the mechanism of this fundamentally important reaction for fuel generation and storage.« less

  8. OpenSim Model Improvements to Support High Joint Angle Resistive Exercising

    NASA Technical Reports Server (NTRS)

    Gallo, Christopher; Thompson, William; Lewandowski, Beth; Humphreys, Brad

    2016-01-01

    Long duration space travel to Mars or to an asteroid will expose astronauts to extended periods of reduced gravity. Since gravity is not present to aid loading, astronauts will use resistive and aerobic exercise regimes for the duration of the space flight to minimize the loss of bone density, muscle mass and aerobic capacity that occurs during exposure to a reduced gravity environment. Unlike the International Space Station (ISS), the area available for an exercise device in the next generation of spacecraft is limited. Therefore, compact resistance exercise device prototypes are being developed. The Advanced Resistive Exercise Device (ARED) currently on the ISS is being used as a benchmark for the functional performance of these new devices. Rigorous testing of these proposed devices in space flight is difficult so computational modeling provides an estimation of the muscle forces and joint loads during exercise to gain insight on the efficacy to protect the musculoskeletal health of astronauts. The NASA Digital Astronaut Project (DAP) is supporting the Advanced Exercise Concepts (AEC) Project, Exercise Physiology and Countermeasures (ExPC) project and the National Space Biomedical Research Institute (NSBRI) funded researchers by developing computational models of exercising with these new advanced exercise device concepts

  9. Final Report: MaRSPlus Sensor System Electrical Cable Management and Distributed Motor Control Computer Interface

    NASA Technical Reports Server (NTRS)

    Reil, Robin

    2011-01-01

    The success of JPL's Next Generation Imaging Spectrometer (NGIS) in Earth remote sensing has inspired a follow-on instrument project, the MaRSPlus Sensor System (MSS). One of JPL's responsibilities in the MSS project involves updating the documentation from the previous JPL airborne imagers to provide all the information necessary for an outside customer to operate the instrument independently. As part of this documentation update, I created detailed electrical cabling diagrams to provide JPL technicians with clear and concise build instructions and a database to track the status of cables from order to build to delivery. Simultaneously, a distributed motor control system is being developed for potential use on the proposed 2018 Mars rover mission. This system would significantly reduce the mass necessary for rover motor control, making more mass space available to other important spacecraft systems. The current stage of the project consists of a desktop computer talking to a single "cold box" unit containing the electronics to drive a motor. In order to test the electronics, I developed a graphical user interface (GUI) using MATLAB to allow a user to send simple commands to the cold box and display the responses received in a user-friendly format.

  10. Feasibility study for application of the compressed-sensing framework to interior computed tomography (ICT) for low-dose, high-accurate dental x-ray imaging

    NASA Astrophysics Data System (ADS)

    Je, U. K.; Cho, H. M.; Cho, H. S.; Park, Y. O.; Park, C. K.; Lim, H. W.; Kim, K. S.; Kim, G. A.; Park, S. Y.; Woo, T. H.; Choi, S. I.

    2016-02-01

    In this paper, we propose a new/next-generation type of CT examinations, the so-called Interior Computed Tomography (ICT), which may presumably lead to dose reduction to the patient outside the target region-of-interest (ROI), in dental x-ray imaging. Here an x-ray beam from each projection position covers only a relatively small ROI containing a target of diagnosis from the examined structure, leading to imaging benefits such as decreasing scatters and system cost as well as reducing imaging dose. We considered the compressed-sensing (CS) framework, rather than common filtered-backprojection (FBP)-based algorithms, for more accurate ICT reconstruction. We implemented a CS-based ICT algorithm and performed a systematic simulation to investigate the imaging characteristics. Simulation conditions of two ROI ratios of 0.28 and 0.14 between the target and the whole phantom sizes and four projection numbers of 360, 180, 90, and 45 were tested. We successfully reconstructed ICT images of substantially high image quality by using the CS framework even with few-view projection data, still preserving sharp edges in the images.

  11. High Performance GPU-Based Fourier Volume Rendering.

    PubMed

    Abdellah, Marwan; Eldeib, Ayman; Sharawi, Amr

    2015-01-01

    Fourier volume rendering (FVR) is a significant visualization technique that has been used widely in digital radiography. As a result of its (N (2)log⁡N) time complexity, it provides a faster alternative to spatial domain volume rendering algorithms that are (N (3)) computationally complex. Relying on the Fourier projection-slice theorem, this technique operates on the spectral representation of a 3D volume instead of processing its spatial representation to generate attenuation-only projections that look like X-ray radiographs. Due to the rapid evolution of its underlying architecture, the graphics processing unit (GPU) became an attractive competent platform that can deliver giant computational raw power compared to the central processing unit (CPU) on a per-dollar-basis. The introduction of the compute unified device architecture (CUDA) technology enables embarrassingly-parallel algorithms to run efficiently on CUDA-capable GPU architectures. In this work, a high performance GPU-accelerated implementation of the FVR pipeline on CUDA-enabled GPUs is presented. This proposed implementation can achieve a speed-up of 117x compared to a single-threaded hybrid implementation that uses the CPU and GPU together by taking advantage of executing the rendering pipeline entirely on recent GPU architectures.

  12. Knowledge Based Systems: A Critical Survey of Major Concepts, Issues, and Techniques. M.S. Thesis Final Report, 1 Jul. 1985 - 31 Dec. 1987

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Kavi, Srinu

    1984-01-01

    This Working Paper Series entry presents a detailed survey of knowledge based systems. After being in a relatively dormant state for many years, only recently is Artificial Intelligence (AI) - that branch of computer science that attempts to have machines emulate intelligent behavior - accomplishing practical results. Most of these results can be attributed to the design and use of Knowledge-Based Systems, KBSs (or ecpert systems) - problem solving computer programs that can reach a level of performance comparable to that of a human expert in some specialized problem domain. These systems can act as a consultant for various requirements like medical diagnosis, military threat analysis, project risk assessment, etc. These systems possess knowledge to enable them to make intelligent desisions. They are, however, not meant to replace the human specialists in any particular domain. A critical survey of recent work in interactive KBSs is reported. A case study (MYCIN) of a KBS, a list of existing KBSs, and an introduction to the Japanese Fifth Generation Computer Project are provided as appendices. Finally, an extensive set of KBS-related references is provided at the end of the report.

  13. Highlights of X-Stack ExM Deliverable Swift/T

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wozniak, Justin M.

    Swift/T is a key success from the ExM: System support for extreme-scale, many-task applications1 X-Stack project, which proposed to use concurrent dataflow as an innovative programming model to exploit extreme parallelism in exascale computers. The Swift/T component of the project reimplemented the Swift language from scratch to allow applications that compose scientific modules together to be build and run on available petascale computers (Blue Gene, Cray). Swift/T does this via a new compiler and runtime that generates and executes the application as an MPI program. We assume that mission-critical emerging exascale applications will be composed as scalable applications using existingmore » software components, connected by data dependencies. Developers wrap native code fragments using a higherlevel language, then build composite applications to form a computational experiment. This exemplifies hierarchical concurrency: lower-level messaging libraries are used for fine-grained parallelism; highlevel control is used for inter-task coordination. These patterns are best expressed with dataflow, but static DAGs (i.e., other workflow languages) limit the applications that can be built; they do not provide the expressiveness of Swift, such as conditional execution, iteration, and recursive functions.« less

  14. Application of data mining approaches to drug delivery.

    PubMed

    Ekins, Sean; Shimada, Jun; Chang, Cheng

    2006-11-30

    Computational approaches play a key role in all areas of the pharmaceutical industry from data mining, experimental and clinical data capture to pharmacoeconomics and adverse events monitoring. They will likely continue to be indispensable assets along with a growing library of software applications. This is primarily due to the increasingly massive amount of biology, chemistry and clinical data, which is now entering the public domain mainly as a result of NIH and commercially funded projects. We are therefore in need of new methods for mining this mountain of data in order to enable new hypothesis generation. The computational approaches include, but are not limited to, database compilation, quantitative structure activity relationships (QSAR), pharmacophores, network visualization models, decision trees, machine learning algorithms and multidimensional data visualization software that could be used to improve drug delivery after mining public and/or proprietary data. We will discuss some areas of unmet needs in the area of data mining for drug delivery that can be addressed with new software tools or databases of relevance to future pharmaceutical projects.

  15. Wide field imaging problems in radio astronomy

    NASA Astrophysics Data System (ADS)

    Cornwell, T. J.; Golap, K.; Bhatnagar, S.

    2005-03-01

    The new generation of synthesis radio telescopes now being proposed, designed, and constructed face substantial problems in making images over wide fields of view. Such observations are required either to achieve the full sensitivity limit in crowded fields or for surveys. The Square Kilometre Array (SKA Consortium, Tech. Rep., 2004), now being developed by an international consortium of 15 countries, will require advances well beyond the current state of the art. We review the theory of synthesis radio telescopes for large fields of view. We describe a new algorithm, W projection, for correcting the non-coplanar baselines aberration. This algorithm has improved performance over those previously used (typically an order of magnitude in speed). Despite the advent of W projection, the computing hardware required for SKA wide field imaging is estimated to cost up to $500M (2015 dollars). This is about half the target cost of the SKA. Reconfigurable computing is one way in which the costs can be decreased dramatically.

  16. A Novel Latin Hypercube Algorithm via Translational Propagation

    PubMed Central

    Pan, Guang; Ye, Pengcheng

    2014-01-01

    Metamodels have been widely used in engineering design to facilitate analysis and optimization of complex systems that involve computationally expensive simulation programs. The accuracy of metamodels is directly related to the experimental designs used. Optimal Latin hypercube designs are frequently used and have been shown to have good space-filling and projective properties. However, the high cost in constructing them limits their use. In this paper, a methodology for creating novel Latin hypercube designs via translational propagation and successive local enumeration algorithm (TPSLE) is developed without using formal optimization. TPSLE algorithm is based on the inspiration that a near optimal Latin Hypercube design can be constructed by a simple initial block with a few points generated by algorithm SLE as a building block. In fact, TPSLE algorithm offers a balanced trade-off between the efficiency and sampling performance. The proposed algorithm is compared to two existing algorithms and is found to be much more efficient in terms of the computation time and has acceptable space-filling and projective properties. PMID:25276844

  17. A Comprehensive Toolset for General-Purpose Private Computing and Outsourcing

    DTIC Science & Technology

    2016-12-08

    project and scientific advances made towards each of the research thrusts throughout the project duration. 1 Project Objectives Cloud computing enables...possibilities that the cloud enables is computation outsourcing, when the client can utilize any necessary computing resources for its computational task...Security considerations, however, stand on the way of harnessing the full benefits of cloud computing to the fullest extent and prevent clients from

  18. A Volunteer Computing Project for Solving Geoacoustic Inversion Problems

    NASA Astrophysics Data System (ADS)

    Zaikin, Oleg; Petrov, Pavel; Posypkin, Mikhail; Bulavintsev, Vadim; Kurochkin, Ilya

    2017-12-01

    A volunteer computing project aimed at solving computationally hard inverse problems in underwater acoustics is described. This project was used to study the possibilities of the sound speed profile reconstruction in a shallow-water waveguide using a dispersion-based geoacoustic inversion scheme. The computational capabilities provided by the project allowed us to investigate the accuracy of the inversion for different mesh sizes of the sound speed profile discretization grid. This problem suits well for volunteer computing because it can be easily decomposed into independent simpler subproblems.

  19. EPA DSSTox and ToxCast Project Updates: Generating New ...

    EPA Pesticide Factsheets

    EPA’s National Center for Computational Toxicology is generating data and capabilities to support a new paradigm for toxicity screening and prediction. The DSSTox project is improving public access to quality structure-annotated chemical toxicity information in less summarized forms than traditionally employed in SAR modeling, and in ways that facilitate data-mining and data read-across. The DSSTox Structure-Browser provides structure searchability across the full published DSSTox toxicity-related inventory, enables linkages to and from previously isolated toxicity data resources (soon to include public microarray resources GEO, ArrayExpress, and CEBS), and provides link-outs to cross-indexed public resources such as PubChem, ChemSpider, and ACToR. The published DSSTox inventory and bioassay information also have been integrated into PubChem allowing a user to take full advantage of PubChem structure-activity and bioassay clustering features. Phase I of the ToxCastTM project has generated high-throughput screening (HTS) data from several hundred biochemical and cell-based assays for a set of 320 chemicals, mostly pesticide actives, with rich toxicology profiles. DSSTox and ACToR are providing the primary cheminformatics support for ToxCastTM and collaborative efforts with the National Toxicology Program’s HTS Program and the NIH Chemical Genomics Center. DSSTox will also be a primary vehicle for publishing ToxCastTM ToxRef summarized bioassay data for use

  20. SHERPA Electromechanical Test Bed

    NASA Technical Reports Server (NTRS)

    Wason, John D.

    2005-01-01

    SHERPA (Strap-on High-altitude Entry Reconnaissance and Precision Aeromaneuver system) is a concept for low-cost-high-accuracy Martian reentry guidance for small scout-class missions with a capsule diameter of approximately 1 meter. This system uses moving masses to change the center of gravity of the capsule in order to control the lift generated by the controlled imbalance. This project involved designing a small proof-of-concept demonstration system that can be used to test the concept through bench-top testing, hardware-in-the-loop testing, and eventually through a drop test from a helicopter. This project has focused on the Mechatronic design aspects of the system including the mechanical, electrical, computer, and low-level control of the concept demonstration system.

  1. Interactive Therapeutic Multi-sensory Environment for Cerebral Palsy People

    NASA Astrophysics Data System (ADS)

    Mauri, Cesar; Solanas, Agusti; Granollers, Toni; Bagés, Joan; García, Mabel

    The Interactive Therapeutic Sensory Environment (ITSE) research project offers new opportunities on stimulation, interaction and interactive creation for people with moderate and severe mental and physical disabilities. Mainly based on computer vision techniques, the ITSE project allows the gathering of users’ gestures and their transformation into images, sounds and vibrations. Currently, in the APPC, we are working in a prototype that is capable of generating sounds based on the users’ motion and to process digitally the vocal sounds of the users. Tests with impaired users show that ITSE promotes participation, engagement and play. In this paper, we briefly describe the ITSE system, the experimental methodology, the preliminary results and some future goals.

  2. Manufacturing Methods and Technology Project Summary Reports

    DTIC Science & Technology

    1985-06-01

    Computer -Aided Design (CAD)/ Computer -Aided Manufacturing (CAM) Process for the Production of Cold Forged Gears Project 483 6121 - Robotic Welding and...Caliber Projectile Bodies Project 682 8370 - Automatic Inspection and 1-I1 Process Control of Weapons Parts Manufacturing METALS Project 181 7285 - Cast...designed for use on each project. Experience suggested that a general purpose computer interface might be designed that could be used on any project

  3. Analysis and methodology for aeronautical systems technology program planning

    NASA Technical Reports Server (NTRS)

    White, M. J.; Gershkoff, I.; Lamkin, S.

    1983-01-01

    A structured methodology was developed that allows the generation, analysis, and rank-ordering of system concepts by their benefits and costs, indicating the preferred order of implementation. The methodology is supported by a base of data on civil transport aircraft fleet growth projections and data on aircraft performance relating the contribution of each element of the aircraft to overall performance. The performance data are used to assess the benefits of proposed concepts. The methodology includes a computer program for performing the calculations needed to rank-order the concepts and compute their cumulative benefit-to-cost ratio. The use of the methodology and supporting data is illustrated through the analysis of actual system concepts from various sources.

  4. Simulating electric field interactions with polar molecules using spectroscopic databases

    NASA Astrophysics Data System (ADS)

    Owens, Alec; Zak, Emil J.; Chubb, Katy L.; Yurchenko, Sergei N.; Tennyson, Jonathan; Yachmenev, Andrey

    2017-03-01

    Ro-vibrational Stark-associated phenomena of small polyatomic molecules are modelled using extensive spectroscopic data generated as part of the ExoMol project. The external field Hamiltonian is built from the computed ro-vibrational line list of the molecule in question. The Hamiltonian we propose is general and suitable for any polar molecule in the presence of an electric field. By exploiting precomputed data, the often prohibitively expensive computations associated with high accuracy simulations of molecule-field interactions are avoided. Applications to strong terahertz field-induced ro-vibrational dynamics of PH3 and NH3, and spontaneous emission data for optoelectrical Sisyphus cooling of H2CO and CH3Cl are discussed.

  5. Electromagnetic Physics Models for Parallel Computing Architectures

    NASA Astrophysics Data System (ADS)

    Amadio, G.; Ananya, A.; Apostolakis, J.; Aurora, A.; Bandieramonte, M.; Bhattacharyya, A.; Bianchini, C.; Brun, R.; Canal, P.; Carminati, F.; Duhem, L.; Elvira, D.; Gheata, A.; Gheata, M.; Goulas, I.; Iope, R.; Jun, S. Y.; Lima, G.; Mohanty, A.; Nikitina, T.; Novak, M.; Pokorski, W.; Ribon, A.; Seghal, R.; Shadura, O.; Vallecorsa, S.; Wenzel, S.; Zhang, Y.

    2016-10-01

    The recent emergence of hardware architectures characterized by many-core or accelerated processors has opened new opportunities for concurrent programming models taking advantage of both SIMD and SIMT architectures. GeantV, a next generation detector simulation, has been designed to exploit both the vector capability of mainstream CPUs and multi-threading capabilities of coprocessors including NVidia GPUs and Intel Xeon Phi. The characteristics of these architectures are very different in terms of the vectorization depth and type of parallelization needed to achieve optimal performance. In this paper we describe implementation of electromagnetic physics models developed for parallel computing architectures as a part of the GeantV project. Results of preliminary performance evaluation and physics validation are presented as well.

  6. Comparing Networks from a Data Analysis Perspective

    NASA Astrophysics Data System (ADS)

    Li, Wei; Yang, Jing-Yu

    To probe network characteristics, two predominant ways of network comparison are global property statistics and subgraph enumeration. However, they suffer from limited information and exhaustible computing. Here, we present an approach to compare networks from the perspective of data analysis. Initially, the approach projects each node of original network as a high-dimensional data point, and the network is seen as clouds of data points. Then the dispersion information of the principal component analysis (PCA) projection of the generated data clouds can be used to distinguish networks. We applied this node projection method to the yeast protein-protein interaction networks and the Internet Autonomous System networks, two types of networks with several similar higher properties. The method can efficiently distinguish one from the other. The identical result of different datasets from independent sources also indicated that the method is a robust and universal framework.

  7. Physical and Chemical Processes Opacity Project: an Overview and Some Preliminary Results

    NASA Astrophysics Data System (ADS)

    Mendoza, C.

    1990-11-01

    RESUMEN. El Proyecto de la Opacidad es un esfuerzo internacional dedicado a calcular con precisi6n la gran cantidad de datos at6micos que se necesitan para estimar opacidades en los envolventes estelares. Describimos el panorama general del proyecto incluyendo aspectos astrofisicos, flsico-at6micos y computacionales. El volumen y calidad de los datos que se estan generando se puede apreciar en los resultados preliminares que se presentan. ABSTRACT The Opacity Project is an international effort dedicated to the calculation of the vast, accurate, atomic data required to estimate stellar envelope opacities. We give an overview of the project including astrophysical, atomic-physical and computational aspects. The volume and quality of the data which are being generated can be appreciated in the preliminary results that are presented. }% words: ATOMIC PROCESSES - OPACITIES - STARS-INThRIORS

  8. Photoacoustic projection imaging using an all-optical detector array

    NASA Astrophysics Data System (ADS)

    Bauer-Marschallinger, J.; Felbermayer, K.; Berer, T.

    2018-02-01

    We present a prototype for all-optical photoacoustic projection imaging. By generating projection images, photoacoustic information of large volumes can be retrieved with less effort compared to common photoacoustic computed tomography where many detectors and/or multiple measurements are required. In our approach, an array of 60 integrating line detectors is used to acquire photoacoustic waves. The line detector array consists of fiber-optic MachZehnder interferometers, distributed on a cylindrical surface. From the measured variation of the optical path lengths of the interferometers, induced by photoacoustic waves, a photoacoustic projection image can be reconstructed. The resulting images represent the projection of the three-dimensional spatial light absorbance within the imaged object onto a two-dimensional plane, perpendicular to the line detector array. The fiber-optic detectors achieve a noise-equivalent pressure of 24 Pascal at a 10 MHz bandwidth. We present the operational principle, the structure of the array, and resulting images. The system can acquire high-resolution projection images of large volumes within a short period of time. Imaging large volumes at high frame rates facilitates monitoring of dynamic processes.

  9. 48 CFR 53.105 - Computer generation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Computer generation. 53...) CLAUSES AND FORMS FORMS General 53.105 Computer generation. (a) Agencies may computer-generate the... be computer generated by the public. Unless prohibited by agency regulations, forms prescribed by...

  10. 48 CFR 53.105 - Computer generation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 2 2011-10-01 2011-10-01 false Computer generation. 53...) CLAUSES AND FORMS FORMS General 53.105 Computer generation. (a) Agencies may computer-generate the... be computer generated by the public. Unless prohibited by agency regulations, forms prescribed by...

  11. Mobile Computing for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Alena, Richard; Swietek, Gregory E. (Technical Monitor)

    1994-01-01

    The use of commercial computer technology in specific aerospace mission applications can reduce the cost and project cycle time required for the development of special-purpose computer systems. Additionally, the pace of technological innovation in the commercial market has made new computer capabilities available for demonstrations and flight tests. Three areas of research and development being explored by the Portable Computer Technology Project at NASA Ames Research Center are the application of commercial client/server network computing solutions to crew support and payload operations, the analysis of requirements for portable computing devices, and testing of wireless data communication links as extensions to the wired network. This paper will present computer architectural solutions to portable workstation design including the use of standard interfaces, advanced flat-panel displays and network configurations incorporating both wired and wireless transmission media. It will describe the design tradeoffs used in selecting high-performance processors and memories, interfaces for communication and peripheral control, and high resolution displays. The packaging issues for safe and reliable operation aboard spacecraft and aircraft are presented. The current status of wireless data links for portable computers is discussed from a system design perspective. An end-to-end data flow model for payload science operations from the experiment flight rack to the principal investigator is analyzed using capabilities provided by the new generation of computer products. A future flight experiment on-board the Russian MIR space station will be described in detail including system configuration and function, the characteristics of the spacecraft operating environment, the flight qualification measures needed for safety review, and the specifications of the computing devices to be used in the experiment. The software architecture chosen shall be presented. An analysis of the performance characteristics of wireless data links in the spacecraft environment will be discussed. Network performance and operation will be modeled and preliminary test results presented. A crew support application will be demonstrated in conjunction with the network metrics experiment.

  12. MIDAS, prototype Multivariate Interactive Digital Analysis System, phase 1. Volume 1: System description

    NASA Technical Reports Server (NTRS)

    Kriegler, F. J.

    1974-01-01

    The MIDAS System is described as a third-generation fast multispectral recognition system able to keep pace with the large quantity and high rates of data acquisition from present and projected sensors. A principal objective of the MIDAS program is to provide a system well interfaced with the human operator and thus to obtain large overall reductions in turnaround time and significant gains in throughput. The hardware and software are described. The system contains a mini-computer to control the various high-speed processing elements in the data path, and a classifier which implements an all-digital prototype multivariate-Gaussian maximum likelihood decision algorithm operating at 200,000 pixels/sec. Sufficient hardware was developed to perform signature extraction from computer-compatible tapes, compute classifier coefficients, control the classifier operation, and diagnose operation.

  13. Computationally Efficient Characterization of Potential Energy Surfaces Based on Fingerprint Distances

    NASA Astrophysics Data System (ADS)

    Schaefer, Bastian; Goedecker, Stefan; Goedecker Group Team

    Based on Lennard-Jones, Silicon, Sodium-Chloride and Gold clusters, it was found that uphill barrier energies of transition states between directly connected minima tend to increase with increasing structural differences of the two minima. Based on this insight it also turned out that post-processing minima hopping data at a negligible computational cost allows to obtain qualitative topological information on potential energy surfaces that can be stored in so called qualitative connectivity databases. These qualitative connectivity databases are used for generating fingerprint disconnectivity graphs that allow to obtain a first qualitative idea on thermodynamic and kinetic properties of a system of interest. This research was supported by the NCCR MARVEL, funded by the Swiss National Science Foundation. Computer time was provided by the Swiss National Supercomputing Centre (CSCS) under Project ID No. s499.

  14. Virtual personal assistance

    NASA Astrophysics Data System (ADS)

    Aditya, K.; Biswadeep, G.; Kedar, S.; Sundar, S.

    2017-11-01

    Human computer communication has growing demand recent days. The new generation of autonomous technology aspires to give computer interfaces emotional states that relate and consider user as well as system environment considerations. In the existing computational model is based an artificial intelligent and externally by multi-modal expression augmented with semi human characteristics. But the main problem with is multi-model expression is that the hardware control given to the Artificial Intelligence (AI) is very limited. So, in our project we are trying to give the Artificial Intelligence (AI) more control on the hardware. There are two main parts such as Speech to Text (STT) and Text to Speech (TTS) engines are used accomplish the requirement. In this work, we are using a raspberry pi 3, a speaker and a mic as hardware and for the programing part, we are using python scripting.

  15. Computer Series, 101: Accurate Equations of State in Computational Chemistry Projects.

    ERIC Educational Resources Information Center

    Albee, David; Jones, Edward

    1989-01-01

    Discusses the use of computers in chemistry courses at the United States Military Academy. Provides two examples of computer projects: (1) equations of state, and (2) solving for molar volume. Presents BASIC and PASCAL listings for the second project. Lists 10 applications for physical chemistry. (MVL)

  16. Reclassification and Documentation in a Medium-sized Medical Center Library: The MTST System in the Simultaneous Production of Catalog Cards and a Computer Stored Record

    PubMed Central

    Love, Erika; Butzin, Diane; Robinson, Robert E.; Lee, Soo

    1971-01-01

    A project to recatalog and reclassify the book collection of the Bowman Gray School of Medicine Library utilizing the Magnetic Tape/Selectric Typwriter system for simultaneous catalog card production and computer stored data acquisition marks the beginning of eventual computerization of all library operations. A keyboard optical display system will be added by late 1970. Major input operations requiring the creation of “hard copy” will continue via the MTST system. Updating, editing and retrieval operations as well as input without hard copy production will be done through the “on-line” keyboard optical display system. Once the library's first data bank, the book catalog, has been established the computer may be consulted directly for library holdings from any optical display terminal throughout the medical center. Three basic information retrieval operations may be carried out through “on-line” optical display terminals. Output options include the reproduction of part or all of a given document, or the generation of statistical data, which are derived from two Acquisition Code lines. The creation of a central bibliographic record of Bowman Gray Faculty publications patterned after the cataloging program is presently under way. The cataloging and computer storage of serial holdings records will begin after completion of the reclassification project. All acquisitions added to the collection since October 1967 are computer-stored and fully retrievable. Reclassification of older titles will be completed in early 1971. PMID:5542915

  17. Computer-Generated Feedback on Student Writing

    ERIC Educational Resources Information Center

    Ware, Paige

    2011-01-01

    A distinction must be made between "computer-generated scoring" and "computer-generated feedback". Computer-generated scoring refers to the provision of automated scores derived from mathematical models built on organizational, syntactic, and mechanical aspects of writing. In contrast, computer-generated feedback, the focus of this article, refers…

  18. Montage Version 3.0

    NASA Technical Reports Server (NTRS)

    Jacob, Joseph; Katz, Daniel; Prince, Thomas; Berriman, Graham; Good, John; Laity, Anastasia

    2006-01-01

    The final version (3.0) of the Montage software has been released. To recapitulate from previous NASA Tech Briefs articles about Montage: This software generates custom, science-grade mosaics of astronomical images on demand from input files that comply with the Flexible Image Transport System (FITS) standard and contain image data registered on projections that comply with the World Coordinate System (WCS) standards. This software can be executed on single-processor computers, multi-processor computers, and such networks of geographically dispersed computers as the National Science Foundation s TeraGrid or NASA s Information Power Grid. The primary advantage of running Montage in a grid environment is that computations can be done on a remote supercomputer for efficiency. Multiple computers at different sites can be used for different parts of a computation a significant advantage in cases of computations for large mosaics that demand more processor time than is available at any one site. Version 3.0 incorporates several improvements over prior versions. The most significant improvement is that this version is accessible to scientists located anywhere, through operational Web services that provide access to data from several large astronomical surveys and construct mosaics on either local workstations or remote computational grids as needed.

  19. A Research Program in Computer Technology

    DTIC Science & Technology

    1976-07-01

    K PROGRAM VERIFICATION 12 [Shaw76b] Shaw, M., W. A. Wulf, and R. L. London, Abstraction and Verification ain Aiphard: Iteration and Generators...millisecond trame of speech: pitch, gain, and 10 k -parameters (often called reflection coefficients). The 12 parameters from each frame are encoded into...del rey, CA 90291 Program Code 3D30 & 3P1O I,%’POLLING OFFICE NAME AND ADDRESS 12 REPORT DATE Defense Advanced Research Projects Agency July 1976 1400

  20. Multi-attribute criteria applied to electric generation energy system analysis LDRD.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuswa, Glenn W.; Tsao, Jeffrey Yeenien; Drennen, Thomas E.

    2005-10-01

    This report began with a Laboratory-Directed Research and Development (LDRD) project to improve Sandia National Laboratories multidisciplinary capabilities in energy systems analysis. The aim is to understand how various electricity generating options can best serve needs in the United States. The initial product is documented in a series of white papers that span a broad range of topics, including the successes and failures of past modeling studies, sustainability, oil dependence, energy security, and nuclear power. Summaries of these projects are included here. These projects have provided a background and discussion framework for the Energy Systems Analysis LDRD team to carrymore » out an inter-comparison of many of the commonly available electric power sources in present use, comparisons of those options, and efforts needed to realize progress towards those options. A computer aid has been developed to compare various options based on cost and other attributes such as technological, social, and policy constraints. The Energy Systems Analysis team has developed a multi-criteria framework that will allow comparison of energy options with a set of metrics that can be used across all technologies. This report discusses several evaluation techniques and introduces the set of criteria developed for this LDRD.« less

  1. Mantle convection on modern supercomputers

    NASA Astrophysics Data System (ADS)

    Weismüller, Jens; Gmeiner, Björn; Mohr, Marcus; Waluga, Christian; Wohlmuth, Barbara; Rüde, Ulrich; Bunge, Hans-Peter

    2015-04-01

    Mantle convection is the cause for plate tectonics, the formation of mountains and oceans, and the main driving mechanism behind earthquakes. The convection process is modeled by a system of partial differential equations describing the conservation of mass, momentum and energy. Characteristic to mantle flow is the vast disparity of length scales from global to microscopic, turning mantle convection simulations into a challenging application for high-performance computing. As system size and technical complexity of the simulations continue to increase, design and implementation of simulation models for next generation large-scale architectures demand an interdisciplinary co-design. Here we report about recent advances of the TERRA-NEO project, which is part of the high visibility SPPEXA program, and a joint effort of four research groups in computer sciences, mathematics and geophysical application under the leadership of FAU Erlangen. TERRA-NEO develops algorithms for future HPC infrastructures, focusing on high computational efficiency and resilience in next generation mantle convection models. We present software that can resolve the Earth's mantle with up to 1012 grid points and scales efficiently to massively parallel hardware with more than 50,000 processors. We use our simulations to explore the dynamic regime of mantle convection assessing the impact of small scale processes on global mantle flow.

  2. Fluorescent x-ray computed tomography with synchrotron radiation using fan collimator

    NASA Astrophysics Data System (ADS)

    Takeda, Tohoru; Akiba, Masahiro; Yuasa, Tetsuya; Kazama, Masahiro; Hoshino, Atsunori; Watanabe, Yuuki; Hyodo, Kazuyuki; Dilmanian, F. Avraham; Akatsuka, Takao; Itai, Yuji

    1996-04-01

    We describe a new system of fluorescent x-ray computed tomography applied to image nonradioactive contrast materials in vivo. The system operates on the basis of computed tomography (CT) of the first generation. The experiment was also simulated using the Monte Carlo method. The research was carried out at the BLNE-5A bending-magnet beam line of the Tristan Accumulation Ring in Kek, Japan. An acrylic cylindrical phantom containing five paraxial channels of 5 and 4 mm diameters was imaged. The channels were filled with a diluted iodine-based contrast material, with iodine concentrations of 2 mg/ml and 500 (mu) g/ml. Spectra obtained with the system's high purity germanium (HPGe) detector separated clearly the K(alpha ) and K(beta 1) x-ray fluorescent lines, and the Compton scattering. CT images were reconstructed from projections generated by integrating the counts in these spectral lines. The method had adequate sensitivity and detection power, as shown by the experiment and predicted by the simulations, to show the iodine content of the phantom channels, which corresponded to 1 and 4 (mu) g iodine content per pixel in the reconstructed images.

  3. Software for Use with Optoelectronic Measuring Tool

    NASA Technical Reports Server (NTRS)

    Ballard, Kim C.

    2004-01-01

    A computer program has been written to facilitate and accelerate the process of measurement by use of the apparatus described in "Optoelectronic Tool Adds Scale Marks to Photographic Images" (KSC-12201). The tool contains four laser diodes that generate parallel beams of light spaced apart at a known distance. The beams of light are used to project bright spots that serve as scale marks that become incorporated into photographic images (including film and electronic images). The sizes of objects depicted in the images can readily be measured by reference to the scale marks. The computer program is applicable to a scene that contains the laser spots and that has been imaged in a square pixel format that can be imported into a graphical user interface (GUI) generated by the program. It is assumed that the laser spots and the distance(s) to be measured all lie in the same plane and that the plane is perpendicular to the line of sight of the camera used to record the image

  4. The infection algorithm: an artificial epidemic approach for dense stereo correspondence.

    PubMed

    Olague, Gustavo; Fernández, Francisco; Pérez, Cynthia B; Lutton, Evelyne

    2006-01-01

    We present a new bio-inspired approach applied to a problem of stereo image matching. This approach is based on an artificial epidemic process, which we call the infection algorithm. The problem at hand is a basic one in computer vision for 3D scene reconstruction. It has many complex aspects and is known as an extremely difficult one. The aim is to match the contents of two images in order to obtain 3D information that allows the generation of simulated projections from a viewpoint that is different from the ones of the initial photographs. This process is known as view synthesis. The algorithm we propose exploits the image contents in order to produce only the necessary 3D depth information, while saving computational time. It is based on a set of distributed rules, which propagate like an artificial epidemic over the images. Experiments on a pair of real images are presented, and realistic reprojected images have been generated.

  5. Virtual finger boosts three-dimensional imaging and microsurgery as well as terabyte volume image visualization and analysis.

    PubMed

    Peng, Hanchuan; Tang, Jianyong; Xiao, Hang; Bria, Alessandro; Zhou, Jianlong; Butler, Victoria; Zhou, Zhi; Gonzalez-Bellido, Paloma T; Oh, Seung W; Chen, Jichao; Mitra, Ananya; Tsien, Richard W; Zeng, Hongkui; Ascoli, Giorgio A; Iannello, Giulio; Hawrylycz, Michael; Myers, Eugene; Long, Fuhui

    2014-07-11

    Three-dimensional (3D) bioimaging, visualization and data analysis are in strong need of powerful 3D exploration techniques. We develop virtual finger (VF) to generate 3D curves, points and regions-of-interest in the 3D space of a volumetric image with a single finger operation, such as a computer mouse stroke, or click or zoom from the 2D-projection plane of an image as visualized with a computer. VF provides efficient methods for acquisition, visualization and analysis of 3D images for roundworm, fruitfly, dragonfly, mouse, rat and human. Specifically, VF enables instant 3D optical zoom-in imaging, 3D free-form optical microsurgery, and 3D visualization and annotation of terabytes of whole-brain image volumes. VF also leads to orders of magnitude better efficiency of automated 3D reconstruction of neurons and similar biostructures over our previous systems. We use VF to generate from images of 1,107 Drosophila GAL4 lines a projectome of a Drosophila brain.

  6. Initial Results of Coupling the Output of a Regional Weather Model and a Localized Computational Fluid Dynamics Model at the Atlantic City International Airport

    NASA Astrophysics Data System (ADS)

    Trout, Joseph; Manson, J. Russell; Rios, Manny; King, David; Decicco, Nicholas

    2015-04-01

    Wake Vortex Turbulence is the turbulence generated by an aircraft in flight. This turbulence is created by vortices at the tips of the wing that may decay slowly and persist for several minutes after creation. The strength, formation and lifetime of the turbulence and vortices are effected by many things including the weather. Here we present the preliminary results of an investigation of low level wind fields generated by the Weather Research and Forecasting Model and an analysis of historical data. The simulations are used as inputs for the computational fluid dynamics model (OpenFoam) that will be used to investigate the effect of weather on wake turbulence. The initial results of the OpenFoam model are presented elsewhere. Presented here are the initial results from a research grant, ``A Pilot Project to Investigate Wake Vortex Patterns and Weather Patterns at the Atlantic City Airport by the Richard Stockton College of NJ and the FAA''.

  7. Improving student learning via mobile phone video content: Evidence from the BridgeIT India project

    NASA Astrophysics Data System (ADS)

    Wennersten, Matthew; Quraishy, Zubeeda Banu; Velamuri, Malathi

    2015-08-01

    Past efforts invested in computer-based education technology interventions have generated little evidence of affordable success at scale. This paper presents the results of a mobile phone-based intervention conducted in the Indian states of Andhra Pradesh and Tamil Nadu in 2012-13. The BridgeIT project provided a pool of audio-visual learning materials organised in accordance with a system of syllabi pacing charts. Teachers of Standard 5 and 6 English and Science classes were notified of the availability of new videos via text messages (SMS), which they downloaded onto their phones using an open-source application and showed, with suggested activities, to students on a TV screen using a TV-out cable. In their evaluation of this project, the authors of this paper found that the test scores of children who experienced the intervention improved by 0.36 standard deviations in English and 0.98 standard deviations in Science in Andhra Pradesh, relative to students in similar classrooms who did not experience the intervention. Differences between treatment and control schools in Tamil Nadu were less marked. The intervention was also cost-effective, relative to other computer-based interventions. Based on these results, the authors argue that is possible to use mobile phones to produce a strong positive and statistically significant effect in terms of teaching and learning quality across a large number of classrooms in India at a lower cost per student than past computer-based interventions.

  8. An Intelligent Computer-aided Training System (CAT) for Diagnosing Adult Illiterates: Integrating NASA Technology into Workplace Literacy

    NASA Technical Reports Server (NTRS)

    Yaden, David B., Jr.

    1991-01-01

    An important part of NASA's mission involves the secondary application of its technologies in the public and private sectors. One current application being developed is The Adult Literacy Evaluator, a simulation-based diagnostic tool designed to assess the operant literacy abilities of adults having difficulties in learning to read and write. Using Intelligent Computer-Aided Training (ICAT) system technology in addition to speech recognition, closed-captioned television (CCTV), live video and other state-of-the-art graphics and storage capabilities, this project attempts to overcome the negative effects of adult literacy assessment by allowing the client to interact with an intelligent computer system which simulates real-life literacy activities and materials and which measures literacy performance in the actual context of its use. The specific objectives of the project are as follows: (1) to develop a simulation-based diagnostic tool to assess adults' prior knowledge about reading and writing processes in actual contexts of application; (2) to provide a profile of readers' strengths and weaknesses; and (3) to suggest instructional strategies and materials which can be used as a beginning point for remediation. In the first and development phase of the project, descriptions of literacy events and environments are being written and functional literacy documents analyzed for their components. From these descriptions, scripts are being generated which define the interaction between the student, an on-screen guide and the simulated literacy environment.

  9. Lunar laser ranging data processing in a Unix/X windows environment

    NASA Technical Reports Server (NTRS)

    Ricklefs, Randall L.; Ries, Judit G.

    1993-01-01

    In cooperation with the NASA Crustal Dynamics Project initiative placing workstation computers at each of its laser ranging stations to handle data filtering and normalpointing, MLRS personnel have developed a new generation of software to provide the same services for the lunar laser ranging data type. The Unix operating system and X windows/Motif provides an environment for both batch and interactive filtering and normalpointing as well as prediction calculations. The goal is to provide a transportable and maintainable data reduction environment. This software and some sample displays are presented. that the lunar (or satellite) datacould be processed on one computer while data was taken on the other. The reduction of the data was totally interactive and in no way automated. In addition, lunar predictions were produced on-site, another first in the effort to down-size historically mainframe-based applications. Extraction of earth rotation parameters was at one time attempted on site in near-realtime. In 1988, the Crustal Dynamics Project SLR Computer Panel mandated the installation of Hewlett-Packard 9000/360 Unix workstations at each NASA-operated laser ranging station to relieve the aging controller computers of much of their data and communications handling responsibility and to provide on-site data filtering and normal pointing for a growing list of artificial satellite targets. This was seen by MLRS staff as an opportunity to provide a better lunar data processing environment as well.

  10. Lunar laser ranging data processing in a Unix/X windows environment

    NASA Astrophysics Data System (ADS)

    Ricklefs, Randall L.; Ries, Judit G.

    1993-06-01

    In cooperation with the NASA Crustal Dynamics Project initiative placing workstation computers at each of its laser ranging stations to handle data filtering and normalpointing, MLRS personnel have developed a new generation of software to provide the same services for the lunar laser ranging data type. The Unix operating system and X windows/Motif provides an environment for both batch and interactive filtering and normalpointing as well as prediction calculations. The goal is to provide a transportable and maintainable data reduction environment. This software and some sample displays are presented. that the lunar (or satellite) datacould be processed on one computer while data was taken on the other. The reduction of the data was totally interactive and in no way automated. In addition, lunar predictions were produced on-site, another first in the effort to down-size historically mainframe-based applications. Extraction of earth rotation parameters was at one time attempted on site in near-realtime. In 1988, the Crustal Dynamics Project SLR Computer Panel mandated the installation of Hewlett-Packard 9000/360 Unix workstations at each NASA-operated laser ranging station to relieve the aging controller computers of much of their data and communications handling responsibility and to provide on-site data filtering and normal pointing for a growing list of artificial satellite targets. This was seen by MLRS staff as an opportunity to provide a better lunar data processing environment as well.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gonzalez Anez, Francisco

    This paper presents two development projects (STARMATE and VIRMAN) focused on supporting training on maintenance. Both projects aim at specifying, designing, developing, and demonstrating prototypes allowing computer guided maintenance of complex mechanical elements using Augmented and Virtual Reality techniques. VIRMAN is a Spanish development project. The objective is to create a computer tool for maintenance training course elaborations and training delivery based on 3D virtual reality models of complex components. The training delivery includes 3D record displays on maintenance procedures with all complementary information for intervention understanding. Users are requested to perform the maintenance intervention trying to follow up themore » procedure. Users can be evaluated about the level of knowledge achieved. Instructors can check the evaluation records left during the training sessions. VIRMAN is simple software supported by a regular computer and can be used in an Internet framework. STARMATE is a forward step in the area of virtual reality. STARMATE is a European Commission project in the frame of 'Information Societies Technologies'. A consortium of five companies and one research institute shares their expertise in this new technology. STARMATE provides two main functionalities (1) user assistance for achieving assembly/de-assembly and following maintenance procedures, and (2) workforce training. The project relies on Augmented Reality techniques, which is a growing area in Virtual Reality research. The idea of Augmented Reality is to combine a real scene, viewed by the user, with a virtual scene, generated by a computer, augmenting the reality with additional information. The user interface is see-through goggles, headphones, microphone and an optical tracking system. All these devices are integrated in a helmet connected with two regular computers. The user has his hands free for performing the maintenance intervention and he can navigate in the virtual world thanks to a voice recognition system and a virtual pointing device. The maintenance work is guided with audio instructions, 2D and 3D information are directly displayed into the user's goggles: There is a position-tracking system that allows 3D virtual models to be displayed in the real counterpart positions independently of the user allocation. The user can create his own virtual environment, placing the information required wherever he wants. The STARMATE system is applicable to a large variety of real work situations. (author)« less

  12. The Secret Life of Quarks, Final Report for the University of North Carolina at Chapel Hill

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fowler, Robert J.

    This final report summarizes activities and results at the University of North Carolina as part of the the SciDAC-2 Project The Secret Life of Quarks: National Computational Infrastructure for Lattice Quantum Chromodynamics. The overall objective of the project is to construct the software needed to study quantum chromo- dynamics (QCD), the theory of the strong interactions of subatomic physics, and similar strongly coupled gauge theories anticipated to be of importance in the LHC era. It built upon the successful efforts of the SciDAC-1 project National Computational Infrastructure for Lattice Gauge Theory, in which a QCD Applications Programming Interface (QCD API)more » was developed that enables lat- tice gauge theorists to make effective use of a wide variety of massively parallel computers. In the SciDAC-2 project, optimized versions of the QCD API were being created for the IBM Blue- Gene/L (BG/L) and BlueGene/P (BG/P), the Cray XT3/XT4 and its successors, and clusters based on multi-core processors and Infiniband communications networks. The QCD API is being used to enhance the performance of the major QCD community codes and to create new applications. Software libraries of physics tools have been expanded to contain sharable building blocks for inclusion in application codes, performance analysis and visualization tools, and software for au- tomation of physics work flow. New software tools were designed for managing the large data sets generated in lattice QCD simulations, and for sharing them through the International Lattice Data Grid consortium. As part of the overall project, researchers at UNC were funded through ASCR to work in three general areas. The main thrust has been performance instrumentation and analysis in support of the SciDAC QCD code base as it evolved and as it moved to new computation platforms. In support of the performance activities, performance data was to be collected in a database for the purpose of broader analysis. Third, the UNC work was done at RENCI (Renaissance Computing Institute), which has extensive expertise and facilities for scientific data visualization, so we acted in an ongoing consulting and support role in that area.« less

  13. Projects Using a Computer Algebra System in First-Year Undergraduate Mathematics

    ERIC Educational Resources Information Center

    Rosenzweig, Martin

    2007-01-01

    This paper illustrates the use of computer-based projects in two one-semester first-year undergraduate mathematics classes. Developed over a period of years, the approach is one in which the classes are organised into work-groups, with computer-based projects being undertaken periodically to illustrate the class material. These projects are…

  14. Enabling Extreme Scale Earth Science Applications at the Oak Ridge Leadership Computing Facility

    NASA Astrophysics Data System (ADS)

    Anantharaj, V. G.; Mozdzynski, G.; Hamrud, M.; Deconinck, W.; Smith, L.; Hack, J.

    2014-12-01

    The Oak Ridge Leadership Facility (OLCF), established at the Oak Ridge National Laboratory (ORNL) under the auspices of the U.S. Department of Energy (DOE), welcomes investigators from universities, government agencies, national laboratories and industry who are prepared to perform breakthrough research across a broad domain of scientific disciplines, including earth and space sciences. Titan, the OLCF flagship system, is currently listed as #2 in the Top500 list of supercomputers in the world, and the largest available for open science. The computational resources are allocated primarily via the Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program, sponsored by the U.S. DOE Office of Science. In 2014, over 2.25 billion core hours on Titan were awarded via INCITE projects., including 14% of the allocation toward earth sciences. The INCITE competition is also open to research scientists based outside the USA. In fact, international research projects account for 12% of the INCITE awards in 2014. The INCITE scientific review panel also includes 20% participation from international experts. Recent accomplishments in earth sciences at OLCF include the world's first continuous simulation of 21,000 years of earth's climate history (2009); and an unprecedented simulation of a magnitude 8 earthquake over 125 sq. miles. One of the ongoing international projects involves scaling the ECMWF Integrated Forecasting System (IFS) model to over 200K cores of Titan. ECMWF is a partner in the EU funded Collaborative Research into Exascale Systemware, Tools and Applications (CRESTA) project. The significance of the research carried out within this project is the demonstration of techniques required to scale current generation Petascale capable simulation codes towards the performance levels required for running on future Exascale systems. One of the techniques pursued by ECMWF is to use Fortran2008 coarrays to overlap computations and communications and to reduce the total volume of data communicated. Use of Titan has enabled ECMWF to plan future scalability developments and resource requirements. We will also discuss the best practices developed over the years in navigating logistical, legal and regulatory hurdles involved in supporting the facility's diverse user community.

  15. ComputerTown. A Do-It-Yourself Community Computer Project.

    ERIC Educational Resources Information Center

    Loop, Liza; And Others

    This manual based on Menlo Park's experiences in participating in a nationwide experimental computer literacy project provides guidelines for the development and management of a ComputerTown. This project, which was begun in September 1981 in the Menlo Park Public Library, concentrates on providing public access to microcomputers through hands-on…

  16. Constitutive Modeling of the Thermomechanical Behavior of Rock Salt

    NASA Astrophysics Data System (ADS)

    Hampel, A.

    2016-12-01

    For the safe disposal of heat-generating high-level radioactive waste in rock salt formations, highly reliable numerical simulations of the thermomechanical and hydraulic behavior of the host rock have to be performed. Today, the huge progress in computer technology has enabled experts to calculate large and detailed computer models of underground repositories. However, the big ad­van­ces in computer technology are only beneficial when the applied material models and modeling procedures also meet very high demands. They result from the fact that the evaluation of the long-term integrity of the geological barrier requires an extra­polation of a highly nonlinear deforma­tion behavior to up to 1 million years, while the underlying experimental investigations in the laboratory or in situ have a duration of only days, weeks or at most some years. Several advanced constitutive models were developed and continuously improved to describe the dependences of various deformation phenomena in rock salt on in-situ relevant boundary conditions: transient and steady-state creep, evolution of damage and dilatancy in the DRZ, failure, post-failure behavior, residual strength, damage and dilatancy reduction, and healing. In a joint project series between 2004 and 2016, fundamental features of the advanced models were investigated and compared in detail with benchmark calculations. The study included procedures for the determination of characteristic salt-type-specific model parameter values and for the performance of numerical calculations of underground structures. Based on the results of this work and on specific laboratory investigations, the rock mechanical modeling is currently developed further in a common research project of experts from Germany and the United States. In this presentation, an overview about the work and results of the project series is given and the current joint research project WEIMOS is introduced.

  17. Modular Bundle Adjustment for Photogrammetric Computations

    NASA Astrophysics Data System (ADS)

    Börlin, N.; Murtiyoso, A.; Grussenmeyer, P.; Menna, F.; Nocerino, E.

    2018-05-01

    In this paper we investigate how the residuals in bundle adjustment can be split into a composition of simple functions. According to the chain rule, the Jacobian (linearisation) of the residual can be formed as a product of the Jacobians of the individual steps. When implemented, this enables a modularisation of the computation of the bundle adjustment residuals and Jacobians where each component has limited responsibility. This enables simple replacement of components to e.g. implement different projection or rotation models by exchanging a module. The technique has previously been used to implement bundle adjustment in the open-source package DBAT (Börlin and Grussenmeyer, 2013) based on the Photogrammetric and Computer Vision interpretations of Brown (1971) lens distortion model. In this paper, we applied the technique to investigate how affine distortions can be used to model the projection of a tilt-shift lens. Two extended distortion models were implemented to test the hypothesis that the ordering of the affine and lens distortion steps can be changed to reduce the size of the residuals of a tilt-shift lens calibration. Results on synthetic data confirm that the ordering of the affine and lens distortion steps matter and is detectable by DBAT. However, when applied to a real camera calibration data set of a tilt-shift lens, no difference between the extended models was seen. This suggests that the tested hypothesis is false and that other effects need to be modelled to better explain the projection. The relatively low implementation effort that was needed to generate the models suggest that the technique can be used to investigate other novel projection models in photogrammetry, including modelling changes in the 3D geometry to better understand the tilt-shift lens.

  18. Scidac-Data: Enabling Data Driven Modeling of Exascale Computing

    DOE PAGES

    Mubarak, Misbah; Ding, Pengfei; Aliaga, Leo; ...

    2017-11-23

    Here, the SciDAC-Data project is a DOE-funded initiative to analyze and exploit two decades of information and analytics that have been collected by the Fermilab data center on the organization, movement, and consumption of high energy physics (HEP) data. The project analyzes the analysis patterns and data organization that have been used by NOvA, MicroBooNE, MINERvA, CDF, D0, and other experiments to develop realistic models of HEP analysis workflows and data processing. The SciDAC-Data project aims to provide both realistic input vectors and corresponding output data that can be used to optimize and validate simulations of HEP analysis. These simulationsmore » are designed to address questions of data handling, cache optimization, and workflow structures that are the prerequisites for modern HEP analysis chains to be mapped and optimized to run on the next generation of leadership-class exascale computing facilities. We present the use of a subset of the SciDAC-Data distributions, acquired from analysis of approximately 71,000 HEP workflows run on the Fermilab data center and corresponding to over 9 million individual analysis jobs, as the input to detailed queuing simulations that model the expected data consumption and caching behaviors of the work running in high performance computing (HPC) and high throughput computing (HTC) environments. In particular we describe how the Sequential Access via Metadata (SAM) data-handling system in combination with the dCache/Enstore-based data archive facilities has been used to develop radically different models for analyzing the HEP data. We also show how the simulations may be used to assess the impact of design choices in archive facilities.« less

  19. Scidac-Data: Enabling Data Driven Modeling of Exascale Computing

    NASA Astrophysics Data System (ADS)

    Mubarak, Misbah; Ding, Pengfei; Aliaga, Leo; Tsaris, Aristeidis; Norman, Andrew; Lyon, Adam; Ross, Robert

    2017-10-01

    The SciDAC-Data project is a DOE-funded initiative to analyze and exploit two decades of information and analytics that have been collected by the Fermilab data center on the organization, movement, and consumption of high energy physics (HEP) data. The project analyzes the analysis patterns and data organization that have been used by NOvA, MicroBooNE, MINERvA, CDF, D0, and other experiments to develop realistic models of HEP analysis workflows and data processing. The SciDAC-Data project aims to provide both realistic input vectors and corresponding output data that can be used to optimize and validate simulations of HEP analysis. These simulations are designed to address questions of data handling, cache optimization, and workflow structures that are the prerequisites for modern HEP analysis chains to be mapped and optimized to run on the next generation of leadership-class exascale computing facilities. We present the use of a subset of the SciDAC-Data distributions, acquired from analysis of approximately 71,000 HEP workflows run on the Fermilab data center and corresponding to over 9 million individual analysis jobs, as the input to detailed queuing simulations that model the expected data consumption and caching behaviors of the work running in high performance computing (HPC) and high throughput computing (HTC) environments. In particular we describe how the Sequential Access via Metadata (SAM) data-handling system in combination with the dCache/Enstore-based data archive facilities has been used to develop radically different models for analyzing the HEP data. We also show how the simulations may be used to assess the impact of design choices in archive facilities.

  20. Integrated Computational Materials Engineering Development of Advanced High Strength Steel for Lightweight Vehicles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hector, Jr., Louis G.; McCarty, Eric D.

    The goal of the ICME 3GAHSS project was to successfully demonstrate the applicability of Integrated Computational Materials Engineering (ICME) for the development and deployment of third generation advanced high strength steels (3GAHSS) for immediate weight reduction in passenger vehicles. The ICME approach integrated results from well-established computational and experimental methodologies to develop a suite of material constitutive models (deformation and failure), manufacturing process and performance simulation modules, a properties database, as well as the computational environment linking them together for both performance prediction and material optimization. This is the Final Report for the ICME 3GAHSS project, which achieved the fol-lowingmore » objectives: 1) Developed a 3GAHSS ICME model, which includes atomistic, crystal plasticity, state variable and forming models. The 3GAHSS model was implemented in commercially available LS-DYNA and a user guide was developed to facilitate use of the model. 2) Developed and produced two 3GAHSS alloys using two different chemistries and manufacturing processes, for use in calibrating and validating the 3GAHSS ICME Model. 3) Optimized the design of an automotive subassembly by substituting 3GAHSS for AHSS yielding a design that met or exceeded all baseline performance requirements with a 30% mass savings. A technical cost model was also developed to estimate the cost per pound of weight saved when substituting 3GAHSS for AHSS. The project demonstrated the potential for 3GAHSS to achieve up to 30% weight savings in an automotive structure at a cost penalty of up to $0.32 to $1.26 per pound of weight saved. The 3GAHSS ICME Model enables the user to design 3GAHSS to desired mechanical properties in terms of strength and ductility.« less

  1. Scidac-Data: Enabling Data Driven Modeling of Exascale Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mubarak, Misbah; Ding, Pengfei; Aliaga, Leo

    Here, the SciDAC-Data project is a DOE-funded initiative to analyze and exploit two decades of information and analytics that have been collected by the Fermilab data center on the organization, movement, and consumption of high energy physics (HEP) data. The project analyzes the analysis patterns and data organization that have been used by NOvA, MicroBooNE, MINERvA, CDF, D0, and other experiments to develop realistic models of HEP analysis workflows and data processing. The SciDAC-Data project aims to provide both realistic input vectors and corresponding output data that can be used to optimize and validate simulations of HEP analysis. These simulationsmore » are designed to address questions of data handling, cache optimization, and workflow structures that are the prerequisites for modern HEP analysis chains to be mapped and optimized to run on the next generation of leadership-class exascale computing facilities. We present the use of a subset of the SciDAC-Data distributions, acquired from analysis of approximately 71,000 HEP workflows run on the Fermilab data center and corresponding to over 9 million individual analysis jobs, as the input to detailed queuing simulations that model the expected data consumption and caching behaviors of the work running in high performance computing (HPC) and high throughput computing (HTC) environments. In particular we describe how the Sequential Access via Metadata (SAM) data-handling system in combination with the dCache/Enstore-based data archive facilities has been used to develop radically different models for analyzing the HEP data. We also show how the simulations may be used to assess the impact of design choices in archive facilities.« less

  2. "First generation" automated DNA sequencing technology.

    PubMed

    Slatko, Barton E; Kieleczawa, Jan; Ju, Jingyue; Gardner, Andrew F; Hendrickson, Cynthia L; Ausubel, Frederick M

    2011-10-01

    Beginning in the 1980s, automation of DNA sequencing has greatly increased throughput, reduced costs, and enabled large projects to be completed more easily. The development of automation technology paralleled the development of other aspects of DNA sequencing: better enzymes and chemistry, separation and imaging technology, sequencing protocols, robotics, and computational advancements (including base-calling algorithms with quality scores, database developments, and sequence analysis programs). Despite the emergence of high-throughput sequencing platforms, automated Sanger sequencing technology remains useful for many applications. This unit provides background and a description of the "First-Generation" automated DNA sequencing technology. It also includes protocols for using the current Applied Biosystems (ABI) automated DNA sequencing machines. © 2011 by John Wiley & Sons, Inc.

  3. SDN-NGenIA, a software defined next generation integrated architecture for HEP and data intensive science

    NASA Astrophysics Data System (ADS)

    Balcas, J.; Hendricks, T. W.; Kcira, D.; Mughal, A.; Newman, H.; Spiropulu, M.; Vlimant, J. R.

    2017-10-01

    The SDN Next Generation Integrated Architecture (SDN-NGeNIA) project addresses some of the key challenges facing the present and next generations of science programs in HEP, astrophysics, and other fields, whose potential discoveries depend on their ability to distribute, process and analyze globally distributed Petascale to Exascale datasets. The SDN-NGenIA system under development by Caltech and partner HEP and network teams is focused on the coordinated use of network, computing and storage infrastructures, through a set of developments that build on the experience gained in recently completed and previous projects that use dynamic circuits with bandwidth guarantees to support major network flows, as demonstrated across LHC Open Network Environment [1] and in large scale demonstrations over the last three years, and recently integrated with PhEDEx and Asynchronous Stage Out data management applications of the CMS experiment at the Large Hadron Collider. In addition to the general program goals of supporting the network needs of the LHC and other science programs with similar needs, a recent focus is the use of the Leadership HPC facility at Argonne National Lab (ALCF) for data intensive applications.

  4. High-Tech Opens Doors.

    ERIC Educational Resources Information Center

    Eichleay, Kristen; Pressman, Harvey

    1987-01-01

    Exemplary projects which help disabled people use technology (particularly computers) expand their employment opportunities include: Project Entry (Seattle); Georgia Computer Programmer Project (Atlanta); Perkins Project with Industry (Watertown, Massachusetts); Project Byte (Newton Massachusetts); Technology Relevant to You (St. Louis); Special…

  5. Stereoscopic Machine-Vision System Using Projected Circles

    NASA Technical Reports Server (NTRS)

    Mackey, Jeffrey R.

    2010-01-01

    A machine-vision system capable of detecting obstacles large enough to damage or trap a robotic vehicle is undergoing development. The system includes (1) a pattern generator that projects concentric circles of laser light forward onto the terrain, (2) a stereoscopic pair of cameras that are aimed forward to acquire images of the circles, (3) a frame grabber and digitizer for acquiring image data from the cameras, and (4) a single-board computer that processes the data. The system is being developed as a prototype of machine- vision systems to enable robotic vehicles ( rovers ) on remote planets to avoid craters, large rocks, and other terrain features that could capture or damage the vehicles. Potential terrestrial applications of systems like this one could include terrain mapping, collision avoidance, navigation of robotic vehicles, mining, and robotic rescue. This system is based partly on the same principles as those of a prior stereoscopic machine-vision system in which the cameras acquire images of a single stripe of laser light that is swept forward across the terrain. However, this system is designed to afford improvements over some of the undesirable features of the prior system, including the need for a pan-and-tilt mechanism to aim the laser to generate the swept stripe, ambiguities in interpretation of the single-stripe image, the time needed to sweep the stripe across the terrain and process the data from many images acquired during that time, and difficulty of calibration because of the narrowness of the stripe. In this system, the pattern generator does not contain any moving parts and need not be mounted on a pan-and-tilt mechanism: the pattern of concentric circles is projected steadily in the forward direction. The system calibrates itself by use of data acquired during projection of the concentric-circle pattern onto a known target representing flat ground. The calibration- target image data are stored in the computer memory for use as a template in processing terrain images. During operation on terrain, the images acquired by the left and right cameras are analyzed. The analysis includes (1) computation of the horizontal and vertical dimensions and the aspect ratios of rectangles that bound the circle images and (2) comparison of these aspect ratios with those of the template. Coordinates of distortions of the circles are used to identify and locate objects. If the analysis leads to identification of an object of significant size, then stereoscopicvision algorithms are used to estimate the distance to the object. The time taken in performing this analysis on a single pair of images acquired by the left and right cameras in this system is a fraction of the time taken in processing the many pairs of images acquired in a sweep of the laser stripe across the field of view in the prior system. The results of the analysis include data on sizes and shapes of, and distances and directions to, objects. Coordinates of objects are updated as the vehicle moves so that intelligent decisions regarding speed and direction can be made. The results of the analysis are utilized in a computational decision-making process that generates obstacle-avoidance data and feeds those data to the control system of the robotic vehicle.

  6. Evaluation of simulated ocean carbon in the CMIP5 earth system models

    NASA Astrophysics Data System (ADS)

    Orr, James; Brockmann, Patrick; Seferian, Roland; Servonnat, Jérôme; Bopp, Laurent

    2013-04-01

    We maintain a centralized model output archive containing output from the previous generation of Earth System Models (ESMs), 7 models used in the IPCC AR4 assessment. Output is in a common format located on a centralized server and is publicly available through a web interface. Through the same interface, LSCE/IPSL has also made available output from the Coupled Model Intercomparison Project (CMIP5), the foundation for the ongoing IPCC AR5 assessment. The latter includes ocean biogeochemical fields from more than 13 ESMs. Modeling partners across 3 EU projects refer to the combined AR4-AR5 archive and comparison as OCMIP5, building on previous phases of OCMIP (Ocean Carbon Cycle Intercomparison Project) and making a clear link to IPCC AR5 (CMIP5). While now focusing on assessing the latest generation of results (AR5, CMIP5), this effort is also able to put them in context (AR4). For model comparison and evaluation, we have also stored computed derived variables (e.g., those needed to assess ocean acidification) and key fields regridded to a common 1°x1° grid, thus complementing the standard CMIP5 archive. The combined AR4-AR5 output (OCMIP5) has been used to compute standard quantitative metrics, both global and regional, and those have been synthesized with summary diagrams. In addition, for key biogeochemical fields we have deconvolved spatiotemporal components of the mean square error in order to constrain which models go wrong where. Here we will detail results from these evaluations which have exploited gridded climatological data. The archive, interface, and centralized evaluation provide a solid technical foundation, upon which collaboration and communication is being broadened in the ocean biogeochemical modeling community. Ultimately we aim to encourage wider use of the OCMIP5 archive.

  7. Climate Science's Globally Distributed Infrastructure

    NASA Astrophysics Data System (ADS)

    Williams, D. N.

    2016-12-01

    The Earth System Grid Federation (ESGF) is primarily funded by the Department of Energy's (DOE's) Office of Science (the Office of Biological and Environmental Research [BER] Climate Data Informatics Program and the Office of Advanced Scientific Computing Research Next Generation Network for Science Program), the National Oceanic and Atmospheric Administration (NOAA), the National Aeronautics and Space Administration (NASA), and the National Science Foundation (NSF), the European Infrastructure for the European Network for Earth System Modeling (IS-ENES), and the Australian National University (ANU). Support also comes from other U.S. federal and international agencies. The federation works across multiple worldwide data centers and spans seven international network organizations to provide users with the ability to access, analyze, and visualize data using a globally federated collection of networks, computers, and software. Its architecture employs a series of geographically distributed peer nodes that are independently administered and united by common federation protocols and application programming interfaces (APIs). The full ESGF infrastructure has now been adopted by multiple Earth science projects and allows access to petabytes of geophysical data, including the Coupled Model Intercomparison Project (CMIP; output used by the Intergovernmental Panel on Climate Change assessment reports), multiple model intercomparison projects (MIPs; endorsed by the World Climate Research Programme [WCRP]), and the Accelerated Climate Modeling for Energy (ACME; ESGF is included in the overarching ACME workflow process to store model output). ESGF is a successful example of integration of disparate open-source technologies into a cohesive functional system that serves the needs the global climate science community. Data served by ESGF includes not only model output but also observational data from satellites and instruments, reanalysis, and generated images.

  8. NOTE: Wobbled splatting—a fast perspective volume rendering method for simulation of x-ray images from CT

    NASA Astrophysics Data System (ADS)

    Birkfellner, Wolfgang; Seemann, Rudolf; Figl, Michael; Hummel, Johann; Ede, Christopher; Homolka, Peter; Yang, Xinhui; Niederer, Peter; Bergmann, Helmar

    2005-05-01

    3D/2D registration, the automatic assignment of a global rigid-body transformation matching the coordinate systems of patient and preoperative volume scan using projection images, is an important topic in image-guided therapy and radiation oncology. A crucial part of most 3D/2D registration algorithms is the fast computation of digitally rendered radiographs (DRRs) to be compared iteratively to radiographs or portal images. Since registration is an iterative process, fast generation of DRRs—which are perspective summed voxel renderings—is desired. In this note, we present a simple and rapid method for generation of DRRs based on splat rendering. As opposed to conventional splatting, antialiasing of the resulting images is not achieved by means of computing a discrete point spread function (a so-called footprint), but by stochastic distortion of either the voxel positions in the volume scan or by the simulation of a focal spot of the x-ray tube with non-zero diameter. Our method generates slightly blurred DRRs suitable for registration purposes at framerates of approximately 10 Hz when rendering volume images with a size of 30 MB.

  9. PointCom: semi-autonomous UGV control with intuitive interface

    NASA Astrophysics Data System (ADS)

    Rohde, Mitchell M.; Perlin, Victor E.; Iagnemma, Karl D.; Lupa, Robert M.; Rohde, Steven M.; Overholt, James; Fiorani, Graham

    2008-04-01

    Unmanned ground vehicles (UGVs) will play an important role in the nation's next-generation ground force. Advances in sensing, control, and computing have enabled a new generation of technologies that bridge the gap between manual UGV teleoperation and full autonomy. In this paper, we present current research on a unique command and control system for UGVs named PointCom (Point-and-Go Command). PointCom is a semi-autonomous command system for one or multiple UGVs. The system, when complete, will be easy to operate and will enable significant reduction in operator workload by utilizing an intuitive image-based control framework for UGV navigation and allowing a single operator to command multiple UGVs. The project leverages new image processing algorithms for monocular visual servoing and odometry to yield a unique, high-performance fused navigation system. Human Computer Interface (HCI) techniques from the entertainment software industry are being used to develop video-game style interfaces that require little training and build upon the navigation capabilities. By combining an advanced navigation system with an intuitive interface, a semi-autonomous control and navigation system is being created that is robust, user friendly, and less burdensome than many current generation systems. mand).

  10. The influence of a game-making project on male and female learners' attitudes to computing

    NASA Astrophysics Data System (ADS)

    Robertson, Judy

    2013-03-01

    There is a pressing need for gender inclusive approaches to engage young people in computer science. A recent popular approach has been to harness learners' enthusiasm for computer games to motivate them to learn computer science concepts through game authoring. This article describes a study in which 992 learners across 13 schools took part in a game-making project. It provides evidence from 225 pre-test and post-test questionnaires on how learners' attitudes to computing changed during the project, as well as qualitative reflections from the class teachers on how the project affected their learners. Results indicate that girls did not enjoy the experience as much as boys, and that in fact, the project may make pupils less inclined to study computing in the future. This has important implications for future efforts to engage young people in computing.

  11. Yes, I Can: Action Projects To Resolve Equity Issues in Educational Computing. A Project of ECCO, the Educational Computer Consortium of Ohio.

    ERIC Educational Resources Information Center

    Fredman, Alice, Ed.

    This book presents reports on selected "local action" projects that were developed as part of the Equity in Technology Project, which was inaugurated in 1985 by the Educational Computer Consortium of Ohio (ECCO). The book is organized into three sections, one for each of the populations targeted by the project. An introduction by Alice Fredman…

  12. NASA Game Changing Development Program Manufacturing Innovation Project

    NASA Technical Reports Server (NTRS)

    Tolbert, Carol; Vickers, John

    2011-01-01

    This presentation examines the new NASA Manufacturing Innovation Project. The project is a part of the Game Changing Development Program which is one element of the Space Technology Programs Managed by Office of the Chief Technologist. The project includes innovative technologies in model-based manufacturing, digital additive manufacturing, and other next generation manufacturing tools. The project is also coupled with the larger federal initiatives in this area including the National Digital Engineering and Manufacturing Initiative and the Advanced Manufacturing Partnership. In addition to NASA, other interagency partners include the Department of Defense, Department of Commerce, NIST, Department of Energy, and the National Science Foundation. The development of game-changing manufacturing technologies are critical for NASA s mission of exploration, strengthening America s manufacturing competitiveness, and are highly related to current challenges in defense manufacturing activities. There is strong consensus across industry, academia, and government that the future competitiveness of U.S. industry will be determined, in large part, by a technologically advanced manufacturing sector. This presentation highlights the prospectus of next generation manufacturing technologies to the challenges faced NASA and by the Department of Defense. The project focuses on maturing innovative/high payoff model-based manufacturing technologies that may lead to entirely new approaches for a broad array of future NASA missions and solutions to significant national needs. Digital manufacturing and computer-integrated manufacturing "virtually" guarantee advantages in quality, speed, and cost and offer many long-term benefits across the entire product lifecycle. This paper addresses key enablers and emerging strategies in areas such as: Current government initiatives, Model-based manufacturing, and Additive manufacturing.

  13. Photonic Interrogation and Control of Nano Processes

    NASA Technical Reports Server (NTRS)

    Jassemnejad, Baha

    2003-01-01

    My research activities for the summer of 2003 consisted of two projects: One project was concerned with determining a method for predicting the static and dynamic assembly properties of nano-structures using laser tweezers. The other project was to investigate the generation of Laguerre-Gaussian modes using a spatial light modulator incorporated into an optical tweezers system. Concerning the first project, I initially pursued the approach suggested by my NASA colleague Dr. Art Decker. This approach involved mimicking the model of the structure of atomic nucleus for the assembly of 1 to 100 atoms using allowed quadruple transitions induced by orbital angular momentums of a Laguerre- Gaussian (Doughnut) laser mode. After realizing the inaptness of the nuclear model with the nanostructure model as far as the binding forces and transitions were concerned, I focused on using quantum dot modei. This model was not attuned also for the host lattice influences the electronic structure of the quantum dot. Thus one other option that I decided to pursue was the approach of molecular quantum mechanics. In this approach the nanostructure is treated as a large (10-100 nm) molecule constructed from single element or multi-elements. Subsequent to consultation with Dr. Fred Morales, a chemical engineer at NASA GRC, and Dr. David Ball, a computational chemist at Cleveland State University, I acquired a molecular-quantum computation software, Hyperchem 7.0. This software allows simulation of different molecular structures as far as their static and dynamic behaviors are concerned. The time that I spent on this project was about eight weeks. Once this suitable approach was identified, I realized the need to collaborate with a computational quantum chemist to pursue searching for stable nanostructures in the range of 10-100 nm that we can be assembled using laser tweezers. The second project was about generating laser tweezers that possess orbital angular momentum. As shown, we were able to generate laser tweezers modes of different orbital angular momentum using a spatial light modulator incorporated into a laser tweezers system. The motivation for investigating these types of modes stems from being able to spin particles at high speeds and also to orient two particles in separate traps and then join them together. Also, there has been recent intense interest on fundamental physics research on orbital angular momentum of light. The fact that circularly polarized light may have associated with it angular momentum that relates to the spin of individual photons (spin 0 for the plane polarized light, spin +1 for the right-circularly polarized light and spin -1 for the left-circularly polarized light) was first demonstrated by Beth in 1936. Orbital angular momentum is, however, distinct from spin in that the spin angular momentum of light is intrinsically linked to the behavior of the electric field in the light whereas orbital angular momentum is a consequence of inclined wavefronts. In 1992 L. Allen, et al showed that the Laguerre-Gaussian (LG) modes could possess well-defined orbital angular momentum that can exceed 1 planck's constant, i.e. l plancks constant per photon, where l is the azimuthal index of the mode.

  14. [Computer-supported patient history: a workplace analysis].

    PubMed

    Schubiger, G; Weber, D; Winiker, H; Desgrandchamps, D; Imahorn, P

    1995-04-29

    Since 1991, an extensive computer network has been developed and implemented at the Cantonal Hospital of Lucerne. The medical applications include computer aided management of patient charts, medical correspondence, and compilation of diagnosis statistics according to the ICD-9 code. In 1992, the system was introduced as a pilot project in the departments of pediatrics and pediatric surgery of the Lucerne Children's Hospital. This new system has been prospectively evaluated using a workplace analysis. The time taken to complete patient charts and surgical reports was recorded for 14 days before and after the introduction of the computerized system. This analysis was performed for both physicians and secretarial staff. The time delay between the discharge of the patient and the mailing of the discharge letter to the family doctor was also recorded. By conventional means, the average time for the physician to generate a patient chart (26 minutes, n = 119) was slightly lower than the time needed with the computer system (28 minutes, n = 177). However, for a discharge letter, the time needed by the physician was reduced by one third with the computer system and by more than one half for the secretarial staff (32 and 66 minutes conventionally; 22 and 24 minutes respectively with the computer system; p < 0.0001). The time required for the generation of surgical reports was reduced from 17 to 13 minutes per patient and the processing time by secretaries from 37 to 14 minutes. The time delay between the discharge of the patient and the mailing of the discharge letter was reduced by 50% from 7.6 to 3.9 days.(ABSTRACT TRUNCATED AT 250 WORDS)

  15. Human genomics projects and precision medicine.

    PubMed

    Carrasco-Ramiro, F; Peiró-Pastor, R; Aguado, B

    2017-09-01

    The completion of the Human Genome Project (HGP) in 2001 opened the floodgates to a deeper understanding of medicine. There are dozens of HGP-like projects which involve from a few tens to several million genomes currently in progress, which vary from having specialized goals or a more general approach. However, data generation, storage, management and analysis in public and private cloud computing platforms have raised concerns about privacy and security. The knowledge gained from further research has changed the field of genomics and is now slowly permeating into clinical medicine. The new precision (personalized) medicine, where genome sequencing and data analysis are essential components, allows tailored diagnosis and treatment according to the information from the patient's own genome and specific environmental factors. P4 (predictive, preventive, personalized and participatory) medicine is introducing new concepts, challenges and opportunities. This review summarizes current sequencing technologies, concentrates on ongoing human genomics projects, and provides some examples in which precision medicine has already demonstrated clinical impact in diagnosis and/or treatment.

  16. Hardware Acceleration of Adaptive Neural Algorithms.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    James, Conrad D.

    As tradit ional numerical computing has faced challenges, researchers have turned towards alternative computing approaches to reduce power - per - computation metrics and improve algorithm performance. Here, we describe an approach towards non - conventional computing that strengthens the connection between machine learning and neuroscience concepts. The Hardware Acceleration of Adaptive Neural Algorithms (HAANA) project ha s develop ed neural machine learning algorithms and hardware for applications in image processing and cybersecurity. While machine learning methods are effective at extracting relevant features from many types of data, the effectiveness of these algorithms degrades when subjected to real - worldmore » conditions. Our team has generated novel neural - inspired approa ches to improve the resiliency and adaptability of machine learning algorithms. In addition, we have also designed and fabricated hardware architectures and microelectronic devices specifically tuned towards the training and inference operations of neural - inspired algorithms. Finally, our multi - scale simulation framework allows us to assess the impact of microelectronic device properties on algorithm performance.« less

  17. Use of agents to implement an integrated computing environment

    NASA Technical Reports Server (NTRS)

    Hale, Mark A.; Craig, James I.

    1995-01-01

    Integrated Product and Process Development (IPPD) embodies the simultaneous application to both system and quality engineering methods throughout an iterative design process. The use of IPPD results in the time-conscious, cost-saving development of engineering systems. To implement IPPD, a Decision-Based Design perspective is encapsulated in an approach that focuses on the role of the human designer in product development. The approach has two parts and is outlined in this paper. First, an architecture, called DREAMS, is being developed that facilitates design from a decision-based perspective. Second, a supporting computing infrastructure, called IMAGE, is being designed. Agents are used to implement the overall infrastructure on the computer. Successful agent utilization requires that they be made of three components: the resource, the model, and the wrap. Current work is focused on the development of generalized agent schemes and associated demonstration projects. When in place, the technology independent computing infrastructure will aid the designer in systematically generating knowledge used to facilitate decision-making.

  18. Conservation status of polar bears (Ursus maritimus) in relation to projected sea-ice declines.

    PubMed

    Regehr, Eric V; Laidre, Kristin L; Akçakaya, H Resit; Amstrup, Steven C; Atwood, Todd C; Lunn, Nicholas J; Obbard, Martyn; Stern, Harry; Thiemann, Gregory W; Wiig, Øystein

    2016-12-01

    Loss of Arctic sea ice owing to climate change is the primary threat to polar bears throughout their range. We evaluated the potential response of polar bears to sea-ice declines by (i) calculating generation length (GL) for the species, which determines the timeframe for conservation assessments; (ii) developing a standardized sea-ice metric representing important habitat; and (iii) using statistical models and computer simulation to project changes in the global population under three approaches relating polar bear abundance to sea ice. Mean GL was 11.5 years. Ice-covered days declined in all subpopulation areas during 1979-2014 (median -1.26 days year -1 ). The estimated probabilities that reductions in the mean global population size of polar bears will be greater than 30%, 50% and 80% over three generations (35-41 years) were 0.71 (range 0.20-0.95), 0.07 (range 0-0.35) and less than 0.01 (range 0-0.02), respectively. According to IUCN Red List reduction thresholds, which provide a common measure of extinction risk across taxa, these results are consistent with listing the species as vulnerable. Our findings support the potential for large declines in polar bear numbers owing to sea-ice loss, and highlight near-term uncertainty in statistical projections as well as the sensitivity of projections to different plausible assumptions. © 2016 The Authors.

  19. Conservation status of polar bears (Ursus maritimus) in relation to projected sea-ice declines

    USGS Publications Warehouse

    Regehr, Eric V.; Laidre, Kristin L.; Akçakaya, H. Resit; Amstrup, Steven C.; Atwood, Todd C.; Lunn, Nicholas J.; Obbard, Martyn E.; Stern, Harry; Thiemann, Gregory W.; Wiig, Øystein

    2016-01-01

    Loss of Arctic sea ice owing to climate change is the primary threat to polar bears throughout their range. We evaluated the potential response of polar bears to sea-ice declines by (i) calculating generation length (GL) for the species, which determines the timeframe for conservation assessments; (ii) developing a standardized sea-ice metric representing important habitat; and (iii) using statistical models and computer simulation to project changes in the global population under three approaches relating polar bear abundance to sea ice. Mean GL was 11.5 years. Ice-covered days declined in all subpopulation areas during 1979–2014 (median −1.26 days year−1). The estimated probabilities that reductions in the mean global population size of polar bears will be greater than 30%, 50% and 80% over three generations (35–41 years) were 0.71 (range 0.20–0.95), 0.07 (range 0–0.35) and less than 0.01 (range 0–0.02), respectively. According to IUCN Red List reduction thresholds, which provide a common measure of extinction risk across taxa, these results are consistent with listing the species as vulnerable. Our findings support the potential for large declines in polar bear numbers owing to sea-ice loss, and highlight near-term uncertainty in statistical projections as well as the sensitivity of projections to different plausible assumptions.

  20. Perspectives from the Avian Phylogenomics Project: Questions that Can Be Answered with Sequencing All Genomes of a Vertebrate Class.

    PubMed

    Jarvis, Erich D

    2016-01-01

    The rapid pace of advances in genome technology, with concomitant reductions in cost, makes it feasible that one day in our lifetime we will have available extant genomes of entire classes of species, including vertebrates. I recently helped cocoordinate the large-scale Avian Phylogenomics Project, which collected and sequenced genomes of 48 bird species representing most currently classified orders to address a range of questions in phylogenomics and comparative genomics. The consortium was able to answer questions not previously possible with just a few genomes. This success spurred on the creation of a project to sequence the genomes of at least one individual of all extant ∼10,500 bird species. The initiation of this project has led us to consider what questions now impossible to answer could be answered with all genomes, and could drive new questions now unimaginable. These include the generation of a highly resolved family tree of extant species, genome-wide association studies across species to identify genetic substrates of many complex traits, redefinition of species and the species concept, reconstruction of the genomes of common ancestors, and generation of new computational tools to address these questions. Here I present visions for the future by posing and answering questions regarding what scientists could potentially do with available genomes of an entire vertebrate class.

  1. High resolution extremity CT for biomechanics modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ashby, A.E.; Brand, H.; Hollerbach, K.

    1995-09-23

    With the advent of ever more powerful computing and finite element analysis (FEA) capabilities, the bone and joint geometry detail available from either commercial surface definitions or from medical CT scans is inadequate. For dynamic FEA modeling of joints, precise articular contours are necessary to get appropriate contact definition. In this project, a fresh cadaver extremity was suspended in parafin in a lucite cylinder and then scanned with an industrial CT system to generate a high resolution data set for use in biomechanics modeling.

  2. Fiber specklegram sensors sensitivities at high temperatures

    NASA Astrophysics Data System (ADS)

    Rodriguez-Cobo, L.; Lomer, M.; Lopez-Higuera, J. M.

    2015-09-01

    In this work, the sensitivity of Fiber Specklegram Sensors to high temperatures (up to 800ºC) have been studied. Two multimode silica fibers have been introduced into a tubular furnace while a HeNe laser source was launched into a fiber edge, projecting speckle patterns to a commercial webcam. A computer generated different heating and cooling sweeps while the specklegram evolution was recorded. The achieved results exhibit a remarkably linearity in FSS's sensitivity for temperatures under 800ºC, following the thermal expansion of fused silica.

  3. Magnetic field errors tolerances of Nuclotron booster

    NASA Astrophysics Data System (ADS)

    Butenko, Andrey; Kazinova, Olha; Kostromin, Sergey; Mikhaylov, Vladimir; Tuzikov, Alexey; Khodzhibagiyan, Hamlet

    2018-04-01

    Generation of magnetic field in units of booster synchrotron for the NICA project is one of the most important conditions for getting the required parameters and qualitative accelerator operation. Research of linear and nonlinear dynamics of ion beam 197Au31+ in the booster have carried out with MADX program. Analytical estimation of magnetic field errors tolerance and numerical computation of dynamic aperture of booster DFO-magnetic lattice are presented. Closed orbit distortion with random errors of magnetic fields and errors in layout of booster units was evaluated.

  4. ENFIN a network to enhance integrative systems biology.

    PubMed

    Kahlem, Pascal; Birney, Ewan

    2007-12-01

    Integration of biological data of various types and development of adapted bioinformatics tools represent critical objectives to enable research at the systems level. The European Network of Excellence ENFIN is engaged in developing both an adapted infrastructure to connect databases and platforms to enable the generation of new bioinformatics tools as well as the experimental validation of computational predictions. We will give an overview of the projects tackled within ENFIN and discuss the challenges associated with integration for systems biology.

  5. Projected Applications of a "Weather in a Box" Computing System at the NASA Short-Term Prediction Research and Transition (SPoRT) Center

    NASA Technical Reports Server (NTRS)

    Jedlovec, Gary J.; Molthan, Andrew; Zavodsky, Bradley T.; Case, Jonathan L.; LaFontaine, Frank J.; Srikishen, Jayanthi

    2010-01-01

    The NASA Short-term Prediction Research and Transition Center (SPoRT)'s new "Weather in a Box" resources will provide weather research and forecast modeling capabilities for real-time application. Model output will provide additional forecast guidance and research into the impacts of new NASA satellite data sets and software capabilities. By combining several research tools and satellite products, SPoRT can generate model guidance that is strongly influenced by unique NASA contributions.

  6. Flow Simulation of N3-X Hybrid Wing-Body Configuration

    NASA Technical Reports Server (NTRS)

    Kim, Hyoungjin; Liou, Meng-Sing

    2013-01-01

    System studies show that a N3-X hybrid wing-body aircraft with a turboelectric distributed propulsion system using a mail-slot inlet/nozzle nacelle can meet the environmental and performance goals for N+3 generation transports (three generations beyond the current air transport technology level) set by NASA s Subsonic Fixed Wing Project. In this study, a Navier-Stokes flow simulation of N3-X on hybrid unstructured meshes was conducted, including the mail-slot propulsor. The geometry of the mail-slot propulsor was generated by a CAD (Computer-Aided Design)-free shape parameterization. A body force approach was used for a more realistic and efficient simulation of the turning and loss effects of the fan blades and the inlet-fan interactions. Flow simulation results of the N3-X demonstrates the validity of the present approach.

  7. [Earth Science Technology Office's Computational Technologies Project

    NASA Technical Reports Server (NTRS)

    Fischer, James (Technical Monitor); Merkey, Phillip

    2005-01-01

    This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.

  8. Developing the Next Generation of Science Data System Engineers

    NASA Technical Reports Server (NTRS)

    Moses, John F.; Behnke, Jeanne; Durachka, Christopher D.

    2016-01-01

    At Goddard, engineers and scientists with a range of experience in science data systems are needed to employ new technologies and develop advances in capabilities for supporting new Earth and Space science research. Engineers with extensive experience in science data, software engineering and computer-information architectures are needed to lead and perform these activities. The increasing types and complexity of instrument data and emerging computer technologies coupled with the current shortage of computer engineers with backgrounds in science has led the need to develop a career path for science data systems engineers and architects.The current career path, in which undergraduate students studying various disciplines such as Computer Engineering or Physical Scientist, generally begins with serving on a development team in any of the disciplines where they can work in depth on existing Goddard data systems or serve with a specific NASA science team. There they begin to understand the data, infuse technologies, and begin to know the architectures of science data systems. From here the typical career involves peermentoring, on-the-job training or graduate level studies in analytics, computational science and applied science and mathematics. At the most senior level, engineers become subject matter experts and system architect experts, leading discipline-specific data centers and large software development projects. They are recognized as a subject matter expert in a science domain, they have project management expertise, lead standards efforts and lead international projects. A long career development remains necessary not only because of the breadth of knowledge required across physical sciences and engineering disciplines, but also because of the diversity of instrument data being developed today both by NASA and international partner agencies and because multidiscipline science and practitioner communities expect to have access to all types of observational data.This paper describes an approach to defining career-path guidance for college-bound high school and undergraduate engineering students, junior and senior engineers from various disciplines.

  9. Developing the Next Generation of Science Data System Engineers

    NASA Astrophysics Data System (ADS)

    Moses, J. F.; Durachka, C. D.; Behnke, J.

    2015-12-01

    At Goddard, engineers and scientists with a range of experience in science data systems are needed to employ new technologies and develop advances in capabilities for supporting new Earth and Space science research. Engineers with extensive experience in science data, software engineering and computer-information architectures are needed to lead and perform these activities. The increasing types and complexity of instrument data and emerging computer technologies coupled with the current shortage of computer engineers with backgrounds in science has led the need to develop a career path for science data systems engineers and architects. The current career path, in which undergraduate students studying various disciplines such as Computer Engineering or Physical Scientist, generally begins with serving on a development team in any of the disciplines where they can work in depth on existing Goddard data systems or serve with a specific NASA science team. There they begin to understand the data, infuse technologies, and begin to know the architectures of science data systems. From here the typical career involves peer mentoring, on-the-job training or graduate level studies in analytics, computational science and applied science and mathematics. At the most senior level, engineers become subject matter experts and system architect experts, leading discipline-specific data centers and large software development projects. They are recognized as a subject matter expert in a science domain, they have project management expertise, lead standards efforts and lead international projects. A long career development remains necessary not only because of the breath of knowledge required across physical sciences and engineering disciplines, but also because of the diversity of instrument data being developed today both by NASA and international partner agencies and because multi-discipline science and practitioner communities expect to have access to all types of observational data. This paper describes an approach to defining career-path guidance for college-bound high school and undergraduate engineering students, junior and senior engineers from various disciplines.

  10. Status of the Short-Pulse X-ray Project at the Advanced Photon Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nassiri, A; Berenc, T G; Borland, M

    2012-07-01

    The Advanced Photon Source Upgrade (APS-U) Project at Argonne will include generation of short-pulse x-rays based on Zholents deflecting cavity scheme. We have chosen superconducting (SC) cavities in order to have a continuous train of crabbed bunches and flexibility of operating modes. In collaboration with Jefferson Laboratory, we are prototyping and testing a number of single-cell deflecting cavities and associated auxiliary systems with promising initial results. In collaboration with Lawrence Berkeley National Laboratory, we are working to develop state-of-the-art timing, synchronization, and differential rf phase stability systems that are required for SPX. Collaboration with Advanced Computations Department at Stanford Linearmore » Accelerator Center is looking into simulations of complex, multi-cavity geometries with lower- and higher-order modes waveguide dampers using ACE3P. This contribution provides the current R&D status of the SPX project.« less

  11. Views on the impact of HOST

    NASA Technical Reports Server (NTRS)

    Esgar, J. B.; Sokolowski, Daniel E.

    1989-01-01

    The Hot Section Technology (HOST) Project, which was initiated by NASA Lewis Research Center in 1980 and concluded in 1987, was aimed at improving advanced aircraft engine hot section durability through better technical understanding and more accurate design analysis capability. The project was a multidisciplinary, multiorganizational, focused research effort that involved 21 organizations and 70 research and technology activities and generated approximately 250 research reports. No major hardware was developed. To evaluate whether HOST had a significant impact on the overall aircraft engine industry in the development of new engines, interviews were conducted with 41 participants in the project to obtain their views. The summarized results of these interviews are presented. Emphasis is placed on results relative to three-dimensional inelastic structural analysis, thermomechanical fatigue testing, constitutive modeling, combustor aerothermal modeling, turbine heat transfer, protective coatings, computer codes, improved engine design capability, reduced engine development costs, and the impacts on technology transfer and the industry-government partnership.

  12. Methodology in the Assessment of Construction and Development Investment Projects, Including the Graphic Multi-Criteria Analysis - a Systemic Approach

    NASA Astrophysics Data System (ADS)

    Szafranko, Elżbieta

    2017-10-01

    Assessment of variant solutions developed for a building investment project needs to be made at the stage of planning. While considering alternative solutions, the investor defines various criteria, but a direct evaluation of the degree of their fulfilment by developed variant solutions can be very difficult. In practice, there are different methods which enable the user to include a large number of parameters into an analysis, but their implementation can be challenging. Some methods require advanced mathematical computations, preceded by complicating input data processing, and the generated results may not lend themselves easily to interpretation. Hence, during her research, the author has developed a systemic approach, which involves several methods and whose goal is to compare their outcome. The final stage of the proposed method consists of graphic interpretation of results. The method has been tested on a variety of building and development projects.

  13. VERS: a virtual environment for reconstructive surgery planning

    NASA Astrophysics Data System (ADS)

    Montgomery, Kevin N.

    1997-05-01

    The virtual environment for reconstructive surgery (VERS) project at the NASA Ames Biocomputation Center is applying virtual reality technology to aid surgeons in planning surgeries. We are working with a craniofacial surgeon at Stanford to assemble and visualize the bone structure of patients requiring reconstructive surgery either through developmental abnormalities or trauma. This project is an extension of our previous work in 3D reconstruction, mesh generation, and immersive visualization. The current VR system, consisting of an SGI Onyx RE2, FakeSpace BOOM and ImmersiveWorkbench, Virtual Technologies CyberGlove and Ascension Technologies tracker, is currently in development and has already been used to visualize defects preoperatively. In the near future it will be used to more fully plan the surgery and compute the projected result to soft tissue structure. This paper presents the work in progress and details the production of a high-performance, collaborative, and networked virtual environment.

  14. Comparison of Aircraft Icing Growth Assessment Software

    NASA Technical Reports Server (NTRS)

    Wright, William; Potapczuk, Mark G.; Levinson, Laurie H.

    2011-01-01

    A research project is underway to produce computer software that can accurately predict ice growth under any meteorological conditions for any aircraft surface. An extensive comparison of the results in a quantifiable manner against the database of ice shapes that have been generated in the NASA Glenn Icing Research Tunnel (IRT) has been performed, including additional data taken to extend the database in the Super-cooled Large Drop (SLD) regime. The project shows the differences in ice shape between LEWICE 3.2.2, GlennICE, and experimental data. The project addresses the validation of the software against a recent set of ice-shape data in the SLD regime. This validation effort mirrors a similar effort undertaken for previous validations of LEWICE. Those reports quantified the ice accretion prediction capabilities of the LEWICE software. Several ice geometry features were proposed for comparing ice shapes in a quantitative manner. The resulting analysis showed that LEWICE compared well to the available experimental data.

  15. Three-Dimensional Displays In The Future Flight Station

    NASA Astrophysics Data System (ADS)

    Bridges, Alan L.

    1984-10-01

    This review paper summarizes the development and applications of computer techniques for the representation of three-dimensional data in the future flight station. It covers the development of the Lockheed-NASA Advanced Concepts Flight Station (ACFS) research simulators. These simulators contain: A Pilot's Desk Flight Station (PDFS) with five 13- inch diagonal, color, cathode ray tubes on the main instrument panel; a computer-generated day and night visual system; a six-degree-of-freedom motion base; and a computer complex. This paper reviews current research, development, and evaluation of easily modifiable display systems and software requirements for three-dimensional displays that may be developed for the PDFS. This includes the analysis and development of a 3-D representation of the entire flight profile. This 3-D flight path, or "Highway-in-the-Sky", will utilize motion and perspective cues to tightly couple the human responses of the pilot to the aircraft control systems. The use of custom logic, e.g., graphics engines, may provide the processing power and architecture required for 3-D computer-generated imagery (CGI) or visual scene simulation (VSS). Diffraction or holographic head-up displays (HUDs) will also be integrated into the ACFS simulator to permit research on the requirements and use of these "out-the-window" projection systems. Future research may include the retrieval of high-resolution, perspective view terrain maps which could then be overlaid with current weather information or other selectable cultural features.

  16. Modeling digital breast tomosynthesis imaging systems for optimization studies

    NASA Astrophysics Data System (ADS)

    Lau, Beverly Amy

    Digital breast tomosynthesis (DBT) is a new imaging modality for breast imaging. In tomosynthesis, multiple images of the compressed breast are acquired at different angles, and the projection view images are reconstructed to yield images of slices through the breast. One of the main problems to be addressed in the development of DBT is the optimal parameter settings to obtain images ideal for detection of cancer. Since it would be unethical to irradiate women multiple times to explore potentially optimum geometries for tomosynthesis, it is ideal to use a computer simulation to generate projection images. Existing tomosynthesis models have modeled scatter and detector without accounting for oblique angles of incidence that tomosynthesis introduces. Moreover, these models frequently use geometry-specific physical factors measured from real systems, which severely limits the robustness of their algorithms for optimization. The goal of this dissertation was to design the framework for a computer simulation of tomosynthesis that would produce images that are sensitive to changes in acquisition parameters, so an optimization study would be feasible. A computer physics simulation of the tomosynthesis system was developed. The x-ray source was modeled as a polychromatic spectrum based on published spectral data, and inverse-square law was applied. Scatter was applied using a convolution method with angle-dependent scatter point spread functions (sPSFs), followed by scaling using an angle-dependent scatter-to-primary ratio (SPR). Monte Carlo simulations were used to generate sPSFs for a 5-cm breast with a 1-cm air gap. Detector effects were included through geometric propagation of the image onto layers of the detector, which were blurred using depth-dependent detector point-spread functions (PRFs). Depth-dependent PRFs were calculated every 5-microns through a 200-micron thick CsI detector using Monte Carlo simulations. Electronic noise was added as Gaussian noise as a last step of the model. The sPSFs and detector PRFs were verified to match published data, and noise power spectrum (NPS) from simulated flat field images were shown to match empirically measured data from a digital mammography unit. A novel anthropomorphic software breast phantom was developed for 3D imaging simulation. Projection view images of the phantom were shown to have similar structure as real breasts in the spatial frequency domain, using the power-law exponent beta to quantify tissue complexity. The physics simulation and computer breast phantom were used together, following methods from a published study with real tomosynthesis images of real breasts. The simulation model and 3D numerical breast phantoms were able to reproduce the trends in the experimental data. This result demonstrates the ability of the tomosynthesis physics model to generate images sensitive to changes in acquisition parameters.

  17. Social Computing as Next-Gen Learning Paradigm: A Platform and Applications

    NASA Astrophysics Data System (ADS)

    Margherita, Alessandro; Taurino, Cesare; Del Vecchio, Pasquale

    As a field at the intersection between computer science and people behavior, social computing can contribute significantly in the endeavor of innovating how individuals and groups interact for learning and working purposes. In particular, the generation of Internet applications tagged as web 2.0 provides an opportunity to create new “environments” where people can exchange knowledge and experience, create new knowledge and learn together. This chapter illustrates the design and application of a prototypal platform which embeds tools such as blog, wiki, folksonomy and RSS in a unique web-based system. This platform has been developed to support a case-based and project-driven learning strategy for the development of business and technology management competencies in undergraduate and graduate education programs. A set of illustrative scenarios are described to show how a learning community can be promoted, created, and sustained through the technological platform.

  18. Realistic Covariance Prediction for the Earth Science Constellation

    NASA Technical Reports Server (NTRS)

    Duncan, Matthew; Long, Anne

    2006-01-01

    Routine satellite operations for the Earth Science Constellation (ESC) include collision risk assessment between members of the constellation and other orbiting space objects. One component of the risk assessment process is computing the collision probability between two space objects. The collision probability is computed using Monte Carlo techniques as well as by numerically integrating relative state probability density functions. Each algorithm takes as inputs state vector and state vector uncertainty information for both objects. The state vector uncertainty information is expressed in terms of a covariance matrix. The collision probability computation is only as good as the inputs. Therefore, to obtain a collision calculation that is a useful decision-making metric, realistic covariance matrices must be used as inputs to the calculation. This paper describes the process used by the NASA/Goddard Space Flight Center's Earth Science Mission Operations Project to generate realistic covariance predictions for three of the Earth Science Constellation satellites: Aqua, Aura and Terra.

  19. Methods for simulation-based analysis of fluid-structure interaction.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barone, Matthew Franklin; Payne, Jeffrey L.

    2005-10-01

    Methods for analysis of fluid-structure interaction using high fidelity simulations are critically reviewed. First, a literature review of modern numerical techniques for simulation of aeroelastic phenomena is presented. The review focuses on methods contained within the arbitrary Lagrangian-Eulerian (ALE) framework for coupling computational fluid dynamics codes to computational structural mechanics codes. The review treats mesh movement algorithms, the role of the geometric conservation law, time advancement schemes, wetted surface interface strategies, and some representative applications. The complexity and computational expense of coupled Navier-Stokes/structural dynamics simulations points to the need for reduced order modeling to facilitate parametric analysis. The proper orthogonalmore » decomposition (POD)/Galerkin projection approach for building a reduced order model (ROM) is presented, along with ideas for extension of the methodology to allow construction of ROMs based on data generated from ALE simulations.« less

  20. Status and prospects of computational fluid dynamics for unsteady transonic viscous flows

    NASA Technical Reports Server (NTRS)

    Mccroskey, W. J.; Kutler, P.; Bridgeman, J. O.

    1984-01-01

    Applications of computational aerodynamics to aeronautical research, design, and analysis have increased rapidly over the past decade, and these applications offer significant benefits to aeroelasticians. The past developments are traced by means of a number of specific examples, and the trends are projected over the next several years. The crucial factors that limit the present capabilities for unsteady analyses are identified; they include computer speed and memory, algorithm and solution methods, grid generation, turbulence modeling, vortex modeling, data processing, and coupling of the aerodynamic and structural dynamic analyses. The prospects for overcoming these limitations are presented, and many improvements appear to be readily attainable. If so, a complete and reliable numerical simulation of the unsteady, transonic viscous flow around a realistic fighter aircraft configuration could become possible within the next decade. The possibilities of using artificial intelligence concepts to hasten the achievement of this goal are also discussed.

  1. Computational Physics? Some perspectives and responses of the undergraduate physics community

    NASA Astrophysics Data System (ADS)

    Chonacky, Norman

    2011-03-01

    Any of the many answers possible to the evocative question ``What is ...'' will likely be heavily shaded by the experience of the respondent. This is partly due to absence of a canon of practice in this still immature, hence dynamic and exciting, method of physics. The diversity of responses is even more apparent in the area of physics education, and more disruptive because an undergraduate educational canon uniformly accepted across institutions for decades already exists. I will present evidence of this educational community's lagging response to the challenge of the current dynamic and diverse practice of computational physics in research. I will also summarize current measures that attempt respond to this lag, discuss a researched-based approach for moving beyond these early measures, and suggest how DCOMP might help. I hope this will generate criticisms and concurrences from the floor. Research support for material in this talk was from: IEEE-Computer Society; Shodor Foundation; Teragrid Project.

  2. Randomized interpolative decomposition of separated representations

    NASA Astrophysics Data System (ADS)

    Biagioni, David J.; Beylkin, Daniel; Beylkin, Gregory

    2015-01-01

    We introduce an algorithm to compute tensor interpolative decomposition (dubbed CTD-ID) for the reduction of the separation rank of Canonical Tensor Decompositions (CTDs). Tensor ID selects, for a user-defined accuracy ɛ, a near optimal subset of terms of a CTD to represent the remaining terms via a linear combination of the selected terms. CTD-ID can be used as an alternative to or in combination with the Alternating Least Squares (ALS) algorithm. We present examples of its use within a convergent iteration to compute inverse operators in high dimensions. We also briefly discuss the spectral norm as a computational alternative to the Frobenius norm in estimating approximation errors of tensor ID. We reduce the problem of finding tensor IDs to that of constructing interpolative decompositions of certain matrices. These matrices are generated via randomized projection of the terms of the given tensor. We provide cost estimates and several examples of the new approach to the reduction of separation rank.

  3. A description of the index of active Florida water data collection stations and a user's guide for station or site information retrieval using computer program Findex H578

    USGS Publications Warehouse

    Merritt, M.L.

    1977-01-01

    A computerized index of water-data collection activities and retrieval software to generate publication list of this information was developed for Florida. This system serves a vital need in the administration of the many and diverse water-data collection activities. Previously, needed data was very difficult to assemble for use in program planning or project implementation. Largely descriptive, the report tells how a file of computer card images has been established which contains entries for all sites in Florida at which there is currently a water-data-collection activity. Entries include information such as identification number, station name, location, type of site, county, information about data collection, funding, and other pertinent details. The computer program FINDEX selectively retrieves entries and lists them in a format suitable for publication. Updating the index is done routinely. (Woodard-USGS)

  4. Probabilistic Structural Analysis Theory Development

    NASA Technical Reports Server (NTRS)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  5. 3D image acquisition by fiber-based fringe projection

    NASA Astrophysics Data System (ADS)

    Pfeifer, Tilo; Driessen, Sascha

    2005-02-01

    In macroscopic production processes several measuring methods are used to assure the quality of 3D parts. Definitely, one of the most widespread techniques is the fringe projection. It"s a fast and accurate method to receive the topography of a part as a computer file which can be processed in further steps, e.g. to compare the measured part to a given CAD file. In this article it will be shown how the fringe projection method is applied to a fiber-optic system. The fringes generated by a miniaturized fringe projector (MiniRot) are first projected onto the front-end of an image guide using special optics. The image guide serves as a transmitter for the fringes in order to get them onto the surface of a micro part. A second image guide is used to observe the micro part. It"s mounted under an angle relating to the illuminating image guide so that the triangulation condition is fulfilled. With a CCD camera connected to the second image guide the projected fringes are recorded and those data is analyzed by an image processing system.

  6. Algorithmic design for 3D printing at building scale

    DOE PAGES

    Guerguis, Maged; Eikevik, Leif; Obendorf, Andrew; ...

    2017-01-01

    Here, this paper addresses the use of algorithmic design paired with additive manufacturing and their potential impact on architectural design and fabrication of a full-sized building, as demonstrated with the AMIE project. AMIE (Additive Manufacturing and Integrated Energy) was collaboration to 3d print a building and vehicle. Both the car and building were designed to generate, store and share energy in an effort to reduce or eliminate reliability on the power grid. This paper is intended to outline our methodology in successfully designing for these innovative strategies, with a focus on the use of computational design tools as a catalystmore » for design optimization, integrated project delivery, rapid prototyping and fabrication of building elements using additive manufacturing.« less

  7. Full-parallax 3D display from stereo-hybrid 3D camera system

    NASA Astrophysics Data System (ADS)

    Hong, Seokmin; Ansari, Amir; Saavedra, Genaro; Martinez-Corral, Manuel

    2018-04-01

    In this paper, we propose an innovative approach for the production of the microimages ready to display onto an integral-imaging monitor. Our main contribution is using a stereo-hybrid 3D camera system, which is used for picking up a 3D data pair and composing a denser point cloud. However, there is an intrinsic difficulty in the fact that hybrid sensors have dissimilarities and therefore should be equalized. Handled data facilitate to generating an integral image after projecting computationally the information through a virtual pinhole array. We illustrate this procedure with some imaging experiments that provide microimages with enhanced quality. After projection of such microimages onto the integral-imaging monitor, 3D images are produced with great parallax and viewing angle.

  8. Algorithmic design for 3D printing at building scale

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guerguis, Maged; Eikevik, Leif; Obendorf, Andrew

    Here, this paper addresses the use of algorithmic design paired with additive manufacturing and their potential impact on architectural design and fabrication of a full-sized building, as demonstrated with the AMIE project. AMIE (Additive Manufacturing and Integrated Energy) was collaboration to 3d print a building and vehicle. Both the car and building were designed to generate, store and share energy in an effort to reduce or eliminate reliability on the power grid. This paper is intended to outline our methodology in successfully designing for these innovative strategies, with a focus on the use of computational design tools as a catalystmore » for design optimization, integrated project delivery, rapid prototyping and fabrication of building elements using additive manufacturing.« less

  9. Interference testing methods of large astronomical mirrors base on lenses and CGH wavefront correctors

    NASA Astrophysics Data System (ADS)

    Abdulkadyrov, Magomed A.; Belousov, Sergey P.; Patrikeev, Vladimir E.; Semenov, Alexandr P.

    2010-07-01

    Since last years and at present days LZOS, JSC has been producing a range of primary mirrors of astronomical telescopes with diameter more than 1m under contracts with foreign companies. Simultaneous testing of an aspherical surface figure by means of a lens corrector and CGH (computer generated hologram) corrector, testing of the corrector using the CGH allow challenging the task of definite testing of the mirrors surfaces figure. The results of successful figuring of the mirrors with diameter up to 4m like VISTA Project (Southern European Observatory), TNT (Thai National telescope, Australia - Thailand), LCO telescopes (Las Cumbres Observatory, USA; Russian national projects and meeting these mirrors specifications' requirements are all considered as the sufficient evidence.

  10. A parallel data management system for large-scale NASA datasets

    NASA Technical Reports Server (NTRS)

    Srivastava, Jaideep

    1993-01-01

    The past decade has experienced a phenomenal growth in the amount of data and resultant information generated by NASA's operations and research projects. A key application is the reprocessing problem which has been identified to require data management capabilities beyond those available today (PRAT93). The Intelligent Information Fusion (IIF) system (ROEL91) is an ongoing NASA project which has similar requirements. Deriving our understanding of NASA's future data management needs based on the above, this paper describes an approach to using parallel computer systems (processor and I/O architectures) to develop an efficient parallel database management system to address the needs. Specifically, we propose to investigate issues in low-level record organizations and management, complex query processing, and query compilation and scheduling.

  11. A sustainability model based on cloud infrastructures for core and downstream Copernicus services

    NASA Astrophysics Data System (ADS)

    Manunta, Michele; Calò, Fabiana; De Luca, Claudio; Elefante, Stefano; Farres, Jordi; Guzzetti, Fausto; Imperatore, Pasquale; Lanari, Riccardo; Lengert, Wolfgang; Zinno, Ivana; Casu, Francesco

    2014-05-01

    The incoming Sentinel missions have been designed to be the first remote sensing satellite system devoted to operational services. In particular, the Synthetic Aperture Radar (SAR) Sentinel-1 sensor, dedicated to globally acquire over land in the interferometric mode, guarantees an unprecedented capability to investigate and monitor the Earth surface deformations related to natural and man-made hazards. Thanks to the global coverage strategy and 12-day revisit time, jointly with the free and open access data policy, such a system will allow an extensive application of Differential Interferometric SAR (DInSAR) techniques. In such a framework, European Commission has been funding several projects through the GMES and Copernicus programs, aimed at preparing the user community to the operational and extensive use of Sentinel-1 products for risk mitigation and management purposes. Among them, the FP7-DORIS, an advanced GMES downstream service coordinated by Italian National Council of Research (CNR), is based on the fully exploitation of advanced DInSAR products in landslides and subsidence contexts. In particular, the DORIS project (www.doris-project.eu) has developed innovative scientific techniques and methodologies to support Civil Protection Authorities (CPA) during the pre-event, event, and post-event phases of the risk management cycle. Nonetheless, the huge data stream expected from the Sentinel-1 satellite may jeopardize the effective use of such data in emergency response and security scenarios. This potential bottleneck can be properly overcome through the development of modern infrastructures, able to efficiently provide computing resources as well as advanced services for big data management, processing and dissemination. In this framework, CNR and ESA have tightened up a cooperation to foster the use of GRID and cloud computing platforms for remote sensing data processing, and to make available to a large audience advanced and innovative tools for DInSAR products generation and exploitation. In particular, CNR is porting the multi-temporal DInSAR technique referred to as Small Baseline Subset (SBAS) into the ESA G-POD (Grid Processing On Demand) and CIOP (Cloud Computing Operational Pilot) platforms (Elefante et al., 2013) within the SuperSites Exploitation Platform (SSEP) project, which aim is contributing to the development of an ecosystem for big geo-data processing and dissemination. This work focuses on presenting the main results that have been achieved by the DORIS project concerning the use of advanced DInSAR products for supporting CPA during the risk management cycle. Furthermore, based on the DORIS experience, a sustainability model for Core and Downstream Copernicus services based on the effective exploitation of cloud platforms is proposed. In this framework, remote sensing community, both service providers and users, can significantly benefit from the Helix Nebula-The Science Cloud initiative, created by European scientific institutions, agencies, SMEs and enterprises to pave the way for the development and exploitation of a cloud computing infrastructure for science. REFERENCES Elefante, S., Imperatore, P. , Zinno, I., M. Manunta, E. Mathot, F. Brito, J. Farres, W. Lengert, R. Lanari, F. Casu, 2013, "SBAS-DINSAR Time series generation on cloud computing platforms". IEEE IGARSS Conference, Melbourne (AU), July 2013.

  12. Computer Animations as Astronomy Educational Tool: Immanuel Kant and the Island Universes Hypothesis

    NASA Astrophysics Data System (ADS)

    Mijic, M.; Park, D.; Zumaeta, J.; Simonian, V.; Levitin, S.; Sullivan, A.; Kang, E. Y. E.; Longson, T.

    2008-11-01

    Development of astronomy is based on well defined, watershed moments when an individual or a group of individuals make a discovery or a measurement that expand, and sometimes dramatically improve our knowledge of the Universe. The purpose of the Scientific Visualization project at Cal State Los Angeles is to bring these moments to life with the use of computer animations, the medium of the 21st century that appeals to the generations which grew up in Internet age. Our first story describes Immanuel Kant's remarkable the Island Universes hypothesis. Using elementary principles of then new Newtonian mechanics, Kant made bold and ultimately correct interpretation of the Milky Way and the objects that we now call galaxies.

  13. Computer Animations as Astronomy Educational Tool: Immanuel Kant and The Island Universes Hypothesis

    NASA Astrophysics Data System (ADS)

    Mijic, Milan; Park, D.; Zumaeta, J.; Dong, H.; Simonian, V.; Levitin, S.; Sullivan, A.; Kang, E. Y. E.; Longson, T.; State LA SciVi Project, Cal

    2008-05-01

    Development of astronomy is based on well defined, watershed moments when an individual or a group of individuals make a discovery or a measurement that expand, and sometimes dramatically improve our knowledge of the Universe. The purpose of the Scientific Visualization project at Cal State LA is to bring these moments to life with the use of computer animations, the medium of the 21st century that appeals to the generations which grew up in Internet age. Our first story describes Immanuel Kant's remarkable the Island Universes hypothesis. Using elementary principles of then new Newtonian mechanics, Kant made bold and ultimately correct interpretation of the Milky Way and the objects that we now call galaxies

  14. Electromagnetic physics models for parallel computing architectures

    DOE PAGES

    Amadio, G.; Ananya, A.; Apostolakis, J.; ...

    2016-11-21

    The recent emergence of hardware architectures characterized by many-core or accelerated processors has opened new opportunities for concurrent programming models taking advantage of both SIMD and SIMT architectures. GeantV, a next generation detector simulation, has been designed to exploit both the vector capability of mainstream CPUs and multi-threading capabilities of coprocessors including NVidia GPUs and Intel Xeon Phi. The characteristics of these architectures are very different in terms of the vectorization depth and type of parallelization needed to achieve optimal performance. In this paper we describe implementation of electromagnetic physics models developed for parallel computing architectures as a part ofmore » the GeantV project. Finally, the results of preliminary performance evaluation and physics validation are presented as well.« less

  15. Cloud4Psi: cloud computing for 3D protein structure similarity searching.

    PubMed

    Mrozek, Dariusz; Małysiak-Mrozek, Bożena; Kłapciński, Artur

    2014-10-01

    Popular methods for 3D protein structure similarity searching, especially those that generate high-quality alignments such as Combinatorial Extension (CE) and Flexible structure Alignment by Chaining Aligned fragment pairs allowing Twists (FATCAT) are still time consuming. As a consequence, performing similarity searching against large repositories of structural data requires increased computational resources that are not always available. Cloud computing provides huge amounts of computational power that can be provisioned on a pay-as-you-go basis. We have developed the cloud-based system that allows scaling of the similarity searching process vertically and horizontally. Cloud4Psi (Cloud for Protein Similarity) was tested in the Microsoft Azure cloud environment and provided good, almost linearly proportional acceleration when scaled out onto many computational units. Cloud4Psi is available as Software as a Service for testing purposes at: http://cloud4psi.cloudapp.net/. For source code and software availability, please visit the Cloud4Psi project home page at http://zti.polsl.pl/dmrozek/science/cloud4psi.htm. © The Author 2014. Published by Oxford University Press.

  16. Cloud4Psi: cloud computing for 3D protein structure similarity searching

    PubMed Central

    Mrozek, Dariusz; Małysiak-Mrozek, Bożena; Kłapciński, Artur

    2014-01-01

    Summary: Popular methods for 3D protein structure similarity searching, especially those that generate high-quality alignments such as Combinatorial Extension (CE) and Flexible structure Alignment by Chaining Aligned fragment pairs allowing Twists (FATCAT) are still time consuming. As a consequence, performing similarity searching against large repositories of structural data requires increased computational resources that are not always available. Cloud computing provides huge amounts of computational power that can be provisioned on a pay-as-you-go basis. We have developed the cloud-based system that allows scaling of the similarity searching process vertically and horizontally. Cloud4Psi (Cloud for Protein Similarity) was tested in the Microsoft Azure cloud environment and provided good, almost linearly proportional acceleration when scaled out onto many computational units. Availability and implementation: Cloud4Psi is available as Software as a Service for testing purposes at: http://cloud4psi.cloudapp.net/. For source code and software availability, please visit the Cloud4Psi project home page at http://zti.polsl.pl/dmrozek/science/cloud4psi.htm. Contact: dariusz.mrozek@polsl.pl PMID:24930141

  17. Executable research compendia in geoscience research infrastructures

    NASA Astrophysics Data System (ADS)

    Nüst, Daniel

    2017-04-01

    From generation through analysis and collaboration to communication, scientific research requires the right tools. Scientists create their own software using third party libraries and platforms. Cloud computing, Open Science, public data infrastructures, and Open Source enable scientists with unprecedented opportunites, nowadays often in a field "Computational X" (e.g. computational seismology) or X-informatics (e.g. geoinformatics) [0]. This increases complexity and generates more innovation, e.g. Environmental Research Infrastructures (environmental RIs [1]). Researchers in Computational X write their software relying on both source code (e.g. from https://github.com) and binary libraries (e.g. from package managers such as APT, https://wiki.debian.org/Apt, or CRAN, https://cran.r-project.org/). They download data from domain specific (cf. https://re3data.org) or generic (e.g. https://zenodo.org) data repositories, and deploy computations remotely (e.g. European Open Science Cloud). The results themselves are archived, given persistent identifiers, connected to other works (e.g. using https://orcid.org/), and listed in metadata catalogues. A single researcher, intentionally or not, interacts with all sub-systems of RIs: data acquisition, data access, data processing, data curation, and community support [3]. To preserve computational research [3] proposes the Executable Research Compendium (ERC), a container format closing the gap of dependency preservation by encapsulating the runtime environment. ERCs and RIs can be integrated for different uses: (i) Coherence: ERC services validate completeness, integrity and results (ii) Metadata: ERCs connect the different parts of a piece of research and faciliate discovery (iii) Exchange and Preservation: ERC as usable building blocks are the shared and archived entity (iv) Self-consistency: ERCs remove dependence on ephemeral sources (v) Execution: ERC services create and execute a packaged analysis but integrate with existing platforms for display and control These integrations are vital for capturing workflows in RIs and connect key stakeholders (scientists, publishers, librarians). They are demonstrated using developments by the DFG-funded project Opening Reproducible Research (http://o2r.info). Semi-automatic creation of ERCs based on research workflows is a core goal of the project. References [0] Tony Hey, Stewart Tansley, Kristin Tolle (eds), 2009. The Fourth Paradigm: Data-Intensive Scientific Discovery. Microsoft Research. [1] P. Martin et al., Open Information Linking for Environmental Research Infrastructures, 2015 IEEE 11th International Conference on e-Science, Munich, 2015, pp. 513-520. doi: 10.1109/eScience.2015.66 [2] Y. Chen et al., Analysis of Common Requirements for Environmental Science Research Infrastructures, The International Symposium on Grids and Clouds (ISGC) 2013, Taipei, 2013, http://pos.sissa.it/archive/conferences/179/032/ISGC [3] Opening Reproducible Research, Geophysical Research Abstracts Vol. 18, EGU2016-7396, 2016, http://meetingorganizer.copernicus.org/EGU2016/EGU2016-7396.pdf

  18. 48 CFR 52.253-1 - Computer Generated Forms.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 2 2012-10-01 2012-10-01 false Computer Generated Forms....253-1 Computer Generated Forms. As prescribed in FAR 53.111, insert the following clause: Computer... by the Federal Acquisition Regulation (FAR) may be submitted on a computer generated version of the...

  19. 48 CFR 52.253-1 - Computer Generated Forms.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 2 2013-10-01 2013-10-01 false Computer Generated Forms....253-1 Computer Generated Forms. As prescribed in FAR 53.111, insert the following clause: Computer... by the Federal Acquisition Regulation (FAR) may be submitted on a computer generated version of the...

  20. 48 CFR 52.253-1 - Computer Generated Forms.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 48 Federal Acquisition Regulations System 2 2014-10-01 2014-10-01 false Computer Generated Forms....253-1 Computer Generated Forms. As prescribed in FAR 53.111, insert the following clause: Computer... by the Federal Acquisition Regulation (FAR) may be submitted on a computer generated version of the...

  1. 27 CFR 19.634 - Computer-generated reports and transaction forms.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2013-04-01 2013-04-01 false Computer-generated reports... Reports Filing Forms and Reports § 19.634 Computer-generated reports and transaction forms. TTB will accept computer-generated reports of operations and transaction forms made using a computer printer on...

  2. 27 CFR 19.634 - Computer-generated reports and transaction forms.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2014-04-01 2014-04-01 false Computer-generated reports... Reports Filing Forms and Reports § 19.634 Computer-generated reports and transaction forms. TTB will accept computer-generated reports of operations and transaction forms made using a computer printer on...

  3. 27 CFR 19.634 - Computer-generated reports and transaction forms.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2012-04-01 2012-04-01 false Computer-generated reports... Reports Filing Forms and Reports § 19.634 Computer-generated reports and transaction forms. TTB will accept computer-generated reports of operations and transaction forms made using a computer printer on...

  4. 27 CFR 19.634 - Computer-generated reports and transaction forms.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 27 Alcohol, Tobacco Products and Firearms 1 2011-04-01 2011-04-01 false Computer-generated reports... Reports Filing Forms and Reports § 19.634 Computer-generated reports and transaction forms. TTB will accept computer-generated reports of operations and transaction forms made using a computer printer on...

  5. 48 CFR 52.253-1 - Computer Generated Forms.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 2 2011-10-01 2011-10-01 false Computer Generated Forms....253-1 Computer Generated Forms. As prescribed in FAR 53.111, insert the following clause: Computer... by the Federal Acquisition Regulation (FAR) may be submitted on a computer generated version of the...

  6. 48 CFR 52.253-1 - Computer Generated Forms.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Computer Generated Forms....253-1 Computer Generated Forms. As prescribed in FAR 53.111, insert the following clause: Computer... by the Federal Acquisition Regulation (FAR) may be submitted on a computer generated version of the...

  7. Advanced Three-Dimensional Display System

    NASA Technical Reports Server (NTRS)

    Geng, Jason

    2005-01-01

    A desktop-scale, computer-controlled display system, initially developed for NASA and now known as the VolumeViewer(TradeMark), generates three-dimensional (3D) images of 3D objects in a display volume. This system differs fundamentally from stereoscopic and holographic display systems: The images generated by this system are truly 3D in that they can be viewed from almost any angle, without the aid of special eyeglasses. It is possible to walk around the system while gazing at its display volume to see a displayed object from a changing perspective, and multiple observers standing at different positions around the display can view the object simultaneously from their individual perspectives, as though the displayed object were a real 3D object. At the time of writing this article, only partial information on the design and principle of operation of the system was available. It is known that the system includes a high-speed, silicon-backplane, ferroelectric-liquid-crystal spatial light modulator (SLM), multiple high-power lasers for projecting images in multiple colors, a rotating helix that serves as a moving screen for displaying voxels [volume cells or volume elements, in analogy to pixels (picture cells or picture elements) in two-dimensional (2D) images], and a host computer. The rotating helix and its motor drive are the only moving parts. Under control by the host computer, a stream of 2D image patterns is generated on the SLM and projected through optics onto the surface of the rotating helix. The system utilizes a parallel pixel/voxel-addressing scheme: All the pixels of the 2D pattern on the SLM are addressed simultaneously by laser beams. This parallel addressing scheme overcomes the difficulty of achieving both high resolution and a high frame rate in a raster scanning or serial addressing scheme. It has been reported that the structure of the system is simple and easy to build, that the optical design and alignment are not difficult, and that the system can be built by use of commercial off-the-shelf products. A prototype of the system displays an image of 1,024 by 768 by 170 (=133,693,440) voxels. In future designs, the resolution could be increased. The maximum number of voxels that can be generated depends upon the spatial resolution of SLM and the speed of rotation of the helix. For example, one could use an available SLM that has 1,024 by 1,024 pixels. Incidentally, this SLM is capable of operation at a switching speed of 300,000 frames per second. Implementation of full-color displays in future versions of the system would be straightforward: One could use three SLMs for red, green, and blue, respectively, and the colors of the voxels could be automatically controlled. An optically simpler alternative would be to use a single red/green/ blue light projector and synchronize the projection of each color with the generation of patterns for that color on a single SLM.

  8. Progress on the DPASS project

    NASA Astrophysics Data System (ADS)

    Galkin, Sergei A.; Bogatu, I. N.; Svidzinski, V. A.

    2015-11-01

    A novel project to develop Disruption Prediction And Simulation Suite (DPASS) of comprehensive computational tools to predict, model, and analyze disruption events in tokamaks has been recently started at FAR-TECH Inc. DPASS will eventually address the following aspects of the disruption problem: MHD, plasma edge dynamics, plasma-wall interaction, generation and losses of runaway electrons. DPASS uses the 3-D Disruption Simulation Code (DSC-3D) as a core tool and will have a modular structure. DSC is a one fluid non-linear, time-dependent 3D MHD code to simulate dynamics of tokamak plasma surrounded by pure vacuum B-field in the real geometry of a conducting tokamak vessel. DSC utilizes the adaptive meshless technique with adaptation to the moving plasma boundary, with accurate magnetic flux conservation and resolution of the plasma surface current. DSC has also an option to neglect the plasma inertia to eliminate fast magnetosonic scale. This option can be turned on/off as needed. During Phase I of the project, two modules will be developed: the computational module for modeling the massive gas injection and main plasma respond; and the module for nanoparticle plasma jet injection as an innovative disruption mitigation scheme. We will report on this development progress. Work is supported by the US DOE SBIR grant # DE-SC0013727.

  9. Dual-modal three-dimensional imaging of single cells with isometric high resolution using an optical projection tomography microscope

    NASA Astrophysics Data System (ADS)

    Miao, Qin; Rahn, J. Richard; Tourovskaia, Anna; Meyer, Michael G.; Neumann, Thomas; Nelson, Alan C.; Seibel, Eric J.

    2009-11-01

    The practice of clinical cytology relies on bright-field microscopy using absorption dyes like hematoxylin and eosin in the transmission mode, while the practice of research microscopy relies on fluorescence microscopy in the epi-illumination mode. The optical projection tomography microscope is an optical microscope that can generate 3-D images of single cells with isometric high resolution both in absorption and fluorescence mode. Although the depth of field of the microscope objective is in the submicron range, it can be extended by scanning the objective's focal plane. The extended depth of field image is similar to a projection in a conventional x-ray computed tomography. Cells suspended in optical gel flow through a custom-designed microcapillary. Multiple pseudoprojection images are taken by rotating the microcapillary. After these pseudoprojection images are further aligned, computed tomography methods are applied to create 3-D reconstruction. 3-D reconstructed images of single cells are shown in both absorption and fluorescence mode. Fluorescence spatial resolution is measured at 0.35 μm in both axial and lateral dimensions. Since fluorescence and absorption images are taken in two different rotations, mechanical error may cause misalignment of 3-D images. This mechanical error is estimated to be within the resolution of the system.

  10. Projection decomposition algorithm for dual-energy computed tomography via deep neural network.

    PubMed

    Xu, Yifu; Yan, Bin; Chen, Jian; Zeng, Lei; Li, Lei

    2018-03-15

    Dual-energy computed tomography (DECT) has been widely used to improve identification of substances from different spectral information. Decomposition of the mixed test samples into two materials relies on a well-calibrated material decomposition function. This work aims to establish and validate a data-driven algorithm for estimation of the decomposition function. A deep neural network (DNN) consisting of two sub-nets is proposed to solve the projection decomposition problem. The compressing sub-net, substantially a stack auto-encoder (SAE), learns a compact representation of energy spectrum. The decomposing sub-net with a two-layer structure fits the nonlinear transform between energy projection and basic material thickness. The proposed DNN not only delivers image with lower standard deviation and higher quality in both simulated and real data, and also yields the best performance in cases mixed with photon noise. Moreover, DNN costs only 0.4 s to generate a decomposition solution of 360 × 512 size scale, which is about 200 times faster than the competing algorithms. The DNN model is applicable to the decomposition tasks with different dual energies. Experimental results demonstrated the strong function fitting ability of DNN. Thus, the Deep learning paradigm provides a promising approach to solve the nonlinear problem in DECT.

  11. Agent Model Development for Assessing Climate-Induced Geopolitical Instability.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boslough, Mark B.; Backus, George A.

    2005-12-01

    We present the initial stages of development of new agent-based computational methods to generate and test hypotheses about linkages between environmental change and international instability. This report summarizes the first year's effort of an originally proposed three-year Laboratory Directed Research and Development (LDRD) project. The preliminary work focused on a set of simple agent-based models and benefited from lessons learned in previous related projects and case studies of human response to climate change and environmental scarcity. Our approach was to define a qualitative model using extremely simple cellular agent models akin to Lovelock's Daisyworld and Schelling's segregation model. Such modelsmore » do not require significant computing resources, and users can modify behavior rules to gain insights. One of the difficulties in agent-based modeling is finding the right balance between model simplicity and real-world representation. Our approach was to keep agent behaviors as simple as possible during the development stage (described herein) and to ground them with a realistic geospatial Earth system model in subsequent years. This work is directed toward incorporating projected climate data--including various C02 scenarios from the Intergovernmental Panel on Climate Change (IPCC) Third Assessment Report--and ultimately toward coupling a useful agent-based model to a general circulation model.3« less

  12. A Set of Computer Projects for an Electromagnetic Fields Class.

    ERIC Educational Resources Information Center

    Gleeson, Ronald F.

    1989-01-01

    Presented are three computer projects: vector analysis, electric field intensities at various distances, and the Biot-Savart law. Programing suggestions and project results are provided. One month is suggested for each project. (MVL)

  13. Projection-Based Reduced Order Modeling for Spacecraft Thermal Analysis

    NASA Technical Reports Server (NTRS)

    Qian, Jing; Wang, Yi; Song, Hongjun; Pant, Kapil; Peabody, Hume; Ku, Jentung; Butler, Charles D.

    2015-01-01

    This paper presents a mathematically rigorous, subspace projection-based reduced order modeling (ROM) methodology and an integrated framework to automatically generate reduced order models for spacecraft thermal analysis. Two key steps in the reduced order modeling procedure are described: (1) the acquisition of a full-scale spacecraft model in the ordinary differential equation (ODE) and differential algebraic equation (DAE) form to resolve its dynamic thermal behavior; and (2) the ROM to markedly reduce the dimension of the full-scale model. Specifically, proper orthogonal decomposition (POD) in conjunction with discrete empirical interpolation method (DEIM) and trajectory piece-wise linear (TPWL) methods are developed to address the strong nonlinear thermal effects due to coupled conductive and radiative heat transfer in the spacecraft environment. Case studies using NASA-relevant satellite models are undertaken to verify the capability and to assess the computational performance of the ROM technique in terms of speed-up and error relative to the full-scale model. ROM exhibits excellent agreement in spatiotemporal thermal profiles (<0.5% relative error in pertinent time scales) along with salient computational acceleration (up to two orders of magnitude speed-up) over the full-scale analysis. These findings establish the feasibility of ROM to perform rational and computationally affordable thermal analysis, develop reliable thermal control strategies for spacecraft, and greatly reduce the development cycle times and costs.

  14. The impact of CFD on development test facilities - A National Research Council projection. [computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Korkegi, R. H.

    1983-01-01

    The results of a National Research Council study on the effect that advances in computational fluid dynamics (CFD) will have on conventional aeronautical ground testing are reported. Current CFD capabilities include the depiction of linearized inviscid flows and a boundary layer, initial use of Euler coordinates using supercomputers to automatically generate a grid, research and development on Reynolds-averaged Navier-Stokes (N-S) equations, and preliminary research on solutions to the full N-S equations. Improvements in the range of CFD usage is dependent on the development of more powerful supercomputers, exceeding even the projected abilities of the NASA Numerical Aerodynamic Simulator (1 BFLOP/sec). Full representation of the Re-averaged N-S equations will require over one million grid points, a computing level predicted to be available in 15 yr. Present capabilities allow identification of data anomalies, confirmation of data accuracy, and adequateness of model design in wind tunnel trials. Account can be taken of the wall effects and the Re in any flight regime during simulation. CFD can actually be more accurate than instrumented tests, since all points in a flow can be modeled with CFD, while they cannot all be monitored with instrumentation in a wind tunnel.

  15. Sma3s: A universal tool for easy functional annotation of proteomes and transcriptomes.

    PubMed

    Casimiro-Soriguer, Carlos S; Muñoz-Mérida, Antonio; Pérez-Pulido, Antonio J

    2017-06-01

    The current cheapening of next-generation sequencing has led to an enormous growth in the number of sequenced genomes and transcriptomes, allowing wet labs to get the sequences from their organisms of study. To make the most of these data, one of the first things that should be done is the functional annotation of the protein-coding genes. But it used to be a slow and tedious step that can involve the characterization of thousands of sequences. Sma3s is an accurate computational tool for annotating proteins in an unattended way. Now, we have developed a completely new version, which includes functionalities that will be of utility for fundamental and applied science. Currently, the results provide functional categories such as biological processes, which become useful for both characterizing particular sequence datasets and comparing results from different projects. But one of the most important implemented innovations is that it has now low computational requirements, and the complete annotation of a simple proteome or transcriptome usually takes around 24 hours in a personal computer. Sma3s has been tested with a large amount of complete proteomes and transcriptomes, and it has demonstrated its potential in health science and other specific projects. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. A Petaflops Era Computing Analysis

    NASA Technical Reports Server (NTRS)

    Preston, Frank S.

    1998-01-01

    This report covers a study of the potential for petaflops (1O(exp 15) floating point operations per second) computing. This study was performed within the year 1996 and should be considered as the first step in an on-going effort. 'Me analysis concludes that a petaflop system is technically feasible but not feasible with today's state-of-the-art. Since the computer arena is now a commodity business, most experts expect that a petaflops system will evolve from current technology in an evolutionary fashion. To meet the price expectations of users waiting for petaflop performance, great improvements in lowering component costs will be required. Lower power consumption is also a must. The present rate of progress in improved performance places the date of introduction of petaflop systems at about 2010. Several years before that date, it is projected that the resolution limit of chips will reach the now known resolution limit. Aside from the economic problems and constraints, software is identified as the major problem. The tone of this initial study is more pessimistic than most of the Super-published material available on petaflop systems. Workers in the field are expected to generate more data which could serve to provide a basis for a more informed projection. This report includes an annotated bibliography.

  17. cual-id: Globally Unique, Correctable, and Human-Friendly Sample Identifiers for Comparative Omics Studies.

    PubMed

    Chase, John H; Bolyen, Evan; Rideout, Jai Ram; Caporaso, J Gregory

    2016-01-01

    The number of samples in high-throughput comparative "omics" studies is increasing rapidly due to declining experimental costs. To keep sample data and metadata manageable and to ensure the integrity of scientific results as the scale of these projects continues to increase, it is essential that we transition to better-designed sample identifiers. Ideally, sample identifiers should be globally unique across projects, project teams, and institutions; short (to facilitate manual transcription); correctable with respect to common types of transcription errors; opaque, meaning that they do not contain information about the samples; and compatible with existing standards. We present cual-id, a lightweight command line tool that creates, or mints, sample identifiers that meet these criteria without reliance on centralized infrastructure. cual-id allows users to assign universally unique identifiers, or UUIDs, that are globally unique to their samples. UUIDs are too long to be conveniently written on sampling materials, such as swabs or microcentrifuge tubes, however, so cual-id additionally generates human-friendly 4- to 12-character identifiers that map to their UUIDs and are unique within a project. By convention, we use "cual-id" to refer to the software, "CualID" to refer to the short, human-friendly identifiers, and "UUID" to refer to the globally unique identifiers. CualIDs are used by humans when they manually write or enter identifiers, while the longer UUIDs are used by computers to unambiguously reference a sample. Finally, cual-id optionally generates printable label sticker sheets containing Code 128 bar codes and CualIDs for labeling of sample collection and processing materials. IMPORTANCE The adoption of identifiers that are globally unique, correctable, and easily handwritten or manually entered into a computer will be a major step forward for sample tracking in comparative omics studies. As the fields transition to more-centralized sample management, for example, across labs within an institution, across projects funded under a common program, or in systems designed to facilitate meta- and/or integrated analysis, sample identifiers generated with cual-id will not need to change; thus, costly and error-prone updating of data and metadata identifiers will be avoided. Further, using cual-id will ensure that transcription errors in sample identifiers do not require the discarding of otherwise-useful samples that may have been expensive to obtain. Finally, cual-id is simple to install and use and is free for all use. No centralized infrastructure is required to ensure global uniqueness, so it is feasible for any lab to get started using these identifiers within their existing infrastructure.

  18. Group Projects and the Computer Science Curriculum

    ERIC Educational Resources Information Center

    Joy, Mike

    2005-01-01

    Group projects in computer science are normally delivered with reference to good software engineering practice. The discipline of software engineering is rapidly evolving, and the application of the latest 'agile techniques' to group projects causes a potential conflict with constraints imposed by regulating bodies on the computer science…

  19. Science Teacher Efficacy and Extrinsic Factors Toward Professional Development Using Video Games in a Design-Based Research Model: The Next Generation of STEM Learning

    NASA Astrophysics Data System (ADS)

    Annetta, Leonard A.; Frazier, Wendy M.; Folta, Elizabeth; Holmes, Shawn; Lamb, Richard; Cheng, Meng-Tzu

    2013-02-01

    Designed-based research principles guided the study of 51 secondary-science teachers in the second year of a 3-year professional development project. The project entailed the creation of student-centered, inquiry-based, science, video games. A professional development model appropriate for infusing innovative technologies into standards-based curricula was employed to determine how science teacher's attitudes and efficacy where impacted while designing science-based video games. The study's mixed-method design ascertained teacher efficacy on five factors (General computer use, Science Learning, Inquiry Teaching and Learning, Synchronous chat/text, and Playing Video Games) related to technology and gaming using a web-based survey). Qualitative data in the form of online blog posts was gathered during the project to assist in the triangulation and assessment of teacher efficacy. Data analyses consisted of an Analysis of Variance and serial coding of teacher reflective responses. Results indicated participants who used computers daily have higher efficacy while using inquiry-based teaching methods and science teaching and learning. Additional emergent findings revealed possible motivating factors for efficacy. This professional development project was focused on inquiry as a pedagogical strategy, standard-based science learning as means to develop content knowledge, and creating video games as technological knowledge. The project was consistent with the Technological Pedagogical Content Knowledge (TPCK) framework where overlapping circles of the three components indicates development of an integrated understanding of the suggested relationships. Findings provide suggestions for development of standards-based science education software, its integration into the curriculum and, strategies for implementing technology into teaching practices.

  20. Grid today, clouds on the horizon

    NASA Astrophysics Data System (ADS)

    Shiers, Jamie

    2009-04-01

    By the time of CCP 2008, the largest scientific machine in the world - the Large Hadron Collider - had been cooled down as scheduled to its operational temperature of below 2 degrees Kelvin and injection tests were starting. Collisions of proton beams at 5+5 TeV were expected within one to two months of the initial tests, with data taking at design energy ( 7+7 TeV) foreseen for 2009. In order to process the data from this world machine, we have put our "Higgs in one basket" - that of Grid computing [The Worldwide LHC Computing Grid (WLCG), in: Proceedings of the Conference on Computational Physics 2006 (CCP 2006), vol. 177, 2007, pp. 219-223]. After many years of preparation, 2008 saw a final "Common Computing Readiness Challenge" (CCRC'08) - aimed at demonstrating full readiness for 2008 data taking, processing and analysis. By definition, this relied on a world-wide production Grid infrastructure. But change - as always - is on the horizon. The current funding model for Grids - which in Europe has been through 3 generations of EGEE projects, together with related projects in other parts of the world, including South America - is evolving towards a long-term, sustainable e-infrastructure, like the European Grid Initiative (EGI) [The European Grid Initiative Design Study, website at http://web.eu-egi.eu/]. At the same time, potentially new paradigms, such as that of "Cloud Computing" are emerging. This paper summarizes the results of CCRC'08 and discusses the potential impact of future Grid funding on both regional and international application communities. It contrasts Grid and Cloud computing models from both technical and sociological points of view. Finally, it discusses the requirements from production application communities, in terms of stability and continuity in the medium to long term.

Top