Sample records for enabling computational technologies

  1. Enabling Logistics With Portable and Wireless Technology Study. Volume 1

    DTIC Science & Technology

    2004-08-06

    Project”, Ubiquitous Computing Group Microsoft Research, 2001. 102 Enabling Logistics with Portable and Wireless Technology Study ...Enabling Logistics with Portable and Wireless Technology Study Final Report FINAL REPORT...Volume I) Enabling Logistics with Portable and Wireless Technology Study AUTHORS School of Industrial Engineering Dr. Soundar Kumara

  2. Enabling Self-Directed Computer Use for Individuals with Cerebral Palsy: A Systematic Review of Assistive Devices and Technologies

    ERIC Educational Resources Information Center

    Davies, T. Claire; Mudge, Suzie; Ameratunga, Shanthi; Stott, N. Susan

    2010-01-01

    Aim: The purpose of this study was to systematically review published evidence on the development, use, and effectiveness of devices and technologies that enable or enhance self-directed computer access by individuals with cerebral palsy (CP). Methods: Nine electronic databases were searched using keywords "computer", "software", "spastic",…

  3. (CICT) Computing, Information, and Communications Technology Overview

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.

    2003-01-01

    The goal of the Computing, Information, and Communications Technology (CICT) program is to enable NASA's Scientific Research, Space Exploration, and Aerospace Technology Missions with greater mission assurance, for less cost, with increased science return through the development and use of advanced computing, information and communications technologies. This viewgraph presentation includes diagrams of how the political guidance behind CICT is structured. The presentation profiles each part of the NASA Mission in detail, and relates the Mission to the activities of CICT. CICT's Integrated Capability Goal is illustrated, and hypothetical missions which could be enabled by CICT are profiled. CICT technology development is profiled.

  4. Research | Computational Science | NREL

    Science.gov Websites

    Research Research NREL's computational science experts use advanced high-performance computing (HPC technologies, thereby accelerating the transformation of our nation's energy system. Enabling High-Impact Research NREL's computational science capabilities enable high-impact research. Some recent examples

  5. Computing, Information, and Communications Technology (CICT) Program Overview

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.

    2003-01-01

    The Computing, Information and Communications Technology (CICT) Program's goal is to enable NASA's Scientific Research, Space Exploration, and Aerospace Technology Missions with greater mission assurance, for less cost, with increased science return through the development and use of advanced computing, information and communication technologies

  6. Recent development on computer aided tissue engineering--a review.

    PubMed

    Sun, Wei; Lal, Pallavi

    2002-02-01

    The utilization of computer-aided technologies in tissue engineering has evolved in the development of a new field of computer-aided tissue engineering (CATE). This article reviews recent development and application of enabling computer technology, imaging technology, computer-aided design and computer-aided manufacturing (CAD and CAM), and rapid prototyping (RP) technology in tissue engineering, particularly, in computer-aided tissue anatomical modeling, three-dimensional (3-D) anatomy visualization and 3-D reconstruction, CAD-based anatomical modeling, computer-aided tissue classification, computer-aided tissue implantation and prototype modeling assisted surgical planning and reconstruction.

  7. Effectiveness and Impact of Technology-Enabled Project-Based Learning with the Use of Process Prompts in Teacher Education

    ERIC Educational Resources Information Center

    Chen, Ching-Huei; Chan, Lim-Ha

    2011-01-01

    This study investigated the effectiveness and impacts of process prompts on students' learning and computer self-efficacy within the technology-enabled project-based learning (PBL) context in an undergraduate educational technology course. If the aim is to prepare prospective teachers to effectively, efficiently, and engagingly use technologies in…

  8. Civil propulsion technology for the next twenty-five years

    NASA Technical Reports Server (NTRS)

    Rosen, Robert; Facey, John R.

    1987-01-01

    The next twenty-five years will see major advances in civil propulsion technology that will result in completely new aircraft systems for domestic, international, commuter and high-speed transports. These aircraft will include advanced aerodynamic, structural, and avionic technologies resulting in major new system capabilities and economic improvements. Propulsion technologies will include high-speed turboprops in the near term, very high bypass ratio turbofans, high efficiency small engines and advanced cycles utilizing high temperature materials for high-speed propulsion. Key fundamental enabling technologies include increased temperature capability and advanced design methods. Increased temperature capability will be based on improved composite materials such as metal matrix, intermetallics, ceramics, and carbon/carbon as well as advanced heat transfer techniques. Advanced design methods will make use of advances in internal computational fluid mechanics, reacting flow computation, computational structural mechanics and computational chemistry. The combination of advanced enabling technologies, new propulsion concepts and advanced control approaches will provide major improvements in civil aircraft.

  9. Real-World Neuroimaging Technologies

    DTIC Science & Technology

    2013-05-10

    system enables long-term wear of up to 10 consecutive hours of operation time. The system’s wireless technologies, light weight (200g), and dry sensor ...biomarkers, body sensor networks , brain computer interactionbrain, computer interfaces, data acquisition, electroencephalography monitoring, translational...brain activity in real-world scenarios. INDEX TERMS Behavioral science, biomarkers, body sensor networks , brain computer interfaces, brain computer

  10. Physician communication via Internet-enabled technology: A systematic review.

    PubMed

    Barr, Neil G; Randall, Glen E; Archer, Norman P; Musson, David M

    2017-10-01

    The use of Internet-enabled technology (information and communication technology such as smartphone applications) may enrich information exchange among providers and, consequently, improve health care delivery. The purpose of this systematic review was to gain a greater understanding of the role that Internet-enabled technology plays in enhancing communication among physicians. Studies were identified through a search in three electronic platforms: the Association for Computing Machinery Digital Library, ProQuest, and Web of Science. The search identified 5140 articles; of these, 21 met all inclusion criteria. In general, physicians were satisfied with Internet-enabled technology, but consensus was lacking regarding whether Internet-enabled technology improved efficiency or made a difference to clinical decision-making. Internet-enabled technology can play an important role in enhancing communication among physicians, but the extent of that benefit is influenced by (1) the impact of Internet-enabled technology on existing work practices, (2) the availability of adequate resources, and (3) the nature of institutional elements, such as privacy legislation.

  11. Embodying Computational Thinking: Initial Design of an Emerging Technological Learning Tool

    ERIC Educational Resources Information Center

    Daily, Shaundra B.; Leonard, Alison E.; Jörg, Sophie; Babu, Sabarish; Gundersen, Kara; Parmar, Dhaval

    2015-01-01

    This emerging technology report describes virtual environment interactions an approach for blending movement and computer programming as an embodied way to support girls in building computational thinking skills. The authors seek to understand how body syntonicity might enable young learners to bootstrap their intuitive knowledge in order to…

  12. Beyond Theory: Improving Public Relations Writing through Computer Technology.

    ERIC Educational Resources Information Center

    Neff, Bonita Dostal

    Computer technology (primarily word processing) enables the student of public relations writing to improve the writing process through increased flexibility in writing, enhanced creativity, increased support of management skills and team work. A new instructional model for computer use in public relations courses at Purdue University Calumet…

  13. Microsystems Technology Symposium: Enabling Future Capability (BRIEFING CHARTS)

    DTIC Science & Technology

    2007-03-07

    Microsystems I t r t i r t Wireless and Networked Systems Embedded Computation Signal Processing Communications 4 Microsystems Technology Office: Enabling...Regency Ballroom) (Regency Ballroom) 1330 1400 Communciation Actuation 1430 (Imperial Ballroom) (Imperial Ballroom) 1500 1530 1600 1630 1700 1730 1800

  14. The Adoption of Grid Computing Technology by Organizations: A Quantitative Study Using Technology Acceptance Model

    ERIC Educational Resources Information Center

    Udoh, Emmanuel E.

    2010-01-01

    Advances in grid technology have enabled some organizations to harness enormous computational power on demand. However, the prediction of widespread adoption of the grid technology has not materialized despite the obvious grid advantages. This situation has encouraged intense efforts to close the research gap in the grid adoption process. In this…

  15. Developing Computer Programming Concepts and Skills via Technology-Enriched Language-Art Projects: A Case Study

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2010-01-01

    Teaching computer programming to young children has been considered difficult because of its abstract and complex nature. The objectives of this study are (1) to investigate whether an innovative educational technology tool called Scratch could enable young children to learn abstract knowledge of computer programming while creating multimedia…

  16. Evaluation of Two Different Teaching Concepts in Dentistry Using Computer Technology

    ERIC Educational Resources Information Center

    Reich, Sven; Simon, James F.; Ruedinger, Dirk; Shortall, Adrian; Wichmann, Manfred; Frankenberger, Roland

    2007-01-01

    The common teaching goal of two different phantom head courses was to enable the students to provide an all-ceramic restoration by the means of computer technology. The aim of this study was to compare these two courses with regard to the different educational methods using identical computer software. Undergraduate dental students from a single…

  17. Lab4CE: A Remote Laboratory for Computer Education

    ERIC Educational Resources Information Center

    Broisin, Julien; Venant, Rémi; Vidal, Philippe

    2017-01-01

    Remote practical activities have been demonstrated to be efficient when learners come to acquire inquiry skills. In computer science education, virtualization technologies are gaining popularity as this technological advance enables instructors to implement realistic practical learning activities, and learners to engage in authentic and…

  18. Engineering and Computing Portal to Solve Environmental Problems

    NASA Astrophysics Data System (ADS)

    Gudov, A. M.; Zavozkin, S. Y.; Sotnikov, I. Y.

    2018-01-01

    This paper describes architecture and services of the Engineering and Computing Portal, which is considered to be a complex solution that provides access to high-performance computing resources, enables to carry out computational experiments, teach parallel technologies and solve computing tasks, including technogenic safety ones.

  19. Ultimate computing. Biomolecular consciousness and nano Technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hameroff, S.R.

    1987-01-01

    The book advances the premise that the cytoskeleton is the cell's nervous system, the biological controller/computer. If indeed cytoskeletal dynamics in the nanoscale (billionth meter, billionth second) are the texture of intracellular information processing, emerging ''NanoTechnologies'' (scanning tunneling microscopy, Feynman machines, von Neumann replicators, etc.) should enable direct monitoring, decoding and interfacing between biological and technological information devices. This in turn could result in important biomedical applications and perhaps a merger of mind and machine: Ultimate Computing.

  20. A Method for Selection of Appropriate Assistive Technology for Computer Access

    ERIC Educational Resources Information Center

    Jenko, Mojca

    2010-01-01

    Assistive technologies (ATs) for computer access enable people with disabilities to be included in the information society. Current methods for assessment and selection of the most appropriate AT for each individual are nonstandardized, lengthy, subjective, and require substantial clinical experience of a multidisciplinary team. This manuscript…

  1. The Computer's Debt to Science.

    ERIC Educational Resources Information Center

    Branscomb, Lewis M.

    1984-01-01

    Discusses discoveries and applications of science that have enabled the computer industry to introduce new technology each year and produce 25 percent more for the customer at constant cost. Potential limits to progress, disc storage technology, programming and end-user interface, and designing for ease of use are considered. Glossary is included.…

  2. Learner-Interface Interaction for Technology-Enhanced Active Learning

    ERIC Educational Resources Information Center

    Sinha, Neelu; Khreisat, Laila; Sharma, Kiron

    2009-01-01

    Neelu Sinha, Laila Khreisat, and Kiron Sharma describe how learner-interface interaction promotes active learning in computer science education. In a pilot study using technology that combines DyKnow software with a hardware platform of pen-enabled HP Tablet notebook computers, Sinha, Khreisat, and Sharma created dynamic learning environments by…

  3. Pervasive access to images and data--the use of computing grids and mobile/wireless devices across healthcare enterprises.

    PubMed

    Pohjonen, Hanna; Ross, Peeter; Blickman, Johan G; Kamman, Richard

    2007-01-01

    Emerging technologies are transforming the workflows in healthcare enterprises. Computing grids and handheld mobile/wireless devices are providing clinicians with enterprise-wide access to all patient data and analysis tools on a pervasive basis. In this paper, emerging technologies are presented that provide computing grids and streaming-based access to image and data management functions, and system architectures that enable pervasive computing on a cost-effective basis. Finally, the implications of such technologies are investigated regarding the positive impacts on clinical workflows.

  4. A New Look at NASA: Strategic Research In Information Technology

    NASA Technical Reports Server (NTRS)

    Alfano, David; Tu, Eugene (Technical Monitor)

    2002-01-01

    This viewgraph presentation provides information on research undertaken by NASA to facilitate the development of information technologies. Specific ideas covered here include: 1) Bio/nano technologies: biomolecular and nanoscale systems and tools for assembly and computing; 2) Evolvable hardware: autonomous self-improving, self-repairing hardware and software for survivable space systems in extreme environments; 3) High Confidence Software Technologies: formal methods, high-assurance software design, and program synthesis; 4) Intelligent Controls and Diagnostics: Next generation machine learning, adaptive control, and health management technologies; 5) Revolutionary computing: New computational models to increase capability and robustness to enable future NASA space missions.

  5. A Longitudinal Examination of the Effects of Computer Self-Efficacy Growth on Performance during Technology Training

    ERIC Educational Resources Information Center

    Downey, James P.; Kher, Hemant V.

    2015-01-01

    Technology training in the classroom is critical in preparing students for upper level classes as well as professional careers, especially in fields such as technology. One of the key enablers to this process is computer self-efficacy (CSE), which has an extensive stream of empirical research. Despite this, one of the missing pieces is how CSE…

  6. Computer Technology-Integrated Projects Should Not Supplant Craft Projects in Science Education

    ERIC Educational Resources Information Center

    Klopp, Tabatha J.; Rule, Audrey C.; Schneider, Jean Suchsland; Boody, Robert M.

    2014-01-01

    The current emphasis on computer technology integration and narrowing of the curriculum has displaced arts and crafts. However, the hands-on, concrete nature of craft work in science modeling enables students to understand difficult concepts and to be engaged and motivated while learning spatial, logical, and sequential thinking skills. Analogy…

  7. How Word Processing Is Changing Our Teaching: New Technologies, New Approaches, New Challenges.

    ERIC Educational Resources Information Center

    Rodrigues, Dawn; Rodrigues, Raymond

    1989-01-01

    Presents teaching variations with a word-processing package and related tools that enable teachers to develop different computer-writing pedagogies for their distinct contexts: traditional classroom, computer lab, or some combination of both. Emphasizes that teachers who re-envision teaching with regard to available technology can create dynamic…

  8. Computer Technology Integration and Student Learning: Barriers and Promise

    ERIC Educational Resources Information Center

    Keengwe, Jared; Onchwari, Grace; Wachira, Patrick

    2008-01-01

    Political and institutional support has enabled many institutions of learning to spend millions of dollars to acquire educational computing tools (Ficklen and Muscara, "Am Educ" 25(3):22-29, 2001) that have not been effectively integrated into the curriculum. While access to educational technology tools has remarkably improved in most schools,…

  9. Computing, Information and Communications Technology (CICT) Website

    NASA Technical Reports Server (NTRS)

    Hardman, John; Tu, Eugene (Technical Monitor)

    2002-01-01

    The Computing, Information and Communications Technology Program (CICT) was established in 2001 to ensure NASA's Continuing leadership in emerging technologies. It is a coordinated, Agency-wide effort to develop and deploy key enabling technologies for a broad range of mission-critical tasks. The NASA CICT program is designed to address Agency-specific computing, information, and communications technology requirements beyond the projected capabilities of commercially available solutions. The areas of technical focus have been chosen for their impact on NASA's missions, their national importance, and the technical challenge they provide to the Program. In order to meet its objectives, the CICT Program is organized into the following four technology focused projects: 1) Computing, Networking and Information Systems (CNIS); 2) Intelligent Systems (IS); 3) Space Communications (SC); 4) Information Technology Strategic Research (ITSR).

  10. Translational bioinformatics in the cloud: an affordable alternative

    PubMed Central

    2010-01-01

    With the continued exponential expansion of publicly available genomic data and access to low-cost, high-throughput molecular technologies for profiling patient populations, computational technologies and informatics are becoming vital considerations in genomic medicine. Although cloud computing technology is being heralded as a key enabling technology for the future of genomic research, available case studies are limited to applications in the domain of high-throughput sequence data analysis. The goal of this study was to evaluate the computational and economic characteristics of cloud computing in performing a large-scale data integration and analysis representative of research problems in genomic medicine. We find that the cloud-based analysis compares favorably in both performance and cost in comparison to a local computational cluster, suggesting that cloud computing technologies might be a viable resource for facilitating large-scale translational research in genomic medicine. PMID:20691073

  11. Simulating Technology Processes to Foster Learning.

    ERIC Educational Resources Information Center

    Krumholtz, Nira

    1998-01-01

    Based on a spiral model of technology evolution, elementary students used LOGO computer software to become both developers and users of technology. The computerized environment enabled 87% to reach intuitive understanding of physical concepts; 24% expressed more formal scientific understanding. (SK)

  12. Optical interconnection networks for high-performance computing systems

    NASA Astrophysics Data System (ADS)

    Biberman, Aleksandr; Bergman, Keren

    2012-04-01

    Enabled by silicon photonic technology, optical interconnection networks have the potential to be a key disruptive technology in computing and communication industries. The enduring pursuit of performance gains in computing, combined with stringent power constraints, has fostered the ever-growing computational parallelism associated with chip multiprocessors, memory systems, high-performance computing systems and data centers. Sustaining these parallelism growths introduces unique challenges for on- and off-chip communications, shifting the focus toward novel and fundamentally different communication approaches. Chip-scale photonic interconnection networks, enabled by high-performance silicon photonic devices, offer unprecedented bandwidth scalability with reduced power consumption. We demonstrate that the silicon photonic platforms have already produced all the high-performance photonic devices required to realize these types of networks. Through extensive empirical characterization in much of our work, we demonstrate such feasibility of waveguides, modulators, switches and photodetectors. We also demonstrate systems that simultaneously combine many functionalities to achieve more complex building blocks. We propose novel silicon photonic devices, subsystems, network topologies and architectures to enable unprecedented performance of these photonic interconnection networks. Furthermore, the advantages of photonic interconnection networks extend far beyond the chip, offering advanced communication environments for memory systems, high-performance computing systems, and data centers.

  13. 3D printing in chemical engineering and catalytic technology: structured catalysts, mixers and reactors.

    PubMed

    Parra-Cabrera, Cesar; Achille, Clement; Kuhn, Simon; Ameloot, Rob

    2018-01-02

    Computer-aided fabrication technologies combined with simulation and data processing approaches are changing our way of manufacturing and designing functional objects. Also in the field of catalytic technology and chemical engineering the impact of additive manufacturing, also referred to as 3D printing, is steadily increasing thanks to a rapidly decreasing equipment threshold. Although still in an early stage, the rapid and seamless transition between digital data and physical objects enabled by these fabrication tools will benefit both research and manufacture of reactors and structured catalysts. Additive manufacturing closes the gap between theory and experiment, by enabling accurate fabrication of geometries optimized through computational fluid dynamics and the experimental evaluation of their properties. This review highlights the research using 3D printing and computational modeling as digital tools for the design and fabrication of reactors and structured catalysts. The goal of this contribution is to stimulate interactions at the crossroads of chemistry and materials science on the one hand and digital fabrication and computational modeling on the other.

  14. The Nuclear Energy Advanced Modeling and Simulation Enabling Computational Technologies FY09 Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diachin, L F; Garaizar, F X; Henson, V E

    2009-10-12

    In this document we report on the status of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) Enabling Computational Technologies (ECT) effort. In particular, we provide the context for ECT In the broader NEAMS program and describe the three pillars of the ECT effort, namely, (1) tools and libraries, (2) software quality assurance, and (3) computational facility (computers, storage, etc) needs. We report on our FY09 deliverables to determine the needs of the integrated performance and safety codes (IPSCs) in these three areas and lay out the general plan for software quality assurance to meet the requirements of DOE andmore » the DOE Advanced Fuel Cycle Initiative (AFCI). We conclude with a brief description of our interactions with the Idaho National Laboratory computer center to determine what is needed to expand their role as a NEAMS user facility.« less

  15. Three-Dimensional Nanobiocomputing Architectures With Neuronal Hypercells

    DTIC Science & Technology

    2007-06-01

    Neumann architectures, and CMOS fabrication. Novel solutions of massive parallel distributed computing and processing (pipelined due to systolic... and processing platforms utilizing molecular hardware within an enabling organization and architecture. The design technology is based on utilizing a...Microsystems and Nanotechnologies investigated a novel 3D3 (Hardware Software Nanotechnology) technology to design super-high performance computing

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCaskey, Alexander J.

    Hybrid programming models for beyond-CMOS technologies will prove critical for integrating new computing technologies alongside our existing infrastructure. Unfortunately the software infrastructure required to enable this is lacking or not available. XACC is a programming framework for extreme-scale, post-exascale accelerator architectures that integrates alongside existing conventional applications. It is a pluggable framework for programming languages developed for next-gen computing hardware architectures like quantum and neuromorphic computing. It lets computational scientists efficiently off-load classically intractable work to attached accelerators through user-friendly Kernel definitions. XACC makes post-exascale hybrid programming approachable for domain computational scientists.

  17. Distributed Network, Wireless and Cloud Computing Enabled 3-D Ultrasound; a New Medical Technology Paradigm

    PubMed Central

    Meir, Arie; Rubinsky, Boris

    2009-01-01

    Medical technologies are indispensable to modern medicine. However, they have become exceedingly expensive and complex and are not available to the economically disadvantaged majority of the world population in underdeveloped as well as developed parts of the world. For example, according to the World Health Organization about two thirds of the world population does not have access to medical imaging. In this paper we introduce a new medical technology paradigm centered on wireless technology and cloud computing that was designed to overcome the problems of increasing health technology costs. We demonstrate the value of the concept with an example; the design of a wireless, distributed network and central (cloud) computing enabled three-dimensional (3-D) ultrasound system. Specifically, we demonstrate the feasibility of producing a 3-D high end ultrasound scan at a central computing facility using the raw data acquired at the remote patient site with an inexpensive low end ultrasound transducer designed for 2-D, through a mobile device and wireless connection link between them. Producing high-end 3D ultrasound images with simple low-end transducers reduces the cost of imaging by orders of magnitude. It also removes the requirement of having a highly trained imaging expert at the patient site, since the need for hand-eye coordination and the ability to reconstruct a 3-D mental image from 2-D scans, which is a necessity for high quality ultrasound imaging, is eliminated. This could enable relatively untrained medical workers in developing nations to administer imaging and a more accurate diagnosis, effectively saving the lives of people. PMID:19936236

  18. Distributed network, wireless and cloud computing enabled 3-D ultrasound; a new medical technology paradigm.

    PubMed

    Meir, Arie; Rubinsky, Boris

    2009-11-19

    Medical technologies are indispensable to modern medicine. However, they have become exceedingly expensive and complex and are not available to the economically disadvantaged majority of the world population in underdeveloped as well as developed parts of the world. For example, according to the World Health Organization about two thirds of the world population does not have access to medical imaging. In this paper we introduce a new medical technology paradigm centered on wireless technology and cloud computing that was designed to overcome the problems of increasing health technology costs. We demonstrate the value of the concept with an example; the design of a wireless, distributed network and central (cloud) computing enabled three-dimensional (3-D) ultrasound system. Specifically, we demonstrate the feasibility of producing a 3-D high end ultrasound scan at a central computing facility using the raw data acquired at the remote patient site with an inexpensive low end ultrasound transducer designed for 2-D, through a mobile device and wireless connection link between them. Producing high-end 3D ultrasound images with simple low-end transducers reduces the cost of imaging by orders of magnitude. It also removes the requirement of having a highly trained imaging expert at the patient site, since the need for hand-eye coordination and the ability to reconstruct a 3-D mental image from 2-D scans, which is a necessity for high quality ultrasound imaging, is eliminated. This could enable relatively untrained medical workers in developing nations to administer imaging and a more accurate diagnosis, effectively saving the lives of people.

  19. Multiband Radio Frequency Interconnect (MRFI) Technology For Next Generation Mobile/Airborne Computing Systems

    DTIC Science & Technology

    2017-02-01

    enable high scalability and reconfigurability for inter-CPU/Memory communications with an increased number of communication channels in frequency ...interconnect technology (MRFI) to enable high scalability and re-configurability for inter-CPU/Memory communications with an increased number of communication ...testing in the University of California, Los Angeles (UCLA) Center for High Frequency Electronics, and Dr. Afshin Momtaz at Broadcom Corporation for

  20. Open Science in the Cloud: Towards a Universal Platform for Scientific and Statistical Computing

    NASA Astrophysics Data System (ADS)

    Chine, Karim

    The UK, through the e-Science program, the US through the NSF-funded cyber infrastructure and the European Union through the ICT Calls aimed to provide "the technological solution to the problem of efficiently connecting data, computers, and people with the goal of enabling derivation of novel scientific theories and knowledge".1 The Grid (Foster, 2002; Foster; Kesselman, Nick, & Tuecke, 2002), foreseen as a major accelerator of discovery, didn't meet the expectations it had excited at its beginnings and was not adopted by the broad population of research professionals. The Grid is a good tool for particle physicists and it has allowed them to tackle the tremendous computational challenges inherent to their field. However, as a technology and paradigm for delivering computing on demand, it doesn't work and it can't be fixed. On one hand, "the abstractions that Grids expose - to the end-user, to the deployers and to application developers - are inappropriate and they need to be higher level" (Jha, Merzky, & Fox), and on the other hand, academic Grids are inherently economically unsustainable. They can't compete with a service outsourced to the Industry whose quality and price would be driven by market forces. The virtualization technologies and their corollary, the Infrastructure-as-a-Service (IaaS) style cloud, hold the promise to enable what the Grid failed to deliver: a sustainable environment for computational sciences that would lower the barriers for accessing federated computational resources, software tools and data; enable collaboration and resources sharing and provide the building blocks of a ubiquitous platform for traceable and reproducible computational research.

  1. The fusion code XGC: Enabling kinetic study of multi-scale edge turbulent transport in ITER [Book Chapter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    D'Azevedo, Eduardo; Abbott, Stephen; Koskela, Tuomas

    The XGC fusion gyrokinetic code combines state-of-the-art, portable computational and algorithmic technologies to enable complicated multiscale simulations of turbulence and transport dynamics in ITER edge plasma on the largest US open-science computer, the CRAY XK7 Titan, at its maximal heterogeneous capability, which have not been possible before due to a factor of over 10 shortage in the time-to-solution for less than 5 days of wall-clock time for one physics case. Frontier techniques such as nested OpenMP parallelism, adaptive parallel I/O, staging I/O and data reduction using dynamic and asynchronous applications interactions, dynamic repartitioning for balancing computational work in pushing particlesmore » and in grid related work, scalable and accurate discretization algorithms for non-linear Coulomb collisions, and communication-avoiding subcycling technology for pushing particles on both CPUs and GPUs are also utilized to dramatically improve the scalability and time-to-solution, hence enabling the difficult kinetic ITER edge simulation on a present-day leadership class computer.« less

  2. Enabling the environmentally clean air transportation of the future: a vision of computational fluid dynamics in 2030

    PubMed Central

    Slotnick, Jeffrey P.; Khodadoust, Abdollah; Alonso, Juan J.; Darmofal, David L.; Gropp, William D.; Lurie, Elizabeth A.; Mavriplis, Dimitri J.; Venkatakrishnan, Venkat

    2014-01-01

    As global air travel expands rapidly to meet demand generated by economic growth, it is essential to continue to improve the efficiency of air transportation to reduce its carbon emissions and address concerns about climate change. Future transports must be ‘cleaner’ and designed to include technologies that will continue to lower engine emissions and reduce community noise. The use of computational fluid dynamics (CFD) will be critical to enable the design of these new concepts. In general, the ability to simulate aerodynamic and reactive flows using CFD has progressed rapidly during the past several decades and has fundamentally changed the aerospace design process. Advanced simulation capabilities not only enable reductions in ground-based and flight-testing requirements, but also provide added physical insight, and enable superior designs at reduced cost and risk. In spite of considerable success, reliable use of CFD has remained confined to a small region of the operating envelope due, in part, to the inability of current methods to reliably predict turbulent, separated flows. Fortunately, the advent of much more powerful computing platforms provides an opportunity to overcome a number of these challenges. This paper summarizes the findings and recommendations from a recent NASA-funded study that provides a vision for CFD in the year 2030, including an assessment of critical technology gaps and needed development, and identifies the key CFD technology advancements that will enable the design and development of much cleaner aircraft in the future. PMID:25024413

  3. Heat Treatment Used to Strengthen Enabling Coating Technology for Oil-Free Turbomachinery

    NASA Technical Reports Server (NTRS)

    Edmonds, Brian J.; DellaCorte, Christopher

    2002-01-01

    The PS304 high-temperature solid lubricant coating is a key enabling technology for Oil- Free turbomachinery propulsion and power systems. Breakthroughs in the performance of advanced foil air bearings and improvements in computer-based finite element modeling techniques are the key technologies enabling the development of Oil-Free aircraft engines being pursued by the Oil-Free Turbomachinery team at the NASA Glenn Research Center. PS304 is a plasma spray coating applied to the surface of shafts operating against foil air bearings or in any other component requiring solid lubrication at high temperatures, where conventional materials such as graphite cannot function.

  4. Cloud Computing Fundamentals

    NASA Astrophysics Data System (ADS)

    Furht, Borko

    In the introductory chapter we define the concept of cloud computing and cloud services, and we introduce layers and types of cloud computing. We discuss the differences between cloud computing and cloud services. New technologies that enabled cloud computing are presented next. We also discuss cloud computing features, standards, and security issues. We introduce the key cloud computing platforms, their vendors, and their offerings. We discuss cloud computing challenges and the future of cloud computing.

  5. Administrative Uses of Computers in the Schools.

    ERIC Educational Resources Information Center

    Bluhm, Harry P.

    This book, intended for school administrators, provides a comprehensive account of how computer information systems can enable administrators at both middle and top management levels to manage the educational enterprise. It can be used as a textbook in an educational administration course emphasizing computer technology in education, an…

  6. 7 CFR 1739.3 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... terrestrial technology having the capacity to provide transmission facilities that enable subscribers of the...) Computer Access Points and wireless access, that is used for the purposes of providing free access to and..., and after normal working hours and on Saturdays or Sunday. Computer Access Point means a new computer...

  7. Enabling Large-Scale Biomedical Analysis in the Cloud

    PubMed Central

    Lin, Ying-Chih; Yu, Chin-Sheng; Lin, Yen-Jen

    2013-01-01

    Recent progress in high-throughput instrumentations has led to an astonishing growth in both volume and complexity of biomedical data collected from various sources. The planet-size data brings serious challenges to the storage and computing technologies. Cloud computing is an alternative to crack the nut because it gives concurrent consideration to enable storage and high-performance computing on large-scale data. This work briefly introduces the data intensive computing system and summarizes existing cloud-based resources in bioinformatics. These developments and applications would facilitate biomedical research to make the vast amount of diversification data meaningful and usable. PMID:24288665

  8. Megatux

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-09-25

    The Megatux platform enables the emulation of large scale (multi-million node) distributed systems. In particular, it allows for the emulation of large-scale networks interconnecting a very large number of emulated computer systems. It does this by leveraging virtualization and associated technologies to allow hundreds of virtual computers to be hosted on a single moderately sized server or workstation. Virtualization technology provided by modern processors allows for multiple guest OSs to run at the same time, sharing the hardware resources. The Megatux platform can be deployed on a single PC, a small cluster of a few boxes or a large clustermore » of computers. With a modest cluster, the Megatux platform can emulate complex organizational networks. By using virtualization, we emulate the hardware, but run actual software enabling large scale without sacrificing fidelity.« less

  9. Reconfigurable Computing As an Enabling Technology for Single-Photon-Counting Laser Altimetry

    NASA Technical Reports Server (NTRS)

    Powell, Wesley; Hicks, Edward; Pinchinat, Maxime; Dabney, Philip; McGarry, Jan; Murray, Paul

    2003-01-01

    Single-photon-counting laser altimetry is a new measurement technique offering significant advantages in vertical resolution, reducing instrument size, mass, and power, and reducing laser complexity as compared to analog or threshold detection laser altimetry techniques. However, these improvements come at the cost of a dramatically increased requirement for onboard real-time data processing. Reconfigurable computing has been shown to offer considerable performance advantages in performing this processing. These advantages have been demonstrated on the Multi-KiloHertz Micro-Laser Altimeter (MMLA), an aircraft based single-photon-counting laser altimeter developed by NASA Goddard Space Flight Center with several potential spaceflight applications. This paper describes how reconfigurable computing technology was employed to perform MMLA data processing in real-time under realistic operating constraints, along with the results observed. This paper also expands on these prior results to identify concepts for using reconfigurable computing to enable spaceflight single-photon-counting laser altimeter instruments.

  10. Omics-Based Strategies in Precision Medicine: Toward a Paradigm Shift in Inborn Errors of Metabolism Investigations

    PubMed Central

    Tebani, Abdellah; Afonso, Carlos; Marret, Stéphane; Bekri, Soumeya

    2016-01-01

    The rise of technologies that simultaneously measure thousands of data points represents the heart of systems biology. These technologies have had a huge impact on the discovery of next-generation diagnostics, biomarkers, and drugs in the precision medicine era. Systems biology aims to achieve systemic exploration of complex interactions in biological systems. Driven by high-throughput omics technologies and the computational surge, it enables multi-scale and insightful overviews of cells, organisms, and populations. Precision medicine capitalizes on these conceptual and technological advancements and stands on two main pillars: data generation and data modeling. High-throughput omics technologies allow the retrieval of comprehensive and holistic biological information, whereas computational capabilities enable high-dimensional data modeling and, therefore, accessible and user-friendly visualization. Furthermore, bioinformatics has enabled comprehensive multi-omics and clinical data integration for insightful interpretation. Despite their promise, the translation of these technologies into clinically actionable tools has been slow. In this review, we present state-of-the-art multi-omics data analysis strategies in a clinical context. The challenges of omics-based biomarker translation are discussed. Perspectives regarding the use of multi-omics approaches for inborn errors of metabolism (IEM) are presented by introducing a new paradigm shift in addressing IEM investigations in the post-genomic era. PMID:27649151

  11. Omics-Based Strategies in Precision Medicine: Toward a Paradigm Shift in Inborn Errors of Metabolism Investigations.

    PubMed

    Tebani, Abdellah; Afonso, Carlos; Marret, Stéphane; Bekri, Soumeya

    2016-09-14

    The rise of technologies that simultaneously measure thousands of data points represents the heart of systems biology. These technologies have had a huge impact on the discovery of next-generation diagnostics, biomarkers, and drugs in the precision medicine era. Systems biology aims to achieve systemic exploration of complex interactions in biological systems. Driven by high-throughput omics technologies and the computational surge, it enables multi-scale and insightful overviews of cells, organisms, and populations. Precision medicine capitalizes on these conceptual and technological advancements and stands on two main pillars: data generation and data modeling. High-throughput omics technologies allow the retrieval of comprehensive and holistic biological information, whereas computational capabilities enable high-dimensional data modeling and, therefore, accessible and user-friendly visualization. Furthermore, bioinformatics has enabled comprehensive multi-omics and clinical data integration for insightful interpretation. Despite their promise, the translation of these technologies into clinically actionable tools has been slow. In this review, we present state-of-the-art multi-omics data analysis strategies in a clinical context. The challenges of omics-based biomarker translation are discussed. Perspectives regarding the use of multi-omics approaches for inborn errors of metabolism (IEM) are presented by introducing a new paradigm shift in addressing IEM investigations in the post-genomic era.

  12. Brain-computer interfaces in neurological rehabilitation.

    PubMed

    Daly, Janis J; Wolpaw, Jonathan R

    2008-11-01

    Recent advances in analysis of brain signals, training patients to control these signals, and improved computing capabilities have enabled people with severe motor disabilities to use their brain signals for communication and control of objects in their environment, thereby bypassing their impaired neuromuscular system. Non-invasive, electroencephalogram (EEG)-based brain-computer interface (BCI) technologies can be used to control a computer cursor or a limb orthosis, for word processing and accessing the internet, and for other functions such as environmental control or entertainment. By re-establishing some independence, BCI technologies can substantially improve the lives of people with devastating neurological disorders such as advanced amyotrophic lateral sclerosis. BCI technology might also restore more effective motor control to people after stroke or other traumatic brain disorders by helping to guide activity-dependent brain plasticity by use of EEG brain signals to indicate to the patient the current state of brain activity and to enable the user to subsequently lower abnormal activity. Alternatively, by use of brain signals to supplement impaired muscle control, BCIs might increase the efficacy of a rehabilitation protocol and thus improve muscle control for the patient.

  13. Mobile Computing: Trends Enabling Virtual Management

    ERIC Educational Resources Information Center

    Kuyatt, Alan E.

    2011-01-01

    The growing power of mobile computing, with its constantly available wireless link to information, creates an opportunity to use innovative ways to work from any location. This technological capability allows companies to remove constraints of physical proximity so that people and enterprises can work together at a distance. Mobile computing is…

  14. Web-Based Architecture to Enable Compute-Intensive CAD Tools and Multi-user Synchronization in Teleradiology

    NASA Astrophysics Data System (ADS)

    Mehta, Neville; Kompalli, Suryaprakash; Chaudhary, Vipin

    Teleradiology is the electronic transmission of radiological patient images, such as x-rays, CT, or MR across multiple locations. The goal could be interpretation, consultation, or medical records keeping. Information technology solutions have enabled electronic records and their associated benefits are evident in health care today. However, salient aspects of collaborative interfaces, and computer assisted diagnostic (CAD) tools are yet to be integrated into workflow designs. The Computer Assisted Diagnostics and Interventions (CADI) group at the University at Buffalo has developed an architecture that facilitates web-enabled use of CAD tools, along with the novel concept of synchronized collaboration. The architecture can support multiple teleradiology applications and case studies are presented here.

  15. Processing Shotgun Proteomics Data on the Amazon Cloud with the Trans-Proteomic Pipeline*

    PubMed Central

    Slagel, Joseph; Mendoza, Luis; Shteynberg, David; Deutsch, Eric W.; Moritz, Robert L.

    2015-01-01

    Cloud computing, where scalable, on-demand compute cycles and storage are available as a service, has the potential to accelerate mass spectrometry-based proteomics research by providing simple, expandable, and affordable large-scale computing to all laboratories regardless of location or information technology expertise. We present new cloud computing functionality for the Trans-Proteomic Pipeline, a free and open-source suite of tools for the processing and analysis of tandem mass spectrometry datasets. Enabled with Amazon Web Services cloud computing, the Trans-Proteomic Pipeline now accesses large scale computing resources, limited only by the available Amazon Web Services infrastructure, for all users. The Trans-Proteomic Pipeline runs in an environment fully hosted on Amazon Web Services, where all software and data reside on cloud resources to tackle large search studies. In addition, it can also be run on a local computer with computationally intensive tasks launched onto the Amazon Elastic Compute Cloud service to greatly decrease analysis times. We describe the new Trans-Proteomic Pipeline cloud service components, compare the relative performance and costs of various Elastic Compute Cloud service instance types, and present on-line tutorials that enable users to learn how to deploy cloud computing technology rapidly with the Trans-Proteomic Pipeline. We provide tools for estimating the necessary computing resources and costs given the scale of a job and demonstrate the use of cloud enabled Trans-Proteomic Pipeline by performing over 1100 tandem mass spectrometry files through four proteomic search engines in 9 h and at a very low cost. PMID:25418363

  16. Processing shotgun proteomics data on the Amazon cloud with the trans-proteomic pipeline.

    PubMed

    Slagel, Joseph; Mendoza, Luis; Shteynberg, David; Deutsch, Eric W; Moritz, Robert L

    2015-02-01

    Cloud computing, where scalable, on-demand compute cycles and storage are available as a service, has the potential to accelerate mass spectrometry-based proteomics research by providing simple, expandable, and affordable large-scale computing to all laboratories regardless of location or information technology expertise. We present new cloud computing functionality for the Trans-Proteomic Pipeline, a free and open-source suite of tools for the processing and analysis of tandem mass spectrometry datasets. Enabled with Amazon Web Services cloud computing, the Trans-Proteomic Pipeline now accesses large scale computing resources, limited only by the available Amazon Web Services infrastructure, for all users. The Trans-Proteomic Pipeline runs in an environment fully hosted on Amazon Web Services, where all software and data reside on cloud resources to tackle large search studies. In addition, it can also be run on a local computer with computationally intensive tasks launched onto the Amazon Elastic Compute Cloud service to greatly decrease analysis times. We describe the new Trans-Proteomic Pipeline cloud service components, compare the relative performance and costs of various Elastic Compute Cloud service instance types, and present on-line tutorials that enable users to learn how to deploy cloud computing technology rapidly with the Trans-Proteomic Pipeline. We provide tools for estimating the necessary computing resources and costs given the scale of a job and demonstrate the use of cloud enabled Trans-Proteomic Pipeline by performing over 1100 tandem mass spectrometry files through four proteomic search engines in 9 h and at a very low cost. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.

  17. Learning COMCAT (Computer Output Microform Catalog): Library Training Program for Foreign Students at New York Institute of Technology.

    ERIC Educational Resources Information Center

    Chiang, Ching-hsin

    This thesis reports on the designer's plans and experiences in carrying out the design, development, implementation, and evaluation of a project, the purpose of which was to develop a training program that would enable foreign students at the New York Institute of Technology (NYIT) to use the Computer Output Microform Catalog (COMCAT) and to…

  18. Computational Support for Technology- Investment Decisions

    NASA Technical Reports Server (NTRS)

    Adumitroaie, Virgil; Hua, Hook; Lincoln, William; Block, Gary; Mrozinski, Joseph; Shelton, Kacie; Weisbin, Charles; Elfes, Alberto; Smith, Jeffrey

    2007-01-01

    Strategic Assessment of Risk and Technology (START) is a user-friendly computer program that assists human managers in making decisions regarding research-and-development investment portfolios in the presence of uncertainties and of non-technological constraints that include budgetary and time limits, restrictions related to infrastructure, and programmatic and institutional priorities. START facilitates quantitative analysis of technologies, capabilities, missions, scenarios and programs, and thereby enables the selection and scheduling of value-optimal development efforts. START incorporates features that, variously, perform or support a unique combination of functions, most of which are not systematically performed or supported by prior decision- support software. These functions include the following: Optimal portfolio selection using an expected-utility-based assessment of capabilities and technologies; Temporal investment recommendations; Distinctions between enhancing and enabling capabilities; Analysis of partial funding for enhancing capabilities; and Sensitivity and uncertainty analysis. START can run on almost any computing hardware, within Linux and related operating systems that include Mac OS X versions 10.3 and later, and can run in Windows under the Cygwin environment. START can be distributed in binary code form. START calls, as external libraries, several open-source software packages. Output is in Excel (.xls) file format.

  19. Virtual Team Effectiveness: An Empirical Examination of the Use of Communication Technologies on Trust and Virtual Team Performance

    ERIC Educational Resources Information Center

    Thomas, Valerie Brown

    2010-01-01

    Ubiquitous technology and agile organizational structures have enabled a strategic response to increasingly competitive, complex, and unpredictable challenges faced by many organizations. Using cyberinfrastructure, which is primarily the network of information, computers, communication technologies, and people, traditional organizations have…

  20. Evaluating Technology Integration in the Elementary School: A Site-Based Approach.

    ERIC Educational Resources Information Center

    Mowe, Richard

    This book enables educators at the elementary level to conduct formative evaluations of their technology programs in minimum time. Most of the technology is computer related, including word processing, graphics, desktop publishing, spreadsheets, databases, instructional software, programming, and telecommunications. The design of the book is aimed…

  1. A network-based distributed, media-rich computing and information environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Phillips, R.L.

    1995-12-31

    Sunrise is a Los Alamos National Laboratory (LANL) project started in October 1993. It is intended to be a prototype National Information Infrastructure development project. A main focus of Sunrise is to tie together enabling technologies (networking, object-oriented distributed computing, graphical interfaces, security, multi-media technologies, and data-mining technologies) with several specific applications. A diverse set of application areas was chosen to ensure that the solutions developed in the project are as generic as possible. Some of the application areas are materials modeling, medical records and image analysis, transportation simulations, and K-12 education. This paper provides a description of Sunrise andmore » a view of the architecture and objectives of this evolving project. The primary objectives of Sunrise are three-fold: (1) To develop common information-enabling tools for advanced scientific research and its applications to industry; (2) To enhance the capabilities of important research programs at the Laboratory; (3) To define a new way of collaboration between computer science and industrially-relevant research.« less

  2. Vehicle telematics as a platform for road use fees : final report.

    DOT National Transportation Integrated Search

    2016-11-01

    Vehicle telematics systems are composed of various onboard communications, positioning technologies, and computing technologies. Much of the data generated and/or gathered by these systems can be used to determine travel. These systems enable a range...

  3. Technologies and Reformed-Based Science Instruction: The Examination of a Professional Development Model Focused on Supporting Science Teaching and Learning with Technologies

    ERIC Educational Resources Information Center

    Campbell, Todd; Longhurst, Max L.; Wang, Shiang-Kwei; Hsu, Hui-Yin; Coster, Dan C.

    2015-01-01

    While access to computers, other technologies, and cyber-enabled resources that could be leveraged for enhancing student learning in science is increasing, generally it has been found that teachers use technology more for administrative purposes or to support traditional instruction. This use of technology, especially to support traditional…

  4. Enabling the environmentally clean air transportation of the future: a vision of computational fluid dynamics in 2030.

    PubMed

    Slotnick, Jeffrey P; Khodadoust, Abdollah; Alonso, Juan J; Darmofal, David L; Gropp, William D; Lurie, Elizabeth A; Mavriplis, Dimitri J; Venkatakrishnan, Venkat

    2014-08-13

    As global air travel expands rapidly to meet demand generated by economic growth, it is essential to continue to improve the efficiency of air transportation to reduce its carbon emissions and address concerns about climate change. Future transports must be 'cleaner' and designed to include technologies that will continue to lower engine emissions and reduce community noise. The use of computational fluid dynamics (CFD) will be critical to enable the design of these new concepts. In general, the ability to simulate aerodynamic and reactive flows using CFD has progressed rapidly during the past several decades and has fundamentally changed the aerospace design process. Advanced simulation capabilities not only enable reductions in ground-based and flight-testing requirements, but also provide added physical insight, and enable superior designs at reduced cost and risk. In spite of considerable success, reliable use of CFD has remained confined to a small region of the operating envelope due, in part, to the inability of current methods to reliably predict turbulent, separated flows. Fortunately, the advent of much more powerful computing platforms provides an opportunity to overcome a number of these challenges. This paper summarizes the findings and recommendations from a recent NASA-funded study that provides a vision for CFD in the year 2030, including an assessment of critical technology gaps and needed development, and identifies the key CFD technology advancements that will enable the design and development of much cleaner aircraft in the future. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  5. Data Movement Dominates: Advanced Memory Technology to Address the Real Exascale Power Problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bergman, Keren

    Energy is the fundamental barrier to Exascale supercomputing and is dominated by the cost of moving data from one point to another, not computation. Similarly, performance is dominated by data movement, not computation. The solution to this problem requires three critical technologies: 3D integration, optical chip-to-chip communication, and a new communication model. The central goal of the Sandia led "Data Movement Dominates" project aimed to develop memory systems and new architectures based on these technologies that have the potential to lower the cost of local memory accesses by orders of magnitude and provide substantially more bandwidth. Only through these transformationalmore » advances can future systems reach the goals of Exascale computing with a manageable power budgets. The Sandia led team included co-PIs from Columbia University, Lawrence Berkeley Lab, and the University of Maryland. The Columbia effort of Data Movement Dominates focused on developing a physically accurate simulation environment and experimental verification for optically-connected memory (OCM) systems that can enable continued performance scaling through high-bandwidth capacity, energy-efficient bit-rate transparency, and time-of-flight latency. With OCM, memory device parallelism and total capacity can scale to match future high-performance computing requirements without sacrificing data-movement efficiency. When we consider systems with integrated photonics, links to memory can be seamlessly integrated with the interconnection network-in a sense, memory becomes a primary aspect of the interconnection network. At the core of the Columbia effort, toward expanding our understanding of OCM enabled computing we have created an integrated modeling and simulation environment that uniquely integrates the physical behavior of the optical layer. The PhoenxSim suite of design and software tools developed under this effort has enabled the co-design of and performance evaluation photonics-enabled OCM architectures on Exascale computing systems.« less

  6. Software Reuse Methods to Improve Technological Infrastructure for e-Science

    NASA Technical Reports Server (NTRS)

    Marshall, James J.; Downs, Robert R.; Mattmann, Chris A.

    2011-01-01

    Social computing has the potential to contribute to scientific research. Ongoing developments in information and communications technology improve capabilities for enabling scientific research, including research fostered by social computing capabilities. The recent emergence of e-Science practices has demonstrated the benefits from improvements in the technological infrastructure, or cyber-infrastructure, that has been developed to support science. Cloud computing is one example of this e-Science trend. Our own work in the area of software reuse offers methods that can be used to improve new technological development, including cloud computing capabilities, to support scientific research practices. In this paper, we focus on software reuse and its potential to contribute to the development and evaluation of information systems and related services designed to support new capabilities for conducting scientific research.

  7. Optical computing, optical memory, and SBIRs at Foster-Miller

    NASA Astrophysics Data System (ADS)

    Domash, Lawrence H.

    1994-03-01

    A desktop design and manufacturing system for binary diffractive elements, MacBEEP, was developed with the optical researcher in mind. Optical processing systems for specialized tasks such as cellular automation computation and fractal measurement were constructed. A new family of switchable holograms has enabled several applications for control of laser beams in optical memories. New spatial light modulators and optical logic elements have been demonstrated based on a more manufacturable semiconductor technology. Novel synthetic and polymeric nonlinear materials for optical storage are under development in an integrated memory architecture. SBIR programs enable creative contributions from smaller companies, both product oriented and technology oriented, and support advances that might not otherwise be developed.

  8. The change in critical technologies for computational physics

    NASA Technical Reports Server (NTRS)

    Watson, Val

    1990-01-01

    It is noted that the types of technology required for computational physics are changing as the field matures. Emphasis has shifted from computer technology to algorithm technology and, finally, to visual analysis technology as areas of critical research for this field. High-performance graphical workstations tied to a supercommunicator with high-speed communications along with the development of especially tailored visualization software has enabled analysis of highly complex fluid-dynamics simulations. Particular reference is made here to the development of visual analysis tools at NASA's Numerical Aerodynamics Simulation Facility. The next technology which this field requires is one that would eliminate visual clutter by extracting key features of simulations of physics and technology in order to create displays that clearly portray these key features. Research in the tuning of visual displays to human cognitive abilities is proposed. The immediate transfer of technology to all levels of computers, specifically the inclusion of visualization primitives in basic software developments for all work stations and PCs, is recommended.

  9. Dynamic Transportation Navigation

    NASA Astrophysics Data System (ADS)

    Meng, Xiaofeng; Chen, Jidong

    Miniaturization of computing devices, and advances in wireless communication and sensor technology are some of the forces that are propagating computing from the stationary desktop to the mobile outdoors. Some important classes of new applications that will be enabled by this revolutionary development include intelligent traffic management, location-based services, tourist services, mobile electronic commerce, and digital battlefield. Some existing application classes that will benefit from the development include transportation and air traffic control, weather forecasting, emergency response, mobile resource management, and mobile workforce. Location management, i.e., the management of transient location information, is an enabling technology for all these applications. In this chapter, we present the applications of moving objects management and their functionalities, in particular, the application of dynamic traffic navigation, which is a challenge due to the highly variable traffic state and the requirement of fast, on-line computations.

  10. Enabling campus grids with open science grid technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weitzel, Derek; Bockelman, Brian; Swanson, David

    2011-01-01

    The Open Science Grid is a recognized key component of the US national cyber-infrastructure enabling scientific discovery through advanced high throughput computing. The principles and techniques that underlie the Open Science Grid can also be applied to Campus Grids since many of the requirements are the same, even if the implementation technologies differ. We find five requirements for a campus grid: trust relationships, job submission, resource independence, accounting, and data management. The Holland Computing Center's campus grid at the University of Nebraska-Lincoln was designed to fulfill the requirements of a campus grid. A bridging daemon was designed to bring non-Condormore » clusters into a grid managed by Condor. Condor features which make it possible to bridge Condor sites into a multi-campus grid have been exploited at the Holland Computing Center as well.« less

  11. GeoChronos: An On-line Collaborative Platform for Earth Observation Scientists

    NASA Astrophysics Data System (ADS)

    Gamon, J. A.; Kiddle, C.; Curry, R.; Markatchev, N.; Zonta-Pastorello, G., Jr.; Rivard, B.; Sanchez-Azofeifa, G. A.; Simmonds, R.; Tan, T.

    2009-12-01

    Recent advances in cyberinfrastructure are offering new solutions to the growing challenges of managing and sharing large data volumes. Web 2.0 and social networking technologies, provide the means for scientists to collaborate and share information more effectively. Cloud computing technologies can provide scientists with transparent and on-demand access to applications served over the Internet in a dynamic and scalable manner. Semantic Web technologies allow for data to be linked together in a manner understandable by machines, enabling greater automation. Combining all of these technologies together can enable the creation of very powerful platforms. GeoChronos (http://geochronos.org/), part of a CANARIE Network Enabled Platforms project, is an online collaborative platform that incorporates these technologies to enable members of the earth observation science community to share data and scientific applications and to collaborate more effectively. The GeoChronos portal is built on an open source social networking platform called Elgg. Elgg provides a full set of social networking functionalities similar to Facebook including blogs, tags, media/document sharing, wikis, friends/contacts, groups, discussions, message boards, calendars, status, activity feeds and more. An underlying cloud computing infrastructure enables scientists to access dynamically provisioned applications via the portal for visualizing and analyzing data. Users are able to access and run the applications from any computer that has a Web browser and Internet connectivity and do not need to manage and maintain the applications themselves. Semantic Web Technologies, such as the Resource Description Framework (RDF) are being employed for relating and linking together spectral, satellite, meteorological and other data. Social networking functionality plays an integral part in facilitating the sharing of data and applications. Examples of recent GeoChronos users during the early testing phase have included the IAI International Wireless Sensor Networking Summer School at the University of Alberta, and the IAI Tropi-Dry community. Current GeoChronos activities include the development of a web-based spectral library and related analytical and visualization tools, in collaboration with members of the SpecNet community. The GeoChronos portal will be open to all members of the earth observation science community when the project nears completion at the end of 2010.

  12. Cloud Computing as a Core Discipline in a Technology Entrepreneurship Program

    ERIC Educational Resources Information Center

    Lawler, James; Joseph, Anthony

    2012-01-01

    Education in entrepreneurship continues to be a developing area of curricula for computer science and information systems students. Entrepreneurship is enabled frequently by cloud computing methods that furnish benefits to especially medium and small-sized firms. Expanding upon an earlier foundation paper, the authors of this paper present an…

  13. Virtual Computing Laboratories: A Case Study with Comparisons to Physical Computing Laboratories

    ERIC Educational Resources Information Center

    Burd, Stephen D.; Seazzu, Alessandro F.; Conway, Christopher

    2009-01-01

    Current technology enables schools to provide remote or virtual computing labs that can be implemented in multiple ways ranging from remote access to banks of dedicated workstations to sophisticated access to large-scale servers hosting virtualized workstations. This paper reports on the implementation of a specific lab using remote access to…

  14. An Adaptive Testing System for Supporting Versatile Educational Assessment

    ERIC Educational Resources Information Center

    Huang, Yueh-Min; Lin, Yen-Ting; Cheng, Shu-Chen

    2009-01-01

    With the rapid growth of computer and mobile technology, it is a challenge to integrate computer based test (CBT) with mobile learning (m-learning) especially for formative assessment and self-assessment. In terms of self-assessment, computer adaptive test (CAT) is a proper way to enable students to evaluate themselves. In CAT, students are…

  15. Using Computer Technology to Create a Revolutionary New Style of Biology.

    ERIC Educational Resources Information Center

    Monaghan, Peter

    1993-01-01

    A $13-million gift of William Gates III to the University of Washington has enabled establishment of the country's first department in molecular biotechnology, a combination of medicine and molecular biology to be practiced by researchers versed in a variety of fields, including computer science, computation, applied physics, and engineering. (MSE)

  16. High performance computing and communications: Advancing the frontiers of information technology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1997-12-31

    This report, which supplements the President`s Fiscal Year 1997 Budget, describes the interagency High Performance Computing and Communications (HPCC) Program. The HPCC Program will celebrate its fifth anniversary in October 1996 with an impressive array of accomplishments to its credit. Over its five-year history, the HPCC Program has focused on developing high performance computing and communications technologies that can be applied to computation-intensive applications. Major highlights for FY 1996: (1) High performance computing systems enable practical solutions to complex problems with accuracies not possible five years ago; (2) HPCC-funded research in very large scale networking techniques has been instrumental inmore » the evolution of the Internet, which continues exponential growth in size, speed, and availability of information; (3) The combination of hardware capability measured in gigaflop/s, networking technology measured in gigabit/s, and new computational science techniques for modeling phenomena has demonstrated that very large scale accurate scientific calculations can be executed across heterogeneous parallel processing systems located thousands of miles apart; (4) Federal investments in HPCC software R and D support researchers who pioneered the development of parallel languages and compilers, high performance mathematical, engineering, and scientific libraries, and software tools--technologies that allow scientists to use powerful parallel systems to focus on Federal agency mission applications; and (5) HPCC support for virtual environments has enabled the development of immersive technologies, where researchers can explore and manipulate multi-dimensional scientific and engineering problems. Educational programs fostered by the HPCC Program have brought into classrooms new science and engineering curricula designed to teach computational science. This document contains a small sample of the significant HPCC Program accomplishments in FY 1996.« less

  17. Integrated Engineering Information Technology, FY93 accommplishments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, R.N.; Miller, D.K.; Neugebauer, G.L.

    1994-03-01

    The Integrated Engineering Information Technology (IEIT) project is providing a comprehensive, easy-to-use computer network solution or communicating with coworkers both inside and outside Sandia National Laboratories. IEIT capabilities include computer networking, electronic mail, mechanical design, and data management. These network-based tools have one fundamental purpose: to help create a concurrent engineering environment that will enable Sandia organizations to excel in today`s increasingly competitive business environment.

  18. The efficacy of computer-enabled discharge communication interventions: a systematic review.

    PubMed

    Motamedi, Soror Mona; Posadas-Calleja, Juan; Straus, Sharon; Bates, David W; Lorenzetti, Diane L; Baylis, Barry; Gilmour, Janet; Kimpton, Shandra; Ghali, William A

    2011-05-01

    Traditional manual/dictated discharge summaries are inaccurate, inconsistent and untimely. Computer-enabled discharge communications may improve information transfer by providing a standardised document that immediately links acute and community healthcare providers. To conduct a systematic review evaluating the efficacy of computer-enabled discharge communication compared with traditional communication for patients discharged from acute care hospitals. MEDLINE, EMBASE, Cochrane CENTRAL Register of Controlled Trials and MEDLINE In-Process. Keywords from three themes were combined: discharge communication, electronic/online/web-based and controlled interventional studies. Study types included: clinical trials, quasiexperimental studies with concurrent controls and controlled before--after studies. Interventions included: (1) automatic population of a discharge document by computer database(s); (2) transmission of discharge information via computer technology; or (3) computer technology providing a 'platform' for dynamic discharge communication. Controls included: no intervention or traditional manual/dictated discharge summaries. Primary outcomes included: mortality, readmission and adverse events/near misses. Secondary outcomes included: timeliness, accuracy, quality/completeness and physician/patient satisfaction. Description of interventions and study outcomes were extracted by two independent reviewers. 12 unique studies were identified: eight randomised controlled trials and four quasi-experimental studies. Pooling/meta-analysis was not possible, given the heterogeneity of measures and outcomes reported. The primary outcomes of mortality and readmission were inconsistently reported. There was no significant difference in mortality, and one study reported reduced long-term readmission. Intervention groups experienced reductions in perceived medical errors/adverse events, and improvements in timeliness and physician/patient satisfaction. Computer-enabled discharge communications appear beneficial with respect to a number of important secondary outcomes. Primary outcomes of mortality and readmission are less commonly reported in this literature and require further study.

  19. PICSiP: new system-in-package technology using a high bandwidth photonic interconnection layer for converged microsystems

    NASA Astrophysics Data System (ADS)

    Tekin, Tolga; Töpper, Michael; Reichl, Herbert

    2009-05-01

    Technological frontiers between semiconductor technology, packaging, and system design are disappearing. Scaling down geometries [1] alone does not provide improvement of performance, less power, smaller size, and lower cost. It will require "More than Moore" [2] through the tighter integration of system level components at the package level. System-in-Package (SiP) will deliver the efficient use of three dimensions (3D) through innovation in packaging and interconnect technology. A key bottleneck to the implementation of high-performance microelectronic systems, including SiP, is the lack of lowlatency, high-bandwidth, and high density off-chip interconnects. Some of the challenges in achieving high-bandwidth chip-to-chip communication using electrical interconnects include the high losses in the substrate dielectric, reflections and impedance discontinuities, and susceptibility to crosstalk [3]. Obviously, the incentive for the use of photonics to overcome the challenges and leverage low-latency and highbandwidth communication will enable the vision of optical computing within next generation architectures. Supercomputers of today offer sustained performance of more than petaflops, which can be increased by utilizing optical interconnects. Next generation computing architectures are needed with ultra low power consumption; ultra high performance with novel interconnection technologies. In this paper we will discuss a CMOS compatible underlying technology to enable next generation optical computing architectures. By introducing a new optical layer within the 3D SiP, the development of converged microsystems, deployment for next generation optical computing architecture will be leveraged.

  20. Current status and future prospects for enabling chemistry technology in the drug discovery process.

    PubMed

    Djuric, Stevan W; Hutchins, Charles W; Talaty, Nari N

    2016-01-01

    This review covers recent advances in the implementation of enabling chemistry technologies into the drug discovery process. Areas covered include parallel synthesis chemistry, high-throughput experimentation, automated synthesis and purification methods, flow chemistry methodology including photochemistry, electrochemistry, and the handling of "dangerous" reagents. Also featured are advances in the "computer-assisted drug design" area and the expanding application of novel mass spectrometry-based techniques to a wide range of drug discovery activities.

  1. Computational Fluid Dynamics of Whole-Body Aircraft

    NASA Astrophysics Data System (ADS)

    Agarwal, Ramesh

    1999-01-01

    The current state of the art in computational aerodynamics for whole-body aircraft flowfield simulations is described. Recent advances in geometry modeling, surface and volume grid generation, and flow simulation algorithms have led to accurate flowfield predictions for increasingly complex and realistic configurations. As a result, computational aerodynamics has emerged as a crucial enabling technology for the design and development of flight vehicles. Examples illustrating the current capability for the prediction of transport and fighter aircraft flowfields are presented. Unfortunately, accurate modeling of turbulence remains a major difficulty in the analysis of viscosity-dominated flows. In the future, inverse design methods, multidisciplinary design optimization methods, artificial intelligence technology, and massively parallel computer technology will be incorporated into computational aerodynamics, opening up greater opportunities for improved product design at substantially reduced costs.

  2. Digital microscopy. Bringing new technology into focus.

    PubMed

    2010-06-01

    Digital microscopy enables the scanning of microscope slides so that they can be viewed, analyzed, and archived on a computer. While the technology is not yet widely accepted by pathologists, a switch to digital microscopy systems seems to be inevitable in the near future.

  3. Methods for transition toward computer assisted cognitive examination.

    PubMed

    Jurica, P; Valenzi, S; Struzik, Z R; Cichocki, A

    2015-01-01

    We present a software framework which enables the extension of current methods for the assessment of cognitive fitness using recent technological advances. Screening for cognitive impairment is becoming more important as the world's population grows older. Current methods could be enhanced by use of computers. Introduction of new methods to clinics requires basic tools for collection and communication of collected data. To develop tools that, with minimal interference, offer new opportunities for the enhancement of the current interview based cognitive examinations. We suggest methods and discuss process by which established cognitive tests can be adapted for data collection through digitization by pen enabled tablets. We discuss a number of methods for evaluation of collected data, which promise to increase the resolution and objectivity of the common scoring strategy based on visual inspection. By involving computers in the roles of both instructing and scoring, we aim to increase the precision and reproducibility of cognitive examination. The tools provided in Python framework CogExTools available at http://bsp. brain.riken.jp/cogextools/ enable the design, application and evaluation of screening tests for assessment of cognitive impairment. The toolbox is a research platform; it represents a foundation for further collaborative development by the wider research community and enthusiasts. It is free to download and use, and open-source. We introduce a set of open-source tools that facilitate the design and development of new cognitive tests for modern technology. We provide these tools in order to enable the adaptation of technology for cognitive examination in clinical settings. The tools provide the first step in a possible transition toward standardized mental state examination using computers.

  4. A Grid Infrastructure for Supporting Space-based Science Operations

    NASA Technical Reports Server (NTRS)

    Bradford, Robert N.; Redman, Sandra H.; McNair, Ann R. (Technical Monitor)

    2002-01-01

    Emerging technologies for computational grid infrastructures have the potential for revolutionizing the way computers are used in all aspects of our lives. Computational grids are currently being implemented to provide a large-scale, dynamic, and secure research and engineering environments based on standards and next-generation reusable software, enabling greater science and engineering productivity through shared resources and distributed computing for less cost than traditional architectures. Combined with the emerging technologies of high-performance networks, grids provide researchers, scientists and engineers the first real opportunity for an effective distributed collaborative environment with access to resources such as computational and storage systems, instruments, and software tools and services for the most computationally challenging applications.

  5. Grid computing in large pharmaceutical molecular modeling.

    PubMed

    Claus, Brian L; Johnson, Stephen R

    2008-07-01

    Most major pharmaceutical companies have employed grid computing to expand their compute resources with the intention of minimizing additional financial expenditure. Historically, one of the issues restricting widespread utilization of the grid resources in molecular modeling is the limited set of suitable applications amenable to coarse-grained parallelization. Recent advances in grid infrastructure technology coupled with advances in application research and redesign will enable fine-grained parallel problems, such as quantum mechanics and molecular dynamics, which were previously inaccessible to the grid environment. This will enable new science as well as increase resource flexibility to load balance and schedule existing workloads.

  6. NASA Center for Climate Simulation (NCCS) Advanced Technology AT5 Virtualized Infiniband Report

    NASA Technical Reports Server (NTRS)

    Thompson, John H.; Bledsoe, Benjamin C.; Wagner, Mark; Shakshober, John; Fromkin, Russ

    2013-01-01

    The NCCS is part of the Computational and Information Sciences and Technology Office (CISTO) of Goddard Space Flight Center's (GSFC) Sciences and Exploration Directorate. The NCCS's mission is to enable scientists to increase their understanding of the Earth, the solar system, and the universe by supplying state-of-the-art high performance computing (HPC) solutions. To accomplish this mission, the NCCS (https://www.nccs.nasa.gov) provides high performance compute engines, mass storage, and network solutions to meet the specialized needs of the Earth and space science user communities

  7. Using Free Computational Resources to Illustrate the Drug Design Process in an Undergraduate Medicinal Chemistry Course

    ERIC Educational Resources Information Center

    Rodrigues, Ricardo P.; Andrade, Saulo F.; Mantoani, Susimaire P.; Eifler-Lima, Vera L.; Silva, Vinicius B.; Kawano, Daniel F.

    2015-01-01

    Advances in, and dissemination of, computer technologies in the field of drug research now enable the use of molecular modeling tools to teach important concepts of drug design to chemistry and pharmacy students. A series of computer laboratories is described to introduce undergraduate students to commonly adopted "in silico" drug design…

  8. End User Computing at a South African Technikon: Enabling Disadvantaged Students To Meet Employers' Requirements.

    ERIC Educational Resources Information Center

    Marsh, Cecille

    A two-phase study examined the skills required of competent end-users of computers in the workplace and assessed the computing awareness and technological environment of first-year students entering historically disadvantaged technikons in South Africa. First, a DACUM (Developing a Curriculum) panel of nine representatives of local business and…

  9. Optimized microsystems-enabled photovoltaics

    DOEpatents

    Cruz-Campa, Jose Luis; Nielson, Gregory N.; Young, Ralph W.; Resnick, Paul J.; Okandan, Murat; Gupta, Vipin P.

    2015-09-22

    Technologies pertaining to designing microsystems-enabled photovoltaic (MEPV) cells are described herein. A first restriction for a first parameter of an MEPV cell is received. Subsequently, a selection of a second parameter of the MEPV cell is received. Values for a plurality of parameters of the MEPV cell are computed such that the MEPV cell is optimized with respect to the second parameter, wherein the values for the plurality of parameters are computed based at least in part upon the restriction for the first parameter.

  10. Engaging Cyber Communities

    DTIC Science & Technology

    2010-04-01

    technology centric operations such as computer network attack and computer network defense. 3 This leads to the question of whether the US military is... information and infrastructure. For the purpose of military operations, CNO are divided into CNA, CND, and computer network exploitation (CNE) enabling...of a CNA if they take undesirable action,” 21 and from a defensive stance in CND, “providing information about non-military threat to computers in

  11. A new milling machine for computer-aided, in-office restorations.

    PubMed

    Kurbad, Andreas

    Chairside computer-aided design/computer-aided manufacturing (CAD/CAM) technology requires an effective technical basis to obtain dental restorations with optimal marginal accuracy, esthetics, and longevity in as short a timeframe as possible. This article describes a compact, 5-axis milling machine based on an innovative milling technology (5XT - five-axis turn-milling technique), which is capable of achieving high-precision milling results within a very short processing time. Furthermore, the device's compact dimensioning and state-of-the-art mode of operation facilitate its use in the dental office. This model is also an option to be considered for use in smaller dental laboratories, especially as the open input format enables it to be quickly and simply integrated into digital processing systems already in use. The possibility of using ceramic and polymer materials with varying properties enables the manufacture of restorations covering all conceivable indications in the field of fixed dental prosthetics.

  12. Fundamental energy limits of SET-based Brownian NAND and half-adder circuits. Preliminary findings from a physical-information-theoretic methodology

    NASA Astrophysics Data System (ADS)

    Ercan, İlke; Suyabatmaz, Enes

    2018-06-01

    The saturation in the efficiency and performance scaling of conventional electronic technologies brings about the development of novel computational paradigms. Brownian circuits are among the promising alternatives that can exploit fluctuations to increase the efficiency of information processing in nanocomputing. A Brownian cellular automaton, where signals propagate randomly and are driven by local transition rules, can be made computationally universal by embedding arbitrary asynchronous circuits on it. One of the potential realizations of such circuits is via single electron tunneling (SET) devices since SET technology enable simulation of noise and fluctuations in a fashion similar to Brownian search. In this paper, we perform a physical-information-theoretic analysis on the efficiency limitations in a Brownian NAND and half-adder circuits implemented using SET technology. The method we employed here establishes a solid ground that enables studying computational and physical features of this emerging technology on an equal footing, and yield fundamental lower bounds that provide valuable insights into how far its efficiency can be improved in principle. In order to provide a basis for comparison, we also analyze a NAND gate and half-adder circuit implemented in complementary metal oxide semiconductor technology to show how the fundamental bound of the Brownian circuit compares against a conventional paradigm.

  13. Developing Intranets: Practical Issues for Implementation and Design.

    ERIC Educational Resources Information Center

    Trowbridge, Dave

    1996-01-01

    An intranet is a system which has "domesticated" the technologies of the Internet for specific organizational settings and goals. Although the adaptability of Hypertext Markup Language to intranets is sometimes limited, implementing various protocols and technologies enable organizations to share files among heterogeneous computers,…

  14. Gender Differences in Computer- and Instrumental-Based Musical Composition

    ERIC Educational Resources Information Center

    Shibazaki, Kagari; Marshall, Nigel A.

    2013-01-01

    Background: Previous studies have argued that technology can be a major support to the music teacher enabling, amongst other things, increased student motivation, higher levels of confidence and more individualised learning to take place [Bolton, J. 2008. "Technologically mediated composition learning: Josh's story." "British…

  15. Enabling Smart Manufacturing Research and Development using a Product Lifecycle Test Bed.

    PubMed

    Helu, Moneer; Hedberg, Thomas

    2015-01-01

    Smart manufacturing technologies require a cyber-physical infrastructure to collect and analyze data and information across the manufacturing enterprise. This paper describes a concept for a product lifecycle test bed built on a cyber-physical infrastructure that enables smart manufacturing research and development. The test bed consists of a Computer-Aided Technologies (CAx) Lab and a Manufacturing Lab that interface through the product model creating a "digital thread" of information across the product lifecycle. The proposed structure and architecture of the test bed is presented, which highlights the challenges and requirements of implementing a cyber-physical infrastructure for manufacturing. The novel integration of systems across the product lifecycle also helps identify the technologies and standards needed to enable interoperability between design, fabrication, and inspection. Potential research opportunities enabled by the test bed are also discussed, such as providing publicly accessible CAx and manufacturing reference data, virtual factory data, and a representative industrial environment for creating, prototyping, and validating smart manufacturing technologies.

  16. Enabling Smart Manufacturing Research and Development using a Product Lifecycle Test Bed

    PubMed Central

    Helu, Moneer; Hedberg, Thomas

    2017-01-01

    Smart manufacturing technologies require a cyber-physical infrastructure to collect and analyze data and information across the manufacturing enterprise. This paper describes a concept for a product lifecycle test bed built on a cyber-physical infrastructure that enables smart manufacturing research and development. The test bed consists of a Computer-Aided Technologies (CAx) Lab and a Manufacturing Lab that interface through the product model creating a “digital thread” of information across the product lifecycle. The proposed structure and architecture of the test bed is presented, which highlights the challenges and requirements of implementing a cyber-physical infrastructure for manufacturing. The novel integration of systems across the product lifecycle also helps identify the technologies and standards needed to enable interoperability between design, fabrication, and inspection. Potential research opportunities enabled by the test bed are also discussed, such as providing publicly accessible CAx and manufacturing reference data, virtual factory data, and a representative industrial environment for creating, prototyping, and validating smart manufacturing technologies. PMID:28664167

  17. Data management and analysis for the Earth System Grid

    NASA Astrophysics Data System (ADS)

    Williams, D. N.; Ananthakrishnan, R.; Bernholdt, D. E.; Bharathi, S.; Brown, D.; Chen, M.; Chervenak, A. L.; Cinquini, L.; Drach, R.; Foster, I. T.; Fox, P.; Hankin, S.; Henson, V. E.; Jones, P.; Middleton, D. E.; Schwidder, J.; Schweitzer, R.; Schuler, R.; Shoshani, A.; Siebenlist, F.; Sim, A.; Strand, W. G.; Wilhelmi, N.; Su, M.

    2008-07-01

    The international climate community is expected to generate hundreds of petabytes of simulation data within the next five to seven years. This data must be accessed and analyzed by thousands of analysts worldwide in order to provide accurate and timely estimates of the likely impact of climate change on physical, biological, and human systems. Climate change is thus not only a scientific challenge of the first order but also a major technological challenge. In order to address this technological challenge, the Earth System Grid Center for Enabling Technologies (ESG-CET) has been established within the U.S. Department of Energy's Scientific Discovery through Advanced Computing (SciDAC)-2 program, with support from the offices of Advanced Scientific Computing Research and Biological and Environmental Research. ESG-CET's mission is to provide climate researchers worldwide with access to the data, information, models, analysis tools, and computational capabilities required to make sense of enormous climate simulation datasets. Its specific goals are to (1) make data more useful to climate researchers by developing Grid technology that enhances data usability; (2) meet specific distributed database, data access, and data movement needs of national and international climate projects; (3) provide a universal and secure web-based data access portal for broad multi-model data collections; and (4) provide a wide-range of Grid-enabled climate data analysis tools and diagnostic methods to international climate centers and U.S. government agencies. Building on the successes of the previous Earth System Grid (ESG) project, which has enabled thousands of researchers to access tens of terabytes of data from a small number of ESG sites, ESG-CET is working to integrate a far larger number of distributed data providers, high-bandwidth wide-area networks, and remote computers in a highly collaborative problem-solving environment.

  18. A self-learning camera for the validation of highly variable and pseudorandom patterns

    NASA Astrophysics Data System (ADS)

    Kelley, Michael

    2004-05-01

    Reliable and productive manufacturing operations have depended on people to quickly detect and solve problems whenever they appear. Over the last 20 years, more and more manufacturing operations have embraced machine vision systems to increase productivity, reliability and cost-effectiveness, including reducing the number of human operators required. Although machine vision technology has long been capable of solving simple problems, it has still not been broadly implemented. The reason is that until now, no machine vision system has been designed to meet the unique demands of complicated pattern recognition. The ZiCAM family was specifically developed to be the first practical hardware to meet these needs. To be able to address non-traditional applications, the machine vision industry must include smart camera technology that meets its users" demands for lower costs, better performance and the ability to address applications of irregular lighting, patterns and color. The next-generation smart cameras will need to evolve as a fundamentally different kind of sensor, with new technology that behaves like a human but performs like a computer. Neural network based systems, coupled with self-taught, n-space, non-linear modeling, promises to be the enabler of the next generation of machine vision equipment. Image processing technology is now available that enables a system to match an operator"s subjectivity. A Zero-Instruction-Set-Computer (ZISC) powered smart camera allows high-speed fuzzy-logic processing, without the need for computer programming. This can address applications of validating highly variable and pseudo-random patterns. A hardware-based implementation of a neural network, Zero-Instruction-Set-Computer, enables a vision system to "think" and "inspect" like a human, with the speed and reliability of a machine.

  19. Embedded Web Technology: Internet Technology Applied to Real-Time System Control

    NASA Technical Reports Server (NTRS)

    Daniele, Carl J.

    1998-01-01

    The NASA Lewis Research Center is developing software tools to bridge the gap between the traditionally non-real-time Internet technology and the real-time, embedded-controls environment for space applications. Internet technology has been expanding at a phenomenal rate. The simple World Wide Web browsers (such as earlier versions of Netscape, Mosaic, and Internet Explorer) that resided on personal computers just a few years ago only enabled users to log into and view a remote computer site. With current browsers, users not only view but also interact with remote sites. In addition, the technology now supports numerous computer platforms (PC's, MAC's, and Unix platforms), thereby providing platform independence.In contrast, the development of software to interact with a microprocessor (embedded controller) that is used to monitor and control a space experiment has generally been a unique development effort. For each experiment, a specific graphical user interface (GUI) has been developed. This procedure works well for a single-user environment. However, the interface for the International Space Station (ISS) Fluids and Combustion Facility will have to enable scientists throughout the world and astronauts onboard the ISS, using different computer platforms, to interact with their experiments in the Fluids and Combustion Facility. Developing a specific GUI for all these users would be cost prohibitive. An innovative solution to this requirement, developed at Lewis, is to use Internet technology, where the general problem of platform independence has already been partially solved, and to leverage this expanding technology as new products are developed. This approach led to the development of the Embedded Web Technology (EWT) program at Lewis, which has the potential to significantly reduce software development costs for both flight and ground software.

  20. Current status and future prospects for enabling chemistry technology in the drug discovery process

    PubMed Central

    Djuric, Stevan W.; Hutchins, Charles W.; Talaty, Nari N.

    2016-01-01

    This review covers recent advances in the implementation of enabling chemistry technologies into the drug discovery process. Areas covered include parallel synthesis chemistry, high-throughput experimentation, automated synthesis and purification methods, flow chemistry methodology including photochemistry, electrochemistry, and the handling of “dangerous” reagents. Also featured are advances in the “computer-assisted drug design” area and the expanding application of novel mass spectrometry-based techniques to a wide range of drug discovery activities. PMID:27781094

  1. A Quantitative Examination of User Experience as an Antecedent to Student Perception in Technology Acceptance Modeling

    ERIC Educational Resources Information Center

    Butler, Rory

    2013-01-01

    Internet-enabled mobile devices have increased the accessibility of learning content for students. Given the ubiquitous nature of mobile computing technology, a thorough understanding of the acceptance factors that impact a learner's intention to use mobile technology as an augment to their studies is warranted. Student acceptance of mobile…

  2. Faculty Attitude towards Integrating Technology in Teaching at a Four-Year Southeastern University

    ERIC Educational Resources Information Center

    Palmore, Donna Venetta

    2011-01-01

    Studies have shown that computer technology has brought about a noticeable change in the manner in which education is delivered to students. Further research suggests that the use of technology enables educators to effectively communicate with their students in an interactive learning environment designed to meet their individual needs. Moreover,…

  3. BeeSim: Leveraging Wearable Computers in Participatory Simulations with Young Children

    ERIC Educational Resources Information Center

    Peppler, Kylie; Danish, Joshua; Zaitlen, Benjamin; Glosson, Diane; Jacobs, Alexander; Phelps, David

    2010-01-01

    New technologies have enabled students to become active participants in computational simulations of dynamic and complex systems (called Participatory Simulations), providing a "first-person"perspective on complex systems. However, most existing Participatory Simulations have targeted older children, teens, and adults assuming that such concepts…

  4. Experiences of Student Mathematics-Teachers in Computer-Based Mathematics Learning Environment

    ERIC Educational Resources Information Center

    Karatas, Ilhan

    2011-01-01

    Computer technology in mathematics education enabled the students find many opportunities for investigating mathematical relationships, hypothesizing, and making generalizations. These opportunities were provided to pre-service teachers through a faculty course. At the end of the course, the teachers were assigned project tasks involving…

  5. Mental models, metaphors and their use in the education of nurses.

    PubMed

    Burke, L M; Wilson, A M

    1997-11-01

    A great deal of nurses' confidence in the use of information technology (IT) depends both on the way computers are introduced to students in the college and how such education is continued and applied when they are practitioners. It is therefore vital that teachers of IT assist nurses to discover ways of learning to utilize and apply computers within their workplace with whatever methods are available. One method which has been introduced with success in other fields is the use of mental models and metaphors. Mental models and metaphors enable individuals to learn by building on past learning. Concepts and ideas which have already been internalized from past experience can be transferred and adapted for usage in a new learning situation with computers and technology. This article explores the use of mental models and metaphors for the technological education of nurses. The concepts themselves will be examined, followed by suggestions for possible applications specifically in the field of nursing and health care. Finally the role of the teacher in enabling improved learning as a result of these techniques will be addressed.

  6. The Montage architecture for grid-enabled science processing of large, distributed datasets

    NASA Technical Reports Server (NTRS)

    Jacob, Joseph C.; Katz, Daniel S .; Prince, Thomas; Berriman, Bruce G.; Good, John C.; Laity, Anastasia C.; Deelman, Ewa; Singh, Gurmeet; Su, Mei-Hui

    2004-01-01

    Montage is an Earth Science Technology Office (ESTO) Computational Technologies (CT) Round III Grand Challenge investigation to deploy a portable, compute-intensive, custom astronomical image mosaicking service for the National Virtual Observatory (NVO). Although Montage is developing a compute- and data-intensive service for the astronomy community, we are also helping to address a problem that spans both Earth and Space science, namely how to efficiently access and process multi-terabyte, distributed datasets. In both communities, the datasets are massive, and are stored in distributed archives that are, in most cases, remote from the available Computational resources. Therefore, state of the art computational grid technologies are a key element of the Montage portal architecture. This paper describes the aspects of the Montage design that are applicable to both the Earth and Space science communities.

  7. Mobile Technology in Educational Services

    ERIC Educational Resources Information Center

    Chen, Jueming; Kinshuk

    2005-01-01

    The use of computers and the Internet has successfully enabled educational institutions to provide their students and staff members with various online educational services. With the recent developments in mobile technology, further possibilities are emerging to provide such services through mobile devices such as mobile phones and PDAs. By…

  8. The Role of Technology in Supporting Learning Communities.

    ERIC Educational Resources Information Center

    Riel, Margaret; Fulton, Kathleen

    2001-01-01

    In a learning community, students learn to cooperate and make teams work. Past technologies (print, photography, film, and computers) have enabled idea sharing, but are one-way communication modes. Broader learning communities have been made possible through electronic field trips, online mentoring, science investigations, and humanities…

  9. Understanding Teachers' Routines to Inform Classroom Technology Design

    ERIC Educational Resources Information Center

    An, Pengcheng; Bakker, Saskia; Eggen, Berry

    2017-01-01

    Secondary school teachers have quite busy and complex routines in their classrooms. However, present classroom technologies usually require focused attention from teachers while being interacted with, which restricts their use in teachers' daily routines. Peripheral interaction is a human-computer interaction style that aims to enable interaction…

  10. DARPA TRADES Annual Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aguilo Valentin, Miguel Alejandro; Trujillo, Susie

    During calendar year 2017, Sandia National Laboratories (SNL) made strides towards developing an open portable design platform rich in highperformance computing (HPC) enabled modeling, analysis and synthesis tools. The main focus was to lay the foundations of the core interfaces that will enable plug-n-play insertion of synthesis optimization technologies in the areas of modeling, analysis and synthesis.

  11. Enabling the Differently-Abled

    ERIC Educational Resources Information Center

    Pal, Sonali

    2009-01-01

    It is perhaps unfortunate that enabling technologies do not come with an "ability warning", as they generally require the user to already have acquired a certain level of IT skills, in a similar way that online courses require users to have a certain level of prior IT knowledge. Accessing a computer and making the most of e-learning…

  12. Evaluating Musical Dis/abilities: Operationalizing the Capability Approach

    ERIC Educational Resources Information Center

    Watts, Michael; Ridley, Barbara

    2007-01-01

    We use this paper to suggest the use of Sen's capability approach in interpreting disability. The substantive focus is our evaluation of the Drake Music Project, which uses electronic and computer technologies to enable severely disabled people to explore, compose and perform music. We consider how the process of making music enables the musicians…

  13. Networking Technologies Enable Advances in Earth Science

    NASA Technical Reports Server (NTRS)

    Johnson, Marjory; Freeman, Kenneth; Gilstrap, Raymond; Beck, Richard

    2004-01-01

    This paper describes an experiment to prototype a new way of conducting science by applying networking and distributed computing technologies to an Earth Science application. A combination of satellite, wireless, and terrestrial networking provided geologists at a remote field site with interactive access to supercomputer facilities at two NASA centers, thus enabling them to validate and calibrate remotely sensed geological data in near-real time. This represents a fundamental shift in the way that Earth scientists analyze remotely sensed data. In this paper we describe the experiment and the network infrastructure that enabled it, analyze the data flow during the experiment, and discuss the scientific impact of the results.

  14. Advanced Exploration Technologies: Micro and Nano Technologies Enabling Space Missions in the 21st Century

    NASA Technical Reports Server (NTRS)

    Krabach, Timothy

    1998-01-01

    Some of the many new and advanced exploration technologies which will enable space missions in the 21st century and specifically the Manned Mars Mission are explored in this presentation. Some of these are the system on a chip, the Computed-Tomography imaging Spectrometer, the digital camera on a chip, and other Micro Electro Mechanical Systems (MEMS) technology for space. Some of these MEMS are the silicon micromachined microgyroscope, a subliming solid micro-thruster, a micro-ion thruster, a silicon seismometer, a dewpoint microhygrometer, a micro laser doppler anemometer, and tunable diode laser (TDL) sensors. The advanced technology insertion is critical for NASA to decrease mass, volume, power and mission costs, and increase functionality, science potential and robustness.

  15. WDM package enabling high-bandwidth optical intrasystem interconnects for high-performance computer systems

    NASA Astrophysics Data System (ADS)

    Schrage, J.; Soenmez, Y.; Happel, T.; Gubler, U.; Lukowicz, P.; Mrozynski, G.

    2006-02-01

    From long haul, metro access and intersystem links the trend goes to applying optical interconnection technology at increasingly shorter distances. Intrasystem interconnects such as data busses between microprocessors and memory blocks are still based on copper interconnects today. This causes a bottleneck in computer systems since the achievable bandwidth of electrical interconnects is limited through the underlying physical properties. Approaches to solve this problem by embedding optical multimode polymer waveguides into the board (electro-optical circuit board technology, EOCB) have been reported earlier. The principle feasibility of optical interconnection technology in chip-to-chip applications has been validated in a number of projects. For reasons of cost considerations waveguides with large cross sections are used in order to relax alignment requirements and to allow automatic placement and assembly without any active alignment of components necessary. On the other hand the bandwidth of these highly multimodal waveguides is restricted due to mode dispersion. The advance of WDM technology towards intrasystem applications will provide sufficiently high bandwidth which is required for future high-performance computer systems: Assuming that, for example, 8 wavelength-channels with 12Gbps (SDR1) each are given, then optical on-board interconnects with data rates a magnitude higher than the data rates of electrical interconnects for distances typically found at today's computer boards and backplanes can be realized. The data rate will be twice as much, if DDR2 technology is considered towards the optical signals as well. In this paper we discuss an approach for a hybrid integrated optoelectronic WDM package which might enable the application of WDM technology to EOCB.

  16. White paper on science operations

    NASA Technical Reports Server (NTRS)

    Schreier, Ethan J.

    1991-01-01

    Major changes are taking place in the way astronomy gets done. There are continuing advances in observational capabilities across the frequency spectrum, involving both ground-based and space-based facilities. There is also very rapid evolution of relevant computing and data management technologies. However, although the new technologies are filtering in to the astronomy community, and astronomers are looking at their computing needs in new ways, there is little coordination or coherent policy. Furthermore, although there is great awareness of the evolving technologies in the arena of operations, much of the existing operations infrastructure is ill-suited to take advantage of them. Astronomy, especially space astronomy, has often been at the cutting edge of computer use in data reduction and image analysis, but has been somewhat removed from advanced applications in operations, which have tended to be implemented by industry rather than by the end-user scientists. The purpose of this paper is threefold. First, we briefly review the background and general status of astronomy-related computing. Second, we make recommendations in three areas: data analysis; operations (directed primarily to NASA-related activities); and issues of management and policy, believing that these must be addressed to enable technological progress and to proceed through the next decade. Finally, we recommend specific NASA-related work as part of the Astrotech-21 plans, to enable better science operations in the operations of the Great Observatories and in the lunar outpost era.

  17. Process for selecting NEAMS applications for access to Idaho National Laboratory high performance computing resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michael Pernice

    2010-09-01

    INL has agreed to provide participants in the Nuclear Energy Advanced Mod- eling and Simulation (NEAMS) program with access to its high performance computing (HPC) resources under sponsorship of the Enabling Computational Technologies (ECT) program element. This report documents the process used to select applications and the software stack in place at INL.

  18. Enhancing Student Engagement: A Group Case Study Approach

    ERIC Educational Resources Information Center

    Taneja, Aakash

    2014-01-01

    Computing professionals work in groups and collaborate with individuals having diverse backgrounds and behaviors. The Accreditation Board for Engineering and Technology (ABET) characterizes that a computing program must enable students to attain the ability to analyze a problem, design and evaluate a solution, and work effectively on teams to…

  19. Undergraduate College Students, Laptop Computers, and Lifelong Learning

    ERIC Educational Resources Information Center

    Tan, Chong Leng; Morris, John S.

    2006-01-01

    Many universities and colleges list the development of lifelong learning skills as a curriculum objective and have adopted laptop programs that may enable lifelong learning. The purpose of this research is to address the effectiveness of a technology-based and computer-mediated learning environment in achieving lifelong learning skills from the…

  20. Visualising "Junk" DNA through Bioinformatics

    ERIC Educational Resources Information Center

    Elwess, Nancy L.; Latourelle, Sandra M.; Cauthorn, Olivia

    2005-01-01

    One of the hottest areas of science today is the field in which biology, information technology,and computer science are merged into a single discipline called bioinformatics. This field enables the discovery and analysis of biological data, including nucleotide and amino acid sequences that are easily accessed through the use of computers. As…

  1. Comparative Analysis of Palm and Wearable Computers for Participatory Simulations

    ERIC Educational Resources Information Center

    Klopfer, Eric; Yoon, Susan; Rivas, Luz

    2004-01-01

    Recent educational computer-based technologies have offered promising lines of research that promote social constructivist learning goals, develop skills required to operate in a knowledge-based economy (Roschelle et al. 2000), and enable more authentic science-like problem-solving. In our research programme, we have been interested in combining…

  2. Multi-petascale highly efficient parallel supercomputer

    DOEpatents

    Asaad, Sameh; Bellofatto, Ralph E.; Blocksome, Michael A.; Blumrich, Matthias A.; Boyle, Peter; Brunheroto, Jose R.; Chen, Dong; Cher, Chen -Yong; Chiu, George L.; Christ, Norman; Coteus, Paul W.; Davis, Kristan D.; Dozsa, Gabor J.; Eichenberger, Alexandre E.; Eisley, Noel A.; Ellavsky, Matthew R.; Evans, Kahn C.; Fleischer, Bruce M.; Fox, Thomas W.; Gara, Alan; Giampapa, Mark E.; Gooding, Thomas M.; Gschwind, Michael K.; Gunnels, John A.; Hall, Shawn A.; Haring, Rudolf A.; Heidelberger, Philip; Inglett, Todd A.; Knudson, Brant L.; Kopcsay, Gerard V.; Kumar, Sameer; Mamidala, Amith R.; Marcella, James A.; Megerian, Mark G.; Miller, Douglas R.; Miller, Samuel J.; Muff, Adam J.; Mundy, Michael B.; O'Brien, John K.; O'Brien, Kathryn M.; Ohmacht, Martin; Parker, Jeffrey J.; Poole, Ruth J.; Ratterman, Joseph D.; Salapura, Valentina; Satterfield, David L.; Senger, Robert M.; Smith, Brian; Steinmacher-Burow, Burkhard; Stockdell, William M.; Stunkel, Craig B.; Sugavanam, Krishnan; Sugawara, Yutaka; Takken, Todd E.; Trager, Barry M.; Van Oosten, James L.; Wait, Charles D.; Walkup, Robert E.; Watson, Alfred T.; Wisniewski, Robert W.; Wu, Peng

    2015-07-14

    A Multi-Petascale Highly Efficient Parallel Supercomputer of 100 petaOPS-scale computing, at decreased cost, power and footprint, and that allows for a maximum packaging density of processing nodes from an interconnect point of view. The Supercomputer exploits technological advances in VLSI that enables a computing model where many processors can be integrated into a single Application Specific Integrated Circuit (ASIC). Each ASIC computing node comprises a system-on-chip ASIC utilizing four or more processors integrated into one die, with each having full access to all system resources and enabling adaptive partitioning of the processors to functions such as compute or messaging I/O on an application by application basis, and preferably, enable adaptive partitioning of functions in accordance with various algorithmic phases within an application, or if I/O or other processors are underutilized, then can participate in computation or communication nodes are interconnected by a five dimensional torus network with DMA that optimally maximize the throughput of packet communications between nodes and minimize latency.

  3. Rapid Technology Assessment via Unified Deployment of Global Optical and Virtual Diagnostics

    NASA Technical Reports Server (NTRS)

    Jordan, Jeffrey D.; Watkins, A. Neal; Fleming, Gary A.; Leighty, Bradley D.; Schwartz, Richard J.; Ingram, JoAnne L.; Grinstead, Keith D., Jr.; Oglesby, Donald M.; Tyler, Charles

    2003-01-01

    This paper discusses recent developments in rapid technology assessment resulting from an active collaboration between researchers at the Air Force Research Laboratory (AFRL) at Wright Patterson Air Force Base (WPAFB) and the NASA Langley Research Center (LaRC). This program targets the unified development and deployment of global measurement technologies coupled with a virtual diagnostic interface to enable the comparative evaluation of experimental and computational results. Continuing efforts focus on the development of seamless data translation methods to enable integration of data sets of disparate file format in a common platform. Results from a successful low-speed wind tunnel test at WPAFB in which global surface pressure distributions were acquired simultaneously with model deformation and geometry measurements are discussed and comparatively evaluated with numerical simulations. Intensity- and lifetime-based pressure-sensitive paint (PSP) and projection moire interferometry (PMI) results are presented within the context of rapid technology assessment to enable simulation-based R&D.

  4. A Level Playing Field

    ERIC Educational Resources Information Center

    Harac, Lani

    2004-01-01

    In this article, the author features the Universal Design for Learning, a computer-assisted methodology that has enabled special-needs kids in the Boston area to stay in regular classrooms. Developed by a nonprofit group called the Center for Applied Special Technology, the UDL approach--in which students use whatever print or technological tools…

  5. Adaptive Social Learning Based on Crowdsourcing

    ERIC Educational Resources Information Center

    Karataev, Evgeny; Zadorozhny, Vladimir

    2017-01-01

    Many techniques have been developed to enhance learning experience with computer technology. A particularly great influence of technology on learning came with the emergence of the web and adaptive educational hypermedia systems. While the web enables users to interact and collaborate with each other to create, organize, and share knowledge via…

  6. Chi-Square Statistics, Tests of Hypothesis and Technology.

    ERIC Educational Resources Information Center

    Rochowicz, John A.

    The use of technology such as computers and programmable calculators enables students to find p-values and conduct tests of hypotheses in many different ways. Comprehension and interpretation of a research problem become the focus for statistical analysis. This paper describes how to calculate chisquare statistics and p-values for statistical…

  7. Embedded 100 Gbps Photonic Components

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kuznia, Charlie

    This innovation to fiber optic component technology increases the performance, reduces the size and reduces the power consumption of optical communications within dense network systems, such as advanced distributed computing systems and data centers. VCSEL technology is enabling short-reach (< 100 m) and >100 Gbps optical interconnections over multi-mode fiber in commercial applications.

  8. Lifelong Learning: Skills and Online Resources

    ERIC Educational Resources Information Center

    Lim, Russell F.; Hsiung, Bob C.; Hales, Deborah J.

    2006-01-01

    Objective: Advances in information technology enable the practicing psychiatrist's quest to keep up-to-date with new discoveries in psychiatry, as well as to meet recertification requirements. However, physicians' computer skills do not always keep up with technology, nor do they take advantage of online search and continuing education services.…

  9. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    NASA Astrophysics Data System (ADS)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  10. 2001 Industry Studies: Information

    DTIC Science & Technology

    2001-01-01

    increasingly demand communications, computers, and software for use in the Internet , intranets, and extranets. Information technology (IT) - enabled...As the number of Internet users increases, so does the demand for the rapid deployment of information and telecommunication technologies . The key...proliferation has become uncontrollable. Only then will the US maintain the lead in the IT market . 13 ESSAYS ON MAJOR ISSUES ISSUE: THE INFORMATION TECHNOLOGY

  11. Cloud Pedagogy: Utilizing Web-Based Technologies for the Promotion of Social Constructivist Learning in Science Teacher Preparation Courses

    ERIC Educational Resources Information Center

    Barak, Miri

    2017-01-01

    The new guidelines for science education emphasize the need to introduce computers and digital technologies as a means of enabling visualization and data collection and analysis. This requires science teachers to bring advanced technologies into the classroom and use them wisely. Hence, the goal of this study was twofold: to examine the…

  12. Application of machine learning methods in bioinformatics

    NASA Astrophysics Data System (ADS)

    Yang, Haoyu; An, Zheng; Zhou, Haotian; Hou, Yawen

    2018-05-01

    Faced with the development of bioinformatics, high-throughput genomic technology have enabled biology to enter the era of big data. [1] Bioinformatics is an interdisciplinary, including the acquisition, management, analysis, interpretation and application of biological information, etc. It derives from the Human Genome Project. The field of machine learning, which aims to develop computer algorithms that improve with experience, holds promise to enable computers to assist humans in the analysis of large, complex data sets.[2]. This paper analyzes and compares various algorithms of machine learning and their applications in bioinformatics.

  13. Technical Assessment: Integrated Photonics

    DTIC Science & Technology

    2015-10-01

    in global internet protocol traffic as a function of time by local access technology. Photonics continues to play a critical role in enabling this...communication networks. This has enabled services like the internet , high performance computing, and power-efficient large-scale data centers. The...signal processing, quantum information science, and optics for free space applications. However major obstacles challenge the implementation of

  14. TK3 eBook Software to Author, Distribute, and Use Electronic Course Content for Medical Education

    ERIC Educational Resources Information Center

    Morton, David A.; Foreman, K. Bo; Goede, Patricia A.; Bezzant, John L.; Albertine, Kurt H.

    2007-01-01

    The methods for authoring and distributing course content are undergoing substantial changes due to advancement in computer technology. Paper has been the traditional method to author and distribute course content. Paper enables students to personalize content through highlighting and note taking but does not enable the incorporation of multimedia…

  15. The Computing And Interdisciplinary Systems Office: Annual Review and Planning Meeting

    NASA Technical Reports Server (NTRS)

    Lytle, John K.

    2003-01-01

    The goal of this research is to develop an advanced engineering analysis system that enables high-fidelity, multi-disciplinary, full propulsion system simulations to be performed early in the design process (a virtual test cell that integrates propulsion and information technologies). This will enable rapid, high-confidence, cost-effective design of revolutionary systems.

  16. Computational high-resolution optical imaging of the living human retina

    NASA Astrophysics Data System (ADS)

    Shemonski, Nathan D.; South, Fredrick A.; Liu, Yuan-Zhi; Adie, Steven G.; Scott Carney, P.; Boppart, Stephen A.

    2015-07-01

    High-resolution in vivo imaging is of great importance for the fields of biology and medicine. The introduction of hardware-based adaptive optics (HAO) has pushed the limits of optical imaging, enabling high-resolution near diffraction-limited imaging of previously unresolvable structures. In ophthalmology, when combined with optical coherence tomography, HAO has enabled a detailed three-dimensional visualization of photoreceptor distributions and individual nerve fibre bundles in the living human retina. However, the introduction of HAO hardware and supporting software adds considerable complexity and cost to an imaging system, limiting the number of researchers and medical professionals who could benefit from the technology. Here we demonstrate a fully automated computational approach that enables high-resolution in vivo ophthalmic imaging without the need for HAO. The results demonstrate that computational methods in coherent microscopy are applicable in highly dynamic living systems.

  17. Computer science: Key to a space program renaissance. The 1981 NASA/ASEE summer study on the use of computer science and technology in NASA. Volume 2: Appendices

    NASA Technical Reports Server (NTRS)

    Freitas, R. A., Jr. (Editor); Carlson, P. A. (Editor)

    1983-01-01

    Adoption of an aggressive computer science research and technology program within NASA will: (1) enable new mission capabilities such as autonomous spacecraft, reliability and self-repair, and low-bandwidth intelligent Earth sensing; (2) lower manpower requirements, especially in the areas of Space Shuttle operations, by making fuller use of control center automation, technical support, and internal utilization of state-of-the-art computer techniques; (3) reduce project costs via improved software verification, software engineering, enhanced scientist/engineer productivity, and increased managerial effectiveness; and (4) significantly improve internal operations within NASA with electronic mail, managerial computer aids, an automated bureaucracy and uniform program operating plans.

  18. The World as Viewed by and with Unpaired Electrons

    PubMed Central

    Eaton, Sandra S.; Eaton, Gareth R.

    2012-01-01

    Recent advances in electron paramagnetic resonance (EPR) include capabilities for applications to areas as diverse as archeology, beer shelf life, biological structure, dosimetry, in vivo imaging, molecular magnets, and quantum computing. Enabling technologies include multifrequency continuous wave, pulsed, and rapid scan EPR. Interpretation is enhanced by increasingly powerful computational models. PMID:22975244

  19. Fostering Recursive Thinking in Combinatorics through the Use of Manipulatives and Computing Technology.

    ERIC Educational Resources Information Center

    Abramovich, Sergei; Pieper, Anne

    1996-01-01

    Describes the use of manipulatives for solving simple combinatorial problems which can lead to the discovery of recurrence relations for permutations and combinations. Numerical evidence and visual imagery generated by a computer spreadsheet through modeling these relations can enable students to experience the ease and power of combinatorial…

  20. Computer-Aided Engineering Tools | Water Power | NREL

    Science.gov Websites

    energy converters that will provide a full range of simulation capabilities for single devices and arrays simulation of water power technologies on high-performance computers enables the study of complex systems and experimentation. Such simulation is critical to accelerate progress in energy programs within the U.S. Department

  1. Computer-Mediated Communication as Experienced by Korean Women Students in US Higher Education

    ERIC Educational Resources Information Center

    Baek, Mikyung; Damarin, Suzanne K.

    2008-01-01

    Having grown up in an age of rapidly developing electronic communication technology, today's students come to higher education with high levels of comfort and familiarity with computer-mediated communication (CMC, hereafter). The students' level of comfort with CMC, coupled with CMC's promises of enabling supplemental class discussion as well as…

  2. Automated Analysis of Short Responses in an Interactive Synthetic Tutoring System for Introductory Physics

    ERIC Educational Resources Information Center

    Nakamura, Christopher M.; Murphy, Sytil K.; Christel, Michael G.; Stevens, Scott M.; Zollman, Dean A.

    2016-01-01

    Computer-automated assessment of students' text responses to short-answer questions represents an important enabling technology for online learning environments. We have investigated the use of machine learning to train computer models capable of automatically classifying short-answer responses and assessed the results. Our investigations are part…

  3. Mobile computing device as tools for college student education: a case on flashcards application

    NASA Astrophysics Data System (ADS)

    Kang, Congying

    2012-04-01

    Traditionally, college students always use flash cards as a tool to remember massive knowledge, such as nomenclature, structures, and reactions in chemistry. Educational and information technology have enabled flashcards viewed on computers, like Slides and PowerPoint, works as tunnels of drilling and feedback for the learners. The current generation of students is more capable of information technology and mobile computing devices. For example, they use their Mobile phones much more intensively everyday day. Trends of using Mobile phone as an educational tool is analyzed and a educational technology initiative is proposed, which use Mobile phone flash cards applications to help students learn biology and chemistry. Experiments show that users responded positively to these mobile flash cards.

  4. Grid Computing Education Support

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steven Crumb

    2008-01-15

    The GGF Student Scholar program enabled GGF the opportunity to bring over sixty qualified graduate and under-graduate students with interests in grid technologies to its three annual events over the three-year program.

  5. Providing Assistive Technology Applications as a Service Through Cloud Computing.

    PubMed

    Mulfari, Davide; Celesti, Antonio; Villari, Massimo; Puliafito, Antonio

    2015-01-01

    Users with disabilities interact with Personal Computers (PCs) using Assistive Technology (AT) software solutions. Such applications run on a PC that a person with a disability commonly uses. However the configuration of AT applications is not trivial at all, especially whenever the user needs to work on a PC that does not allow him/her to rely on his / her AT tools (e.g., at work, at university, in an Internet point). In this paper, we discuss how cloud computing provides a valid technological solution to enhance such a scenario.With the emergence of cloud computing, many applications are executed on top of virtual machines (VMs). Virtualization allows us to achieve a software implementation of a real computer able to execute a standard operating system and any kind of application. In this paper we propose to build personalized VMs running AT programs and settings. By using the remote desktop technology, our solution enables users to control their customized virtual desktop environment by means of an HTML5-based web interface running on any computer equipped with a browser, whenever they are.

  6. Aesthetics in Young Children's Lives: From Music Technology Curriculum Perspective

    ERIC Educational Resources Information Center

    Ko, Chia-Hui; Chou, Mei-Ju

    2013-01-01

    Music technology is a term commonly used to refer to electronic form of the musical arts, particularly devices and computer software that enable the facilitation, playback, recording, composition, storage, and performance of various musical compositions. There has been a growing awareness of the importance of aesthetics in early childhood…

  7. Beyond Functionality and Technocracy: Creating Human Involvement with Educational Technology

    ERIC Educational Resources Information Center

    Westera, Wim

    2005-01-01

    Innovation of education is highly topical. It is obviously boosted by a range of new technologies, which enable new modes of learning that, are independent of time and place through Web-based delivery and computer-mediated communication. However, innovators in education often encounter intrinsic conservatism or even deliberate obstructions. For…

  8. Effects of Practice Type in the Here and Now Mobile Learning Environment

    ERIC Educational Resources Information Center

    Tutty, Jeremy I.; Martin, Florence

    2014-01-01

    This generation of technology is characterized by mobile and portable devices such as smartphones and tablet computers with wireless broadband access. Mobile technologies enable a new kind of learning called "here and now learning," where learners have access to information anytime and anywhere to perform authentic activities in the…

  9. Wibree: wireless communication technology

    NASA Astrophysics Data System (ADS)

    Fernandes e Fizardo, Trima Piedade

    2011-12-01

    Nowadays everywhere we come across electronic devices and now the world has become entirely mobile with so many new electronic equipments. The number of computing and telecommunications devices is increasing and consequently the focus on how to connect them to each other. The usual solution is to connect the device with cables or using infra red light to make file transfer and synchronizations possible but infrared light requires line of sight. To solve these problems a new technology,Wibree radio technology complements other local connectivity technologies, consuming only a fraction of the power compared to other radio technologies, enabling smaller and less costly implementations and being easy to integrate with Bluetooth solutions, Furthermore it can be also used to enable communication between several units such as small radio LANs.This paper focuses on why this technology has got large attention although there are pro's and con's with respect to other technologies.

  10. Communication and collaboration technologies.

    PubMed

    Cheeseman, Susan E

    2012-01-01

    This is the third in a series of columns exploring health information technology (HIT) in the neonatal intensive care unit (NICU). The first column provided background information on the implementation of information technology throughout the health care delivery system, as well as the requisite informatics competencies needed for nurses to fully engage in the digital era of health care. The second column focused on information and resources to master basic computer competencies described by the TIGER initiative (Technology Informatics Guiding Education Reform) as learning about computers, computer networks, and the transfer of data.1 This column will provide additional information related to basic computer competencies, focusing on communication and collaboration technologies. Computers and the Internet have transformed the way we communicate and collaborate. Electronic communication is the ability to exchange information through the use of computer equipment and software.2 Broadly defined, any technology that facilitates linking one or more individuals together is a collaborative tool. Collaboration using technology encompasses an extensive range of applications that enable groups of individuals to work together including e-mail, instant messaging (IM ), and several web applications collectively referred to as Web 2.0 technologies. The term Web 2.0 refers to web applications where users interact and collaborate with each other in a collective exchange of ideas generating content in a virtual community. Examples of Web 2.0 technologies include social networking sites, blogs, wikis, video sharing sites, and mashups. Many organizations are developing collaborative strategies and tools for employees to connect and interact using web-based social media technologies.3.

  11. The Sunrise project: An R&D project for a national information infrastructure prototype

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Juhnyoung

    1995-02-01

    Sunrise is a Los Alamos National Laboratory (LANL) project started in October 1993. It is intended to a prototype National Information Infrastructure (NII) development project. A main focus of Sunrise is to tie together enabling technologies (networking, object-oriented distributed computing, graphical interfaces, security, multimedia technologies, and data mining technologies) with several specific applications. A diverse set of application areas was chosen to ensure that the solutions developed in the project are as generic as possible. Some of the application areas are materials modeling, medical records and image analysis, transportation simulations, and education. This paper provides a description of Sunrise andmore » a view of the architecture and objectives of this evolving project. The primary objectives of Sunrise are three-fold: (1) To develop common information-enabling tools for advanced scientific research and its applications to industry; (2) To enhance the capabilities of important research programs at the Laboratory; and (3) To define a new way of collaboration between computer science and industrially relevant research.« less

  12. Neuromorphic computing enabled by physics of electron spins: Prospects and perspectives

    NASA Astrophysics Data System (ADS)

    Sengupta, Abhronil; Roy, Kaushik

    2018-03-01

    “Spintronics” refers to the understanding of the physics of electron spin-related phenomena. While most of the significant advancements in this field has been driven primarily by memory, recent research has demonstrated that various facets of the underlying physics of spin transport and manipulation can directly mimic the functionalities of the computational primitives in neuromorphic computation, i.e., the neurons and synapses. Given the potential of these spintronic devices to implement bio-mimetic computations at very low terminal voltages, several spin-device structures have been proposed as the core building blocks of neuromorphic circuits and systems to implement brain-inspired computing. Such an approach is expected to play a key role in circumventing the problems of ever-increasing power dissipation and hardware requirements for implementing neuro-inspired algorithms in conventional digital CMOS technology. Perspectives on spin-enabled neuromorphic computing, its status, and challenges and future prospects are outlined in this review article.

  13. Structural and Computational Biology in the Design of Immunogenic Vaccine Antigens

    PubMed Central

    Liljeroos, Lassi; Malito, Enrico; Ferlenghi, Ilaria; Bottomley, Matthew James

    2015-01-01

    Vaccination is historically one of the most important medical interventions for the prevention of infectious disease. Previously, vaccines were typically made of rather crude mixtures of inactivated or attenuated causative agents. However, over the last 10–20 years, several important technological and computational advances have enabled major progress in the discovery and design of potently immunogenic recombinant protein vaccine antigens. Here we discuss three key breakthrough approaches that have potentiated structural and computational vaccine design. Firstly, genomic sciences gave birth to the field of reverse vaccinology, which has enabled the rapid computational identification of potential vaccine antigens. Secondly, major advances in structural biology, experimental epitope mapping, and computational epitope prediction have yielded molecular insights into the immunogenic determinants defining protective antigens, enabling their rational optimization. Thirdly, and most recently, computational approaches have been used to convert this wealth of structural and immunological information into the design of improved vaccine antigens. This review aims to illustrate the growing power of combining sequencing, structural and computational approaches, and we discuss how this may drive the design of novel immunogens suitable for future vaccines urgently needed to increase the global prevention of infectious disease. PMID:26526043

  14. Structural and Computational Biology in the Design of Immunogenic Vaccine Antigens.

    PubMed

    Liljeroos, Lassi; Malito, Enrico; Ferlenghi, Ilaria; Bottomley, Matthew James

    2015-01-01

    Vaccination is historically one of the most important medical interventions for the prevention of infectious disease. Previously, vaccines were typically made of rather crude mixtures of inactivated or attenuated causative agents. However, over the last 10-20 years, several important technological and computational advances have enabled major progress in the discovery and design of potently immunogenic recombinant protein vaccine antigens. Here we discuss three key breakthrough approaches that have potentiated structural and computational vaccine design. Firstly, genomic sciences gave birth to the field of reverse vaccinology, which has enabled the rapid computational identification of potential vaccine antigens. Secondly, major advances in structural biology, experimental epitope mapping, and computational epitope prediction have yielded molecular insights into the immunogenic determinants defining protective antigens, enabling their rational optimization. Thirdly, and most recently, computational approaches have been used to convert this wealth of structural and immunological information into the design of improved vaccine antigens. This review aims to illustrate the growing power of combining sequencing, structural and computational approaches, and we discuss how this may drive the design of novel immunogens suitable for future vaccines urgently needed to increase the global prevention of infectious disease.

  15. Digital imaging of root traits (DIRT): a high-throughput computing and collaboration platform for field-based root phenomics.

    PubMed

    Das, Abhiram; Schneider, Hannah; Burridge, James; Ascanio, Ana Karine Martinez; Wojciechowski, Tobias; Topp, Christopher N; Lynch, Jonathan P; Weitz, Joshua S; Bucksch, Alexander

    2015-01-01

    Plant root systems are key drivers of plant function and yield. They are also under-explored targets to meet global food and energy demands. Many new technologies have been developed to characterize crop root system architecture (CRSA). These technologies have the potential to accelerate the progress in understanding the genetic control and environmental response of CRSA. Putting this potential into practice requires new methods and algorithms to analyze CRSA in digital images. Most prior approaches have solely focused on the estimation of root traits from images, yet no integrated platform exists that allows easy and intuitive access to trait extraction and analysis methods from images combined with storage solutions linked to metadata. Automated high-throughput phenotyping methods are increasingly used in laboratory-based efforts to link plant genotype with phenotype, whereas similar field-based studies remain predominantly manual low-throughput. Here, we present an open-source phenomics platform "DIRT", as a means to integrate scalable supercomputing architectures into field experiments and analysis pipelines. DIRT is an online platform that enables researchers to store images of plant roots, measure dicot and monocot root traits under field conditions, and share data and results within collaborative teams and the broader community. The DIRT platform seamlessly connects end-users with large-scale compute "commons" enabling the estimation and analysis of root phenotypes from field experiments of unprecedented size. DIRT is an automated high-throughput computing and collaboration platform for field based crop root phenomics. The platform is accessible at http://www.dirt.iplantcollaborative.org/ and hosted on the iPlant cyber-infrastructure using high-throughput grid computing resources of the Texas Advanced Computing Center (TACC). DIRT is a high volume central depository and high-throughput RSA trait computation platform for plant scientists working on crop roots. It enables scientists to store, manage and share crop root images with metadata and compute RSA traits from thousands of images in parallel. It makes high-throughput RSA trait computation available to the community with just a few button clicks. As such it enables plant scientists to spend more time on science rather than on technology. All stored and computed data is easily accessible to the public and broader scientific community. We hope that easy data accessibility will attract new tool developers and spur creative data usage that may even be applied to other fields of science.

  16. Human and Robotic Space Mission Use Cases for High-Performance Spaceflight Computing

    NASA Technical Reports Server (NTRS)

    Some, Raphael; Doyle, Richard; Bergman, Larry; Whitaker, William; Powell, Wesley; Johnson, Michael; Goforth, Montgomery; Lowry, Michael

    2013-01-01

    Spaceflight computing is a key resource in NASA space missions and a core determining factor of spacecraft capability, with ripple effects throughout the spacecraft, end-to-end system, and mission. Onboard computing can be aptly viewed as a "technology multiplier" in that advances provide direct dramatic improvements in flight functions and capabilities across the NASA mission classes, and enable new flight capabilities and mission scenarios, increasing science and exploration return. Space-qualified computing technology, however, has not advanced significantly in well over ten years and the current state of the practice fails to meet the near- to mid-term needs of NASA missions. Recognizing this gap, the NASA Game Changing Development Program (GCDP), under the auspices of the NASA Space Technology Mission Directorate, commissioned a study on space-based computing needs, looking out 15-20 years. The study resulted in a recommendation to pursue high-performance spaceflight computing (HPSC) for next-generation missions, and a decision to partner with the Air Force Research Lab (AFRL) in this development.

  17. System design and implementation of digital-image processing using computational grids

    NASA Astrophysics Data System (ADS)

    Shen, Zhanfeng; Luo, Jiancheng; Zhou, Chenghu; Huang, Guangyu; Ma, Weifeng; Ming, Dongping

    2005-06-01

    As a special type of digital image, remotely sensed images are playing increasingly important roles in our daily lives. Because of the enormous amounts of data involved, and the difficulties of data processing and transfer, an important issue for current computer and geo-science experts is developing internet technology to implement rapid remotely sensed image processing. Computational grids are able to solve this problem effectively. These networks of computer workstations enable the sharing of data and resources, and are used by computer experts to solve imbalances of network resources and lopsided usage. In China, computational grids combined with spatial-information-processing technology have formed a new technology: namely, spatial-information grids. In the field of remotely sensed images, spatial-information grids work more effectively for network computing, data processing, resource sharing, task cooperation and so on. This paper focuses mainly on the application of computational grids to digital-image processing. Firstly, we describe the architecture of digital-image processing on the basis of computational grids, its implementation is then discussed in detail with respect to the technology of middleware. The whole network-based intelligent image-processing system is evaluated on the basis of the experimental analysis of remotely sensed image-processing tasks; the results confirm the feasibility of the application of computational grids to digital-image processing.

  18. Virtual reality and planetary exploration

    NASA Technical Reports Server (NTRS)

    Mcgreevy, Michael W.

    1992-01-01

    Exploring planetary environments is central to NASA's missions and goals. A new computing technology called Virtual Reality has much to offer in support of planetary exploration. This technology augments and extends human presence within computer-generated and remote spatial environments. Historically, NASA has been a leader in many of the fundamental concepts and technologies that comprise Virtual Reality. Indeed, Ames Research Center has a central role in the development of this rapidly emerging approach to using computers. This ground breaking work has inspired researchers in academia, industry, and the military. Further, NASA's leadership in this technology has spun off new businesses, has caught the attention of the international business community, and has generated several years of positive international media coverage. In the future, Virtual Reality technology will enable greatly improved human-machine interactions for more productive planetary surface exploration. Perhaps more importantly, Virtual Reality technology will democratize the experience of planetary exploration and thereby broaden understanding of, and support for, this historic enterprise.

  19. Virtual reality and planetary exploration

    NASA Astrophysics Data System (ADS)

    McGreevy, Michael W.

    Exploring planetary environments is central to NASA's missions and goals. A new computing technology called Virtual Reality has much to offer in support of planetary exploration. This technology augments and extends human presence within computer-generated and remote spatial environments. Historically, NASA has been a leader in many of the fundamental concepts and technologies that comprise Virtual Reality. Indeed, Ames Research Center has a central role in the development of this rapidly emerging approach to using computers. This ground breaking work has inspired researchers in academia, industry, and the military. Further, NASA's leadership in this technology has spun off new businesses, has caught the attention of the international business community, and has generated several years of positive international media coverage. In the future, Virtual Reality technology will enable greatly improved human-machine interactions for more productive planetary surface exploration. Perhaps more importantly, Virtual Reality technology will democratize the experience of planetary exploration and thereby broaden understanding of, and support for, this historic enterprise.

  20. Nanoelectronic programmable synapses based on phase change materials for brain-inspired computing.

    PubMed

    Kuzum, Duygu; Jeyasingh, Rakesh G D; Lee, Byoungil; Wong, H-S Philip

    2012-05-09

    Brain-inspired computing is an emerging field, which aims to extend the capabilities of information technology beyond digital logic. A compact nanoscale device, emulating biological synapses, is needed as the building block for brain-like computational systems. Here, we report a new nanoscale electronic synapse based on technologically mature phase change materials employed in optical data storage and nonvolatile memory applications. We utilize continuous resistance transitions in phase change materials to mimic the analog nature of biological synapses, enabling the implementation of a synaptic learning rule. We demonstrate different forms of spike-timing-dependent plasticity using the same nanoscale synapse with picojoule level energy consumption.

  1. Experimental demonstration of blind quantum computing

    NASA Astrophysics Data System (ADS)

    Barz, Stefanie; Kashefi, Elham; Broadbent, Anne; Fitzsimons, Joe; Zeilinger, Anton; Walther, Philip

    2012-02-01

    Quantum computers are among the most promising applications of quantum-enhanced technologies. Quantum effects such as superposition and entanglement enable computational speed-ups that are unattainable using classical computers. The challenges in realising quantum computers suggest that in the near future, only a few facilities worldwide will be capable of operating such devices. In order to exploit these computers, users would seemingly have to give up their privacy. It was recently shown that this is not the case and that, via the universal blind quantum computation protocol, quantum mechanics provides a way to guarantee that the user's data remain private. Here, we demonstrate the first experimental version of this protocol using polarisation-entangled photonic qubits. We demonstrate various blind one- and two-qubit gate operations as well as blind versions of the Deutsch's and Grover's algorithms. When the technology to build quantum computers becomes available, this will become an important privacy-preserving feature of quantum information processing.

  2. Live broadcast of laparoscopic surgery to handheld computers.

    PubMed

    Gandsas, A; McIntire, K; Park, A

    2004-06-01

    Thanks to advances in computer power and miniaturization technology, portable electronic devices are now being used to assist physicians with various applications that extend far beyond Web browsing or sending e-mail. Handheld computers are used for electronic medical records, billing, coding, and to enable convenient access to electronic journals for reference purposes. The results of diagnostic investigations, such as laboratory results, study reports, and still radiographic pictures, can also be downloaded into portable devices for later view. Handheld computer technology, combined with wireless protocols and streaming video technology, has the added potential to become a powerful educational tool for medical students and residents. The purpose of this study was to assess the feasibility of transferring multimedia data in real time to a handheld computer via a wireless network and displaying them on the computer screens of clients at remote locations. A live laparoscopic splenectomy was transmitted live to eight handheld computers simultaneously through our institution's wireless network. All eight viewers were able to view the procedure and to hear the surgeon's comments throughout the entire duration of the operation. Handheld computer technology can play a key role in surgical education by delivering information to surgical residents or students when they are geographically distant from the actual event. Validation of this new technology by conducting clinical research is still needed to determine whether resident physicians or medical students can benefit from the use of handheld computers.

  3. Cultural and Global Linkages of Emotional Support through Online Support Groups.

    ERIC Educational Resources Information Center

    Gary, Juneau Mahan

    Computer technology is altering the way people cope with emotional distress. Computers enable people worldwide and from all cultural groups to give and receive emotional support when it may be culturally stigmatizing to seek face-to-face support or when support services are limited or non-existent. Online support groups attract a broad range of…

  4. EduSpeak[R]: A Speech Recognition and Pronunciation Scoring Toolkit for Computer-Aided Language Learning Applications

    ERIC Educational Resources Information Center

    Franco, Horacio; Bratt, Harry; Rossier, Romain; Rao Gadde, Venkata; Shriberg, Elizabeth; Abrash, Victor; Precoda, Kristin

    2010-01-01

    SRI International's EduSpeak[R] system is a software development toolkit that enables developers of interactive language education software to use state-of-the-art speech recognition and pronunciation scoring technology. Automatic pronunciation scoring allows the computer to provide feedback on the overall quality of pronunciation and to point to…

  5. The world as viewed by and with unpaired electrons.

    PubMed

    Eaton, Sandra S; Eaton, Gareth R

    2012-10-01

    Recent advances in electron paramagnetic resonance (EPR) include capabilities for applications to areas as diverse as archeology, beer shelf life, biological structure, dosimetry, in vivo imaging, molecular magnets, and quantum computing. Enabling technologies include multifrequency continuous wave, pulsed, and rapid scan EPR. Interpretation is enhanced by increasingly powerful computational models. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. [A skin cell segregating control system based on PC].

    PubMed

    Liu, Wen-zhong; Zhou, Ming; Zhang, Hong-bing

    2005-11-01

    A skin cell segregating control system based on PC (personal computer) is presented in this paper. Its front controller is a single-chip microcomputer which enables the manipulation for 6 patients simultaneously, and thus provides a great convenience for clinical treatments for vitiligo. With the use of serial port communication technology, it's possible to monitor and control the front controller in a PC terminal. And the application of computer image acquisition technology realizes the synchronous acquisition of pathologic shin cell images pre/after the operation and a case history. Clinical tests prove its conformity with national standards and the pre-set technological requirements.

  7. Towards programmable plant genetic circuits.

    PubMed

    Medford, June I; Prasad, Ashok

    2016-07-01

    Synthetic biology enables the construction of genetic circuits with predictable gene functions in plants. Detailed quantitative descriptions of the transfer function or input-output function for genetic parts (promoters, 5' and 3' untranslated regions, etc.) are collected. These data are then used in computational simulations to determine their robustness and desired properties, thereby enabling the best components to be selected for experimental testing in plants. In addition, the process forms an iterative workflow which allows vast improvement to validated elements with sub-optimal function. These processes enable computational functions such as digital logic in living plants and follow the pathway of technological advances which took us from vacuum tubes to cell phones. © 2016 The Authors The Plant Journal © 2016 John Wiley & Sons Ltd.

  8. Physics and Robotic Sensing -- the good, the bad, and approaches to making it work

    NASA Astrophysics Data System (ADS)

    Huff, Brian

    2011-03-01

    All of the technological advances that have benefited consumer electronics have direct application to robotics. Technological advances have resulted in the dramatic reduction in size, cost, and weight of computing systems, while simultaneously doubling computational speed every eighteen months. The same manufacturing advancements that have enabled this rapid increase in computational power are now being leveraged to produce small, powerful and cost-effective sensing technologies applicable for use in mobile robotics applications. Despite the increase in computing and sensing resources available to today's robotic systems developers, there are sensing problems typically found in unstructured environments that continue to frustrate the widespread use of robotics and unmanned systems. This talk presents how physics has contributed to the creation of the technologies that are making modern robotics possible. The talk discusses theoretical approaches to robotic sensing that appear to suffer when they are deployed in the real world. Finally the author presents methods being used to make robotic sensing more robust.

  9. Aerosciences, Aero-Propulsion and Flight Mechanics Technology Development for NASA's Next Generation Launch Technology Program

    NASA Technical Reports Server (NTRS)

    Cockrell, Charles E., Jr.

    2003-01-01

    The Next Generation Launch Technology (NGLT) program, Vehicle Systems Research and Technology (VSR&T) project is pursuing technology advancements in aerothermodynamics, aeropropulsion and flight mechanics to enable development of future reusable launch vehicle (RLV) systems. The current design trade space includes rocket-propelled, hypersonic airbreathing and hybrid systems in two-stage and single-stage configurations. Aerothermodynamics technologies include experimental and computational databases to evaluate stage separation of two-stage vehicles as well as computational and trajectory simulation tools for this problem. Additionally, advancements in high-fidelity computational tools and measurement techniques are being pursued along with the study of flow physics phenomena, such as boundary-layer transition. Aero-propulsion technology development includes scramjet flowpath development and integration, with a current emphasis on hypervelocity (Mach 10 and above) operation, as well as the study of aero-propulsive interactions and the impact on overall vehicle performance. Flight mechanics technology development is focused on advanced guidance, navigation and control (GN&C) algorithms and adaptive flight control systems for both rocket-propelled and airbreathing vehicles.

  10. Health Technology-Enabled Interventions for Adherence Support and Retention in Care Among US HIV-Infected Adolescents and Young Adults: An Integrative Review.

    PubMed

    Navarra, Ann-Margaret Dunn; Gwadz, Marya Viorst; Whittemore, Robin; Bakken, Suzanne R; Cleland, Charles M; Burleson, Winslow; Jacobs, Susan Kaplan; Melkus, Gail D'Eramo

    2017-11-01

    The objective of this integrative review was to describe current US trends for health technology-enabled adherence interventions among behaviorally HIV-infected youth (ages 13-29 years), and present the feasibility and efficacy of identified interventions. A comprehensive search was executed across five electronic databases (January 2005-March 2016). Of the 1911 identified studies, nine met the inclusion criteria of quantitative or mixed methods design, technology-enabled adherence and or retention intervention for US HIV-infected youth. The majority were small pilots. Intervention dose varied between studies applying similar technology platforms with more than half not informed by a theoretical framework. Retention in care was not a reported outcome, and operationalization of adherence was heterogeneous across studies. Despite these limitations, synthesized findings from this review demonstrate feasibility of computer-based interventions, and initial efficacy of SMS texting for adherence support among HIV-infected youth. Moving forward, there is a pressing need for the expansion of this evidence base.

  11. The Russia Project: Building Digital Bridges and Meeting Adolescent Needs

    ERIC Educational Resources Information Center

    Beal, Candy; Cuper, Pru; Dalton, Pat

    2005-01-01

    The intent of good education is to meet the needs of learners. How educators go about meeting those needs varies from one context to the next, and has lately been affected by the advent of technology-enhanced learning tools. Today, computer technology applications enable teachers to accelerate the pace of learning, increase the depth of in-school…

  12. Towards Individualized Instruction with Technology-Enabled Tools and Methods: An Exploratory Study. CRESST Report 854

    ERIC Educational Resources Information Center

    Chung, Gregory K. W. K.; Delacruz, Girlie C.; Dionne, Gary B.; Baker, Eva L.; Lee, John J.; Osmundson, Ellen

    2016-01-01

    This report addresses a renewed interest in individualized instruction, driven in part by advances in technology and assessment as well as a persistent desire to increase the access, efficiency, and cost effectiveness of training and education. Using computer-based instruction we delivered extremely efficient instruction targeted to low knowledge…

  13. Adapting to Student Learning Styles: Engaging Students with Cell Phone Technology in Organic Chemistry Instruction

    ERIC Educational Resources Information Center

    Pursell, David P.

    2009-01-01

    Students of organic chemistry traditionally make 3 x 5 in. flash cards to assist learning nomenclature, structures, and reactions. Advances in educational technology have enabled flash cards to be viewed on computers, offering an endless array of drilling and feedback for students. The current generation of students is less inclined to use…

  14. Digital Devices, Distraction, and Student Performance: Does In-Class Cell Phone Use Reduce Learning?

    ERIC Educational Resources Information Center

    Duncan, Douglas K.; Hoekstra, Angel R.; Wilcox, Bethany R.

    2012-01-01

    The recent increase in use of digital devices such as laptop computers, iPads, and web-enabled cell phones has generated concern about how technologies affect student performance. Combining observation, survey, and interview data, this research assesses the effects of technology use on student attitudes and learning. Data were gathered in eight…

  15. Convergent and Divergent Computer-Mediated Communication Tasks in an English for Academic Purposes Course

    ERIC Educational Resources Information Center

    Jackson, Daniel O.

    2011-01-01

    This article describes the implementation of technology-mediated tasks in an English for academic purposes (EAP) curriculum at a Japanese university. The course addressed the needs of English majors at the school by enabling more efficient completion of academic work, including essay writing. One way that technology supported this goal was through…

  16. Beyond "Classroom" Technology: The Equipment Circulation Program at Rasmuson Library, University of Alaska Fairbanks

    ERIC Educational Resources Information Center

    Jensen, Karen

    2008-01-01

    The library at the University of Alaska Fairbanks offers a unique equipment lending program through its Circulation Desk. The program features a wide array of equipment types, generous circulation policies, and unrestricted borrowing, enabling students, staff, and faculty to experiment with the latest in audio, video, and computer technologies,…

  17. The Teaching of Anthropogenic Climate Change and Earth Science via Technology-Enabled Inquiry Education

    ERIC Educational Resources Information Center

    Bush, Drew; Sieber, Renee; Seiler, Gale; Chandler, Mark

    2016-01-01

    A gap has existed between the tools and processes of scientists working on anthropogenic global climate change (AGCC) and the technologies and curricula available to educators teaching the subject through student inquiry. Designing realistic scientific inquiry into AGCC poses a challenge because research on it relies on complex computer models,…

  18. Virtual Worlds to Support Patient Group Communication? A Questionnaire Study Investigating Potential for Virtual World Focus Group Use by Respiratory Patients

    ERIC Educational Resources Information Center

    Taylor, Michael J.; Taylor, Dave; Vlaev, Ivo; Elkin, Sarah

    2017-01-01

    Recent advances in communication technologies enable potential provision of remote education for patients using computer-generated environments known as virtual worlds. Previous research has revealed highly variable levels of patient receptiveness to using information technologies for healthcare-related purposes. This preliminary study involved…

  19. Quantum Testbeds Stakeholder Workshop (QTSW) Report meeting purpose and agenda.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hebner, Gregory A.

    Quantum computing (QC) is a promising early-stage technology with the potential to provide scientific computing capabilities far beyond what is possible with even an Exascale computer in specific problems of relevance to the Office of Science. These include (but are not limited to) materials modeling, molecular dynamics, and quantum chromodynamics. However, commercial QC systems are not yet available and the technical maturity of current QC hardware, software, algorithms, and systems integration is woefully incomplete. Thus, there is a significant opportunity for DOE to define the technology building blocks, and solve the system integration issues to enable a revolutionary tool. Oncemore » realized, QC will have world changing impact on economic competitiveness, the scientific enterprise, and citizen well-being. Prior to this workshop, DOE / Office of Advanced Scientific Computing Research (ASCR) hosted a workshop in 2015 to explore QC scientific applications. The goal of that workshop was to assess the viability of QC technologies to meet the computational requirements in support of DOE’s science and energy mission and to identify the potential impact of these technologies.« less

  20. Enabling Earth Science: The Facilities and People of the NCCS

    NASA Technical Reports Server (NTRS)

    2002-01-01

    The NCCS's mass data storage system allows scientists to store and manage the vast amounts of data generated by these computations, and its high-speed network connections allow the data to be accessed quickly from the NCCS archives. Some NCCS users perform studies that are directly related to their ability to run computationally expensive and data-intensive simulations. Because the number and type of questions scientists research often are limited by computing power, the NCCS continually pursues the latest technologies in computing, mass storage, and networking technologies. Just as important as the processors, tapes, and routers of the NCCS are the personnel who administer this hardware, create and manage accounts, maintain security, and assist the scientists, often working one on one with them.

  1. Application of M-JPEG compression hardware to dynamic stimulus production.

    PubMed

    Mulligan, J B

    1997-01-01

    Inexpensive circuit boards have appeared on the market which transform a normal micro-computer's disk drive into a video disk capable of playing extended video sequences in real time. This technology enables the performance of experiments which were previously impossible, or at least prohibitively expensive. The new technology achieves this capability using special-purpose hardware to compress and decompress individual video frames, enabling a video stream to be transferred over relatively low-bandwidth disk interfaces. This paper will describe the use of such devices for visual psychophysics and present the technical issues that must be considered when evaluating individual products.

  2. Exploring the experience of clients with tetraplegia utilizing assistive technology for computer access.

    PubMed

    Folan, Alyce; Barclay, Linda; Cooper, Cathy; Robinson, Merren

    2015-01-01

    Assistive technology for computer access can be used to facilitate people with a spinal cord injury to utilize mainstream computer applications, thereby enabling participation in a variety of meaningful occupations. The aim of this study was to gain an understanding of the experiences of clients with tetraplegia trialing assistive technologies for computer access during different stages in a public rehabilitation service. In order to explore the experiences of clients with tetraplegia trialing assistive technologies for computer use, qualitative methodology was selected. Data were collected from seven participants using semi-structured interviews, which were audio-taped, transcribed and analyzed thematically. Three main themes were identified. These were: getting back into life, assisting in adjusting to injury and learning new skills. The findings from this study demonstrated that people with tetraplegia can be assisted to return to previous life roles or engage in new roles, through developing skills in the use of assistive technology for computer access. Being able to use computers for meaningful activities contributed to the participants gaining an enhanced sense of self-efficacy, and thereby quality of life. Implications for Rehabilitation Findings from this pilot study indicate that people with tetraplegia can be assisted to return to previous life roles, and develop new roles that have meaning to them through the use of assistive technologies for computer use. Being able to use the internet to socialize, and complete daily tasks, contributed to the participants gaining a sense of control over their lives. Early introduction to assistive technology is important to ensure sufficient time for newly injured people to feel comfortable enough with the assistive technology to use the computers productively by the time of discharge. Further research into this important and expanding area is indicated.

  3. Practical 3D Printing of Antennas and RF Electronics

    DTIC Science & Technology

    2017-03-01

    Passive RF; Combiners Introduction Additive manufacturing can reduce the time and material costs in a design cycle and enable the on-demand printing of...performance, and create Computer Assisted Manufacturing (CAM) files. By intelligently leveraging this process, the design can be readily updated or...advances in 3D printing technology now enable antennas and RF electronics to be designed and prototyped significantly faster than conventional

  4. Army AL&T, October-December 2008

    DTIC Science & Technology

    2008-12-01

    during the WIN-T technology demonstration Nov. 8, 2007, at Naval Air Engineering Station , Lakehurst, NJ. (U.S. Army photo by Russ Messeroll.) 16 OCTOBER...worldwide communications architecture, enabling connectivity from the global backbone to regional networks to posts/camps/ stations , and, lastly, to...Force Tracker. • Tacticomp™ wireless and Global Positioning System(GPS)-enabled hand-held computer. • One Station Remote Video Terminal. • Counter

  5. VCSEL-based optical transceiver module for high-speed short-reach interconnect

    NASA Astrophysics Data System (ADS)

    Yagisawa, Takatoshi; Oku, Hideki; Mori, Tatsuhiro; Tsudome, Rie; Tanaka, Kazuhiro; Daikuhara, Osamu; Komiyama, Takeshi; Ide, Satoshi

    2017-02-01

    Interconnects have been more important in high-performance computing systems and high-end servers beside its improvements in computing capability. Recently, active optical cables (AOCs) have started being used for this purpose instead of conventionally used copper cables. The AOC enables to extend the transmission distance of the high-speed signals dramatically by its broadband characteristics, however, it tend to increase the cost. In this paper, we report our developed quad small form-factor pluggable (QSFP) AOC utilizing cost-effective optical-module technologies. These are a unique structure using generally used flexible printed circuit (FPC) in combination with an optical waveguide that enables low-cost high-precision assembly with passive alignment, a lens-integrated ferrule that improves productivity by eliminating a polishing process for physical contact of standard PMT connector for the optical waveguide, and an overdrive technology that enables 100 Gb/s (25 Gb/s × 4-channel) operation with low-cost 14 Gb/s vertical-cavity surfaceemitting laser (VCSEL) array. The QSFP AOC demonstrated clear eye opening and error-free operation at 100 Gb/s with high yield rate even though the 14 Gb/s VCSEL was used thanks to the low-coupling loss resulting from the highprecision alignment of optical devices and the over-drive technology.

  6. Distributed nuclear medicine applications using World Wide Web and Java technology.

    PubMed

    Knoll, P; Höll, K; Mirzaei, S; Koriska, K; Köhn, H

    2000-01-01

    At present, medical applications applying World Wide Web (WWW) technology are mainly used to view static images and to retrieve some information. The Java platform is a relative new way of computing, especially designed for network computing and distributed applications which enables interactive connection between user and information via the WWW. The Java 2 Software Development Kit (SDK) including Java2D API, Java Remote Method Invocation (RMI) technology, Object Serialization and the Java Advanced Imaging (JAI) extension was used to achieve a robust, platform independent and network centric solution. Medical image processing software based on this technology is presented and adequate performance capability of Java is demonstrated by an iterative reconstruction algorithm for single photon emission computerized tomography (SPECT).

  7. Technical description of space ultra reliable modular computer (SUMC), model 2 B

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The design features of the SUMC-2B computer, also called the IBM-HTC are described. It is general purpose digital computer implemented with flexible hardware elements and microprograming to enable low cost customizing to a wide range of applications. It executes the S/360 standard instruction set to maintain problem state compability. Memory technology, extended instruction sets, and I/O channel variations are among the available options.

  8. Science in the cloud (SIC): A use case in MRI connectomics

    PubMed Central

    Gorgolewski, Krzysztof J.; Kleissas, Dean; Roncal, William Gray; Litt, Brian; Wandell, Brian; Poldrack, Russel A.; Wiener, Martin; Vogelstein, R. Jacob; Burns, Randal

    2017-01-01

    Abstract Modern technologies are enabling scientists to collect extraordinary amounts of complex and sophisticated data across a huge range of scales like never before. With this onslaught of data, we can allow the focal point to shift from data collection to data analysis. Unfortunately, lack of standardized sharing mechanisms and practices often make reproducing or extending scientific results very difficult. With the creation of data organization structures and tools that drastically improve code portability, we now have the opportunity to design such a framework for communicating extensible scientific discoveries. Our proposed solution leverages these existing technologies and standards, and provides an accessible and extensible model for reproducible research, called ‘science in the cloud’ (SIC). Exploiting scientific containers, cloud computing, and cloud data services, we show the capability to compute in the cloud and run a web service that enables intimate interaction with the tools and data presented. We hope this model will inspire the community to produce reproducible and, importantly, extensible results that will enable us to collectively accelerate the rate at which scientific breakthroughs are discovered, replicated, and extended. PMID:28327935

  9. Enabling NVM for Data-Intensive Scientific Services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carns, Philip; Jenkins, John; Seo, Sangmin

    Specialized, transient data services are playing an increasingly prominent role in data-intensive scientific computing. These services offer flexible, on-demand pairing of applications with storage hardware using semantics that are optimized for the problem domain. Concurrent with this trend, upcoming scientific computing and big data systems will be deployed with emerging NVM technology to achieve the highest possible price/productivity ratio. Clearly, therefore, we must develop techniques to facilitate the confluence of specialized data services and NVM technology. In this work we explore how to enable the composition of NVM resources within transient distributed services while still retaining their essential performance characteristics.more » Our approach involves eschewing the conventional distributed file system model and instead projecting NVM devices as remote microservices that leverage user-level threads, RPC services, RMA-enabled network transports, and persistent memory libraries in order to maximize performance. We describe a prototype system that incorporates these concepts, evaluate its performance for key workloads on an exemplar system, and discuss how the system can be leveraged as a component of future data-intensive architectures.« less

  10. Science in the cloud (SIC): A use case in MRI connectomics.

    PubMed

    Kiar, Gregory; Gorgolewski, Krzysztof J; Kleissas, Dean; Roncal, William Gray; Litt, Brian; Wandell, Brian; Poldrack, Russel A; Wiener, Martin; Vogelstein, R Jacob; Burns, Randal; Vogelstein, Joshua T

    2017-05-01

    Modern technologies are enabling scientists to collect extraordinary amounts of complex and sophisticated data across a huge range of scales like never before. With this onslaught of data, we can allow the focal point to shift from data collection to data analysis. Unfortunately, lack of standardized sharing mechanisms and practices often make reproducing or extending scientific results very difficult. With the creation of data organization structures and tools that drastically improve code portability, we now have the opportunity to design such a framework for communicating extensible scientific discoveries. Our proposed solution leverages these existing technologies and standards, and provides an accessible and extensible model for reproducible research, called 'science in the cloud' (SIC). Exploiting scientific containers, cloud computing, and cloud data services, we show the capability to compute in the cloud and run a web service that enables intimate interaction with the tools and data presented. We hope this model will inspire the community to produce reproducible and, importantly, extensible results that will enable us to collectively accelerate the rate at which scientific breakthroughs are discovered, replicated, and extended. © The Author 2017. Published by Oxford University Press.

  11. Using spatial principles to optimize distributed computing for enabling the physical science discoveries

    PubMed Central

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-01-01

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century. PMID:21444779

  12. Using spatial principles to optimize distributed computing for enabling the physical science discoveries.

    PubMed

    Yang, Chaowei; Wu, Huayi; Huang, Qunying; Li, Zhenlong; Li, Jing

    2011-04-05

    Contemporary physical science studies rely on the effective analyses of geographically dispersed spatial data and simulations of physical phenomena. Single computers and generic high-end computing are not sufficient to process the data for complex physical science analysis and simulations, which can be successfully supported only through distributed computing, best optimized through the application of spatial principles. Spatial computing, the computing aspect of a spatial cyberinfrastructure, refers to a computing paradigm that utilizes spatial principles to optimize distributed computers to catalyze advancements in the physical sciences. Spatial principles govern the interactions between scientific parameters across space and time by providing the spatial connections and constraints to drive the progression of the phenomena. Therefore, spatial computing studies could better position us to leverage spatial principles in simulating physical phenomena and, by extension, advance the physical sciences. Using geospatial science as an example, this paper illustrates through three research examples how spatial computing could (i) enable data intensive science with efficient data/services search, access, and utilization, (ii) facilitate physical science studies with enabling high-performance computing capabilities, and (iii) empower scientists with multidimensional visualization tools to understand observations and simulations. The research examples demonstrate that spatial computing is of critical importance to design computing methods to catalyze physical science studies with better data access, phenomena simulation, and analytical visualization. We envision that spatial computing will become a core technology that drives fundamental physical science advancements in the 21st century.

  13. Data-Base Software For Tracking Technological Developments

    NASA Technical Reports Server (NTRS)

    Aliberti, James A.; Wright, Simon; Monteith, Steve K.

    1996-01-01

    Technology Tracking System (TechTracS) computer program developed for use in storing and retrieving information on technology and related patent information developed under auspices of NASA Headquarters and NASA's field centers. Contents of data base include multiple scanned still images and quick-time movies as well as text. TechTracS includes word-processing, report-editing, chart-and-graph-editing, and search-editing subprograms. Extensive keyword searching capabilities enable rapid location of technologies, innovators, and companies. System performs routine functions automatically and serves multiple users.

  14. Switched on or switched off? A survey of mobile, computer and Internet use in a community mental health rehabilitation sample.

    PubMed

    Tobitt, Simon; Percival, Robert

    2017-07-04

    UK society is undergoing a technological revolution, including meeting health needs through technology. Government policy is shifting towards a "digital by default" position. Studies have trialled health technology interventions for those experiencing psychosis and shown them to be useful. To gauge levels of engagement with mobile phones (Internet-enabled or cell phone), computers and the Internet in the specific population of community mental health rehabilitation. Two surveys were conducted: with service-users on use/non-use of technologies, and interest in technology interventions and support; and with placements on facilities and support available to service-users. Levels of engagement in this population were substantially less than those recorded in the general UK and other clinical populations: 40.2% regularly use mobiles, 17.5% computers, and 14.4% the Internet. Users of all three technologies were significantly younger than non-users. Users of mobiles and computers were significantly more likely to live in lower support/higher independence placements. Of surveyed placements, 35.5% provide a communal computer and 38.7% IT skills sessions. Community mental health rehabilitation service-users risk finding themselves excluded by a "digital divide". Action is needed to ensure equal access to online opportunities, including healthcare innovations. Clinical and policy implications are discussed.

  15. Computer methods in designing tourist equipment for people with disabilities

    NASA Astrophysics Data System (ADS)

    Zuzda, Jolanta GraŻyna; Borkowski, Piotr; Popławska, Justyna; Latosiewicz, Robert; Moska, Eleonora

    2017-11-01

    Modern technologies enable disabled people to enjoy physical activity every day. Many new structures are matched individually and created for people who fancy active tourism, giving them wider opportunities for active pastime. The process of creating this type of devices in every stage, from initial design through assessment to validation, is assisted by various types of computer support software.

  16. Computer Science (CS) Education in Indian Schools: Situation Analysis Using Darmstadt Model

    ERIC Educational Resources Information Center

    Raman, Raghu; Venkatasubramanian, Smrithi; Achuthan, Krishnashree; Nedungadi, Prema

    2015-01-01

    Computer science (CS) and its enabling technologies are at the heart of this information age, yet its adoption as a core subject by senior secondary students in Indian schools is low and has not reached critical mass. Though there have been efforts to create core curriculum standards for subjects like Physics, Chemistry, Biology, and Math, CS…

  17. Rapid Development and Distribution of Mobile Media-Rich Clinical Practice Guidelines Nationwide in Colombia.

    PubMed

    Flórez-Arango, José F; Sriram Iyengar, M; Caicedo, Indira T; Escobar, German

    2017-01-01

    Development and electronic distribution of Clinical Practice Guidelines production is costly and challenging. This poster presents a rapid method to represent existing guidelines in auditable, computer executable multimedia format. We used a technology that enables a small number of clinicians to, in a short period of time, develop a substantial amount of computer executable guidelines without programming.

  18. Immersive, Interactive, Web-Enabled Computer Simulation as a Trigger for Learning: The next Generation of Problem-Based Learning in Educational Leadership

    ERIC Educational Resources Information Center

    Mann, Dale; Reardon, R. M.; Becker, J. D.; Shakeshaft, C.; Bacon, Nicholas

    2011-01-01

    This paper describes the use of advanced computer technology in an innovative educational leadership program. This program integrates full-motion video scenarios that simulate the leadership challenges typically faced by principals over the course of a full school year. These scenarios require decisions that are then coupled to consequences and…

  19. Assisting People with Developmental Disabilities to Improve Computer Pointing Efficiency through Multiple Mice and Automatic Pointing Assistive Programs

    ERIC Educational Resources Information Center

    Shih, Ching-Hsiang

    2011-01-01

    This study combines multi-mice technology (people with disabilities can use standard mice, instead of specialized alternative computer input devices, to achieve complete mouse operation) with an assistive pointing function (i.e. cursor-capturing, which enables the user to move the cursor to the target center automatically), to assess whether two…

  20. Performance analysis of a parallel Monte Carlo code for simulating solar radiative transfer in cloudy atmospheres using CUDA-enabled NVIDIA GPU

    NASA Astrophysics Data System (ADS)

    Russkova, Tatiana V.

    2017-11-01

    One tool to improve the performance of Monte Carlo methods for numerical simulation of light transport in the Earth's atmosphere is the parallel technology. A new algorithm oriented to parallel execution on the CUDA-enabled NVIDIA graphics processor is discussed. The efficiency of parallelization is analyzed on the basis of calculating the upward and downward fluxes of solar radiation in both a vertically homogeneous and inhomogeneous models of the atmosphere. The results of testing the new code under various atmospheric conditions including continuous singlelayered and multilayered clouds, and selective molecular absorption are presented. The results of testing the code using video cards with different compute capability are analyzed. It is shown that the changeover of computing from conventional PCs to the architecture of graphics processors gives more than a hundredfold increase in performance and fully reveals the capabilities of the technology used.

  1. Big Data: An Opportunity for Collaboration with Computer Scientists on Data-Driven Science

    NASA Astrophysics Data System (ADS)

    Baru, C.

    2014-12-01

    Big data technologies are evolving rapidly, driven by the need to manage ever increasing amounts of historical data; process relentless streams of human and machine-generated data; and integrate data of heterogeneous structure from extremely heterogeneous sources of information. Big data is inherently an application-driven problem. Developing the right technologies requires an understanding of the applications domain. Though, an intriguing aspect of this phenomenon is that the availability of the data itself enables new applications not previously conceived of! In this talk, we will discuss how the big data phenomenon creates an imperative for collaboration among domain scientists (in this case, geoscientists) and computer scientists. Domain scientists provide the application requirements as well as insights about the data involved, while computer scientists help assess whether problems can be solved with currently available technologies or require adaptaion of existing technologies and/or development of new technologies. The synergy can create vibrant collaborations potentially leading to new science insights as well as development of new data technologies and systems. The area of interface between geosciences and computer science, also referred to as geoinformatics is, we believe, a fertile area for interdisciplinary research.

  2. An Overview of the NASA Aerospace Flight Battery Systems Program

    NASA Technical Reports Server (NTRS)

    Manzo, Michelle

    2003-01-01

    Develop an understanding of the safety issues relating to space use and qualification of new Li-Ion technology for manned applications. Enable use of new technology batteries into GFE equipment - laptop computers, camcorders. Establish a data base for an optimized set of cells (and batteries) exhibiting acceptable performance and abuse characteristics for utilization as building blocks for numerous applications.

  3. Tapping into Graduate Students' Collaborative Technology Experience in a Research Methods Class: Insights on Teaching Research Methods in a Malaysian and American Setting

    ERIC Educational Resources Information Center

    Vasquez-Colina, Maria D.; Maslin-Ostrowski, Pat; Baba, Suria

    2017-01-01

    This case study used qualitative and quantitative methods to investigate challenges of learning and teaching research methods by examining graduate students' use of collaborative technology (i.e., digital tools that enable collaboration and information seeking such as software and social media) and students' computer self-efficacy. We conducted…

  4. Post-Coma Persons with Motor and Communication/Consciousness Impairments Choose among Environmental Stimuli and Request Stimulus Repetitions via Assistive Technology

    ERIC Educational Resources Information Center

    Lancioni, Giulio E.; Singh, Nirbhay N.; O'Reilly, Mark F.; Sigafoos, Jeff; Buonocunto, Francesca; Sacco, Valentina; Colonna, Fabio; Navarro, Jorge; Lanzilotti, Crocifissa; Oliva, Doretta; Megna, Gianfranco

    2010-01-01

    This study assessed whether a program based on microswitch and computer technology would enable three post-coma participants (adults) with motor and communication/consciousness impairments to choose among environmental stimuli and request their repetition whenever they so desired. Within each session, 16 stimuli (12 preferred and 4 non-preferred)…

  5. The Use of Weblog in Language Learning: Motivation of Second Language Learners in Reading Classroom

    ERIC Educational Resources Information Center

    Sulaiman, Ahmad Nasaruddin; Kassim, Asiah

    2010-01-01

    The age of technology has enabled learners to interact with other users outside the four walls of the classroom. Weblogs, in particular, provide a channel for asynchronous computer-mediated communication to take place in the learning process. Motivation is one of the learning aspects that is greatly enhanced by the use of technology.…

  6. The Lighter Side of Things: The Inevitable Convergence of the Internet of Things and Cybersecurity

    NASA Technical Reports Server (NTRS)

    Davis, Jerry

    2017-01-01

    By the year 2020 it is estimated that there will be more than 50 billion devices connected to the Internet. These devices not only include traditional electronics such as smartphones and other mobile compute devices, but also eEnabled technologies such as cars, airplanes and smartgrids. The IoT brings with it the promise of efficiency, greater remote management of industrial processes and further opens the doors to world of vehicle autonomy. However, IoT enabled technology will have to operate and contend in the contested domain of cyberspace. This discussion will touch on the impact that cybersecurity has on IoT and the people, processes and technology required to mitigate cyber risks.

  7. Comparison of Scientific Calipers and Computer-Enabled CT Review for the Measurement of Skull Base and Craniomaxillofacial Dimensions

    PubMed Central

    Citardi, Martin J.; Herrmann, Brian; Hollenbeak, Chris S.; Stack, Brendan C.; Cooper, Margaret; Bucholz, Richard D.

    2001-01-01

    Traditionally, cadaveric studies and plain-film cephalometrics provided information about craniomaxillofacial proportions and measurements; however, advances in computer technology now permit software-based review of computed tomography (CT)-based models. Distances between standardized anatomic points were measured on five dried human skulls with standard scientific calipers (Geneva Gauge, Albany, NY) and through computer workstation (StealthStation 2.6.4, Medtronic Surgical Navigation Technology, Louisville, CO) review of corresponding CT scans. Differences in measurements between the caliper and CT model were not statistically significant for each parameter. Measurements obtained by computer workstation CT review of the cranial skull base are an accurate representation of actual bony anatomy. Such information has important implications for surgical planning and clinical research. ImagesFigure 1Figure 2Figure 3 PMID:17167599

  8. Current Capabilities at SNL for the Integration of Small Modular Reactors onto Smart Microgrids Using Sandia's Smart Microgrid Technology High Performance Computing and Advanced Manufacturing.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rodriguez, Salvador B.

    Smart grids are a crucial component for enabling the nation’s future energy needs, as part of a modernization effort led by the Department of Energy. Smart grids and smart microgrids are being considered in niche applications, and as part of a comprehensive energy strategy to help manage the nation’s growing energy demands, for critical infrastructures, military installations, small rural communities, and large populations with limited water supplies. As part of a far-reaching strategic initiative, Sandia National Laboratories (SNL) presents herein a unique, three-pronged approach to integrate small modular reactors (SMRs) into microgrids, with the goal of providing economically-competitive, reliable, andmore » secure energy to meet the nation’s needs. SNL’s triad methodology involves an innovative blend of smart microgrid technology, high performance computing (HPC), and advanced manufacturing (AM). In this report, Sandia’s current capabilities in those areas are summarized, as well as paths forward that will enable DOE to achieve its energy goals. In the area of smart grid/microgrid technology, Sandia’s current computational capabilities can model the entire grid, including temporal aspects and cyber security issues. Our tools include system development, integration, testing and evaluation, monitoring, and sustainment.« less

  9. NEXUS - Resilient Intelligent Middleware

    NASA Astrophysics Data System (ADS)

    Kaveh, N.; Hercock, R. Ghanea

    Service-oriented computing, a composition of distributed-object computing, component-based, and Web-based concepts, is becoming the widespread choice for developing dynamic heterogeneous software assets available as services across a network. One of the major strengths of service-oriented technologies is the high abstraction layer and large granularity level at which software assets are viewed compared to traditional object-oriented technologies. Collaboration through encapsulated and separately defined service interfaces creates a service-oriented environment, whereby multiple services can be linked together through their interfaces to compose a functional system. This approach enables better integration of legacy and non-legacy services, via wrapper interfaces, and allows for service composition at a more abstract level especially in cases such as vertical market stacks. The heterogeneous nature of service-oriented technologies and the granularity of their software components makes them a suitable computing model in the pervasive domain.

  10. Health-Enabled Smart Sensor Fusion Technology

    NASA Technical Reports Server (NTRS)

    Wang, Ray

    2012-01-01

    A process was designed to fuse data from multiple sensors in order to make a more accurate estimation of the environment and overall health in an intelligent rocket test facility (IRTF), to provide reliable, high-confidence measurements for a variety of propulsion test articles. The object of the technology is to provide sensor fusion based on a distributed architecture. Specifically, the fusion technology is intended to succeed in providing health condition monitoring capability at the intelligent transceiver, such as RF signal strength, battery reading, computing resource monitoring, and sensor data reading. The technology also provides analytic and diagnostic intelligence at the intelligent transceiver, enhancing the IEEE 1451.x-based standard for sensor data management and distributions, as well as providing appropriate communications protocols to enable complex interactions to support timely and high-quality flow of information among the system elements.

  11. CICT Computing, Information, and Communications Technology Program

    NASA Technical Reports Server (NTRS)

    Laufenberg, Lawrence; Tu, Eugene (Technical Monitor)

    2002-01-01

    The CICT Program is part of the NASA Aerospace Technology Enterprise's fundamental technology thrust to develop tools. processes, and technologies that enable new aerospace system capabilities and missions. The CICT Program's four key objectives are: Provide seamless access to NASA resources- including ground-, air-, and space-based distributed information technology resources-so that NASA scientists and engineers can more easily control missions, make new scientific discoveries, and design the next-generation space vehicles, provide high-data delivery from these assets directly to users for missions, develop goal-oriented human-centered systems, and research, develop and evaluate revolutionary technology.

  12. Envisioning a Future of Computational Geoscience in a Data Rich Semantic World

    NASA Astrophysics Data System (ADS)

    Kumar, P.; Elag, M.; Jiang, P.; Marini, L.

    2015-12-01

    Advances in observational systems and reduction in their cost are allowing us to explore, monitor, and digitally represent our environment in unprecedented details and over large areas. Low cost in situ sensors, unmanned autonomous vehicles, imaging technologies, and other new observational approaches along with airborne and space borne systems are allowing us to measure nearly everything, almost everywhere, and at almost all the time. Under the aegis of observatories they are enabling an integrated view across space and time scales ranging from storms to seasons to years and, in some cases, decades. Rapid increase in the convergence of computational, communication and information systems and their inter-operability through advances in technologies such as semantic web can provide opportunities to further facilitate fusion and synthesis of heterogeneous measurements with knowledge systems. This integration can enable us to break disciplinary boundaries and bring sensor data directly to desktop or handheld devices. We describe CyberInfrastructure effort that is being developed through projects such as Earthcube Geosemantics (http://geosemantics.hydrocomplexity.net), (SEAD (http://sead-data.net/), and Browndog (http://browndog.ncsa.illinois.edu/)s o that data across all of earth science can be easily shared and integrated with models. This also includes efforts to enable models to become interoperable among themselves and with data using technologies that enable human-out-of-the-loop integration. Through such technologies our ability to use real time information for decision-making and scientific investigations will increase multifold. The data goes through a sequence of steps, often iterative, from collection to long-term preservation. Similarly the scientific investigation and associated outcomes are composed of a number of iterative steps from problem identification to solutions. However, the integration between these two pathways is rather limited. We describe characteristics of new technologies that are needed to bring these processes together in the near future to significantly reduce the latency between data, science, and agile and informed actions that support sustainability.

  13. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation

    EPA Science Inventory

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  14. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation.

    EPA Science Inventory

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  15. Transforming System Engineering through Model-Centric Engineering

    DTIC Science & Technology

    2015-11-18

    best practices and provide computational technologies for real-time training within digital engineering environments  Multidisciplinary System...MBSE well due to continued training and practicing . While MBSE is a part of the MCE it does not encompass the full idea and enabling technologies of... practices against other Industry contractors and it was believed that ABC was trailing the others in the use of MDAO capabilities. They decided that

  16. Trans-Proteomic Pipeline, a standardized data processing pipeline for large-scale reproducible proteomics informatics

    PubMed Central

    Deutsch, Eric W.; Mendoza, Luis; Shteynberg, David; Slagel, Joseph; Sun, Zhi; Moritz, Robert L.

    2015-01-01

    Democratization of genomics technologies has enabled the rapid determination of genotypes. More recently the democratization of comprehensive proteomics technologies is enabling the determination of the cellular phenotype and the molecular events that define its dynamic state. Core proteomic technologies include mass spectrometry to define protein sequence, protein:protein interactions, and protein post-translational modifications. Key enabling technologies for proteomics are bioinformatic pipelines to identify, quantitate, and summarize these events. The Trans-Proteomics Pipeline (TPP) is a robust open-source standardized data processing pipeline for large-scale reproducible quantitative mass spectrometry proteomics. It supports all major operating systems and instrument vendors via open data formats. Here we provide a review of the overall proteomics workflow supported by the TPP, its major tools, and how it can be used in its various modes from desktop to cloud computing. We describe new features for the TPP, including data visualization functionality. We conclude by describing some common perils that affect the analysis of tandem mass spectrometry datasets, as well as some major upcoming features. PMID:25631240

  17. Trans-Proteomic Pipeline, a standardized data processing pipeline for large-scale reproducible proteomics informatics.

    PubMed

    Deutsch, Eric W; Mendoza, Luis; Shteynberg, David; Slagel, Joseph; Sun, Zhi; Moritz, Robert L

    2015-08-01

    Democratization of genomics technologies has enabled the rapid determination of genotypes. More recently the democratization of comprehensive proteomics technologies is enabling the determination of the cellular phenotype and the molecular events that define its dynamic state. Core proteomic technologies include MS to define protein sequence, protein:protein interactions, and protein PTMs. Key enabling technologies for proteomics are bioinformatic pipelines to identify, quantitate, and summarize these events. The Trans-Proteomics Pipeline (TPP) is a robust open-source standardized data processing pipeline for large-scale reproducible quantitative MS proteomics. It supports all major operating systems and instrument vendors via open data formats. Here, we provide a review of the overall proteomics workflow supported by the TPP, its major tools, and how it can be used in its various modes from desktop to cloud computing. We describe new features for the TPP, including data visualization functionality. We conclude by describing some common perils that affect the analysis of MS/MS datasets, as well as some major upcoming features. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Template Interfaces for Agile Parallel Data-Intensive Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramakrishnan, Lavanya; Gunter, Daniel; Pastorello, Gilerto Z.

    Tigres provides a programming library to compose and execute large-scale data-intensive scientific workflows from desktops to supercomputers. DOE User Facilities and large science collaborations are increasingly generating large enough data sets that it is no longer practical to download them to a desktop to operate on them. They are instead stored at centralized compute and storage resources such as high performance computing (HPC) centers. Analysis of this data requires an ability to run on these facilities, but with current technologies, scaling an analysis to an HPC center and to a large data set is difficult even for experts. Tigres ismore » addressing the challenge of enabling collaborative analysis of DOE Science data through a new concept of reusable "templates" that enable scientists to easily compose, run and manage collaborative computational tasks. These templates define common computation patterns used in analyzing a data set.« less

  19. TK3 eBook software to author, distribute, and use electronic course content for medical education.

    PubMed

    Morton, David A; Foreman, K Bo; Goede, Patricia A; Bezzant, John L; Albertine, Kurt H

    2007-03-01

    The methods for authoring and distributing course content are undergoing substantial changes due to advancement in computer technology. Paper has been the traditional method to author and distribute course content. Paper enables students to personalize content through highlighting and note taking but does not enable the incorporation of multimedia elements. Computers enable multimedia content but lack the capability of the user to personalize the content. Therefore, we investigated TK3 eBooks as a potential solution to incorporate the benefits of both paper and computer technology. The objective of our study was to assess the utility of TK3 eBooks in the context of authoring and distributing dermatology course content for use by second-year medical students at the University of Utah School of Medicine during the spring of 2004. We incorporated all dermatology course content into TK3 eBook format. TK3 eBooks enable students to personalize information through tools such as "notebook," "hiliter," "stickies," mark pages, and keyword search. Students were given the course content in both paper and eBook formats. At the conclusion of the dermatology course, students completed a questionnaire designed to evaluate the effectiveness of the eBooks compared with paper. Students perceived eBooks as an effective way to distribute course content and as a study tool. However, students preferred paper over eBooks to take notes during lecture. In conclusion, the present study demonstrated that eBooks provide a convenient method for authoring, distributing, and using course content but that students preferred paper to take notes during lecture.

  20. Understanding access and use of technology among youth with first-episode psychosis to inform the development of technology-enabled therapeutic interventions.

    PubMed

    Abdel-Baki, Amal; Lal, Shalini; D-Charron, Olivier; Stip, Emmanuel; Kara, Nadjia

    2017-02-01

    Computers, video games and technological devices are part of young people's everyday lives. However, their use in first-episode psychosis (FEP) treatment is rare. The purpose of this study was to better understand the access and use of technology among individuals with FEP, including gaming activities, to inform future development of technology-enabled therapeutic applications. Self-administered survey on use of technological tools in 71 FEP individuals. PCs/laptops were used by all participants; cellphones/smartphones by 92%, consoles by 83% (mainly male and younger participants). Women texted and used social networks more frequently; men played games (mainly action) more often. The younger individuals reported playing games frequently (32% daily) with less use of the Web and social networks (favourite: Facebook). These data will be useful for developing Web-based psychoeducation tools and cognitive remediation video games for youth with FEP. © 2015 Wiley Publishing Asia Pty Ltd.

  1. Human and Robotic Space Mission Use Cases for High-Performance Spaceflight Computing

    NASA Technical Reports Server (NTRS)

    Doyle, Richard; Bergman, Larry; Some, Raphael; Whitaker, William; Powell, Wesley; Johnson, Michael; Goforth, Montgomery; Lowry, Michael

    2013-01-01

    Spaceflight computing is a key resource in NASA space missions and a core determining factor of spacecraft capability, with ripple effects throughout the spacecraft, end-to-end system, and the mission; it can be aptly viewed as a "technology multiplier" in that advances in onboard computing provide dramatic improvements in flight functions and capabilities across the NASA mission classes, and will enable new flight capabilities and mission scenarios, increasing science and exploration return per mission-dollar.

  2. Large-scale neural circuit mapping data analysis accelerated with the graphical processing unit (GPU).

    PubMed

    Shi, Yulin; Veidenbaum, Alexander V; Nicolau, Alex; Xu, Xiangmin

    2015-01-15

    Modern neuroscience research demands computing power. Neural circuit mapping studies such as those using laser scanning photostimulation (LSPS) produce large amounts of data and require intensive computation for post hoc processing and analysis. Here we report on the design and implementation of a cost-effective desktop computer system for accelerated experimental data processing with recent GPU computing technology. A new version of Matlab software with GPU enabled functions is used to develop programs that run on Nvidia GPUs to harness their parallel computing power. We evaluated both the central processing unit (CPU) and GPU-enabled computational performance of our system in benchmark testing and practical applications. The experimental results show that the GPU-CPU co-processing of simulated data and actual LSPS experimental data clearly outperformed the multi-core CPU with up to a 22× speedup, depending on computational tasks. Further, we present a comparison of numerical accuracy between GPU and CPU computation to verify the precision of GPU computation. In addition, we show how GPUs can be effectively adapted to improve the performance of commercial image processing software such as Adobe Photoshop. To our best knowledge, this is the first demonstration of GPU application in neural circuit mapping and electrophysiology-based data processing. Together, GPU enabled computation enhances our ability to process large-scale data sets derived from neural circuit mapping studies, allowing for increased processing speeds while retaining data precision. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. Large scale neural circuit mapping data analysis accelerated with the graphical processing unit (GPU)

    PubMed Central

    Shi, Yulin; Veidenbaum, Alexander V.; Nicolau, Alex; Xu, Xiangmin

    2014-01-01

    Background Modern neuroscience research demands computing power. Neural circuit mapping studies such as those using laser scanning photostimulation (LSPS) produce large amounts of data and require intensive computation for post-hoc processing and analysis. New Method Here we report on the design and implementation of a cost-effective desktop computer system for accelerated experimental data processing with recent GPU computing technology. A new version of Matlab software with GPU enabled functions is used to develop programs that run on Nvidia GPUs to harness their parallel computing power. Results We evaluated both the central processing unit (CPU) and GPU-enabled computational performance of our system in benchmark testing and practical applications. The experimental results show that the GPU-CPU co-processing of simulated data and actual LSPS experimental data clearly outperformed the multi-core CPU with up to a 22x speedup, depending on computational tasks. Further, we present a comparison of numerical accuracy between GPU and CPU computation to verify the precision of GPU computation. In addition, we show how GPUs can be effectively adapted to improve the performance of commercial image processing software such as Adobe Photoshop. Comparison with Existing Method(s) To our best knowledge, this is the first demonstration of GPU application in neural circuit mapping and electrophysiology-based data processing. Conclusions Together, GPU enabled computation enhances our ability to process large-scale data sets derived from neural circuit mapping studies, allowing for increased processing speeds while retaining data precision. PMID:25277633

  4. CE-ACCE: The Cloud Enabled Advanced sCience Compute Environment

    NASA Astrophysics Data System (ADS)

    Cinquini, L.; Freeborn, D. J.; Hardman, S. H.; Wong, C.

    2017-12-01

    Traditionally, Earth Science data from NASA remote sensing instruments has been processed by building custom data processing pipelines (often based on a common workflow engine or framework) which are typically deployed and run on an internal cluster of computing resources. This approach has some intrinsic limitations: it requires each mission to develop and deploy a custom software package on top of the adopted framework; it makes use of dedicated hardware, network and storage resources, which must be specifically purchased, maintained and re-purposed at mission completion; and computing services cannot be scaled on demand beyond the capability of the available servers.More recently, the rise of Cloud computing, coupled with other advances in containerization technology (most prominently, Docker) and micro-services architecture, has enabled a new paradigm, whereby space mission data can be processed through standard system architectures, which can be seamlessly deployed and scaled on demand on either on-premise clusters, or commercial Cloud providers. In this talk, we will present one such architecture named CE-ACCE ("Cloud Enabled Advanced sCience Compute Environment"), which we have been developing at the NASA Jet Propulsion Laboratory over the past year. CE-ACCE is based on the Apache OODT ("Object Oriented Data Technology") suite of services for full data lifecycle management, which are turned into a composable array of Docker images, and complemented by a plug-in model for mission-specific customization. We have applied this infrastructure to both flying and upcoming NASA missions, such as ECOSTRESS and SMAP, and demonstrated deployment on the Amazon Cloud, either using simple EC2 instances, or advanced AWS services such as Amazon Lambda and ECS (EC2 Container Services).

  5. NASA HPCC Technology for Aerospace Analysis and Design

    NASA Technical Reports Server (NTRS)

    Schulbach, Catherine H.

    1999-01-01

    The Computational Aerosciences (CAS) Project is part of NASA's High Performance Computing and Communications Program. Its primary goal is to accelerate the availability of high-performance computing technology to the US aerospace community-thus providing the US aerospace community with key tools necessary to reduce design cycle times and increase fidelity in order to improve safety, efficiency and capability of future aerospace vehicles. A complementary goal is to hasten the emergence of a viable commercial market within the aerospace community for the advantage of the domestic computer hardware and software industry. The CAS Project selects representative aerospace problems (especially design) and uses them to focus efforts on advancing aerospace algorithms and applications, systems software, and computing machinery to demonstrate vast improvements in system performance and capability over the life of the program. Recent demonstrations have served to assess the benefits of possible performance improvements while reducing the risk of adopting high-performance computing technology. This talk will discuss past accomplishments in providing technology to the aerospace community, present efforts, and future goals. For example, the times to do full combustor and compressor simulations (of aircraft engines) have been reduced by factors of 320:1 and 400:1 respectively. While this has enabled new capabilities in engine simulation, the goal of an overnight, dynamic, multi-disciplinary, 3-dimensional simulation of an aircraft engine is still years away and will require new generations of high-end technology.

  6. Silicon photonics integrated circuits: a manufacturing platform for high density, low power optical I/O's.

    PubMed

    Absil, Philippe P; Verheyen, Peter; De Heyn, Peter; Pantouvaki, Marianna; Lepage, Guy; De Coster, Jeroen; Van Campenhout, Joris

    2015-04-06

    Silicon photonics integrated circuits are considered to enable future computing systems with optical input-outputs co-packaged with CMOS chips to circumvent the limitations of electrical interfaces. In this paper we present the recent progress made to enable dense multiplexing by exploiting the integration advantage of silicon photonics integrated circuits. We also discuss the manufacturability of such circuits, a key factor for a wide adoption of this technology.

  7. The Galaxy platform for accessible, reproducible and collaborative biomedical analyses: 2016 update

    PubMed Central

    Afgan, Enis; Baker, Dannon; van den Beek, Marius; Blankenberg, Daniel; Bouvier, Dave; Čech, Martin; Chilton, John; Clements, Dave; Coraor, Nate; Eberhard, Carl; Grüning, Björn; Guerler, Aysam; Hillman-Jackson, Jennifer; Von Kuster, Greg; Rasche, Eric; Soranzo, Nicola; Turaga, Nitesh; Taylor, James; Nekrutenko, Anton; Goecks, Jeremy

    2016-01-01

    High-throughput data production technologies, particularly ‘next-generation’ DNA sequencing, have ushered in widespread and disruptive changes to biomedical research. Making sense of the large datasets produced by these technologies requires sophisticated statistical and computational methods, as well as substantial computational power. This has led to an acute crisis in life sciences, as researchers without informatics training attempt to perform computation-dependent analyses. Since 2005, the Galaxy project has worked to address this problem by providing a framework that makes advanced computational tools usable by non experts. Galaxy seeks to make data-intensive research more accessible, transparent and reproducible by providing a Web-based environment in which users can perform computational analyses and have all of the details automatically tracked for later inspection, publication, or reuse. In this report we highlight recently added features enabling biomedical analyses on a large scale. PMID:27137889

  8. Integrating Cloud-Computing-Specific Model into Aircraft Design

    NASA Astrophysics Data System (ADS)

    Zhimin, Tian; Qi, Lin; Guangwen, Yang

    Cloud Computing is becoming increasingly relevant, as it will enable companies involved in spreading this technology to open the door to Web 3.0. In the paper, the new categories of services introduced will slowly replace many types of computational resources currently used. In this perspective, grid computing, the basic element for the large scale supply of cloud services, will play a fundamental role in defining how those services will be provided. The paper tries to integrate cloud computing specific model into aircraft design. This work has acquired good results in sharing licenses of large scale and expensive software, such as CFD (Computational Fluid Dynamics), UG, CATIA, and so on.

  9. Using speech recognition to enhance the Tongue Drive System functionality in computer access.

    PubMed

    Huo, Xueliang; Ghovanloo, Maysam

    2011-01-01

    Tongue Drive System (TDS) is a wireless tongue operated assistive technology (AT), which can enable people with severe physical disabilities to access computers and drive powered wheelchairs using their volitional tongue movements. TDS offers six discrete commands, simultaneously available to the users, for pointing and typing as a substitute for mouse and keyboard in computer access, respectively. To enhance the TDS performance in typing, we have added a microphone, an audio codec, and a wireless audio link to its readily available 3-axial magnetic sensor array, and combined it with a commercially available speech recognition software, the Dragon Naturally Speaking, which is regarded as one of the most efficient ways for text entry. Our preliminary evaluations indicate that the combined TDS and speech recognition technologies can provide end users with significantly higher performance than using each technology alone, particularly in completing tasks that require both pointing and text entry, such as web surfing.

  10. Virtual reality applications to automated rendezvous and capture

    NASA Technical Reports Server (NTRS)

    Hale, Joseph; Oneil, Daniel

    1991-01-01

    Virtual Reality (VR) is a rapidly developing Human/Computer Interface (HCI) technology. The evolution of high-speed graphics processors and development of specialized anthropomorphic user interface devices, that more fully involve the human senses, have enabled VR technology. Recently, the maturity of this technology has reached a level where it can be used as a tool in a variety of applications. This paper provides an overview of: VR technology, VR activities at Marshall Space Flight Center (MSFC), applications of VR to Automated Rendezvous and Capture (AR&C), and identifies areas of VR technology that requires further development.

  11. Software for Classroom Music Making.

    ERIC Educational Resources Information Center

    Ely, Mark C.

    1992-01-01

    Describes musical instrument digital interface (MIDI), a communication system that uses digital data to enable MIDI-equipped instruments to communicate with each other. Includes discussion of music editors, sequencers, compositional software, and commonly used computers. Suggests uses for the technology for students and teachers. Urges further…

  12. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation (presentation)

    EPA Science Inventory

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  13. Enabling Analytics in the Cloud for Earth Science Data

    NASA Technical Reports Server (NTRS)

    Ramachandran, Rahul; Lynnes, Christopher; Bingham, Andrew W.; Quam, Brandi M.

    2018-01-01

    The purpose of this workshop was to hold interactive discussions where providers, users, and other stakeholders could explore the convergence of three main elements in the rapidly developing world of technology: Big Data, Cloud Computing, and Analytics, [for earth science data].

  14. Computer-Based Technologies in Dentistry: Types and Applications

    PubMed Central

    Albuha Al-Mussawi, Raja’a M.; Farid, Farzaneh

    2016-01-01

    During dental education, dental students learn how to examine patients, make diagnosis, plan treatment and perform dental procedures perfectly and efficiently. However, progresses in computer-based technologies including virtual reality (VR) simulators, augmented reality (AR) and computer aided design/computer aided manufacturing (CAD/CAM) systems have resulted in new modalities for instruction and practice of dentistry. Virtual reality dental simulators enable repeated, objective and assessable practice in various controlled situations. Superimposition of three-dimensional (3D) virtual images on actual images in AR allows surgeons to simultaneously visualize the surgical site and superimpose informative 3D images of invisible regions on the surgical site to serve as a guide. The use of CAD/CAM systems for designing and manufacturing of dental appliances and prostheses has been well established. This article reviews computer-based technologies, their application in dentistry and their potentials and limitations in promoting dental education, training and practice. Practitioners will be able to choose from a broader spectrum of options in their field of practice by becoming familiar with new modalities of training and practice. PMID:28392819

  15. Computer-Based Technologies in Dentistry: Types and Applications.

    PubMed

    Albuha Al-Mussawi, Raja'a M; Farid, Farzaneh

    2016-06-01

    During dental education, dental students learn how to examine patients, make diagnosis, plan treatment and perform dental procedures perfectly and efficiently. However, progresses in computer-based technologies including virtual reality (VR) simulators, augmented reality (AR) and computer aided design/computer aided manufacturing (CAD/CAM) systems have resulted in new modalities for instruction and practice of dentistry. Virtual reality dental simulators enable repeated, objective and assessable practice in various controlled situations. Superimposition of three-dimensional (3D) virtual images on actual images in AR allows surgeons to simultaneously visualize the surgical site and superimpose informative 3D images of invisible regions on the surgical site to serve as a guide. The use of CAD/CAM systems for designing and manufacturing of dental appliances and prostheses has been well established. This article reviews computer-based technologies, their application in dentistry and their potentials and limitations in promoting dental education, training and practice. Practitioners will be able to choose from a broader spectrum of options in their field of practice by becoming familiar with new modalities of training and practice.

  16. High Performance Computing Innovation Service Portal Study (HPC-ISP)

    DTIC Science & Technology

    2009-04-01

    threatened by global competition. It is essential that these suppliers remain competitive and maintain their technological advantage . In this increasingly...place themselves, as well as customers who rely on them, in competitive jeopardy. Despite the potential competitive advantage associated with adopting...computing users into the HPC fold and to enable more entry-level users to exploit HPC more fully for competitive advantage . About half of the surveyed

  17. Investigating the English Language Needs of the Female Students at the Faculty of Computing and Information Technology at King Abdulaziz University in Saudi Arabia

    ERIC Educational Resources Information Center

    Fadel, Sahar; Rajab, Hussam

    2017-01-01

    In the field of computer science, specific English language skills are needed to facilitate the students' academic progress. Needs analysis is generally believed to be an important element in ESP/EAP context because it enables the practitioners and curriculum designers determine the learners' needs in a particular academic context. In this regard,…

  18. Look-ahead Dynamic Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-10-20

    Look-ahead dynamic simulation software system incorporates the high performance parallel computing technologies, significantly reduces the solution time for each transient simulation case, and brings the dynamic simulation analysis into on-line applications to enable more transparency for better reliability and asset utilization. It takes the snapshot of the current power grid status, functions in parallel computing the system dynamic simulation, and outputs the transient response of the power system in real time.

  19. GSFC Cutting Edge Avionics Technologies for Spacecraft

    NASA Technical Reports Server (NTRS)

    Luers, Philip J.; Culver, Harry L.; Plante, Jeannette

    1998-01-01

    With the launch of NASA's first fiber optic bus on SAMPEX in 1992, GSFC has ushered in an era of new technology development and insertion into flight programs. Predating such programs the Lewis and Clark missions and the New Millenium Program, GSFC has spearheaded the drive to use cutting edge technologies on spacecraft for three reasons: to enable next generation Space and Earth Science, to shorten spacecraft development schedules, and to reduce the cost of NASA missions. The technologies developed have addressed three focus areas: standard interface components, high performance processing, and high-density packaging techniques enabling lower cost systems. To realize the benefits of standard interface components GSFC has developed and utilized radiation hardened/tolerant devices such as PCI target ASICs, Parallel Fiber Optic Data Bus terminals, MIL-STD-1773 and AS1773 transceivers, and Essential Services Node. High performance processing has been the focus of the Mongoose I and Mongoose V rad-hard 32-bit processor programs as well as the SMEX-Lite Computation Hub. High-density packaging techniques have resulted in 3-D stack DRAM packages and Chip-On-Board processes. Lower cost systems have been demonstrated by judiciously using all of our technology developments to enable "plug and play" scalable architectures. The paper will present a survey of development and insertion experiences for the above technologies, as well as future plans to enable more "better, faster, cheaper" spacecraft. Details of ongoing GSFC programs such as Ultra-Low Power electronics, Rad-Hard FPGAs, PCI master ASICs, and Next Generation Mongoose processors.

  20. Space Technology for Palate Surgery

    NASA Technical Reports Server (NTRS)

    1980-01-01

    University of Miami utilized NASA's spacecraft viewing technology to develop the optical profilometer provides more accurate measurements of cleft palate casts than has heretofore been possible, enabling better planning of corrective surgery. Lens like instrument electronically scans a palate cast precisely measuring its irregular contours by detecting minute differences in the intensity of a light beam reflected off the cast. Readings are computer processed and delivered to the surgeon by a teleprinter.

  1. Propulsion System Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Tai, Jimmy C. M.; McClure, Erin K.; Mavris, Dimitri N.; Burg, Cecile

    2002-01-01

    The Aerospace Systems Design Laboratory at the School of Aerospace Engineering in Georgia Institute of Technology has developed a core competency that enables propulsion technology managers to make technology investment decisions substantiated by propulsion and airframe technology system studies. This method assists the designer/manager in selecting appropriate technology concepts while accounting for the presence of risk and uncertainty as well as interactions between disciplines. This capability is incorporated into a single design simulation system that is described in this paper. This propulsion system design environment is created with a commercially available software called iSIGHT, which is a generic computational framework, and with analysis programs for engine cycle, engine flowpath, mission, and economic analyses. iSIGHT is used to integrate these analysis tools within a single computer platform and facilitate information transfer amongst the various codes. The resulting modeling and simulation (M&S) environment in conjunction with the response surface method provides the designer/decision-maker an analytical means to examine the entire design space from either a subsystem and/or system perspective. The results of this paper will enable managers to analytically play what-if games to gain insight in to the benefits (and/or degradation) of changing engine cycle design parameters. Furthermore, the propulsion design space will be explored probabilistically to show the feasibility and viability of the propulsion system integrated with a vehicle.

  2. Platform Architecture for Decentralized Positioning Systems.

    PubMed

    Kasmi, Zakaria; Norrdine, Abdelmoumen; Blankenbach, Jörg

    2017-04-26

    A platform architecture for positioning systems is essential for the realization of a flexible localization system, which interacts with other systems and supports various positioning technologies and algorithms. The decentralized processing of a position enables pushing the application-level knowledge into a mobile station and avoids the communication with a central unit such as a server or a base station. In addition, the calculation of the position on low-cost and resource-constrained devices presents a challenge due to the limited computing, storage capacity, as well as power supply. Therefore, we propose a platform architecture that enables the design of a system with the reusability of the components, extensibility (e.g., with other positioning technologies) and interoperability. Furthermore, the position is computed on a low-cost device such as a microcontroller, which simultaneously performs additional tasks such as data collecting or preprocessing based on an operating system. The platform architecture is designed, implemented and evaluated on the basis of two positioning systems: a field strength system and a time of arrival-based positioning system.

  3. Platform Architecture for Decentralized Positioning Systems

    PubMed Central

    Kasmi, Zakaria; Norrdine, Abdelmoumen; Blankenbach, Jörg

    2017-01-01

    A platform architecture for positioning systems is essential for the realization of a flexible localization system, which interacts with other systems and supports various positioning technologies and algorithms. The decentralized processing of a position enables pushing the application-level knowledge into a mobile station and avoids the communication with a central unit such as a server or a base station. In addition, the calculation of the position on low-cost and resource-constrained devices presents a challenge due to the limited computing, storage capacity, as well as power supply. Therefore, we propose a platform architecture that enables the design of a system with the reusability of the components, extensibility (e.g., with other positioning technologies) and interoperability. Furthermore, the position is computed on a low-cost device such as a microcontroller, which simultaneously performs additional tasks such as data collecting or preprocessing based on an operating system. The platform architecture is designed, implemented and evaluated on the basis of two positioning systems: a field strength system and a time of arrival-based positioning system. PMID:28445414

  4. Web services as applications' integration tool: QikProp case study.

    PubMed

    Laoui, Abdel; Polyakov, Valery R

    2011-07-15

    Web services are a new technology that enables to integrate applications running on different platforms by using primarily XML to enable communication among different computers over the Internet. Large number of applications was designed as stand alone systems before the concept of Web services was introduced and it is a challenge to integrate them into larger computational networks. A generally applicable method of wrapping stand alone applications into Web services was developed and is described. To test the technology, it was applied to the QikProp for DOS (Windows). Although performance of the application did not change when it was delivered as a Web service, this form of deployment had offered several advantages like simplified and centralized maintenance, smaller number of licenses, and practically no training for the end user. Because by using the described approach almost any legacy application can be wrapped as a Web service, this form of delivery may be recommended as a global alternative to traditional deployment solutions. Copyright © 2011 Wiley Periodicals, Inc.

  5. Some recent applications of Navier-Stokes codes to rotorcraft

    NASA Technical Reports Server (NTRS)

    Mccroskey, W. J.

    1992-01-01

    Many operational limitations of helicopters and other rotary-wing aircraft are due to nonlinear aerodynamic phenomena incuding unsteady, three-dimensional transonic and separated flow near the surfaces and highly vortical flow in the wakes of rotating blades. Modern computational fluid dynamics (CFD) technology offers new tools to study and simulate these complex flows. However, existing Euler and Navier-Stokes codes have to be modified significantly for rotorcraft applications, and the enormous computational requirements presently limit their use in routine design applications. Nevertheless, the Euler/Navier-Stokes technology is progressing in anticipation of future supercomputers that will enable meaningful calculations to be made for complete rotorcraft configurations.

  6. Highlighting the medical applications of 3D printing in Egypt

    PubMed Central

    Abdelghany, Khaled; Hamza, Hosamuddin

    2015-01-01

    Computer-assisted designing/computer-assisted manufacturing (CAD/CAM) technology has enabled medical practitioners to tailor physical models in a patient and purpose-specific fashion. It allows the designing and manufacturing of templates, appliances and devices with a high range of accuracy using biocompatible materials. The technique, nevertheless, relies on digital scanning (e.g., using intraoral scanners) and/or digital imaging (e.g., CT and MRI). In developing countries, there are some technical and financial limitations of implementing such advanced tools as an essential portion of medical applications. This paper focuses on the surgical and dental use of 3D printing technology in Egypt as a developing country. PMID:26807414

  7. Computer-assisted abdominal surgery: new technologies.

    PubMed

    Kenngott, H G; Wagner, M; Nickel, F; Wekerle, A L; Preukschas, A; Apitz, M; Schulte, T; Rempel, R; Mietkowski, P; Wagner, F; Termer, A; Müller-Stich, Beat P

    2015-04-01

    Computer-assisted surgery is a wide field of technologies with the potential to enable the surgeon to improve efficiency and efficacy of diagnosis, treatment, and clinical management. This review provides an overview of the most important new technologies and their applications. A MEDLINE database search was performed revealing a total of 1702 references. All references were considered for information on six main topics, namely image guidance and navigation, robot-assisted surgery, human-machine interface, surgical processes and clinical pathways, computer-assisted surgical training, and clinical decision support. Further references were obtained through cross-referencing the bibliography cited in each work. Based on their respective field of expertise, the authors chose 64 publications relevant for the purpose of this review. Computer-assisted systems are increasingly used not only in experimental studies but also in clinical studies. Although computer-assisted abdominal surgery is still in its infancy, the number of studies is constantly increasing, and clinical studies start showing the benefits of computers used not only as tools of documentation and accounting but also for directly assisting surgeons during diagnosis and treatment of patients. Further developments in the field of clinical decision support even have the potential of causing a paradigm shift in how patients are diagnosed and treated.

  8. HTMT-class Latency Tolerant Parallel Architecture for Petaflops Scale Computation

    NASA Technical Reports Server (NTRS)

    Sterling, Thomas; Bergman, Larry

    2000-01-01

    Computational Aero Sciences and other numeric intensive computation disciplines demand computing throughputs substantially greater than the Teraflops scale systems only now becoming available. The related fields of fluids, structures, thermal, combustion, and dynamic controls are among the interdisciplinary areas that in combination with sufficient resolution and advanced adaptive techniques may force performance requirements towards Petaflops. This will be especially true for compute intensive models such as Navier-Stokes are or when such system models are only part of a larger design optimization computation involving many design points. Yet recent experience with conventional MPP configurations comprising commodity processing and memory components has shown that larger scale frequently results in higher programming difficulty and lower system efficiency. While important advances in system software and algorithms techniques have had some impact on efficiency and programmability for certain classes of problems, in general it is unlikely that software alone will resolve the challenges to higher scalability. As in the past, future generations of high-end computers may require a combination of hardware architecture and system software advances to enable efficient operation at a Petaflops level. The NASA led HTMT project has engaged the talents of a broad interdisciplinary team to develop a new strategy in high-end system architecture to deliver petaflops scale computing in the 2004/5 timeframe. The Hybrid-Technology, MultiThreaded parallel computer architecture incorporates several advanced technologies in combination with an innovative dynamic adaptive scheduling mechanism to provide unprecedented performance and efficiency within practical constraints of cost, complexity, and power consumption. The emerging superconductor Rapid Single Flux Quantum electronics can operate at 100 GHz (the record is 770 GHz) and one percent of the power required by convention semiconductor logic. Wave Division Multiplexing optical communications can approach a peak per fiber bandwidth of 1 Tbps and the new Data Vortex network topology employing this technology can connect tens of thousands of ports providing a bi-section bandwidth on the order of a Petabyte per second with latencies well below 100 nanoseconds, even under heavy loads. Processor-in-Memory (PIM) technology combines logic and memory on the same chip exposing the internal bandwidth of the memory row buffers at low latency. And holographic storage photorefractive storage technologies provide high-density memory with access a thousand times faster than conventional disk technologies. Together these technologies enable a new class of shared memory system architecture with a peak performance in the range of a Petaflops but size and power requirements comparable to today's largest Teraflops scale systems. To achieve high-sustained performance, HTMT combines an advanced multithreading processor architecture with a memory-driven coarse-grained latency management strategy called "percolation", yielding high efficiency while reducing the much of the parallel programming burden. This paper will present the basic system architecture characteristics made possible through this series of advanced technologies and then give a detailed description of the new percolation approach to runtime latency management.

  9. Single-server blind quantum computation with quantum circuit model

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoqian; Weng, Jian; Li, Xiaochun; Luo, Weiqi; Tan, Xiaoqing; Song, Tingting

    2018-06-01

    Blind quantum computation (BQC) enables the client, who has few quantum technologies, to delegate her quantum computation to a server, who has strong quantum computabilities and learns nothing about the client's quantum inputs, outputs and algorithms. In this article, we propose a single-server BQC protocol with quantum circuit model by replacing any quantum gate with the combination of rotation operators. The trap quantum circuits are introduced, together with the combination of rotation operators, such that the server is unknown about quantum algorithms. The client only needs to perform operations X and Z, while the server honestly performs rotation operators.

  10. Computational Approach for Developing Blood Pump

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan

    2002-01-01

    This viewgraph presentation provides an overview of the computational approach to developing a ventricular assist device (VAD) which utilizes NASA aerospace technology. The VAD is used as a temporary support to sick ventricles for those who suffer from late stage congestive heart failure (CHF). The need for donor hearts is much greater than their availability, and the VAD is seen as a bridge-to-transplant. The computational issues confronting the design of a more advanced, reliable VAD include the modelling of viscous incompressible flow. A computational approach provides the possibility of quantifying the flow characteristics, which is especially valuable for analyzing compact design with highly sensitive operating conditions. Computational fluid dynamics (CFD) and rocket engine technology has been applied to modify the design of a VAD which enabled human transplantation. The computing requirement for this project is still large, however, and the unsteady analysis of the entire system from natural heart to aorta involves several hundred revolutions of the impeller. Further study is needed to assess the impact of mechanical VADs on the human body

  11. Turbulent Flow Simulation at the Exascale: Opportunities and Challenges Workshop: August 4-5, 2015, Washington, D.C.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sprague, Michael A.; Boldyrev, Stanislav; Fischer, Paul

    This report details the impact exascale will bring to turbulent-flow simulations in applied science and technology. The need for accurate simulation of turbulent flows is evident across the DOE applied-science and engineering portfolios, including combustion, plasma physics, nuclear-reactor physics, wind energy, and atmospheric science. The workshop brought together experts in turbulent-flow simulation, computational mathematics, and high-performance computing. Building upon previous ASCR workshops on exascale computing, participants defined a research agenda and path forward that will enable scientists and engineers to continually leverage, engage, and direct advances in computational systems on the path to exascale computing.

  12. Transformation of OODT CAS to Perform Larger Tasks

    NASA Technical Reports Server (NTRS)

    Mattmann, Chris; Freeborn, Dana; Crichton, Daniel; Hughes, John; Ramirez, Paul; Hardman, Sean; Woollard, David; Kelly, Sean

    2008-01-01

    A computer program denoted OODT CAS has been transformed to enable performance of larger tasks that involve greatly increased data volumes and increasingly intensive processing of data on heterogeneous, geographically dispersed computers. Prior to the transformation, OODT CAS (also alternatively denoted, simply, 'CAS') [wherein 'OODT' signifies 'Object-Oriented Data Technology' and 'CAS' signifies 'Catalog and Archive Service'] was a proven software component used to manage scientific data from spaceflight missions. In the transformation, CAS was split into two separate components representing its canonical capabilities: file management and workflow management. In addition, CAS was augmented by addition of a resource-management component. This third component enables CAS to manage heterogeneous computing by use of diverse resources, including high-performance clusters of computers, commodity computing hardware, and grid computing infrastructures. CAS is now more easily maintainable, evolvable, and reusable. These components can be used separately or, taking advantage of synergies, can be used together. Other elements of the transformation included addition of a separate Web presentation layer that supports distribution of data products via Really Simple Syndication (RSS) feeds, and provision for full Resource Description Framework (RDF) exports of metadata.

  13. Semantic Web meets Integrative Biology: a survey.

    PubMed

    Chen, Huajun; Yu, Tong; Chen, Jake Y

    2013-01-01

    Integrative Biology (IB) uses experimental or computational quantitative technologies to characterize biological systems at the molecular, cellular, tissue and population levels. IB typically involves the integration of the data, knowledge and capabilities across disciplinary boundaries in order to solve complex problems. We identify a series of bioinformatics problems posed by interdisciplinary integration: (i) data integration that interconnects structured data across related biomedical domains; (ii) ontology integration that brings jargons, terminologies and taxonomies from various disciplines into a unified network of ontologies; (iii) knowledge integration that integrates disparate knowledge elements from multiple sources; (iv) service integration that build applications out of services provided by different vendors. We argue that IB can benefit significantly from the integration solutions enabled by Semantic Web (SW) technologies. The SW enables scientists to share content beyond the boundaries of applications and websites, resulting into a web of data that is meaningful and understandable to any computers. In this review, we provide insight into how SW technologies can be used to build open, standardized and interoperable solutions for interdisciplinary integration on a global basis. We present a rich set of case studies in system biology, integrative neuroscience, bio-pharmaceutics and translational medicine, to highlight the technical features and benefits of SW applications in IB.

  14. An X-Ray computed tomography/positron emission tomography system designed specifically for breast imaging.

    PubMed

    Boone, John M; Yang, Kai; Burkett, George W; Packard, Nathan J; Huang, Shih-ying; Bowen, Spencer; Badawi, Ramsey D; Lindfors, Karen K

    2010-02-01

    Mammography has served the population of women who are at-risk for breast cancer well over the past 30 years. While mammography has undergone a number of changes as digital detector technology has advanced, other modalities such as computed tomography have experienced technological sophistication over this same time frame as well. The advent of large field of view flat panel detector systems enable the development of breast CT and several other niche CT applications, which rely on cone beam geometry. The breast, it turns out, is well suited to cone beam CT imaging because the lack of bones reduces artifacts, and the natural tapering of the breast anteriorly reduces the x-ray path lengths through the breast at large cone angle, reducing cone beam artifacts as well. We are in the process of designing a third prototype system which will enable the use of breast CT for image guided interventional procedures. This system will have several copies fabricated so that several breast CT scanners can be used in a multi-institutional clinical trial to better understand the role that this technology can bring to breast imaging.

  15. Command, Control and Communications Capabilities Enabling 21st Century Missions, a Historical Perspective

    NASA Technical Reports Server (NTRS)

    Waterman, Robert D.; Rice, Herbert D.; Waterman, Susan J.

    2010-01-01

    Command, Control and Communications (CCC) has evolved through the years from simple switches, dials, analogue hardwire networks and lights to a modern computer based digital network. However there are two closely coupled pillars upon which a CCC system is built. The first, is that technology drives the pace of advancement. The second is that a culture that fosters resistance to change can limit technological advancements in the CCC system. While technology has advanced at a tremendous rate throughout the years, the change in culture has moved slowly. This paper will attempt to show through a historical perspective where specific design decisions for early CCC systems have erroneously evolved into general requirements being imposed on later systems. Finally this paper will provide a glimpse into the future directions envisioned for CCC capabilities that will enable 21st century missions.

  16. Benefits of information technology-enabled diabetes management.

    PubMed

    Bu, Davis; Pan, Eric; Walker, Janice; Adler-Milstein, Julia; Kendrick, David; Hook, Julie M; Cusack, Caitlin M; Bates, David W; Middleton, Blackford

    2007-05-01

    To determine the financial and clinical benefits of implementing information technology (IT)-enabled disease management systems. A computer model was created to project the impact of IT-enabled disease management on care processes, clinical outcomes, and medical costs for patients with type 2 diabetes aged >25 years in the U.S. Several ITs were modeled (e.g., diabetes registries, computerized decision support, remote monitoring, patient self-management systems, and payer-based systems). Estimates of care process improvements were derived from published literature. Simulations projected outcomes for both payer and provider organizations, scaled to the national level. The primary outcome was medical cost savings, in 2004 U.S. dollars discounted at 5%. Secondary measures include reduction of cardiovascular, cerebrovascular, neuropathy, nephropathy, and retinopathy clinical outcomes. All forms of IT-enabled disease management improved the health of patients with diabetes and reduced health care expenditures. Over 10 years, diabetes registries saved $14.5 billion, computerized decision support saved $10.7 billion, payer-centered technologies saved $7.10 billion, remote monitoring saved $326 million, self-management saved $285 million, and integrated provider-patient systems saved $16.9 billion. IT-enabled diabetes management has the potential to improve care processes, delay diabetes complications, and save health care dollars. Of existing systems, provider-centered technologies such as diabetes registries currently show the most potential for benefit. Fully integrated provider-patient systems would have even greater potential for benefit. These benefits must be weighed against the implementation costs.

  17. Federated and Cloud Enabled Resources for Data Management and Utilization

    NASA Astrophysics Data System (ADS)

    Rankin, R.; Gordon, M.; Potter, R. G.; Satchwill, B.

    2011-12-01

    The emergence of cloud computing over the past three years has led to a paradigm shift in how data can be managed, processed and made accessible. Building on the federated data management system offered through the Canadian Space Science Data Portal (www.cssdp.ca), we demonstrate how heterogeneous and geographically distributed data sets and modeling tools have been integrated to form a virtual data center and computational modeling platform that has services for data processing and visualization embedded within it. We also discuss positive and negative experiences in utilizing Eucalyptus and OpenStack cloud applications, and job scheduling facilitated by Condor and Star Cluster. We summarize our findings by demonstrating use of these technologies in the Cloud Enabled Space Weather Data Assimilation and Modeling Platform CESWP (www.ceswp.ca), which is funded through Canarie's (canarie.ca) Network Enabled Platforms program in Canada.

  18. Simulation Based Exploration of Critical Zone Dynamics in Intensively Managed Landscapes

    NASA Astrophysics Data System (ADS)

    Kumar, P.

    2017-12-01

    The advent of high-resolution measurements of topographic and (vertical) vegetation features using areal LiDAR are enabling us to resolve micro-scale ( 1m) landscape structural characteristics over large areas. Availability of hyperspectral measurements is further augmenting these LiDAR data by enabling the biogeochemical characterization of vegetation and soils at unprecedented spatial resolutions ( 1-10m). Such data have opened up novel opportunities for modeling Critical Zone processes and exploring questions that were not possible before. We show how an integrated 3-D model at 1m grid resolution can enable us to resolve micro-topographic and ecological dynamics and their control on hydrologic and biogeochemical processes over large areas. We address the computational challenge of such detailed modeling by exploiting hybrid CPU and GPU computing technologies. We show results of moisture, biogeochemical, and vegetation dynamics from studies in the Critical Zone Observatory for Intensively managed Landscapes (IMLCZO) in the Midwestern United States.

  19. Polymer waveguides for electro-optical integration in data centers and high-performance computers.

    PubMed

    Dangel, Roger; Hofrichter, Jens; Horst, Folkert; Jubin, Daniel; La Porta, Antonio; Meier, Norbert; Soganci, Ibrahim Murat; Weiss, Jonas; Offrein, Bert Jan

    2015-02-23

    To satisfy the intra- and inter-system bandwidth requirements of future data centers and high-performance computers, low-cost low-power high-throughput optical interconnects will become a key enabling technology. To tightly integrate optics with the computing hardware, particularly in the context of CMOS-compatible silicon photonics, optical printed circuit boards using polymer waveguides are considered as a formidable platform. IBM Research has already demonstrated the essential silicon photonics and interconnection building blocks. A remaining challenge is electro-optical packaging, i.e., the connection of the silicon photonics chips with the system. In this paper, we present a new single-mode polymer waveguide technology and a scalable method for building the optical interface between silicon photonics chips and single-mode polymer waveguides.

  20. Perspectives of IT Professionals on Employing Server Virtualization Technologies

    ERIC Educational Resources Information Center

    Sligh, Darla

    2010-01-01

    Server virtualization enables a physical computer to support multiple applications logically by decoupling the application from the hardware layer, thereby reducing operational costs and competitive in delivering IT services to their enterprise organizations. IT organizations continually examine the efficiency of their internal IT systems and…

  1. ACOT Classroom Networks: Today and Tomorrow. ACOT Report #5.

    ERIC Educational Resources Information Center

    Knapp, Linda

    The Apple Classrooms of Tomorrow (ACOT) research project provides classroom sites with equipment, ongoing support, and training, enabling educators to discover the potential of networked learning environments. ACOT networks link together technology from Apple IIe computers and Image Writer printers, to Macintosh II systems, synthesizers, laserdisc…

  2. The Charge of the Byte Brigade: Educators Lead the Fourth Revolution.

    ERIC Educational Resources Information Center

    Gardner, David Pierpont

    1986-01-01

    Opportunities provided by computer technologies are considered from two perspectives: what they are enabling teachers and researchers to do and implications for the future. Processing information, opening access to the library, and coordinating telecommunications are discussed, including the University of California MELVYL system. (MLW)

  3. A Multi-Temporal Context-Aware System for Competences Management

    ERIC Educational Resources Information Center

    Rosa, João H.; Barbosa, Jorge L.; Kich, Marcos; Brito, Lucas

    2015-01-01

    The evolution of computing technology and wireless networks has contributed to the miniaturization of mobile devices and their increase in power, providing services anywhere and anytime. In this scenario, applications have considered the user's contexts to make decisions (Context Awareness). Context-aware applications have enabled new…

  4. Solving probability reasoning based on DNA strand displacement and probability modules.

    PubMed

    Zhang, Qiang; Wang, Xiaobiao; Wang, Xiaojun; Zhou, Changjun

    2017-12-01

    In computation biology, DNA strand displacement technology is used to simulate the computation process and has shown strong computing ability. Most researchers use it to solve logic problems, but it is only rarely used in probabilistic reasoning. To process probabilistic reasoning, a conditional probability derivation model and total probability model based on DNA strand displacement were established in this paper. The models were assessed through the game "read your mind." It has been shown to enable the application of probabilistic reasoning in genetic diagnosis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. FPGA-Based X-Ray Detection and Measurement for an X-Ray Polarimeter

    NASA Technical Reports Server (NTRS)

    Gregory, Kyle; Hill, Joanne; Black, Kevin; Baumgartner, Wayne

    2013-01-01

    This technology enables detection and measurement of x-rays in an x-ray polarimeter using a field-programmable gate array (FPGA). The technology was developed for the Gravitational and Extreme Magnetism Small Explorer (GEMS) mission. It performs precision energy and timing measurements, as well as rejection of non-x-ray events. It enables the GEMS polarimeter to detect precisely when an event has taken place so that additional measurements can be made. The technology also enables this function to be performed in an FPGA using limited resources so that mass and power can be minimized while reliability for a space application is maximized and precise real-time operation is achieved. This design requires a low-noise, charge-sensitive preamplifier; a highspeed analog to digital converter (ADC); and an x-ray detector with a cathode terminal. It functions by computing a sum of differences for time-samples whose difference exceeds a programmable threshold. A state machine advances through states as a programmable number of consecutive samples exceeds or fails to exceed this threshold. The pulse height is recorded as the accumulated sum. The track length is also measured based on the time from the start to the end of accumulation. For track lengths longer than a certain length, the algorithm estimates the barycenter of charge deposit by comparing the accumulator value at the midpoint to the final accumulator value. The design also employs a number of techniques for rejecting background events. This innovation enables the function to be performed in space where it can operate autonomously with a rapid response time. This implementation combines advantages of computing system-based approaches with those of pure analog approaches. The result is an implementation that is highly reliable, performs in real-time, rejects background events, and consumes minimal power.

  6. A Look at the Impact of High-End Computing Technologies on NASA Missions

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Dunbar, Jill; Hardman, John; Bailey, F. Ron; Wheeler, Lorien; Rogers, Stuart

    2012-01-01

    From its bold start nearly 30 years ago and continuing today, the NASA Advanced Supercomputing (NAS) facility at Ames Research Center has enabled remarkable breakthroughs in the space agency s science and engineering missions. Throughout this time, NAS experts have influenced the state-of-the-art in high-performance computing (HPC) and related technologies such as scientific visualization, system benchmarking, batch scheduling, and grid environments. We highlight the pioneering achievements and innovations originating from and made possible by NAS resources and know-how, from early supercomputing environment design and software development, to long-term simulation and analyses critical to design safe Space Shuttle operations and associated spinoff technologies, to the highly successful Kepler Mission s discovery of new planets now capturing the world s imagination.

  7. Aero-Structural Assessment of an Inflatable Aerodynamic Decelerator

    NASA Technical Reports Server (NTRS)

    Sheta, Essam F.; Venugopalan, Vinod; Tan, X. G.; Liever, Peter A.; Habchi, Sami D.

    2010-01-01

    NASA is conducting an Entry, Descent and Landing Systems Analysis (EDL-SA) Study to determine the key technology development projects that should be undertaken for enabling the landing of large payloads on Mars for both human and robotic missions. Inflatable Aerodynamic Decelerators (IADs) are one of the candidate technologies. A variety of EDL architectures are under consideration. The current effort is conducted for development and simulations of computational framework for inflatable structures.

  8. Report on Technology Horizons: A Vision for Air Force Science and Technology During 2010-2030. Volume 1

    DTIC Science & Technology

    2010-05-15

    flow and decision processes across the air and space domains. It thus comprises traditional wired and fiber-optic computer networks based on...dual flow path design allow high volumetric efficiency, and high cruise speed provides significantly increased survivability. Vertical takeoff...emerging “third-stream engine architectures” can enable for constant mass flow engines that can provide further reductions in fuel consumption. A wide

  9. Designing of routing algorithms in autonomous distributed data transmission system for mobile computing devices with ‘WiFi-Direct’ technology

    NASA Astrophysics Data System (ADS)

    Nikitin, I. A.; Sherstnev, V. S.; Sherstneva, A. I.; Botygin, I. A.

    2017-02-01

    The results of the research of existent routing protocols in wireless networks and their main features are discussed in the paper. Basing on the protocol data, the routing protocols in wireless networks, including search routing algorithms and phone directory exchange algorithms, are designed with the ‘WiFi-Direct’ technology. Algorithms without IP-protocol were designed, and that enabled one to increase the efficiency of the algorithms while working only with the MAC-addresses of the devices. The developed algorithms are expected to be used in the mobile software engineering with the Android platform taken as base. Easier algorithms and formats of the well-known route protocols, rejection of the IP-protocols enables to use the developed protocols on more primitive mobile devices. Implementation of the protocols to the engineering industry enables to create data transmission networks among working places and mobile robots without any access points.

  10. Integration of problem-based learning and innovative technology into a self-care course.

    PubMed

    McFalls, Marsha

    2013-08-12

    To assess the integration of problem-based learning and technology into a self-care course. Problem-based learning (PBL) activities were developed and implemented in place of lectures in a self-care course. Students used technology, such as computer-generated virtual patients and iPads, during the PBL sessions. Students' scores on post-case quizzes were higher than on pre-case quizzes used to assess baseline knowledge. Student satisfaction with problem-based learning and the use of technology in the course remained consistent throughout the semester. Integrating problem-based learning and technology into a self-care course enabled students to become active learners.

  11. An Internet of Things Approach to Electrical Power Monitoring and Outage Reporting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koch, Daniel B

    The so-called Internet of Things concept has captured much attention recently as ordinary devices are connected to the Internet for monitoring and control purposes. One enabling technology is the proliferation of low-cost, single board computers with built-in network interfaces. Some of these are capable of hosting full-fledged operating systems that provide rich programming environments. Taken together, these features enable inexpensive solutions for even traditional tasks such as the one presented here for electrical power monitoring and outage reporting.

  12. Software for Collaborative Engineering of Launch Rockets

    NASA Technical Reports Server (NTRS)

    Stanley, Thomas Troy

    2003-01-01

    The Rocket Evaluation and Cost Integration for Propulsion and Engineering software enables collaborative computing with automated exchange of information in the design and analysis of launch rockets and other complex systems. RECIPE can interact with and incorporate a variety of programs, including legacy codes, that model aspects of a system from the perspectives of different technological disciplines (e.g., aerodynamics, structures, propulsion, trajectory, aeroheating, controls, and operations) and that are used by different engineers on different computers running different operating systems. RECIPE consists mainly of (1) ISCRM a file-transfer subprogram that makes it possible for legacy codes executed in their original operating systems on their original computers to exchange data and (2) CONES an easy-to-use filewrapper subprogram that enables the integration of legacy codes. RECIPE provides a tightly integrated conceptual framework that emphasizes connectivity among the programs used by the collaborators, linking these programs in a manner that provides some configuration control while facilitating collaborative engineering tradeoff studies, including design to cost studies. In comparison with prior collaborative-engineering schemes, one based on the use of RECIPE enables fewer engineers to do more in less time.

  13. Individualization, globalization and health--about sustainable information technologies and the aim of medical informatics.

    PubMed

    Haux, Reinhold

    2006-12-01

    This paper discusses aspects of information technologies for health care, in particular on transinstitutional health information systems (HIS) and on health-enabling technologies, with some consequences for the aim of medical informatics. It is argued that with the extended range of health information systems and the perspective of having adequate transinstitutional HIS architectures, a substantial contribution can be made to better patient-centered care, with possibilities ranging from regional, national to even global care. It is also argued that in applying health-enabling technologies, using ubiquitous, pervasive computing environments and ambient intelligence approaches, we can expect that in addition care will become more specific and tailored for the individual, and that we can achieve better personalized care. In developing health care systems towards transinstitutional HIS and health-enabling technologies, the aim of medical informatics, to contribute to the progress of the sciences and to high-quality, efficient, and affordable health care that does justice to the individual and to society, may be extended to also contributing to self-determined and self-sufficient (autonomous) life. Reference is made and examples are given from the Yearbook of Medical Informatics of the International Medical Informatics Association (IMIA) and from the work of Professor Jochen Moehr.

  14. Using Multi-Core Systems for Rover Autonomy

    NASA Technical Reports Server (NTRS)

    Clement, Brad; Estlin, Tara; Bornstein, Benjamin; Springer, Paul; Anderson, Robert C.

    2010-01-01

    Task Objectives are: (1) Develop and demonstrate key capabilities for rover long-range science operations using multi-core computing, (a) Adapt three rover technologies to execute on SOA multi-core processor (b) Illustrate performance improvements achieved (c) Demonstrate adapted capabilities with rover hardware, (2) Targeting three high-level autonomy technologies (a) Two for onboard data analysis (b) One for onboard command sequencing/planning, (3) Technologies identified as enabling for future missions, (4)Benefits will be measured along several metrics: (a) Execution time / Power requirements (b) Number of data products processed per unit time (c) Solution quality

  15. Workflow4Metabolomics: a collaborative research infrastructure for computational metabolomics

    PubMed Central

    Giacomoni, Franck; Le Corguillé, Gildas; Monsoor, Misharl; Landi, Marion; Pericard, Pierre; Pétéra, Mélanie; Duperier, Christophe; Tremblay-Franco, Marie; Martin, Jean-François; Jacob, Daniel; Goulitquer, Sophie; Thévenot, Etienne A.; Caron, Christophe

    2015-01-01

    Summary: The complex, rapidly evolving field of computational metabolomics calls for collaborative infrastructures where the large volume of new algorithms for data pre-processing, statistical analysis and annotation can be readily integrated whatever the language, evaluated on reference datasets and chained to build ad hoc workflows for users. We have developed Workflow4Metabolomics (W4M), the first fully open-source and collaborative online platform for computational metabolomics. W4M is a virtual research environment built upon the Galaxy web-based platform technology. It enables ergonomic integration, exchange and running of individual modules and workflows. Alternatively, the whole W4M framework and computational tools can be downloaded as a virtual machine for local installation. Availability and implementation: http://workflow4metabolomics.org homepage enables users to open a private account and access the infrastructure. W4M is developed and maintained by the French Bioinformatics Institute (IFB) and the French Metabolomics and Fluxomics Infrastructure (MetaboHUB). Contact: contact@workflow4metabolomics.org PMID:25527831

  16. Workflow4Metabolomics: a collaborative research infrastructure for computational metabolomics.

    PubMed

    Giacomoni, Franck; Le Corguillé, Gildas; Monsoor, Misharl; Landi, Marion; Pericard, Pierre; Pétéra, Mélanie; Duperier, Christophe; Tremblay-Franco, Marie; Martin, Jean-François; Jacob, Daniel; Goulitquer, Sophie; Thévenot, Etienne A; Caron, Christophe

    2015-05-01

    The complex, rapidly evolving field of computational metabolomics calls for collaborative infrastructures where the large volume of new algorithms for data pre-processing, statistical analysis and annotation can be readily integrated whatever the language, evaluated on reference datasets and chained to build ad hoc workflows for users. We have developed Workflow4Metabolomics (W4M), the first fully open-source and collaborative online platform for computational metabolomics. W4M is a virtual research environment built upon the Galaxy web-based platform technology. It enables ergonomic integration, exchange and running of individual modules and workflows. Alternatively, the whole W4M framework and computational tools can be downloaded as a virtual machine for local installation. http://workflow4metabolomics.org homepage enables users to open a private account and access the infrastructure. W4M is developed and maintained by the French Bioinformatics Institute (IFB) and the French Metabolomics and Fluxomics Infrastructure (MetaboHUB). contact@workflow4metabolomics.org. © The Author 2014. Published by Oxford University Press.

  17. Improving Access to Data While Protecting Confidentiality: Prospects for the Future.

    ERIC Educational Resources Information Center

    Duncan, George T.; Pearson, Robert W.

    Providing researchers, especially those in the social sciences, with access to publicly collected microdata furthers research while advancing public policy goals in a democratic society. However, while technological improvements have eased remote access to these databases and enabled computer using researchers to perform sophisticated statistical…

  18. Podcasting Is Dead. Long Live Video!

    ERIC Educational Resources Information Center

    Cann, Alan J.

    2007-01-01

    Podcasting (an automatic mechanism whereby multimedia computer files are transferred from a server to a client, en.wikipedia.org/wiki/podcast) is becoming increasingly popular in education. Although podcasts enable students and teachers to share information anywhere at anytime, the most frequent application of the technology to date has been to…

  19. An Investigation into Specifying Service Level Agreements for Provisioning Cloud Computing Services

    DTIC Science & Technology

    2012-12-01

    IT .................................................................................................... Information Technology KPI ...the service delivery be measured? 3. Key Performance Indicators ( KPIs ): Describe the KPIs and the responsible party for producing the KPIs . 4...level objectives (SLOs) that are evaluated according to measurable Key Performance Indicators ( KPIs ). Automatic SLA protection enables further

  20. The Aggregate Exposure Pathway (AEP): A conceptual framework for advancing exposure science research and transforming risk assessment

    EPA Science Inventory

    Recent advances in analytical methods, biomarker discovery, cell-based assay development, computational tools, sensor/monitor, and omics technology have enabled new streams of exposure and toxicity data to be generated at higher volumes and speed. These new data offer the opport...

  1. The Mercury System: Embedding Computation into Disk Drives

    DTIC Science & Technology

    2004-08-20

    enabling technologies to build extremely fast data search engines . We do this by moving the search closer to the data, and performing it in hardware...engine searches in parallel across a disk or disk surface 2. System Parallelism: Searching is off-loaded to search engines and main processor can

  2. A Collaborative Model for Ubiquitous Learning Environments

    ERIC Educational Resources Information Center

    Barbosa, Jorge; Barbosa, Debora; Rabello, Solon

    2016-01-01

    Use of mobile devices and widespread adoption of wireless networks have enabled the emergence of Ubiquitous Computing. Application of this technology to improving education strategies gave rise to Ubiquitous e-Learning, also known as Ubiquitous Learning. There are several approaches to organizing ubiquitous learning environments, but most of them…

  3. (In)Forming: The Affordances of Digital Fabrication in Architectural Education

    ERIC Educational Resources Information Center

    Cabrinha, Mark Newell

    2010-01-01

    This research focuses on the effect of technology on the culture of architectural education through the lens of digital fabrication (CAD/CAM). As the computer was introduced into design education long before digital fabrication was accessible, design culture has prioritized image over material experience. Digital fabrication enables a material…

  4. Data base management system configuration specification. [computer storage devices

    NASA Technical Reports Server (NTRS)

    Neiers, J. W.

    1979-01-01

    The functional requirements and the configuration of the data base management system are described. Techniques and technology which will enable more efficient and timely transfer of useful data from the sensor to the user, extraction of information by the user, and exchange of information among the users are demonstrated.

  5. Exploring Moodle Functionality for Managing Open Distance Learning E-Assessments

    ERIC Educational Resources Information Center

    Koneru, Indira

    2017-01-01

    Current and emerging technologies enable Open Distance Learning (ODL) institutions integrate e-Learning in innovative ways and add value to the existing teaching-learning and assessment processes. ODL e-Assessment systems have evolved from Computer Assisted/Aided Assessment (CAA) systems through intelligent assessment and feedback systems.…

  6. Virtualising the Quantitative Research Methods Course: An Island-Based Approach

    ERIC Educational Resources Information Center

    Baglin, James; Reece, John; Baker, Jenalle

    2015-01-01

    Many recent improvements in pedagogical practice have been enabled by the rapid development of innovative technologies, particularly for teaching quantitative research methods and statistics. This study describes the design, implementation, and evaluation of a series of specialised computer laboratory sessions. The sessions combined the use of an…

  7. To the Cloud! A Grassroots Proposal to Accelerate Brain Science Discovery

    PubMed Central

    Vogelstein, Joshua T.; Mensh, Brett; Hausser, Michael; Spruston, Nelson; Evans, Alan; Kording, Konrad; Amunts, Katrin; Ebell, Christoph; Muller, Jeff; Telefont, Martin; Hill, Sean; Koushika, Sandhya P.; Cali, Corrado; Valdés-Sosa, Pedro Antonio; Littlewood, Peter; Koch, Christof; Saalfeld, Stephan; Kepecs, Adam; Peng, Hanchuan; Halchenko, Yaroslav O.; Kiar, Gregory; Poo, Mu-Ming; Poline, Jean-Baptiste; Milham, Michael P.; Schaffer, Alyssa Picchini; Gidron, Rafi; Okano, Hideyuki; Calhoun, Vince D; Chun, Miyoung; Kleissas, Dean M.; Vogelstein, R. Jacob; Perlman, Eric; Burns, Randal; Huganir, Richard; Miller, Michael I.

    2018-01-01

    The revolution in neuroscientific data acquisition is creating an analysis challenge. We propose leveraging cloud-computing technologies to enable large-scale neurodata storing, exploring, analyzing, and modeling. This utility will empower scientists globally to generate and test theories of brain function and dysfunction. PMID:27810005

  8. Quantum Sensors at the Intersections of Fundamental Science, Quantum Information Science & Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chattopadhyay, Swapan; Falcone, Roger; Walsworth, Ronald

    Over the last twenty years, there has been a boom in quantum science - i.e., the development and exploitation of quantum systems to enable qualitatively and quantitatively new capabilities, with high-impact applications and fundamental insights that can range across all areas of science and technology.

  9. Integrating Blended Teaching and Learning to Enhance Graduate Attributes

    ERIC Educational Resources Information Center

    Hermens, Antoine; Clarke, Elizabeth

    2009-01-01

    Purpose: The purpose of this paper is to explore the role of computer based business simulations in higher education as innovative tools of teaching and learning to enhance students' practical understanding of real business problems. Whether the integration of business simulation technologies will enable significant innovation in teaching and…

  10. A practical approach to virtualization in HEP

    NASA Astrophysics Data System (ADS)

    Buncic, P.; Aguado Sánchez, C.; Blomer, J.; Harutyunyan, A.; Mudrinic, M.

    2011-01-01

    In the attempt to solve the problem of processing data coming from LHC experiments at CERN at a rate of 15PB per year, for almost a decade the High Enery Physics (HEP) community has focused its efforts on the development of the Worldwide LHC Computing Grid. This generated large interest and expectations promising to revolutionize computing. Meanwhile, having initially taken part in the Grid standardization process, industry has moved in a different direction and started promoting the Cloud Computing paradigm which aims to solve problems on a similar scale and in equally seamless way as it was expected in the idealized Grid approach. A key enabling technology behind Cloud computing is server virtualization. In early 2008, an R&D project was established in the PH-SFT group at CERN to investigate how virtualization technology could be used to improve and simplify the daily interaction of physicists with experiment software frameworks and the Grid infrastructure. In this article we shall first briefly compare Grid and Cloud computing paradigms and then summarize the results of the R&D activity pointing out where and how virtualization technology could be effectively used in our field in order to maximize practical benefits whilst avoiding potential pitfalls.

  11. Riding the Hype Wave: Evaluating new AI Techniques for their Applicability in Earth Science

    NASA Astrophysics Data System (ADS)

    Ramachandran, R.; Zhang, J.; Maskey, M.; Lee, T. J.

    2016-12-01

    Every few years a new technology rides the hype wave generated by the computer science community. Converts to this new technology who surface from both the science community and the informatics community promulgate that it can radically improve or even change the existing scientific process. Recent examples of new technology following in the footsteps of "big data" now include deep learning algorithms and knowledge graphs. Deep learning algorithms mimic the human brain and process information through multiple stages of transformation and representation. These algorithms are able to learn complex functions that map pixels directly to outputs without relying on human-crafted features and solve some of the complex classification problems that exist in science. Similarly, knowledge graphs aggregate information around defined topics that enable users to resolve their query without having to navigate and assemble information manually. Knowledge graphs could potentially be used in scientific research to assist in hypothesis formulation, testing, and review. The challenge for the Earth science research community is to evaluate these new technologies by asking the right questions and considering what-if scenarios. What is this new technology enabling/providing that is innovative and different? Can one justify the adoption costs with respect to the research returns? Since nothing comes for free, utilizing a new technology entails adoption costs that may outweigh the benefits. Furthermore, these technologies may require significant computing infrastructure in order to be utilized effectively. Results from two different projects will be presented along with lessons learned from testing these technologies. The first project primarily evaluates deep learning techniques for different applications of image retrieval within Earth science while the second project builds a prototype knowledge graph constructed for Hurricane science.

  12. Challenges and Emerging Concepts in the Development of Adaptive, Computer-based Tutoring Systems for Team Training

    DTIC Science & Technology

    2011-11-01

    based perception of each team member‟s behavior and physiology with the goal of predicting unobserved variables (e.g., cognitive state). Along with...sensing technologies are showing promise as enablers of computer-based perception of each team member‟s behavior and physiology with the goal...an essential element of team performance. The perception that other team members may be unable to perform their tasks is detrimental to trust and

  13. Proactive human-computer collaboration for information discovery

    NASA Astrophysics Data System (ADS)

    DiBona, Phil; Shilliday, Andrew; Barry, Kevin

    2016-05-01

    Lockheed Martin Advanced Technology Laboratories (LM ATL) is researching methods, representations, and processes for human/autonomy collaboration to scale analysis and hypotheses substantiation for intelligence analysts. This research establishes a machinereadable hypothesis representation that is commonsensical to the human analyst. The representation unifies context between the human and computer, enabling autonomy in the form of analytic software, to support the analyst through proactively acquiring, assessing, and organizing high-value information that is needed to inform and substantiate hypotheses.

  14. Information Systems for NASA's Aeronautics and Space Enterprises

    NASA Technical Reports Server (NTRS)

    Kutler, Paul

    1998-01-01

    The aerospace industry is being challenged to reduce costs and development time as well as utilize new technologies to improve product performance. Information technology (IT) is the key to providing revolutionary solutions to the challenges posed by the increasing complexity of NASA's aeronautics and space missions and the sophisticated nature of the systems that enable them. The NASA Ames vision is to develop technologies enabling the information age, expanding the frontiers of knowledge for aeronautics and space, improving America's competitive position, and inspiring future generations. Ames' missions to accomplish that vision include: 1) performing research to support the American aviation community through the unique integration of computation, experimentation, simulation and flight testing, 2) studying the health of our planet, understanding living systems in space and the origins of the universe, developing technologies for space flight, and 3) to research, develop and deliver information technologies and applications. Information technology may be defined as the use of advance computing systems to generate data, analyze data, transform data into knowledge and to use as an aid in the decision-making process. The knowledge from transformed data can be displayed in visual, virtual and multimedia environments. The decision-making process can be fully autonomous or aided by a cognitive processes, i.e., computational aids designed to leverage human capacities. IT Systems can learn as they go, developing the capability to make decisions or aid the decision making process on the basis of experiences gained using limited data inputs. In the future, information systems will be used to aid space mission synthesis, virtual aerospace system design, aid damaged aircraft during landing, perform robotic surgery, and monitor the health and status of spacecraft and planetary probes. NASA Ames through the Center of Excellence for Information Technology Office is leading the effort in pursuit of revolutionary, IT-based approaches to satisfying NASA's aeronautics and space requirements. The objective of the effort is to incorporate information technologies within each of the Agency's four Enterprises, i.e., Aeronautics and Space Transportation Technology, Earth, Science, Human Exploration and Development of Space and Space Sciences. The end results of these efforts for Enterprise programs and projects should be reduced cost, enhanced mission capability and expedited mission completion.

  15. Aeroelastic Tailoring of Transport Aircraft Wings: State-of-the-Art and Potential Enabling Technologies

    NASA Technical Reports Server (NTRS)

    Jutte, Christine; Stanford, Bret K.

    2014-01-01

    This paper provides a brief overview of the state-of-the-art for aeroelastic tailoring of subsonic transport aircraft and offers additional resources on related research efforts. Emphasis is placed on aircraft having straight or aft swept wings. The literature covers computational synthesis tools developed for aeroelastic tailoring and numerous design studies focused on discovering new methods for passive aeroelastic control. Several new structural and material technologies are presented as potential enablers of aeroelastic tailoring, including selectively reinforced materials, functionally graded materials, fiber tow steered composite laminates, and various nonconventional structural designs. In addition, smart materials and structures whose properties or configurations change in response to external stimuli are presented as potential active approaches to aeroelastic tailoring.

  16. Subject-enabled analytics model on measurement statistics in health risk expert system for public health informatics.

    PubMed

    Chung, Chi-Jung; Kuo, Yu-Chen; Hsieh, Yun-Yu; Li, Tsai-Chung; Lin, Cheng-Chieh; Liang, Wen-Miin; Liao, Li-Na; Li, Chia-Ing; Lin, Hsueh-Chun

    2017-11-01

    This study applied open source technology to establish a subject-enabled analytics model that can enhance measurement statistics of case studies with the public health data in cloud computing. The infrastructure of the proposed model comprises three domains: 1) the health measurement data warehouse (HMDW) for the case study repository, 2) the self-developed modules of online health risk information statistics (HRIStat) for cloud computing, and 3) the prototype of a Web-based process automation system in statistics (PASIS) for the health risk assessment of case studies with subject-enabled evaluation. The system design employed freeware including Java applications, MySQL, and R packages to drive a health risk expert system (HRES). In the design, the HRIStat modules enforce the typical analytics methods for biomedical statistics, and the PASIS interfaces enable process automation of the HRES for cloud computing. The Web-based model supports both modes, step-by-step analysis and auto-computing process, respectively for preliminary evaluation and real time computation. The proposed model was evaluated by computing prior researches in relation to the epidemiological measurement of diseases that were caused by either heavy metal exposures in the environment or clinical complications in hospital. The simulation validity was approved by the commercial statistics software. The model was installed in a stand-alone computer and in a cloud-server workstation to verify computing performance for a data amount of more than 230K sets. Both setups reached efficiency of about 10 5 sets per second. The Web-based PASIS interface can be used for cloud computing, and the HRIStat module can be flexibly expanded with advanced subjects for measurement statistics. The analytics procedure of the HRES prototype is capable of providing assessment criteria prior to estimating the potential risk to public health. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Community Petascale Project for Accelerator Science and Simulation: Advancing Computational Science for Future Accelerators and Accelerator Technologies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spentzouris, P.; /Fermilab; Cary, J.

    The design and performance optimization of particle accelerators are essential for the success of the DOE scientific program in the next decade. Particle accelerators are very complex systems whose accurate description involves a large number of degrees of freedom and requires the inclusion of many physics processes. Building on the success of the SciDAC-1 Accelerator Science and Technology project, the SciDAC-2 Community Petascale Project for Accelerator Science and Simulation (ComPASS) is developing a comprehensive set of interoperable components for beam dynamics, electromagnetics, electron cooling, and laser/plasma acceleration modelling. ComPASS is providing accelerator scientists the tools required to enable the necessarymore » accelerator simulation paradigm shift from high-fidelity single physics process modeling (covered under SciDAC1) to high-fidelity multiphysics modeling. Our computational frameworks have been used to model the behavior of a large number of accelerators and accelerator R&D experiments, assisting both their design and performance optimization. As parallel computational applications, the ComPASS codes have been shown to make effective use of thousands of processors. ComPASS is in the first year of executing its plan to develop the next-generation HPC accelerator modeling tools. ComPASS aims to develop an integrated simulation environment that will utilize existing and new accelerator physics modules with petascale capabilities, by employing modern computing and solver technologies. The ComPASS vision is to deliver to accelerator scientists a virtual accelerator and virtual prototyping modeling environment, with the necessary multiphysics, multiscale capabilities. The plan for this development includes delivering accelerator modeling applications appropriate for each stage of the ComPASS software evolution. Such applications are already being used to address challenging problems in accelerator design and optimization. The ComPASS organization for software development and applications accounts for the natural domain areas (beam dynamics, electromagnetics, and advanced acceleration), and all areas depend on the enabling technologies activities, such as solvers and component technology, to deliver the desired performance and integrated simulation environment. The ComPASS applications focus on computationally challenging problems important for design or performance optimization to all major HEP, NP, and BES accelerator facilities. With the cost and complexity of particle accelerators rising, the use of computation to optimize their designs and find improved operating regimes becomes essential, potentially leading to significant cost savings with modest investment.« less

  18. Dual-energy computed tomography for detection of coronary artery disease

    PubMed Central

    Danad, Ibrahim; Ó Hartaigh, Bríain; Min, James K.

    2016-01-01

    Recent technological advances in computed tomography (CT) technology have fulfilled the prerequisites for the cardiac application of dual-energy CT (DECT) imaging. By exploiting the unique characteristics of materials when exposed to two different x-ray energies, DECT holds great promise for the diagnosis and management of coronary artery disease. It allows for the assessment of myocardial perfusion to discern the hemodynamic significance of coronary disease and possesses high accuracy for the detection and characterization of coronary plaques, while facilitating reductions in radiation dose. As such, DECT enabled cardiac CT to advance beyond the mere detection of coronary stenosis expanding its role in the evaluation and management of coronary atherosclerosis. PMID:26549789

  19. Using Speech Recognition to Enhance the Tongue Drive System Functionality in Computer Access

    PubMed Central

    Huo, Xueliang; Ghovanloo, Maysam

    2013-01-01

    Tongue Drive System (TDS) is a wireless tongue operated assistive technology (AT), which can enable people with severe physical disabilities to access computers and drive powered wheelchairs using their volitional tongue movements. TDS offers six discrete commands, simultaneously available to the users, for pointing and typing as a substitute for mouse and keyboard in computer access, respectively. To enhance the TDS performance in typing, we have added a microphone, an audio codec, and a wireless audio link to its readily available 3-axial magnetic sensor array, and combined it with a commercially available speech recognition software, the Dragon Naturally Speaking, which is regarded as one of the most efficient ways for text entry. Our preliminary evaluations indicate that the combined TDS and speech recognition technologies can provide end users with significantly higher performance than using each technology alone, particularly in completing tasks that require both pointing and text entry, such as web surfing. PMID:22255801

  20. Multimodal approaches for emotion recognition: a survey

    NASA Astrophysics Data System (ADS)

    Sebe, Nicu; Cohen, Ira; Gevers, Theo; Huang, Thomas S.

    2004-12-01

    Recent technological advances have enabled human users to interact with computers in ways previously unimaginable. Beyond the confines of the keyboard and mouse, new modalities for human-computer interaction such as voice, gesture, and force-feedback are emerging. Despite important advances, one necessary ingredient for natural interaction is still missing-emotions. Emotions play an important role in human-to-human communication and interaction, allowing people to express themselves beyond the verbal domain. The ability to understand human emotions is desirable for the computer in several applications. This paper explores new ways of human-computer interaction that enable the computer to be more aware of the user's emotional and attentional expressions. We present the basic research in the field and the recent advances into the emotion recognition from facial, voice, and physiological signals, where the different modalities are treated independently. We then describe the challenging problem of multimodal emotion recognition and we advocate the use of probabilistic graphical models when fusing the different modalities. We also discuss the difficult issues of obtaining reliable affective data, obtaining ground truth for emotion recognition, and the use of unlabeled data.

  1. Multimodal approaches for emotion recognition: a survey

    NASA Astrophysics Data System (ADS)

    Sebe, Nicu; Cohen, Ira; Gevers, Theo; Huang, Thomas S.

    2005-01-01

    Recent technological advances have enabled human users to interact with computers in ways previously unimaginable. Beyond the confines of the keyboard and mouse, new modalities for human-computer interaction such as voice, gesture, and force-feedback are emerging. Despite important advances, one necessary ingredient for natural interaction is still missing-emotions. Emotions play an important role in human-to-human communication and interaction, allowing people to express themselves beyond the verbal domain. The ability to understand human emotions is desirable for the computer in several applications. This paper explores new ways of human-computer interaction that enable the computer to be more aware of the user's emotional and attentional expressions. We present the basic research in the field and the recent advances into the emotion recognition from facial, voice, and physiological signals, where the different modalities are treated independently. We then describe the challenging problem of multimodal emotion recognition and we advocate the use of probabilistic graphical models when fusing the different modalities. We also discuss the difficult issues of obtaining reliable affective data, obtaining ground truth for emotion recognition, and the use of unlabeled data.

  2. Using computer decision support systems in NHS emergency and urgent care: ethnographic study using normalisation process theory

    PubMed Central

    2013-01-01

    Background Information and communication technologies (ICTs) are often proposed as ‘technological fixes’ for problems facing healthcare. They promise to deliver services more quickly and cheaply. Yet research on the implementation of ICTs reveals a litany of delays, compromises and failures. Case studies have established that these technologies are difficult to embed in everyday healthcare. Methods We undertook an ethnographic comparative analysis of a single computer decision support system in three different settings to understand the implementation and everyday use of this technology which is designed to deal with calls to emergency and urgent care services. We examined the deployment of this technology in an established 999 ambulance call-handling service, a new single point of access for urgent care and an established general practice out-of-hours service. We used Normalization Process Theory as a framework to enable systematic cross-case analysis. Results Our data comprise nearly 500 hours of observation, interviews with 64 call-handlers, and stakeholders and documents about the technology and settings. The technology has been implemented and is used distinctively in each setting reflecting important differences between work and contexts. Using Normalisation Process Theory we show how the work (collective action) of implementing the system and maintaining its routine use was enabled by a range of actors who established coherence for the technology, secured buy-in (cognitive participation) and engaged in on-going appraisal and adjustment (reflexive monitoring). Conclusions Huge effort was expended and continues to be required to implement and keep this technology in use. This innovation must be understood both as a computer technology and as a set of practices related to that technology, kept in place by a network of actors in particular contexts. While technologies can be ‘made to work’ in different settings, successful implementation has been achieved, and will only be maintained, through the efforts of those involved in the specific settings and if the wider context continues to support the coherence, cognitive participation, and reflective monitoring processes that surround this collective action. Implementation is more than simply putting technologies in place – it requires new resources and considerable effort, perhaps on an on-going basis. PMID:23522021

  3. GPU Accelerated Prognostics

    NASA Technical Reports Server (NTRS)

    Gorospe, George E., Jr.; Daigle, Matthew J.; Sankararaman, Shankar; Kulkarni, Chetan S.; Ng, Eley

    2017-01-01

    Prognostic methods enable operators and maintainers to predict the future performance for critical systems. However, these methods can be computationally expensive and may need to be performed each time new information about the system becomes available. In light of these computational requirements, we have investigated the application of graphics processing units (GPUs) as a computational platform for real-time prognostics. Recent advances in GPU technology have reduced cost and increased the computational capability of these highly parallel processing units, making them more attractive for the deployment of prognostic software. We present a survey of model-based prognostic algorithms with considerations for leveraging the parallel architecture of the GPU and a case study of GPU-accelerated battery prognostics with computational performance results.

  4. Photonic-Enabled RF Canceller with Tunable Time-Delay Taps

    DTIC Science & Technology

    2016-12-05

    ports indicated in Fig. 1. The analyzer was configured to sweep 10 MHz to 6 GHz with +10 dBm of output power , and compute the time-domain transmission ...Laboratory Lexington, Massachusetts, USA Abstract—Future 5G wireless networks can benefit from the use of in-band full-duplex technologies that allow access...microwave photonics, RF cancellation. I. INTRODUCTION In-Band Full-Duplex (IBFD) technologies are being consid- ered for 5th generation (5G) wireless

  5. NASA information sciences and human factors program

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The Data Systems Program consists of research and technology devoted to controlling, processing, storing, manipulating, and analyzing space-derived data. The objectives of the program are to provide the technology advancements needed to enable affordable utilization of space-derived data, to increase substantially the capability for future missions of on-board processing and recording and to provide high-speed, high-volume computational systems that are anticipated for missions such as the evolutionary Space Station and Earth Observing System.

  6. Systems cell biology

    PubMed Central

    Mast, Fred D.; Ratushny, Alexander V.

    2014-01-01

    Systems cell biology melds high-throughput experimentation with quantitative analysis and modeling to understand many critical processes that contribute to cellular organization and dynamics. Recently, there have been several advances in technology and in the application of modeling approaches that enable the exploration of the dynamic properties of cells. Merging technology and computation offers an opportunity to objectively address unsolved cellular mechanisms, and has revealed emergent properties and helped to gain a more comprehensive and fundamental understanding of cell biology. PMID:25225336

  7. Friction Stir Welding and Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hovanski, Yuri; Carsley, John; Clarke, Kester D.

    2015-05-01

    With nearly twenty years of international research and collaboration in friction stir welding (FSW) and processing industrial applications have spread into nearly every feasible market. Currently applications exist in aerospace, railway, automotive, personal computers, technology, marine, cutlery, construction, as well as several other markets. Implementation of FSW has demonstrated diverse opportunities ranging from enabling new materials to reducing the production costs of current welding technologies by enabling condensed packaging solutions for traditional fabrication and assembly. TMS has sponsored focused instruction and communication in this technology area for more than fifteen years, with leadership from the Shaping and Forming Committee, whichmore » organizes a biannual symposium each odd year at the annual meeting. A focused publication produced from each of these symposia now comprises eight volumes detailing the primary research and development activities in this area over the last two decades. The articles assembled herein focus on both recent developments and technology reviews of several key markets from international experts in this area.« less

  8. High-speed quantum networking by ship

    NASA Astrophysics Data System (ADS)

    Devitt, Simon J.; Greentree, Andrew D.; Stephens, Ashley M.; van Meter, Rodney

    2016-11-01

    Networked entanglement is an essential component for a plethora of quantum computation and communication protocols. Direct transmission of quantum signals over long distances is prevented by fibre attenuation and the no-cloning theorem, motivating the development of quantum repeaters, designed to purify entanglement, extending its range. Quantum repeaters have been demonstrated over short distances, but error-corrected, global repeater networks with high bandwidth require new technology. Here we show that error corrected quantum memories installed in cargo containers and carried by ship can provide a exible connection between local networks, enabling low-latency, high-fidelity quantum communication across global distances at higher bandwidths than previously proposed. With demonstrations of technology with sufficient fidelity to enable topological error-correction, implementation of the quantum memories is within reach, and bandwidth increases with improvements in fabrication. Our approach to quantum networking avoids technological restrictions of repeater deployment, providing an alternate path to a worldwide Quantum Internet.

  9. High-speed quantum networking by ship

    PubMed Central

    Devitt, Simon J.; Greentree, Andrew D.; Stephens, Ashley M.; Van Meter, Rodney

    2016-01-01

    Networked entanglement is an essential component for a plethora of quantum computation and communication protocols. Direct transmission of quantum signals over long distances is prevented by fibre attenuation and the no-cloning theorem, motivating the development of quantum repeaters, designed to purify entanglement, extending its range. Quantum repeaters have been demonstrated over short distances, but error-corrected, global repeater networks with high bandwidth require new technology. Here we show that error corrected quantum memories installed in cargo containers and carried by ship can provide a exible connection between local networks, enabling low-latency, high-fidelity quantum communication across global distances at higher bandwidths than previously proposed. With demonstrations of technology with sufficient fidelity to enable topological error-correction, implementation of the quantum memories is within reach, and bandwidth increases with improvements in fabrication. Our approach to quantum networking avoids technological restrictions of repeater deployment, providing an alternate path to a worldwide Quantum Internet. PMID:27805001

  10. High-speed quantum networking by ship.

    PubMed

    Devitt, Simon J; Greentree, Andrew D; Stephens, Ashley M; Van Meter, Rodney

    2016-11-02

    Networked entanglement is an essential component for a plethora of quantum computation and communication protocols. Direct transmission of quantum signals over long distances is prevented by fibre attenuation and the no-cloning theorem, motivating the development of quantum repeaters, designed to purify entanglement, extending its range. Quantum repeaters have been demonstrated over short distances, but error-corrected, global repeater networks with high bandwidth require new technology. Here we show that error corrected quantum memories installed in cargo containers and carried by ship can provide a exible connection between local networks, enabling low-latency, high-fidelity quantum communication across global distances at higher bandwidths than previously proposed. With demonstrations of technology with sufficient fidelity to enable topological error-correction, implementation of the quantum memories is within reach, and bandwidth increases with improvements in fabrication. Our approach to quantum networking avoids technological restrictions of repeater deployment, providing an alternate path to a worldwide Quantum Internet.

  11. Neuroadaptive technology enables implicit cursor control based on medial prefrontal cortex activity.

    PubMed

    Zander, Thorsten O; Krol, Laurens R; Birbaumer, Niels P; Gramann, Klaus

    2016-12-27

    The effectiveness of today's human-machine interaction is limited by a communication bottleneck as operators are required to translate high-level concepts into a machine-mandated sequence of instructions. In contrast, we demonstrate effective, goal-oriented control of a computer system without any form of explicit communication from the human operator. Instead, the system generated the necessary input itself, based on real-time analysis of brain activity. Specific brain responses were evoked by violating the operators' expectations to varying degrees. The evoked brain activity demonstrated detectable differences reflecting congruency with or deviations from the operators' expectations. Real-time analysis of this activity was used to build a user model of those expectations, thus representing the optimal (expected) state as perceived by the operator. Based on this model, which was continuously updated, the computer automatically adapted itself to the expectations of its operator. Further analyses showed this evoked activity to originate from the medial prefrontal cortex and to exhibit a linear correspondence to the degree of expectation violation. These findings extend our understanding of human predictive coding and provide evidence that the information used to generate the user model is task-specific and reflects goal congruency. This paper demonstrates a form of interaction without any explicit input by the operator, enabling computer systems to become neuroadaptive, that is, to automatically adapt to specific aspects of their operator's mindset. Neuroadaptive technology significantly widens the communication bottleneck and has the potential to fundamentally change the way we interact with technology.

  12. Cloud@Home: A New Enhanced Computing Paradigm

    NASA Astrophysics Data System (ADS)

    Distefano, Salvatore; Cunsolo, Vincenzo D.; Puliafito, Antonio; Scarpa, Marco

    Cloud computing is a distributed computing paradigm that mixes aspects of Grid computing, ("… hardware and software infrastructure that provides dependable, consistent, pervasive, and inexpensive access to high-end computational capabilities" (Foster, 2002)) Internet Computing ("…a computing platform geographically distributed across the Internet" (Milenkovic et al., 2003)), Utility computing ("a collection of technologies and business practices that enables computing to be delivered seamlessly and reliably across multiple computers, ... available as needed and billed according to usage, much like water and electricity are today" (Ross & Westerman, 2004)) Autonomic computing ("computing systems that can manage themselves given high-level objectives from administrators" (Kephart & Chess, 2003)), Edge computing ("… provides a generic template facility for any type of application to spread its execution across a dedicated grid, balancing the load …" Davis, Parikh, & Weihl, 2004) and Green computing (a new frontier of Ethical computing1 starting from the assumption that in next future energy costs will be related to the environment pollution).

  13. Construction of Blaze at the University of Illinois at Chicago: A Shared, High-Performance, Visual Computer for Next-Generation Cyberinfrastructure-Accelerated Scientific, Engineering, Medical and Public Policy Research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Maxine D.; Leigh, Jason

    2014-02-17

    The Blaze high-performance visual computing system serves the high-performance computing research and education needs of University of Illinois at Chicago (UIC). Blaze consists of a state-of-the-art, networked, computer cluster and ultra-high-resolution visualization system called CAVE2(TM) that is currently not available anywhere in Illinois. This system is connected via a high-speed 100-Gigabit network to the State of Illinois' I-WIRE optical network, as well as to national and international high speed networks, such as the Internet2, and the Global Lambda Integrated Facility. This enables Blaze to serve as an on-ramp to national cyberinfrastructure, such as the National Science Foundation’s Blue Waters petascalemore » computer at the National Center for Supercomputing Applications at the University of Illinois at Chicago and the Department of Energy’s Argonne Leadership Computing Facility (ALCF) at Argonne National Laboratory. DOE award # DE-SC005067, leveraged with NSF award #CNS-0959053 for “Development of the Next-Generation CAVE Virtual Environment (NG-CAVE),” enabled us to create a first-of-its-kind high-performance visual computing system. The UIC Electronic Visualization Laboratory (EVL) worked with two U.S. companies to advance their commercial products and maintain U.S. leadership in the global information technology economy. New applications are being enabled with the CAVE2/Blaze visual computing system that is advancing scientific research and education in the U.S. and globally, and help train the next-generation workforce.« less

  14. Use of Emerging Grid Computing Technologies for the Analysis of LIGO Data

    NASA Astrophysics Data System (ADS)

    Koranda, Scott

    2004-03-01

    The LIGO Scientific Collaboration (LSC) today faces the challenge of enabling analysis of terabytes of LIGO data by hundreds of scientists from institutions all around the world. To meet this challenge the LSC is developing tools, infrastructure, applications, and expertise leveraging Grid Computing technologies available today, and making available to LSC scientists compute resources at sites across the United States and Europe. We use digital credentials for strong and secure authentication and authorization to compute resources and data. Building on top of products from the Globus project for high-speed data transfer and information discovery we have created the Lightweight Data Replicator (LDR) to securely and robustly replicate data to resource sites. We have deployed at our computing sites the Virtual Data Toolkit (VDT) Server and Client packages, developed in collaboration with our partners in the GriPhyN and iVDGL projects, providing uniform access to distributed resources for users and their applications. Taken together these Grid Computing technologies and infrastructure have formed the LSC DataGrid--a coherent and uniform environment across two continents for the analysis of gravitational-wave detector data. Much work, however, remains in order to scale current analyses and recent lessons learned need to be integrated into the next generation of Grid middleware.

  15. Computation Directorate Annual Report 2003

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crawford, D L; McGraw, J R; Ashby, S F

    Big computers are icons: symbols of the culture, and of the larger computing infrastructure that exists at Lawrence Livermore. Through the collective effort of Laboratory personnel, they enable scientific discovery and engineering development on an unprecedented scale. For more than three decades, the Computation Directorate has supplied the big computers that enable the science necessary for Laboratory missions and programs. Livermore supercomputing is uniquely mission driven. The high-fidelity weapon simulation capabilities essential to the Stockpile Stewardship Program compel major advances in weapons codes and science, compute power, and computational infrastructure. Computation's activities align with this vital mission of the Departmentmore » of Energy. Increasingly, non-weapons Laboratory programs also rely on computer simulation. World-class achievements have been accomplished by LLNL specialists working in multi-disciplinary research and development teams. In these teams, Computation personnel employ a wide array of skills, from desktop support expertise, to complex applications development, to advanced research. Computation's skilled professionals make the Directorate the success that it has become. These individuals know the importance of the work they do and the many ways it contributes to Laboratory missions. They make appropriate and timely decisions that move the entire organization forward. They make Computation a leader in helping LLNL achieve its programmatic milestones. I dedicate this inaugural Annual Report to the people of Computation in recognition of their continuing contributions. I am proud that we perform our work securely and safely. Despite increased cyber attacks on our computing infrastructure from the Internet, advanced cyber security practices ensure that our computing environment remains secure. Through Integrated Safety Management (ISM) and diligent oversight, we address safety issues promptly and aggressively. The safety of our employees, whether at work or at home, is a paramount concern. Even as the Directorate meets today's supercomputing requirements, we are preparing for the future. We are investigating open-source cluster technology, the basis of our highly successful Mulitprogrammatic Capability Resource (MCR). Several breakthrough discoveries have resulted from MCR calculations coupled with theory and experiment, prompting Laboratory scientists to demand ever-greater capacity and capability. This demand is being met by a new 23-TF system, Thunder, with architecture modeled on MCR. In preparation for the ''after-next'' computer, we are researching technology even farther out on the horizon--cell-based computers. Assuming that the funding and the technology hold, we will acquire the cell-based machine BlueGene/L within the next 12 months.« less

  16. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Mid-year report FY17 Q2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Pugmire, David; Rogers, David

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  17. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY17.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Pugmire, David; Rogers, David

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  18. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem. Mid-year report FY16 Q2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Sewell, Christopher; Childs, Hank

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  19. XVis: Visualization for the Extreme-Scale Scientific-Computation Ecosystem: Year-end report FY15 Q4.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moreland, Kenneth D.; Sewell, Christopher; Childs, Hank

    The XVis project brings together the key elements of research to enable scientific discovery at extreme scale. Scientific computing will no longer be purely about how fast computations can be performed. Energy constraints, processor changes, and I/O limitations necessitate significant changes in both the software applications used in scientific computation and the ways in which scientists use them. Components for modeling, simulation, analysis, and visualization must work together in a computational ecosystem, rather than working independently as they have in the past. This project provides the necessary research and infrastructure for scientific discovery in this new computational ecosystem by addressingmore » four interlocking challenges: emerging processor technology, in situ integration, usability, and proxy analysis.« less

  20. The Future Medical Science and Colorectal Surgeons

    PubMed Central

    2017-01-01

    Future medical technology breakthroughs will build from the incredible progress made in computers, biotechnology, and nanotechnology and from the information learned from the human genome. With such technology and information, computer-aided diagnoses, organ replacement, gene therapy, personalized drugs, and even age reversal will become possible. True 3-dimensional system technology will enable surgeons to envision key clinical features and will help them in planning complex surgery. Surgeons will enter surgical instructions in a virtual space from a remote medical center, order a medical robot to perform the operation, and review the operation in real time on a monitor. Surgeons will be better than artificial intelligence or automated robots when surgeons (or we) love patients and ask questions for a better future. The purpose of this paper is looking at the future medical science and the changes of colorectal surgeons. PMID:29354602

  1. National Combustion Code: A Multidisciplinary Combustor Design System

    NASA Technical Reports Server (NTRS)

    Stubbs, Robert M.; Liu, Nan-Suey

    1997-01-01

    The Internal Fluid Mechanics Division conducts both basic research and technology, and system technology research for aerospace propulsion systems components. The research within the division, which is both computational and experimental, is aimed at improving fundamental understanding of flow physics in inlets, ducts, nozzles, turbomachinery, and combustors. This article and the following three articles highlight some of the work accomplished in 1996. A multidisciplinary combustor design system is critical for optimizing the combustor design process. Such a system should include sophisticated computer-aided design (CAD) tools for geometry creation, advanced mesh generators for creating solid model representations, a common framework for fluid flow and structural analyses, modern postprocessing tools, and parallel processing. The goal of the present effort is to develop some of the enabling technologies and to demonstrate their overall performance in an integrated system called the National Combustion Code.

  2. The Future Medical Science and Colorectal Surgeons.

    PubMed

    Kim, Young Jin

    2017-12-01

    Future medical technology breakthroughs will build from the incredible progress made in computers, biotechnology, and nanotechnology and from the information learned from the human genome. With such technology and information, computer-aided diagnoses, organ replacement, gene therapy, personalized drugs, and even age reversal will become possible. True 3-dimensional system technology will enable surgeons to envision key clinical features and will help them in planning complex surgery. Surgeons will enter surgical instructions in a virtual space from a remote medical center, order a medical robot to perform the operation, and review the operation in real time on a monitor. Surgeons will be better than artificial intelligence or automated robots when surgeons (or we) love patients and ask questions for a better future. The purpose of this paper is looking at the future medical science and the changes of colorectal surgeons.

  3. Additive Manufacturing and High-Performance Computing: a Disruptive Latent Technology

    NASA Astrophysics Data System (ADS)

    Goodwin, Bruce

    2015-03-01

    This presentation will discuss the relationship between recent advances in Additive Manufacturing (AM) technology, High-Performance Computing (HPC) simulation and design capabilities, and related advances in Uncertainty Quantification (UQ), and then examines their impacts upon national and international security. The presentation surveys how AM accelerates the fabrication process, while HPC combined with UQ provides a fast track for the engineering design cycle. The combination of AM and HPC/UQ almost eliminates the engineering design and prototype iterative cycle, thereby dramatically reducing cost of production and time-to-market. These methods thereby present significant benefits for US national interests, both civilian and military, in an age of austerity. Finally, considering cyber security issues and the advent of the ``cloud,'' these disruptive, currently latent technologies may well enable proliferation and so challenge both nuclear and non-nuclear aspects of international security.

  4. Database Integrity Monitoring for Synthetic Vision Systems Using Machine Vision and SHADE

    NASA Technical Reports Server (NTRS)

    Cooper, Eric G.; Young, Steven D.

    2005-01-01

    In an effort to increase situational awareness, the aviation industry is investigating technologies that allow pilots to visualize what is outside of the aircraft during periods of low-visibility. One of these technologies, referred to as Synthetic Vision Systems (SVS), provides the pilot with real-time computer-generated images of obstacles, terrain features, runways, and other aircraft regardless of weather conditions. To help ensure the integrity of such systems, methods of verifying the accuracy of synthetically-derived display elements using onboard remote sensing technologies are under investigation. One such method is based on a shadow detection and extraction (SHADE) algorithm that transforms computer-generated digital elevation data into a reference domain that enables direct comparison with radar measurements. This paper describes machine vision techniques for making this comparison and discusses preliminary results from application to actual flight data.

  5. An Infrastructure to Enable Lightweight Context-Awareness for Mobile Users

    PubMed Central

    Curiel, Pablo; Lago, Ana B.

    2013-01-01

    Mobile phones enable us to carry out a wider range of tasks every day, and as a result they have become more ubiquitous than ever. However, they are still more limited in terms of processing power and interaction capabilities than traditional computers, and the often distracting and time-constricted scenarios in which we use them do not help in alleviating these limitations. Context-awareness is a valuable technique to address these issues, as it enables to adapt application behaviour to each situation. In this paper we present a context management infrastructure for mobile environments, aimed at controlling context information life-cycle in this kind of scenarios, with the main goal of enabling application and services to adapt their behaviour to better meet end-user needs. This infrastructure relies on semantic technologies and open standards to improve interoperability, and is based on a central element, the context manager. This element acts as a central context repository and takes most of the computational burden derived from dealing with this kind of information, thus relieving from these tasks to more resource-scarce devices in the system. PMID:23899932

  6. 2000 Numerical Propulsion System Simulation Review

    NASA Technical Reports Server (NTRS)

    Lytle, John; Follen, Greg; Naiman, Cynthia; Veres, Joseph; Owen, Karl; Lopez, Isaac

    2001-01-01

    The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with industry, academia, and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This concept is called the Numerical Propulsion System Simulation (NPSS). NPSS consists of three main elements: (1) engineering models that enable multidisciplinary analysis of large subsystems and systems at various levels of detail, (2) a simulation environment that maximizes designer productivity, and (3) a cost-effective. high-performance computing platform. A fundamental requirement of the concept is that the simulations must be capable of overnight execution on easily accessible computing platforms. This will greatly facilitate the use of large-scale simulations in a design environment. This paper describes the current status of the NPSS with specific emphasis on the progress made over the past year on air breathing propulsion applications. Major accomplishments include the first formal release of the NPSS object-oriented architecture (NPSS Version 1) and the demonstration of a one order of magnitude reduction in computing cost-to-performance ratio using a cluster of personal computers. The paper also describes the future NPSS milestones, which include the simulation of space transportation propulsion systems in response to increased emphasis on safe, low cost access to space within NASA'S Aerospace Technology Enterprise. In addition, the paper contains a summary of the feedback received from industry partners on the fiscal year 1999 effort and the actions taken over the past year to respond to that feedback. NPSS was supported in fiscal year 2000 by the High Performance Computing and Communications Program.

  7. 2001 Numerical Propulsion System Simulation Review

    NASA Technical Reports Server (NTRS)

    Lytle, John; Follen, Gregory; Naiman, Cynthia; Veres, Joseph; Owen, Karl; Lopez, Isaac

    2002-01-01

    The technologies necessary to enable detailed numerical simulations of complete propulsion systems are being developed at the NASA Glenn Research Center in cooperation with industry, academia and other government agencies. Large scale, detailed simulations will be of great value to the nation because they eliminate some of the costly testing required to develop and certify advanced propulsion systems. In addition, time and cost savings will be achieved by enabling design details to be evaluated early in the development process before a commitment is made to a specific design. This concept is called the Numerical Propulsion System Simulation (NPSS). NPSS consists of three main elements: (1) engineering models that enable multidisciplinary analysis of large subsystems and systems at various levels of detail, (2) a simulation environment that maximizes designer productivity, and (3) a cost-effective, high-performance computing platform. A fundamental requirement of the concept is that the simulations must be capable of overnight execution on easily accessible computing platforms. This will greatly facilitate the use of large-scale simulations in a design environment. This paper describes the current status of the NPSS with specific emphasis on the progress made over the past year on air breathing propulsion applications. Major accomplishments include the first formal release of the NPSS object-oriented architecture (NPSS Version 1) and the demonstration of a one order of magnitude reduction in computing cost-to-performance ratio using a cluster of personal computers. The paper also describes the future NPSS milestones, which include the simulation of space transportation propulsion systems in response to increased emphasis on safe, low cost access to space within NASA's Aerospace Technology Enterprise. In addition, the paper contains a summary of the feedback received from industry partners on the fiscal year 2000 effort and the actions taken over the past year to respond to that feedback. NPSS was supported in fiscal year 2001 by the High Performance Computing and Communications Program.

  8. Technology Requirements and Selection for Securely Partitioning OBSW

    NASA Astrophysics Data System (ADS)

    Mendham, Peter; Windsor, James; Eckstein, Knut

    2010-08-01

    The Securely Partitioning Spacecraft Computing Resources project is a current ESA TRP activity investigating the application of secure time and space partitioning (TSP) technologies to enable multi-use missions from a single platform. Secure TSP technologies are used in a number of application areas outside the space domain and an opportunity exists to 'spin-in' a suitable solution. The selection of a technology for use within space the European space industry relies on an understanding of the requirements for the application of secure TSP, of which this paper presents a summary. Further, the paper outlines the selection process taken by the project and highlights promising solutions for use today.

  9. Clinical operations generation next… The age of technology and outsourcing

    PubMed Central

    Temkar, Priya

    2015-01-01

    Huge cost pressures and the need to drive faster approvals has driven a technology transformation in the clinical trial (CT) industry. The CT industry is thus leveraging mobile data, cloud computing, social media, robotic automation, and electronic source to drive efficiencies in a big way. Outsourcing of clinical operations support services to technology companies with a clinical edge is gaining tremendous importance. This paper provides an overview of current technology trends, applicable Food and Drug Administration (FDA) guidelines, basic challenges that the pharma industry is facing in trying to implement such changes and its shift towards outsourcing these services to enable it to focus on site operations. PMID:26623386

  10. Clinical operations generation next… The age of technology and outsourcing.

    PubMed

    Temkar, Priya

    2015-01-01

    Huge cost pressures and the need to drive faster approvals has driven a technology transformation in the clinical trial (CT) industry. The CT industry is thus leveraging mobile data, cloud computing, social media, robotic automation, and electronic source to drive efficiencies in a big way. Outsourcing of clinical operations support services to technology companies with a clinical edge is gaining tremendous importance. This paper provides an overview of current technology trends, applicable Food and Drug Administration (FDA) guidelines, basic challenges that the pharma industry is facing in trying to implement such changes and its shift towards outsourcing these services to enable it to focus on site operations.

  11. Challenging Technology, and Technology Infusion into 21st Century

    NASA Technical Reports Server (NTRS)

    Chau, S. N.; Hunter, D. J.

    2001-01-01

    In preparing for the space exploration challenges of the next century, the National Aeronautics and Space Administration (NASA) Center for Integrated Space Micro-Systems (CISM) is chartered to develop advanced spacecraft systems that can be adapted for a large spectrum of future space missions. Enabling this task are revolutions in the miniaturization of electrical, mechanical, and computational functions. On the other hand, these revolutionary technologies usually have much lower readiness levels than those required by flight projects. The mission of the Advanced Micro Spacecraft (AMS) task in CISM is to bridge the readiness gap between advanced technologies and flight projects. Additional information is contained in the original extended abstract.

  12. "First generation" automated DNA sequencing technology.

    PubMed

    Slatko, Barton E; Kieleczawa, Jan; Ju, Jingyue; Gardner, Andrew F; Hendrickson, Cynthia L; Ausubel, Frederick M

    2011-10-01

    Beginning in the 1980s, automation of DNA sequencing has greatly increased throughput, reduced costs, and enabled large projects to be completed more easily. The development of automation technology paralleled the development of other aspects of DNA sequencing: better enzymes and chemistry, separation and imaging technology, sequencing protocols, robotics, and computational advancements (including base-calling algorithms with quality scores, database developments, and sequence analysis programs). Despite the emergence of high-throughput sequencing platforms, automated Sanger sequencing technology remains useful for many applications. This unit provides background and a description of the "First-Generation" automated DNA sequencing technology. It also includes protocols for using the current Applied Biosystems (ABI) automated DNA sequencing machines. © 2011 by John Wiley & Sons, Inc.

  13. Integration of Problem-based Learning and Innovative Technology Into a Self-Care Course

    PubMed Central

    2013-01-01

    Objective. To assess the integration of problem-based learning and technology into a self-care course. Design. Problem-based learning (PBL) activities were developed and implemented in place of lectures in a self-care course. Students used technology, such as computer-generated virtual patients and iPads, during the PBL sessions. Assessments. Students’ scores on post-case quizzes were higher than on pre-case quizzes used to assess baseline knowledge. Student satisfaction with problem-based learning and the use of technology in the course remained consistent throughout the semester. Conclusion. Integrating problem-based learning and technology into a self-care course enabled students to become active learners. PMID:23966730

  14. Bilayer avalanche spin-diode logic

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman, Joseph S., E-mail: joseph.friedman@u-psud.fr; Querlioz, Damien; Fadel, Eric R.

    2015-11-15

    A novel spintronic computing paradigm is proposed and analyzed in which InSb p-n bilayer avalanche spin-diodes are cascaded to efficiently perform complex logic operations. This spin-diode logic family uses control wires to generate magnetic fields that modulate the resistance of the spin-diodes, and currents through these devices control the resistance of cascaded devices. Electromagnetic simulations are performed to demonstrate the cascading mechanism, and guidelines are provided for the development of this innovative computing technology. This cascading scheme permits compact logic circuits with switching speeds determined by electromagnetic wave propagation rather than electron motion, enabling high-performance spintronic computing.

  15. VASCOMP 2. The V/STOL aircraft sizing and performance computer program. Volume 6: User's manual, revision 3

    NASA Technical Reports Server (NTRS)

    Schoen, A. H.; Rosenstein, H.; Stanzione, K.; Wisniewski, J. S.

    1980-01-01

    This report describes the use of the V/STOL Aircraft Sizing and Performance Computer Program (VASCOMP II). The program is useful in performing aircraft parametric studies in a quick and cost efficient manner. Problem formulation and data development were performed by the Boeing Vertol Company and reflects the present preliminary design technology. The computer program, written in FORTRAN IV, has a broad range of input parameters, to enable investigation of a wide variety of aircraft. User oriented features of the program include minimized input requirements, diagnostic capabilities, and various options for program flexibility.

  16. Computer-assisted instruction and diagnosis of radiographic findings.

    PubMed

    Harper, D; Butler, C; Hodder, R; Allman, R; Woods, J; Riordan, D

    1984-04-01

    Recent advances in computer technology, including high bit-density storage, digital imaging, and the ability to interface microprocessors with videodisk, create enormous opportunities in the field of medical education. This program, utilizing a personal computer, videodisk, BASIC language, a linked textfile system, and a triangulation approach to the interpretation of radiographs developed by Dr. W. L. Thompson, can enable the user to engage in a user-friendly, dynamic teaching program in radiology, applicable to various levels of expertise. Advantages include a relatively more compact and inexpensive system with rapid access and ease of revision which requires little instruction to the user.

  17. Modern Methods for fast generation of digital holograms

    NASA Astrophysics Data System (ADS)

    Tsang, P. W. M.; Liu, J. P.; Cheung, K. W. K.; Poon, T.-C.

    2010-06-01

    With the advancement of computers, digital holography (DH) has become an area of interest that has gained much popularity. Research findings derived from this technology enables holograms representing three dimensional (3-D) scenes to be acquired with optical means, or generated with numerical computation. In both cases, the holograms are in the form of numerical data that can be recorded, transmitted, and processed with digital techniques. On top of that, the availability of high capacity digital storage and wide-band communication technologies also cast light on the emergence of real time video holographic systems, enabling animated 3-D contents to be encoded as holographic data, and distributed via existing medium. At present, development in DH has reached a reasonable degree of maturity, but at the same time the heavy computation involved also imposes difficulty in practical applications. In this paper, a summary on a number of successful accomplishments that have been made recently in overcoming this problem is presented. Subsequently, we shall propose an economical framework that is suitable for real time generation and transmission of holographic video signals over existing distribution media. The proposed framework includes an aspect of extending the depth range of the object scene, which is important for the display of large-scale objects. [Figure not available: see fulltext.

  18. Toward a Dynamically Reconfigurable Computing and Communication System for Small Spacecraft

    NASA Technical Reports Server (NTRS)

    Kifle, Muli; Andro, Monty; Tran, Quang K.; Fujikawa, Gene; Chu, Pong P.

    2003-01-01

    Future science missions will require the use of multiple spacecraft with multiple sensor nodes autonomously responding and adapting to a dynamically changing space environment. The acquisition of random scientific events will require rapidly changing network topologies, distributed processing power, and a dynamic resource management strategy. Optimum utilization and configuration of spacecraft communications and navigation resources will be critical in meeting the demand of these stringent mission requirements. There are two important trends to follow with respect to NASA's (National Aeronautics and Space Administration) future scientific missions: the use of multiple satellite systems and the development of an integrated space communications network. Reconfigurable computing and communication systems may enable versatile adaptation of a spacecraft system's resources by dynamic allocation of the processor hardware to perform new operations or to maintain functionality due to malfunctions or hardware faults. Advancements in FPGA (Field Programmable Gate Array) technology make it possible to incorporate major communication and network functionalities in FPGA chips and provide the basis for a dynamically reconfigurable communication system. Advantages of higher computation speeds and accuracy are envisioned with tremendous hardware flexibility to ensure maximum survivability of future science mission spacecraft. This paper discusses the requirements, enabling technologies, and challenges associated with dynamically reconfigurable space communications systems.

  19. Art Maps--Mapping the Multiple Meanings of Place

    ERIC Educational Resources Information Center

    Sinker, Rebecca; Giannachi, Gabriella; Carletti, Laura

    2013-01-01

    Digital technology enables us to prospect, generate, assemble and share eclectic materials, creating virtual journeys, stories or exhibitions through the internet, viewed on computer but also on location via mobile devices. How does the ability to create and curate in this way enhance or transform our access to and understanding of art, as well as…

  20. Component Exchange Community: A Model of Utilizing Research Components to Foster International Collaboration

    ERIC Educational Resources Information Center

    Deng, Yi-Chan; Lin, Taiyu; Kinshuk; Chan, Tak-Wai

    2006-01-01

    "One-to-one" technology enhanced learning research refers to the design and investigation of learning environments and learning activities where every learner is equipped with at least one portable computing device enabled by wireless capability. G1:1 is an international research community coordinated by a network of laboratories conducting…

  1. Biomimetic robots using EAP as artificial muscles - progress and challenges

    NASA Technical Reports Server (NTRS)

    Bar-Cohen, Yoseph

    2004-01-01

    Biology offers a great model for emulation in areas ranging from tools, computational algorithms, materials science, mechanisms and information technology. In recent years, the field of biomimetics, namely mimicking biology, has blossomed with significant advances enabling the reverse engineering of many animals' functions and implementation of some of these capabilities.

  2. From the Teachers' Perspective: A Way of Simplicity for Multimedia Design

    ERIC Educational Resources Information Center

    Hirca, Necati

    2009-01-01

    Presently, teaching and presentation methods are changing from chalk and blackboards to interactive methods. Multimedia technology is presently used in many schools, however much of the commercially-available software programs don't allow teachers to share their experiences. Adobe Captivate 3 is a computer program that enables teachers, without…

  3. Models, Databases, and Simulation Tools Needed for the Realization of Integrated Computational Materials Engineering. Proceedings of the Symposium Held at Materials Science and Technology 2010

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M. (Editor); Wong, Terry T. (Editor)

    2011-01-01

    Topics covered include: An Annotative Review of Multiscale Modeling and its Application to Scales Inherent in the Field of ICME; and A Multiscale, Nonlinear, Modeling Framework Enabling the Design and Analysis of Composite Materials and Structures.

  4. Empowering Teachers to Create Educational Software: A Constructivist Approach Utilizing Etoys, Pair Programming and Cognitive Apprenticeship

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2011-01-01

    This study investigates whether a visual programming environment called Etoys could enable teachers to create software applications meeting their own instructional needs. Twenty-four teachers who participated in the study successfully developed their own educational computer programs in the educational technology course employing cognitive…

  5. Speed challenge: a case for hardware implementation in soft-computing

    NASA Technical Reports Server (NTRS)

    Daud, T.; Stoica, A.; Duong, T.; Keymeulen, D.; Zebulum, R.; Thomas, T.; Thakoor, A.

    2000-01-01

    For over a decade, JPL has been actively involved in soft computing research on theory, architecture, applications, and electronics hardware. The driving force in all our research activities, in addition to the potential enabling technology promise, has been creation of a niche that imparts orders of magnitude speed advantage by implementation in parallel processing hardware with algorithms made especially suitable for hardware implementation. We review our work on neural networks, fuzzy logic, and evolvable hardware with selected application examples requiring real time response capabilities.

  6. Tracking-Data-Conversion Tool

    NASA Technical Reports Server (NTRS)

    Flora-Adams, Dana; Makihara, Jeanne; Benenyan, Zabel; Berner, Jeff; Kwok, Andrew

    2007-01-01

    Object Oriented Data Technology (OODT) is a software framework for creating a Web-based system for exchange of scientific data that are stored in diverse formats on computers at different sites under the management of scientific peers. OODT software consists of a set of cooperating, distributed peer components that provide distributed peer-topeer (P2P) services that enable one peer to search and retrieve data managed by another peer. In effect, computers running OODT software at different locations become parts of an integrated data-management system.

  7. Software Framework for Peer Data-Management Services

    NASA Technical Reports Server (NTRS)

    Hughes, John; Hardman, Sean; Crichton, Daniel; Hyon, Jason; Kelly, Sean; Tran, Thuy

    2007-01-01

    Object Oriented Data Technology (OODT) is a software framework for creating a Web-based system for exchange of scientific data that are stored in diverse formats on computers at different sites under the management of scientific peers. OODT software consists of a set of cooperating, distributed peer components that provide distributed peer-to-peer (P2P) services that enable one peer to search and retrieve data managed by another peer. In effect, computers running OODT software at different locations become parts of an integrated data-management system.

  8. Information visualization: Beyond traditional engineering

    NASA Technical Reports Server (NTRS)

    Thomas, James J.

    1995-01-01

    This presentation addresses a different aspect of the human-computer interface; specifically the human-information interface. This interface will be dominated by an emerging technology called Information Visualization (IV). IV goes beyond the traditional views of computer graphics, CADS, and enables new approaches for engineering. IV specifically must visualize text, documents, sound, images, and video in such a way that the human can rapidly interact with and understand the content structure of information entities. IV is the interactive visual interface between humans and their information resources.

  9. Computer-assisted design of flux-cored wires

    NASA Astrophysics Data System (ADS)

    Dubtsov, Yu N.; Zorin, I. V.; Sokolov, G. N.; Antonov, A. A.; Artem'ev, A. A.; Lysak, V. I.

    2017-02-01

    The algorithm and description of the AlMe-WireLaB software for the computer-assisted design of flux-cored wires are introduced. The software functionality is illustrated with the selection of the components for the flux-cored wire, ensuring the acquisition of the deposited metal of the Fe-Cr-C-Mo-Ni-Ti-B system. It is demonstrated that the developed software enables the technologically reliable flux-cored wire to be designed for surfacing, resulting in a metal of an ordered composition.

  10. Structural Technology Evaluation and Analysis Program (STEAP). Delivery Order 0037: Prognosis-Based Control Reconfiguration for an Aircraft with Faulty Actuator to Enable Performance in a Degraded State

    DTIC Science & Technology

    2010-12-01

    computers in 1953. HIL motion simulators were also built for the dynamic testing of vehicle com- ponents (e.g. suspensions, bodies ) with hydraulic or...complex, comprehensive mechanical systems can be simulated in real-time by parallel computers; examples include multi- body sys- tems, brake systems...hard constraints in a multivariable control framework. And the third aspect is the ability to perform online optimization. These aspects results in

  11. A review of emerging non-volatile memory (NVM) technologies and applications

    NASA Astrophysics Data System (ADS)

    Chen, An

    2016-11-01

    This paper will review emerging non-volatile memory (NVM) technologies, with the focus on phase change memory (PCM), spin-transfer-torque random-access-memory (STTRAM), resistive random-access-memory (RRAM), and ferroelectric field-effect-transistor (FeFET) memory. These promising NVM devices are evaluated in terms of their advantages, challenges, and applications. Their performance is compared based on reported parameters of major industrial test chips. Memory selector devices and cell structures are discussed. Changing market trends toward low power (e.g., mobile, IoT) and data-centric applications create opportunities for emerging NVMs. High-performance and low-cost emerging NVMs may simplify memory hierarchy, introduce non-volatility in logic gates and circuits, reduce system power, and enable novel architectures. Storage-class memory (SCM) based on high-density NVMs could fill the performance and density gap between memory and storage. Some unique characteristics of emerging NVMs can be utilized for novel applications beyond the memory space, e.g., neuromorphic computing, hardware security, etc. In the beyond-CMOS era, emerging NVMs have the potential to fulfill more important functions and enable more efficient, intelligent, and secure computing systems.

  12. Research Institute for Advanced Computer Science: Annual Report October 1998 through September 1999

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.; Gross, Anthony R. (Technical Monitor)

    1999-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center (ARC). It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. ARC has been designated NASA's Center of Excellence in Information Technology. In this capacity, ARC is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA ARC and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth. (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, and visiting scientist programs, designed to encourage and facilitate collaboration between the university and NASA information technology research communities.

  13. Research Institute for Advanced Computer Science

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R. (Technical Monitor); Leiner, Barry M.

    2000-01-01

    The Research Institute for Advanced Computer Science (RIACS) carries out basic research and technology development in computer science, in support of the National Aeronautics and Space Administration's missions. RIACS is located at the NASA Ames Research Center. It currently operates under a multiple year grant/cooperative agreement that began on October 1, 1997 and is up for renewal in the year 2002. Ames has been designated NASA's Center of Excellence in Information Technology. In this capacity, Ames is charged with the responsibility to build an Information Technology Research Program that is preeminent within NASA. RIACS serves as a bridge between NASA Ames and the academic community, and RIACS scientists and visitors work in close collaboration with NASA scientists. RIACS has the additional goal of broadening the base of researchers in these areas of importance to the nation's space and aeronautics enterprises. RIACS research focuses on the three cornerstones of information technology research necessary to meet the future challenges of NASA missions: (1) Automated Reasoning for Autonomous Systems. Techniques are being developed enabling spacecraft that will be self-guiding and self-correcting to the extent that they will require little or no human intervention. Such craft will be equipped to independently solve problems as they arise, and fulfill their missions with minimum direction from Earth; (2) Human-Centered Computing. Many NASA missions require synergy between humans and computers, with sophisticated computational aids amplifying human cognitive and perceptual abilities; (3) High Performance Computing and Networking. Advances in the performance of computing and networking continue to have major impact on a variety of NASA endeavors, ranging from modeling and simulation to data analysis of large datasets to collaborative engineering, planning and execution. In addition, RIACS collaborates with NASA scientists to apply information technology research to a variety of NASA application domains. RIACS also engages in other activities, such as workshops, seminars, and visiting scientist programs, designed to encourage and facilitate collaboration between the university and NASA information technology research communities.

  14. Universal blind quantum computation for hybrid system

    NASA Astrophysics Data System (ADS)

    Huang, He-Liang; Bao, Wan-Su; Li, Tan; Li, Feng-Guang; Fu, Xiang-Qun; Zhang, Shuo; Zhang, Hai-Long; Wang, Xiang

    2017-08-01

    As progress on the development of building quantum computer continues to advance, first-generation practical quantum computers will be available for ordinary users in the cloud style similar to IBM's Quantum Experience nowadays. Clients can remotely access the quantum servers using some simple devices. In such a situation, it is of prime importance to keep the security of the client's information. Blind quantum computation protocols enable a client with limited quantum technology to delegate her quantum computation to a quantum server without leaking any privacy. To date, blind quantum computation has been considered only for an individual quantum system. However, practical universal quantum computer is likely to be a hybrid system. Here, we take the first step to construct a framework of blind quantum computation for the hybrid system, which provides a more feasible way for scalable blind quantum computation.

  15. Cloud Bursting with GlideinWMS: Means to satisfy ever increasing computing needs for Scientific Workflows

    NASA Astrophysics Data System (ADS)

    Mhashilkar, Parag; Tiradani, Anthony; Holzman, Burt; Larson, Krista; Sfiligoi, Igor; Rynge, Mats

    2014-06-01

    Scientific communities have been in the forefront of adopting new technologies and methodologies in the computing. Scientific computing has influenced how science is done today, achieving breakthroughs that were impossible to achieve several decades ago. For the past decade several such communities in the Open Science Grid (OSG) and the European Grid Infrastructure (EGI) have been using GlideinWMS to run complex application workflows to effectively share computational resources over the grid. GlideinWMS is a pilot-based workload management system (WMS) that creates on demand, a dynamically sized overlay HTCondor batch system on grid resources. At present, the computational resources shared over the grid are just adequate to sustain the computing needs. We envision that the complexity of the science driven by "Big Data" will further push the need for computational resources. To fulfill their increasing demands and/or to run specialized workflows, some of the big communities like CMS are investigating the use of cloud computing as Infrastructure-As-A-Service (IAAS) with GlideinWMS as a potential alternative to fill the void. Similarly, communities with no previous access to computing resources can use GlideinWMS to setup up a batch system on the cloud infrastructure. To enable this, the architecture of GlideinWMS has been extended to enable support for interfacing GlideinWMS with different Scientific and commercial cloud providers like HLT, FutureGrid, FermiCloud and Amazon EC2. In this paper, we describe a solution for cloud bursting with GlideinWMS. The paper describes the approach, architectural changes and lessons learned while enabling support for cloud infrastructures in GlideinWMS.

  16. Cloud Bursting with GlideinWMS: Means to satisfy ever increasing computing needs for Scientific Workflows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mhashilkar, Parag; Tiradani, Anthony; Holzman, Burt

    Scientific communities have been in the forefront of adopting new technologies and methodologies in the computing. Scientific computing has influenced how science is done today, achieving breakthroughs that were impossible to achieve several decades ago. For the past decade several such communities in the Open Science Grid (OSG) and the European Grid Infrastructure (EGI) have been using GlideinWMS to run complex application workflows to effectively share computational resources over the grid. GlideinWMS is a pilot-based workload management system (WMS) that creates on demand, a dynamically sized overlay HTCondor batch system on grid resources. At present, the computational resources shared overmore » the grid are just adequate to sustain the computing needs. We envision that the complexity of the science driven by 'Big Data' will further push the need for computational resources. To fulfill their increasing demands and/or to run specialized workflows, some of the big communities like CMS are investigating the use of cloud computing as Infrastructure-As-A-Service (IAAS) with GlideinWMS as a potential alternative to fill the void. Similarly, communities with no previous access to computing resources can use GlideinWMS to setup up a batch system on the cloud infrastructure. To enable this, the architecture of GlideinWMS has been extended to enable support for interfacing GlideinWMS with different Scientific and commercial cloud providers like HLT, FutureGrid, FermiCloud and Amazon EC2. In this paper, we describe a solution for cloud bursting with GlideinWMS. The paper describes the approach, architectural changes and lessons learned while enabling support for cloud infrastructures in GlideinWMS.« less

  17. Systems cell biology.

    PubMed

    Mast, Fred D; Ratushny, Alexander V; Aitchison, John D

    2014-09-15

    Systems cell biology melds high-throughput experimentation with quantitative analysis and modeling to understand many critical processes that contribute to cellular organization and dynamics. Recently, there have been several advances in technology and in the application of modeling approaches that enable the exploration of the dynamic properties of cells. Merging technology and computation offers an opportunity to objectively address unsolved cellular mechanisms, and has revealed emergent properties and helped to gain a more comprehensive and fundamental understanding of cell biology. © 2014 Mast et al.

  18. Computation as the mechanistic bridge between precision medicine and systems therapeutics.

    PubMed

    Hansen, J; Iyengar, R

    2013-01-01

    Over the past 50 years, like molecular cell biology, medicine and pharmacology have been driven by a reductionist approach. The focus on individual genes and cellular components as disease loci and drug targets has been a necessary step in understanding the basic mechanisms underlying tissue/organ physiology and drug action. Recent progress in genomics and proteomics, as well as advances in other technologies that enable large-scale data gathering and computational approaches, is providing new knowledge of both normal and disease states. Systems-biology approaches enable integration of knowledge from different types of data for precision medicine and systems therapeutics. In this review, we describe recent studies that contribute to these emerging fields and discuss how together these fields can lead to a mechanism-based therapy for individual patients.

  19. Computation as the Mechanistic Bridge Between Precision Medicine and Systems Therapeutics

    PubMed Central

    Hansen, J; Iyengar, R

    2014-01-01

    Over the past 50 years, like molecular cell biology, medicine and pharmacology have been driven by a reductionist approach. The focus on individual genes and cellular components as disease loci and drug targets has been a necessary step in understanding the basic mechanisms underlying tissue/organ physiology and drug action. Recent progress in genomics and proteomics, as well as advances in other technologies that enable large-scale data gathering and computational approaches, is providing new knowledge of both normal and disease states. Systems-biology approaches enable integration of knowledge from different types of data for precision medicine and systems therapeutics. In this review, we describe recent studies that contribute to these emerging fields and discuss how together these fields can lead to a mechanism-based therapy for individual patients. PMID:23212109

  20. Electron collisions with atoms, ions, molecules, and surfaces: Fundamental science empowering advances in technology

    PubMed Central

    Bartschat, Klaus; Kushner, Mark J.

    2016-01-01

    Electron collisions with atoms, ions, molecules, and surfaces are critically important to the understanding and modeling of low-temperature plasmas (LTPs), and so in the development of technologies based on LTPs. Recent progress in obtaining experimental benchmark data and the development of highly sophisticated computational methods is highlighted. With the cesium-based diode-pumped alkali laser and remote plasma etching of Si3N4 as examples, we demonstrate how accurate and comprehensive datasets for electron collisions enable complex modeling of plasma-using technologies that empower our high-technology–based society. PMID:27317740

  1. Looking ahead through a rearview mirror

    NASA Astrophysics Data System (ADS)

    Koehler, Richard F.; Bares, Jan

    1993-06-01

    Electrophotography, as an original invention, was just another way to make a copy. Its development into a continuous process made it historic. As with any technology, the evolution proceeded along several fronts, in particular the advancement of enabling components including stimulation and sponsorship of research in related scientific disciplines, development of technology and engineering solutions, and expansion of the market while satisfying existing demand. The evolution, driven by customer and market requirements, has followed the paradigm of any other technology-based appliance: growth in performance and reliability and reduction in size and cost, ultimately enabling the transition all the way from highly functional centralized machines to personal devices. Besides this traditional evolution, xerography expanded when it could link with other technologies. The most dramatic breakthroughs that led to rapid market expansion occurred when digital electronics enabled printing and image processing, and the proliferation of personal computers launched a robust color creation and hardcopy market. The electrophotography industry was prepared for this opportunity and made possible desktop publishing, distributed printing, and recently, color copying and printing with acceptable color fidelity. What early indicators signaled the evolutionary paths, and the divergences, electrophotography would take? In this paper, we examine the history, including relevant publications, to find such indicators. Current literature is also considered in that light.

  2. Micro/nano-computed tomography technology for quantitative dynamic, multi-scale imaging of morphogenesis.

    PubMed

    Gregg, Chelsea L; Recknagel, Andrew K; Butcher, Jonathan T

    2015-01-01

    Tissue morphogenesis and embryonic development are dynamic events challenging to quantify, especially considering the intricate events that happen simultaneously in different locations and time. Micro- and more recently nano-computed tomography (micro/nanoCT) has been used for the past 15 years to characterize large 3D fields of tortuous geometries at high spatial resolution. We and others have advanced micro/nanoCT imaging strategies for quantifying tissue- and organ-level fate changes throughout morphogenesis. Exogenous soft tissue contrast media enables visualization of vascular lumens and tissues via extravasation. Furthermore, the emergence of antigen-specific tissue contrast enables direct quantitative visualization of protein and mRNA expression. Micro-CT X-ray doses appear to be non-embryotoxic, enabling longitudinal imaging studies in live embryos. In this chapter we present established soft tissue contrast protocols for obtaining high-quality micro/nanoCT images and the image processing techniques useful for quantifying anatomical and physiological information from the data sets.

  3. Moving mobile: using an open-sourced framework to enable a web-based health application on touch devices.

    PubMed

    Lindsay, Joseph; McLean, J Allen; Bains, Amrita; Ying, Tom; Kuo, M H

    2013-01-01

    Computer devices using touch-enabled technology are becoming more prevalent today. The application of a touch screen high definition surgical monitor could allow not only high definition video from an endoscopic camera to be displayed, but also the display and interaction with relevant patient and health related data. However, this technology has not been quickly embraced by all health care organizations. Although traditional keyboard or mouse-based software programs may function flawlessly on a touch-based device, many are not practical due to the usage of small buttons, fonts and very complex menu systems. This paper describes an approach taken to overcome these problems. A real case study was used to demonstrate the novelty and efficiency of the proposed method.

  4. Atlas2 Cloud: a framework for personal genome analysis in the cloud

    PubMed Central

    2012-01-01

    Background Until recently, sequencing has primarily been carried out in large genome centers which have invested heavily in developing the computational infrastructure that enables genomic sequence analysis. The recent advancements in next generation sequencing (NGS) have led to a wide dissemination of sequencing technologies and data, to highly diverse research groups. It is expected that clinical sequencing will become part of diagnostic routines shortly. However, limited accessibility to computational infrastructure and high quality bioinformatic tools, and the demand for personnel skilled in data analysis and interpretation remains a serious bottleneck. To this end, the cloud computing and Software-as-a-Service (SaaS) technologies can help address these issues. Results We successfully enabled the Atlas2 Cloud pipeline for personal genome analysis on two different cloud service platforms: a community cloud via the Genboree Workbench, and a commercial cloud via the Amazon Web Services using Software-as-a-Service model. We report a case study of personal genome analysis using our Atlas2 Genboree pipeline. We also outline a detailed cost structure for running Atlas2 Amazon on whole exome capture data, providing cost projections in terms of storage, compute and I/O when running Atlas2 Amazon on a large data set. Conclusions We find that providing a web interface and an optimized pipeline clearly facilitates usage of cloud computing for personal genome analysis, but for it to be routinely used for large scale projects there needs to be a paradigm shift in the way we develop tools, in standard operating procedures, and in funding mechanisms. PMID:23134663

  5. Atlas2 Cloud: a framework for personal genome analysis in the cloud.

    PubMed

    Evani, Uday S; Challis, Danny; Yu, Jin; Jackson, Andrew R; Paithankar, Sameer; Bainbridge, Matthew N; Jakkamsetti, Adinarayana; Pham, Peter; Coarfa, Cristian; Milosavljevic, Aleksandar; Yu, Fuli

    2012-01-01

    Until recently, sequencing has primarily been carried out in large genome centers which have invested heavily in developing the computational infrastructure that enables genomic sequence analysis. The recent advancements in next generation sequencing (NGS) have led to a wide dissemination of sequencing technologies and data, to highly diverse research groups. It is expected that clinical sequencing will become part of diagnostic routines shortly. However, limited accessibility to computational infrastructure and high quality bioinformatic tools, and the demand for personnel skilled in data analysis and interpretation remains a serious bottleneck. To this end, the cloud computing and Software-as-a-Service (SaaS) technologies can help address these issues. We successfully enabled the Atlas2 Cloud pipeline for personal genome analysis on two different cloud service platforms: a community cloud via the Genboree Workbench, and a commercial cloud via the Amazon Web Services using Software-as-a-Service model. We report a case study of personal genome analysis using our Atlas2 Genboree pipeline. We also outline a detailed cost structure for running Atlas2 Amazon on whole exome capture data, providing cost projections in terms of storage, compute and I/O when running Atlas2 Amazon on a large data set. We find that providing a web interface and an optimized pipeline clearly facilitates usage of cloud computing for personal genome analysis, but for it to be routinely used for large scale projects there needs to be a paradigm shift in the way we develop tools, in standard operating procedures, and in funding mechanisms.

  6. High End Computer Network Testbedding at NASA Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Gary, James Patrick

    1998-01-01

    The Earth & Space Data Computing (ESDC) Division, at the Goddard Space Flight Center, is involved in development and demonstrating various high end computer networking capabilities. The ESDC has several high end super computers. These are used to run: (1) computer simulation of the climate systems; (2) to support the Earth and Space Sciences (ESS) project; (3) to support the Grand Challenge (GC) Science, which is aimed at understanding the turbulent convection and dynamos in stars. GC research occurs in many sites throughout the country, and this research is enabled by, in part, the multiple high performance network interconnections. The application drivers for High End Computer Networking use distributed supercomputing to support virtual reality applications, such as TerraVision, (i.e., three dimensional browser of remotely accessed data), and Cave Automatic Virtual Environments (CAVE). Workstations can access and display data from multiple CAVE's with video servers, which allows for group/project collaborations using a combination of video, data, voice and shared white boarding. The ESDC is also developing and demonstrating the high degree of interoperability between satellite and terrestrial-based networks. To this end, the ESDC is conducting research and evaluations of new computer networking protocols and related technologies which improve the interoperability of satellite and terrestrial networks. The ESDC is also involved in the Security Proof of Concept Keystone (SPOCK) program sponsored by National Security Agency (NSA). The SPOCK activity provides a forum for government users and security technology providers to share information on security requirements, emerging technologies and new product developments. Also, the ESDC is involved in the Trans-Pacific Digital Library Experiment, which aims to demonstrate and evaluate the use of high performance satellite communications and advanced data communications protocols to enable interactive digital library data access between the U. S. Library of Congress, the National Library of Japan and other digital library sites at 155 MegaBytes Per Second. The ESDC participation in this program is the Trans-Pacific access to GLOBE visualizations in real time. ESDC is participating in the Department of Defense's ATDNet with Multiwavelength Optical Network (MONET) a fully switched Wavelength Division Networking testbed. This presentation is in viewgraph format.

  7. Elucidating reaction mechanisms on quantum computers.

    PubMed

    Reiher, Markus; Wiebe, Nathan; Svore, Krysta M; Wecker, Dave; Troyer, Matthias

    2017-07-18

    With rapid recent advances in quantum technology, we are close to the threshold of quantum devices whose computational powers can exceed those of classical supercomputers. Here, we show that a quantum computer can be used to elucidate reaction mechanisms in complex chemical systems, using the open problem of biological nitrogen fixation in nitrogenase as an example. We discuss how quantum computers can augment classical computer simulations used to probe these reaction mechanisms, to significantly increase their accuracy and enable hitherto intractable simulations. Our resource estimates show that, even when taking into account the substantial overhead of quantum error correction, and the need to compile into discrete gate sets, the necessary computations can be performed in reasonable time on small quantum computers. Our results demonstrate that quantum computers will be able to tackle important problems in chemistry without requiring exorbitant resources.

  8. Elucidating reaction mechanisms on quantum computers

    PubMed Central

    Reiher, Markus; Wiebe, Nathan; Svore, Krysta M.; Wecker, Dave; Troyer, Matthias

    2017-01-01

    With rapid recent advances in quantum technology, we are close to the threshold of quantum devices whose computational powers can exceed those of classical supercomputers. Here, we show that a quantum computer can be used to elucidate reaction mechanisms in complex chemical systems, using the open problem of biological nitrogen fixation in nitrogenase as an example. We discuss how quantum computers can augment classical computer simulations used to probe these reaction mechanisms, to significantly increase their accuracy and enable hitherto intractable simulations. Our resource estimates show that, even when taking into account the substantial overhead of quantum error correction, and the need to compile into discrete gate sets, the necessary computations can be performed in reasonable time on small quantum computers. Our results demonstrate that quantum computers will be able to tackle important problems in chemistry without requiring exorbitant resources. PMID:28674011

  9. Elucidating reaction mechanisms on quantum computers

    NASA Astrophysics Data System (ADS)

    Reiher, Markus; Wiebe, Nathan; Svore, Krysta M.; Wecker, Dave; Troyer, Matthias

    2017-07-01

    With rapid recent advances in quantum technology, we are close to the threshold of quantum devices whose computational powers can exceed those of classical supercomputers. Here, we show that a quantum computer can be used to elucidate reaction mechanisms in complex chemical systems, using the open problem of biological nitrogen fixation in nitrogenase as an example. We discuss how quantum computers can augment classical computer simulations used to probe these reaction mechanisms, to significantly increase their accuracy and enable hitherto intractable simulations. Our resource estimates show that, even when taking into account the substantial overhead of quantum error correction, and the need to compile into discrete gate sets, the necessary computations can be performed in reasonable time on small quantum computers. Our results demonstrate that quantum computers will be able to tackle important problems in chemistry without requiring exorbitant resources.

  10. Machine learning for Big Data analytics in plants.

    PubMed

    Ma, Chuang; Zhang, Hao Helen; Wang, Xiangfeng

    2014-12-01

    Rapid advances in high-throughput genomic technology have enabled biology to enter the era of 'Big Data' (large datasets). The plant science community not only needs to build its own Big-Data-compatible parallel computing and data management infrastructures, but also to seek novel analytical paradigms to extract information from the overwhelming amounts of data. Machine learning offers promising computational and analytical solutions for the integrative analysis of large, heterogeneous and unstructured datasets on the Big-Data scale, and is gradually gaining popularity in biology. This review introduces the basic concepts and procedures of machine-learning applications and envisages how machine learning could interface with Big Data technology to facilitate basic research and biotechnology in the plant sciences. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. High-Throughput Thermodynamic Modeling and Uncertainty Quantification for ICME

    NASA Astrophysics Data System (ADS)

    Otis, Richard A.; Liu, Zi-Kui

    2017-05-01

    One foundational component of the integrated computational materials engineering (ICME) and Materials Genome Initiative is the computational thermodynamics based on the calculation of phase diagrams (CALPHAD) method. The CALPHAD method pioneered by Kaufman has enabled the development of thermodynamic, atomic mobility, and molar volume databases of individual phases in the full space of temperature, composition, and sometimes pressure for technologically important multicomponent engineering materials, along with sophisticated computational tools for using the databases. In this article, our recent efforts will be presented in terms of developing new computational tools for high-throughput modeling and uncertainty quantification based on high-throughput, first-principles calculations and the CALPHAD method along with their potential propagations to downstream ICME modeling and simulations.

  12. NASA Advanced Supercomputing Facility Expansion

    NASA Technical Reports Server (NTRS)

    Thigpen, William W.

    2017-01-01

    The NASA Advanced Supercomputing (NAS) Division enables advances in high-end computing technologies and in modeling and simulation methods to tackle some of the toughest science and engineering challenges facing NASA today. The name "NAS" has long been associated with leadership and innovation throughout the high-end computing (HEC) community. We play a significant role in shaping HEC standards and paradigms, and provide leadership in the areas of large-scale InfiniBand fabrics, Lustre open-source filesystems, and hyperwall technologies. We provide an integrated high-end computing environment to accelerate NASA missions and make revolutionary advances in science. Pleiades, a petaflop-scale supercomputer, is used by scientists throughout the U.S. to support NASA missions, and is ranked among the most powerful systems in the world. One of our key focus areas is in modeling and simulation to support NASA's real-world engineering applications and make fundamental advances in modeling and simulation methods.

  13. Blind topological measurement-based quantum computation.

    PubMed

    Morimae, Tomoyuki; Fujii, Keisuke

    2012-01-01

    Blind quantum computation is a novel secure quantum-computing protocol that enables Alice, who does not have sufficient quantum technology at her disposal, to delegate her quantum computation to Bob, who has a fully fledged quantum computer, in such a way that Bob cannot learn anything about Alice's input, output and algorithm. A recent proof-of-principle experiment demonstrating blind quantum computation in an optical system has raised new challenges regarding the scalability of blind quantum computation in realistic noisy conditions. Here we show that fault-tolerant blind quantum computation is possible in a topologically protected manner using the Raussendorf-Harrington-Goyal scheme. The error threshold of our scheme is 4.3 × 10(-3), which is comparable to that (7.5 × 10(-3)) of non-blind topological quantum computation. As the error per gate of the order 10(-3) was already achieved in some experimental systems, our result implies that secure cloud quantum computation is within reach.

  14. Blind topological measurement-based quantum computation

    NASA Astrophysics Data System (ADS)

    Morimae, Tomoyuki; Fujii, Keisuke

    2012-09-01

    Blind quantum computation is a novel secure quantum-computing protocol that enables Alice, who does not have sufficient quantum technology at her disposal, to delegate her quantum computation to Bob, who has a fully fledged quantum computer, in such a way that Bob cannot learn anything about Alice's input, output and algorithm. A recent proof-of-principle experiment demonstrating blind quantum computation in an optical system has raised new challenges regarding the scalability of blind quantum computation in realistic noisy conditions. Here we show that fault-tolerant blind quantum computation is possible in a topologically protected manner using the Raussendorf-Harrington-Goyal scheme. The error threshold of our scheme is 4.3×10-3, which is comparable to that (7.5×10-3) of non-blind topological quantum computation. As the error per gate of the order 10-3 was already achieved in some experimental systems, our result implies that secure cloud quantum computation is within reach.

  15. Computational Approaches to Phenotyping

    PubMed Central

    Lussier, Yves A.; Liu, Yang

    2007-01-01

    The recent completion of the Human Genome Project has made possible a high-throughput “systems approach” for accelerating the elucidation of molecular underpinnings of human diseases, and subsequent derivation of molecular-based strategies to more effectively prevent, diagnose, and treat these diseases. Although altered phenotypes are among the most reliable manifestations of altered gene functions, research using systematic analysis of phenotype relationships to study human biology is still in its infancy. This article focuses on the emerging field of high-throughput phenotyping (HTP) phenomics research, which aims to capitalize on novel high-throughput computation and informatics technology developments to derive genomewide molecular networks of genotype–phenotype associations, or “phenomic associations.” The HTP phenomics research field faces the challenge of technological research and development to generate novel tools in computation and informatics that will allow researchers to amass, access, integrate, organize, and manage phenotypic databases across species and enable genomewide analysis to associate phenotypic information with genomic data at different scales of biology. Key state-of-the-art technological advancements critical for HTP phenomics research are covered in this review. In particular, we highlight the power of computational approaches to conduct large-scale phenomics studies. PMID:17202287

  16. Multi-scale computation methods: Their applications in lithium-ion battery research and development

    NASA Astrophysics Data System (ADS)

    Siqi, Shi; Jian, Gao; Yue, Liu; Yan, Zhao; Qu, Wu; Wangwei, Ju; Chuying, Ouyang; Ruijuan, Xiao

    2016-01-01

    Based upon advances in theoretical algorithms, modeling and simulations, and computer technologies, the rational design of materials, cells, devices, and packs in the field of lithium-ion batteries is being realized incrementally and will at some point trigger a paradigm revolution by combining calculations and experiments linked by a big shared database, enabling accelerated development of the whole industrial chain. Theory and multi-scale modeling and simulation, as supplements to experimental efforts, can help greatly to close some of the current experimental and technological gaps, as well as predict path-independent properties and help to fundamentally understand path-independent performance in multiple spatial and temporal scales. Project supported by the National Natural Science Foundation of China (Grant Nos. 51372228 and 11234013), the National High Technology Research and Development Program of China (Grant No. 2015AA034201), and Shanghai Pujiang Program, China (Grant No. 14PJ1403900).

  17. Radiology: "killer app" for next generation networks?

    PubMed

    McNeill, Kevin M

    2004-03-01

    The core principles of digital radiology were well developed by the end of the 1980 s. During the following decade tremendous improvements in computer technology enabled realization of those principles at an affordable cost. In this decade work can focus on highly distributed radiology in the context of the integrated health care enterprise. Over the same period computer networking has evolved from a relatively obscure field used by a small number of researchers across low-speed serial links to a pervasive technology that affects nearly all facets of society. Development directions in network technology will ultimately provide end-to-end data paths with speeds that match or exceed the speeds of data paths within the local network and even within workstations. This article describes key developments in Next Generation Networks, potential obstacles, and scenarios in which digital radiology can become a "killer app" that helps to drive deployment of new network infrastructure.

  18. Designing and using computer simulations in medical education and training: an introduction.

    PubMed

    Friedl, Karl E; O'Neil, Harold F

    2013-10-01

    Computer-based technologies informed by the science of learning are becoming increasingly prevalent in education and training. For the Department of Defense (DoD), this presents a great potential advantage to the effective preparation of a new generation of technologically enabled service members. Military medicine has broad education and training challenges ranging from first aid and personal protective skills for every service member to specialized combat medic training; many of these challenges can be met with gaming and simulation technologies that this new generation has embraced. However, comprehensive use of medical games and simulation to augment expert mentorship is still limited to elite medical provider training programs, but can be expected to become broadly used in the training of first responders and allied health care providers. The purpose of this supplement is to review the use of computer games and simulation to teach and assess medical knowledge and skills. This review and other DoD research policy sources will form the basis for development of a research and development road map and guidelines for use of this technology in military medicine. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.

  19. FY10 Engineering Innovations, Research and Technology Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lane, M A; Aceves, S M; Paulson, C N

    This report summarizes key research, development, and technology advancements in Lawrence Livermore National Laboratory's Engineering Directorate for FY2010. These efforts exemplify Engineering's nearly 60-year history of developing and applying the technology innovations needed for the Laboratory's national security missions, and embody Engineering's mission to ''Enable program success today and ensure the Laboratory's vitality tomorrow.'' Leading off the report is a section featuring compelling engineering innovations. These innovations range from advanced hydrogen storage that enables clean vehicles, to new nuclear material detection technologies, to a landmine detection system using ultra-wideband ground-penetrating radar. Many have been recognized with R&D Magazine's prestigious R&Dmore » 100 Award; all are examples of the forward-looking application of innovative engineering to pressing national problems and challenging customer requirements. Engineering's capability development strategy includes both fundamental research and technology development. Engineering research creates the competencies of the future where discovery-class groundwork is required. Our technology development (or reduction to practice) efforts enable many of the research breakthroughs across the Laboratory to translate from the world of basic research to the national security missions of the Laboratory. This portfolio approach produces new and advanced technological capabilities, and is a unique component of the value proposition of the Lawrence Livermore Laboratory. The balance of the report highlights this work in research and technology, organized into thematic technical areas: Computational Engineering; Micro/Nano-Devices and Structures; Measurement Technologies; Engineering Systems for Knowledge Discovery; and Energy Manipulation. Our investments in these areas serve not only known programmatic requirements of today and tomorrow, but also anticipate the breakthrough engineering innovations that will be needed in the future.« less

  20. Advanced Avionics and Processor Systems for a Flexible Space Exploration Architecture

    NASA Technical Reports Server (NTRS)

    Keys, Andrew S.; Adams, James H.; Smith, Leigh M.; Johnson, Michael A.; Cressler, John D.

    2010-01-01

    The Advanced Avionics and Processor Systems (AAPS) project, formerly known as the Radiation Hardened Electronics for Space Environments (RHESE) project, endeavors to develop advanced avionic and processor technologies anticipated to be used by NASA s currently evolving space exploration architectures. The AAPS project is a part of the Exploration Technology Development Program, which funds an entire suite of technologies that are aimed at enabling NASA s ability to explore beyond low earth orbit. NASA s Marshall Space Flight Center (MSFC) manages the AAPS project. AAPS uses a broad-scoped approach to developing avionic and processor systems. Investment areas include advanced electronic designs and technologies capable of providing environmental hardness, reconfigurable computing techniques, software tools for radiation effects assessment, and radiation environment modeling tools. Near-term emphasis within the multiple AAPS tasks focuses on developing prototype components using semiconductor processes and materials (such as Silicon-Germanium (SiGe)) to enhance a device s tolerance to radiation events and low temperature environments. As the SiGe technology will culminate in a delivered prototype this fiscal year, the project emphasis shifts its focus to developing low-power, high efficiency total processor hardening techniques. In addition to processor development, the project endeavors to demonstrate techniques applicable to reconfigurable computing and partially reconfigurable Field Programmable Gate Arrays (FPGAs). This capability enables avionic architectures the ability to develop FPGA-based, radiation tolerant processor boards that can serve in multiple physical locations throughout the spacecraft and perform multiple functions during the course of the mission. The individual tasks that comprise AAPS are diverse, yet united in the common endeavor to develop electronics capable of operating within the harsh environment of space. Specifically, the AAPS tasks for the Federal fiscal year of 2010 are: Silicon-Germanium (SiGe) Integrated Electronics for Extreme Environments, Modeling of Radiation Effects on Electronics, Radiation Hardened High Performance Processors (HPP), and and Reconfigurable Computing.

  1. Identifying the impact of G-quadruplexes on Affymetrix 3' arrays using cloud computing.

    PubMed

    Memon, Farhat N; Owen, Anne M; Sanchez-Graillet, Olivia; Upton, Graham J G; Harrison, Andrew P

    2010-01-15

    A tetramer quadruplex structure is formed by four parallel strands of DNA/ RNA containing runs of guanine. These quadruplexes are able to form because guanine can Hoogsteen hydrogen bond to other guanines, and a tetrad of guanines can form a stable arrangement. Recently we have discovered that probes on Affymetrix GeneChips that contain runs of guanine do not measure gene expression reliably. We associate this finding with the likelihood that quadruplexes are forming on the surface of GeneChips. In order to cope with the rapidly expanding size of GeneChip array datasets in the public domain, we are exploring the use of cloud computing to replicate our experiments on 3' arrays to look at the effect of the location of G-spots (runs of guanines). Cloud computing is a recently introduced high-performance solution that takes advantage of the computational infrastructure of large organisations such as Amazon and Google. We expect that cloud computing will become widely adopted because it enables bioinformaticians to avoid capital expenditure on expensive computing resources and to only pay a cloud computing provider for what is used. Moreover, as well as financial efficiency, cloud computing is an ecologically-friendly technology, it enables efficient data-sharing and we expect it to be faster for development purposes. Here we propose the advantageous use of cloud computing to perform a large data-mining analysis of public domain 3' arrays.

  2. Computer Technology-Integrated Projects Should not Supplant Craft Projects in Science Education

    NASA Astrophysics Data System (ADS)

    Klopp, Tabatha J.; Rule, Audrey C.; Suchsland Schneider, Jean; Boody, Robert M.

    2014-03-01

    The current emphasis on computer technology integration and narrowing of the curriculum has displaced arts and crafts. However, the hands-on, concrete nature of craft work in science modeling enables students to understand difficult concepts and to be engaged and motivated while learning spatial, logical, and sequential thinking skills. Analogy use is also helpful in understanding unfamiliar, complex science concepts. This study of 28 academically advanced elementary to middle-school students examined student work and perceptions during a science unit focused on four fossil organisms: crinoid, brachiopod, horn coral and trilobite. The study compared: (1) analogy-focused instruction to independent Internet research and (2) computer technology-rich products to crafts-based products. Findings indicate student products were more creative after analogy-based instruction and when made using technology. However, students expressed a strong desire to engage in additional craft work after making craft products and enjoyed making crafts more after analogy-focused instruction. Additionally, more science content was found in the craft products than the technology-rich products. Students expressed a particular liking for two of the fossil organisms because they had been modeled with crafts. The authors recommend that room should be retained for crafts in the science curriculum to model science concepts.

  3. BNCI systems as a potential assistive technology: ethical issues and participatory research in the BrainAble project.

    PubMed

    Carmichael, Clare; Carmichael, Patrick

    2014-01-01

    This paper highlights aspects related to current research and thinking about ethical issues in relation to Brain Computer Interface (BCI) and Brain-Neuronal Computer Interfaces (BNCI) research through the experience of one particular project, BrainAble, which is exploring and developing the potential of these technologies to enable people with complex disabilities to control computers. It describes how ethical practice has been developed both within the multidisciplinary research team and with participants. The paper presents findings in which participants shared their views of the project prototypes, of the potential of BCI/BNCI systems as an assistive technology, and of their other possible applications. This draws attention to the importance of ethical practice in projects where high expectations of technologies, and representations of "ideal types" of disabled users may reinforce stereotypes or drown out participant "voices". Ethical frameworks for research and development in emergent areas such as BCI/BNCI systems should be based on broad notions of a "duty of care" while being sufficiently flexible that researchers can adapt project procedures according to participant needs. They need to be frequently revisited, not only in the light of experience, but also to ensure they reflect new research findings and ever more complex and powerful technologies.

  4. Effectiveness of e-Lab Use in Science Teaching at the Omani Schools

    ERIC Educational Resources Information Center

    Al Musawi, A.; Ambusaidi, A.; Al-Balushi, S.; Al-Balushi, K.

    2015-01-01

    Computer and information technology can be used so that students can individually, in groups, or by electronic demonstration experiment and draw conclusion for the required activities in an electronic form in what is now called "e-lab". It enables students to conduct experiments more flexibly and in an interactive way using multimedia.…

  5. A Computational Method for Enabling Teaching-Learning Process in Huge Online Courses and Communities

    ERIC Educational Resources Information Center

    Mora, Higinio; Ferrández, Antonio; Gil, David; Peral, Jesús

    2017-01-01

    Massive Open Online Courses and e-learning represent the future of the teaching-learning processes through the development of Information and Communication Technologies. They are the response to the new education needs of society. However, this future also presents many challenges such as the processing of online forums when a huge number of…

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Troy Hiltbrand; Daniel Jones

    As we look at the cyber security ecosystem, are we planning to fight the battle as we did yesterday, with firewalls and intrusion detection systems (IDS), or are we sensing a change in how security is evolving and planning accordingly? With the technology enablement and possible financial benefits of cloud computing, the traditional tools for establishing and maintaining our cyber security ecosystems are being dramatically altered.

  7. Use of Short Podcasts to Reinforce Learning Outcomes in Biology

    ERIC Educational Resources Information Center

    Aguiar, Cristina; Carvalho, Ana Amelia; Carvalho, Carla Joana

    2009-01-01

    Podcasts are audio or video files which can be automatically downloaded to one's computer when the episodes become available, then later transferred to a portable player for listening. The technology thereby enables the user to listen to and/or watch the content anywhere at any time. Formerly popular as radio shows, podcasting was rapidly explored…

  8. Using assistive technology for schoolwork: the experience of children with physical disabilities.

    PubMed

    Murchland, Sonya; Parkyn, Helen

    2010-01-01

    This study explored the experience of children with physical disabilities using assistive technology for participation with schoolwork to gain a greater understanding of their perspectives and subjective experiences. A qualitative study involving thematic analysis of in-depth interviews of the child with a parent or significant adult. Purposeful sampling from a larger study recruited five children aged between 10 and 14 years, with differing physical disabilities who attended mainstream schools. All children used computer-based assistive technology. All of the children recognised that assistive technology enabled them to participate and reduced the impact of their physical disability, allowing independent participation, and facilitated higher learning outcomes. Issues related to ease of use, social implications and assistive technology systems are discussed.

  9. Development of a Multi-Disciplinary Computing Environment (MDICE)

    NASA Technical Reports Server (NTRS)

    Kingsley, Gerry; Siegel, John M., Jr.; Harrand, Vincent J.; Lawrence, Charles; Luker, Joel J.

    1999-01-01

    The growing need for and importance of multi-component and multi-disciplinary engineering analysis has been understood for many years. For many applications, loose (or semi-implicit) coupling is optimal, and allows the use of various legacy codes without requiring major modifications. For this purpose, CFDRC and NASA LeRC have developed a computational environment to enable coupling between various flow analysis codes at several levels of fidelity. This has been referred to as the Visual Computing Environment (VCE), and is being successfully applied to the analysis of several aircraft engine components. Recently, CFDRC and AFRL/VAAC (WL) have extended the framework and scope of VCE to enable complex multi-disciplinary simulations. The chosen initial focus is on aeroelastic aircraft applications. The developed software is referred to as MDICE-AE, an extensible system suitable for integration of several engineering analysis disciplines. This paper describes the methodology, basic architecture, chosen software technologies, salient library modules, and the current status of and plans for MDICE. A fluid-structure interaction application is described in a separate companion paper.

  10. A PICKSC Science Gateway for enabling the common plasma physicist to run kinetic software

    NASA Astrophysics Data System (ADS)

    Hu, Q.; Winjum, B. J.; Zonca, A.; Youn, C.; Tsung, F. S.; Mori, W. B.

    2017-10-01

    Computer simulations offer tremendous opportunities for studying plasmas, ranging from simulations for students that illuminate fundamental educational concepts to research-level simulations that advance scientific knowledge. Nevertheless, there is a significant hurdle to using simulation tools. Users must navigate codes and software libraries, determine how to wrangle output into meaningful plots, and oftentimes confront a significant cyberinfrastructure with powerful computational resources. Science gateways offer a Web-based environment to run simulations without needing to learn or manage the underlying software and computing cyberinfrastructure. We discuss our progress on creating a Science Gateway for the Particle-in-Cell and Kinetic Simulation Software Center that enables users to easily run and analyze kinetic simulations with our software. We envision that this technology could benefit a wide range of plasma physicists, both in the use of our simulation tools as well as in its adaptation for running other plasma simulation software. Supported by NSF under Grant ACI-1339893 and by the UCLA Institute for Digital Research and Education.

  11. A Spacelab Expert System for Remote Engineering and Science

    NASA Technical Reports Server (NTRS)

    Groleau, Nick; Colombano, Silvano; Friedland, Peter (Technical Monitor)

    1994-01-01

    NASA's space science program is based on strictly pre-planned activities. This approach does not always result in the best science. We describe an existing computer system that enables space science to be conducted in a more reactive manner through advanced automation techniques that have recently been used in SLS-2 October 1993 space shuttle flight. Advanced computing techniques, usually developed in the field of Artificial Intelligence, allow large portions of the scientific investigator's knowledge to be "packaged" in a portable computer to present advice to the astronaut operator. We strongly believe that this technology has wide applicability to other forms of remote science/engineering. In this brief article, we present the technology of remote science/engineering assistance as implemented for the SLS-2 space shuttle flight. We begin with a logical overview of the system (paying particular attention to the implementation details relevant to the use of the embedded knowledge for system reasoning), then describe its use and success in space, and conclude with ideas about possible earth uses of the technology in the life and medical sciences.

  12. The future challenge for aeropropulsion

    NASA Technical Reports Server (NTRS)

    Rosen, Robert; Bowditch, David N.

    1992-01-01

    NASA's research in aeropropulsion is focused on improving the efficiency, capability, and environmental compatibility for all classes of future aircraft. The development of innovative concepts, and theoretical, experimental, and computational tools provide the knowledge base for continued propulsion system advances. Key enabling technologies include advances in internal fluid mechanics, structures, light-weight high-strength composite materials, and advanced sensors and controls. Recent emphasis has been on the development of advanced computational tools in internal fluid mechanics, structural mechanics, reacting flows, and computational chemistry. For subsonic transport applications, very high bypass ratio turbofans with increased engine pressure ratio are being investigated to increase fuel efficiency and reduce airport noise levels. In a joint supersonic cruise propulsion program with industry, the critical environmental concerns of emissions and community noise are being addressed. NASA is also providing key technologies for the National Aerospaceplane, and is studying propulsion systems that provide the capability for aircraft to accelerate to and cruise in the Mach 4-6 speed range. The combination of fundamental, component, and focused technology development underway at NASA will make possible dramatic advances in aeropropulsion efficiency and environmental compatibility for future aeronautical vehicles.

  13. [Facing the challenges of ubiquitous computing in the health care sector].

    PubMed

    Georgieff, Peter; Friedewald, Michael

    2010-01-01

    The steady progress of microelectronics, communications and information technology will enable the realisation of the vision for "ubiquitous computing" where the Internet extends into the real world embracing everyday objects. The necessary technical basis is already in place. Due to their diminishing size, constantly falling price and declining energy consumption, processors, communications modules and sensors are being increasingly integrated into everyday objects today. This development is opening up huge opportunities for both the economy and individuals. In the present paper we discuss possible applications, but also technical, social and economic barriers to a wide-spread use of ubiquitous computing in the health care sector. .

  14. Bioinformatics for Exploration

    NASA Technical Reports Server (NTRS)

    Johnson, Kathy A.

    2006-01-01

    For the purpose of this paper, bioinformatics is defined as the application of computer technology to the management of biological information. It can be thought of as the science of developing computer databases and algorithms to facilitate and expedite biological research. This is a crosscutting capability that supports nearly all human health areas ranging from computational modeling, to pharmacodynamics research projects, to decision support systems within autonomous medical care. Bioinformatics serves to increase the efficiency and effectiveness of the life sciences research program. It provides data, information, and knowledge capture which further supports management of the bioastronautics research roadmap - identifying gaps that still remain and enabling the determination of which risks have been addressed.

  15. Simulating Human Cognition in the Domain of Air Traffic Control

    NASA Technical Reports Server (NTRS)

    Freed, Michael; Johnston, James C.; Null, Cynthia H. (Technical Monitor)

    1995-01-01

    Experiments intended to assess performance in human-machine interactions are often prohibitively expensive, unethical or otherwise impractical to run. Approximations of experimental results can be obtained, in principle, by simulating the behavior of subjects using computer models of human mental behavior. Computer simulation technology has been developed for this purpose. Our goal is to produce a cognitive model suitable to guide the simulation machinery and enable it to closely approximate a human subject's performance in experimental conditions. The described model is designed to simulate a variety of cognitive behaviors involved in routine air traffic control. As the model is elaborated, our ability to predict the effects of novel circumstances on controller error rates and other performance characteristics should increase. This will enable the system to project the impact of proposed changes to air traffic control procedures and equipment on controller performance.

  16. Strategic Computing Computer Vision: Taking Image Understanding To The Next Plateau

    NASA Astrophysics Data System (ADS)

    Simpson, R. L., Jr.

    1987-06-01

    The overall objective of the Strategic Computing (SC) Program of the Defense Advanced Research Projects Agency (DARPA) is to develop and demonstrate a new generation of machine intelligence technology which can form the basis for more capable military systems in the future and also maintain a position of world leadership for the US in computer technology. Begun in 1983, SC represents a focused research strategy for accelerating the evolution of new technology and its rapid prototyping in realistic military contexts. Among the very ambitious demonstration prototypes being developed within the SC Program are: 1) the Pilot's Associate which will aid the pilot in route planning, aerial target prioritization, evasion of missile threats, and aircraft emergency safety procedures during flight; 2) two battle management projects one for the for the Army, which is just getting started, called the AirLand Battle Management program (ALBM) which will use knowledge-based systems technology to assist in the generation and evaluation of tactical options and plans at the Corps level; 3) the other more established program for the Navy is the Fleet Command Center Battle Management Program (FCCBIVIP) at Pearl Harbor. The FCCBMP is employing knowledge-based systems and natural language technology in a evolutionary testbed situated in an operational command center to demonstrate and evaluate intelligent decision-aids which can assist in the evaluation of fleet readiness and explore alternatives during contingencies; and 4) the Autonomous Land Vehicle (ALV) which integrates in a major robotic testbed the technologies for dynamic image understanding, knowledge-based route planning with replanning during execution, hosted on new advanced parallel architectures. The goal of the Strategic Computing computer vision technology base (SCVision) is to develop generic technology that will enable the construction of complete, robust, high performance image understanding systems to support a wide range of DoD applications. Possible applications include autonomous vehicle navigation, photointerpretation, smart weapons, and robotic manipulation. This paper provides an overview of the technical and program management plans being used in evolving this critical national technology.

  17. Medical informatics--an Australian perspective.

    PubMed

    Hannan, T

    1991-06-01

    Computers, like the X-ray and stethoscope can be seen as clinical tools, that provide physicians with improved expertise in solving patient management problems. As tools they enable us to extend our clinical information base, and they also provide facilities that improve the delivery of the health care we provide. Automation (computerisation) in the health domain will cause the computer to become a more integral part of health care management and delivery before the start of the next century. To understand how the computer assists those who deliver and manage health care, it is important to be aware of its functional capabilities and how we can use them in medical practice. The rapid technological advances in computers over the last two decades has had both beneficial and counterproductive effects on the implementation of effective computer applications in the delivery of health care. For example, in the 1990s the computer hobbyist is able to make an investment of less than $10,000 on computer hardware that will match or exceed the technological capacities of machines of the 1960s. These rapid technological advances, which have produced a quantum leap in our ability to store and process information, have tended to make us overlook the need for effective computer programmes which will meet the needs of patient care. As the 1990s begin, those delivering health care (eg, physicians, nurses, pharmacists, administrators ...) need to become more involved in directing the effective implementation of computer applications that will provide the tools for improved information management, knowledge processing, and ultimately better patient care.

  18. Cloud-based Jupyter Notebooks for Water Data Analysis

    NASA Astrophysics Data System (ADS)

    Castronova, A. M.; Brazil, L.; Seul, M.

    2017-12-01

    The development and adoption of technologies by the water science community to improve our ability to openly collaborate and share workflows will have a transformative impact on how we address the challenges associated with collaborative and reproducible scientific research. Jupyter notebooks offer one solution by providing an open-source platform for creating metadata-rich toolchains for modeling and data analysis applications. Adoption of this technology within the water sciences, coupled with publicly available datasets from agencies such as USGS, NASA, and EPA enables researchers to easily prototype and execute data intensive toolchains. Moreover, implementing this software stack in a cloud-based environment extends its native functionality to provide researchers a mechanism to build and execute toolchains that are too large or computationally demanding for typical desktop computers. Additionally, this cloud-based solution enables scientists to disseminate data processing routines alongside journal publications in an effort to support reproducibility. For example, these data collection and analysis toolchains can be shared, archived, and published using the HydroShare platform or downloaded and executed locally to reproduce scientific analysis. This work presents the design and implementation of a cloud-based Jupyter environment and its application for collecting, aggregating, and munging various datasets in a transparent, sharable, and self-documented manner. The goals of this work are to establish a free and open source platform for domain scientists to (1) conduct data intensive and computationally intensive collaborative research, (2) utilize high performance libraries, models, and routines within a pre-configured cloud environment, and (3) enable dissemination of research products. This presentation will discuss recent efforts towards achieving these goals, and describe the architectural design of the notebook server in an effort to support collaborative and reproducible science.

  19. Live theater on a virtual stage: incorporating soft skills and teamwork in computer graphics education.

    PubMed

    Schweppe, M; Geigel, J

    2011-01-01

    Industry has increasingly emphasized the need for "soft" or interpersonal skills development and team-building experience in the college curriculum. Here, we discuss our experiences with providing such opportunities via a collaborative project called the Virtual Theater. In this joint project between the Rochester Institute of Technology's School of Design and Department of Computer Science, the goal is to enable live performance in a virtual space with participants in different physical locales. Students work in teams, collaborating with other students in and out of their disciplines.

  20. Technical report: precisely fitting bars on implants in five steps-a CAD/CAM concept for the edentulous mandible.

    PubMed

    Beuer, Florian; Schweiger, Josef; Huber, Martin; Engels, Jörg; Stimmelmayr, Michael

    2014-06-01

    Various treatment concepts have been presented for the edentulous mandible. Manufacturing tension-free and precisely fitting bars on dental implants was previously a great challenge in prosthetic dentistry and required great effort. Modern computer aided design/computer aided manufacturing technology in combination with some clinical modifications of the established workflow enables the clinician to achieve precise results in a very efficient way. The innovative five-step concept is presented in a clinical case. © 2014 by the American College of Prosthodontists.

  1. Mechanical Properties in Metal-Organic Frameworks: Emerging Opportunities and Challenges for Device Functionality and Technological Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burtch, Nicholas C.; Heinen, Jurn; Bennett, Thomas D.

    We report that some of the most remarkable recent developments in metal–organic framework (MOF) performance properties can only be rationalized by the mechanical properties endowed by their hybrid inorganic–organic nanoporous structures. While these characteristics create intriguing application prospects, the same attributes also present challenges that will need to be overcome to enable the integration of MOFs with technologies where these promising traits can be exploited. In this review, emerging opportunities and challenges are identified for MOF-enabled device functionality and technological applications that arise from their fascinating mechanical properties. This is discussed not only in the context of their more well-studiedmore » gas storage and separation applications, but also for instances where MOFs serve as components of functional nanodevices. Recent advances in understanding MOF mechanical structure–property relationships due to attributes such as defects and interpenetration are highlighted, and open questions related to state-of-the-art computational approaches for quantifying their mechanical properties are critically discussed.« less

  2. Mechanical Properties in Metal-Organic Frameworks: Emerging Opportunities and Challenges for Device Functionality and Technological Applications

    DOE PAGES

    Burtch, Nicholas C.; Heinen, Jurn; Bennett, Thomas D.; ...

    2017-11-17

    We report that some of the most remarkable recent developments in metal–organic framework (MOF) performance properties can only be rationalized by the mechanical properties endowed by their hybrid inorganic–organic nanoporous structures. While these characteristics create intriguing application prospects, the same attributes also present challenges that will need to be overcome to enable the integration of MOFs with technologies where these promising traits can be exploited. In this review, emerging opportunities and challenges are identified for MOF-enabled device functionality and technological applications that arise from their fascinating mechanical properties. This is discussed not only in the context of their more well-studiedmore » gas storage and separation applications, but also for instances where MOFs serve as components of functional nanodevices. Recent advances in understanding MOF mechanical structure–property relationships due to attributes such as defects and interpenetration are highlighted, and open questions related to state-of-the-art computational approaches for quantifying their mechanical properties are critically discussed.« less

  3. Mechanical Properties in Metal-Organic Frameworks: Emerging Opportunities and Challenges for Device Functionality and Technological Applications.

    PubMed

    Burtch, Nicholas C; Heinen, Jurn; Bennett, Thomas D; Dubbeldam, David; Allendorf, Mark D

    2017-11-17

    Some of the most remarkable recent developments in metal-organic framework (MOF) performance properties can only be rationalized by the mechanical properties endowed by their hybrid inorganic-organic nanoporous structures. While these characteristics create intriguing application prospects, the same attributes also present challenges that will need to be overcome to enable the integration of MOFs with technologies where these promising traits can be exploited. In this review, emerging opportunities and challenges are identified for MOF-enabled device functionality and technological applications that arise from their fascinating mechanical properties. This is discussed not only in the context of their more well-studied gas storage and separation applications, but also for instances where MOFs serve as components of functional nanodevices. Recent advances in understanding MOF mechanical structure-property relationships due to attributes such as defects and interpenetration are highlighted, and open questions related to state-of-the-art computational approaches for quantifying their mechanical properties are critically discussed. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Methodical and technological aspects of creation of interactive computer learning systems

    NASA Astrophysics Data System (ADS)

    Vishtak, N. M.; Frolov, D. A.

    2017-01-01

    The article presents a methodology for the development of an interactive computer training system for training power plant. The methods used in the work are a generalization of the content of scientific and methodological sources on the use of computer-based training systems in vocational education, methods of system analysis, methods of structural and object-oriented modeling of information systems. The relevance of the development of the interactive computer training systems in the preparation of the personnel in the conditions of the educational and training centers is proved. Development stages of the computer training systems are allocated, factors of efficient use of the interactive computer training system are analysed. The algorithm of work performance at each development stage of the interactive computer training system that enables one to optimize time, financial and labor expenditure on the creation of the interactive computer training system is offered.

  5. The Tongue Enables Computer and Wheelchair Control for People with Spinal Cord Injury

    PubMed Central

    Kim, Jeonghee; Park, Hangue; Bruce, Joy; Sutton, Erica; Rowles, Diane; Pucci, Deborah; Holbrook, Jaimee; Minocha, Julia; Nardone, Beatrice; West, Dennis; Laumann, Anne; Roth, Eliot; Jones, Mike; Veledar, Emir; Ghovanloo, Maysam

    2015-01-01

    The Tongue Drive System (TDS) is a wireless and wearable assistive technology, designed to allow individuals with severe motor impairments such as tetraplegia to access their environment using voluntary tongue motion. Previous TDS trials used a magnetic tracer temporarily attached to the top surface of the tongue with tissue adhesive. We investigated TDS efficacy for controlling a computer and driving a powered wheelchair in two groups of able-bodied subjects and a group of volunteers with spinal cord injury (SCI) at C6 or above. All participants received a magnetic tongue barbell and used the TDS for five to six consecutive sessions. The performance of the group was compared for TDS versus keypad and TDS versus a sip-and-puff device (SnP) using accepted measures of speed and accuracy. All performance measures improved over the course of the trial. The gap between keypad and TDS performance narrowed for able-bodied subjects. Despite participants with SCI already having familiarity with the SnP, their performance measures were up to three times better with the TDS than with the SnP and continued to improve. TDS flexibility and the inherent characteristics of the human tongue enabled individuals with high-level motor impairments to access computers and drive wheelchairs at speeds that were faster than traditional assistive technologies but with comparable accuracy. PMID:24285485

  6. The Centre of High-Performance Scientific Computing, Geoverbund, ABC/J - Geosciences enabled by HPSC

    NASA Astrophysics Data System (ADS)

    Kollet, Stefan; Görgen, Klaus; Vereecken, Harry; Gasper, Fabian; Hendricks-Franssen, Harrie-Jan; Keune, Jessica; Kulkarni, Ketan; Kurtz, Wolfgang; Sharples, Wendy; Shrestha, Prabhakar; Simmer, Clemens; Sulis, Mauro; Vanderborght, Jan

    2016-04-01

    The Centre of High-Performance Scientific Computing (HPSC TerrSys) was founded 2011 to establish a centre of competence in high-performance scientific computing in terrestrial systems and the geosciences enabling fundamental and applied geoscientific research in the Geoverbund ABC/J (geoscientfic research alliance of the Universities of Aachen, Cologne, Bonn and the Research Centre Jülich, Germany). The specific goals of HPSC TerrSys are to achieve relevance at the national and international level in (i) the development and application of HPSC technologies in the geoscientific community; (ii) student education; (iii) HPSC services and support also to the wider geoscientific community; and in (iv) the industry and public sectors via e.g., useful applications and data products. A key feature of HPSC TerrSys is the Simulation Laboratory Terrestrial Systems, which is located at the Jülich Supercomputing Centre (JSC) and provides extensive capabilities with respect to porting, profiling, tuning and performance monitoring of geoscientific software in JSC's supercomputing environment. We will present a summary of success stories of HPSC applications including integrated terrestrial model development, parallel profiling and its application from watersheds to the continent; massively parallel data assimilation using physics-based models and ensemble methods; quasi-operational terrestrial water and energy monitoring; and convection permitting climate simulations over Europe. The success stories stress the need for a formalized education of students in the application of HPSC technologies in future.

  7. The Design and Implementation of NASA's Advanced Flight Computing Module

    NASA Technical Reports Server (NTRS)

    Alkakaj, Leon; Straedy, Richard; Jarvis, Bruce

    1995-01-01

    This paper describes a working flight computer Multichip Module developed jointly by JPL and TRW under their respective research programs in a collaborative fashion. The MCM is fabricated by nCHIP and is packaged within a 2 by 4 inch Al package from Coors. This flight computer module is one of three modules under development by NASA's Advanced Flight Computer (AFC) program. Further development of the Mass Memory and the programmable I/O MCM modules will follow. The three building block modules will then be stacked into a 3D MCM configuration. The mass and volume of the flight computer MCM achieved at 89 grams and 1.5 cubic inches respectively, represent a major enabling technology for future deep space as well as commercial remote sensing applications.

  8. Lobachevsky Year at Kazan University: Center of Science, Education, Intellectual-Cognitive Tourism "Kazan - GeoNa - 2020+" and "Kazan-Moon-2020+" projects

    NASA Astrophysics Data System (ADS)

    Gusev, A.; Trudkova, N.

    2017-09-01

    Center "GeoNa" will enable scientists and teachers of the Russian universities to join to advanced achievements of a science, information technologies; to establish scientific communications with foreign colleagues in sphere of the high technology, educational projects and Intellectual-Cognitive Tourism. The Project "Kazan - Moon - 2020+" is directed on the decision of fundamental problems of celestial mechanics, selenodesy and geophysics of the Moon(s) connected to carrying out of complex theoretical researches and computer modelling.

  9. Personal technology use by U.S. military service members and veterans: an update.

    PubMed

    Bush, Nigel E; Wheeler, William M

    2015-04-01

    Although personal electronic devices, such as mobile phones, computers, and tablets, increasingly are being leveraged as vehicles for health in the civilian world, almost nothing is known about personal technology use in the U.S. military. In 2012 we conducted a unique survey of personal technologies used by U.S. military service members. However, with the rapidly growing sophistication of personal technology and changes in consumer habits, that knowledge must be continuously updated to be useful. Accordingly, we recently surveyed new samples of active duty service members, National Guard and Reserve, and veterans. We collected data by online surveys in 2013 from 239 active, inactive, and former service members. Online surveys were completed in-person via laptop computers at a large military installation and remotely via Web-based surveys posted on the Army Knowledge Online Web site and on a Defense Center Facebook social media channel. We measured high rates of personal technology use by service members at home across popular electronic media. The most dramatic change since our earlier survey was the tremendous increase in mobile phone use at home for a wide variety of purposes. Participants also reported moderate non-work uses of computers and tablets while on recent deployment to Iraq and Afghanistan, but almost no mobile phone use, ostensibly because of military restrictions in the war zone. These latest results will enable researchers and technology developers target their efforts on the most promising and popular technologies for psychological health in the military.

  10. On Representative Spaceflight Instrument and Associated Instrument Sensor Web Framework

    NASA Technical Reports Server (NTRS)

    Kizhner, Semion; Patel, Umeshkumar; Vootukuru, Meg

    2007-01-01

    Sensor Web-based adaptation and sharing of space flight mission resources, including those of the Space-Ground and Control-User communication segment, could greatly benefit from utilization of heritage Internet Protocols and devices applied for Spaceflight (SpaceIP). This had been successfully demonstrated by a few recent spaceflight experiments. However, while terrestrial applications of Internet protocols are well developed and understood (mostly due to billions of dollars in investments by the military and industry), the spaceflight application of Internet protocols is still in its infancy. Progress in the developments of SpaceIP-enabled instrument components will largely determine the SpaceIP utilization of those investments and acceptance in years to come. Likewise SpaceIP, the development of commercial real-time and instrument colocated computational resources, data compression and storage, can be enabled on-board a spacecraft and, in turn, support a powerful application to Sensor Web-based design of a spaceflight instrument. Sensor Web-enabled reconfiguration and adaptation of structures for hardware resources and information systems will commence application of Field Programmable Arrays (FPGA) and other aerospace programmable logic devices for what this technology was intended. These are a few obvious potential benefits of Sensor Web technologies for spaceflight applications. However, they are still waiting to be explored. This is because there is a need for a new approach to spaceflight instrumentation in order to make these mature sensor web technologies applicable for spaceflight. In this paper we present an approach in developing related and enabling spaceflight instrument-level technologies based on the new concept of a representative spaceflight Instrument Sensor Web (ISW).

  11. An overview of computer-based natural language processing

    NASA Technical Reports Server (NTRS)

    Gevarter, W. B.

    1983-01-01

    Computer based Natural Language Processing (NLP) is the key to enabling humans and their computer based creations to interact with machines in natural language (like English, Japanese, German, etc., in contrast to formal computer languages). The doors that such an achievement can open have made this a major research area in Artificial Intelligence and Computational Linguistics. Commercial natural language interfaces to computers have recently entered the market and future looks bright for other applications as well. This report reviews the basic approaches to such systems, the techniques utilized, applications, the state of the art of the technology, issues and research requirements, the major participants and finally, future trends and expectations. It is anticipated that this report will prove useful to engineering and research managers, potential users, and others who will be affected by this field as it unfolds.

  12. Development of AN Innovative Three-Dimensional Complete Body Screening Device - 3D-CBS

    NASA Astrophysics Data System (ADS)

    Crosetto, D. B.

    2004-07-01

    This article describes an innovative technological approach that increases the efficiency with which a large number of particles (photons) can be detected and analyzed. The three-dimensional complete body screening (3D-CBS) combines the functional imaging capability of the Positron Emission Tomography (PET) with those of the anatomical imaging capability of Computed Tomography (CT). The novel techniques provide better images in a shorter time with less radiation to the patient. A primary means of accomplishing this is the use of a larger solid angle, but this requires a new electronic technique capable of handling the increased data rate. This technique, combined with an improved and simplified detector assembly, enables executing complex real-time algorithms and allows more efficiently use of economical crystals. These are the principal features of this invention. A good synergy of advanced techniques in particle detection, together with technological progress in industry (latest FPGA technology) and simple, but cost-effective ideas provide a revolutionary invention. This technology enables over 400 times PET efficiency improvement at once compared to two to three times improvements achieved every five years during the past decades. Details of the electronics are provided, including an IBM PC board with a parallel-processing architecture implemented in FPGA, enabling the execution of a programmable complex real-time algorithm for best detection of photons.

  13. Augmenting Research, Education, and Outreach with Client-Side Web Programming.

    PubMed

    Abriata, Luciano A; Rodrigues, João P G L M; Salathé, Marcel; Patiny, Luc

    2018-05-01

    The evolution of computing and web technologies over the past decade has enabled the development of fully fledged scientific applications that run directly on web browsers. Powered by JavaScript, the lingua franca of web programming, these 'web apps' are starting to revolutionize and democratize scientific research, education, and outreach. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Another Philadelphia Story: Mobilizing Resistance and Widening the Educational Imagination in the Midst of Corporate Assault on the Public Sphere

    ERIC Educational Resources Information Center

    Fine, Michelle

    2013-01-01

    Information about the MIT Media Lab PLATFORM, a Summit of Innovators, is presented. This Summit was comprised of engineers, computer scientists, coders, activists, designers, and technology wizards, a gathering of people of color who have been remarkably successful, provocative, and creative against the odds. The event enabled the author and other…

  15. Influence of an Integrated Learning Diagnosis and Formative Assessment-Based Personalized Web Learning Approach on Students' Learning Performances and Perceptions

    ERIC Educational Resources Information Center

    Wongwatkit, Charoenchai; Srisawasdi, Niwat; Hwang, Gwo-Jen; Panjaburee, Patcharin

    2017-01-01

    The advancement of computer and communication technologies has enabled students to learn across various real-world contexts with supports from the learning system. In the meantime, researchers have emphasized the necessity of providing personalized learning guidance or support by considering individual students' status and needs in order to…

  16. A Federal Vision for Future Computing: A Nanotechnology-Inspired Grand Challenge

    DTIC Science & Technology

    2016-07-29

    Science Foundation (NSF), Department of Defense (DOD), National Institute of Standards and Technology (NIST), Intelligence Community (IC) Introduction...multiple Federal agencies: • Intelligent big data sensors that act autonomously and are programmable via the network for increased flexibility, and... intelligence for scientific discovery enabled by rapid extreme-scale data analysis, capable of understanding and making sense of results and thereby

  17. Students as Virtual Scientists: An Exploration of Students' and Teachers' Perceived Realness of a Remote Electron Microscopy Investigation

    ERIC Educational Resources Information Center

    Childers, Gina; Jones, M. Gail

    2015-01-01

    Remote access technologies enable students to investigate science by utilizing scientific tools and communicating in real-time with scientists and researchers with only a computer and an Internet connection. Very little is known about student perceptions of how real remote investigations are and how immersed the students are in the experience.…

  18. Use of a Technology-Enhanced Version of the Good Behavior Game in an Elementary School Setting

    ERIC Educational Resources Information Center

    Lynne, Shauna; Radley, Keith C.; Dart, Evan H.; Tingstrom, Daniel H.; Barry, Christopher T.; Lum, John D. K.

    2017-01-01

    The purpose of this study was to investigate the effectiveness of a variation of the Good Behavior Game (GBG) in which teachers used ClassDojo to manage each team's progress. ClassDojo is a computer-based program that enables teachers to award students with points for demonstrating target behaviors. Dependent variables included class-wide…

  19. "Now, Year Ones, This Is Your Life!" Preparing the Present Generation of Students for a World of Shrinking Distances.

    ERIC Educational Resources Information Center

    Beare, Hedley

    2001-01-01

    Forecasts for the future are made against the backdrop of population growth, environmental change, information technology, and globalization. Schools and teachers as we know them will change radically, perhaps become obsolete, as computers and the Internet enable access to information from anywhere, any time. Learning will become a life-long,…

  20. Computer Assisted Reading in German as a Foreign Language, Developing and Testing an NLP-Based Application

    ERIC Educational Resources Information Center

    Wood, Peter

    2011-01-01

    "QuickAssist," the program presented in this paper, uses natural language processing (NLP) technologies. It places a range of NLP tools at the disposal of learners, intended to enable them to independently read and comprehend a German text of their choice while they extend their vocabulary, learn about different uses of particular words,…

  1. Speech-Enabled Tools for Augmented Interaction in E-Learning Applications

    ERIC Educational Resources Information Center

    Selouani, Sid-Ahmed A.; Lê, Tang-Hô; Benahmed, Yacine; O'Shaughnessy, Douglas

    2008-01-01

    This article presents systems that use speech technology, to emulate the one-on-one interaction a student can get from a virtual instructor. A web-based learning tool, the Learn IN Context (LINC+) system, designed and used in a real mixed-mode learning context for a computer (C++ language) programming course taught at the Université de Moncton…

  2. State University of New York Institute of Technology (SUNYIT) Visiting Scholars Program

    DTIC Science & Technology

    2013-05-01

    team members, and build the necessary backend metal interconnections. APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED 4 Baek-Young Choi...Cooperative and Opportunistic Mobile Cloud for Energy Efficient Positioning; Department of Computer Science Electrical Engineering, University of...Missouri - Kansas City The fast growing popularity of smartphones and tablets enables us the use of various intelligent mobile applications. As many of

  3. Bowling Ball Spotting

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Exactatron, an accurate weighing and spotting system in bowling ball manufacture, was developed by Ebonite International engineers with the assistance of a NASA computer search which identified Jet Propulsion Laboratory (JPL) technology. The JPL research concerned a means of determining the center of an object's mass, and an apparatus for measuring liquid viscosity, enabling Ebonite to identify the exact spotting of the drilling point for top weighting.

  4. New Window into the Human Body

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Michael Vannier, MD, a former NASA engineer, recognized the similarity between NASA's computerized image processing technology and nuclear magnetic resonance. With technical assistance from Kennedy Space Center, he developed a computer program for Mallinckrodt Institute of Radiology enabling Nuclear Magnetic Resonance (NMR) to scan body tissue for earlier diagnoses. Dr. Vannier feels that "satellite imaging" has opened a new window into the human body.

  5. Operational flash flood forecasting platform based on grid technology

    NASA Astrophysics Data System (ADS)

    Thierion, V.; Ayral, P.-A.; Angelini, V.; Sauvagnargues-Lesage, S.; Nativi, S.; Payrastre, O.

    2009-04-01

    Flash flood events of south of France such as the 8th and 9th September 2002 in the Grand Delta territory caused important economic and human damages. Further to this catastrophic hydrological situation, a reform of flood warning services have been initiated (set in 2006). Thus, this political reform has transformed the 52 existing flood warning services (SAC) in 22 flood forecasting services (SPC), in assigning them territories more hydrological consistent and new effective hydrological forecasting mission. Furthermore, national central service (SCHAPI) has been created to ease this transformation and support local services in their new objectives. New functioning requirements have been identified: - SPC and SCHAPI carry the responsibility to clearly disseminate to public organisms, civil protection actors and population, crucial hydrologic information to better anticipate potential dramatic flood event, - a new effective hydrological forecasting mission to these flood forecasting services seems essential particularly for the flash floods phenomenon. Thus, models improvement and optimization was one of the most critical requirements. Initially dedicated to support forecaster in their monitoring mission, thanks to measuring stations and rainfall radar images analysis, hydrological models have to become more efficient in their capacity to anticipate hydrological situation. Understanding natural phenomenon occuring during flash floods mainly leads present hydrological research. Rather than trying to explain such complex processes, the presented research try to manage the well-known need of computational power and data storage capacities of these services. Since few years, Grid technology appears as a technological revolution in high performance computing (HPC) allowing large-scale resource sharing, computational power using and supporting collaboration across networks. Nowadays, EGEE (Enabling Grids for E-science in Europe) project represents the most important effort in term of grid technology development. This paper presents an operational flash flood forecasting platform which have been developed in the framework of CYCLOPS European project providing one of virtual organizations of EGEE project. This platform has been designed to enable multi-simulations processes to ease forecasting operations of several supervised watersheds on Grand Delta (SPC-GD) territory. Grid technology infrastructure, in providing multiple remote computing elements enables the processing of multiple rainfall scenarios, derived to the original meteorological forecasting transmitted by Meteo-France, and their respective hydrological simulations. First results show that from one forecasting scenario, this new presented approach can permit simulations of more than 200 different scenarios to support forecasters in their aforesaid mission and appears as an efficient hydrological decision-making tool. Although, this system seems operational, model validity has to be confirmed. So, further researches are necessary to improve models core to be more efficient in term of hydrological aspects. Finally, this platform could be an efficient tool for developing others modelling aspects as calibration or data assimilation in real time processing.

  6. Interoperability of Neuroscience Modeling Software

    PubMed Central

    Cannon, Robert C.; Gewaltig, Marc-Oliver; Gleeson, Padraig; Bhalla, Upinder S.; Cornelis, Hugo; Hines, Michael L.; Howell, Fredrick W.; Muller, Eilif; Stiles, Joel R.; Wils, Stefan; De Schutter, Erik

    2009-01-01

    Neuroscience increasingly uses computational models to assist in the exploration and interpretation of complex phenomena. As a result, considerable effort is invested in the development of software tools and technologies for numerical simulations and for the creation and publication of models. The diversity of related tools leads to the duplication of effort and hinders model reuse. Development practices and technologies that support interoperability between software systems therefore play an important role in making the modeling process more efficient and in ensuring that published models can be reliably and easily reused. Various forms of interoperability are possible including the development of portable model description standards, the adoption of common simulation languages or the use of standardized middleware. Each of these approaches finds applications within the broad range of current modeling activity. However more effort is required in many areas to enable new scientific questions to be addressed. Here we present the conclusions of the “Neuro-IT Interoperability of Simulators” workshop, held at the 11th computational neuroscience meeting in Edinburgh (July 19-20 2006; http://www.cnsorg.org). We assess the current state of interoperability of neural simulation software and explore the future directions that will enable the field to advance. PMID:17873374

  7. Numerical Technology for Large-Scale Computational Electromagnetics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharpe, R; Champagne, N; White, D

    The key bottleneck of implicit computational electromagnetics tools for large complex geometries is the solution of the resulting linear system of equations. The goal of this effort was to research and develop critical numerical technology that alleviates this bottleneck for large-scale computational electromagnetics (CEM). The mathematical operators and numerical formulations used in this arena of CEM yield linear equations that are complex valued, unstructured, and indefinite. Also, simultaneously applying multiple mathematical modeling formulations to different portions of a complex problem (hybrid formulations) results in a mixed structure linear system, further increasing the computational difficulty. Typically, these hybrid linear systems aremore » solved using a direct solution method, which was acceptable for Cray-class machines but does not scale adequately for ASCI-class machines. Additionally, LLNL's previously existing linear solvers were not well suited for the linear systems that are created by hybrid implicit CEM codes. Hence, a new approach was required to make effective use of ASCI-class computing platforms and to enable the next generation design capabilities. Multiple approaches were investigated, including the latest sparse-direct methods developed by our ASCI collaborators. In addition, approaches that combine domain decomposition (or matrix partitioning) with general-purpose iterative methods and special purpose pre-conditioners were investigated. Special-purpose pre-conditioners that take advantage of the structure of the matrix were adapted and developed based on intimate knowledge of the matrix properties. Finally, new operator formulations were developed that radically improve the conditioning of the resulting linear systems thus greatly reducing solution time. The goal was to enable the solution of CEM problems that are 10 to 100 times larger than our previous capability.« less

  8. Enabling technologies for fiber optic sensing

    NASA Astrophysics Data System (ADS)

    Ibrahim, Selwan K.; Farnan, Martin; Karabacak, Devrez M.; Singer, Johannes M.

    2016-04-01

    In order for fiber optic sensors to compete with electrical sensors, several critical parameters need to be addressed such as performance, cost, size, reliability, etc. Relying on technologies developed in different industrial sectors helps to achieve this goal in a more efficient and cost effective way. FAZ Technology has developed a tunable laser based optical interrogator based on technologies developed in the telecommunication sector and optical transducer/sensors based on components sourced from the automotive market. Combining Fiber Bragg Grating (FBG) sensing technology with the above, high speed, high precision, reliable quasi distributed optical sensing systems for temperature, pressure, acoustics, acceleration, etc. has been developed. Careful design needs to be considered to filter out any sources of measurement drifts/errors due to different effects e.g. polarization and birefringence, coating imperfections, sensor packaging etc. Also to achieve high speed and high performance optical sensing systems, combining and synchronizing multiple optical interrogators similar to what has been used with computer/processors to deliver super computing power is an attractive solution. This path can be achieved by using photonic integrated circuit (PIC) technology which opens the doors to scaling up and delivering powerful optical sensing systems in an efficient and cost effective way.

  9. An Overview Of NASA's Solar Sail Propulsion Project

    NASA Technical Reports Server (NTRS)

    Garbe, Gregory; Montgomery, Edward E., IV

    2003-01-01

    Research conducted by the In-Space Propulsion (ISP) Technologies Projects is at the forefront of NASA's efforts to mature propulsion technologies that will enable or enhance a variety of space science missions. The ISP Program is developing technologies from a Technology Readiness Level (TRL) of 3 through TRL 6. Activities under the different technology areas are selected through the NASA Research Announcement (NRA) process. The ISP Program goal is to mature a suite of reliable advanced propulsion technologies that will promote more cost efficient missions through the reduction of interplanetary mission trip time, increased scientific payload mass fraction, and allowing for longer on-station operations. These propulsion technologies will also enable missions with previously inaccessible orbits (e.g., non-Keplerian, high solar latitudes). The ISP Program technology suite has been prioritized by an agency wide study. Solar Sail propulsion is one of ISP's three high-priority technology areas. Solar sail propulsion systems will be required to meet the challenge of monitoring and predicting space weather by the Office of Space Science s (OSS) Living with a Star (LWS) program. Near-to-mid-term mission needs include monitoring of solar activity and observations at high solar latitudes. Near-term work funded by the ISP solar sail propulsion project is centered around the quantitative demonstration of scalability of present solar sail subsystem designs and concepts to future mission requirements through ground testing, computer modeling and analytical simulations. This talk will review the solar sail technology roadmap, current funded technology development work, future funding opportunities, and mission applications.

  10. The contributions of digital technologies in the teaching of nursing skills: an integrative review.

    PubMed

    Silveira, Maurício de Souza; Cogo, Ana Luísa Petersen

    2017-07-13

    To analyze the contributions of digital educational technologies used in teaching nursing skills. Integrative literature review, search in five databases, from 2006 to 2015 combining the descriptors 'education, nursing', 'educational technology', 'computer-assisted instruction' or related terms in English. Sample of 30 articles grouped in the thematic categories 'technology in the simulation with manikin', 'incentive to learning' and 'teaching of nursing skills'. It was identified different formats of digital educational technologies used in teaching Nursing skills such as videos, learning management system, applications, hypertext, games, virtual reality simulators. These digital materials collaborated in the acquisition of theoretical references that subsidize the practices, enhancing the teaching and enable the use of active learning methods, breaking with the traditional teaching of demonstrating and repeating procedures.

  11. The wireless Web and patient care.

    PubMed

    Bergeron, B P

    2001-01-01

    Wireless computing, when integrated with the Web, is poised to revolutionize the practice and teaching of medicine. As vendors introduce wireless Web technologies in the medical community that have been used successfully in the business and consumer markets, clinicians can expect profound increases in the amount of patient data, as well as the ease with which those data are acquired, analyzed, and disseminated. The enabling technologies involved in this transformation to the wireless Web range from the new generation of wireless PDAs, eBooks, and wireless data acquisition peripherals to new wireless network protocols. The rate-limiting step in the application of this technology in medicine is not technology per se but rather how quickly clinicians and their patients come to accept and appreciate the benefits and limitations of the application of wireless Web technology.

  12. Building new computational models to support health behavior change and maintenance: new opportunities in behavioral research.

    PubMed

    Spruijt-Metz, Donna; Hekler, Eric; Saranummi, Niilo; Intille, Stephen; Korhonen, Ilkka; Nilsen, Wendy; Rivera, Daniel E; Spring, Bonnie; Michie, Susan; Asch, David A; Sanna, Alberto; Salcedo, Vicente Traver; Kukakfa, Rita; Pavel, Misha

    2015-09-01

    Adverse and suboptimal health behaviors and habits are responsible for approximately 40 % of preventable deaths, in addition to their unfavorable effects on quality of life and economics. Our current understanding of human behavior is largely based on static "snapshots" of human behavior, rather than ongoing, dynamic feedback loops of behavior in response to ever-changing biological, social, personal, and environmental states. This paper first discusses how new technologies (i.e., mobile sensors, smartphones, ubiquitous computing, and cloud-enabled processing/computing) and emerging systems modeling techniques enable the development of new, dynamic, and empirical models of human behavior that could facilitate just-in-time adaptive, scalable interventions. The paper then describes concrete steps to the creation of robust dynamic mathematical models of behavior including: (1) establishing "gold standard" measures, (2) the creation of a behavioral ontology for shared language and understanding tools that both enable dynamic theorizing across disciplines, (3) the development of data sharing resources, and (4) facilitating improved sharing of mathematical models and tools to support rapid aggregation of the models. We conclude with the discussion of what might be incorporated into a "knowledge commons," which could help to bring together these disparate activities into a unified system and structure for organizing knowledge about behavior.

  13. Photonic Design: From Fundamental Solar Cell Physics to Computational Inverse Design

    NASA Astrophysics Data System (ADS)

    Miller, Owen Dennis

    Photonic innovation is becoming ever more important in the modern world. Optical systems are dominating shorter and shorter communications distances, LED's are rapidly emerging for a variety of applications, and solar cells show potential to be a mainstream technology in the energy space. The need for novel, energy-efficient photonic and optoelectronic devices will only increase. This work unites fundamental physics and a novel computational inverse design approach towards such innovation. The first half of the dissertation is devoted to the physics of high-efficiency solar cells. As solar cells approach fundamental efficiency limits, their internal physics transforms. Photonic considerations, instead of electronic ones, are the key to reaching the highest voltages and efficiencies. Proper photon management led to Alta Device's recent dramatic increase of the solar cell efficiency record to 28.3%. Moreover, approaching the Shockley-Queisser limit for any solar cell technology will require light extraction to become a part of all future designs. The second half of the dissertation introduces inverse design as a new computational paradigm in photonics. An assortment of techniques (FDTD, FEM, etc.) have enabled quick and accurate simulation of the "forward problem" of finding fields for a given geometry. However, scientists and engineers are typically more interested in the inverse problem: for a desired functionality, what geometry is needed? Answering this question breaks from the emphasis on the forward problem and forges a new path in computational photonics. The framework of shape calculus enables one to quickly find superior, non-intuitive designs. Novel designs for optical cloaking and sub-wavelength solar cell applications are presented.

  14. Application of Computer-Assisted Design and Manufacturing-Fabricated Artificial Bone in the Reconstruction of Craniofacial Bone Defects.

    PubMed

    Liang, Weiqiang; Yao, Yuanyuan; Huang, Zixian; Chen, Yuhong; Ji, Chenyang; Zhang, Jinming

    2016-07-01

    The purpose of this study was to evaluate the clinical application of individual craniofacial bone fabrications using computer-assisted design (CAD)-computer-assisted manufacturing technology for the reconstruction of craniofacial bone defects. A total of 8 patients diagnosed with craniofacial bone defects were enrolled in this study between May 2007 and August 2010. After computed tomography scans were obtained, the patients were fitted with artificial bone that was created using CAD software, rapid prototyping technology, and epoxy-methyl acrylate resin and hydroxyapatite materials. The fabrication was fixed to the defect area with titanium screws, and soft tissue defects were repaired if necessary. The fabrications were precisely fixed to the defect areas, and all wounds healed well without any serious complications except for 1 case with intraoral incision dehiscence, which required further treatment. Postoperative curative effects were retrospectively observed after 6 to 48 months, acceptable anatomic and cosmetic outcomes were obtained, and no rejections or other complications occurred. The use of CAD-computer-assisted manufacturing technology-assisted epoxy-methyl acrylate resin and hydroxyapatite composite artificial bone to treat patients with craniofacial bone defects could enable the precise reconstruction of these defects and obtain good anatomic and cosmetic outcomes. Copyright © 2016 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  15. Barriers to use of information and computer technology by Australia's nurses: a national survey.

    PubMed

    Eley, Robert; Fallon, Tony; Soar, Jeffrey; Buikstra, Elizabeth; Hegney, Desley

    2009-04-01

    To support policy planning for health, the barriers to the use of health information and computer technology (ICT) by nurses in Australia were determined. Australia, in line with many countries, aims to achieve a better quality of care and health outcomes through effective and innovative use of health information. Nurses form the largest component of the health workforce. Successful adoption of ICT by nurses will be a requirement for success. No national study has been undertaken to determine the barriers to adoption. A self-administered postal survey was conducted. A questionnaire was distributed to 10,000 members of the Australian Nursing Federation. Twenty possible barriers to the use of health ICT uptake were offered and responses were given on a five point Likert scale. Work demands, access to computers and lack of support were the principal barriers faced by nurses to their adoption of the technology in the workplace. Factors that were considered to present few barriers included age and lack of interest. While age was not considered by the respondents to be a barrier, their age was positively correlated with several barriers, including knowledge and confidence in the use of computers. Results indicate that to use the information and computer technologies being brought into health care fully, barriers that prevent the principal users from embracing those technologies must be addressed. Factors such as the age of the nurse and their level of job must be considered when developing strategies to overcome barriers. The findings of the present study provide essential information not only for national government and state health departments but also for local administrators and managers to enable clinical nurses to meet present and future job requirements.

  16. No Photon Left Behind: Advanced Optics at ARPA-E for Buildings and Solar Energy

    NASA Astrophysics Data System (ADS)

    Branz, Howard M.

    2015-04-01

    Key technology challenges in building efficiency and solar energy utilization require transformational optics, plasmonics and photonics technologies. We describe advanced optical technologies funded by the Advanced Research Projects Agency - Energy. Buildings technologies include a passive daytime photonic cooler, infra-red computer vision mapping for energy audit, and dual-band electrochromic windows based on plasmonic absorption. Solar technologies include novel hybrid energy converters that combine high-efficiency photovoltaics with concentrating solar thermal collection and storage. Because the marginal cost of thermal energy storage is low, these systems enable generation of inexpensive and dispatchable solar energy that can be deployed when the sun doesn't shine. The solar technologies under development include nanoparticle plasmonic spectrum splitting, Rugate filter interference structures and photovoltaic cells that can operate efficiently at over 400° C.

  17. Advanced Computational Methods in Bio-Mechanics.

    PubMed

    Al Qahtani, Waleed M S; El-Anwar, Mohamed I

    2018-04-15

    A novel partnership between surgeons and machines, made possible by advances in computing and engineering technology, could overcome many of the limitations of traditional surgery. By extending surgeons' ability to plan and carry out surgical interventions more accurately and with fewer traumas, computer-integrated surgery (CIS) systems could help to improve clinical outcomes and the efficiency of healthcare delivery. CIS systems could have a similar impact on surgery to that long since realised in computer-integrated manufacturing. Mathematical modelling and computer simulation have proved tremendously successful in engineering. Computational mechanics has enabled technological developments in virtually every area of our lives. One of the greatest challenges for mechanists is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. Biomechanics has significant potential for applications in orthopaedic industry, and the performance arts since skills needed for these activities are visibly related to the human musculoskeletal and nervous systems. Although biomechanics is widely used nowadays in the orthopaedic industry to design orthopaedic implants for human joints, dental parts, external fixations and other medical purposes, numerous researches funded by billions of dollars are still running to build a new future for sports and human healthcare in what is called biomechanics era.

  18. Mobile, Cloud, and Big Data Computing: Contributions, Challenges, and New Directions in Telecardiology

    PubMed Central

    Hsieh, Jui-Chien; Li, Ai-Hsien; Yang, Chung-Chi

    2013-01-01

    Many studies have indicated that computing technology can enable off-site cardiologists to read patients’ electrocardiograph (ECG), echocardiography (ECHO), and relevant images via smart phones during pre-hospital, in-hospital, and post-hospital teleconsultation, which not only identifies emergency cases in need of immediate treatment, but also prevents the unnecessary re-hospitalizations. Meanwhile, several studies have combined cloud computing and mobile computing to facilitate better storage, delivery, retrieval, and management of medical files for telecardiology. In the future, the aggregated ECG and images from hospitals worldwide will become big data, which should be used to develop an e-consultation program helping on-site practitioners deliver appropriate treatment. With information technology, real-time tele-consultation and tele-diagnosis of ECG and images can be practiced via an e-platform for clinical, research, and educational purposes. While being devoted to promote the application of information technology onto telecardiology, we need to resolve several issues: (1) data confidentiality in the cloud, (2) data interoperability among hospitals, and (3) network latency and accessibility. If these challenges are overcome, tele-consultation will be ubiquitous, easy to perform, inexpensive, and beneficial. Most importantly, these services will increase global collaboration and advance clinical practice, education, and scientific research in cardiology. PMID:24232290

  19. Virtual worlds to support patient group communication? A questionnaire study investigating potential for virtual world focus group use by respiratory patients.

    PubMed

    Taylor, Michael J; Taylor, Dave; Vlaev, Ivo; Elkin, Sarah

    2017-01-01

    Recent advances in communication technologies enable potential provision of remote education for patients using computer-generated environments known as virtual worlds. Previous research has revealed highly variable levels of patient receptiveness to using information technologies for healthcare-related purposes. This preliminary study involved implementing a questionnaire investigating attitudes and access to computer technologies of respiratory outpatients, in order to assess potential for use of virtual worlds to facilitate health-related education for this sample. Ninety-four patients with a chronic respiratory condition completed surveys, which were distributed at a Chest Clinic. In accordance with our prediction, younger participants were more likely to be able to use, and have access to a computer and some patients were keen to explore use virtual worlds for healthcare-related purposes: Of those with access to computer facilities, 14.50% expressed a willingness to attend a virtual world focus group. Results indicate future virtual world health education facilities should be designed to cater for younger patients, because this group are most likely to accept and use such facilities. Within the study sample, this is likely to comprise of people diagnosed with asthma. Future work could investigate the potential of creating a virtual world asthma education facility.

  20. Virtual worlds to support patient group communication? A questionnaire study investigating potential for virtual world focus group use by respiratory patients

    PubMed Central

    Taylor, Michael J.; Taylor, Dave; Vlaev, Ivo; Elkin, Sarah

    2015-01-01

    Recent advances in communication technologies enable potential provision of remote education for patients using computer-generated environments known as virtual worlds. Previous research has revealed highly variable levels of patient receptiveness to using information technologies for healthcare-related purposes. This preliminary study involved implementing a questionnaire investigating attitudes and access to computer technologies of respiratory outpatients, in order to assess potential for use of virtual worlds to facilitate health-related education for this sample. Ninety-four patients with a chronic respiratory condition completed surveys, which were distributed at a Chest Clinic. In accordance with our prediction, younger participants were more likely to be able to use, and have access to a computer and some patients were keen to explore use virtual worlds for healthcare-related purposes: Of those with access to computer facilities, 14.50% expressed a willingness to attend a virtual world focus group. Results indicate future virtual world health education facilities should be designed to cater for younger patients, because this group are most likely to accept and use such facilities. Within the study sample, this is likely to comprise of people diagnosed with asthma. Future work could investigate the potential of creating a virtual world asthma education facility. PMID:28239187

  1. Mobile, cloud, and big data computing: contributions, challenges, and new directions in telecardiology.

    PubMed

    Hsieh, Jui-Chien; Li, Ai-Hsien; Yang, Chung-Chi

    2013-11-13

    Many studies have indicated that computing technology can enable off-site cardiologists to read patients' electrocardiograph (ECG), echocardiography (ECHO), and relevant images via smart phones during pre-hospital, in-hospital, and post-hospital teleconsultation, which not only identifies emergency cases in need of immediate treatment, but also prevents the unnecessary re-hospitalizations. Meanwhile, several studies have combined cloud computing and mobile computing to facilitate better storage, delivery, retrieval, and management of medical files for telecardiology. In the future, the aggregated ECG and images from hospitals worldwide will become big data, which should be used to develop an e-consultation program helping on-site practitioners deliver appropriate treatment. With information technology, real-time tele-consultation and tele-diagnosis of ECG and images can be practiced via an e-platform for clinical, research, and educational purposes. While being devoted to promote the application of information technology onto telecardiology, we need to resolve several issues: (1) data confidentiality in the cloud, (2) data interoperability among hospitals, and (3) network latency and accessibility. If these challenges are overcome, tele-consultation will be ubiquitous, easy to perform, inexpensive, and beneficial. Most importantly, these services will increase global collaboration and advance clinical practice, education, and scientific research in cardiology.

  2. Wearable Intrinsically Soft, Stretchable, Flexible Devices for Memories and Computing.

    PubMed

    Rajan, Krishna; Garofalo, Erik; Chiolerio, Alessandro

    2018-01-27

    A recent trend in the development of high mass consumption electron devices is towards electronic textiles (e-textiles), smart wearable devices, smart clothes, and flexible or printable electronics. Intrinsically soft, stretchable, flexible, Wearable Memories and Computing devices (WMCs) bring us closer to sci-fi scenarios, where future electronic systems are totally integrated in our everyday outfits and help us in achieving a higher comfort level, interacting for us with other digital devices such as smartphones and domotics, or with analog devices, such as our brain/peripheral nervous system. WMC will enable each of us to contribute to open and big data systems as individual nodes, providing real-time information about physical and environmental parameters (including air pollution monitoring, sound and light pollution, chemical or radioactive fallout alert, network availability, and so on). Furthermore, WMC could be directly connected to human brain and enable extremely fast operation and unprecedented interface complexity, directly mapping the continuous states available to biological systems. This review focuses on recent advances in nanotechnology and materials science and pays particular attention to any result and promising technology to enable intrinsically soft, stretchable, flexible WMC.

  3. Atomic switch: atom/ion movement controlled devices for beyond von-neumann computers.

    PubMed

    Hasegawa, Tsuyoshi; Terabe, Kazuya; Tsuruoka, Tohru; Aono, Masakazu

    2012-01-10

    An atomic switch is a nanoionic device that controls the diffusion of metal ions/atoms and their reduction/oxidation processes in the switching operation to form/annihilate a conductive path. Since metal atoms can provide a highly conductive channel even if their cluster size is in the nanometer scale, atomic switches may enable downscaling to smaller than the 11 nm technology node, which is a great challenge for semiconductor devices. Atomic switches also possess novel characteristics, such as high on/off ratios, very low power consumption and non-volatility. The unique operating mechanisms of these devices have enabled the development of various types of atomic switch, such as gap-type and gapless-type two-terminal atomic switches and three-terminal atomic switches. Novel functions, such as selective volatile/nonvolatile, synaptic, memristive, and photo-assisted operations have been demonstrated. Such atomic switch characteristics can not only improve the performance of present-day electronic systems, but also enable development of new types of electronic systems, such as beyond von- Neumann computers. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Wearable Intrinsically Soft, Stretchable, Flexible Devices for Memories and Computing

    PubMed Central

    Rajan, Krishna; Garofalo, Erik

    2018-01-01

    A recent trend in the development of high mass consumption electron devices is towards electronic textiles (e-textiles), smart wearable devices, smart clothes, and flexible or printable electronics. Intrinsically soft, stretchable, flexible, Wearable Memories and Computing devices (WMCs) bring us closer to sci-fi scenarios, where future electronic systems are totally integrated in our everyday outfits and help us in achieving a higher comfort level, interacting for us with other digital devices such as smartphones and domotics, or with analog devices, such as our brain/peripheral nervous system. WMC will enable each of us to contribute to open and big data systems as individual nodes, providing real-time information about physical and environmental parameters (including air pollution monitoring, sound and light pollution, chemical or radioactive fallout alert, network availability, and so on). Furthermore, WMC could be directly connected to human brain and enable extremely fast operation and unprecedented interface complexity, directly mapping the continuous states available to biological systems. This review focuses on recent advances in nanotechnology and materials science and pays particular attention to any result and promising technology to enable intrinsically soft, stretchable, flexible WMC. PMID:29382050

  5. Robust Online Monitoring for Calibration Assessment of Transmitters and Instrumentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramuhalli, Pradeep; Coble, Jamie B.; Shumaker, Brent

    Robust online monitoring (OLM) technologies are expected to enable the extension or elimination of periodic sensor calibration intervals in operating and new reactors. These advances in OLM technologies will improve the safety and reliability of current and planned nuclear power systems through improved accuracy and increased reliability of sensors used to monitor key parameters. In this article, we discuss an overview of research being performed within the Nuclear Energy Enabling Technologies (NEET)/Advanced Sensors and Instrumentation (ASI) program, for the development of OLM algorithms to use sensor outputs and, in combination with other available information, 1) determine whether one or moremore » sensors are out of calibration or failing and 2) replace a failing sensor with reliable, accurate sensor outputs. Algorithm development is focused on the following OLM functions: • Signal validation • Virtual sensing • Sensor response-time assessment These algorithms incorporate, at their base, a Gaussian Process-based uncertainty quantification (UQ) method. Various plant models (using kernel regression, GP, or hierarchical models) may be used to predict sensor responses under various plant conditions. These predicted responses can then be applied in fault detection (sensor output and response time) and in computing the correct value (virtual sensing) of a failing physical sensor. The methods being evaluated in this work can compute confidence levels along with the predicted sensor responses, and as a result, may have the potential for compensating for sensor drift in real-time (online recalibration). Evaluation was conducted using data from multiple sources (laboratory flow loops and plant data). Ongoing research in this project is focused on further evaluation of the algorithms, optimization for accuracy and computational efficiency, and integration into a suite of tools for robust OLM that are applicable to monitoring sensor calibration state in nuclear power plants.« less

  6. Computing facilities available to final-year students at 3 UK dental schools in 1997/8: their use, and students' attitudes to information technology.

    PubMed

    Grigg, P; Macfarlane, T V; Shearer, A C; Jepson, N J; Stephens, C D

    2001-08-01

    To identify computer facilities available in 3 dental schools where 3 different approaches to the use of technology-based learning material have been adopted and assess dental students' perception of their own computer skills and their attitudes towards information technology. Multicentre cross sectional by questionnaire. All 181 dental students in their final year of study (1997-8). The overall participation rate was 80%. There were no differences between schools in the students' self assessment of their IT skills but only 1/3 regarded themselves as competent in basic skills and nearly 50% of students in all 3 schools felt that insufficient IT training had been provided to enable them to follow their course without difficulty. There were significant differences between schools in most of the other areas examined which reflect the different ways in which IT can be used to support the dental course. 1. Students value IT as an educational tool. 2. Their awareness of the relevance of a knowledge of information technology for their future careers remains generally low. 3. There is a need to provide effective instruction in IT skills for those dental students who do not acquire these during secondary education.

  7. The emergence of spatial cyberinfrastructure.

    PubMed

    Wright, Dawn J; Wang, Shaowen

    2011-04-05

    Cyberinfrastructure integrates advanced computer, information, and communication technologies to empower computation-based and data-driven scientific practice and improve the synthesis and analysis of scientific data in a collaborative and shared fashion. As such, it now represents a paradigm shift in scientific research that has facilitated easy access to computational utilities and streamlined collaboration across distance and disciplines, thereby enabling scientific breakthroughs to be reached more quickly and efficiently. Spatial cyberinfrastructure seeks to resolve longstanding complex problems of handling and analyzing massive and heterogeneous spatial datasets as well as the necessity and benefits of sharing spatial data flexibly and securely. This article provides an overview and potential future directions of spatial cyberinfrastructure. The remaining four articles of the special feature are introduced and situated in the context of providing empirical examples of how spatial cyberinfrastructure is extending and enhancing scientific practice for improved synthesis and analysis of both physical and social science data. The primary focus of the articles is spatial analyses using distributed and high-performance computing, sensor networks, and other advanced information technology capabilities to transform massive spatial datasets into insights and knowledge.

  8. The emergence of spatial cyberinfrastructure

    PubMed Central

    Wright, Dawn J.; Wang, Shaowen

    2011-01-01

    Cyberinfrastructure integrates advanced computer, information, and communication technologies to empower computation-based and data-driven scientific practice and improve the synthesis and analysis of scientific data in a collaborative and shared fashion. As such, it now represents a paradigm shift in scientific research that has facilitated easy access to computational utilities and streamlined collaboration across distance and disciplines, thereby enabling scientific breakthroughs to be reached more quickly and efficiently. Spatial cyberinfrastructure seeks to resolve longstanding complex problems of handling and analyzing massive and heterogeneous spatial datasets as well as the necessity and benefits of sharing spatial data flexibly and securely. This article provides an overview and potential future directions of spatial cyberinfrastructure. The remaining four articles of the special feature are introduced and situated in the context of providing empirical examples of how spatial cyberinfrastructure is extending and enhancing scientific practice for improved synthesis and analysis of both physical and social science data. The primary focus of the articles is spatial analyses using distributed and high-performance computing, sensor networks, and other advanced information technology capabilities to transform massive spatial datasets into insights and knowledge. PMID:21467227

  9. Power monitoring and control for large scale projects: SKA, a case study

    NASA Astrophysics Data System (ADS)

    Barbosa, Domingos; Barraca, João. Paulo; Maia, Dalmiro; Carvalho, Bruno; Vieira, Jorge; Swart, Paul; Le Roux, Gerhard; Natarajan, Swaminathan; van Ardenne, Arnold; Seca, Luis

    2016-07-01

    Large sensor-based science infrastructures for radio astronomy like the SKA will be among the most intensive datadriven projects in the world, facing very high demanding computation, storage, management, and above all power demands. The geographically wide distribution of the SKA and its associated processing requirements in the form of tailored High Performance Computing (HPC) facilities, require a Greener approach towards the Information and Communications Technologies (ICT) adopted for the data processing to enable operational compliance to potentially strict power budgets. Addressing the reduction of electricity costs, improve system power monitoring and the generation and management of electricity at system level is paramount to avoid future inefficiencies and higher costs and enable fulfillments of Key Science Cases. Here we outline major characteristics and innovation approaches to address power efficiency and long-term power sustainability for radio astronomy projects, focusing on Green ICT for science and Smart power monitoring and control.

  10. A wearable computing platform for developing cloud-based machine learning models for health monitoring applications.

    PubMed

    Patel, Shyamal; McGinnis, Ryan S; Silva, Ikaro; DiCristofaro, Steve; Mahadevan, Nikhil; Jortberg, Elise; Franco, Jaime; Martin, Albert; Lust, Joseph; Raj, Milan; McGrane, Bryan; DePetrillo, Paolo; Aranyosi, A J; Ceruolo, Melissa; Pindado, Jesus; Ghaffari, Roozbeh

    2016-08-01

    Wearable sensors have the potential to enable clinical-grade ambulatory health monitoring outside the clinic. Technological advances have enabled development of devices that can measure vital signs with great precision and significant progress has been made towards extracting clinically meaningful information from these devices in research studies. However, translating measurement accuracies achieved in the controlled settings such as the lab and clinic to unconstrained environments such as the home remains a challenge. In this paper, we present a novel wearable computing platform for unobtrusive collection of labeled datasets and a new paradigm for continuous development, deployment and evaluation of machine learning models to ensure robust model performance as we transition from the lab to home. Using this system, we train activity classification models across two studies and track changes in model performance as we go from constrained to unconstrained settings.

  11. Blind topological measurement-based quantum computation

    PubMed Central

    Morimae, Tomoyuki; Fujii, Keisuke

    2012-01-01

    Blind quantum computation is a novel secure quantum-computing protocol that enables Alice, who does not have sufficient quantum technology at her disposal, to delegate her quantum computation to Bob, who has a fully fledged quantum computer, in such a way that Bob cannot learn anything about Alice's input, output and algorithm. A recent proof-of-principle experiment demonstrating blind quantum computation in an optical system has raised new challenges regarding the scalability of blind quantum computation in realistic noisy conditions. Here we show that fault-tolerant blind quantum computation is possible in a topologically protected manner using the Raussendorf–Harrington–Goyal scheme. The error threshold of our scheme is 4.3×10−3, which is comparable to that (7.5×10−3) of non-blind topological quantum computation. As the error per gate of the order 10−3 was already achieved in some experimental systems, our result implies that secure cloud quantum computation is within reach. PMID:22948818

  12. When technology became language: the origins of the linguistic conception of computer programming, 1950-1960.

    PubMed

    Nofre, David; Priestley, Mark; Alberts, Gerard

    2014-01-01

    Language is one of the central metaphors around which the discipline of computer science has been built. The language metaphor entered modern computing as part of a cybernetic discourse, but during the second half of the 1950s acquired a more abstract meaning, closely related to the formal languages of logic and linguistics. The article argues that this transformation was related to the appearance of the commercial computer in the mid-1950s. Managers of computing installations and specialists on computer programming in academic computer centers, confronted with an increasing variety of machines, called for the creation of "common" or "universal languages" to enable the migration of computer code from machine to machine. Finally, the article shows how the idea of a universal language was a decisive step in the emergence of programming languages, in the recognition of computer programming as a proper field of knowledge, and eventually in the way we think of the computer.

  13. Capturing Petascale Application Characteristics with the Sequoia Toolkit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vetter, Jeffrey S; Bhatia, Nikhil; Grobelny, Eric M

    2005-01-01

    Characterization of the computation, communication, memory, and I/O demands of current scientific applications is crucial for identifying which technologies will enable petascale scientific computing. In this paper, we present the Sequoia Toolkit for characterizing HPC applications. The Sequoia Toolkit consists of the Sequoia trace capture library and the Sequoia Event Analysis Library, or SEAL, that facilitates the development of tools for analyzing Sequoia event traces. Using the Sequoia Toolkit, we have characterized the behavior of application runs with up to 2048 application processes. To illustrate the use of the Sequoia Toolkit, we present a preliminary characterization of LAMMPS, a molecularmore » dynamics application of great interest to the computational biology community.« less

  14. National Geographic Society Kids Network: Report on 1994 teacher participants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    In 1994, National Geographic Society Kids Network, a computer/telecommunications-based science curriculum, was presented to elementary and middle school teachers through summer programs sponsored by NGS and US DOE. The network program assists teachers in understanding the process of doing science; understanding the role of computers and telecommunications in the study of science, math, and engineering; and utilizing computers and telecommunications appropriately in the classroom. The program enables teacher to integrate science, math, and technology with other subjects with the ultimate goal of encouraging students of all abilities to pursue careers in science/math/engineering. This report assesses the impact of the networkmore » program on participating teachers.« less

  15. Computational medical imaging and hemodynamics framework for functional analysis and assessment of cardiovascular structures.

    PubMed

    Wong, Kelvin K L; Wang, Defeng; Ko, Jacky K L; Mazumdar, Jagannath; Le, Thu-Thao; Ghista, Dhanjoo

    2017-03-21

    Cardiac dysfunction constitutes common cardiovascular health issues in the society, and has been an investigation topic of strong focus by researchers in the medical imaging community. Diagnostic modalities based on echocardiography, magnetic resonance imaging, chest radiography and computed tomography are common techniques that provide cardiovascular structural information to diagnose heart defects. However, functional information of cardiovascular flow, which can in fact be used to support the diagnosis of many cardiovascular diseases with a myriad of hemodynamics performance indicators, remains unexplored to its full potential. Some of these indicators constitute important cardiac functional parameters affecting the cardiovascular abnormalities. With the advancement of computer technology that facilitates high speed computational fluid dynamics, the realization of a support diagnostic platform of hemodynamics quantification and analysis can be achieved. This article reviews the state-of-the-art medical imaging and high fidelity multi-physics computational analyses that together enable reconstruction of cardiovascular structures and hemodynamic flow patterns within them, such as of the left ventricle (LV) and carotid bifurcations. The combined medical imaging and hemodynamic analysis enables us to study the mechanisms of cardiovascular disease-causing dysfunctions, such as how (1) cardiomyopathy causes left ventricular remodeling and loss of contractility leading to heart failure, and (2) modeling of LV construction and simulation of intra-LV hemodynamics can enable us to determine the optimum procedure of surgical ventriculation to restore its contractility and health This combined medical imaging and hemodynamics framework can potentially extend medical knowledge of cardiovascular defects and associated hemodynamic behavior and their surgical restoration, by means of an integrated medical image diagnostics and hemodynamic performance analysis framework.

  16. 3D Printing of Biomolecular Models for Research and Pedagogy

    PubMed Central

    Da Veiga Beltrame, Eduardo; Tyrwhitt-Drake, James; Roy, Ian; Shalaby, Raed; Suckale, Jakob; Pomeranz Krummel, Daniel

    2017-01-01

    The construction of physical three-dimensional (3D) models of biomolecules can uniquely contribute to the study of the structure-function relationship. 3D structures are most often perceived using the two-dimensional and exclusively visual medium of the computer screen. Converting digital 3D molecular data into real objects enables information to be perceived through an expanded range of human senses, including direct stereoscopic vision, touch, and interaction. Such tangible models facilitate new insights, enable hypothesis testing, and serve as psychological or sensory anchors for conceptual information about the functions of biomolecules. Recent advances in consumer 3D printing technology enable, for the first time, the cost-effective fabrication of high-quality and scientifically accurate models of biomolecules in a variety of molecular representations. However, the optimization of the virtual model and its printing parameters is difficult and time consuming without detailed guidance. Here, we provide a guide on the digital design and physical fabrication of biomolecule models for research and pedagogy using open source or low-cost software and low-cost 3D printers that use fused filament fabrication technology. PMID:28362403

  17. Nurses using futuristic technology in today's healthcare setting.

    PubMed

    Wolf, Debra M; Kapadia, Amar; Kintzel, Jessie; Anton, Bonnie B

    2009-01-01

    Human computer interaction (HCI) equates nurses using voice assisted technology within a clinical setting to document patient care real time, retrieve patient information from care plans, and complete routine tasks. This is a reality currently utilized by clinicians today in acute and long term care settings. Voice assisted documentation provides hands & eyes free accurate documentation while enabling effective communication and task management. The speech technology increases the accuracy of documentation, while interfacing directly into the electronic health record (EHR). Using technology consisting of a light weight headset and small fist size wireless computer, verbal responses to easy to follow cues are converted into a database systems allowing staff to obtain individualized care status reports on demand. To further assist staff in their daily process, this innovative technology allows staff to send and receive pages as needed. This paper will discuss how leading edge and award winning technology is being integrated within the United States. Collaborative efforts between clinicians and analyst will be discussed reflecting the interactive design and build functionality. Features such as the system's voice responses and directed cues will be shared and how easily data can be documented, viewed and retrieved. Outcome data will be presented on how the technology impacted organization's quality outcomes, financial reimbursement, and employee's level of satisfaction.

  18. Highly Parallel Computing Architectures by using Arrays of Quantum-dot Cellular Automata (QCA): Opportunities, Challenges, and Recent Results

    NASA Technical Reports Server (NTRS)

    Fijany, Amir; Toomarian, Benny N.

    2000-01-01

    There has been significant improvement in the performance of VLSI devices, in terms of size, power consumption, and speed, in recent years and this trend may also continue for some near future. However, it is a well known fact that there are major obstacles, i.e., physical limitation of feature size reduction and ever increasing cost of foundry, that would prevent the long term continuation of this trend. This has motivated the exploration of some fundamentally new technologies that are not dependent on the conventional feature size approach. Such technologies are expected to enable scaling to continue to the ultimate level, i.e., molecular and atomistic size. Quantum computing, quantum dot-based computing, DNA based computing, biologically inspired computing, etc., are examples of such new technologies. In particular, quantum-dots based computing by using Quantum-dot Cellular Automata (QCA) has recently been intensely investigated as a promising new technology capable of offering significant improvement over conventional VLSI in terms of reduction of feature size (and hence increase in integration level), reduction of power consumption, and increase of switching speed. Quantum dot-based computing and memory in general and QCA specifically, are intriguing to NASA due to their high packing density (10(exp 11) - 10(exp 12) per square cm ) and low power consumption (no transfer of current) and potentially higher radiation tolerant. Under Revolutionary Computing Technology (RTC) Program at the NASA/JPL Center for Integrated Space Microelectronics (CISM), we have been investigating the potential applications of QCA for the space program. To this end, exploiting the intrinsic features of QCA, we have designed novel QCA-based circuits for co-planner (i.e., single layer) and compact implementation of a class of data permutation matrices, a class of interconnection networks, and a bit-serial processor. Building upon these circuits, we have developed novel algorithms and QCA-based architectures for highly parallel and systolic computation of signal/image processing applications, such as FFT and Wavelet and Wlash-Hadamard Transforms.

  19. Collaborative Visualization Project: shared-technology learning environments for science learning

    NASA Astrophysics Data System (ADS)

    Pea, Roy D.; Gomez, Louis M.

    1993-01-01

    Project-enhanced science learning (PESL) provides students with opportunities for `cognitive apprenticeships' in authentic scientific inquiry using computers for data-collection and analysis. Student teams work on projects with teacher guidance to develop and apply their understanding of science concepts and skills. We are applying advanced computing and communications technologies to augment and transform PESL at-a-distance (beyond the boundaries of the individual school), which is limited today to asynchronous, text-only networking and unsuitable for collaborative science learning involving shared access to multimedia resources such as data, graphs, tables, pictures, and audio-video communication. Our work creates user technology (a Collaborative Science Workbench providing PESL design support and shared synchronous document views, program, and data access; a Science Learning Resource Directory for easy access to resources including two-way video links to collaborators, mentors, museum exhibits, media-rich resources such as scientific visualization graphics), and refine enabling technologies (audiovisual and shared-data telephony, networking) for this PESL niche. We characterize participation scenarios for using these resources and we discuss national networked access to science education expertise.

  20. Web-based continuing medical education. (II): Evaluation study of computer-mediated continuing medical education.

    PubMed

    Curran, V R; Hoekman, T; Gulliver, W; Landells, I; Hatcher, L

    2000-01-01

    Over the years, various distance learning technologies and methods have been applied to the continuing medical education needs of rural and remote physicians. They have included audio teleconferencing, slow scan imaging, correspondence study, and compressed videoconferencing. The recent emergence and growth of Internet, World Wide Web (Web), and compact disk read-only-memory (CD-ROM) technologies have introduced new opportunities for providing continuing education to the rural medical practitioner. This evaluation study assessed the instructional effectiveness of a hybrid computer-mediated courseware delivery system on dermatologic office procedures. A hybrid delivery system merges Web documents, multimedia, computer-mediated communications, and CD-ROMs to enable self-paced instruction and collaborative learning. Using a modified pretest to post-test control group study design, several evaluative criteria (participant reaction, learning achievement, self-reported performance change, and instructional transactions) were assessed by various qualitative and quantitative data collection methods. This evaluation revealed that a hybrid computer-mediated courseware system was an effective means for increasing knowledge (p < .05) and improving self-reported competency (p < .05) in dermatologic office procedures, and that participants were very satisfied with the self-paced instruction and use of asynchronous computer conferencing for collaborative information sharing among colleagues.

  1. Open source system OpenVPN in a function of Virtual Private Network

    NASA Astrophysics Data System (ADS)

    Skendzic, A.; Kovacic, B.

    2017-05-01

    Using of Virtual Private Networks (VPN) can establish high security level in network communication. VPN technology enables high security networking using distributed or public network infrastructure. VPN uses different security and managing rules inside networks. It can be set up using different communication channels like Internet or separate ISP communication infrastructure. VPN private network makes security communication channel over public network between two endpoints (computers). OpenVPN is an open source software product under GNU General Public License (GPL) that can be used to establish VPN communication between two computers inside business local network over public communication infrastructure. It uses special security protocols and 256-bit Encryption and it is capable of traversing network address translators (NATs) and firewalls. It allows computers to authenticate each other using a pre-shared secret key, certificates or username and password. This work gives review of VPN technology with a special accent on OpenVPN. This paper will also give comparison and financial benefits of using open source VPN software in business environment.

  2. Evaluation of Semantic Web Technologies for Storing Computable Definitions of Electronic Health Records Phenotyping Algorithms.

    PubMed

    Papež, Václav; Denaxas, Spiros; Hemingway, Harry

    2017-01-01

    Electronic Health Records are electronic data generated during or as a byproduct of routine patient care. Structured, semi-structured and unstructured EHR offer researchers unprecedented phenotypic breadth and depth and have the potential to accelerate the development of precision medicine approaches at scale. A main EHR use-case is defining phenotyping algorithms that identify disease status, onset and severity. Phenotyping algorithms utilize diagnoses, prescriptions, laboratory tests, symptoms and other elements in order to identify patients with or without a specific trait. No common standardized, structured, computable format exists for storing phenotyping algorithms. The majority of algorithms are stored as human-readable descriptive text documents making their translation to code challenging due to their inherent complexity and hinders their sharing and re-use across the community. In this paper, we evaluate the two key Semantic Web Technologies, the Web Ontology Language and the Resource Description Framework, for enabling computable representations of EHR-driven phenotyping algorithms.

  3. Photonics for aerospace sensors

    NASA Astrophysics Data System (ADS)

    Pellegrino, John; Adler, Eric D.; Filipov, Andree N.; Harrison, Lorna J.; van der Gracht, Joseph; Smith, Dale J.; Tayag, Tristan J.; Viveiros, Edward A.

    1992-11-01

    The maturation in the state-of-the-art of optical components is enabling increased applications for the technology. Most notable is the ever-expanding market for fiber optic data and communications links, familiar in both commercial and military markets. The inherent properties of optics and photonics, however, have suggested that components and processors may be designed that offer advantages over more commonly considered digital approaches for a variety of airborne sensor and signal processing applications. Various academic, industrial, and governmental research groups have been actively investigating and exploiting these properties of high bandwidth, large degree of parallelism in computation (e.g., processing in parallel over a two-dimensional field), and interconnectivity, and have succeeded in advancing the technology to the stage of systems demonstration. Such advantages as computational throughput and low operating power consumption are highly attractive for many computationally intensive problems. This review covers the key devices necessary for optical signal and image processors, some of the system application demonstration programs currently in progress, and active research directions for the implementation of next-generation architectures.

  4. The Audacity of Fiber-Wireless (FiWi) Networks

    NASA Astrophysics Data System (ADS)

    Maier, Martin; Ghazisaidi, Navid; Reisslein, Martin

    A plethora of enabling optical and wireless technologies have been emerging that can be used to build future-proof bimodal fiber-wireless (FiWi) broadband access networks. After overviewing key enabling radio-over-fiber (RoF) and radio-and-fiber (R&F) technologies and briefly surveying the state of the art of FiWi networks, we introduce an Ethernet-based access-metro FiWi network, called SuperMAN, that integrates next-generation WiFi and WiMAX networks with WDM-enhanced EPON and RPR networks. Throughout the paper we pay close attention to the technical challenges and opportunities of FiWi networks, but also elaborate on their societal benefits and potential to shift the current research focus from optical-wireless networking to the exploitation of personal and in-home computing facilities to create new unforeseen services and applications as we are about to enter the Petabyte age.

  5. Control of coherent information via on-chip photonic-phononic emitter-receivers.

    PubMed

    Shin, Heedeuk; Cox, Jonathan A; Jarecki, Robert; Starbuck, Andrew; Wang, Zheng; Rakich, Peter T

    2015-03-05

    Rapid progress in integrated photonics has fostered numerous chip-scale sensing, computing and signal processing technologies. However, many crucial filtering and signal delay operations are difficult to perform with all-optical devices. Unlike photons propagating at luminal speeds, GHz-acoustic phonons moving at slower velocities allow information to be stored, filtered and delayed over comparatively smaller length-scales with remarkable fidelity. Hence, controllable and efficient coupling between coherent photons and phonons enables new signal processing technologies that greatly enhance the performance and potential impact of integrated photonics. Here we demonstrate a mechanism for coherent information processing based on travelling-wave photon-phonon transduction, which achieves a phonon emit-and-receive process between distinct nanophotonic waveguides. Using this device, physics--which supports GHz frequencies--we create wavelength-insensitive radiofrequency photonic filters with frequency selectivity, narrow-linewidth and high power-handling in silicon. More generally, this emit-receive concept is the impetus for enabling new signal processing schemes.

  6. MiRTE: Mixed Reality Triage and Evacuation game for Mass Casualty information systems design, testing and training.

    PubMed

    Yu, Xunyi; Ganz, Aura

    2011-01-01

    In this paper we introduce a Mixed Reality Triage and Evacuation game, MiRTE, that is used in the development, testing and training of Mass Casualty Incident (MCI) information systems for first responders. Using the Source game engine from Valve software, MiRTE creates immersive virtual environments to simulate various incident scenarios, and enables interactions between multiple players/first responders. What distinguishes it from a pure computer simulation game is that it can interface with external mass casualty incident management systems, such as DIORAMA. The game will enable system developers to specify technical requirements of underlying technology, and test different alternatives of design. After the information system hardware and software are completed, the game can simulate various algorithms such as localization technologies, and interface with an actual user interface on PCs and Smartphones. We implemented and tested the game with the DIORAMA system.

  7. RIACS

    NASA Technical Reports Server (NTRS)

    Moore, Robert C.

    1998-01-01

    The Research Institute for Advanced Computer Science (RIACS) was established by the Universities Space Research Association (USRA) at the NASA Ames Research Center (ARC) on June 6, 1983. RIACS is privately operated by USRA, a consortium of universities that serves as a bridge between NASA and the academic community. Under a five-year co-operative agreement with NASA, research at RIACS is focused on areas that are strategically enabling to the Ames Research Center's role as NASA's Center of Excellence for Information Technology. The primary mission of RIACS is charted to carry out research and development in computer science. This work is devoted in the main to tasks that are strategically enabling with respect to NASA's bold mission in space exploration and aeronautics. There are three foci for this work: (1) Automated Reasoning. (2) Human-Centered Computing. and (3) High Performance Computing and Networking. RIACS has the additional goal of broadening the base of researcher in these areas of importance to the nation's space and aeronautics enterprises. Through its visiting scientist program, RIACS facilitates the participation of university-based researchers, including both faculty and students, in the research activities of NASA and RIACS. RIACS researchers work in close collaboration with NASA computer scientists on projects such as the Remote Agent Experiment on Deep Space One mission, and Super-Resolution Surface Modeling.

  8. HPC Annual Report 2017

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dennig, Yasmin

    Sandia National Laboratories has a long history of significant contributions to the high performance community and industry. Our innovative computer architectures allowed the United States to become the first to break the teraFLOP barrier—propelling us to the international spotlight. Our advanced simulation and modeling capabilities have been integral in high consequence US operations such as Operation Burnt Frost. Strong partnerships with industry leaders, such as Cray, Inc. and Goodyear, have enabled them to leverage our high performance computing (HPC) capabilities to gain a tremendous competitive edge in the marketplace. As part of our continuing commitment to providing modern computing infrastructuremore » and systems in support of Sandia missions, we made a major investment in expanding Building 725 to serve as the new home of HPC systems at Sandia. Work is expected to be completed in 2018 and will result in a modern facility of approximately 15,000 square feet of computer center space. The facility will be ready to house the newest National Nuclear Security Administration/Advanced Simulation and Computing (NNSA/ASC) Prototype platform being acquired by Sandia, with delivery in late 2019 or early 2020. This new system will enable continuing advances by Sandia science and engineering staff in the areas of operating system R&D, operation cost effectiveness (power and innovative cooling technologies), user environment and application code performance.« less

  9. In the blink of an eye: head mounted displays development within BAE Systems

    NASA Astrophysics Data System (ADS)

    Cameron, Alex

    2015-05-01

    There has been an explosion of interest in head worn displays in recent years, particularly for consumer applications with an attendant ramping up of investment into key enabling technologies to provide what is essence a mobile computer display. However, head mounted system have been around for over 40 years and today's consumer products are building on a legacy of knowledge and technology created by companies such as BAE Systems who have been designing and fielding helmet mounted displays (HMD) for a wide range of specialist applications. Although the dominant application area has been military aviation, solutions have been fielded for solider, ground vehicle, simulation, medical, racing car and even subsea navigation applications. What sets these HMDs apart is that they provide the user with accurate conformal information embedded in the users real world view where the information presented is intuitive and easy to use because it overlays the real world and enables them to stay head up, eyes out, - improving their effectiveness, reducing workload and improving safety. Such systems are an enabling technology in the provision of enhanced Situation Awareness (SA) and reducing user workload in high intensity situations. These capabilities are finding much wider application in new types of compact man mounted audio/visual products enabled by the emergence of new families of micro displays, novel optical concepts and ultra-compact low power processing solutions. This paper therefore provides a personal summary of BAE Systems 40 year's journey in developing and fielding Head Mounted systems, their applications.

  10. The Experiences of Female High School Students and Interest in STEM: Factors Leading to the Selection of an Engineering or Computer Science Major

    ERIC Educational Resources Information Center

    Genoways, Sharon K.

    2017-01-01

    STEM (Science, Technology, Engineering and Math) education creates critical thinkers, increases science literacy, and enables the next generation of innovators, which leads to new products and processes that sustain our economy (Hossain & Robinson, 2012). We have been hearing the warnings for several years, that there simply are not enough…

  11. Innovating toward Sustainability: How Computer Labs Can Enable New Staffing Structures, and New Savings. Schools in Crisis: Making Ends Meet

    ERIC Educational Resources Information Center

    Simburg, Suzanne; Roza, Marguerite

    2012-01-01

    Even as new educational technologies have emerged, staffing innovations have seemed all but impossible in American schools. Charter and district schools alike long ago surrendered to the notion that education requires at least as many core teachers as is determined from dividing enrollment by class size. A few new school designs suggest that we…

  12. Initial Determination of Low Earth Orbits Using Commercial Telescopes

    DTIC Science & Technology

    2008-03-01

    many new technologies have significantly changed the face of private astronomy . Developments such as inexpensive but high-quality sensors, rapid... astronomy . Unpar- alleled access to quality equipment, rapid personal computing, and extensive community support enable nearly anyone to achieve feats in...other subdisciplines of astronomy , this field benefits greatly from recent advances. This project examines how modern equipment is used to track Low Earth

  13. Proceedings on Combating the Unrestricted Warfare Threat: Integrating Strategy, Analysis, and Technology, 20-21 March 2007

    DTIC Science & Technology

    2007-03-01

    Prosthetics to enable return to units without loss of capability Quantum...and will give us a big advantage in terms of unrestricted warfare. Figure 17 high-Productivity Computing System PRoSThETICS We have an exciting...program in prosthetics (Figure 18). It started with a monkey at Duke University. We put microelectronic implants into her brain, taught her

  14. Design of a Multi-Touch Tabletop for Simulation-Based Training

    DTIC Science & Technology

    2014-06-01

    receive, for example using point and click mouse-based computer interactions to specify the routes that vehicles take as part of a convoy...learning, coordination and support for planning. We first provide background in tabletop interaction in general and survey earlier efforts to use...tremendous progress over the past five years. Touch detection technologies now enable multiple users to interact simultaneously on large areas with

  15. A Survey of Robotic Technology.

    DTIC Science & Technology

    1983-07-01

    developed the following definition of a robot: A robot is a reprogrammable multifunctional manipulator designed to move material, parts, tools, or specialized...subroutines subroutines commands to specific actuators, computations based on sensor data, etc. For instance, the job might be to assemble an automobile ...the set-up developed at Draper Labs to enable a robot to assemble an automobile alternator. The assembly operation is impressive to watch. The number

  16. Globus | Informatics Technology for Cancer Research (ITCR)

    Cancer.gov

    Globus software services provide secure cancer research data transfer, synchronization, and sharing in distributed environments at large scale. These services can be integrated into applications and research data gateways, leveraging Globus identity management, single sign-on, search, and authorization capabilities. Globus Genomics integrates Globus with the Galaxy genomics workflow engine and Amazon Web Services to enable cancer genomics analysis that can elastically scale compute resources with demand.

  17. Implementation of Protocols to Enable Doctoral Training in Physical and Computational Chemistry of a Blind Graduate Student

    ERIC Educational Resources Information Center

    Minkara, Mona S.; Weaver, Michael N.; Gorske, Jim; Bowers, Clifford R.; Merz, Kenneth M., Jr.

    2015-01-01

    There exists a sparse representation of blind and low-vision students in science, technology, engineering and mathematics (STEM) fields. This is due in part to these individuals being discouraged from pursuing STEM degrees as well as a lack of appropriate adaptive resources in upper level STEM courses and research. Mona Minkara is a rising fifth…

  18. The Functionality of a Geography Information System (GIS) Technology in Geography Teaching: Application of a Sample Lesson

    ERIC Educational Resources Information Center

    Ozgen, Nurettin

    2009-01-01

    A Geographic Information System (GIS) is a high performance computer-aided chain of software which enables us to understand, interpret, capture, update, map, and display natural and human-originated events on Earth and allows us to bring out such phenomena in a form of synthesis. Therefore, a GIS is an important information system in which…

  19. Google Earth Engine: a new cloud-computing platform for global-scale earth observation data and analysis

    NASA Astrophysics Data System (ADS)

    Moore, R. T.; Hansen, M. C.

    2011-12-01

    Google Earth Engine is a new technology platform that enables monitoring and measurement of changes in the earth's environment, at planetary scale, on a large catalog of earth observation data. The platform offers intrinsically-parallel computational access to thousands of computers in Google's data centers. Initial efforts have focused primarily on global forest monitoring and measurement, in support of REDD+ activities in the developing world. The intent is to put this platform into the hands of scientists and developing world nations, in order to advance the broader operational deployment of existing scientific methods, and strengthen the ability for public institutions and civil society to better understand, manage and report on the state of their natural resources. Earth Engine currently hosts online nearly the complete historical Landsat archive of L5 and L7 data collected over more than twenty-five years. Newly-collected Landsat imagery is downloaded from USGS EROS Center into Earth Engine on a daily basis. Earth Engine also includes a set of historical and current MODIS data products. The platform supports generation, on-demand, of spatial and temporal mosaics, "best-pixel" composites (for example to remove clouds and gaps in satellite imagery), as well as a variety of spectral indices. Supervised learning methods are available over the Landsat data catalog. The platform also includes a new application programming framework, or "API", that allows scientists access to these computational and data resources, to scale their current algorithms or develop new ones. Under the covers of the Google Earth Engine API is an intrinsically-parallel image-processing system. Several forest monitoring applications powered by this API are currently in development and expected to be operational in 2011. Combining science with massive data and technology resources in a cloud-computing framework can offer advantages of computational speed, ease-of-use and collaboration, as well as transparency in data and methods. Methods developed for global processing of MODIS data to map land cover are being adopted for use with Landsat data. Specifically, the MODIS Vegetation Continuous Field product methodology has been applied for mapping forest extent and change at national scales using Landsat time-series data sets. Scaling this method to continental and global scales is enabled by Google Earth Engine computing capabilities. By combining the supervised learning VCF approach with the Landsat archive and cloud computing, unprecedented monitoring of land cover dynamics is enabled.

  20. Promise of new imaging technologies for assessing ovarian function.

    PubMed

    Singh, Jaswant; Adams, Gregg P; Pierson, Roger A

    2003-10-15

    Advancements in imaging technologies over the last two decades have ushered a quiet revolution in research approaches to the study of ovarian structure and function. The most significant changes in our understanding of the ovary have resulted from the use of ultrasonography which has enabled sequential analyses in live animals. Computer-assisted image analysis and mathematical modeling of the dynamic changes within the ovary has permitted exciting new avenues of research with readily quantifiable endpoints. Spectral, color-flow and power Doppler imaging now facilitate physiologic interpretations of vascular dynamics over time. Similarly, magnetic resonance imaging (MRI) is emerging as a research tool in ovarian imaging. New technologies, such as three-dimensional ultrasonography and MRI, ultrasound-based biomicroscopy and synchrotron-based techniques each have the potential to enhance our real-time picture of ovarian function to the near-cellular level. Collectively, information available in ultrasonography, MRI, computer-assisted image analysis and mathematical modeling heralds a new era in our understanding of the basic processes of female and male reproduction.

  1. Creating a histology-embryology free digital image database using high-end microscopy and computer techniques for on-line biomedical education.

    PubMed

    Silva-Lopes, Victor W; Monteiro-Leal, Luiz H

    2003-07-01

    The development of new technology and the possibility of fast information delivery by either Internet or Intranet connections are changing education. Microanatomy education depends basically on the correct interpretation of microscopy images by students. Modern microscopes coupled to computers enable the presentation of these images in a digital form by creating image databases. However, the access to this new technology is restricted entirely to those living in cities and towns with an Information Technology (IT) infrastructure. This study describes the creation of a free Internet histology database composed by high-quality images and also presents an inexpensive way to supply it to a greater number of students through Internet/Intranet connections. By using state-of-the-art scientific instruments, we developed a Web page (http://www2.uerj.br/~micron/atlas/atlasenglish/index.htm) that, in association with a multimedia microscopy laboratory, intends to help in the reduction of the IT educational gap between developed and underdeveloped regions. Copyright 2003 Wiley-Liss, Inc.

  2. System and method for design and optimization of grid connected photovoltaic power plant with multiple photovoltaic module technologies

    DOEpatents

    Thomas, Bex George; Elasser, Ahmed; Bollapragada, Srinivas; Galbraith, Anthony William; Agamy, Mohammed; Garifullin, Maxim Valeryevich

    2016-03-29

    A system and method of using one or more DC-DC/DC-AC converters and/or alternative devices allows strings of multiple module technologies to coexist within the same PV power plant. A computing (optimization) framework estimates the percentage allocation of PV power plant capacity to selected PV module technologies. The framework and its supporting components considers irradiation, temperature, spectral profiles, cost and other practical constraints to achieve the lowest levelized cost of electricity, maximum output and minimum system cost. The system and method can function using any device enabling distributed maximum power point tracking at the module, string or combiner level.

  3. Note-taking and Handouts in The Digital Age.

    PubMed

    Stacy, Elizabeth Moore; Cain, Jeff

    2015-09-25

    Most educators consider note-taking a critical component of formal classroom learning. Advancements in technology such as tablet computers, mobile applications, and recorded lectures are altering classroom dynamics and affecting the way students compose and review class notes. These tools may improve a student's ability to take notes, but they also may hinder learning. In an era of dynamic technology developments, it is important for educators to routinely examine and evaluate influences on formal and informal learning environments. This paper discusses key background literature on student note-taking, identifies recent trends and potential implications of mobile technologies on classroom note-taking and student learning, and discusses future directions for note-taking in the context of digitally enabled lifelong learning.

  4. NASA's Advanced Information Systems Technology (AIST) Program: Advanced Concepts and Disruptive Technologies

    NASA Astrophysics Data System (ADS)

    Little, M. M.; Moe, K.; Komar, G.

    2014-12-01

    NASA's Earth Science Technology Office (ESTO) manages a wide range of information technology projects under the Advanced Information Systems Technology (AIST) Program. The AIST Program aims to support all phases of NASA's Earth Science program with the goal of enabling new observations and information products, increasing the accessibility and use of Earth observations, and reducing the risk and cost of satellite and ground based information systems. Recent initiatives feature computational technologies to improve information extracted from data streams or model outputs and researchers' tools for Big Data analytics. Data-centric technologies enable research communities to facilitate collaboration and increase the speed with which results are produced and published. In the future NASA anticipates more small satellites (e.g., CubeSats), mobile drones and ground-based in-situ sensors will advance the state-of-the-art regarding how scientific observations are performed, given the flexibility, cost and deployment advantages of new operations technologies. This paper reviews the success of the program and the lessons learned. Infusion of these technologies is challenging and the paper discusses the obstacles and strategies to adoption by the earth science research and application efforts. It also describes alternative perspectives for the future program direction and for realizing the value in the steps to transform observations from sensors to data, to information, and to knowledge, namely: sensor measurement concepts development; data acquisition and management; data product generation; and data exploitation for science and applications.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    D'Azevedo, Eduardo; Abbott, Stephen; Koskela, Tuomas

    The XGC fusion gyrokinetic code combines state-of-the-art, portable computational and algorithmic technologies to enable complicated multiscale simulations of turbulence and transport dynamics in ITER edge plasma on the largest US open-science computer, the CRAY XK7 Titan, at its maximal heterogeneous capability, which have not been possible before due to a factor of over 10 shortage in the time-to-solution for less than 5 days of wall-clock time for one physics case. Frontier techniques such as nested OpenMP parallelism, adaptive parallel I/O, staging I/O and data reduction using dynamic and asynchronous applications interactions, dynamic repartitioning.

  6. A Platform-Independent Plugin for Navigating Online Radiology Cases.

    PubMed

    Balkman, Jason D; Awan, Omer A

    2016-06-01

    Software methods that enable navigation of radiology cases on various digital platforms differ between handheld devices and desktop computers. This has resulted in poor compatibility of online radiology teaching files across mobile smartphones, tablets, and desktop computers. A standardized, platform-independent, or "agnostic" approach for presenting online radiology content was produced in this work by leveraging modern hypertext markup language (HTML) and JavaScript web software technology. We describe the design and evaluation of this software, demonstrate its use across multiple viewing platforms, and make it publicly available as a model for future development efforts.

  7. Computer numeric control generation of toric surfaces

    NASA Astrophysics Data System (ADS)

    Bradley, Norman D.; Ball, Gary A.; Keller, John R.

    1994-05-01

    Until recently, the manufacture of toric ophthalmic lenses relied largely upon expensive, manual techniques for generation and polishing. Recent gains in computer numeric control (CNC) technology and tooling enable lens designers to employ single- point diamond, fly-cutting methods in the production of torics. Fly-cutting methods continue to improve, significantly expanding lens design possibilities while lowering production costs. Advantages of CNC fly cutting include precise control of surface geometry, rapid production with high throughput, and high-quality lens surface finishes requiring minimal polishing. As accessibility and affordability increase within the ophthalmic market, torics promise to dramatically expand lens design choices available to consumers.

  8. Development of microgravity, full body functional reach envelope using 3-D computer graphic models and virtual reality technology

    NASA Technical Reports Server (NTRS)

    Lindsey, Patricia F.

    1994-01-01

    In microgravity conditions mobility is greatly enhanced and body stability is difficult to achieve. Because of these difficulties, optimum placement and accessibility of objects and controls can be critical to required tasks on board shuttle flights or on the proposed space station. Anthropometric measurement of the maximum reach of occupants of a microgravity environment provide knowledge about maximum functional placement for tasking situations. Calculations for a full body, functional reach envelope for microgravity environments are imperative. To this end, three dimensional computer modeled human figures, providing a method of anthropometric measurement, were used to locate the data points that define the full body, functional reach envelope. Virtual reality technology was utilized to enable an occupant of the microgravity environment to experience movement within the reach envelope while immersed in a simulated microgravity environment.

  9. Paradigms of perception in clinical practice.

    PubMed

    Jacobson, Francine L; Berlanstein, Bruce P; Andriole, Katherine P

    2006-06-01

    Display strategies for medical images in radiology have evolved in tandem with the technology by which images are made. The close of the 20th century, nearly coincident with the 100th anniversary of the discovery of x-rays, brought radiologists to a new crossroad in the evolution of image display. The increasing availability, speed, and flexibility of computer technology can now revolutionize how images are viewed and interpreted. Radiologists are not yet in agreement regarding the next paradigm for image display. The possibilities are being explored systematically through the Society for Computer Applications in Radiology's Transforming the Radiological Interpretation Process initiative. The varied input of radiologists who work in a large variety of settings will enable new display strategies to best serve radiologists in the detection and quantification of disease. Considerations and possibilities for the future are presented in this paper.

  10. Silicon photonics for high-performance interconnection networks

    NASA Astrophysics Data System (ADS)

    Biberman, Aleksandr

    2011-12-01

    We assert in the course of this work that silicon photonics has the potential to be a key disruptive technology in computing and communication industries. The enduring pursuit of performance gains in computing, combined with stringent power constraints, has fostered the ever-growing computational parallelism associated with chip multiprocessors, memory systems, high-performance computing systems, and data centers. Sustaining these parallelism growths introduces unique challenges for on- and off-chip communications, shifting the focus toward novel and fundamentally different communication approaches. This work showcases that chip-scale photonic interconnection networks, enabled by high-performance silicon photonic devices, enable unprecedented bandwidth scalability with reduced power consumption. We demonstrate that the silicon photonic platforms have already produced all the high-performance photonic devices required to realize these types of networks. Through extensive empirical characterization in much of this work, we demonstrate such feasibility of waveguides, modulators, switches, and photodetectors. We also demonstrate systems that simultaneously combine many functionalities to achieve more complex building blocks. Furthermore, we leverage the unique properties of available silicon photonic materials to create novel silicon photonic devices, subsystems, network topologies, and architectures to enable unprecedented performance of these photonic interconnection networks and computing systems. We show that the advantages of photonic interconnection networks extend far beyond the chip, offering advanced communication environments for memory systems, high-performance computing systems, and data centers. Furthermore, we explore the immense potential of all-optical functionalities implemented using parametric processing in the silicon platform, demonstrating unique methods that have the ability to revolutionize computation and communication. Silicon photonics enables new sets of opportunities that we can leverage for performance gains, as well as new sets of challenges that we must solve. Leveraging its inherent compatibility with standard fabrication techniques of the semiconductor industry, combined with its capability of dense integration with advanced microelectronics, silicon photonics also offers a clear path toward commercialization through low-cost mass-volume production. Combining empirical validations of feasibility, demonstrations of massive performance gains in large-scale systems, and the potential for commercial penetration of silicon photonics, the impact of this work will become evident in the many decades that follow.

  11. ViDI: Virtual Diagnostics Interface. Volume 2; Unified File Format and Web Services as Applied to Seamless Data Transfer

    NASA Technical Reports Server (NTRS)

    Fleming, Gary A. (Technical Monitor); Schwartz, Richard J.

    2004-01-01

    The desire to revolutionize the aircraft design cycle from its currently lethargic pace to a fast turn-around operation enabling the optimization of non-traditional configurations is a critical challenge facing the aeronautics industry. In response, a large scale effort is underway to not only advance the state of the art in wind tunnel testing, computational modeling, and information technology, but to unify these often disparate elements into a cohesive design resource. This paper will address Seamless Data Transfer, the critical central nervous system that will enable a wide variety of varied components to work together.

  12. Recent Enhancements to the Development of CFD-Based Aeroelastic Reduced-Order Models

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.

    2007-01-01

    Recent enhancements to the development of CFD-based unsteady aerodynamic and aeroelastic reduced-order models (ROMs) are presented. These enhancements include the simultaneous application of structural modes as CFD input, static aeroelastic analysis using a ROM, and matched-point solutions using a ROM. The simultaneous application of structural modes as CFD input enables the computation of the unsteady aerodynamic state-space matrices with a single CFD execution, independent of the number of structural modes. The responses obtained from a simultaneous excitation of the CFD-based unsteady aerodynamic system are processed using system identification techniques in order to generate an unsteady aerodynamic state-space ROM. Once the unsteady aerodynamic state-space ROM is generated, a method for computing the static aeroelastic response using this unsteady aerodynamic ROM and a state-space model of the structure, is presented. Finally, a method is presented that enables the computation of matchedpoint solutions using a single ROM that is applicable over a range of dynamic pressures and velocities for a given Mach number. These enhancements represent a significant advancement of unsteady aerodynamic and aeroelastic ROM technology.

  13. Supercomputers ready for use as discovery machines for neuroscience.

    PubMed

    Helias, Moritz; Kunkel, Susanne; Masumoto, Gen; Igarashi, Jun; Eppler, Jochen Martin; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus

    2012-01-01

    NEST is a widely used tool to simulate biological spiking neural networks. Here we explain the improvements, guided by a mathematical model of memory consumption, that enable us to exploit for the first time the computational power of the K supercomputer for neuroscience. Multi-threaded components for wiring and simulation combine 8 cores per MPI process to achieve excellent scaling. K is capable of simulating networks corresponding to a brain area with 10(8) neurons and 10(12) synapses in the worst case scenario of random connectivity; for larger networks of the brain its hierarchical organization can be exploited to constrain the number of communicating computer nodes. We discuss the limits of the software technology, comparing maximum filling scaling plots for K and the JUGENE BG/P system. The usability of these machines for network simulations has become comparable to running simulations on a single PC. Turn-around times in the range of minutes even for the largest systems enable a quasi interactive working style and render simulations on this scale a practical tool for computational neuroscience.

  14. Optimizing high performance computing workflow for protein functional annotation.

    PubMed

    Stanberry, Larissa; Rekepalli, Bhanu; Liu, Yuan; Giblock, Paul; Higdon, Roger; Montague, Elizabeth; Broomall, William; Kolker, Natali; Kolker, Eugene

    2014-09-10

    Functional annotation of newly sequenced genomes is one of the major challenges in modern biology. With modern sequencing technologies, the protein sequence universe is rapidly expanding. Newly sequenced bacterial genomes alone contain over 7.5 million proteins. The rate of data generation has far surpassed that of protein annotation. The volume of protein data makes manual curation infeasible, whereas a high compute cost limits the utility of existing automated approaches. In this work, we present an improved and optmized automated workflow to enable large-scale protein annotation. The workflow uses high performance computing architectures and a low complexity classification algorithm to assign proteins into existing clusters of orthologous groups of proteins. On the basis of the Position-Specific Iterative Basic Local Alignment Search Tool the algorithm ensures at least 80% specificity and sensitivity of the resulting classifications. The workflow utilizes highly scalable parallel applications for classification and sequence alignment. Using Extreme Science and Engineering Discovery Environment supercomputers, the workflow processed 1,200,000 newly sequenced bacterial proteins. With the rapid expansion of the protein sequence universe, the proposed workflow will enable scientists to annotate big genome data.

  15. Optimizing high performance computing workflow for protein functional annotation

    PubMed Central

    Stanberry, Larissa; Rekepalli, Bhanu; Liu, Yuan; Giblock, Paul; Higdon, Roger; Montague, Elizabeth; Broomall, William; Kolker, Natali; Kolker, Eugene

    2014-01-01

    Functional annotation of newly sequenced genomes is one of the major challenges in modern biology. With modern sequencing technologies, the protein sequence universe is rapidly expanding. Newly sequenced bacterial genomes alone contain over 7.5 million proteins. The rate of data generation has far surpassed that of protein annotation. The volume of protein data makes manual curation infeasible, whereas a high compute cost limits the utility of existing automated approaches. In this work, we present an improved and optmized automated workflow to enable large-scale protein annotation. The workflow uses high performance computing architectures and a low complexity classification algorithm to assign proteins into existing clusters of orthologous groups of proteins. On the basis of the Position-Specific Iterative Basic Local Alignment Search Tool the algorithm ensures at least 80% specificity and sensitivity of the resulting classifications. The workflow utilizes highly scalable parallel applications for classification and sequence alignment. Using Extreme Science and Engineering Discovery Environment supercomputers, the workflow processed 1,200,000 newly sequenced bacterial proteins. With the rapid expansion of the protein sequence universe, the proposed workflow will enable scientists to annotate big genome data. PMID:25313296

  16. Supercomputers Ready for Use as Discovery Machines for Neuroscience

    PubMed Central

    Helias, Moritz; Kunkel, Susanne; Masumoto, Gen; Igarashi, Jun; Eppler, Jochen Martin; Ishii, Shin; Fukai, Tomoki; Morrison, Abigail; Diesmann, Markus

    2012-01-01

    NEST is a widely used tool to simulate biological spiking neural networks. Here we explain the improvements, guided by a mathematical model of memory consumption, that enable us to exploit for the first time the computational power of the K supercomputer for neuroscience. Multi-threaded components for wiring and simulation combine 8 cores per MPI process to achieve excellent scaling. K is capable of simulating networks corresponding to a brain area with 108 neurons and 1012 synapses in the worst case scenario of random connectivity; for larger networks of the brain its hierarchical organization can be exploited to constrain the number of communicating computer nodes. We discuss the limits of the software technology, comparing maximum filling scaling plots for K and the JUGENE BG/P system. The usability of these machines for network simulations has become comparable to running simulations on a single PC. Turn-around times in the range of minutes even for the largest systems enable a quasi interactive working style and render simulations on this scale a practical tool for computational neuroscience. PMID:23129998

  17. Advanced Information Technology Investments at the NASA Earth Science Technology Office

    NASA Astrophysics Data System (ADS)

    Clune, T.; Seablom, M. S.; Moe, K.

    2012-12-01

    The NASA Earth Science Technology Office (ESTO) regularly makes investments for nurturing advanced concepts in information technology to enable rapid, low-cost acquisition, processing and visualization of Earth science data in support of future NASA missions and climate change research. In 2012, the National Research Council published a mid-term assessment of the 2007 decadal survey for future spacemissions supporting Earth science and applications [1]. The report stated, "Earth sciences have advanced significantly because of existing observational capabilities and the fruit of past investments, along with advances in data and information systems, computer science, and enabling technologies." The report found that NASA had responded favorably and aggressively to the decadal survey and noted the role of the recent ESTO solicitation for information systems technologies that partnered with the NASA Applied Sciences Program to support the transition into operations. NASA's future missions are key stakeholders for the ESTO technology investments. Also driving these investments is the need for the Agency to properly address questions regarding the prediction, adaptation, and eventual mitigation of climate change. The Earth Science Division has championed interdisciplinary research, recognizing that the Earth must be studied as a complete system in order toaddress key science questions [2]. Information technology investments in the low-mid technology readiness level (TRL) range play a key role in meeting these challenges. ESTO's Advanced Information Systems Technology (AIST) program invests in higher risk / higher reward technologies that solve the most challenging problems of the information processing chain. This includes the space segment, where the information pipeline begins, to the end user, where knowledge is ultimatelyadvanced. The objectives of the program are to reduce the risk, cost, size, and development time of Earth Science space-based and ground-based systems, increase the accessibility and utility of science data, and to enable new observation measurements and information products. We will discuss the ESTO investment strategy for information technology development, the methods used to assess stakeholder needs and technology advancements, and technology partnerships to enhance the infusion for the resulting technology. We also describe specific investments and their potential impact on enabling NASA missions and scientific discovery. [1] "Earth Science and Applications from Space: A Midterm Assessment of NASA's Implementation of the Decadal Survey", 2012: National Academies Press, http://www.nap.edu/catalog.php?record_id=13405 [2] "Responding to the Challenge of Climate and Environmental Change: NASA's Plan for a Climate-Centric Architecture for Earth Observations and Applications from Space", 2010: NASA Tech Memo, http://science.nasa.gov/media/medialibrary/2010/07/01/Climate_Architecture_Final.pdf

  18. Copyright Ownership in a Networked Multimedia Environment

    NASA Technical Reports Server (NTRS)

    Williams, Vernon E.

    1994-01-01

    The explosion of computer communications in the United States has spurred the development of many new technologies. One of these new technologies is Mosaic and the World-Wide Web. Mosaic is a user interface that uses the internet as a backbone for communications. The Mosaic interface enables a user to manipulate text, images and graphics produced by different authors. The flexibility that Mosaic offers raises significant copyright issues. This paper attempts to analyze these issues using current copyright law as a framework. The author then goes on to offer a different analysis that may result from future developments in copyright law.

  19. Perioperative nurse training in cardiothoracic surgical robotics.

    PubMed

    Connor, M A; Reinbolt, J A; Handley, P J

    2001-12-01

    The exponential growth of OR technology during the past 10 years has placed increased demands on perioperative nurses. Proficiency is required not only in patient care but also in the understanding, operating, and troubleshooting of video systems, computers, and cutting edge medical devices. The formation of a surgical team dedicated to robotically assisted cardiac surgery requires careful selection, education, and hands-on practice. This article details the six-week training process undertaken at Sarasota Memorial Hospital, Sarasota, Fla, which enabled staff members to deliver excellent patient care with a high degree of confidence in themselves and the robotic technology.

  20. Pruning a decision tree for selecting computer-related assistive devices for people with disabilities.

    PubMed

    Chi, Chia-Fen; Tseng, Li-Kai; Jang, Yuh

    2012-07-01

    Many disabled individuals lack extensive knowledge about assistive technology, which could help them use computers. In 1997, Denis Anson developed a decision tree of 49 evaluative questions designed to evaluate the functional capabilities of the disabled user and choose an appropriate combination of assistive devices, from a selection of 26, that enable the individual to use a computer. In general, occupational therapists guide the disabled users through this process. They often have to go over repetitive questions in order to find an appropriate device. A disabled user may require an alphanumeric entry device, a pointing device, an output device, a performance enhancement device, or some combination of these. Therefore, the current research eliminates redundant questions and divides Anson's decision tree into multiple independent subtrees to meet the actual demand of computer users with disabilities. The modified decision tree was tested by six disabled users to prove it can determine a complete set of assistive devices with a smaller number of evaluative questions. The means to insert new categories of computer-related assistive devices was included to ensure the decision tree can be expanded and updated. The current decision tree can help the disabled users and assistive technology practitioners to find appropriate computer-related assistive devices that meet with clients' individual needs in an efficient manner.

Top