Science.gov

Sample records for adaptive computer technologies

  1. Reconfigurable environmentally adaptive computing

    NASA Technical Reports Server (NTRS)

    Coxe, Robin L. (Inventor); Galica, Gary E. (Inventor)

    2008-01-01

    Described are methods and apparatus, including computer program products, for reconfigurable environmentally adaptive computing technology. An environmental signal representative of an external environmental condition is received. A processing configuration is automatically selected, based on the environmental signal, from a plurality of processing configurations. A reconfigurable processing element is reconfigured to operate according to the selected processing configuration. In some examples, the environmental condition is detected and the environmental signal is generated based on the detected condition.

  2. Using Neural Net Technology To Enhance the Efficiency of a Computer Adaptive Testing Application.

    ERIC Educational Resources Information Center

    Van Nelson, C.; Henriksen, Larry W.

    The potential for computer adaptive testing (CAT) has been well documented. In order to improve the efficiency of this process, it may be possible to utilize a neural network, or more specifically, a back propagation neural network. The paper asserts that in order to accomplish this end, it must be shown that grouping examinees by ability as…

  3. Adaptive Technology that Provides Access to Computers. DO-IT Program.

    ERIC Educational Resources Information Center

    Washington Univ., Seattle.

    This brochure describes the different types of barriers individuals with mobility impairments, blindness, low vision, hearing impairments, and specific learning disabilities face in providing computer input, interpreting output, and reading documentation. The adaptive hardware and software that has been developed to provide functional alternatives…

  4. Technology transfer for adaptation

    NASA Astrophysics Data System (ADS)

    Biagini, Bonizella; Kuhl, Laura; Gallagher, Kelly Sims; Ortiz, Claudia

    2014-09-01

    Technology alone will not be able to solve adaptation challenges, but it is likely to play an important role. As a result of the role of technology in adaptation and the importance of international collaboration for climate change, technology transfer for adaptation is a critical but understudied issue. Through an analysis of Global Environment Facility-managed adaptation projects, we find there is significantly more technology transfer occurring in adaptation projects than might be expected given the pessimistic rhetoric surrounding technology transfer for adaptation. Most projects focused on demonstration and early deployment/niche formation for existing technologies rather than earlier stages of innovation, which is understandable considering the pilot nature of the projects. Key challenges for the transfer process, including technology selection and appropriateness under climate change, markets and access to technology, and diffusion strategies are discussed in more detail.

  5. Architecture Adaptive Computing Environment

    NASA Technical Reports Server (NTRS)

    Dorband, John E.

    2006-01-01

    Architecture Adaptive Computing Environment (aCe) is a software system that includes a language, compiler, and run-time library for parallel computing. aCe was developed to enable programmers to write programs, more easily than was previously possible, for a variety of parallel computing architectures. Heretofore, it has been perceived to be difficult to write parallel programs for parallel computers and more difficult to port the programs to different parallel computing architectures. In contrast, aCe is supportable on all high-performance computing architectures. Currently, it is supported on LINUX clusters. aCe uses parallel programming constructs that facilitate writing of parallel programs. Such constructs were used in single-instruction/multiple-data (SIMD) programming languages of the 1980s, including Parallel Pascal, Parallel Forth, C*, *LISP, and MasPar MPL. In aCe, these constructs are extended and implemented for both SIMD and multiple- instruction/multiple-data (MIMD) architectures. Two new constructs incorporated in aCe are those of (1) scalar and virtual variables and (2) pre-computed paths. The scalar-and-virtual-variables construct increases flexibility in optimizing memory utilization in various architectures. The pre-computed-paths construct enables the compiler to pre-compute part of a communication operation once, rather than computing it every time the communication operation is performed.

  6. Computer Technology for Industry

    NASA Technical Reports Server (NTRS)

    1979-01-01

    In this age of the computer, more and more business firms are automating their operations for increased efficiency in a great variety of jobs, from simple accounting to managing inventories, from precise machining to analyzing complex structures. In the interest of national productivity, NASA is providing assistance both to longtime computer users and newcomers to automated operations. Through a special technology utilization service, NASA saves industry time and money by making available already developed computer programs which have secondary utility. A computer program is essentially a set of instructions which tells the computer how to produce desired information or effect by drawing upon its stored input. Developing a new program from scratch can be costly and time-consuming. Very often, however, a program developed for one purpose can readily be adapted to a totally different application. To help industry take advantage of existing computer technology, NASA operates the Computer Software Management and Information Center (COSMIC)(registered TradeMark),located at the University of Georgia. COSMIC maintains a large library of computer programs developed for NASA, the Department of Defense, the Department of Energy and other technology-generating agencies of the government. The Center gets a continual flow of software packages, screens them for adaptability to private sector usage, stores them and informs potential customers of their availability.

  7. Adaptive computing for people with disabilities.

    PubMed

    Merrow, S L; Corbett, C D

    1994-01-01

    Adaptive computing is a relatively new area, and little has been written in the nursing literature on the topic. "Adaptive computing" refers to the professional services and the technology (both hardware and software) that make computing technology accessible for persons with disabilities. Nurses in many settings such as schools, industry, rehabilitation facilities, and the community, can use knowledge of adaptive computing as they counsel, advise, and advocate for people with disabilities. Nurses with an awareness and knowledge of adaptive computing will be better able to promote high-level wellness for individuals with disabilities, thus maximizing their potential for an active fulfilling life. People with different types of disabilities, including visual, mobility, hearing, learning, communication disorders and acquired brain injuries may benefit from computer adaptations. Disabled people encounter barriers to computing in six major areas: 1) the environment, 2) data entry, 3) information output, 4) technical documentation, 5) support, and 6) training. After a discussion of these barriers, the criteria for selecting appropriate adaptations and selected examples of adaptations are presented. Several cases studies illustrate the evaluation process and the development of adaptive computer solutions. PMID:8082064

  8. Advanced Adaptive Optics Technology Development

    SciTech Connect

    Olivier, S

    2001-09-18

    The NSF Center for Adaptive Optics (CfAO) is supporting research on advanced adaptive optics technologies. CfAO research activities include development and characterization of micro-electro-mechanical systems (MEMS) deformable mirror (DM) technology, as well as development and characterization of high-resolution adaptive optics systems using liquid crystal (LC) spatial light modulator (SLM) technology. This paper presents an overview of the CfAO advanced adaptive optics technology development activities including current status and future plans.

  9. Ubiquitous Computing Technologies in Education

    ERIC Educational Resources Information Center

    Hwang, Gwo-Jen; Wu, Ting-Ting; Chen, Yen-Jung

    2007-01-01

    The prosperous development of wireless communication and sensor technologies has attracted the attention of researchers from both computer and education fields. Various investigations have been made for applying the new technologies to education purposes, such that more active and adaptive learning activities can be conducted in the real world.…

  10. Adaptive Technologies for Training and Education

    ERIC Educational Resources Information Center

    Durlach, Paula J., Ed; Lesgold, Alan M., Ed.

    2012-01-01

    This edited volume provides an overview of the latest advancements in adaptive training technology. Intelligent tutoring has been deployed for well-defined and relatively static educational domains such as algebra and geometry. However, this adaptive approach to computer-based training has yet to come into wider usage for domains that are less…

  11. Optimizing Computer Technology Integration

    ERIC Educational Resources Information Center

    Dillon-Marable, Elizabeth; Valentine, Thomas

    2006-01-01

    The purpose of this study was to better understand what optimal computer technology integration looks like in adult basic skills education (ABSE). One question guided the research: How is computer technology integration best conceptualized and measured? The study used the Delphi method to map the construct of computer technology integration and…

  12. Computer Access. Tech Use Guide: Using Computer Technology.

    ERIC Educational Resources Information Center

    Council for Exceptional Children, Reston, VA. Center for Special Education Technology.

    One of nine brief guides for special educators on using computer technology, this guide focuses on access including adaptations in input devices, output devices, and computer interfaces. Low technology devices include "no-technology" devices (usually modifications to existing devices), simple switches, and multiple switches. High technology input…

  13. Computers: Educational Technology Paradox?

    ERIC Educational Resources Information Center

    Hashim, Hajah Rugayah Hj.; Mustapha, Wan Narita

    2005-01-01

    As we move further into the new millennium, the need to involve and adapt learners with new technology have been the main aim of many institutions of higher learning in Malaysia. The involvement of the government in huge technology-based projects like the Multimedia Super Corridor Highway (MSC) and one of its flagships, the Smart Schools have…

  14. Computational adaptive optics of the human retina

    NASA Astrophysics Data System (ADS)

    South, Fredrick A.; Liu, Yuan-Zhi; Carney, P. Scott; Boppart, Stephen A.

    2016-03-01

    It is well known that patient-specific ocular aberrations limit imaging resolution in the human retina. Previously, hardware adaptive optics (HAO) has been employed to measure and correct these aberrations to acquire high-resolution images of various retinal structures. While the resulting aberration-corrected images are of great clinical importance, clinical use of HAO has not been widespread due to the cost and complexity of these systems. We present a technique termed computational adaptive optics (CAO) for aberration correction in the living human retina without the use of hardware adaptive optics components. In CAO, complex interferometric data acquired using optical coherence tomography (OCT) is manipulated in post-processing to adjust the phase of the optical wavefront. In this way, the aberrated wavefront can be corrected. We summarize recent results in this technology for retinal imaging, including aberration-corrected imaging in multiple retinal layers and practical considerations such as phase stability and image optimization.

  15. Access to College for All: ITAC Project--Computer and Adaptive Computer Technologies in the Cegeps for Students with Disabilities = L'accessibilite au cegep pour tous: Projet ITAC--informatique et technologies adaptees dans les cegeps pour les etudiants handicapes.

    ERIC Educational Resources Information Center

    Fichten, Catherine S.; Barile, Maria

    This report discusses outcomes of three empirical studies which investigated the computer and adaptive computer technology needs and concerns of Quebec college students with various disabilities, professors, and individuals responsible for providing services to students with disabilities. Key findings are highlighted and recommendations are made…

  16. Computers boost structural technology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Venneri, Samuel L.

    1989-01-01

    Derived from matrix methods of structural analysis and finite element methods developed over the last three decades, computational structures technology (CST) blends computer science, numerical analysis, and approximation theory into structural analysis and synthesis. Recent significant advances in CST include stochastic-based modeling, strategies for performing large-scale structural calculations on new computing systems, and the integration of CST with other disciplinary modules for multidisciplinary analysis and design. New methodologies have been developed at NASA for integrated fluid-thermal structural analysis and integrated aerodynamic-structure-control design. The need for multiple views of data for different modules also led to the development of a number of sophisticated data-base management systems. For CST to play a role in the future development of structures technology and in the multidisciplinary design of future flight vehicles, major advances and computational tools are needed in a number of key areas.

  17. QPSO-Based Adaptive DNA Computing Algorithm

    PubMed Central

    Karakose, Mehmet; Cigdem, Ugur

    2013-01-01

    DNA (deoxyribonucleic acid) computing that is a new computation model based on DNA molecules for information storage has been increasingly used for optimization and data analysis in recent years. However, DNA computing algorithm has some limitations in terms of convergence speed, adaptability, and effectiveness. In this paper, a new approach for improvement of DNA computing is proposed. This new approach aims to perform DNA computing algorithm with adaptive parameters towards the desired goal using quantum-behaved particle swarm optimization (QPSO). Some contributions provided by the proposed QPSO based on adaptive DNA computing algorithm are as follows: (1) parameters of population size, crossover rate, maximum number of operations, enzyme and virus mutation rate, and fitness function of DNA computing algorithm are simultaneously tuned for adaptive process, (2) adaptive algorithm is performed using QPSO algorithm for goal-driven progress, faster operation, and flexibility in data, and (3) numerical realization of DNA computing algorithm with proposed approach is implemented in system identification. Two experiments with different systems were carried out to evaluate the performance of the proposed approach with comparative results. Experimental results obtained with Matlab and FPGA demonstrate ability to provide effective optimization, considerable convergence speed, and high accuracy according to DNA computing algorithm. PMID:23935409

  18. Computer Technology and Education.

    ERIC Educational Resources Information Center

    Senter, Joy

    1981-01-01

    Examines educational tasks in general computing, including computer-assisted instruction, computer-managed instruction, word processing, secretarial and business applications, time sharing, and networking to larger computers. (CT)

  19. Adaptable Deployable Entry and Placement Technology (ADEPT)

    NASA Video Gallery

    The Adaptable, Deployable Entry Placement Technology (ADEPT) Project will test and demonstrate a deployable aeroshell concept as a viable thermal protection system for entry, descent, and landing o...

  20. Computer Technology in Adult Education.

    ERIC Educational Resources Information Center

    Slider, Patty; Hodges, Kathy; Carter, Cea; White, Barbara

    This publication provides materials to help adult educators use computer technology in their teaching. Section 1, Computer Basics, contains activities and materials on these topics: increasing computer literacy, computer glossary, parts of a computer, keyboard, disk care, highlighting text, scrolling and wrap-around text, setting up text,…

  1. Computing technology in the 1980's. [computers

    NASA Technical Reports Server (NTRS)

    Stone, H. S.

    1978-01-01

    Advances in computing technology have been led by consistently improving semiconductor technology. The semiconductor industry has turned out ever faster, smaller, and less expensive devices since transistorized computers were first introduced 20 years ago. For the next decade, there appear to be new advances possible, with the rate of introduction of improved devices at least equal to the historic trends. The implication of these projections is that computers will enter new markets and will truly be pervasive in business, home, and factory as their cost diminishes and their computational power expands to new levels. The computer industry as we know it today will be greatly altered in the next decade, primarily because the raw computer system will give way to computer-based turn-key information and control systems.

  2. Principals' Relationship with Computer Technology

    ERIC Educational Resources Information Center

    Brockmeier, Lantry L.; Sermon, Janet M.; Hope, Warren C.

    2005-01-01

    This investigation sought information about principals and their relationship with computer technology. Several questions were fundamental to the inquiry. Are principals prepared to facilitate the attainment of technology's promise through the integration of computer technology into the teaching and learning process? Are principals prepared to use…

  3. Computer Viruses. Technology Update.

    ERIC Educational Resources Information Center

    Ponder, Tim, Comp.; Ropog, Marty, Comp.; Keating, Joseph, Comp.

    This document provides general information on computer viruses, how to help protect a computer network from them, measures to take if a computer becomes infected. Highlights include the origins of computer viruses; virus contraction; a description of some common virus types (File Virus, Boot Sector/Partition Table Viruses, Trojan Horses, and…

  4. Computers and Technological Forecasting

    ERIC Educational Resources Information Center

    Martino, Joseph P.

    1971-01-01

    Forecasting is becoming increasingly automated, thanks in large measure to the computer. It is now possible for a forecaster to submit his data to a computation center and call for the appropriate program. (No knowledge of statistics is required.) (Author)

  5. Computer Technology Directory.

    ERIC Educational Resources Information Center

    Exceptional Parent, 1990

    1990-01-01

    This directory lists approximately 300 commercial vendors that offer computer hardware, software, and communication aids for children with disabilities. The company listings indicate computer compatibility and specific disabilities served by their products. (JDD)

  6. [Earth Science Technology Office's Computational Technologies Project

    NASA Technical Reports Server (NTRS)

    Fischer, James (Technical Monitor); Merkey, Phillip

    2005-01-01

    This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.

  7. Computer and information technology: hardware.

    PubMed

    O'Brien, D

    1998-02-01

    Computers open the door to an ever-expanding arena of knowledge and technology. Most nurses practicing in perianesthesia setting were educated before the computer era, and many fear computers and the associated technology. Frequently, the greatest difficulty is finding the resources and knowing what questions to ask. The following is the first in a series of articles on computers and information technology. This article discusses computer hardware to get the novice started or the experienced user upgraded to access new technologies and the Internet. Future articles will discuss start up and usual software applications, getting up to speed on the information superhighway, and other technologies that will broaden our knowledge and expand our personal and professional world. PMID:9543967

  8. Technological Innovation: Teacher Preparation, Adaptability, and Effectiveness.

    ERIC Educational Resources Information Center

    Hurley, Noel; Mundy, Pamela

    This study examined elementary teachers' perceptions about their preparation for efficient, effective implementation of technology, the adaptability of technology to teaching style, and the effect on students of their technology use, investigating whether there was a correlation between those three variables. This work also examined the effects of…

  9. Computer Technology for Industry.

    ERIC Educational Resources Information Center

    Aviation/Space, 1982

    1982-01-01

    A special National Aeronautics and Space Administration (NASA) service is contributing to national productivity by providing industry with reusable, low-cost, government-developed computer programs. Located at the University of Georgia, NASA's Computer Software Management and Information Center (COSMIC) has developed programs for equipment…

  10. Decoding Technology: Computer Shortcuts

    ERIC Educational Resources Information Center

    Walker, Tim; Donohue, Chip

    2008-01-01

    For the typical early childhood administrator, there will never be enough hours in a day to finish the work that needs to be done. This includes numerous hours spent on a computer tracking enrollment, managing the budget, researching curriculum ideas online, and many other administrative tasks. Improving an administrator's computer efficiency can…

  11. Computer Technology and Nursing Education.

    ERIC Educational Resources Information Center

    Southern Council on Collegiate Education for Nursing, Atlanta, GA.

    The influences of computer technology on college nursing education programs and health care delivery systems are discussed in eight papers. The use of computers is considered, with attention to clinical care, nursing education and continuing education, administration, and research. Attention is also directed to basic computer terminology, computer…

  12. Computers, Technology, and Disability. [Update.

    ERIC Educational Resources Information Center

    American Council on Education, Washington, DC. HEATH Resource Center.

    This paper describes programs and resources that focus on access of postsecondary students with disabilities to computers and other forms of technology. Increased access to technological devices and services is provided to students with disabilities under the Technology-Related Assistance for Individuals with Disabilities Act (Tech Act). Section…

  13. Parallel computations and control of adaptive structures

    NASA Technical Reports Server (NTRS)

    Park, K. C.; Alvin, Kenneth F.; Belvin, W. Keith; Chong, K. P. (Editor); Liu, S. C. (Editor); Li, J. C. (Editor)

    1991-01-01

    The equations of motion for structures with adaptive elements for vibration control are presented for parallel computations to be used as a software package for real-time control of flexible space structures. A brief introduction of the state-of-the-art parallel computational capability is also presented. Time marching strategies are developed for an effective use of massive parallel mapping, partitioning, and the necessary arithmetic operations. An example is offered for the simulation of control-structure interaction on a parallel computer and the impact of the approach presented for applications in other disciplines than aerospace industry is assessed.

  14. Effects of Computer Technology

    ERIC Educational Resources Information Center

    Heydinger, Richard B.; Norris, Donald M.

    1976-01-01

    The expanding networks of computer hardware, software, and organizations for controlling them and the institutional data bases they access are described. Improvements are making the data sources accessible but raise some new problems for data managers and institutional researchers. (Author/LBH)

  15. Computer Technology for Industry

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Shell Oil Company used a COSMIC program, called VISCEL to insure the accuracy of the company's new computer code for analyzing polymers, and chemical compounds. Shell reported that there were no other programs available that could provide the necessary calculations. Shell produces chemicals for plastic products used in the manufacture of automobiles, housewares, appliances, film, textiles, electronic equipment and furniture.

  16. Computer Technology and Social Issues.

    ERIC Educational Resources Information Center

    Garson, G. David

    Computing involves social issues and political choices. Issues such as privacy, computer crime, gender inequity, disemployment, and electronic democracy versus "Big Brother" are addressed in the context of efforts to develop a national public policy for information technology. A broad range of research and case studies are examined in an attempt…

  17. Adaptive security systems -- Combining expert systems with adaptive technologies

    SciTech Connect

    Argo, P.; Loveland, R.; Anderson, K.

    1997-09-01

    The Adaptive Multisensor Integrated Security System (AMISS) uses a variety of computational intelligence techniques to reason from raw sensor data through an array of processing layers to arrive at an assessment for alarm/alert conditions based on human behavior within a secure facility. In this paper, the authors give an overview of the system and briefly describe some of the major components of the system. This system is currently under development and testing in a realistic facility setting.

  18. Trusted Computing Technologies, Intel Trusted Execution Technology.

    SciTech Connect

    Guise, Max Joseph; Wendt, Jeremy Daniel

    2011-01-01

    We describe the current state-of-the-art in Trusted Computing Technologies - focusing mainly on Intel's Trusted Execution Technology (TXT). This document is based on existing documentation and tests of two existing TXT-based systems: Intel's Trusted Boot and Invisible Things Lab's Qubes OS. We describe what features are lacking in current implementations, describe what a mature system could provide, and present a list of developments to watch. Critical systems perform operation-critical computations on high importance data. In such systems, the inputs, computation steps, and outputs may be highly sensitive. Sensitive components must be protected from both unauthorized release, and unauthorized alteration: Unauthorized users should not access the sensitive input and sensitive output data, nor be able to alter them; the computation contains intermediate data with the same requirements, and executes algorithms that the unauthorized should not be able to know or alter. Due to various system requirements, such critical systems are frequently built from commercial hardware, employ commercial software, and require network access. These hardware, software, and network system components increase the risk that sensitive input data, computation, and output data may be compromised.

  19. Computational technology for high-temperature aerospace structures

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Card, M. F.

    1992-01-01

    The status and some recent developments of computational technology for high-temperature aerospace structures are summarized. Discussion focuses on a number of aspects including: goals of computational technology for high-temperature structures; computational material modeling; life prediction methodology; computational modeling of high-temperature composites; error estimation and adaptive improvement strategies; strategies for solution of fluid flow/thermal/structural problems; and probabilistic methods and stochastic modeling approaches, integrated analysis and design. Recent trends in high-performance computing environment are described and the research areas which have high potential for meeting future technological needs are identified.

  20. Techniques for grid manipulation and adaptation. [computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Choo, Yung K.; Eisemann, Peter R.; Lee, Ki D.

    1992-01-01

    Two approaches have been taken to provide systematic grid manipulation for improved grid quality. One is the control point form (CPF) of algebraic grid generation. It provides explicit control of the physical grid shape and grid spacing through the movement of the control points. It works well in the interactive computer graphics environment and hence can be a good candidate for integration with other emerging technologies. The other approach is grid adaptation using a numerical mapping between the physical space and a parametric space. Grid adaptation is achieved by modifying the mapping functions through the effects of grid control sources. The adaptation process can be repeated in a cyclic manner if satisfactory results are not achieved after a single application.

  1. Adaptable Computing Environment/Self-Assembling Software

    SciTech Connect

    Osbourn, Gordon C.; Bouchard, Ann M.; Bartholomew, John W.

    2007-09-25

    Complex software applications are difficult to learn to use and to remember how to use. Further, the user has no control over the functionality available in a given application. The software we use can be created and modified only by a relatively small group of elite, highly skilled artisans known as programmers. "Normal users" are powerless to create and modify software themselves, because the tools for software development, designed by and for programmers, are a barrier to entry. This software, when completed, will be a user-adaptable computing environment in which the user is really in control of his/her own software, able to adapt the system, make new parts of the system interactive, and even modify the behavior of the system itself. Som key features of the basic environment that have been implemented are (a) books in bookcases, where all data is stored, (b) context-sensitive compass menus (compass, because the buttons are located in compass directions relative to the mouose cursor position), (c) importing tabular data and displaying it in a book, (d) light-weight table querying/sorting, (e) a Reach&Get capability (sort of a "smart" copy/paste that prevents the user from copying invalid data), and (f) a LogBook that automatically logs all user actions that change data or the system itself. To bootstrap toward full end-user adaptability, we implemented a set of development tools. With the development tools, compass menus can be made and customized.

  2. Alberta Education's Computer Technology Project.

    ERIC Educational Resources Information Center

    Thiessen, Jim

    This description of activities initiated through the Computer Technology Project of the provincial education ministry in Alberta, Canada, covers the 2-year period beginning with establishment of the project by the Alberta Department of Education in October 1981. Activities described include: (1) the establishment of the Office of Educational…

  3. Optical Computers and Space Technology

    NASA Technical Reports Server (NTRS)

    Abdeldayem, Hossin A.; Frazier, Donald O.; Penn, Benjamin; Paley, Mark S.; Witherow, William K.; Banks, Curtis; Hicks, Rosilen; Shields, Angela

    1995-01-01

    The rapidly increasing demand for greater speed and efficiency on the information superhighway requires significant improvements over conventional electronic logic circuits. Optical interconnections and optical integrated circuits are strong candidates to provide the way out of the extreme limitations imposed on the growth of speed and complexity of nowadays computations by the conventional electronic logic circuits. The new optical technology has increased the demand for high quality optical materials. NASA's recent involvement in processing optical materials in space has demonstrated that a new and unique class of high quality optical materials are processible in a microgravity environment. Microgravity processing can induce improved orders in these materials and could have a significant impact on the development of optical computers. We will discuss NASA's role in processing these materials and report on some of the associated nonlinear optical properties which are quite useful for optical computers technology.

  4. Center for Advanced Computational Technology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    2000-01-01

    The Center for Advanced Computational Technology (ACT) was established to serve as a focal point for diverse research activities pertaining to application of advanced computational technology to future aerospace systems. These activities include the use of numerical simulations, artificial intelligence methods, multimedia and synthetic environments, and computational intelligence, in the modeling, analysis, sensitivity studies, optimization, design and operation of future aerospace systems. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The Center has four specific objectives: 1) conduct innovative research on applications of advanced computational technology to aerospace systems; 2) act as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); 3) help in identifying future directions of research in support of the aeronautical and space missions of the twenty-first century; and 4) help in the rapid transfer of research results to industry and in broadening awareness among researchers and engineers of the state-of-the-art in applications of advanced computational technology to the analysis, design prototyping and operations of aerospace and other high-performance engineering systems. In addition to research, Center activities include helping in the planning and coordination of the activities of a multi-center team of NASA and JPL researchers who are developing an intelligent synthesis environment for future aerospace systems; organizing workshops and national symposia; as well as writing state-of-the-art monographs and NASA special publications on timely topics.

  5. Adaptive Management of Computing and Network Resources for Spacecraft Systems

    NASA Technical Reports Server (NTRS)

    Pfarr, Barbara; Welch, Lonnie R.; Detter, Ryan; Tjaden, Brett; Huh, Eui-Nam; Szczur, Martha R. (Technical Monitor)

    2000-01-01

    It is likely that NASA's future spacecraft systems will consist of distributed processes which will handle dynamically varying workloads in response to perceived scientific events, the spacecraft environment, spacecraft anomalies and user commands. Since all situations and possible uses of sensors cannot be anticipated during pre-deployment phases, an approach for dynamically adapting the allocation of distributed computational and communication resources is needed. To address this, we are evolving the DeSiDeRaTa adaptive resource management approach to enable reconfigurable ground and space information systems. The DeSiDeRaTa approach embodies a set of middleware mechanisms for adapting resource allocations, and a framework for reasoning about the real-time performance of distributed application systems. The framework and middleware will be extended to accommodate (1) the dynamic aspects of intra-constellation network topologies, and (2) the complete real-time path from the instrument to the user. We are developing a ground-based testbed that will enable NASA to perform early evaluation of adaptive resource management techniques without the expense of first deploying them in space. The benefits of the proposed effort are numerous, including the ability to use sensors in new ways not anticipated at design time; the production of information technology that ties the sensor web together; the accommodation of greater numbers of missions with fewer resources; and the opportunity to leverage the DeSiDeRaTa project's expertise, infrastructure and models for adaptive resource management for distributed real-time systems.

  6. Adaptable Computing Environment/Self-Assembling Software

    2007-09-25

    Complex software applications are difficult to learn to use and to remember how to use. Further, the user has no control over the functionality available in a given application. The software we use can be created and modified only by a relatively small group of elite, highly skilled artisans known as programmers. "Normal users" are powerless to create and modify software themselves, because the tools for software development, designed by and for programmers, are amore » barrier to entry. This software, when completed, will be a user-adaptable computing environment in which the user is really in control of his/her own software, able to adapt the system, make new parts of the system interactive, and even modify the behavior of the system itself. Som key features of the basic environment that have been implemented are (a) books in bookcases, where all data is stored, (b) context-sensitive compass menus (compass, because the buttons are located in compass directions relative to the mouose cursor position), (c) importing tabular data and displaying it in a book, (d) light-weight table querying/sorting, (e) a Reach&Get capability (sort of a "smart" copy/paste that prevents the user from copying invalid data), and (f) a LogBook that automatically logs all user actions that change data or the system itself. To bootstrap toward full end-user adaptability, we implemented a set of development tools. With the development tools, compass menus can be made and customized.« less

  7. Shape threat detection via adaptive computed tomography

    NASA Astrophysics Data System (ADS)

    Masoudi, Ahmad; Thamvichai, Ratchaneekorn; Neifeld, Mark A.

    2016-05-01

    X-ray Computed Tomography (CT) is used widely for screening purposes. Conventional x-ray threat detection systems employ image reconstruction and segmentation algorithms prior to making threat/no-threat decisions. We find that in many cases these pre-processing steps can degrade detection performance. Therefore in this work we will investigate methods that operate directly on the CT measurements. We analyze a fixed-gantry system containing 25 x-ray sources and 2200 photon counting detectors. We present a new method for improving threat detection performance. This new method is a so-called greedy adaptive algorithm which at each time step uses information from previous measurements to design the next measurement. We utilize sequential hypothesis testing (SHT) in order to derive both the optimal "next measurement" and the stopping criterion to insure a target probability of error Pe. We find that selecting the next x-ray source according to such a greedy adaptive algorithm, we can reduce Pe by a factor of 42.4× relative to the conventional measurement sequence employing all 25 sources in sequence.

  8. Visual Impairments. Tech Use Guide: Using Computer Technology.

    ERIC Educational Resources Information Center

    Carr, Annette

    This guide describes adaptive technology for reading printed text and producing written material, to assist the student who has a visual impairment. The special technologies discussed include auditory text access, text enlargement, tactile text access, portable notetaking devices, and computer access. The guide concludes with lists of the…

  9. Some Practical Examples of Computer-Adaptive Sequential Testing.

    ERIC Educational Resources Information Center

    Luecht, Richard M.; Nungester, Ronald J.

    1998-01-01

    Describes an integrated approach to test development and administration called computer-adaptive sequential testing (CAST). CAST incorporates adaptive testing methods with automated test assembly. Describes the CAST framework and demonstrates several applications using a medical-licensure example. (SLD)

  10. Infinite possibilities: Computational structures technology

    NASA Astrophysics Data System (ADS)

    Beam, Sherilee F.

    1994-12-01

    Computational Fluid Dynamics (or CFD) methods are very familiar to the research community. Even the general public has had some exposure to CFD images, primarily through the news media. However, very little attention has been paid to CST--Computational Structures Technology. Yet, no important design can be completed without it. During the first half of this century, researchers only dreamed of designing and building structures on a computer. Today their dreams have become practical realities as computational methods are used in all phases of design, fabrication and testing of engineering systems. Increasingly complex structures can now be built in even shorter periods of time. Over the past four decades, computer technology has been developing, and early finite element methods have grown from small in-house programs to numerous commercial software programs. When coupled with advanced computing systems, they help engineers make dramatic leaps in designing and testing concepts. The goals of CST include: predicting how a structure will behave under actual operating conditions; designing and complementing other experiments conducted on a structure; investigating microstructural damage or chaotic, unpredictable behavior; helping material developers in improving material systems; and being a useful tool in design systems optimization and sensitivity techniques. Applying CST to a structure problem requires five steps: (1) observe the specific problem; (2) develop a computational model for numerical simulation; (3) develop and assemble software and hardware for running the codes; (4) post-process and interpret the results; and (5) use the model to analyze and design the actual structure. Researchers in both industry and academia continue to make significant contributions to advance this technology with improvements in software, collaborative computing environments and supercomputing systems. As these environments and systems evolve, computational structures technology will

  11. Infinite possibilities: Computational structures technology

    NASA Technical Reports Server (NTRS)

    Beam, Sherilee F.

    1994-01-01

    Computational Fluid Dynamics (or CFD) methods are very familiar to the research community. Even the general public has had some exposure to CFD images, primarily through the news media. However, very little attention has been paid to CST--Computational Structures Technology. Yet, no important design can be completed without it. During the first half of this century, researchers only dreamed of designing and building structures on a computer. Today their dreams have become practical realities as computational methods are used in all phases of design, fabrication and testing of engineering systems. Increasingly complex structures can now be built in even shorter periods of time. Over the past four decades, computer technology has been developing, and early finite element methods have grown from small in-house programs to numerous commercial software programs. When coupled with advanced computing systems, they help engineers make dramatic leaps in designing and testing concepts. The goals of CST include: predicting how a structure will behave under actual operating conditions; designing and complementing other experiments conducted on a structure; investigating microstructural damage or chaotic, unpredictable behavior; helping material developers in improving material systems; and being a useful tool in design systems optimization and sensitivity techniques. Applying CST to a structure problem requires five steps: (1) observe the specific problem; (2) develop a computational model for numerical simulation; (3) develop and assemble software and hardware for running the codes; (4) post-process and interpret the results; and (5) use the model to analyze and design the actual structure. Researchers in both industry and academia continue to make significant contributions to advance this technology with improvements in software, collaborative computing environments and supercomputing systems. As these environments and systems evolve, computational structures technology will

  12. ICAN Computer Code Adapted for Building Materials

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.

    1997-01-01

    The NASA Lewis Research Center has been involved in developing composite micromechanics and macromechanics theories over the last three decades. These activities have resulted in several composite mechanics theories and structural analysis codes whose applications range from material behavior design and analysis to structural component response. One of these computer codes, the Integrated Composite Analyzer (ICAN), is designed primarily to address issues related to designing polymer matrix composites and predicting their properties - including hygral, thermal, and mechanical load effects. Recently, under a cost-sharing cooperative agreement with a Fortune 500 corporation, Master Builders Inc., ICAN was adapted to analyze building materials. The high costs and technical difficulties involved with the fabrication of continuous-fiber-reinforced composites sometimes limit their use. Particulate-reinforced composites can be thought of as a viable alternative. They are as easily processed to near-net shape as monolithic materials, yet have the improved stiffness, strength, and fracture toughness that is characteristic of continuous-fiber-reinforced composites. For example, particlereinforced metal-matrix composites show great potential for a variety of automotive applications, such as disk brake rotors, connecting rods, cylinder liners, and other hightemperature applications. Building materials, such as concrete, can be thought of as one of the oldest materials in this category of multiphase, particle-reinforced materials. The adaptation of ICAN to analyze particle-reinforced composite materials involved the development of new micromechanics-based theories. A derivative of the ICAN code, ICAN/PART, was developed and delivered to Master Builders Inc. as a part of the cooperative activity.

  13. Inflatable technologies: Adaptability from dream to reality

    NASA Astrophysics Data System (ADS)

    Häuplik-Meusburger, Sandra; Sommer, Bernhard; Aguzzi, Manuela

    2009-09-01

    With the increasing investment in a sustained human presence beyond low-earth orbit, the interest in the development of lightweight structures once again comes to the fore. In general, lightweight structural concepts can include inflatable, erectable or deployable parts utilizing membranes, composites or hybrid concepts. These structures clearly offer great advantages, not only for structures on Earth, but especially for space stations and also for the building of habitats and associated infrastructure on the Moon and Mars. By applying intelligent, constructive and packaging concepts, they are particularly interesting because they combine maximum load capacity and minimum use of material with an increase in operational and habitable volume. In addition to the structural advantages, lightweight and adaptable structural concepts can be one of the most important strategies in assisting sustainable space exploration development. With the term "adaptable structural concepts" we refer to the ability to adapt to changing requirements e.g. mission objectives, crew condition and technological developments. This paper presents a selection of innovative developments, both past and present, in lightweight and adaptable architectural concepts—that, focussing beyond deployability, allow the development of innovative building structures with features thoroughly enhanced through detailed discussion and examples.

  14. Computer technologies and institutional memory

    NASA Technical Reports Server (NTRS)

    Bell, Christopher; Lachman, Roy

    1989-01-01

    NASA programs for manned space flight are in their 27th year. Scientists and engineers who worked continuously on the development of aerospace technology during that period are approaching retirement. The resulting loss to the organization will be considerable. Although this problem is general to the NASA community, the problem was explored in terms of the institutional memory and technical expertise of a single individual in the Man-Systems division. The main domain of the expert was spacecraft lighting, which became the subject area for analysis in these studies. The report starts with an analysis of the cumulative expertise and institutional memory of technical employees of organizations such as NASA. A set of solutions to this problem are examined and found inadequate. Two solutions were investigated at length: hypertext and expert systems. Illustrative examples were provided of hypertext and expert system representation of spacecraft lighting. These computer technologies can be used to ameliorate the problem of the loss of invaluable personnel.

  15. Military engine computational structures technology

    NASA Technical Reports Server (NTRS)

    Thomson, Daniel E.

    1992-01-01

    Integrated High Performance Turbine Engine Technology Initiative (IHPTET) goals require a strong analytical base. Effective analysis of composite materials is critical to life analysis and structural optimization. Accurate life prediction for all material systems is critical. User friendly systems are also desirable. Post processing of results is very important. The IHPTET goal is to double turbine engine propulsion capability by the year 2003. Fifty percent of the goal will come from advanced materials and structures, the other 50 percent will come from increasing performance. Computer programs are listed.

  16. A Very High Order, Adaptable MESA Implementation for Aeroacoustic Computations

    NASA Technical Reports Server (NTRS)

    Dydson, Roger W.; Goodrich, John W.

    2000-01-01

    Since computational efficiency and wave resolution scale with accuracy, the ideal would be infinitely high accuracy for problems with widely varying wavelength scales. Currently, many of the computational aeroacoustics methods are limited to 4th order accurate Runge-Kutta methods in time which limits their resolution and efficiency. However, a new procedure for implementing the Modified Expansion Solution Approximation (MESA) schemes, based upon Hermitian divided differences, is presented which extends the effective accuracy of the MESA schemes to 57th order in space and time when using 128 bit floating point precision. This new approach has the advantages of reducing round-off error, being easy to program. and is more computationally efficient when compared to previous approaches. Its accuracy is limited only by the floating point hardware. The advantages of this new approach are demonstrated by solving the linearized Euler equations in an open bi-periodic domain. A 500th order MESA scheme can now be created in seconds, making these schemes ideally suited for the next generation of high performance 256-bit (double quadruple) or higher precision computers. This ease of creation makes it possible to adapt the algorithm to the mesh in time instead of its converse: this is ideal for resolving varying wavelength scales which occur in noise generation simulations. And finally, the sources of round-off error which effect the very high order methods are examined and remedies provided that effectively increase the accuracy of the MESA schemes while using current computer technology.

  17. Test Anxiety, Computer-Adaptive Testing and the Common Core

    ERIC Educational Resources Information Center

    Colwell, Nicole Makas

    2013-01-01

    This paper highlights the current findings and issues regarding the role of computer-adaptive testing in test anxiety. The computer-adaptive test (CAT) proposed by one of the Common Core consortia brings these issues to the forefront. Research has long indicated that test anxiety impairs student performance. More recent research indicates that…

  18. Center for Computational Structures Technology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Perry, Ferman W.

    1995-01-01

    The Center for Computational Structures Technology (CST) is intended to serve as a focal point for the diverse CST research activities. The CST activities include the use of numerical simulation and artificial intelligence methods in modeling, analysis, sensitivity studies, and optimization of flight-vehicle structures. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The key elements of the Center are: (1) conducting innovative research on advanced topics of CST; (2) acting as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); (3) strong collaboration with NASA scientists and researchers from universities and other government laboratories; and (4) rapid dissemination of CST to industry, through integration of industrial personnel into the ongoing research efforts.

  19. Implementation of Multispectral Image Classification on a Remote Adaptive Computer

    NASA Technical Reports Server (NTRS)

    Figueiredo, Marco A.; Gloster, Clay S.; Stephens, Mark; Graves, Corey A.; Nakkar, Mouna

    1999-01-01

    As the demand for higher performance computers for the processing of remote sensing science algorithms increases, the need to investigate new computing paradigms its justified. Field Programmable Gate Arrays enable the implementation of algorithms at the hardware gate level, leading to orders of m a,gnitude performance increase over microprocessor based systems. The automatic classification of spaceborne multispectral images is an example of a computation intensive application, that, can benefit from implementation on an FPGA - based custom computing machine (adaptive or reconfigurable computer). A probabilistic neural network is used here to classify pixels of of a multispectral LANDSAT-2 image. The implementation described utilizes Java client/server application programs to access the adaptive computer from a remote site. Results verify that a remote hardware version of the algorithm (implemented on an adaptive computer) is significantly faster than a local software version of the same algorithm implemented on a typical general - purpose computer).

  20. Computer Software for Forestry Technology Curricula. Final Report.

    ERIC Educational Resources Information Center

    Watson, Roy C.; Scobie, Walter R.

    Since microcomputers are being used more and more frequently in the forest products industry in the Pacific Northwest, Green River Community College conducted a project to search for BASIC language computer programs pertaining to forestry, and when possible, to adapt such software for use in teaching forestry technology. The search for applicable…

  1. The research on thermal adaptability reinforcement technology for photovoltaic modules

    NASA Astrophysics Data System (ADS)

    Su, Nana; Zhou, Guozhong

    2015-10-01

    Nowadays, Photovoltaic module contains more high-performance components in smaller space. It is also demanded to work in severe temperature condition for special use, such as aerospace. As temperature rises, the failure rate will increase exponentially which makes reliability significantly reduce. In order to improve thermal adaptability of photovoltaic module, this paper makes a research on reinforcement technologies. Thermoelectric cooler is widely used in aerospace which has harsh working environment. So, theoretical formulas for computing refrigerating efficiency, refrigerating capacity and temperature difference are described in detail. The optimum operating current of three classical working condition is obtained which can be used to guide the design of driven circuit. Taken some equipment enclosure for example, we use thermoelectric cooler to reinforce its thermal adaptability. By building physical model and thermal model with the aid of physical dimension and constraint condition, the model is simulated by Flotherm. The temperature field cloud is shown to verify the effectiveness of reinforcement.

  2. Art and Technology: Computers in the Studio?

    ERIC Educational Resources Information Center

    Ruby-Baird, Janet

    1997-01-01

    Because the graphic industry demands graduates with computer skills, art students want college programs that include complex computer technologies. However, students can produce good computer art only if they have mastered traditional drawing and design skills. Discusses designing an art curriculum including both technology and traditional course…

  3. Advanced laptop and small personal computer technology

    NASA Technical Reports Server (NTRS)

    Johnson, Roger L.

    1991-01-01

    Advanced laptop and small personal computer technology is presented in the form of the viewgraphs. The following areas of hand carried computers and mobile workstation technology are covered: background, applications, high end products, technology trends, requirements for the Control Center application, and recommendations for the future.

  4. A survey of software adaptation in mobile and ubiquitous computing

    NASA Astrophysics Data System (ADS)

    Kakousis, Konstantinos; Paspallis, Nearchos; Angelos Papadopoulos, George

    2010-11-01

    Driven by the vast proliferation of mobile devices and ubiquitous computing, dynamic software adaptation is becoming one of the most common terms in Software Engineering and Computer Science in general. After the evolution in autonomic and ubiquitous computing, we will soon expect devices to understand our changing needs and react to them as transparently as possible. Software adaptation is not a new term though; it has been extensively researched in several domains and in numerous forms. This has resulted in several interpretations of adaptation. This survey aims to provide a disambiguation of the term, as it is understood in ubiquitous computing, and a critical evaluation of existing software adaptation approaches. In particular, we focus on existing solutions that enable dynamic software modifications that happen on resource constrained devices, deployed in mobile and ubiquitous computing environments.

  5. Faculty Computer Expertise and Use of Instructional Technology. Technology Survey.

    ERIC Educational Resources Information Center

    Gabriner, Robert; Mery, Pamela

    This report shows the findings of a 1997 technology survey used to assess degrees of faculty computer expertise and the use of instructional technology. Part 1 reviews general findings of the fall 1997 technology survey: (1) the level of computer expertise among faculty, staff and administrators appears to be increasing; (2) in comparison with the…

  6. RASCAL: A Rudimentary Adaptive System for Computer-Aided Learning.

    ERIC Educational Resources Information Center

    Stewart, John Christopher

    Both the background of computer-assisted instruction (CAI) systems in general and the requirements of a computer-aided learning system which would be a reasonable assistant to a teacher are discussed. RASCAL (Rudimentary Adaptive System for Computer-Aided Learning) is a first attempt at defining a CAI system which would individualize the learning…

  7. The Efficacy of Psychophysiological Measures for Implementing Adaptive Technology

    NASA Technical Reports Server (NTRS)

    Scerbo, Mark W.; Freeman, Frederick G.; Mikulka, Peter J.; Parasuraman, Raja; DiNocero, Francesco; Prinzel, Lawrence J., III

    2001-01-01

    Adaptive automation refers to technology that can change its mode of operation dynamically. Further, both the technology and the operator can initiate changes in the level or mode of automation. The present paper reviews research on adaptive technology. It is divided into three primary sections. In the first section, issues surrounding the development and implementation of adaptive automation are presented. Because physiological-based measures show much promise for implementing adaptive automation, the second section is devoted to examining candidate indices. In the final section, those techniques that show the greatest promise for adaptive automation as well as issues that still need to be resolved are discussed.

  8. Computer Technology: State of the Art.

    ERIC Educational Resources Information Center

    Withington, Frederic G.

    1981-01-01

    Describes the nature of modern general-purpose computer systems, including hardware, semiconductor electronics, microprocessors, computer architecture, input output technology, and system control programs. Seven suggested readings are cited. (FM)

  9. A Guide to Computer Adaptive Testing Systems

    ERIC Educational Resources Information Center

    Davey, Tim

    2011-01-01

    Some brand names are used generically to describe an entire class of products that perform the same function. "Kleenex," "Xerox," "Thermos," and "Band-Aid" are good examples. The term "computerized adaptive testing" (CAT) is similar in that it is often applied uniformly across a diverse family of testing methods. Although the various members of…

  10. Adaptive Finite-Element Computation In Fracture Mechanics

    NASA Technical Reports Server (NTRS)

    Min, J. B.; Bass, J. M.; Spradley, L. W.

    1995-01-01

    Report discusses recent progress in use of solution-adaptive finite-element computational methods to solve two-dimensional problems in linear elastic fracture mechanics. Method also shown extensible to three-dimensional problems.

  11. The Computer as Adaptive Instructional Decision Maker.

    ERIC Educational Resources Information Center

    Kopstein, Felix F.; Seidel, Robert J.

    The computer's potential for education, and most particularly for instruction, is contingent on the development of a class of instructional decision models (formal instructional strategies) that interact with the student through appropriate peripheral equipment (man-machine interfaces). Computer hardware and software by themselves should not be…

  12. Solution-adaptive finite element method in computational fracture mechanics

    NASA Technical Reports Server (NTRS)

    Min, J. B.; Bass, J. M.; Spradley, L. W.

    1993-01-01

    Some recent results obtained using solution-adaptive finite element method in linear elastic two-dimensional fracture mechanics problems are presented. The focus is on the basic issue of adaptive finite element method for validating the applications of new methodology to fracture mechanics problems by computing demonstration problems and comparing the stress intensity factors to analytical results.

  13. Moral Responsibility and Computer Technology.

    ERIC Educational Resources Information Center

    Friedman, Batya

    Noting a recent increase in the number of cases of computer crime and computer piracy, this paper takes up the question, "How can understanding the social context of computing help us--as parents, educators, and members of government and industry--to educate young people to become morally responsible members of an electronic information…

  14. Trajectory Optimization with Adaptive Deployable Entry and Placement Technology Architecture

    NASA Astrophysics Data System (ADS)

    Saranathan, H.; Saikia, S.; Grant, M. J.; Longuski, J. M.

    2014-06-01

    This paper compares the results of trajectory optimization for Adaptive Deployable Entry and Placement Technology (ADEPT) using different control methods. ADEPT addresses the limitations of current EDL technology in delivering heavy payloads to Mars.

  15. Adapting Books: Ready, Set, Read!: EAT: Equipment, Adaptations, and Technology

    ERIC Educational Resources Information Center

    Schoonover, Judith; Norton-Darr, Sally

    2016-01-01

    Developing multimodal materials to introduce or extend literacy experiences sets the stage for literacy success. Alternative ways to organize, display and arrange, interact and respond to information produces greater understanding of concepts. Adaptations include making books easier to use (turning pages or holding), and text easier to read…

  16. Implementing the Computer-Adaptive Sequential Testing (CAST) Framework To Mass Produce High Quality Computer-Adaptive and Mastery Tests.

    ERIC Educational Resources Information Center

    Luecht, Richard M.

    Computerized testing has created new challenges for the production and administration of test forms. This paper describes a multi-stage, testlet-based framework for test design, assembly, and administration called computer-adaptive sequential testing (CAST). CAST is a structured testing approach that is amenable to both adaptive and mastery…

  17. Computer Technology and Education: A Policy Delphi.

    ERIC Educational Resources Information Center

    Steier, Lloyd P.

    Realizing the educational potential of computer technology largely depends on developing appropriate policies related to the technology. A Policy Delphi method was used to identify changes in education that are both probable and possible on account of the introduction of computers, and to explore potential patterns for arriving at a desired…

  18. Computer Technology-Infused Learning Enhancement

    ERIC Educational Resources Information Center

    Keengwe, Jared; Anyanwu, Longy O.

    2007-01-01

    The purpose of the study was to determine students' perception of instructional integration of computer technology to improve learning. Two key questions were investigated in this study: (a) What is the students' perception of faculty integration of computer technology into classroom instruction? (b) To what extent does the students' perception of…

  19. College Students' Attitude towards Computer Technology

    ERIC Educational Resources Information Center

    Njagi, K. O.; Havice, W. L.

    2011-01-01

    Recent advances in the contemporary world, especially in the area of computer technology, have heralded the development and implementation of new and innovative teaching strategies and particularly with the Internet revolution. This study assessed students' attitude towards computer technology. Specifically, the study assessed differences in…

  20. Prior Computer Experience and Technology Acceptance

    ERIC Educational Resources Information Center

    Varma, Sonali

    2010-01-01

    Prior computer experience with information technology has been identified as a key variable (Lee, Kozar, & Larsen, 2003) that can influence an individual's future use of newer computer technology. The lack of a theory driven approach to measuring prior experience has however led to conceptually different factors being used interchangeably in…

  1. Theory-Guided Technology in Computer Science.

    ERIC Educational Resources Information Center

    Ben-Ari, Mordechai

    2001-01-01

    Examines the history of major achievements in computer science as portrayed by winners of the prestigious Turing award and identifies a possibly unique activity called Theory-Guided Technology (TGT). Researchers develop TGT by using theoretical results to create practical technology. Discusses reasons why TGT is practical in computer science and…

  2. Computer technology for autistic students.

    PubMed

    Panyan, M V

    1984-12-01

    The first purpose of this article is to review the literature related to the use of computers with autistic individuals. Although only a limited number of applications have been reported, the potential of the computer to facilitate the progress of autistic persons is promising. The second purpose is to identify specific learning problems or styles associated with autism from the research literature and link these with the unique aspects of computer-based instruction. For example, the computer's role in improving the motivation of autistic individuals is related to its capacity to analyze the reinforcing qualities of a particular event interactively and immediately for each user. Finally, recommendations that may enable computers to be maximally beneficial in assessing the learning process and remediating learning problems are offered. Two such recommendations are selecting appropriate software and integrating computer instruction within the classroom environment. PMID:6549182

  3. Technologies for Visualization in Computational Aerosciences

    NASA Technical Reports Server (NTRS)

    Miceli, Kristina D.; Cooper, D. M. (Technical Monitor)

    1993-01-01

    State-of-the-art research in computational aerosciences produces' complex, time-dependent datasets. Simulations can also be multidisciplinary in nature, coupling two or more physical disciplines such as fluid dynamics, structural dynamics, thermodynamics, and acoustics. Many diverse technologies are necessary for visualizing computational aerosciences simulations. This paper describes these technologies and how they contribute to building effective tools for use by domain scientists. These technologies include data management, distributed environments, advanced user interfaces, rapid prototyping environments, parallel computation, and methods to visualize the scalar and vector fields associated with computational aerosciences datasets.

  4. Adaptive DNA Computing Algorithm by Using PCR and Restriction Enzyme

    NASA Astrophysics Data System (ADS)

    Kon, Yuji; Yabe, Kaoru; Rajaee, Nordiana; Ono, Osamu

    In this paper, we introduce an adaptive DNA computing algorithm by using polymerase chain reaction (PCR) and restriction enzyme. The adaptive algorithm is designed based on Adleman-Lipton paradigm[3] of DNA computing. In this work, however, unlike the Adleman- Lipton architecture a cutting operation has been introduced to the algorithm and the mechanism in which the molecules used by computation were feedback to the next cycle devised. Moreover, the amplification by PCR is performed in the molecule used by feedback and the difference concentration arisen in the base sequence can be used again. By this operation the molecules which serve as a solution candidate can be reduced down and the optimal solution is carried out in the shortest path problem. The validity of the proposed adaptive algorithm is considered with the logical simulation and finally we go on to propose applying adaptive algorithm to the chemical experiment which used the actual DNA molecules for solving an optimal network problem.

  5. Multithreaded Model for Dynamic Load Balancing Parallel Adaptive PDE Computations

    NASA Technical Reports Server (NTRS)

    Chrisochoides, Nikos

    1995-01-01

    We present a multithreaded model for the dynamic load-balancing of numerical, adaptive computations required for the solution of Partial Differential Equations (PDE's) on multiprocessors. Multithreading is used as a means of exploring concurrency in the processor level in order to tolerate synchronization costs inherent to traditional (non-threaded) parallel adaptive PDE solvers. Our preliminary analysis for parallel, adaptive PDE solvers indicates that multithreading can be used an a mechanism to mask overheads required for the dynamic balancing of processor workloads with computations required for the actual numerical solution of the PDE's. Also, multithreading can simplify the implementation of dynamic load-balancing algorithms, a task that is very difficult for traditional data parallel adaptive PDE computations. Unfortunately, multithreading does not always simplify program complexity, often makes code re-usability not an easy task, and increases software complexity.

  6. Computer Technology Resources for Literacy Projects.

    ERIC Educational Resources Information Center

    Florida State Council on Aging, Tallahassee.

    This resource booklet was prepared to assist literacy projects and community adult education programs in determining the technology they need to serve more older persons. Section 1 contains the following reprinted articles: "The Human Touch in the Computer Age: Seniors Learn Computer Skills from Schoolkids" (Suzanne Kashuba); "Computer Instruction…

  7. The Prisoner's Dilemma: A Computer Adaption.

    ERIC Educational Resources Information Center

    Ashmore, Timothy M.

    1987-01-01

    Presents a computerized version of the Prisoner's Dilemma game, which runs on several Apple computers. Makes a case for utilizing the program in interpersonal, small group, and social conflict communication classes. Describes the four computerized versions of the game: rational, partially rational, nonrational, and assumed rational. (JD)

  8. Computation as an emergent feature of adaptive synchronization

    NASA Astrophysics Data System (ADS)

    Zanin, M.; Papo, D.; Sendiña-Nadal, I.; Boccaletti, S.

    2011-12-01

    We report on the spontaneous emergence of computation from adaptive synchronization of networked dynamical systems. The fundamentals are nonlinear elements, interacting in a directed graph via a coupling that adapts itself to the synchronization level between two input signals. These units can emulate different Boolean logics, and perform any computational task in a Turing sense, each specific operation being associated with a given network's motif. The resilience of the computation against noise is proven, and the general applicability is demonstrated with regard to periodic and chaotic oscillators, and excitable systems mimicking neural dynamics.

  9. Education & Technology: Reflections on Computing in Classrooms.

    ERIC Educational Resources Information Center

    Fisher, Charles, Ed.; Dwyer, David C., Ed.; Yocam, Keith, Ed.

    This volume examines learning in the age of technology, describes changing practices in technology-rich classrooms, and proposes new ways to support teachers as they incorporate technology into their work. It commemorates the eleventh anniversary of the Apple Classrooms of Tomorrow (ACOT) Project, when Apple Computer, Inc., in partnership with a…

  10. From Computer Lab to Technology Class.

    ERIC Educational Resources Information Center

    Sherwood, Sandra

    1999-01-01

    Discussion of integrating technology into elementary school classrooms focuses on teacher training that is based on a three-year plan developed at an elementary school in Marathon, New York. Describes the role of a technology teacher who facilitates technology integration by running the computer lab, offering workshops, and developing inservice…

  11. Adaptive computational methods for aerothermal heating analysis

    NASA Technical Reports Server (NTRS)

    Price, John M.; Oden, J. Tinsley

    1988-01-01

    The development of adaptive gridding techniques for finite-element analysis of fluid dynamics equations is described. The developmental work was done with the Euler equations with concentration on shock and inviscid flow field capturing. Ultimately this methodology is to be applied to a viscous analysis for the purpose of predicting accurate aerothermal loads on complex shapes subjected to high speed flow environments. The development of local error estimate strategies as a basis for refinement strategies is discussed, as well as the refinement strategies themselves. The application of the strategies to triangular elements and a finite-element flux-corrected-transport numerical scheme are presented. The implementation of these strategies in the GIM/PAGE code for 2-D and 3-D applications is documented and demonstrated.

  12. Adapting Inspection Data for Computer Numerical Control

    NASA Technical Reports Server (NTRS)

    Hutchison, E. E.

    1986-01-01

    Machining time for repetitive tasks reduced. Program converts measurements of stub post locations by coordinate-measuring machine into form used by numerical-control computer. Work time thus reduced by 10 to 15 minutes for each post. Since there are 600 such posts on each injector, time saved per injector is 100 to 150 hours. With modifications this approach applicable to machining of many precise holes on large machine frames and similar objects.

  13. Adapting Technological Interventions to Meet the Needs of Priority Populations.

    PubMed

    Linke, Sarah E; Larsen, Britta A; Marquez, Becky; Mendoza-Vasconez, Andrea; Marcus, Bess H

    2016-01-01

    Cardiovascular diseases (CVD) comprise the leading cause of mortality worldwide, accounting for 3 in 10 deaths. Individuals with certain risk factors, including tobacco use, obesity, low levels of physical activity, type 2 diabetes mellitus, racial/ethnic minority status and low socioeconomic status, experience higher rates of CVD and are, therefore, considered priority populations. Technological devices such as computers and smartphones are now routinely utilized in research studies aiming to prevent CVD and its risk factors, and they are also rampant in the public and private health sectors. Traditional health behavior interventions targeting these risk factors have been adapted for technology-based approaches. This review provides an overview of technology-based interventions conducted in these priority populations as well as the challenges and gaps to be addressed in future research. Researchers currently possess tremendous opportunities to engage in technology-based implementation and dissemination science to help spread evidence-based programs focusing on CVD risk factors in these and other priority populations. PMID:26957186

  14. Confidence in Pass/Fail Decisions for Computer Adaptive and Paper and Pencil Examinations.

    ERIC Educational Resources Information Center

    Bergstrom, Betty A.; Lunz, Mary E.

    The level of confidence in pass/fail decisions obtained with computer adaptive tests (CATs) was compared to decisions based on paper-and-pencil tests. Subjects included 645 medical technology students from 238 educational programs across the country. The tests used in this study constituted part of the subjects' review for the certification…

  15. Computer technology: implications for nurse educators.

    PubMed

    Rambo, A

    1994-01-01

    The purpose of this paper is to discuss implications of computer technology for nursing education. Effects of computer anxiety and strategies to minimize them are presented. Computer assisted instruction (CAI) and interactive videodisc (IVD) are alternative instructional strategies for content dissemination and learning enhancement. Faculty must be cognizant of design factors facilitating usage when selecting programs. Issues of privacy, confidentiality, information security, and impact on nursing practice have risen with increased computer usage. PMID:7831133

  16. Adaptive Fuzzy Systems in Computational Intelligence

    NASA Technical Reports Server (NTRS)

    Berenji, Hamid R.

    1996-01-01

    In recent years, the interest in computational intelligence techniques, which currently includes neural networks, fuzzy systems, and evolutionary programming, has grown significantly and a number of their applications have been developed in the government and industry. In future, an essential element in these systems will be fuzzy systems that can learn from experience by using neural network in refining their performances. The GARIC architecture, introduced earlier, is an example of a fuzzy reinforcement learning system which has been applied in several control domains such as cart-pole balancing, simulation of to Space Shuttle orbital operations, and tether control. A number of examples from GARIC's applications in these domains will be demonstrated.

  17. Computers and Writing. Tech Use Guide: Using Computer Technology.

    ERIC Educational Resources Information Center

    Dalton, Bridget

    One of nine brief guides for special educators on using computer technology, this guide focuses on the use of computers to improve skills and attitudes in writing instruction. Pre-writing tools such as group brainstorming, story webs, free-writing, journal entries, and prewriting guides help generate ideas and can be carried out either on or off…

  18. Computing, Information and Communications Technology (CICT) Website

    NASA Technical Reports Server (NTRS)

    Hardman, John; Tu, Eugene (Technical Monitor)

    2002-01-01

    The Computing, Information and Communications Technology Program (CICT) was established in 2001 to ensure NASA's Continuing leadership in emerging technologies. It is a coordinated, Agency-wide effort to develop and deploy key enabling technologies for a broad range of mission-critical tasks. The NASA CICT program is designed to address Agency-specific computing, information, and communications technology requirements beyond the projected capabilities of commercially available solutions. The areas of technical focus have been chosen for their impact on NASA's missions, their national importance, and the technical challenge they provide to the Program. In order to meet its objectives, the CICT Program is organized into the following four technology focused projects: 1) Computing, Networking and Information Systems (CNIS); 2) Intelligent Systems (IS); 3) Space Communications (SC); 4) Information Technology Strategic Research (ITSR).

  19. Theory-Guided Technology in Computer Science

    NASA Astrophysics Data System (ADS)

    Ben-Ari, Mordechai

    Scientists usually identify themselves as either theoreticians or experimentalists, while technology - the application of science in practice - is done by engineers. In computer science, these distinctions are often blurred. This paper examines the history of major achievements in computer science as portrayed by the winners of the prestigious Turing Award and identifies a possibly unique activity called Theory-Guided Technology (TGT). Researchers develop TGT by using theoretical results to create practical technology. The reasons why TGT is practical in computer science are discussed, as is the cool reception that TGT has been received by software engineers.

  20. An Adaptive Sensor Mining Framework for Pervasive Computing Applications

    NASA Astrophysics Data System (ADS)

    Rashidi, Parisa; Cook, Diane J.

    Analyzing sensor data in pervasive computing applications brings unique challenges to the KDD community. The challenge is heightened when the underlying data source is dynamic and the patterns change. We introduce a new adaptive mining framework that detects patterns in sensor data, and more importantly, adapts to the changes in the underlying model. In our framework, the frequent and periodic patterns of data are first discovered by the Frequent and Periodic Pattern Miner (FPPM) algorithm; and then any changes in the discovered patterns over the lifetime of the system are discovered by the Pattern Adaptation Miner (PAM) algorithm, in order to adapt to the changing environment. This framework also captures vital context information present in pervasive computing applications, such as the startup triggers and temporal information. In this paper, we present a description of our mining framework and validate the approach using data collected in the CASAS smart home testbed.

  1. Electrooptical adaptive switching network for the hypercube computer

    NASA Technical Reports Server (NTRS)

    Chow, E.; Peterson, J.

    1988-01-01

    An all-optical network design for the hyperswitch network using regular free-space interconnects between electronic processor nodes is presented. The adaptive routing model used is described, and an adaptive routing control example is presented. The design demonstrates that existing electrooptical techniques are sufficient for implementing efficient parallel architectures without the need for more complex means of implementing arbitrary interconnection schemes. The electrooptical hyperswitch network significantly improves the communication performance of the hypercube computer.

  2. (CICT) Computing, Information, and Communications Technology Overview

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.

    2003-01-01

    The goal of the Computing, Information, and Communications Technology (CICT) program is to enable NASA's Scientific Research, Space Exploration, and Aerospace Technology Missions with greater mission assurance, for less cost, with increased science return through the development and use of advanced computing, information and communications technologies. This viewgraph presentation includes diagrams of how the political guidance behind CICT is structured. The presentation profiles each part of the NASA Mission in detail, and relates the Mission to the activities of CICT. CICT's Integrated Capability Goal is illustrated, and hypothetical missions which could be enabled by CICT are profiled. CICT technology development is profiled.

  3. Adapting Teaching Strategies To Encompass New Technologies.

    ERIC Educational Resources Information Center

    Oravec, Jo Ann

    2001-01-01

    The explosion of special-purpose computing devices--Internet appliances, handheld computers, wireless Internet, networked household appliances--challenges business educators attempting to provide computer literacy education. At a minimum, they should address connectivity, expanded applications, and social and public policy implications of these…

  4. Ultimate computing. Biomolecular consciousness and nano Technology

    SciTech Connect

    Hameroff, S.R.

    1987-01-01

    The book advances the premise that the cytoskeleton is the cell's nervous system, the biological controller/computer. If indeed cytoskeletal dynamics in the nanoscale (billionth meter, billionth second) are the texture of intracellular information processing, emerging ''NanoTechnologies'' (scanning tunneling microscopy, Feynman machines, von Neumann replicators, etc.) should enable direct monitoring, decoding and interfacing between biological and technological information devices. This in turn could result in important biomedical applications and perhaps a merger of mind and machine: Ultimate Computing.

  5. Computer technology forecast study for general aviation

    NASA Technical Reports Server (NTRS)

    Seacord, C. L.; Vaughn, D.

    1976-01-01

    A multi-year, multi-faceted program is underway to investigate and develop potential improvements in airframes, engines, and avionics for general aviation aircraft. The objective of this study was to assemble information that will allow the government to assess the trends in computer and computer/operator interface technology that may have application to general aviation in the 1980's and beyond. The current state of the art of computer hardware is assessed, technical developments in computer hardware are predicted, and nonaviation large volume users of computer hardware are identified.

  6. Computer Technology and the Social Studies.

    ERIC Educational Resources Information Center

    Glenn, Allen D.; Klassen, Daniel L.

    1983-01-01

    The citizen of tomorrow needs to understand the role of information in political systems; computer technology and information storage, retrieval, and use; the implications of information systems for individual rights; and the impact of computer crime, databanks, and systems analysis on the social, economic, and political spheres. (QKR)

  7. On the Emergence of New Computer Technologies

    ERIC Educational Resources Information Center

    Asaolu, Olumuyiwa Sunday

    2006-01-01

    This work presents a review of the development and application of computers. It traces the highlights of emergent computing technologies shaping our world. Recent trends in hardware and software deployment are chronicled as well as their impact on various segments of the society. The expectations for the future are also discussed along with…

  8. Computer Maintenance Technology. Suggested Basic Course Outline.

    ERIC Educational Resources Information Center

    Texas A and M Univ., College Station. Vocational Instructional Services.

    This competency-based basic course outline is designed for a two-year secondary program in computer maintenance technology. The first year is devoted to basic electricity and electronics, the second to the troubleshooting, maintenance, and service of microcomputers. (The repair section is based upon the Apple II computer, disc drive, monitor, and…

  9. Applications of Computer Technology in Intercollegiate Debate.

    ERIC Educational Resources Information Center

    Kay, Jack, Ed.

    1986-01-01

    Focusing on how computers can and should be used in intercollegiate forensics, this journal issue offers the perspectives of a number of forensics instructors. The lead article, "Applications of Computer Technology in Intercollegiate Debate" by Theodore F. Sheckels, Jr., discusses five areas in which forensics educators might use computer…

  10. [Computer technologies in teaching pathological anatomy].

    PubMed

    Ponomarev, A B; Fedorov, D N

    2015-01-01

    The paper gives experience with personal computers used at the Academician A.L. Strukov Department of Pathological Anatomy for more than 20 years. It shows the objective necessity of introducing computer technologies at all stages of acquiring skills in anatomical pathology, including lectures, students' free work, test check, etc. PMID:26027397

  11. Adapting the human-computer interface for reading literacy and computer skill to facilitate collection of information directly from patients.

    PubMed

    Lobach, David F; Arbanas, Jennifer M; Mishra, Dharani D; Campbell, Marci; Wildemuth, Barbara M

    2004-01-01

    Clinical information collected directly from patients is critical to the practice of medicine. Past efforts to collect this information using computers have had limited utility because these efforts required users to be facile with the computerized information collecting system. In this paper we describe the design, development, and function of a computer system that uses recent technology to overcome the limitations of previous computer-based data collection tools by adapting the human-computer interface to the native language, reading literacy, and computer skills of the user. Specifically, our system uses a numerical representation of question content, multimedia, and touch screen technology to adapt the computer interface to the native language, reading literacy, and computer literacy of the user. In addition, the system supports health literacy needs throughout the data collection session and provides contextually relevant disease-specific education to users based on their responses to the questions. The system has been successfully used in an academically affiliated family medicine clinic and in an indigent adult medicine clinic. PMID:15360991

  12. Implementing Computer Technologies: Teachers' Perceptions and Practices

    ERIC Educational Resources Information Center

    Wozney, Lori; Venkatesh, Vivek; Abrami, Philip

    2006-01-01

    This study investigates personal and setting characteristics, teacher attitudes, and current computer technology practices among 764 elementary and secondary teachers from both private and public school sectors in Quebec. Using expectancy-value theory, the Technology Implementation Questionnaire (TIQ) was developed; it consists of 33 belief items…

  13. Computer Technology and Maintenance Curriculum. Final Report.

    ERIC Educational Resources Information Center

    Manchester Community Coll., CT.

    A project was conducted by Manchester Community College and Howell Cheney Vocational Technical School in Connecticut to develop a joint curriculum for a two-year computer technology and maintenance program. During the year the project was conducted, a high technology advisory council was formed, consisting of industry and faculty representatives…

  14. Adaptive Hypermedia Educational System Based on XML Technologies.

    ERIC Educational Resources Information Center

    Baek, Yeongtae; Wang, Changjong; Lee, Sehoon

    This paper proposes an adaptive hypermedia educational system using XML technologies, such as XML, XSL, XSLT, and XLink. Adaptive systems are capable of altering the presentation of the content of the hypermedia on the basis of a dynamic understanding of the individual user. The user profile can be collected in a user model, while the knowledge…

  15. Estimating Skin Cancer Risk: Evaluating Mobile Computer-Adaptive Testing

    PubMed Central

    Djaja, Ngadiman; Janda, Monika; Olsen, Catherine M; Whiteman, David C

    2016-01-01

    Background Response burden is a major detriment to questionnaire completion rates. Computer adaptive testing may offer advantages over non-adaptive testing, including reduction of numbers of items required for precise measurement. Objective Our aim was to compare the efficiency of non-adaptive (NAT) and computer adaptive testing (CAT) facilitated by Partial Credit Model (PCM)-derived calibration to estimate skin cancer risk. Methods We used a random sample from a population-based Australian cohort study of skin cancer risk (N=43,794). All 30 items of the skin cancer risk scale were calibrated with the Rasch PCM. A total of 1000 cases generated following a normal distribution (mean [SD] 0 [1]) were simulated using three Rasch models with three fixed-item (dichotomous, rating scale, and partial credit) scenarios, respectively. We calculated the comparative efficiency and precision of CAT and NAT (shortening of questionnaire length and the count difference number ratio less than 5% using independent t tests). Results We found that use of CAT led to smaller person standard error of the estimated measure than NAT, with substantially higher efficiency but no loss of precision, reducing response burden by 48%, 66%, and 66% for dichotomous, Rating Scale Model, and PCM models, respectively. Conclusions CAT-based administrations of the skin cancer risk scale could substantially reduce participant burden without compromising measurement precision. A mobile computer adaptive test was developed to help people efficiently assess their skin cancer risk. PMID:26800642

  16. The Importance of Formalizing Computational Models of Face Adaptation Aftereffects

    PubMed Central

    Ross, David A.; Palmeri, Thomas J.

    2016-01-01

    Face adaptation is widely used as a means to probe the neural representations that support face recognition. While the theories that relate face adaptation to behavioral aftereffects may seem conceptually simple, our work has shown that testing computational instantiations of these theories can lead to unexpected results. Instantiating a model of face adaptation not only requires specifying how faces are represented and how adaptation shapes those representations but also specifying how decisions are made, translating hidden representational states into observed responses. Considering the high-dimensionality of face representations, the parallel activation of multiple representations, and the non-linearity of activation functions and decision mechanisms, intuitions alone are unlikely to succeed. If the goal is to understand mechanism, not simply to examine the boundaries of a behavioral phenomenon or correlate behavior with brain activity, then formal computational modeling must be a component of theory testing. To illustrate, we highlight our recent computational modeling of face adaptation aftereffects and discuss how models can be used to understand the mechanisms by which faces are recognized. PMID:27378960

  17. Future Information Processing Technology--1983, Computer Science and Technology.

    ERIC Educational Resources Information Center

    Kay, Peg, Ed.; Powell, Patricia, Ed.

    Developed by the Institute for Computer Sciences and Technology and the Defense Intelligence Agency with input from other federal agencies, this detailed document contains the 1983 technical forecast for the information processing industry through 1997. Part I forecasts the underlying technologies of hardware and software, discusses changes in the…

  18. Adaptive Engine Technologies for Aviation CO2 Emissions Reduction

    NASA Technical Reports Server (NTRS)

    Mercer, Carolyn R.; Haller, William J.; Tong, Michael T.

    2006-01-01

    Adaptive turbine engine technologies are assessed for their potential to reduce carbon dioxide emissions from commercial air transports.Technologies including inlet, fan, and compressor flow control, compressor stall control, blade clearance control, combustion control, active bearings and enabling technologies such as active materials and wireless sensors are discussed. The method of systems assessment is described, including strengths and weaknesses of the approach. Performance benefit estimates are presented for each technology, with a summary of potential emissions reduction possible from the development of new, adaptively controlled engine components.

  19. Cloud Computing Technologies and Applications

    NASA Astrophysics Data System (ADS)

    Zhu, Jinzy

    In a nutshell, the existing Internet provides to us content in the forms of videos, emails and information served up in web pages. With Cloud Computing, the next generation of Internet will allow us to "buy" IT services from a web portal, drastic expanding the types of merchandise available beyond those on e-commerce sites such as eBay and Taobao. We would be able to rent from a virtual storefront the basic necessities to build a virtual data center: such as CPU, memory, storage, and add on top of that the middleware necessary: web application servers, databases, enterprise server bus, etc. as the platform(s) to support the applications we would like to either rent from an Independent Software Vendor (ISV) or develop ourselves. Together this is what we call as "IT as a Service," or ITaaS, bundled to us the end users as a virtual data center.

  20. Sequential decision making in computational sustainability via adaptive submodularity

    USGS Publications Warehouse

    Andreas Krause; Daniel Golovin; Converse, Sarah J.

    2015-01-01

    Many problems in computational sustainability require making a sequence of decisions in complex, uncertain environments. Such problems are generally notoriously difficult. In this article, we review the recently discovered notion of adaptive submodularity, an intuitive diminishing returns condition that generalizes the classical notion of submodular set functions to sequential decision problems. Problems exhibiting the adaptive submodularity property can be efficiently and provably near-optimally solved using simple myopic policies. We illustrate this concept in several case studies of interest in computational sustainability: First, we demonstrate how it can be used to efficiently plan for resolving uncertainty in adaptive management scenarios. Secondly, we show how it applies to dynamic conservation planning for protecting endangered species, a case study carried out in collaboration with the US Geological Survey and the US Fish and Wildlife Service.

  1. Adaptive-mesh algorithms for computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Powell, Kenneth G.; Roe, Philip L.; Quirk, James

    1993-01-01

    The basic goal of adaptive-mesh algorithms is to distribute computational resources wisely by increasing the resolution of 'important' regions of the flow and decreasing the resolution of regions that are less important. While this goal is one that is worthwhile, implementing schemes that have this degree of sophistication remains more of an art than a science. In this paper, the basic pieces of adaptive-mesh algorithms are described and some of the possible ways to implement them are discussed and compared. These basic pieces are the data structure to be used, the generation of an initial mesh, the criterion to be used to adapt the mesh to the solution, and the flow-solver algorithm on the resulting mesh. Each of these is discussed, with particular emphasis on methods suitable for the computation of compressible flows.

  2. Computer Adaptive Testing for Small Scale Programs and Instructional Systems

    ERIC Educational Resources Information Center

    Rudner, Lawrence M.; Guo, Fanmin

    2011-01-01

    This study investigates measurement decision theory (MDT) as an underlying model for computer adaptive testing when the goal is to classify examinees into one of a finite number of groups. The first analysis compares MDT with a popular item response theory model and finds little difference in terms of the percentage of correct classifications. The…

  3. X-Y plotter adapter developed for SDS-930 computer

    NASA Technical Reports Server (NTRS)

    Robertson, J. B.

    1968-01-01

    Graphical Display Adapter provides a real time display for digital computerized experiments. This display uses a memory oscilloscope which records a single trace until erased. It is a small hardware unit which interfaces with the J-box feature of the SDS-930 computer to either an X-Y plotter or a memory oscilloscope.

  4. Research on Key Technologies of Cloud Computing

    NASA Astrophysics Data System (ADS)

    Zhang, Shufen; Yan, Hongcan; Chen, Xuebin

    With the development of multi-core processors, virtualization, distributed storage, broadband Internet and automatic management, a new type of computing mode named cloud computing is produced. It distributes computation task on the resource pool which consists of massive computers, so the application systems can obtain the computing power, the storage space and software service according to its demand. It can concentrate all the computing resources and manage them automatically by the software without intervene. This makes application offers not to annoy for tedious details and more absorbed in his business. It will be advantageous to innovation and reduce cost. It's the ultimate goal of cloud computing to provide calculation, services and applications as a public facility for the public, So that people can use the computer resources just like using water, electricity, gas and telephone. Currently, the understanding of cloud computing is developing and changing constantly, cloud computing still has no unanimous definition. This paper describes three main service forms of cloud computing: SAAS, PAAS, IAAS, compared the definition of cloud computing which is given by Google, Amazon, IBM and other companies, summarized the basic characteristics of cloud computing, and emphasized on the key technologies such as data storage, data management, virtualization and programming model.

  5. Unstructured Adaptive Grid Computations on an Array of SMPs

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Pramanick, Ira; Sohn, Andrew; Simon, Horst D.

    1996-01-01

    Dynamic load balancing is necessary for parallel adaptive methods to solve unsteady CFD problems on unstructured grids. We have presented such a dynamic load balancing framework called JOVE, in this paper. Results on a four-POWERnode POWER CHALLENGEarray demonstrated that load balancing gives significant performance improvements over no load balancing for such adaptive computations. The parallel speedup of JOVE, implemented using MPI on the POWER CHALLENCEarray, was significant, being as high as 31 for 32 processors. An implementation of JOVE that exploits 'an array of SMPS' architecture was also studied; this hybrid JOVE outperformed flat JOVE by up to 28% on the meshes and adaption models tested. With large, realistic meshes and actual flow-solver and adaption phases incorporated into JOVE, hybrid JOVE can be expected to yield significant advantage over flat JOVE, especially as the number of processors is increased, thus demonstrating the scalability of an array of SMPs architecture.

  6. AHaH computing with thermodynamic RAM: bridging the technology stack

    NASA Astrophysics Data System (ADS)

    Nugent, Alex

    2014-05-01

    We introduce the motivations behind AHaH computing, an emerging new form of adaptive computing with many applications in machine learning. We then present a technology stack or specification describing the multiple levels of abstraction and specialization needed to support AHaH computing.

  7. The Technology Fix: The Promise and Reality of Computers in Our Schools

    ERIC Educational Resources Information Center

    Pflaum, William D.

    2004-01-01

    During the technology boom of the 1980s and 1990s, computers seemed set to revolutionize education. Do any of these promises sound familiar? (1) Technology would help all students learn better, thanks to multimedia programs capable of adapting to individual needs, learning styles, and skill levels; (2) Technology would transform the teacher's role…

  8. Probabilistic co-adaptive brain-computer interfacing

    NASA Astrophysics Data System (ADS)

    Bryan, Matthew J.; Martin, Stefan A.; Cheung, Willy; Rao, Rajesh P. N.

    2013-12-01

    Objective. Brain-computer interfaces (BCIs) are confronted with two fundamental challenges: (a) the uncertainty associated with decoding noisy brain signals, and (b) the need for co-adaptation between the brain and the interface so as to cooperatively achieve a common goal in a task. We seek to mitigate these challenges. Approach. We introduce a new approach to brain-computer interfacing based on partially observable Markov decision processes (POMDPs). POMDPs provide a principled approach to handling uncertainty and achieving co-adaptation in the following manner: (1) Bayesian inference is used to compute posterior probability distributions (‘beliefs’) over brain and environment state, and (2) actions are selected based on entire belief distributions in order to maximize total expected reward; by employing methods from reinforcement learning, the POMDP’s reward function can be updated over time to allow for co-adaptive behaviour. Main results. We illustrate our approach using a simple non-invasive BCI which optimizes the speed-accuracy trade-off for individual subjects based on the signal-to-noise characteristics of their brain signals. We additionally demonstrate that the POMDP BCI can automatically detect changes in the user’s control strategy and can co-adaptively switch control strategies on-the-fly to maximize expected reward. Significance. Our results suggest that the framework of POMDPs offers a promising approach for designing BCIs that can handle uncertainty in neural signals and co-adapt with the user on an ongoing basis. The fact that the POMDP BCI maintains a probability distribution over the user’s brain state allows a much more powerful form of decision making than traditional BCI approaches, which have typically been based on the output of classifiers or regression techniques. Furthermore, the co-adaptation of the system allows the BCI to make online improvements to its behaviour, adjusting itself automatically to the user’s changing

  9. Adaptation of Technological Pedagogical Content Knowledge Scale to Turkish

    ERIC Educational Resources Information Center

    Kaya, Zehra; Kaya, Osman Nafiz; Emre, Irfan

    2013-01-01

    The purpose of this study was to adapt "Survey of Pre-service Teachers' Knowledge of Teaching and Technology" in order to assess pre-service primary teachers' Technological Pedagogical Content Knowledge (TPACK) to Turkish. 407 pre-service primary teachers (227 female and 180 male) in their final semester in Education Faculties…

  10. Memory System Technologies for Future High-End Computing Systems

    SciTech Connect

    McKee, S A; de Supinski, B R; Mueller, F; Tyson, G S

    2003-05-16

    Our ability to solve Grand Challenge Problems in computing hinges on the development of reliable and efficient High-End Computing systems. Unfortunately, the increasing gap between memory and processor speeds remains one of the major bottlenecks in modern architectures. Uniprocessor nodes still suffer, but symmetric multiprocessor nodes--where access to physical memory is shared among all processors--are among the hardest hit. In the latter case, the memory system must juggle multiple working sets and maintain memory coherence, on top of simply responding to access requests. To illustrate the severity of the current situation, consider two important examples: even the high-performance parallel supercomputers in use at Department of Energy National labs observe single-processor utilization rates as low as 5%, and transaction processing commercial workloads see utilizations of at most about 33%. A wealth of research demonstrates that traditional memory systems are incapable of bridging the processor/memory performance gap, and the problem continues to grow. The success of future High-End Computing platforms therefore depends on our developing hardware and software technologies to dramatically relieve the memory bottleneck. In order to take better advantage of the tremendous computing power of modern microprocessors and future High-End systems, we consider it crucial to develop the hardware for intelligent, adaptable memory systems; the middleware and OS modifications to manage them; and the compiler technology and performance tools to exploit them. Taken together, these will provide the foundations for meeting the requirements of future generations of performance-critical, parallel systems based on either uniprocessor or SMP nodes (including PIM organizations). We feel that such solutions should not be vendor-specific, but should be sufficiently general and adaptable such that the technologies could be leveraged by any commercial vendor of High-End Computing systems

  11. Enabling Computational Technologies for Terascale Scientific Simulations

    SciTech Connect

    Ashby, S.F.

    2000-08-24

    We develop scalable algorithms and object-oriented code frameworks for terascale scientific simulations on massively parallel processors (MPPs). Our research in multigrid-based linear solvers and adaptive mesh refinement enables Laboratory programs to use MPPs to explore important physical phenomena. For example, our research aids stockpile stewardship by making practical detailed 3D simulations of radiation transport. The need to solve large linear systems arises in many applications, including radiation transport, structural dynamics, combustion, and flow in porous media. These systems result from discretizations of partial differential equations on computational meshes. Our first research objective is to develop multigrid preconditioned iterative methods for such problems and to demonstrate their scalability on MPPs. Scalability describes how total computational work grows with problem size; it measures how effectively additional resources can help solve increasingly larger problems. Many factors contribute to scalability: computer architecture, parallel implementation, and choice of algorithm. Scalable algorithms have been shown to decrease simulation times by several orders of magnitude.

  12. Method and system for environmentally adaptive fault tolerant computing

    NASA Technical Reports Server (NTRS)

    Copenhaver, Jason L. (Inventor); Jeremy, Ramos (Inventor); Wolfe, Jeffrey M. (Inventor); Brenner, Dean (Inventor)

    2010-01-01

    A method and system for adapting fault tolerant computing. The method includes the steps of measuring an environmental condition representative of an environment. An on-board processing system's sensitivity to the measured environmental condition is measured. It is determined whether to reconfigure a fault tolerance of the on-board processing system based in part on the measured environmental condition. The fault tolerance of the on-board processing system may be reconfigured based in part on the measured environmental condition.

  13. Computation emerges from adaptive synchronization of networking neurons.

    PubMed

    Zanin, Massimiliano; Del Pozo, Francisco; Boccaletti, Stefano

    2011-01-01

    The activity of networking neurons is largely characterized by the alternation of synchronous and asynchronous spiking sequences. One of the most relevant challenges that scientists are facing today is, then, relating that evidence with the fundamental mechanisms through which the brain computes and processes information, as well as with the arousal (or progress) of a number of neurological illnesses. In other words, the problem is how to associate an organized dynamics of interacting neural assemblies to a computational task. Here we show that computation can be seen as a feature emerging from the collective dynamics of an ensemble of networking neurons, which interact by means of adaptive dynamical connections. Namely, by associating logical states to synchronous neuron's dynamics, we show how the usual Boolean logics can be fully recovered, and a universal Turing machine can be constructed. Furthermore, we show that, besides the static binary gates, a wider class of logical operations can be efficiently constructed as the fundamental computational elements interact within an adaptive network, each operation being represented by a specific motif. Our approach qualitatively differs from the past attempts to encode information and compute with complex systems, where computation was instead the consequence of the application of control loops enforcing a desired state into the specific system's dynamics. Being the result of an emergent process, the computation mechanism here described is not limited to a binary Boolean logic, but it can involve a much larger number of states. As such, our results can enlighten new concepts for the understanding of the real computing processes taking place in the brain. PMID:22073167

  14. Adaptive kinetic-fluid solvers for heterogeneous computing architectures

    NASA Astrophysics Data System (ADS)

    Zabelok, Sergey; Arslanbekov, Robert; Kolobov, Vladimir

    2015-12-01

    We show feasibility and benefits of porting an adaptive multi-scale kinetic-fluid code to CPU-GPU systems. Challenges are due to the irregular data access for adaptive Cartesian mesh, vast difference of computational cost between kinetic and fluid cells, and desire to evenly load all CPUs and GPUs during grid adaptation and algorithm refinement. Our Unified Flow Solver (UFS) combines Adaptive Mesh Refinement (AMR) with automatic cell-by-cell selection of kinetic or fluid solvers based on continuum breakdown criteria. Using GPUs enables hybrid simulations of mixed rarefied-continuum flows with a million of Boltzmann cells each having a 24 × 24 × 24 velocity mesh. We describe the implementation of CUDA kernels for three modules in UFS: the direct Boltzmann solver using the discrete velocity method (DVM), the Direct Simulation Monte Carlo (DSMC) solver, and a mesoscopic solver based on the Lattice Boltzmann Method (LBM), all using adaptive Cartesian mesh. Double digit speedups on single GPU and good scaling for multi-GPUs have been demonstrated.

  15. Instructional Technology in Computer Science Education

    ERIC Educational Resources Information Center

    Jenny, Frederick J.

    2004-01-01

    The Web, the Internet, the intranet and associated resources, campus computer labs, smart classrooms, course management systems, and a plethora of software packages all offer opportunities for every classroom instructor to enrich in-class and out-of-class activities. Why should an instructor consider the integration of technology into their…

  16. Publishing a School Newspaper Using Computer Technology.

    ERIC Educational Resources Information Center

    Whitney, Jeanne; And Others

    By publishing a quarterly school and community newspaper, sixth, seventh, and eighth graders get involved in the writing of many types of articles, proofreading, communication skills, interviewing skills, investigative reporting, photography, artistic and graphic design, and computer technology. As the students work together on each issue of the…

  17. Women Workers as Users of Computer Technology.

    ERIC Educational Resources Information Center

    Larwood, Laurie

    1992-01-01

    Discussion of expectations, trends, and implications of growth of computer technology and its effect on women workers argues that the experience of women is different from that of men in the nature of jobs in which women are found, their training and education, home-family conflict, and discrimination. The impact on women of increasing…

  18. Competency Index. [Business/Computer Technologies Cluster.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Center on Education and Training for Employment.

    This index allows the user to scan the competencies under each title for the 28 subjects appropriate for use in a competency list for the 12 occupations within the business/computer technologies cluster. Titles of the 28 units are as follows: employability skills; professionalism; teamwork; professional and ethical standards; economic and business…

  19. Business/Computer Technologies. State Competency Profile.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Center on Education and Training for Employment.

    This document contains 272 competencies, grouped into 36 units, for tech prep programs in the business/computer technology cluster. The competencies were developed through collaboration of Ohio business, industry, and labor representatives and secondary and associate degree educators. The competencies are rated either "essential" (necessary to…

  20. University Students' Perceptions of Computer Technology Experiences

    ERIC Educational Resources Information Center

    Inoue, Yukiko

    2007-01-01

    On the basis of a survey as a research method (involving from designing surveys to reporting on surveys), the author examined students' perceptions of computers and information technology. In fall 2005, a survey questionnaire was administered to students enrolled in education courses at a university in the western Pacific. Attention was given to…

  1. Computer Servicing Technology. Florida Vocational Program Guide.

    ERIC Educational Resources Information Center

    University of South Florida, Tampa. Dept. of Adult and Vocational Education.

    This program guide identifies primary concerns in the organization, operation, and evaluation of a computer servicing technology program. It is designed for local school district and community college administrators, instructors, program advisory committees, and regional coordinating councils. The guide begins with the Dictionary of Occupational…

  2. Evaluation of Advanced Computing Techniques and Technologies: Reconfigurable Computing

    NASA Technical Reports Server (NTRS)

    Wells, B. Earl

    2003-01-01

    The focus of this project was to survey the technology of reconfigurable computing determine its level of maturity and suitability for NASA applications. To better understand and assess the effectiveness of the reconfigurable design paradigm that is utilized within the HAL-15 reconfigurable computer system. This system was made available to NASA MSFC for this purpose, from Star Bridge Systems, Inc. To implement on at least one application that would benefit from the performance levels that are possible with reconfigurable hardware. It was originally proposed that experiments in fault tolerance and dynamically reconfigurability would be perform but time constraints mandated that these be pursued as future research.

  3. Computational Fluid Dynamics Technology for Hypersonic Applications

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.

    2003-01-01

    Several current challenges in computational fluid dynamics and aerothermodynamics for hypersonic vehicle applications are discussed. Example simulations are presented from code validation and code benchmarking efforts to illustrate capabilities and limitations. Opportunities to advance the state-of-art in algorithms, grid generation and adaptation, and code validation are identified. Highlights of diverse efforts to address these challenges are then discussed. One such effort to re-engineer and synthesize the existing analysis capability in LAURA, VULCAN, and FUN3D will provide context for these discussions. The critical (and evolving) role of agile software engineering practice in the capability enhancement process is also noted.

  4. Petascale Computing Enabling Technologies Project Final Report

    SciTech Connect

    de Supinski, B R

    2010-02-14

    The Petascale Computing Enabling Technologies (PCET) project addressed challenges arising from current trends in computer architecture that will lead to large-scale systems with many more nodes, each of which uses multicore chips. These factors will soon lead to systems that have over one million processors. Also, the use of multicore chips will lead to less memory and less memory bandwidth per core. We need fundamentally new algorithmic approaches to cope with these memory constraints and the huge number of processors. Further, correct, efficient code development is difficult even with the number of processors in current systems; more processors will only make it harder. The goal of PCET was to overcome these challenges by developing the computer science and mathematical underpinnings needed to realize the full potential of our future large-scale systems. Our research results will significantly increase the scientific output obtained from LLNL large-scale computing resources by improving application scientist productivity and system utilization. Our successes include scalable mathematical algorithms that adapt to these emerging architecture trends and through code correctness and performance methodologies that automate critical aspects of application development as well as the foundations for application-level fault tolerance techniques. PCET's scope encompassed several research thrusts in computer science and mathematics: code correctness and performance methodologies, scalable mathematics algorithms appropriate for multicore systems, and application-level fault tolerance techniques. Due to funding limitations, we focused primarily on the first three thrusts although our work also lays the foundation for the needed advances in fault tolerance. In the area of scalable mathematics algorithms, our preliminary work established that OpenMP performance of the AMG linear solver benchmark and important individual kernels on Atlas did not match the predictions of our

  5. Topology and grid adaption for high-speed flow computations

    NASA Technical Reports Server (NTRS)

    Abolhassani, Jamshid S.; Tiwari, Surendra N.

    1989-01-01

    This study investigates the effects of grid topology and grid adaptation on numerical solutions of the Navier-Stokes equations. In the first part of this study, a general procedure is presented for computation of high-speed flow over complex three-dimensional configurations. The flow field is simulated on the surface of a Butler wing in a uniform stream. Results are presented for Mach number 3.5 and a Reynolds number of 2,000,000. The O-type and H-type grids have been used for this study, and the results are compared together and with other theoretical and experimental results. The results demonstrate that while the H-type grid is suitable for the leading and trailing edges, a more accurate solution can be obtained for the middle part of the wing with an O-type grid. In the second part of this study, methods of grid adaption are reviewed and a method is developed with the capability of adapting to several variables. This method is based on a variational approach and is an algebraic method. Also, the method has been formulated in such a way that there is no need for any matrix inversion. This method is used in conjunction with the calculation of hypersonic flow over a blunt-nose body. A movie has been produced which shows simultaneously the transient behavior of the solution and the grid adaption.

  6. Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine

    NASA Astrophysics Data System (ADS)

    Sharma, Gulshan B.; Robertson, Douglas D.

    2013-07-01

    Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula's material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element's remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than actual

  7. Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine

    SciTech Connect

    Sharma, Gulshan B.; Robertson, Douglas D.

    2013-07-01

    Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula’s material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element’s remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than

  8. Adaptive filtering image preprocessing for smart FPA technology

    NASA Astrophysics Data System (ADS)

    Brooks, Geoffrey W.

    1995-05-01

    This paper discusses two applications of adaptive filters for image processing on parallel architectures. The first, based on the results of previously accomplished work, summarizes the analyses of various adaptive filters implemented for pixel-level image prediction. FIR filters, fixed and adaptive IIR filters, and various variable step size algorithms were compared with a focus on algorithm complexity against the ability to predict future pixel values. A gaussian smoothing operation with varying spatial and temporal constants were also applied for comparisons of random noise reductions. The second application is a suggestion to use memory-adaptive IIR filters for detecting and tracking motion within an image. Objects within an image are made of edges, or segments, with varying degrees of motion. An application has been previously published that describes FIR filters connecting pixels and using correlations to determine motion and direction. This implementation seems limited to detecting motion coinciding with FIR filter operation rate and the associated harmonics. Upgrading the FIR structures with adaptive IIR structures can eliminate these limitations. These and any other pixel-level adaptive filtering application require data memory for filter parameters and some basic computational capability. Tradeoffs have to be made between chip real estate and these desired features. System tradeoffs will also have to be made as to where it makes the most sense to do which level of processing. Although smart pixels may not be ready to implement adaptive filters, applications such as these should give the smart pixel designer some long range goals.

  9. Adaptive quantum computation in changing environments using projective simulation

    PubMed Central

    Tiersch, M.; Ganahl, E. J.; Briegel, H. J.

    2015-01-01

    Quantum information processing devices need to be robust and stable against external noise and internal imperfections to ensure correct operation. In a setting of measurement-based quantum computation, we explore how an intelligent agent endowed with a projective simulator can act as controller to adapt measurement directions to an external stray field of unknown magnitude in a fixed direction. We assess the agent’s learning behavior in static and time-varying fields and explore composition strategies in the projective simulator to improve the agent’s performance. We demonstrate the applicability by correcting for stray fields in a measurement-based algorithm for Grover’s search. Thereby, we lay out a path for adaptive controllers based on intelligent agents for quantum information tasks. PMID:26260263

  10. Adaptive E-Learning Environments: Research Dimensions and Technological Approaches

    ERIC Educational Resources Information Center

    Di Bitonto, Pierpaolo; Roselli, Teresa; Rossano, Veronica; Sinatra, Maria

    2013-01-01

    One of the most closely investigated topics in e-learning research has always been the effectiveness of adaptive learning environments. The technological evolutions that have dramatically changed the educational world in the last six decades have allowed ever more advanced and smarter solutions to be proposed. The focus of this paper is to depict…

  11. Adaptive wall technology for minimization of wall interferences in transonic wind tunnels

    NASA Technical Reports Server (NTRS)

    Wolf, Stephen W. D.

    1988-01-01

    Modern experimental techniques to improve free air simulations in transonic wind tunnels by use of adaptive wall technology are reviewed. Considered are the significant advantages of adaptive wall testing techniques with respect to wall interferences, Reynolds number, tunnel drive power, and flow quality. The application of these testing techniques relies on making the test section boundaries adjustable and using a rapid wall adjustment procedure. A historical overview shows how the disjointed development of these testing techniques, since 1938, is closely linked to available computer support. An overview of Adaptive Wall Test Section (AWTS) designs shows a preference for use of relatively simple designs with solid adaptive walls in 2- and 3-D testing. Operational aspects of AWTS's are discussed with regard to production type operation where adaptive wall adjustments need to be quick. Both 2- and 3-D data are presented to illustrate the quality of AWTS data over the transonic speed range. Adaptive wall technology is available for general use in 2-D testing, even in cryogenic wind tunnels. In 3-D testing, more refinement of the adaptive wall testing techniques is required before more widespread use can be planned.

  12. Technologies for Achieving Field Ubiquitous Computing

    NASA Astrophysics Data System (ADS)

    Nagashima, Akira

    Although the term “ubiquitous” may sound like jargon used in information appliances, ubiquitous computing is an emerging concept in industrial automation. This paper presents the author's visions of field ubiquitous computing, which is based on the novel Internet Protocol IPv6. IPv6-based instrumentation will realize the next generation manufacturing excellence. This paper focuses on the following five key issues: 1. IPv6 standardization; 2. IPv6 interfaces embedded in field devices; 3. Compatibility with FOUNDATION fieldbus; 4. Network securities for field applications; and 5. Wireless technologies to complement IP instrumentation. Furthermore, the principles of digital plant operations and ubiquitous production to support the above key technologies to achieve field ubiquitous systems are discussed.

  13. Technology Needs for Teachers Web Development and Curriculum Adaptations

    NASA Technical Reports Server (NTRS)

    Carroll, Christy J.

    1999-01-01

    Computer-based mathematics and science curricula focusing on NASA inventions and technologies will enhance current teacher knowledge and skills. Materials and interactive software developed by educators will allow students to integrate their various courses, to work cooperatively, and to collaborate with both NASA scientists and students at other locations by using computer networks, email and the World Wide Web.

  14. Use of Computer Technology for English Language Learning: Do Learning Styles, Gender, and Age Matter?

    ERIC Educational Resources Information Center

    Lee, Cynthia; Yeung, Alexander Seeshing; Ip, Tiffany

    2016-01-01

    Computer technology provides spaces and locales for language learning. However, learning style preference and demographic variables may affect the effectiveness of technology use for a desired goal. Adapting Reid's pioneering Perceptual Learning Style Preference Questionnaire (PLSPQ), this study investigated the relations of university students'…

  15. Reviews of computing technology: Client-server technology

    SciTech Connect

    Johnson, S.M.

    1990-09-01

    One of the most frequently heard terms in the computer industry these days is client-server.'' There is much misinformation available on the topic, and competitive pressures on software vendors have led to a great deal of hype with little in the way of supporting products. The purpose of this document is to explain what is meant by client-server applications, why the Advanced Technology and Architecture (ATA) section of the Information Resources Management (IRM) Department sees this emerging technology as key for computer applications during the next ten years, and what ATA sees as the existing standards and products available today. Because of the relative immaturity of existing client-server products, IRM is not yet guidelining any specific client-server products, except those that are components of guidelined data communications products or database management systems.

  16. Reviews of computing technology: Client-server technology

    SciTech Connect

    Johnson, S.M.

    1990-09-01

    One of the most frequently heard terms in the computer industry these days is ``client-server.`` There is much misinformation available on the topic, and competitive pressures on software vendors have led to a great deal of hype with little in the way of supporting products. The purpose of this document is to explain what is meant by client-server applications, why the Advanced Technology and Architecture (ATA) section of the Information Resources Management (IRM) Department sees this emerging technology as key for computer applications during the next ten years, and what ATA sees as the existing standards and products available today. Because of the relative immaturity of existing client-server products, IRM is not yet guidelining any specific client-server products, except those that are components of guidelined data communications products or database management systems.

  17. Computer Education and Instructional Technology Teacher Trainees' Opinions about Cloud Computing Technology

    ERIC Educational Resources Information Center

    Karamete, Aysen

    2015-01-01

    This study aims to show the present conditions about the usage of cloud computing in the department of Computer Education and Instructional Technology (CEIT) amongst teacher trainees in School of Necatibey Education, Balikesir University, Turkey. In this study, a questionnaire with open-ended questions was used. 17 CEIT teacher trainees…

  18. Computational Support for Technology- Investment Decisions

    NASA Technical Reports Server (NTRS)

    Adumitroaie, Virgil; Hua, Hook; Lincoln, William; Block, Gary; Mrozinski, Joseph; Shelton, Kacie; Weisbin, Charles; Elfes, Alberto; Smith, Jeffrey

    2007-01-01

    Strategic Assessment of Risk and Technology (START) is a user-friendly computer program that assists human managers in making decisions regarding research-and-development investment portfolios in the presence of uncertainties and of non-technological constraints that include budgetary and time limits, restrictions related to infrastructure, and programmatic and institutional priorities. START facilitates quantitative analysis of technologies, capabilities, missions, scenarios and programs, and thereby enables the selection and scheduling of value-optimal development efforts. START incorporates features that, variously, perform or support a unique combination of functions, most of which are not systematically performed or supported by prior decision- support software. These functions include the following: Optimal portfolio selection using an expected-utility-based assessment of capabilities and technologies; Temporal investment recommendations; Distinctions between enhancing and enabling capabilities; Analysis of partial funding for enhancing capabilities; and Sensitivity and uncertainty analysis. START can run on almost any computing hardware, within Linux and related operating systems that include Mac OS X versions 10.3 and later, and can run in Windows under the Cygwin environment. START can be distributed in binary code form. START calls, as external libraries, several open-source software packages. Output is in Excel (.xls) file format.

  19. Grid computing technology for hydrological applications

    NASA Astrophysics Data System (ADS)

    Lecca, G.; Petitdidier, M.; Hluchy, L.; Ivanovic, M.; Kussul, N.; Ray, N.; Thieron, V.

    2011-06-01

    SummaryAdvances in e-Infrastructure promise to revolutionize sensing systems and the way in which data are collected and assimilated, and complex water systems are simulated and visualized. According to the EU Infrastructure 2010 work-programme, data and compute infrastructures and their underlying technologies, either oriented to tackle scientific challenges or complex problem solving in engineering, are expected to converge together into the so-called knowledge infrastructures, leading to a more effective research, education and innovation in the next decade and beyond. Grid technology is recognized as a fundamental component of e-Infrastructures. Nevertheless, this emerging paradigm highlights several topics, including data management, algorithm optimization, security, performance (speed, throughput, bandwidth, etc.), and scientific cooperation and collaboration issues that require further examination to fully exploit it and to better inform future research policies. The paper illustrates the results of six different surface and subsurface hydrology applications that have been deployed on the Grid. All the applications aim to answer to strong requirements from the Civil Society at large, relatively to natural and anthropogenic risks. Grid technology has been successfully tested to improve flood prediction, groundwater resources management and Black Sea hydrological survey, by providing large computing resources. It is also shown that Grid technology facilitates e-cooperation among partners by means of services for authentication and authorization, seamless access to distributed data sources, data protection and access right, and standardization.

  20. Collection of Articles on Computers and Information Technology.

    ERIC Educational Resources Information Center

    Shannon, A. G.; And Others

    Four articles focus on computers, information technology, and education: (1) "Information Technology: Some Implications for Education" (A. G. Shannon, B. S. Thorton, and Gareth Locksley) examines the last phase of technological development, the communication phase, as it relates to computer technology in education; (2) "Computers in the…

  1. Reviews of computing technology: Object-oriented technology

    SciTech Connect

    Skeen, D.C.

    1993-03-01

    A useful metaphor in introducing object-oriented concepts is the idea of a computer hardware manufacturer assembling products from an existing stock of electronic parts. In this analogy, think of the parts as pieces of computer software and of the finished products as computer applications. Like its counterpart, the object is capable of performing its specific function in a wide variety of different applications. The advantages to assembling hardware using a set of prebuilt parts are obvious. The design process is greatly simplified in this scenario, since the designer needs only to carry the design down to the chip level, rather than to the transistor level. As a result, the designer is free to develop a more reliable and feature rich product. Also, since the component parts are reused in several different products, the parts can be made more robust and subjected to more rigorous testing than would be economically feasible for a part used in only one piece of equipment. Additionally, maintenance on the resulting systems is simplified because of the part-level consistency from one type of equipment to another. The remainder of this document introduces the techniques used to develop objects, the benefits of the technology, outstanding issues that remain with the technology, industry direction for the technology, and the impact that object-oriented technology is likely to have on the organization. While going through this material, the reader will find it useful to remember the parts analogy and to keep in mind that the overall purpose of object-oriented technology is to create software parts and to construct applications using those parts.

  2. Computer microworld development adapted to children's conceptions: A case study

    NASA Astrophysics Data System (ADS)

    Couturier, Russell Lawrence

    This research studied changes in ten middle school students' scientific conceptions during interaction with a computer microworld designed adaptively for exploring phases of the moon. Following direct observations of lunar phenomena, five students participated in the development of the computer microworld. The researcher implemented software design requests from the students based on their real world and microworld experience. Five different students used the final revised microworld and provided additional feedback. All sessions were transcribed and analyzed. Evidence from this case study suggests that this constructionist activity was a good catalyst for inducing conceptual change in learners---especially the five who had considerable ownership in the software development. Implications for classroom teaching strategies and suggestions for future research are offered.

  3. Experience with automatic, dynamic load balancing and adaptive finite element computation

    SciTech Connect

    Wheat, S.R.; Devine, K.D.; Maccabe, A.B.

    1993-10-01

    Distributed memory, Massively Parallel (MP), MIMD technology has enabled the development of applications requiring computational resources previously unobtainable. Structural mechanics and fluid dynamics applications, for example, are often solved by finite element methods (FEMs) requiring, millions of degrees of freedom to accurately simulate physical phenomenon. Adaptive methods, which automatically refine or coarsen meshes and vary the order of accuracy of the numerical solution, offer greater robustness and computational efficiency than traditional FEMs by reducing the amount of computation required away from physical structures such as shock waves and boundary layers. On MP computers, FEMs frequently result in distributed processor load imbalances. To overcome load imbalance, many MP FEMs use static load balancing as a preprocessor to the finite element calculation. Adaptive methods complicate the load imbalance problem since the work per element is not uniform across the solution domain and changes as the computation proceeds. Therefore, dynamic load balancing is required to maintain global load balance. We describe a dynamic, fine-grained, element-based data migration system that maintains global load balance and is effective in the presence of changing work loads. Global load balance is achieved by overlapping neighborhoods of processors, where each neighborhood performs local load balancing. The method utilizes an automatic element management system library to which a programmer integrates the application`s computational description. The library`s flexibility supports a large class of finite element and finite difference based applications.

  4. UV light source adaptive sensing technology for flue gas measurement

    NASA Astrophysics Data System (ADS)

    Sun, Changku; Zhang, Chi; Sun, Bo; Liu, Bin; Wang, Peng

    2010-11-01

    The UV absorption spectrometry technique DOAS (Differential Optical Absorption Spectroscopy) has been widely used in continuous monitoring of flue gas, and has achieved good results. DOAS method is based on the basic law of light absorption--Lambert-Beer law. SO2, NOX are the principal component of the flue gas. These components are considered by DOAS method at the same time. And certain mathematical methods are used for concentrations measuring. The Continuous Emission Monitoring System (CEMS) based on the principle of DOAS mainly has two probe-styles present: in-situ probe-style and extractive probe-style. For the in-situ probe-style CEMS based on DOAS method, prolonged use for the UV light source, contaminated lens caused by floating oil and complex environment of the flue will all bring attenuation of the spectral intensity, it will affect the accuracy of measurement. In this article, an in-situ continuous monitoring system based on DOAS method is described, and a component adaptive sensing technology is proposed. By using this adaptive sensing technology, CEMS can adjust the integral time of the spectrometer according to the non-measuring attenuation of the light source intensity and automatically compensate the loss of spectral intensity. Under the laboratory conditions, the experiments for SO2, NO standard gas measurement using adaptive sensing technology is made. Many different levels of light intensity attenuation are considered in the experiments. The results show that the adaptive sensing technology can well compensate the non-measuring loss of spectral intensity. In the field measurement, this technology can well reduce the measurement error brought by attenuation of light intensity, compared with the handheld gas analyzer, the average error of concentration measurement is less than 2% FS(Full Scale).

  5. Improving student retention in computer engineering technology

    NASA Astrophysics Data System (ADS)

    Pierozinski, Russell Ivan

    The purpose of this research project was to improve student retention in the Computer Engineering Technology program at the Northern Alberta Institute of Technology by reducing the number of dropouts and increasing the graduation rate. This action research project utilized a mixed methods approach of a survey and face-to-face interviews. The participants were male and female, with a large majority ranging from 18 to 21 years of age. The research found that participants recognized their skills and capability, but their capacity to remain in the program was dependent on understanding and meeting the demanding pace and rigour of the program. The participants recognized that curriculum delivery along with instructor-student interaction had an impact on student retention. To be successful in the program, students required support in four domains: academic, learning management, career, and social.

  6. [Computer technology in clinical preventive medicine].

    PubMed

    Okada, M

    1990-12-01

    Predicting who will suffer from diseases in the future is basically mathematical work. Current computer technology will accelerate the progress of preventive medicine. In this respect, there are two useful tools for the research. First, long-term archiving of health-care information is valuable in a retrospective study, such as, determination of diagnostic criteria for the prediction. Health-care information here includes past history, laboratory data, and dietary habits. Using such criteria, potential patients can be discriminated from truly healthy persons. Second, prediction is successfully carried out on the basis of mathematical equations which represent the relationship between disease status and health-care information. In conclusion, technology for database management and mathematical modelling is essential for the basic study in preventive medicine. PMID:2082030

  7. Adaptive Optics Technology for High-Resolution Retinal Imaging

    PubMed Central

    Lombardo, Marco; Serrao, Sebastiano; Devaney, Nicholas; Parravano, Mariacristina; Lombardo, Giuseppe

    2013-01-01

    Adaptive optics (AO) is a technology used to improve the performance of optical systems by reducing the effects of optical aberrations. The direct visualization of the photoreceptor cells, capillaries and nerve fiber bundles represents the major benefit of adding AO to retinal imaging. Adaptive optics is opening a new frontier for clinical research in ophthalmology, providing new information on the early pathological changes of the retinal microstructures in various retinal diseases. We have reviewed AO technology for retinal imaging, providing information on the core components of an AO retinal camera. The most commonly used wavefront sensing and correcting elements are discussed. Furthermore, we discuss current applications of AO imaging to a population of healthy adults and to the most frequent causes of blindness, including diabetic retinopathy, age-related macular degeneration and glaucoma. We conclude our work with a discussion on future clinical prospects for AO retinal imaging. PMID:23271600

  8. The Contextualized Technology Adaptation Process (CTAP): Optimizing Health Information Technology to Improve Mental Health Systems.

    PubMed

    Lyon, Aaron R; Wasse, Jessica Knaster; Ludwig, Kristy; Zachry, Mark; Bruns, Eric J; Unützer, Jürgen; McCauley, Elizabeth

    2016-05-01

    Health information technologies have become a central fixture in the mental healthcare landscape, but few frameworks exist to guide their adaptation to novel settings. This paper introduces the contextualized technology adaptation process (CTAP) and presents data collected during Phase 1 of its application to measurement feedback system development in school mental health. The CTAP is built on models of human-centered design and implementation science and incorporates repeated mixed methods assessments to guide the design of technologies to ensure high compatibility with a destination setting. CTAP phases include: (1) Contextual evaluation, (2) Evaluation of the unadapted technology, (3) Trialing and evaluation of the adapted technology, (4) Refinement and larger-scale implementation, and (5) Sustainment through ongoing evaluation and system revision. Qualitative findings from school-based practitioner focus groups are presented, which provided information for CTAP Phase 1, contextual evaluation, surrounding education sector clinicians' workflows, types of technologies currently available, and influences on technology use. Discussion focuses on how findings will inform subsequent CTAP phases, as well as their implications for future technology adaptation across content domains and service sectors. PMID:25677251

  9. Making Effective Use of Computer Technology.

    ERIC Educational Resources Information Center

    Ornstein, Allan C.

    1992-01-01

    Six computer applications in education are word processing, computer-assisted instruction, computer-aided design, computer authoring systems, computer data systems, and computer storage. Computers may assist students with three learning stages: acquisition, transformation, and evaluation of information. Advances in computer programing, software,…

  10. Advancements in adaptive aerodynamic technologies for airfoils and wings

    NASA Astrophysics Data System (ADS)

    Jepson, Jeffrey Keith

    required for the airfoil-aircraft matching. Examples are presented to illustrate the flapped-airfoil design approach for a general aviation aircraft and the results are validated by comparison with results from post-design aircraft performance computations. Once the airfoil is designed to incorporate a TE flap, it is important to determine the most suitable flap angles along the wing for different flight conditions. The second part of this dissertation presents a method for determining the optimum flap angles to minimize drag based on pressures measured at select locations on the wing. Computational flow simulations using a panel method are used "in the loop" for demonstrating closed-loop control of the flaps. Examples in the paper show that the control algorithm is successful in correctly adapting the wing to achieve the target lift distributions for minimizing induced drag while adjusting the wing angle of attack for operation of the wing in the drag bucket. It is shown that the "sense-and-adapt" approach developed is capable of handling varying and unpredictable inflow conditions. Such a capability could be useful in adapting long-span flexible wings that may experience significant and unknown atmospheric inflow variations along the span. To further develop the "sense-and-adapt" approach, the method was tested experimentally in the third part of the research. The goal of the testing was to see if the same results found computationally can be obtained experimentally. The North Carolina State University subsonic wind tunnel was used for the wind tunnel tests. Results from the testing showed that the "sense-and-adapt" approach has the same performance experimentally as it did computationally. The research presented in this dissertation is a stepping stone towards further development of the concept, which includes modeling the system in the Simulink environment and flight experiments using uninhabited aerial vehicles.

  11. The Feasibility of Adaptive Unstructured Computations On Petaflops Systems

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Oliker, Leonid; Heber, Gerd; Gao, Guang; Saini, Subhash (Technical Monitor)

    1999-01-01

    This viewgraph presentation covers the advantages of mesh adaptation, unstructured grids, and dynamic load balancing. It illustrates parallel adaptive communications, and explains PLUM (Parallel dynamic load balancing for adaptive unstructured meshes), and PSAW (Proper Self Avoiding Walks).

  12. Authoring of Adaptive Computer Assisted Assessment of Free-Text Answers

    ERIC Educational Resources Information Center

    Alfonseca, Enrique; Carro, Rosa M.; Freire, Manuel; Ortigosa, Alvaro; Perez, Diana; Rodriguez, Pilar

    2005-01-01

    Adaptation techniques can be applied not only to the multimedia contents or navigational possibilities of a course, but also to the assessment. In order to facilitate the authoring of adaptive free-text assessment and its integration within adaptive web-based courses, Adaptive Hypermedia techniques and Free-text Computer Assisted Assessment are…

  13. Learners' Perceptions and Illusions of Adaptivity in Computer-Based Learning Environments

    ERIC Educational Resources Information Center

    Vandewaetere, Mieke; Vandercruysse, Sylke; Clarebout, Geraldine

    2012-01-01

    Research on computer-based adaptive learning environments has shown exemplary growth. Although the mechanisms of effective adaptive instruction are unraveled systematically, little is known about the relative effect of learners' perceptions of adaptivity in adaptive learning environments. As previous research has demonstrated that the learners'…

  14. Teaching with Technology: The Classroom Manager. Cost-Conscious Computing.

    ERIC Educational Resources Information Center

    Smith, Rhea; And Others

    1992-01-01

    Teachers discuss how to make the most of technology in the classroom during a tight economy. Ideas include recycling computer printer ribbons, buying replacement batteries for computer power supply packs, upgrading via software, and soliciting donated computer equipment. (SM)

  15. Adopt or Adapt: Sanitation Technology Choices in Urbanizing Malawi.

    PubMed

    Chunga, Richard M; Ensink, Jeroen H J; Jenkins, Marion W; Brown, Joe

    2016-01-01

    This paper presents the results of a mixed-methods study examining adaptation strategies that property owners in low-income, rapidly urbanizing areas in Malawi adopt to address the limitations of pit latrines, the most common method of disposing human excreta. A particular challenge is lack of space for constructing new latrines as population density increases: traditional practice has been to cap full pits and simply move to a new site, but increasing demands on space require new approaches to extend the service life of latrines. In this context, we collected data on sanitation technology choices from January to September 2013 through 48 in-depth interviews and a stated preference survey targeting 1,300 property owners from 27 low-income urban areas. Results showed that property owners with concern about space for replacing pit latrines were 1.8 times more likely to select pit emptying service over the construction of new pit latrines with a slab floor (p = 0.02) but there was no significant association between concern about space for replacing pit latrines and intention to adopt locally promoted, novel sanitation technology known as ecological sanitation (ecosan). Property owners preferred to adapt existing, known technology by constructing replacement pit latrines on old pit latrine locations, reducing the frequency of replacing pit latrines, or via emptying pit latrines when full. This study highlights potential challenges to adoption of wholly new sanitation technologies, even when they present clear advantages to end users. To scale, alternative sanitation technologies for rapidly urbanising cities should offer clear advantages, be affordable, be easy to use when shared among multiple households, and their design should be informed by existing adaptation strategies and local knowledge. PMID:27532871

  16. Adopt or Adapt: Sanitation Technology Choices in Urbanizing Malawi

    PubMed Central

    Chunga, Richard M.; Ensink, Jeroen H. J.; Jenkins, Marion W.; Brown, Joe

    2016-01-01

    This paper presents the results of a mixed-methods study examining adaptation strategies that property owners in low-income, rapidly urbanizing areas in Malawi adopt to address the limitations of pit latrines, the most common method of disposing human excreta. A particular challenge is lack of space for constructing new latrines as population density increases: traditional practice has been to cap full pits and simply move to a new site, but increasing demands on space require new approaches to extend the service life of latrines. In this context, we collected data on sanitation technology choices from January to September 2013 through 48 in-depth interviews and a stated preference survey targeting 1,300 property owners from 27 low-income urban areas. Results showed that property owners with concern about space for replacing pit latrines were 1.8 times more likely to select pit emptying service over the construction of new pit latrines with a slab floor (p = 0.02) but there was no significant association between concern about space for replacing pit latrines and intention to adopt locally promoted, novel sanitation technology known as ecological sanitation (ecosan). Property owners preferred to adapt existing, known technology by constructing replacement pit latrines on old pit latrine locations, reducing the frequency of replacing pit latrines, or via emptying pit latrines when full. This study highlights potential challenges to adoption of wholly new sanitation technologies, even when they present clear advantages to end users. To scale, alternative sanitation technologies for rapidly urbanising cities should offer clear advantages, be affordable, be easy to use when shared among multiple households, and their design should be informed by existing adaptation strategies and local knowledge. PMID:27532871

  17. Evaluating Computer Technology Integration in a Centralized School System

    ERIC Educational Resources Information Center

    Eteokleous, N.

    2008-01-01

    The study evaluated the current situation in Cyprus elementary classrooms regarding computer technology integration in an attempt to identify ways of expanding teachers' and students' experiences with computer technology. It examined how Cypriot elementary teachers use computers, and the factors that influence computer integration in their…

  18. A Multiscale Computational Framework to Understand Vascular Adaptation

    PubMed Central

    Garbey, Marc; Rahman, Mahbubur; Berceli, Scott A.

    2015-01-01

    The failure rate for vascular interventions (vein bypass grafting, arterial angioplasty/stenting) remains unacceptably high. Over the past two decades, researchers have applied a wide variety of approaches to investigate the primary failure mechanisms, neointimal hyperplasia and aberrant remodeling of the wall, in an effort to identify novel therapeutic strategies. Despite incremental progress, specific cause/effect linkages among the primary drivers of the pathology, (hemodynamic factors, inflammatory biochemical mediators, cellular effectors) and vascular occlusive phenotype remain lacking. We propose a multiscale computational framework of vascular adaptation to develop a bridge between theory and experimental observation and to provide a method for the systematic testing of relevant clinical hypotheses. Cornerstone to our model is a feedback mechanism between environmental conditions and dynamic tissue plasticity described at the cellular level with an agent based model. Our implementation (i) is modular, (ii) starts from basic mechano-biology principle at the cell level and (iii) facilitates the agile development of the model. PMID:25977733

  19. Computation of Transient Nonlinear Ship Waves Using AN Adaptive Algorithm

    NASA Astrophysics Data System (ADS)

    Çelebi, M. S.

    2000-04-01

    An indirect boundary integral method is used to solve transient nonlinear ship wave problems. A resulting mixed boundary value problem is solved at each time-step using a mixed Eulerian- Lagrangian time integration technique. Two dynamic node allocation techniques, which basically distribute nodes on an ever changing body surface, are presented. Both two-sided hyperbolic tangent and variational grid generation algorithms are developed and compared on station curves. A ship hull form is generated in parametric space using a B-spline surface representation. Two-sided hyperbolic tangent and variational adaptive curve grid-generation methods are then applied on the hull station curves to generate effective node placement. The numerical algorithm, in the first method, used two stretching parameters. In the second method, a conservative form of the parametric variational Euler-Lagrange equations is used the perform an adaptive gridding on each station. The resulting unsymmetrical influence coefficient matrix is solved using both a restarted version of GMRES based on the modified Gram-Schmidt procedure and a line Jacobi method based on LU decomposition. The convergence rates of both matrix iteration techniques are improved with specially devised preconditioners. Numerical examples of node placements on typical hull cross-sections using both techniques are discussed and fully nonlinear ship wave patterns and wave resistance computations are presented.

  20. Configurable multiplier modules for an adaptive computing system

    NASA Astrophysics Data System (ADS)

    Pfänder, O. A.; Pfleiderer, H.-J.; Lachowicz, S. W.

    2006-09-01

    The importance of reconfigurable hardware is increasing steadily. For example, the primary approach of using adaptive systems based on programmable gate arrays and configurable routing resources has gone mainstream and high-performance programmable logic devices are rivaling traditional application-specific hardwired integrated circuits. Also, the idea of moving from the 2-D domain into a 3-D design which stacks several active layers above each other is gaining momentum in research and industry, to cope with the demand for smaller devices with a higher scale of integration. However, optimized arithmetic blocks in course-grain reconfigurable arrays as well as field-programmable architectures still play an important role. In countless digital systems and signal processing applications, the multiplication is one of the critical challenges, where in many cases a trade-off between area usage and data throughput has to be made. But the a priori choice of word-length and number representation can also be replaced by a dynamic choice at run-time, in order to improve flexibility, area efficiency and the level of parallelism in computation. In this contribution, we look at an adaptive computing system called 3-D-SoftChip to point out what parameters are crucial to implement flexible multiplier blocks into optimized elements for accelerated processing. The 3-D-SoftChip architecture uses a novel approach to 3-dimensional integration based on flip-chip bonding with indium bumps. The modular construction, the introduction of interfaces to realize the exchange of intermediate data, and the reconfigurable sign handling approach will be explained, as well as a beneficial way to handle and distribute the numerous required control signals.

  1. Spacecraft computer technology at Southwest Research Institute

    NASA Technical Reports Server (NTRS)

    Shirley, D. J.

    1993-01-01

    Southwest Research Institute (SwRI) has developed and delivered spacecraft computers for a number of different near-Earth-orbit spacecraft including shuttle experiments and SDIO free-flyer experiments. We describe the evolution of the basic SwRI spacecraft computer design from those weighing in at 20 to 25 lb and using 20 to 30 W to newer models weighing less than 5 lb and using only about 5 W, yet delivering twice the processing throughput. Because of their reduced size, weight, and power, these newer designs are especially applicable to planetary instrument requirements. The basis of our design evolution has been the availability of more powerful processor chip sets and the development of higher density packaging technology, coupled with more aggressive design strategies in incorporating high-density FPGA technology and use of high-density memory chips. In addition to reductions in size, weight, and power, the newer designs also address the necessity of survival in the harsh radiation environment of space. Spurred by participation in such programs as MSTI, LACE, RME, Delta 181, Delta Star, and RADARSAT, our designs have evolved in response to program demands to be small, low-powered units, radiation tolerant enough to be suitable for both Earth-orbit microsats and for planetary instruments. Present designs already include MIL-STD-1750 and Multi-Chip Module (MCM) technology with near-term plans to include RISC processors and higher-density MCM's. Long term plans include development of whole-core processors on one or two MCM's.

  2. Overview of deformable mirror technologies for adaptive optics and astronomy

    NASA Astrophysics Data System (ADS)

    Madec, P.-Y.

    2012-07-01

    From the ardent bucklers used during the Syracuse battle to set fire to Romans’ ships to more contemporary piezoelectric deformable mirrors widely used in astronomy, from very large voice coil deformable mirrors considered in future Extremely Large Telescopes to very small and compact ones embedded in Multi Object Adaptive Optics systems, this paper aims at giving an overview of Deformable Mirror technology for Adaptive Optics and Astronomy. First the main drivers for the design of Deformable Mirrors are recalled, not only related to atmospheric aberration compensation but also to environmental conditions or mechanical constraints. Then the different technologies available today for the manufacturing of Deformable Mirrors will be described, pros and cons analyzed. A review of the Companies and Institutes with capabilities in delivering Deformable Mirrors to astronomers will be presented, as well as lessons learned from the past 25 years of technological development and operation on sky. In conclusion, perspective will be tentatively drawn for what regards the future of Deformable Mirror technology for Astronomy.

  3. Computer-aided design and computer science technology

    NASA Technical Reports Server (NTRS)

    Fulton, R. E.; Voigt, S. J.

    1976-01-01

    A description is presented of computer-aided design requirements and the resulting computer science advances needed to support aerospace design. The aerospace design environment is examined, taking into account problems of data handling and aspects of computer hardware and software. The interactive terminal is normally the primary interface between the computer system and the engineering designer. Attention is given to user aids, interactive design, interactive computations, the characteristics of design information, data management requirements, hardware advancements, and computer science developments.

  4. Computer vision research with new imaging technology

    NASA Astrophysics Data System (ADS)

    Hou, Guangqi; Liu, Fei; Sun, Zhenan

    2015-12-01

    Light field imaging is capable of capturing dense multi-view 2D images in one snapshot, which record both intensity values and directions of rays simultaneously. As an emerging 3D device, the light field camera has been widely used in digital refocusing, depth estimation, stereoscopic display, etc. Traditional multi-view stereo (MVS) methods only perform well on strongly texture surfaces, but the depth map contains numerous holes and large ambiguities on textureless or low-textured regions. In this paper, we exploit the light field imaging technology on 3D face modeling in computer vision. Based on a 3D morphable model, we estimate the pose parameters from facial feature points. Then the depth map is estimated through the epipolar plane images (EPIs) method. At last, the high quality 3D face model is exactly recovered via the fusing strategy. We evaluate the effectiveness and robustness on face images captured by a light field camera with different poses.

  5. Role of computer technology in neurosurgery.

    PubMed

    Abdelwahab, M G; Cavalcanti, D D; Preul, M C

    2010-08-01

    In the clinical office, during surgical planning, or in the operating room, neurosurgeons have been surrounded by the digital world either recreating old tools or introducing new ones. Technological refinements, chiefly based on the use of computer systems, have altered the modus operandi for neurosurgery. In the emergency room or in the office, patient data are entered, digitally dictated, or gathered from electronic medical records. Images from every modality can be examined on a Picture Archiving and Communication System (PACS) or can be seen remotely on cell phones. Surgical planning is based on high-resolution reconstructions, and microsurgical or radiosurgical approaches can be assessed precisely using stereotaxy. Tumor resection, abscess or hematoma evacuation, or the management of vascular lesions can be assisted intraoperatively by new imaging resources integrated into the surgical microscope. Mathematical models can dictate how a lesion may recur as well as how often a particular patient should be followed. Finally, virtual reality is being developed as a training tool for residents and surgeons by preoperatively simulating complex surgical scenarios. Altogether, computerization at each level of patient care has been affected by digital technology to help enhance the safety of procedures and thereby improve outcomes of patients undergoing neurosurgical procedures. PMID:20802430

  6. Adaptive finite element simulation of flow and transport applications on parallel computers

    NASA Astrophysics Data System (ADS)

    Kirk, Benjamin Shelton

    The subject of this work is the adaptive finite element simulation of problems arising in flow and transport applications on parallel computers. Of particular interest are new contributions to adaptive mesh refinement (AMR) in this parallel high-performance context, including novel work on data structures, treatment of constraints in a parallel setting, generality and extensibility via object-oriented programming, and the design/implementation of a flexible software framework. This technology and software capability then enables more robust, reliable treatment of multiscale--multiphysics problems and specific studies of fine scale interaction such as those in biological chemotaxis (Chapter 4) and high-speed shock physics for compressible flows (Chapter 5). The work begins by presenting an overview of key concepts and data structures employed in AMR simulations. Of particular interest is how these concepts are applied in the physics-independent software framework which is developed here and is the basis for all the numerical simulations performed in this work. This open-source software framework has been adopted by a number of researchers in the U.S. and abroad for use in a wide range of applications. The dynamic nature of adaptive simulations pose particular issues for efficient implementation on distributed-memory parallel architectures. Communication cost, computational load balance, and memory requirements must all be considered when developing adaptive software for this class of machines. Specific extensions to the adaptive data structures to enable implementation on parallel computers is therefore considered in detail. The libMesh framework for performing adaptive finite element simulations on parallel computers is developed to provide a concrete implementation of the above ideas. This physics-independent framework is applied to two distinct flow and transport applications classes in the subsequent application studies to illustrate the flexibility of the

  7. Computer Utilization in Industrial Arts/Technology Education. Curriculum Guide.

    ERIC Educational Resources Information Center

    Connecticut Industrial Arts Association.

    This guide is intended to assist industrial arts/technology education teachers in helping students in grades K-12 understand the impact of computers and computer technology in the world. Discussed in the introductory sections are the ways in which computers have changed the face of business, industry, and education and training; the scope and…

  8. Computer Science and Technology Publications. NBS Publications List 84.

    ERIC Educational Resources Information Center

    National Bureau of Standards (DOC), Washington, DC. Inst. for Computer Sciences and Technology.

    This bibliography lists publications of the Institute for Computer Sciences and Technology of the National Bureau of Standards. Publications are listed by subject in the areas of computer security, computer networking, and automation technology. Sections list publications of: (1) current Federal Information Processing Standards; (2) computer…

  9. The Invisible Barrier to Integrating Computer Technology in Education

    ERIC Educational Resources Information Center

    Aflalo, Ester

    2014-01-01

    The article explores contradictions in teachers' perceptions regarding the place of computer technologies in education. The research population included 47 teachers who have incorporated computers in the classroom for several years. The teachers expressed positive attitudes regarding the decisive importance of computer technologies in furthering…

  10. A Wafer Transfer Technology for MEMS Adaptive Optics

    NASA Technical Reports Server (NTRS)

    Yang, Eui-Hyeok; Wiberg, Dean V.

    2001-01-01

    Adaptive optics systems require the combination of several advanced technologies such as precision optics, wavefront sensors, deformable mirrors, and lasers with high-speed control systems. The deformable mirror with a continuous membrane is a key component of these systems. This paper describes a new technique for transferring an entire wafer-level silicon membrane from one substrate to another. This technology is developed for the fabrication of a compact deformable mirror with a continuous facet. A 1 (mu)m thick silicon membrane, 100 mm in diameter, has been successfully transferred without using adhesives or polymers (i.e. wax, epoxy, or photoresist). Smaller or larger diameter membranes can also be transferred using this technique. The fabricated actuator membrane with an electrode gap of 1.5 (mu)m shows a vertical deflection of 0.37 (mu)m at 55 V.

  11. Computing, Information, and Communications Technology (CICT) Program Overview

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.

    2003-01-01

    The Computing, Information and Communications Technology (CICT) Program's goal is to enable NASA's Scientific Research, Space Exploration, and Aerospace Technology Missions with greater mission assurance, for less cost, with increased science return through the development and use of advanced computing, information and communication technologies

  12. Adaptive Fault Tolerance for Many-Core Based Space-Borne Computing

    NASA Technical Reports Server (NTRS)

    James, Mark; Springer, Paul; Zima, Hans

    2010-01-01

    This paper describes an approach to providing software fault tolerance for future deep-space robotic NASA missions, which will require a high degree of autonomy supported by an enhanced on-board computational capability. Such systems have become possible as a result of the emerging many-core technology, which is expected to offer 1024-core chips by 2015. We discuss the challenges and opportunities of this new technology, focusing on introspection-based adaptive fault tolerance that takes into account the specific requirements of applications, guided by a fault model. Introspection supports runtime monitoring of the program execution with the goal of identifying, locating, and analyzing errors. Fault tolerance assertions for the introspection system can be provided by the user, domain-specific knowledge, or via the results of static or dynamic program analysis. This work is part of an on-going project at the Jet Propulsion Laboratory in Pasadena, California.

  13. 3D-SoftChip: A Novel Architecture for Next-Generation Adaptive Computing Systems

    NASA Astrophysics Data System (ADS)

    Kim, Chul; Rassau, Alex; Lachowicz, Stefan; Lee, Mike Myung-Ok; Eshraghian, Kamran

    2006-12-01

    This paper introduces a novel architecture for next-generation adaptive computing systems, which we term 3D-SoftChip. The 3D-SoftChip is a 3-dimensional (3D) vertically integrated adaptive computing system combining state-of-the-art processing and 3D interconnection technology. It comprises the vertical integration of two chips (a configurable array processor and an intelligent configurable switch) through an indium bump interconnection array (IBIA). The configurable array processor (CAP) is an array of heterogeneous processing elements (PEs), while the intelligent configurable switch (ICS) comprises a switch block, 32-bit dedicated RISC processor for control, on-chip program/data memory, data frame buffer, along with a direct memory access (DMA) controller. This paper introduces the novel 3D-SoftChip architecture for real-time communication and multimedia signal processing as a next-generation computing system. The paper further describes the advanced HW/SW codesign and verification methodology, including high-level system modeling of the 3D-SoftChip using SystemC, being used to determine the optimum hardware specification in the early design stage.

  14. Cloud Computing. Technology Briefing. Number 1

    ERIC Educational Resources Information Center

    Alberta Education, 2013

    2013-01-01

    Cloud computing is Internet-based computing in which shared resources, software and information are delivered as a service that computers or mobile devices can access on demand. Cloud computing is already used extensively in education. Free or low-cost cloud-based services are used daily by learners and educators to support learning, social…

  15. Identifying Differential Item Functioning in Multi-Stage Computer Adaptive Testing

    ERIC Educational Resources Information Center

    Gierl, Mark J.; Lai, Hollis; Li, Johnson

    2013-01-01

    The purpose of this study is to evaluate the performance of CATSIB (Computer Adaptive Testing-Simultaneous Item Bias Test) for detecting differential item functioning (DIF) when items in the matching and studied subtest are administered adaptively in the context of a realistic multi-stage adaptive test (MST). MST was simulated using a 4-item…

  16. Relationships between Computer Self-Efficacy, Technology, Attitudes and Anxiety: Development of the Computer Technology Use Scale (CTUS)

    ERIC Educational Resources Information Center

    Conrad, Agatha M.; Munro, Don

    2008-01-01

    Two studies are reported which describe the development and evaluation of a new instrument, the Computer Technology Use Scale (CTUS), comprising three domains: computer self-efficacy, attitudes to technology, and technology related anxiety. Study 1 describes the development of the instrument and explores its factor structure. Study 2 used…

  17. The Steam Engine and the Computer: What Makes Technology Revolutionary.

    ERIC Educational Resources Information Center

    Simon, Herbert A.

    1987-01-01

    This discussion of technological revolution focuses on the computer and its uses in education. Contrasts between human traits, such as insight and creativity, and computer capabilities are discussed; the computer as its own instructional device is described; and possible educational changes resulting from computers are addressed. (LRW)

  18. Wireless Adaptive Therapeutic TeleGaming in a Pervasive Computing Environment

    NASA Astrophysics Data System (ADS)

    Peters, James F.; Szturm, Tony; Borkowski, Maciej; Lockery, Dan; Ramanna, Sheela; Shay, Barbara

    This chapter introduces a wireless, pervasive computing approach to adaptive therapeutic telegaming considered in the context of near set theory. Near set theory provides a formal basis for observation, comparison and classification of perceptual granules. A perceptual granule is defined by a collection of objects that are graspable by the senses or by the mind. In the proposed pervasive computing approach to telegaming, a handicapped person (e.g., stroke patient with limited hand, finger, arm function) plays a video game by interacting with familiar instrumented objects such as cups, cutlery, soccer balls, nozzles, screw top-lids, spoons, so that the technology that makes therapeutic exercise game-playing possible is largely invisible (Archives of Physical Medicine and Rehabilitation 89:2213-2217, 2008). The basic approach to adaptive learning (AL) in the proposed telegaming environment is ethology-inspired and is quite different from the traditional approach to reinforcement learning. In biologically-inspired learning, organisms learn to achieve some goal by durable modification of behaviours in response to signals from the environment resulting from specific experiences (Animal Behavior, 1995). The term adaptive is used here in an ethological sense, where learning by an organism results from modifying behaviour in response to perceived changes in the environment. To instill adaptivity in a video game, it is assumed that learning by a video game is episodic. During an episode, the behaviour of a player is measured indirectly by tracking the occurrence of gaming events such as a hit or a miss of a target (e.g., hitting a moving ball with a game paddle). An ethogram provides a record of behaviour feature values that provide a basis a functional registry for handicapped players for gaming adaptivity. An important practical application of adaptive gaming is therapeutic rehabilitation exercise carried out in parallel with playing action video games. Enjoyable and

  19. Using Assistive Technology Adaptations To Include Students with Learning Disabilities in Cooperative Learning Activities.

    ERIC Educational Resources Information Center

    Bryant, Diane Pedrotty; Bryant, Brian R.

    1998-01-01

    Discusses a process for integrating technology adaptations for students with learning disabilities into cooperative-learning activities in terms of three components: (1) selecting adaptations; (2) monitoring use of adaptations during cooperative-learning activities; and (3) evaluating the adaptations' effectiveness. Barriers to and support systems…

  20. An adaptive replacement algorithm for paged-memory computer systems.

    NASA Technical Reports Server (NTRS)

    Thorington, J. M., Jr.; Irwin, J. D.

    1972-01-01

    A general class of adaptive replacement schemes for use in paged memories is developed. One such algorithm, called SIM, is simulated using a probability model that generates memory traces, and the results of the simulation of this adaptive scheme are compared with those obtained using the best nonlookahead algorithms. A technique for implementing this type of adaptive replacement algorithm with state of the art digital hardware is also presented.

  1. Condition Driven Adaptive Music Generation for Computer Games

    NASA Astrophysics Data System (ADS)

    Naushad, Alamgir; Muhammad, Tufail

    2013-02-01

    The video game industry has grown to a multi-billion dollar, worldwide industry. The background music tends adaptively in reference to the specific game content during the game length of the play. Adaptive music should be further explored by looking at the particular condition in the game; such condition is driven by generating a specific music in the background which best fits in with the active game content throughout the length of the gameplay. This research paper outlines the use of condition driven adaptive music generation for audio and video to dynamically incorporate adaptively.

  2. Attitudes to Technology, Perceived Computer Self-Efficacy and Computer Anxiety as Predictors of Computer Supported Education

    ERIC Educational Resources Information Center

    Celik, Vehbi; Yesilyurt, Etem

    2013-01-01

    There is a large body of research regarding computer supported education, perceptions of computer self-efficacy, computer anxiety and the technological attitudes of teachers and teacher candidates. However, no study has been conducted on the correlation between and effect of computer supported education, perceived computer self-efficacy, computer…

  3. Numerical Technology for Large-Scale Computational Electromagnetics

    SciTech Connect

    Sharpe, R; Champagne, N; White, D; Stowell, M; Adams, R

    2003-01-30

    The key bottleneck of implicit computational electromagnetics tools for large complex geometries is the solution of the resulting linear system of equations. The goal of this effort was to research and develop critical numerical technology that alleviates this bottleneck for large-scale computational electromagnetics (CEM). The mathematical operators and numerical formulations used in this arena of CEM yield linear equations that are complex valued, unstructured, and indefinite. Also, simultaneously applying multiple mathematical modeling formulations to different portions of a complex problem (hybrid formulations) results in a mixed structure linear system, further increasing the computational difficulty. Typically, these hybrid linear systems are solved using a direct solution method, which was acceptable for Cray-class machines but does not scale adequately for ASCI-class machines. Additionally, LLNL's previously existing linear solvers were not well suited for the linear systems that are created by hybrid implicit CEM codes. Hence, a new approach was required to make effective use of ASCI-class computing platforms and to enable the next generation design capabilities. Multiple approaches were investigated, including the latest sparse-direct methods developed by our ASCI collaborators. In addition, approaches that combine domain decomposition (or matrix partitioning) with general-purpose iterative methods and special purpose pre-conditioners were investigated. Special-purpose pre-conditioners that take advantage of the structure of the matrix were adapted and developed based on intimate knowledge of the matrix properties. Finally, new operator formulations were developed that radically improve the conditioning of the resulting linear systems thus greatly reducing solution time. The goal was to enable the solution of CEM problems that are 10 to 100 times larger than our previous capability.

  4. An object-oriented, technology-adaptive information model

    NASA Technical Reports Server (NTRS)

    Anyiwo, Joshua C.

    1995-01-01

    The primary objective was to develop a computer information system for effectively presenting NASA's technologies to American industries, for appropriate commercialization. To this end a comprehensive information management model, applicable to a wide variety of situations, and immune to computer software/hardware technological gyrations, was developed. The model consists of four main elements: a DATA_STORE, a data PRODUCER/UPDATER_CLIENT and a data PRESENTATION_CLIENT, anchored to a central object-oriented SERVER engine. This server engine facilitates exchanges among the other model elements and safeguards the integrity of the DATA_STORE element. It is designed to support new technologies, as they become available, such as Object Linking and Embedding (OLE), on-demand audio-video data streaming with compression (such as is required for video conferencing), Worldwide Web (WWW) and other information services and browsing, fax-back data requests, presentation of information on CD-ROM, and regular in-house database management, regardless of the data model in place. The four components of this information model interact through a system of intelligent message agents which are customized to specific information exchange needs. This model is at the leading edge of modern information management models. It is independent of technological changes and can be implemented in a variety of ways to meet the specific needs of any communications situation. This summer a partial implementation of the model has been achieved. The structure of the DATA_STORE has been fully specified and successfully tested using Microsoft's FoxPro 2.6 database management system. Data PRODUCER/UPDATER and PRESENTATION architectures have been developed and also successfully implemented in FoxPro; and work has started on a full implementation of the SERVER engine. The model has also been successfully applied to a CD-ROM presentation of NASA's technologies in support of Langley Research Center's TAG

  5. Use of personal computing technology by deaf-blind individuals.

    PubMed

    Zuckerman, D

    1984-10-01

    This paper describes a system that enables deaf-blind people to work with microcomputers. The system utilizes the International Morse Code as a general communication medium. The deaf-blind person "hears" Morse code via a vibrotactile device to "see" the computer's screen. This technique enables deaf-blind individuals to receive immediate feedback from their typing and to scan the screen. This makes it possible for them to use the keyboard and screen in the same way as do seeing persons. A side benefit is that it provides a means for deaf-blind people to communicate with the sighted through a common medium. The sighted person can see the screen while the deaf-blind person feels it. Hardware cost to equip a standard personal computer with this interface is negligible. Vibrotactile Morse code is particularly viable because it can be adapted for the individual's particular tactile sensitivities. Morse-encoded tactile communication fits well in a social facilitation context for learning. It is technologically simple and standard. This work can significantly improve the quality of life for deaf-blind individuals because it provides new opportunities for communication and vocation. PMID:6239896

  6. Cognitive Effects with and of Computer Technology.

    ERIC Educational Resources Information Center

    Salomon, Gavriel

    1990-01-01

    Discusses the distinction between cognitive effects with computers, whereby an individual's performance is redefined and upgraded during intellectual partnership with the computer, and effects of computers, whereby such partnership leaves durable and generalizable cognitive residues. Suggests two mechanisms for affecting cognition: skill…

  7. Computers--Teaching, Technology, and Applications.

    ERIC Educational Resources Information Center

    Cocco, Anthony M.; And Others

    1995-01-01

    Includes "Managing Personality Types in the Computer Classroom" (Cocco); "External I/O Input/Output with a PC" (Fryda); "The Future of CAD/CAM Computer-Assisted Design/Computer-Assisted Manufacturing Software" (Fulton); and "Teaching Quality Assurance--A Laboratory Approach" (Wojslaw). (SK)

  8. CICT Computing, Information, and Communications Technology Program

    NASA Technical Reports Server (NTRS)

    Laufenberg, Lawrence; Tu, Eugene (Technical Monitor)

    2002-01-01

    The CICT Program is part of the NASA Aerospace Technology Enterprise's fundamental technology thrust to develop tools. processes, and technologies that enable new aerospace system capabilities and missions. The CICT Program's four key objectives are: Provide seamless access to NASA resources- including ground-, air-, and space-based distributed information technology resources-so that NASA scientists and engineers can more easily control missions, make new scientific discoveries, and design the next-generation space vehicles, provide high-data delivery from these assets directly to users for missions, develop goal-oriented human-centered systems, and research, develop and evaluate revolutionary technology.

  9. The Impact of Computer Technology on the Family.

    ERIC Educational Resources Information Center

    Bailey, Anne Wiseman

    1982-01-01

    Computer technology as it affects home and family life is explored. Elements of this technology which are examined include electronic fund transfers (EFT), consumer rights and responsibilities relating to use of EFT, working at home via computer, housing design, costs of computerizing the home, and computerized aids for the handicapped. (CT)

  10. Strengthening Computer Technology Programs. Special Publication Series No. 49.

    ERIC Educational Resources Information Center

    McKinney, Floyd L., Comp.

    Three papers present examples of strategies used by developing institutions and historically black colleges to strengthen computer technology programs. "Promoting Industry Support in Developing a Computer Technology Program" (Albert D. Robinson) describes how the Washtenaw Community College (Ann Arbor, Michigan) Electrical/Electronics Department…

  11. Preschool Children. Tech Use Guide: Using Computer Technology.

    ERIC Educational Resources Information Center

    Council for Exceptional Children, Reston, VA. Center for Special Education Technology.

    One of nine brief guides for special educators on using computer technology, this guide focuses on uses with preschool children with either mild to severe disabilities. Especially noted is the ability of the computer to provide access to environmental experiences otherwise inaccessible to the young handicapped child. Appropriate technology for…

  12. Guide for Teachers. Tech Use Guide: Using Computer Technology.

    ERIC Educational Resources Information Center

    Council for Exceptional Children, Reston, VA. Center for Special Education Technology.

    One of nine brief guides for special educators on using computer technology, this guide is specifically directed to special education teachers and encourages them to become critical consumers of technology to ensure its most effective use. Brief descriptions of the various uses of the computer in the school setting--as an instructional tool, as an…

  13. Contactless thin adaptive mirror technology: past, present, and future

    NASA Astrophysics Data System (ADS)

    Biasi, Roberto; Gallieni, Daniele; Salinari, Piero; Riccardi, Armando; Mantegazza, Paolo

    2010-07-01

    The contactless, voice coil motor adaptive mirror technology starts from an idea by Piero Salinari in 1993. This idea has progressively evolved to real systems thanks to a fruitful collaboration involving Italian research institutes (INAF - Osservatorio Astrofisico di Arcetri and Aerospace Department of Politecnico di Milano) and small Italian enterprises (Microgate and ADS). Collaboration between research institutions and industry is still very effectively in place, but nowadays the technology has left the initial R&D phase reaching a stage in which the whole projects are managed by the industrial entities. In this paper we present the baseline concept and its evolution, describing the main progress milestones. These are paced by the actual implementation of this idea into real systems, from MMT, to LBT, Magellan, VLT, GMT and E-ELT. The fundamental concept and layout has remained unchanged through this evolution, maintaining its intrinsic advantages: tolerance to actuators' failures, mechanical de-coupling and relaxed tolerances between correcting mirror and reference structure, large stroke, hysteresis-free behavior. Moreover, this concept has proved its expandability to very large systems with thousands of controlled d.o.f. Notwithstanding the solidity of the fundamentals, the implementation has strongly evolved from the beginning, in order to deal with the dimensional, power, maintainability and reliability constraints imposed by the increased size of the targeted systems.

  14. Cutting Technology Costs with Refurbished Computers

    ERIC Educational Resources Information Center

    Dessoff, Alan

    2010-01-01

    Many district administrators are finding that they can save money on computers by buying preowned ones instead of new ones. The practice has other benefits as well: It allows districts to give more computers to more students who need them, and it also promotes good environmental practices by keeping the machines out of landfills, where they…

  15. Computers and Autistic Learners: An Evolving Technology.

    ERIC Educational Resources Information Center

    Hedbring, Charles

    1985-01-01

    A research and demonstration computer center for severely handicapped autistic children, STEPPE-Lab, which uses computers as an augmentative communication and instructional system, is described. The article first reviews the keyboard, joystick, mouse, and drawing tablet as augmentative devices for helping communication disordered children interact…

  16. Emerging Uses of Computer Technology in Qualitative Research.

    ERIC Educational Resources Information Center

    Parker, D. Randall

    The application of computer technology in qualitative research and evaluation ranges from simple word processing to doing sophisticated data sorting and retrieval. How computer software can be used for qualitative research is discussed. Researchers should consider the use of computers in data analysis in light of their own familiarity and comfort…

  17. Fostering an Informal Learning Community of Computer Technologies at School

    ERIC Educational Resources Information Center

    Xiao, Lu; Carroll, John M.

    2007-01-01

    Computer technologies develop at a challenging fast pace. Formal education should not only teach students basic computer skills to meet current computer needs, but also foster student development of informal learning ability for a lifelong learning process. On the other hand, students growing up in the digital world are often more skilled with…

  18. A Framework for the Development of Context-Adaptable User Interfaces for Ubiquitous Computing Systems.

    PubMed

    Varela, Gervasio; Paz-Lopez, Alejandro; Becerra, Jose A; Duro, Richard

    2016-01-01

    This paper addresses the problem of developing user interfaces for Ubiquitous Computing (UC) and Ambient Intelligence (AmI) systems. These kind of systems are expected to provide a natural user experience, considering interaction modalities adapted to the user abilities and preferences and using whatever interaction devices are present in the environment. These interaction devices are not necessarily known at design time. The task is quite complicated due to the variety of devices and technologies, and the diversity of scenarios, and it usually burdens the developer with the need to create many different UIs in order to consider the foreseeable user-environment combinations. Here, we propose an UI abstraction framework for UC and AmI systems that effectively improves the portability of those systems between different environments and for different users. It allows developers to design and implement a single UI capable of being deployed with different devices and modalities regardless the physical location. PMID:27399711

  19. A Framework for the Development of Context-Adaptable User Interfaces for Ubiquitous Computing Systems

    PubMed Central

    Varela, Gervasio; Paz-Lopez, Alejandro; Becerra, Jose A.; Duro, Richard

    2016-01-01

    This paper addresses the problem of developing user interfaces for Ubiquitous Computing (UC) and Ambient Intelligence (AmI) systems. These kind of systems are expected to provide a natural user experience, considering interaction modalities adapted to the user abilities and preferences and using whatever interaction devices are present in the environment. These interaction devices are not necessarily known at design time. The task is quite complicated due to the variety of devices and technologies, and the diversity of scenarios, and it usually burdens the developer with the need to create many different UIs in order to consider the foreseeable user-environment combinations. Here, we propose an UI abstraction framework for UC and AmI systems that effectively improves the portability of those systems between different environments and for different users. It allows developers to design and implement a single UI capable of being deployed with different devices and modalities regardless the physical location. PMID:27399711

  20. Audit and Evaluation of Computer Security. Computer Science and Technology.

    ERIC Educational Resources Information Center

    Ruthberg, Zella G.

    This is a collection of consensus reports, each produced at a session of an invitational workshop sponsored by the National Bureau of Standards. The purpose of the workshop was to explore the state-of-the-art and define appropriate subjects for future research in the audit and evaluation of computer security. Leading experts in the audit and…

  1. Planning Computer Lessons. Tech Use Guide: Using Computer Technology.

    ERIC Educational Resources Information Center

    Council for Exceptional Children, Reston, VA. Center for Special Education Technology.

    This guide offers planning and organizing ideas for effectively using computers in classrooms that include students both with and without disabilities. The guide addresses: developing lesson plans, introducing the lesson in a way that builds motivation, providing guided and independent practice, extending the learning, and choosing software.…

  2. Computer Network Interconnection: Problems and Prospects. Computer Science & Technology Series.

    ERIC Educational Resources Information Center

    Cotton, Ira W.

    This report examines the current situation regarding the interconnection of computer networks, especially packet switched networks (PSNs). The emphasis is on idntifying the barriers to interconnection and on surveying approaches to a solution, rather than recommending any single course of action. Sufficient organizational and technical background…

  3. Impact of Computer Technology on Design and Craft Education

    ERIC Educational Resources Information Center

    Thorsteinsson, Gisli

    2014-01-01

    This research aims to answer the question, "How has the use of computer technology benefited the compulsory education system, focusing on Design and Technology?" In order to reply this question, it was necessary to focus on interactive whiteboards, e-portfolios and digital projectors as the main technology formats. An initial literature…

  4. Computer-Assisted Technology for the Twice Exceptional

    ERIC Educational Resources Information Center

    Rizza, Mary

    2006-01-01

    Technology helps students develop coping strategies to deal with various learning differences. Assistive technology is a common intervention provided to students with disabilities and generally varies depending on student need. Within gifted education, the use of computers and technology is concentrated on curricular applications and activities…

  5. Guided Discovery Learning with Computer-Based Simulation Games: Effects of Adaptive and Non-Adaptive Instructional Support.

    ERIC Educational Resources Information Center

    Leutner, Detlev

    1993-01-01

    System-initiated adaptive advice and learner-requested nonadaptive background information were investigated in computer simulation game experiments with 64 seventh graders, 38 college students, and 80 seventh and eighth graders in Germany. Results are discussed in terms of theories of problem solving, intelligence, memory, and information…

  6. Reviews of computing technology: Software overview

    SciTech Connect

    Hartshorn, W.R.; Johnson, A.L.

    1994-01-05

    The Savannah River Site Computing Architecture states that the site computing environment will be standards-based, data-driven, and workstation-oriented. Larger server systems deliver needed information to users in a client-server relationship. Goals of the Architecture include utilizing computing resources effectively, maintaining a high level of data integrity, developing a robust infrastructure, and storing data in such a way as to promote accessibility and usability. This document describes the current storage environment at Savannah River Site (SRS) and presents some of the problems that will be faced and strategies that are planned over the next few years.

  7. Mistaking Computers for Technology: Technology Literacy and the Digital Divide

    ERIC Educational Resources Information Center

    Amiel, Tel

    2006-01-01

    No other information and communication technology has swept the globe with greater speed than the Internet, having the potential to promote vast social, economic, and political transformations. As new technologies become available the pattern of adoption and diffusion creates disparities in access and ownership. At the most basic this gap is…

  8. Emerging Trends in Technology Education Computer Applications.

    ERIC Educational Resources Information Center

    Hazari, Sunil I.

    1993-01-01

    Graphical User Interface (GUI)--and its variant, pen computing--is rapidly replacing older types of operating environments. Despite its heavier demand for processing power, GUI has many advantages. (SK)

  9. Molecular determinants of enzyme cold adaptation: comparative structural and computational studies of cold- and warm-adapted enzymes.

    PubMed

    Papaleo, Elena; Tiberti, Matteo; Invernizzi, Gaetano; Pasi, Marco; Ranzani, Valeria

    2011-11-01

    The identification of molecular mechanisms underlying enzyme cold adaptation is a hot-topic both for fundamental research and industrial applications. In the present contribution, we review the last decades of structural computational investigations on cold-adapted enzymes in comparison to their warm-adapted counterparts. Comparative sequence and structural studies allow the definition of a multitude of adaptation strategies. Different enzymes carried out diverse mechanisms to adapt to low temperatures, so that a general theory for enzyme cold adaptation cannot be formulated. However, some common features can be traced in dynamic and flexibility properties of these enzymes, as well as in their intra- and inter-molecular interaction networks. Interestingly, the current data suggest that a family-centered point of view is necessary in the comparative analyses of cold- and warm-adapted enzymes. In fact, enzymes belonging to the same family or superfamily, thus sharing at least the three-dimensional fold and common features of the functional sites, have evolved similar structural and dynamic patterns to overcome the detrimental effects of low temperatures. PMID:21827423

  10. An adaptable Boolean net trainable to control a computing robot

    SciTech Connect

    Lauria, F. E.; Prevete, R.; Milo, M.; Visco, S.

    1999-03-22

    We discuss a method to implement in a Boolean neural network a Hebbian rule so to obtain an adaptable universal control system. We start by presenting both the Boolean neural net and the Hebbian rule we have considered. Then we discuss, first, the problems arising when the latter is naively implemented in a Boolean neural net, second, the method consenting us to overcome them and the ensuing adaptable Boolean neural net paradigm. Next, we present the adaptable Boolean neural net as an intelligent control system, actually controlling a writing robot, and discuss how to train it in the execution of the elementary arithmetic operations on operands represented by numerals with an arbitrary number of digits.

  11. Adaptive, associative, and self-organizing functions in neural computing.

    PubMed

    Kohonen, T

    1987-12-01

    This paper contains an attempt to describe certain adaptive and cooperative functions encountered in neural networks. The approach is a compromise between biological accuracy and mathematical clarity. two types of differential equation seem to describe the basic effects underlying the information of these functions: the equation for the electrical activity of the neuron and the adaptation equation that describes changes in its input connectivities. Various phenomena and operations are derivable from them: clustering of activity in a laterally interconnected nework; adaptive formation of feature detectors; the autoassociative memory function; and self-organized formation of ordered sensory maps. The discussion tends to reason what functions are readily amenable to analytical modeling and which phenomena seem to ensue from the more complex interactions that take place in the brain. PMID:20523469

  12. Evolving technologies for Space Station Freedom computer-based workstations

    NASA Technical Reports Server (NTRS)

    Jensen, Dean G.; Rudisill, Marianne

    1990-01-01

    Viewgraphs on evolving technologies for Space Station Freedom computer-based workstations are presented. The human-computer computer software environment modules are described. The following topics are addressed: command and control workstation concept; cupola workstation concept; Japanese experiment module RMS workstation concept; remote devices controlled from workstations; orbital maneuvering vehicle free flyer; remote manipulator system; Japanese experiment module exposed facility; Japanese experiment module small fine arm; flight telerobotic servicer; human-computer interaction; and workstation/robotics related activities.

  13. Improving Adaptive Learning Technology through the Use of Response Times

    ERIC Educational Resources Information Center

    Mettler, Everett; Massey, Christine M.; Kellman, Philip J.

    2011-01-01

    Adaptive learning techniques have typically scheduled practice using learners' accuracy and item presentation history. We describe an adaptive learning system (Adaptive Response Time Based Sequencing--ARTS) that uses both accuracy and response time (RT) as direct inputs into sequencing. Response times are used to assess learning strength and…

  14. Application of advanced computational technology to propulsion CFD

    NASA Astrophysics Data System (ADS)

    Szuch, John R.

    The Internal Fluid Mechanics Division of the NASA Lewis Research Center is combining the key elements of computational fluid dynamics, aerothermodynamic experiments, and advanced computational technology to bring internal computational fluid dynamics (ICFM) to a state of practical application for aerospace propulsion system design. This paper presents an overview of efforts underway at NASA Lewis to advance and apply computational technology to ICFM. These efforts include the use of modern, software engineering principles for code development, the development of an AI-based user-interface for large codes, the establishment of a high-performance, data communications network to link ICFM researchers and facilities, and the application of parallel processing to speed up computationally intensive and/or time-critical ICFM problems. A multistage compressor flow physics program is cited as an example of efforts to use advanced computational technology to enhance a current NASA Lewis ICFM research program.

  15. Enabling technologies for visible adaptive optics: the Magellan adaptive secondary VisAO camera

    NASA Astrophysics Data System (ADS)

    Kopon, Derek; Males, Jared; Close, Laird M.; Gasho, Victor

    2009-08-01

    Since its beginnings, diffraction-limited ground-based adaptive optics (AO) imaging has been limited to wavelengths in the near IR (λ>1μm) and longer. Visible AO (λ>1μm) has proven to be difficult because shorter wavelengths require wavefront correction on very short spatial and temporal scales. The pupil must be sampled very finely, which requires dense actuator spacing and fine wavefront sampling with large dynamic range. In addition, atmospheric dispersion is much more significant in the visible than in the near-IR. Imaging over a broad visible band requires a very good Atmospheric Dispersion Corrector (ADC). Even with these technologies, our AO simulations using the CAOS code, combined with the optical and site parameters for the 6.5m Magellan telescope, demonstrate a large temporal variability of visible (λ=0.7μm) Strehl on timescales of 50 ms. Over several hundred milliseconds, the visible Strehl can be as high at 50% and as low as 10%. Taking advantage of periods of high Strehl requires either the ability to read out the CCD very fast, thereby introducing significant amounts of read-noise, or the use of a fast asynchronous shutter that can block the low-Strehl light. Our Magellan VisAO camera will use an advanced ADC, a high-speed shutter, and our 585 actuator adaptive secondary to achieve broadband (0.5-1.0 μm) diffraction limited images on the 6.5m Magellan Clay telescope in Chile at Las Campanas Observatory. These will be the sharpest and deepest visible direct images taken to date with a resolution of 17 mas, a factor of 2.7 better than the diffraction limit of the Hubble Space Telescope.

  16. Implementation and Evaluation of Multiple Adaptive Control Technologies for a Generic Transport Aircraft Simulation

    NASA Technical Reports Server (NTRS)

    Campbell, Stefan F.; Kaneshige, John T.; Nguyen, Nhan T.; Krishakumar, Kalmanje S.

    2010-01-01

    Presented here is the evaluation of multiple adaptive control technologies for a generic transport aircraft simulation. For this study, seven model reference adaptive control (MRAC) based technologies were considered. Each technology was integrated into an identical dynamic-inversion control architecture and tuned using a methodology based on metrics and specific design requirements. Simulation tests were then performed to evaluate each technology s sensitivity to time-delay, flight condition, model uncertainty, and artificially induced cross-coupling. The resulting robustness and performance characteristics were used to identify potential strengths, weaknesses, and integration challenges of the individual adaptive control technologies

  17. Identifying Reading Problems with Computer-Adaptive Assessments

    ERIC Educational Resources Information Center

    Merrell, C.; Tymms, P.

    2007-01-01

    This paper describes the development of an adaptive assessment called Interactive Computerised Assessment System (InCAS) that is aimed at children of a wide age and ability range to identify specific reading problems. Rasch measurement has been used to create the equal interval scales that form each part of the assessment. The rationale for the…

  18. Adaptive optics scanning laser ophthalmoscope imaging: technology update

    PubMed Central

    Merino, David; Loza-Alvarez, Pablo

    2016-01-01

    Adaptive optics (AO) retinal imaging has become very popular in the past few years, especially within the ophthalmic research community. Several different retinal techniques, such as fundus imaging cameras or optical coherence tomography systems, have been coupled with AO in order to produce impressive images showing individual cell mosaics over different layers of the in vivo human retina. The combination of AO with scanning laser ophthalmoscopy has been extensively used to generate impressive images of the human retina with unprecedented resolution, showing individual photoreceptor cells, retinal pigment epithelium cells, as well as microscopic capillary vessels, or the nerve fiber layer. Over the past few years, the technique has evolved to develop several different applications not only in the clinic but also in different animal models, thanks to technological developments in the field. These developments have specific applications to different fields of investigation, which are not limited to the study of retinal diseases but also to the understanding of the retinal function and vision science. This review is an attempt to summarize these developments in an understandable and brief manner in order to guide the reader into the possibilities that AO scanning laser ophthalmoscopy offers, as well as its limitations, which should be taken into account when planning on using it. PMID:27175057

  19. Study of large adaptive arrays for space technology applications

    NASA Technical Reports Server (NTRS)

    Berkowitz, R. S.; Steinberg, B.; Powers, E.; Lim, T.

    1977-01-01

    The research in large adaptive antenna arrays for space technology applications is reported. Specifically two tasks were considered. The first was a system design study for accurate determination of the positions and the frequencies of sources radiating from the earth's surface that could be used for the rapid location of people or vehicles in distress. This system design study led to a nonrigid array about 8 km in size with means for locating the array element positions, receiving signals from the earth and determining the source locations and frequencies of the transmitting sources. It is concluded that this system design is feasible, and satisfies the desired objectives. The second task was an experiment to determine the largest earthbound array which could simulate a spaceborne experiment. It was determined that an 800 ft array would perform indistinguishably in both locations and it is estimated that one several times larger also would serve satisfactorily. In addition the power density spectrum of the phase difference fluctuations across a large array was measured. It was found that the spectrum falls off approximately as f to the minus 5/2 power.

  20. Adaptive optics scanning laser ophthalmoscope imaging: technology update.

    PubMed

    Merino, David; Loza-Alvarez, Pablo

    2016-01-01

    Adaptive optics (AO) retinal imaging has become very popular in the past few years, especially within the ophthalmic research community. Several different retinal techniques, such as fundus imaging cameras or optical coherence tomography systems, have been coupled with AO in order to produce impressive images showing individual cell mosaics over different layers of the in vivo human retina. The combination of AO with scanning laser ophthalmoscopy has been extensively used to generate impressive images of the human retina with unprecedented resolution, showing individual photoreceptor cells, retinal pigment epithelium cells, as well as microscopic capillary vessels, or the nerve fiber layer. Over the past few years, the technique has evolved to develop several different applications not only in the clinic but also in different animal models, thanks to technological developments in the field. These developments have specific applications to different fields of investigation, which are not limited to the study of retinal diseases but also to the understanding of the retinal function and vision science. This review is an attempt to summarize these developments in an understandable and brief manner in order to guide the reader into the possibilities that AO scanning laser ophthalmoscopy offers, as well as its limitations, which should be taken into account when planning on using it. PMID:27175057

  1. An Investigation on Computer-Adaptive Multistage Testing Panels for Multidimensional Assessment

    ERIC Educational Resources Information Center

    Wang, Xinrui

    2013-01-01

    The computer-adaptive multistage testing (ca-MST) has been developed as an alternative to computerized adaptive testing (CAT), and been increasingly adopted in large-scale assessments. Current research and practice only focus on ca-MST panels for credentialing purposes. The ca-MST test mode, therefore, is designed to gauge a single scale. The…

  2. Adaptive finite element methods for two-dimensional problems in computational fracture mechanics

    NASA Technical Reports Server (NTRS)

    Min, J. B.; Bass, J. M.; Spradley, L. W.

    1994-01-01

    Some recent results obtained using solution-adaptive finite element methods in two-dimensional problems in linear elastic fracture mechanics are presented. The focus is on the basic issue of adaptive finite element methods for validating the new methodology by computing demonstration problems and comparing the stress intensity factors to analytical results.

  3. Embedded Data Processor and Portable Computer Technology testbeds

    NASA Technical Reports Server (NTRS)

    Alena, Richard; Liu, Yuan-Kwei; Goforth, Andre; Fernquist, Alan R.

    1993-01-01

    Attention is given to current activities in the Embedded Data Processor and Portable Computer Technology testbed configurations that are part of the Advanced Data Systems Architectures Testbed at the Information Sciences Division at NASA Ames Research Center. The Embedded Data Processor Testbed evaluates advanced microprocessors for potential use in mission and payload applications within the Space Station Freedom Program. The Portable Computer Technology (PCT) Testbed integrates and demonstrates advanced portable computing devices and data system architectures. The PCT Testbed uses both commercial and custom-developed devices to demonstrate the feasibility of functional expansion and networking for portable computers in flight missions.

  4. A New Socio-technical Model for Studying Health Information Technology in Complex Adaptive Healthcare Systems

    PubMed Central

    Sittig, Dean F.; Singh, Hardeep

    2011-01-01

    Conceptual models have been developed to address challenges inherent in studying health information technology (HIT). This manuscript introduces an 8-dimensional model specifically designed to address the socio-technical challenges involved in design, development, implementation, use, and evaluation of HIT within complex adaptive healthcare systems. The 8 dimensions are not independent, sequential, or hierarchical, but rather are interdependent and interrelated concepts similar to compositions of other complex adaptive systems. Hardware and software computing infrastructure refers to equipment and software used to power, support, and operate clinical applications and devices. Clinical content refers to textual or numeric data and images that constitute the “language” of clinical applications. The human computer interface includes all aspects of the computer that users can see, touch, or hear as they interact with it. People refers to everyone who interacts in some way with the system, from developer to end-user, including potential patient-users. Workflow and communication are the processes or steps involved in assuring that patient care tasks are carried out effectively. Two additional dimensions of the model are internal organizational features (e.g., policies, procedures, and culture) and external rules and regulations, both of which may facilitate or constrain many aspects of the preceding dimensions. The final dimension is measurement and monitoring, which refers to the process of measuring and evaluating both intended and unintended consequences of HIT implementation and use. We illustrate how our model has been successfully applied in real-world complex adaptive settings to understand and improve HIT applications at various stages of development and implementation. PMID:20959322

  5. Novel Approaches to Adaptive Angular Approximations in Computational Transport

    SciTech Connect

    Marvin L. Adams; Igor Carron; Paul Nelson

    2006-06-04

    The particle-transport equation is notoriously difficult to discretize accurately, largely because the solution can be discontinuous in every variable. At any given spatial position and energy E, for example, the transport solution  can be discontinuous at an arbitrary number of arbitrary locations in the direction domain. Even if the solution is continuous it is often devoid of smoothness. This makes the direction variable extremely difficult to discretize accurately. We have attacked this problem with adaptive discretizations in the angle variables, using two distinctly different approaches. The first approach used wavelet function expansions directly and exploited their ability to capture sharp local variations. The second used discrete ordinates with a spatially varying quadrature set that adapts to the local solution. The first approach is very different from that in today’s transport codes, while the second could conceivably be implemented in such codes. Both approaches succeed in reducing angular discretization error to any desired level. The work described and results presented in this report add significantly to the understanding of angular discretization in transport problems and demonstrate that it is possible to solve this important long-standing problem in deterministic transport. Our results show that our adaptive discrete-ordinates (ADO) approach successfully: 1) Reduces angular discretization error to user-selected “tolerance” levels in a variety of difficult test problems; 2) Achieves a given error with significantly fewer unknowns than non-adaptive discrete ordinates methods; 3) Can be implemented within standard discrete-ordinates solution techniques, and thus could generate a significant impact on the field in a relatively short time. Our results show that our adaptive wavelet approach: 1) Successfully reduces the angular discretization error to arbitrarily small levels in a variety of difficult test problems, even when using the

  6. The Application of Computer Technology for Development.

    ERIC Educational Resources Information Center

    United Nations, New York, NY. Dept. of Economic and Social Affairs.

    At its twenty-third session, the General Assembly adopted resolution 2458(XXII) requesting the Secretary-General to prepare a report giving special consideration to the situation of the developing countries with regard to: (1) the results already obtained and the needs and prospects for the use of electronic computers in accelerating the process…

  7. Cloud Computing Technologies Facilitate Earth Research

    NASA Technical Reports Server (NTRS)

    2015-01-01

    Under a Space Act Agreement, NASA partnered with Seattle-based Amazon Web Services to make the agency's climate and Earth science satellite data publicly available on the company's servers. Users can access the data for free, but they can also pay to use Amazon's computing services to analyze and visualize information using the same software available to NASA researchers.

  8. Continuing Health Education Through Computer Technology.

    ERIC Educational Resources Information Center

    Held, Thomas H.; Kappelman, Murray M.

    Computer assisted instruction is beginning to have an important role in the rapidly expanding field of continuing education for health science professionals. At the present time, there are 22 medical specialty boards, all of which require or are about to require some form of continuing medical education for re-certification, and studies are being…

  9. CACTUS: Calculator and Computer Technology User Service.

    ERIC Educational Resources Information Center

    Hyde, Hartley

    1998-01-01

    Presents an activity in which students use computer-based spreadsheets to find out how much grain should be added to a chess board when a grain of rice is put on the first square, the amount is doubled for the next square, and the chess board is covered. (ASK)

  10. Technological Imperatives: Using Computers in Academic Debate.

    ERIC Educational Resources Information Center

    Ticku, Ravinder; Phelps, Greg

    Intended for forensic educators and debate teams, this document details how one university debate team, at the University of Iowa, makes use of computer resources on campus to facilitate storage and retrieval of information useful to debaters. The introduction notes the problem of storing and retrieving the amount of information required by debate…

  11. Using Interactive Computer Technology to Enhance Learning

    ERIC Educational Resources Information Center

    Pemberton, Joy R.; Borrego, Joaquin, Jr.; Cohen, Lee M.

    2006-01-01

    We assessed the effects of using LearnStar[TM], an interactive, computer-based teaching tool, as an in-class exam review method. Students with higher LearnStar review scores had higher grades. Furthermore, students' satisfaction ratings indicated that LearnStar reviews were more enjoyable and conducive to participation than traditional reviews.…

  12. Voice-coil technology for the E-ELT M4 Adaptive Unit

    NASA Astrophysics Data System (ADS)

    Gallieni, D.; Tintori, M.; Mantegazza, M.; Anaclerio, E.; Crimella, L.; Acerboni, M.; Biasi, R.; Angerer, G.; Andrigettoni, M.; Merler, A.; Veronese, D.; Carel, J.-L.; Marque, G.; Molinari, E.; Tresoldi, D.; Toso, G.; Spanó, P.; Riva, M.; Mazzoleni, R.; Riccardi, A.; Mantegazza, P.; Manetti, M.; Morandini, M.; Vernet, E.; Hubin, N.; Jochum, L.; Madec, P.; Dimmler, M.; Koch, F.

    We present our design of the E-ELT M4 Adaptive Unit based on voice-coil driven deformable mirror technology. This technology was developed by INAF-Arcetri, Microgate and ADS team in the past 15 years and it has been adopted by a number of large ground based telescopes as the MMT, LBT, Magellan and lastly the VLT in the frame of the Adaptive Telescope Facility project. Our design is based on contactless force actuators made by permanent magnets glued on the back of the deformable mirror and coils mounted on a stiff reference structure. We use capacitive sensors to close a position loop co-located with each actuator. Dedicated high performance parallel processors are used to implement the local de-centralized control at actuator level and a centralized feed-forward computation of all the actuators forces. This allowed achieving in our previous systems dynamic performances well in line with the requirements of the M4 Adaptive Unit (M4AU) case. The actuator density of our design is in the order of 30-mm spacing for a figure of about 6000 actuators on the M4AU and it allows fulfilling the fitting error and corrections requirements of the E-ELT high order DM. Moreover, our contact-less technology makes the Deformable Mirror tolerant to up 5% actuators failures without spoiling system capability to reach its specified performances, besides allowing large mechanical tolerances between the reference structure and the deformable mirror. Finally, we present the Demonstration Prototype we are building in the frame of the M4AU Phase B study to measure the optical dynamical performances predicted by our design. Such a prototype will be fully representative of the M4AU features, in particular it will address the controllability of two adjacent segments of the 2-mm thick mirror and implement the actuators "brick" modular concept that has been adopted to dramatically improve the maintainability of the final unit.

  13. Technology and society: ideological implications of information and computer technologies in the Soviet Union

    SciTech Connect

    Weigle, M.A.

    1988-01-01

    This study examines the impact of technology on the USSR's social system from the perspective of Soviet ideological development. The analysis of information and computer technologies within this framework de-emphasizes both modernization theories and those that assume unchallenged Communist Party control over technological development. Previous studies have examined the level of Soviet technological achievements and the gap between this level and those in the West, many referring to ideological boundaries of Soviet technological development without, however, systematically analyzing the resulting implications for the Soviet ideology of Marxism-Leninism. This study develops a framework for analyzing the impact of new technologies in the USSR in the fields of technology, ideology, and the scientific and technological revolution. On the basis of this framework, examination turns to the relevant Soviet theoretical and technical literature and debates among Soviety elites, concluding that the introduction of information and computer technologies and the organization of computer networks has exacerbated tensions in Soviety Marxism-Leninism.

  14. Adaptive computational methods for SSME internal flow analysis

    NASA Technical Reports Server (NTRS)

    Oden, J. T.

    1986-01-01

    Adaptive finite element methods for the analysis of classes of problems in compressible and incompressible flow of interest in SSME (space shuttle main engine) analysis and design are described. The general objective of the adaptive methods is to improve and to quantify the quality of numerical solutions to the governing partial differential equations of fluid dynamics in two-dimensional cases. There are several different families of adaptive schemes that can be used to improve the quality of solutions in complex flow simulations. Among these are: (1) r-methods (node-redistribution or moving mesh methods) in which a fixed number of nodal points is allowed to migrate to points in the mesh where high error is detected; (2) h-methods, in which the mesh size h is automatically refined to reduce local error; and (3) p-methods, in which the local degree p of the finite element approximation is increased to reduce local error. Two of the three basic techniques have been studied in this project: an r-method for steady Euler equations in two dimensions and a p-method for transient, laminar, viscous incompressible flow. Numerical results are presented. A brief introduction to residual methods of a-posterior error estimation is also given and some pertinent conclusions of the study are listed.

  15. The Principal and the Pauper: Administrator Training in Computer Technology.

    ERIC Educational Resources Information Center

    Isherwood, Geoffrey B.

    1985-01-01

    Describes principals' responsibilities in overseeing educational technology. Outlines computer knowledge and skills principals should possess about hardware, software, word processing, data bases, spreadsheets, integrated software, and school applications (scheduling, grading, recording attendance, budgeting, maintaining inventories, etc.).…

  16. Using Computer Technology To Foster Learning for Understanding.

    ERIC Educational Resources Information Center

    Van Melle, Elaine; Tomalty, Lewis

    2000-01-01

    Describes how computer technology, specifically the use of a multimedia CD-ROM, was integrated into a microbiology curriculum as part of the transition from focusing on facts to fostering learning for understanding. (Contains 30 references.) (Author/YDS)

  17. Theories of Learning and Computer-Mediated Instructional Technologies.

    ERIC Educational Resources Information Center

    Hung, David

    2001-01-01

    Describes four major models of learning: behaviorism, cognitivism, constructivism, and social constructivism. Discusses situated cognition; differences between learning theories and instructional approaches; and how computer-mediated technologies can be integrated with learning theories. (LRW)

  18. A survey of adaptive control technology in robotics

    NASA Technical Reports Server (NTRS)

    Tosunoglu, S.; Tesar, D.

    1987-01-01

    Previous work on the adaptive control of robotic systems is reviewed. Although the field is relatively new and does not yet represent a mature discipline, considerable attention has been given to the design of sophisticated robot controllers. Here, adaptive control methods are divided into model reference adaptive systems and self-tuning regulators with further definition of various approaches given in each class. The similarity and distinct features of the designed controllers are delineated and tabulated to enhance comparative review.

  19. WRF4G project: Adaptation of WRF Model to Distributed Computing Infrastructures

    NASA Astrophysics Data System (ADS)

    Cofino, Antonio S.; Fernández Quiruelas, Valvanuz; García Díez, Markel; Blanco Real, Jose C.; Fernández, Jesús

    2013-04-01

    Nowadays Grid Computing is powerful computational tool which is ready to be used for scientific community in different areas (such as biomedicine, astrophysics, climate, etc.). However, the use of this distributed computing infrastructures (DCI) is not yet common practice in climate research, and only a few teams and applications in this area take advantage of this infrastructure. Thus, the first objective of this project is to popularize the use of this technology in the atmospheric sciences area. In order to achieve this objective, one of the most used applications has been taken (WRF; a limited- area model, successor of the MM5 model), that has a user community formed by more than 8000 researchers worldwide. This community develop its research activity on different areas and could benefit from the advantages of Grid resources (case study simulations, regional hind-cast/forecast, sensitivity studies, etc.). The WRF model is been used as input by many energy and natural hazards community, therefore those community will also benefit. However, Grid infrastructures have some drawbacks for the execution of applications that make an intensive use of CPU and memory for a long period of time. This makes necessary to develop a specific framework (middleware). This middleware encapsulates the application and provides appropriate services for the monitoring and management of the jobs and the data. Thus, the second objective of the project consists on the development of a generic adaptation of WRF for Grid (WRF4G), to be distributed as open-source and to be integrated in the official WRF development cycle. The use of this WRF adaptation should be transparent and useful to face any of the previously described studies, and avoid any of the problems of the Grid infrastructure. Moreover it should simplify the access to the Grid infrastructures for the research teams, and also to free them from the technical and computational aspects of the use of the Grid. Finally, in order to

  20. Adapting Computational Data Structures Technology to Reason about Infinity

    ERIC Educational Resources Information Center

    Goldberg, Robert; Hammerman, Natalie

    2004-01-01

    The NCTM curriculum states that students should be able to "compare and contrast the real number system and its various subsystems with regard to their structural characteristics." In evaluating overall conformity to the 1989 standard, the National Council of Teachers of Mathematics (NCTM) requires that "teachers must value and encourage the use…

  1. Full custom VLSI - A technology for high performance computing

    NASA Technical Reports Server (NTRS)

    Maki, Gary K.; Whitaker, Sterling R.

    1990-01-01

    Full custom VLSI is presented as a viable technology for addressing the need for the computing capabilities required for the real-time health monitoring of spacecraft systems. This technology presents solutions that cannot be realized with stored program computers or semicustom VLSI; also, it is not dependent on current IC processes. It is argued that, while design time is longer, full custom VLSI produces the fastest and densest VLSI solution and that high density normally also yields low manufacturing costs.

  2. Restricted access processor - An application of computer security technology

    NASA Technical Reports Server (NTRS)

    Mcmahon, E. M.

    1985-01-01

    This paper describes a security guard device that is currently being developed by Computer Sciences Corporation (CSC). The methods used to provide assurance that the system meets its security requirements include the system architecture, a system security evaluation, and the application of formal and informal verification techniques. The combination of state-of-the-art technology and the incorporation of new verification procedures results in a demonstration of the feasibility of computer security technology for operational applications.

  3. Computer technology -- 1996: Applications and methodology. PVP-Volume 326

    SciTech Connect

    Hulbert, G.M.; Hsu, K.H.; Lee, T.W.; Nicholas, T.

    1996-12-01

    The primary objective of the Computer Technology Committee of the ASME Pressure Vessels and Piping Division is to promote interest and technical exchange in the field of computer technology, related to the design and analysis of pressure vessels and piping. The topics included in this volume are: analysis of bolted joints; nonlinear analysis, applications and methodology; finite element analysis and applications; and behavior of materials. Separate abstracts were prepared for 23 of the papers in this volume.

  4. Beyond Theory: Improving Public Relations Writing through Computer Technology.

    ERIC Educational Resources Information Center

    Neff, Bonita Dostal

    Computer technology (primarily word processing) enables the student of public relations writing to improve the writing process through increased flexibility in writing, enhanced creativity, increased support of management skills and team work. A new instructional model for computer use in public relations courses at Purdue University Calumet…

  5. "Computer" and "Information and Communication Technology": Students' Culture Specific Interpretations

    ERIC Educational Resources Information Center

    Elen, Jan; Clarebout, Geraldine; Sarfo, Frederick Kwaku; Louw, Lambertus Philippus; Poysa-Tarhonen, Johanna; Stassens, Nick

    2010-01-01

    Given the use of information and communication technology (ICT) and computer as synonyms in ICT-integration research on the one hand, and the potential problems in doing so on the other, this contribution tries to gain insight in the understanding of the words computer and ICT in different settings. In five different countries (Belgium, Finland,…

  6. Using Computer Technology To Enhance Middle School Science.

    ERIC Educational Resources Information Center

    Jermanovich, Trudy

    This practicum was designed to encourage middle school science teachers to utilize computer technology as an enhancement in order to provide students with an additional means of addressing their basic skills areas. The primary goals were to provide information on the ease of utilization of appropriate computer-managed software through networking…

  7. Computer-Mediated Technology and Transcultural Counselor Education.

    ERIC Educational Resources Information Center

    McFadden, John

    2000-01-01

    This manuscript traces the history of computer technologies, their applications in mental health settings, and suggests that transcultural counselor educators engage their students in the design of a case-based computer simulation. The avatar-focused simulation offers an unprecedented environment for experimentation in collaborative learning and…

  8. Information and Communicative Technology--Computers as Research Tools

    ERIC Educational Resources Information Center

    Sarsani, Mahender Reddy

    2007-01-01

    The emergence of "the electronic age,/electronic cottages/the electronic world" has affected the whole world; particularly the emergence of computers has penetrated everyone's life to a remarkable degree. They are being used in various fields including education. Recent advances, especially in the area of computer technology have…

  9. Institute for Computer Sciences and Technology. Annual Report FY 1986.

    ERIC Educational Resources Information Center

    National Bureau of Standards (DOC), Washington, DC. Inst. for Computer Sciences and Technology.

    Activities of the Institute for Computer Sciences and Technology (ICST) within the U.S. Department of Commerce during fiscal year 1986 are described in this annual report, which summarizes research and publications by ICST in the following areas: (1) standards and guidelines for computer security, including encryption and message authentication…

  10. Selecting Software. Tech Use Guide: Using Computer Technology.

    ERIC Educational Resources Information Center

    Council for Exceptional Children, Reston, VA. Center for Special Education Technology.

    One of nine brief guides for special educators on using computer technology, this guide focuses on the selection of software for use in the special education classroom. Four types of software used for computer assisted instruction are briefly described: tutorials; drill and practice; educational games; and simulations. The increasing use of tool…

  11. Computer-Integrated Manufacturing Technology. Tech Prep Competency Profile.

    ERIC Educational Resources Information Center

    Lakeland Tech Prep Consortium, Kirtland, OH.

    This tech prep competency profile for computer-integrated manufacturing technology begins with definitions for four occupations: manufacturing technician, quality technician, mechanical engineering technician, and computer-assisted design/drafting (CADD) technician. A chart lists competencies by unit and indicates whether entire or partial unit is…

  12. Coached, Interactive Computer Simulations: A New Technology for Training.

    ERIC Educational Resources Information Center

    Hummel, Thomas J.

    This paper provides an overview of a prototype simulation-centered intelligent computer-based training (CBT) system--implemented using expert system technology--which provides: (1) an environment in which trainees can learn and practice complex skills; (2) a computer-based coach or mentor to critique performance, suggest improvements, and provide…

  13. Using Computer Technology To Aid the Disabled Reader.

    ERIC Educational Resources Information Center

    Balajthy, Ernest

    When matched for achievement level and educational objectives, computer technology can be particularly effective with at-risk students. Computer-assisted instructional software is the most widely available type of software. An exciting development pertinent to literacy education is the development of the "electronic book" (also called "interactive…

  14. Two Year Computer System Technology Curricula for the '80's.

    ERIC Educational Resources Information Center

    Palko, Donald N.; Hata, David M.

    1982-01-01

    The computer industry is viewed on a collision course with a human resources crisis. Changes expected during the next decade are outlined, with expectations noted that merging of hardware and software skills will be met in a technician's skill set. Essential curricula components of a computer system technology program are detailed. (MP)

  15. The Future of Computer Technology in K-12 Education.

    ERIC Educational Resources Information Center

    Bennett, Frederick

    2002-01-01

    Asserts that schools are not taking advantage of computer technology to improve student learning. Argues that schools must alter basic practices (just as businesses did) to take full advantage of computerized education. Provides examples of how some schools have taken advantage of the interactive power of the computer. (PKP)

  16. GPU-based computational adaptive optics for volumetric optical coherence microscopy

    NASA Astrophysics Data System (ADS)

    Tang, Han; Mulligan, Jeffrey A.; Untracht, Gavrielle R.; Zhang, Xihao; Adie, Steven G.

    2016-03-01

    Optical coherence tomography (OCT) is a non-invasive imaging technique that measures reflectance from within biological tissues. Current higher-NA optical coherence microscopy (OCM) technologies with near cellular resolution have limitations on volumetric imaging capabilities due to the trade-offs between resolution vs. depth-of-field and sensitivity to aberrations. Such trade-offs can be addressed using computational adaptive optics (CAO), which corrects aberration computationally for all depths based on the complex optical field measured by OCT. However, due to the large size of datasets plus the computational complexity of CAO and OCT algorithms, it is a challenge to achieve high-resolution 3D-OCM reconstructions at speeds suitable for clinical and research OCM imaging. In recent years, real-time OCT reconstruction incorporating both dispersion and defocus correction has been achieved through parallel computing on graphics processing units (GPUs). We add to these methods by implementing depth-dependent aberration correction for volumetric OCM using plane-by-plane phase deconvolution. Following both defocus and aberration correction, our reconstruction algorithm achieved depth-independent transverse resolution of 2.8 um, equal to the diffraction-limited focal plane resolution. We have translated the CAO algorithm to a CUDA code implementation and tested the speed of the software in real-time using two GPUs - NVIDIA Quadro K600 and Geforce TITAN Z. For a data volume containing 4096×256×256 voxels, our system's processing speed can keep up with the 60 kHz acquisition rate of the line-scan camera, and takes 1.09 seconds to simultaneously update the CAO correction for 3 en face planes at user-selectable depths.

  17. Adapting Wireless Technology to Lighting Control and Environmental Sensing

    SciTech Connect

    Dana Teasdale; Francis Rubinstein; Dave Watson; Steve Purdy

    2005-10-01

    The high cost of retrofitting buildings with advanced lighting control systems is a barrier to adoption of this energy-saving technology. Wireless technology, however, offers a solution to mounting installation costs since it requires no additional wiring to implement. To demonstrate the feasibility of such a system, a prototype wirelessly-controlled advanced lighting system was designed and built. The system includes the following components: a wirelessly-controllable analog circuit module (ACM), a wirelessly-controllable electronic dimmable ballast, a T8 3-lamp fixture, an environmental multi-sensor, a current transducer, and control software. The ACM, dimmable ballast, multi-sensor, and current transducer were all integrated with SmartMesh{trademark} wireless mesh networking nodes, called motes, enabling wireless communication, sensor monitoring, and actuator control. Each mote-enabled device has a reliable communication path to the SmartMesh Manager, a single board computer that controls network functions and connects the wireless network to a PC running lighting control software. The ACM is capable of locally driving one or more standard 0-10 Volt electronic dimmable ballasts through relay control and a 0-10 Volt controllable output. The mote-integrated electronic dimmable ballast is designed to drive a standard 3-lamp T8 light fixture. The environmental multi-sensor measures occupancy, light level and temperature. The current transducer is used to measure the power consumed by the fixture. Control software was developed to implement advanced lighting algorithms, including daylight ramping, occupancy control, and demand response. Engineering prototypes of each component were fabricated and tested in a bench-scale system. Based on standard industry practices, a cost analysis was conducted. It is estimated that the installation cost of a wireless advanced lighting control system for a retrofit application is at least 30% lower than a comparable wired system for

  18. Space systems computer-aided design technology

    NASA Technical Reports Server (NTRS)

    Garrett, L. B.

    1984-01-01

    The interactive Design and Evaluation of Advanced Spacecraft (IDEAS) system is described, together with planned capability increases in the IDEAS system. The system's disciplines consist of interactive graphics and interactive computing. A single user at an interactive terminal can create, design, analyze, and conduct parametric studies of earth-orbiting satellites, which represents a timely and cost-effective method during the conceptual design phase where various missions and spacecraft options require evaluation. Spacecraft concepts evaluated include microwave radiometer satellites, communication satellite systems, solar-powered lasers, power platforms, and orbiting space stations.

  19. Promoting Technology-Assisted Active Learning in Computer Science Education

    ERIC Educational Resources Information Center

    Gao, Jinzhu; Hargis, Jace

    2010-01-01

    This paper describes specific active learning strategies for teaching computer science, integrating both instructional technologies and non-technology-based strategies shown to be effective in the literature. The theoretical learning components addressed include an intentional method to help students build metacognitive abilities, as well as…

  20. Factors Influencing Cloud-Computing Technology Adoption in Developing Countries

    ERIC Educational Resources Information Center

    Hailu, Alemayehu

    2012-01-01

    Adoption of new technology has complicating components both from the selection, as well as decision-making criteria and process. Although new technology such as cloud computing provides great benefits especially to the developing countries, it has challenges that may complicate the selection decision and subsequent adoption process. This study…

  1. Video and Computer Technologies for Extended-Campus Programming.

    ERIC Educational Resources Information Center

    Sagan, Edgar L.; And Others

    This paper discusses video and computer technologies for extended-campus programming (courses and programs at off-campus sites). The first section provides an overview of the distance education program at the University of Kentucky (UK), and highlights the improved access to graduate and professional programs, advances in technology, funding,…

  2. Building Computer Technology Skills in TESOL Teacher Education

    ERIC Educational Resources Information Center

    DelliCarpini, Margo

    2012-01-01

    This paper reports on an action research study that investigated factors influencing TESOL (teaching English to speakers of other languages) teacher candidates' (TCs) selection and use of technology in the English as a second language (ESL) classroom and the influence of explicit training in context in the use of computer technology for second…

  3. Exploring Computer Technology. The Illinois Plan for Industrial Education.

    ERIC Educational Resources Information Center

    Illinois State Univ., Normal.

    This guide, which is one in the "Exploration" series of curriculum guides intended to assist junior high and middle school industrial educators in helping their students explore diverse industrial situations and technologies used in industry, deals with exploring computer technology. The following topics are covered in the individual lessons: the…

  4. COMPUGIRLS: Stepping Stone to Future Computer-Based Technology Pathways

    ERIC Educational Resources Information Center

    Lee, Jieun; Husman, Jenefer; Scott, Kimberly A.; Eggum-Wilkens, Natalie D.

    2015-01-01

    The COMPUGIRLS: Culturally relevant technology program for adolescent girls was developed to promote underrepresented girls' future possible selves and career pathways in computer-related technology fields. We hypothesized that the COMPUGIRLS would promote academic possible selves and self-regulation to achieve these possible selves. We compared…

  5. Computed Tomography Technology: Development and Applications for Defence

    NASA Astrophysics Data System (ADS)

    Baheti, G. L.; Saxena, Nisheet; Tripathi, D. K.; Songara, K. C.; Meghwal, L. R.; Meena, V. L.

    2008-09-01

    Computed Tomography(CT) has revolutionized the field of Non-Destructive Testing and Evaluation (NDT&E). Tomography for industrial applications warrants design and development of customized solutions catering to specific visualization requirements. Present paper highlights Tomography Technology Solutions implemented at Defence Laboratory, Jodhpur (DLJ). Details on the technological developments carried out and their utilization for various Defence applications has been covered.

  6. Beyond Computer Literacy: Supporting Youth's Positive Development through Technology

    ERIC Educational Resources Information Center

    Bers, Marina Umaschi

    2010-01-01

    In a digital era in which technology plays a role in most aspects of a child's life, having the competence and confidence to use computers might be a necessary step, but not a goal in itself. Developing character traits that will serve children to use technology in a safe way to communicate and connect with others, and providing opportunities for…

  7. Use of Computer Technology To Help Students with Special Needs.

    ERIC Educational Resources Information Center

    Hasselbring, Ted S.; Glaser, Candyce H. Williams

    2000-01-01

    Reviews the role of computer technology in promoting the education of children with special needs within regular classrooms, discussing: technologies for students with mild learning and behavioral disorders, speech and language disorders, hearing impairments, visual impairments, and severe physical disabilities. Examines barriers to effective…

  8. National Survey of Computer Aided Manufacturing in Industrial Technology Programs.

    ERIC Educational Resources Information Center

    Heidari, Farzin

    The current status of computer-aided manufacturing in the 4-year industrial technology programs in the United States was studied. All industrial technology department chairs were mailed a questionnaire divided into program information, equipment information, and general comments sections. The questionnaire was designed to determine the subjects…

  9. The Federal Government's Role in Advancing Computer Technology

    ERIC Educational Resources Information Center

    Information Hotline, 1978

    1978-01-01

    As part of the Federal Data Processing Reorganization Study submitted by the Science and Technology Team, the Federal Government's role in advancing and diffusing computer technology is discussed. Findings and conclusions assess the state-of-the-art in government and in industry, and five recommendations provide directions for government policy…

  10. The Future of Mobile Technology and Mobile Wireless Computing

    ERIC Educational Resources Information Center

    Hart, Jim; Hannan, Mike

    2004-01-01

    It is often stated that mobile wireless computing is going to be the next big technology revolution that will grip the world in the same way mobile telephones did in the 1990s. However, while the technology is rapidly improving, the rate of uptake has been lower than expected. This paper describes some of the reasons for this, and discusses some…

  11. Comparison and Equating of Paper-Administered, Computer-Administered and Computerized Adaptive Tests of Achievement.

    ERIC Educational Resources Information Center

    Olsen, James B.; And Others

    Student achievement test scores were compared and equated, using three different testing methods: paper-administered, computer-administered, and computerized adaptive testing. The tests were developed from third and sixth grade mathematics item banks of the California Assessment Program. The paper and the computer-administered tests were identical…

  12. Comparing Computer-Adaptive and Curriculum-Based Measurement Methods of Assessment

    ERIC Educational Resources Information Center

    Shapiro, Edward S.; Gebhardt, Sarah N.

    2012-01-01

    This article reported the concurrent, predictive, and diagnostic accuracy of a computer-adaptive test (CAT) and curriculum-based measurements (CBM; both computation and concepts/application measures) for universal screening in mathematics among students in first through fourth grade. Correlational analyses indicated moderate to strong…

  13. Network Computer Technology. Phase I: Viability and Promise within NASA's Desktop Computing Environment

    NASA Technical Reports Server (NTRS)

    Paluzzi, Peter; Miller, Rosalind; Kurihara, West; Eskey, Megan

    1998-01-01

    Over the past several months, major industry vendors have made a business case for the network computer as a win-win solution toward lowering total cost of ownership. This report provides results from Phase I of the Ames Research Center network computer evaluation project. It identifies factors to be considered for determining cost of ownership; further, it examines where, when, and how network computer technology might fit in NASA's desktop computing architecture.

  14. Helping Students Adapt to Computer-Based Encrypted Examinations

    ERIC Educational Resources Information Center

    Baker-Eveleth, Lori; Eveleth, Daniel M.; O'Neill, Michele; Stone, Robert W.

    2006-01-01

    The College of Business and Economics at the University of Idaho conducted a pilot study that used commercially available encryption software called Securexam to deliver computer-based examinations. A multi-step implementation procedure was developed, implemented, and then evaluated on the basis of what students viewed as valuable. Two key aspects…

  15. Adapting the traveling salesman problem to an adiabatic quantum computer

    NASA Astrophysics Data System (ADS)

    Warren, Richard H.

    2013-04-01

    We show how to guide a quantum computer to select an optimal tour for the traveling salesman. This is significant because it opens a rapid solution method for the wide range of applications of the traveling salesman problem, which include vehicle routing, job sequencing and data clustering.

  16. Computer simulation program is adaptable to industrial processes

    NASA Technical Reports Server (NTRS)

    Schultz, F. E.

    1966-01-01

    The Reaction kinetics ablation program /REKAP/, developed to simulate ablation of various materials, provides mathematical formulations for computer programs which can simulate certain industrial processes. The programs are based on the use of nonsymmetrical difference equations that are employed to solve complex partial differential equation systems.

  17. Computers and terminals as an aid to international technology transfer

    NASA Technical Reports Server (NTRS)

    Sweeney, W. T.

    1974-01-01

    As technology transfer becomes more popular and proves to be an economical method for companies of all sizes to take advantage of a tremendous amount of new and available technology from sources all over the world, the introduction of computers and terminals into the international technology transfer process is proving to be a successful method for companies to take part in this beneficial approach to new business opportunities.

  18. Parallel computation with adaptive methods for elliptic and hyperbolic systems

    SciTech Connect

    Benantar, M.; Biswas, R.; Flaherty, J.E.; Shephard, M.S.

    1990-01-01

    We consider the solution of two dimensional vector systems of elliptic and hyperbolic partial differential equations on a shared memory parallel computer. For elliptic problems, the spatial domain is discretized using a finite quadtree mesh generation procedure and the differential system is discretized by a finite element-Galerkin technique with a piecewise linear polynomial basis. Resulting linear algebraic systems are solved using the conjugate gradient technique with element-by-element and symmetric successive over-relaxation preconditioners. Stiffness matrix assembly and linear system solutions are processed in parallel with computations scheduled on noncontiguous quadrants of the tree in order to minimize process synchronization. Determining noncontiguous regions by coloring the regular finite quadtree structure is far simpler than coloring elements of the unstructured mesh that the finite quadtree procedure generates. We describe linear-time complexity coloring procedures that use six and eight colors.

  19. Adaptive critic design for computer intrusion detection system

    NASA Astrophysics Data System (ADS)

    Novokhodko, Alexander; Wunsch, Donald C., II; Dagli, Cihan H.

    2001-03-01

    This paper summarizes ongoing research. A neural network is used to detect a computer system intrusion basing on data from the system audit trail generated by Solaris Basic Security Module. The data have been provided by Lincoln Labs, MIT. The system alerts the human operator, when it encounters suspicious activity logged in the audit trail. To reduce the false alarm rate and accommodate the temporal indefiniteness of moment of attack a reinforcement learning approach is chosen to train the network.

  20. From biological neural networks to thinking machines: Transitioning biological organizational principles to computer technology

    NASA Technical Reports Server (NTRS)

    Ross, Muriel D.

    1991-01-01

    The three-dimensional organization of the vestibular macula is under study by computer assisted reconstruction and simulation methods as a model for more complex neural systems. One goal of this research is to transition knowledge of biological neural network architecture and functioning to computer technology, to contribute to the development of thinking computers. Maculas are organized as weighted neural networks for parallel distributed processing of information. The network is characterized by non-linearity of its terminal/receptive fields. Wiring appears to develop through constrained randomness. A further property is the presence of two main circuits, highly channeled and distributed modifying, that are connected through feedforward-feedback collaterals and biasing subcircuit. Computer simulations demonstrate that differences in geometry of the feedback (afferent) collaterals affects the timing and the magnitude of voltage changes delivered to the spike initiation zone. Feedforward (efferent) collaterals act as voltage followers and likely inhibit neurons of the distributed modifying circuit. These results illustrate the importance of feedforward-feedback loops, of timing, and of inhibition in refining neural network output. They also suggest that it is the distributed modifying network that is most involved in adaptation, memory, and learning. Tests of macular adaptation, through hyper- and microgravitational studies, support this hypothesis since synapses in the distributed modifying circuit, but not the channeled circuit, are altered. Transitioning knowledge of biological systems to computer technology, however, remains problematical.

  1. Adapting to a Computer-Oriented Society: The Leadership Role of Business and Liberal Arts Faculties.

    ERIC Educational Resources Information Center

    O'Gorman, David E.

    The need for higher education to take a proactive rather than a reactive stance in dealing with the impact of the computer is considered. The field of computerized video technology is briefly discussed. It is suggested that disparate groups such as the liberal arts and business faculties should cooperate to maximize the use of computer technology.…

  2. A survey on adaptive engine technology for serious games

    NASA Astrophysics Data System (ADS)

    Rasim, Langi, Armein Z. R.; Munir, Rosmansyah, Yusep

    2016-02-01

    Serious Games has become a priceless tool in learning because it can simulate abstract concept to appear more realistic. The problem faced is that the players have different ability in playing the games. This causes the players to become frustrated if the game is too difficult or to get bored if it is too easy. Serious games have non-player character (NPC) in it. The NPC should be able to adapt to the players in such a way so that the players can feel comfortable in playing the games. Because of that, serious games development must involve an adaptive engine, which is by applying a learning machine that can adapt to different players. The development of adaptive engine can be viewed in terms of the frameworks and the algorithms. Frameworks include rules based, plan based, organization description based, proficiency of player based, and learning style and cognitive state based. Algorithms include agents based and non-agent based

  3. Adapting Wireless Technology to Lighting Control and Environmental Sensing

    SciTech Connect

    Dana Teasdale; Francis Rubinstein; David S. Watson; Steve Purdy

    2006-04-30

    Although advanced lighting control systems offer significant energy savings, the high cost of retrofitting buildings with advanced lighting control systems is a barrier to adoption of this energy-saving technology. Wireless technology, however, offers a solution to mounting installation costs since it requires no additional wiring to implement. To demonstrate the feasibility of such a system, a prototype wirelessly-controlled advanced lighting system was designed and built. The system includes the following components: a wirelessly-controllable analog circuit module (ACM), a wirelessly-controllable electronic dimmable ballast, a T8 3-lamp fixture, an environmental multi-sensor, a current transducer, and control software. The ACM, dimmable ballast, multi-sensor, and current transducer were all integrated with SmartMesh{trademark} wireless mesh networking nodes, called motes, enabling wireless communication, sensor monitoring, and actuator control. Each mote-enabled device has a reliable communication path to the SmartMesh Manager, a single board computer that controls network functions and connects the wireless network to a PC running lighting control software. The ACM is capable of locally driving one or more standard 0-10 Volt electronic dimmable ballasts through relay control and a 0-10 Volt controllable output, in addition to 0-24 Volt and 0-10 Volt inputs. The mote-integrated electronic dimmable ballast is designed to drive a standard 3-lamp T8 light fixture. The environmental multisensor measures occupancy, light level and temperature. The current transducer is used to measure the power consumed by the fixture. Control software was developed to implement advanced lighting algorithms, including open and closed-loop daylight ramping, occupancy control, and demand response. Engineering prototypes of each component were fabricated and tested in a bench-scale system. Based on standard industry practices, a cost analysis was conducted. It is estimated that the

  4. Adaption of space station technology for lunar operations

    NASA Technical Reports Server (NTRS)

    Garvey, J. M.

    1992-01-01

    Space Station Freedom technology will have the potential for numerous applications in an early lunar base program. The benefits of utilizing station technology in such a fashion include reduced development and facility costs for lunar base systems, shorter schedules, and verification of such technology through space station experience. This paper presents an assessment of opportunities for using station technology in a lunar base program, particularly in the lander/ascent vehicles and surface modules.

  5. Impact of new computing systems on computational mechanics and flight-vehicle structures technology

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Storaasli, O. O.; Fulton, R. E.

    1984-01-01

    Advances in computer technology which may have an impact on computational mechanics and flight vehicle structures technology were reviewed. The characteristics of supersystems, highly parallel systems, and small systems are summarized. The interrelations of numerical algorithms and software with parallel architectures are discussed. A scenario for future hardware/software environment and engineering analysis systems is presented. Research areas with potential for improving the effectiveness of analysis methods in the new environment are identified.

  6. Adapting Wood Technology to Teach Design and Engineering

    ERIC Educational Resources Information Center

    Rummel, Robert A.

    2012-01-01

    Technology education has changed dramatically over the last few years. The transition of industrial arts to technology education and more recently the pursuit of design and engineering has resulted in technology education teachers often needing to change their curriculum and course activities to meet the demands of a rapidly changing profession.…

  7. Combining Brain–Computer Interfaces and Assistive Technologies: State-of-the-Art and Challenges

    PubMed Central

    Millán, J. d. R.; Rupp, R.; Müller-Putz, G. R.; Murray-Smith, R.; Giugliemma, C.; Tangermann, M.; Vidaurre, C.; Cincotti, F.; Kübler, A.; Leeb, R.; Neuper, C.; Müller, K.-R.; Mattia, D.

    2010-01-01

    In recent years, new research has brought the field of electroencephalogram (EEG)-based brain–computer interfacing (BCI) out of its infancy and into a phase of relative maturity through many demonstrated prototypes such as brain-controlled wheelchairs, keyboards, and computer games. With this proof-of-concept phase in the past, the time is now ripe to focus on the development of practical BCI technologies that can be brought out of the lab and into real-world applications. In particular, we focus on the prospect of improving the lives of countless disabled individuals through a combination of BCI technology with existing assistive technologies (AT). In pursuit of more practical BCIs for use outside of the lab, in this paper, we identify four application areas where disabled individuals could greatly benefit from advancements in BCI technology, namely, “Communication and Control”, “Motor Substitution”, “Entertainment”, and “Motor Recovery”. We review the current state of the art and possible future developments, while discussing the main research issues in these four areas. In particular, we expect the most progress in the development of technologies such as hybrid BCI architectures, user–machine adaptation algorithms, the exploitation of users’ mental states for BCI reliability and confidence measures, the incorporation of principles in human–computer interaction (HCI) to improve BCI usability, and the development of novel BCI technology including better EEG devices. PMID:20877434

  8. Beyond computer literacy: supporting youth's positive development through technology.

    PubMed

    Bers, Marina Umaschi

    2010-01-01

    In a digital era in which technology plays a role in most aspects of a child's life, having the competence and confidence to use computers might be a necessary step, but not a goal in itself. Developing character traits that will serve children to use technology in a safe way to communicate and connect with others, and providing opportunities for children to make a better world through the use of their computational skills, is just as important. The Positive Technological Development framework (PTD), a natural extension of the computer literacy and the technological fluency movements that have influenced the world of educational technology, adds psychosocial, civic, and ethical components to the cognitive ones. PTD examines the developmental tasks of a child growing up in our digital era and provides a model for developing and evaluating technology-rich youth programs. The explicit goal of PTD programs is to support children in the positive uses of technology to lead more fulfilling lives and make the world a better place. This article introduces the concept of PTD and presents examples of the Zora virtual world program for young people that the author developed following this framework. PMID:21240949

  9. The change in critical technologies for computational physics

    NASA Technical Reports Server (NTRS)

    Watson, Val

    1990-01-01

    It is noted that the types of technology required for computational physics are changing as the field matures. Emphasis has shifted from computer technology to algorithm technology and, finally, to visual analysis technology as areas of critical research for this field. High-performance graphical workstations tied to a supercommunicator with high-speed communications along with the development of especially tailored visualization software has enabled analysis of highly complex fluid-dynamics simulations. Particular reference is made here to the development of visual analysis tools at NASA's Numerical Aerodynamics Simulation Facility. The next technology which this field requires is one that would eliminate visual clutter by extracting key features of simulations of physics and technology in order to create displays that clearly portray these key features. Research in the tuning of visual displays to human cognitive abilities is proposed. The immediate transfer of technology to all levels of computers, specifically the inclusion of visualization primitives in basic software developments for all work stations and PCs, is recommended.

  10. Application of covariate shift adaptation techniques in brain-computer interfaces.

    PubMed

    Li, Yan; Kambara, Hiroyuki; Koike, Yasuharu; Sugiyama, Masashi

    2010-06-01

    A phenomenon often found in session-to-session transfers of brain-computer interfaces (BCIs) is nonstationarity. It can be caused by fatigue and changing attention level of the user, differing electrode placements, varying impedances, among other reasons. Covariate shift adaptation is an effective method that can adapt to the testing sessions without the need for labeling the testing session data. The method was applied on a BCI Competition III dataset. Results showed that covariate shift adaptation compares favorably with methods used in the BCI competition in coping with nonstationarities. Specifically, bagging combined with covariate shift helped to increase stability, when applied to the competition dataset. An online experiment also proved the effectiveness of bagged-covariate shift method. Thus, it can be summarized that covariate shift adaptation is helpful to realize adaptive BCI systems. PMID:20172795

  11. Design of a fault tolerant airborne digital computer. Volume 2: Computational requirements and technology

    NASA Technical Reports Server (NTRS)

    Ratner, R. S.; Shapiro, E. B.; Zeidler, H. M.; Wahlstrom, S. E.; Clark, C. B.; Goldberg, J.

    1973-01-01

    This final report summarizes the work on the design of a fault tolerant digital computer for aircraft. Volume 2 is composed of two parts. Part 1 is concerned with the computational requirements associated with an advanced commercial aircraft. Part 2 reviews the technology that will be available for the implementation of the computer in the 1975-1985 period. With regard to the computation task 26 computations have been categorized according to computational load, memory requirements, criticality, permitted down-time, and the need to save data in order to effect a roll-back. The technology part stresses the impact of large scale integration (LSI) on the realization of logic and memory. Also considered was module interconnection possibilities so as to minimize fault propagation.

  12. Understanding and enhancing user acceptance of computer technology

    NASA Technical Reports Server (NTRS)

    Rouse, William B.; Morris, Nancy M.

    1986-01-01

    Technology-driven efforts to implement computer technology often encounter problems due to lack of acceptance or begrudging acceptance of the personnel involved. It is argued that individuals' acceptance of automation, in terms of either computerization or computer aiding, is heavily influenced by their perceptions of the impact of the automation on their discretion in performing their jobs. It is suggested that desired levels of discretion reflect needs to feel in control and achieve self-satisfaction in task performance, as well as perceptions of inadequacies of computer technology. Discussion of these factors leads to a structured set of considerations for performing front-end analysis, deciding what to automate, and implementing the resulting changes.

  13. First Year Preservice Teachers' Attitudes toward Computers from Computer Education and Instructional Technology Department

    ERIC Educational Resources Information Center

    Yakin, Ilker, Sumuer, Evren

    2007-01-01

    The purpose of the study is to explore the attitudes of first year university students towards computers. The study focuses on preservice teachers (N=46) included 33 male and 12 female from Middle East Technical University, Computer Education and Instructional Technology (CEIT) department. The study is delimited to first grade preservice teachers…

  14. Mechanical Design Technology--Modified. (Computer Assisted Drafting, Computer Aided Design). Curriculum Grant 84/85.

    ERIC Educational Resources Information Center

    Schoolcraft Coll., Livonia, MI.

    This document is a curriculum guide for a program in mechanical design technology (computer-assisted drafting and design developed at Schoolcraft College, Livonia, Michigan). The program helps students to acquire the skills of drafters and to interact with electronic equipment, with the option of becoming efficient in the computer-aided…

  15. Engineering Technology Programs Courses Guide for Computer Aided Design and Computer Aided Manufacturing.

    ERIC Educational Resources Information Center

    Georgia Univ., Athens. Div. of Vocational Education.

    This guide describes the requirements for courses in computer-aided design and computer-aided manufacturing (CAD/CAM) that are part of engineering technology programs conducted in vocational-technical schools in Georgia. The guide is organized in five sections. The first section provides a rationale for occupations in design and in production,…

  16. Portable Computer Technology (PCT) Research and Development Program Phase 2

    NASA Technical Reports Server (NTRS)

    Castillo, Michael; McGuire, Kenyon; Sorgi, Alan

    1995-01-01

    The subject of this project report, focused on: (1) Design and development of two Advanced Portable Workstation 2 (APW 2) units. These units incorporate advanced technology features such as a low power Pentium processor, a high resolution color display, National Television Standards Committee (NTSC) video handling capabilities, a Personal Computer Memory Card International Association (PCMCIA) interface, and Small Computer System Interface (SCSI) and ethernet interfaces. (2) Use these units to integrate and demonstrate advanced wireless network and portable video capabilities. (3) Qualification of the APW 2 systems for use in specific experiments aboard the Mir Space Station. A major objective of the PCT Phase 2 program was to help guide future choices in computing platforms and techniques for meeting National Aeronautics and Space Administration (NASA) mission objectives. The focus being on the development of optimal configurations of computing hardware, software applications, and network technologies for use on NASA missions.

  17. Program on Promoting Climate Change Adaptation Technologies Bridging Policy Making and Science Research in Taiwan

    NASA Astrophysics Data System (ADS)

    Chiang, Y.; Chiang, W.; Sui, C.; Tung, C.; Ho, H.; Li, M.; Chan, S.; Climate Change Adaptation Technologies Program, National Science Council, Taiwan

    2010-12-01

    Climate changes adaptation needs innovative technological revolution on demand for transdisciplinary studies in various temporal and spatial scales. In our proposed program, a systematic and scientific framework will be developed to promote innovative adaptation technologies with respect to providing decision making information for government sectors, enhancing applicability of scientific research output, strengthening national research capabilities, and integrating both academic and non-academic resources. The objectives of this program are to identify key issues, required technologies, and scientific knowledge for climate change adaptations, and to build a transdisciplinary platform bridging science-supported technologies required by government sectors and demand-oriented scientific research conducted by academic communities. The approach proposed herein will be practiced in vulnerable regions, such as urban, rural, mountain, river basin, and coastal areas, which are particularly sensitive to climate change. The first phase of 3-year (2011~2013) work is to deploy framework and strategies of climate change impact assessment and adaptation measures between related government sectors and researchers from academic communities. The proposed framework involves three principle research groups, namely Environmental System, Vulnerability Assessment, and Risk Management and Adaptation Technology. The goal of the first group, Environmental System, is to combine climate change projections with enhanced scientific and environmental monitoring technologies for better adaptations to future scenarios in different social, economic, and environmental sectors to support adaptation measures planning and to reduce uncertainties on assessing vulnerability. The goal of the second group, Vulnerability Assessment, is to identify interfaces and information structures of climate change vulnerably issues and to develop protocol, models, and indices for vulnerability assessment. The goal of

  18. Application of software technology to a future spacecraft computer design

    NASA Technical Reports Server (NTRS)

    Labaugh, R. J.

    1980-01-01

    A study was conducted to determine how major improvements in spacecraft computer systems can be obtained from recent advances in hardware and software technology. Investigations into integrated circuit technology indicated that the CMOS/SOS chip set being developed for the Air Force Avionics Laboratory at Wright Patterson had the best potential for improving the performance of spaceborne computer systems. An integral part of the chip set is the bit slice arithmetic and logic unit. The flexibility allowed by microprogramming, combined with the software investigations, led to the specification of a baseline architecture and instruction set.

  19. L(sub 1) Adaptive Flight Control System: Flight Evaluation and Technology Transition

    NASA Technical Reports Server (NTRS)

    Xargay, Enric; Hovakimyan, Naira; Dobrokhodov, Vladimir; Kaminer, Isaac; Gregory, Irene M.; Cao, Chengyu

    2010-01-01

    Certification of adaptive control technologies for both manned and unmanned aircraft represent a major challenge for current Verification and Validation techniques. A (missing) key step towards flight certification of adaptive flight control systems is the definition and development of analysis tools and methods to support Verification and Validation for nonlinear systems, similar to the procedures currently used for linear systems. In this paper, we describe and demonstrate the advantages of L(sub l) adaptive control architectures for closing some of the gaps in certification of adaptive flight control systems, which may facilitate the transition of adaptive control into military and commercial aerospace applications. As illustrative examples, we present the results of a piloted simulation evaluation on the NASA AirSTAR flight test vehicle, and results of an extensive flight test program conducted by the Naval Postgraduate School to demonstrate the advantages of L(sub l) adaptive control as a verifiable robust adaptive flight control system.

  20. Collaborative Learning with Multi-Touch Technology: Developing Adaptive Expertise

    ERIC Educational Resources Information Center

    Mercier, Emma M.; Higgins, Steven E.

    2013-01-01

    Developing fluency and flexibility in mathematics is a key goal of upper primary schooling, however, while fluency can be developed with practice, designing activities that support the development of flexibility is more difficult. Drawing on concepts of adaptive expertise, we developed a task for a multi-touch classroom, NumberNet, that aimed to…

  1. Study on rule-based adaptive fuzzy excitation control technology

    NASA Astrophysics Data System (ADS)

    Zhao, Hui; Wang, Hong-jun; Liu, Lu-yuan; Yue, You-jun

    2008-10-01

    Power system is a kind of typical non-linear system, it is hard to achieve excellent control performance with conventional PID controller under different operating conditions. Fuzzy parameter adaptive PID exciting controller is very efficient to overcome the influence of tiny disturbances, but the performance of the control system will be worsened when operating conditions of the system change greatly or larger disturbances occur. To solve this problem, this article presents a rule adaptive fuzzy control scheme for synchronous generator exciting system. In this scheme the control rule adaptation is implemented by regulating the value of parameter di under the given proportional divisors K1, K2 and K3 of fuzzy sets Ai and Bi. This rule adaptive mechanism is constituted by two groups of original rules about the self-generation and self-correction of the control rule. Using two groups of rules, the control rule activated by status 1 and 2 in figure 2 system can be regulated automatically and simultaneously at the time instant k. The results from both theoretical analysis and simulation show that the presented scheme is effective and feasible and possesses good performance.

  2. Computational Characterization of Visually Induced Auditory Spatial Adaptation

    PubMed Central

    Wozny, David R.; Shams, Ladan

    2011-01-01

    Recent research investigating the principles governing human perception has provided increasing evidence for probabilistic inference in human perception. For example, human auditory and visual localization judgments closely resemble that of a Bayesian causal inference observer, where the underlying causal structure of the stimuli are inferred based on both the available sensory evidence and prior knowledge. However, most previous studies have focused on characterization of perceptual inference within a static environment, and therefore, little is known about how this inference process changes when observers are exposed to a new environment. In this study we aimed to computationally characterize the change in auditory spatial perception induced by repeated auditory–visual spatial conflict, known as the ventriloquist aftereffect. In theory, this change could reflect a shift in the auditory sensory representations (i.e., shift in auditory likelihood distribution), a decrease in the precision of the auditory estimates (i.e., increase in spread of likelihood distribution), a shift in the auditory bias (i.e., shift in prior distribution), or an increase/decrease in strength of the auditory bias (i.e., the spread of prior distribution), or a combination of these. By quantitatively estimating the parameters of the perceptual process for each individual observer using a Bayesian causal inference model, we found that the shift in the perceived locations after exposure was associated with a shift in the mean of the auditory likelihood functions in the direction of the experienced visual offset. The results suggest that repeated exposure to a fixed auditory–visual discrepancy is attributed by the nervous system to sensory representation error and as a result, the sensory map of space is recalibrated to correct the error. PMID:22069383

  3. Helicopter mission optimization study. [portable computer technology for flight optimization

    NASA Technical Reports Server (NTRS)

    Olson, J. R.

    1978-01-01

    The feasibility of using low-cost, portable computer technology to help a helicopter pilot optimize flight parameters to minimize fuel consumption and takeoff and landing noise was demonstrated. Eight separate computer programs were developed for use in the helicopter cockpit using a hand-held computer. The programs provide the helicopter pilot with the ability to calculate power required, minimum fuel consumption for both range and endurance, maximum speed and a minimum noise profile for both takeoff and landing. Each program is defined by a maximum of two magnetic cards. The helicopter pilot is required to key in the proper input parameter such as gross weight, outside air temperature or pressure altitude.

  4. Metabolic adaptation of skeletal muscle to high altitude hypoxia: how new technologies could resolve the controversies

    PubMed Central

    2009-01-01

    In most tissues of the body, cellular ATP production predominantly occurs via mitochondrial oxidative phosphorylation of reduced intermediates, which are in turn derived from substrates such as glucose and fatty acids. In order to maintain ATP homeostasis, and therefore cellular function, the mitochondria require a constant supply of fuels and oxygen. In many disease states, or in healthy individuals at altitude, tissue oxygen levels fall and the cell must meet this hypoxic challenge to maintain energetics and limit oxidative stress. In humans at altitude and patients with respiratory disease, loss of skeletal muscle mitochondrial density is a consistent finding. Recent studies that have used cultured cells and genetic mouse models have elucidated a number of elegant adaptations that allow cells with a diminished mitochondrial population to function effectively in hypoxia. This article reviews these findings alongside studies of hypoxic human skeletal muscle, putting them into the context of whole-body physiology and acclimatization to high-altitude hypoxia. A number of current controversies are highlighted, which may eventually be resolved by a systems physiology approach that considers the time-or tissue-dependent nature of some adaptive responses. Future studies using high-throughput metabolomic, transcriptomic, and proteomic technologies to investigate hypoxic skeletal muscle in humans and animal models could resolve many of these controversies, and a case is therefore made for the integration of resulting data into computational models that account for factors such as duration and extent of hypoxic exposure, subjects' backgrounds, and whether data have been acquired from active or sedentary individuals. An integrated and more quantitative understanding of the body's metabolic response to hypoxia and the conditions under which adaptive processes occur could reveal much about the ways that tissues function in the very many disease states where hypoxia is a

  5. Cases on Technological Adaptability and Transnational Learning: Issues and Challenges

    ERIC Educational Resources Information Center

    Mukerji, Siran, Ed.; Tripathi, Purnendu, Ed.

    2010-01-01

    Technology holds the key for bridging the gap between access to quality education and the need for enhanced learning experiences. This book contains case studies on divergent themes of personalized learning environments, inclusive learning for social change, innovative learning and assessment techniques, technology and international partnership…

  6. Adapting Technology for School Improvement: A Global Perspective

    ERIC Educational Resources Information Center

    Chapman, David W., Ed.; Mahlck, Lars O., Ed.

    2004-01-01

    This book presents a compilation of articles based on the premise that the move to advanced technology use in primary and secondary schools offers great hope for improving the access, quality, and efficiency of basic education. The aim of the book is to identify and examine how information technologies can be, and are being, used to strengthen the…

  7. Applications of automatic mesh generation and adaptive methods in computational medicine

    SciTech Connect

    Schmidt, J.A.; Macleod, R.S.; Johnson, C.R.; Eason, J.C.

    1995-12-31

    Important problems in Computational Medicine exist that can benefit from the implementation of adaptive mesh refinement techniques. Biological systems are so inherently complex that only efficient models running on state of the art hardware can begin to simulate reality. To tackle the complex geometries associated with medical applications we present a general purpose mesh generation scheme based upon the Delaunay tessellation algorithm and an iterative point generator. In addition, automatic, two- and three-dimensional adaptive mesh refinement methods are presented that are derived from local and global estimates of the finite element error. Mesh generation and adaptive refinement techniques are utilized to obtain accurate approximations of bioelectric fields within anatomically correct models of the heart and human thorax. Specifically, we explore the simulation of cardiac defibrillation and the general forward and inverse problems in electrocardiography (ECG). Comparisons between uniform and adaptive refinement techniques are made to highlight the computational efficiency and accuracy of adaptive methods in the solution of field problems in computational medicine.

  8. Instructors' Integration of Computer Technology: Examining the Role of Interaction

    ERIC Educational Resources Information Center

    Kim, Hoe Kyeung; Rissel, Dorothy

    2008-01-01

    Computer technology has the potential to provide rich resources for language teaching and learning. However, it continues to be underutilized, even though its availability, familiarity, and sophistication are steadily increasing. This case study explored the way in which three language instructors' beliefs about language teaching and learning…

  9. Introduction to CAD/Computers. High-Technology Training Module.

    ERIC Educational Resources Information Center

    Lockerby, Hugh

    This learning module for an eighth-grade introductory technology course is designed to help teachers introduce students to computer-assisted design (CAD) in a communications unit on graphics. The module contains a module objective and five specific objectives, a content outline, suggested instructor methodology, student activities, a list of six…

  10. Computer-Aided Drafting. Education for Technology Employment.

    ERIC Educational Resources Information Center

    Northern Illinois Univ., De Kalb. Dept. of Technology.

    This computer-aided drafting (CAD) curriculum was developed to provide drafting instructors in Illinois with a useful guide for relating an important new technological advance to the vocational classroom. The competency-based learning activity guides are written to be used with any CAD system being used at the secondary and postsecondary levels.…

  11. Implementation of Assistive Computer Technology: A Model for School Systems

    ERIC Educational Resources Information Center

    Morrison, Karen

    2007-01-01

    Many researchers conclude that assistive computer technology (ACT) has the potential for improving educational outcomes and improving the quality of life for those with disabilities (Blackhurst & Edyburn, 2000; Fisher & Frey 2001; Lewis, 1993; Lindsey, 1993). While it is recognized that ACT can have a positive impact on learning for students with…

  12. Troubling Discourse: Basic Writing and Computer-Mediated Technologies

    ERIC Educational Resources Information Center

    Jonaitis, Leigh A.

    2012-01-01

    Through an examination of literature in the fields of Basic Writing and developmental education, this essay provides some historical perspective and examines the prevalent discourses on the use of computer-mediated technologies in the basic writing classroom. The author uses Bertram Bruce's (1997) framework of various "stances" on…

  13. Computer Technology Integration and Student Learning: Barriers and Promise

    ERIC Educational Resources Information Center

    Keengwe, Jared; Onchwari, Grace; Wachira, Patrick

    2008-01-01

    Political and institutional support has enabled many institutions of learning to spend millions of dollars to acquire educational computing tools (Ficklen and Muscara, "Am Educ" 25(3):22-29, 2001) that have not been effectively integrated into the curriculum. While access to educational technology tools has remarkably improved in most schools,…

  14. Computer integrated manufacturing and technology transfer for improving aerospace productivity

    NASA Astrophysics Data System (ADS)

    Farrington, P. A.; Sica, J.

    1992-03-01

    This paper reviews a cooperative effort, between the Alabama Industial Development Training Institute and the University of Alabama in Huntsville, to implement a prototype computer integrated manufacturing system. The primary use of this system will be to educate Alabama companies on the organizational and technological issues involved in the implementation of advanced manufacturing systems.

  15. ENVIRONMENTAL CONSEQUENCES OF TELEMATICS: TELECOMMUNICATION, COMPUTATION, AND INFORMATION TECHNOLOGIES

    EPA Science Inventory

    Current important research needs whose results will be critical to Environmental Protection Agency's mission in the next two to three decades with regard to a major expansion in the use of telematics, i.e. telecommunications, computer, and information technology, are identified. ...

  16. Computational Structures Technology for Airframes and Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Housner, Jerrold M. (Compiler); Starnes, James H., Jr. (Compiler); Hopkins, Dale A. (Compiler); Chamis, Christos C. (Compiler)

    1992-01-01

    This conference publication contains the presentations and discussions from the joint University of Virginia (UVA)/NASA Workshops. The presentations included NASA Headquarters perspectives on High Speed Civil Transport (HSCT), goals and objectives of the UVA Center for Computational Structures Technology (CST), NASA and Air Force CST activities, CST activities for airframes and propulsion systems in industry, and CST activities at Sandia National Laboratory.

  17. NASA CST aids U.S. industry. [computational structures technology

    NASA Technical Reports Server (NTRS)

    Housner, Jerry M.; Pinson, Larry D.

    1993-01-01

    The effect of NASA's computational structures Technology (CST) research on aerospace vehicle design and operation is discussed. The application of this research to proposed version of a high-speed civil transport, to composite structures in aerospace, to the study of crack growth, and to resolving field problems is addressed.

  18. Computer Technology and Student Preferences in a Nutrition Course

    ERIC Educational Resources Information Center

    Temple, Norman J.; Kemp, Wendy C.; Benson, Wendy A.

    2006-01-01

    This study assessed learner preferences for using computer-based technology in a distance education course. A questionnaire was posted to students who had taken an undergraduate nutrition course at Athabasca University, Canada. The response rate was 57.1% (176 returned out of 308). Subjects were predominately female (93.7%) and nursing students…

  19. Beyond Computer Literacy: Technology Integration and Curriculum Transformation

    ERIC Educational Resources Information Center

    Safar, Ammar H.; AlKhezzi, Fahad A.

    2013-01-01

    Personal computers, the Internet, smartphones, and other forms of information and communication technology (ICT) have changed our world, our job, our personal lives, as well as how we manage our knowledge and time effectively and efficiently. Research findings in the past decades have acknowledged and affirmed that the content the ICT medium…

  20. Pervasive Computing and Communication Technologies for U-Learning

    ERIC Educational Resources Information Center

    Park, Young C.

    2014-01-01

    The development of digital information transfer, storage and communication methods influences a significant effect on education. The assimilation of pervasive computing and communication technologies marks another great step forward, with Ubiquitous Learning (U-learning) emerging for next generation learners. In the evolutionary view the 5G (or…

  1. Computer-Assisted Foreign Language Teaching and Learning: Technological Advances

    ERIC Educational Resources Information Center

    Zou, Bin; Xing, Minjie; Wang, Yuping; Sun, Mingyu; Xiang, Catherine H.

    2013-01-01

    Computer-Assisted Foreign Language Teaching and Learning: Technological Advances highlights new research and an original framework that brings together foreign language teaching, experiments and testing practices that utilize the most recent and widely used e-learning resources. This comprehensive collection of research will offer linguistic…

  2. Implications of Computer Technology. Harvard University Program on Technology and Society.

    ERIC Educational Resources Information Center

    Taviss, Irene; Burbank, Judith

    Lengthy abstracts of a small number of selected books and articles on the implications of computer technology are presented, preceded by a brief state-of-the-art survey which traces the impact of computers on the structure of economic and political organizations and socio-cultural patterns. A summary statement introduces each of the three abstract…

  3. Providing Assistive Technology Applications as a Service Through Cloud Computing.

    PubMed

    Mulfari, Davide; Celesti, Antonio; Villari, Massimo; Puliafito, Antonio

    2015-01-01

    Users with disabilities interact with Personal Computers (PCs) using Assistive Technology (AT) software solutions. Such applications run on a PC that a person with a disability commonly uses. However the configuration of AT applications is not trivial at all, especially whenever the user needs to work on a PC that does not allow him/her to rely on his / her AT tools (e.g., at work, at university, in an Internet point). In this paper, we discuss how cloud computing provides a valid technological solution to enhance such a scenario.With the emergence of cloud computing, many applications are executed on top of virtual machines (VMs). Virtualization allows us to achieve a software implementation of a real computer able to execute a standard operating system and any kind of application. In this paper we propose to build personalized VMs running AT programs and settings. By using the remote desktop technology, our solution enables users to control their customized virtual desktop environment by means of an HTML5-based web interface running on any computer equipped with a browser, whenever they are. PMID:26132225

  4. Overview of European technology in computers, telecommunications, and electronics

    NASA Astrophysics Data System (ADS)

    Blackburn, J. F.

    1990-05-01

    The emergence of the personal computer, the growing use of distributed systems, and the increasing demand for supercomputers and mini-supercomputers are causing a profound impact on the European computer market. An equally profound development in telecommunications is the integration of voice, data, and images in the public network systems - the Integrated Service Digital Network (ISDN). The programs being mounted in Europe to meet the challenges of these technologies are described. The Europe-wide trends and actions with respect to computers, telecommunications, and microelectronics are discussed, and the major European collaborative programs in these fields are described. Specific attention is given to the European Strategic Programme for Research and Development in Information (ESPRIT); Research in Advanced Communications for Europe (RACE); European Research Coordination Agency (Eureka) programs; Joint European Submicron Silicon Initiative (JESSI); and the recently combined programs Basic Research Industrial Technologies in Europe/European Research in Advanced Materials (BRITE/EURAM).

  5. Implementing a Computer/Technology Endorsement in a Classroom Technology Master's Program.

    ERIC Educational Resources Information Center

    Brownell, Gregg; O'Bannon, Blanche; Brownell, Nancy

    In the spring of 1998, the Master's program in Classroom Technology at Bowling Green State University (Ohio) was granted conditional approval to grant, as part of the program, the new State of Ohio Department of Education computer/technology endorsement. This paper briefly describes Ohio's change from certification to licensure, the removal of…

  6. The Adoption of Grid Computing Technology by Organizations: A Quantitative Study Using Technology Acceptance Model

    ERIC Educational Resources Information Center

    Udoh, Emmanuel E.

    2010-01-01

    Advances in grid technology have enabled some organizations to harness enormous computational power on demand. However, the prediction of widespread adoption of the grid technology has not materialized despite the obvious grid advantages. This situation has encouraged intense efforts to close the research gap in the grid adoption process. In this…

  7. Impact of Load Balancing on Unstructured Adaptive Grid Computations for Distributed-Memory Multiprocessors

    NASA Technical Reports Server (NTRS)

    Sohn, Andrew; Biswas, Rupak; Simon, Horst D.

    1996-01-01

    The computational requirements for an adaptive solution of unsteady problems change as the simulation progresses. This causes workload imbalance among processors on a parallel machine which, in turn, requires significant data movement at runtime. We present a new dynamic load-balancing framework, called JOVE, that balances the workload across all processors with a global view. Whenever the computational mesh is adapted, JOVE is activated to eliminate the load imbalance. JOVE has been implemented on an IBM SP2 distributed-memory machine in MPI for portability. Experimental results for two model meshes demonstrate that mesh adaption with load balancing gives more than a sixfold improvement over one without load balancing. We also show that JOVE gives a 24-fold speedup on 64 processors compared to sequential execution.

  8. Promoting Contextual Vocabulary Learning through an Adaptive Computer-Assisted EFL Reading System

    ERIC Educational Resources Information Center

    Wang, Y.-H.

    2016-01-01

    The study developed an adaptive computer-assisted reading system and investigated its effect on promoting English as a foreign language learner-readers' contextual vocabulary learning performance. Seventy Taiwanese college students were assigned to two reading groups. Participants in the customised reading group read online English texts, each of…

  9. Students' Perceived Usefulness of Formative Feedback for a Computer-Adaptive Test

    ERIC Educational Resources Information Center

    Lilley, Mariana; Barker, Trevor

    2007-01-01

    In this paper we report on research related to the provision of automated feedback based on a computer adaptive test (CAT), used in formative assessment. A cohort of 76 second year university undergraduates took part in a formative assessment with a CAT and were provided with automated feedback on their performance. A sample of students responded…

  10. Lessons Learned in Designing and Implementing a Computer-Adaptive Test for English

    ERIC Educational Resources Information Center

    Burston, Jack; Neophytou, Maro

    2014-01-01

    This paper describes the lessons learned in designing and implementing a computer-adaptive test (CAT) for English. The early identification of students with weak L2 English proficiency is of critical importance in university settings that have compulsory English language course graduation requirements. The most efficient means of diagnosing the L2…

  11. Comparing Computer Adaptive and Curriculum-Based Measures of Math in Progress Monitoring

    ERIC Educational Resources Information Center

    Shapiro, Edward S.; Dennis, Minyi Shih; Fu, Qiong

    2015-01-01

    The purpose of the study was to compare the use of a Computer Adaptive Test and Curriculum-Based Measurement in the assessment of mathematics. This study also investigated the degree to which slope or rate of change predicted student outcomes on the annual state assessment of mathematics above and beyond scores of single point screening…

  12. A Practical Computer Adaptive Testing Model for Small-Scale Scenarios

    ERIC Educational Resources Information Center

    Tao, Yu-Hui; Wu, Yu-Lung; Chang, Hsin-Yi

    2008-01-01

    Computer adaptive testing (CAT) is theoretically sound and efficient, and is commonly seen in larger testing programs. It is, however, rarely seen in a smaller-scale scenario, such as in classrooms or business daily routines, because of the complexity of most adopted Item Response Theory (IRT) models. While the Sequential Probability Ratio Test…

  13. The Effect of Adaptive Confidence Strategies in Computer-Assisted Instruction on Learning and Learner Confidence

    ERIC Educational Resources Information Center

    Warren, Richard Daniel

    2012-01-01

    The purpose of this research was to investigate the effects of including adaptive confidence strategies in instructionally sound computer-assisted instruction (CAI) on learning and learner confidence. Seventy-one general educational development (GED) learners recruited from various GED learning centers at community colleges in the southeast United…

  14. The ARCS Model for Developing Motivationally-Adaptive Computer-Assisted Instruction.

    ERIC Educational Resources Information Center

    Song, Sang H.; Keller, John M.

    This study examined the effects of a prototype of motivationally-adaptive CAI (Computer Assisted Instruction), developed in accordance with the ARCS (Attention, Relevance, Confidence, Satisfaction) model of motivational design, on achievement, perceived motivation (both overall motivation and for each of the ARCS components), efficiency, and…

  15. An Adaptive Feedback and Review Paradigm for Computer-Based Drills.

    ERIC Educational Resources Information Center

    Siegel, Martin A.; Misselt, A. Lynn

    The Corrective Feedback Paradigm (CFP), which has been refined and expanded through use on the PLATO IV Computer-Based Education System, is based on instructional design strategies implied by stimulus-locus analyses, direct instruction, and instructional feedback methods. Features of the paradigm include adaptive feedback techniques with…

  16. A case for Sandia investment in complex adaptive systems science and technology.

    SciTech Connect

    Colbaugh, Richard; Tsao, Jeffrey Yeenien; Johnson, Curtis Martin; Backus, George A.; Brown, Theresa Jean; Jones, Katherine A.

    2012-05-01

    This white paper makes a case for Sandia National Laboratories investments in complex adaptive systems science and technology (S&T) -- investments that could enable higher-value-added and more-robustly-engineered solutions to challenges of importance to Sandia's national security mission and to the nation. Complex adaptive systems are ubiquitous in Sandia's national security mission areas. We often ignore the adaptive complexity of these systems by narrowing our 'aperture of concern' to systems or subsystems with a limited range of function exposed to a limited range of environments over limited periods of time. But by widening our aperture of concern we could increase our impact considerably. To do so, the science and technology of complex adaptive systems must mature considerably. Despite an explosion of interest outside of Sandia, however, that science and technology is still in its youth. What has been missing is contact with real (rather than model) systems and real domain-area detail. With its center-of-gravity as an engineering laboratory, Sandia's has made considerable progress applying existing science and technology to real complex adaptive systems. It has focused much less, however, on advancing the science and technology itself. But its close contact with real systems and real domain-area detail represents a powerful strength with which to help complex adaptive systems science and technology mature. Sandia is thus both a prime beneficiary of, as well as potentially a prime contributor to, complex adaptive systems science and technology. Building a productive program in complex adaptive systems science and technology at Sandia will not be trivial, but a credible path can be envisioned: in the short run, continue to apply existing science and technology to real domain-area complex adaptive systems; in the medium run, jump-start the creation of new science and technology capability through Sandia's Laboratory Directed Research and Development program; and

  17. Preservice Teachers' Computer Literacy: Validation of an Instrument To Measure Self-efficacy for Computer-based Technologies.

    ERIC Educational Resources Information Center

    Buhendwa, Frank M.

    Instruments used in a study by M. B. Kinzie and M. A. Delacourt (1991), the Attitude towards Computer Technologies (ACT) and the Self-efficacy for Computer Technologies (SCT), assess preservice teachers' perceived usefulness of and comfort level with specific computer technologies. This study uses a population confirmed to be similar to that used…

  18. National Energy Research Scientific Computing Center (NERSC): Advancing the frontiers of computational science and technology

    SciTech Connect

    Hules, J.

    1996-11-01

    National Energy Research Scientific Computing Center (NERSC) provides researchers with high-performance computing tools to tackle science`s biggest and most challenging problems. Founded in 1974 by DOE/ER, the Controlled Thermonuclear Research Computer Center was the first unclassified supercomputer center and was the model for those that followed. Over the years the center`s name was changed to the National Magnetic Fusion Energy Computer Center and then to NERSC; it was relocated to LBNL. NERSC, one of the largest unclassified scientific computing resources in the world, is the principal provider of general-purpose computing services to DOE/ER programs: Magnetic Fusion Energy, High Energy and Nuclear Physics, Basic Energy Sciences, Health and Environmental Research, and the Office of Computational and Technology Research. NERSC users are a diverse community located throughout US and in several foreign countries. This brochure describes: the NERSC advantage, its computational resources and services, future technologies, scientific resources, and computational science of scale (interdisciplinary research over a decade or longer; examples: combustion in engines, waste management chemistry, global climate change modeling).

  19. SIMCA T 1.0: A SAS Computer Program for Simulating Computer Adaptive Testing

    ERIC Educational Resources Information Center

    Raiche, Gilles; Blais, Jean-Guy

    2006-01-01

    Monte Carlo methodologies are frequently applied to study the sampling distribution of the estimated proficiency level in adaptive testing. These methods eliminate real situational constraints. However, these Monte Carlo methodologies are not currently supported by the available software programs, and when these programs are available, their…

  20. Self-adaptive phosphor coating technology for wafer-level scale chip packaging

    NASA Astrophysics Data System (ADS)

    Linsong, Zhou; Haibo, Rao; Wei, Wang; Xianlong, Wan; Junyuan, Liao; Xuemei, Wang; Da, Zhou; Qiaolin, Lei

    2013-05-01

    A new self-adaptive phosphor coating technology has been successfully developed, which adopted a slurry method combined with a self-exposure process. A phosphor suspension in the water-soluble photoresist was applied and exposed to LED blue light itself and developed to form a conformal phosphor coating with self-adaptability to the angular distribution of intensity of blue light and better-performing spatial color uniformity. The self-adaptive phosphor coating technology had been successfully adopted in the wafer surface to realize a wafer-level scale phosphor conformal coating. The first-stage experiments show satisfying results and give an adequate demonstration of the flexibility of self-adaptive coating technology on application of WLSCP.

  1. Advanced computer modeling techniques expand belt conveyor technology

    SciTech Connect

    Alspaugh, M.

    1998-07-01

    Increased mining production is continuing to challenge engineers and manufacturers to keep up. The pressure to produce larger and more versatile equipment is increasing. This paper will show some recent major projects in the belt conveyor industry that have pushed the limits of design and engineering technology. Also, it will discuss the systems engineering discipline and advanced computer modeling tools that have helped make these achievements possible. Several examples of technologically advanced designs will be reviewed. However, new technology can sometimes produce increased problems with equipment availability and reliability if not carefully developed. Computer modeling techniques that help one design larger equipment can also compound operational headaches if engineering processes and algorithms are not carefully analyzed every step of the way.

  2. Renewable energy technologies and its adaptation in an urban environment

    SciTech Connect

    Thampi, K. Ravindranathan Byrne, Owen Surolia, Praveen K.

    2014-01-28

    This general article is based on the inaugural talk delivered at the opening of OMTAT 2013 conference. It notes that the integration of renewable energy sources into living and transport sectors presents a daunting task, still. In spite of the fact that the earth and its atmosphere continually receive 1.7 × 10{sup 17} watts of radiation from the sun, in the portfolio of sustainable and environment friendly energy options, which is about 16% of the world’s energy consumption and mostly met by biomass, only a paltry 0.04% is accredited to solar. First and second generation solar cells offer mature technologies for applications. The most important difficulty with regards to integration with structures is not only the additional cost, but also the lack of sufficient knowledge in managing the available energy smartly and efficiently. The incorporation of PV as a part of building fabric greatly reduces the overall costs compared with retrofitting. BIPV (Building Integrated photovoltaic) is a critical technology for establishing aesthetically pleasing solar structures. Infusing PV and building elements is greatly simplified with some of the second generation thin film technologies now manufactured as flexible panels. The same holds true for 3{sup rd} generation technologies under development such as, and dye- and quantum dot- sensitized solar cells. Additionally, these technologies offer transparent or translucent solar cells for incorporation into windows and skylights. This review deals with the present state of solar cell technologies suitable for BIPV and the status of BIPV applications and its future prospects.

  3. Renewable energy technologies and its adaptation in an urban environment

    NASA Astrophysics Data System (ADS)

    Thampi, K. Ravindranathan; Byrne, Owen; Surolia, Praveen K.

    2014-01-01

    This general article is based on the inaugural talk delivered at the opening of OMTAT 2013 conference. It notes that the integration of renewable energy sources into living and transport sectors presents a daunting task, still. In spite of the fact that the earth and its atmosphere continually receive 1.7 × 1017 watts of radiation from the sun, in the portfolio of sustainable and environment friendly energy options, which is about 16% of the world's energy consumption and mostly met by biomass, only a paltry 0.04% is accredited to solar. First and second generation solar cells offer mature technologies for applications. The most important difficulty with regards to integration with structures is not only the additional cost, but also the lack of sufficient knowledge in managing the available energy smartly and efficiently. The incorporation of PV as a part of building fabric greatly reduces the overall costs compared with retrofitting. BIPV (Building Integrated photovoltaic) is a critical technology for establishing aesthetically pleasing solar structures. Infusing PV and building elements is greatly simplified with some of the second generation thin film technologies now manufactured as flexible panels. The same holds true for 3rd generation technologies under development such as, and dye- and quantum dot- sensitized solar cells. Additionally, these technologies offer transparent or translucent solar cells for incorporation into windows and skylights. This review deals with the present state of solar cell technologies suitable for BIPV and the status of BIPV applications and its future prospects.

  4. The role of computer technology in applied computational chemical-physics

    NASA Astrophysics Data System (ADS)

    Chillemi, G.; Rosati, M.; Sanna, N.

    2001-09-01

    In this paper we would discuss the increasing role played by the past and upcoming silicon technology in solving real computational applications' cases in correlated scientific fields ranging from quantum chemistry, materials science, atomic and molecular physics and bio-chemistry. Although the wide range of computational applications of computer technology in this areas does not permit to have a full rationale of its present and future role, some basic features appear to be so clearly defined that an attempt to find common numerical behaviours become now feasible to be exploited. Several theoretical approaches have been developed in order to study the state of bound and unbound interactions among physical particles with the scope of having a feasible numerical path to the solution of the equations proposed. Apart from the evident scientific diversities among the cited computational fields, it is now becoming clear how they share common numerical devices, in terms of computer architectures, algorithms and low-level functions. This last fact, when coupled with the role of the numerical intensive technology provider who is committed to offer a computational solution to the needs of the scientific users on a common general-purpose computing platform, offers a unique way of analysis of the basic numeric requirements in this area. Some specific computational examples in classical and quantum mechanics of specific biochemistry and physics applications, will be reported in this paper and by the exposition of the basic elements of the theories involved, a discussion on the alternative to — and optimization of — the use of current parallel technologies will be opened. Whenever possible, a comparison between some numerical results obtained on general purpose mid-range parallel machines and forecasts from on silicon routines will be carried out in order to understand the viability of this solution to the (bio)chemical-physics computational community.

  5. Contingency support using adaptive telemetry extractor and expert system technologies

    NASA Technical Reports Server (NTRS)

    Bryant, Thomas; Cruse, Bryant; Wende, Charles

    1987-01-01

    The 'telemetry analysis logic for operations support' prototype system constitutes an expert system that is charged with contingency planning for the NASA Hubble Space Telescope (HST); this system has demonstrated the feasibility of using an adaptive telemetry extractor/reformatter that is integrated with an expert system. A test case generated by a simulator has demonstrated the reduction of the time required for analysis of a complex series of failures to a few minutes, from the hour usually required. The HST's telemetry extractor will be able to read real-time engineering telemetry streams and disk-based data. Telemetry format changes will be handled almost instantaneously.

  6. IMRT planning on adaptive volume structures--a decisive reduction in computational complexity.

    PubMed

    Scherrer, Alexander; Küfer, Karl-Heinz; Bortfeld, Thomas; Monz, Michael; Alonso, Fernando

    2005-05-01

    The objective of radiotherapy planning is to find a compromise between the contradictive goals of delivering a sufficiently high dose to the target volume while widely sparing critical structures. The search for such a compromise requires the computation of several plans, which mathematically means solving several optimization problems. In the case of intensity modulated radiotherapy (IMRT) these problems are large-scale, hence the accumulated computational expense is very high. The adaptive clustering method presented in this paper overcomes this difficulty. The main idea is to use a preprocessed hierarchy of aggregated dose-volume information as a basis for individually adapted approximations of the original optimization problems. This leads to a decisively reduced computational expense: numerical experiments on several sets of real clinical data typically show computation times decreased by a factor of about 10. In contrast to earlier work in this field, this reduction in computational complexity will not lead to a loss in accuracy: the adaptive clustering method produces the optimum of the original optimization problem. PMID:15843735

  7. Architecture-Adaptive Computing Environment: A Tool for Teaching Parallel Programming

    NASA Technical Reports Server (NTRS)

    Dorband, John E.; Aburdene, Maurice F.

    2002-01-01

    Recently, networked and cluster computation have become very popular. This paper is an introduction to a new C based parallel language for architecture-adaptive programming, aCe C. The primary purpose of aCe (Architecture-adaptive Computing Environment) is to encourage programmers to implement applications on parallel architectures by providing them the assurance that future architectures will be able to run their applications with a minimum of modification. A secondary purpose is to encourage computer architects to develop new types of architectures by providing an easily implemented software development environment and a library of test applications. This new language should be an ideal tool to teach parallel programming. In this paper, we will focus on some fundamental features of aCe C.

  8. Computational adaptive optics for broadband interferometric tomography of tissues and cells

    NASA Astrophysics Data System (ADS)

    Adie, Steven G.; Mulligan, Jeffrey A.

    2016-03-01

    Adaptive optics (AO) can shape aberrated optical wavefronts to physically restore the constructive interference needed for high-resolution imaging. With access to the complex optical field, however, many functions of optical hardware can be achieved computationally, including focusing and the compensation of optical aberrations to restore the constructive interference required for diffraction-limited imaging performance. Holography, which employs interferometric detection of the complex optical field, was developed based on this connection between hardware and computational image formation, although this link has only recently been exploited for 3D tomographic imaging in scattering biological tissues. This talk will present the underlying imaging science behind computational image formation with optical coherence tomography (OCT) -- a beam-scanned version of broadband digital holography. Analogous to hardware AO (HAO), we demonstrate computational adaptive optics (CAO) and optimization of the computed pupil correction in 'sensorless mode' (Zernike polynomial corrections with feedback from image metrics) or with the use of 'guide-stars' in the sample. We discuss the concept of an 'isotomic volume' as the volumetric extension of the 'isoplanatic patch' introduced in astronomical AO. Recent CAO results and ongoing work is highlighted to point to the potential biomedical impact of computed broadband interferometric tomography. We also discuss the advantages and disadvantages of HAO vs. CAO for the effective shaping of optical wavefronts, and highlight opportunities for hybrid approaches that synergistically combine the unique advantages of hardware and computational methods for rapid volumetric tomography with cellular resolution.

  9. Productivity and Job Security: Retraining to Adapt to Technological Change.

    ERIC Educational Resources Information Center

    National Center for Productivity and Quality of Working Life, Washington, DC.

    This report, the first of a series on productivity and job security, presents five case studies to illustrate retraining to achieve worker's adjustment to technology. The first of seven chapters addresses the following issues: the availability of job training/retraining data, the desirability of informing workers in advance of technological…

  10. Adapting Corporate Portal Technology for Management E-Learning.

    ERIC Educational Resources Information Center

    Hamlin, Michael D.; Griffy-Brown, Charla

    As the largest provider of MBA talent in the Western United States, the Graziadio School of Business and Management (Malibu, California) had a need to create a technology infrastructure to support students and staff distributed in seven educational centers throughout California. This paper describes the process by which the School came to…

  11. Building climate adaptation capabilities through technology and community

    NASA Astrophysics Data System (ADS)

    Murray, D.; McWhirter, J.; Intsiful, J. D.; Cozzini, S.

    2011-12-01

    To effectively plan for adaptation to changes in climate, decision makers require infrastructure and tools that will provide them with timely access to current and future climate information. For example, climate scientists and operational forecasters need to access global and regional model projections and current climate information that they can use to prepare monitoring products and reports and then publish these for the decision makers. Through the UNDP African Adaption Programme, an infrastructure is being built across Africa that will provide multi-tiered access to such information. Web accessible servers running RAMADDA, an open source content management system for geoscience information, will provide access to the information at many levels: from the raw and processed climate model output to real-time climate conditions and predictions to documents and presentation for government officials. Output from regional climate models (e.g. RegCM4) and downscaled global climate models will be accessible through RAMADDA. The Integrated Data Viewer (IDV) is being used by scientists to create visualizations that assist the understanding of climate processes and projections, using the data on these as well as external servers. Since RAMADDA is more than a data server, it is also being used as a publishing platform for the generated material that will be available and searchable by the decision makers. Users can wade through the enormous volumes of information and extract subsets for their region or project of interest. Participants from 20 countries attended workshops at ICTP during 2011. They received training on setting up and installing the servers and necessary software and are now working on deploying the systems in their respective countries. This is the first time an integrated and comprehensive approach to climate change adaptation has been widely applied in Africa. It is expected that this infrastructure will enhance North-South collaboration and improve the

  12. Computational structures technology and UVA Center for CST

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1992-01-01

    Rapid advances in computer hardware have had a profound effect on various engineering and mechanics disciplines, including the materials, structures, and dynamics disciplines. A new technology, computational structures technology (CST), has recently emerged as an insightful blend between material modeling, structural and dynamic analysis and synthesis on the one hand, and other disciplines such as computer science, numerical analysis, and approximation theory, on the other hand. CST is an outgrowth of finite element methods developed over the last three decades. The focus of this presentation is on some aspects of CST which can impact future airframes and propulsion systems, as well as on the newly established University of Virginia (UVA) Center for CST. The background and goals for CST are described along with the motivations for developing CST, and a brief discussion is made on computational material modeling. We look at the future in terms of technical needs, computing environment, and research directions. The newly established UVA Center for CST is described. One of the research projects of the Center is described, and a brief summary of the presentation is given.

  13. Computing, information, and communications: Technologies for the 21. Century

    SciTech Connect

    1998-11-01

    To meet the challenges of a radically new and technologically demanding century, the Federal Computing, Information, and Communications (CIC) programs are investing in long-term research and development (R and D) to advance computing, information, and communications in the United States. CIC R and D programs help Federal departments and agencies to fulfill their evolving missions, assure the long-term national security, better understand and manage the physical environment, improve health care, help improve the teaching of children, provide tools for lifelong training and distance learning to the workforce, and sustain critical US economic competitiveness. One of the nine committees of the National Science and Technology Council (NSTC), the Committee on Computing, Information, and Communications (CCIC)--through its CIC R and D Subcommittee--coordinates R and D programs conducted by twelve Federal departments and agencies in cooperation with US academia and industry. These R and D programs are organized into five Program Component Areas: (1) HECC--High End Computing and Computation; (2) LSN--Large Scale Networking, including the Next Generation Internet Initiative; (3) HCS--High Confidence Systems; (4) HuCS--Human Centered Systems; and (5) ETHR--Education, Training, and Human Resources. A brief synopsis of FY 1997 accomplishments and FY 1998 goals by PCA is presented. This report, which supplements the President`s Fiscal Year 1998 Budget, describes the interagency CIC programs.

  14. Creating presentation graphics with MS-DOS computer technology.

    PubMed

    Van Hoozer, H; Warner, S; Felton, G

    1989-01-01

    This article describes how The University of Iowa College of Nursing Instructional Design Services uses MS-DOS computer technology to create presentation graphics to support nursing education, research, scholarly productivity, and service. Hardware and software are described and examples are presented to illustrate the use of software to create alphanumeric, schematic, and freeform pictures. The authors stress that the use of computer-aided design and production does not eliminate the use of traditional principles of visual design, but rather necessitates their application. PMID:2752333

  15. Report of the Panel on Computer and Information Technology

    NASA Technical Reports Server (NTRS)

    Lundstrom, Stephen F.; Larsen, Ronald L.

    1984-01-01

    Aircraft have become more and more dependent on computers (information processing) for improved performance and safety. It is clear that this activity will grow, since information processing technology has advanced by a factor of 10 every 5 years for the past 35 years and will continue to do so. Breakthroughs in device technology, from vacuum tubes through transistors to integrated circuits, contribute to this rapid pace. This progress is nearly matched by similar, though not as dramatic, advances in numerical software and algorithms. Progress has not been easy. Many technical and nontechnical challenges were surmounted. The outlook is for continued growth in capability but will require surmounting new challenges. The technology forecast presented in this report has been developed by extrapolating current trends and assessing the possibilities of several high-risk research topics. In the process, critical problem areas that require research and development emphasis have been identified. The outlook assumes a positive perspective; the projected capabilities are possible by the year 2000, and adequate resources will be made available to achieve them. Computer and information technology forecasts and the potential impacts of this technology on aeronautics are identified. Critical issues and technical challenges underlying the achievement of forecasted performance and benefits are addressed.

  16. Application of Adaptive Decision Aiding Systems to Computer-Assisted Instruction. Final Report, January-December 1974.

    ERIC Educational Resources Information Center

    May, Donald M.; And Others

    The minicomputer-based Computerized Diagnostic and Decision Training (CDDT) system described combines the principles of artificial intelligence, decision theory, and adaptive computer assisted instruction for training in electronic troubleshooting. The system incorporates an adaptive computer program which learns the student's diagnostic and…

  17. Adaptive Traffic Route Control in QoS Provisioning for Cognitive Radio Technology with Heterogeneous Wireless Systems

    NASA Astrophysics Data System (ADS)

    Yamamoto, Toshiaki; Ueda, Tetsuro; Obana, Sadao

    As one of the dynamic spectrum access technologies, “cognitive radio technology,” which aims to improve the spectrum efficiency, has been studied. In cognitive radio networks, each node recognizes radio conditions, and according to them, optimizes its wireless communication routes. Cognitive radio systems integrate the heterogeneous wireless systems not only by switching over them but also aggregating and utilizing them simultaneously. The adaptive control of switchover use and concurrent use of various wireless systems will offer a stable and flexible wireless communication. In this paper, we propose the adaptive traffic route control scheme that provides high quality of service (QoS) for cognitive radio technology, and examine the performance of the proposed scheme through the field trials and computer simulations. The results of field trials show that the adaptive route control according to the radio conditions improves the user IP throughput by more than 20% and reduce the one-way delay to less than 1/6 with the concurrent use of IEEE802.16 and IEEE802.11 wireless media. Moreover, the simulation results assuming hundreds of mobile terminals reveal that the number of users receiving the required QoS of voice over IP (VoIP) service and the total network throughput of FTP users increase by more than twice at the same time with the proposed algorithm. The proposed adaptive traffic route control scheme can enhance the performances of the cognitive radio technologies by providing the appropriate communication routes for various applications to satisfy their required QoS.

  18. Cloud computing and patient engagement: leveraging available technology.

    PubMed

    Noblin, Alice; Cortelyou-Ward, Kendall; Servan, Rosa M

    2014-01-01

    Cloud computing technology has the potential to transform medical practices and improve patient engagement and quality of care. However, issues such as privacy and security and "fit" can make incorporation of the cloud an intimidating decision for many physicians. This article summarizes the four most common types of clouds and discusses their ideal uses, how they engage patients, and how they improve the quality of care offered. This technology also can be used to meet Meaningful Use requirements 1 and 2; and, if speculation is correct, the cloud will provide the necessary support needed for Meaningful Use 3 as well. PMID:25807597

  19. Using archaeogenomic and computational approaches to unravel the history of local adaptation in crops

    PubMed Central

    Allaby, Robin G.; Gutaker, Rafal; Clarke, Andrew C.; Pearson, Neil; Ware, Roselyn; Palmer, Sarah A.; Kitchen, James L.; Smith, Oliver

    2015-01-01

    Our understanding of the evolution of domestication has changed radically in the past 10 years, from a relatively simplistic rapid origin scenario to a protracted complex process in which plants adapted to the human environment. The adaptation of plants continued as the human environment changed with the expansion of agriculture from its centres of origin. Using archaeogenomics and computational models, we can observe genome evolution directly and understand how plants adapted to the human environment and the regional conditions to which agriculture expanded. We have applied various archaeogenomics approaches as exemplars to study local adaptation of barley to drought resistance at Qasr Ibrim, Egypt. We show the utility of DNA capture, ancient RNA, methylation patterns and DNA from charred remains of archaeobotanical samples from low latitudes where preservation conditions restrict ancient DNA research to within a Holocene timescale. The genomic level of analyses that is now possible, and the complexity of the evolutionary process of local adaptation means that plant studies are set to move to the genome level, and account for the interaction of genes under selection in systems-level approaches. This way we can understand how plants adapted during the expansion of agriculture across many latitudes with rapidity. PMID:25487329

  20. Using archaeogenomic and computational approaches to unravel the history of local adaptation in crops.

    PubMed

    Allaby, Robin G; Gutaker, Rafal; Clarke, Andrew C; Pearson, Neil; Ware, Roselyn; Palmer, Sarah A; Kitchen, James L; Smith, Oliver

    2015-01-19

    Our understanding of the evolution of domestication has changed radically in the past 10 years, from a relatively simplistic rapid origin scenario to a protracted complex process in which plants adapted to the human environment. The adaptation of plants continued as the human environment changed with the expansion of agriculture from its centres of origin. Using archaeogenomics and computational models, we can observe genome evolution directly and understand how plants adapted to the human environment and the regional conditions to which agriculture expanded. We have applied various archaeogenomics approaches as exemplars to study local adaptation of barley to drought resistance at Qasr Ibrim, Egypt. We show the utility of DNA capture, ancient RNA, methylation patterns and DNA from charred remains of archaeobotanical samples from low latitudes where preservation conditions restrict ancient DNA research to within a Holocene timescale. The genomic level of analyses that is now possible, and the complexity of the evolutionary process of local adaptation means that plant studies are set to move to the genome level, and account for the interaction of genes under selection in systems-level approaches. This way we can understand how plants adapted during the expansion of agriculture across many latitudes with rapidity. PMID:25487329

  1. Reinforcement learning for adaptive threshold control of restorative brain-computer interfaces: a Bayesian simulation.

    PubMed

    Bauer, Robert; Gharabaghi, Alireza

    2015-01-01

    Restorative brain-computer interfaces (BCI) are increasingly used to provide feedback of neuronal states in a bid to normalize pathological brain activity and achieve behavioral gains. However, patients and healthy subjects alike often show a large variability, or even inability, of brain self-regulation for BCI control, known as BCI illiteracy. Although current co-adaptive algorithms are powerful for assistive BCIs, their inherent class switching clashes with the operant conditioning goal of restorative BCIs. Moreover, due to the treatment rationale, the classifier of restorative BCIs usually has a constrained feature space, thus limiting the possibility of classifier adaptation. In this context, we applied a Bayesian model of neurofeedback and reinforcement learning for different threshold selection strategies to study the impact of threshold adaptation of a linear classifier on optimizing restorative BCIs. For each feedback iteration, we first determined the thresholds that result in minimal action entropy and maximal instructional efficiency. We then used the resulting vector for the simulation of continuous threshold adaptation. We could thus show that threshold adaptation can improve reinforcement learning, particularly in cases of BCI illiteracy. Finally, on the basis of information-theory, we provided an explanation for the achieved benefits of adaptive threshold setting. PMID:25729347

  2. Reinforcement learning for adaptive threshold control of restorative brain-computer interfaces: a Bayesian simulation

    PubMed Central

    Bauer, Robert; Gharabaghi, Alireza

    2015-01-01

    Restorative brain-computer interfaces (BCI) are increasingly used to provide feedback of neuronal states in a bid to normalize pathological brain activity and achieve behavioral gains. However, patients and healthy subjects alike often show a large variability, or even inability, of brain self-regulation for BCI control, known as BCI illiteracy. Although current co-adaptive algorithms are powerful for assistive BCIs, their inherent class switching clashes with the operant conditioning goal of restorative BCIs. Moreover, due to the treatment rationale, the classifier of restorative BCIs usually has a constrained feature space, thus limiting the possibility of classifier adaptation. In this context, we applied a Bayesian model of neurofeedback and reinforcement learning for different threshold selection strategies to study the impact of threshold adaptation of a linear classifier on optimizing restorative BCIs. For each feedback iteration, we first determined the thresholds that result in minimal action entropy and maximal instructional efficiency. We then used the resulting vector for the simulation of continuous threshold adaptation. We could thus show that threshold adaptation can improve reinforcement learning, particularly in cases of BCI illiteracy. Finally, on the basis of information-theory, we provided an explanation for the achieved benefits of adaptive threshold setting. PMID:25729347

  3. Adaptation of existing infrared technologies to unanticipated applications

    NASA Astrophysics Data System (ADS)

    Peng, Philip

    2005-01-01

    Radiation thermometry is just but one of many applications, both potential and realized, of infrared technology. During the SARS (Severe Acute Respiratory Syndromes) global crisis in 2003, the technology was utilized as a preliminary screening method for infected persons as a defense against a major outbreak, as the primary symptom of this disease is elevated body temperature. ATC timely developed a product designed specifically for mass volume crowd screening of febrile individuals. For this application, the machine must register temperature of subjects rapidly and efficiently, with a certain degree of accuracy, and function for extended periods of time. The equipment must be safe to use, easily deployed, and function with minimum maintenance needed. The ATIR-303 model satisfies all of the above and other pre-requisite conditions amicably. Studies on the correlation between the maximum temperature registered among individual's facial features, as measured under the conditions of usage, and the core temperature of individuals were performed. The results demonstrated that ATIR-303 is very suitable for this application. Other applications of the infrared technology in various areas, like medical diagnosis, non-destructive testing, security, search and rescue, and others, are also interest areas of ATC. The progress ATC has achieved in these areas is presented also.

  4. Large-Scale Assessment of a Fully Automatic Co-Adaptive Motor Imagery-Based Brain Computer Interface

    PubMed Central

    Acqualagna, Laura; Botrel, Loic; Vidaurre, Carmen; Kübler, Andrea; Blankertz, Benjamin

    2016-01-01

    In the last years Brain Computer Interface (BCI) technology has benefited from the development of sophisticated machine leaning methods that let the user operate the BCI after a few trials of calibration. One remarkable example is the recent development of co-adaptive techniques that proved to extend the use of BCIs also to people not able to achieve successful control with the standard BCI procedure. Especially for BCIs based on the modulation of the Sensorimotor Rhythm (SMR) these improvements are essential, since a not negligible percentage of users is unable to operate SMR-BCIs efficiently. In this study we evaluated for the first time a fully automatic co-adaptive BCI system on a large scale. A pool of 168 participants naive to BCIs operated the co-adaptive SMR-BCI in one single session. Different psychological interventions were performed prior the BCI session in order to investigate how motor coordination training and relaxation could influence BCI performance. A neurophysiological indicator based on the Power Spectral Density (PSD) was extracted by the recording of few minutes of resting state brain activity and tested as predictor of BCI performances. Results show that high accuracies in operating the BCI could be reached by the majority of the participants before the end of the session. BCI performances could be significantly predicted by the neurophysiological indicator, consolidating the validity of the model previously developed. Anyway, we still found about 22% of users with performance significantly lower than the threshold of efficient BCI control at the end of the session. Being the inter-subject variability still the major problem of BCI technology, we pointed out crucial issues for those who did not achieve sufficient control. Finally, we propose valid developments to move a step forward to the applicability of the promising co-adaptive methods. PMID:26891350

  5. Diversity in computing technologies and strategies for dynamic resource allocation

    SciTech Connect

    Garzoglio, G.; Gutsche, O.

    2015-01-01

    Here, High Energy Physics (HEP) is a very data intensive and trivially parallelizable science discipline. HEP is probing nature at increasingly finer details requiring ever increasing computational resources to process and analyze experimental data. In this paper, we discuss how HEP provisioned resources so far using Grid technologies, how HEP is starting to include new resource providers like commercial Clouds and HPC installations, and how HEP is transparently provisioning resources at these diverse providers.

  6. Vector Field Visual Data Analysis Technologies for Petascale Computational Science

    SciTech Connect

    Garth, Christoph; Deines, Eduard; Joy, Kenneth I.; Bethel, E. Wes; Childs, Hank; Weber, Gunther; Ahern, Sean; Pugmire, Dave; Sanderson, Allen; Johnson, Chris

    2009-11-13

    State-of-the-art computational science simulations generate large-scale vector field data sets. Visualization and analysis is a key aspect of obtaining insight into these data sets and represents an important challenge. This article discusses possibilities and challenges of modern vector field visualization and focuses on methods and techniques developed in the SciDAC Visualization and Analytics Center for Enabling Technologies (VACET) and deployed in the open-source visualization tool, VisIt.

  7. Diversity in Computing Technologies and Strategies for Dynamic Resource Allocation

    NASA Astrophysics Data System (ADS)

    Garzoglio, G.; Gutsche, O.

    2015-12-01

    High Energy Physics (HEP) is a very data intensive and trivially parallelizable science discipline. HEP is probing nature at increasingly finer details requiring ever increasing computational resources to process and analyze experimental data. In this paper, we discuss how HEP provisioned resources so far using Grid technologies, how HEP is starting to include new resource providers like commercial Clouds and HPC installations, and how HEP is transparently provisioning resources at these diverse providers.

  8. Diversity in Computing Technologies and Strategies for Dynamic Resource Allocation

    DOE PAGESBeta

    Garzoglio, G.; Gutsche, O.

    2015-12-23

    High Energy Physics (HEP) is a very data intensive and trivially parallelizable science discipline. HEP is probing nature at increasingly finer details requiring ever increasing computational resources to process and analyze experimental data. In this paper, we discuss how HEP provisioned resources so far using Grid technologies, how HEP is starting to include new resource providers like commercial Clouds and HPC installations, and how HEP is transparently provisioning resources at these diverse providers.

  9. Electrical hand tools and techniques: A compilation. [utilization of space technology for tools and adapters

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Space technology utilization for developing tools, adapters, and fixtures and procedures for assembling, installing, and servicing electrical components and equipment are discussed. Some of the items considered are: (1) pivotal screwdriver, (2) termination locator tool for shielded cables, (3) solder application tools, (4) insulation and shield removing tool, and (5) torque wrench adapter for cable connector engaging ring. Diagrams of the various tools and devices are provided.

  10. Adaptive Nulling: A New Enabling Technology for Interferometric Exoplanet

    NASA Technical Reports Server (NTRS)

    Lay, Oliver P.; Jeganathan, Muthu; Peters, Robert

    2003-01-01

    Deep, stable nulling of starlight requires careful control of the amplitudes and phases of the beams that are being combined. The detection of earth-like planets using the interferometer architectures currently being considered for the Terrestrial Planet Finder mission require that the E-field amplitudes are balanced at the level of approx. 0.1%, and the phases are controlled at the level of 1 mrad (corresponding to approx.1.5 nm for a wavelength of 10 microns). These conditions must be met simultaneously at all wavelengths across the science band, and for both polarization states, imposing unrealistic tolerances on the symmetry between the optical beamtrains. We introduce the concept of a compensator that is inserted into the beamtrain, which can adaptively correct for the mismatches across the spectrum, enabling deep nulls with realistic, imperfect optics. The design presented uses a deformable mirror to adjust the amplitude and phase of each beam as an arbitrary function of wavelength and polarization. A proof-of-concept experiment will be conducted at visible/near-IR wavelengths, followed by a system operating in the Mid-IR band.

  11. The technological influence on health professionals' care: translation and adaptation of scales1

    PubMed Central

    Almeida, Carlos Manuel Torres; Almeida, Filipe Nuno Alves dos Santos; Escola, Joaquim José Jacinto; Rodrigues, Vitor Manuel Costa Pereira

    2016-01-01

    Objectives: in this study, two research tools were validated to study the impact of technological influence on health professionals' care practice. Methods: the following methodological steps were taken: bibliographic review, selection of the scales, translation and cultural adaptation and analysis of psychometric properties. Results: the psychometric properties of the scale were assessed based on its application to a sample of 341 individuals (nurses, physicians, final-year nursing and medical students). The validity, reliability and internal consistency were tested. Two scales were found: Caring Attributes Questionnaire (adapted) with a Cronbach's Alpha coefficient of 0.647 and the Technological Influence Questionnaire (adapted) with an Alpha coefficient of 0.777. Conclusions: the scales are easy to apply and reveal reliable psychometric properties, an additional quality as they permit generalized studies on a theme as important as the impact of technological influence in health care. PMID:27143537

  12. 25 CFR 502.7 - Electronic, computer or other technologic aid.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 25 Indians 2 2013-04-01 2013-04-01 false Electronic, computer or other technologic aid. 502.7... DEFINITIONS OF THIS CHAPTER § 502.7 Electronic, computer or other technologic aid. (a) Electronic, computer or... applicable Federal communications law. (b) Electronic, computer or other technologic aids include, but...

  13. 25 CFR 502.7 - Electronic, computer or other technologic aid.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 25 Indians 2 2012-04-01 2012-04-01 false Electronic, computer or other technologic aid. 502.7... DEFINITIONS OF THIS CHAPTER § 502.7 Electronic, computer or other technologic aid. (a) Electronic, computer or... applicable Federal communications law. (b) Electronic, computer or other technologic aids include, but...

  14. 25 CFR 502.7 - Electronic, computer or other technologic aid.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 2 2014-04-01 2014-04-01 false Electronic, computer or other technologic aid. 502.7... DEFINITIONS OF THIS CHAPTER § 502.7 Electronic, computer or other technologic aid. (a) Electronic, computer or... applicable Federal communications law. (b) Electronic, computer or other technologic aids include, but...

  15. 25 CFR 502.7 - Electronic, computer or other technologic aid.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 2 2010-04-01 2010-04-01 false Electronic, computer or other technologic aid. 502.7... DEFINITIONS OF THIS CHAPTER § 502.7 Electronic, computer or other technologic aid. (a) Electronic, computer or... applicable Federal communications law. (b) Electronic, computer or other technologic aids include, but...

  16. 25 CFR 502.7 - Electronic, computer or other technologic aid.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 25 Indians 2 2011-04-01 2011-04-01 false Electronic, computer or other technologic aid. 502.7... DEFINITIONS OF THIS CHAPTER § 502.7 Electronic, computer or other technologic aid. (a) Electronic, computer or... applicable Federal communications law. (b) Electronic, computer or other technologic aids include, but...

  17. Vortex-dominated conical-flow computations using unstructured adaptively-refined meshes

    NASA Technical Reports Server (NTRS)

    Batina, John T.

    1989-01-01

    A conical Euler/Navier-Stokes algorithm is presented for the computation of vortex-dominated flows. The flow solver involves a multistage Runge-Kutta time stepping scheme which uses a finite-volume spatial discretization on an unstructured grid made up of triangles. The algorithm also employs an adaptive mesh refinement procedure which enriches the mesh locally to more accurately resolve the vortical flow features. Results are presented for several highly-swept delta wing and circular cone cases at high angles of attack and at supersonic freestream flow conditions. Accurate solutions were obtained more efficiently when adaptive mesh refinement was used in contrast with refining the grid globally. The paper presents descriptions of the conical Euler/Navier-Stokes flow solver and adaptive mesh refinement procedures along with results which demonstrate the capability.

  18. Reviews of computing technology: Fiber distributed data interface

    SciTech Connect

    Johnson, A.J.

    1991-12-01

    Fiber Distributed Data Interface, more commonly known as FDDI, is the name of the standard that describes a new local area network (LAN) technology for the 90`s. This technology is based on fiber optics communications and, at a data transmission rate of 100 million bits per second (mbps), provides a full order of magnitude improvement over previous LAN standards such as Ethernet and Token Ring. FDDI as a standard has been accepted by all major computer manufacturers and is a national standard as defined by the American National Standards Institute (ANSI). FDDI will become part of the US Government Open Systems Interconnection Profile (GOSIP) under Version 3 GOSIP and will become an international standard promoted by the International Standards Organization (ISO). It is important to note that there are no competing standards for high performance LAN`s so that FDDI acceptance is nearly universal. This technology report describes FDDI as a technology, looks at the applications of this technology, examine the current economics of using it, and describe activities and plans by the Information Resource Management (IRM) department to implement this technology at the Savannah River Site.

  19. Reviews of computing technology: Fiber distributed data interface

    SciTech Connect

    Johnson, A.J.

    1991-12-01

    Fiber Distributed Data Interface, more commonly known as FDDI, is the name of the standard that describes a new local area network (LAN) technology for the 90's. This technology is based on fiber optics communications and, at a data transmission rate of 100 million bits per second (mbps), provides a full order of magnitude improvement over previous LAN standards such as Ethernet and Token Ring. FDDI as a standard has been accepted by all major computer manufacturers and is a national standard as defined by the American National Standards Institute (ANSI). FDDI will become part of the US Government Open Systems Interconnection Profile (GOSIP) under Version 3 GOSIP and will become an international standard promoted by the International Standards Organization (ISO). It is important to note that there are no competing standards for high performance LAN's so that FDDI acceptance is nearly universal. This technology report describes FDDI as a technology, looks at the applications of this technology, examine the current economics of using it, and describe activities and plans by the Information Resource Management (IRM) department to implement this technology at the Savannah River Site.

  20. Computer technology futures for the improvement of assessment

    NASA Astrophysics Data System (ADS)

    Baker, Eva L.; O'Neil, Harold F.

    1995-03-01

    With a focus on the interaction between computer technology and assessment, we first review the typical functions served by technology in the support of various assessment purposes. These include efficiencies in person and item sampling and in administration, analysis, and reporting. Our major interest is the extent to which technology can provide unique opportunities to understand performance. Two examples are described: a tool-based knowledge representation approach to assess content understanding and a team problem-solving task involving negotiation. The first example, using HyperCard as well as paper-and-pencil variations, has been tested in science and history fields. Its continuing challenge is to determine a strategy for creating and validating scoring criteria. The second example, involving a workforce readiness task for secondary school, has used expert-novice comparisons to infer performance standards. These examples serve as the context for the exploration of validity, equity, and utility.

  1. Global assessment of technological innovation for climate change adaptation and mitigation in developing world.

    PubMed

    Adenle, Ademola A; Azadi, Hossein; Arbiol, Joseph

    2015-09-15

    Concerns about mitigating and adapting to climate change resulted in renewing the incentive for agricultural research investments and developing further innovation priorities around the world particularly in developing countries. In the near future, development of new agricultural measures and proper diffusion of technologies will greatly influence the ability of farmers in adaptation and mitigation to climate change. Using bibliometric approaches through output of academic journal publications and patent-based data, we assess the impact of research and development (R&D) for new and existing technologies within the context of climate change mitigation and adaptation. We show that many developing countries invest limited resources for R&D in relevant technologies that have great potential for mitigation and adaption in agricultural production. We also discuss constraints including weak infrastructure, limited research capacity, lack of credit facilities and technology transfer that may hinder the application of innovation in tackling the challenges of climate change. A range of policy measures is also suggested to overcome identified constraints and to ensure that potentials of innovation for climate change mitigation and adaptation are realized. PMID:26189184

  2. Communication: Spin-free quantum computational simulations and symmetry adapted states.

    PubMed

    Whitfield, James Daniel

    2013-07-14

    The ideas of digital simulation of quantum systems using a quantum computer parallel the original ideas of numerical simulation using a classical computer. In order for quantum computational simulations to advance to a competitive point, many techniques from classical simulations must be imported into the quantum domain. In this article, we consider the applications of symmetry in the context of quantum simulation. Building upon well established machinery, we propose a form of first quantized simulation that only requires the spatial part of the wave function, thereby allowing spin-free quantum computational simulations. We go further and discuss the preparation of N-body states with specified symmetries based on projection techniques. We consider two simple examples, molecular hydrogen and cyclopropenyl cation, to illustrate the ideas. The methods here are the first to explicitly deal with preparing N-body symmetry-adapted states and open the door for future investigations into group theory, chemistry, and quantum simulation. PMID:23862919

  3. Industrialization of housing construction adapting building technology to Kuwait environment

    SciTech Connect

    Ezz Al Din, M.A.

    1984-01-01

    A major study of the industrialization of housing construction was conducted by Kuwait University, Department of Civil Engineering in conjunction with Kuwait Institute for Scientific Research, to compare and contrast limited and average income group housing. Data from this study permit a preliminary assessment of the impact of the building technology change and other aspects on construction costs. A case study and identification of the designers and users' point of view concerning the sorts of issues addressed in the paper is followed by an evaluation and a concept for public and private space. The findings of the study are then presented and considered, also their meaning and significance for Kuwait as well as for other developing countries, are assessed. Concluding remarks and recommendations complete the paper.

  4. Adaptation.

    PubMed

    Broom, Donald M

    2006-01-01

    The term adaptation is used in biology in three different ways. It may refer to changes which occur at the cell and organ level, or at the individual level, or at the level of gene action and evolutionary processes. Adaptation by cells, especially nerve cells helps in: communication within the body, the distinguishing of stimuli, the avoidance of overload and the conservation of energy. The time course and complexity of these mechanisms varies. Adaptive characters of organisms, including adaptive behaviours, increase fitness so this adaptation is evolutionary. The major part of this paper concerns adaptation by individuals and its relationships to welfare. In complex animals, feed forward control is widely used. Individuals predict problems and adapt by acting before the environmental effect is substantial. Much of adaptation involves brain control and animals have a set of needs, located in the brain and acting largely via motivational mechanisms, to regulate life. Needs may be for resources but are also for actions and stimuli which are part of the mechanism which has evolved to obtain the resources. Hence pigs do not just need food but need to be able to carry out actions like rooting in earth or manipulating materials which are part of foraging behaviour. The welfare of an individual is its state as regards its attempts to cope with its environment. This state includes various adaptive mechanisms including feelings and those which cope with disease. The part of welfare which is concerned with coping with pathology is health. Disease, which implies some significant effect of pathology, always results in poor welfare. Welfare varies over a range from very good, when adaptation is effective and there are feelings of pleasure or contentment, to very poor. A key point concerning the concept of individual adaptation in relation to welfare is that welfare may be good or poor while adaptation is occurring. Some adaptation is very easy and energetically cheap and

  5. Energy-saving technology of vector controlled induction motor based on the adaptive neuro-controller

    NASA Astrophysics Data System (ADS)

    Engel, E.; Kovalev, I. V.; Karandeev, D.

    2015-10-01

    The ongoing evolution of the power system towards a Smart Grid implies an important role of intelligent technologies, but poses strict requirements on their control schemes to preserve stability and controllability. This paper presents the adaptive neuro-controller for the vector control of induction motor within Smart Gird. The validity and effectiveness of the proposed energy-saving technology of vector controlled induction motor based on adaptive neuro-controller are verified by simulation results at different operating conditions over a wide speed range of induction motor.

  6. Reliability of MEMS deformable mirror technology used in adaptive optics imaging systems

    NASA Astrophysics Data System (ADS)

    Hartzell, Allyson L.; Cornelissen, Steven A.; Bierden, Paul A.; Lam, Charlie V.; Davis, Daniel F.

    2010-02-01

    Deformable mirror (DM) technology based on microelectromechanical systems (MEMS) technology produced by Boston Micromachines Corporation has been demonstrated to be an enabling component in a variety of adaptive optics applications such as high contrast imaging in astronomy, multi object adaptive optics, free-space laser communication, and microscopy. Many of these applications require DMs with thousands of actuators operating at frame rates up to 10 kHz for many years requiring sufficient device reliability to avoid device failures. In this paper we present improvements in MEMS deformable mirrors for reliability along with test data and device lifetime prediction that show trillions of actuator-cycles can be achieved without failures.

  7. Campus Computing, 1998. The Ninth National Survey of Desktop Computing and Information Technology in American Higher Education.

    ERIC Educational Resources Information Center

    Green, Kenneth C.

    This report presents findings of a June 1998 survey of computing officials at 1,623 two- and four-year U.S. colleges and universities concerning the use of computer technology. The survey found that computing and information technology (IT) are now core components of the campus environment and classroom experience. However, key aspects of IT…

  8. Applying Computer Adaptive Testing to Optimize Online Assessment of Suicidal Behavior: A Simulation Study

    PubMed Central

    de Vries, Anton LM; de Groot, Marieke H; de Keijser, Jos; Kerkhof, Ad JFM

    2014-01-01

    Background The Internet is used increasingly for both suicide research and prevention. To optimize online assessment of suicidal patients, there is a need for short, good-quality tools to assess elevated risk of future suicidal behavior. Computer adaptive testing (CAT) can be used to reduce response burden and improve accuracy, and make the available pencil-and-paper tools more appropriate for online administration. Objective The aim was to test whether an item response–based computer adaptive simulation can be used to reduce the length of the Beck Scale for Suicide Ideation (BSS). Methods The data used for our simulation was obtained from a large multicenter trial from The Netherlands: the Professionals in Training to STOP suicide (PITSTOP suicide) study. We applied a principal components analysis (PCA), confirmatory factor analysis (CFA), a graded response model (GRM), and simulated a CAT. Results The scores of 505 patients were analyzed. Psychometric analyses showed the questionnaire to be unidimensional with good internal consistency. The computer adaptive simulation showed that for the estimation of elevation of risk of future suicidal behavior 4 items (instead of the full 19) were sufficient, on average. Conclusions This study demonstrated that CAT can be applied successfully to reduce the length of the Dutch version of the BSS. We argue that the use of CAT can improve the accuracy and the response burden when assessing the risk of future suicidal behavior online. Because CAT can be daunting for clinicians and applied scientists, we offer a concrete example of our computer adaptive simulation of the Dutch version of the BSS at the end of the paper. PMID:25213259

  9. Quality Assurance Challenges for Motion-Adaptive Radiation Therapy: Gating, Breath Holding, and Four-Dimensional Computed Tomography

    SciTech Connect

    Jiang, Steve B. Wolfgang, John; Mageras, Gig S.

    2008-05-01

    Compared with conventional three-dimensional (3D) conformal radiation therapy and intensity-modulated radiation therapy treatments, quality assurance (QA) for motion-adaptive radiation therapy involves various challenges because of the added temporal dimension. Here we discuss those challenges for three specific techniques related to motion-adaptive therapy: namely respiratory gating, breath holding, and four-dimensional computed tomography. Similar to the introduction of any other new technologies in clinical practice, typical QA measures should be taken for these techniques also, including initial testing of equipment and clinical procedures, as well as frequent QA examinations during the early stage of implementation. Here, rather than covering every QA aspect in depth, we focus on some major QA challenges. The biggest QA challenge for gating and breath holding is how to ensure treatment accuracy when internal target position is predicted using external surrogates. Recommended QA measures for each component of treatment, including simulation, planning, patient positioning, and treatment delivery and verification, are discussed. For four-dimensional computed tomography, some major QA challenges have also been discussed.

  10. An Adaptive Paradigm for Computer-Aided Detection of Colonic Polyps

    PubMed Central

    Wang, Huafeng; Liang, Zhengrong; Li, Lihong C.; Han, Hao; Song, Bowen; Pickhardt, Perry J.; Barish, Matthew A.; Lascarides, Chris E.

    2015-01-01

    Most previous efforts in developing computer-aided detection (CADe) of colonic polyps apply similar measures or parameters to detect polyps regardless their locations under an implicit assumption that all the polyps reside in a similar local environment, e.g., on a relatively flat colon wall. In reality, this implicit assumption is frequently invalid, because the haustral folds can have a very different local environment from that of the relatively flat colon wall. We conjecture that this assumption may be a major cause of missing detection of polyps, especially small polyps (<10mm linear size) located on the haustral folds. In this paper, we take the concept of adaptive-ness and present an adaptive paradigm for CADe of colonic polyps. Firstly, we decompose the complicated colon structure into two simplified sub-structures, each of which has similar properties, of (1) relatively flat colon wall and (2) ridge-shaped haustral folds. Then we develop local environment descriptions to adaptively reflect each of these two simplified sub-structures. To show the impact of the adaptive-ness of the local environment descriptions upon the polyp detection task, we focus on the local geometrical measures of the volume data for both the detection of initial polyp candidates (IPCs) and the reduction of false positives (FPs) in the IPC pool. The experimental outcome using the local geometrical measures is very impressive such that not only the previously-missed small polyps on the folds are detected, but also the previously miss-removed small polyps on the folds during FP reduction are retained. It is expected that this adaptive paradigm will have a great impact on detecting the small polyps, measuring their volumes and volume changes over time, and optimizing their management plan. PMID:26348125

  11. Evaluation of the adaptation of zirconia-based fixed partial dentures using micro-CT technology.

    PubMed

    Borba, Márcia; Miranda, Walter Gomes; Cesar, Paulo Francisco; Griggs, Jason Allan; Bona, Alvaro Della

    2013-01-01

    The objective of the study was to measure the marginal and internal fit of zirconia-based all-ceramic three-unit fixed partial dentures (FPDs) (Y-TZP - LAVA, 3M-ESPE), using a novel methodology based on micro-computed tomography (micro-CT) technology. Stainless steel models of prepared abutments were fabricated to design FPDs. Ten frameworks were produced with 9 mm2 connector cross-sections using a LAVATM CAD-CAM system. All FPDs were veneered with a compatible porcelain. Each FPD was seated on the original model and scanned using micro-CT. Files were processed using NRecon and CTAn software. Adobe Photoshop and Image J software were used to analyze the cross-sectional images. Five measuring points were selected, as follows: MG - marginal gap; CA - chamfer area; AW - axial wall; AOT - axio-occlusal transition area; OA - occlusal area. Results were statistically analyzed by Kruskall-Wallis and Tukey's post hoc test (α= 0.05). There were significant differences for the gap width between the measurement points evaluated. MG showed the smallest median gap width (42 µm). OA had the highest median gap dimension (125 µm), followed by the AOT point (105 µm). CA and AW gap width values were statistically similar, 66 and 65 µm respectively. Thus, it was possible to conclude that different levels of adaptation were observed within the FPD, at the different measuring points. In addition, the micro-CT technology seems to be a reliable tool to evaluate the fit of dental restorations. PMID:24036977

  12. Computer program for distance learning of pesticide application technology.

    PubMed

    Maia, Bruno; Cunha, Joao P A R

    2011-12-01

    Distance learning presents great potential for mitigating field problems on pesticide application technology. Thus, due to the lack of teaching material about pesticide spraying technology in the Portuguese language and the increasing availability of distance learning, this study developed and evaluated a computer program for distance learning about the theory of pesticide spraying technology using the tools of information technology. The modules comprising the course, named Pulverizar, were: (1) Basic concepts, (2) Factors that affect application, (3) Equipments, (4) Spraying nozzles, (5) Sprayer calibration, (6) Aerial application, (7) Chemigation, (8) Physical-chemical properties, (9) Formulations, (10) Adjuvants, (11) Water quality, and (12) Adequate use of pesticides. The program was made available to the public on July 1(st), 2008, hosted at the web site www.pulverizar.iciag.ufu.br, and was simple, robust and practical on the complementation of traditional teaching for the education of professionals in Agricultural Sciences. Mastering pesticide spraying technology by people involved in agricultural production can be facilitated by the program Pulverizar, which was well accepted in its initial evaluation. PMID:22159349

  13. An a-posteriori finite element error estimator for adaptive grid computation of viscous incompressible flows

    NASA Astrophysics Data System (ADS)

    Wu, Heng

    2000-10-01

    In this thesis, an a-posteriori error estimator is presented and employed for solving viscous incompressible flow problems. In an effort to detect local flow features, such as vortices and separation, and to resolve flow details precisely, a velocity angle error estimator e theta which is based on the spatial derivative of velocity direction fields is designed and constructed. The a-posteriori error estimator corresponds to the antisymmetric part of the deformation-rate-tensor, and it is sensitive to the second derivative of the velocity angle field. Rationality discussions reveal that the velocity angle error estimator is a curvature error estimator, and its value reflects the accuracy of streamline curves. It is also found that the velocity angle error estimator contains the nonlinear convective term of the Navier-Stokes equations, and it identifies and computes the direction difference when the convective acceleration direction and the flow velocity direction have a disparity. Through benchmarking computed variables with the analytic solution of Kovasznay flow or the finest grid of cavity flow, it is demonstrated that the velocity angle error estimator has a better performance than the strain error estimator. The benchmarking work also shows that the computed profile obtained by using etheta can achieve the best matching outcome with the true theta field, and that it is asymptotic to the true theta variation field, with a promise of fewer unknowns. Unstructured grids are adapted by employing local cell division as well as unrefinement of transition cells. Using element class and node class can efficiently construct a hierarchical data structure which provides cell and node inter-reference at each adaptive level. Employing element pointers and node pointers can dynamically maintain the connection of adjacent elements and adjacent nodes, and thus avoids time-consuming search processes. The adaptive scheme is applied to viscous incompressible flow at different

  14. Foreign and Domestic Accomplishments in Magnetic Bubble Device Technology. Computer Science & Technology.

    ERIC Educational Resources Information Center

    Warnar, Robert B. J.; Calomeris, Peter J.

    This document assesses the status of magnetic bubble technology as displayed by non-U.S. research and manufacturing facilities. Non-U.S. research and U.S. accomplishments are described while both technical and economic factors are addressed. Magnetic bubble devices are discussed whenever their application could impact future computer system…

  15. Computer-Based Learning: The Key 'Technological Multiplier' for Technology Transfer.

    ERIC Educational Resources Information Center

    Reynolds, Angus

    1982-01-01

    The use of computer-based learning (CBL) is discussed. The author examines the appropriate use of the technology; its cost; identifying the best potential applications of CBL; and the use of CBL by major airlines, oil companies, universities, manufacturers, and government. (CT)

  16. Computational fluid dynamics for propulsion technology: Geometric grid visualization in CFD-based propulsion technology research

    NASA Technical Reports Server (NTRS)

    Ziebarth, John P.; Meyer, Doug

    1992-01-01

    The coordination is examined of necessary resources, facilities, and special personnel to provide technical integration activities in the area of computational fluid dynamics applied to propulsion technology. Involved is the coordination of CFD activities between government, industry, and universities. Current geometry modeling, grid generation, and graphical methods are established to use in the analysis of CFD design methodologies.

  17. Use of Soft Computing Technologies For Rocket Engine Control

    NASA Technical Reports Server (NTRS)

    Trevino, Luis C.; Olcmen, Semih; Polites, Michael

    2003-01-01

    The problem to be addressed in this paper is to explore how the use of Soft Computing Technologies (SCT) could be employed to further improve overall engine system reliability and performance. Specifically, this will be presented by enhancing rocket engine control and engine health management (EHM) using SCT coupled with conventional control technologies, and sound software engineering practices used in Marshall s Flight Software Group. The principle goals are to improve software management, software development time and maintenance, processor execution, fault tolerance and mitigation, and nonlinear control in power level transitions. The intent is not to discuss any shortcomings of existing engine control and EHM methodologies, but to provide alternative design choices for control, EHM, implementation, performance, and sustaining engineering. The approaches outlined in this paper will require knowledge in the fields of rocket engine propulsion, software engineering for embedded systems, and soft computing technologies (i.e., neural networks, fuzzy logic, and Bayesian belief networks), much of which is presented in this paper. The first targeted demonstration rocket engine platform is the MC-1 (formerly FASTRAC Engine) which is simulated with hardware and software in the Marshall Avionics & Software Testbed laboratory that

  18. Technology--The Equalizer.

    ERIC Educational Resources Information Center

    Sloane, Eydie

    1989-01-01

    This article describes a number of computer-based learning tools for disabled students. Adaptive input devices, assisted technologies, software, and hardware and software resources are discussed. (IAH)

  19. Application Specific Performance Technology for Productive Parallel Computing

    SciTech Connect

    Malony, Allen D.; Shende, Sameer

    2008-09-30

    Our accomplishments over the last three years of the DOE project Application- Specific Performance Technology for Productive Parallel Computing (DOE Agreement: DE-FG02-05ER25680) are described below. The project will have met all of its objectives by the time of its completion at the end of September, 2008. Two extensive yearly progress reports were produced in in March 2006 and 2007 and were previously submitted to the DOE Office of Advanced Scientific Computing Research (OASCR). Following an overview of the objectives of the project, we summarize for each of the project areas the achievements in the first two years, and then describe in some more detail the project accomplishments this past year. At the end, we discuss the relationship of the proposed renewal application to the work done on the current project.

  20. Technologies for Large Data Management in Scientific Computing

    NASA Astrophysics Data System (ADS)

    Pace, Alberto

    2014-01-01

    In recent years, intense usage of computing has been the main strategy of investigations in several scientific research projects. The progress in computing technology has opened unprecedented opportunities for systematic collection of experimental data and the associated analysis that were considered impossible only few years ago. This paper focuses on the strategies in use: it reviews the various components that are necessary for an effective solution that ensures the storage, the long term preservation, and the worldwide distribution of large quantities of data that are necessary in a large scientific research project. The paper also mentions several examples of data management solutions used in High Energy Physics for the CERN Large Hadron Collider (LHC) experiments in Geneva, Switzerland which generate more than 30,000 terabytes of data every year that need to be preserved, analyzed, and made available to a community of several tenth of thousands scientists worldwide.

  1. SAR data exploitation: computational technology enabling SAR ATR algorithm development

    NASA Astrophysics Data System (ADS)

    Majumder, Uttam K.; Casteel, Curtis H., Jr.; Buxa, Peter; Minardi, Michael J.; Zelnio, Edmund G.; Nehrbass, John W.

    2007-04-01

    A fundamental issue with synthetic aperture radar (SAR) application development is data processing and exploitation in real-time or near real-time. The power of high performance computing (HPC) clusters, FPGA, and the IBM Cell processor presents new algorithm development possibilities that have not been fully leveraged. In this paper, we will illustrate the capability of SAR data exploitation which was impractical over the last decade due to computing limitations. We can envision that SAR imagery encompassing city size coverage at extremely high levels of fidelity could be processed at near-real time using the above technologies to empower the warfighter with access to critical information for the war on terror, homeland defense, as well as urban warfare.

  2. Using Computer Technology to Foster Learning for Understanding

    PubMed Central

    VAN MELLE, ELAINE; TOMALTY, LEWIS

    2000-01-01

    The literature shows that students typically use either a surface approach to learning, in which the emphasis is on memorization of facts, or a deep approach to learning, in which learning for understanding is the primary focus. This paper describes how computer technology, specifically the use of a multimedia CD-ROM, was integrated into a microbiology curriculum as part of the transition from focusing on facts to fostering learning for understanding. Evaluation of the changes in approaches to learning over the course of the term showed a statistically significant shift in a deep approach to learning, as measured by the Study Process Questionnaire. Additional data collected showed that the use of computer technology supported this shift by providing students with the opportunity to apply what they had learned in class to order tests and interpret the test results in relation to specific patient-focused case studies. The extent of the impact, however, varied among different groups of students in the class. For example, students who were recent high school graduates did not show a statistically significant increase in deep learning scores over the course of the term and did not perform as well in the course. The results also showed that a surface approach to learning was an important aspect of learning for understanding, although only those students who were able to combine a surface with a deep approach to learning were successfully able to learn for understanding. Implications of this finding for the future use of computer technology and learning for understanding are considered. PMID:23653533

  3. Three Authentic Curriculum-Integration Approaches to Bird Adaptations That Incorporate Technology and Thinking Skills

    ERIC Educational Resources Information Center

    Rule, Audrey C.; Barrera, Manuel T., III

    2008-01-01

    Integration of subject areas with technology and thinking skills is a way to help teachers cope with today's overloaded curriculum and to help students see the connectedness of different curriculum areas. This study compares three authentic approaches to teaching a science unit on bird adaptations for habitat that integrate thinking skills and…

  4. How Should the Higher Education Workforce Adapt to Advancements in Technology for Teaching and Learning?

    ERIC Educational Resources Information Center

    Kukulska-Hulme, Agnes

    2012-01-01

    In a time of change, higher education is in the position of having to adapt to external conditions created by widespread adoption of popular technologies such as social media, social networking services and mobile devices. For faculty members, there must be opportunities for concrete experiences capable of generating a personal conviction that a…

  5. Adaptive Technology for the Internet: Making Electronic Resources Accessible to All.

    ERIC Educational Resources Information Center

    Mates, Barbara T.

    This book seeks to guide information providers in establishing accessible World Wide Web sites and acquiring the hardware and software needed by people with disabilities, focusing on access to the Internet using large print, voice, and Braille. The book also covers how to acquire the funds for adaptive technology, what type of equipment to choose,…

  6. Adaptation technology between IP layer and optical layer in optical Internet

    NASA Astrophysics Data System (ADS)

    Ji, Yuefeng; Li, Hua; Sun, Yongmei

    2001-10-01

    Wavelength division multiplexing (WDM) optical network provides a platform with high bandwidth capacity and is supposed to be the backbone infrastructure supporting the next-generation high-speed multi-service networks (ATM, IP, etc.). In the foreseeable future, IP will be the predominant data traffic, to make fully use of the bandwidth of the WDM optical network, many attentions have been focused on IP over WDM, which has been proposed as the most promising technology for new kind of network, so-called Optical Internet. According to OSI model, IP is in the 3rd layer (network layer) and optical network is in the 1st layer (physical layer), so the key issue is what adaptation technology should be used in the 2nd layer (data link layer). In this paper, firstly, we analyze and compare the current adaptation technologies used in backbone network nowadays. Secondly, aiming at the drawbacks of above technologies, we present a novel adaptation protocol (DONA) between IP layer and optical layer in Optical Internet and describe it in details. Thirdly, the gigabit transmission adapter (GTA) we accomplished based on the novel protocol is described. Finally, we set up an experiment platform to apply and verify the DONA and GTA, the results and conclusions of the experiment are given.

  7. Turkish Adaptation of Technological Pedagogical Content Knowledge Survey for Elementary Teachers

    ERIC Educational Resources Information Center

    Kaya, Sibel; Dag, Funda

    2013-01-01

    The purpose of this study was to adapt the Technological Pedagogical Content Knowledge (TPACK) Survey developed by Schmidt and colleagues into Turkish and investigate its factor structure through exploratory and confirmatory factor analysis. The participants were 352 elementary pre-service teachers from three large universities in northwestern…

  8. Flexibly Adaptive Professional Development in Support of Teaching Science with Geospatial Technology

    ERIC Educational Resources Information Center

    Trautmann, Nancy M.; MaKinster, James G.

    2010-01-01

    The "flexibly adaptive" model of professional development, developed in the GIT Ahead project, enables secondary science teachers to incorporate a variety of geospatial technology applications into wide-ranging classroom contexts. Teacher impacts were evaluated quantitatively and qualitatively. Post-questionnaire responses showed significant…

  9. Resituation or Resistance? Higher Education Teachers' Adaptations to Technological Change

    ERIC Educational Resources Information Center

    Westberry, Nicola; McNaughton, Susan; Billot, Jennie; Gaeta, Helen

    2015-01-01

    This paper presents the findings from a project that explored teachers' adaptations to technological change in four large classes in higher education. In these classes, lecturers changed from single- to multi-lecture settings mediated by videoconferencing, requiring them to transfer their beliefs and practices into a new pedagogical space.…

  10. Overcoming Barriers in the Use of Adaptive and Assistive Technology in Special Education.

    ERIC Educational Resources Information Center

    Bushrow, Kathy M.; Turner, Keith D.

    This paper examines change and change facilitators as they affect full use of adaptive and assistive technology (AAT) in special education, and compares qualitative versus quantitative methods of researching the change process. Four administrators and two teachers from a rural school district completed the Stages of Concern Questionnaire, which…

  11. Emerging computer technologies and the news media of the future

    NASA Technical Reports Server (NTRS)

    Vrabel, Debra A.

    1993-01-01

    The media environment of the future may be dramatically different from what exists today. As new computing and communications technologies evolve and synthesize to form a global, integrated communications system of networks, public domain hardware and software, and consumer products, it will be possible for citizens to fulfill most information needs at any time and from any place, to obtain desired information easily and quickly, to obtain information in a variety of forms, and to experience and interact with information in a variety of ways. This system will transform almost every institution, every profession, and every aspect of human life--including the creation, packaging, and distribution of news and information by media organizations. This paper presents one vision of a 21st century global information system and how it might be used by citizens. It surveys some of the technologies now on the market that are paving the way for new media environment.

  12. Reconfigurable high-speed optoelectronic interconnect technology for multiprocessor computers

    NASA Astrophysics Data System (ADS)

    Cheng, Julian

    1995-06-01

    We describe a compact optoelectronic switching technology for interconnecting multiple computer processors and shared memory modules together through dynamically reconfigurable optical paths to provide simultaneous, high speed communication amongst different nodes. Each switch provides a optical link to other nodes as well as electrical access to an individual processor, and it can perform optical and optoelectronic switching to covert digital data between various electrical and optical input/output formats. This multifunctional switching technology is based on the monolithic integration of arrays of vertical-cavity surface-emitting lasers with photodetectors and heterojunction bipolar transistors. The various digital switching and routing functions, as well as optically cascaded multistage operation, have been experimentally demonstrated.

  13. Adaptive methods and parallel computation for partial differential equations. Final report

    SciTech Connect

    Biswas, R.; Benantar, M.; Flaherty, J.E.

    1992-05-01

    Consider the adaptive solution of two-dimensional vector systems of hyperbolic and elliptic partial differential equations on shared-memory parallel computers. Hyperbolic systems are approximated by an explicit finite volume technique and solved by a recursive local mesh refinement procedure on a tree-structured grid. Local refinement of the time steps and spatial cells of a coarse base mesh is performed in regions where a refinement indicator exceeds a prescribed tolerance. Computational procedures that sequentially traverse the tree while processing solutions on each grid in parallel, that process solutions at the same tree level in parallel, and that dynamically assign processors to nodes of the tree have been developed and applied to an example. Computational results comparing a variety of heuristic processor load balancing techniques and refinement strategies are presented.

  14. Adaptive information interchange system of the fiber-optic measuring networks with the computer

    NASA Astrophysics Data System (ADS)

    Denisov, Igor V.; Drozdov, Roman S.; Sedov, Victor A.

    2005-06-01

    In the present paper the characteristics and opportunities of application of the system of parallel input-output of information from the fiber-optical measuring network into computer are considered. The system consists of two pars: on manframe and several expansion blocks. The first part is internal, is connected directly in the socket of the motherboard of the personal computer. It is designed for buffering system signals and development of cojmands of controlling by the system for input-output of signals into personal computer and signals generation onto expansion blocks. The second part is external, connects to the mainframe by means of cables. It designed for transformation of information from the fiber-optical measuring network into signalsof rthe mainframe and instrument settings adaptation. The analysis of speed of procesing of analog and digital data by system is presented. The possible schemes of use of the system for processing quasistationary and dynamic fields are considered.

  15. Computation Directorate and Science& Technology Review Computational Science and Research Featured in 2002

    SciTech Connect

    Alchorn, A L

    2003-04-04

    Thank you for your interest in the activities of the Lawrence Livermore National Laboratory Computation Directorate. This collection of articles from the Laboratory's Science & Technology Review highlights the most significant computational projects, achievements, and contributions during 2002. In 2002, LLNL marked the 50th anniversary of its founding. Scientific advancement in support of our national security mission has always been the core of the Laboratory. So that researchers could better under and predict complex physical phenomena, the Laboratory has pushed the limits of the largest, fastest, most powerful computers in the world. In the late 1950's, Edward Teller--one of the LLNL founders--proposed that the Laboratory commission a Livermore Advanced Research Computer (LARC) built to Livermore's specifications. He tells the story of being in Washington, DC, when John Von Neumann asked to talk about the LARC. He thought Teller wanted too much memory in the machine. (The specifications called for 20-30,000 words.) Teller was too smart to argue with him. Later Teller invited Von Neumann to the Laboratory and showed him one of the design codes being prepared for the LARC. He asked Von Neumann for suggestions on fitting the code into 10,000 words of memory, and flattered him about ''Labbies'' not being smart enough to figure it out. Von Neumann dropped his objections, and the LARC arrived with 30,000 words of memory. Memory, and how close memory is to the processor, is still of interest to us today. Livermore's first supercomputer was the Remington-Rand Univac-1. It had 5600 vacuum tubes and was 2 meters wide by 4 meters long. This machine was commonly referred to as a 1 KFlop machine [E+3]. Skip ahead 50 years. The ASCI White machine at the Laboratory today, produced by IBM, is rated at a peak performance of 12.3 TFlops or E+13. We've improved computer processing power by 10 orders of magnitude in 50 years, and I do not believe there's any reason to think we won

  16. Adaptive allocation of decisionmaking responsibility between human and computer in multitask situations

    NASA Technical Reports Server (NTRS)

    Chu, Y.-Y.; Rouse, W. B.

    1979-01-01

    As human and computer come to have overlapping decisionmaking abilities, a dynamic or adaptive allocation of responsibilities may be the best mode of human-computer interaction. It is suggested that the computer serve as a backup decisionmaker, accepting responsibility when human workload becomes excessive and relinquishing responsibility when workload becomes acceptable. A queueing theory formulation of multitask decisionmaking is used and a threshold policy for turning the computer on/off is proposed. This policy minimizes event-waiting cost subject to human workload constraints. An experiment was conducted with a balanced design of several subject runs within a computer-aided multitask flight management situation with different task demand levels. It was found that computer aiding enhanced subsystem performance as well as subjective ratings. The queueing model appears to be an adequate representation of the multitask decisionmaking situation, and to be capable of predicting system performance in terms of average waiting time and server occupancy. Server occupancy was further found to correlate highly with the subjective effort ratings.

  17. Primer on computers and information technology. Part two: an introduction to computer networking.

    PubMed

    Channin, D S; Chang, P J

    1997-01-01

    Computers networks are a way of connecting computers together such that they can exchange information. For this exchange to be successful, system behavior must be planned and specified very clearly at a number of different levels. Although there are many choices to be made at each level, often there are simple decisions that can be made to rapidly reduce the number of options. Planning is most important at the highest (application) and lowest (wiring) levels, whereas the middle levels must be specified to ensure compatibility. Because of the widespread use of the Internet, solutions based on Internet technologies are often cost-effective and should be considered when designing a network. As in all technical fields, consultation with experts (ie, computer networking specialists) may be worthwhile. PMID:9225395

  18. Advances in computer-aided design and computer-aided manufacture technology.

    PubMed

    Calamia, J R

    1996-01-01

    Although the development of computer-aided design (CAD) and computer-aided manufacture (CAM) technology and the benefits of increased productivity became obvious in the automobile and aerospace industries in the 1970s, investigations of this technology's application in the field of dentistry did not begin until the 1980s. Only now are we beginning to see the fruits of this work with the commercial availability of some systems; the potential for this technology seems boundless. This article reviews the recent literature with emphasis on the period from June 1992 to May 1993. This review should familiarize the reader with some of the latest developments in this technology, including a brief description of some systems currently available and the clinical and economical rationale for their acceptance into the dental mainstream. This article concentrates on a particular system, the Cerec (Siemens/Pelton and Crane, Charlotte, NC) system, for three reasons: First, this system has been available since 1985 and, as a result, has a track record of almost 7 years of data. Most of the data have just recently been released and consequently, much of this year's literature on CAD-CAM is monopolized by studies using this system. Second, this system was developed as a mobile, affordable, direct chairside CAD-CAM restorative method. As such, it is of special interest to the patient, providing a one-visit restoration. Third, the author is currently engaged in research using this particular system and has a working knowledge of this system's capabilities. PMID:9462062

  19. Condition monitoring through advanced sensor and computational technology : final report (January 2002 to May 2005).

    SciTech Connect

    Kim, Jung-Taek; Luk, Vincent K.

    2005-05-01

    The overall goal of this joint research project was to develop and demonstrate advanced sensors and computational technology for continuous monitoring of the condition of components, structures, and systems in advanced and next-generation nuclear power plants (NPPs). This project included investigating and adapting several advanced sensor technologies from Korean and US national laboratory research communities, some of which were developed and applied in non-nuclear industries. The project team investigated and developed sophisticated signal processing, noise reduction, and pattern recognition techniques and algorithms. The researchers installed sensors and conducted condition monitoring tests on two test loops, a check valve (an active component) and a piping elbow (a passive component), to demonstrate the feasibility of using advanced sensors and computational technology to achieve the project goal. Acoustic emission (AE) devices, optical fiber sensors, accelerometers, and ultrasonic transducers (UTs) were used to detect mechanical vibratory response of check valve and piping elbow in normal and degraded configurations. Chemical sensors were also installed to monitor the water chemistry in the piping elbow test loop. Analysis results of processed sensor data indicate that it is feasible to differentiate between the normal and degraded (with selected degradation mechanisms) configurations of these two components from the acquired sensor signals, but it is questionable that these methods can reliably identify the level and type of degradation. Additional research and development efforts are needed to refine the differentiation techniques and to reduce the level of uncertainties.

  20. The ESO adaptive optics real-time computer platform: a step toward the future

    NASA Astrophysics Data System (ADS)

    Fedrigo, Enrico; Donaldson, Robert; Soenke, Christian; Hubin, Norbert N.

    2004-10-01

    ESO now operates several AO systems in the Paranal observatory. Most of them are the outcome of different and independent efforts resulting in different and incompatible systems with all the problems of maintaining and evolving them. At the same time, industry is now proposing powerful embedded computers and new standard technologies that enable the construction of massive real time parallel computers, with a technology roadmap that looks extremely promising. The ESO AO Platform initiative aims at taking this unique opportunity of gathering all the experience accumulated so far in building and operating AO system and the recent advances offered by the industry to define and build a standard hardware and software platform able to run every AO system of the near future of the VLT with an eye towards OWL. We review the key technologies that enable the design of a common AO-RTC and we discuss the main choices of the AO Platform initiative.

  1. Accelerating technology development through integrated computation and experimentation

    SciTech Connect

    Shekhawat, Dushyant; Srivastava, Rameshwar

    2013-01-01

    This special section of Energy & Fuels comprises a selection of papers presented at the topical conference “Accelerating Technology Development through Integrated Computation and Experimentation”, sponsored and organized by the United States Department of Energy’s National Energy Technology Laboratory (NETL) as part of the 2012 American Institute of Chemical Engineers (AIChE) Annual Meeting held in Pittsburgh, PA, Oct 28−Nov 2, 2012. That topical conference focused on the latest research and development efforts in five main areas related to fossil energy, with each area focusing on the utilization of both experimental and computational approaches: (1) gas separations (membranes, sorbents, and solvents for CO{sub 2}, H{sub 2}, and O{sub 2} production), (2) CO{sub 2} utilization (enhanced oil recovery, chemical production, mineralization, etc.), (3) carbon sequestration (flow in natural systems), (4) advanced power cycles (oxy-combustion, chemical looping, gasification, etc.), and (5) fuel processing (H{sub 2} production for fuel cells).

  2. COMET-AR User's Manual: COmputational MEchanics Testbed with Adaptive Refinement

    NASA Technical Reports Server (NTRS)

    Moas, E. (Editor)

    1997-01-01

    The COMET-AR User's Manual provides a reference manual for the Computational Structural Mechanics Testbed with Adaptive Refinement (COMET-AR), a software system developed jointly by Lockheed Palo Alto Research Laboratory and NASA Langley Research Center under contract NAS1-18444. The COMET-AR system is an extended version of an earlier finite element based structural analysis system called COMET, also developed by Lockheed and NASA. The primary extensions are the adaptive mesh refinement capabilities and a new "object-like" database interface that makes COMET-AR easier to extend further. This User's Manual provides a detailed description of the user interface to COMET-AR from the viewpoint of a structural analyst.

  3. Parallel Adaptive Computation of Blood Flow in a 3D ``Whole'' Body Model

    NASA Astrophysics Data System (ADS)

    Zhou, M.; Figueroa, C. A.; Taylor, C. A.; Sahni, O.; Jansen, K. E.

    2008-11-01

    Accurate numerical simulations of vascular trauma require the consideration of a larger portion of the vasculature than previously considered, due to the systemic nature of the human body's response. A patient-specific 3D model composed of 78 connected arterial branches extending from the neck to the lower legs is constructed to effectively represent the entire body. Recently developed outflow boundary conditions that appropriately represent the downstream vasculature bed which is not included in the 3D computational domain are applied at 78 outlets. In this work, the pulsatile blood flow simulations are started on a fairly uniform, unstructured mesh that is subsequently adapted using a solution-based approach to efficiently resolve the flow features. The adapted mesh contains non-uniform, anisotropic elements resulting in resolution that conforms with the physical length scales present in the problem. The effects of the mesh resolution on the flow field are studied, specifically on relevant quantities of pressure, velocity and wall shear stress.

  4. Does Computer Technology Improve Student Learning and Achievement? How, When, and under What Conditions?

    ERIC Educational Resources Information Center

    Schacter, John; Fagnano, Cheryl

    1999-01-01

    Discussion of the implementation of computer technology in schools focuses on the need to design computer technologies based on sound learning theories. Considers the results of meta-analyses on computer-based instruction; socio-cultural learning theory; computer-supported collaborative learning; constructivist theories; project-based learning;…

  5. Evaluating the Appropriateness of a New Computer-Administered Measure of Adaptive Function for Children and Youth with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Coster, Wendy J.; Kramer, Jessica M.; Tian, Feng; Dooley, Meghan; Liljenquist, Kendra; Kao, Ying-Chia; Ni, Pengsheng

    2016-01-01

    The Pediatric Evaluation of Disability Inventory-Computer Adaptive Test is an alternative method for describing the adaptive function of children and youth with disabilities using a computer-administered assessment. This study evaluated the performance of the Pediatric Evaluation of Disability Inventory-Computer Adaptive Test with a national…

  6. in vivo laser speckle imaging by adaptive contrast computation for microvasculature assessment

    NASA Astrophysics Data System (ADS)

    Basak, Kausik; Dey, Goutam; Mahadevappa, Manjunatha; Mandal, Mahitosh; Dutta, Pranab Kumar

    2014-11-01

    Interference of light backscattered from a diffused surface leads to speckle formation in laser speckle imaging. These time integrated speckle patterns can be statistically analyzed to study the flow profile of moving scatterers. Simple speckle contrast analysis techniques have limited ability to distinguish thin structures due to presence of corrupting speckles. This paper presents a high resolution imaging technique by adaptive computation of contrast for laser speckle contrast analysis (adLASCA). Speckle images of retinal microvasculature in mice model are acquired during normal and reduced blood flow conditions. Initially, the speckle images are registered to compensate for movements, associated with heart beating and respiration. Adaptive computation is performed using local image statistics, estimated within a spatially moving window over successive time frames. Experimental evidence suggests that adLASCA outperforms other contrast analysis methods, substantiating significant improvement in contrast resolution. Fine vessels can be distinguished more efficiently with reduced fluctuations in contrast level. Quantitative performance of adLASCA is evaluated by computing standard deviation, corresponding to speckle fluctuations due to unwanted speckles. There is a significant reduction in standard deviation compared to other methods. Therefore, adLASCA can be used for enhancing microvasculature in high resolution perfusion imaging with reduced effect of corrupting speckles for effective assessment.

  7. Computational adaptive optics for broadband optical interferometric tomography of biological tissue

    NASA Astrophysics Data System (ADS)

    Boppart, Stephen A.

    2015-03-01

    High-resolution real-time tomography of biological tissues is important for many areas of biological investigations and medical applications. Cellular level optical tomography, however, has been challenging because of the compromise between transverse imaging resolution and depth-of-field, the system and sample aberrations that may be present, and the low imaging sensitivity deep in scattering tissues. The use of computed optical imaging techniques has the potential to address several of these long-standing limitations and challenges. Two related techniques are interferometric synthetic aperture microscopy (ISAM) and computational adaptive optics (CAO). Through three-dimensional Fourierdomain resampling, in combination with high-speed OCT, ISAM can be used to achieve high-resolution in vivo tomography with enhanced depth sensitivity over a depth-of-field extended by more than an order-of-magnitude, in realtime. Subsequently, aberration correction with CAO can be performed in a tomogram, rather than to the optical beam of a broadband optical interferometry system. Based on principles of Fourier optics, aberration correction with CAO is performed on a virtual pupil using Zernike polynomials, offering the potential to augment or even replace the more complicated and expensive adaptive optics hardware with algorithms implemented on a standard desktop computer. Interferometric tomographic reconstructions are characterized with tissue phantoms containing sub-resolution scattering particles, and in both ex vivo and in vivo biological tissue. This review will collectively establish the foundation for high-speed volumetric cellular-level optical interferometric tomography in living tissues.

  8. Applying Observations from Technological Transformations in Complex Adaptive Systems to Inform Health Policy on Technology Adoption

    PubMed Central

    Phillips, Andrew B.; Merrill, Jacqueline

    2012-01-01

    Many complex markets such as banking and manufacturing have benefited significantly from technology adoption. Each of these complex markets experienced increased efficiency, quality, security, and customer involvement as a result of technology transformation in their industry. Healthcare has not benefited to the same extent. We provide initial findings from a policy analysis of complex markets and the features of these transformations that can influence health technology adoption and acceptance. PMID:24199112

  9. Applying observations from technological transformations in complex adaptive systems to inform health policy on technology adoption.

    PubMed

    Phillips, Andrew B; Merrill, Jacqueline

    2012-01-01

    Many complex markets such as banking and manufacturing have benefited significantly from technology adoption. Each of these complex markets experienced increased efficiency, quality, security, and customer involvement as a result of technology transformation in their industry. Healthcare has not benefited to the same extent. We provide initial findings from a policy analysis of complex markets and the features of these transformations that can influence health technology adoption and acceptance. PMID:24199112

  10. Adapt

    NASA Astrophysics Data System (ADS)

    Bargatze, L. F.

    2015-12-01

    Active Data Archive Product Tracking (ADAPT) is a collection of software routines that permits one to generate XML metadata files to describe and register data products in support of the NASA Heliophysics Virtual Observatory VxO effort. ADAPT is also a philosophy. The ADAPT concept is to use any and all available metadata associated with scientific data to produce XML metadata descriptions in a consistent, uniform, and organized fashion to provide blanket access to the full complement of data stored on a targeted data server. In this poster, we present an application of ADAPT to describe all of the data products that are stored by using the Common Data File (CDF) format served out by the CDAWEB and SPDF data servers hosted at the NASA Goddard Space Flight Center. These data servers are the primary repositories for NASA Heliophysics data. For this purpose, the ADAPT routines have been used to generate data resource descriptions by using an XML schema named Space Physics Archive, Search, and Extract (SPASE). SPASE is the designated standard for documenting Heliophysics data products, as adopted by the Heliophysics Data and Model Consortium. The set of SPASE XML resource descriptions produced by ADAPT includes high-level descriptions of numerical data products, display data products, or catalogs and also includes low-level "Granule" descriptions. A SPASE Granule is effectively a universal access metadata resource; a Granule associates an individual data file (e.g. a CDF file) with a "parent" high-level data resource description, assigns a resource identifier to the file, and lists the corresponding assess URL(s). The CDAWEB and SPDF file systems were queried to provide the input required by the ADAPT software to create an initial set of SPASE metadata resource descriptions. Then, the CDAWEB and SPDF data repositories were queried subsequently on a nightly basis and the CDF file lists were checked for any changes such as the occurrence of new, modified, or deleted

  11. Performance Evaluation of Emerging High Performance Computing Technologies using WRF

    NASA Astrophysics Data System (ADS)

    Newby, G. B.; Morton, D.

    2008-12-01

    The Arctic Region Supercomputing Center (ARSC) has evaluated multicore processors and other emerging processor technologies for a variety of high performance computing applications in the earth and space sciences, especially climate and weather applications. A flagship effort has been to assess dual core processor nodes on ARSC's Midnight supercomputer, in which two-socket systems were compared to eight-socket systems. Midnight is utilized for ARSC's twice-daily weather research and forecasting (WRF) model runs, available at weather.arsc.edu. Among other findings on Midnight, it was found that the Hypertransport system for interconnecting Opteron processors, memory, and other subsystems does not scale as well on eight-socket (sixteen processor) systems as well as two-socket (four processor) systems. A fundamental limitation is the cache snooping operation performed whenever a computational thread accesses main memory. This increases memory latency as the number of processor sockets increases. This is particularly noticeable on applications such as WRF that are primarily CPU-bound, versus applications that are bound by input/output or communication. The new Cray XT5 supercomputer at ARSC features quad core processors, and will host a variety of scaling experiments for WRF, CCSM4, and other models. Early results will be presented, including a series of WRF runs for Alaska with grid resolutions under 2km. ARSC will discuss a set of standardized test cases for the Alaska domain, similar to existing test cases for CONUS. These test cases will provide different configuration sizes and resolutions, suitable for single processors up to thousands. Beyond multi-core Opteron-based supercomputers, ARSC has examined WRF and other applications on additional emerging technologies. One such technology is the graphics processing unit, or GPU. The 9800-series nVidia GPU was evaluated with the cuBLAS software library. While in-socket GPUs might be forthcoming in the future, current

  12. Rapid Computation of Thermodynamic Properties over Multidimensional Nonbonded Parameter Spaces Using Adaptive Multistate Reweighting.

    PubMed

    Naden, Levi N; Shirts, Michael R

    2016-04-12

    We show how thermodynamic properties of molecular models can be computed over a large, multidimensional parameter space by combining multistate reweighting analysis with a linear basis function approach. This approach reduces the computational cost to estimate thermodynamic properties from molecular simulations for over 130,000 tested parameter combinations from over 1000 CPU years to tens of CPU days. This speed increase is achieved primarily by computing the potential energy as a linear combination of basis functions, computed from either modified simulation code or as the difference of energy between two reference states, which can be done without any simulation code modification. The thermodynamic properties are then estimated with the Multistate Bennett Acceptance Ratio (MBAR) as a function of multiple model parameters without the need to define a priori how the states are connected by a pathway. Instead, we adaptively sample a set of points in parameter space to create mutual configuration space overlap. The existence of regions of poor configuration space overlap are detected by analyzing the eigenvalues of the sampled states' overlap matrix. The configuration space overlap to sampled states is monitored alongside the mean and maximum uncertainty to determine convergence, as neither the uncertainty or the configuration space overlap alone is a sufficient metric of convergence. This adaptive sampling scheme is demonstrated by estimating with high precision the solvation free energies of charged particles of Lennard-Jones plus Coulomb functional form with charges between -2 and +2 and generally physical values of σij and ϵij in TIP3P water. We also compute entropy, enthalpy, and radial distribution functions of arbitrary unsampled parameter combinations using only the data from these sampled states and use the estimates of free energies over the entire space to examine the deviation of atomistic simulations from the Born approximation to the solvation free

  13. Computer-Adaptive Testing for Students with Disabilities: A Review of the Literature. Research Report. ETS RR-11-32

    ERIC Educational Resources Information Center

    Stone, Elizabeth; Davey, Tim

    2011-01-01

    There has been an increased interest in developing computer-adaptive testing (CAT) and multistage assessments for K-12 accountability assessments. The move to adaptive testing has been met with some resistance by those in the field of special education who express concern about routing of students with divergent profiles (e.g., some students with…

  14. Implementation of Parallel Computing Technology to Vortex Flow

    NASA Technical Reports Server (NTRS)

    Dacles-Mariani, Jennifer

    1999-01-01

    Mainframe supercomputers such as the Cray C90 was invaluable in obtaining large scale computations using several millions of grid points to resolve salient features of a tip vortex flow over a lifting wing. However, real flight configurations require tracking not only of the flow over several lifting wings but its growth and decay in the near- and intermediate- wake regions, not to mention the interaction of these vortices with each other. Resolving and tracking the evolution and interaction of these vortices shed from complex bodies is computationally intensive. Parallel computing technology is an attractive option in solving these flows. In planetary science vortical flows are also important in studying how planets and protoplanets form when cosmic dust and gases become gravitationally unstable and eventually form planets or protoplanets. The current paradigm for the formation of planetary systems maintains that the planets accreted from the nebula of gas and dust left over from the formation of the Sun. Traditional theory also indicate that such a preplanetary nebula took the form of flattened disk. The coagulation of dust led to the settling of aggregates toward the midplane of the disk, where they grew further into asteroid-like planetesimals. Some of the issues still remaining in this process are the onset of gravitational instability, the role of turbulence in the damping of particles and radial effects. In this study the focus will be with the role of turbulence and the radial effects.

  15. Using Adaptive Learning Technologies to Personalize Instruction to Student Interests: The Impact of Relevant Contexts on Performance and Learning Outcomes

    ERIC Educational Resources Information Center

    Walkington, Candace A.

    2013-01-01

    Adaptive learning technologies are emerging in educational settings as a means to customize instruction to learners' background, experiences, and prior knowledge. Here, a technology-based personalization intervention within an intelligent tutoring system (ITS) for secondary mathematics was used to adapt instruction to students' personal interests.…

  16. Multiresolution Wavelet Based Adaptive Numerical Dissipation Control for Shock-Turbulence Computations

    NASA Technical Reports Server (NTRS)

    Sjoegreen, B.; Yee, H. C.

    2001-01-01

    The recently developed essentially fourth-order or higher low dissipative shock-capturing scheme of Yee, Sandham and Djomehri (1999) aimed at minimizing nu- merical dissipations for high speed compressible viscous flows containing shocks, shears and turbulence. To detect non smooth behavior and control the amount of numerical dissipation to be added, Yee et al. employed an artificial compression method (ACM) of Harten (1978) but utilize it in an entirely different context than Harten originally intended. The ACM sensor consists of two tuning parameters and is highly physical problem dependent. To minimize the tuning of parameters and physical problem dependence, new sensors with improved detection properties are proposed. The new sensors are derived from utilizing appropriate non-orthogonal wavelet basis functions and they can be used to completely switch to the extra numerical dissipation outside shock layers. The non-dissipative spatial base scheme of arbitrarily high order of accuracy can be maintained without compromising its stability at all parts of the domain where the solution is smooth. Two types of redundant non-orthogonal wavelet basis functions are considered. One is the B-spline wavelet (Mallat & Zhong 1992) used by Gerritsen and Olsson (1996) in an adaptive mesh refinement method, to determine regions where re nement should be done. The other is the modification of the multiresolution method of Harten (1995) by converting it to a new, redundant, non-orthogonal wavelet. The wavelet sensor is then obtained by computing the estimated Lipschitz exponent of a chosen physical quantity (or vector) to be sensed on a chosen wavelet basis function. Both wavelet sensors can be viewed as dual purpose adaptive methods leading to dynamic numerical dissipation control and improved grid adaptation indicators. Consequently, they are useful not only for shock-turbulence computations but also for computational aeroacoustics and numerical combustion. In addition, these

  17. Automated interferometric synthetic aperture microscopy and computational adaptive optics for improved optical coherence tomography.

    PubMed

    Xu, Yang; Liu, Yuan-Zhi; Boppart, Stephen A; Carney, P Scott

    2016-03-10

    In this paper, we introduce an algorithm framework for the automation of interferometric synthetic aperture microscopy (ISAM). Under this framework, common processing steps such as dispersion correction, Fourier domain resampling, and computational adaptive optics aberration correction are carried out as metrics-assisted parameter search problems. We further present the results of this algorithm applied to phantom and biological tissue samples and compare with manually adjusted results. With the automated algorithm, near-optimal ISAM reconstruction can be achieved without manual adjustment. At the same time, the technical barrier for the nonexpert using ISAM imaging is also significantly lowered. PMID:26974799

  18. Campus Computing, 1996. The Seventh National Survey of Desktop Computing and Information Technology in American Higher Education.

    ERIC Educational Resources Information Center

    Green, Kenneth C.

    This report presents the findings of a June, 1996, survey of computing officials at 660 two- and four-year colleges and universities across the United States concerning the use of computer technology on college campuses. The survey found that instructional integration and user support emerged as the two most important information technology (IT)…

  19. Advances in computer-aided design and computer-aided manufacture technology.

    PubMed

    Calamia, J R

    1994-01-01

    Although the development of computer-aided design (CAD) and computer-aided manufacture (CAM) technology and the benefits of increased productivity became obvious in the automobile and aerospace industries in the 1970s, investigations of this technology's application in the field of dentistry did not begin until the 1980s. Only now are we beginning to see the fruits of this work with the commercial availability of some systems; the potential for this technology seems boundless. This article reviews the recent literature with emphasis on the period from June 1992 to May 1993. This review should familiarize the reader with some of the latest developments in this technology, including a brief description of some systems currently available and the clinical and economical rationale for their acceptance into the dental mainstream. This article concentrates on a particular system, the Cerec (Siemens/Pelton and Crane, Charlotte, NC) system, for three reasons: first, this system has been available since 1985 and, as a result, has a track record of almost 7 years of data. Most of the data have just recently been released and consequently, much of this year's literature on CAD-CAM is monopolized by studies using this system. Second, this system was developed as a mobile, affordable, direct chairside CAD-CAM restorative method. As such, it is of special interest to the dentist who will offer this new technology directly to the patient, providing a one-visit restoration. Third, the author is currently engaged in research using this particular system and has a working knowledge of this system's capabilities. PMID:8032444

  20. The Picatinny Technology Transfer Innovation Center: A business incubator concept adapted to federal laboratory technology transfer

    SciTech Connect

    Wittig, T.; Greenfield, J.

    1996-10-01

    In recent years, the US defense industrial base spawned the aerospace industry, among other successes, and served as the nation`s technology seed bed. However, as the defense industrial base shrinks and public and private resources become scarcer, the merging of the commercial and defense communities becomes necessary to maintain national technological competencies. Cooperative efforts such as technology transfer provide an attractive, cost-effective, well-leveraged alternative to independently funded research and development (R and D). The sharing of knowledge, resources, and innovation among defense contractors and other public sector firms, academia, and other organizations has become exceedingly attractive. Recent legislation involving technology transfer provides for the sharing of federal laboratory resources with the private sector. The Army Research, Development and Engineering Center (ARDEC), Picatinny Arsenal, NJ, a designer of weapons systems, is one of the nation`s major laboratories with this requirement. To achieve its important technology transfer mission, ARDEC reviewed its capabilities, resources, intellectual property, and products with commercial potential. The purpose of the review was to develop a viable plan for effecting a technology transfer cultural change within the ARDEC, Picatinny Arsenal and with the private sector. This report highlights the issues identified, discussed, and resolved prior to the transformation of a temporarily vacant federal building on the Picatinny installation into a business incubator. ARDEC`s discussions and rationale for the decisions and actions that led to the implementation of the Picatinny Technology Transfer Innovation Center are discussed.

  1. Method and system for rendering and interacting with an adaptable computing environment

    DOEpatents

    Osbourn, Gordon Cecil; Bouchard, Ann Marie

    2012-06-12

    An adaptable computing environment is implemented with software entities termed "s-machines", which self-assemble into hierarchical data structures capable of rendering and interacting with the computing environment. A hierarchical data structure includes a first hierarchical s-machine bound to a second hierarchical s-machine. The first hierarchical s-machine is associated with a first layer of a rendering region on a display screen and the second hierarchical s-machine is associated with a second layer of the rendering region overlaying at least a portion of the first layer. A screen element s-machine is linked to the first hierarchical s-machine. The screen element s-machine manages data associated with a screen element rendered to the display screen within the rendering region at the first layer.

  2. An adaptable head retention and alignment device for computed tomography scanning of Macaca mulatta.

    PubMed

    Bidez, M W; Mcloughlin, S W; Chen, Y; Lakshminarayanan, A V; Jeffcoat, M K

    1994-10-01

    An adaptable retention device has been developed for the purpose of holding and aligning the head of a sedated primate subject during computed tomography (CT) scan procedures. The device is used to obtain a close reproduction of CT scan studies at a time before and after dental implant placement in the mandibles of nine subjects. Geometric and material properties are extracted from these studies for the purpose of developing finite elements computer models. The device is constructed of low-density acrylic and consists of a horizontal base to which lateral supports are affixed. The device is placed on the CT table and axially aligned with the scan beam. Repeatable, calibrated CT studies of primate implant subjects were possible using the head holding device. PMID:7962014

  3. Dynamic Load Balancing for Adaptive Computations on Distributed-Memory Machines

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Dynamic load balancing is central to adaptive mesh-based computations on large-scale parallel computers. The principal investigator has investigated various issues on the dynamic load balancing problem under NASA JOVE and JAG rants. The major accomplishments of the project are two graph partitioning algorithms and a load balancing framework. The S-HARP dynamic graph partitioner is known to be the fastest among the known dynamic graph partitioners to date. It can partition a graph of over 100,000 vertices in 0.25 seconds on a 64- processor Cray T3E distributed-memory multiprocessor while maintaining the scalability of over 16-fold speedup. Other known and widely used dynamic graph partitioners take over a second or two while giving low scalability of a few fold speedup on 64 processors. These results have been published in journals and peer-reviewed flagship conferences.

  4. Adaptive intermittent control: A computational model explaining motor intermittency observed in human behavior.

    PubMed

    Sakaguchi, Yutaka; Tanaka, Masato; Inoue, Yasuyuki

    2015-07-01

    It is a fundamental question how our brain performs a given motor task in a real-time fashion with the slow sensorimotor system. Computational theory proposed an influential idea of feed-forward control, but it has mainly treated the case that the movement is ballistic (such as reaching) because the motor commands should be calculated in advance of movement execution. As a possible mechanism for operating feed-forward control in continuous motor tasks (such as target tracking), we propose a control model called "adaptive intermittent control" or "segmented control," that brain adaptively divides the continuous time axis into discrete segments and executes feed-forward control in each segment. The idea of intermittent control has been proposed in the fields of control theory, biological modeling and nonlinear dynamical system. Compared with these previous models, the key of the proposed model is that the system speculatively determines the segmentation based on the future prediction and its uncertainty. The result of computer simulation showed that the proposed model realized faithful visuo-manual tracking with realistic sensorimotor delays and with less computational costs (i.e., with fewer number of segments). Furthermore, it replicated "motor intermittency", that is, intermittent discontinuities commonly observed in human movement trajectories. We discuss that the temporally segmented control is an inevitable strategy for brain which has to achieve a given task with small computational (or cognitive) cost, using a slow control system in an uncertain variable environment, and the motor intermittency is the side-effect of this strategy. PMID:25897510

  5. The PLUTO Code for Adaptive Mesh Computations in Astrophysical Fluid Dynamics

    NASA Astrophysics Data System (ADS)

    Mignone, A.; Zanni, C.; Tzeferacos, P.; van Straalen, B.; Colella, P.; Bodo, G.

    2012-01-01

    We present a description of the adaptive mesh refinement (AMR) implementation of the PLUTO code for solving the equations of classical and special relativistic magnetohydrodynamics (MHD and RMHD). The current release exploits, in addition to the static grid version of the code, the distributed infrastructure of the CHOMBO library for multidimensional parallel computations over block-structured, adaptively refined grids. We employ a conservative finite-volume approach where primary flow quantities are discretized at the cell center in a dimensionally unsplit fashion using the Corner Transport Upwind method. Time stepping relies on a characteristic tracing step where piecewise parabolic method, weighted essentially non-oscillatory, or slope-limited linear interpolation schemes can be handily adopted. A characteristic decomposition-free version of the scheme is also illustrated. The solenoidal condition of the magnetic field is enforced by augmenting the equations with a generalized Lagrange multiplier providing propagation and damping of divergence errors through a mixed hyperbolic/parabolic explicit cleaning step. Among the novel features, we describe an extension of the scheme to include non-ideal dissipative processes, such as viscosity, resistivity, and anisotropic thermal conduction without operator splitting. Finally, we illustrate an efficient treatment of point-local, potentially stiff source terms over hierarchical nested grids by taking advantage of the adaptivity in time. Several multidimensional benchmarks and applications to problems of astrophysical relevance assess the potentiality of the AMR version of PLUTO in resolving flow features separated by large spatial and temporal disparities.

  6. THE PLUTO CODE FOR ADAPTIVE MESH COMPUTATIONS IN ASTROPHYSICAL FLUID DYNAMICS

    SciTech Connect

    Mignone, A.; Tzeferacos, P.; Zanni, C.; Bodo, G.; Van Straalen, B.; Colella, P.

    2012-01-01

    We present a description of the adaptive mesh refinement (AMR) implementation of the PLUTO code for solving the equations of classical and special relativistic magnetohydrodynamics (MHD and RMHD). The current release exploits, in addition to the static grid version of the code, the distributed infrastructure of the CHOMBO library for multidimensional parallel computations over block-structured, adaptively refined grids. We employ a conservative finite-volume approach where primary flow quantities are discretized at the cell center in a dimensionally unsplit fashion using the Corner Transport Upwind method. Time stepping relies on a characteristic tracing step where piecewise parabolic method, weighted essentially non-oscillatory, or slope-limited linear interpolation schemes can be handily adopted. A characteristic decomposition-free version of the scheme is also illustrated. The solenoidal condition of the magnetic field is enforced by augmenting the equations with a generalized Lagrange multiplier providing propagation and damping of divergence errors through a mixed hyperbolic/parabolic explicit cleaning step. Among the novel features, we describe an extension of the scheme to include non-ideal dissipative processes, such as viscosity, resistivity, and anisotropic thermal conduction without operator splitting. Finally, we illustrate an efficient treatment of point-local, potentially stiff source terms over hierarchical nested grids by taking advantage of the adaptivity in time. Several multidimensional benchmarks and applications to problems of astrophysical relevance assess the potentiality of the AMR version of PLUTO in resolving flow features separated by large spatial and temporal disparities.

  7. Reconfigurable and adaptive photonic networks for high-performance computing systems.

    PubMed

    Kodi, Avinash; Louri, Ahmed

    2009-08-01

    As feature sizes decrease to the submicrometer regime and clock rates increase to the multigigahertz range, the limited bandwidth at higher bit rates and longer communication distances in electrical interconnects will create a major bandwidth imbalance in future high-performance computing (HPC) systems. We explore the application of an optoelectronic interconnect for the design of flexible, high-bandwidth, reconfigurable and adaptive interconnection architectures for chip-to-chip and board-to-board HPC systems. Reconfigurability is realized by interconnecting arrays of optical transmitters, and adaptivity is implemented by a dynamic bandwidth reallocation (DBR) technique that balances the load on each communication channel. We evaluate a DBR technique, the lockstep (LS) protocol, that monitors traffic intensities, reallocates bandwidth, and adapts to changes in communication patterns. We incorporate this DBR technique into a detailed discrete-event network simulator to evaluate the performance for uniform, nonuniform, and permutation communication patterns. Simulation results indicate that, without reconfiguration techniques being applied, optical based system architecture shows better performance than electrical interconnects for uniform and nonuniform patterns; with reconfiguration techniques being applied, the dynamically reconfigurable optoelectronic interconnect provides much better performance for all communication patterns. Based on the performance study, the reconfigured architecture shows 30%-50% increased throughput and 50%-75% reduced network latency compared with HPC electrical networks. PMID:19649024

  8. Application of adaptive and neural network computational techniques to Traffic Volume and Classification Monitoring

    SciTech Connect

    Mead, W.C.; Fisher, H.N.; Jones, R.D.; Bisset, K.R.; Lee, L.A.

    1993-09-01

    We are developing a Traffic Volume and Classification Monitoring (TVCM) system based on adaptive and neural network computational techniques. The value of neutral networks in this application lies in their ability to learn from data and to form a mapping of arbitrary topology. The piezoelectric strip and magnetic loop sensors typically used for TVCM provide signals that are complicated and variable, and that correspond in indirect ways with the desired FWHA 13-class classification system. Further, the wide variety of vehicle configurations adds to the complexity of the classification task. Our goal is to provide a TVCM system featuring high accuracy, adaptability to wide sensor and envirorunental variations, and continuous fault detection. We have instrumented an experimental TVCM site, developed PC-based on-line data acquisition software, collected a large database of vehicles` signals together with accurate ground truth determination, and analyzed the data off-line with a neural net classification system that can distinguish between class 2 (automobiles) and class 3 (utility vehicles) with better than 90% accuracy. The neural network used, called the Connectionist Hyperprism Classification (CHC) network, features simple basis functions; rapid, linear training algorithms for basis function amplitudes and widths; and basis function elimination that enhances network speed and accuracy. Work is in progress to extend the system to other classes, to quantify the system`s adaptability, and to develop automatic fault detection techniques.

  9. An adaptive filter bank for motor imagery based Brain Computer Interface.

    PubMed

    Thomas, Kavitha P; Guan, Cuntai; Tong, Lau Chiew; Prasad, Vinod A

    2008-01-01

    Brain Computer Interface (BCI) provides an alternative communication and control method for people with severe motor disabilities. Motor imagery patterns are widely used in Electroencephalogram (EEG) based BCIs. These motor imagery activities are associated with variation in alpha and beta band power of EEG signals called Event Related Desynchronization/synchronization (ERD/ERS). The dominant frequency bands are subject-specific and therefore performance of motor imagery based BCIs are sensitive to both temporal filtering and spatial filtering. As the optimum filter is strongly subject-dependent, we propose a method that selects the subject-specific discriminative frequency components using time-frequency plots of Fisher ratio of two-class motor imagery patterns. We also propose a low complexity adaptive Finite Impulse Response (FIR) filter bank system based on coefficient decimation technique which can realize the subject-specific bandpass filters adaptively depending on the information of Fisher ratio map. Features are extracted only from the selected frequency components. The proposed adaptive filter bank based system offers average classification accuracy of about 90%, which is slightly better than the existing fixed filter bank system. PMID:19162856

  10. The Utah Educational Technology Initiative Year Two Evaluation: Program Implementation, Computer Acquisition and Placement, and Computer Use.

    ERIC Educational Resources Information Center

    Mergendoller, John R.; And Others

    This evaluation report describes program implementation, computer acquisition and placement, and computer use during the second year (1991-92) of the Utah Educational Technology Initiative (ETI). In addition, it discusses the various ways computers are used in Utah schools and reports the opinions and experiences of ETI coordinators in the 12…

  11. Reviews of computing technology: A review of compound document architectures

    SciTech Connect

    Hudson, B.J.

    1991-10-01

    This review of computing technology will define, describe, and give examples of various approaches to document management through the use of compound document architectures. Experts agree that only 10% of business information exists in machine readable form, but much of what is stored is not in useful form. As a result, the average business document is copied over a dozen times during its life and duplicate copies are stored in numerous locations. The goal of compound document architectures is to provide an information support environment where rapid access to the correct information in the proper format is simplified. A compound document architecture provides structure to seemingly unstructured electronic documents, and standardizes the methods for interchange and access of entire or partial documents by authors and users.

  12. Application of modern computer technology to EPRI (Electric Power Research Institute) nuclear computer programs: Final report

    SciTech Connect

    Feinauer, L.R.

    1989-08-01

    Many of the nuclear analysis programs in use today were designed and developed well over a decade ago. Within this time frame, tremendous changes in hardware and software technologies have made it necessary to revise and/or restructure most of the analysis programs to take advantage of these changes. As computer programs mature from the development phase to being production programs, program maintenance and portability become very important issues. The maintenance costs associated with a particular computer program can generally be expected to exceed the total development costs by as much as a factor of two. Many of the problems associated with high maintenance costs can be traced back to either poorly designed coding structure, or ''quick fix'' modifications which do not preserve the original coding structure. The lack of standardization between hardware designs presents an obstacle to the software designer in providing 100% portable coding; however, conformance to certain guidelines can ensure portability between a wide variety of machines and operating systems. This report presents guidelines for upgrading EPRI nuclear computer programs to conform to current programming standards while maintaining flexibility for accommodating future hardware and software design trends. Guidelines for development of new computer programs are also presented. 22 refs., 10 figs.

  13. When should irrigators invest in more water-efficient technologies as an adaptation to climate change?

    NASA Astrophysics Data System (ADS)

    Malek, K.; Adam, J. C.; Stockle, C.; Brady, M.; Yoder, J.

    2015-12-01

    The western US is expected to experience more frequent droughts with higher magnitudes and persistence due to the climate change, with potentially large impacts on agricultural productivity and the economy. Irrigated farmers have many options for minimizing drought impacts including changing crops, engaging in water markets, and switching irrigation technologies. Switching to more efficient irrigation technologies, which increase water availability in the crop root zone through reduction of irrigation losses, receives significant attention because of the promise of maintaining current production with less. However, more efficient irrigation systems are almost always more capital-intensive adaptation strategy particularly compared to changing crops or trading water. A farmer's decision to switch will depend on how much money they project to save from reducing drought damages. The objective of this study is to explore when (and under what climate change scenarios) it makes sense economically for farmers to invest in a new irrigation system. This study was performed over the Yakima River Basin (YRB) in Washington State, although the tools and information gained from this study are transferable to other watersheds in the western US. We used VIC-CropSyst, a large-scale grid-based modeling framework that simulates hydrological processes while mechanistically capturing crop water use, growth and development. The water flows simulated by VIC-CropSyst were used to run the RiverWare river system and water management model (YAK-RW), which simulates river processes and calculates regional water availability for agricultural use each day (i.e., the prorationing ratio). An automated computational platform has been developed and programed to perform the economic analysis for each grid cell, crop types and future climate projections separately, which allows us to explore whether or not implementing a new irrigation system is economically viable. Results of this study indicate that

  14. Technical Education Transfer: Perceptions of Employee Computer Technology Self-Efficacy.

    ERIC Educational Resources Information Center

    Decker, C. A.

    1999-01-01

    This study investigated influences on employee self-efficacy of computer technologies resulting from computer-training programs that were intended to meet individual and organization objectives for university personnel. Influences on the transfer of training process included previous computer training, computer-use requirements, computer-use…

  15. What Can Computer Technology Offer Special Education? Research & Resources: Special Education Information for Policymakers.

    ERIC Educational Resources Information Center

    National Conference of State Legislatures, Washington, DC.

    Intended for policymakers, this brief addresses issues related to computer technology and its contributions to special education. Trends are noted and three types of applications are considered: computer assisted instruction, computer managed instruction, and computer support activities. Descriptions of several computer applications in local and…

  16. Isosurface Computation Made Simple: Hardware acceleration,Adaptive Refinement and tetrahedral Stripping

    SciTech Connect

    Pascucci, V

    2004-02-18

    This paper presents a simple approach for rendering isosurfaces of a scalar field. Using the vertex programming capability of commodity graphics cards, we transfer the cost of computing an isosurface from the Central Processing Unit (CPU), running the main application, to the Graphics Processing Unit (GPU), rendering the images. We consider a tetrahedral decomposition of the domain and draw one quadrangle (quad) primitive per tetrahedron. A vertex program transforms the quad into the piece of isosurface within the tetrahedron (see Figure 2). In this way, the main application is only devoted to streaming the vertices of the tetrahedra from main memory to the graphics card. For adaptively refined rectilinear grids, the optimization of this streaming process leads to the definition of a new 3D space-filling curve, which generalizes the 2D Sierpinski curve used for efficient rendering of triangulated terrains. We maintain the simplicity of the scheme when constructing view-dependent adaptive refinements of the domain mesh. In particular, we guarantee the absence of T-junctions by satisfying local bounds in our nested error basis. The expensive stage of fixing cracks in the mesh is completely avoided. We discuss practical tradeoffs in the distribution of the workload between the application and the graphics hardware. With current GPU's it is convenient to perform certain computations on the main CPU. Beyond the performance considerations that will change with the new generations of GPU's this approach has the major advantage of avoiding completely the storage in memory of the isosurface vertices and triangles.

  17. Evolutionary adaptive eye tracking for low-cost human computer interaction applications

    NASA Astrophysics Data System (ADS)

    Shen, Yan; Shin, Hak Chul; Sung, Won Jun; Khim, Sarang; Kim, Honglak; Rhee, Phill Kyu

    2013-01-01

    We present an evolutionary adaptive eye-tracking framework aiming for low-cost human computer interaction. The main focus is to guarantee eye-tracking performance without using high-cost devices and strongly controlled situations. The performance optimization of eye tracking is formulated into the dynamic control problem of deciding on an eye tracking algorithm structure and associated thresholds/parameters, where the dynamic control space is denoted by genotype and phenotype spaces. The evolutionary algorithm is responsible for exploring the genotype control space, and the reinforcement learning algorithm organizes the evolved genotype into a reactive phenotype. The evolutionary algorithm encodes an eye-tracking scheme as a genetic code based on image variation analysis. Then, the reinforcement learning algorithm defines internal states in a phenotype control space limited by the perceived genetic code and carries out interactive adaptations. The proposed method can achieve optimal performance by compromising the difficulty in the real-time performance of the evolutionary algorithm and the drawback of the huge search space of the reinforcement learning algorithm. Extensive experiments were carried out using webcam image sequences and yielded very encouraging results. The framework can be readily applied to other low-cost vision-based human computer interactions in solving their intrinsic brittleness in unstable operational environments.

  18. Adaptive controller for dynamic power and performance management in the virtualized computing systems.

    PubMed

    Wen, Chengjian; Long, Xiang; Mu, Yifen

    2013-01-01

    Power and performance management problem in large scale computing systems like data centers has attracted a lot of interests from both enterprises and academic researchers as power saving has become more and more important in many fields. Because of the multiple objectives, multiple influential factors and hierarchical structure in the system, the problem is indeed complex and hard. In this paper, the problem will be investigated in a virtualized computing system. Specifically, it is formulated as a power optimization problem with some constraints on performance. Then, the adaptive controller based on least-square self-tuning regulator(LS-STR) is designed to track performance in the first step; and the resource solved by the controller is allocated in order to minimize the power consumption as the second step. Some simulations are designed to test the effectiveness of this method and to compare it with some other controllers. The simulation results show that the adaptive controller is generally effective: it is applicable for different performance metrics, for different workloads, and for single and multiple workloads; it can track the performance requirement effectively and save the power consumption significantly. PMID:23451241

  19. Fast computation of an optimal controller for large-scale adaptive optics.

    PubMed

    Massioni, Paolo; Kulcsár, Caroline; Raynaud, Henri-François; Conan, Jean-Marc

    2011-11-01

    The linear quadratic Gaussian regulator provides the minimum-variance control solution for a linear time-invariant system. For adaptive optics (AO) applications, under the hypothesis of a deformable mirror with instantaneous response, such a controller boils down to a minimum-variance phase estimator (a Kalman filter) and a projection onto the mirror space. The Kalman filter gain can be computed by solving an algebraic Riccati matrix equation, whose computational complexity grows very quickly with the size of the telescope aperture. This "curse of dimensionality" makes the standard solvers for Riccati equations very slow in the case of extremely large telescopes. In this article, we propose a way of computing the Kalman gain for AO systems by means of an approximation that considers the turbulence phase screen as the cropped version of an infinite-size screen. We demonstrate the advantages of the methods for both off- and on-line computational time, and we evaluate its performance for classical AO as well as for wide-field tomographic AO with multiple natural guide stars. Simulation results are reported. PMID:22048298

  20. U.S. perspective on technology demonstration experiments for adaptive structures

    NASA Technical Reports Server (NTRS)

    Aswani, Mohan; Wada, Ben K.; Garba, John A.

    1991-01-01

    Evaluation of design concepts for adaptive structures is being performed in support of several focused research programs. These include programs such as Precision Segmented Reflector (PSR), Control Structure Interaction (CSI), and the Advanced Space Structures Technology Research Experiment (ASTREX). Although not specifically designed for adaptive structure technology validation, relevant experiments can be performed using the Passive and Active Control of Space Structures (PACOSS) testbed, the Space Integrated Controls Experiment (SPICE), the CSI Evolutionary Model (CEM), and the Dynamic Scale Model Test (DSMT) Hybrid Scale. In addition to the ground test experiments, several space flight experiments have been planned, including a reduced gravity experiment aboard the KC-135 aircraft, shuttle middeck experiments, and the Inexpensive Flight Experiment (INFLEX).

  1. Adaptation of NASA technology for the optimum design of orthopedic knee implants.

    PubMed

    Saravanos, D A; Mraz, P J; Davy, D T; Hopkins, D A

    1991-03-01

    NASA technology originally developed for designing aircraft turbine-engine blades has been adapted and applied to orthopedic knee implants. This article describes a method for tailoring an implant for optimal interaction with the environment of the tibia. The implant components are designed to control stresses in the bone for minimizing bone degradation and preventing failures. Engineers expect the tailoring system to improve knee prosthesis design and allow customized implants for individual patients. PMID:10150099

  2. Advances in computer technology: impact on the practice of medicine.

    PubMed

    Groth-Vasselli, B; Singh, K; Farnsworth, P N

    1995-01-01

    Advances in computer technology provide a wide range of applications which are revolutionizing the practice of medicine. The development of new software for the office creates a web of communication among physicians, staff members, health care facilities and associated agencies. This provides the physician with the prospect of a paperless office. At the other end of the spectrum, the development of 3D work stations and software based on computational chemistry permits visualization of protein molecules involved in disease. Computer assisted molecular modeling has been used to construct working 3D models of lens alpha-crystallin. The 3D structure of alpha-crystallin is basic to our understanding of the molecular mechanisms involved in lens fiber cell maturation, stabilization of the inner nuclear region, the maintenance of lens transparency and cataractogenesis. The major component of the high molecular weight aggregates that occur during cataractogenesis is alpha-crystallin subunits. Subunits of alpha-crystallin occur in other tissues of the body. In the central nervous system accumulation of these subunits in the form of dense inclusion bodies occurs in pathological conditions such as Alzheimer's disease, Huntington's disease, multiple sclerosis and toxoplasmosis (Iwaki, Wisniewski et al., 1992), as well as neoplasms of astrocyte origin (Iwaki, Iwaki, et al., 1991). Also cardiac ischemia is associated with an increased alpha B synthesis (Chiesi, Longoni et al., 1990). On a more global level, the molecular structure of alpha-crystallin may provide information pertaining to the function of small heat shock proteins, hsp, in maintaining cell stability under the stress of disease. PMID:8721907

  3. Flexibly Adaptive Professional Development in Support of Teaching Science with Geospatial Technology

    NASA Astrophysics Data System (ADS)

    Trautmann, Nancy M.; Makinster, James G.

    2010-04-01

    The flexibly adaptive model of professional development, developed in the GIT Ahead project, enables secondary science teachers to incorporate a variety of geospatial technology applications into wide-ranging classroom contexts. Teacher impacts were evaluated quantitatively and qualitatively. Post-questionnaire responses showed significant growth in teachers’ perceived technological expertise, interest, and ability to integrate geospatial technology into their science teaching. Application of the Technical Pedagogical Content Knowledge (TPACK) framework to three case studies illustrates such growth. Crucial aspects of professional development in support of teaching science with geospatial technology include intensive training, ongoing support, a supportive learning community, and flexibility in terms of support provided and implementation expectations. Implications are presented for design of professional development and use of TPACK in evaluating impacts.

  4. Adaptation of the perfluorocarbon tracer technology for aqueous-phase studies in subsurface applications

    SciTech Connect

    Senum, G.I.; Goodrich, R.W.; Wilson, R.; Dietz, R.N.

    1990-01-01

    The perfluorocarbon tracer (PFT) technology as developed by the Tracer Technology Center at Brookhaven National Laboratory can be easily adapted for use as in aqueous-phase tracer studies in subsurface hydrological applications. The advantages of the PFT technology in this application is that it is a multi-tracer technology, up to 5 or 6 PFTs may be used in an experiment, the PFTs are completely non-toxic and inert, the PFTs can be detected to 4 orders greater sensitivity than fluorescent dyes. The disadvantages are that the PFTs are only sparingly soluble in water and are also volatile. They are minimized by the PFT deployment and sampling methodologies which are given in this report. 15 refs., 3 tabs.

  5. Survey of subsurface geophysical exploration technologies adaptable to an airborne platform

    SciTech Connect

    Taylor, K.A.

    1992-12-01

    This report has been prepared by the US Department of Energy (DOE) as part of a Research Development Demonstration Testing and Evaluation (RDDT E) project by EG G Energy Measurement's (EG G/EM) Remote Sensing Laboratory. It examines geophysical detection techniques which may be used in Environmental Restoration/Waste Management (ER/WM) surveys to locate buried waste, waste containers, potential waste migratory paths, and aquifer depths. Because of the Remote Sensing Laboratory's unique survey capabilities, only those technologies which have been adapted or are capable of being adapted to an airborne platform were studied. This survey describes several of the available subsurface survey technologies and discusses the basic capabilities of each: the target detectability, required geologic conditions, and associated survey methods. Because the airborne capabilities of these survey techniques have not been fully developed, the chapters deal mostly with the ground-based capabilities of each of the technologies, with reference made to the airborne capabilities where applicable. The information about each survey technique came from various contractors whose companies employ these specific technologies. EG G/EM cannot guarantee or verify the accuracy of the contractor information; however, the data given is an indication of the technologies that are available.

  6. Survey of subsurface geophysical exploration technologies adaptable to an airborne platform

    SciTech Connect

    Taylor, K.A.

    1992-12-01

    This report has been prepared by the US Department of Energy (DOE) as part of a Research Development Demonstration Testing and Evaluation (RDDT&E) project by EG&G Energy Measurement`s (EG&G/EM) Remote Sensing Laboratory. It examines geophysical detection techniques which may be used in Environmental Restoration/Waste Management (ER/WM) surveys to locate buried waste, waste containers, potential waste migratory paths, and aquifer depths. Because of the Remote Sensing Laboratory`s unique survey capabilities, only those technologies which have been adapted or are capable of being adapted to an airborne platform were studied. This survey describes several of the available subsurface survey technologies and discusses the basic capabilities of each: the target detectability, required geologic conditions, and associated survey methods. Because the airborne capabilities of these survey techniques have not been fully developed, the chapters deal mostly with the ground-based capabilities of each of the technologies, with reference made to the airborne capabilities where applicable. The information about each survey technique came from various contractors whose companies employ these specific technologies. EG&G/EM cannot guarantee or verify the accuracy of the contractor information; however, the data given is an indication of the technologies that are available.

  7. Important Advances in Technology and Unique Applications to Cardiovascular Computed Tomography

    PubMed Central

    Chaikriangkrai, Kongkiat; Choi, Su Yeon; Nabi, Faisal; Chang, Su Min

    2014-01-01

    For the past decade, multidetector cardiac computed tomography and its main application, coronary computed tomography angiography, have been established as a noninvasive technique for anatomical assessment of coronary arteries. This new era of coronary artery evaluation by coronary computed tomography angiography has arisen from the rapid advancement in computed tomography technology, which has led to massive diagnostic and prognostic clinical studies in various patient populations. This article gives a brief overview of current multidetector cardiac computed tomography systems, developing cardiac computed tomography technologies in both hardware and software fields, innovative radiation exposure reduction measures, multidetector cardiac computed tomography functional studies, and their newer clinical applications beyond coronary computed tomography angiography. PMID:25574342

  8. Important advances in technology and unique applications to cardiovascular computed tomography.

    PubMed

    Chaikriangkrai, Kongkiat; Choi, Su Yeon; Nabi, Faisal; Chang, Su Min

    2014-01-01

    For the past decade, multidetector cardiac computed tomography and its main application, coronary computed tomography angiography, have been established as a noninvasive technique for anatomical assessment of coronary arteries. This new era of coronary artery evaluation by coronary computed tomography angiography has arisen from the rapid advancement in computed tomography technology, which has led to massive diagnostic and prognostic clinical studies in various patient populations. This article gives a brief overview of current multidetector cardiac computed tomography systems, developing cardiac computed tomography technologies in both hardware and software fields, innovative radiation exposure reduction measures, multidetector cardiac computed tomography functional studies, and their newer clinical applications beyond coronary computed tomography angiography. PMID:25574342

  9. FPGA-based slope computation for ELTs adaptive optics wavefront sensors

    NASA Astrophysics Data System (ADS)

    Rodríguez Ramos, L. F.; Díaz Garcia, J. J.; Piqueras Meseguer, J. J.; Martin Hernando, Y.; Rodríguez Ramos, J. M.

    2008-07-01

    ELTs laser guide stars wavefront sensors are planned to have specifically developed sensor chips, which will probably include readout logic and D/A conversion, followed by a powerful FPGA slope computer located very close to it, but not inside for flexibility and simplicity reasons. This paper presents the architecture of an FPGA-based wavefront slope computer, capable of handling the sensor output stream in a massively parallel approach. It will feature the ability of performing dark and flat field correction, the flexibility needed for allocating complex processing schemes, the capability of undertaking all computations expected to be performed at maximum speed, even though they were not strictly related to the calculation of the slopes, and the necessary housekeeping controls to properly command it and evaluate its behaviour. Feasibility using today's technology is evaluated, clearly showing its viability, together with an analysis of the amount of external memory, power consumption and printed circuit board space needed.

  10. Coevolution of adaptive technology, maladaptive culture and population size in a producer–scrounger game

    PubMed Central

    Lehmann, Laurent; Feldman, Marcus W.

    2009-01-01

    Technology (i.e. tools, methods of cultivation and domestication, systems of construction and appropriation, machines) has increased the vital rates of humans, and is one of the defining features of the transition from Malthusian ecological stagnation to a potentially perpetual rising population growth. Maladaptations, on the other hand, encompass behaviours, customs and practices that decrease the vital rates of individuals. Technology and maladaptations are part of the total stock of culture carried by the individuals in a population. Here, we develop a quantitative model for the coevolution of cumulative adaptive technology and maladaptive culture in a ‘producer–scrounger’ game, which can also usefully be interpreted as an ‘individual–social’ learner interaction. Producers (individual learners) are assumed to invent new adaptations and maladaptations by trial-and-error learning, insight or deduction, and they pay the cost of innovation. Scroungers (social learners) are assumed to copy or imitate (cultural transmission) both the adaptations and maladaptations generated by producers. We show that the coevolutionary dynamics of producers and scroungers in the presence of cultural transmission can have a variety of effects on population carrying capacity. From stable polymorphism, where scroungers bring an advantage to the population (increase in carrying capacity), to periodic cycling, where scroungers decrease carrying capacity, we find that selection-driven cultural innovation and transmission may send a population on the path of indefinite growth or to extinction. PMID:19692409

  11. Coevolution of adaptive technology, maladaptive culture and population size in a producer-scrounger game.

    PubMed

    Lehmann, Laurent; Feldman, Marcus W

    2009-11-01

    Technology (i.e. tools, methods of cultivation and domestication, systems of construction and appropriation, machines) has increased the vital rates of humans, and is one of the defining features of the transition from Malthusian ecological stagnation to a potentially perpetual rising population growth. Maladaptations, on the other hand, encompass behaviours, customs and practices that decrease the vital rates of individuals. Technology and maladaptations are part of the total stock of culture carried by the individuals in a population. Here, we develop a quantitative model for the coevolution of cumulative adaptive technology and maladaptive culture in a 'producer-scrounger' game, which can also usefully be interpreted as an 'individual-social' learner interaction. Producers (individual learners) are assumed to invent new adaptations and maladaptations by trial-and-error learning, insight or deduction, and they pay the cost of innovation. Scroungers (social learners) are assumed to copy or imitate (cultural transmission) both the adaptations and maladaptations generated by producers. We show that the coevolutionary dynamics of producers and scroungers in the presence of cultural transmission can have a variety of effects on population carrying capacity. From stable polymorphism, where scroungers bring an advantage to the population (increase in carrying capacity), to periodic cycling, where scroungers decrease carrying capacity, we find that selection-driven cultural innovation and transmission may send a population on the path of indefinite growth or to extinction. PMID:19692409

  12. Review of Enabling Technologies to Facilitate Secure Compute Customization

    SciTech Connect

    Aderholdt, Ferrol; Caldwell, Blake A; Hicks, Susan Elaine; Koch, Scott M; Naughton, III, Thomas J; Pelfrey, Daniel S; Pogge, James R; Scott, Stephen L; Shipman, Galen M; Sorrillo, Lawrence

    2014-12-01

    High performance computing environments are often used for a wide variety of workloads ranging from simulation, data transformation and analysis, and complex workflows to name just a few. These systems may process data for a variety of users, often requiring strong separation between job allocations. There are many challenges to establishing these secure enclaves within the shared infrastructure of high-performance computing (HPC) environments. The isolation mechanisms in the system software are the basic building blocks for enabling secure compute enclaves. There are a variety of approaches and the focus of this report is to review the different virtualization technologies that facilitate the creation of secure compute enclaves. The report reviews current operating system (OS) protection mechanisms and modern virtualization technologies to better understand the performance/isolation properties. We also examine the feasibility of running ``virtualized'' computing resources as non-privileged users, and providing controlled administrative permissions for standard users running within a virtualized context. Our examination includes technologies such as Linux containers (LXC [32], Docker [15]) and full virtualization (KVM [26], Xen [5]). We categorize these different approaches to virtualization into two broad groups: OS-level virtualization and system-level virtualization. The OS-level virtualization uses containers to allow a single OS kernel to be partitioned to create Virtual Environments (VE), e.g., LXC. The resources within the host's kernel are only virtualized in the sense of separate namespaces. In contrast, system-level virtualization uses hypervisors to manage multiple OS kernels and virtualize the physical resources (hardware) to create Virtual Machines (VM), e.g., Xen, KVM. This terminology of VE and VM, detailed in Section 2, is used throughout the report to distinguish between the two different approaches to providing virtualized execution environments

  13. The Diffusion of Computer-Based Technology in K-12 Schools: Teachers' Perspectives

    ERIC Educational Resources Information Center

    Colandrea, John Louis

    2012-01-01

    Because computer technology represents a major financial outlay for school districts and is an efficient method of preparing and delivering lessons, studying the process of teacher adoption of computer use is beneficial and adds to the current body of knowledge. Because the teacher is the ultimate user of computer technology for lesson preparation…

  14. A Detailed Analysis over Some Important Issues towards Using Computer Technology into the EFL Classrooms

    ERIC Educational Resources Information Center

    Gilakjani, Abbas Pourhosein

    2014-01-01

    Computer technology has changed the ways we work, learn, interact and spend our leisure time. Computer technology has changed every aspect of our daily life--how and where we get our news, how we order goods and services, and how we communicate. This study investigates some of the significant issues concerning the use of computer technology…

  15. Factors Contributing to Teachers' Use of Computer Technology in the Classroom

    ERIC Educational Resources Information Center

    Gilakjani, Abbas Pourhosein

    2013-01-01

    There are many factors for teachers to use computer technology in their classrooms. The goal of this study is to identify some of the important factors contributing the teachers' use of computer technology. The first goal of this paper is to discuss computer self-efficacy. The second goal is to explain teaching experience. The third goal is to…

  16. Computer architectures for computational physics work done by Computational Research and Technology Branch and Advanced Computational Concepts Group

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Slides are reproduced that describe the importance of having high performance number crunching and graphics capability. They also indicate the types of research and development underway at Ames Research Center to ensure that, in the near term, Ames is a smart buyer and user, and in the long-term that Ames knows the best possible solutions for number crunching and graphics needs. The drivers for this research are real computational physics applications of interest to Ames and NASA. They are concerned with how to map the applications, and how to maximize the physics learned from the results of the calculations. The computer graphics activities are aimed at getting maximum information from the three-dimensional calculations by using the real time manipulation of three-dimensional data on the Silicon Graphics workstation. Work is underway on new algorithms that will permit the display of experimental results that are sparse and random, the same way that the dense and regular computed results are displayed.

  17. Computer and mobile technology-based interventions for substance use disorders: an organizing framework.

    PubMed

    Litvin, Erika B; Abrantes, Ana M; Brown, Richard A

    2013-03-01

    Research devoted to the development of therapeutic, behavioral interventions for substance use disorders (SUDs) that can be accessed and delivered via computer and mobile technologies has increased rapidly during the past decade. Numerous recent reviews of this literature have supported the efficacy of technology-based interventions (TBIs), but have also revealed their great heterogeneity and a limited understanding of treatment mechanisms. We conducted a "review of reviews" focused on summarizing findings of previous reviews with respect to moderators of TBIs' efficacy, and present an organizing framework of considerations involved in designing and evaluating TBIs for SUDs. The four primary elements that comprise our framework are Accessibility, Usage, Human Contact, and Intervention Content, with several sub-elements within each category. We offer some suggested directions for future research grouped within these four primary considerations. We believe that technology affords unique opportunities to improve, support, and supplement therapeutic and peer relationships via dynamic applications that adapt to individuals' constantly changing motivation and treatment needs. We hope that our framework will aid in guiding programmatic progress in this exciting field. PMID:23254225

  18. Technological Metaphors and Moral Education: The Hacker Ethic and the Computational Experience

    ERIC Educational Resources Information Center

    Warnick, Bryan R.

    2004-01-01

    This essay is an attempt to understand how technological metaphors, particularly computer metaphors, are relevant to moral education. After discussing various types of technological metaphors, it is argued that technological metaphors enter moral thought through their "functional descriptions." The computer metaphor is then explored by turning to…

  19. Resource-constrained complexity-scalable video decoding via adaptive B-residual computation

    NASA Astrophysics Data System (ADS)

    Peng, Sharon S.; Zhong, Zhun

    2002-01-01

    As media processing gradually migrates from hardware to software programmable platforms, the number of media processing functions added on the media processor grow even faster than the ever-increasing media processor power can support. Computational complexity scalable algorithms become powerful vehicles for implementing many time-critical yet complexity-constrained applications, such as MPEG2 video decoding. In this paper, we present an adaptive resource-constrained complexity scalable MPEG2 video decoding scheme that makes a good trade-off between decoding complexity and output quality. Based on the available computational resources and the energy level of B-frame residuals, the scalable decoding algorithm selectively decodes B-residual blocks to significantly reduce system complexity. Furthermore, we describe an iterative procedure designed to dynamically adjust the complexity levels in order to achieve the best possible output quality under a given resource constraint. Experimental results show that up to 20% of total computational complexity reduction can be obtained with satisfactory output visual quality.

  20. Prototype Space Technology Hall of Fame exhibit at Technology 2003: Analysis of data from computer-based questionaire

    NASA Technical Reports Server (NTRS)

    Ewell, Robert N.

    1994-01-01

    The U.S. Space Foundation displayed its prototype Space Technology Hall of Fame exhibit design at the Technology 2003 conference in Anaheim, CA, December 7-9, 1993. In order to sample public opinion on space technology in general and the exhibit in particular, a computer-based survey was set up as a part of the display. The data collected was analyzed.