Science.gov

Sample records for adaptive computer technologies

  1. Adaptive Computing.

    ERIC Educational Resources Information Center

    Harrell, William

    1999-01-01

    Provides information on various adaptive technology resources available to people with disabilities. (Contains 19 references, an annotated list of 129 websites, and 12 additional print resources.) (JOW)

  2. Reconfigurable environmentally adaptive computing

    NASA Technical Reports Server (NTRS)

    Coxe, Robin L. (Inventor); Galica, Gary E. (Inventor)

    2008-01-01

    Described are methods and apparatus, including computer program products, for reconfigurable environmentally adaptive computing technology. An environmental signal representative of an external environmental condition is received. A processing configuration is automatically selected, based on the environmental signal, from a plurality of processing configurations. A reconfigurable processing element is reconfigured to operate according to the selected processing configuration. In some examples, the environmental condition is detected and the environmental signal is generated based on the detected condition.

  3. Using Neural Net Technology To Enhance the Efficiency of a Computer Adaptive Testing Application.

    ERIC Educational Resources Information Center

    Van Nelson, C.; Henriksen, Larry W.

    The potential for computer adaptive testing (CAT) has been well documented. In order to improve the efficiency of this process, it may be possible to utilize a neural network, or more specifically, a back propagation neural network. The paper asserts that in order to accomplish this end, it must be shown that grouping examinees by ability as…

  4. Adaptive Technology that Provides Access to Computers. DO-IT Program.

    ERIC Educational Resources Information Center

    Washington Univ., Seattle.

    This brochure describes the different types of barriers individuals with mobility impairments, blindness, low vision, hearing impairments, and specific learning disabilities face in providing computer input, interpreting output, and reading documentation. The adaptive hardware and software that has been developed to provide functional alternatives…

  5. Technology transfer for adaptation

    NASA Astrophysics Data System (ADS)

    Biagini, Bonizella; Kuhl, Laura; Gallagher, Kelly Sims; Ortiz, Claudia

    2014-09-01

    Technology alone will not be able to solve adaptation challenges, but it is likely to play an important role. As a result of the role of technology in adaptation and the importance of international collaboration for climate change, technology transfer for adaptation is a critical but understudied issue. Through an analysis of Global Environment Facility-managed adaptation projects, we find there is significantly more technology transfer occurring in adaptation projects than might be expected given the pessimistic rhetoric surrounding technology transfer for adaptation. Most projects focused on demonstration and early deployment/niche formation for existing technologies rather than earlier stages of innovation, which is understandable considering the pilot nature of the projects. Key challenges for the transfer process, including technology selection and appropriateness under climate change, markets and access to technology, and diffusion strategies are discussed in more detail.

  6. Architecture Adaptive Computing Environment

    NASA Technical Reports Server (NTRS)

    Dorband, John E.

    2006-01-01

    Architecture Adaptive Computing Environment (aCe) is a software system that includes a language, compiler, and run-time library for parallel computing. aCe was developed to enable programmers to write programs, more easily than was previously possible, for a variety of parallel computing architectures. Heretofore, it has been perceived to be difficult to write parallel programs for parallel computers and more difficult to port the programs to different parallel computing architectures. In contrast, aCe is supportable on all high-performance computing architectures. Currently, it is supported on LINUX clusters. aCe uses parallel programming constructs that facilitate writing of parallel programs. Such constructs were used in single-instruction/multiple-data (SIMD) programming languages of the 1980s, including Parallel Pascal, Parallel Forth, C*, *LISP, and MasPar MPL. In aCe, these constructs are extended and implemented for both SIMD and multiple- instruction/multiple-data (MIMD) architectures. Two new constructs incorporated in aCe are those of (1) scalar and virtual variables and (2) pre-computed paths. The scalar-and-virtual-variables construct increases flexibility in optimizing memory utilization in various architectures. The pre-computed-paths construct enables the compiler to pre-compute part of a communication operation once, rather than computing it every time the communication operation is performed.

  7. Computer Technology for Industry

    NASA Technical Reports Server (NTRS)

    1979-01-01

    In this age of the computer, more and more business firms are automating their operations for increased efficiency in a great variety of jobs, from simple accounting to managing inventories, from precise machining to analyzing complex structures. In the interest of national productivity, NASA is providing assistance both to longtime computer users and newcomers to automated operations. Through a special technology utilization service, NASA saves industry time and money by making available already developed computer programs which have secondary utility. A computer program is essentially a set of instructions which tells the computer how to produce desired information or effect by drawing upon its stored input. Developing a new program from scratch can be costly and time-consuming. Very often, however, a program developed for one purpose can readily be adapted to a totally different application. To help industry take advantage of existing computer technology, NASA operates the Computer Software Management and Information Center (COSMIC)(registered TradeMark),located at the University of Georgia. COSMIC maintains a large library of computer programs developed for NASA, the Department of Defense, the Department of Energy and other technology-generating agencies of the government. The Center gets a continual flow of software packages, screens them for adaptability to private sector usage, stores them and informs potential customers of their availability.

  8. Adaptive computing for people with disabilities.

    PubMed

    Merrow, S L; Corbett, C D

    1994-01-01

    Adaptive computing is a relatively new area, and little has been written in the nursing literature on the topic. "Adaptive computing" refers to the professional services and the technology (both hardware and software) that make computing technology accessible for persons with disabilities. Nurses in many settings such as schools, industry, rehabilitation facilities, and the community, can use knowledge of adaptive computing as they counsel, advise, and advocate for people with disabilities. Nurses with an awareness and knowledge of adaptive computing will be better able to promote high-level wellness for individuals with disabilities, thus maximizing their potential for an active fulfilling life. People with different types of disabilities, including visual, mobility, hearing, learning, communication disorders and acquired brain injuries may benefit from computer adaptations. Disabled people encounter barriers to computing in six major areas: 1) the environment, 2) data entry, 3) information output, 4) technical documentation, 5) support, and 6) training. After a discussion of these barriers, the criteria for selecting appropriate adaptations and selected examples of adaptations are presented. Several cases studies illustrate the evaluation process and the development of adaptive computer solutions. PMID:8082064

  9. Advanced Adaptive Optics Technology Development

    SciTech Connect

    Olivier, S

    2001-09-18

    The NSF Center for Adaptive Optics (CfAO) is supporting research on advanced adaptive optics technologies. CfAO research activities include development and characterization of micro-electro-mechanical systems (MEMS) deformable mirror (DM) technology, as well as development and characterization of high-resolution adaptive optics systems using liquid crystal (LC) spatial light modulator (SLM) technology. This paper presents an overview of the CfAO advanced adaptive optics technology development activities including current status and future plans.

  10. Ubiquitous Computing Technologies in Education

    ERIC Educational Resources Information Center

    Hwang, Gwo-Jen; Wu, Ting-Ting; Chen, Yen-Jung

    2007-01-01

    The prosperous development of wireless communication and sensor technologies has attracted the attention of researchers from both computer and education fields. Various investigations have been made for applying the new technologies to education purposes, such that more active and adaptive learning activities can be conducted in the real world.…

  11. Adaptive Technologies for Training and Education

    ERIC Educational Resources Information Center

    Durlach, Paula J., Ed; Lesgold, Alan M., Ed.

    2012-01-01

    This edited volume provides an overview of the latest advancements in adaptive training technology. Intelligent tutoring has been deployed for well-defined and relatively static educational domains such as algebra and geometry. However, this adaptive approach to computer-based training has yet to come into wider usage for domains that are less…

  12. Computer Access. Tech Use Guide: Using Computer Technology.

    ERIC Educational Resources Information Center

    Council for Exceptional Children, Reston, VA. Center for Special Education Technology.

    One of nine brief guides for special educators on using computer technology, this guide focuses on access including adaptations in input devices, output devices, and computer interfaces. Low technology devices include "no-technology" devices (usually modifications to existing devices), simple switches, and multiple switches. High technology input…

  13. Computers: Educational Technology Paradox?

    ERIC Educational Resources Information Center

    Hashim, Hajah Rugayah Hj.; Mustapha, Wan Narita

    2005-01-01

    As we move further into the new millennium, the need to involve and adapt learners with new technology have been the main aim of many institutions of higher learning in Malaysia. The involvement of the government in huge technology-based projects like the Multimedia Super Corridor Highway (MSC) and one of its flagships, the Smart Schools have…

  14. Access to College for All: ITAC Project--Computer and Adaptive Computer Technologies in the Cegeps for Students with Disabilities = L'accessibilite au cegep pour tous: Projet ITAC--informatique et technologies adaptees dans les cegeps pour les etudiants handicapes.

    ERIC Educational Resources Information Center

    Fichten, Catherine S.; Barile, Maria

    This report discusses outcomes of three empirical studies which investigated the computer and adaptive computer technology needs and concerns of Quebec college students with various disabilities, professors, and individuals responsible for providing services to students with disabilities. Key findings are highlighted and recommendations are made…

  15. Computers boost structural technology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Venneri, Samuel L.

    1989-01-01

    Derived from matrix methods of structural analysis and finite element methods developed over the last three decades, computational structures technology (CST) blends computer science, numerical analysis, and approximation theory into structural analysis and synthesis. Recent significant advances in CST include stochastic-based modeling, strategies for performing large-scale structural calculations on new computing systems, and the integration of CST with other disciplinary modules for multidisciplinary analysis and design. New methodologies have been developed at NASA for integrated fluid-thermal structural analysis and integrated aerodynamic-structure-control design. The need for multiple views of data for different modules also led to the development of a number of sophisticated data-base management systems. For CST to play a role in the future development of structures technology and in the multidisciplinary design of future flight vehicles, major advances and computational tools are needed in a number of key areas.

  16. QPSO-based adaptive DNA computing algorithm.

    PubMed

    Karakose, Mehmet; Cigdem, Ugur

    2013-01-01

    DNA (deoxyribonucleic acid) computing that is a new computation model based on DNA molecules for information storage has been increasingly used for optimization and data analysis in recent years. However, DNA computing algorithm has some limitations in terms of convergence speed, adaptability, and effectiveness. In this paper, a new approach for improvement of DNA computing is proposed. This new approach aims to perform DNA computing algorithm with adaptive parameters towards the desired goal using quantum-behaved particle swarm optimization (QPSO). Some contributions provided by the proposed QPSO based on adaptive DNA computing algorithm are as follows: (1) parameters of population size, crossover rate, maximum number of operations, enzyme and virus mutation rate, and fitness function of DNA computing algorithm are simultaneously tuned for adaptive process, (2) adaptive algorithm is performed using QPSO algorithm for goal-driven progress, faster operation, and flexibility in data, and (3) numerical realization of DNA computing algorithm with proposed approach is implemented in system identification. Two experiments with different systems were carried out to evaluate the performance of the proposed approach with comparative results. Experimental results obtained with Matlab and FPGA demonstrate ability to provide effective optimization, considerable convergence speed, and high accuracy according to DNA computing algorithm.

  17. Computer Technology in Adult Education.

    ERIC Educational Resources Information Center

    Slider, Patty; Hodges, Kathy; Carter, Cea; White, Barbara

    This publication provides materials to help adult educators use computer technology in their teaching. Section 1, Computer Basics, contains activities and materials on these topics: increasing computer literacy, computer glossary, parts of a computer, keyboard, disk care, highlighting text, scrolling and wrap-around text, setting up text,…

  18. Computing technology in the 1980's. [computers

    NASA Technical Reports Server (NTRS)

    Stone, H. S.

    1978-01-01

    Advances in computing technology have been led by consistently improving semiconductor technology. The semiconductor industry has turned out ever faster, smaller, and less expensive devices since transistorized computers were first introduced 20 years ago. For the next decade, there appear to be new advances possible, with the rate of introduction of improved devices at least equal to the historic trends. The implication of these projections is that computers will enter new markets and will truly be pervasive in business, home, and factory as their cost diminishes and their computational power expands to new levels. The computer industry as we know it today will be greatly altered in the next decade, primarily because the raw computer system will give way to computer-based turn-key information and control systems.

  19. Computer Viruses. Technology Update.

    ERIC Educational Resources Information Center

    Ponder, Tim, Comp.; Ropog, Marty, Comp.; Keating, Joseph, Comp.

    This document provides general information on computer viruses, how to help protect a computer network from them, measures to take if a computer becomes infected. Highlights include the origins of computer viruses; virus contraction; a description of some common virus types (File Virus, Boot Sector/Partition Table Viruses, Trojan Horses, and…

  20. Nursing Education Update: Computer Technology.

    ERIC Educational Resources Information Center

    Gothler, Ann M.

    1985-01-01

    A survey of nursing faculty showed that 91 percent of nursing education programs had faculty members who had attended or participated in a conference on computers during 1983 and 1984. Other survey responses concerned computer applications integrated into nursing courses, required courses in computer technology, and computer-assisted instruction.…

  1. Adaptable Deployable Entry and Placement Technology (ADEPT)

    NASA Video Gallery

    The Adaptable, Deployable Entry Placement Technology (ADEPT) Project will test and demonstrate a deployable aeroshell concept as a viable thermal protection system for entry, descent, and landing o...

  2. [Earth Science Technology Office's Computational Technologies Project

    NASA Technical Reports Server (NTRS)

    Fischer, James (Technical Monitor); Merkey, Phillip

    2005-01-01

    This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.

  3. Adaptive multiclass classification for brain computer interfaces.

    PubMed

    Llera, A; Gómez, V; Kappen, H J

    2014-06-01

    We consider the problem of multiclass adaptive classification for brain-computer interfaces and propose the use of multiclass pooled mean linear discriminant analysis (MPMLDA), a multiclass generalization of the adaptation rule introduced by Vidaurre, Kawanabe, von Bünau, Blankertz, and Müller (2010) for the binary class setting. Using publicly available EEG data sets and tangent space mapping (Barachant, Bonnet, Congedo, & Jutten, 2012) as a feature extractor, we demonstrate that MPMLDA can significantly outperform state-of-the-art multiclass static and adaptive methods. Furthermore, efficient learning rates can be achieved using data from different subjects.

  4. Computer and information technology: hardware.

    PubMed

    O'Brien, D

    1998-02-01

    Computers open the door to an ever-expanding arena of knowledge and technology. Most nurses practicing in perianesthesia setting were educated before the computer era, and many fear computers and the associated technology. Frequently, the greatest difficulty is finding the resources and knowing what questions to ask. The following is the first in a series of articles on computers and information technology. This article discusses computer hardware to get the novice started or the experienced user upgraded to access new technologies and the Internet. Future articles will discuss start up and usual software applications, getting up to speed on the information superhighway, and other technologies that will broaden our knowledge and expand our personal and professional world. PMID:9543967

  5. Decoding Technology: Computer Shortcuts

    ERIC Educational Resources Information Center

    Walker, Tim; Donohue, Chip

    2008-01-01

    For the typical early childhood administrator, there will never be enough hours in a day to finish the work that needs to be done. This includes numerous hours spent on a computer tracking enrollment, managing the budget, researching curriculum ideas online, and many other administrative tasks. Improving an administrator's computer efficiency can…

  6. Computers, Technology, and Disability. [Update.

    ERIC Educational Resources Information Center

    American Council on Education, Washington, DC. HEATH Resource Center.

    This paper describes programs and resources that focus on access of postsecondary students with disabilities to computers and other forms of technology. Increased access to technological devices and services is provided to students with disabilities under the Technology-Related Assistance for Individuals with Disabilities Act (Tech Act). Section…

  7. Evaluation Parameters for Computer-Adaptive Testing

    ERIC Educational Resources Information Center

    Georgiadou, Elisabeth; Triantafillou, Evangelos; Economides, Anastasios A.

    2006-01-01

    With the proliferation of computers in test delivery today, adaptive testing has become quite popular, especially when examinees must be classified into two categories (passfail, master nonmaster). Several well-established organisations have provided standards and guidelines for the design and evaluation of educational and psychological testing.…

  8. Parallel computations and control of adaptive structures

    NASA Technical Reports Server (NTRS)

    Park, K. C.; Alvin, Kenneth F.; Belvin, W. Keith; Chong, K. P. (Editor); Liu, S. C. (Editor); Li, J. C. (Editor)

    1991-01-01

    The equations of motion for structures with adaptive elements for vibration control are presented for parallel computations to be used as a software package for real-time control of flexible space structures. A brief introduction of the state-of-the-art parallel computational capability is also presented. Time marching strategies are developed for an effective use of massive parallel mapping, partitioning, and the necessary arithmetic operations. An example is offered for the simulation of control-structure interaction on a parallel computer and the impact of the approach presented for applications in other disciplines than aerospace industry is assessed.

  9. Computer Technology for Industry

    NASA Technical Reports Server (NTRS)

    1982-01-01

    Shell Oil Company used a COSMIC program, called VISCEL to insure the accuracy of the company's new computer code for analyzing polymers, and chemical compounds. Shell reported that there were no other programs available that could provide the necessary calculations. Shell produces chemicals for plastic products used in the manufacture of automobiles, housewares, appliances, film, textiles, electronic equipment and furniture.

  10. Adaptive and perceptual learning technologies in medical education and training.

    PubMed

    Kellman, Philip J

    2013-10-01

    Recent advances in the learning sciences offer remarkable potential to improve medical education and maximize the benefits of emerging medical technologies. This article describes 2 major innovation areas in the learning sciences that apply to simulation and other aspects of medical learning: Perceptual learning (PL) and adaptive learning technologies. PL technology offers, for the first time, systematic, computer-based methods for teaching pattern recognition, structural intuition, transfer, and fluency. Synergistic with PL are new adaptive learning technologies that optimize learning for each individual, embed objective assessment, and implement mastery criteria. The author describes the Adaptive Response-Time-based Sequencing (ARTS) system, which uses each learner's accuracy and speed in interactive learning to guide spacing, sequencing, and mastery. In recent efforts, these new technologies have been applied in medical learning contexts, including adaptive learning modules for initial medical diagnosis and perceptual/adaptive learning modules (PALMs) in dermatology, histology, and radiology. Results of all these efforts indicate the remarkable potential of perceptual and adaptive learning technologies, individually and in combination, to improve learning in a variety of medical domains.

  11. Adaptive computation algorithm for RBF neural network.

    PubMed

    Han, Hong-Gui; Qiao, Jun-Fei

    2012-02-01

    A novel learning algorithm is proposed for nonlinear modelling and identification using radial basis function neural networks. The proposed method simplifies neural network training through the use of an adaptive computation algorithm (ACA). In addition, the convergence of the ACA is analyzed by the Lyapunov criterion. The proposed algorithm offers two important advantages. First, the model performance can be significantly improved through ACA, and the modelling error is uniformly ultimately bounded. Secondly, the proposed ACA can reduce computational cost and accelerate the training speed. The proposed method is then employed to model classical nonlinear system with limit cycle and to identify nonlinear dynamic system, exhibiting the effectiveness of the proposed algorithm. Computational complexity analysis and simulation results demonstrate its effectiveness.

  12. Trusted Computing Technologies, Intel Trusted Execution Technology.

    SciTech Connect

    Guise, Max Joseph; Wendt, Jeremy Daniel

    2011-01-01

    We describe the current state-of-the-art in Trusted Computing Technologies - focusing mainly on Intel's Trusted Execution Technology (TXT). This document is based on existing documentation and tests of two existing TXT-based systems: Intel's Trusted Boot and Invisible Things Lab's Qubes OS. We describe what features are lacking in current implementations, describe what a mature system could provide, and present a list of developments to watch. Critical systems perform operation-critical computations on high importance data. In such systems, the inputs, computation steps, and outputs may be highly sensitive. Sensitive components must be protected from both unauthorized release, and unauthorized alteration: Unauthorized users should not access the sensitive input and sensitive output data, nor be able to alter them; the computation contains intermediate data with the same requirements, and executes algorithms that the unauthorized should not be able to know or alter. Due to various system requirements, such critical systems are frequently built from commercial hardware, employ commercial software, and require network access. These hardware, software, and network system components increase the risk that sensitive input data, computation, and output data may be compromised.

  13. Computational technology for high-temperature aerospace structures

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Card, M. F.

    1992-01-01

    The status and some recent developments of computational technology for high-temperature aerospace structures are summarized. Discussion focuses on a number of aspects including: goals of computational technology for high-temperature structures; computational material modeling; life prediction methodology; computational modeling of high-temperature composites; error estimation and adaptive improvement strategies; strategies for solution of fluid flow/thermal/structural problems; and probabilistic methods and stochastic modeling approaches, integrated analysis and design. Recent trends in high-performance computing environment are described and the research areas which have high potential for meeting future technological needs are identified.

  14. Teacher Teams and Computer Technology.

    ERIC Educational Resources Information Center

    Hecht, Jeffrey B.; Roberts, Nicole K.; Schoon, Perry L.; Fansler, Gigi

    This research used three groups in a quasi-experimental approach to assess the combined impact of teacher teaming and computer technology on student grade point averages (GPAs). Ninth-grade students' academic achievement in each of four different subject areas (algebra, biology, world cultures, and English) was studied. Two separate treatments…

  15. Computer Technology. Career Education Guide.

    ERIC Educational Resources Information Center

    Dependents Schools (DOD), Washington, DC. European Area.

    The curriculum guide is designed to provide students with realistic training in computer technology theory and practice within the secondary educational framework and to prepare them for entry into an occupation or continuing postsecondary education. Each unit plan consists of a description of the area under consideration, estimated hours of…

  16. Techniques for grid manipulation and adaptation. [computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Choo, Yung K.; Eisemann, Peter R.; Lee, Ki D.

    1992-01-01

    Two approaches have been taken to provide systematic grid manipulation for improved grid quality. One is the control point form (CPF) of algebraic grid generation. It provides explicit control of the physical grid shape and grid spacing through the movement of the control points. It works well in the interactive computer graphics environment and hence can be a good candidate for integration with other emerging technologies. The other approach is grid adaptation using a numerical mapping between the physical space and a parametric space. Grid adaptation is achieved by modifying the mapping functions through the effects of grid control sources. The adaptation process can be repeated in a cyclic manner if satisfactory results are not achieved after a single application.

  17. Optical Computers and Space Technology

    NASA Technical Reports Server (NTRS)

    Abdeldayem, Hossin A.; Frazier, Donald O.; Penn, Benjamin; Paley, Mark S.; Witherow, William K.; Banks, Curtis; Hicks, Rosilen; Shields, Angela

    1995-01-01

    The rapidly increasing demand for greater speed and efficiency on the information superhighway requires significant improvements over conventional electronic logic circuits. Optical interconnections and optical integrated circuits are strong candidates to provide the way out of the extreme limitations imposed on the growth of speed and complexity of nowadays computations by the conventional electronic logic circuits. The new optical technology has increased the demand for high quality optical materials. NASA's recent involvement in processing optical materials in space has demonstrated that a new and unique class of high quality optical materials are processible in a microgravity environment. Microgravity processing can induce improved orders in these materials and could have a significant impact on the development of optical computers. We will discuss NASA's role in processing these materials and report on some of the associated nonlinear optical properties which are quite useful for optical computers technology.

  18. Adaptable Computing Environment/Self-Assembling Software

    SciTech Connect

    Osbourn, Gordon C.; Bouchard, Ann M.; Bartholomew, John W.

    2007-09-25

    Complex software applications are difficult to learn to use and to remember how to use. Further, the user has no control over the functionality available in a given application. The software we use can be created and modified only by a relatively small group of elite, highly skilled artisans known as programmers. "Normal users" are powerless to create and modify software themselves, because the tools for software development, designed by and for programmers, are a barrier to entry. This software, when completed, will be a user-adaptable computing environment in which the user is really in control of his/her own software, able to adapt the system, make new parts of the system interactive, and even modify the behavior of the system itself. Som key features of the basic environment that have been implemented are (a) books in bookcases, where all data is stored, (b) context-sensitive compass menus (compass, because the buttons are located in compass directions relative to the mouose cursor position), (c) importing tabular data and displaying it in a book, (d) light-weight table querying/sorting, (e) a Reach&Get capability (sort of a "smart" copy/paste that prevents the user from copying invalid data), and (f) a LogBook that automatically logs all user actions that change data or the system itself. To bootstrap toward full end-user adaptability, we implemented a set of development tools. With the development tools, compass menus can be made and customized.

  19. Center for Advanced Computational Technology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    2000-01-01

    The Center for Advanced Computational Technology (ACT) was established to serve as a focal point for diverse research activities pertaining to application of advanced computational technology to future aerospace systems. These activities include the use of numerical simulations, artificial intelligence methods, multimedia and synthetic environments, and computational intelligence, in the modeling, analysis, sensitivity studies, optimization, design and operation of future aerospace systems. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The Center has four specific objectives: 1) conduct innovative research on applications of advanced computational technology to aerospace systems; 2) act as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); 3) help in identifying future directions of research in support of the aeronautical and space missions of the twenty-first century; and 4) help in the rapid transfer of research results to industry and in broadening awareness among researchers and engineers of the state-of-the-art in applications of advanced computational technology to the analysis, design prototyping and operations of aerospace and other high-performance engineering systems. In addition to research, Center activities include helping in the planning and coordination of the activities of a multi-center team of NASA and JPL researchers who are developing an intelligent synthesis environment for future aerospace systems; organizing workshops and national symposia; as well as writing state-of-the-art monographs and NASA special publications on timely topics.

  20. Adaptable Computing Environment/Self-Assembling Software

    2007-09-25

    Complex software applications are difficult to learn to use and to remember how to use. Further, the user has no control over the functionality available in a given application. The software we use can be created and modified only by a relatively small group of elite, highly skilled artisans known as programmers. "Normal users" are powerless to create and modify software themselves, because the tools for software development, designed by and for programmers, are amore » barrier to entry. This software, when completed, will be a user-adaptable computing environment in which the user is really in control of his/her own software, able to adapt the system, make new parts of the system interactive, and even modify the behavior of the system itself. Som key features of the basic environment that have been implemented are (a) books in bookcases, where all data is stored, (b) context-sensitive compass menus (compass, because the buttons are located in compass directions relative to the mouose cursor position), (c) importing tabular data and displaying it in a book, (d) light-weight table querying/sorting, (e) a Reach&Get capability (sort of a "smart" copy/paste that prevents the user from copying invalid data), and (f) a LogBook that automatically logs all user actions that change data or the system itself. To bootstrap toward full end-user adaptability, we implemented a set of development tools. With the development tools, compass menus can be made and customized.« less

  1. Shape threat detection via adaptive computed tomography

    NASA Astrophysics Data System (ADS)

    Masoudi, Ahmad; Thamvichai, Ratchaneekorn; Neifeld, Mark A.

    2016-05-01

    X-ray Computed Tomography (CT) is used widely for screening purposes. Conventional x-ray threat detection systems employ image reconstruction and segmentation algorithms prior to making threat/no-threat decisions. We find that in many cases these pre-processing steps can degrade detection performance. Therefore in this work we will investigate methods that operate directly on the CT measurements. We analyze a fixed-gantry system containing 25 x-ray sources and 2200 photon counting detectors. We present a new method for improving threat detection performance. This new method is a so-called greedy adaptive algorithm which at each time step uses information from previous measurements to design the next measurement. We utilize sequential hypothesis testing (SHT) in order to derive both the optimal "next measurement" and the stopping criterion to insure a target probability of error Pe. We find that selecting the next x-ray source according to such a greedy adaptive algorithm, we can reduce Pe by a factor of 42.4× relative to the conventional measurement sequence employing all 25 sources in sequence.

  2. Infinite possibilities: Computational structures technology

    NASA Astrophysics Data System (ADS)

    Beam, Sherilee F.

    1994-12-01

    Computational Fluid Dynamics (or CFD) methods are very familiar to the research community. Even the general public has had some exposure to CFD images, primarily through the news media. However, very little attention has been paid to CST--Computational Structures Technology. Yet, no important design can be completed without it. During the first half of this century, researchers only dreamed of designing and building structures on a computer. Today their dreams have become practical realities as computational methods are used in all phases of design, fabrication and testing of engineering systems. Increasingly complex structures can now be built in even shorter periods of time. Over the past four decades, computer technology has been developing, and early finite element methods have grown from small in-house programs to numerous commercial software programs. When coupled with advanced computing systems, they help engineers make dramatic leaps in designing and testing concepts. The goals of CST include: predicting how a structure will behave under actual operating conditions; designing and complementing other experiments conducted on a structure; investigating microstructural damage or chaotic, unpredictable behavior; helping material developers in improving material systems; and being a useful tool in design systems optimization and sensitivity techniques. Applying CST to a structure problem requires five steps: (1) observe the specific problem; (2) develop a computational model for numerical simulation; (3) develop and assemble software and hardware for running the codes; (4) post-process and interpret the results; and (5) use the model to analyze and design the actual structure. Researchers in both industry and academia continue to make significant contributions to advance this technology with improvements in software, collaborative computing environments and supercomputing systems. As these environments and systems evolve, computational structures technology will

  3. Infinite possibilities: Computational structures technology

    NASA Technical Reports Server (NTRS)

    Beam, Sherilee F.

    1994-01-01

    Computational Fluid Dynamics (or CFD) methods are very familiar to the research community. Even the general public has had some exposure to CFD images, primarily through the news media. However, very little attention has been paid to CST--Computational Structures Technology. Yet, no important design can be completed without it. During the first half of this century, researchers only dreamed of designing and building structures on a computer. Today their dreams have become practical realities as computational methods are used in all phases of design, fabrication and testing of engineering systems. Increasingly complex structures can now be built in even shorter periods of time. Over the past four decades, computer technology has been developing, and early finite element methods have grown from small in-house programs to numerous commercial software programs. When coupled with advanced computing systems, they help engineers make dramatic leaps in designing and testing concepts. The goals of CST include: predicting how a structure will behave under actual operating conditions; designing and complementing other experiments conducted on a structure; investigating microstructural damage or chaotic, unpredictable behavior; helping material developers in improving material systems; and being a useful tool in design systems optimization and sensitivity techniques. Applying CST to a structure problem requires five steps: (1) observe the specific problem; (2) develop a computational model for numerical simulation; (3) develop and assemble software and hardware for running the codes; (4) post-process and interpret the results; and (5) use the model to analyze and design the actual structure. Researchers in both industry and academia continue to make significant contributions to advance this technology with improvements in software, collaborative computing environments and supercomputing systems. As these environments and systems evolve, computational structures technology will

  4. Case Studies in Computer Adaptive Test Design through Simulation.

    ERIC Educational Resources Information Center

    Eignor, Daniel R.; And Others

    The extensive computer simulation work done in developing the computer adaptive versions of the Graduate Record Examinations (GRE) Board General Test and the College Board Admissions Testing Program (ATP) Scholastic Aptitude Test (SAT) is described in this report. Both the GRE General and SAT computer adaptive tests (CATs), which are fixed length…

  5. ICAN Computer Code Adapted for Building Materials

    NASA Technical Reports Server (NTRS)

    Murthy, Pappu L. N.

    1997-01-01

    The NASA Lewis Research Center has been involved in developing composite micromechanics and macromechanics theories over the last three decades. These activities have resulted in several composite mechanics theories and structural analysis codes whose applications range from material behavior design and analysis to structural component response. One of these computer codes, the Integrated Composite Analyzer (ICAN), is designed primarily to address issues related to designing polymer matrix composites and predicting their properties - including hygral, thermal, and mechanical load effects. Recently, under a cost-sharing cooperative agreement with a Fortune 500 corporation, Master Builders Inc., ICAN was adapted to analyze building materials. The high costs and technical difficulties involved with the fabrication of continuous-fiber-reinforced composites sometimes limit their use. Particulate-reinforced composites can be thought of as a viable alternative. They are as easily processed to near-net shape as monolithic materials, yet have the improved stiffness, strength, and fracture toughness that is characteristic of continuous-fiber-reinforced composites. For example, particlereinforced metal-matrix composites show great potential for a variety of automotive applications, such as disk brake rotors, connecting rods, cylinder liners, and other hightemperature applications. Building materials, such as concrete, can be thought of as one of the oldest materials in this category of multiphase, particle-reinforced materials. The adaptation of ICAN to analyze particle-reinforced composite materials involved the development of new micromechanics-based theories. A derivative of the ICAN code, ICAN/PART, was developed and delivered to Master Builders Inc. as a part of the cooperative activity.

  6. Military engine computational structures technology

    NASA Technical Reports Server (NTRS)

    Thomson, Daniel E.

    1992-01-01

    Integrated High Performance Turbine Engine Technology Initiative (IHPTET) goals require a strong analytical base. Effective analysis of composite materials is critical to life analysis and structural optimization. Accurate life prediction for all material systems is critical. User friendly systems are also desirable. Post processing of results is very important. The IHPTET goal is to double turbine engine propulsion capability by the year 2003. Fifty percent of the goal will come from advanced materials and structures, the other 50 percent will come from increasing performance. Computer programs are listed.

  7. Computer technologies and institutional memory

    NASA Technical Reports Server (NTRS)

    Bell, Christopher; Lachman, Roy

    1989-01-01

    NASA programs for manned space flight are in their 27th year. Scientists and engineers who worked continuously on the development of aerospace technology during that period are approaching retirement. The resulting loss to the organization will be considerable. Although this problem is general to the NASA community, the problem was explored in terms of the institutional memory and technical expertise of a single individual in the Man-Systems division. The main domain of the expert was spacecraft lighting, which became the subject area for analysis in these studies. The report starts with an analysis of the cumulative expertise and institutional memory of technical employees of organizations such as NASA. A set of solutions to this problem are examined and found inadequate. Two solutions were investigated at length: hypertext and expert systems. Illustrative examples were provided of hypertext and expert system representation of spacecraft lighting. These computer technologies can be used to ameliorate the problem of the loss of invaluable personnel.

  8. A Very High Order, Adaptable MESA Implementation for Aeroacoustic Computations

    NASA Technical Reports Server (NTRS)

    Dydson, Roger W.; Goodrich, John W.

    2000-01-01

    Since computational efficiency and wave resolution scale with accuracy, the ideal would be infinitely high accuracy for problems with widely varying wavelength scales. Currently, many of the computational aeroacoustics methods are limited to 4th order accurate Runge-Kutta methods in time which limits their resolution and efficiency. However, a new procedure for implementing the Modified Expansion Solution Approximation (MESA) schemes, based upon Hermitian divided differences, is presented which extends the effective accuracy of the MESA schemes to 57th order in space and time when using 128 bit floating point precision. This new approach has the advantages of reducing round-off error, being easy to program. and is more computationally efficient when compared to previous approaches. Its accuracy is limited only by the floating point hardware. The advantages of this new approach are demonstrated by solving the linearized Euler equations in an open bi-periodic domain. A 500th order MESA scheme can now be created in seconds, making these schemes ideally suited for the next generation of high performance 256-bit (double quadruple) or higher precision computers. This ease of creation makes it possible to adapt the algorithm to the mesh in time instead of its converse: this is ideal for resolving varying wavelength scales which occur in noise generation simulations. And finally, the sources of round-off error which effect the very high order methods are examined and remedies provided that effectively increase the accuracy of the MESA schemes while using current computer technology.

  9. Retinal imaging using adaptive optics technology.

    PubMed

    Kozak, Igor

    2014-04-01

    Adaptive optics (AO) is a technology used to improve the performance of optical systems by reducing the effect of wave front distortions. Retinal imaging using AO aims to compensate for higher order aberrations originating from the cornea and the lens by using deformable mirror. The main application of AO retinal imaging has been to assess photoreceptor cell density, spacing, and mosaic regularity in normal and diseased eyes. Apart from photoreceptors, the retinal pigment epithelium, retinal nerve fiber layer, retinal vessel wall and lamina cribrosa can also be visualized with AO technology. Recent interest in AO technology in eye research has resulted in growing number of reports and publications utilizing this technology in both animals and humans. With the availability of first commercially available instruments we are making transformation of AO technology from a research tool to diagnostic instrument. The current challenges include imaging eyes with less than perfect optical media, formation of normative databases for acquired images such as cone mosaics, and the cost of the technology. The opportunities for AO will include more detailed diagnosis with description of some new findings in retinal diseases and glaucoma as well as expansion of AO into clinical trials which has already started. PMID:24843304

  10. Test Anxiety, Computer-Adaptive Testing and the Common Core

    ERIC Educational Resources Information Center

    Colwell, Nicole Makas

    2013-01-01

    This paper highlights the current findings and issues regarding the role of computer-adaptive testing in test anxiety. The computer-adaptive test (CAT) proposed by one of the Common Core consortia brings these issues to the forefront. Research has long indicated that test anxiety impairs student performance. More recent research indicates that…

  11. Adaptive Resource Management Technology for Satellite Constellations

    NASA Technical Reports Server (NTRS)

    Welch, Lonnie; Tjaden, Brett; Pfarr, Barbara B.; Hennessy, Joseph F. (Technical Monitor)

    2002-01-01

    This manuscript describes the Sensor Web Adaptive Resource Manager (SWARM) project. The primary focus of the project is on the design and prototyping of middleware for managing computing and network resources in a way that enables the information systems of satellite constellations to provide realtime performance within dynamic environments. The middleware has been prototyped, and it has been evaluated by employing it to manage a pool of distributed resources for the ITOS (Integrated Test and Operations System) satellite command and control software system. The design of the middleware is discussed and a summary of the evaluation effort is provided.

  12. Center for Computational Structures Technology

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Perry, Ferman W.

    1995-01-01

    The Center for Computational Structures Technology (CST) is intended to serve as a focal point for the diverse CST research activities. The CST activities include the use of numerical simulation and artificial intelligence methods in modeling, analysis, sensitivity studies, and optimization of flight-vehicle structures. The Center is located at NASA Langley and is an integral part of the School of Engineering and Applied Science of the University of Virginia. The key elements of the Center are: (1) conducting innovative research on advanced topics of CST; (2) acting as pathfinder by demonstrating to the research community what can be done (high-potential, high-risk research); (3) strong collaboration with NASA scientists and researchers from universities and other government laboratories; and (4) rapid dissemination of CST to industry, through integration of industrial personnel into the ongoing research efforts.

  13. Implementation of Multispectral Image Classification on a Remote Adaptive Computer

    NASA Technical Reports Server (NTRS)

    Figueiredo, Marco A.; Gloster, Clay S.; Stephens, Mark; Graves, Corey A.; Nakkar, Mouna

    1999-01-01

    As the demand for higher performance computers for the processing of remote sensing science algorithms increases, the need to investigate new computing paradigms its justified. Field Programmable Gate Arrays enable the implementation of algorithms at the hardware gate level, leading to orders of m a,gnitude performance increase over microprocessor based systems. The automatic classification of spaceborne multispectral images is an example of a computation intensive application, that, can benefit from implementation on an FPGA - based custom computing machine (adaptive or reconfigurable computer). A probabilistic neural network is used here to classify pixels of of a multispectral LANDSAT-2 image. The implementation described utilizes Java client/server application programs to access the adaptive computer from a remote site. Results verify that a remote hardware version of the algorithm (implemented on an adaptive computer) is significantly faster than a local software version of the same algorithm implemented on a typical general - purpose computer).

  14. Computer and Information Technologies: Student and Service Provider Perspectives.

    ERIC Educational Resources Information Center

    Fichten, Catherine S.; Barile, Maria; Asuncion, Jennison; Judd, Darlene; Alapin, Iris; Lavers, Jason; Havel, Alice; Wolforth, Joan

    This presentation discusses the outcomes of a project that investigated the computer, information, and learning and adaptive technology needs and concerns of Canadian postsecondary students with disabilities. A series of four focus groups involving 33 students with disabilities and 25 service providers was held. The study found: (1) colleges have…

  15. Computer Software for Forestry Technology Curricula. Final Report.

    ERIC Educational Resources Information Center

    Watson, Roy C.; Scobie, Walter R.

    Since microcomputers are being used more and more frequently in the forest products industry in the Pacific Northwest, Green River Community College conducted a project to search for BASIC language computer programs pertaining to forestry, and when possible, to adapt such software for use in teaching forestry technology. The search for applicable…

  16. Adaptive sports technology and biomechanics: wheelchairs.

    PubMed

    Cooper, Rory A; De Luigi, Arthur Jason

    2014-08-01

    Wheelchair sports are an important tool in the rehabilitation of people with severe chronic disabilities and have been a driving force for innovation in technology and practice. In this paper, we will present an overview of the adaptive technology used in Paralympic sports with a special focus on wheeled technology and the impact of design on performance (defined as achieving the greatest level of athletic ability and minimizing the risk of injury). Many advances in manual wheelchairs trace their origins to wheelchair sports. Features of wheelchairs that were used for racing and basketball 25 or more years ago have become integral to the manual wheelchairs that people now use every day; moreover, the current components used on ultralight wheelchairs also have benefitted from technological advances developed for sports wheelchairs. For example, the wheels now used on chairs for daily mobility incorporate many of the components first developed for sports chairs. Also, advances in manufacturing and the availability of aerospace materials have driven current wheelchair design and manufacture. Basic principles of sports wheelchair design are universal across sports and include fit; minimizing weight while maintaining high stiffness; minimizing rolling resistance; and optimizing the sports-specific design of the chair. However, a well-designed and fitted wheelchair is not sufficient for optimal sports performance: the athlete must be well trained, skilled, and use effective biomechanics because wheelchair athletes face some unique biomechanical challenges.

  17. Adaptive Decision Aiding in Computer-Assisted Instruction: Adaptive Computerized Training System (ACTS).

    ERIC Educational Resources Information Center

    Hopf-Weichel, Rosemarie; And Others

    This report describes results of the first year of a three-year program to develop and evaluate a new Adaptive Computerized Training System (ACTS) for electronics maintenance training. (ACTS incorporates an adaptive computer program that learns the student's diagnostic and decision value structure, compares it to that of an expert, and adapts the…

  18. Faculty Computer Expertise and Use of Instructional Technology. Technology Survey.

    ERIC Educational Resources Information Center

    Gabriner, Robert; Mery, Pamela

    This report shows the findings of a 1997 technology survey used to assess degrees of faculty computer expertise and the use of instructional technology. Part 1 reviews general findings of the fall 1997 technology survey: (1) the level of computer expertise among faculty, staff and administrators appears to be increasing; (2) in comparison with the…

  19. RASCAL: A Rudimentary Adaptive System for Computer-Aided Learning.

    ERIC Educational Resources Information Center

    Stewart, John Christopher

    Both the background of computer-assisted instruction (CAI) systems in general and the requirements of a computer-aided learning system which would be a reasonable assistant to a teacher are discussed. RASCAL (Rudimentary Adaptive System for Computer-Aided Learning) is a first attempt at defining a CAI system which would individualize the learning…

  20. Computer Technology: State of the Art.

    ERIC Educational Resources Information Center

    Withington, Frederic G.

    1981-01-01

    Describes the nature of modern general-purpose computer systems, including hardware, semiconductor electronics, microprocessors, computer architecture, input output technology, and system control programs. Seven suggested readings are cited. (FM)

  1. The research on thermal adaptability reinforcement technology for photovoltaic modules

    NASA Astrophysics Data System (ADS)

    Su, Nana; Zhou, Guozhong

    2015-10-01

    Nowadays, Photovoltaic module contains more high-performance components in smaller space. It is also demanded to work in severe temperature condition for special use, such as aerospace. As temperature rises, the failure rate will increase exponentially which makes reliability significantly reduce. In order to improve thermal adaptability of photovoltaic module, this paper makes a research on reinforcement technologies. Thermoelectric cooler is widely used in aerospace which has harsh working environment. So, theoretical formulas for computing refrigerating efficiency, refrigerating capacity and temperature difference are described in detail. The optimum operating current of three classical working condition is obtained which can be used to guide the design of driven circuit. Taken some equipment enclosure for example, we use thermoelectric cooler to reinforce its thermal adaptability. By building physical model and thermal model with the aid of physical dimension and constraint condition, the model is simulated by Flotherm. The temperature field cloud is shown to verify the effectiveness of reinforcement.

  2. A Guide to Computer Adaptive Testing Systems

    ERIC Educational Resources Information Center

    Davey, Tim

    2011-01-01

    Some brand names are used generically to describe an entire class of products that perform the same function. "Kleenex," "Xerox," "Thermos," and "Band-Aid" are good examples. The term "computerized adaptive testing" (CAT) is similar in that it is often applied uniformly across a diverse family of testing methods. Although the various members of…

  3. Solution-adaptive finite element method in computational fracture mechanics

    NASA Technical Reports Server (NTRS)

    Min, J. B.; Bass, J. M.; Spradley, L. W.

    1993-01-01

    Some recent results obtained using solution-adaptive finite element method in linear elastic two-dimensional fracture mechanics problems are presented. The focus is on the basic issue of adaptive finite element method for validating the applications of new methodology to fracture mechanics problems by computing demonstration problems and comparing the stress intensity factors to analytical results.

  4. The Efficacy of Psychophysiological Measures for Implementing Adaptive Technology

    NASA Technical Reports Server (NTRS)

    Scerbo, Mark W.; Freeman, Frederick G.; Mikulka, Peter J.; Parasuraman, Raja; DiNocero, Francesco; Prinzel, Lawrence J., III

    2001-01-01

    Adaptive automation refers to technology that can change its mode of operation dynamically. Further, both the technology and the operator can initiate changes in the level or mode of automation. The present paper reviews research on adaptive technology. It is divided into three primary sections. In the first section, issues surrounding the development and implementation of adaptive automation are presented. Because physiological-based measures show much promise for implementing adaptive automation, the second section is devoted to examining candidate indices. In the final section, those techniques that show the greatest promise for adaptive automation as well as issues that still need to be resolved are discussed.

  5. Theory-Guided Technology in Computer Science.

    ERIC Educational Resources Information Center

    Ben-Ari, Mordechai

    2001-01-01

    Examines the history of major achievements in computer science as portrayed by winners of the prestigious Turing award and identifies a possibly unique activity called Theory-Guided Technology (TGT). Researchers develop TGT by using theoretical results to create practical technology. Discusses reasons why TGT is practical in computer science and…

  6. Prior Computer Experience and Technology Acceptance

    ERIC Educational Resources Information Center

    Varma, Sonali

    2010-01-01

    Prior computer experience with information technology has been identified as a key variable (Lee, Kozar, & Larsen, 2003) that can influence an individual's future use of newer computer technology. The lack of a theory driven approach to measuring prior experience has however led to conceptually different factors being used interchangeably in…

  7. Computer technology for autistic students.

    PubMed

    Panyan, M V

    1984-12-01

    The first purpose of this article is to review the literature related to the use of computers with autistic individuals. Although only a limited number of applications have been reported, the potential of the computer to facilitate the progress of autistic persons is promising. The second purpose is to identify specific learning problems or styles associated with autism from the research literature and link these with the unique aspects of computer-based instruction. For example, the computer's role in improving the motivation of autistic individuals is related to its capacity to analyze the reinforcing qualities of a particular event interactively and immediately for each user. Finally, recommendations that may enable computers to be maximally beneficial in assessing the learning process and remediating learning problems are offered. Two such recommendations are selecting appropriate software and integrating computer instruction within the classroom environment. PMID:6549182

  8. Computer technology for autistic students.

    PubMed

    Panyan, M V

    1984-12-01

    The first purpose of this article is to review the literature related to the use of computers with autistic individuals. Although only a limited number of applications have been reported, the potential of the computer to facilitate the progress of autistic persons is promising. The second purpose is to identify specific learning problems or styles associated with autism from the research literature and link these with the unique aspects of computer-based instruction. For example, the computer's role in improving the motivation of autistic individuals is related to its capacity to analyze the reinforcing qualities of a particular event interactively and immediately for each user. Finally, recommendations that may enable computers to be maximally beneficial in assessing the learning process and remediating learning problems are offered. Two such recommendations are selecting appropriate software and integrating computer instruction within the classroom environment.

  9. Center for Computer Sciences and Technology.

    ERIC Educational Resources Information Center

    National Bureau of Standards (DOC), Washington, DC.

    Functions of the Center for Computer Sciences and Technology (CCST), a national center for computer research and development for the United States government, are described. CCST provides computer and related services to the National Bureau of Standards of which it is a part and to other government agencies on a cost-reimbursable basis. The Office…

  10. Computer Service Technology (An Associate Degree Program).

    ERIC Educational Resources Information Center

    McQuay, Paul L.; Bronk, Carol G.

    Delaware County College's (DCC's) computer service technology program is described in this paper, along with job market needs for computer personnel in Delaware County and nationwide. First, the type of work performed by computer service technicians and the areas in which they are employed are outlined. Next, the objectives of DCC's program are…

  11. SLJ Special Section: Computer Technology and Libraries.

    ERIC Educational Resources Information Center

    School Library Journal, 1984

    1984-01-01

    This 5-article section provides an annotated bibliography of 39 books about computer technology; describes peripheral equipment essential for building computer systems; suggests an approach to software evaluation and selection; describes the implementation of a computer program for elementary/secondary students and teachers; and reviews 34…

  12. Computer Technology Resources for Literacy Projects.

    ERIC Educational Resources Information Center

    Florida State Council on Aging, Tallahassee.

    This resource booklet was prepared to assist literacy projects and community adult education programs in determining the technology they need to serve more older persons. Section 1 contains the following reprinted articles: "The Human Touch in the Computer Age: Seniors Learn Computer Skills from Schoolkids" (Suzanne Kashuba); "Computer Instruction…

  13. Adaptive classification on brain-computer interfaces using reinforcement signals.

    PubMed

    Llera, A; Gómez, V; Kappen, H J

    2012-11-01

    We introduce a probabilistic model that combines a classifier with an extra reinforcement signal (RS) encoding the probability of an erroneous feedback being delivered by the classifier. This representation computes the class probabilities given the task related features and the reinforcement signal. Using expectation maximization (EM) to estimate the parameter values under such a model shows that some existing adaptive classifiers are particular cases of such an EM algorithm. Further, we present a new algorithm for adaptive classification, which we call constrained means adaptive classifier, and show using EEG data and simulated RS that this classifier is able to significantly outperform state-of-the-art adaptive classifiers.

  14. Multithreaded Model for Dynamic Load Balancing Parallel Adaptive PDE Computations

    NASA Technical Reports Server (NTRS)

    Chrisochoides, Nikos

    1995-01-01

    We present a multithreaded model for the dynamic load-balancing of numerical, adaptive computations required for the solution of Partial Differential Equations (PDE's) on multiprocessors. Multithreading is used as a means of exploring concurrency in the processor level in order to tolerate synchronization costs inherent to traditional (non-threaded) parallel adaptive PDE solvers. Our preliminary analysis for parallel, adaptive PDE solvers indicates that multithreading can be used an a mechanism to mask overheads required for the dynamic balancing of processor workloads with computations required for the actual numerical solution of the PDE's. Also, multithreading can simplify the implementation of dynamic load-balancing algorithms, a task that is very difficult for traditional data parallel adaptive PDE computations. Unfortunately, multithreading does not always simplify program complexity, often makes code re-usability not an easy task, and increases software complexity.

  15. Adaptive DNA Computing Algorithm by Using PCR and Restriction Enzyme

    NASA Astrophysics Data System (ADS)

    Kon, Yuji; Yabe, Kaoru; Rajaee, Nordiana; Ono, Osamu

    In this paper, we introduce an adaptive DNA computing algorithm by using polymerase chain reaction (PCR) and restriction enzyme. The adaptive algorithm is designed based on Adleman-Lipton paradigm[3] of DNA computing. In this work, however, unlike the Adleman- Lipton architecture a cutting operation has been introduced to the algorithm and the mechanism in which the molecules used by computation were feedback to the next cycle devised. Moreover, the amplification by PCR is performed in the molecule used by feedback and the difference concentration arisen in the base sequence can be used again. By this operation the molecules which serve as a solution candidate can be reduced down and the optimal solution is carried out in the shortest path problem. The validity of the proposed adaptive algorithm is considered with the logical simulation and finally we go on to propose applying adaptive algorithm to the chemical experiment which used the actual DNA molecules for solving an optimal network problem.

  16. Simple and Effective Algorithms: Computer-Adaptive Testing.

    ERIC Educational Resources Information Center

    Linacre, John Michael

    Computer-adaptive testing (CAT) allows improved security, greater scoring accuracy, shorter testing periods, quicker availability of results, and reduced guessing and other undesirable test behavior. Simple approaches can be applied by the classroom teacher, or other content specialist, who possesses simple computer equipment and elementary…

  17. Adapting Grids For Computing Two-Dimensional Flows

    NASA Technical Reports Server (NTRS)

    Davies, Carol B.; Venkatapathy, Ethiraj

    1992-01-01

    SAGE2D is two-dimensional implementation of Self Adaptive Grid Evolution computer program that intelligently redistributes initial grid points on basis of initial flow-field solution. Grids modified according to initial computed flows enabling recomputation at greater accuracy. Written in FORTRAN 77.

  18. An Adaptive Evaluation Structure for Computer-Based Instruction.

    ERIC Educational Resources Information Center

    Welsh, William A.

    Adaptive Evaluation Structure (AES) is a set of linked computer programs designed to increase the effectiveness of interactive computer-assisted instruction at the college level. The package has four major features, the first of which is based on a prior cognitive inventory and on the accuracy and pace of student responses. AES adjusts materials…

  19. From Computer Lab to Technology Class.

    ERIC Educational Resources Information Center

    Sherwood, Sandra

    1999-01-01

    Discussion of integrating technology into elementary school classrooms focuses on teacher training that is based on a three-year plan developed at an elementary school in Marathon, New York. Describes the role of a technology teacher who facilitates technology integration by running the computer lab, offering workshops, and developing inservice…

  20. Art Therapists and Computer Technology

    ERIC Educational Resources Information Center

    Peterson, Brent C.; Stovall, Kay; Elkins, David E.; Parker-Bell, Barbara

    2005-01-01

    The purpose of this study was to understand the impact of technology on art therapists by exploring how art therapists own and use technology and to determine barriers to ownership and use. A survey was conducted at the 2002 annual conference of the American Art Therapy Association in Washington, DC. Of the 250 surveys distributed, 195 were…

  1. Computation as an emergent feature of adaptive synchronization

    NASA Astrophysics Data System (ADS)

    Zanin, M.; Papo, D.; Sendiña-Nadal, I.; Boccaletti, S.

    2011-12-01

    We report on the spontaneous emergence of computation from adaptive synchronization of networked dynamical systems. The fundamentals are nonlinear elements, interacting in a directed graph via a coupling that adapts itself to the synchronization level between two input signals. These units can emulate different Boolean logics, and perform any computational task in a Turing sense, each specific operation being associated with a given network's motif. The resilience of the computation against noise is proven, and the general applicability is demonstrated with regard to periodic and chaotic oscillators, and excitable systems mimicking neural dynamics.

  2. Parallel computation of geometry control in adaptive truss structures

    NASA Technical Reports Server (NTRS)

    Ramesh, A. V.; Utku, S.; Wada, B. K.

    1992-01-01

    The fast computation of geometry control in adaptive truss structures involves two distinct parts: the efficient integration of the inverse kinematic differential equations that govern the geometry control and the fast computation of the Jacobian, which appears on the right-hand-side of the inverse kinematic equations. This paper present an efficient parallel implementation of the Jacobian computation on an MIMD machine. Large speedup from the parallel implementation is obtained, which reduces the Jacobian computation to an O(M-squared/n) procedure on an n-processor machine, where M is the number of members in the adaptive truss. The parallel algorithm given here is a good candidate for on-line geometry control of adaptive structures using attached processors.

  3. Adapting Inspection Data for Computer Numerical Control

    NASA Technical Reports Server (NTRS)

    Hutchison, E. E.

    1986-01-01

    Machining time for repetitive tasks reduced. Program converts measurements of stub post locations by coordinate-measuring machine into form used by numerical-control computer. Work time thus reduced by 10 to 15 minutes for each post. Since there are 600 such posts on each injector, time saved per injector is 100 to 150 hours. With modifications this approach applicable to machining of many precise holes on large machine frames and similar objects.

  4. Computers and Writing. Tech Use Guide: Using Computer Technology.

    ERIC Educational Resources Information Center

    Dalton, Bridget

    One of nine brief guides for special educators on using computer technology, this guide focuses on the use of computers to improve skills and attitudes in writing instruction. Pre-writing tools such as group brainstorming, story webs, free-writing, journal entries, and prewriting guides help generate ideas and can be carried out either on or off…

  5. Adaptive computational methods for aerothermal heating analysis

    NASA Technical Reports Server (NTRS)

    Price, John M.; Oden, J. Tinsley

    1988-01-01

    The development of adaptive gridding techniques for finite-element analysis of fluid dynamics equations is described. The developmental work was done with the Euler equations with concentration on shock and inviscid flow field capturing. Ultimately this methodology is to be applied to a viscous analysis for the purpose of predicting accurate aerothermal loads on complex shapes subjected to high speed flow environments. The development of local error estimate strategies as a basis for refinement strategies is discussed, as well as the refinement strategies themselves. The application of the strategies to triangular elements and a finite-element flux-corrected-transport numerical scheme are presented. The implementation of these strategies in the GIM/PAGE code for 2-D and 3-D applications is documented and demonstrated.

  6. Adapting Books: Ready, Set, Read!: EAT: Equipment, Adaptations, and Technology

    ERIC Educational Resources Information Center

    Schoonover, Judith; Norton-Darr, Sally

    2016-01-01

    Developing multimodal materials to introduce or extend literacy experiences sets the stage for literacy success. Alternative ways to organize, display and arrange, interact and respond to information produces greater understanding of concepts. Adaptations include making books easier to use (turning pages or holding), and text easier to read…

  7. Theory-Guided Technology in Computer Science

    NASA Astrophysics Data System (ADS)

    Ben-Ari, Mordechai

    Scientists usually identify themselves as either theoreticians or experimentalists, while technology - the application of science in practice - is done by engineers. In computer science, these distinctions are often blurred. This paper examines the history of major achievements in computer science as portrayed by the winners of the prestigious Turing Award and identifies a possibly unique activity called Theory-Guided Technology (TGT). Researchers develop TGT by using theoretical results to create practical technology. The reasons why TGT is practical in computer science are discussed, as is the cool reception that TGT has been received by software engineers.

  8. Computing, Information and Communications Technology (CICT) Website

    NASA Technical Reports Server (NTRS)

    Hardman, John; Tu, Eugene (Technical Monitor)

    2002-01-01

    The Computing, Information and Communications Technology Program (CICT) was established in 2001 to ensure NASA's Continuing leadership in emerging technologies. It is a coordinated, Agency-wide effort to develop and deploy key enabling technologies for a broad range of mission-critical tasks. The NASA CICT program is designed to address Agency-specific computing, information, and communications technology requirements beyond the projected capabilities of commercially available solutions. The areas of technical focus have been chosen for their impact on NASA's missions, their national importance, and the technical challenge they provide to the Program. In order to meet its objectives, the CICT Program is organized into the following four technology focused projects: 1) Computing, Networking and Information Systems (CNIS); 2) Intelligent Systems (IS); 3) Space Communications (SC); 4) Information Technology Strategic Research (ITSR).

  9. Advances and trends in computational structures technology

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Venneri, S. L.

    1990-01-01

    The major goals of computational structures technology (CST) are outlined, and recent advances in CST are examined. These include computational material modeling, stochastic-based modeling, computational methods for articulated structural dynamics, strategies and numerical algorithms for new computing systems, multidisciplinary analysis and optimization. The role of CST in the future development of structures technology and the multidisciplinary design of future flight vehicles is addressed, and the future directions of CST research in the prediction of failures of structural components, the solution of large-scale structural problems, and quality assessment and control of numerical simulations are discussed.

  10. Adaptive Fuzzy Systems in Computational Intelligence

    NASA Technical Reports Server (NTRS)

    Berenji, Hamid R.

    1996-01-01

    In recent years, the interest in computational intelligence techniques, which currently includes neural networks, fuzzy systems, and evolutionary programming, has grown significantly and a number of their applications have been developed in the government and industry. In future, an essential element in these systems will be fuzzy systems that can learn from experience by using neural network in refining their performances. The GARIC architecture, introduced earlier, is an example of a fuzzy reinforcement learning system which has been applied in several control domains such as cart-pole balancing, simulation of to Space Shuttle orbital operations, and tether control. A number of examples from GARIC's applications in these domains will be demonstrated.

  11. (CICT) Computing, Information, and Communications Technology Overview

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.

    2003-01-01

    The goal of the Computing, Information, and Communications Technology (CICT) program is to enable NASA's Scientific Research, Space Exploration, and Aerospace Technology Missions with greater mission assurance, for less cost, with increased science return through the development and use of advanced computing, information and communications technologies. This viewgraph presentation includes diagrams of how the political guidance behind CICT is structured. The presentation profiles each part of the NASA Mission in detail, and relates the Mission to the activities of CICT. CICT's Integrated Capability Goal is illustrated, and hypothetical missions which could be enabled by CICT are profiled. CICT technology development is profiled.

  12. An Adaptive Sensor Mining Framework for Pervasive Computing Applications

    NASA Astrophysics Data System (ADS)

    Rashidi, Parisa; Cook, Diane J.

    Analyzing sensor data in pervasive computing applications brings unique challenges to the KDD community. The challenge is heightened when the underlying data source is dynamic and the patterns change. We introduce a new adaptive mining framework that detects patterns in sensor data, and more importantly, adapts to the changes in the underlying model. In our framework, the frequent and periodic patterns of data are first discovered by the Frequent and Periodic Pattern Miner (FPPM) algorithm; and then any changes in the discovered patterns over the lifetime of the system are discovered by the Pattern Adaptation Miner (PAM) algorithm, in order to adapt to the changing environment. This framework also captures vital context information present in pervasive computing applications, such as the startup triggers and temporal information. In this paper, we present a description of our mining framework and validate the approach using data collected in the CASAS smart home testbed.

  13. Computer Programming Projects in Technology Courses.

    ERIC Educational Resources Information Center

    Thomas, Charles R.

    1985-01-01

    Discusses programming projects in applied technology courses, examining documentation, formal reports, and implementation. Includes recommendations based on experience with a sophomore machine elements course which provided computers for problem solving exercises. (DH)

  14. Adapting Technological Interventions to Meet the Needs of Priority Populations.

    PubMed

    Linke, Sarah E; Larsen, Britta A; Marquez, Becky; Mendoza-Vasconez, Andrea; Marcus, Bess H

    2016-01-01

    Cardiovascular diseases (CVD) comprise the leading cause of mortality worldwide, accounting for 3 in 10 deaths. Individuals with certain risk factors, including tobacco use, obesity, low levels of physical activity, type 2 diabetes mellitus, racial/ethnic minority status and low socioeconomic status, experience higher rates of CVD and are, therefore, considered priority populations. Technological devices such as computers and smartphones are now routinely utilized in research studies aiming to prevent CVD and its risk factors, and they are also rampant in the public and private health sectors. Traditional health behavior interventions targeting these risk factors have been adapted for technology-based approaches. This review provides an overview of technology-based interventions conducted in these priority populations as well as the challenges and gaps to be addressed in future research. Researchers currently possess tremendous opportunities to engage in technology-based implementation and dissemination science to help spread evidence-based programs focusing on CVD risk factors in these and other priority populations. PMID:26957186

  15. Electrooptical adaptive switching network for the hypercube computer

    NASA Technical Reports Server (NTRS)

    Chow, E.; Peterson, J.

    1988-01-01

    An all-optical network design for the hyperswitch network using regular free-space interconnects between electronic processor nodes is presented. The adaptive routing model used is described, and an adaptive routing control example is presented. The design demonstrates that existing electrooptical techniques are sufficient for implementing efficient parallel architectures without the need for more complex means of implementing arbitrary interconnection schemes. The electrooptical hyperswitch network significantly improves the communication performance of the hypercube computer.

  16. Ultimate computing. Biomolecular consciousness and nano Technology

    SciTech Connect

    Hameroff, S.R.

    1987-01-01

    The book advances the premise that the cytoskeleton is the cell's nervous system, the biological controller/computer. If indeed cytoskeletal dynamics in the nanoscale (billionth meter, billionth second) are the texture of intracellular information processing, emerging ''NanoTechnologies'' (scanning tunneling microscopy, Feynman machines, von Neumann replicators, etc.) should enable direct monitoring, decoding and interfacing between biological and technological information devices. This in turn could result in important biomedical applications and perhaps a merger of mind and machine: Ultimate Computing.

  17. Computer technology forecast study for general aviation

    NASA Technical Reports Server (NTRS)

    Seacord, C. L.; Vaughn, D.

    1976-01-01

    A multi-year, multi-faceted program is underway to investigate and develop potential improvements in airframes, engines, and avionics for general aviation aircraft. The objective of this study was to assemble information that will allow the government to assess the trends in computer and computer/operator interface technology that may have application to general aviation in the 1980's and beyond. The current state of the art of computer hardware is assessed, technical developments in computer hardware are predicted, and nonaviation large volume users of computer hardware are identified.

  18. Computational issue in the analysis of adaptive control systems

    NASA Technical Reports Server (NTRS)

    Kosut, Robert L.

    1989-01-01

    Adaptive systems under slow parameter adaption can be analyzed by the method of averaging. This provides a means to assess stability (and instability) properties of most adaptive systems, either continuous-time or (more importantly for practice) discrete-time, as well as providing an estimate of the region of attraction. Although the method of averaging is conceptually straightforward, even simple examples are well beyond hand calculations. Specific software tools are proposed which can provide the basis for user-friendly environment to perform the necessary computations involved in the averaging analysis.

  19. [Computer technologies in teaching pathological anatomy].

    PubMed

    Ponomarev, A B; Fedorov, D N

    2015-01-01

    The paper gives experience with personal computers used at the Academician A.L. Strukov Department of Pathological Anatomy for more than 20 years. It shows the objective necessity of introducing computer technologies at all stages of acquiring skills in anatomical pathology, including lectures, students' free work, test check, etc.

  20. Applications of Computer Technology in Intercollegiate Debate.

    ERIC Educational Resources Information Center

    Kay, Jack, Ed.

    1986-01-01

    Focusing on how computers can and should be used in intercollegiate forensics, this journal issue offers the perspectives of a number of forensics instructors. The lead article, "Applications of Computer Technology in Intercollegiate Debate" by Theodore F. Sheckels, Jr., discusses five areas in which forensics educators might use computer…

  1. Computer Technology: For Better or Worse?

    ERIC Educational Resources Information Center

    Ware, Willis H.

    Computer technology ought to be among the most helpful and useful of any technology, but if it is not treated with care, the society will be worse off for it. The Federal Privacy Act of 1974 established the Privacy Protection Study Commission whose business started in June, 1975. In examination of the private sector, the commission's…

  2. Crystal gazing v. computer system technology projections.

    NASA Astrophysics Data System (ADS)

    Wells, Donald C.

    The following sections are included: * INTRODUCTION * PREDICTIONS FOR THE EARLY NINETIES * EVOLUTION OF COMPUTER CAPACITIES * UNIX IS COMING! * GRAPHICS TECHNOLOGY * MASSIVE ARCHIVAL STORAGE * ADA AND PROJECT MANAGEMENT * ARTIFICIAL INTELLIGENCE TECHNOLOGY * FILLING IN THE DETAILS * UNIX DESIDERATA * ICON-DRIVEN COMMAND LANGUAGES? * AI AGAIN * DISCUSSION * REFERENCES * BIBLIOGRAPHY—FOR FURTHER READING

  3. Research on Key Technologies of Cloud Computing

    NASA Astrophysics Data System (ADS)

    Zhang, Shufen; Yan, Hongcan; Chen, Xuebin

    With the development of multi-core processors, virtualization, distributed storage, broadband Internet and automatic management, a new type of computing mode named cloud computing is produced. It distributes computation task on the resource pool which consists of massive computers, so the application systems can obtain the computing power, the storage space and software service according to its demand. It can concentrate all the computing resources and manage them automatically by the software without intervene. This makes application offers not to annoy for tedious details and more absorbed in his business. It will be advantageous to innovation and reduce cost. It's the ultimate goal of cloud computing to provide calculation, services and applications as a public facility for the public, So that people can use the computer resources just like using water, electricity, gas and telephone. Currently, the understanding of cloud computing is developing and changing constantly, cloud computing still has no unanimous definition. This paper describes three main service forms of cloud computing: SAAS, PAAS, IAAS, compared the definition of cloud computing which is given by Google, Amazon, IBM and other companies, summarized the basic characteristics of cloud computing, and emphasized on the key technologies such as data storage, data management, virtualization and programming model.

  4. Culture Computing: Interactive Technology to Explore Culture

    NASA Astrophysics Data System (ADS)

    Cheok, Adrian David

    The present day rapid development of media science and digital technology is offering the modern generation more opportunities as well as challenges as the new fundamental literacy. Therefore, to reach the modern generation on issues such as an appreciation of cultures, we have to find common grounds based on digital media technology. In an increasingly hybrid cultural environment, interaction and fusion of cultural factors with the computer technology will be an investigation into the possibilities of providing an experience into the cultures of the world, operating in the environments the modern generation inhabits. Research has created novel merging of traditional cultures and literature with recent media literacy. Three cultural computing systems, Media Me, BlogWall and Confucius Computer, are presented in this chapter. Studies showed that users gave positive feedback to their experience of interacting with cultural computing systems.

  5. Cloud Computing Technologies and Applications

    NASA Astrophysics Data System (ADS)

    Zhu, Jinzy

    In a nutshell, the existing Internet provides to us content in the forms of videos, emails and information served up in web pages. With Cloud Computing, the next generation of Internet will allow us to "buy" IT services from a web portal, drastic expanding the types of merchandise available beyond those on e-commerce sites such as eBay and Taobao. We would be able to rent from a virtual storefront the basic necessities to build a virtual data center: such as CPU, memory, storage, and add on top of that the middleware necessary: web application servers, databases, enterprise server bus, etc. as the platform(s) to support the applications we would like to either rent from an Independent Software Vendor (ISV) or develop ourselves. Together this is what we call as "IT as a Service," or ITaaS, bundled to us the end users as a virtual data center.

  6. Estimating Skin Cancer Risk: Evaluating Mobile Computer-Adaptive Testing

    PubMed Central

    Djaja, Ngadiman; Janda, Monika; Olsen, Catherine M; Whiteman, David C

    2016-01-01

    Background Response burden is a major detriment to questionnaire completion rates. Computer adaptive testing may offer advantages over non-adaptive testing, including reduction of numbers of items required for precise measurement. Objective Our aim was to compare the efficiency of non-adaptive (NAT) and computer adaptive testing (CAT) facilitated by Partial Credit Model (PCM)-derived calibration to estimate skin cancer risk. Methods We used a random sample from a population-based Australian cohort study of skin cancer risk (N=43,794). All 30 items of the skin cancer risk scale were calibrated with the Rasch PCM. A total of 1000 cases generated following a normal distribution (mean [SD] 0 [1]) were simulated using three Rasch models with three fixed-item (dichotomous, rating scale, and partial credit) scenarios, respectively. We calculated the comparative efficiency and precision of CAT and NAT (shortening of questionnaire length and the count difference number ratio less than 5% using independent t tests). Results We found that use of CAT led to smaller person standard error of the estimated measure than NAT, with substantially higher efficiency but no loss of precision, reducing response burden by 48%, 66%, and 66% for dichotomous, Rating Scale Model, and PCM models, respectively. Conclusions CAT-based administrations of the skin cancer risk scale could substantially reduce participant burden without compromising measurement precision. A mobile computer adaptive test was developed to help people efficiently assess their skin cancer risk. PMID:26800642

  7. The Importance of Formalizing Computational Models of Face Adaptation Aftereffects

    PubMed Central

    Ross, David A.; Palmeri, Thomas J.

    2016-01-01

    Face adaptation is widely used as a means to probe the neural representations that support face recognition. While the theories that relate face adaptation to behavioral aftereffects may seem conceptually simple, our work has shown that testing computational instantiations of these theories can lead to unexpected results. Instantiating a model of face adaptation not only requires specifying how faces are represented and how adaptation shapes those representations but also specifying how decisions are made, translating hidden representational states into observed responses. Considering the high-dimensionality of face representations, the parallel activation of multiple representations, and the non-linearity of activation functions and decision mechanisms, intuitions alone are unlikely to succeed. If the goal is to understand mechanism, not simply to examine the boundaries of a behavioral phenomenon or correlate behavior with brain activity, then formal computational modeling must be a component of theory testing. To illustrate, we highlight our recent computational modeling of face adaptation aftereffects and discuss how models can be used to understand the mechanisms by which faces are recognized. PMID:27378960

  8. X-Y plotter adapter developed for SDS-930 computer

    NASA Technical Reports Server (NTRS)

    Robertson, J. B.

    1968-01-01

    Graphical Display Adapter provides a real time display for digital computerized experiments. This display uses a memory oscilloscope which records a single trace until erased. It is a small hardware unit which interfaces with the J-box feature of the SDS-930 computer to either an X-Y plotter or a memory oscilloscope.

  9. Computer Adaptive Testing for Small Scale Programs and Instructional Systems

    ERIC Educational Resources Information Center

    Rudner, Lawrence M.; Guo, Fanmin

    2011-01-01

    This study investigates measurement decision theory (MDT) as an underlying model for computer adaptive testing when the goal is to classify examinees into one of a finite number of groups. The first analysis compares MDT with a popular item response theory model and finds little difference in terms of the percentage of correct classifications. The…

  10. Adaptive-mesh algorithms for computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Powell, Kenneth G.; Roe, Philip L.; Quirk, James

    1993-01-01

    The basic goal of adaptive-mesh algorithms is to distribute computational resources wisely by increasing the resolution of 'important' regions of the flow and decreasing the resolution of regions that are less important. While this goal is one that is worthwhile, implementing schemes that have this degree of sophistication remains more of an art than a science. In this paper, the basic pieces of adaptive-mesh algorithms are described and some of the possible ways to implement them are discussed and compared. These basic pieces are the data structure to be used, the generation of an initial mesh, the criterion to be used to adapt the mesh to the solution, and the flow-solver algorithm on the resulting mesh. Each of these is discussed, with particular emphasis on methods suitable for the computation of compressible flows.

  11. Sequential decision making in computational sustainability via adaptive submodularity

    USGS Publications Warehouse

    Andreas Krause,; Daniel Golovin,; Converse, Sarah J.

    2015-01-01

    Many problems in computational sustainability require making a sequence of decisions in complex, uncertain environments. Such problems are generally notoriously difficult. In this article, we review the recently discovered notion of adaptive submodularity, an intuitive diminishing returns condition that generalizes the classical notion of submodular set functions to sequential decision problems. Problems exhibiting the adaptive submodularity property can be efficiently and provably near-optimally solved using simple myopic policies. We illustrate this concept in several case studies of interest in computational sustainability: First, we demonstrate how it can be used to efficiently plan for resolving uncertainty in adaptive management scenarios. Secondly, we show how it applies to dynamic conservation planning for protecting endangered species, a case study carried out in collaboration with the US Geological Survey and the US Fish and Wildlife Service.

  12. Unstructured Adaptive Grid Computations on an Array of SMPs

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Pramanick, Ira; Sohn, Andrew; Simon, Horst D.

    1996-01-01

    Dynamic load balancing is necessary for parallel adaptive methods to solve unsteady CFD problems on unstructured grids. We have presented such a dynamic load balancing framework called JOVE, in this paper. Results on a four-POWERnode POWER CHALLENGEarray demonstrated that load balancing gives significant performance improvements over no load balancing for such adaptive computations. The parallel speedup of JOVE, implemented using MPI on the POWER CHALLENCEarray, was significant, being as high as 31 for 32 processors. An implementation of JOVE that exploits 'an array of SMPS' architecture was also studied; this hybrid JOVE outperformed flat JOVE by up to 28% on the meshes and adaption models tested. With large, realistic meshes and actual flow-solver and adaption phases incorporated into JOVE, hybrid JOVE can be expected to yield significant advantage over flat JOVE, especially as the number of processors is increased, thus demonstrating the scalability of an array of SMPs architecture.

  13. The Technology Fix: The Promise and Reality of Computers in Our Schools

    ERIC Educational Resources Information Center

    Pflaum, William D.

    2004-01-01

    During the technology boom of the 1980s and 1990s, computers seemed set to revolutionize education. Do any of these promises sound familiar? (1) Technology would help all students learn better, thanks to multimedia programs capable of adapting to individual needs, learning styles, and skill levels; (2) Technology would transform the teacher's role…

  14. Adaptive Hypermedia Educational System Based on XML Technologies.

    ERIC Educational Resources Information Center

    Baek, Yeongtae; Wang, Changjong; Lee, Sehoon

    This paper proposes an adaptive hypermedia educational system using XML technologies, such as XML, XSL, XSLT, and XLink. Adaptive systems are capable of altering the presentation of the content of the hypermedia on the basis of a dynamic understanding of the individual user. The user profile can be collected in a user model, while the knowledge…

  15. Adaptive Engine Technologies for Aviation CO2 Emissions Reduction

    NASA Technical Reports Server (NTRS)

    Mercer, Carolyn R.; Haller, William J.; Tong, Michael T.

    2006-01-01

    Adaptive turbine engine technologies are assessed for their potential to reduce carbon dioxide emissions from commercial air transports.Technologies including inlet, fan, and compressor flow control, compressor stall control, blade clearance control, combustion control, active bearings and enabling technologies such as active materials and wireless sensors are discussed. The method of systems assessment is described, including strengths and weaknesses of the approach. Performance benefit estimates are presented for each technology, with a summary of potential emissions reduction possible from the development of new, adaptively controlled engine components.

  16. Probabilistic co-adaptive brain-computer interfacing

    NASA Astrophysics Data System (ADS)

    Bryan, Matthew J.; Martin, Stefan A.; Cheung, Willy; Rao, Rajesh P. N.

    2013-12-01

    Objective. Brain-computer interfaces (BCIs) are confronted with two fundamental challenges: (a) the uncertainty associated with decoding noisy brain signals, and (b) the need for co-adaptation between the brain and the interface so as to cooperatively achieve a common goal in a task. We seek to mitigate these challenges. Approach. We introduce a new approach to brain-computer interfacing based on partially observable Markov decision processes (POMDPs). POMDPs provide a principled approach to handling uncertainty and achieving co-adaptation in the following manner: (1) Bayesian inference is used to compute posterior probability distributions (‘beliefs’) over brain and environment state, and (2) actions are selected based on entire belief distributions in order to maximize total expected reward; by employing methods from reinforcement learning, the POMDP’s reward function can be updated over time to allow for co-adaptive behaviour. Main results. We illustrate our approach using a simple non-invasive BCI which optimizes the speed-accuracy trade-off for individual subjects based on the signal-to-noise characteristics of their brain signals. We additionally demonstrate that the POMDP BCI can automatically detect changes in the user’s control strategy and can co-adaptively switch control strategies on-the-fly to maximize expected reward. Significance. Our results suggest that the framework of POMDPs offers a promising approach for designing BCIs that can handle uncertainty in neural signals and co-adapt with the user on an ongoing basis. The fact that the POMDP BCI maintains a probability distribution over the user’s brain state allows a much more powerful form of decision making than traditional BCI approaches, which have typically been based on the output of classifiers or regression techniques. Furthermore, the co-adaptation of the system allows the BCI to make online improvements to its behaviour, adjusting itself automatically to the user’s changing

  17. Overview of adaptive finite element analysis in computational geodynamics

    NASA Astrophysics Data System (ADS)

    May, D. A.; Schellart, W. P.; Moresi, L.

    2013-10-01

    The use of numerical models to develop insight and intuition into the dynamics of the Earth over geological time scales is a firmly established practice in the geodynamics community. As our depth of understanding grows, and hand-in-hand with improvements in analytical techniques and higher resolution remote sensing of the physical structure and state of the Earth, there is a continual need to develop more efficient, accurate and reliable numerical techniques. This is necessary to ensure that we can meet the challenge of generating robust conclusions, interpretations and predictions from improved observations. In adaptive numerical methods, the desire is generally to maximise the quality of the numerical solution for a given amount of computational effort. Neither of these terms has a unique, universal definition, but typically there is a trade off between the number of unknowns we can calculate to obtain a more accurate representation of the Earth, and the resources (time and computational memory) required to compute them. In the engineering community, this topic has been extensively examined using the adaptive finite element (AFE) method. Recently, the applicability of this technique to geodynamic processes has started to be explored. In this review we report on the current status and usage of spatially adaptive finite element analysis in the field of geodynamics. The objective of this review is to provide a brief introduction to the area of spatially adaptive finite analysis, including a summary of different techniques to define spatial adaptation and of different approaches to guide the adaptive process in order to control the discretisation error inherent within the numerical solution. An overview of the current state of the art in adaptive modelling in geodynamics is provided, together with a discussion pertaining to the issues related to using adaptive analysis techniques and perspectives for future research in this area. Additionally, we also provide a

  18. Evaluation of Advanced Computing Techniques and Technologies: Reconfigurable Computing

    NASA Technical Reports Server (NTRS)

    Wells, B. Earl

    2003-01-01

    The focus of this project was to survey the technology of reconfigurable computing determine its level of maturity and suitability for NASA applications. To better understand and assess the effectiveness of the reconfigurable design paradigm that is utilized within the HAL-15 reconfigurable computer system. This system was made available to NASA MSFC for this purpose, from Star Bridge Systems, Inc. To implement on at least one application that would benefit from the performance levels that are possible with reconfigurable hardware. It was originally proposed that experiments in fault tolerance and dynamically reconfigurability would be perform but time constraints mandated that these be pursued as future research.

  19. Enabling Computational Technologies for Terascale Scientific Simulations

    SciTech Connect

    Ashby, S.F.

    2000-08-24

    We develop scalable algorithms and object-oriented code frameworks for terascale scientific simulations on massively parallel processors (MPPs). Our research in multigrid-based linear solvers and adaptive mesh refinement enables Laboratory programs to use MPPs to explore important physical phenomena. For example, our research aids stockpile stewardship by making practical detailed 3D simulations of radiation transport. The need to solve large linear systems arises in many applications, including radiation transport, structural dynamics, combustion, and flow in porous media. These systems result from discretizations of partial differential equations on computational meshes. Our first research objective is to develop multigrid preconditioned iterative methods for such problems and to demonstrate their scalability on MPPs. Scalability describes how total computational work grows with problem size; it measures how effectively additional resources can help solve increasingly larger problems. Many factors contribute to scalability: computer architecture, parallel implementation, and choice of algorithm. Scalable algorithms have been shown to decrease simulation times by several orders of magnitude.

  20. Video Adaptation Model Based on Cognitive Lattice in Ubiquitous Computing

    NASA Astrophysics Data System (ADS)

    Kim, Svetlana; Yoon, Yong-Ik

    The multimedia service delivery chain poses today many challenges. There are an increasing terminal diversity, network heterogeneity and a pressure to satisfy the user preferences. The situation encourages the need for the personalized contents to provide the user in the best possible experience in ubiquitous computing. This paper introduces a personalized content preparation and delivery framework for multimedia service. The personalized video adaptation is expected to satisfy individual users' need in video content. Cognitive lattice plays a significant role of video annotation to meet users' preference on video content. In this paper, a comprehensive solution for the PVA (Personalized Video Adaptation) is proposed based on Cognitive lattice concept. The PVA is implemented based on MPEG-21 Digital Item Adaptation framework. One of the challenges is how to quantify users' preference on video content.

  1. Instructional Technology in Computer Science Education

    ERIC Educational Resources Information Center

    Jenny, Frederick J.

    2004-01-01

    The Web, the Internet, the intranet and associated resources, campus computer labs, smart classrooms, course management systems, and a plethora of software packages all offer opportunities for every classroom instructor to enrich in-class and out-of-class activities. Why should an instructor consider the integration of technology into their…

  2. Women Workers as Users of Computer Technology.

    ERIC Educational Resources Information Center

    Larwood, Laurie

    1992-01-01

    Discussion of expectations, trends, and implications of growth of computer technology and its effect on women workers argues that the experience of women is different from that of men in the nature of jobs in which women are found, their training and education, home-family conflict, and discrimination. The impact on women of increasing…

  3. Publishing a School Newspaper Using Computer Technology.

    ERIC Educational Resources Information Center

    Whitney, Jeanne; And Others

    By publishing a quarterly school and community newspaper, sixth, seventh, and eighth graders get involved in the writing of many types of articles, proofreading, communication skills, interviewing skills, investigative reporting, photography, artistic and graphic design, and computer technology. As the students work together on each issue of the…

  4. Business/Computer Technologies. State Competency Profile.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Center on Education and Training for Employment.

    This document contains 272 competencies, grouped into 36 units, for tech prep programs in the business/computer technology cluster. The competencies were developed through collaboration of Ohio business, industry, and labor representatives and secondary and associate degree educators. The competencies are rated either "essential" (necessary to…

  5. Competency Index. [Business/Computer Technologies Cluster.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Center on Education and Training for Employment.

    This index allows the user to scan the competencies under each title for the 28 subjects appropriate for use in a competency list for the 12 occupations within the business/computer technologies cluster. Titles of the 28 units are as follows: employability skills; professionalism; teamwork; professional and ethical standards; economic and business…

  6. Computation emerges from adaptive synchronization of networking neurons.

    PubMed

    Zanin, Massimiliano; Del Pozo, Francisco; Boccaletti, Stefano

    2011-01-01

    The activity of networking neurons is largely characterized by the alternation of synchronous and asynchronous spiking sequences. One of the most relevant challenges that scientists are facing today is, then, relating that evidence with the fundamental mechanisms through which the brain computes and processes information, as well as with the arousal (or progress) of a number of neurological illnesses. In other words, the problem is how to associate an organized dynamics of interacting neural assemblies to a computational task. Here we show that computation can be seen as a feature emerging from the collective dynamics of an ensemble of networking neurons, which interact by means of adaptive dynamical connections. Namely, by associating logical states to synchronous neuron's dynamics, we show how the usual Boolean logics can be fully recovered, and a universal Turing machine can be constructed. Furthermore, we show that, besides the static binary gates, a wider class of logical operations can be efficiently constructed as the fundamental computational elements interact within an adaptive network, each operation being represented by a specific motif. Our approach qualitatively differs from the past attempts to encode information and compute with complex systems, where computation was instead the consequence of the application of control loops enforcing a desired state into the specific system's dynamics. Being the result of an emergent process, the computation mechanism here described is not limited to a binary Boolean logic, but it can involve a much larger number of states. As such, our results can enlighten new concepts for the understanding of the real computing processes taking place in the brain. PMID:22073167

  7. Adaptation of Technological Pedagogical Content Knowledge Scale to Turkish

    ERIC Educational Resources Information Center

    Kaya, Zehra; Kaya, Osman Nafiz; Emre, Irfan

    2013-01-01

    The purpose of this study was to adapt "Survey of Pre-service Teachers' Knowledge of Teaching and Technology" in order to assess pre-service primary teachers' Technological Pedagogical Content Knowledge (TPACK) to Turkish. 407 pre-service primary teachers (227 female and 180 male) in their final semester in Education Faculties…

  8. Method and system for environmentally adaptive fault tolerant computing

    NASA Technical Reports Server (NTRS)

    Copenhaver, Jason L. (Inventor); Jeremy, Ramos (Inventor); Wolfe, Jeffrey M. (Inventor); Brenner, Dean (Inventor)

    2010-01-01

    A method and system for adapting fault tolerant computing. The method includes the steps of measuring an environmental condition representative of an environment. An on-board processing system's sensitivity to the measured environmental condition is measured. It is determined whether to reconfigure a fault tolerance of the on-board processing system based in part on the measured environmental condition. The fault tolerance of the on-board processing system may be reconfigured based in part on the measured environmental condition.

  9. An intelligent computational algorithm based on neural network for spatial data mining in adaptability evaluation

    NASA Astrophysics Data System (ADS)

    Miao, Zuohua; Xu, Hong; Chen, Yong; Zeng, Xiangyang

    2009-10-01

    Back-propagation neural network model (BPNN) is an intelligent computational model based on stylebook learning. This model is different from traditional adaptability symbolic logic reasoning method based on knowledge and rules. At the same time, BPNN model has shortcoming such as: slowly convergence speed and partial minimum. During the process of adaptability evaluation, the factors were diverse, complicated and uncertain, so an effectual model should adopt the technique of data mining method and fuzzy logical technology. In this paper, the author ameliorated the backpropagation of BPNN and applied fuzzy logical theory for dynamic inference of fuzzy rules. Authors also give detail description on training and experiment process of the novel model.

  10. Adaptive kinetic-fluid solvers for heterogeneous computing architectures

    NASA Astrophysics Data System (ADS)

    Zabelok, Sergey; Arslanbekov, Robert; Kolobov, Vladimir

    2015-12-01

    We show feasibility and benefits of porting an adaptive multi-scale kinetic-fluid code to CPU-GPU systems. Challenges are due to the irregular data access for adaptive Cartesian mesh, vast difference of computational cost between kinetic and fluid cells, and desire to evenly load all CPUs and GPUs during grid adaptation and algorithm refinement. Our Unified Flow Solver (UFS) combines Adaptive Mesh Refinement (AMR) with automatic cell-by-cell selection of kinetic or fluid solvers based on continuum breakdown criteria. Using GPUs enables hybrid simulations of mixed rarefied-continuum flows with a million of Boltzmann cells each having a 24 × 24 × 24 velocity mesh. We describe the implementation of CUDA kernels for three modules in UFS: the direct Boltzmann solver using the discrete velocity method (DVM), the Direct Simulation Monte Carlo (DSMC) solver, and a mesoscopic solver based on the Lattice Boltzmann Method (LBM), all using adaptive Cartesian mesh. Double digit speedups on single GPU and good scaling for multi-GPUs have been demonstrated.

  11. Petascale Computing Enabling Technologies Project Final Report

    SciTech Connect

    de Supinski, B R

    2010-02-14

    The Petascale Computing Enabling Technologies (PCET) project addressed challenges arising from current trends in computer architecture that will lead to large-scale systems with many more nodes, each of which uses multicore chips. These factors will soon lead to systems that have over one million processors. Also, the use of multicore chips will lead to less memory and less memory bandwidth per core. We need fundamentally new algorithmic approaches to cope with these memory constraints and the huge number of processors. Further, correct, efficient code development is difficult even with the number of processors in current systems; more processors will only make it harder. The goal of PCET was to overcome these challenges by developing the computer science and mathematical underpinnings needed to realize the full potential of our future large-scale systems. Our research results will significantly increase the scientific output obtained from LLNL large-scale computing resources by improving application scientist productivity and system utilization. Our successes include scalable mathematical algorithms that adapt to these emerging architecture trends and through code correctness and performance methodologies that automate critical aspects of application development as well as the foundations for application-level fault tolerance techniques. PCET's scope encompassed several research thrusts in computer science and mathematics: code correctness and performance methodologies, scalable mathematics algorithms appropriate for multicore systems, and application-level fault tolerance techniques. Due to funding limitations, we focused primarily on the first three thrusts although our work also lays the foundation for the needed advances in fault tolerance. In the area of scalable mathematics algorithms, our preliminary work established that OpenMP performance of the AMG linear solver benchmark and important individual kernels on Atlas did not match the predictions of our

  12. Computational Fluid Dynamics Technology for Hypersonic Applications

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.

    2003-01-01

    Several current challenges in computational fluid dynamics and aerothermodynamics for hypersonic vehicle applications are discussed. Example simulations are presented from code validation and code benchmarking efforts to illustrate capabilities and limitations. Opportunities to advance the state-of-art in algorithms, grid generation and adaptation, and code validation are identified. Highlights of diverse efforts to address these challenges are then discussed. One such effort to re-engineer and synthesize the existing analysis capability in LAURA, VULCAN, and FUN3D will provide context for these discussions. The critical (and evolving) role of agile software engineering practice in the capability enhancement process is also noted.

  13. Enabling computational technologies for subsurface simulations

    SciTech Connect

    Falgout, R D

    1999-02-22

    We collaborated with Environmental Programs to develop and apply advanced computational methodologies for simulating multiphase flow through heterogeneous porous media. The primary focus was on developing a fast accurate advection scheme using a new temporal subcycling technique and on the scalable and efficient solution of the nonlinear Richards' equation used to model two-phase (variably saturated) flow. The resulting algorithms can be orders-of-magnitude faster than existing methods. Our computational technologies were applied to the simulation of subsurface fluid flow and chemical transport in the context of two important applications: water resource management and groundwater remediation.

  14. Advances in computed tomography imaging technology.

    PubMed

    Ginat, Daniel Thomas; Gupta, Rajiv

    2014-07-11

    Computed tomography (CT) is an essential tool in diagnostic imaging for evaluating many clinical conditions. In recent years, there have been several notable advances in CT technology that already have had or are expected to have a significant clinical impact, including extreme multidetector CT, iterative reconstruction algorithms, dual-energy CT, cone-beam CT, portable CT, and phase-contrast CT. These techniques and their clinical applications are reviewed and illustrated in this article. In addition, emerging technologies that address deficiencies in these modalities are discussed.

  15. Technologies for Achieving Field Ubiquitous Computing

    NASA Astrophysics Data System (ADS)

    Nagashima, Akira

    Although the term “ubiquitous” may sound like jargon used in information appliances, ubiquitous computing is an emerging concept in industrial automation. This paper presents the author's visions of field ubiquitous computing, which is based on the novel Internet Protocol IPv6. IPv6-based instrumentation will realize the next generation manufacturing excellence. This paper focuses on the following five key issues: 1. IPv6 standardization; 2. IPv6 interfaces embedded in field devices; 3. Compatibility with FOUNDATION fieldbus; 4. Network securities for field applications; and 5. Wireless technologies to complement IP instrumentation. Furthermore, the principles of digital plant operations and ubiquitous production to support the above key technologies to achieve field ubiquitous systems are discussed.

  16. Topology and grid adaption for high-speed flow computations

    NASA Technical Reports Server (NTRS)

    Abolhassani, Jamshid S.; Tiwari, Surendra N.

    1989-01-01

    This study investigates the effects of grid topology and grid adaptation on numerical solutions of the Navier-Stokes equations. In the first part of this study, a general procedure is presented for computation of high-speed flow over complex three-dimensional configurations. The flow field is simulated on the surface of a Butler wing in a uniform stream. Results are presented for Mach number 3.5 and a Reynolds number of 2,000,000. The O-type and H-type grids have been used for this study, and the results are compared together and with other theoretical and experimental results. The results demonstrate that while the H-type grid is suitable for the leading and trailing edges, a more accurate solution can be obtained for the middle part of the wing with an O-type grid. In the second part of this study, methods of grid adaption are reviewed and a method is developed with the capability of adapting to several variables. This method is based on a variational approach and is an algebraic method. Also, the method has been formulated in such a way that there is no need for any matrix inversion. This method is used in conjunction with the calculation of hypersonic flow over a blunt-nose body. A movie has been produced which shows simultaneously the transient behavior of the solution and the grid adaption.

  17. Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine

    SciTech Connect

    Sharma, Gulshan B.; Robertson, Douglas D.

    2013-07-01

    Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula’s material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element’s remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than

  18. Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine

    NASA Astrophysics Data System (ADS)

    Sharma, Gulshan B.; Robertson, Douglas D.

    2013-07-01

    Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula's material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element's remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than actual

  19. Computer Education and Instructional Technology Teacher Trainees' Opinions about Cloud Computing Technology

    ERIC Educational Resources Information Center

    Karamete, Aysen

    2015-01-01

    This study aims to show the present conditions about the usage of cloud computing in the department of Computer Education and Instructional Technology (CEIT) amongst teacher trainees in School of Necatibey Education, Balikesir University, Turkey. In this study, a questionnaire with open-ended questions was used. 17 CEIT teacher trainees…

  20. Safeguards technology and computer security training

    SciTech Connect

    Hunteman, W.J.; Zack, N.R.

    1992-09-01

    The Los Alamos National Laboratory Safeguards Systems Group provides a variety of training services to the federal government and its contractors. The US Department of Energy sponsors a Safeguards Technology Training Program at Los Alamos in which seminars are offered concerning materials accounting for nuclear safeguards, measurement control for materials accounting, and variance propagation and systems analysis. These seminars provide guidance and techniques for accounting for nuclear material, developing and quantifying quality nuclear material measurements, and assessing overall accounting system performance. The Safeguards Systems Group also provides training in computer and data security applications; i.e., a workshop and the Los Alamos Vulnerability/Risk Assessment System (LAVA), computer system security officer training, and nuclear material safeguards for managers training, which are available on request. This paper describes the purpose, content, and expected benefits of the training activities that can be applied at nuclear materials facilities or where there are computer and/or data security concerns.

  1. Adaptive quantum computation in changing environments using projective simulation

    PubMed Central

    Tiersch, M.; Ganahl, E. J.; Briegel, H. J.

    2015-01-01

    Quantum information processing devices need to be robust and stable against external noise and internal imperfections to ensure correct operation. In a setting of measurement-based quantum computation, we explore how an intelligent agent endowed with a projective simulator can act as controller to adapt measurement directions to an external stray field of unknown magnitude in a fixed direction. We assess the agent’s learning behavior in static and time-varying fields and explore composition strategies in the projective simulator to improve the agent’s performance. We demonstrate the applicability by correcting for stray fields in a measurement-based algorithm for Grover’s search. Thereby, we lay out a path for adaptive controllers based on intelligent agents for quantum information tasks. PMID:26260263

  2. Adaptive quantum computation in changing environments using projective simulation

    NASA Astrophysics Data System (ADS)

    Tiersch, M.; Ganahl, E. J.; Briegel, H. J.

    2015-08-01

    Quantum information processing devices need to be robust and stable against external noise and internal imperfections to ensure correct operation. In a setting of measurement-based quantum computation, we explore how an intelligent agent endowed with a projective simulator can act as controller to adapt measurement directions to an external stray field of unknown magnitude in a fixed direction. We assess the agent’s learning behavior in static and time-varying fields and explore composition strategies in the projective simulator to improve the agent’s performance. We demonstrate the applicability by correcting for stray fields in a measurement-based algorithm for Grover’s search. Thereby, we lay out a path for adaptive controllers based on intelligent agents for quantum information tasks.

  3. Use of Computer Technology for English Language Learning: Do Learning Styles, Gender, and Age Matter?

    ERIC Educational Resources Information Center

    Lee, Cynthia; Yeung, Alexander Seeshing; Ip, Tiffany

    2016-01-01

    Computer technology provides spaces and locales for language learning. However, learning style preference and demographic variables may affect the effectiveness of technology use for a desired goal. Adapting Reid's pioneering Perceptual Learning Style Preference Questionnaire (PLSPQ), this study investigated the relations of university students'…

  4. Reviews of computing technology: Client-server technology

    SciTech Connect

    Johnson, S.M.

    1990-09-01

    One of the most frequently heard terms in the computer industry these days is ``client-server.`` There is much misinformation available on the topic, and competitive pressures on software vendors have led to a great deal of hype with little in the way of supporting products. The purpose of this document is to explain what is meant by client-server applications, why the Advanced Technology and Architecture (ATA) section of the Information Resources Management (IRM) Department sees this emerging technology as key for computer applications during the next ten years, and what ATA sees as the existing standards and products available today. Because of the relative immaturity of existing client-server products, IRM is not yet guidelining any specific client-server products, except those that are components of guidelined data communications products or database management systems.

  5. Reviews of computing technology: Client-server technology

    SciTech Connect

    Johnson, S.M.

    1990-09-01

    One of the most frequently heard terms in the computer industry these days is client-server.'' There is much misinformation available on the topic, and competitive pressures on software vendors have led to a great deal of hype with little in the way of supporting products. The purpose of this document is to explain what is meant by client-server applications, why the Advanced Technology and Architecture (ATA) section of the Information Resources Management (IRM) Department sees this emerging technology as key for computer applications during the next ten years, and what ATA sees as the existing standards and products available today. Because of the relative immaturity of existing client-server products, IRM is not yet guidelining any specific client-server products, except those that are components of guidelined data communications products or database management systems.

  6. Grid computing technology for hydrological applications

    NASA Astrophysics Data System (ADS)

    Lecca, G.; Petitdidier, M.; Hluchy, L.; Ivanovic, M.; Kussul, N.; Ray, N.; Thieron, V.

    2011-06-01

    SummaryAdvances in e-Infrastructure promise to revolutionize sensing systems and the way in which data are collected and assimilated, and complex water systems are simulated and visualized. According to the EU Infrastructure 2010 work-programme, data and compute infrastructures and their underlying technologies, either oriented to tackle scientific challenges or complex problem solving in engineering, are expected to converge together into the so-called knowledge infrastructures, leading to a more effective research, education and innovation in the next decade and beyond. Grid technology is recognized as a fundamental component of e-Infrastructures. Nevertheless, this emerging paradigm highlights several topics, including data management, algorithm optimization, security, performance (speed, throughput, bandwidth, etc.), and scientific cooperation and collaboration issues that require further examination to fully exploit it and to better inform future research policies. The paper illustrates the results of six different surface and subsurface hydrology applications that have been deployed on the Grid. All the applications aim to answer to strong requirements from the Civil Society at large, relatively to natural and anthropogenic risks. Grid technology has been successfully tested to improve flood prediction, groundwater resources management and Black Sea hydrological survey, by providing large computing resources. It is also shown that Grid technology facilitates e-cooperation among partners by means of services for authentication and authorization, seamless access to distributed data sources, data protection and access right, and standardization.

  7. Computational Support for Technology- Investment Decisions

    NASA Technical Reports Server (NTRS)

    Adumitroaie, Virgil; Hua, Hook; Lincoln, William; Block, Gary; Mrozinski, Joseph; Shelton, Kacie; Weisbin, Charles; Elfes, Alberto; Smith, Jeffrey

    2007-01-01

    Strategic Assessment of Risk and Technology (START) is a user-friendly computer program that assists human managers in making decisions regarding research-and-development investment portfolios in the presence of uncertainties and of non-technological constraints that include budgetary and time limits, restrictions related to infrastructure, and programmatic and institutional priorities. START facilitates quantitative analysis of technologies, capabilities, missions, scenarios and programs, and thereby enables the selection and scheduling of value-optimal development efforts. START incorporates features that, variously, perform or support a unique combination of functions, most of which are not systematically performed or supported by prior decision- support software. These functions include the following: Optimal portfolio selection using an expected-utility-based assessment of capabilities and technologies; Temporal investment recommendations; Distinctions between enhancing and enabling capabilities; Analysis of partial funding for enhancing capabilities; and Sensitivity and uncertainty analysis. START can run on almost any computing hardware, within Linux and related operating systems that include Mac OS X versions 10.3 and later, and can run in Windows under the Cygwin environment. START can be distributed in binary code form. START calls, as external libraries, several open-source software packages. Output is in Excel (.xls) file format.

  8. Reviews of computing technology: Object-oriented technology

    SciTech Connect

    Skeen, D.C.

    1993-03-01

    A useful metaphor in introducing object-oriented concepts is the idea of a computer hardware manufacturer assembling products from an existing stock of electronic parts. In this analogy, think of the parts as pieces of computer software and of the finished products as computer applications. Like its counterpart, the object is capable of performing its specific function in a wide variety of different applications. The advantages to assembling hardware using a set of prebuilt parts are obvious. The design process is greatly simplified in this scenario, since the designer needs only to carry the design down to the chip level, rather than to the transistor level. As a result, the designer is free to develop a more reliable and feature rich product. Also, since the component parts are reused in several different products, the parts can be made more robust and subjected to more rigorous testing than would be economically feasible for a part used in only one piece of equipment. Additionally, maintenance on the resulting systems is simplified because of the part-level consistency from one type of equipment to another. The remainder of this document introduces the techniques used to develop objects, the benefits of the technology, outstanding issues that remain with the technology, industry direction for the technology, and the impact that object-oriented technology is likely to have on the organization. While going through this material, the reader will find it useful to remember the parts analogy and to keep in mind that the overall purpose of object-oriented technology is to create software parts and to construct applications using those parts.

  9. Coarse-node computations with an adaptive node structure

    SciTech Connect

    Tzanos, C.P.

    1988-01-01

    The analysis with COMMIX of liquid metal reactor (LMR) intermediate heat exchanger (IHX) transients that are characterized by low flows, and especially imbalanced low flows, shows that if a coarse-node structure is used the predicted temperatures are significantly different than those given by a fine-node structure. If a fine-node structure is used, for problems that involve a large part of the plant, the computation time becomes excessive. This paper presents an improved version of an adaptive node structure. At this stage this version has been applied only to one-dimensional problems.

  10. Technology Needs for Teachers Web Development and Curriculum Adaptations

    NASA Technical Reports Server (NTRS)

    Carroll, Christy J.

    1999-01-01

    Computer-based mathematics and science curricula focusing on NASA inventions and technologies will enhance current teacher knowledge and skills. Materials and interactive software developed by educators will allow students to integrate their various courses, to work cooperatively, and to collaborate with both NASA scientists and students at other locations by using computer networks, email and the World Wide Web.

  11. Adaptive Mesh Refinement in Computational Astrophysics -- Methods and Applications

    NASA Astrophysics Data System (ADS)

    Balsara, D.

    2001-12-01

    The advent of robust, reliable and accurate higher order Godunov schemes for many of the systems of equations of interest in computational astrophysics has made it important to understand how to solve them in multi-scale fashion. This is so because the physics associated with astrophysical phenomena evolves in multi-scale fashion and we wish to arrive at a multi-scale simulational capability to represent the physics. Because astrophysical systems have magnetic fields, multi-scale magnetohydrodynamics (MHD) is of especial interest. In this paper we first discuss general issues in adaptive mesh refinement (AMR). We then focus on the important issues in carrying out divergence-free AMR-MHD and catalogue the progress we have made in that area. We show that AMR methods lend themselves to easy parallelization. We then discuss applications of the RIEMANN framework for AMR-MHD to problems in computational astophysics.

  12. Adaptive filtering image preprocessing for smart FPA technology

    NASA Astrophysics Data System (ADS)

    Brooks, Geoffrey W.

    1995-05-01

    This paper discusses two applications of adaptive filters for image processing on parallel architectures. The first, based on the results of previously accomplished work, summarizes the analyses of various adaptive filters implemented for pixel-level image prediction. FIR filters, fixed and adaptive IIR filters, and various variable step size algorithms were compared with a focus on algorithm complexity against the ability to predict future pixel values. A gaussian smoothing operation with varying spatial and temporal constants were also applied for comparisons of random noise reductions. The second application is a suggestion to use memory-adaptive IIR filters for detecting and tracking motion within an image. Objects within an image are made of edges, or segments, with varying degrees of motion. An application has been previously published that describes FIR filters connecting pixels and using correlations to determine motion and direction. This implementation seems limited to detecting motion coinciding with FIR filter operation rate and the associated harmonics. Upgrading the FIR structures with adaptive IIR structures can eliminate these limitations. These and any other pixel-level adaptive filtering application require data memory for filter parameters and some basic computational capability. Tradeoffs have to be made between chip real estate and these desired features. System tradeoffs will also have to be made as to where it makes the most sense to do which level of processing. Although smart pixels may not be ready to implement adaptive filters, applications such as these should give the smart pixel designer some long range goals.

  13. Experience with automatic, dynamic load balancing and adaptive finite element computation

    SciTech Connect

    Wheat, S.R.; Devine, K.D.; Maccabe, A.B.

    1993-10-01

    Distributed memory, Massively Parallel (MP), MIMD technology has enabled the development of applications requiring computational resources previously unobtainable. Structural mechanics and fluid dynamics applications, for example, are often solved by finite element methods (FEMs) requiring, millions of degrees of freedom to accurately simulate physical phenomenon. Adaptive methods, which automatically refine or coarsen meshes and vary the order of accuracy of the numerical solution, offer greater robustness and computational efficiency than traditional FEMs by reducing the amount of computation required away from physical structures such as shock waves and boundary layers. On MP computers, FEMs frequently result in distributed processor load imbalances. To overcome load imbalance, many MP FEMs use static load balancing as a preprocessor to the finite element calculation. Adaptive methods complicate the load imbalance problem since the work per element is not uniform across the solution domain and changes as the computation proceeds. Therefore, dynamic load balancing is required to maintain global load balance. We describe a dynamic, fine-grained, element-based data migration system that maintains global load balance and is effective in the presence of changing work loads. Global load balance is achieved by overlapping neighborhoods of processors, where each neighborhood performs local load balancing. The method utilizes an automatic element management system library to which a programmer integrates the application`s computational description. The library`s flexibility supports a large class of finite element and finite difference based applications.

  14. Adaptive E-Learning Environments: Research Dimensions and Technological Approaches

    ERIC Educational Resources Information Center

    Di Bitonto, Pierpaolo; Roselli, Teresa; Rossano, Veronica; Sinatra, Maria

    2013-01-01

    One of the most closely investigated topics in e-learning research has always been the effectiveness of adaptive learning environments. The technological evolutions that have dramatically changed the educational world in the last six decades have allowed ever more advanced and smarter solutions to be proposed. The focus of this paper is to depict…

  15. Adaptive wall technology for minimization of wall interferences in transonic wind tunnels

    NASA Technical Reports Server (NTRS)

    Wolf, Stephen W. D.

    1988-01-01

    Modern experimental techniques to improve free air simulations in transonic wind tunnels by use of adaptive wall technology are reviewed. Considered are the significant advantages of adaptive wall testing techniques with respect to wall interferences, Reynolds number, tunnel drive power, and flow quality. The application of these testing techniques relies on making the test section boundaries adjustable and using a rapid wall adjustment procedure. A historical overview shows how the disjointed development of these testing techniques, since 1938, is closely linked to available computer support. An overview of Adaptive Wall Test Section (AWTS) designs shows a preference for use of relatively simple designs with solid adaptive walls in 2- and 3-D testing. Operational aspects of AWTS's are discussed with regard to production type operation where adaptive wall adjustments need to be quick. Both 2- and 3-D data are presented to illustrate the quality of AWTS data over the transonic speed range. Adaptive wall technology is available for general use in 2-D testing, even in cryogenic wind tunnels. In 3-D testing, more refinement of the adaptive wall testing techniques is required before more widespread use can be planned.

  16. PCCM2: A GCM adapted for scalable parallel computers

    SciTech Connect

    Drake, J.; Semeraro, B.D.; Worley, P.; Foster, I.; Michalakes, J.; Toonen, B.; Hack, J.J.; Williamson, D.L.

    1994-01-01

    The Computer Hardware, Advanced Mathematics and Model Physics (CHAMMP) program seeks to provide climate researchers with an advanced modeling capability for the study of global change issues. One of the more ambitious projects being undertaken in the CHAMMP program is the development of PCCM2, an adaptation of the Community Climate Model (CCM2) for scalable parallel computers. PCCM2 uses a message-passing, domain-decomposition approach, in which each processor is allocated responsibility for computation on one part of the computational grid, and messages are generated to communicate data between processors. Much of the research effort associated with development of a parallel code of this sort is concerned with identifying efficient decomposition and communication strategies. In PCCM2, this task is complicated by the need to support both semi-Lagrangian transport and spectral transport. Load balancing and parallel I/O techniques are also required. In this paper, the authors review the various parallel algorithms used in PCCM2 and the work done to arrive at a validated model.

  17. Toward unsupervised adaptation of LDA for brain-computer interfaces.

    PubMed

    Vidaurre, C; Kawanabe, M; von Bünau, P; Blankertz, B; Müller, K R

    2011-03-01

    There is a step of significant difficulty experienced by brain-computer interface (BCI) users when going from the calibration recording to the feedback application. This effect has been previously studied and a supervised adaptation solution has been proposed. In this paper, we suggest a simple unsupervised adaptation method of the linear discriminant analysis (LDA) classifier that effectively solves this problem by counteracting the harmful effect of nonclass-related nonstationarities in electroencephalography (EEG) during BCI sessions performed with motor imagery tasks. For this, we first introduce three types of adaptation procedures and investigate them in an offline study with 19 datasets. Then, we select one of the proposed methods and analyze it further. The chosen classifier is offline tested in data from 80 healthy users and four high spinal cord injury patients. Finally, for the first time in BCI literature, we apply this unsupervised classifier in online experiments. Additionally, we show that its performance is significantly better than the state-of-the-art supervised approach.

  18. Improving student retention in computer engineering technology

    NASA Astrophysics Data System (ADS)

    Pierozinski, Russell Ivan

    The purpose of this research project was to improve student retention in the Computer Engineering Technology program at the Northern Alberta Institute of Technology by reducing the number of dropouts and increasing the graduation rate. This action research project utilized a mixed methods approach of a survey and face-to-face interviews. The participants were male and female, with a large majority ranging from 18 to 21 years of age. The research found that participants recognized their skills and capability, but their capacity to remain in the program was dependent on understanding and meeting the demanding pace and rigour of the program. The participants recognized that curriculum delivery along with instructor-student interaction had an impact on student retention. To be successful in the program, students required support in four domains: academic, learning management, career, and social.

  19. Teaching with Technology: The Classroom Manager. Cost-Conscious Computing.

    ERIC Educational Resources Information Center

    Smith, Rhea; And Others

    1992-01-01

    Teachers discuss how to make the most of technology in the classroom during a tight economy. Ideas include recycling computer printer ribbons, buying replacement batteries for computer power supply packs, upgrading via software, and soliciting donated computer equipment. (SM)

  20. Evaluating Computer Technology Integration in a Centralized School System

    ERIC Educational Resources Information Center

    Eteokleous, N.

    2008-01-01

    The study evaluated the current situation in Cyprus elementary classrooms regarding computer technology integration in an attempt to identify ways of expanding teachers' and students' experiences with computer technology. It examined how Cypriot elementary teachers use computers, and the factors that influence computer integration in their…

  1. A Multiscale Computational Framework to Understand Vascular Adaptation

    PubMed Central

    Garbey, Marc; Rahman, Mahbubur; Berceli, Scott A.

    2015-01-01

    The failure rate for vascular interventions (vein bypass grafting, arterial angioplasty/stenting) remains unacceptably high. Over the past two decades, researchers have applied a wide variety of approaches to investigate the primary failure mechanisms, neointimal hyperplasia and aberrant remodeling of the wall, in an effort to identify novel therapeutic strategies. Despite incremental progress, specific cause/effect linkages among the primary drivers of the pathology, (hemodynamic factors, inflammatory biochemical mediators, cellular effectors) and vascular occlusive phenotype remain lacking. We propose a multiscale computational framework of vascular adaptation to develop a bridge between theory and experimental observation and to provide a method for the systematic testing of relevant clinical hypotheses. Cornerstone to our model is a feedback mechanism between environmental conditions and dynamic tissue plasticity described at the cellular level with an agent based model. Our implementation (i) is modular, (ii) starts from basic mechano-biology principle at the cell level and (iii) facilitates the agile development of the model. PMID:25977733

  2. Electronic Quality of Life Assessment Using Computer-Adaptive Testing

    PubMed Central

    2016-01-01

    Background Quality of life (QoL) questionnaires are desirable for clinical practice but can be time-consuming to administer and interpret, making their widespread adoption difficult. Objective Our aim was to assess the performance of the World Health Organization Quality of Life (WHOQOL)-100 questionnaire as four item banks to facilitate adaptive testing using simulated computer adaptive tests (CATs) for physical, psychological, social, and environmental QoL. Methods We used data from the UK WHOQOL-100 questionnaire (N=320) to calibrate item banks using item response theory, which included psychometric assessments of differential item functioning, local dependency, unidimensionality, and reliability. We simulated CATs to assess the number of items administered before prespecified levels of reliability was met. Results The item banks (40 items) all displayed good model fit (P>.01) and were unidimensional (fewer than 5% of t tests significant), reliable (Person Separation Index>.70), and free from differential item functioning (no significant analysis of variance interaction) or local dependency (residual correlations < +.20). When matched for reliability, the item banks were between 45% and 75% shorter than paper-based WHOQOL measures. Across the four domains, a high standard of reliability (alpha>.90) could be gained with a median of 9 items. Conclusions Using CAT, simulated assessments were as reliable as paper-based forms of the WHOQOL with a fraction of the number of items. These properties suggest that these item banks are suitable for computerized adaptive assessment. These item banks have the potential for international development using existing alternative language versions of the WHOQOL items. PMID:27694100

  3. Configurable multiplier modules for an adaptive computing system

    NASA Astrophysics Data System (ADS)

    Pfänder, O. A.; Pfleiderer, H.-J.; Lachowicz, S. W.

    2006-09-01

    The importance of reconfigurable hardware is increasing steadily. For example, the primary approach of using adaptive systems based on programmable gate arrays and configurable routing resources has gone mainstream and high-performance programmable logic devices are rivaling traditional application-specific hardwired integrated circuits. Also, the idea of moving from the 2-D domain into a 3-D design which stacks several active layers above each other is gaining momentum in research and industry, to cope with the demand for smaller devices with a higher scale of integration. However, optimized arithmetic blocks in course-grain reconfigurable arrays as well as field-programmable architectures still play an important role. In countless digital systems and signal processing applications, the multiplication is one of the critical challenges, where in many cases a trade-off between area usage and data throughput has to be made. But the a priori choice of word-length and number representation can also be replaced by a dynamic choice at run-time, in order to improve flexibility, area efficiency and the level of parallelism in computation. In this contribution, we look at an adaptive computing system called 3-D-SoftChip to point out what parameters are crucial to implement flexible multiplier blocks into optimized elements for accelerated processing. The 3-D-SoftChip architecture uses a novel approach to 3-dimensional integration based on flip-chip bonding with indium bumps. The modular construction, the introduction of interfaces to realize the exchange of intermediate data, and the reconfigurable sign handling approach will be explained, as well as a beneficial way to handle and distribute the numerous required control signals.

  4. Learners' Perceptions and Illusions of Adaptivity in Computer-Based Learning Environments

    ERIC Educational Resources Information Center

    Vandewaetere, Mieke; Vandercruysse, Sylke; Clarebout, Geraldine

    2012-01-01

    Research on computer-based adaptive learning environments has shown exemplary growth. Although the mechanisms of effective adaptive instruction are unraveled systematically, little is known about the relative effect of learners' perceptions of adaptivity in adaptive learning environments. As previous research has demonstrated that the learners'…

  5. Computer-aided design and computer science technology

    NASA Technical Reports Server (NTRS)

    Fulton, R. E.; Voigt, S. J.

    1976-01-01

    A description is presented of computer-aided design requirements and the resulting computer science advances needed to support aerospace design. The aerospace design environment is examined, taking into account problems of data handling and aspects of computer hardware and software. The interactive terminal is normally the primary interface between the computer system and the engineering designer. Attention is given to user aids, interactive design, interactive computations, the characteristics of design information, data management requirements, hardware advancements, and computer science developments.

  6. Adaptive optics technology for high-resolution retinal imaging.

    PubMed

    Lombardo, Marco; Serrao, Sebastiano; Devaney, Nicholas; Parravano, Mariacristina; Lombardo, Giuseppe

    2012-12-27

    Adaptive optics (AO) is a technology used to improve the performance of optical systems by reducing the effects of optical aberrations. The direct visualization of the photoreceptor cells, capillaries and nerve fiber bundles represents the major benefit of adding AO to retinal imaging. Adaptive optics is opening a new frontier for clinical research in ophthalmology, providing new information on the early pathological changes of the retinal microstructures in various retinal diseases. We have reviewed AO technology for retinal imaging, providing information on the core components of an AO retinal camera. The most commonly used wavefront sensing and correcting elements are discussed. Furthermore, we discuss current applications of AO imaging to a population of healthy adults and to the most frequent causes of blindness, including diabetic retinopathy, age-related macular degeneration and glaucoma. We conclude our work with a discussion on future clinical prospects for AO retinal imaging.

  7. Adaptive Optics Technology for High-Resolution Retinal Imaging

    PubMed Central

    Lombardo, Marco; Serrao, Sebastiano; Devaney, Nicholas; Parravano, Mariacristina; Lombardo, Giuseppe

    2013-01-01

    Adaptive optics (AO) is a technology used to improve the performance of optical systems by reducing the effects of optical aberrations. The direct visualization of the photoreceptor cells, capillaries and nerve fiber bundles represents the major benefit of adding AO to retinal imaging. Adaptive optics is opening a new frontier for clinical research in ophthalmology, providing new information on the early pathological changes of the retinal microstructures in various retinal diseases. We have reviewed AO technology for retinal imaging, providing information on the core components of an AO retinal camera. The most commonly used wavefront sensing and correcting elements are discussed. Furthermore, we discuss current applications of AO imaging to a population of healthy adults and to the most frequent causes of blindness, including diabetic retinopathy, age-related macular degeneration and glaucoma. We conclude our work with a discussion on future clinical prospects for AO retinal imaging. PMID:23271600

  8. Computer vision research with new imaging technology

    NASA Astrophysics Data System (ADS)

    Hou, Guangqi; Liu, Fei; Sun, Zhenan

    2015-12-01

    Light field imaging is capable of capturing dense multi-view 2D images in one snapshot, which record both intensity values and directions of rays simultaneously. As an emerging 3D device, the light field camera has been widely used in digital refocusing, depth estimation, stereoscopic display, etc. Traditional multi-view stereo (MVS) methods only perform well on strongly texture surfaces, but the depth map contains numerous holes and large ambiguities on textureless or low-textured regions. In this paper, we exploit the light field imaging technology on 3D face modeling in computer vision. Based on a 3D morphable model, we estimate the pose parameters from facial feature points. Then the depth map is estimated through the epipolar plane images (EPIs) method. At last, the high quality 3D face model is exactly recovered via the fusing strategy. We evaluate the effectiveness and robustness on face images captured by a light field camera with different poses.

  9. Adopt or Adapt: Sanitation Technology Choices in Urbanizing Malawi.

    PubMed

    Chunga, Richard M; Ensink, Jeroen H J; Jenkins, Marion W; Brown, Joe

    2016-01-01

    This paper presents the results of a mixed-methods study examining adaptation strategies that property owners in low-income, rapidly urbanizing areas in Malawi adopt to address the limitations of pit latrines, the most common method of disposing human excreta. A particular challenge is lack of space for constructing new latrines as population density increases: traditional practice has been to cap full pits and simply move to a new site, but increasing demands on space require new approaches to extend the service life of latrines. In this context, we collected data on sanitation technology choices from January to September 2013 through 48 in-depth interviews and a stated preference survey targeting 1,300 property owners from 27 low-income urban areas. Results showed that property owners with concern about space for replacing pit latrines were 1.8 times more likely to select pit emptying service over the construction of new pit latrines with a slab floor (p = 0.02) but there was no significant association between concern about space for replacing pit latrines and intention to adopt locally promoted, novel sanitation technology known as ecological sanitation (ecosan). Property owners preferred to adapt existing, known technology by constructing replacement pit latrines on old pit latrine locations, reducing the frequency of replacing pit latrines, or via emptying pit latrines when full. This study highlights potential challenges to adoption of wholly new sanitation technologies, even when they present clear advantages to end users. To scale, alternative sanitation technologies for rapidly urbanising cities should offer clear advantages, be affordable, be easy to use when shared among multiple households, and their design should be informed by existing adaptation strategies and local knowledge.

  10. Adopt or Adapt: Sanitation Technology Choices in Urbanizing Malawi

    PubMed Central

    Chunga, Richard M.; Ensink, Jeroen H. J.; Jenkins, Marion W.; Brown, Joe

    2016-01-01

    This paper presents the results of a mixed-methods study examining adaptation strategies that property owners in low-income, rapidly urbanizing areas in Malawi adopt to address the limitations of pit latrines, the most common method of disposing human excreta. A particular challenge is lack of space for constructing new latrines as population density increases: traditional practice has been to cap full pits and simply move to a new site, but increasing demands on space require new approaches to extend the service life of latrines. In this context, we collected data on sanitation technology choices from January to September 2013 through 48 in-depth interviews and a stated preference survey targeting 1,300 property owners from 27 low-income urban areas. Results showed that property owners with concern about space for replacing pit latrines were 1.8 times more likely to select pit emptying service over the construction of new pit latrines with a slab floor (p = 0.02) but there was no significant association between concern about space for replacing pit latrines and intention to adopt locally promoted, novel sanitation technology known as ecological sanitation (ecosan). Property owners preferred to adapt existing, known technology by constructing replacement pit latrines on old pit latrine locations, reducing the frequency of replacing pit latrines, or via emptying pit latrines when full. This study highlights potential challenges to adoption of wholly new sanitation technologies, even when they present clear advantages to end users. To scale, alternative sanitation technologies for rapidly urbanising cities should offer clear advantages, be affordable, be easy to use when shared among multiple households, and their design should be informed by existing adaptation strategies and local knowledge. PMID:27532871

  11. Adopt or Adapt: Sanitation Technology Choices in Urbanizing Malawi.

    PubMed

    Chunga, Richard M; Ensink, Jeroen H J; Jenkins, Marion W; Brown, Joe

    2016-01-01

    This paper presents the results of a mixed-methods study examining adaptation strategies that property owners in low-income, rapidly urbanizing areas in Malawi adopt to address the limitations of pit latrines, the most common method of disposing human excreta. A particular challenge is lack of space for constructing new latrines as population density increases: traditional practice has been to cap full pits and simply move to a new site, but increasing demands on space require new approaches to extend the service life of latrines. In this context, we collected data on sanitation technology choices from January to September 2013 through 48 in-depth interviews and a stated preference survey targeting 1,300 property owners from 27 low-income urban areas. Results showed that property owners with concern about space for replacing pit latrines were 1.8 times more likely to select pit emptying service over the construction of new pit latrines with a slab floor (p = 0.02) but there was no significant association between concern about space for replacing pit latrines and intention to adopt locally promoted, novel sanitation technology known as ecological sanitation (ecosan). Property owners preferred to adapt existing, known technology by constructing replacement pit latrines on old pit latrine locations, reducing the frequency of replacing pit latrines, or via emptying pit latrines when full. This study highlights potential challenges to adoption of wholly new sanitation technologies, even when they present clear advantages to end users. To scale, alternative sanitation technologies for rapidly urbanising cities should offer clear advantages, be affordable, be easy to use when shared among multiple households, and their design should be informed by existing adaptation strategies and local knowledge. PMID:27532871

  12. Role of computer technology in neurosurgery.

    PubMed

    Abdelwahab, M G; Cavalcanti, D D; Preul, M C

    2010-08-01

    In the clinical office, during surgical planning, or in the operating room, neurosurgeons have been surrounded by the digital world either recreating old tools or introducing new ones. Technological refinements, chiefly based on the use of computer systems, have altered the modus operandi for neurosurgery. In the emergency room or in the office, patient data are entered, digitally dictated, or gathered from electronic medical records. Images from every modality can be examined on a Picture Archiving and Communication System (PACS) or can be seen remotely on cell phones. Surgical planning is based on high-resolution reconstructions, and microsurgical or radiosurgical approaches can be assessed precisely using stereotaxy. Tumor resection, abscess or hematoma evacuation, or the management of vascular lesions can be assisted intraoperatively by new imaging resources integrated into the surgical microscope. Mathematical models can dictate how a lesion may recur as well as how often a particular patient should be followed. Finally, virtual reality is being developed as a training tool for residents and surgeons by preoperatively simulating complex surgical scenarios. Altogether, computerization at each level of patient care has been affected by digital technology to help enhance the safety of procedures and thereby improve outcomes of patients undergoing neurosurgical procedures.

  13. Overview of deformable mirror technologies for adaptive optics and astronomy

    NASA Astrophysics Data System (ADS)

    Madec, P.-Y.

    2012-07-01

    From the ardent bucklers used during the Syracuse battle to set fire to Romans’ ships to more contemporary piezoelectric deformable mirrors widely used in astronomy, from very large voice coil deformable mirrors considered in future Extremely Large Telescopes to very small and compact ones embedded in Multi Object Adaptive Optics systems, this paper aims at giving an overview of Deformable Mirror technology for Adaptive Optics and Astronomy. First the main drivers for the design of Deformable Mirrors are recalled, not only related to atmospheric aberration compensation but also to environmental conditions or mechanical constraints. Then the different technologies available today for the manufacturing of Deformable Mirrors will be described, pros and cons analyzed. A review of the Companies and Institutes with capabilities in delivering Deformable Mirrors to astronomers will be presented, as well as lessons learned from the past 25 years of technological development and operation on sky. In conclusion, perspective will be tentatively drawn for what regards the future of Deformable Mirror technology for Astronomy.

  14. Computer Graphics. Curriculum Guide for Technology Education.

    ERIC Educational Resources Information Center

    Craft, Clyde O.

    This curriculum guide for a 1-quarter or 1-semester course in computer graphics is designed to be used with Apple II computers. Some of the topics covered include the following: computer graphics terminology and applications, operating Apple computers, graphics programming in BASIC using various programs and commands, computer graphics painting,…

  15. Computer Utilization in Industrial Arts/Technology Education. Curriculum Guide.

    ERIC Educational Resources Information Center

    Connecticut Industrial Arts Association.

    This guide is intended to assist industrial arts/technology education teachers in helping students in grades K-12 understand the impact of computers and computer technology in the world. Discussed in the introductory sections are the ways in which computers have changed the face of business, industry, and education and training; the scope and…

  16. Computer Science and Technology Publications. NBS Publications List 84.

    ERIC Educational Resources Information Center

    National Bureau of Standards (DOC), Washington, DC. Inst. for Computer Sciences and Technology.

    This bibliography lists publications of the Institute for Computer Sciences and Technology of the National Bureau of Standards. Publications are listed by subject in the areas of computer security, computer networking, and automation technology. Sections list publications of: (1) current Federal Information Processing Standards; (2) computer…

  17. The Invisible Barrier to Integrating Computer Technology in Education

    ERIC Educational Resources Information Center

    Aflalo, Ester

    2014-01-01

    The article explores contradictions in teachers' perceptions regarding the place of computer technologies in education. The research population included 47 teachers who have incorporated computers in the classroom for several years. The teachers expressed positive attitudes regarding the decisive importance of computer technologies in furthering…

  18. Using Type II Computer Network Technology To Reach Distance Students.

    ERIC Educational Resources Information Center

    Eastmond, Dan; Granger, Dan

    1998-01-01

    This article, in a series on computer technology and distance education, focuses on "Type II Technology," courses using textbooks and course guides for primary delivery, but enhancing them with computer conferencing as the main vehicle of instructional communication. Discusses technology proficiency, maximizing learning in conferencing…

  19. Computing, Information, and Communications Technology (CICT) Program Overview

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.

    2003-01-01

    The Computing, Information and Communications Technology (CICT) Program's goal is to enable NASA's Scientific Research, Space Exploration, and Aerospace Technology Missions with greater mission assurance, for less cost, with increased science return through the development and use of advanced computing, information and communication technologies

  20. Computer Technology and Academic Skill Training for Improving Disabled Students' Academic Performance: Applications and Limitations.

    ERIC Educational Resources Information Center

    Severs, Mary K.

    The Educational Center for Disabled Students at the University of Nebraska-Lincoln is designed to improve the academic performance and attitudes toward success of disabled students through computer technology and academic skills training. Adaptive equipment interventions take into account keyboard access and screen and voice output. Non-adaptive…

  1. Adaptive finite element simulation of flow and transport applications on parallel computers

    NASA Astrophysics Data System (ADS)

    Kirk, Benjamin Shelton

    The subject of this work is the adaptive finite element simulation of problems arising in flow and transport applications on parallel computers. Of particular interest are new contributions to adaptive mesh refinement (AMR) in this parallel high-performance context, including novel work on data structures, treatment of constraints in a parallel setting, generality and extensibility via object-oriented programming, and the design/implementation of a flexible software framework. This technology and software capability then enables more robust, reliable treatment of multiscale--multiphysics problems and specific studies of fine scale interaction such as those in biological chemotaxis (Chapter 4) and high-speed shock physics for compressible flows (Chapter 5). The work begins by presenting an overview of key concepts and data structures employed in AMR simulations. Of particular interest is how these concepts are applied in the physics-independent software framework which is developed here and is the basis for all the numerical simulations performed in this work. This open-source software framework has been adopted by a number of researchers in the U.S. and abroad for use in a wide range of applications. The dynamic nature of adaptive simulations pose particular issues for efficient implementation on distributed-memory parallel architectures. Communication cost, computational load balance, and memory requirements must all be considered when developing adaptive software for this class of machines. Specific extensions to the adaptive data structures to enable implementation on parallel computers is therefore considered in detail. The libMesh framework for performing adaptive finite element simulations on parallel computers is developed to provide a concrete implementation of the above ideas. This physics-independent framework is applied to two distinct flow and transport applications classes in the subsequent application studies to illustrate the flexibility of the

  2. Computer-adaptive test to measure community reintegration of Veterans.

    PubMed

    Resnik, Linda; Tian, Feng; Ni, Pengsheng; Jette, Alan

    2012-01-01

    The Community Reintegration of Injured Service Members (CRIS) measure consists of three scales measuring extent of, perceived limitations in, and satisfaction with community reintegration. Length of the CRIS may be a barrier to its widespread use. Using item response theory (IRT) and computer-adaptive test (CAT) methodologies, this study developed and evaluated a briefer community reintegration measure called the CRIS-CAT. Large item banks for each CRIS scale were constructed. A convenience sample of 517 Veterans responded to all items. Exploratory and confirmatory factor analyses (CFAs) were used to identify the dimensionality within each domain, and IRT methods were used to calibrate items. Accuracy and precision of CATs of different lengths were compared with the full-item bank, and data were examined for differential item functioning (DIF). CFAs supported unidimensionality of scales. Acceptable item fit statistics were found for final models. Accuracy of 10-, 15-, 20-, and variable-item CATs for all three scales was 0.88 or above. CAT precision increased with number of items administered and decreased at the upper ranges of each scale. Three items exhibited moderate DIF by sex. The CRIS-CAT demonstrated promising measurement properties and is recommended for use in community reintegration assessment. PMID:22773259

  3. Computer-adaptive test to measure community reintegration of Veterans.

    PubMed

    Resnik, Linda; Tian, Feng; Ni, Pengsheng; Jette, Alan

    2012-01-01

    The Community Reintegration of Injured Service Members (CRIS) measure consists of three scales measuring extent of, perceived limitations in, and satisfaction with community reintegration. Length of the CRIS may be a barrier to its widespread use. Using item response theory (IRT) and computer-adaptive test (CAT) methodologies, this study developed and evaluated a briefer community reintegration measure called the CRIS-CAT. Large item banks for each CRIS scale were constructed. A convenience sample of 517 Veterans responded to all items. Exploratory and confirmatory factor analyses (CFAs) were used to identify the dimensionality within each domain, and IRT methods were used to calibrate items. Accuracy and precision of CATs of different lengths were compared with the full-item bank, and data were examined for differential item functioning (DIF). CFAs supported unidimensionality of scales. Acceptable item fit statistics were found for final models. Accuracy of 10-, 15-, 20-, and variable-item CATs for all three scales was 0.88 or above. CAT precision increased with number of items administered and decreased at the upper ranges of each scale. Three items exhibited moderate DIF by sex. The CRIS-CAT demonstrated promising measurement properties and is recommended for use in community reintegration assessment.

  4. Adaptive Offset Correction for Intracortical Brain Computer Interfaces

    PubMed Central

    Homer, Mark L.; Perge, János A.; Black, Michael J.; Harrison, Matthew T.; Cash, Sydney S.; Hochberg, Leigh R.

    2014-01-01

    Intracortical brain computer interfaces (iBCIs) decode intended movement from neural activity for the control of external devices such as a robotic arm. Standard approaches include a calibration phase to estimate decoding parameters. During iBCI operation, the statistical properties of the neural activity can depart from those observed during calibration, sometimes hindering a user’s ability to control the iBCI. To address this problem, we adaptively correct the offset terms within a Kalman filter decoder via penalized maximum likelihood estimation. The approach can handle rapid shifts in neural signal behavior (on the order of seconds) and requires no knowledge of the intended movement. The algorithm, called MOCA, was tested using simulated neural activity and evaluated retrospectively using data collected from two people with tetraplegia operating an iBCI. In 19 clinical research test cases, where a nonadaptive Kalman filter yielded relatively high decoding errors, MOCA significantly reduced these errors (10.6 ±10.1%; p<0.05, pairwise t-test). MOCA did not significantly change the error in the remaining 23 cases where a nonadaptive Kalman filter already performed well. These results suggest that MOCA provides more robust decoding than the standard Kalman filter for iBCIs. PMID:24196868

  5. Adaptive offset correction for intracortical brain-computer interfaces.

    PubMed

    Homer, Mark L; Perge, Janos A; Black, Michael J; Harrison, Matthew T; Cash, Sydney S; Hochberg, Leigh R

    2014-03-01

    Intracortical brain-computer interfaces (iBCIs) decode intended movement from neural activity for the control of external devices such as a robotic arm. Standard approaches include a calibration phase to estimate decoding parameters. During iBCI operation, the statistical properties of the neural activity can depart from those observed during calibration, sometimes hindering a user's ability to control the iBCI. To address this problem, we adaptively correct the offset terms within a Kalman filter decoder via penalized maximum likelihood estimation. The approach can handle rapid shifts in neural signal behavior (on the order of seconds) and requires no knowledge of the intended movement. The algorithm, called multiple offset correction algorithm (MOCA), was tested using simulated neural activity and evaluated retrospectively using data collected from two people with tetraplegia operating an iBCI. In 19 clinical research test cases, where a nonadaptive Kalman filter yielded relatively high decoding errors, MOCA significantly reduced these errors ( 10.6 ± 10.1% ; p < 0.05, pairwise t-test). MOCA did not significantly change the error in the remaining 23 cases where a nonadaptive Kalman filter already performed well. These results suggest that MOCA provides more robust decoding than the standard Kalman filter for iBCIs.

  6. Cloud Computing. Technology Briefing. Number 1

    ERIC Educational Resources Information Center

    Alberta Education, 2013

    2013-01-01

    Cloud computing is Internet-based computing in which shared resources, software and information are delivered as a service that computers or mobile devices can access on demand. Cloud computing is already used extensively in education. Free or low-cost cloud-based services are used daily by learners and educators to support learning, social…

  7. The Steam Engine and the Computer: What Makes Technology Revolutionary.

    ERIC Educational Resources Information Center

    Simon, Herbert A.

    1987-01-01

    This discussion of technological revolution focuses on the computer and its uses in education. Contrasts between human traits, such as insight and creativity, and computer capabilities are discussed; the computer as its own instructional device is described; and possible educational changes resulting from computers are addressed. (LRW)

  8. Adaptive Fault Tolerance for Many-Core Based Space-Borne Computing

    NASA Technical Reports Server (NTRS)

    James, Mark; Springer, Paul; Zima, Hans

    2010-01-01

    This paper describes an approach to providing software fault tolerance for future deep-space robotic NASA missions, which will require a high degree of autonomy supported by an enhanced on-board computational capability. Such systems have become possible as a result of the emerging many-core technology, which is expected to offer 1024-core chips by 2015. We discuss the challenges and opportunities of this new technology, focusing on introspection-based adaptive fault tolerance that takes into account the specific requirements of applications, guided by a fault model. Introspection supports runtime monitoring of the program execution with the goal of identifying, locating, and analyzing errors. Fault tolerance assertions for the introspection system can be provided by the user, domain-specific knowledge, or via the results of static or dynamic program analysis. This work is part of an on-going project at the Jet Propulsion Laboratory in Pasadena, California.

  9. 3D-SoftChip: A Novel Architecture for Next-Generation Adaptive Computing Systems

    NASA Astrophysics Data System (ADS)

    Kim, Chul; Rassau, Alex; Lachowicz, Stefan; Lee, Mike Myung-Ok; Eshraghian, Kamran

    2006-12-01

    This paper introduces a novel architecture for next-generation adaptive computing systems, which we term 3D-SoftChip. The 3D-SoftChip is a 3-dimensional (3D) vertically integrated adaptive computing system combining state-of-the-art processing and 3D interconnection technology. It comprises the vertical integration of two chips (a configurable array processor and an intelligent configurable switch) through an indium bump interconnection array (IBIA). The configurable array processor (CAP) is an array of heterogeneous processing elements (PEs), while the intelligent configurable switch (ICS) comprises a switch block, 32-bit dedicated RISC processor for control, on-chip program/data memory, data frame buffer, along with a direct memory access (DMA) controller. This paper introduces the novel 3D-SoftChip architecture for real-time communication and multimedia signal processing as a next-generation computing system. The paper further describes the advanced HW/SW codesign and verification methodology, including high-level system modeling of the 3D-SoftChip using SystemC, being used to determine the optimum hardware specification in the early design stage.

  10. Attitudes to Technology, Perceived Computer Self-Efficacy and Computer Anxiety as Predictors of Computer Supported Education

    ERIC Educational Resources Information Center

    Celik, Vehbi; Yesilyurt, Etem

    2013-01-01

    There is a large body of research regarding computer supported education, perceptions of computer self-efficacy, computer anxiety and the technological attitudes of teachers and teacher candidates. However, no study has been conducted on the correlation between and effect of computer supported education, perceived computer self-efficacy, computer…

  11. Computer-based technological applications in psychotherapy training.

    PubMed

    Berger, Thomas

    2004-03-01

    Despite obvious and documented advantages of technological applications in education, psychotherapy training based on information technology is either still rare or limited to technical innovations such as videotape recordings. This article outlines opportunities new computer-based learning technology create for psychotherapy training. First, approaches that include computer-mediated communication between trainees and teachers/supervisors are presented. Then, computer-based learning technology for self-study purposes is discussed in the context of educational approaches to learning. Computer-based tools that have been developed and evaluated in the context of psychotherapy training are described. Evaluations of the tools are discussed.

  12. Applied Computer Technology in Cree and Naskapi Language Programs.

    ERIC Educational Resources Information Center

    Jancewicz, Bill; MacKenzie, Marguerite

    2002-01-01

    Discusses the parameters for the application of computer technology in Cree and Naskapi language programs, and shows that the deliberate and structured introduction of these technologies to indigenous language programs can facilitate indigenous language stabilization and development. (Author/VWL)

  13. Identifying Differential Item Functioning in Multi-Stage Computer Adaptive Testing

    ERIC Educational Resources Information Center

    Gierl, Mark J.; Lai, Hollis; Li, Johnson

    2013-01-01

    The purpose of this study is to evaluate the performance of CATSIB (Computer Adaptive Testing-Simultaneous Item Bias Test) for detecting differential item functioning (DIF) when items in the matching and studied subtest are administered adaptively in the context of a realistic multi-stage adaptive test (MST). MST was simulated using a 4-item…

  14. A Wafer Transfer Technology for MEMS Adaptive Optics

    NASA Technical Reports Server (NTRS)

    Yang, Eui-Hyeok; Wiberg, Dean V.

    2001-01-01

    Adaptive optics systems require the combination of several advanced technologies such as precision optics, wavefront sensors, deformable mirrors, and lasers with high-speed control systems. The deformable mirror with a continuous membrane is a key component of these systems. This paper describes a new technique for transferring an entire wafer-level silicon membrane from one substrate to another. This technology is developed for the fabrication of a compact deformable mirror with a continuous facet. A 1 (mu)m thick silicon membrane, 100 mm in diameter, has been successfully transferred without using adhesives or polymers (i.e. wax, epoxy, or photoresist). Smaller or larger diameter membranes can also be transferred using this technique. The fabricated actuator membrane with an electrode gap of 1.5 (mu)m shows a vertical deflection of 0.37 (mu)m at 55 V.

  15. Wireless Adaptive Therapeutic TeleGaming in a Pervasive Computing Environment

    NASA Astrophysics Data System (ADS)

    Peters, James F.; Szturm, Tony; Borkowski, Maciej; Lockery, Dan; Ramanna, Sheela; Shay, Barbara

    This chapter introduces a wireless, pervasive computing approach to adaptive therapeutic telegaming considered in the context of near set theory. Near set theory provides a formal basis for observation, comparison and classification of perceptual granules. A perceptual granule is defined by a collection of objects that are graspable by the senses or by the mind. In the proposed pervasive computing approach to telegaming, a handicapped person (e.g., stroke patient with limited hand, finger, arm function) plays a video game by interacting with familiar instrumented objects such as cups, cutlery, soccer balls, nozzles, screw top-lids, spoons, so that the technology that makes therapeutic exercise game-playing possible is largely invisible (Archives of Physical Medicine and Rehabilitation 89:2213-2217, 2008). The basic approach to adaptive learning (AL) in the proposed telegaming environment is ethology-inspired and is quite different from the traditional approach to reinforcement learning. In biologically-inspired learning, organisms learn to achieve some goal by durable modification of behaviours in response to signals from the environment resulting from specific experiences (Animal Behavior, 1995). The term adaptive is used here in an ethological sense, where learning by an organism results from modifying behaviour in response to perceived changes in the environment. To instill adaptivity in a video game, it is assumed that learning by a video game is episodic. During an episode, the behaviour of a player is measured indirectly by tracking the occurrence of gaming events such as a hit or a miss of a target (e.g., hitting a moving ball with a game paddle). An ethogram provides a record of behaviour feature values that provide a basis a functional registry for handicapped players for gaming adaptivity. An important practical application of adaptive gaming is therapeutic rehabilitation exercise carried out in parallel with playing action video games. Enjoyable and

  16. Numerical Technology for Large-Scale Computational Electromagnetics

    SciTech Connect

    Sharpe, R; Champagne, N; White, D; Stowell, M; Adams, R

    2003-01-30

    The key bottleneck of implicit computational electromagnetics tools for large complex geometries is the solution of the resulting linear system of equations. The goal of this effort was to research and develop critical numerical technology that alleviates this bottleneck for large-scale computational electromagnetics (CEM). The mathematical operators and numerical formulations used in this arena of CEM yield linear equations that are complex valued, unstructured, and indefinite. Also, simultaneously applying multiple mathematical modeling formulations to different portions of a complex problem (hybrid formulations) results in a mixed structure linear system, further increasing the computational difficulty. Typically, these hybrid linear systems are solved using a direct solution method, which was acceptable for Cray-class machines but does not scale adequately for ASCI-class machines. Additionally, LLNL's previously existing linear solvers were not well suited for the linear systems that are created by hybrid implicit CEM codes. Hence, a new approach was required to make effective use of ASCI-class computing platforms and to enable the next generation design capabilities. Multiple approaches were investigated, including the latest sparse-direct methods developed by our ASCI collaborators. In addition, approaches that combine domain decomposition (or matrix partitioning) with general-purpose iterative methods and special purpose pre-conditioners were investigated. Special-purpose pre-conditioners that take advantage of the structure of the matrix were adapted and developed based on intimate knowledge of the matrix properties. Finally, new operator formulations were developed that radically improve the conditioning of the resulting linear systems thus greatly reducing solution time. The goal was to enable the solution of CEM problems that are 10 to 100 times larger than our previous capability.

  17. The Adaptive Aerosol Delivery (AAD) technology: Past, present, and future.

    PubMed

    Denyer, John; Dyche, Tony

    2010-04-01

    Conventional aerosol delivery systems and the availability of new technologies have led to the development of "intelligent" nebulizers such as the I-neb Adaptive Aerosol Delivery (AAD) System. Based on the AAD technology, the I-neb AAD System has been designed to continuously adapt to changes in the patient's breathing pattern, and to pulse aerosol only during the inspiratory part of the breathing cycle. This eliminates waste of aerosol during exhalation, and creates a foundation for precise aerosol (dose) delivery. To facilitate the delivery of precise metered doses of aerosol to the patient, a unique metering chamber design has been developed. Through the vibrating mesh technology, the metering chamber design, and the AAD Disc function, the aerosol output rate and metered (delivered) dose can be tailored to the demands of the specific drug to be delivered. In the I-neb AAD System, aerosol delivery is guided through two algorithms, one for the Tidal Breathing Mode (TBM), and one for slow and deep inhalations, the Target Inhalation Mode (TIM). The aim of TIM is to reduce the treatment time by increasing the total inhalation time per minute, and to increase lung deposition by reducing impaction in the upper airways through slow and deep inhalations. A key feature of the AAD technology is the patient feedback mechanisms that are provided to guide the patient on delivery performance. These feedback signals, which include visual, audible, and tactile forms, are configured in a feedback cascade that leads to a high level of compliance with the use of the I-neb AAD System. The I-neb Insight and the Patient Logging System facilitate a further degree of sophistication to the feedback mechanisms, by providing information on long term adherence and compliance data. These can be assessed by patients and clinicians via a Web-based delivery of information in the form of customized graphical analyses.

  18. The Adaptive Aerosol Delivery (AAD) Technology: Past, Present, and Future

    PubMed Central

    Dyche, Tony

    2010-01-01

    Abstract Conventional aerosol delivery systems and the availability of new technologies have led to the development of “intelligent” nebulizers such as the I-neb Adaptive Aerosol Delivery (AAD) System. Based on the AAD technology, the I-neb AAD System has been designed to continuously adapt to changes in the patient's breathing pattern, and to pulse aerosol only during the inspiratory part of the breathing cycle. This eliminates waste of aerosol during exhalation, and creates a foundation for precise aerosol (dose) delivery. To facilitate the delivery of precise metered doses of aerosol to the patient, a unique metering chamber design has been developed. Through the vibrating mesh technology, the metering chamber design, and the AAD Disc function, the aerosol output rate and metered (delivered) dose can be tailored to the demands of the specific drug to be delivered. In the I-neb AAD System, aerosol delivery is guided through two algorithms, one for the Tidal Breathing Mode (TBM), and one for slow and deep inhalations, the Target Inhalation Mode (TIM). The aim of TIM is to reduce the treatment time by increasing the total inhalation time per minute, and to increase lung deposition by reducing impaction in the upper airways through slow and deep inhalations. A key feature of the AAD technology is the patient feedback mechanisms that are provided to guide the patient on delivery performance. These feedback signals, which include visual, audible, and tactile forms, are configured in a feedback cascade that leads to a high level of compliance with the use of the I-neb AAD System. The I-neb Insight and the Patient Logging System facilitate a further degree of sophistication to the feedback mechanisms, by providing information on long term adherence and compliance data. These can be assessed by patients and clinicians via a Web-based delivery of information in the form of customized graphical analyses. PMID:20373904

  19. Computers--Teaching, Technology, and Applications.

    ERIC Educational Resources Information Center

    Cocco, Anthony M.; And Others

    1995-01-01

    Includes "Managing Personality Types in the Computer Classroom" (Cocco); "External I/O Input/Output with a PC" (Fryda); "The Future of CAD/CAM Computer-Assisted Design/Computer-Assisted Manufacturing Software" (Fulton); and "Teaching Quality Assurance--A Laboratory Approach" (Wojslaw). (SK)

  20. Condition Driven Adaptive Music Generation for Computer Games

    NASA Astrophysics Data System (ADS)

    Naushad, Alamgir; Muhammad, Tufail

    2013-02-01

    The video game industry has grown to a multi-billion dollar, worldwide industry. The background music tends adaptively in reference to the specific game content during the game length of the play. Adaptive music should be further explored by looking at the particular condition in the game; such condition is driven by generating a specific music in the background which best fits in with the active game content throughout the length of the gameplay. This research paper outlines the use of condition driven adaptive music generation for audio and video to dynamically incorporate adaptively.

  1. Preschool Children. Tech Use Guide: Using Computer Technology.

    ERIC Educational Resources Information Center

    Council for Exceptional Children, Reston, VA. Center for Special Education Technology.

    One of nine brief guides for special educators on using computer technology, this guide focuses on uses with preschool children with either mild to severe disabilities. Especially noted is the ability of the computer to provide access to environmental experiences otherwise inaccessible to the young handicapped child. Appropriate technology for…

  2. Guide for Teachers. Tech Use Guide: Using Computer Technology.

    ERIC Educational Resources Information Center

    Council for Exceptional Children, Reston, VA. Center for Special Education Technology.

    One of nine brief guides for special educators on using computer technology, this guide is specifically directed to special education teachers and encourages them to become critical consumers of technology to ensure its most effective use. Brief descriptions of the various uses of the computer in the school setting--as an instructional tool, as an…

  3. CICT Computing, Information, and Communications Technology Program

    NASA Technical Reports Server (NTRS)

    Laufenberg, Lawrence; Tu, Eugene (Technical Monitor)

    2002-01-01

    The CICT Program is part of the NASA Aerospace Technology Enterprise's fundamental technology thrust to develop tools. processes, and technologies that enable new aerospace system capabilities and missions. The CICT Program's four key objectives are: Provide seamless access to NASA resources- including ground-, air-, and space-based distributed information technology resources-so that NASA scientists and engineers can more easily control missions, make new scientific discoveries, and design the next-generation space vehicles, provide high-data delivery from these assets directly to users for missions, develop goal-oriented human-centered systems, and research, develop and evaluate revolutionary technology.

  4. An object-oriented, technology-adaptive information model

    NASA Technical Reports Server (NTRS)

    Anyiwo, Joshua C.

    1995-01-01

    The primary objective was to develop a computer information system for effectively presenting NASA's technologies to American industries, for appropriate commercialization. To this end a comprehensive information management model, applicable to a wide variety of situations, and immune to computer software/hardware technological gyrations, was developed. The model consists of four main elements: a DATA_STORE, a data PRODUCER/UPDATER_CLIENT and a data PRESENTATION_CLIENT, anchored to a central object-oriented SERVER engine. This server engine facilitates exchanges among the other model elements and safeguards the integrity of the DATA_STORE element. It is designed to support new technologies, as they become available, such as Object Linking and Embedding (OLE), on-demand audio-video data streaming with compression (such as is required for video conferencing), Worldwide Web (WWW) and other information services and browsing, fax-back data requests, presentation of information on CD-ROM, and regular in-house database management, regardless of the data model in place. The four components of this information model interact through a system of intelligent message agents which are customized to specific information exchange needs. This model is at the leading edge of modern information management models. It is independent of technological changes and can be implemented in a variety of ways to meet the specific needs of any communications situation. This summer a partial implementation of the model has been achieved. The structure of the DATA_STORE has been fully specified and successfully tested using Microsoft's FoxPro 2.6 database management system. Data PRODUCER/UPDATER and PRESENTATION architectures have been developed and also successfully implemented in FoxPro; and work has started on a full implementation of the SERVER engine. The model has also been successfully applied to a CD-ROM presentation of NASA's technologies in support of Langley Research Center's TAG

  5. A more efficient anisotropic mesh adaptation for the computation of Lagrangian coherent structures

    NASA Astrophysics Data System (ADS)

    Fortin, A.; Briffard, T.; Garon, A.

    2015-03-01

    The computation of Lagrangian coherent structures is more and more used in fluid mechanics to determine subtle fluid flow structures. We present in this paper a new adaptive method for the efficient computation of Finite Time Lyapunov Exponent (FTLE) from which the coherent Lagrangian structures can be obtained. This new adaptive method considerably reduces the computational burden without any loss of accuracy on the FTLE field.

  6. Math Attitudes of Computer Education and Instructional Technology Students

    ERIC Educational Resources Information Center

    Tekerek, Mehmet; Yeniterzi, Betul; Ercan, Orhan

    2011-01-01

    Computer Education and Instructional Technology (CEIT) Departments train computer teachers to fill gap of computer instructor in all grades of schools in Turkey. Additionally graduates can also work as instructional technologist or software developer. The curriculum of CEIT departments includes mathematics courses. The aim of this study is to…

  7. Using Assistive Technology Adaptations To Include Students with Learning Disabilities in Cooperative Learning Activities.

    ERIC Educational Resources Information Center

    Bryant, Diane Pedrotty; Bryant, Brian R.

    1998-01-01

    Discusses a process for integrating technology adaptations for students with learning disabilities into cooperative-learning activities in terms of three components: (1) selecting adaptations; (2) monitoring use of adaptations during cooperative-learning activities; and (3) evaluating the adaptations' effectiveness. Barriers to and support systems…

  8. Computers and Autistic Learners: An Evolving Technology.

    ERIC Educational Resources Information Center

    Hedbring, Charles

    1985-01-01

    A research and demonstration computer center for severely handicapped autistic children, STEPPE-Lab, which uses computers as an augmentative communication and instructional system, is described. The article first reviews the keyboard, joystick, mouse, and drawing tablet as augmentative devices for helping communication disordered children interact…

  9. Cutting Technology Costs with Refurbished Computers

    ERIC Educational Resources Information Center

    Dessoff, Alan

    2010-01-01

    Many district administrators are finding that they can save money on computers by buying preowned ones instead of new ones. The practice has other benefits as well: It allows districts to give more computers to more students who need them, and it also promotes good environmental practices by keeping the machines out of landfills, where they…

  10. Questioning the Humanist Vision of Computer Technology.

    ERIC Educational Resources Information Center

    Van Alkemade, Kim

    Scholarship's reliance on humanism as a critical principle is problematic and needs to be called into question. The issue of access in "Computers and Composition" scholarship illustrates how a critical reliance on humanism may actually betray humanist values scholars in the field tend to promote. Scholarship in "Computers and Composition" has…

  11. High performance computing for deformable image registration: towards a new paradigm in adaptive radiotherapy.

    PubMed

    Samant, Sanjiv S; Xia, Junyi; Muyan-Ozcelik, Pinar; Owens, John D

    2008-08-01

    The advent of readily available temporal imaging or time series volumetric (4D) imaging has become an indispensable component of treatment planning and adaptive radiotherapy (ART) at many radiotherapy centers. Deformable image registration (DIR) is also used in other areas of medical imaging, including motion corrected image reconstruction. Due to long computation time, clinical applications of DIR in radiation therapy and elsewhere have been limited and consequently relegated to offline analysis. With the recent advances in hardware and software, graphics processing unit (GPU) based computing is an emerging technology for general purpose computation, including DIR, and is suitable for highly parallelized computing. However, traditional general purpose computation on the GPU is limited because the constraints of the available programming platforms. As well, compared to CPU programming, the GPU currently has reduced dedicated processor memory, which can limit the useful working data set for parallelized processing. We present an implementation of the demons algorithm using the NVIDIA 8800 GTX GPU and the new CUDA programming language. The GPU performance will be compared with single threading and multithreading CPU implementations on an Intel dual core 2.4 GHz CPU using the C programming language. CUDA provides a C-like language programming interface, and allows for direct access to the highly parallel compute units in the GPU. Comparisons for volumetric clinical lung images acquired using 4DCT were carried out. Computation time for 100 iterations in the range of 1.8-13.5 s was observed for the GPU with image size ranging from 2.0 x 10(6) to 14.2 x 10(6) pixels. The GPU registration was 55-61 times faster than the CPU for the single threading implementation, and 34-39 times faster for the multithreading implementation. For CPU based computing, the computational time generally has a linear dependence on image size for medical imaging data. Computational efficiency is

  12. Planning Computer Lessons. Tech Use Guide: Using Computer Technology.

    ERIC Educational Resources Information Center

    Council for Exceptional Children, Reston, VA. Center for Special Education Technology.

    This guide offers planning and organizing ideas for effectively using computers in classrooms that include students both with and without disabilities. The guide addresses: developing lesson plans, introducing the lesson in a way that builds motivation, providing guided and independent practice, extending the learning, and choosing software.…

  13. Audit and Evaluation of Computer Security. Computer Science and Technology.

    ERIC Educational Resources Information Center

    Ruthberg, Zella G.

    This is a collection of consensus reports, each produced at a session of an invitational workshop sponsored by the National Bureau of Standards. The purpose of the workshop was to explore the state-of-the-art and define appropriate subjects for future research in the audit and evaluation of computer security. Leading experts in the audit and…

  14. Impact of Computer Technology on Design and Craft Education

    ERIC Educational Resources Information Center

    Thorsteinsson, Gisli

    2014-01-01

    This research aims to answer the question, "How has the use of computer technology benefited the compulsory education system, focusing on Design and Technology?" In order to reply this question, it was necessary to focus on interactive whiteboards, e-portfolios and digital projectors as the main technology formats. An initial literature…

  15. Reviews of computing technology: Software overview

    SciTech Connect

    Hartshorn, W.R.; Johnson, A.L.

    1994-01-05

    The Savannah River Site Computing Architecture states that the site computing environment will be standards-based, data-driven, and workstation-oriented. Larger server systems deliver needed information to users in a client-server relationship. Goals of the Architecture include utilizing computing resources effectively, maintaining a high level of data integrity, developing a robust infrastructure, and storing data in such a way as to promote accessibility and usability. This document describes the current storage environment at Savannah River Site (SRS) and presents some of the problems that will be faced and strategies that are planned over the next few years.

  16. A Framework for the Development of Context-Adaptable User Interfaces for Ubiquitous Computing Systems

    PubMed Central

    Varela, Gervasio; Paz-Lopez, Alejandro; Becerra, Jose A.; Duro, Richard

    2016-01-01

    This paper addresses the problem of developing user interfaces for Ubiquitous Computing (UC) and Ambient Intelligence (AmI) systems. These kind of systems are expected to provide a natural user experience, considering interaction modalities adapted to the user abilities and preferences and using whatever interaction devices are present in the environment. These interaction devices are not necessarily known at design time. The task is quite complicated due to the variety of devices and technologies, and the diversity of scenarios, and it usually burdens the developer with the need to create many different UIs in order to consider the foreseeable user-environment combinations. Here, we propose an UI abstraction framework for UC and AmI systems that effectively improves the portability of those systems between different environments and for different users. It allows developers to design and implement a single UI capable of being deployed with different devices and modalities regardless the physical location. PMID:27399711

  17. Emerging Trends in Technology Education Computer Applications.

    ERIC Educational Resources Information Center

    Hazari, Sunil I.

    1993-01-01

    Graphical User Interface (GUI)--and its variant, pen computing--is rapidly replacing older types of operating environments. Despite its heavier demand for processing power, GUI has many advantages. (SK)

  18. Mistaking Computers for Technology: Technology Literacy and the Digital Divide

    ERIC Educational Resources Information Center

    Amiel, Tel

    2006-01-01

    No other information and communication technology has swept the globe with greater speed than the Internet, having the potential to promote vast social, economic, and political transformations. As new technologies become available the pattern of adoption and diffusion creates disparities in access and ownership. At the most basic this gap is…

  19. Guided Discovery Learning with Computer-Based Simulation Games: Effects of Adaptive and Non-Adaptive Instructional Support.

    ERIC Educational Resources Information Center

    Leutner, Detlev

    1993-01-01

    System-initiated adaptive advice and learner-requested nonadaptive background information were investigated in computer simulation game experiments with 64 seventh graders, 38 college students, and 80 seventh and eighth graders in Germany. Results are discussed in terms of theories of problem solving, intelligence, memory, and information…

  20. A case study of evolutionary computation of biochemical adaptation

    NASA Astrophysics Data System (ADS)

    François, Paul; Siggia, Eric D.

    2008-06-01

    Simulations of evolution have a long history, but their relation to biology is questioned because of the perceived contingency of evolution. Here we provide an example of a biological process, adaptation, where simulations are argued to approach closer to biology. Adaptation is a common feature of sensory systems, and a plausible component of other biochemical networks because it rescales upstream signals to facilitate downstream processing. We create random gene networks numerically, by linking genes with interactions that model transcription, phosphorylation and protein-protein association. We define a fitness function for adaptation in terms of two functional metrics, and show that any reasonable combination of them will yield the same adaptive networks after repeated rounds of mutation and selection. Convergence to these networks is driven by positive selection and thus fast. There is always a path in parameter space of continuously improving fitness that leads to perfect adaptation, implying that the actual mutation rates we use in the simulation do not bias the results. Our results imply a kinetic view of evolution, i.e., it favors gene networks that can be learned quickly from the random examples supplied by mutation. This formulation allows for deductive predictions of the networks realized in nature.

  1. Assessment of Service Protocols Adaptability Using a Novel Path Computation Technique

    NASA Astrophysics Data System (ADS)

    Zhou, Zhangbing; Bhiri, Sami; Haller, Armin; Zhuge, Hai; Hauswirth, Manfred

    In this paper we propose a new kind of adaptability assessment that determines whether service protocols of a requestor and a provider are adaptable, computes their adaptation degree, and identifies conditions that determine when they can be adapted. We also propose a technique that implements this adaptability assessment: (1) we construct a complete adaptation graph that captures all service interactions adaptable between these two service protocols. The emptiness or non-emptiness of this graph corresponds to the fact that whether or not they are adaptable; (2) we propose a novel path computation technique to generate all instance sub-protocols which reflect valid executions of a particular service protocol, and to derive all instance sub-protocol pairs captured by the complete adaptation graph. An adaptation degree is computed as a ratio between the number of instance sub-protocols captured by these instance sub-protocol pairs with respect to a service protocol and that of this service protocol; (3) and finally we identify a set of conditions based on these instance sub-protocol pairs. A condition is the conjunction of all conditions specified on the transitions of a given pair of instance sub-protocols. This assessment is a comprehensive means of selecting the suitable service protocol among functionally-equivalent candidates according to the requestor's business requirements.

  2. Contactless thin adaptive mirror technology: past, present, and future

    NASA Astrophysics Data System (ADS)

    Biasi, Roberto; Gallieni, Daniele; Salinari, Piero; Riccardi, Armando; Mantegazza, Paolo

    2010-07-01

    The contactless, voice coil motor adaptive mirror technology starts from an idea by Piero Salinari in 1993. This idea has progressively evolved to real systems thanks to a fruitful collaboration involving Italian research institutes (INAF - Osservatorio Astrofisico di Arcetri and Aerospace Department of Politecnico di Milano) and small Italian enterprises (Microgate and ADS). Collaboration between research institutions and industry is still very effectively in place, but nowadays the technology has left the initial R&D phase reaching a stage in which the whole projects are managed by the industrial entities. In this paper we present the baseline concept and its evolution, describing the main progress milestones. These are paced by the actual implementation of this idea into real systems, from MMT, to LBT, Magellan, VLT, GMT and E-ELT. The fundamental concept and layout has remained unchanged through this evolution, maintaining its intrinsic advantages: tolerance to actuators' failures, mechanical de-coupling and relaxed tolerances between correcting mirror and reference structure, large stroke, hysteresis-free behavior. Moreover, this concept has proved its expandability to very large systems with thousands of controlled d.o.f. Notwithstanding the solidity of the fundamentals, the implementation has strongly evolved from the beginning, in order to deal with the dimensional, power, maintainability and reliability constraints imposed by the increased size of the targeted systems.

  3. An adaptable Boolean net trainable to control a computing robot

    SciTech Connect

    Lauria, F. E.; Prevete, R.; Milo, M.; Visco, S.

    1999-03-22

    We discuss a method to implement in a Boolean neural network a Hebbian rule so to obtain an adaptable universal control system. We start by presenting both the Boolean neural net and the Hebbian rule we have considered. Then we discuss, first, the problems arising when the latter is naively implemented in a Boolean neural net, second, the method consenting us to overcome them and the ensuing adaptable Boolean neural net paradigm. Next, we present the adaptable Boolean neural net as an intelligent control system, actually controlling a writing robot, and discuss how to train it in the execution of the elementary arithmetic operations on operands represented by numerals with an arbitrary number of digits.

  4. Embedded Data Processor and Portable Computer Technology testbeds

    NASA Technical Reports Server (NTRS)

    Alena, Richard; Liu, Yuan-Kwei; Goforth, Andre; Fernquist, Alan R.

    1993-01-01

    Attention is given to current activities in the Embedded Data Processor and Portable Computer Technology testbed configurations that are part of the Advanced Data Systems Architectures Testbed at the Information Sciences Division at NASA Ames Research Center. The Embedded Data Processor Testbed evaluates advanced microprocessors for potential use in mission and payload applications within the Space Station Freedom Program. The Portable Computer Technology (PCT) Testbed integrates and demonstrates advanced portable computing devices and data system architectures. The PCT Testbed uses both commercial and custom-developed devices to demonstrate the feasibility of functional expansion and networking for portable computers in flight missions.

  5. Molecular determinants of enzyme cold adaptation: comparative structural and computational studies of cold- and warm-adapted enzymes.

    PubMed

    Papaleo, Elena; Tiberti, Matteo; Invernizzi, Gaetano; Pasi, Marco; Ranzani, Valeria

    2011-11-01

    The identification of molecular mechanisms underlying enzyme cold adaptation is a hot-topic both for fundamental research and industrial applications. In the present contribution, we review the last decades of structural computational investigations on cold-adapted enzymes in comparison to their warm-adapted counterparts. Comparative sequence and structural studies allow the definition of a multitude of adaptation strategies. Different enzymes carried out diverse mechanisms to adapt to low temperatures, so that a general theory for enzyme cold adaptation cannot be formulated. However, some common features can be traced in dynamic and flexibility properties of these enzymes, as well as in their intra- and inter-molecular interaction networks. Interestingly, the current data suggest that a family-centered point of view is necessary in the comparative analyses of cold- and warm-adapted enzymes. In fact, enzymes belonging to the same family or superfamily, thus sharing at least the three-dimensional fold and common features of the functional sites, have evolved similar structural and dynamic patterns to overcome the detrimental effects of low temperatures.

  6. Molecular determinants of enzyme cold adaptation: comparative structural and computational studies of cold- and warm-adapted enzymes.

    PubMed

    Papaleo, Elena; Tiberti, Matteo; Invernizzi, Gaetano; Pasi, Marco; Ranzani, Valeria

    2011-11-01

    The identification of molecular mechanisms underlying enzyme cold adaptation is a hot-topic both for fundamental research and industrial applications. In the present contribution, we review the last decades of structural computational investigations on cold-adapted enzymes in comparison to their warm-adapted counterparts. Comparative sequence and structural studies allow the definition of a multitude of adaptation strategies. Different enzymes carried out diverse mechanisms to adapt to low temperatures, so that a general theory for enzyme cold adaptation cannot be formulated. However, some common features can be traced in dynamic and flexibility properties of these enzymes, as well as in their intra- and inter-molecular interaction networks. Interestingly, the current data suggest that a family-centered point of view is necessary in the comparative analyses of cold- and warm-adapted enzymes. In fact, enzymes belonging to the same family or superfamily, thus sharing at least the three-dimensional fold and common features of the functional sites, have evolved similar structural and dynamic patterns to overcome the detrimental effects of low temperatures. PMID:21827423

  7. Identifying Reading Problems with Computer-Adaptive Assessments

    ERIC Educational Resources Information Center

    Merrell, C.; Tymms, P.

    2007-01-01

    This paper describes the development of an adaptive assessment called Interactive Computerised Assessment System (InCAS) that is aimed at children of a wide age and ability range to identify specific reading problems. Rasch measurement has been used to create the equal interval scales that form each part of the assessment. The rationale for the…

  8. Adjusting Computer Adaptive Test Starting Points To Conserve Item Pool.

    ERIC Educational Resources Information Center

    Zhu, Daming; Fan, Meichu

    The convention for selecting starting points (that is, initial items) on a computerized adaptive test (CAT) is to choose as starting points items of medium difficulty for all examinees. Selecting a starting point based on prior information about an individual's ability was first suggested many years ago, but has been believed unimportant provided…

  9. Technological Imperatives: Using Computers in Academic Debate.

    ERIC Educational Resources Information Center

    Ticku, Ravinder; Phelps, Greg

    Intended for forensic educators and debate teams, this document details how one university debate team, at the University of Iowa, makes use of computer resources on campus to facilitate storage and retrieval of information useful to debaters. The introduction notes the problem of storing and retrieving the amount of information required by debate…

  10. CACTUS: Calculator and Computer Technology User Service.

    ERIC Educational Resources Information Center

    Hyde, Hartley

    1998-01-01

    Presents an activity in which students use computer-based spreadsheets to find out how much grain should be added to a chess board when a grain of rice is put on the first square, the amount is doubled for the next square, and the chess board is covered. (ASK)

  11. Cloud Computing Technologies Facilitate Earth Research

    NASA Technical Reports Server (NTRS)

    2015-01-01

    Under a Space Act Agreement, NASA partnered with Seattle-based Amazon Web Services to make the agency's climate and Earth science satellite data publicly available on the company's servers. Users can access the data for free, but they can also pay to use Amazon's computing services to analyze and visualize information using the same software available to NASA researchers.

  12. Locating Materials using Computer Database Technology

    ERIC Educational Resources Information Center

    Kissock, Craig; Lopez, A. A.

    1978-01-01

    Describes the development and contents of a computer database which helps social studies educators identify available curriculum materials. Materials can be located by using one or a combination of the following elements: Acquisition number, publisher, institution or sponsor, author, title, grade level, copyright date, cost, number of pages, and…

  13. Adaptive finite element methods for two-dimensional problems in computational fracture mechanics

    NASA Technical Reports Server (NTRS)

    Min, J. B.; Bass, J. M.; Spradley, L. W.

    1994-01-01

    Some recent results obtained using solution-adaptive finite element methods in two-dimensional problems in linear elastic fracture mechanics are presented. The focus is on the basic issue of adaptive finite element methods for validating the new methodology by computing demonstration problems and comparing the stress intensity factors to analytical results.

  14. An Investigation on Computer-Adaptive Multistage Testing Panels for Multidimensional Assessment

    ERIC Educational Resources Information Center

    Wang, Xinrui

    2013-01-01

    The computer-adaptive multistage testing (ca-MST) has been developed as an alternative to computerized adaptive testing (CAT), and been increasingly adopted in large-scale assessments. Current research and practice only focus on ca-MST panels for credentialing purposes. The ca-MST test mode, therefore, is designed to gauge a single scale. The…

  15. Adaptive 3D single-block grids for the computation of viscous flows around wings

    SciTech Connect

    Hagmeijer, R.; Kok, J.C.

    1996-12-31

    A robust algorithm for the adaption of a 3D single-block structured grid suitable for the computation of viscous flows around a wing is presented and demonstrated by application to the ONERA M6 wing. The effects of grid adaption on the flow solution and accuracy improvements is analyzed. Reynolds number variations are studied.

  16. Theories of Learning and Computer-Mediated Instructional Technologies.

    ERIC Educational Resources Information Center

    Hung, David

    2001-01-01

    Describes four major models of learning: behaviorism, cognitivism, constructivism, and social constructivism. Discusses situated cognition; differences between learning theories and instructional approaches; and how computer-mediated technologies can be integrated with learning theories. (LRW)

  17. Using Computer Technology To Foster Learning for Understanding.

    ERIC Educational Resources Information Center

    Van Melle, Elaine; Tomalty, Lewis

    2000-01-01

    Describes how computer technology, specifically the use of a multimedia CD-ROM, was integrated into a microbiology curriculum as part of the transition from focusing on facts to fostering learning for understanding. (Contains 30 references.) (Author/YDS)

  18. Technology and society: ideological implications of information and computer technologies in the Soviet Union

    SciTech Connect

    Weigle, M.A.

    1988-01-01

    This study examines the impact of technology on the USSR's social system from the perspective of Soviet ideological development. The analysis of information and computer technologies within this framework de-emphasizes both modernization theories and those that assume unchallenged Communist Party control over technological development. Previous studies have examined the level of Soviet technological achievements and the gap between this level and those in the West, many referring to ideological boundaries of Soviet technological development without, however, systematically analyzing the resulting implications for the Soviet ideology of Marxism-Leninism. This study develops a framework for analyzing the impact of new technologies in the USSR in the fields of technology, ideology, and the scientific and technological revolution. On the basis of this framework, examination turns to the relevant Soviet theoretical and technical literature and debates among Soviety elites, concluding that the introduction of information and computer technologies and the organization of computer networks has exacerbated tensions in Soviety Marxism-Leninism.

  19. Restricted access processor - An application of computer security technology

    NASA Technical Reports Server (NTRS)

    Mcmahon, E. M.

    1985-01-01

    This paper describes a security guard device that is currently being developed by Computer Sciences Corporation (CSC). The methods used to provide assurance that the system meets its security requirements include the system architecture, a system security evaluation, and the application of formal and informal verification techniques. The combination of state-of-the-art technology and the incorporation of new verification procedures results in a demonstration of the feasibility of computer security technology for operational applications.

  20. Computer technology -- 1996: Applications and methodology. PVP-Volume 326

    SciTech Connect

    Hulbert, G.M.; Hsu, K.H.; Lee, T.W.; Nicholas, T.

    1996-12-01

    The primary objective of the Computer Technology Committee of the ASME Pressure Vessels and Piping Division is to promote interest and technical exchange in the field of computer technology, related to the design and analysis of pressure vessels and piping. The topics included in this volume are: analysis of bolted joints; nonlinear analysis, applications and methodology; finite element analysis and applications; and behavior of materials. Separate abstracts were prepared for 23 of the papers in this volume.

  1. Novel Approaches to Adaptive Angular Approximations in Computational Transport

    SciTech Connect

    Marvin L. Adams; Igor Carron; Paul Nelson

    2006-06-04

    The particle-transport equation is notoriously difficult to discretize accurately, largely because the solution can be discontinuous in every variable. At any given spatial position and energy E, for example, the transport solution  can be discontinuous at an arbitrary number of arbitrary locations in the direction domain. Even if the solution is continuous it is often devoid of smoothness. This makes the direction variable extremely difficult to discretize accurately. We have attacked this problem with adaptive discretizations in the angle variables, using two distinctly different approaches. The first approach used wavelet function expansions directly and exploited their ability to capture sharp local variations. The second used discrete ordinates with a spatially varying quadrature set that adapts to the local solution. The first approach is very different from that in today’s transport codes, while the second could conceivably be implemented in such codes. Both approaches succeed in reducing angular discretization error to any desired level. The work described and results presented in this report add significantly to the understanding of angular discretization in transport problems and demonstrate that it is possible to solve this important long-standing problem in deterministic transport. Our results show that our adaptive discrete-ordinates (ADO) approach successfully: 1) Reduces angular discretization error to user-selected “tolerance” levels in a variety of difficult test problems; 2) Achieves a given error with significantly fewer unknowns than non-adaptive discrete ordinates methods; 3) Can be implemented within standard discrete-ordinates solution techniques, and thus could generate a significant impact on the field in a relatively short time. Our results show that our adaptive wavelet approach: 1) Successfully reduces the angular discretization error to arbitrarily small levels in a variety of difficult test problems, even when using the

  2. Computers for artificial intelligence a technology assessment and forecast

    SciTech Connect

    Miller, R.K.

    1986-01-01

    This study reviews the development and current state-of-the-art in computers for artificial intelligence, including LISP machines, AI workstations, professional and engineering workstations, minicomputers, mainframes, and supercomputers. Major computer systems for AI applications are reviewed. The use of personal computers for expert system development is discussed, and AI software for the IBM PC, Texas Instrument Professional Computer, and Apple MacIntosh is presented. Current research aimed at developing a new computer for artificial intelligence is described, and future technological developments are discussed.

  3. Improving Adaptive Learning Technology through the Use of Response Times

    ERIC Educational Resources Information Center

    Mettler, Everett; Massey, Christine M.; Kellman, Philip J.

    2011-01-01

    Adaptive learning techniques have typically scheduled practice using learners' accuracy and item presentation history. We describe an adaptive learning system (Adaptive Response Time Based Sequencing--ARTS) that uses both accuracy and response time (RT) as direct inputs into sequencing. Response times are used to assess learning strength and…

  4. "Computer" and "Information and Communication Technology": Students' Culture Specific Interpretations

    ERIC Educational Resources Information Center

    Elen, Jan; Clarebout, Geraldine; Sarfo, Frederick Kwaku; Louw, Lambertus Philippus; Poysa-Tarhonen, Johanna; Stassens, Nick

    2010-01-01

    Given the use of information and communication technology (ICT) and computer as synonyms in ICT-integration research on the one hand, and the potential problems in doing so on the other, this contribution tries to gain insight in the understanding of the words computer and ICT in different settings. In five different countries (Belgium, Finland,…

  5. Computer-Integrated Manufacturing Technology. Tech Prep Competency Profile.

    ERIC Educational Resources Information Center

    Lakeland Tech Prep Consortium, Kirtland, OH.

    This tech prep competency profile for computer-integrated manufacturing technology begins with definitions for four occupations: manufacturing technician, quality technician, mechanical engineering technician, and computer-assisted design/drafting (CADD) technician. A chart lists competencies by unit and indicates whether entire or partial unit is…

  6. Computer-Mediated Technology and Transcultural Counselor Education.

    ERIC Educational Resources Information Center

    McFadden, John

    2000-01-01

    This manuscript traces the history of computer technologies, their applications in mental health settings, and suggests that transcultural counselor educators engage their students in the design of a case-based computer simulation. The avatar-focused simulation offers an unprecedented environment for experimentation in collaborative learning and…

  7. Institute for Computer Sciences and Technology. Annual Report FY 1986.

    ERIC Educational Resources Information Center

    National Bureau of Standards (DOC), Washington, DC. Inst. for Computer Sciences and Technology.

    Activities of the Institute for Computer Sciences and Technology (ICST) within the U.S. Department of Commerce during fiscal year 1986 are described in this annual report, which summarizes research and publications by ICST in the following areas: (1) standards and guidelines for computer security, including encryption and message authentication…

  8. Two Year Computer System Technology Curricula for the '80's.

    ERIC Educational Resources Information Center

    Palko, Donald N.; Hata, David M.

    1982-01-01

    The computer industry is viewed on a collision course with a human resources crisis. Changes expected during the next decade are outlined, with expectations noted that merging of hardware and software skills will be met in a technician's skill set. Essential curricula components of a computer system technology program are detailed. (MP)

  9. Selecting Software. Tech Use Guide: Using Computer Technology.

    ERIC Educational Resources Information Center

    Council for Exceptional Children, Reston, VA. Center for Special Education Technology.

    One of nine brief guides for special educators on using computer technology, this guide focuses on the selection of software for use in the special education classroom. Four types of software used for computer assisted instruction are briefly described: tutorials; drill and practice; educational games; and simulations. The increasing use of tool…

  10. The Computer Industry. High Technology Industries: Profiles and Outlooks.

    ERIC Educational Resources Information Center

    International Trade Administration (DOC), Washington, DC.

    A series of meetings was held to assess future problems in United States high technology, particularly in the fields of robotics, computers, semiconductors, and telecommunications. This report, which focuses on the computer industry, includes a profile of this industry and the papers presented by industry speakers during the meetings. The profile…

  11. Coached, Interactive Computer Simulations: A New Technology for Training.

    ERIC Educational Resources Information Center

    Hummel, Thomas J.

    This paper provides an overview of a prototype simulation-centered intelligent computer-based training (CBT) system--implemented using expert system technology--which provides: (1) an environment in which trainees can learn and practice complex skills; (2) a computer-based coach or mentor to critique performance, suggest improvements, and provide…

  12. WRF4G project: Adaptation of WRF Model to Distributed Computing Infrastructures

    NASA Astrophysics Data System (ADS)

    Cofino, Antonio S.; Fernández Quiruelas, Valvanuz; García Díez, Markel; Blanco Real, Jose C.; Fernández, Jesús

    2013-04-01

    Nowadays Grid Computing is powerful computational tool which is ready to be used for scientific community in different areas (such as biomedicine, astrophysics, climate, etc.). However, the use of this distributed computing infrastructures (DCI) is not yet common practice in climate research, and only a few teams and applications in this area take advantage of this infrastructure. Thus, the first objective of this project is to popularize the use of this technology in the atmospheric sciences area. In order to achieve this objective, one of the most used applications has been taken (WRF; a limited- area model, successor of the MM5 model), that has a user community formed by more than 8000 researchers worldwide. This community develop its research activity on different areas and could benefit from the advantages of Grid resources (case study simulations, regional hind-cast/forecast, sensitivity studies, etc.). The WRF model is been used as input by many energy and natural hazards community, therefore those community will also benefit. However, Grid infrastructures have some drawbacks for the execution of applications that make an intensive use of CPU and memory for a long period of time. This makes necessary to develop a specific framework (middleware). This middleware encapsulates the application and provides appropriate services for the monitoring and management of the jobs and the data. Thus, the second objective of the project consists on the development of a generic adaptation of WRF for Grid (WRF4G), to be distributed as open-source and to be integrated in the official WRF development cycle. The use of this WRF adaptation should be transparent and useful to face any of the previously described studies, and avoid any of the problems of the Grid infrastructure. Moreover it should simplify the access to the Grid infrastructures for the research teams, and also to free them from the technical and computational aspects of the use of the Grid. Finally, in order to

  13. Network Computer Technology. Phase I: Viability and Promise within NASA's Desktop Computing Environment

    NASA Technical Reports Server (NTRS)

    Paluzzi, Peter; Miller, Rosalind; Kurihara, West; Eskey, Megan

    1998-01-01

    Over the past several months, major industry vendors have made a business case for the network computer as a win-win solution toward lowering total cost of ownership. This report provides results from Phase I of the Ames Research Center network computer evaluation project. It identifies factors to be considered for determining cost of ownership; further, it examines where, when, and how network computer technology might fit in NASA's desktop computing architecture.

  14. Implementation and Evaluation of Multiple Adaptive Control Technologies for a Generic Transport Aircraft Simulation

    NASA Technical Reports Server (NTRS)

    Campbell, Stefan F.; Kaneshige, John T.; Nguyen, Nhan T.; Krishakumar, Kalmanje S.

    2010-01-01

    Presented here is the evaluation of multiple adaptive control technologies for a generic transport aircraft simulation. For this study, seven model reference adaptive control (MRAC) based technologies were considered. Each technology was integrated into an identical dynamic-inversion control architecture and tuned using a methodology based on metrics and specific design requirements. Simulation tests were then performed to evaluate each technology s sensitivity to time-delay, flight condition, model uncertainty, and artificially induced cross-coupling. The resulting robustness and performance characteristics were used to identify potential strengths, weaknesses, and integration challenges of the individual adaptive control technologies

  15. Promoting Technology-Assisted Active Learning in Computer Science Education

    ERIC Educational Resources Information Center

    Gao, Jinzhu; Hargis, Jace

    2010-01-01

    This paper describes specific active learning strategies for teaching computer science, integrating both instructional technologies and non-technology-based strategies shown to be effective in the literature. The theoretical learning components addressed include an intentional method to help students build metacognitive abilities, as well as…

  16. Reaching Distance Students with Computer Network Technology (Part II).

    ERIC Educational Resources Information Center

    Distance Education Report, 1997

    1997-01-01

    This article is the second in a series on the typology of technology used in distance education courses at the Center for Distance Learning at the State University of New York's Empire State College. Discusses computer technology as the main delivery vehicle of the course guide, discussion, information resources, and assignments. Outlines learning…

  17. Complexity of Integrating Computer Technologies into Education in Turkey

    ERIC Educational Resources Information Center

    Akbaba-Altun, Sadegul

    2006-01-01

    Integrating Information and Communication Technologies (ICT) into a centralized education system such as Turkey's depends on its successful design and application, which is an expensive and complex process. The aim of this study was to identify the issues related to integrating computer technologies into a centralized education system. Data were…

  18. Computed Tomography Technology: Development and Applications for Defence

    SciTech Connect

    Baheti, G. L.; Saxena, Nisheet; Tripathi, D. K.; Songara, K. C.; Meghwal, L. R.; Meena, V. L.

    2008-09-26

    Computed Tomography(CT) has revolutionized the field of Non-Destructive Testing and Evaluation (NDT and E). Tomography for industrial applications warrants design and development of customized solutions catering to specific visualization requirements. Present paper highlights Tomography Technology Solutions implemented at Defence Laboratory, Jodhpur (DLJ). Details on the technological developments carried out and their utilization for various Defence applications has been covered.

  19. Factors Influencing Cloud-Computing Technology Adoption in Developing Countries

    ERIC Educational Resources Information Center

    Hailu, Alemayehu

    2012-01-01

    Adoption of new technology has complicating components both from the selection, as well as decision-making criteria and process. Although new technology such as cloud computing provides great benefits especially to the developing countries, it has challenges that may complicate the selection decision and subsequent adoption process. This study…

  20. Building Computer Technology Skills in TESOL Teacher Education

    ERIC Educational Resources Information Center

    DelliCarpini, Margo

    2012-01-01

    This paper reports on an action research study that investigated factors influencing TESOL (teaching English to speakers of other languages) teacher candidates' (TCs) selection and use of technology in the English as a second language (ESL) classroom and the influence of explicit training in context in the use of computer technology for second…

  1. Exploring Computer Technology. The Illinois Plan for Industrial Education.

    ERIC Educational Resources Information Center

    Illinois State Univ., Normal.

    This guide, which is one in the "Exploration" series of curriculum guides intended to assist junior high and middle school industrial educators in helping their students explore diverse industrial situations and technologies used in industry, deals with exploring computer technology. The following topics are covered in the individual lessons: the…

  2. Computers Put a Journalism School on Technology's Leading Edge.

    ERIC Educational Resources Information Center

    Blum, Debra E.

    1992-01-01

    Since 1985, the University of Missouri at Columbia's School of Journalism has been developing a high-technology environment for student work, including word processing, electronic imaging, networked personal computers, and telecommunications. Some faculty worry that the emphasis on technology may overshadow the concepts, principles, and substance…

  3. Computed Tomography Technology: Development and Applications for Defence

    NASA Astrophysics Data System (ADS)

    Baheti, G. L.; Saxena, Nisheet; Tripathi, D. K.; Songara, K. C.; Meghwal, L. R.; Meena, V. L.

    2008-09-01

    Computed Tomography(CT) has revolutionized the field of Non-Destructive Testing and Evaluation (NDT&E). Tomography for industrial applications warrants design and development of customized solutions catering to specific visualization requirements. Present paper highlights Tomography Technology Solutions implemented at Defence Laboratory, Jodhpur (DLJ). Details on the technological developments carried out and their utilization for various Defence applications has been covered.

  4. COMPUGIRLS: Stepping Stone to Future Computer-Based Technology Pathways

    ERIC Educational Resources Information Center

    Lee, Jieun; Husman, Jenefer; Scott, Kimberly A.; Eggum-Wilkens, Natalie D.

    2015-01-01

    The COMPUGIRLS: Culturally relevant technology program for adolescent girls was developed to promote underrepresented girls' future possible selves and career pathways in computer-related technology fields. We hypothesized that the COMPUGIRLS would promote academic possible selves and self-regulation to achieve these possible selves. We compared…

  5. Adaptive optics scanning laser ophthalmoscope imaging: technology update.

    PubMed

    Merino, David; Loza-Alvarez, Pablo

    2016-01-01

    Adaptive optics (AO) retinal imaging has become very popular in the past few years, especially within the ophthalmic research community. Several different retinal techniques, such as fundus imaging cameras or optical coherence tomography systems, have been coupled with AO in order to produce impressive images showing individual cell mosaics over different layers of the in vivo human retina. The combination of AO with scanning laser ophthalmoscopy has been extensively used to generate impressive images of the human retina with unprecedented resolution, showing individual photoreceptor cells, retinal pigment epithelium cells, as well as microscopic capillary vessels, or the nerve fiber layer. Over the past few years, the technique has evolved to develop several different applications not only in the clinic but also in different animal models, thanks to technological developments in the field. These developments have specific applications to different fields of investigation, which are not limited to the study of retinal diseases but also to the understanding of the retinal function and vision science. This review is an attempt to summarize these developments in an understandable and brief manner in order to guide the reader into the possibilities that AO scanning laser ophthalmoscopy offers, as well as its limitations, which should be taken into account when planning on using it.

  6. Adaptive optics scanning laser ophthalmoscope imaging: technology update

    PubMed Central

    Merino, David; Loza-Alvarez, Pablo

    2016-01-01

    Adaptive optics (AO) retinal imaging has become very popular in the past few years, especially within the ophthalmic research community. Several different retinal techniques, such as fundus imaging cameras or optical coherence tomography systems, have been coupled with AO in order to produce impressive images showing individual cell mosaics over different layers of the in vivo human retina. The combination of AO with scanning laser ophthalmoscopy has been extensively used to generate impressive images of the human retina with unprecedented resolution, showing individual photoreceptor cells, retinal pigment epithelium cells, as well as microscopic capillary vessels, or the nerve fiber layer. Over the past few years, the technique has evolved to develop several different applications not only in the clinic but also in different animal models, thanks to technological developments in the field. These developments have specific applications to different fields of investigation, which are not limited to the study of retinal diseases but also to the understanding of the retinal function and vision science. This review is an attempt to summarize these developments in an understandable and brief manner in order to guide the reader into the possibilities that AO scanning laser ophthalmoscopy offers, as well as its limitations, which should be taken into account when planning on using it. PMID:27175057

  7. Study of large adaptive arrays for space technology applications

    NASA Technical Reports Server (NTRS)

    Berkowitz, R. S.; Steinberg, B.; Powers, E.; Lim, T.

    1977-01-01

    The research in large adaptive antenna arrays for space technology applications is reported. Specifically two tasks were considered. The first was a system design study for accurate determination of the positions and the frequencies of sources radiating from the earth's surface that could be used for the rapid location of people or vehicles in distress. This system design study led to a nonrigid array about 8 km in size with means for locating the array element positions, receiving signals from the earth and determining the source locations and frequencies of the transmitting sources. It is concluded that this system design is feasible, and satisfies the desired objectives. The second task was an experiment to determine the largest earthbound array which could simulate a spaceborne experiment. It was determined that an 800 ft array would perform indistinguishably in both locations and it is estimated that one several times larger also would serve satisfactorily. In addition the power density spectrum of the phase difference fluctuations across a large array was measured. It was found that the spectrum falls off approximately as f to the minus 5/2 power.

  8. Adapting Computational Data Structures Technology to Reason about Infinity

    ERIC Educational Resources Information Center

    Goldberg, Robert; Hammerman, Natalie

    2004-01-01

    The NCTM curriculum states that students should be able to "compare and contrast the real number system and its various subsystems with regard to their structural characteristics." In evaluating overall conformity to the 1989 standard, the National Council of Teachers of Mathematics (NCTM) requires that "teachers must value and encourage the use…

  9. GPU-based computational adaptive optics for volumetric optical coherence microscopy

    NASA Astrophysics Data System (ADS)

    Tang, Han; Mulligan, Jeffrey A.; Untracht, Gavrielle R.; Zhang, Xihao; Adie, Steven G.

    2016-03-01

    Optical coherence tomography (OCT) is a non-invasive imaging technique that measures reflectance from within biological tissues. Current higher-NA optical coherence microscopy (OCM) technologies with near cellular resolution have limitations on volumetric imaging capabilities due to the trade-offs between resolution vs. depth-of-field and sensitivity to aberrations. Such trade-offs can be addressed using computational adaptive optics (CAO), which corrects aberration computationally for all depths based on the complex optical field measured by OCT. However, due to the large size of datasets plus the computational complexity of CAO and OCT algorithms, it is a challenge to achieve high-resolution 3D-OCM reconstructions at speeds suitable for clinical and research OCM imaging. In recent years, real-time OCT reconstruction incorporating both dispersion and defocus correction has been achieved through parallel computing on graphics processing units (GPUs). We add to these methods by implementing depth-dependent aberration correction for volumetric OCM using plane-by-plane phase deconvolution. Following both defocus and aberration correction, our reconstruction algorithm achieved depth-independent transverse resolution of 2.8 um, equal to the diffraction-limited focal plane resolution. We have translated the CAO algorithm to a CUDA code implementation and tested the speed of the software in real-time using two GPUs - NVIDIA Quadro K600 and Geforce TITAN Z. For a data volume containing 4096×256×256 voxels, our system's processing speed can keep up with the 60 kHz acquisition rate of the line-scan camera, and takes 1.09 seconds to simultaneously update the CAO correction for 3 en face planes at user-selectable depths.

  10. Computational relativistic astrophysics with adaptive mesh refinement: Testbeds

    SciTech Connect

    Evans, Edwin; Iyer, Sai; Tao Jian; Wolfmeyer, Randy; Zhang Huimin; Schnetter, Erik; Suen, Wai-Mo

    2005-04-15

    We have carried out numerical simulations of strongly gravitating systems based on the Einstein equations coupled to the relativistic hydrodynamic equations using adaptive mesh refinement (AMR) techniques. We show AMR simulations of NS binary inspiral and coalescence carried out on a workstation having an accuracy equivalent to that of a 1025{sup 3} regular unigrid simulation, which is, to the best of our knowledge, larger than all previous simulations of similar NS systems on supercomputers. We believe the capability opens new possibilities in general relativistic simulations.

  11. Comparing Computer-Adaptive and Curriculum-Based Measurement Methods of Assessment

    ERIC Educational Resources Information Center

    Shapiro, Edward S.; Gebhardt, Sarah N.

    2012-01-01

    This article reported the concurrent, predictive, and diagnostic accuracy of a computer-adaptive test (CAT) and curriculum-based measurements (CBM; both computation and concepts/application measures) for universal screening in mathematics among students in first through fourth grade. Correlational analyses indicated moderate to strong…

  12. Adapting the traveling salesman problem to an adiabatic quantum computer

    NASA Astrophysics Data System (ADS)

    Warren, Richard H.

    2013-04-01

    We show how to guide a quantum computer to select an optimal tour for the traveling salesman. This is significant because it opens a rapid solution method for the wide range of applications of the traveling salesman problem, which include vehicle routing, job sequencing and data clustering.

  13. Helping Students Adapt to Computer-Based Encrypted Examinations

    ERIC Educational Resources Information Center

    Baker-Eveleth, Lori; Eveleth, Daniel M.; O'Neill, Michele; Stone, Robert W.

    2006-01-01

    The College of Business and Economics at the University of Idaho conducted a pilot study that used commercially available encryption software called Securexam to deliver computer-based examinations. A multi-step implementation procedure was developed, implemented, and then evaluated on the basis of what students viewed as valuable. Two key aspects…

  14. Computer simulation program is adaptable to industrial processes

    NASA Technical Reports Server (NTRS)

    Schultz, F. E.

    1966-01-01

    The Reaction kinetics ablation program /REKAP/, developed to simulate ablation of various materials, provides mathematical formulations for computer programs which can simulate certain industrial processes. The programs are based on the use of nonsymmetrical difference equations that are employed to solve complex partial differential equation systems.

  15. Socio-Pedagogical Complex as a Pedagogical Support Technology of Students' Social Adaptation

    ERIC Educational Resources Information Center

    Sadovaya, Victoriya V.; Simonova, Galina I.

    2016-01-01

    The relevance of the problem stated in the article is determined by the need of developing technological approaches to pedagogical support of students' social adaptation. The purpose of this paper is to position the technological sequence of pedagogical support of students' social adaptation in the activities of the socio-pedagogical complex. The…

  16. A survey of adaptive control technology in robotics

    NASA Technical Reports Server (NTRS)

    Tosunoglu, S.; Tesar, D.

    1987-01-01

    Previous work on the adaptive control of robotic systems is reviewed. Although the field is relatively new and does not yet represent a mature discipline, considerable attention has been given to the design of sophisticated robot controllers. Here, adaptive control methods are divided into model reference adaptive systems and self-tuning regulators with further definition of various approaches given in each class. The similarity and distinct features of the designed controllers are delineated and tabulated to enhance comparative review.

  17. Adaptive critic design for computer intrusion detection system

    NASA Astrophysics Data System (ADS)

    Novokhodko, Alexander; Wunsch, Donald C., II; Dagli, Cihan H.

    2001-03-01

    This paper summarizes ongoing research. A neural network is used to detect a computer system intrusion basing on data from the system audit trail generated by Solaris Basic Security Module. The data have been provided by Lincoln Labs, MIT. The system alerts the human operator, when it encounters suspicious activity logged in the audit trail. To reduce the false alarm rate and accommodate the temporal indefiniteness of moment of attack a reinforcement learning approach is chosen to train the network.

  18. How to adapt portable computers for field gaugers

    SciTech Connect

    Cain, S. )

    1990-12-01

    Problems to be solved in using portable computers for field gaugers include developing the proper software and selecting the proper hardware. This article discusses software development and the considerations surrounding the selection of the hardware suitable to field gaugers. There are six state of development discussed. Software development is largely a process of communicating information about the eventual program and translating this information from one form to another.

  19. From biological neural networks to thinking machines: Transitioning biological organizational principles to computer technology

    NASA Technical Reports Server (NTRS)

    Ross, Muriel D.

    1991-01-01

    The three-dimensional organization of the vestibular macula is under study by computer assisted reconstruction and simulation methods as a model for more complex neural systems. One goal of this research is to transition knowledge of biological neural network architecture and functioning to computer technology, to contribute to the development of thinking computers. Maculas are organized as weighted neural networks for parallel distributed processing of information. The network is characterized by non-linearity of its terminal/receptive fields. Wiring appears to develop through constrained randomness. A further property is the presence of two main circuits, highly channeled and distributed modifying, that are connected through feedforward-feedback collaterals and biasing subcircuit. Computer simulations demonstrate that differences in geometry of the feedback (afferent) collaterals affects the timing and the magnitude of voltage changes delivered to the spike initiation zone. Feedforward (efferent) collaterals act as voltage followers and likely inhibit neurons of the distributed modifying circuit. These results illustrate the importance of feedforward-feedback loops, of timing, and of inhibition in refining neural network output. They also suggest that it is the distributed modifying network that is most involved in adaptation, memory, and learning. Tests of macular adaptation, through hyper- and microgravitational studies, support this hypothesis since synapses in the distributed modifying circuit, but not the channeled circuit, are altered. Transitioning knowledge of biological systems to computer technology, however, remains problematical.

  20. Adapting Wireless Technology to Lighting Control and Environmental Sensing

    SciTech Connect

    Dana Teasdale; Francis Rubinstein; Dave Watson; Steve Purdy

    2005-10-01

    The high cost of retrofitting buildings with advanced lighting control systems is a barrier to adoption of this energy-saving technology. Wireless technology, however, offers a solution to mounting installation costs since it requires no additional wiring to implement. To demonstrate the feasibility of such a system, a prototype wirelessly-controlled advanced lighting system was designed and built. The system includes the following components: a wirelessly-controllable analog circuit module (ACM), a wirelessly-controllable electronic dimmable ballast, a T8 3-lamp fixture, an environmental multi-sensor, a current transducer, and control software. The ACM, dimmable ballast, multi-sensor, and current transducer were all integrated with SmartMesh{trademark} wireless mesh networking nodes, called motes, enabling wireless communication, sensor monitoring, and actuator control. Each mote-enabled device has a reliable communication path to the SmartMesh Manager, a single board computer that controls network functions and connects the wireless network to a PC running lighting control software. The ACM is capable of locally driving one or more standard 0-10 Volt electronic dimmable ballasts through relay control and a 0-10 Volt controllable output. The mote-integrated electronic dimmable ballast is designed to drive a standard 3-lamp T8 light fixture. The environmental multi-sensor measures occupancy, light level and temperature. The current transducer is used to measure the power consumed by the fixture. Control software was developed to implement advanced lighting algorithms, including daylight ramping, occupancy control, and demand response. Engineering prototypes of each component were fabricated and tested in a bench-scale system. Based on standard industry practices, a cost analysis was conducted. It is estimated that the installation cost of a wireless advanced lighting control system for a retrofit application is at least 30% lower than a comparable wired system for

  1. Impact of new computing systems on computational mechanics and flight-vehicle structures technology

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Storaasli, O. O.; Fulton, R. E.

    1984-01-01

    Advances in computer technology which may have an impact on computational mechanics and flight vehicle structures technology were reviewed. The characteristics of supersystems, highly parallel systems, and small systems are summarized. The interrelations of numerical algorithms and software with parallel architectures are discussed. A scenario for future hardware/software environment and engineering analysis systems is presented. Research areas with potential for improving the effectiveness of analysis methods in the new environment are identified.

  2. Design of a fault tolerant airborne digital computer. Volume 2: Computational requirements and technology

    NASA Technical Reports Server (NTRS)

    Ratner, R. S.; Shapiro, E. B.; Zeidler, H. M.; Wahlstrom, S. E.; Clark, C. B.; Goldberg, J.

    1973-01-01

    This final report summarizes the work on the design of a fault tolerant digital computer for aircraft. Volume 2 is composed of two parts. Part 1 is concerned with the computational requirements associated with an advanced commercial aircraft. Part 2 reviews the technology that will be available for the implementation of the computer in the 1975-1985 period. With regard to the computation task 26 computations have been categorized according to computational load, memory requirements, criticality, permitted down-time, and the need to save data in order to effect a roll-back. The technology part stresses the impact of large scale integration (LSI) on the realization of logic and memory. Also considered was module interconnection possibilities so as to minimize fault propagation.

  3. Mechanical Design Technology--Modified. (Computer Assisted Drafting, Computer Aided Design). Curriculum Grant 84/85.

    ERIC Educational Resources Information Center

    Schoolcraft Coll., Livonia, MI.

    This document is a curriculum guide for a program in mechanical design technology (computer-assisted drafting and design developed at Schoolcraft College, Livonia, Michigan). The program helps students to acquire the skills of drafters and to interact with electronic equipment, with the option of becoming efficient in the computer-aided…

  4. Engineering Technology Programs Courses Guide for Computer Aided Design and Computer Aided Manufacturing.

    ERIC Educational Resources Information Center

    Georgia Univ., Athens. Div. of Vocational Education.

    This guide describes the requirements for courses in computer-aided design and computer-aided manufacturing (CAD/CAM) that are part of engineering technology programs conducted in vocational-technical schools in Georgia. The guide is organized in five sections. The first section provides a rationale for occupations in design and in production,…

  5. First Year Preservice Teachers' Attitudes toward Computers from Computer Education and Instructional Technology Department

    ERIC Educational Resources Information Center

    Yakin, Ilker, Sumuer, Evren

    2007-01-01

    The purpose of the study is to explore the attitudes of first year university students towards computers. The study focuses on preservice teachers (N=46) included 33 male and 12 female from Middle East Technical University, Computer Education and Instructional Technology (CEIT) department. The study is delimited to first grade preservice teachers…

  6. Combining Brain–Computer Interfaces and Assistive Technologies: State-of-the-Art and Challenges

    PubMed Central

    Millán, J. d. R.; Rupp, R.; Müller-Putz, G. R.; Murray-Smith, R.; Giugliemma, C.; Tangermann, M.; Vidaurre, C.; Cincotti, F.; Kübler, A.; Leeb, R.; Neuper, C.; Müller, K.-R.; Mattia, D.

    2010-01-01

    In recent years, new research has brought the field of electroencephalogram (EEG)-based brain–computer interfacing (BCI) out of its infancy and into a phase of relative maturity through many demonstrated prototypes such as brain-controlled wheelchairs, keyboards, and computer games. With this proof-of-concept phase in the past, the time is now ripe to focus on the development of practical BCI technologies that can be brought out of the lab and into real-world applications. In particular, we focus on the prospect of improving the lives of countless disabled individuals through a combination of BCI technology with existing assistive technologies (AT). In pursuit of more practical BCIs for use outside of the lab, in this paper, we identify four application areas where disabled individuals could greatly benefit from advancements in BCI technology, namely, “Communication and Control”, “Motor Substitution”, “Entertainment”, and “Motor Recovery”. We review the current state of the art and possible future developments, while discussing the main research issues in these four areas. In particular, we expect the most progress in the development of technologies such as hybrid BCI architectures, user–machine adaptation algorithms, the exploitation of users’ mental states for BCI reliability and confidence measures, the incorporation of principles in human–computer interaction (HCI) to improve BCI usability, and the development of novel BCI technology including better EEG devices. PMID:20877434

  7. Beyond computer literacy: supporting youth's positive development through technology.

    PubMed

    Bers, Marina Umaschi

    2010-01-01

    In a digital era in which technology plays a role in most aspects of a child's life, having the competence and confidence to use computers might be a necessary step, but not a goal in itself. Developing character traits that will serve children to use technology in a safe way to communicate and connect with others, and providing opportunities for children to make a better world through the use of their computational skills, is just as important. The Positive Technological Development framework (PTD), a natural extension of the computer literacy and the technological fluency movements that have influenced the world of educational technology, adds psychosocial, civic, and ethical components to the cognitive ones. PTD examines the developmental tasks of a child growing up in our digital era and provides a model for developing and evaluating technology-rich youth programs. The explicit goal of PTD programs is to support children in the positive uses of technology to lead more fulfilling lives and make the world a better place. This article introduces the concept of PTD and presents examples of the Zora virtual world program for young people that the author developed following this framework.

  8. Portable Computer Technology (PCT) Research and Development Program Phase 2

    NASA Technical Reports Server (NTRS)

    Castillo, Michael; McGuire, Kenyon; Sorgi, Alan

    1995-01-01

    The subject of this project report, focused on: (1) Design and development of two Advanced Portable Workstation 2 (APW 2) units. These units incorporate advanced technology features such as a low power Pentium processor, a high resolution color display, National Television Standards Committee (NTSC) video handling capabilities, a Personal Computer Memory Card International Association (PCMCIA) interface, and Small Computer System Interface (SCSI) and ethernet interfaces. (2) Use these units to integrate and demonstrate advanced wireless network and portable video capabilities. (3) Qualification of the APW 2 systems for use in specific experiments aboard the Mir Space Station. A major objective of the PCT Phase 2 program was to help guide future choices in computing platforms and techniques for meeting National Aeronautics and Space Administration (NASA) mission objectives. The focus being on the development of optimal configurations of computing hardware, software applications, and network technologies for use on NASA missions.

  9. Technology-based intervention to help persons with minimally conscious state and pervasive motor disabilities perform environmentally relevant adaptive behavior.

    PubMed

    Lancioni, Giulio E; Singh, Nirbhay N; O'Reilly, Mark F; Sigafoos, Jeff; Olivetti Belardinelli, Marta

    2012-08-01

    Persons with a diagnosis of minimally conscious state and pervasive motor disabilities tend to be passive and isolated. A way to help them improve their adaptive behavior (relate to their environment) involves the use of intervention packages combining assistive technology with motivational strategies. The types of assistive technology included in those packages may consist of (a) microswitches allowing direct access to environmental stimuli, (b) combinations of microswitches and voice output communication devices (VOCAs) allowing stimulus access and calls for caregivers' attention, respectively, and (c) computer presentations of stimulus options and microswitches allowing choice among those options and access to them.

  10. Application of software technology to a future spacecraft computer design

    NASA Technical Reports Server (NTRS)

    Labaugh, R. J.

    1980-01-01

    A study was conducted to determine how major improvements in spacecraft computer systems can be obtained from recent advances in hardware and software technology. Investigations into integrated circuit technology indicated that the CMOS/SOS chip set being developed for the Air Force Avionics Laboratory at Wright Patterson had the best potential for improving the performance of spaceborne computer systems. An integral part of the chip set is the bit slice arithmetic and logic unit. The flexibility allowed by microprogramming, combined with the software investigations, led to the specification of a baseline architecture and instruction set.

  11. Computer assisted orthopaedic surgery. Image guided and robotic assistive technologies.

    PubMed

    DiGioia, A M; Jaramaz, B; Colgan, B D

    1998-09-01

    Technologies are emerging that will influence the way in which orthopaedic surgery is planned, simulated, and performed. Recent advances in the fields of medical imaging, computer vision, and robotics have provided the enabling technologies to permit computer aided surgery to become an established area which can address clinical needs. Although these technologies have been applied in industry for more than 20 years, the field of computer assisted orthopaedic surgery is still in its infancy. Image guided and surgical navigation systems, robotic assistive devices, and surgical simulators have begun to emerge from the laboratory and hold the potential to improve current surgical practice and patients' outcomes. The goals of these new clinically focused technologies are to develop interactive, patient specific preoperative planners to optimize the performance of surgery and the postoperative biologic response, and develop more precise and less invasive interactive smart tools and sensors to assist in the accurate and precise performance of surgery. The medical community is beginning to see the benefit of these enabling technologies which can be realized only through the collaboration and combined expertise of engineers, roboticists, computer scientists, and surgeons.

  12. The implementation of AI technologies in computer wargames

    NASA Astrophysics Data System (ADS)

    Tiller, John A.

    2004-08-01

    Computer wargames involve the most in-depth analysis of general game theory. The enumerated turns of a game like chess are dwarfed by the exponentially larger possibilities of even a simple computer wargame. Implementing challenging AI is computer wargames is an important goal in both the commercial and military environments. In the commercial marketplace, customers demand a challenging AI opponent when they play a computer wargame and are frustrated by a lack of competence on the part of the AI. In the military environment, challenging AI opponents are important for several reasons. A challenging AI opponent will force the military professional to avoid routine or set-piece approaches to situations and cause them to think much deeper about military situations before taking action. A good AI opponent would also include national characteristics of the opponent being simulated, thus providing the military professional with even more of a challenge in planning and approach. Implementing current AI technologies in computer wargames is a technological challenge. The goal is to join the needs of AI in computer wargames with the solutions of current AI technologies. This talk will address several of those issues, possible solutions, and currently unsolved problems.

  13. Adapting Wireless Technology to Lighting Control and Environmental Sensing

    SciTech Connect

    Dana Teasdale; Francis Rubinstein; David S. Watson; Steve Purdy

    2006-04-30

    Although advanced lighting control systems offer significant energy savings, the high cost of retrofitting buildings with advanced lighting control systems is a barrier to adoption of this energy-saving technology. Wireless technology, however, offers a solution to mounting installation costs since it requires no additional wiring to implement. To demonstrate the feasibility of such a system, a prototype wirelessly-controlled advanced lighting system was designed and built. The system includes the following components: a wirelessly-controllable analog circuit module (ACM), a wirelessly-controllable electronic dimmable ballast, a T8 3-lamp fixture, an environmental multi-sensor, a current transducer, and control software. The ACM, dimmable ballast, multi-sensor, and current transducer were all integrated with SmartMesh{trademark} wireless mesh networking nodes, called motes, enabling wireless communication, sensor monitoring, and actuator control. Each mote-enabled device has a reliable communication path to the SmartMesh Manager, a single board computer that controls network functions and connects the wireless network to a PC running lighting control software. The ACM is capable of locally driving one or more standard 0-10 Volt electronic dimmable ballasts through relay control and a 0-10 Volt controllable output, in addition to 0-24 Volt and 0-10 Volt inputs. The mote-integrated electronic dimmable ballast is designed to drive a standard 3-lamp T8 light fixture. The environmental multisensor measures occupancy, light level and temperature. The current transducer is used to measure the power consumed by the fixture. Control software was developed to implement advanced lighting algorithms, including open and closed-loop daylight ramping, occupancy control, and demand response. Engineering prototypes of each component were fabricated and tested in a bench-scale system. Based on standard industry practices, a cost analysis was conducted. It is estimated that the

  14. A survey on adaptive engine technology for serious games

    NASA Astrophysics Data System (ADS)

    Rasim, Langi, Armein Z. R.; Munir, Rosmansyah, Yusep

    2016-02-01

    Serious Games has become a priceless tool in learning because it can simulate abstract concept to appear more realistic. The problem faced is that the players have different ability in playing the games. This causes the players to become frustrated if the game is too difficult or to get bored if it is too easy. Serious games have non-player character (NPC) in it. The NPC should be able to adapt to the players in such a way so that the players can feel comfortable in playing the games. Because of that, serious games development must involve an adaptive engine, which is by applying a learning machine that can adapt to different players. The development of adaptive engine can be viewed in terms of the frameworks and the algorithms. Frameworks include rules based, plan based, organization description based, proficiency of player based, and learning style and cognitive state based. Algorithms include agents based and non-agent based

  15. Adaption of space station technology for lunar operations

    NASA Technical Reports Server (NTRS)

    Garvey, J. M.

    1992-01-01

    Space Station Freedom technology will have the potential for numerous applications in an early lunar base program. The benefits of utilizing station technology in such a fashion include reduced development and facility costs for lunar base systems, shorter schedules, and verification of such technology through space station experience. This paper presents an assessment of opportunities for using station technology in a lunar base program, particularly in the lander/ascent vehicles and surface modules.

  16. Adaption of space station technology for lunar operations

    NASA Astrophysics Data System (ADS)

    Garvey, J. M.

    1992-09-01

    Space Station Freedom technology will have the potential for numerous applications in an early lunar base program. The benefits of utilizing station technology in such a fashion include reduced development and facility costs for lunar base systems, shorter schedules, and verification of such technology through space station experience. This paper presents an assessment of opportunities for using station technology in a lunar base program, particularly in the lander/ascent vehicles and surface modules.

  17. Adapting Wood Technology to Teach Design and Engineering

    ERIC Educational Resources Information Center

    Rummel, Robert A.

    2012-01-01

    Technology education has changed dramatically over the last few years. The transition of industrial arts to technology education and more recently the pursuit of design and engineering has resulted in technology education teachers often needing to change their curriculum and course activities to meet the demands of a rapidly changing profession.…

  18. Computer Technology and Student Preferences in a Nutrition Course

    ERIC Educational Resources Information Center

    Temple, Norman J.; Kemp, Wendy C.; Benson, Wendy A.

    2006-01-01

    This study assessed learner preferences for using computer-based technology in a distance education course. A questionnaire was posted to students who had taken an undergraduate nutrition course at Athabasca University, Canada. The response rate was 57.1% (176 returned out of 308). Subjects were predominately female (93.7%) and nursing students…

  19. Instructors' Integration of Computer Technology: Examining the Role of Interaction

    ERIC Educational Resources Information Center

    Kim, Hoe Kyeung; Rissel, Dorothy

    2008-01-01

    Computer technology has the potential to provide rich resources for language teaching and learning. However, it continues to be underutilized, even though its availability, familiarity, and sophistication are steadily increasing. This case study explored the way in which three language instructors' beliefs about language teaching and learning…

  20. Computer Technology Integration and Student Learning: Barriers and Promise

    ERIC Educational Resources Information Center

    Keengwe, Jared; Onchwari, Grace; Wachira, Patrick

    2008-01-01

    Political and institutional support has enabled many institutions of learning to spend millions of dollars to acquire educational computing tools (Ficklen and Muscara, "Am Educ" 25(3):22-29, 2001) that have not been effectively integrated into the curriculum. While access to educational technology tools has remarkably improved in most schools,…

  1. Introduction to CAD/Computers. High-Technology Training Module.

    ERIC Educational Resources Information Center

    Lockerby, Hugh

    This learning module for an eighth-grade introductory technology course is designed to help teachers introduce students to computer-assisted design (CAD) in a communications unit on graphics. The module contains a module objective and five specific objectives, a content outline, suggested instructor methodology, student activities, a list of six…

  2. Computer-Aided Drafting. Education for Technology Employment.

    ERIC Educational Resources Information Center

    Northern Illinois Univ., De Kalb. Dept. of Technology.

    This computer-aided drafting (CAD) curriculum was developed to provide drafting instructors in Illinois with a useful guide for relating an important new technological advance to the vocational classroom. The competency-based learning activity guides are written to be used with any CAD system being used at the secondary and postsecondary levels.…

  3. Computational Structures Technology for Airframes and Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Housner, Jerrold M. (Compiler); Starnes, James H., Jr. (Compiler); Hopkins, Dale A. (Compiler); Chamis, Christos C. (Compiler)

    1992-01-01

    This conference publication contains the presentations and discussions from the joint University of Virginia (UVA)/NASA Workshops. The presentations included NASA Headquarters perspectives on High Speed Civil Transport (HSCT), goals and objectives of the UVA Center for Computational Structures Technology (CST), NASA and Air Force CST activities, CST activities for airframes and propulsion systems in industry, and CST activities at Sandia National Laboratory.

  4. Troubling Discourse: Basic Writing and Computer-Mediated Technologies

    ERIC Educational Resources Information Center

    Jonaitis, Leigh A.

    2012-01-01

    Through an examination of literature in the fields of Basic Writing and developmental education, this essay provides some historical perspective and examines the prevalent discourses on the use of computer-mediated technologies in the basic writing classroom. The author uses Bertram Bruce's (1997) framework of various "stances" on…

  5. The Voice as Computer Interface: A Look at Tomorrow's Technologies.

    ERIC Educational Resources Information Center

    Lange, Holley R.

    1991-01-01

    Discussion of voice as the communications device for computer-human interaction focuses on voice recognition systems for use within a library environment. Voice technologies are described, including voice response and voice recognition; examples of voice systems in use in libraries are examined; and further possibilities, including use with…

  6. NASA CST aids U.S. industry. [computational structures technology

    NASA Technical Reports Server (NTRS)

    Housner, Jerry M.; Pinson, Larry D.

    1993-01-01

    The effect of NASA's computational structures Technology (CST) research on aerospace vehicle design and operation is discussed. The application of this research to proposed version of a high-speed civil transport, to composite structures in aerospace, to the study of crack growth, and to resolving field problems is addressed.

  7. Computer-Assisted Foreign Language Teaching and Learning: Technological Advances

    ERIC Educational Resources Information Center

    Zou, Bin; Xing, Minjie; Wang, Yuping; Sun, Mingyu; Xiang, Catherine H.

    2013-01-01

    Computer-Assisted Foreign Language Teaching and Learning: Technological Advances highlights new research and an original framework that brings together foreign language teaching, experiments and testing practices that utilize the most recent and widely used e-learning resources. This comprehensive collection of research will offer linguistic…

  8. Computer Technologies: Attitudes and Self-Efficacy across Undergraduate Disciplines.

    ERIC Educational Resources Information Center

    Kinzie, Mable B.; And Others

    1994-01-01

    A study of 359 undergraduate students in business (n=125), education (n=111), and nursing (n=123) in 3 state university systems investigated the use of 2 affective measures concerning aspects of computer technology. Data on construct validity, relationship between results of the two measures, and implications for future research are reported.…

  9. Computer integrated manufacturing and technology transfer for improving aerospace productivity

    NASA Astrophysics Data System (ADS)

    Farrington, P. A.; Sica, J.

    1992-03-01

    This paper reviews a cooperative effort, between the Alabama Industial Development Training Institute and the University of Alabama in Huntsville, to implement a prototype computer integrated manufacturing system. The primary use of this system will be to educate Alabama companies on the organizational and technological issues involved in the implementation of advanced manufacturing systems.

  10. Pervasive Computing and Communication Technologies for U-Learning

    ERIC Educational Resources Information Center

    Park, Young C.

    2014-01-01

    The development of digital information transfer, storage and communication methods influences a significant effect on education. The assimilation of pervasive computing and communication technologies marks another great step forward, with Ubiquitous Learning (U-learning) emerging for next generation learners. In the evolutionary view the 5G (or…

  11. Integration of Computer Technology and Interactive Learning in Geographic Education.

    ERIC Educational Resources Information Center

    Bishop, Michael P.; And Others

    1995-01-01

    Contends that the rapid proliferation of computer technology is dramatically improving geographic instruction. Describes how instructors can identify and access easily Internet resources using Mosaic software. Concludes that these diverse and informative materials can be used in a wide variety of pedagogical tasks. (CFR)

  12. Developing a University's Construction Technology and Mgt's Computer Learning Center.

    ERIC Educational Resources Information Center

    Ryan, Richard; Kramer, Scott

    1994-01-01

    Provides working blueprints for a university technology lab built specifically for construction science students and faculty. More than just housing for computer workstations, the facility is intentionally designed as a medium for better communication and instruction. A future in which distance learning is the norm is addressed. (KRN)

  13. Beyond Computer Literacy: Technology Integration and Curriculum Transformation

    ERIC Educational Resources Information Center

    Safar, Ammar H.; AlKhezzi, Fahad A.

    2013-01-01

    Personal computers, the Internet, smartphones, and other forms of information and communication technology (ICT) have changed our world, our job, our personal lives, as well as how we manage our knowledge and time effectively and efficiently. Research findings in the past decades have acknowledged and affirmed that the content the ICT medium…

  14. National Energy Research Scientific Computing Center (NERSC): Advancing the frontiers of computational science and technology

    SciTech Connect

    Hules, J.

    1996-11-01

    National Energy Research Scientific Computing Center (NERSC) provides researchers with high-performance computing tools to tackle science`s biggest and most challenging problems. Founded in 1974 by DOE/ER, the Controlled Thermonuclear Research Computer Center was the first unclassified supercomputer center and was the model for those that followed. Over the years the center`s name was changed to the National Magnetic Fusion Energy Computer Center and then to NERSC; it was relocated to LBNL. NERSC, one of the largest unclassified scientific computing resources in the world, is the principal provider of general-purpose computing services to DOE/ER programs: Magnetic Fusion Energy, High Energy and Nuclear Physics, Basic Energy Sciences, Health and Environmental Research, and the Office of Computational and Technology Research. NERSC users are a diverse community located throughout US and in several foreign countries. This brochure describes: the NERSC advantage, its computational resources and services, future technologies, scientific resources, and computational science of scale (interdisciplinary research over a decade or longer; examples: combustion in engines, waste management chemistry, global climate change modeling).

  15. Providing Assistive Technology Applications as a Service Through Cloud Computing.

    PubMed

    Mulfari, Davide; Celesti, Antonio; Villari, Massimo; Puliafito, Antonio

    2015-01-01

    Users with disabilities interact with Personal Computers (PCs) using Assistive Technology (AT) software solutions. Such applications run on a PC that a person with a disability commonly uses. However the configuration of AT applications is not trivial at all, especially whenever the user needs to work on a PC that does not allow him/her to rely on his / her AT tools (e.g., at work, at university, in an Internet point). In this paper, we discuss how cloud computing provides a valid technological solution to enhance such a scenario.With the emergence of cloud computing, many applications are executed on top of virtual machines (VMs). Virtualization allows us to achieve a software implementation of a real computer able to execute a standard operating system and any kind of application. In this paper we propose to build personalized VMs running AT programs and settings. By using the remote desktop technology, our solution enables users to control their customized virtual desktop environment by means of an HTML5-based web interface running on any computer equipped with a browser, whenever they are.

  16. Providing Assistive Technology Applications as a Service Through Cloud Computing.

    PubMed

    Mulfari, Davide; Celesti, Antonio; Villari, Massimo; Puliafito, Antonio

    2015-01-01

    Users with disabilities interact with Personal Computers (PCs) using Assistive Technology (AT) software solutions. Such applications run on a PC that a person with a disability commonly uses. However the configuration of AT applications is not trivial at all, especially whenever the user needs to work on a PC that does not allow him/her to rely on his / her AT tools (e.g., at work, at university, in an Internet point). In this paper, we discuss how cloud computing provides a valid technological solution to enhance such a scenario.With the emergence of cloud computing, many applications are executed on top of virtual machines (VMs). Virtualization allows us to achieve a software implementation of a real computer able to execute a standard operating system and any kind of application. In this paper we propose to build personalized VMs running AT programs and settings. By using the remote desktop technology, our solution enables users to control their customized virtual desktop environment by means of an HTML5-based web interface running on any computer equipped with a browser, whenever they are. PMID:26132225

  17. Implications of Computer Technology. Harvard University Program on Technology and Society.

    ERIC Educational Resources Information Center

    Taviss, Irene; Burbank, Judith

    Lengthy abstracts of a small number of selected books and articles on the implications of computer technology are presented, preceded by a brief state-of-the-art survey which traces the impact of computers on the structure of economic and political organizations and socio-cultural patterns. A summary statement introduces each of the three abstract…

  18. Advanced computer modeling techniques expand belt conveyor technology

    SciTech Connect

    Alspaugh, M.

    1998-07-01

    Increased mining production is continuing to challenge engineers and manufacturers to keep up. The pressure to produce larger and more versatile equipment is increasing. This paper will show some recent major projects in the belt conveyor industry that have pushed the limits of design and engineering technology. Also, it will discuss the systems engineering discipline and advanced computer modeling tools that have helped make these achievements possible. Several examples of technologically advanced designs will be reviewed. However, new technology can sometimes produce increased problems with equipment availability and reliability if not carefully developed. Computer modeling techniques that help one design larger equipment can also compound operational headaches if engineering processes and algorithms are not carefully analyzed every step of the way.

  19. The Adoption of Grid Computing Technology by Organizations: A Quantitative Study Using Technology Acceptance Model

    ERIC Educational Resources Information Center

    Udoh, Emmanuel E.

    2010-01-01

    Advances in grid technology have enabled some organizations to harness enormous computational power on demand. However, the prediction of widespread adoption of the grid technology has not materialized despite the obvious grid advantages. This situation has encouraged intense efforts to close the research gap in the grid adoption process. In this…

  20. Implementing a Computer/Technology Endorsement in a Classroom Technology Master's Program.

    ERIC Educational Resources Information Center

    Brownell, Gregg; O'Bannon, Blanche; Brownell, Nancy

    In the spring of 1998, the Master's program in Classroom Technology at Bowling Green State University (Ohio) was granted conditional approval to grant, as part of the program, the new State of Ohio Department of Education computer/technology endorsement. This paper briefly describes Ohio's change from certification to licensure, the removal of…

  1. Simple adaptive sparse representation based classification schemes for EEG based brain-computer interface applications.

    PubMed

    Shin, Younghak; Lee, Seungchan; Ahn, Minkyu; Cho, Hohyun; Jun, Sung Chan; Lee, Heung-No

    2015-11-01

    One of the main problems related to electroencephalogram (EEG) based brain-computer interface (BCI) systems is the non-stationarity of the underlying EEG signals. This results in the deterioration of the classification performance during experimental sessions. Therefore, adaptive classification techniques are required for EEG based BCI applications. In this paper, we propose simple adaptive sparse representation based classification (SRC) schemes. Supervised and unsupervised dictionary update techniques for new test data and a dictionary modification method by using the incoherence measure of the training data are investigated. The proposed methods are very simple and additional computation for the re-training of the classifier is not needed. The proposed adaptive SRC schemes are evaluated using two BCI experimental datasets. The proposed methods are assessed by comparing classification results with the conventional SRC and other adaptive classification methods. On the basis of the results, we find that the proposed adaptive schemes show relatively improved classification accuracy as compared to conventional methods without requiring additional computation.

  2. Applications of automatic mesh generation and adaptive methods in computational medicine

    SciTech Connect

    Schmidt, J.A.; Macleod, R.S.; Johnson, C.R.; Eason, J.C.

    1995-12-31

    Important problems in Computational Medicine exist that can benefit from the implementation of adaptive mesh refinement techniques. Biological systems are so inherently complex that only efficient models running on state of the art hardware can begin to simulate reality. To tackle the complex geometries associated with medical applications we present a general purpose mesh generation scheme based upon the Delaunay tessellation algorithm and an iterative point generator. In addition, automatic, two- and three-dimensional adaptive mesh refinement methods are presented that are derived from local and global estimates of the finite element error. Mesh generation and adaptive refinement techniques are utilized to obtain accurate approximations of bioelectric fields within anatomically correct models of the heart and human thorax. Specifically, we explore the simulation of cardiac defibrillation and the general forward and inverse problems in electrocardiography (ECG). Comparisons between uniform and adaptive refinement techniques are made to highlight the computational efficiency and accuracy of adaptive methods in the solution of field problems in computational medicine.

  3. Alienation and Adaptation: Integrating Technology and the Humanities.

    ERIC Educational Resources Information Center

    Smith, Elizabeth T.; Selfe, Cynthia L.

    Although most segments of the American workforce now recognize that computers can reduce the drudgery of repetitive tasks and lighten the burden of information exchange, storage, and retrieval, some people, especially many humanists, social scientists, and other non-technically oriented professionals, remain hesitant to step into the computer age.…

  4. SIMCA T 1.0: A SAS Computer Program for Simulating Computer Adaptive Testing

    ERIC Educational Resources Information Center

    Raiche, Gilles; Blais, Jean-Guy

    2006-01-01

    Monte Carlo methodologies are frequently applied to study the sampling distribution of the estimated proficiency level in adaptive testing. These methods eliminate real situational constraints. However, these Monte Carlo methodologies are not currently supported by the available software programs, and when these programs are available, their…

  5. Study on rule-based adaptive fuzzy excitation control technology

    NASA Astrophysics Data System (ADS)

    Zhao, Hui; Wang, Hong-jun; Liu, Lu-yuan; Yue, You-jun

    2008-10-01

    Power system is a kind of typical non-linear system, it is hard to achieve excellent control performance with conventional PID controller under different operating conditions. Fuzzy parameter adaptive PID exciting controller is very efficient to overcome the influence of tiny disturbances, but the performance of the control system will be worsened when operating conditions of the system change greatly or larger disturbances occur. To solve this problem, this article presents a rule adaptive fuzzy control scheme for synchronous generator exciting system. In this scheme the control rule adaptation is implemented by regulating the value of parameter di under the given proportional divisors K1, K2 and K3 of fuzzy sets Ai and Bi. This rule adaptive mechanism is constituted by two groups of original rules about the self-generation and self-correction of the control rule. Using two groups of rules, the control rule activated by status 1 and 2 in figure 2 system can be regulated automatically and simultaneously at the time instant k. The results from both theoretical analysis and simulation show that the presented scheme is effective and feasible and possesses good performance.

  6. Collaborative Learning with Multi-Touch Technology: Developing Adaptive Expertise

    ERIC Educational Resources Information Center

    Mercier, Emma M.; Higgins, Steven E.

    2013-01-01

    Developing fluency and flexibility in mathematics is a key goal of upper primary schooling, however, while fluency can be developed with practice, designing activities that support the development of flexibility is more difficult. Drawing on concepts of adaptive expertise, we developed a task for a multi-touch classroom, NumberNet, that aimed to…

  7. Lessons Learned in Designing and Implementing a Computer-Adaptive Test for English

    ERIC Educational Resources Information Center

    Burston, Jack; Neophytou, Maro

    2014-01-01

    This paper describes the lessons learned in designing and implementing a computer-adaptive test (CAT) for English. The early identification of students with weak L2 English proficiency is of critical importance in university settings that have compulsory English language course graduation requirements. The most efficient means of diagnosing the L2…

  8. Comparing Computer Adaptive and Curriculum-Based Measures of Math in Progress Monitoring

    ERIC Educational Resources Information Center

    Shapiro, Edward S.; Dennis, Minyi Shih; Fu, Qiong

    2015-01-01

    The purpose of the study was to compare the use of a Computer Adaptive Test and Curriculum-Based Measurement in the assessment of mathematics. This study also investigated the degree to which slope or rate of change predicted student outcomes on the annual state assessment of mathematics above and beyond scores of single point screening…

  9. Assessing the Reliability of Computer Adaptive Testing Branching Algorithms Using HyperCAT.

    ERIC Educational Resources Information Center

    Shermis, Mark D.; And Others

    The reliability of four branching algorithms commonly used in computer adaptive testing (CAT) was examined. These algorithms were: (1) maximum likelihood (MLE); (2) Bayesian; (3) modal Bayesian; and (4) crossover. Sixty-eight undergraduate college students were randomly assigned to one of the four conditions using the HyperCard-based CAT program,…

  10. An Adaptive Feedback and Review Paradigm for Computer-Based Drills.

    ERIC Educational Resources Information Center

    Siegel, Martin A.; Misselt, A. Lynn

    The Corrective Feedback Paradigm (CFP), which has been refined and expanded through use on the PLATO IV Computer-Based Education System, is based on instructional design strategies implied by stimulus-locus analyses, direct instruction, and instructional feedback methods. Features of the paradigm include adaptive feedback techniques with…

  11. A Practical Computer Adaptive Testing Model for Small-Scale Scenarios

    ERIC Educational Resources Information Center

    Tao, Yu-Hui; Wu, Yu-Lung; Chang, Hsin-Yi

    2008-01-01

    Computer adaptive testing (CAT) is theoretically sound and efficient, and is commonly seen in larger testing programs. It is, however, rarely seen in a smaller-scale scenario, such as in classrooms or business daily routines, because of the complexity of most adopted Item Response Theory (IRT) models. While the Sequential Probability Ratio Test…

  12. The Effect of Adaptive Confidence Strategies in Computer-Assisted Instruction on Learning and Learner Confidence

    ERIC Educational Resources Information Center

    Warren, Richard Daniel

    2012-01-01

    The purpose of this research was to investigate the effects of including adaptive confidence strategies in instructionally sound computer-assisted instruction (CAI) on learning and learner confidence. Seventy-one general educational development (GED) learners recruited from various GED learning centers at community colleges in the southeast United…

  13. Promoting Contextual Vocabulary Learning through an Adaptive Computer-Assisted EFL Reading System

    ERIC Educational Resources Information Center

    Wang, Y.-H.

    2016-01-01

    The study developed an adaptive computer-assisted reading system and investigated its effect on promoting English as a foreign language learner-readers' contextual vocabulary learning performance. Seventy Taiwanese college students were assigned to two reading groups. Participants in the customised reading group read online English texts, each of…

  14. Off-the-shelf real-time computers for next-generation adaptive optics

    NASA Astrophysics Data System (ADS)

    Hippler, Stefan; Looze, Douglas P.; Gaessler, Wolfgang

    2004-10-01

    The performance of adaptive optics systems for existing as well as future giant telescopes heavily depends on the number of active wavefront compensating elements, the spatial, and the temporal sampling of the distorted incoming wavefront. In a phase-A study for an extreme adaptive optics system for the VLT (CHEOPS) as well as for LINC-NIRVANA a fizeau interferometer aboard LBT with a multi-conjugated adaptive optics system, we investigate how today's off-the-shelf computers compare in terms of floating point computing power, memory bandwidth, input/output bandwidth and real-time behavior. We address questions like how level three cache can impact the memory bandwidth, what matrix-vector multiplication performance is achievable, and what can we learn from standard benchmarks running on different architectures.

  15. Impact of Load Balancing on Unstructured Adaptive Grid Computations for Distributed-Memory Multiprocessors

    NASA Technical Reports Server (NTRS)

    Biswas, Rupak; Simon, Horst D.; Sohn, Andrew

    1996-01-01

    The computational requirements for an adaptive solution of unsteady problems change as the simulation progresses. This causes workload imbalance among processors on a parallel machine which, in turn, requires significant data movement at runtime. We present a new dynamic load-balancing framework, called JOVE, that balances the workload across all processors with a global view. Whenever the computational mesh is adapted, JOVE is activated to eliminate the load imbalance. JOVE has been implemented on an IBM SP2 distributed-memory machine in MPI for portability. Experimental results for two model meshes demonstrate that mesh adaption with load balancing gives more than a sixfold improvement over one without load balancing. We also show that JOVE gives a 24-fold speedup on 64 processors compared to sequential execution.

  16. L(sub 1) Adaptive Flight Control System: Flight Evaluation and Technology Transition

    NASA Technical Reports Server (NTRS)

    Xargay, Enric; Hovakimyan, Naira; Dobrokhodov, Vladimir; Kaminer, Isaac; Gregory, Irene M.; Cao, Chengyu

    2010-01-01

    Certification of adaptive control technologies for both manned and unmanned aircraft represent a major challenge for current Verification and Validation techniques. A (missing) key step towards flight certification of adaptive flight control systems is the definition and development of analysis tools and methods to support Verification and Validation for nonlinear systems, similar to the procedures currently used for linear systems. In this paper, we describe and demonstrate the advantages of L(sub l) adaptive control architectures for closing some of the gaps in certification of adaptive flight control systems, which may facilitate the transition of adaptive control into military and commercial aerospace applications. As illustrative examples, we present the results of a piloted simulation evaluation on the NASA AirSTAR flight test vehicle, and results of an extensive flight test program conducted by the Naval Postgraduate School to demonstrate the advantages of L(sub l) adaptive control as a verifiable robust adaptive flight control system.

  17. Fax technology for collecting outcomes data in a computer database.

    PubMed

    Khan, Z M; Rascati, K L; Koeller, J M; Smeeding, J

    1999-12-15

    In this era of cost containment, outcomes research is becoming more prevalent. Therefore, various technologies allowing for flexibility in study design and the capture of specific clinical information need to be examined and used. These technologies include fax data systems, pocket scanners, automated telephone equipment, and hand-held computer devices. Fax data systems convert a fax machine into an automated data-entry system. Data-filled forms are faxed to a computer, the fax is converted, and the data are entered into preset fields in a database. Applications for fax systems include acute care-based and ambulatory care-based drug-use evaluations, drug recall systems, and patient-completed surveys of health status. Pocket scanners are hand-held instruments for rapid data entry and transport. Applications for pocket scanning include patient interview responses, procedure and disease analysis, and procedure coding. Options for automated telephone equipment include surveys with interactive voice-mail responses or keypad data entry, pharmacist-monitored drug information and survey services, fax-back and mail-out services, and patient-generated disease intervention programs. Hand-held computer technology is a source of information on multiple protocols and care pathways. All these technologies improve data collection with respect to accuracy and speed, facilitate data analysis, and promote cost-efficient information sharing. The purpose of this study was to evaluate the use of fax technology in data collection for a prospective, multicenter study of the outcomes and cost-effectiveness of two drugs used in the treatment of cancer. Details for the pharmacoeconomic study can be found elsewhere. Fax technology was selected because of the ease with which those responsible for managing the data collection could be trained to use it, the affordability and efficiency of the technology, the ease with which data could be analyzed, and the accuracy of data collection.

  18. Computers in My Curriculum? 18 Lesson Plans for Teaching Computer Awareness without a Computer. Adaptable Grades 3-12.

    ERIC Educational Resources Information Center

    Bailey, Suzanne Powers; Jeffers, Marcia

    Eighteen interrelated, sequential lesson plans and supporting materials for teaching computer literacy at the elementary and secondary levels are presented. The activities, intended to be infused into the regular curriculum, do not require the use of a computer. The introduction presents background information on computer literacy, suggests a…

  19. Cases on Technological Adaptability and Transnational Learning: Issues and Challenges

    ERIC Educational Resources Information Center

    Mukerji, Siran, Ed.; Tripathi, Purnendu, Ed.

    2010-01-01

    Technology holds the key for bridging the gap between access to quality education and the need for enhanced learning experiences. This book contains case studies on divergent themes of personalized learning environments, inclusive learning for social change, innovative learning and assessment techniques, technology and international partnership…

  20. Adapting Technology for School Improvement: A Global Perspective

    ERIC Educational Resources Information Center

    Chapman, David W., Ed.; Mahlck, Lars O., Ed.

    2004-01-01

    This book presents a compilation of articles based on the premise that the move to advanced technology use in primary and secondary schools offers great hope for improving the access, quality, and efficiency of basic education. The aim of the book is to identify and examine how information technologies can be, and are being, used to strengthen the…

  1. IMRT planning on adaptive volume structures--a decisive reduction in computational complexity.

    PubMed

    Scherrer, Alexander; Küfer, Karl-Heinz; Bortfeld, Thomas; Monz, Michael; Alonso, Fernando

    2005-05-01

    The objective of radiotherapy planning is to find a compromise between the contradictive goals of delivering a sufficiently high dose to the target volume while widely sparing critical structures. The search for such a compromise requires the computation of several plans, which mathematically means solving several optimization problems. In the case of intensity modulated radiotherapy (IMRT) these problems are large-scale, hence the accumulated computational expense is very high. The adaptive clustering method presented in this paper overcomes this difficulty. The main idea is to use a preprocessed hierarchy of aggregated dose-volume information as a basis for individually adapted approximations of the original optimization problems. This leads to a decisively reduced computational expense: numerical experiments on several sets of real clinical data typically show computation times decreased by a factor of about 10. In contrast to earlier work in this field, this reduction in computational complexity will not lead to a loss in accuracy: the adaptive clustering method produces the optimum of the original optimization problem.

  2. Comparing computer adaptive and curriculum-based measures of math in progress monitoring.

    PubMed

    Shapiro, Edward S; Dennis, Minyi Shih; Fu, Qiong

    2015-12-01

    The purpose of the study was to compare the use of a Computer Adaptive Test and Curriculum-Based Measurement in the assessment of mathematics. This study also investigated the degree to which slope or rate of change predicted student outcomes on the annual state assessment of mathematics above and beyond scores of single point screening assessments (i.e., the computer adaptive test or the CBM assessment just before the administration of the state assessment). Repeated measurement of mathematics once per month across a 7-month period using a Computer Adaptive Test (STAR-Math) and Curriculum-Based Measurement (CBM, AIMSweb Math Computation, AIMSweb Math Concepts/Applications) was collected for a maximum total of 250 third, fourth, and fifth grade students. Results showed STAR-Math in all 3 grades and AIMSweb Math Concepts/Applications in the third and fifth grades had primarily linear growth patterns in mathematics. AIMSweb Math Computation in all grades and AIMSweb Math Concepts/Applications in Grade 4 had decelerating positive trends. Predictive validity evidence showed the strongest relationships were between STAR-Math and outcomes for third and fourth grade students. The blockwise multiple regression by grade revealed that slopes accounted for only a very small proportion of additional variance above and beyond what was explained by the scores obtained on a single point of assessment just prior to the administration of the state assessment.

  3. Computing, information, and communications: Technologies for the 21. Century

    SciTech Connect

    1998-11-01

    To meet the challenges of a radically new and technologically demanding century, the Federal Computing, Information, and Communications (CIC) programs are investing in long-term research and development (R and D) to advance computing, information, and communications in the United States. CIC R and D programs help Federal departments and agencies to fulfill their evolving missions, assure the long-term national security, better understand and manage the physical environment, improve health care, help improve the teaching of children, provide tools for lifelong training and distance learning to the workforce, and sustain critical US economic competitiveness. One of the nine committees of the National Science and Technology Council (NSTC), the Committee on Computing, Information, and Communications (CCIC)--through its CIC R and D Subcommittee--coordinates R and D programs conducted by twelve Federal departments and agencies in cooperation with US academia and industry. These R and D programs are organized into five Program Component Areas: (1) HECC--High End Computing and Computation; (2) LSN--Large Scale Networking, including the Next Generation Internet Initiative; (3) HCS--High Confidence Systems; (4) HuCS--Human Centered Systems; and (5) ETHR--Education, Training, and Human Resources. A brief synopsis of FY 1997 accomplishments and FY 1998 goals by PCA is presented. This report, which supplements the President`s Fiscal Year 1998 Budget, describes the interagency CIC programs.

  4. Architecture-Adaptive Computing Environment: A Tool for Teaching Parallel Programming

    NASA Technical Reports Server (NTRS)

    Dorband, John E.; Aburdene, Maurice F.

    2002-01-01

    Recently, networked and cluster computation have become very popular. This paper is an introduction to a new C based parallel language for architecture-adaptive programming, aCe C. The primary purpose of aCe (Architecture-adaptive Computing Environment) is to encourage programmers to implement applications on parallel architectures by providing them the assurance that future architectures will be able to run their applications with a minimum of modification. A secondary purpose is to encourage computer architects to develop new types of architectures by providing an easily implemented software development environment and a library of test applications. This new language should be an ideal tool to teach parallel programming. In this paper, we will focus on some fundamental features of aCe C.

  5. Unstructured adaptive mesh computations of rotorcraft high-speed impulsive noise

    NASA Technical Reports Server (NTRS)

    Strawn, Roger; Garceau, Michael; Biswas, Rupak

    1993-01-01

    A new method is developed for modeling helicopter high-speed impulsive (HSI) noise. The aerodynamics and acoustics near the rotor blade tip are computed by solving the Euler equations on an unstructured grid. A stationary Kirchhoff surface integral is then used to propagate these acoustic signals to the far field. The near-field Euler solver uses a solution-adaptive grid scheme to improve the resolution of the acoustic signal. Grid points are locally added and/or deleted from the mesh at each adaptive step. An important part of this procedure is the choice of an appropriate error indicator. The error indicator is computed from the flow field solution and determines the regions for mesh coarsening and refinement. Computed results for HSI noise compare favorably with experimental data for three different hovering rotor cases.

  6. Computational adaptive optics for broadband interferometric tomography of tissues and cells

    NASA Astrophysics Data System (ADS)

    Adie, Steven G.; Mulligan, Jeffrey A.

    2016-03-01

    Adaptive optics (AO) can shape aberrated optical wavefronts to physically restore the constructive interference needed for high-resolution imaging. With access to the complex optical field, however, many functions of optical hardware can be achieved computationally, including focusing and the compensation of optical aberrations to restore the constructive interference required for diffraction-limited imaging performance. Holography, which employs interferometric detection of the complex optical field, was developed based on this connection between hardware and computational image formation, although this link has only recently been exploited for 3D tomographic imaging in scattering biological tissues. This talk will present the underlying imaging science behind computational image formation with optical coherence tomography (OCT) -- a beam-scanned version of broadband digital holography. Analogous to hardware AO (HAO), we demonstrate computational adaptive optics (CAO) and optimization of the computed pupil correction in 'sensorless mode' (Zernike polynomial corrections with feedback from image metrics) or with the use of 'guide-stars' in the sample. We discuss the concept of an 'isotomic volume' as the volumetric extension of the 'isoplanatic patch' introduced in astronomical AO. Recent CAO results and ongoing work is highlighted to point to the potential biomedical impact of computed broadband interferometric tomography. We also discuss the advantages and disadvantages of HAO vs. CAO for the effective shaping of optical wavefronts, and highlight opportunities for hybrid approaches that synergistically combine the unique advantages of hardware and computational methods for rapid volumetric tomography with cellular resolution.

  7. Report of the Panel on Computer and Information Technology

    NASA Technical Reports Server (NTRS)

    Lundstrom, Stephen F.; Larsen, Ronald L.

    1984-01-01

    Aircraft have become more and more dependent on computers (information processing) for improved performance and safety. It is clear that this activity will grow, since information processing technology has advanced by a factor of 10 every 5 years for the past 35 years and will continue to do so. Breakthroughs in device technology, from vacuum tubes through transistors to integrated circuits, contribute to this rapid pace. This progress is nearly matched by similar, though not as dramatic, advances in numerical software and algorithms. Progress has not been easy. Many technical and nontechnical challenges were surmounted. The outlook is for continued growth in capability but will require surmounting new challenges. The technology forecast presented in this report has been developed by extrapolating current trends and assessing the possibilities of several high-risk research topics. In the process, critical problem areas that require research and development emphasis have been identified. The outlook assumes a positive perspective; the projected capabilities are possible by the year 2000, and adequate resources will be made available to achieve them. Computer and information technology forecasts and the potential impacts of this technology on aeronautics are identified. Critical issues and technical challenges underlying the achievement of forecasted performance and benefits are addressed.

  8. Cloud computing and patient engagement: leveraging available technology.

    PubMed

    Noblin, Alice; Cortelyou-Ward, Kendall; Servan, Rosa M

    2014-01-01

    Cloud computing technology has the potential to transform medical practices and improve patient engagement and quality of care. However, issues such as privacy and security and "fit" can make incorporation of the cloud an intimidating decision for many physicians. This article summarizes the four most common types of clouds and discusses their ideal uses, how they engage patients, and how they improve the quality of care offered. This technology also can be used to meet Meaningful Use requirements 1 and 2; and, if speculation is correct, the cloud will provide the necessary support needed for Meaningful Use 3 as well. PMID:25807597

  9. Cloud computing and patient engagement: leveraging available technology.

    PubMed

    Noblin, Alice; Cortelyou-Ward, Kendall; Servan, Rosa M

    2014-01-01

    Cloud computing technology has the potential to transform medical practices and improve patient engagement and quality of care. However, issues such as privacy and security and "fit" can make incorporation of the cloud an intimidating decision for many physicians. This article summarizes the four most common types of clouds and discusses their ideal uses, how they engage patients, and how they improve the quality of care offered. This technology also can be used to meet Meaningful Use requirements 1 and 2; and, if speculation is correct, the cloud will provide the necessary support needed for Meaningful Use 3 as well.

  10. Renewable energy technologies and its adaptation in an urban environment

    SciTech Connect

    Thampi, K. Ravindranathan Byrne, Owen Surolia, Praveen K.

    2014-01-28

    This general article is based on the inaugural talk delivered at the opening of OMTAT 2013 conference. It notes that the integration of renewable energy sources into living and transport sectors presents a daunting task, still. In spite of the fact that the earth and its atmosphere continually receive 1.7 × 10{sup 17} watts of radiation from the sun, in the portfolio of sustainable and environment friendly energy options, which is about 16% of the world’s energy consumption and mostly met by biomass, only a paltry 0.04% is accredited to solar. First and second generation solar cells offer mature technologies for applications. The most important difficulty with regards to integration with structures is not only the additional cost, but also the lack of sufficient knowledge in managing the available energy smartly and efficiently. The incorporation of PV as a part of building fabric greatly reduces the overall costs compared with retrofitting. BIPV (Building Integrated photovoltaic) is a critical technology for establishing aesthetically pleasing solar structures. Infusing PV and building elements is greatly simplified with some of the second generation thin film technologies now manufactured as flexible panels. The same holds true for 3{sup rd} generation technologies under development such as, and dye- and quantum dot- sensitized solar cells. Additionally, these technologies offer transparent or translucent solar cells for incorporation into windows and skylights. This review deals with the present state of solar cell technologies suitable for BIPV and the status of BIPV applications and its future prospects.

  11. Renewable energy technologies and its adaptation in an urban environment

    NASA Astrophysics Data System (ADS)

    Thampi, K. Ravindranathan; Byrne, Owen; Surolia, Praveen K.

    2014-01-01

    This general article is based on the inaugural talk delivered at the opening of OMTAT 2013 conference. It notes that the integration of renewable energy sources into living and transport sectors presents a daunting task, still. In spite of the fact that the earth and its atmosphere continually receive 1.7 × 1017 watts of radiation from the sun, in the portfolio of sustainable and environment friendly energy options, which is about 16% of the world's energy consumption and mostly met by biomass, only a paltry 0.04% is accredited to solar. First and second generation solar cells offer mature technologies for applications. The most important difficulty with regards to integration with structures is not only the additional cost, but also the lack of sufficient knowledge in managing the available energy smartly and efficiently. The incorporation of PV as a part of building fabric greatly reduces the overall costs compared with retrofitting. BIPV (Building Integrated photovoltaic) is a critical technology for establishing aesthetically pleasing solar structures. Infusing PV and building elements is greatly simplified with some of the second generation thin film technologies now manufactured as flexible panels. The same holds true for 3rd generation technologies under development such as, and dye- and quantum dot- sensitized solar cells. Additionally, these technologies offer transparent or translucent solar cells for incorporation into windows and skylights. This review deals with the present state of solar cell technologies suitable for BIPV and the status of BIPV applications and its future prospects.

  12. A case for Sandia investment in complex adaptive systems science and technology.

    SciTech Connect

    Colbaugh, Richard; Tsao, Jeffrey Yeenien; Johnson, Curtis Martin; Backus, George A.; Brown, Theresa Jean; Jones, Katherine A.

    2012-05-01

    This white paper makes a case for Sandia National Laboratories investments in complex adaptive systems science and technology (S&T) -- investments that could enable higher-value-added and more-robustly-engineered solutions to challenges of importance to Sandia's national security mission and to the nation. Complex adaptive systems are ubiquitous in Sandia's national security mission areas. We often ignore the adaptive complexity of these systems by narrowing our 'aperture of concern' to systems or subsystems with a limited range of function exposed to a limited range of environments over limited periods of time. But by widening our aperture of concern we could increase our impact considerably. To do so, the science and technology of complex adaptive systems must mature considerably. Despite an explosion of interest outside of Sandia, however, that science and technology is still in its youth. What has been missing is contact with real (rather than model) systems and real domain-area detail. With its center-of-gravity as an engineering laboratory, Sandia's has made considerable progress applying existing science and technology to real complex adaptive systems. It has focused much less, however, on advancing the science and technology itself. But its close contact with real systems and real domain-area detail represents a powerful strength with which to help complex adaptive systems science and technology mature. Sandia is thus both a prime beneficiary of, as well as potentially a prime contributor to, complex adaptive systems science and technology. Building a productive program in complex adaptive systems science and technology at Sandia will not be trivial, but a credible path can be envisioned: in the short run, continue to apply existing science and technology to real domain-area complex adaptive systems; in the medium run, jump-start the creation of new science and technology capability through Sandia's Laboratory Directed Research and Development program; and

  13. Self-adaptive phosphor coating technology for wafer-level scale chip packaging

    NASA Astrophysics Data System (ADS)

    Linsong, Zhou; Haibo, Rao; Wei, Wang; Xianlong, Wan; Junyuan, Liao; Xuemei, Wang; Da, Zhou; Qiaolin, Lei

    2013-05-01

    A new self-adaptive phosphor coating technology has been successfully developed, which adopted a slurry method combined with a self-exposure process. A phosphor suspension in the water-soluble photoresist was applied and exposed to LED blue light itself and developed to form a conformal phosphor coating with self-adaptability to the angular distribution of intensity of blue light and better-performing spatial color uniformity. The self-adaptive phosphor coating technology had been successfully adopted in the wafer surface to realize a wafer-level scale phosphor conformal coating. The first-stage experiments show satisfying results and give an adequate demonstration of the flexibility of self-adaptive coating technology on application of WLSCP.

  14. 3D fast adaptive correlation imaging for large-scale gravity data based on GPU computation

    NASA Astrophysics Data System (ADS)

    Chen, Z.; Meng, X.; Guo, L.; Liu, G.

    2011-12-01

    In recent years, large scale gravity data sets have been collected and employed to enhance gravity problem-solving abilities of tectonics studies in China. Aiming at the large scale data and the requirement of rapid interpretation, previous authors have carried out a lot of work, including the fast gradient module inversion and Euler deconvolution depth inversion ,3-D physical property inversion using stochastic subspaces and equivalent storage, fast inversion using wavelet transforms and a logarithmic barrier method. So it can be say that 3-D gravity inversion has been greatly improved in the last decade. Many authors added many different kinds of priori information and constraints to deal with nonuniqueness using models composed of a large number of contiguous cells of unknown property and obtained good results. However, due to long computation time, instability and other shortcomings, 3-D physical property inversion has not been widely applied to large-scale data yet. In order to achieve 3-D interpretation with high efficiency and precision for geological and ore bodies and obtain their subsurface distribution, there is an urgent need to find a fast and efficient inversion method for large scale gravity data. As an entirely new geophysical inversion method, 3D correlation has a rapid development thanks to the advantage of requiring no a priori information and demanding small amount of computer memory. This method was proposed to image the distribution of equivalent excess masses of anomalous geological bodies with high resolution both longitudinally and transversely. In order to tranform the equivalence excess masses into real density contrasts, we adopt the adaptive correlation imaging for gravity data. After each 3D correlation imaging, we change the equivalence into density contrasts according to the linear relationship, and then carry out forward gravity calculation for each rectangle cells. Next, we compare the forward gravity data with real data, and

  15. Diversity in computing technologies and strategies for dynamic resource allocation

    SciTech Connect

    Garzoglio, G.; Gutsche, O.

    2015-01-01

    Here, High Energy Physics (HEP) is a very data intensive and trivially parallelizable science discipline. HEP is probing nature at increasingly finer details requiring ever increasing computational resources to process and analyze experimental data. In this paper, we discuss how HEP provisioned resources so far using Grid technologies, how HEP is starting to include new resource providers like commercial Clouds and HPC installations, and how HEP is transparently provisioning resources at these diverse providers.

  16. Vector Field Visual Data Analysis Technologies for Petascale Computational Science

    SciTech Connect

    Garth, Christoph; Deines, Eduard; Joy, Kenneth I.; Bethel, E. Wes; Childs, Hank; Weber, Gunther; Ahern, Sean; Pugmire, Dave; Sanderson, Allen; Johnson, Chris

    2009-11-13

    State-of-the-art computational science simulations generate large-scale vector field data sets. Visualization and analysis is a key aspect of obtaining insight into these data sets and represents an important challenge. This article discusses possibilities and challenges of modern vector field visualization and focuses on methods and techniques developed in the SciDAC Visualization and Analytics Center for Enabling Technologies (VACET) and deployed in the open-source visualization tool, VisIt.

  17. Application of Adaptive Decision Aiding Systems to Computer-Assisted Instruction. Final Report, January-December 1974.

    ERIC Educational Resources Information Center

    May, Donald M.; And Others

    The minicomputer-based Computerized Diagnostic and Decision Training (CDDT) system described combines the principles of artificial intelligence, decision theory, and adaptive computer assisted instruction for training in electronic troubleshooting. The system incorporates an adaptive computer program which learns the student's diagnostic and…

  18. 25 CFR 502.7 - Electronic, computer or other technologic aid.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 25 Indians 2 2010-04-01 2010-04-01 false Electronic, computer or other technologic aid. 502.7... DEFINITIONS OF THIS CHAPTER § 502.7 Electronic, computer or other technologic aid. (a) Electronic, computer or... applicable Federal communications law. (b) Electronic, computer or other technologic aids include, but...

  19. 25 CFR 502.7 - Electronic, computer or other technologic aid.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 25 Indians 2 2013-04-01 2013-04-01 false Electronic, computer or other technologic aid. 502.7... DEFINITIONS OF THIS CHAPTER § 502.7 Electronic, computer or other technologic aid. (a) Electronic, computer or... applicable Federal communications law. (b) Electronic, computer or other technologic aids include, but...

  20. 25 CFR 502.7 - Electronic, computer or other technologic aid.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 25 Indians 2 2012-04-01 2012-04-01 false Electronic, computer or other technologic aid. 502.7... DEFINITIONS OF THIS CHAPTER § 502.7 Electronic, computer or other technologic aid. (a) Electronic, computer or... applicable Federal communications law. (b) Electronic, computer or other technologic aids include, but...

  1. 25 CFR 502.7 - Electronic, computer or other technologic aid.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 25 Indians 2 2014-04-01 2014-04-01 false Electronic, computer or other technologic aid. 502.7... DEFINITIONS OF THIS CHAPTER § 502.7 Electronic, computer or other technologic aid. (a) Electronic, computer or... applicable Federal communications law. (b) Electronic, computer or other technologic aids include, but...

  2. 25 CFR 502.7 - Electronic, computer or other technologic aid.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 25 Indians 2 2011-04-01 2011-04-01 false Electronic, computer or other technologic aid. 502.7... DEFINITIONS OF THIS CHAPTER § 502.7 Electronic, computer or other technologic aid. (a) Electronic, computer or... applicable Federal communications law. (b) Electronic, computer or other technologic aids include, but...

  3. Cross-domain adaptation reveals that a common mechanism computes stereoscopic (cyclopean) and luminance plaid motion.

    PubMed

    Bowd, C; Donnelly, M; Shorter, S; Patterson, R

    2000-01-01

    Across three experiments, this study investigated the visual processing of moving stereoscopic plaid patterns (plaids created with cyclopean components defined by moving binocular disparity embedded in a dynamic random-dot stereogram). Results showed that adaptation to a moving stereoscopic plaid or its components affected the perceived coherence of a luminance test plaid, and vice versa. Cross-domain adaptation suggests that stereoscopic and luminance motion signals feed into a common pattern-motion mechanism, consistent with the idea that stereoscopic motion signals are computed early in the motion processing stream.

  4. Adaptation for a Changing Environment: Developing Learning and Teaching with Information and Communication Technologies

    ERIC Educational Resources Information Center

    Kirkwood, Adrian; Price, Linda

    2006-01-01

    This article examines the relationship between the use of information and communication technologies (ICT) and learning and teaching, particularly in distance education contexts. We argue that environmental changes (societal, educational, and technological) make it necessary to adapt systems and practices that are no longer appropriate. The need…

  5. Contingency support using adaptive telemetry extractor and expert system technologies

    NASA Astrophysics Data System (ADS)

    Bryant, Thomas; Cruse, Bryant; Wende, Charles

    The 'telemetry analysis logic for operations support' prototype system constitutes an expert system that is charged with contingency planning for the NASA Hubble Space Telescope (HST); this system has demonstrated the feasibility of using an adaptive telemetry extractor/reformatter that is integrated with an expert system. A test case generated by a simulator has demonstrated the reduction of the time required for analysis of a complex series of failures to a few minutes, from the hour usually required. The HST's telemetry extractor will be able to read real-time engineering telemetry streams and disk-based data. Telemetry format changes will be handled almost instantaneously.

  6. Adapting MCM-D technology to a piezoresistive accelerometer packaging

    NASA Astrophysics Data System (ADS)

    Collado, A.; Plaza, J. A.; Cabruja, E.; Esteve, J.

    2003-07-01

    A silicon-on-silicon multichip module for a piezoresistive accelerometer is presented in this paper. This packaging technology, a type of wafer level packaging, offers fully complementary metal-oxide semiconductor compatible silicon substrates, so a pre-amplification stage can be included at substrate level. The electrical contacts and a partial sealing of the sensor mobile structures are performed at the same step using flip-chip technology, so the cost is reduced. As accelerometers are stress-sensitive devices, great care must be taken in the fabrication process and materials. Thus, test structures have been included to study the packaging effects. In this paper we report on the compatibility of accelerometer and wafer level packaging technologies.

  7. A virtual computer lab for distance biomedical technology education

    PubMed Central

    Locatis, Craig; Vega, Anibal; Bhagwat, Medha; Liu, Wei-Li; Conde, Jose

    2008-01-01

    Background The National Library of Medicine's National Center for Biotechnology Information offers mini-courses which entail applying concepts in biochemistry and genetics to search genomics databases and other information sources. They are highly interactive and involve use of 3D molecular visualization software that can be computationally taxing. Methods Methods were devised to offer the courses at a distance so as to provide as much functionality of a computer lab as possible, the venue where they are normally taught. The methods, which can be employed with varied videoconferencing technology and desktop sharing software, were used to deliver mini-courses at a distance in pilot applications where students could see demonstrations by the instructor and the instructor could observe and interact with students working at their remote desktops. Results Student ratings of the learning experience and comments to open ended questions were similar to those when the courses are offered face to face. The real time interaction and the instructor's ability to access student desktops from a distance in order to provide individual assistance and feedback were considered invaluable. Conclusion The technologies and methods mimic much of the functionality of computer labs and may be usefully applied in any context where content changes frequently, training needs to be offered on complex computer applications at a distance in real time, and where it is necessary for the instructor to monitor students as they work. PMID:18366629

  8. Advances in parallel computer technology for desktop atmospheric dispersion models

    SciTech Connect

    Bian, X.; Ionescu-Niscov, S.; Fast, J.D.; Allwine, K.J.

    1996-12-31

    Desktop models are those models used by analysts with varied backgrounds, for performing, for example, air quality assessment and emergency response activities. These models must be robust, well documented, have minimal and well controlled user inputs, and have clear outputs. Existing coarse-grained parallel computers can provide significant increases in computation speed in desktop atmospheric dispersion modeling without considerable increases in hardware cost. This increased speed will allow for significant improvements to be made in the scientific foundations of these applied models, in the form of more advanced diffusion schemes and better representation of the wind and turbulence fields. This is especially attractive for emergency response applications where speed and accuracy are of utmost importance. This paper describes one particular application of coarse-grained parallel computer technology to a desktop complex terrain atmospheric dispersion modeling system. By comparing performance characteristics of the coarse-grained parallel version of the model with the single-processor version, we will demonstrate that applying coarse-grained parallel computer technology to desktop atmospheric dispersion modeling systems will allow us to address critical issues facing future requirements of this class of dispersion models.

  9. Reinforcement learning for adaptive threshold control of restorative brain-computer interfaces: a Bayesian simulation.

    PubMed

    Bauer, Robert; Gharabaghi, Alireza

    2015-01-01

    Restorative brain-computer interfaces (BCI) are increasingly used to provide feedback of neuronal states in a bid to normalize pathological brain activity and achieve behavioral gains. However, patients and healthy subjects alike often show a large variability, or even inability, of brain self-regulation for BCI control, known as BCI illiteracy. Although current co-adaptive algorithms are powerful for assistive BCIs, their inherent class switching clashes with the operant conditioning goal of restorative BCIs. Moreover, due to the treatment rationale, the classifier of restorative BCIs usually has a constrained feature space, thus limiting the possibility of classifier adaptation. In this context, we applied a Bayesian model of neurofeedback and reinforcement learning for different threshold selection strategies to study the impact of threshold adaptation of a linear classifier on optimizing restorative BCIs. For each feedback iteration, we first determined the thresholds that result in minimal action entropy and maximal instructional efficiency. We then used the resulting vector for the simulation of continuous threshold adaptation. We could thus show that threshold adaptation can improve reinforcement learning, particularly in cases of BCI illiteracy. Finally, on the basis of information-theory, we provided an explanation for the achieved benefits of adaptive threshold setting. PMID:25729347

  10. Using archaeogenomic and computational approaches to unravel the history of local adaptation in crops

    PubMed Central

    Allaby, Robin G.; Gutaker, Rafal; Clarke, Andrew C.; Pearson, Neil; Ware, Roselyn; Palmer, Sarah A.; Kitchen, James L.; Smith, Oliver

    2015-01-01

    Our understanding of the evolution of domestication has changed radically in the past 10 years, from a relatively simplistic rapid origin scenario to a protracted complex process in which plants adapted to the human environment. The adaptation of plants continued as the human environment changed with the expansion of agriculture from its centres of origin. Using archaeogenomics and computational models, we can observe genome evolution directly and understand how plants adapted to the human environment and the regional conditions to which agriculture expanded. We have applied various archaeogenomics approaches as exemplars to study local adaptation of barley to drought resistance at Qasr Ibrim, Egypt. We show the utility of DNA capture, ancient RNA, methylation patterns and DNA from charred remains of archaeobotanical samples from low latitudes where preservation conditions restrict ancient DNA research to within a Holocene timescale. The genomic level of analyses that is now possible, and the complexity of the evolutionary process of local adaptation means that plant studies are set to move to the genome level, and account for the interaction of genes under selection in systems-level approaches. This way we can understand how plants adapted during the expansion of agriculture across many latitudes with rapidity. PMID:25487329

  11. Reinforcement learning for adaptive threshold control of restorative brain-computer interfaces: a Bayesian simulation.

    PubMed

    Bauer, Robert; Gharabaghi, Alireza

    2015-01-01

    Restorative brain-computer interfaces (BCI) are increasingly used to provide feedback of neuronal states in a bid to normalize pathological brain activity and achieve behavioral gains. However, patients and healthy subjects alike often show a large variability, or even inability, of brain self-regulation for BCI control, known as BCI illiteracy. Although current co-adaptive algorithms are powerful for assistive BCIs, their inherent class switching clashes with the operant conditioning goal of restorative BCIs. Moreover, due to the treatment rationale, the classifier of restorative BCIs usually has a constrained feature space, thus limiting the possibility of classifier adaptation. In this context, we applied a Bayesian model of neurofeedback and reinforcement learning for different threshold selection strategies to study the impact of threshold adaptation of a linear classifier on optimizing restorative BCIs. For each feedback iteration, we first determined the thresholds that result in minimal action entropy and maximal instructional efficiency. We then used the resulting vector for the simulation of continuous threshold adaptation. We could thus show that threshold adaptation can improve reinforcement learning, particularly in cases of BCI illiteracy. Finally, on the basis of information-theory, we provided an explanation for the achieved benefits of adaptive threshold setting.

  12. Reinforcement learning for adaptive threshold control of restorative brain-computer interfaces: a Bayesian simulation

    PubMed Central

    Bauer, Robert; Gharabaghi, Alireza

    2015-01-01

    Restorative brain-computer interfaces (BCI) are increasingly used to provide feedback of neuronal states in a bid to normalize pathological brain activity and achieve behavioral gains. However, patients and healthy subjects alike often show a large variability, or even inability, of brain self-regulation for BCI control, known as BCI illiteracy. Although current co-adaptive algorithms are powerful for assistive BCIs, their inherent class switching clashes with the operant conditioning goal of restorative BCIs. Moreover, due to the treatment rationale, the classifier of restorative BCIs usually has a constrained feature space, thus limiting the possibility of classifier adaptation. In this context, we applied a Bayesian model of neurofeedback and reinforcement learning for different threshold selection strategies to study the impact of threshold adaptation of a linear classifier on optimizing restorative BCIs. For each feedback iteration, we first determined the thresholds that result in minimal action entropy and maximal instructional efficiency. We then used the resulting vector for the simulation of continuous threshold adaptation. We could thus show that threshold adaptation can improve reinforcement learning, particularly in cases of BCI illiteracy. Finally, on the basis of information-theory, we provided an explanation for the achieved benefits of adaptive threshold setting. PMID:25729347

  13. Large-Scale Assessment of a Fully Automatic Co-Adaptive Motor Imagery-Based Brain Computer Interface

    PubMed Central

    Acqualagna, Laura; Botrel, Loic; Vidaurre, Carmen; Kübler, Andrea; Blankertz, Benjamin

    2016-01-01

    In the last years Brain Computer Interface (BCI) technology has benefited from the development of sophisticated machine leaning methods that let the user operate the BCI after a few trials of calibration. One remarkable example is the recent development of co-adaptive techniques that proved to extend the use of BCIs also to people not able to achieve successful control with the standard BCI procedure. Especially for BCIs based on the modulation of the Sensorimotor Rhythm (SMR) these improvements are essential, since a not negligible percentage of users is unable to operate SMR-BCIs efficiently. In this study we evaluated for the first time a fully automatic co-adaptive BCI system on a large scale. A pool of 168 participants naive to BCIs operated the co-adaptive SMR-BCI in one single session. Different psychological interventions were performed prior the BCI session in order to investigate how motor coordination training and relaxation could influence BCI performance. A neurophysiological indicator based on the Power Spectral Density (PSD) was extracted by the recording of few minutes of resting state brain activity and tested as predictor of BCI performances. Results show that high accuracies in operating the BCI could be reached by the majority of the participants before the end of the session. BCI performances could be significantly predicted by the neurophysiological indicator, consolidating the validity of the model previously developed. Anyway, we still found about 22% of users with performance significantly lower than the threshold of efficient BCI control at the end of the session. Being the inter-subject variability still the major problem of BCI technology, we pointed out crucial issues for those who did not achieve sufficient control. Finally, we propose valid developments to move a step forward to the applicability of the promising co-adaptive methods. PMID:26891350

  14. Large-Scale Assessment of a Fully Automatic Co-Adaptive Motor Imagery-Based Brain Computer Interface.

    PubMed

    Acqualagna, Laura; Botrel, Loic; Vidaurre, Carmen; Kübler, Andrea; Blankertz, Benjamin

    2016-01-01

    In the last years Brain Computer Interface (BCI) technology has benefited from the development of sophisticated machine leaning methods that let the user operate the BCI after a few trials of calibration. One remarkable example is the recent development of co-adaptive techniques that proved to extend the use of BCIs also to people not able to achieve successful control with the standard BCI procedure. Especially for BCIs based on the modulation of the Sensorimotor Rhythm (SMR) these improvements are essential, since a not negligible percentage of users is unable to operate SMR-BCIs efficiently. In this study we evaluated for the first time a fully automatic co-adaptive BCI system on a large scale. A pool of 168 participants naive to BCIs operated the co-adaptive SMR-BCI in one single session. Different psychological interventions were performed prior the BCI session in order to investigate how motor coordination training and relaxation could influence BCI performance. A neurophysiological indicator based on the Power Spectral Density (PSD) was extracted by the recording of few minutes of resting state brain activity and tested as predictor of BCI performances. Results show that high accuracies in operating the BCI could be reached by the majority of the participants before the end of the session. BCI performances could be significantly predicted by the neurophysiological indicator, consolidating the validity of the model previously developed. Anyway, we still found about 22% of users with performance significantly lower than the threshold of efficient BCI control at the end of the session. Being the inter-subject variability still the major problem of BCI technology, we pointed out crucial issues for those who did not achieve sufficient control. Finally, we propose valid developments to move a step forward to the applicability of the promising co-adaptive methods. PMID:26891350

  15. Campus Computing, 1998. The Ninth National Survey of Desktop Computing and Information Technology in American Higher Education.

    ERIC Educational Resources Information Center

    Green, Kenneth C.

    This report presents findings of a June 1998 survey of computing officials at 1,623 two- and four-year U.S. colleges and universities concerning the use of computer technology. The survey found that computing and information technology (IT) are now core components of the campus environment and classroom experience. However, key aspects of IT…

  16. Reviews of computing technology: Fiber distributed data interface

    SciTech Connect

    Johnson, A.J.

    1991-12-01

    Fiber Distributed Data Interface, more commonly known as FDDI, is the name of the standard that describes a new local area network (LAN) technology for the 90`s. This technology is based on fiber optics communications and, at a data transmission rate of 100 million bits per second (mbps), provides a full order of magnitude improvement over previous LAN standards such as Ethernet and Token Ring. FDDI as a standard has been accepted by all major computer manufacturers and is a national standard as defined by the American National Standards Institute (ANSI). FDDI will become part of the US Government Open Systems Interconnection Profile (GOSIP) under Version 3 GOSIP and will become an international standard promoted by the International Standards Organization (ISO). It is important to note that there are no competing standards for high performance LAN`s so that FDDI acceptance is nearly universal. This technology report describes FDDI as a technology, looks at the applications of this technology, examine the current economics of using it, and describe activities and plans by the Information Resource Management (IRM) department to implement this technology at the Savannah River Site.

  17. Reviews of computing technology: Fiber distributed data interface

    SciTech Connect

    Johnson, A.J.

    1991-12-01

    Fiber Distributed Data Interface, more commonly known as FDDI, is the name of the standard that describes a new local area network (LAN) technology for the 90's. This technology is based on fiber optics communications and, at a data transmission rate of 100 million bits per second (mbps), provides a full order of magnitude improvement over previous LAN standards such as Ethernet and Token Ring. FDDI as a standard has been accepted by all major computer manufacturers and is a national standard as defined by the American National Standards Institute (ANSI). FDDI will become part of the US Government Open Systems Interconnection Profile (GOSIP) under Version 3 GOSIP and will become an international standard promoted by the International Standards Organization (ISO). It is important to note that there are no competing standards for high performance LAN's so that FDDI acceptance is nearly universal. This technology report describes FDDI as a technology, looks at the applications of this technology, examine the current economics of using it, and describe activities and plans by the Information Resource Management (IRM) department to implement this technology at the Savannah River Site.

  18. Productivity and Job Security: Retraining to Adapt to Technological Change.

    ERIC Educational Resources Information Center

    National Center for Productivity and Quality of Working Life, Washington, DC.

    This report, the first of a series on productivity and job security, presents five case studies to illustrate retraining to achieve worker's adjustment to technology. The first of seven chapters addresses the following issues: the availability of job training/retraining data, the desirability of informing workers in advance of technological…

  19. Building climate adaptation capabilities through technology and community

    NASA Astrophysics Data System (ADS)

    Murray, D.; McWhirter, J.; Intsiful, J. D.; Cozzini, S.

    2011-12-01

    To effectively plan for adaptation to changes in climate, decision makers require infrastructure and tools that will provide them with timely access to current and future climate information. For example, climate scientists and operational forecasters need to access global and regional model projections and current climate information that they can use to prepare monitoring products and reports and then publish these for the decision makers. Through the UNDP African Adaption Programme, an infrastructure is being built across Africa that will provide multi-tiered access to such information. Web accessible servers running RAMADDA, an open source content management system for geoscience information, will provide access to the information at many levels: from the raw and processed climate model output to real-time climate conditions and predictions to documents and presentation for government officials. Output from regional climate models (e.g. RegCM4) and downscaled global climate models will be accessible through RAMADDA. The Integrated Data Viewer (IDV) is being used by scientists to create visualizations that assist the understanding of climate processes and projections, using the data on these as well as external servers. Since RAMADDA is more than a data server, it is also being used as a publishing platform for the generated material that will be available and searchable by the decision makers. Users can wade through the enormous volumes of information and extract subsets for their region or project of interest. Participants from 20 countries attended workshops at ICTP during 2011. They received training on setting up and installing the servers and necessary software and are now working on deploying the systems in their respective countries. This is the first time an integrated and comprehensive approach to climate change adaptation has been widely applied in Africa. It is expected that this infrastructure will enhance North-South collaboration and improve the

  20. Adaptive Traffic Route Control in QoS Provisioning for Cognitive Radio Technology with Heterogeneous Wireless Systems

    NASA Astrophysics Data System (ADS)

    Yamamoto, Toshiaki; Ueda, Tetsuro; Obana, Sadao

    As one of the dynamic spectrum access technologies, “cognitive radio technology,” which aims to improve the spectrum efficiency, has been studied. In cognitive radio networks, each node recognizes radio conditions, and according to them, optimizes its wireless communication routes. Cognitive radio systems integrate the heterogeneous wireless systems not only by switching over them but also aggregating and utilizing them simultaneously. The adaptive control of switchover use and concurrent use of various wireless systems will offer a stable and flexible wireless communication. In this paper, we propose the adaptive traffic route control scheme that provides high quality of service (QoS) for cognitive radio technology, and examine the performance of the proposed scheme through the field trials and computer simulations. The results of field trials show that the adaptive route control according to the radio conditions improves the user IP throughput by more than 20% and reduce the one-way delay to less than 1/6 with the concurrent use of IEEE802.16 and IEEE802.11 wireless media. Moreover, the simulation results assuming hundreds of mobile terminals reveal that the number of users receiving the required QoS of voice over IP (VoIP) service and the total network throughput of FTP users increase by more than twice at the same time with the proposed algorithm. The proposed adaptive traffic route control scheme can enhance the performances of the cognitive radio technologies by providing the appropriate communication routes for various applications to satisfy their required QoS.

  1. Communication: Spin-free quantum computational simulations and symmetry adapted states.

    PubMed

    Whitfield, James Daniel

    2013-07-14

    The ideas of digital simulation of quantum systems using a quantum computer parallel the original ideas of numerical simulation using a classical computer. In order for quantum computational simulations to advance to a competitive point, many techniques from classical simulations must be imported into the quantum domain. In this article, we consider the applications of symmetry in the context of quantum simulation. Building upon well established machinery, we propose a form of first quantized simulation that only requires the spatial part of the wave function, thereby allowing spin-free quantum computational simulations. We go further and discuss the preparation of N-body states with specified symmetries based on projection techniques. We consider two simple examples, molecular hydrogen and cyclopropenyl cation, to illustrate the ideas. The methods here are the first to explicitly deal with preparing N-body symmetry-adapted states and open the door for future investigations into group theory, chemistry, and quantum simulation. PMID:23862919

  2. Communication: Spin-free quantum computational simulations and symmetry adapted states.

    PubMed

    Whitfield, James Daniel

    2013-07-14

    The ideas of digital simulation of quantum systems using a quantum computer parallel the original ideas of numerical simulation using a classical computer. In order for quantum computational simulations to advance to a competitive point, many techniques from classical simulations must be imported into the quantum domain. In this article, we consider the applications of symmetry in the context of quantum simulation. Building upon well established machinery, we propose a form of first quantized simulation that only requires the spatial part of the wave function, thereby allowing spin-free quantum computational simulations. We go further and discuss the preparation of N-body states with specified symmetries based on projection techniques. We consider two simple examples, molecular hydrogen and cyclopropenyl cation, to illustrate the ideas. The methods here are the first to explicitly deal with preparing N-body symmetry-adapted states and open the door for future investigations into group theory, chemistry, and quantum simulation.

  3. Computer technology to evaluate body composition, nutrition, and exercise.

    PubMed

    Katch, F I; Katch, V L

    1983-09-01

    The use of computer technology has made it possible to make accurate determinations of body composition, nutrition, and exercise. With the FITCOMP computer assessment system, detailed measurements of physique status have been made on a variety of world-class athletes, including professional football and baseball players, as well as on diverse groups of young and older men and women throughout the United States. The FITCOMP measurement system allows the user a choice of measurement techniques: fatfolds, girths, bone diameters, and hydrostatic weighing. Combined with body composition assessment is a nutrition and exercise plan. The nutrition plan is based on guidelines formulated by the American Dietetic Association. This application of computer technology is unique, because individuals can select the foods they will eat from a list of preferred choices from the basic food groups. Individual menu plans for breakfast, lunch, and dinner are generated to provide an optimal blend of nutrients aimed at achieving ideal body mass and fat percentage. This is coupled with an aerobic exercise program that is selected by the individual from nine different forms, including walking, jogging, running, swimming, cycling, and various sport activities. The caloric output is designed to reduce total body fat through reductions in body weight of 1.4 to 2.5 pounds per week, depending on the exercise selected and total weight loss necessary to achieve a weight goal (and ideal fat percentage). The aerobic exercise plan is based on the method of overload, where intensity and duration are periodically increased dependent on individual capabilities. The use of fitness-oriented computer technology makes it possible to prepare detailed reports about current status and progress as well as to systematize record keeping.

  4. Adaptive Remodeling of Achilles Tendon: A Multi-scale Computational Model

    PubMed Central

    Rubenson, Jonas; Umberger, Brian

    2016-01-01

    While it is known that musculotendon units adapt to their load environments, there is only a limited understanding of tendon adaptation in vivo. Here we develop a computational model of tendon remodeling based on the premise that mechanical damage and tenocyte-mediated tendon damage and repair processes modify the distribution of its collagen fiber lengths. We explain how these processes enable the tendon to geometrically adapt to its load conditions. Based on known biological processes, mechanical and strain-dependent proteolytic fiber damage are incorporated into our tendon model. Using a stochastic model of fiber repair, it is assumed that mechanically damaged fibers are repaired longer, whereas proteolytically damaged fibers are repaired shorter, relative to their pre-damage length. To study adaptation of tendon properties to applied load, our model musculotendon unit is a simplified three-component Hill-type model of the human Achilles-soleus unit. Our model results demonstrate that the geometric equilibrium state of the Achilles tendon can coincide with minimization of the total metabolic cost of muscle activation. The proposed tendon model independently predicts rates of collagen fiber turnover that are in general agreement with in vivo experimental measurements. While the computational model here only represents a first step in a new approach to understanding the complex process of tendon remodeling in vivo, given these findings, it appears likely that the proposed framework may itself provide a useful theoretical foundation for developing valuable qualitative and quantitative insights into tendon physiology and pathology. PMID:27684554

  5. Quality Assurance Challenges for Motion-Adaptive Radiation Therapy: Gating, Breath Holding, and Four-Dimensional Computed Tomography

    SciTech Connect

    Jiang, Steve B. Wolfgang, John; Mageras, Gig S.

    2008-05-01

    Compared with conventional three-dimensional (3D) conformal radiation therapy and intensity-modulated radiation therapy treatments, quality assurance (QA) for motion-adaptive radiation therapy involves various challenges because of the added temporal dimension. Here we discuss those challenges for three specific techniques related to motion-adaptive therapy: namely respiratory gating, breath holding, and four-dimensional computed tomography. Similar to the introduction of any other new technologies in clinical practice, typical QA measures should be taken for these techniques also, including initial testing of equipment and clinical procedures, as well as frequent QA examinations during the early stage of implementation. Here, rather than covering every QA aspect in depth, we focus on some major QA challenges. The biggest QA challenge for gating and breath holding is how to ensure treatment accuracy when internal target position is predicted using external surrogates. Recommended QA measures for each component of treatment, including simulation, planning, patient positioning, and treatment delivery and verification, are discussed. For four-dimensional computed tomography, some major QA challenges have also been discussed.

  6. Computer program for distance learning of pesticide application technology.

    PubMed

    Maia, Bruno; Cunha, Joao P A R

    2011-12-01

    Distance learning presents great potential for mitigating field problems on pesticide application technology. Thus, due to the lack of teaching material about pesticide spraying technology in the Portuguese language and the increasing availability of distance learning, this study developed and evaluated a computer program for distance learning about the theory of pesticide spraying technology using the tools of information technology. The modules comprising the course, named Pulverizar, were: (1) Basic concepts, (2) Factors that affect application, (3) Equipments, (4) Spraying nozzles, (5) Sprayer calibration, (6) Aerial application, (7) Chemigation, (8) Physical-chemical properties, (9) Formulations, (10) Adjuvants, (11) Water quality, and (12) Adequate use of pesticides. The program was made available to the public on July 1(st), 2008, hosted at the web site www.pulverizar.iciag.ufu.br, and was simple, robust and practical on the complementation of traditional teaching for the education of professionals in Agricultural Sciences. Mastering pesticide spraying technology by people involved in agricultural production can be facilitated by the program Pulverizar, which was well accepted in its initial evaluation.

  7. Solution-Adaptive Program for Computing 2D/Axi Viscous Flow

    NASA Technical Reports Server (NTRS)

    Wood, William A.

    2003-01-01

    A computer program solves the Navier- Stokes equations governing the flow of a viscous, compressible fluid in an axisymmetric or two-dimensional (2D) setting. To obtain solutions more accurate than those generated by prior such programs that utilize regular and/or fixed computational meshes, this program utilizes unstructured (that is, irregular triangular) computational meshes that are automatically adapted to solutions. The adaptation can refine to regions of high change in gradient or can be driven by a novel residual minimization technique. Starting from an initial mesh and a corresponding data structure, the adaptation of the mesh is controlled by use of minimization functional. Other improvements over prior such programs include the following: (1) Boundary conditions are imposed weakly; that is, following initial specification of solution values at boundary nodes, these values are relaxed in time by means of the same formulations as those used for interior nodes. (2) Eigenvalues are limited in order to suppress expansion shocks. (3) An upwind fluctuation-splitting distribution scheme applied to inviscid flux requires fewer operations and produces less artificial dissipation than does a finite-volume scheme, leading to greater accuracy of solutions.

  8. Computer technology applications in industrial and organizational psychology.

    PubMed

    Crespin, Timothy R; Austin, James T

    2002-08-01

    This article reviews computer applications developed and utilized by industrial-organizational (I-O) psychologists, both in practice and in research. A primary emphasis is on applications developed for Internet usage, because this "network of networks" changes the way I-O psychologists work. The review focuses on traditional and emerging topics in I-O psychology. The first topic involves information technology applications in measurement, defined broadly across levels of analysis (persons, groups, organizations) and domains (abilities, personality, attitudes). Discussion then focuses on individual learning at work, both in formal training and in coping with continual automation of work. A section on job analysis follows, illustrating the role of computers and the Internet in studying jobs. Shifting focus to the group level of analysis, we briefly review how information technology is being used to understand and support cooperative work. Finally, special emphasis is given to the emerging "third discipline" in I-O psychology research-computational modeling of behavioral events in organizations. Throughout this review, themes of innovation and dissemination underlie a continuum between research and practice. The review concludes by setting a framework for I-O psychology in a computerized and networked world.

  9. Adaptive Radiotherapy Planning on Decreasing Gross Tumor Volumes as Seen on Megavoltage Computed Tomography Images

    SciTech Connect

    Woodford, Curtis; Yartsev, Slav Dar, A. Rashid; Bauman, Glenn; Van Dyk, Jake

    2007-11-15

    Purpose: To evaluate gross tumor volume (GTV) changes for patients with non-small-cell lung cancer by using daily megavoltage (MV) computed tomography (CT) studies acquired before each treatment fraction on helical tomotherapy and to relate the potential benefit of adaptive image-guided radiotherapy to changes in GTV. Methods and Materials: Seventeen patients were prescribed 30 fractions of radiotherapy on helical tomotherapy for non-small-cell lung cancer at London Regional Cancer Program from Dec 2005 to March 2007. The GTV was contoured on the daily MVCT studies of each patient. Adapted plans were created using merged MVCT-kilovoltage CT image sets to investigate the advantages of replanning for patients with differing GTV regression characteristics. Results: Average GTV change observed over 30 fractions was -38%, ranging from -12 to -87%. No significant correlation was observed between GTV change and patient's physical or tumor features. Patterns of GTV changes in the 17 patients could be divided broadly into three groups with distinctive potential for benefit from adaptive planning. Conclusions: Changes in GTV are difficult to predict quantitatively based on patient or tumor characteristics. If changes occur, there are points in time during the treatment course when it may be appropriate to adapt the plan to improve sparing of normal tissues. If GTV decreases by greater than 30% at any point in the first 20 fractions of treatment, adaptive planning is appropriate to further improve the therapeutic ratio.

  10. Adaptive Nulling: A New Enabling Technology for Interferometric Exoplanet

    NASA Technical Reports Server (NTRS)

    Lay, Oliver P.; Jeganathan, Muthu; Peters, Robert

    2003-01-01

    Deep, stable nulling of starlight requires careful control of the amplitudes and phases of the beams that are being combined. The detection of earth-like planets using the interferometer architectures currently being considered for the Terrestrial Planet Finder mission require that the E-field amplitudes are balanced at the level of approx. 0.1%, and the phases are controlled at the level of 1 mrad (corresponding to approx.1.5 nm for a wavelength of 10 microns). These conditions must be met simultaneously at all wavelengths across the science band, and for both polarization states, imposing unrealistic tolerances on the symmetry between the optical beamtrains. We introduce the concept of a compensator that is inserted into the beamtrain, which can adaptively correct for the mismatches across the spectrum, enabling deep nulls with realistic, imperfect optics. The design presented uses a deformable mirror to adjust the amplitude and phase of each beam as an arbitrary function of wavelength and polarization. A proof-of-concept experiment will be conducted at visible/near-IR wavelengths, followed by a system operating in the Mid-IR band.

  11. Electrical hand tools and techniques: A compilation. [utilization of space technology for tools and adapters

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Space technology utilization for developing tools, adapters, and fixtures and procedures for assembling, installing, and servicing electrical components and equipment are discussed. Some of the items considered are: (1) pivotal screwdriver, (2) termination locator tool for shielded cables, (3) solder application tools, (4) insulation and shield removing tool, and (5) torque wrench adapter for cable connector engaging ring. Diagrams of the various tools and devices are provided.

  12. Foreign and Domestic Accomplishments in Magnetic Bubble Device Technology. Computer Science & Technology.

    ERIC Educational Resources Information Center

    Warnar, Robert B. J.; Calomeris, Peter J.

    This document assesses the status of magnetic bubble technology as displayed by non-U.S. research and manufacturing facilities. Non-U.S. research and U.S. accomplishments are described while both technical and economic factors are addressed. Magnetic bubble devices are discussed whenever their application could impact future computer system…

  13. Computational fluid dynamics for propulsion technology: Geometric grid visualization in CFD-based propulsion technology research

    NASA Technical Reports Server (NTRS)

    Ziebarth, John P.; Meyer, Doug

    1992-01-01

    The coordination is examined of necessary resources, facilities, and special personnel to provide technical integration activities in the area of computational fluid dynamics applied to propulsion technology. Involved is the coordination of CFD activities between government, industry, and universities. Current geometry modeling, grid generation, and graphical methods are established to use in the analysis of CFD design methodologies.

  14. An Adaptive Paradigm for Computer-Aided Detection of Colonic Polyps

    PubMed Central

    Wang, Huafeng; Liang, Zhengrong; Li, Lihong C.; Han, Hao; Song, Bowen; Pickhardt, Perry J.; Barish, Matthew A.; Lascarides, Chris E.

    2015-01-01

    Most previous efforts in developing computer-aided detection (CADe) of colonic polyps apply similar measures or parameters to detect polyps regardless their locations under an implicit assumption that all the polyps reside in a similar local environment, e.g., on a relatively flat colon wall. In reality, this implicit assumption is frequently invalid, because the haustral folds can have a very different local environment from that of the relatively flat colon wall. We conjecture that this assumption may be a major cause of missing detection of polyps, especially small polyps (<10mm linear size) located on the haustral folds. In this paper, we take the concept of adaptive-ness and present an adaptive paradigm for CADe of colonic polyps. Firstly, we decompose the complicated colon structure into two simplified sub-structures, each of which has similar properties, of (1) relatively flat colon wall and (2) ridge-shaped haustral folds. Then we develop local environment descriptions to adaptively reflect each of these two simplified sub-structures. To show the impact of the adaptive-ness of the local environment descriptions upon the polyp detection task, we focus on the local geometrical measures of the volume data for both the detection of initial polyp candidates (IPCs) and the reduction of false positives (FPs) in the IPC pool. The experimental outcome using the local geometrical measures is very impressive such that not only the previously-missed small polyps on the folds are detected, but also the previously miss-removed small polyps on the folds during FP reduction are retained. It is expected that this adaptive paradigm will have a great impact on detecting the small polyps, measuring their volumes and volume changes over time, and optimizing their management plan. PMID:26348125

  15. The technological influence on health professionals' care: translation and adaptation of scales1

    PubMed Central

    Almeida, Carlos Manuel Torres; Almeida, Filipe Nuno Alves dos Santos; Escola, Joaquim José Jacinto; Rodrigues, Vitor Manuel Costa Pereira

    2016-01-01

    Objectives: in this study, two research tools were validated to study the impact of technological influence on health professionals' care practice. Methods: the following methodological steps were taken: bibliographic review, selection of the scales, translation and cultural adaptation and analysis of psychometric properties. Results: the psychometric properties of the scale were assessed based on its application to a sample of 341 individuals (nurses, physicians, final-year nursing and medical students). The validity, reliability and internal consistency were tested. Two scales were found: Caring Attributes Questionnaire (adapted) with a Cronbach's Alpha coefficient of 0.647 and the Technological Influence Questionnaire (adapted) with an Alpha coefficient of 0.777. Conclusions: the scales are easy to apply and reveal reliable psychometric properties, an additional quality as they permit generalized studies on a theme as important as the impact of technological influence in health care. PMID:27143537

  16. Technologies for Large Data Management in Scientific Computing

    NASA Astrophysics Data System (ADS)

    Pace, Alberto

    2014-01-01

    In recent years, intense usage of computing has been the main strategy of investigations in several scientific research projects. The progress in computing technology has opened unprecedented opportunities for systematic collection of experimental data and the associated analysis that were considered impossible only few years ago. This paper focuses on the strategies in use: it reviews the various components that are necessary for an effective solution that ensures the storage, the long term preservation, and the worldwide distribution of large quantities of data that are necessary in a large scientific research project. The paper also mentions several examples of data management solutions used in High Energy Physics for the CERN Large Hadron Collider (LHC) experiments in Geneva, Switzerland which generate more than 30,000 terabytes of data every year that need to be preserved, analyzed, and made available to a community of several tenth of thousands scientists worldwide.

  17. Application Specific Performance Technology for Productive Parallel Computing

    SciTech Connect

    Malony, Allen D.; Shende, Sameer

    2008-09-30

    Our accomplishments over the last three years of the DOE project Application- Specific Performance Technology for Productive Parallel Computing (DOE Agreement: DE-FG02-05ER25680) are described below. The project will have met all of its objectives by the time of its completion at the end of September, 2008. Two extensive yearly progress reports were produced in in March 2006 and 2007 and were previously submitted to the DOE Office of Advanced Scientific Computing Research (OASCR). Following an overview of the objectives of the project, we summarize for each of the project areas the achievements in the first two years, and then describe in some more detail the project accomplishments this past year. At the end, we discuss the relationship of the proposed renewal application to the work done on the current project.

  18. Global assessment of technological innovation for climate change adaptation and mitigation in developing world.

    PubMed

    Adenle, Ademola A; Azadi, Hossein; Arbiol, Joseph

    2015-09-15

    Concerns about mitigating and adapting to climate change resulted in renewing the incentive for agricultural research investments and developing further innovation priorities around the world particularly in developing countries. In the near future, development of new agricultural measures and proper diffusion of technologies will greatly influence the ability of farmers in adaptation and mitigation to climate change. Using bibliometric approaches through output of academic journal publications and patent-based data, we assess the impact of research and development (R&D) for new and existing technologies within the context of climate change mitigation and adaptation. We show that many developing countries invest limited resources for R&D in relevant technologies that have great potential for mitigation and adaption in agricultural production. We also discuss constraints including weak infrastructure, limited research capacity, lack of credit facilities and technology transfer that may hinder the application of innovation in tackling the challenges of climate change. A range of policy measures is also suggested to overcome identified constraints and to ensure that potentials of innovation for climate change mitigation and adaptation are realized. PMID:26189184

  19. Global assessment of technological innovation for climate change adaptation and mitigation in developing world.

    PubMed

    Adenle, Ademola A; Azadi, Hossein; Arbiol, Joseph

    2015-09-15

    Concerns about mitigating and adapting to climate change resulted in renewing the incentive for agricultural research investments and developing further innovation priorities around the world particularly in developing countries. In the near future, development of new agricultural measures and proper diffusion of technologies will greatly influence the ability of farmers in adaptation and mitigation to climate change. Using bibliometric approaches through output of academic journal publications and patent-based data, we assess the impact of research and development (R&D) for new and existing technologies within the context of climate change mitigation and adaptation. We show that many developing countries invest limited resources for R&D in relevant technologies that have great potential for mitigation and adaption in agricultural production. We also discuss constraints including weak infrastructure, limited research capacity, lack of credit facilities and technology transfer that may hinder the application of innovation in tackling the challenges of climate change. A range of policy measures is also suggested to overcome identified constraints and to ensure that potentials of innovation for climate change mitigation and adaptation are realized.

  20. Reconfigurable high-speed optoelectronic interconnect technology for multiprocessor computers

    NASA Astrophysics Data System (ADS)

    Cheng, Julian

    1995-06-01

    We describe a compact optoelectronic switching technology for interconnecting multiple computer processors and shared memory modules together through dynamically reconfigurable optical paths to provide simultaneous, high speed communication amongst different nodes. Each switch provides a optical link to other nodes as well as electrical access to an individual processor, and it can perform optical and optoelectronic switching to covert digital data between various electrical and optical input/output formats. This multifunctional switching technology is based on the monolithic integration of arrays of vertical-cavity surface-emitting lasers with photodetectors and heterojunction bipolar transistors. The various digital switching and routing functions, as well as optically cascaded multistage operation, have been experimentally demonstrated.

  1. Emerging computer technologies and the news media of the future

    NASA Technical Reports Server (NTRS)

    Vrabel, Debra A.

    1993-01-01

    The media environment of the future may be dramatically different from what exists today. As new computing and communications technologies evolve and synthesize to form a global, integrated communications system of networks, public domain hardware and software, and consumer products, it will be possible for citizens to fulfill most information needs at any time and from any place, to obtain desired information easily and quickly, to obtain information in a variety of forms, and to experience and interact with information in a variety of ways. This system will transform almost every institution, every profession, and every aspect of human life--including the creation, packaging, and distribution of news and information by media organizations. This paper presents one vision of a 21st century global information system and how it might be used by citizens. It surveys some of the technologies now on the market that are paving the way for new media environment.

  2. Processing instrumentation technology: Process definition with a cognitive computer

    SciTech Connect

    Price, H.L.

    1996-11-01

    Much of the polymer composites industry is built around the thermochemical conversion of raw material into useful composites. The raw materials (molding compound, prepreg) often are made up of thermosetting resins and small fibers or particles. While this conversion can follow a large number of paths, only a few paths are efficient, economical and lead to desirable composite properties. Processing instrument (P/I) technology enables a computer to sense and interpret changes taking place during the cure of prepreg or molding compound. P/I technology has been used to make estimates of gel time and cure time, thermal diffusivity measurements and transition temperature measurements. Control and sensing software is comparatively straightforward. The interpretation of results with appropriate software is under development.

  3. Japanese technology assessment: Computer science, opto- and microelectronics mechatronics, biotechnology

    SciTech Connect

    Brandin, D.; Wieder, H.; Spicer, W.; Nevins, J.; Oxender, D.

    1986-01-01

    The series studies Japanese research and development in four high-technology areas - computer science, opto and microelectronics, mechatronics (a term created by the Japanese to describe the union of mechanical and electronic engineering to produce the next generation of machines, robots, and the like), and biotechnology. The evaluations were conducted by panels of U.S. scientists - chosen from academia, government, and industry - actively involved in research in areas of expertise. The studies were prepared for the purpose of aiding the U.S. response to Japan's technological challenge. The main focus of the assessments is on the current status and long-term direction and emphasis of Japanese research and development. Other aspects covered include evolution of the state of the art; identification of Japanese researchers, R and D organizations, and resources; and comparative U.S. efforts. The general time frame of the studies corresponds to future industrial applications and potential commercial impacts spanning approximately the next two decades.

  4. Adaptive intermittent control: A computational model explaining motor intermittency observed in human behavior.

    PubMed

    Sakaguchi, Yutaka; Tanaka, Masato; Inoue, Yasuyuki

    2015-07-01

    It is a fundamental question how our brain performs a given motor task in a real-time fashion with the slow sensorimotor system. Computational theory proposed an influential idea of feed-forward control, but it has mainly treated the case that the movement is ballistic (such as reaching) because the motor commands should be calculated in advance of movement execution. As a possible mechanism for operating feed-forward control in continuous motor tasks (such as target tracking), we propose a control model called "adaptive intermittent control" or "segmented control," that brain adaptively divides the continuous time axis into discrete segments and executes feed-forward control in each segment. The idea of intermittent control has been proposed in the fields of control theory, biological modeling and nonlinear dynamical system. Compared with these previous models, the key of the proposed model is that the system speculatively determines the segmentation based on the future prediction and its uncertainty. The result of computer simulation showed that the proposed model realized faithful visuo-manual tracking with realistic sensorimotor delays and with less computational costs (i.e., with fewer number of segments). Furthermore, it replicated "motor intermittency", that is, intermittent discontinuities commonly observed in human movement trajectories. We discuss that the temporally segmented control is an inevitable strategy for brain which has to achieve a given task with small computational (or cognitive) cost, using a slow control system in an uncertain variable environment, and the motor intermittency is the side-effect of this strategy.

  5. Computation Directorate and Science& Technology Review Computational Science and Research Featured in 2002

    SciTech Connect

    Alchorn, A L

    2003-04-04

    Thank you for your interest in the activities of the Lawrence Livermore National Laboratory Computation Directorate. This collection of articles from the Laboratory's Science & Technology Review highlights the most significant computational projects, achievements, and contributions during 2002. In 2002, LLNL marked the 50th anniversary of its founding. Scientific advancement in support of our national security mission has always been the core of the Laboratory. So that researchers could better under and predict complex physical phenomena, the Laboratory has pushed the limits of the largest, fastest, most powerful computers in the world. In the late 1950's, Edward Teller--one of the LLNL founders--proposed that the Laboratory commission a Livermore Advanced Research Computer (LARC) built to Livermore's specifications. He tells the story of being in Washington, DC, when John Von Neumann asked to talk about the LARC. He thought Teller wanted too much memory in the machine. (The specifications called for 20-30,000 words.) Teller was too smart to argue with him. Later Teller invited Von Neumann to the Laboratory and showed him one of the design codes being prepared for the LARC. He asked Von Neumann for suggestions on fitting the code into 10,000 words of memory, and flattered him about ''Labbies'' not being smart enough to figure it out. Von Neumann dropped his objections, and the LARC arrived with 30,000 words of memory. Memory, and how close memory is to the processor, is still of interest to us today. Livermore's first supercomputer was the Remington-Rand Univac-1. It had 5600 vacuum tubes and was 2 meters wide by 4 meters long. This machine was commonly referred to as a 1 KFlop machine [E+3]. Skip ahead 50 years. The ASCI White machine at the Laboratory today, produced by IBM, is rated at a peak performance of 12.3 TFlops or E+13. We've improved computer processing power by 10 orders of magnitude in 50 years, and I do not believe there's any reason to think we won

  6. Energy-saving technology of vector controlled induction motor based on the adaptive neuro-controller

    NASA Astrophysics Data System (ADS)

    Engel, E.; Kovalev, I. V.; Karandeev, D.

    2015-10-01

    The ongoing evolution of the power system towards a Smart Grid implies an important role of intelligent technologies, but poses strict requirements on their control schemes to preserve stability and controllability. This paper presents the adaptive neuro-controller for the vector control of induction motor within Smart Gird. The validity and effectiveness of the proposed energy-saving technology of vector controlled induction motor based on adaptive neuro-controller are verified by simulation results at different operating conditions over a wide speed range of induction motor.

  7. Primer on computers and information technology. Part two: an introduction to computer networking.

    PubMed

    Channin, D S; Chang, P J

    1997-01-01

    Computers networks are a way of connecting computers together such that they can exchange information. For this exchange to be successful, system behavior must be planned and specified very clearly at a number of different levels. Although there are many choices to be made at each level, often there are simple decisions that can be made to rapidly reduce the number of options. Planning is most important at the highest (application) and lowest (wiring) levels, whereas the middle levels must be specified to ensure compatibility. Because of the widespread use of the Internet, solutions based on Internet technologies are often cost-effective and should be considered when designing a network. As in all technical fields, consultation with experts (ie, computer networking specialists) may be worthwhile. PMID:9225395

  8. Primer on computers and information technology. Part two: an introduction to computer networking.

    PubMed

    Channin, D S; Chang, P J

    1997-01-01

    Computers networks are a way of connecting computers together such that they can exchange information. For this exchange to be successful, system behavior must be planned and specified very clearly at a number of different levels. Although there are many choices to be made at each level, often there are simple decisions that can be made to rapidly reduce the number of options. Planning is most important at the highest (application) and lowest (wiring) levels, whereas the middle levels must be specified to ensure compatibility. Because of the widespread use of the Internet, solutions based on Internet technologies are often cost-effective and should be considered when designing a network. As in all technical fields, consultation with experts (ie, computer networking specialists) may be worthwhile.

  9. Adaptive information interchange system of the fiber-optic measuring networks with the computer

    NASA Astrophysics Data System (ADS)

    Denisov, Igor V.; Drozdov, Roman S.; Sedov, Victor A.

    2005-06-01

    In the present paper the characteristics and opportunities of application of the system of parallel input-output of information from the fiber-optical measuring network into computer are considered. The system consists of two pars: on manframe and several expansion blocks. The first part is internal, is connected directly in the socket of the motherboard of the personal computer. It is designed for buffering system signals and development of cojmands of controlling by the system for input-output of signals into personal computer and signals generation onto expansion blocks. The second part is external, connects to the mainframe by means of cables. It designed for transformation of information from the fiber-optical measuring network into signalsof rthe mainframe and instrument settings adaptation. The analysis of speed of procesing of analog and digital data by system is presented. The possible schemes of use of the system for processing quasistationary and dynamic fields are considered.

  10. How Computer Technology Expands Educational Options: A Rationale, Recommendations, and a Pamphlet for Administrators.

    ERIC Educational Resources Information Center

    Kelch, Panette Evers; Karr-Kidwell, PJ

    The purpose of this paper is to provide a historical rationale on how computer technology, particularly the Internet, expands educational options for administrators and teachers. A review of the literature includes a brief history of computer technology and its growing use, and a discussion of computer technology for distance learning, for…

  11. Advances in computer-aided design and computer-aided manufacture technology.

    PubMed

    Calamia, J R

    1996-01-01

    Although the development of computer-aided design (CAD) and computer-aided manufacture (CAM) technology and the benefits of increased productivity became obvious in the automobile and aerospace industries in the 1970s, investigations of this technology's application in the field of dentistry did not begin until the 1980s. Only now are we beginning to see the fruits of this work with the commercial availability of some systems; the potential for this technology seems boundless. This article reviews the recent literature with emphasis on the period from June 1992 to May 1993. This review should familiarize the reader with some of the latest developments in this technology, including a brief description of some systems currently available and the clinical and economical rationale for their acceptance into the dental mainstream. This article concentrates on a particular system, the Cerec (Siemens/Pelton and Crane, Charlotte, NC) system, for three reasons: First, this system has been available since 1985 and, as a result, has a track record of almost 7 years of data. Most of the data have just recently been released and consequently, much of this year's literature on CAD-CAM is monopolized by studies using this system. Second, this system was developed as a mobile, affordable, direct chairside CAD-CAM restorative method. As such, it is of special interest to the patient, providing a one-visit restoration. Third, the author is currently engaged in research using this particular system and has a working knowledge of this system's capabilities.

  12. Technology--The Equalizer.

    ERIC Educational Resources Information Center

    Sloane, Eydie

    1989-01-01

    This article describes a number of computer-based learning tools for disabled students. Adaptive input devices, assisted technologies, software, and hardware and software resources are discussed. (IAH)

  13. Adaptive allocation of decisionmaking responsibility between human and computer in multitask situations

    NASA Technical Reports Server (NTRS)

    Chu, Y.-Y.; Rouse, W. B.

    1979-01-01

    As human and computer come to have overlapping decisionmaking abilities, a dynamic or adaptive allocation of responsibilities may be the best mode of human-computer interaction. It is suggested that the computer serve as a backup decisionmaker, accepting responsibility when human workload becomes excessive and relinquishing responsibility when workload becomes acceptable. A queueing theory formulation of multitask decisionmaking is used and a threshold policy for turning the computer on/off is proposed. This policy minimizes event-waiting cost subject to human workload constraints. An experiment was conducted with a balanced design of several subject runs within a computer-aided multitask flight management situation with different task demand levels. It was found that computer aiding enhanced subsystem performance as well as subjective ratings. The queueing model appears to be an adequate representation of the multitask decisionmaking situation, and to be capable of predicting system performance in terms of average waiting time and server occupancy. Server occupancy was further found to correlate highly with the subjective effort ratings.

  14. Counseling Student Computer Competency Skills: Effects of Technology Course in Training.

    ERIC Educational Resources Information Center

    Edwards, Yolanda V.; Portman, Tarrell Awe Agahe; Bethea, James

    2002-01-01

    The focus of this article is to assess counseling student computer competency level as an effect of a one-credit hour introductory course in computer technology. Results indicate student computer competencies increased after completing the computer technology course in the following areas: ethics, assisting clients with internet searches,…

  15. Accelerating Technology Development through Integrated Computation and Experimentation

    SciTech Connect

    Shekhawat, Dushyant; Srivastava, Rameshwar D.; Ciferno, Jared; Litynski, John; Morreale, Bryan D.

    2013-08-15

    This special section of Energy & Fuels comprises a selection of papers presented at the topical conference “Accelerating Technology Development through Integrated Computation and Experimentation”, sponsored and organized by the United States Department of Energy’s National Energy Technology Laboratory (NETL) as part of the 2012 American Institute of Chemical Engineers (AIChE) Annual Meeting held in Pittsburgh, PA, Oct 28-Nov 2, 2012. That topical conference focused on the latest research and development efforts in five main areas related to fossil energy, with each area focusing on the utilization of both experimental and computational approaches: (1) gas separations (membranes, sorbents, and solvents for CO{sub 2}, H{sub 2}, and O{sub 2} production), (2) CO{sub 2} utilization (enhanced oil recovery, chemical production, mineralization, etc.), (3) carbon sequestration (flow in natural systems), (4) advanced power cycles (oxy-combustion, chemical looping, gasification, etc.), and (5) fuel processing (H{sub 2} production for fuel cells).

  16. Condition monitoring through advanced sensor and computational technology : final report (January 2002 to May 2005).

    SciTech Connect

    Kim, Jung-Taek; Luk, Vincent K.

    2005-05-01

    The overall goal of this joint research project was to develop and demonstrate advanced sensors and computational technology for continuous monitoring of the condition of components, structures, and systems in advanced and next-generation nuclear power plants (NPPs). This project included investigating and adapting several advanced sensor technologies from Korean and US national laboratory research communities, some of which were developed and applied in non-nuclear industries. The project team investigated and developed sophisticated signal processing, noise reduction, and pattern recognition techniques and algorithms. The researchers installed sensors and conducted condition monitoring tests on two test loops, a check valve (an active component) and a piping elbow (a passive component), to demonstrate the feasibility of using advanced sensors and computational technology to achieve the project goal. Acoustic emission (AE) devices, optical fiber sensors, accelerometers, and ultrasonic transducers (UTs) were used to detect mechanical vibratory response of check valve and piping elbow in normal and degraded configurations. Chemical sensors were also installed to monitor the water chemistry in the piping elbow test loop. Analysis results of processed sensor data indicate that it is feasible to differentiate between the normal and degraded (with selected degradation mechanisms) configurations of these two components from the acquired sensor signals, but it is questionable that these methods can reliably identify the level and type of degradation. Additional research and development efforts are needed to refine the differentiation techniques and to reduce the level of uncertainties.

  17. Implementation of Parallel Computing Technology to Vortex Flow

    NASA Technical Reports Server (NTRS)

    Dacles-Mariani, Jennifer

    1999-01-01

    Mainframe supercomputers such as the Cray C90 was invaluable in obtaining large scale computations using several millions of grid points to resolve salient features of a tip vortex flow over a lifting wing. However, real flight configurations require tracking not only of the flow over several lifting wings but its growth and decay in the near- and intermediate- wake regions, not to mention the interaction of these vortices with each other. Resolving and tracking the evolution and interaction of these vortices shed from complex bodies is computationally intensive. Parallel computing technology is an attractive option in solving these flows. In planetary science vortical flows are also important in studying how planets and protoplanets form when cosmic dust and gases become gravitationally unstable and eventually form planets or protoplanets. The current paradigm for the formation of planetary systems maintains that the planets accreted from the nebula of gas and dust left over from the formation of the Sun. Traditional theory also indicate that such a preplanetary nebula took the form of flattened disk. The coagulation of dust led to the settling of aggregates toward the midplane of the disk, where they grew further into asteroid-like planetesimals. Some of the issues still remaining in this process are the onset of gravitational instability, the role of turbulence in the damping of particles and radial effects. In this study the focus will be with the role of turbulence and the radial effects.

  18. Assistive and Adaptive Technology--Supporting Competence and Independence in Young Children with Disabilities.

    ERIC Educational Resources Information Center

    Brett, Arlene

    1997-01-01

    Argues that computers and related technology can be an important asset in the classrooms of young children with disabilities. Suggests that this technology can promote mobility, communication, and learning; increase independence; augment abilities; compensate for learning challenges; overcome learned helplessness; and foster competence and…

  19. Computations of two- and three-dimensional flows using an adaptive mesh

    NASA Astrophysics Data System (ADS)

    Nakahashi, K.

    1985-11-01

    Two- and three-dimensional, steady and unsteady viscous flow fields are numerically simulated by solving the Navier-Stokes equations. A solution-adaptive-grid method is used to redistribute the grid points so as to improve the resolution of shock waves and shear layers without increasing the number of grid points. Flow fields considered include two-dimensional transonic flows about airfoils, two- and three-dimensional supersonic flow past an aerodynamic afterbody with a propulsive jet, supersonic flow over a blunt fin mounted on a wall, and supersonic flow over a bump. The computed results demonstrate a significant improvement in accuracy and quality of the solutions owing to the solution-adaptive mesh.

  20. COMET-AR User's Manual: COmputational MEchanics Testbed with Adaptive Refinement

    NASA Technical Reports Server (NTRS)

    Moas, E. (Editor)

    1997-01-01

    The COMET-AR User's Manual provides a reference manual for the Computational Structural Mechanics Testbed with Adaptive Refinement (COMET-AR), a software system developed jointly by Lockheed Palo Alto Research Laboratory and NASA Langley Research Center under contract NAS1-18444. The COMET-AR system is an extended version of an earlier finite element based structural analysis system called COMET, also developed by Lockheed and NASA. The primary extensions are the adaptive mesh refinement capabilities and a new "object-like" database interface that makes COMET-AR easier to extend further. This User's Manual provides a detailed description of the user interface to COMET-AR from the viewpoint of a structural analyst.

  1. Parallel Adaptive Computation of Blood Flow in a 3D ``Whole'' Body Model

    NASA Astrophysics Data System (ADS)

    Zhou, M.; Figueroa, C. A.; Taylor, C. A.; Sahni, O.; Jansen, K. E.

    2008-11-01

    Accurate numerical simulations of vascular trauma require the consideration of a larger portion of the vasculature than previously considered, due to the systemic nature of the human body's response. A patient-specific 3D model composed of 78 connected arterial branches extending from the neck to the lower legs is constructed to effectively represent the entire body. Recently developed outflow boundary conditions that appropriately represent the downstream vasculature bed which is not included in the 3D computational domain are applied at 78 outlets. In this work, the pulsatile blood flow simulations are started on a fairly uniform, unstructured mesh that is subsequently adapted using a solution-based approach to efficiently resolve the flow features. The adapted mesh contains non-uniform, anisotropic elements resulting in resolution that conforms with the physical length scales present in the problem. The effects of the mesh resolution on the flow field are studied, specifically on relevant quantities of pressure, velocity and wall shear stress.

  2. in vivo laser speckle imaging by adaptive contrast computation for microvasculature assessment

    NASA Astrophysics Data System (ADS)

    Basak, Kausik; Dey, Goutam; Mahadevappa, Manjunatha; Mandal, Mahitosh; Dutta, Pranab Kumar

    2014-11-01

    Interference of light backscattered from a diffused surface leads to speckle formation in laser speckle imaging. These time integrated speckle patterns can be statistically analyzed to study the flow profile of moving scatterers. Simple speckle contrast analysis techniques have limited ability to distinguish thin structures due to presence of corrupting speckles. This paper presents a high resolution imaging technique by adaptive computation of contrast for laser speckle contrast analysis (adLASCA). Speckle images of retinal microvasculature in mice model are acquired during normal and reduced blood flow conditions. Initially, the speckle images are registered to compensate for movements, associated with heart beating and respiration. Adaptive computation is performed using local image statistics, estimated within a spatially moving window over successive time frames. Experimental evidence suggests that adLASCA outperforms other contrast analysis methods, substantiating significant improvement in contrast resolution. Fine vessels can be distinguished more efficiently with reduced fluctuations in contrast level. Quantitative performance of adLASCA is evaluated by computing standard deviation, corresponding to speckle fluctuations due to unwanted speckles. There is a significant reduction in standard deviation compared to other methods. Therefore, adLASCA can be used for enhancing microvasculature in high resolution perfusion imaging with reduced effect of corrupting speckles for effective assessment.

  3. Evaluating the Appropriateness of a New Computer-Administered Measure of Adaptive Function for Children and Youth with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Coster, Wendy J.; Kramer, Jessica M.; Tian, Feng; Dooley, Meghan; Liljenquist, Kendra; Kao, Ying-Chia; Ni, Pengsheng

    2016-01-01

    The Pediatric Evaluation of Disability Inventory-Computer Adaptive Test is an alternative method for describing the adaptive function of children and youth with disabilities using a computer-administered assessment. This study evaluated the performance of the Pediatric Evaluation of Disability Inventory-Computer Adaptive Test with a national…

  4. Addressing Cultural Context in the Development of Performance-based Assessments and Computer-adaptive Testing: Preliminary Validity Considerations.

    ERIC Educational Resources Information Center

    Boodoo, Gwyneth M.

    1998-01-01

    Discusses the research and steps needed to develop performance-based and computer-adaptive assessments that are culturally responsive. Supports the development of a new conceptual framework and more explicit guidelines for designing culturally responsive assessments. (SLD)

  5. Advances in computer-aided design and computer-aided manufacture technology.

    PubMed

    Calamia, J R

    1994-01-01

    Although the development of computer-aided design (CAD) and computer-aided manufacture (CAM) technology and the benefits of increased productivity became obvious in the automobile and aerospace industries in the 1970s, investigations of this technology's application in the field of dentistry did not begin until the 1980s. Only now are we beginning to see the fruits of this work with the commercial availability of some systems; the potential for this technology seems boundless. This article reviews the recent literature with emphasis on the period from June 1992 to May 1993. This review should familiarize the reader with some of the latest developments in this technology, including a brief description of some systems currently available and the clinical and economical rationale for their acceptance into the dental mainstream. This article concentrates on a particular system, the Cerec (Siemens/Pelton and Crane, Charlotte, NC) system, for three reasons: first, this system has been available since 1985 and, as a result, has a track record of almost 7 years of data. Most of the data have just recently been released and consequently, much of this year's literature on CAD-CAM is monopolized by studies using this system. Second, this system was developed as a mobile, affordable, direct chairside CAD-CAM restorative method. As such, it is of special interest to the dentist who will offer this new technology directly to the patient, providing a one-visit restoration. Third, the author is currently engaged in research using this particular system and has a working knowledge of this system's capabilities.

  6. Turkish Adaptation of Technological Pedagogical Content Knowledge Survey for Elementary Teachers

    ERIC Educational Resources Information Center

    Kaya, Sibel; Dag, Funda

    2013-01-01

    The purpose of this study was to adapt the Technological Pedagogical Content Knowledge (TPACK) Survey developed by Schmidt and colleagues into Turkish and investigate its factor structure through exploratory and confirmatory factor analysis. The participants were 352 elementary pre-service teachers from three large universities in northwestern…

  7. Adaptation technology between IP layer and optical layer in optical Internet

    NASA Astrophysics Data System (ADS)

    Ji, Yuefeng; Li, Hua; Sun, Yongmei

    2001-10-01

    Wavelength division multiplexing (WDM) optical network provides a platform with high bandwidth capacity and is supposed to be the backbone infrastructure supporting the next-generation high-speed multi-service networks (ATM, IP, etc.). In the foreseeable future, IP will be the predominant data traffic, to make fully use of the bandwidth of the WDM optical network, many attentions have been focused on IP over WDM, which has been proposed as the most promising technology for new kind of network, so-called Optical Internet. According to OSI model, IP is in the 3rd layer (network layer) and optical network is in the 1st layer (physical layer), so the key issue is what adaptation technology should be used in the 2nd layer (data link layer). In this paper, firstly, we analyze and compare the current adaptation technologies used in backbone network nowadays. Secondly, aiming at the drawbacks of above technologies, we present a novel adaptation protocol (DONA) between IP layer and optical layer in Optical Internet and describe it in details. Thirdly, the gigabit transmission adapter (GTA) we accomplished based on the novel protocol is described. Finally, we set up an experiment platform to apply and verify the DONA and GTA, the results and conclusions of the experiment are given.

  8. Resituation or Resistance? Higher Education Teachers' Adaptations to Technological Change

    ERIC Educational Resources Information Center

    Westberry, Nicola; McNaughton, Susan; Billot, Jennie; Gaeta, Helen

    2015-01-01

    This paper presents the findings from a project that explored teachers' adaptations to technological change in four large classes in higher education. In these classes, lecturers changed from single- to multi-lecture settings mediated by videoconferencing, requiring them to transfer their beliefs and practices into a new pedagogical space.…

  9. Adaptive Technology for the Internet: Making Electronic Resources Accessible to All.

    ERIC Educational Resources Information Center

    Mates, Barbara T.

    This book seeks to guide information providers in establishing accessible World Wide Web sites and acquiring the hardware and software needed by people with disabilities, focusing on access to the Internet using large print, voice, and Braille. The book also covers how to acquire the funds for adaptive technology, what type of equipment to choose,…

  10. Three Authentic Curriculum-Integration Approaches to Bird Adaptations That Incorporate Technology and Thinking Skills

    ERIC Educational Resources Information Center

    Rule, Audrey C.; Barrera, Manuel T., III

    2008-01-01

    Integration of subject areas with technology and thinking skills is a way to help teachers cope with today's overloaded curriculum and to help students see the connectedness of different curriculum areas. This study compares three authentic approaches to teaching a science unit on bird adaptations for habitat that integrate thinking skills and…

  11. An adaptive grid method for computing the high speed 3D viscous flow about a re-entry vehicle

    NASA Technical Reports Server (NTRS)

    Bockelie, Michael J.; Smith, Robert E.

    1992-01-01

    An algebraic solution adaptive grid generation method that allows adapting the grid in all three coordinate directions is presented. Techniques are described that maintain the integrity of the original vehicle definition for grid point movement on the vehicle surface and that avoid grid cross over in the boundary layer portion of the grid lying next to the vehicle surface. The adaptive method is tested by computing the Mach 6 hypersonic three dimensional viscous flow about a proposed Martian entry vehicle.

  12. Computer-Adaptive Testing for Students with Disabilities: A Review of the Literature. Research Report. ETS RR-11-32

    ERIC Educational Resources Information Center

    Stone, Elizabeth; Davey, Tim

    2011-01-01

    There has been an increased interest in developing computer-adaptive testing (CAT) and multistage assessments for K-12 accountability assessments. The move to adaptive testing has been met with some resistance by those in the field of special education who express concern about routing of students with divergent profiles (e.g., some students with…

  13. Multiresolution Wavelet Based Adaptive Numerical Dissipation Control for Shock-Turbulence Computations

    NASA Technical Reports Server (NTRS)

    Sjoegreen, B.; Yee, H. C.

    2001-01-01

    The recently developed essentially fourth-order or higher low dissipative shock-capturing scheme of Yee, Sandham and Djomehri (1999) aimed at minimizing nu- merical dissipations for high speed compressible viscous flows containing shocks, shears and turbulence. To detect non smooth behavior and control the amount of numerical dissipation to be added, Yee et al. employed an artificial compression method (ACM) of Harten (1978) but utilize it in an entirely different context than Harten originally intended. The ACM sensor consists of two tuning parameters and is highly physical problem dependent. To minimize the tuning of parameters and physical problem dependence, new sensors with improved detection properties are proposed. The new sensors are derived from utilizing appropriate non-orthogonal wavelet basis functions and they can be used to completely switch to the extra numerical dissipation outside shock layers. The non-dissipative spatial base scheme of arbitrarily high order of accuracy can be maintained without compromising its stability at all parts of the domain where the solution is smooth. Two types of redundant non-orthogonal wavelet basis functions are considered. One is the B-spline wavelet (Mallat & Zhong 1992) used by Gerritsen and Olsson (1996) in an adaptive mesh refinement method, to determine regions where re nement should be done. The other is the modification of the multiresolution method of Harten (1995) by converting it to a new, redundant, non-orthogonal wavelet. The wavelet sensor is then obtained by computing the estimated Lipschitz exponent of a chosen physical quantity (or vector) to be sensed on a chosen wavelet basis function. Both wavelet sensors can be viewed as dual purpose adaptive methods leading to dynamic numerical dissipation control and improved grid adaptation indicators. Consequently, they are useful not only for shock-turbulence computations but also for computational aeroacoustics and numerical combustion. In addition, these

  14. The Utah Educational Technology Initiative Year Two Evaluation: Program Implementation, Computer Acquisition and Placement, and Computer Use.

    ERIC Educational Resources Information Center

    Mergendoller, John R.; And Others

    This evaluation report describes program implementation, computer acquisition and placement, and computer use during the second year (1991-92) of the Utah Educational Technology Initiative (ETI). In addition, it discusses the various ways computers are used in Utah schools and reports the opinions and experiences of ETI coordinators in the 12…

  15. Design of computer-generated beam-shaping holograms by iterative finite-element mesh adaption.

    PubMed

    Dresel, T; Beyerlein, M; Schwider, J

    1996-12-10

    Computer-generated phase-only holograms can be used for laser beam shaping, i.e., for focusing a given aperture with intensity and phase distributions into a pregiven intensity pattern in their focal planes. A numerical approach based on iterative finite-element mesh adaption permits the design of appropriate phase functions for the task of focusing into two-dimensional reconstruction patterns. Both the hologram aperture and the reconstruction pattern are covered by mesh mappings. An iterative procedure delivers meshes with intensities equally distributed over the constituting elements. This design algorithm adds new elementary focuser functions to what we call object-oriented hologram design. Some design examples are discussed.

  16. An adaptive grid method for computing time accurate solutions on structured grids

    NASA Technical Reports Server (NTRS)

    Bockelie, Michael J.; Smith, Robert E.; Eiseman, Peter R.

    1991-01-01

    The solution method consists of three parts: a grid movement scheme; an unsteady Euler equation solver; and a temporal coupling routine that links the dynamic grid to the Euler solver. The grid movement scheme is an algebraic method containing grid controls that generate a smooth grid that resolves the severe solution gradients and the sharp transitions in the solution gradients. The temporal coupling is performed with a grid prediction correction procedure that is simple to implement and provides a grid that does not lag the solution in time. The adaptive solution method is tested by computing the unsteady inviscid solutions for a one dimensional shock tube and a two dimensional shock vortex iteraction.

  17. Computational technology for flight vehicles; Proceedings of the Symposium on Computational Technology on Flight Vehicles, Washington, DC, Nov. 5-7, 1990

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Editor); Venneri, Samuel L. (Editor)

    1990-01-01

    The present conference on computational methods for aeronautics applications discusses topics in the fields of parallel computing, multidisciplinary computational methods, grid generation, visualization methods for CFD, probabilistic modeling, numerical simulations and methodologies for different flow regimes, computational strategies and adaptive methods in CFD, and computational strategies and dynamics and control. Attention is given to the MACH system-software kernel, a multidisciplinary approach to aeroelastic analysis, interactive grid generation with control points, interactive flow visualization using stream surfaces, numerical simulations of dynamic/aerodynamic interactions, implicit mathods for the Navier-Stokes equations, and the automatic phase-space analysis of dynamical systems.

  18. What Can Computer Technology Offer Special Education? Research & Resources: Special Education Information for Policymakers.

    ERIC Educational Resources Information Center

    National Conference of State Legislatures, Washington, DC.

    Intended for policymakers, this brief addresses issues related to computer technology and its contributions to special education. Trends are noted and three types of applications are considered: computer assisted instruction, computer managed instruction, and computer support activities. Descriptions of several computer applications in local and…

  19. Dynamic Load Balancing for Adaptive Computations on Distributed-Memory Machines

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Dynamic load balancing is central to adaptive mesh-based computations on large-scale parallel computers. The principal investigator has investigated various issues on the dynamic load balancing problem under NASA JOVE and JAG rants. The major accomplishments of the project are two graph partitioning algorithms and a load balancing framework. The S-HARP dynamic graph partitioner is known to be the fastest among the known dynamic graph partitioners to date. It can partition a graph of over 100,000 vertices in 0.25 seconds on a 64- processor Cray T3E distributed-memory multiprocessor while maintaining the scalability of over 16-fold speedup. Other known and widely used dynamic graph partitioners take over a second or two while giving low scalability of a few fold speedup on 64 processors. These results have been published in journals and peer-reviewed flagship conferences.

  20. Method and system for rendering and interacting with an adaptable computing environment

    DOEpatents

    Osbourn, Gordon Cecil; Bouchard, Ann Marie

    2012-06-12

    An adaptable computing environment is implemented with software entities termed "s-machines", which self-assemble into hierarchical data structures capable of rendering and interacting with the computing environment. A hierarchical data structure includes a first hierarchical s-machine bound to a second hierarchical s-machine. The first hierarchical s-machine is associated with a first layer of a rendering region on a display screen and the second hierarchical s-machine is associated with a second layer of the rendering region overlaying at least a portion of the first layer. A screen element s-machine is linked to the first hierarchical s-machine. The screen element s-machine manages data associated with a screen element rendered to the display screen within the rendering region at the first layer.

  1. A Co-Adaptive Brain-Computer Interface for End Users with Severe Motor Impairment

    PubMed Central

    Faller, Josef; Scherer, Reinhold; Costa, Ursula; Opisso, Eloy; Medina, Josep; Müller-Putz, Gernot R.

    2014-01-01

    Co-adaptive training paradigms for event-related desynchronization (ERD) based brain-computer interfaces (BCI) have proven effective for healthy users. As of yet, it is not clear whether co-adaptive training paradigms can also benefit users with severe motor impairment. The primary goal of our paper was to evaluate a novel cue-guided, co-adaptive BCI training paradigm with severely impaired volunteers. The co-adaptive BCI supports a non-control state, which is an important step toward intuitive, self-paced control. A secondary aim was to have the same participants operate a specifically designed self-paced BCI training paradigm based on the auto-calibrated classifier. The co-adaptive BCI analyzed the electroencephalogram from three bipolar derivations (C3, Cz, and C4) online, while the 22 end users alternately performed right hand movement imagery (MI), left hand MI and relax with eyes open (non-control state). After less than five minutes, the BCI auto-calibrated and proceeded to provide visual feedback for the MI task that could be classified better against the non-control state. The BCI continued to regularly recalibrate. In every calibration step, the system performed trial-based outlier rejection and trained a linear discriminant analysis classifier based on one auto-selected logarithmic band-power feature. In 24 minutes of training, the co-adaptive BCI worked significantly (p = 0.01) better than chance for 18 of 22 end users. The self-paced BCI training paradigm worked significantly (p = 0.01) better than chance in 11 of 20 end users. The presented co-adaptive BCI complements existing approaches in that it supports a non-control state, requires very little setup time, requires no BCI expert and works online based on only two electrodes. The preliminary results from the self-paced BCI paradigm compare favorably to previous studies and the collected data will allow to further improve self-paced BCI systems for disabled users. PMID:25014055

  2. Applying Observations from Technological Transformations in Complex Adaptive Systems to Inform Health Policy on Technology Adoption

    PubMed Central

    Phillips, Andrew B.; Merrill, Jacqueline

    2012-01-01

    Many complex markets such as banking and manufacturing have benefited significantly from technology adoption. Each of these complex markets experienced increased efficiency, quality, security, and customer involvement as a result of technology transformation in their industry. Healthcare has not benefited to the same extent. We provide initial findings from a policy analysis of complex markets and the features of these transformations that can influence health technology adoption and acceptance. PMID:24199112

  3. Applying observations from technological transformations in complex adaptive systems to inform health policy on technology adoption.

    PubMed

    Phillips, Andrew B; Merrill, Jacqueline

    2012-01-01

    Many complex markets such as banking and manufacturing have benefited significantly from technology adoption. Each of these complex markets experienced increased efficiency, quality, security, and customer involvement as a result of technology transformation in their industry. Healthcare has not benefited to the same extent. We provide initial findings from a policy analysis of complex markets and the features of these transformations that can influence health technology adoption and acceptance. PMID:24199112

  4. Adaptive intermittent control: A computational model explaining motor intermittency observed in human behavior.

    PubMed

    Sakaguchi, Yutaka; Tanaka, Masato; Inoue, Yasuyuki

    2015-07-01

    It is a fundamental question how our brain performs a given motor task in a real-time fashion with the slow sensorimotor system. Computational theory proposed an influential idea of feed-forward control, but it has mainly treated the case that the movement is ballistic (such as reaching) because the motor commands should be calculated in advance of movement execution. As a possible mechanism for operating feed-forward control in continuous motor tasks (such as target tracking), we propose a control model called "adaptive intermittent control" or "segmented control," that brain adaptively divides the continuous time axis into discrete segments and executes feed-forward control in each segment. The idea of intermittent control has been proposed in the fields of control theory, biological modeling and nonlinear dynamical system. Compared with these previous models, the key of the proposed model is that the system speculatively determines the segmentation based on the future prediction and its uncertainty. The result of computer simulation showed that the proposed model realized faithful visuo-manual tracking with realistic sensorimotor delays and with less computational costs (i.e., with fewer number of segments). Furthermore, it replicated "motor intermittency", that is, intermittent discontinuities commonly observed in human movement trajectories. We discuss that the temporally segmented control is an inevitable strategy for brain which has to achieve a given task with small computational (or cognitive) cost, using a slow control system in an uncertain variable environment, and the motor intermittency is the side-effect of this strategy. PMID:25897510

  5. Important advances in technology and unique applications to cardiovascular computed tomography.

    PubMed

    Chaikriangkrai, Kongkiat; Choi, Su Yeon; Nabi, Faisal; Chang, Su Min

    2014-01-01

    For the past decade, multidetector cardiac computed tomography and its main application, coronary computed tomography angiography, have been established as a noninvasive technique for anatomical assessment of coronary arteries. This new era of coronary artery evaluation by coronary computed tomography angiography has arisen from the rapid advancement in computed tomography technology, which has led to massive diagnostic and prognostic clinical studies in various patient populations. This article gives a brief overview of current multidetector cardiac computed tomography systems, developing cardiac computed tomography technologies in both hardware and software fields, innovative radiation exposure reduction measures, multidetector cardiac computed tomography functional studies, and their newer clinical applications beyond coronary computed tomography angiography. PMID:25574342

  6. Elicitation of natural language representations of uncertainty using computer technology

    SciTech Connect

    Tonn, B.; Goeltz, R.; Travis, C.; Tennessee Univ., Knoxville, TN )

    1989-01-01

    Knowledge elicitation is an important aspect of risk analysis. Knowledge about risks must be accurately elicited from experts for use in risk assessments. Knowledge and perceptions of risks must also be accurately elicited from the public in order to intelligently perform policy analysis and develop and implement programs. Oak Ridge National Laboratory is developing computer technology to effectively and efficiently elicit knowledge from experts and the public. This paper discusses software developed to elicit natural language representations of uncertainty. The software is written in Common Lisp and resides on VAX Computers System and Symbolics Lisp machines. The software has three goals, to determine preferences for using natural language terms for representing uncertainty; likelihood rankings of the terms; and how likelihood estimates are combined to form new terms. The first two goals relate to providing useful results for those interested in risk communication. The third relates to providing cognitive data to further our understanding of people's decision making under uncertainty. The software is used to elicit natural language terms used to express the likelihood of various agents causing cancer in humans and cancer resulting in various maladies, and the likelihood of everyday events. 6 refs., 4 figs., 4 tabs.

  7. Adapt

    NASA Astrophysics Data System (ADS)

    Bargatze, L. F.

    2015-12-01

    Active Data Archive Product Tracking (ADAPT) is a collection of software routines that permits one to generate XML metadata files to describe and register data products in support of the NASA Heliophysics Virtual Observatory VxO effort. ADAPT is also a philosophy. The ADAPT concept is to use any and all available metadata associated with scientific data to produce XML metadata descriptions in a consistent, uniform, and organized fashion to provide blanket access to the full complement of data stored on a targeted data server. In this poster, we present an application of ADAPT to describe all of the data products that are stored by using the Common Data File (CDF) format served out by the CDAWEB and SPDF data servers hosted at the NASA Goddard Space Flight Center. These data servers are the primary repositories for NASA Heliophysics data. For this purpose, the ADAPT routines have been used to generate data resource descriptions by using an XML schema named Space Physics Archive, Search, and Extract (SPASE). SPASE is the designated standard for documenting Heliophysics data products, as adopted by the Heliophysics Data and Model Consortium. The set of SPASE XML resource descriptions produced by ADAPT includes high-level descriptions of numerical data products, display data products, or catalogs and also includes low-level "Granule" descriptions. A SPASE Granule is effectively a universal access metadata resource; a Granule associates an individual data file (e.g. a CDF file) with a "parent" high-level data resource description, assigns a resource identifier to the file, and lists the corresponding assess URL(s). The CDAWEB and SPDF file systems were queried to provide the input required by the ADAPT software to create an initial set of SPASE metadata resource descriptions. Then, the CDAWEB and SPDF data repositories were queried subsequently on a nightly basis and the CDF file lists were checked for any changes such as the occurrence of new, modified, or deleted

  8. Advances in computer technology: impact on the practice of medicine.

    PubMed

    Groth-Vasselli, B; Singh, K; Farnsworth, P N

    1995-01-01

    Advances in computer technology provide a wide range of applications which are revolutionizing the practice of medicine. The development of new software for the office creates a web of communication among physicians, staff members, health care facilities and associated agencies. This provides the physician with the prospect of a paperless office. At the other end of the spectrum, the development of 3D work stations and software based on computational chemistry permits visualization of protein molecules involved in disease. Computer assisted molecular modeling has been used to construct working 3D models of lens alpha-crystallin. The 3D structure of alpha-crystallin is basic to our understanding of the molecular mechanisms involved in lens fiber cell maturation, stabilization of the inner nuclear region, the maintenance of lens transparency and cataractogenesis. The major component of the high molecular weight aggregates that occur during cataractogenesis is alpha-crystallin subunits. Subunits of alpha-crystallin occur in other tissues of the body. In the central nervous system accumulation of these subunits in the form of dense inclusion bodies occurs in pathological conditions such as Alzheimer's disease, Huntington's disease, multiple sclerosis and toxoplasmosis (Iwaki, Wisniewski et al., 1992), as well as neoplasms of astrocyte origin (Iwaki, Iwaki, et al., 1991). Also cardiac ischemia is associated with an increased alpha B synthesis (Chiesi, Longoni et al., 1990). On a more global level, the molecular structure of alpha-crystallin may provide information pertaining to the function of small heat shock proteins, hsp, in maintaining cell stability under the stress of disease.

  9. Arithmetic Fourier transform and adaptive delta modulation: A symbiosis for high speed computation

    SciTech Connect

    Tufts, D.W.; Sadasiv, G.

    1988-01-01

    This report presents preliminary results on the VLSI design and implementation of a novel and promising algorithm for accurate high-speed Fourier analysis and synthesis. The Arithmetic Fourier Transform is based on the number -theoretic method of Mobius inversion. Its computations proceed in parallel and the individual operations are very simple. Except for a small number of scalings in one stage of the computation, only multiplications by 0, +1, and -1 are required. If the input samples were not quantized and if deal real-number operations were used internally, then the results would be exact. The accuracy of the computation is limited only by the input A/D conversion process, any constraints on the word lengths of internal accumulating registers, and the implementation of the few scaling operations. Motivated by the goal of efficient, effective, high-speed realization of the algorithm in an integrated circuit, we introduce further simplicities by the use of delta modulation to represent the input function in digital form. The result is that only binary (or preferably, ternary) sequences need to be processed in the parallel computations. And the required accumulations can be replaced by up/down counters. The dynamic range of the resulting transformation can be increased by the use of adaptive delta modulation.

  10. Application of adaptive and neural network computational techniques to Traffic Volume and Classification Monitoring

    SciTech Connect

    Mead, W.C.; Fisher, H.N.; Jones, R.D.; Bisset, K.R.; Lee, L.A.

    1993-09-01

    We are developing a Traffic Volume and Classification Monitoring (TVCM) system based on adaptive and neural network computational techniques. The value of neutral networks in this application lies in their ability to learn from data and to form a mapping of arbitrary topology. The piezoelectric strip and magnetic loop sensors typically used for TVCM provide signals that are complicated and variable, and that correspond in indirect ways with the desired FWHA 13-class classification system. Further, the wide variety of vehicle configurations adds to the complexity of the classification task. Our goal is to provide a TVCM system featuring high accuracy, adaptability to wide sensor and envirorunental variations, and continuous fault detection. We have instrumented an experimental TVCM site, developed PC-based on-line data acquisition software, collected a large database of vehicles` signals together with accurate ground truth determination, and analyzed the data off-line with a neural net classification system that can distinguish between class 2 (automobiles) and class 3 (utility vehicles) with better than 90% accuracy. The neural network used, called the Connectionist Hyperprism Classification (CHC) network, features simple basis functions; rapid, linear training algorithms for basis function amplitudes and widths; and basis function elimination that enhances network speed and accuracy. Work is in progress to extend the system to other classes, to quantify the system`s adaptability, and to develop automatic fault detection techniques.

  11. THE PLUTO CODE FOR ADAPTIVE MESH COMPUTATIONS IN ASTROPHYSICAL FLUID DYNAMICS

    SciTech Connect

    Mignone, A.; Tzeferacos, P.; Zanni, C.; Bodo, G.; Van Straalen, B.; Colella, P.

    2012-01-01

    We present a description of the adaptive mesh refinement (AMR) implementation of the PLUTO code for solving the equations of classical and special relativistic magnetohydrodynamics (MHD and RMHD). The current release exploits, in addition to the static grid version of the code, the distributed infrastructure of the CHOMBO library for multidimensional parallel computations over block-structured, adaptively refined grids. We employ a conservative finite-volume approach where primary flow quantities are discretized at the cell center in a dimensionally unsplit fashion using the Corner Transport Upwind method. Time stepping relies on a characteristic tracing step where piecewise parabolic method, weighted essentially non-oscillatory, or slope-limited linear interpolation schemes can be handily adopted. A characteristic decomposition-free version of the scheme is also illustrated. The solenoidal condition of the magnetic field is enforced by augmenting the equations with a generalized Lagrange multiplier providing propagation and damping of divergence errors through a mixed hyperbolic/parabolic explicit cleaning step. Among the novel features, we describe an extension of the scheme to include non-ideal dissipative processes, such as viscosity, resistivity, and anisotropic thermal conduction without operator splitting. Finally, we illustrate an efficient treatment of point-local, potentially stiff source terms over hierarchical nested grids by taking advantage of the adaptivity in time. Several multidimensional benchmarks and applications to problems of astrophysical relevance assess the potentiality of the AMR version of PLUTO in resolving flow features separated by large spatial and temporal disparities.

  12. An adaptive filter bank for motor imagery based Brain Computer Interface.

    PubMed

    Thomas, Kavitha P; Guan, Cuntai; Tong, Lau Chiew; Prasad, Vinod A

    2008-01-01

    Brain Computer Interface (BCI) provides an alternative communication and control method for people with severe motor disabilities. Motor imagery patterns are widely used in Electroencephalogram (EEG) based BCIs. These motor imagery activities are associated with variation in alpha and beta band power of EEG signals called Event Related Desynchronization/synchronization (ERD/ERS). The dominant frequency bands are subject-specific and therefore performance of motor imagery based BCIs are sensitive to both temporal filtering and spatial filtering. As the optimum filter is strongly subject-dependent, we propose a method that selects the subject-specific discriminative frequency components using time-frequency plots of Fisher ratio of two-class motor imagery patterns. We also propose a low complexity adaptive Finite Impulse Response (FIR) filter bank system based on coefficient decimation technique which can realize the subject-specific bandpass filters adaptively depending on the information of Fisher ratio map. Features are extracted only from the selected frequency components. The proposed adaptive filter bank based system offers average classification accuracy of about 90%, which is slightly better than the existing fixed filter bank system. PMID:19162856

  13. Enabling Technologies for Scalable Trapped Ion Quantum Computing

    NASA Astrophysics Data System (ADS)

    Crain, Stephen; Gaultney, Daniel; Mount, Emily; Knoernschild, Caleb; Baek, Soyoung; Maunz, Peter; Kim, Jungsang

    2013-05-01

    Scalability is one of the main challenges of trapped ion based quantum computation, mainly limited by the lack of enabling technologies needed to trap, manipulate and process the increasing number of qubits. Microelectromechanical systems (MEMS) technology allows one to design movable micromirrors to focus laser beams on individual ions in a chain and steer the focal point in two dimensions. Our current MEMS system is designed to steer 355 nm pulsed laser beams to carry out logic gates on a chain of Yb ions with a waist of 1.5 μm across a 20 μm range. In order to read the state of the qubit chain we developed a 32-channel PMT with a custom read-out circuit operating near the thermal noise limit of the readout amplifier which increases state detection fidelity. We also developed a set of digital to analog converters (DACs) used to supply analog DC voltages to the electrodes of an ion trap. We designed asynchronous DACs to avoid added noise injection at the update rate commonly found in synchronous DACs. Effective noise filtering is expected to reduce the heating rate of a surface trap, thus improving multi-qubit logic gate fidelities. Our DAC system features 96 channels and an integrated FPGA that allows the system to be controlled in real time. This work was supported by IARPA/ARO.

  14. Adaptive Flow Simulation of Turbulence in Subject-Specific Abdominal Aortic Aneurysm on Massively Parallel Computers

    NASA Astrophysics Data System (ADS)

    Sahni, Onkar; Jansen, Kenneth; Shephard, Mark; Taylor, Charles

    2007-11-01

    Flow within the healthy human vascular system is typically laminar but diseased conditions can alter the geometry sufficiently to produce transitional/turbulent flows in regions focal (and immediately downstream) of the diseased section. The mean unsteadiness (pulsatile or respiratory cycle) further complicates the situation making traditional turbulence simulation techniques (e.g., Reynolds-averaged Navier-Stokes simulations (RANSS)) suspect. At the other extreme, direct numerical simulation (DNS) while fully appropriate can lead to large computational expense, particularly when the simulations must be done quickly since they are intended to affect the outcome of a medical treatment (e.g., virtual surgical planning). To produce simulations in a clinically relevant time frame requires; 1) adaptive meshing technique that closely matches the desired local mesh resolution in all three directions to the highly anisotropic physical length scales in the flow, 2) efficient solution algorithms, and 3) excellent scaling on massively parallel computers. In this presentation we will demonstrate results for a subject-specific simulation of an abdominal aortic aneurysm using stabilized finite element method on anisotropically adapted meshes consisting of O(10^8) elements over O(10^4) processors.

  15. Isosurface Computation Made Simple: Hardware acceleration,Adaptive Refinement and tetrahedral Stripping

    SciTech Connect

    Pascucci, V

    2004-02-18

    This paper presents a simple approach for rendering isosurfaces of a scalar field. Using the vertex programming capability of commodity graphics cards, we transfer the cost of computing an isosurface from the Central Processing Unit (CPU), running the main application, to the Graphics Processing Unit (GPU), rendering the images. We consider a tetrahedral decomposition of the domain and draw one quadrangle (quad) primitive per tetrahedron. A vertex program transforms the quad into the piece of isosurface within the tetrahedron (see Figure 2). In this way, the main application is only devoted to streaming the vertices of the tetrahedra from main memory to the graphics card. For adaptively refined rectilinear grids, the optimization of this streaming process leads to the definition of a new 3D space-filling curve, which generalizes the 2D Sierpinski curve used for efficient rendering of triangulated terrains. We maintain the simplicity of the scheme when constructing view-dependent adaptive refinements of the domain mesh. In particular, we guarantee the absence of T-junctions by satisfying local bounds in our nested error basis. The expensive stage of fixing cracks in the mesh is completely avoided. We discuss practical tradeoffs in the distribution of the workload between the application and the graphics hardware. With current GPU's it is convenient to perform certain computations on the main CPU. Beyond the performance considerations that will change with the new generations of GPU's this approach has the major advantage of avoiding completely the storage in memory of the isosurface vertices and triangles.

  16. Reconstruction of sparse-view X-ray computed tomography using adaptive iterative algorithms.

    PubMed

    Liu, Li; Lin, Weikai; Jin, Mingwu

    2015-01-01

    In this paper, we propose two reconstruction algorithms for sparse-view X-ray computed tomography (CT). Treating the reconstruction problems as data fidelity constrained total variation (TV) minimization, both algorithms adapt the alternate two-stage strategy: projection onto convex sets (POCS) for data fidelity and non-negativity constraints and steepest descent for TV minimization. The novelty of this work is to determine iterative parameters automatically from data, thus avoiding tedious manual parameter tuning. In TV minimization, the step sizes of steepest descent are adaptively adjusted according to the difference from POCS update in either the projection domain or the image domain, while the step size of algebraic reconstruction technique (ART) in POCS is determined based on the data noise level. In addition, projection errors are used to compare with the error bound to decide whether to perform ART so as to reduce computational costs. The performance of the proposed methods is studied and evaluated using both simulated and physical phantom data. Our methods with automatic parameter tuning achieve similar, if not better, reconstruction performance compared to a representative two-stage algorithm.

  17. Evolutionary adaptive eye tracking for low-cost human computer interaction applications

    NASA Astrophysics Data System (ADS)

    Shen, Yan; Shin, Hak Chul; Sung, Won Jun; Khim, Sarang; Kim, Honglak; Rhee, Phill Kyu

    2013-01-01

    We present an evolutionary adaptive eye-tracking framework aiming for low-cost human computer interaction. The main focus is to guarantee eye-tracking performance without using high-cost devices and strongly controlled situations. The performance optimization of eye tracking is formulated into the dynamic control problem of deciding on an eye tracking algorithm structure and associated thresholds/parameters, where the dynamic control space is denoted by genotype and phenotype spaces. The evolutionary algorithm is responsible for exploring the genotype control space, and the reinforcement learning algorithm organizes the evolved genotype into a reactive phenotype. The evolutionary algorithm encodes an eye-tracking scheme as a genetic code based on image variation analysis. Then, the reinforcement learning algorithm defines internal states in a phenotype control space limited by the perceived genetic code and carries out interactive adaptations. The proposed method can achieve optimal performance by compromising the difficulty in the real-time performance of the evolutionary algorithm and the drawback of the huge search space of the reinforcement learning algorithm. Extensive experiments were carried out using webcam image sequences and yielded very encouraging results. The framework can be readily applied to other low-cost vision-based human computer interactions in solving their intrinsic brittleness in unstable operational environments.

  18. A simple computational principle predicts vocal adaptation dynamics across age and error size

    PubMed Central

    Kelly, Conor W.; Sober, Samuel J.

    2014-01-01

    The brain uses sensory feedback to correct errors in behavior. Songbirds and humans acquire vocal behaviors by imitating the sounds produced by adults and rely on auditory feedback to correct vocal errors throughout their lifetimes. In both birds and humans, acoustic variability decreases steadily with age following the acquisition of vocal behavior. Prior studies in adults have shown that while sensory errors that fall within the limits of vocal variability evoke robust motor corrections, larger errors do not induce learning. Although such results suggest that younger animals, which have greater vocal variability, might correct large errors more readily than older individuals, it is unknown whether age-dependent changes in variability are accompanied by changes in the speed or magnitude of vocal error correction. We tested the hypothesis that auditory errors evoke greater vocal changes in younger animals and that a common computation determines how sensory information drives motor learning across different ages and error sizes. Consistent with our hypothesis, we found that in songbirds the speed and extent of error correction changes dramatically with age and that age-dependent differences in learning were predicted by a model in which the overlap between sensory errors and the distribution of prior sensory feedback determines the dynamics of adaptation. Our results suggest that the brain employs a simple and robust computational principle to calibrate the rate and magnitude of vocal adaptation across age-dependent changes in behavioral performance and in response to different sensory errors. PMID:25324740

  19. Scatter correction for cone-beam computed tomography using self-adaptive scatter kernel superposition

    NASA Astrophysics Data System (ADS)

    Xie, Shi-Peng; Luo, Li-Min

    2012-06-01

    The authors propose a combined scatter reduction and correction method to improve image quality in cone beam computed tomography (CBCT). The scatter kernel superposition (SKS) method has been used occasionally in previous studies. However, this method differs in that a scatter detecting blocker (SDB) was used between the X-ray source and the tested object to model the self-adaptive scatter kernel. This study first evaluates the scatter kernel parameters using the SDB, and then isolates the scatter distribution based on the SKS. The quality of image can be improved by removing the scatter distribution. The results show that the method can effectively reduce the scatter artifacts, and increase the image quality. Our approach increases the image contrast and reduces the magnitude of cupping. The accuracy of the SKS technique can be significantly improved in our method by using a self-adaptive scatter kernel. This method is computationally efficient, easy to implement, and provides scatter correction using a single scan acquisition.

  20. Review of Enabling Technologies to Facilitate Secure Compute Customization

    SciTech Connect

    Aderholdt, Ferrol; Caldwell, Blake A; Hicks, Susan Elaine; Koch, Scott M; Naughton, III, Thomas J; Pelfrey, Daniel S; Pogge, James R; Scott, Stephen L; Shipman, Galen M; Sorrillo, Lawrence

    2014-12-01

    High performance computing environments are often used for a wide variety of workloads ranging from simulation, data transformation and analysis, and complex workflows to name just a few. These systems may process data for a variety of users, often requiring strong separation between job allocations. There are many challenges to establishing these secure enclaves within the shared infrastructure of high-performance computing (HPC) environments. The isolation mechanisms in the system software are the basic building blocks for enabling secure compute enclaves. There are a variety of approaches and the focus of this report is to review the different virtualization technologies that facilitate the creation of secure compute enclaves. The report reviews current operating system (OS) protection mechanisms and modern virtualization technologies to better understand the performance/isolation properties. We also examine the feasibility of running ``virtualized'' computing resources as non-privileged users, and providing controlled administrative permissions for standard users running within a virtualized context. Our examination includes technologies such as Linux containers (LXC [32], Docker [15]) and full virtualization (KVM [26], Xen [5]). We categorize these different approaches to virtualization into two broad groups: OS-level virtualization and system-level virtualization. The OS-level virtualization uses containers to allow a single OS kernel to be partitioned to create Virtual Environments (VE), e.g., LXC. The resources within the host's kernel are only virtualized in the sense of separate namespaces. In contrast, system-level virtualization uses hypervisors to manage multiple OS kernels and virtualize the physical resources (hardware) to create Virtual Machines (VM), e.g., Xen, KVM. This terminology of VE and VM, detailed in Section 2, is used throughout the report to distinguish between the two different approaches to providing virtualized execution environments

  1. Computer architectures for computational physics work done by Computational Research and Technology Branch and Advanced Computational Concepts Group

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Slides are reproduced that describe the importance of having high performance number crunching and graphics capability. They also indicate the types of research and development underway at Ames Research Center to ensure that, in the near term, Ames is a smart buyer and user, and in the long-term that Ames knows the best possible solutions for number crunching and graphics needs. The drivers for this research are real computational physics applications of interest to Ames and NASA. They are concerned with how to map the applications, and how to maximize the physics learned from the results of the calculations. The computer graphics activities are aimed at getting maximum information from the three-dimensional calculations by using the real time manipulation of three-dimensional data on the Silicon Graphics workstation. Work is underway on new algorithms that will permit the display of experimental results that are sparse and random, the same way that the dense and regular computed results are displayed.

  2. Fluidity: a fully-unstructured adaptive mesh computational framework for geodynamics

    NASA Astrophysics Data System (ADS)

    Kramer, S. C.; Davies, D.; Wilson, C. R.

    2010-12-01

    Fluidity is a finite element, finite volume fluid dynamics model developed by the Applied Modelling and Computation Group at Imperial College London. Several features of the model make it attractive for use in geodynamics. A core finite element library enables the rapid implementation and investigation of new numerical schemes. For example, the function spaces used for each variable can be changed allowing properties of the discretisation, such as stability, conservation and balance, to be easily varied and investigated. Furthermore, unstructured, simplex meshes allow the underlying resolution to vary rapidly across the computational domain. Combined with dynamic mesh adaptivity, where the mesh is periodically optimised to the current conditions, this allows significant savings in computational cost over traditional chessboard-like structured mesh simulations [1]. In this study we extend Fluidity (using the Portable, Extensible Toolkit for Scientific Computation [PETSc, 2]) to Stokes flow problems relevant to geodynamics. However, due to the assumptions inherent in all models, it is necessary to properly verify and validate the code before applying it to any large-scale problems. In recent years this has been made easier by the publication of a series of ‘community benchmarks’ for geodynamic modelling. We discuss the use of several of these to help validate Fluidity [e.g. 3, 4]. The experimental results of Vatteville et al. [5] are then used to validate Fluidity against laboratory measurements. This test case is also used to highlight the computational advantages of using adaptive, unstructured meshes - significantly reducing the number of nodes and total CPU time required to match a fixed mesh simulation. References: 1. C. C. Pain et al. Comput. Meth. Appl. M, 190:3771-3796, 2001. doi:10.1016/S0045-7825(00)00294-2. 2. B. Satish et al. http://www.mcs.anl.gov/petsc/petsc-2/, 2001. 3. Blankenbach et al. Geophys. J. Int., 98:23-28, 1989. 4. Busse et al. Geophys

  3. The Technology Refresh Program: Affording State-of-the Art Personal Computing.

    ERIC Educational Resources Information Center

    Spiwak, Rand

    2000-01-01

    Describes the Florida Community College Technology Refresh Program in which 28 Florida community colleges refresh their personal computer technology on a three-year cyclical basis through negotiation of a contract with Dell Computer Corporation. Discusses the contract highlights (such as a 22.5 percent discount on personal computers and on-site…

  4. A Detailed Analysis over Some Important Issues towards Using Computer Technology into the EFL Classrooms

    ERIC Educational Resources Information Center

    Gilakjani, Abbas Pourhosein

    2014-01-01

    Computer technology has changed the ways we work, learn, interact and spend our leisure time. Computer technology has changed every aspect of our daily life--how and where we get our news, how we order goods and services, and how we communicate. This study investigates some of the significant issues concerning the use of computer technology…

  5. Factors Contributing to Teachers' Use of Computer Technology in the Classroom

    ERIC Educational Resources Information Center

    Gilakjani, Abbas Pourhosein

    2013-01-01

    There are many factors for teachers to use computer technology in their classrooms. The goal of this study is to identify some of the important factors contributing the teachers' use of computer technology. The first goal of this paper is to discuss computer self-efficacy. The second goal is to explain teaching experience. The third goal is to…

  6. The Picatinny Technology Transfer Innovation Center: A business incubator concept adapted to federal laboratory technology transfer

    SciTech Connect

    Wittig, T.; Greenfield, J.

    1996-10-01

    In recent years, the US defense industrial base spawned the aerospace industry, among other successes, and served as the nation`s technology seed bed. However, as the defense industrial base shrinks and public and private resources become scarcer, the merging of the commercial and defense communities becomes necessary to maintain national technological competencies. Cooperative efforts such as technology transfer provide an attractive, cost-effective, well-leveraged alternative to independently funded research and development (R and D). The sharing of knowledge, resources, and innovation among defense contractors and other public sector firms, academia, and other organizations has become exceedingly attractive. Recent legislation involving technology transfer provides for the sharing of federal laboratory resources with the private sector. The Army Research, Development and Engineering Center (ARDEC), Picatinny Arsenal, NJ, a designer of weapons systems, is one of the nation`s major laboratories with this requirement. To achieve its important technology transfer mission, ARDEC reviewed its capabilities, resources, intellectual property, and products with commercial potential. The purpose of the review was to develop a viable plan for effecting a technology transfer cultural change within the ARDEC, Picatinny Arsenal and with the private sector. This report highlights the issues identified, discussed, and resolved prior to the transformation of a temporarily vacant federal building on the Picatinny installation into a business incubator. ARDEC`s discussions and rationale for the decisions and actions that led to the implementation of the Picatinny Technology Transfer Innovation Center are discussed.

  7. Nanoinformatics: an emerging area of information technology at the intersection of bioinformatics, computational chemistry and nanobiotechnology.

    PubMed

    González-Nilo, Fernando; Pérez-Acle, Tomás; Guínez-Molinos, Sergio; Geraldo, Daniela A; Sandoval, Claudia; Yévenes, Alejandro; Santos, Leonardo S; Laurie, V Felipe; Mendoza, Hegaly; Cachau, Raúl E

    2011-01-01

    After the progress made during the genomics era, bioinformatics was tasked with supporting the flow of information generated by nanobiotechnology efforts. This challenge requires adapting classical bioinformatic and computational chemistry tools to store, standardize, analyze, and visualize nanobiotechnological information. Thus, old and new bioinformatic and computational chemistry tools have been merged into a new sub-discipline: nanoinformatics. This review takes a second look at the development of this new and exciting area as seen from the perspective of the evolution of nanobiotechnology applied to the life sciences. The knowledge obtained at the nano-scale level implies answers to new questions and the development of new concepts in different fields. The rapid convergence of technologies around nanobiotechnologies has spun off collaborative networks and web platforms created for sharing and discussing the knowledge generated in nanobiotechnology. The implementation of new database schemes suitable for storage, processing and integrating physical, chemical, and biological properties of nanoparticles will be a key element in achieving the promises in this convergent field. In this work, we will review some applications of nanobiotechnology to life sciences in generating new requirements for diverse scientific fields, such as bioinformatics and computational chemistry.

  8. Technological Metaphors and Moral Education: The Hacker Ethic and the Computational Experience

    ERIC Educational Resources Information Center

    Warnick, Bryan R.

    2004-01-01

    This essay is an attempt to understand how technological metaphors, particularly computer metaphors, are relevant to moral education. After discussing various types of technological metaphors, it is argued that technological metaphors enter moral thought through their "functional descriptions." The computer metaphor is then explored by turning to…

  9. When should irrigators invest in more water-efficient technologies as an adaptation to climate change?

    NASA Astrophysics Data System (ADS)

    Malek, K.; Adam, J. C.; Stockle, C.; Brady, M.; Yoder, J.

    2015-12-01

    The western US is expected to experience more frequent droughts with higher magnitudes and persistence due to the climate change, with potentially large impacts on agricultural productivity and the economy. Irrigated farmers have many options for minimizing drought impacts including changing crops, engaging in water markets, and switching irrigation technologies. Switching to more efficient irrigation technologies, which increase water availability in the crop root zone through reduction of irrigation losses, receives significant attention because of the promise of maintaining current production with less. However, more efficient irrigation systems are almost always more capital-intensive adaptation strategy particularly compared to changing crops or trading water. A farmer's decision to switch will depend on how much money they project to save from reducing drought damages. The objective of this study is to explore when (and under what climate change scenarios) it makes sense economically for farmers to invest in a new irrigation system. This study was performed over the Yakima River Basin (YRB) in Washington State, although the tools and information gained from this study are transferable to other watersheds in the western US. We used VIC-CropSyst, a large-scale grid-based modeling framework that simulates hydrological processes while mechanistically capturing crop water use, growth and development. The water flows simulated by VIC-CropSyst were used to run the RiverWare river system and water management model (YAK-RW), which simulates river processes and calculates regional water availability for agricultural use each day (i.e., the prorationing ratio). An automated computational platform has been developed and programed to perform the economic analysis for each grid cell, crop types and future climate projections separately, which allows us to explore whether or not implementing a new irrigation system is economically viable. Results of this study indicate that

  10. Usability of an Adaptive Computer Assistant that Improves Self-care and Health Literacy of Older Adults

    PubMed Central

    Blanson Henkemans, O. A.; Rogers, W. A.; Fisk, A. D.; Neerincx, M. A.; Lindenberg, J.; van der Mast, C. A. P. G.

    2014-01-01

    Summary Objectives We developed an adaptive computer assistant for the supervision of diabetics’ self-care, to support limiting illness and need for acute treatment, and improve health literacy. This assistant monitors self-care activities logged in the patient’s electronic diary. Accordingly, it provides context-aware feedback. The objective was to evaluate whether older adults in general can make use of the computer assistant and to compare an adaptive computer assistant with a fixed one, concerning its usability and contribution to health literacy. Methods We conducted a laboratory experiment in the Georgia Tech Aware Home wherein 28 older adults participated in a usability evaluation of the computer assistant, while engaged in scenarios reflecting normal and health-critical situations. We evaluated the assistant on effectiveness, efficiency, satisfaction, and educational value. Finally, we studied the moderating effects of the subjects’ personal characteristics. Results Logging self-care tasks and receiving feedback from the computer assistant enhanced the subjects’ knowledge of diabetes. The adaptive assistant was more effective in dealing with normal and health-critical situations, and, generally, it led to more time efficiency. Subjects’ personal characteristics had substantial effects on the effectiveness and efficiency of the two computer assistants. Conclusions Older adults were able to use the adaptive computer assistant. In addition, it had a positive effect on the development of health literacy. The assistant has the potential to support older diabetics’ self care while maintaining quality of life. PMID:18213433

  11. U.S. perspective on technology demonstration experiments for adaptive structures

    NASA Technical Reports Server (NTRS)

    Aswani, Mohan; Wada, Ben K.; Garba, John A.

    1991-01-01

    Evaluation of design concepts for adaptive structures is being performed in support of several focused research programs. These include programs such as Precision Segmented Reflector (PSR), Control Structure Interaction (CSI), and the Advanced Space Structures Technology Research Experiment (ASTREX). Although not specifically designed for adaptive structure technology validation, relevant experiments can be performed using the Passive and Active Control of Space Structures (PACOSS) testbed, the Space Integrated Controls Experiment (SPICE), the CSI Evolutionary Model (CEM), and the Dynamic Scale Model Test (DSMT) Hybrid Scale. In addition to the ground test experiments, several space flight experiments have been planned, including a reduced gravity experiment aboard the KC-135 aircraft, shuttle middeck experiments, and the Inexpensive Flight Experiment (INFLEX).

  12. Adaptation of hybrid human-computer interaction systems using EEG error-related potentials.

    PubMed

    Chavarriaga, Ricardo; Biasiucci, Andrea; Forster, Killian; Roggen, Daniel; Troster, Gerhard; Millan, Jose Del R

    2010-01-01

    Performance improvement in both humans and artificial systems strongly relies in the ability of recognizing erroneous behavior or decisions. This paper, that builds upon previous studies on EEG error-related signals, presents a hybrid approach for human computer interaction that uses human gestures to send commands to a computer and exploits brain activity to provide implicit feedback about the recognition of such commands. Using a simple computer game as a case study, we show that EEG activity evoked by erroneous gesture recognition can be classified in single trials above random levels. Automatic artifact rejection techniques are used, taking into account that subjects are allowed to move during the experiment. Moreover, we present a simple adaptation mechanism that uses the EEG signal to label newly acquired samples and can be used to re-calibrate the gesture recognition system in a supervised manner. Offline analysis show that, although the achieved EEG decoding accuracy is far from being perfect, these signals convey sufficient information to significantly improve the overall system performance.

  13. Computer vision challenges and technologies for agile manufacturing

    NASA Astrophysics Data System (ADS)

    Molley, Perry A.

    1996-02-01

    applicable to commercial production processes and applications. Computer vision will play a critical role in the new agile production environment for automation of processes such as inspection, assembly, welding, material dispensing and other process control tasks. Although there are many academic and commercial solutions that have been developed, none have had widespread adoption considering the huge potential number of applications that could benefit from this technology. The reason for this slow adoption is that the advantages of computer vision for automation can be a double-edged sword. The benefits can be lost if the vision system requires an inordinate amount of time for reprogramming by a skilled operator to account for different parts, changes in lighting conditions, background clutter, changes in optics, etc. Commercially available solutions typically require an operator to manually program the vision system with features used for the recognition. In a recent survey, we asked a number of commercial manufacturers and machine vision companies the question, 'What prevents machine vision systems from being more useful in factories?' The number one (and unanimous) response was that vision systems require too much skill to set up and program to be cost effective.

  14. Adaptation of NASA technology for the optimum design of orthopedic knee implants.

    PubMed

    Saravanos, D A; Mraz, P J; Davy, D T; Hopkins, D A

    1991-03-01

    NASA technology originally developed for designing aircraft turbine-engine blades has been adapted and applied to orthopedic knee implants. This article describes a method for tailoring an implant for optimal interaction with the environment of the tibia. The implant components are designed to control stresses in the bone for minimizing bone degradation and preventing failures. Engineers expect the tailoring system to improve knee prosthesis design and allow customized implants for individual patients. PMID:10150099

  15. Dental Student Experience and Perceptions of Computer Technology.

    ERIC Educational Resources Information Center

    Feldman, Cecile A.

    1992-01-01

    A survey of 180 dental students in 3 classes assessed student knowledge of computer-related topics and perceptions of the usefulness of computers in different areas of practice management. Computer ownership and use, computer-related courses taken, software types used, and student characteristics (age, sex, academic achievement, undergraduate…

  16. Roles of Computer Technology in the Mathematics Education of the Gifted.

    ERIC Educational Resources Information Center

    Grandgenett, Neal

    1991-01-01

    This article reviews technological advances in educational computer use and discusses applications for computers as tools, tutors, and tutees in mathematics education of gifted students. Computer-assisted instruction, artificial intelligence, multimedia, numeric processing, computer-aided design, LOGO, robotics, and hypercard software packages are…

  17. Survey of subsurface geophysical exploration technologies adaptable to an airborne platform

    SciTech Connect

    Taylor, K.A.

    1992-12-01

    This report has been prepared by the US Department of Energy (DOE) as part of a Research Development Demonstration Testing and Evaluation (RDDT E) project by EG G Energy Measurement's (EG G/EM) Remote Sensing Laboratory. It examines geophysical detection techniques which may be used in Environmental Restoration/Waste Management (ER/WM) surveys to locate buried waste, waste containers, potential waste migratory paths, and aquifer depths. Because of the Remote Sensing Laboratory's unique survey capabilities, only those technologies which have been adapted or are capable of being adapted to an airborne platform were studied. This survey describes several of the available subsurface survey technologies and discusses the basic capabilities of each: the target detectability, required geologic conditions, and associated survey methods. Because the airborne capabilities of these survey techniques have not been fully developed, the chapters deal mostly with the ground-based capabilities of each of the technologies, with reference made to the airborne capabilities where applicable. The information about each survey technique came from various contractors whose companies employ these specific technologies. EG G/EM cannot guarantee or verify the accuracy of the contractor information; however, the data given is an indication of the technologies that are available.

  18. Survey of subsurface geophysical exploration technologies adaptable to an airborne platform

    SciTech Connect

    Taylor, K.A.

    1992-12-01

    This report has been prepared by the US Department of Energy (DOE) as part of a Research Development Demonstration Testing and Evaluation (RDDT&E) project by EG&G Energy Measurement`s (EG&G/EM) Remote Sensing Laboratory. It examines geophysical detection techniques which may be used in Environmental Restoration/Waste Management (ER/WM) surveys to locate buried waste, waste containers, potential waste migratory paths, and aquifer depths. Because of the Remote Sensing Laboratory`s unique survey capabilities, only those technologies which have been adapted or are capable of being adapted to an airborne platform were studied. This survey describes several of the available subsurface survey technologies and discusses the basic capabilities of each: the target detectability, required geologic conditions, and associated survey methods. Because the airborne capabilities of these survey techniques have not been fully developed, the chapters deal mostly with the ground-based capabilities of each of the technologies, with reference made to the airborne capabilities where applicable. The information about each survey technique came from various contractors whose companies employ these specific technologies. EG&G/EM cannot guarantee or verify the accuracy of the contractor information; however, the data given is an indication of the technologies that are available.

  19. Optimizing alcohol production from whey using computer technology. [Kluyveromyces fragilis

    SciTech Connect

    Zertuche, L.; Zall, R.R.

    1985-01-01

    This study was undertaken with the major goal of optimizing the ethanol production from whey using computer technology. To reach this goal, a mathematical model that would describe the fermentation and that could be used for the optimization was developed. Kluyveromyces fragilis was the microorganism used to ferment the lactose in the whey into ethanol. Preliminary studies showed that K. fragilis produced about 90% of the theoretical ethanol yield when grown in whey-complemented media. However, when this yeast is grown in nonsupplemented whey media, it does not produce more than 32% of that yield. Comparative batch fermentations of lactose and whey-complemented media showed that whey possibly contains enhancing components for yeast growth and ethanol production. To obtain the mathematical model, the one-to-one effect of the process variables (lactose and yeast extract concentrations, air flow rate, pH, and dilution rate) on the ethanol production were first investigated. Experiments on the pH effect showed that a decrease in pH from 7 to 4 produced an increase in ethanol concentration from 16.5 to 26.5 g/L (50 g/L initial lactose). The results obtained from modeling of the continuous fermentation using the previously listed variables showed that air flow rate, pH, and dilution rate were the process variables that most influence the production of ethanol.

  20. A comprehensive review on adaptability of network forensics frameworks for mobile cloud computing.

    PubMed

    Khan, Suleman; Shiraz, Muhammad; Wahab, Ainuddin Wahid Abdul; Gani, Abdullah; Han, Qi; Rahman, Zulkanain Bin Abdul

    2014-01-01

    Network forensics enables investigation and identification of network attacks through the retrieved digital content. The proliferation of smartphones and the cost-effective universal data access through cloud has made Mobile Cloud Computing (MCC) a congenital target for network attacks. However, confines in carrying out forensics in MCC is interrelated with the autonomous cloud hosting companies and their policies for restricted access to the digital content in the back-end cloud platforms. It implies that existing Network Forensic Frameworks (NFFs) have limited impact in the MCC paradigm. To this end, we qualitatively analyze the adaptability of existing NFFs when applied to the MCC. Explicitly, the fundamental mechanisms of NFFs are highlighted and then analyzed using the most relevant parameters. A classification is proposed to help understand the anatomy of existing NFFs. Subsequently, a comparison is given that explores the functional similarities and deviations among NFFs. The paper concludes by discussing research challenges for progressive network forensics in MCC.

  1. A Comprehensive Review on Adaptability of Network Forensics Frameworks for Mobile Cloud Computing

    PubMed Central

    Abdul Wahab, Ainuddin Wahid; Han, Qi; Bin Abdul Rahman, Zulkanain

    2014-01-01

    Network forensics enables investigation and identification of network attacks through the retrieved digital content. The proliferation of smartphones and the cost-effective universal data access through cloud has made Mobile Cloud Computing (MCC) a congenital target for network attacks. However, confines in carrying out forensics in MCC is interrelated with the autonomous cloud hosting companies and their policies for restricted access to the digital content in the back-end cloud platforms. It implies that existing Network Forensic Frameworks (NFFs) have limited impact in the MCC paradigm. To this end, we qualitatively analyze the adaptability of existing NFFs when applied to the MCC. Explicitly, the fundamental mechanisms of NFFs are highlighted and then analyzed using the most relevant parameters. A classification is proposed to help understand the anatomy of existing NFFs. Subsequently, a comparison is given that explores the functional similarities and deviations among NFFs. The paper concludes by discussing research challenges for progressive network forensics in MCC. PMID:25097880

  2. On the Computation of Integral Curves in Adaptive Mesh Refinement Vector Fields

    SciTech Connect

    Deines, Eduard; Weber, Gunther H.; Garth, Christoph; Van Straalen, Brian; Borovikov, Sergey; Martin, Daniel F.; Joy, Kenneth I.

    2011-06-27

    Integral curves, such as streamlines, streaklines, pathlines, and timelines, are an essential tool in the analysis of vector field structures, offering straightforward and intuitive interpretation of visualization results. While such curves have a long-standing tradition in vector field visualization, their application to Adaptive Mesh Refinement (AMR) simulation results poses unique problems. AMR is a highly effective discretization method for a variety of physical simulation problems and has recently been applied to the study of vector fields in flow and magnetohydrodynamic applications. The cell-centered nature of AMR data and discontinuities in the vector field representation arising from AMR level boundaries complicate the application of numerical integration methods to compute integral curves. In this paper, we propose a novel approach to alleviate these problems and show its application to streamline visualization in an AMR model of the magnetic field of the solar system as well as to a simulation of two incompressible viscous vortex rings merging.

  3. Adaptive-projection intrinsically transformed multivariate empirical mode decomposition in cooperative brain-computer interface applications.

    PubMed

    Hemakom, Apit; Goverdovsky, Valentin; Looney, David; Mandic, Danilo P

    2016-04-13

    An extension to multivariate empirical mode decomposition (MEMD), termed adaptive-projection intrinsically transformed MEMD (APIT-MEMD), is proposed to cater for power imbalances and inter-channel correlations in real-world multichannel data. It is shown that the APIT-MEMD exhibits similar or better performance than MEMD for a large number of projection vectors, whereas it outperforms MEMD for the critical case of a small number of projection vectors within the sifting algorithm. We also employ the noise-assisted APIT-MEMD within our proposed intrinsic multiscale analysis framework and illustrate the advantages of such an approach in notoriously noise-dominated cooperative brain-computer interface (BCI) based on the steady-state visual evoked potentials and the P300 responses. Finally, we show that for a joint cognitive BCI task, the proposed intrinsic multiscale analysis framework improves system performance in terms of the information transfer rate. PMID:26953174

  4. A three-dimensional adaptive grid method. [for computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Nakahashi, K.; Deiwert, G. S.

    1985-01-01

    A three-dimensional solution-adaptive-grid scheme is described which is suitable for complex fluid flows. This method, using tension and torsion spring analogies, was previously developed and successfully applied for two-dimensional flows. In the present work, a collection of three-dimensional flow fields are used to demonstrate the feasibility and versatility of this concept to include an added dimension. Flow fields considered include: (1) supersonic flow past an aerodynamic afterbody with a propulsive jet at incidence to the free stream, (2) supersonic flow past a blunt fin mounted on a solid wall, and (3) supersonic flow over a bump. In addition to generating three-dimensional solution-adapted grids, the method can also be used effectively as an initial grid generator. The utility of the method lies in: (1) optimum distribution of discrete grid points, (2) improvement of accuracy, (3) improved computational efficiency, (4) minimization of data base sizes, and (5) simplified three-dimensional grid generation.

  5. Adaptive Stacked Generalization for Multiclass Motor Imagery-Based Brain Computer Interfaces.

    PubMed

    Nicolas-Alonso, Luis F; Corralejo, Rebeca; Gomez-Pilar, Javier; Álvarez, Daniel; Hornero, Roberto

    2015-07-01

    Practical motor imagery-based brain computer interface (MI-BCI) applications are limited by the difficult to decode brain signals in a reliable way. In this paper, we propose a processing framework to address non-stationarity, as well as handle spectral, temporal, and spatial characteristics associated with execution of motor tasks. Stacked generalization is used to exploit the power of classifier ensembles for combining information coming from multiple sources and reducing the existing uncertainty in EEG signals. The outputs of several regularized linear discriminant analysis (RLDA) models are combined to account for temporal, spatial, and spectral information. The resultant algorithm is called stacked RLDA (SRLDA). Additionally, an adaptive processing stage is introduced before classification to reduce the harmful effect of intersession non-stationarity. The benefits of the proposed method are evaluated on the BCI Competition IV dataset 2a. We demonstrate its effectiveness in binary and multiclass settings with four different motor imagery tasks: left-hand, right-hand, both feet, and tongue movements. The results show that adaptive SRLDA outperforms the winner of the competition and other approaches tested on this multiclass dataset.

  6. SLA-Driven Adaptive Resource Management for Web Applications on a Heterogeneous Compute Cloud

    NASA Astrophysics Data System (ADS)

    Iqbal, Waheed; Dailey, Matthew; Carrera, David

    Current service-level agreements (SLAs) offered by cloud providers make guarantees about quality attributes such as availability. However, although one of the most important quality attributes from the perspective of the users of a cloud-based Web application is its response time, current SLAs do not guarantee response time. Satisfying a maximum average response time guarantee for Web applications is difficult due to unpredictable traffic patterns, but in this paper we show how it can be accomplished through dynamic resource allocation in a virtual Web farm. We present the design and implementation of a working prototype built on a EUCALYPTUS-based heterogeneous compute cloud that actively monitors the response time of each virtual machine assigned to the farm and adaptively scales up the application to satisfy a SLA promising a specific average response time. We demonstrate the feasibility of the approach in an experimental evaluation with a testbed cloud and a synthetic workload. Adaptive resource management has the potential to increase the usability of Web applications while maximizing resource utilization.

  7. Adaptive hybrid brain-computer interaction: ask a trainer for assistance!

    PubMed

    Müller-Putz, Gernot R; Steyrl, David; Faller, Josef

    2014-01-01

    In applying mental imagery brain-computer interfaces (BCIs) to end users, training is a key part for novice users to get control. In general learning situations, it is an established concept that a trainer assists a trainee to improve his/her aptitude in certain skills. In this work, we want to evaluate whether we can apply this concept in the context of event-related desynchronization (ERD) based, adaptive, hybrid BCIs. Hence, in a first session we merged the features of a high aptitude BCI user, a trainer, and a novice user, the trainee, in a closed-loop BCI feedback task and automatically adapted the classifier over time. In a second session the trainees operated the system unassisted. Twelve healthy participants ran through this protocol. Along with the trainer, the trainees achieved a very high overall peak accuracy of 95.3 %. In the second session, where users operated the BCI unassisted, they still achieved a high overall peak accuracy of 83.6%. Ten of twelve first time BCI users successfully achieved significantly better than chance accuracy. Concluding, we can say that this trainer-trainee approach is very promising. Future research should investigate, whether this approach is superior to conventional training approaches. This trainer-trainee concept could have potential for future application of BCIs to end users.

  8. Social Studies: Application Units. Course II, Teachers. Computer-Oriented Curriculum. REACT (Relevant Educational Applications of Computer Technology).

    ERIC Educational Resources Information Center

    Tecnica Education Corp., San Carlos, CA.

    This book is one of a series in Course II of the Relevant Educational Applications of Computer Technology (REACT) Project. It is designed to point out to teachers two of the major applications of computers in the social sciences: simulation and data analysis. The first section contains a variety of simulation units organized under the following…

  9. Influence of Gender and Computer Teaching Efficacy on Computer Acceptance among Malaysian Student Teachers: An Extended Technology Acceptance Model

    ERIC Educational Resources Information Center

    Wong, Kung-Teck; Teo, Timothy; Russo, Sharon

    2012-01-01

    The purpose of this study is to validate the technology acceptance model (TAM) in an educational context and explore the role of gender and computer teaching efficacy as external variables. From the literature, it appeared that only limited studies had developed models to explain statistically the chain of influence of computer teaching efficacy…

  10. Self-Concept, Computer Anxiety, Gender and Attitude towards Interactive Computer Technologies: A Predictive Study among Nigerian Teachers

    ERIC Educational Resources Information Center

    Agbatogun, Alaba Olaoluwakotansibe

    2010-01-01

    Interactive Computer Technologies (ICTs) have crept into education industry, thus dramatically causing transformation in instructional process. This study examined the relative and combined contributions of computer anxiety, self-concept and gender to teachers' attitude towards the use of ICT(s). 454 Nigerian teachers constituted the sample. Three…

  11. Adapting to Student Learning Styles: Engaging Students with Cell Phone Technology in Organic Chemistry Instruction

    ERIC Educational Resources Information Center

    Pursell, David P.

    2009-01-01

    Students of organic chemistry traditionally make 3 x 5 in. flash cards to assist learning nomenclature, structures, and reactions. Advances in educational technology have enabled flash cards to be viewed on computers, offering an endless array of drilling and feedback for students. The current generation of students is less inclined to use…

  12. Computers and Classrooms: The Status of Technology in U.S. Schools. Policy Information Report.

    ERIC Educational Resources Information Center

    Coley, Richard; Cradler, John; Engel, Penelope K.

    The purpose of this report is to provide a "snapshot" of the status of technology use in United States schools. The report focuses on the following: school access to technology; student use of computers; evaluating the impact of educational technology; connecting teachers and technology; assessing the content and quality of courseware; and the…

  13. Adaptive and Efficient Computing for Subsurface Simulation within ParFlow

    SciTech Connect

    Tiedeman, H; Woodward, C S

    2010-11-16

    This project is concerned with the PF.WRF model as a means to enable more accurate predictions of wind fluctuations and subsurface storage. As developed at LLNL, PF.WRF couples a groundwater (subsurface) and surface water flow model (ParFlow) to a mesoscale atmospheric model (WRF, Weather Research and Forecasting Model). It was developed as a unique tool to address coupled water balance and wind energy questions that occur across traditionally separated research regimes of the atmosphere, land surface, and subsurface. PF.WRF is capable of simulating fluid, mass, and energy transport processes in groundwater, vadose zone, root zone, and land surface systems, including overland flow, and allows for the WRF model to both directly drive and respond to surface and subsurface hydrologic processes and conditions. The current PF.WRF model is constrained to have uniform spatial gridding below the land surface and matching areal grids with the WRF model at the land surface. There are often cases where it is advantageous for land surface, overland flow and subsurface models to have finer gridding than their atmospheric counterparts. Finer vertical discretization is also advantageous near the land surface (to properly capture feedbacks) yet many applications have a large vertical extent. However, the surface flow is strongly dependent on topography leading to a need for greater lateral resolution in some regions and the subsurface flow is tightly coupled to the atmospheric model near the surface leading to a need for finer vertical resolution. In addition, the interactions (e.g. rain) will be highly variable in space and time across the problem domain so an adaptive scheme is preferred to a static strategy to efficiently use computing and memory resources. As a result, this project focussed on algorithmic research required for development of an adaptive simulation capability in the PF.WRF system and its subsequent use in an application problem in the Central Valley of

  14. Towards a 'siliconeural computer': technological successes and challenges.

    PubMed

    Hughes, Mark A; Shipston, Mike J; Murray, Alan F

    2015-07-28

    Electronic signals govern the function of both nervous systems and computers, albeit in different ways. As such, hybridizing both systems to create an iono-electric brain-computer interface is a realistic goal; and one that promises exciting advances in both heterotic computing and neuroprosthetics capable of circumventing devastating neuropathology. 'Neural networks' were, in the 1980s, viewed naively as a potential panacea for all computational problems that did not fit well with conventional computing. The field bifurcated during the 1990s into a highly successful and much more realistic machine learning community and an equally pragmatic, biologically oriented 'neuromorphic computing' community. Algorithms found in nature that use the non-synchronous, spiking nature of neuronal signals have been found to be (i) implementable efficiently in silicon and (ii) computationally useful. As a result, interest has grown in techniques that could create mixed 'siliconeural' computers. Here, we discuss potential approaches and focus on one particular platform using parylene-patterned silicon dioxide.

  15. Some aspects of adaptive grid technology related to boundary and interior layers

    NASA Astrophysics Data System (ADS)

    Carey, Graham F.; Anderson, M.; Carnes, B.; Kirk, B.

    2004-04-01

    We consider the use of adaptive mesh strategies for solution of problems exhibiting boundary and interior layer solutions. As the presence of these layer structures suggests, reliable and accurate solution of this class of problems using finite difference, finite volume or finite element schemes requires grading the mesh into the layers and due attention to the associated algorithms. When the nature and structure of the layer is known, mesh grading can be achieved during the grid generation by specifying an appropriate grading function. However, in many applications the location and nature of the layer behavior is not known in advance. Consequently, adaptive mesh techniques that employ feedback from intermediate grid solutions are an appealing approach. In this paper, we provide a brief overview of the main adaptive grid strategies in the context of problems with layers. Associated error indicators that guide the refinement feedback control/grid optimization process are also covered and there is a brief commentary on the supporting data structure requirements. Some current issues concerning the use of stabilization in conjunction with adaptive mesh refinement (AMR), the question of "pollution effects" in computation of local error indicators, the influence of nonlinearities and the design of meshes for targeted optimization of specific quantities are considered. The application of AMR for layer problems is illustrated by means of case studies from semiconductor device transport (drift diffusion), nonlinear reaction-diffusion, layers due to surface capillary effects, and shockwaves in compressible gas dynamics.

  16. Computer Technology in Rural Schools: The Case of Mendocino County.

    ERIC Educational Resources Information Center

    Hoachlander, E. Gareth

    The county education office of Mendocino County, California, serving nine school districts and 11,800 elementary and secondary students, began planning for computers in 1979-1980, purchased two central computers, and by 1983 had one computer or terminal for every 40 students in the county. The county was characterized by its very enthusiastic…

  17. The application of microprocessor technology to in-flight computation

    NASA Technical Reports Server (NTRS)

    Sawyer, P. L.; Somers, D. M.

    1979-01-01

    A modular design of a general purpose microprocessor-based computer to perform in-flight computations for cross-country soaring pilots is described. The basic requirements for the system are discussed. Several specialized applications of the computer are presented, including real-time pilot feedback and flight-test data acquisition and reduction.

  18. Fascinating Technology: Computer Games as an Issue for Religious Education

    ERIC Educational Resources Information Center

    Scholtz, Christopher P.

    2005-01-01

    Computer games as an important part of youth culture can, from a certain perspective, be highly relevant for religious education. I will review the role of computer games, and then give a brief overview, suggesting a specific phenomenological approach for research on computer games and religious education. After presenting one example of such…

  19. Cloud Computing: A Free Technology Option to Promote Collaborative Learning

    ERIC Educational Resources Information Center

    Siegle, Del

    2010-01-01

    In a time of budget cuts and limited funding, purchasing and installing the latest software on classroom computers can be prohibitive for schools. Many educators are unaware that a variety of free software options exist, and some of them do not actually require installing software on the user's computer. One such option is cloud computing. This…

  20. Educational Technology Classics: The Computer versus the Clock

    ERIC Educational Resources Information Center

    Slack, Charles W.

    2010-01-01

    It is no accident that the first use of computers in school systems was to arrange schedules for students and teachers. The proper use of the computer in the classroom is as a replacement for the clock and its strict temporal schedule. By conveying information through self-instructional content, the computer can schedule work for pupils in…