Sample records for sophisticated computer controlled

  1. Controls for Burning Solid Wastes

    ERIC Educational Resources Information Center

    Toro, Richard F.; Weinstein, Norman J.

    1975-01-01

    Modern thermal solid waste processing systems are becoming more complex, incorporating features that require instrumentation and control systems to a degree greater than that previously required just for proper combustion control. With the advent of complex, sophisticated, thermal processing systems, TV monitoring and computer control should…

  2. V/STOL AND digital avionics system for UH-1H

    NASA Technical Reports Server (NTRS)

    Liden, S.

    1978-01-01

    A hardware and software system for the Bell UH-1H helicopter was developed that provides sophisticated navigation, guidance, control, display, and data acquisition capabilities for performing terminal area navigation, guidance and control research. Two Sperry 1819B general purpose digital computers were used. One contains the development software that performs all the specified system flight computations. The second computer is available to NASA for experimental programs that run simultaneously with the other computer programs and which may, at the push of a button, replace selected computer computations. Other features that provide research flexibility include keyboard selectable gains and parameters and software generated alphanumeric and CRT displays.

  3. An approach to quality and performance control in a computer-assisted clinical chemistry laboratory.

    PubMed Central

    Undrill, P E; Frazer, S C

    1979-01-01

    A locally developed, computer-based clinical chemistry laboratory system has been in operation since 1970. This utilises a Digital Equipment Co Ltd PDP 12 and an interconnected PDP 8/F computer. Details are presented of the performance and quality control techniques incorporated into the system. Laboratory performance is assessed through analysis of results from fixed-level control sera as well as from cumulative sum methods. At a simple level the presentation may be considered purely indicative, while at a more sophisticated level statistical concepts have been introduced to aid the laboratory controller in decision-making processes. PMID:438340

  4. InteractiveROSETTA: a graphical user interface for the PyRosetta protein modeling suite.

    PubMed

    Schenkelberg, Christian D; Bystroff, Christopher

    2015-12-15

    Modern biotechnical research is becoming increasingly reliant on computational structural modeling programs to develop novel solutions to scientific questions. Rosetta is one such protein modeling suite that has already demonstrated wide applicability to a number of diverse research projects. Unfortunately, Rosetta is largely a command-line-driven software package which restricts its use among non-computational researchers. Some graphical interfaces for Rosetta exist, but typically are not as sophisticated as commercial software. Here, we present InteractiveROSETTA, a graphical interface for the PyRosetta framework that presents easy-to-use controls for several of the most widely used Rosetta protocols alongside a sophisticated selection system utilizing PyMOL as a visualizer. InteractiveROSETTA is also capable of interacting with remote Rosetta servers, facilitating sophisticated protocols that are not accessible in PyRosetta or which require greater computational resources. InteractiveROSETTA is freely available at https://github.com/schenc3/InteractiveROSETTA/releases and relies upon a separate download of PyRosetta which is available at http://www.pyrosetta.org after obtaining a license (free for academic use). © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  5. MorphoHawk: Geometric-based Software for Manufacturing and More

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keith Arterburn

    2001-04-01

    Hollywood movies portray facial recognition as a perfected technology, but reality is that sophisticated computers and algorithmic calculations are far from perfect. In fact, the most sophisticated and successful computer for recognizing faces and other imagery already is the human brain with more than 10 billion nerve cells. Beginning at birth, humans process data and connect optical and sensory experiences that create unparalleled accumulation of data for people to associate images with life experiences, emotions and knowledge. Computers are powerful, rapid and tireless, but still cannot compare to the highly sophisticated relational calculations and associations that the human computer canmore » produce in connecting ‘what we see with what we know.’« less

  6. Control systems for heating, ventilating, and air conditioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haines, R.W.

    1977-01-01

    Hundreds of ideas for designing and controlling sophisticated heating, ventilating and air conditioning (HVAC) systems are presented. Information is included on enthalpy control, energy conservation in HVAC systems, on solar heating, cooling and refrigeration systems, and on a self-draining water collector and heater. Computerized control systems and the economics of supervisory systems are discussed. Information is presented on computer system components, software, relevant terminology, and computerized security and fire reporting systems. Benefits of computer systems are explained, along with optimization techniques, data management, maintenance schedules, and energy consumption. A bibliography, glossaries of HVAC terminology, abbreviations, symbols, and a subject indexmore » are provided. (LCL)« less

  7. Eye-gaze and intent: Application in 3D interface control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schryver, J.C.; Goldberg, J.H.

    1993-06-01

    Computer interface control is typically accomplished with an input ``device`` such as keyboard, mouse, trackball, etc. An input device translates a users input actions, such as mouse clicks and key presses, into appropriate computer commands. To control the interface, the user must first convert intent into the syntax of the input device. A more natural means of computer control is possible when the computer can directly infer user intent, without need of intervening input devices. We describe an application of eye-gaze-contingent control of an interactive three-dimensional (3D) user interface. A salient feature of the user interface is natural input, withmore » a heightened impression of controlling the computer directly by the mind. With this interface, input of rotation and translation are intuitive, whereas other abstract features, such as zoom, are more problematic to match with user intent. This paper describes successes with implementation to date, and ongoing efforts to develop a more sophisticated intent inferencing methodology.« less

  8. Eye-gaze and intent: Application in 3D interface control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schryver, J.C.; Goldberg, J.H.

    1993-01-01

    Computer interface control is typically accomplished with an input device'' such as keyboard, mouse, trackball, etc. An input device translates a users input actions, such as mouse clicks and key presses, into appropriate computer commands. To control the interface, the user must first convert intent into the syntax of the input device. A more natural means of computer control is possible when the computer can directly infer user intent, without need of intervening input devices. We describe an application of eye-gaze-contingent control of an interactive three-dimensional (3D) user interface. A salient feature of the user interface is natural input, withmore » a heightened impression of controlling the computer directly by the mind. With this interface, input of rotation and translation are intuitive, whereas other abstract features, such as zoom, are more problematic to match with user intent. This paper describes successes with implementation to date, and ongoing efforts to develop a more sophisticated intent inferencing methodology.« less

  9. A description of the thruster attitude control simulation and its application to the HEAO-C study

    NASA Technical Reports Server (NTRS)

    Brandon, L. B.

    1971-01-01

    During the design and evaluation of a reaction control system (RCS), it is desirable to have a digital computer program simulating vehicle dynamics, disturbance torques, control torques, and RCS logic. The thruster attitude control simulation (TACS) is just such a computer program. The TACS is a relatively sophisticated digital computer program that includes all the major parameters involved in the attitude control of a vehicle using an RCS for control. It includes the effects of gravity gradient torques and HEAO-C aerodynamic torques so that realistic runs can be made in the areas of fuel consumption and engine actuation rates. Also, the program is general enough that any engine configuration and logic scheme can be implemented in a reasonable amount of time. The results of the application of the TACS in the HEAO-C study are included.

  10. Virtual reality systems

    NASA Technical Reports Server (NTRS)

    Johnson, David W.

    1992-01-01

    Virtual realities are a type of human-computer interface (HCI) and as such may be understood from a historical perspective. In the earliest era, the computer was a very simple, straightforward machine. Interaction was human manipulation of an inanimate object, little more than the provision of an explicit instruction set to be carried out without deviation. In short, control resided with the user. In the second era of HCI, some level of intelligence and control was imparted to the system to enable a dialogue with the user. Simple context sensitive help systems are early examples, while more sophisticated expert system designs typify this era. Control was shared more equally. In this, the third era of the HCI, the constructed system emulates a particular environment, constructed with rules and knowledge about 'reality'. Control is, in part, outside the realm of the human-computer dialogue. Virtual reality systems are discussed.

  11. Biomolecular computing systems: principles, progress and potential.

    PubMed

    Benenson, Yaakov

    2012-06-12

    The task of information processing, or computation, can be performed by natural and man-made 'devices'. Man-made computers are made from silicon chips, whereas natural 'computers', such as the brain, use cells and molecules. Computation also occurs on a much smaller scale in regulatory and signalling pathways in individual cells and even within single biomolecules. Indeed, much of what we recognize as life results from the remarkable capacity of biological building blocks to compute in highly sophisticated ways. Rational design and engineering of biological computing systems can greatly enhance our ability to study and to control biological systems. Potential applications include tissue engineering and regeneration and medical treatments. This Review introduces key concepts and discusses recent progress that has been made in biomolecular computing.

  12. Proposal for hierarchical description of software systems

    NASA Technical Reports Server (NTRS)

    Thauboth, H.

    1973-01-01

    The programming of digital computers has developed into a new dimension full of diffculties, because the hardware of computers has become so powerful that more complex applications are entrusted to computers. The costs of software development, verification, and maintenance are outpacing those of the hardware and the trend is toward futher increase of sophistication of application of computers and consequently of sophistication of software. To obtain better visibility into software systems and to improve the structure of software systems for better tests, verification, and maintenance, a clear, but rigorous description and documentation of software is needed. The purpose of the report is to extend the present methods in order to obtain a documentation that better reflects the interplay between the various components and functions of a software system at different levels of detail without losing the precision in expression. This is done by the use of block diagrams, sequence diagrams, and cross-reference charts. In the appendices, examples from an actual large sofware system, i.e. the Marshall System for Aerospace Systems Simulation (MARSYAS), are presented. The proposed documentation structure is compatible to automation of updating significant portions of the documentation for better software change control.

  13. Neurofeedback Training for BCI Control

    NASA Astrophysics Data System (ADS)

    Neuper, Christa; Pfurtscheller, Gert

    Brain-computer interface (BCI) systems detect changes in brain signals that reflect human intention, then translate these signals to control monitors or external devices (for a comprehensive review, see [1]). BCIs typically measure electrical signals resulting from neural firing (i.e. neuronal action potentials, Electroencephalogram (ECoG), or Electroencephalogram (EEG)). Sophisticated pattern recognition and classification algorithms convert neural activity into the required control signals. BCI research has focused heavily on developing powerful signal processing and machine learning techniques to accurately classify neural activity [2-4].

  14. Program Integrity, Controlled Growth Spell Success for Roots of Empathy

    ERIC Educational Resources Information Center

    Gordon, Mary; Letchford, Donna

    2009-01-01

    Childhood is a universal aspect of the human condition. Yet the landscape of childhood is changing rapidly. On playgrounds young children carry cell phones, and in classrooms children are more sophisticated in their use of computers and digital media than the adults in their lives. Most young adolescents are prolific communicators via text and…

  15. Modeling of the Human - Operator in a Complex System Functioning Under Extreme Conditions

    NASA Astrophysics Data System (ADS)

    Getzov, Peter; Hubenova, Zoia; Yordanov, Dimitar; Popov, Wiliam

    2013-12-01

    Problems, related to the explication of sophisticated control systems of objects, operating under extreme conditions, have been examined and the impact of the effectiveness of the operator's activity on the systems as a whole. The necessity of creation of complex simulation models, reflecting operator's activity, is discussed. Organizational and technical system of an unmanned aviation complex is described as a sophisticated ergatic system. Computer realization of main subsystems of algorithmic system of the man as a controlling system is implemented and specialized software for data processing and analysis is developed. An original computer model of a Man as a tracking system has been implemented. Model of unmanned complex for operators training and formation of a mental model in emergency situation, implemented in "matlab-simulink" environment, has been synthesized. As a unit of the control loop, the pilot (operator) is simplified viewed as an autocontrol system consisting of three main interconnected subsystems: sensitive organs (perception sensors); central nervous system; executive organs (muscles of the arms, legs, back). Theoretical-data model of prediction the level of operator's information load in ergatic systems is proposed. It allows the assessment and prediction of the effectiveness of a real working operator. Simulation model of operator's activity in takeoff based on the Petri nets has been synthesized.

  16. On the Relevancy of Efficient, Integrated Computer and Network Monitoring in HEP Distributed Online Environment

    NASA Astrophysics Data System (ADS)

    Carvalho, D.; Gavillet, Ph.; Delgado, V.; Albert, J. N.; Bellas, N.; Javello, J.; Miere, Y.; Ruffinoni, D.; Smith, G.

    Large Scientific Equipments are controlled by Computer Systems whose complexity is growing driven, on the one hand by the volume and variety of the information, its distributed nature, the sophistication of its treatment and, on the other hand by the fast evolution of the computer and network market. Some people call them genetically Large-Scale Distributed Data Intensive Information Systems or Distributed Computer Control Systems (DCCS) for those systems dealing more with real time control. Taking advantage of (or forced by) the distributed architecture, the tasks are more and more often implemented as Client-Server applications. In this framework the monitoring of the computer nodes, the communications network and the applications becomes of primary importance for ensuring the safe running and guaranteed performance of the system. With the future generation of HEP experiments, such as those at the LHC in view, it is proposed to integrate the various functions of DCCS monitoring into one general purpose Multi-layer System.

  17. Historical data recording for process computers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hale, J.C.; Sellars, H.L.

    1981-11-01

    Computers have been used to monitor and control chemical and refining processes for more than 15 years. During this time, there has been a steady growth in the variety and sophistication of the functions performed by these process computers. Early systems were limited to maintaining only current operating measurements, available through crude operator's consoles or noisy teletypes. The value of retaining a process history, that is, a collection of measurements over time, became apparent, and early efforts produced shift and daily summary reports. The need for improved process historians which record, retrieve and display process information has grown as processmore » computers assume larger responsibilities in plant operations. This paper describes newly developed process historian functions that have been used on several of its in-house process monitoring and control systems in Du Pont factories. 3 refs.« less

  18. Performance Analysis of the Mobile IP Protocol (RFC 3344 and Related RFCS)

    DTIC Science & Technology

    2006-12-01

    Encapsulation HMAC Keyed-Hash Message Authentication Code ICMP Internet Control Message Protocol IEEE Institute of Electrical and Electronics Engineers IETF...Internet Engineering Task Force IOS Internetwork Operating System IP Internet Protocol ITU International Telecommunication Union LAN Local Area...network computing. Most organizations today have sophisticated networks that are connected to the Internet. The major benefit reaped from such a

  19. The Energy Conservation Idea Handbook. A Compendium of Imaginative and Innovative Examples of Ideas and Practices at Colleges and Universities Today.

    ERIC Educational Resources Information Center

    Tickton, Sidney G.; And Others

    Summarized in this compendium are approximately 500 ideas being used by colleges and universities in the United States to deal with the problem of energy conservation. These ideas range from suggestions that cost pennies to implement to sophisticated computer controls or the construction of new buildings which incorporate alternative energy…

  20. Teaching computer interfacing with virtual instruments in an object-oriented language.

    PubMed Central

    Gulotta, M

    1995-01-01

    LabVIEW is a graphic object-oriented computer language developed to facilitate hardware/software communication. LabVIEW is a complete computer language that can be used like Basic, FORTRAN, or C. In LabVIEW one creates virtual instruments that aesthetically look like real instruments but are controlled by sophisticated computer programs. There are several levels of data acquisition VIs that make it easy to control data flow, and many signal processing and analysis algorithms come with the software as premade VIs. In the classroom, the similarity between virtual and real instruments helps students understand how information is passed between the computer and attached instruments. The software may be used in the absence of hardware so that students can work at home as well as in the classroom. This article demonstrates how LabVIEW can be used to control data flow between computers and instruments, points out important features for signal processing and analysis, and shows how virtual instruments may be used in place of physical instrumentation. Applications of LabVIEW to the teaching laboratory are also discussed, and a plausible course outline is given. PMID:8580361

  1. Teaching computer interfacing with virtual instruments in an object-oriented language.

    PubMed

    Gulotta, M

    1995-11-01

    LabVIEW is a graphic object-oriented computer language developed to facilitate hardware/software communication. LabVIEW is a complete computer language that can be used like Basic, FORTRAN, or C. In LabVIEW one creates virtual instruments that aesthetically look like real instruments but are controlled by sophisticated computer programs. There are several levels of data acquisition VIs that make it easy to control data flow, and many signal processing and analysis algorithms come with the software as premade VIs. In the classroom, the similarity between virtual and real instruments helps students understand how information is passed between the computer and attached instruments. The software may be used in the absence of hardware so that students can work at home as well as in the classroom. This article demonstrates how LabVIEW can be used to control data flow between computers and instruments, points out important features for signal processing and analysis, and shows how virtual instruments may be used in place of physical instrumentation. Applications of LabVIEW to the teaching laboratory are also discussed, and a plausible course outline is given.

  2. Sydney Observatory and astronomy teaching in the 90s

    NASA Astrophysics Data System (ADS)

    Lomb, N.

    1996-05-01

    Computers and the Internet have created a revolution in the way astronomy can be communicated to the public. At Sydney Observatory we make full use of these recent developments. In our lecture room a variety of sophisticated computer programs can show, with the help of a projection TV system, the appearance and motion of the sky at any place, date or time. The latest HST images obtained from the Internet can be shown, as can images taken through our own Meade 16 inch telescope. This recently installed computer-controlled telescope with its accurate pointing is an ideal instrument for a light-polluted site such as ours.

  3. Program For A Pushbutton Display

    NASA Technical Reports Server (NTRS)

    Busquets, Anthony M.; Luck, William S., Jr.

    1989-01-01

    Programmable Display Pushbutton (PDP) is pushbutton device available from Micro Switch having programmable 16X35 matrix of light-emitting diodes on pushbutton surface. Any desired legends display on PDP's, producing user-friendly applications reducing need for dedicated manual controls. Interacts with operator, calls for correct response before transmitting next message. Both simple manual control and sophisticated programmable link between operator and host system. Programmable Display Pushbutton Legend Editor (PDPE) computer program used to create light-emitting-diode (LED) displays for pushbuttons. Written in FORTRAN.

  4. Reference clock parameters for digital communications systems applications

    NASA Technical Reports Server (NTRS)

    Kartaschoff, P.

    1981-01-01

    The basic parameters relevant to the design of network timing systems describe the random and systematic time departures of the system elements, i.e., master (or reference) clocks, transmission links, and other clocks controlled over the links. The quantitative relations between these parameters were established and illustrated by means of numerical examples based on available measured data. The examples were limited to a simple PLL control system but the analysis can eventually be applied to more sophisticated systems at the cost of increased computational effort.

  5. Study on the application of NASA energy management techniques for control of a terrestrial solar water heating system

    NASA Technical Reports Server (NTRS)

    Swanson, T. D.; Ollendorf, S.

    1979-01-01

    This paper addresses the potential for enhanced solar system performance through sophisticated control of the collector loop flow rate. Computer simulations utilizing the TRNSYS solar energy program were performed to study the relative effect on system performance of eight specific control algorithms. Six of these control algorithms are of the proportional type: two are concave exponentials, two are simple linear functions, and two are convex exponentials. These six functions are typical of what might be expected from future, more advanced, controllers. The other two algorithms are of the on/off type and are thus typical of existing control devices. Results of extensive computer simulations utilizing actual weather data indicate that proportional control does not significantly improve system performance. However, it is shown that thermal stratification in the liquid storage tank may significantly improve performance.

  6. Economic analysis of linking operating room scheduling and hospital material management information systems for just-in-time inventory control.

    PubMed

    Epstein, R H; Dexter, F

    2000-08-01

    Operating room (OR) scheduling information systems can decrease perioperative labor costs. Material management information systems can decrease perioperative inventory costs. We used computer simulation to investigate whether using the OR schedule to trigger purchasing of perioperative supplies is likely to further decrease perioperative inventory costs, as compared with using sophisticated, stand-alone material management inventory control. Although we designed the simulations to favor financially linking the information systems, we found that this strategy would be expected to decrease inventory costs substantively only for items of high price ($1000 each) and volume (>1000 used each year). Because expensive items typically have different models and sizes, each of which is used by a hospital less often than this, for almost all items there will be no benefit to making daily adjustments to the order volume based on booked cases. We conclude that, in a hospital with a sophisticated material management information system, OR managers will probably achieve greater cost reductions from focusing on negotiating less expensive purchase prices for items than on trying to link the OR information system with the hospital's material management information system to achieve just-in-time inventory control. In a hospital with a sophisticated material management information system, operating room managers will probably achieve greater cost reductions from focusing on negotiating less expensive purchase prices for items than on trying to link the operating room information system with the hospital's material management information system to achieve just-in-time inventory control.

  7. Amoeba-based computing for traveling salesman problem: long-term correlations between spatially separated individual cells of Physarum polycephalum.

    PubMed

    Zhu, Liping; Aono, Masashi; Kim, Song-Ju; Hara, Masahiko

    2013-04-01

    A single-celled, multi-nucleated amoeboid organism, a plasmodium of the true slime mold Physarum polycephalum, can perform sophisticated computing by exhibiting complex spatiotemporal oscillatory dynamics while deforming its amorphous body. We previously devised an "amoeba-based computer (ABC)" to quantitatively evaluate the optimization capability of the amoeboid organism in searching for a solution to the traveling salesman problem (TSP) under optical feedback control. In ABC, the organism changes its shape to find a high quality solution (a relatively shorter TSP route) by alternately expanding and contracting its pseudopod-like branches that exhibit local photoavoidance behavior. The quality of the solution serves as a measure of the optimality of which the organism maximizes its global body area (nutrient absorption) while minimizing the risk of being illuminated (exposure to aversive stimuli). ABC found a high quality solution for the 8-city TSP with a high probability. However, it remains unclear whether intracellular communication among the branches of the organism is essential for computing. In this study, we conducted a series of control experiments using two individual cells (two single-celled organisms) to perform parallel searches in the absence of intercellular communication. We found that ABC drastically lost its ability to find a solution when it used two independent individuals. However, interestingly, when two individuals were prepared by dividing one individual, they found a solution for a few tens of minutes. That is, the two divided individuals remained correlated even though they were spatially separated. These results suggest the presence of a long-term memory in the intrinsic dynamics of this organism and its significance in performing sophisticated computing. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  8. Flight test validation of a design procedure for digital autopilots

    NASA Technical Reports Server (NTRS)

    Bryant, W. H.

    1983-01-01

    Commercially available general aviation autopilots are currently in transition from an analogue circuit system to a computer implemented digital flight control system. Well known advantages of the digital autopilot include enhanced modes, self-test capacity, fault detection, and greater computational capacity. A digital autopilot's computational capacity can be used to full advantage by increasing the sophistication of the digital autopilot's chief function, stability and control. NASA's Langley Research Center has been pursuing the development of direct digital design tools for aircraft stabilization systems for several years. This effort has most recently been directed towards the development and realization of multi-mode digital autopilots for GA aircraft, conducted under a SPIFR-related program called the General Aviation Terminal Operations Research (GATOR) Program. This presentation focuses on the implementation and testing of a candidate multi-mode autopilot designed using these newly developed tools.

  9. Defense Small Business Innovation Research Program (SBIR), Volume 4, Defense Agencies Abstracts of Phase 1 Awards 1991

    DTIC Science & Technology

    1991-01-01

    EXPERIENCE IN DEVELOPING INTEGRATED OPTICAL DEVICES, NONLINEAR MAGNETIC-OPTIC MATERIALS, HIGH FREQUENCY MODULATORS, COMPUTER-AIDED MODELING AND SOPHISTICATED... HIGH -LEVEL PRESENTATION AND DISTRIBUTED CONTROL MODELS FOR INTEGRATING HETEROGENEOUS MECHANICAL ENGINEERING APPLICATIONS AND TOOLS. THE DESIGN IS FOCUSED...STATISTICALLY ACCURATE WORST CASE DEVICE MODELS FOR CIRCUIT SIMULATION. PRESENT METHODS OF WORST CASE DEVICE DESIGN ARE AD HOC AND DO NOT ALLOW THE

  10. Fiber Composite Sandwich Thermostructural Behavior: Computational Simulation

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Aiello, R. A.; Murthy, P. L. N.

    1986-01-01

    Several computational levels of progressive sophistication/simplification are described to computationally simulate composite sandwich hygral, thermal, and structural behavior. The computational levels of sophistication include: (1) three-dimensional detailed finite element modeling of the honeycomb, the adhesive and the composite faces; (2) three-dimensional finite element modeling of the honeycomb assumed to be an equivalent continuous, homogeneous medium, the adhesive and the composite faces; (3) laminate theory simulation where the honeycomb (metal or composite) is assumed to consist of plies with equivalent properties; and (4) derivations of approximate, simplified equations for thermal and mechanical properties by simulating the honeycomb as an equivalent homogeneous medium. The approximate equations are combined with composite hygrothermomechanical and laminate theories to provide a simple and effective computational procedure for simulating the thermomechanical/thermostructural behavior of fiber composite sandwich structures.

  11. V/STOLAND digital avionics system for XV-15 tilt rotor

    NASA Technical Reports Server (NTRS)

    Liden, S.

    1980-01-01

    A digital flight control system for the tilt rotor research aircraft provides sophisticated navigation, guidance, control, display and data acquisition capabilities for performing terminal area navigation, guidance and control research. All functions of the XV-15 V/STOLAND system were demonstrated on the NASA-ARC S-19 simulation facility under a comprehensive dynamic acceptance test. The most noteworthy accomplishments of the system are: (1) automatic configuration control of a tilt-rotor aircraft over the total operating range; (2) total hands-off landing to touchdown on various selectable straight-in glide slopes and on a flight path that includes a two-revolution helix; (3) automatic guidance along a programmed three-dimensional reference flight path; (4) navigation data for the automatic guidance computed on board, based on VOR/DME, TACAN, or MLS navid data; and (5) integration of a large set of functions in a single computer, utilizing 16k words of storage for programs and data.

  12. Advancing botnet modeling techniques for military and security simulations

    NASA Astrophysics Data System (ADS)

    Banks, Sheila B.; Stytz, Martin R.

    2011-06-01

    Simulation environments serve many purposes, but they are only as good as their content. One of the most challenging and pressing areas that call for improved content is the simulation of bot armies (botnets) and their effects upon networks and computer systems. Botnets are a new type of malware, a type that is more powerful and potentially dangerous than any other type of malware. A botnet's power derives from several capabilities including the following: 1) the botnet's capability to be controlled and directed throughout all phases of its activity, 2) a command and control structure that grows increasingly sophisticated, and 3) the ability of a bot's software to be updated at any time by the owner of the bot (a person commonly called a bot master or bot herder.) Not only is a bot army powerful and agile in its technical capabilities, a bot army can be extremely large, can be comprised of tens of thousands, if not millions, of compromised computers or it can be as small as a few thousand targeted systems. In all botnets, their members can surreptitiously communicate with each other and their command and control centers. In sum, these capabilities allow a bot army to execute attacks that are technically sophisticated, difficult to trace, tactically agile, massive, and coordinated. To improve our understanding of their operation and potential, we believe that it is necessary to develop computer security simulations that accurately portray bot army activities, with the goal of including bot army simulations within military simulation environments. In this paper, we investigate issues that arise when simulating bot armies and propose a combination of the biologically inspired MSEIR infection spread model coupled with the jump-diffusion infection spread model to portray botnet propagation.

  13. Hybrid soft computing systems for electromyographic signals analysis: a review.

    PubMed

    Xie, Hong-Bo; Guo, Tianruo; Bai, Siwei; Dokos, Socrates

    2014-02-03

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis.

  14. Hybrid soft computing systems for electromyographic signals analysis: a review

    PubMed Central

    2014-01-01

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979

  15. The Impact of Robotics on Employment. Hearing before the Subcommittee on Economic Goals and Intergovernmental Policy of the Joint Economic Committee, Congress of the United States, Ninety-Eighth Congress. First Session.

    ERIC Educational Resources Information Center

    Joint Economic Committee, Washington, DC.

    This document is a transcript of a fact-finding hearing conducted to evaluate the prospective impact of robotics (the use of sophisticated programmable or computer-controlled robots to perform routine and repetitious tasks) on employment in this country. Testimony and prepared reports were given by John Andelin, assistant director of Science,…

  16. Teaching biomedical applications to secondary students.

    PubMed

    Openshaw, S; Fleisher, A; Ljunggren, C

    1999-01-01

    Certain aspects of biomedical engineering applications lend themselves well to experimentation that can be done by high school students. This paper describes two experiments done during a six-week summer internship program in which two high school students used electrodes, circuit boards, and computers to mimic a sophisticated heart monitor and also to control a robotic car. Our experience suggests that simple illustrations of complex instrumentation can be effective in introducing adolescents to the biomedical engineering field.

  17. Real-time fuzzy inference based robot path planning

    NASA Technical Reports Server (NTRS)

    Pacini, Peter J.; Teichrow, Jon S.

    1990-01-01

    This project addresses the problem of adaptive trajectory generation for a robot arm. Conventional trajectory generation involves computing a path in real time to minimize a performance measure such as expended energy. This method can be computationally intensive, and it may yield poor results if the trajectory is weakly constrained. Typically some implicit constraints are known, but cannot be encoded analytically. The alternative approach used here is to formulate domain-specific knowledge, including implicit and ill-defined constraints, in terms of fuzzy rules. These rules utilize linguistic terms to relate input variables to output variables. Since the fuzzy rulebase is determined off-line, only high-level, computationally light processing is required in real time. Potential applications for adaptive trajectory generation include missile guidance and various sophisticated robot control tasks, such as automotive assembly, high speed electrical parts insertion, stepper alignment, and motion control for high speed parcel transfer systems.

  18. An XML-based method for astronomy software designing

    NASA Astrophysics Data System (ADS)

    Liao, Mingxue; Aili, Yusupu; Zhang, Jin

    XML-based method for standardization of software designing is introduced and analyzed and successfully applied to renovating the hardware and software of the digital clock at Urumqi Astronomical Station. Basic strategy for eliciting time information from the new digital clock of FT206 in the antenna control program is introduced. By FT206, the need to compute how many centuries passed since a certain day with sophisticated formulas is eliminated and it is no longer necessary to set right UT time for the computer holding control over antenna because the information about year, month, day are all deduced from Julian day dwelling in FT206, rather than from computer time. With XML-based method and standard for software designing, various existing designing methods are unified, communications and collaborations between developers are facilitated, and thus Internet-based mode of developing software becomes possible. The trend of development of XML-based designing method is predicted.

  19. Reading wild minds: A computational assay of Theory of Mind sophistication across seven primate species

    PubMed Central

    Devaine, Marie; San-Galli, Aurore; Trapanese, Cinzia; Bardino, Giulia; Hano, Christelle; Saint Jalme, Michel; Bouret, Sebastien

    2017-01-01

    Theory of Mind (ToM), i.e. the ability to understand others' mental states, endows humans with highly adaptive social skills such as teaching or deceiving. Candidate evolutionary explanations have been proposed for the unique sophistication of human ToM among primates. For example, the Machiavellian intelligence hypothesis states that the increasing complexity of social networks may have induced a demand for sophisticated ToM. This type of scenario ignores neurocognitive constraints that may eventually be crucial limiting factors for ToM evolution. In contradistinction, the cognitive scaffolding hypothesis asserts that a species' opportunity to develop sophisticated ToM is mostly determined by its general cognitive capacity (on which ToM is scaffolded). However, the actual relationships between ToM sophistication and either brain volume (a proxy for general cognitive capacity) or social group size (a proxy for social network complexity) are unclear. Here, we let 39 individuals sampled from seven non-human primate species (lemurs, macaques, mangabeys, orangutans, gorillas and chimpanzees) engage in simple dyadic games against artificial ToM players (via a familiar human caregiver). Using computational analyses of primates' choice sequences, we found that the probability of exhibiting a ToM-compatible learning style is mainly driven by species' brain volume (rather than by social group size). Moreover, primates' social cognitive sophistication culminates in a precursor form of ToM, which still falls short of human fully-developed ToM abilities. PMID:29112973

  20. Reading wild minds: A computational assay of Theory of Mind sophistication across seven primate species.

    PubMed

    Devaine, Marie; San-Galli, Aurore; Trapanese, Cinzia; Bardino, Giulia; Hano, Christelle; Saint Jalme, Michel; Bouret, Sebastien; Masi, Shelly; Daunizeau, Jean

    2017-11-01

    Theory of Mind (ToM), i.e. the ability to understand others' mental states, endows humans with highly adaptive social skills such as teaching or deceiving. Candidate evolutionary explanations have been proposed for the unique sophistication of human ToM among primates. For example, the Machiavellian intelligence hypothesis states that the increasing complexity of social networks may have induced a demand for sophisticated ToM. This type of scenario ignores neurocognitive constraints that may eventually be crucial limiting factors for ToM evolution. In contradistinction, the cognitive scaffolding hypothesis asserts that a species' opportunity to develop sophisticated ToM is mostly determined by its general cognitive capacity (on which ToM is scaffolded). However, the actual relationships between ToM sophistication and either brain volume (a proxy for general cognitive capacity) or social group size (a proxy for social network complexity) are unclear. Here, we let 39 individuals sampled from seven non-human primate species (lemurs, macaques, mangabeys, orangutans, gorillas and chimpanzees) engage in simple dyadic games against artificial ToM players (via a familiar human caregiver). Using computational analyses of primates' choice sequences, we found that the probability of exhibiting a ToM-compatible learning style is mainly driven by species' brain volume (rather than by social group size). Moreover, primates' social cognitive sophistication culminates in a precursor form of ToM, which still falls short of human fully-developed ToM abilities.

  1. Large Hospital Buildings in the United States in 2007

    EIA Publications

    2012-01-01

    Hospitals consume large amounts of energy because of how they are run and the many people that use them. They are open 24 hours a day; thousands of employees, patients, and visitors occupy the buildings daily; and sophisticated heating, ventilation, and air conditioning (HVAC) systems control the temperatures and air flow. In addition, many energy intensive activities occur in these buildings: laundry, medical and lab equipment use, sterilization, computer and server use, food service, and refrigeration.

  2. Human-machine interface hardware: The next decade

    NASA Technical Reports Server (NTRS)

    Marcus, Elizabeth A.

    1991-01-01

    In order to understand where human-machine interface hardware is headed, it is important to understand where we are today, how we got there, and what our goals for the future are. As computers become more capable, faster, and programs become more sophisticated, it becomes apparent that the interface hardware is the key to an exciting future in computing. How can a user interact and control a seemingly limitless array of parameters effectively? Today, the answer is most often a limitless array of controls. The link between these controls and human sensory motor capabilities does not utilize existing human capabilities to their full extent. Interface hardware for teleoperation and virtual environments is now facing a crossroad in design. Therefore, we as developers need to explore how the combination of interface hardware, human capabilities, and user experience can be blended to get the best performance today and in the future.

  3. Emulating weak localization using a solid-state quantum circuit.

    PubMed

    Chen, Yu; Roushan, P; Sank, D; Neill, C; Lucero, Erik; Mariantoni, Matteo; Barends, R; Chiaro, B; Kelly, J; Megrant, A; Mutus, J Y; O'Malley, P J J; Vainsencher, A; Wenner, J; White, T C; Yin, Yi; Cleland, A N; Martinis, John M

    2014-10-14

    Quantum interference is one of the most fundamental physical effects found in nature. Recent advances in quantum computing now employ interference as a fundamental resource for computation and control. Quantum interference also lies at the heart of sophisticated condensed matter phenomena such as Anderson localization, phenomena that are difficult to reproduce in numerical simulations. Here, employing a multiple-element superconducting quantum circuit, with which we manipulate a single microwave photon, we demonstrate that we can emulate the basic effects of weak localization. By engineering the control sequence, we are able to reproduce the well-known negative magnetoresistance of weak localization as well as its temperature dependence. Furthermore, we can use our circuit to continuously tune the level of disorder, a parameter that is not readily accessible in mesoscopic systems. Demonstrating a high level of control, our experiment shows the potential for employing superconducting quantum circuits as emulators for complex quantum phenomena.

  4. Analysis, design, and testing of a low cost, direct force command linear proof mass actuator for structural control

    NASA Technical Reports Server (NTRS)

    Slater, G. L.; Shelley, Stuart; Jacobson, Mark

    1993-01-01

    In this paper, the design, analysis, and test of a low cost, linear proof mass actuator for vibration control is presented. The actuator is based on a linear induction coil from a large computer disk drive. Such disk drives are readily available and provide the linear actuator, current feedback amplifier, and power supply for a highly effective, yet inexpensive, experimental laboratory actuator. The device is implemented as a force command input system, and the performance is virtually the same as other, more sophisticated, linear proof mass systems.

  5. Emerging Uses of Computer Technology in Qualitative Research.

    ERIC Educational Resources Information Center

    Parker, D. Randall

    The application of computer technology in qualitative research and evaluation ranges from simple word processing to doing sophisticated data sorting and retrieval. How computer software can be used for qualitative research is discussed. Researchers should consider the use of computers in data analysis in light of their own familiarity and comfort…

  6. The Next Computer Revolution.

    ERIC Educational Resources Information Center

    Peled, Abraham

    1987-01-01

    Discusses some of the future trends in the use of the computer in our society, suggesting that computing is now entering a new phase in which it will grow exponentially more powerful, flexible, and sophisticated in the next decade. Describes some of the latest breakthroughs in computer hardware and software technology. (TW)

  7. 'Stranger' child-murder: issues relating to causes and controls.

    PubMed

    Wilson, P R

    1988-02-01

    Most industrialised countries are concerned with a perceived increase in the killing of children and adolescents by strangers. Though reliable statistics are lacking, the growth of serial murder suggests that more young persons may be at risk than ever before. Explanations, either of a psychological or sociological kind, of child murder by strangers are inadequately developed. Despite the tendency to see such killers as psychiatrically ill a number of studies suggest that the majority of offenders do not differ significantly, at least in psychological traits, from non-offenders. Subcultural and other sociological perspectives stressing "social disadvantage" have low levels of exploratory power and do not assist greatly in understanding child killings. Despite sketchy and contradictory evidence on the effects of the media on sexual and violent crime case study material supports the view that pornography, including popular music, may increase the propensity of individuals to commit atrocities. Counter-measures to control stranger child killing lie in more sophisticated law enforcement (profiling and computer links between police forces) long periods of incarceration of the offender and more sophisticated analyses of the crimes.

  8. Ocean Models and Proper Orthogonal Decomposition

    NASA Astrophysics Data System (ADS)

    Salas-de-Leon, D. A.

    2007-05-01

    The increasing computational developments and the better understanding of mathematical and physical systems resulted in an increasing number of ocean models. Long time ago, modelers were like a secret organization and recognize each other by using secret codes and languages that only a select group of people was able to recognize and understand. The access to computational systems was reduced, on one hand equipment and the using time of computers were expensive and restricted, and on the other hand, they required an advance computational languages that not everybody wanted to learn. Now a days most college freshman own a personal computer (PC or laptop), and/or have access to more sophisticated computational systems than those available for research in the early 80's. The resource availability resulted in a mayor access to all kind models. Today computer speed and time and the algorithms does not seem to be a problem, even though some models take days to run in small computational systems. Almost every oceanographic institution has their own model, what is more, in the same institution from one office to the next there are different models for the same phenomena, developed by different research member, the results does not differ substantially since the equations are the same, and the solving algorithms are similar. The algorithms and the grids, constructed with algorithms, can be found in text books and/or over the internet. Every year more sophisticated models are constructed. The Proper Orthogonal Decomposition is a technique that allows the reduction of the number of variables to solve keeping the model properties, for which it can be a very useful tool in diminishing the processes that have to be solved using "small" computational systems, making sophisticated models available for a greater community.

  9. Goals and Objectives for Computing in the Associated Colleges of the St. Lawrence Valley.

    ERIC Educational Resources Information Center

    Grupe, Fritz H.

    A forecast of the computing requirements of the Associated Colleges of the St. Lawrence Valley, an analysis of their needs, and specifications for a joint computer system are presented. Problems encountered included the lack of resources and computer sophistication at the member schools and a dearth of experience with long-term computer consortium…

  10. Site-controlled quantum dots fabricated using an atomic-force microscope assisted technique

    PubMed Central

    Usuki, T; Ohshima, T; Sakuma, Y; Kawabe, M; Okada, Y; Takemoto, K; Miyazawa, T; Hirose, S; Nakata, Y; Takatsu, M; Yokoyama, N

    2006-01-01

    An atomic-force microscope assisted technique is developed to control the position and size of self-assembled semiconductor quantum dots (QDs). Presently, the site precision is as good as ± 1.5 nm and the size fluctuation is within ± 5% with the minimum controllable lateral diameter of 20 nm. With the ability of producing tightly packed and differently sized QDs, sophisticated QD arrays can be controllably fabricated for the application in quantum computing. The optical quality of such site-controlled QDs is found comparable to some conventionally self-assembled semiconductor QDs. The single dot photoluminescence of site-controlled InAs/InP QDs is studied in detail, presenting the prospect to utilize them in quantum communication as precisely controlled single photon emitters working at telecommunication bands.

  11. User participation in the development of the human/computer interface for control centers

    NASA Technical Reports Server (NTRS)

    Broome, Richard; Quick-Campbell, Marlene; Creegan, James; Dutilly, Robert

    1996-01-01

    Technological advances coupled with the requirements to reduce operations staffing costs led to the demand for efficient, technologically-sophisticated mission operations control centers. The control center under development for the earth observing system (EOS) is considered. The users are involved in the development of a control center in order to ensure that it is cost-efficient and flexible. A number of measures were implemented in the EOS program in order to encourage user involvement in the area of human-computer interface development. The following user participation exercises carried out in relation to the system analysis and design are described: the shadow participation of the programmers during a day of operations; the flight operations personnel interviews; and the analysis of the flight operations team tasks. The user participation in the interface prototype development, the prototype evaluation, and the system implementation are reported on. The involvement of the users early in the development process enables the requirements to be better understood and the cost to be reduced.

  12. Logic via Computer Programming.

    ERIC Educational Resources Information Center

    Wieschenberg, Agnes A.

    This paper proposed the question "How do we teach logical thinking and sophisticated mathematics to unsophisticated college students?" One answer among many is through the writing of computer programs. The writing of computer algorithms is mathematical problem solving and logic in disguise and it may attract students who would otherwise stop…

  13. Prior Consent: Not-So-Strange Bedfellows Plan Library/Computing Partnerships.

    ERIC Educational Resources Information Center

    McDonough, Kristin

    The increasing sophistication of information technologies and the nearly universal access to computing have blurred distinctions among information delivery units on college campuses, forcing institutions to rethink the separate organizational structures that evolved when computing in academe was more localized and less prevalent. Experiences in…

  14. Alpha Control - A new Concept in SPM Control

    NASA Astrophysics Data System (ADS)

    Spizig, P.; Sanchen, D.; Volswinkler, G.; Ibach, W.; Koenen, J.

    2006-03-01

    Controlling modern Scanning Probe Microscopes demands highly sophisticated electronics. While flexibility and powerful computing power is of great importance in facilitating the variety of measurement modes, extremely low noise is also a necessity. Accordingly, modern SPM Controller designs are based on digital electronics to overcome the drawbacks of analog designs. While todays SPM controllers are based on DSPs or Microprocessors and often still incorporate analog parts, we are now introducing a completely new approach: Using a Field Programmable Gate Array (FPGA) to implement the digital control tasks allows unrivalled data processing speed by computing all tasks in parallel within a single chip. Time consuming task switching between data acquisition, digital filtering, scanning and the computing of feedback signals can be completely avoided. Together with a star topology to avoid any bus limitations in accessing the variety of ADCs and DACs, this design guarantees for the first time an entirely deterministic timing capability in the nanosecond regime for all tasks. This becomes especially useful for any external experiments which must be synchronized with the scan or for high speed scans that require not only closed loop control of the scanner, but also dynamic correction of the scan movement. Delicate samples additionally benefit from extremely high sample rates, allowing highly resolved signals and low noise levels.

  15. Reproducible research in vadose zone sciences

    USDA-ARS?s Scientific Manuscript database

    A significant portion of present-day soil and Earth science research is computational, involving complex data analysis pipelines, advanced mathematical and statistical models, and sophisticated computer codes. Opportunities for scientific progress are greatly diminished if reproducing and building o...

  16. Experience with a sophisticated computer based authoring system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gardner, P.R.

    1984-04-01

    In the November 1982 issue of ADCIS SIG CBT Newsletter the editor arrives at two conclusions regarding Computer Based Authoring Systems (CBAS): (1) CBAS drastically reduces programming time and the need for expert programmers, and (2) CBAS appears to have minimal impact on initial lesson design. Both of these comments have significant impact on any Cost-Benefit analysis for Computer-Based Training. The first tends to improve cost-effectiveness but only toward the limits imposed by the second. Westinghouse Hanford Company (WHC) recently purchased a sophisticated CBAS, the WISE/SMART system from Wicat (Orem, UT), for use in the Nuclear Power Industry. This reportmore » details our experience with this system relative to Items (1) and (2) above; lesson design time will be compared with lesson input time. Also provided will be the WHC experience in the use of subject matter experts (though computer neophytes) for the design and inputting of CBT materials.« less

  17. Using Interactive Computer to Communicate Scientific Information.

    ERIC Educational Resources Information Center

    Selnow, Gary W.

    1988-01-01

    Asks whether the computer is another channel of communication, if its interactive qualities make it an information source, or if it is an undefined hybrid. Concludes that computers are neither the medium nor the source but will in the future provide the possibility of a sophisticated interaction between human intelligence and artificial…

  18. How You Can Protect Public Access Computers "and" Their Users

    ERIC Educational Resources Information Center

    Huang, Phil

    2007-01-01

    By providing the public with online computing facilities, librarians make available a world of information resources beyond their traditional print materials. Internet-connected computers in libraries greatly enhance the opportunity for patrons to enjoy the benefits of the digital age. Unfortunately, as hackers become more sophisticated and…

  19. Remote control of nanoscale devices

    NASA Astrophysics Data System (ADS)

    Högberg, Björn

    2018-01-01

    Processes that occur at the nanometer scale have a tremendous impact on our daily lives. Sophisticated evolved nanomachines operate in each of our cells; we also, as a society, increasingly rely on synthetic nanodevices for communication and computation. Scientists are still only beginning to master this scale, but, recently, DNA nanotechnology (1)—in particular, DNA origami (2)—has emerged as a powerful tool to build structures precise enough to help us do so. On page 296 of this issue, Kopperger et al. (3) show that they are now also able to control the motion of a DNA origami device from the outside by applying electric fields.

  20. Implementing Computer Algebra Enabled Questions for the Assessment and Learning of Mathematics

    ERIC Educational Resources Information Center

    Sangwin, Christopher J.; Naismith, Laura

    2008-01-01

    We present principles for the design of an online system to support computer algebra enabled questions for use within the teaching and learning of mathematics in higher education. The introduction of a computer algebra system (CAS) into a computer aided assessment (CAA) system affords sophisticated response processing of student provided answers.…

  1. SpaceWire- Based Control System Architecture for the Lightweight Advanced Robotic Arm Demonstrator [LARAD

    NASA Astrophysics Data System (ADS)

    Rucinski, Marek; Coates, Adam; Montano, Giuseppe; Allouis, Elie; Jameux, David

    2015-09-01

    The Lightweight Advanced Robotic Arm Demonstrator (LARAD) is a state-of-the-art, two-meter long robotic arm for planetary surface exploration currently being developed by a UK consortium led by Airbus Defence and Space Ltd under contract to the UK Space Agency (CREST-2 programme). LARAD has a modular design, which allows for experimentation with different electronics and control software. The control system architecture includes the on-board computer, control software and firmware, and the communication infrastructure (e.g. data links, switches) connecting on-board computer(s), sensors, actuators and the end-effector. The purpose of the control system is to operate the arm according to pre-defined performance requirements, monitoring its behaviour in real-time and performing safing/recovery actions in case of faults. This paper reports on the results of a recent study about the feasibility of the development and integration of a novel control system architecture for LARAD fully based on the SpaceWire protocol. The current control system architecture is based on the combination of two communication protocols, Ethernet and CAN. The new SpaceWire-based control system will allow for improved monitoring and telecommanding performance thanks to higher communication data rate, allowing for the adoption of advanced control schemes, potentially based on multiple vision sensors, and for the handling of sophisticated end-effectors that require fine control, such as science payloads or robotic hands.

  2. Cerebellar contributions to motor control and language comprehension: searching for common computational principles.

    PubMed

    Moberget, Torgeir; Ivry, Richard B

    2016-04-01

    The past 25 years have seen the functional domain of the cerebellum extend beyond the realm of motor control, with considerable discussion of how this subcortical structure contributes to cognitive domains including attention, memory, and language. Drawing on evidence from neuroanatomy, physiology, neuropsychology, and computational work, sophisticated models have been developed to describe cerebellar function in sensorimotor control and learning. In contrast, mechanistic accounts of how the cerebellum contributes to cognition have remained elusive. Inspired by the homogeneous cerebellar microanatomy and a desire for parsimony, many researchers have sought to extend mechanistic ideas from motor control to cognition. One influential hypothesis centers on the idea that the cerebellum implements internal models, representations of the context-specific dynamics of an agent's interactions with the environment, enabling predictive control. We briefly review cerebellar anatomy and physiology, to review the internal model hypothesis as applied in the motor domain, before turning to extensions of these ideas in the linguistic domain, focusing on speech perception and semantic processing. While recent findings are consistent with this computational generalization, they also raise challenging questions regarding the nature of cerebellar learning, and may thus inspire revisions of our views on the role of the cerebellum in sensorimotor control. © 2016 New York Academy of Sciences.

  3. Evolution of Computational Toxicology-from Primitive ...

    EPA Pesticide Factsheets

    Presentation at the Health Canada seminar in Ottawa, ON, Canada on Nov. 15. 2016 Presentation at the Health Canada seminar in Ottawa, ON, Canada on Nov. 15. 2016 on the Evolution of Computational Toxicology-from Primitive Beginnings to Sophisticated Application

  4. Evaluating coastal and river valley communities evacuation network performance using macroscopic productivity.

    DOT National Transportation Integrated Search

    2017-06-30

    The ever-increasing processing speed and computational power of computers and simulation systems has led to correspondingly larger, more sophisticated representations of evacuation traffic processes. Today, micro-level analyses can be conducted for m...

  5. Computers for Interactive Learning.

    ERIC Educational Resources Information Center

    Grabowski, Barbara; Aggen, William

    1984-01-01

    Analyzes features of computer-based interactive video including sophisticated answer judging, diagnostic feedback, simulation, animation, audible tones, touch sensitive screen, function keys, and video enhancements, and matches these to the characteristics and pedagogical styles of learners. The learner characteristics discussed include internal…

  6. Big Data, Big Solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pike, Bill

    Data—lots of data—generated in seconds and piling up on the internet, streaming and stored in countless databases. Big data is important for commerce, society and our nation’s security. Yet the volume, velocity, variety and veracity of data is simply too great for any single analyst to make sense of alone. It requires advanced, data-intensive computing. Simply put, data-intensive computing is the use of sophisticated computers to sort through mounds of information and present analysts with solutions in the form of graphics, scenarios, formulas, new hypotheses and more. This scientific capability is foundational to PNNL’s energy, environment and security missions. Seniormore » Scientist and Division Director Bill Pike and his team are developing analytic tools that are used to solve important national challenges, including cyber systems defense, power grid control systems, intelligence analysis, climate change and scientific exploration.« less

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    This paper reports on an automated metering/proving system for custody transfer of crude oil at the Phillips 66 Co. tanker unloading terminal in Freeport, Texas. It is described as one of the most sophisticated systems developed. The menu-driven, one-button automation removes the proving sequence entirely from manual control. The system also is the to be cost-effective and versatile compared to a dedicated flow computer with API calculation capabilities. Developed by Puffer-Sweiven, systems integrators, the new technology additionally is thought to be the first custody transfer system to employ a programmable logic controller (PLC). The PLC provides the automation, gathers andmore » stores all raw data, and prints alarms. Also the system uses a personal computer operator interface (OI) that runs on the Intel iRMX real time operating system. The OI is loaded with Puffer-Sweiven application software that performs API meter factor and volume correction calculations as well as present color graphics and generate reports.« less

  8. Motorized CPM/CAM physiotherapy device with sliding-mode Fuzzy Neural Network control loop.

    PubMed

    Ho, Hung-Jung; Chen, Tien-Chi

    2009-11-01

    Continuous passive motion (CPM) and controllable active motion (CAM) physiotherapy devices promote rehabilitation of damaged joints. This paper presents a computerized CPM/CAM system that obviates the need for mechanical resistance devices such as springs. The system is controlled by a computer which performs sliding-mode Fuzzy Neural Network (FNN) calculations online. CAM-type resistance force is generated by the active performance of an electric motor which is controlled so as to oppose the motion of the patient's leg. A force sensor under the patient's foot on the device pedal provides data for feedback in a sliding-mode FNN control loop built around the motor. Via an active impedance control feedback system, the controller drives the motor to behave similarly to a damped spring by generating and controlling the amplitude and direction of the pedal force in relation to the patient's leg. Experiments demonstrate the high sensitivity and speed of the device. The PC-based feedback nature of the control loop means that sophisticated auto-adaptable CPM/CAM custom-designed physiotherapy becomes possible. The computer base also allows extensive data recording, data analysis and network-connected remote patient monitoring.

  9. Novel apparatus and methods for performing remotely controlled particle-solid interaction experiments at CERN

    NASA Astrophysics Data System (ADS)

    Krause, H. F.; Deveney, E. F.; Jones, N. L.; Vane, C. R.; Datz, S.; Knudsen, H.; Grafström, P.; Schuch, R.

    1997-04-01

    Recent atomic physics studies involving ultrarelativistic Pb ions required solid target positioners, scintillators, and a sophisticated data acquisition and control system placed in a remote location at the CERN Super Proton Synchrotron near Geneva, Switzerland. The apparatus, installed in a high-radiation zone underground, had to (i) function for months, (ii) automatically respond to failures such as power outages and particle-induced computer upsets, and (iii) communicate with the outside world via a telephone line. The heart of the apparatus developed was an Apple Macintosh-based CAMAC system that answered the telephone and interpreted and executed remote control commands that (i) sensed and set targets, (ii) controlled voltages and discriminator levels for scintillators, (iii) modified data acquisition hardware logic, (iv) reported control information, and (v) automatically synchronized data acquisition to the CERN spill cycle via a modem signal and transmitted experimental data to a remote computer. No problems were experienced using intercontinental telephone connections at 1200 baud. Our successful "virtual laboratory" approach that uses off-the-shelf electronics is generally adaptable to more conventional bench-type experiments.

  10. AOPs & Biomarkers: Bridging High Throughput Screening and Regulatory Decision Making.

    EPA Science Inventory

    As high throughput screening (HTS) approaches play a larger role in toxicity testing, computational toxicology has emerged as a critical component in interpreting the large volume of data produced. Computational models for this purpose are becoming increasingly more sophisticated...

  11. A Computer Analysis of Library Postcards. (CALP)

    ERIC Educational Resources Information Center

    Stevens, Norman D.

    1974-01-01

    A description of a sophisticated application of computer techniques to the analysis of a collection of picture postcards of library buildings in an attempt to establish the minimum architectural requirements needed to distinguish one style of library building from another. (Author)

  12. Virtual Computing Laboratories: A Case Study with Comparisons to Physical Computing Laboratories

    ERIC Educational Resources Information Center

    Burd, Stephen D.; Seazzu, Alessandro F.; Conway, Christopher

    2009-01-01

    Current technology enables schools to provide remote or virtual computing labs that can be implemented in multiple ways ranging from remote access to banks of dedicated workstations to sophisticated access to large-scale servers hosting virtualized workstations. This paper reports on the implementation of a specific lab using remote access to…

  13. Factors Affecting Utilization of Information Output of Computer-Based Modeling Procedures in Local Government Organizations.

    ERIC Educational Resources Information Center

    Komsky, Susan

    Fiscal Impact Budgeting Systems (FIBS) are sophisticated computer based modeling procedures used in local government organizations, whose results, however, are often overlooked or ignored by decision makers. A study attempted to discover the reasons for this situation by focusing on four factors: potential usefulness, faith in computers,…

  14. Use of Computer-Assisted Technologies (CAT) to Enhance Social, Communicative, and Language Development in Children with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Ploog, Bertram O.; Scharf, Alexa; Nelson, DeShawn; Brooks, Patricia J.

    2013-01-01

    Major advances in multimedia computer technology over the past decades have made sophisticated computer games readily available to the public. This, combined with the observation that most children, including those with autism spectrum disorders (ASD), show an affinity to computers, has led researchers to recognize the potential of computer…

  15. Home Diabetes Monitoring through Touch-Tone Computer Data Entry and Voice Synthesizer Response

    PubMed Central

    Arbogast, James G.; Dodrill, William H.

    1984-01-01

    Current studies suggest that the control of Diabetes mellitus can be improved with home monitoring of blood sugars. Voice synthesizers and recent technology, allowing decoding of Touch-Tone® pulses into their digital equivalents, make it possible for diabetics with no more sophisticated equipment than a Touch-Tone® telephone to enter their blood sugars directly into a medical office computer. A working prototype that can provide physicians with timely, logically oriented information about their diabetics is discussed along with plans to expand this concept into giving the patients uncomplicated therapeutic advice without the need for a direct patient/physician interaction. The potential impact on health care costs and the management of other chronic diseases is presented.

  16. Computational Control of Flexible Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Sharpe, Lonnie, Jr.; Shen, Ji Yao

    1994-01-01

    The main objective of this project is to establish a distributed parameter modeling technique for structural analysis, parameter estimation, vibration suppression and control synthesis of large flexible aerospace structures. This report concentrates on the research outputs produced in the last two years of the project. The main accomplishments can be summarized as follows. A new version of the PDEMOD Code had been completed. A theoretical investigation of the NASA MSFC two-dimensional ground-based manipulator facility by using distributed parameter modelling technique has been conducted. A new mathematical treatment for dynamic analysis and control of large flexible manipulator systems has been conceived, which may provide a embryonic form of a more sophisticated mathematical model for future modified versions of the PDEMOD Codes.

  17. High-Performance Computing Act of 1991. Report of the Senate Committee on Commerce, Science, and Transportation on S. 272. Senate, 102d Congress, 1st Session.

    ERIC Educational Resources Information Center

    Congress of the U.S., Washington, DC. Senate Committee on Commerce, Science, and Transportation.

    This report discusses Senate Bill no. 272, which provides for a coordinated federal research and development program to ensure continued U.S. leadership in high-performance computing. High performance computing is defined as representing the leading edge of technological advancement in computing, i.e., the most sophisticated computer chips, the…

  18. Progress in Computational Electron-Molecule Collisions

    NASA Astrophysics Data System (ADS)

    Rescigno, Tn

    1997-10-01

    The past few years have witnessed tremendous progress in the development of sophisticated ab initio methods for treating collisions of slow electrons with isolated small molecules. Researchers in this area have benefited greatly from advances in computer technology; indeed, the advent of parallel computers has made it possible to carry out calculations at a level of sophistication inconceivable a decade ago. But bigger and faster computers are only part of the picture. Even with today's computers, the practical need to study electron collisions with the kinds of complex molecules and fragments encountered in real-world plasma processing environments is taxing present methods beyond their current capabilities. Since extrapolation of existing methods to handle increasingly larger targets will ultimately fail as it would require computational resources beyond any imagined, continued progress must also be linked to new theoretical developments. Some of the techniques recently introduced to address these problems will be discussed and illustrated with examples of electron-molecule collision calculations we have carried out on some fairly complex target gases encountered in processing plasmas. Electron-molecule scattering continues to pose many formidable theoretical and computational challenges. I will touch on some of the outstanding open questions.

  19. Choosing a Computer Language for Institutional Research. The AIR Professional File No. 6.

    ERIC Educational Resources Information Center

    Strenglein, Denise

    1980-01-01

    It is suggested that much thought should be given to choosing an appropriate computer language for an institutional research office, considering the sophistication of the staff, types of planned application, size and type of computer, and availability of central programming support in the institution. For offices that prepare straight reports and…

  20. New Ways of Using Computers in Language Teaching. New Ways in TESOL Series II. Innovative Classroom Techniques.

    ERIC Educational Resources Information Center

    Boswood, Tim, Ed.

    A collection of classroom approaches and activities using computers for language learning is presented. Some require sophisticated installations, but most do not, and most use software readily available on most workplace computer systems. The activities were chosen because they use sound language learning strategies. The book is divided into five…

  1. Design of a monitor and simulation terminal (master) for space station telerobotics and telescience

    NASA Technical Reports Server (NTRS)

    Lopez, L.; Konkel, C.; Harmon, P.; King, S.

    1989-01-01

    Based on Space Station and planetary spacecraft communication time delays and bandwidth limitations, it will be necessary to develop an intelligent, general purpose ground monitor terminal capable of sophisticated data display and control of on-orbit facilities and remote spacecraft. The basic elements that make up a Monitor and Simulation Terminal (MASTER) include computer overlay video, data compression, forward simulation, mission resource optimization and high level robotic control. Hardware and software elements of a MASTER are being assembled for testbed use. Applications of Neural Networks (NNs) to some key functions of a MASTER are also discussed. These functions are overlay graphics adjustment, object correlation and kinematic-dynamic characterization of the manipulator.

  2. Interactive mission planning for a Space Shuttle flight experiment - A case history

    NASA Technical Reports Server (NTRS)

    Harris, H. M.

    1986-01-01

    Scientific experiments which use the Space Shuttle as a platform require the development of new operations techniques for the command and control of the instrument. Principal among these is the ability to simulate the complex maneuvers of the orbiter's path realistically. Computer generated graphics provide a window into the actual and predicted performance of the instrument and allow sophisticated control of the instrument under varying conditions. In October of 1984 the Shuttle carried a synthetic aperture radar built by JPL for the purpose of recording images of the earth surface. The mission deviated from planned operation in almost every conceivable way and provided an exacting test bed for concepts of interactive mission planning.

  3. An Introduction to the Industrial Applications of Microcontrollers

    NASA Astrophysics Data System (ADS)

    Carelse, Xavier F.

    A microcontroller is sometimes described as a “computer on a chip” because it contains all the features of a full computer including central processor, in-built clock circuitry, ROM, RAM, input and output ports with special features'such as serial communication, analogue-to-digital conversion and, more recently, signal processing. The smallest microcontroller has only eight pins but some having 68 pins are also being marketed. In the last five years, the prices of microcontrollers have dropped by 80% and are now one of the most cost-effective components in industry. Being software-driven, microcontrollers greatly simplify the design of sophisticated instrumentation and control circuitry. They are able to effect precise calculations sometimes needed for feedback in control systems and now form the basis of all intelligent embedded systems such as those required in television and VCR remote controls, microwave ovens, washing machines, etc. More than ten times as many microcontrollers than microprocessors are manufactured and sold in the world in spite of the high profile that the latter enjoys because of the personal computer market. In Zimbabwe, extensive research is being carried out to use microcontrollers to aid the cost recovery of domestic and commercial solar installations as part of the rural electrification programme.

  4. Railroads and the Environment : Estimation of Fuel Consumption in Rail Transportation : Volume 3. Comparison of Computer Simulations with Field Measurements

    DOT National Transportation Integrated Search

    1978-09-01

    This report documents comparisons between extensive rail freight service measurements (previously presented in Volume II) and simulations of the same operations using a sophisticated train performance calculator computer program. The comparisons cove...

  5. Home, Hearth and Computing.

    ERIC Educational Resources Information Center

    Seelig, Anita

    1982-01-01

    Advantages of having children use microcomputers at school and home include learning about sophisticated concepts early in life without a great deal of prodding, playing games that expand knowledge, and becoming literate in computer knowledge needed later in life. Includes comments from parents on their experiences with microcomputers and…

  6. Elliptic Length Scales in Laminar, Two-Dimensional Supersonic Flows

    DTIC Science & Technology

    2015-06-01

    sophisticated computational fluid dynamics ( CFD ) methods. Additionally, for 3D interactions, the length scales would require determination in spanwise as well...Manna, M. “Experimental, Analytical, and Computational Methods Applied to Hypersonic Compression Ramp Flows,” AIAA Journal, Vol. 32, No. 2, Feb. 1994

  7. 32 CFR 806b.35 - Balancing protection.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...

  8. 32 CFR 806b.35 - Balancing protection.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...

  9. 32 CFR 806b.35 - Balancing protection.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...

  10. 32 CFR 806b.35 - Balancing protection.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...

  11. 32 CFR 806b.35 - Balancing protection.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., Computer Security, 5 for procedures on safeguarding personal information in automated records. 5 http://www... automated system with a log-on protocol. Others may require more sophisticated security protection based on the sensitivity of the information. Classified computer systems or those with established audit and...

  12. Synthesizing Biomolecule-based Boolean Logic Gates

    PubMed Central

    Miyamoto, Takafumi; Razavi, Shiva; DeRose, Robert; Inoue, Takanari

    2012-01-01

    One fascinating recent avenue of study in the field of synthetic biology is the creation of biomolecule-based computers. The main components of a computing device consist of an arithmetic logic unit, the control unit, memory, and the input and output devices. Boolean logic gates are at the core of the operational machinery of these parts, hence to make biocomputers a reality, biomolecular logic gates become a necessity. Indeed, with the advent of more sophisticated biological tools, both nucleic acid- and protein-based logic systems have been generated. These devices function in the context of either test tubes or living cells and yield highly specific outputs given a set of inputs. In this review, we discuss various types of biomolecular logic gates that have been synthesized, with particular emphasis on recent developments that promise increased complexity of logic gate circuitry, improved computational speed, and potential clinical applications. PMID:23526588

  13. Synthesizing biomolecule-based Boolean logic gates.

    PubMed

    Miyamoto, Takafumi; Razavi, Shiva; DeRose, Robert; Inoue, Takanari

    2013-02-15

    One fascinating recent avenue of study in the field of synthetic biology is the creation of biomolecule-based computers. The main components of a computing device consist of an arithmetic logic unit, the control unit, memory, and the input and output devices. Boolean logic gates are at the core of the operational machinery of these parts, and hence to make biocomputers a reality, biomolecular logic gates become a necessity. Indeed, with the advent of more sophisticated biological tools, both nucleic acid- and protein-based logic systems have been generated. These devices function in the context of either test tubes or living cells and yield highly specific outputs given a set of inputs. In this review, we discuss various types of biomolecular logic gates that have been synthesized, with particular emphasis on recent developments that promise increased complexity of logic gate circuitry, improved computational speed, and potential clinical applications.

  14. Raster Scan Computer Image Generation (CIG) System Based On Refresh Memory

    NASA Astrophysics Data System (ADS)

    Dichter, W.; Doris, K.; Conkling, C.

    1982-06-01

    A full color, Computer Image Generation (CIG) raster visual system has been developed which provides a high level of training sophistication by utilizing advanced semiconductor technology and innovative hardware and firmware techniques. Double buffered refresh memory and efficient algorithms eliminate the problem of conventional raster line ordering by allowing the generated image to be stored in a random fashion. Modular design techniques and simplified architecture provide significant advantages in reduced system cost, standardization of parts, and high reliability. The major system components are a general purpose computer to perform interfacing and data base functions; a geometric processor to define the instantaneous scene image; a display generator to convert the image to a video signal; an illumination control unit which provides final image processing; and a CRT monitor for display of the completed image. Additional optional enhancements include texture generators, increased edge and occultation capability, curved surface shading, and data base extensions.

  15. Space Spurred Computer Graphics

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Dicomed Corporation was asked by NASA in the early 1970s to develop processing capabilities for recording images sent from Mars by Viking spacecraft. The company produced a film recorder which increased the intensity levels and the capability for color recording. This development led to a strong technology base resulting in sophisticated computer graphics equipment. Dicomed systems are used to record CAD (computer aided design) and CAM (computer aided manufacturing) equipment, to update maps and produce computer generated animation.

  16. A computer assisted tutorial for applications of computer spreadsheets in nursing financial management.

    PubMed

    Edwardson, S R; Pejsa, J

    1993-01-01

    A computer-based tutorial for teaching nursing financial management concepts was developed using the macro function of a commercially available spreadsheet program. The goals of the tutorial were to provide students with an experience with spreadsheets as a computer tool and to teach selected financial management concepts. Preliminary results show the tutorial was well received by students. Suggestions are made for overcoming the general lack of computer sophistication among students.

  17. The Application of a Massively Parallel Computer to the Simulation of Electrical Wave Propagation Phenomena in the Heart Muscle Using Simplified Models

    NASA Technical Reports Server (NTRS)

    Karpoukhin, Mikhii G.; Kogan, Boris Y.; Karplus, Walter J.

    1995-01-01

    The simulation of heart arrhythmia and fibrillation are very important and challenging tasks. The solution of these problems using sophisticated mathematical models is beyond the capabilities of modern super computers. To overcome these difficulties it is proposed to break the whole simulation problem into two tightly coupled stages: generation of the action potential using sophisticated models. and propagation of the action potential using simplified models. The well known simplified models are compared and modified to bring the rate of depolarization and action potential duration restitution closer to reality. The modified method of lines is used to parallelize the computational process. The conditions for the appearance of 2D spiral waves after the application of a premature beat and the subsequent traveling of the spiral wave inside the simulated tissue are studied.

  18. Physical and numerical sources of computational inefficiency in integration of chemical kinetic rate equations: Etiology, treatment and prognosis

    NASA Technical Reports Server (NTRS)

    Pratt, D. T.; Radhakrishnan, K.

    1986-01-01

    The design of a very fast, automatic black-box code for homogeneous, gas-phase chemical kinetics problems requires an understanding of the physical and numerical sources of computational inefficiency. Some major sources reviewed in this report are stiffness of the governing ordinary differential equations (ODE's) and its detection, choice of appropriate method (i.e., integration algorithm plus step-size control strategy), nonphysical initial conditions, and too frequent evaluation of thermochemical and kinetic properties. Specific techniques are recommended (and some advised against) for improving or overcoming the identified problem areas. It is argued that, because reactive species increase exponentially with time during induction, and all species exhibit asymptotic, exponential decay with time during equilibration, exponential-fitted integration algorithms are inherently more accurate for kinetics modeling than classical, polynomial-interpolant methods for the same computational work. But current codes using the exponential-fitted method lack the sophisticated stepsize-control logic of existing black-box ODE solver codes, such as EPISODE and LSODE. The ultimate chemical kinetics code does not exist yet, but the general characteristics of such a code are becoming apparent.

  19. Computerized Measurement of Negative Symptoms in Schizophrenia

    PubMed Central

    Cohen, Alex S.; Alpert, Murray; Nienow, Tasha M.; Dinzeo, Thomas J.; Docherty, Nancy M.

    2008-01-01

    Accurate measurement of negative symptoms is crucial for understanding and treating schizophrenia. However, current measurement strategies are reliant on subjective symptom rating scales which often have psychometric and practical limitations. Computerized analysis of patients’ speech offers a sophisticated and objective means of evaluating negative symptoms. The present study examined the feasibility and validity of using widely-available acoustic and lexical-analytic software to measure flat affect, alogia and anhedonia (via positive emotion). These measures were examined in their relationships to clinically-rated negative symptoms and social functioning. Natural speech samples were collected and analyzed for 14 patients with clinically-rated flat affect, 46 patients without flat affect and 19 healthy controls. The computer-based inflection and speech rate measures significantly discriminated patients with flat affect from controls, and the computer-based measure of alogia and negative emotion significantly discriminated the flat and non-flat patients. Both the computer and clinical measures of positive emotion/anhedonia corresponded to functioning impairments. The computerized method of assessing negative symptoms offered a number of advantages over the symptom scale-based approach. PMID:17920078

  20. Peering into the Future of Advertising.

    ERIC Educational Resources Information Center

    Hsia, H. J.

    All areas in mass communications (i.e., newspapers, magazines, television, radio, films, photos, and books) will be transformed because of the increasing sophistication of computer users, the decreasing costs for interactive computer systems, and the global adoption of integrated services digital networks (ISDN). ISDN refer to the digitization of…

  1. Improving Undergraduate Computer Instruction: Experiments and Strategies

    ERIC Educational Resources Information Center

    Kalman, Howard K.; Ellis, Maureen L.

    2007-01-01

    Today, undergraduate students enter college with increasingly more sophisticated computer skills compared to their counterparts of 20 years ago. However, many instructors are still using traditional instructional strategies to teach this new generation. This research study discusses a number of strategies that were employed to teach a…

  2. Producing picture-perfect posters.

    PubMed

    Bach, D B; Vellet, A D; Karlik, S J; Downey, D B; Levin, M F; Munk, P L

    1993-06-01

    Scientific posters form an integral part of many radiology meetings. They provide the opportunity for interested parties to read the material at an individualized pace, to study the images in detail, and to return to the exhibit numerous times. Although the content of the poster is undoubtedly its most important component, the visual presentation of the material can enhance or detract from the clarity of the message. With the wide availability of sophisticated computer programs for desktop publishing (DTP), one can now create the poster on a computer monitor with full control of the form as well as the content. This process will result in a professional-appearing poster, yet still allow the author the opportunity to make innumerable revisions, as the poster is visualized in detail on the computer monitor before printing. Furthermore, this process is less expensive than the traditional method of typesetting individual sections separately and mounting them on cardboard for display. The purpose of this article is to present our approach to poster production using commercially available DTP computer programs.

  3. Student Thinking Processes. The Influence of Immediate Computer Access on Students' Thinking. First- and Second-Year Findings. ACOT Report #3.

    ERIC Educational Resources Information Center

    Tierney, Robert J.

    This 2-year longitudinal study explored whether computers promote more sophisticated thinking, and examined how students' thinking changes as they become experienced computer users. The first-year study examined the thinking process of four ninth-grade Apple Classrooms of Tomorrow (ACOT) students. The second-year study continued following these…

  4. Objective and Item Banking Computer Software and Its Use in Comprehensive Achievement Monitoring.

    ERIC Educational Resources Information Center

    Schriber, Peter E.; Gorth, William P.

    The current emphasis on objectives and test item banks for constructing more effective tests is being augmented by increasingly sophisticated computer software. Items can be catalogued in numerous ways for retrieval. The items as well as instructional objectives can be stored and test forms can be selected and printed by the computer. It is also…

  5. An Innovative Learning Model for Computation in First Year Mathematics

    ERIC Educational Resources Information Center

    Tonkes, E. J.; Loch, B. I.; Stace, A. W.

    2005-01-01

    MATLAB is a sophisticated software tool for numerical analysis and visualization. The University of Queensland has adopted Matlab as its official teaching package across large first year mathematics courses. In the past, the package has met severe resistance from students who have not appreciated their computational experience. Several main…

  6. Electronic Networking as an Avenue of Enhanced Professional Interchange.

    ERIC Educational Resources Information Center

    Ratcliff, James L.

    Electronic networking is communication between two or more people that involves one or more telecommunications media. There is electronic networking software available for most computers, including IBM, Apple, and Radio Shack personal computers. Depending upon the sophistication of the hardware and software used, individuals and groups can…

  7. Teaching for CAD Expertise

    ERIC Educational Resources Information Center

    Chester, Ivan

    2007-01-01

    CAD (Computer Aided Design) has now become an integral part of Technology Education. The recent introduction of highly sophisticated, low-cost CAD software and CAM hardware capable of running on desktop computers has accelerated this trend. There is now quite widespread introduction of solid modeling CAD software into secondary schools but how…

  8. Computer Aided Tests.

    ERIC Educational Resources Information Center

    Steinke, Elisabeth

    An approach to using the computer to assemble German tests is described. The purposes of the system would be: (1) an expansion of the bilingual lexical memory bank to list and store idioms of all degrees of difficulty, with frequency data and with complete and sophisticated retrieval possibility for assembly; (2) the creation of an…

  9. Core Community Specifications for Electron Microprobe Operating Systems: Software, Quality Control, and Data Management Issues

    NASA Technical Reports Server (NTRS)

    Fournelle, John; Carpenter, Paul

    2006-01-01

    Modem electron microprobe systems have become increasingly sophisticated. These systems utilize either UNIX or PC computer systems for measurement, automation, and data reduction. These systems have undergone major improvements in processing, storage, display, and communications, due to increased capabilities of hardware and software. Instrument specifications are typically utilized at the time of purchase and concentrate on hardware performance. The microanalysis community includes analysts, researchers, software developers, and manufacturers, who could benefit from exchange of ideas and the ultimate development of core community specifications (CCS) for hardware and software components of microprobe instrumentation and operating systems.

  10. Fast, cheap and in control: spectral imaging with handheld devices

    NASA Astrophysics Data System (ADS)

    Gooding, Edward A.; Deutsch, Erik R.; Huehnerhoff, Joseph; Hajian, Arsen R.

    2017-05-01

    Remote sensing has moved out of the laboratory and into the real world. Instruments using reflection or Raman imaging modalities become faster, cheaper and more powerful annually. Enabling technologies include virtual slit spectrometer design, high power multimode diode lasers, fast open-loop scanning systems, low-noise IR-sensitive array detectors and low-cost computers with touchscreen interfaces. High-volume manufacturing assembles these components into inexpensive portable or handheld devices that make possible sophisticated decision-making based on robust data analytics. Examples include threat, hazmat and narcotics detection; remote gas sensing; biophotonic screening; environmental remediation and a host of other applications.

  11. Electronic prototyping

    NASA Technical Reports Server (NTRS)

    Hopcroft, J.

    1987-01-01

    The potential benefits of automation in space are significant. The science base needed to support this automation not only will help control costs and reduce lead-time in the earth-based design and construction of space stations, but also will advance the nation's capability for computer design, simulation, testing, and debugging of sophisticated objects electronically. Progress in automation will require the ability to electronically represent, reason about, and manipulate objects. Discussed here is the development of representations, languages, editors, and model-driven simulation systems to support electronic prototyping. In particular, it identifies areas where basic research is needed before further progress can be made.

  12. Computer literacy in nursing education. An overview.

    PubMed

    Newbern, V B

    1985-09-01

    Nursing educators are beginning to realize that computer literacy has become a survival skill for the profession. They understand that literacy must be at a level that assures the ability to manage and control the flood of available information and provides an openness and awareness of future technologic possibilities. The computer has been on college campuses for a number of years, used primarily for record storage and retrieval. However, early on a few nurse educators saw the potential for its use as a practice tool. Out of this foresight came both formal and nonformal educational offerings. The evolution of formal coursework in computer literacy has moved from learning about the computer to learning with the computer. Today the use of the computer is expanding geometrically as microcomputers become common. Graduate students and faculty use them for literature searches and data analysis. Undergraduates are routinely using computer-assisted instruction. Coursework in computer technology is fast becoming a given for nursing students and computer competency a requisite for faculty. However, inculcating computer competency in faculty and student repertoires is not an easy task. There are problems related to motivation, resources, and control. Territorial disputes between schools and colleges must be arbitrated. The interface with practice must be addressed. The paucity of adequate software is a real concern. But the potential is enormous, probably restricted only by human creativity. The possibilities for teaching and learning are profound, especially if geographical constraints can be effaced and scarce resources can be shared at minimal cost. Extremely sophisticated research designs and evaluation methodologies can be used routinely.(ABSTRACT TRUNCATED AT 250 WORDS)

  13. An intelligent multi-media human-computer dialogue system

    NASA Technical Reports Server (NTRS)

    Neal, J. G.; Bettinger, K. E.; Byoun, J. S.; Dobes, Z.; Thielman, C. Y.

    1988-01-01

    Sophisticated computer systems are being developed to assist in the human decision-making process for very complex tasks performed under stressful conditions. The human-computer interface is a critical factor in these systems. The human-computer interface should be simple and natural to use, require a minimal learning period, assist the user in accomplishing his task(s) with a minimum of distraction, present output in a form that best conveys information to the user, and reduce cognitive load for the user. In pursuit of this ideal, the Intelligent Multi-Media Interfaces project is devoted to the development of interface technology that integrates speech, natural language text, graphics, and pointing gestures for human-computer dialogues. The objective of the project is to develop interface technology that uses the media/modalities intelligently in a flexible, context-sensitive, and highly integrated manner modelled after the manner in which humans converse in simultaneous coordinated multiple modalities. As part of the project, a knowledge-based interface system, called CUBRICON (CUBRC Intelligent CONversationalist) is being developed as a research prototype. The application domain being used to drive the research is that of military tactical air control.

  14. Using hypermedia to develop an intelligent tutorial/diagnostic system for the Space Shuttle Main Engine Controller Lab

    NASA Technical Reports Server (NTRS)

    Oreilly, Daniel; Williams, Robert; Yarborough, Kevin

    1988-01-01

    This is a tutorial/diagnostic system for training personnel in the use of the Space Shuttle Main Engine Controller (SSMEC) Simulation Lab. It also provides a diagnostic capable of isolating lab failures at least to the major lab component. The system was implemented using Hypercard, which is an program of hypermedia running on Apple Macintosh computers. Hypercard proved to be a viable platform for the development and use of sophisticated tutorial systems and moderately capable diagnostic systems. This tutorial/diagnostic system uses the basic Hypercard tools to provide the tutorial. The diagnostic part of the system uses a simple interpreter written in the Hypercard language (Hypertalk) to implement the backward chaining rule based logic commonly found in diagnostic systems using Prolog. Some of the advantages of Hypercard in developing this type of system include sophisticated graphics, animation, sound and voice capabilities, its ability as a hypermedia tool, and its ability to include digitized pictures. The major disadvantage is the slow execution time for evaluation of rules (due to the interpretive processing of the language). Other disadvantages include the limitation on the size of the cards, that color is not supported, that it does not support grey scale graphics, and its lack of selectable fonts for text fields.

  15. Brain-Computer Interface Based on Generation of Visual Images

    PubMed Central

    Bobrov, Pavel; Frolov, Alexander; Cantor, Charles; Fedulova, Irina; Bakhnyan, Mikhail; Zhavoronkov, Alexander

    2011-01-01

    This paper examines the task of recognizing EEG patterns that correspond to performing three mental tasks: relaxation and imagining of two types of pictures: faces and houses. The experiments were performed using two EEG headsets: BrainProducts ActiCap and Emotiv EPOC. The Emotiv headset becomes widely used in consumer BCI application allowing for conducting large-scale EEG experiments in the future. Since classification accuracy significantly exceeded the level of random classification during the first three days of the experiment with EPOC headset, a control experiment was performed on the fourth day using ActiCap. The control experiment has shown that utilization of high-quality research equipment can enhance classification accuracy (up to 68% in some subjects) and that the accuracy is independent of the presence of EEG artifacts related to blinking and eye movement. This study also shows that computationally-inexpensive Bayesian classifier based on covariance matrix analysis yields similar classification accuracy in this problem as a more sophisticated Multi-class Common Spatial Patterns (MCSP) classifier. PMID:21695206

  16. Spline Trajectory Algorithm Development: Bezier Curve Control Point Generation for UAVs

    NASA Technical Reports Server (NTRS)

    Howell, Lauren R.; Allen, B. Danette

    2016-01-01

    A greater need for sophisticated autonomous piloting systems has risen in direct correlation with the ubiquity of Unmanned Aerial Vehicle (UAV) technology. Whether surveying unknown or unexplored areas of the world, collecting scientific data from regions in which humans are typically incapable of entering, locating lost or wanted persons, or delivering emergency supplies, an unmanned vehicle moving in close proximity to people and other vehicles, should fly smoothly and predictably. The mathematical application of spline interpolation can play an important role in autopilots' on-board trajectory planning. Spline interpolation allows for the connection of Three-Dimensional Euclidean Space coordinates through a continuous set of smooth curves. This paper explores the motivation, application, and methodology used to compute the spline control points, which shape the curves in such a way that the autopilot trajectory is able to meet vehicle-dynamics limitations. The spline algorithms developed used to generate these curves supply autopilots with the information necessary to compute vehicle paths through a set of coordinate waypoints.

  17. Force Feedback Joystick

    NASA Technical Reports Server (NTRS)

    1997-01-01

    I-FORCE, a computer peripheral from Immersion Corporation, was derived from virtual environment and human factors research at the Advanced Displays and Spatial Perception Laboratory at Ames Research Center in collaboration with Stanford University Center for Design Research. Entrepreneur Louis Rosenberg, a former Stanford researcher, now president of Immersion, collaborated with Dr. Bernard Adelstein at Ames on studies of perception in virtual reality. The result was an inexpensive way to incorporate motors and a sophisticated microprocessor into joysticks and other game controllers. These devices can emulate the feel of a car on the skid, a crashing plane, the bounce of a ball, compressed springs, or other physical phenomenon. The first products incorporating I-FORCE technology include CH- Products' line of FlightStick and CombatStick controllers.

  18. Using Web Speech Technology with Language Learning Applications

    ERIC Educational Resources Information Center

    Daniels, Paul

    2015-01-01

    In this article, the author presents the history of human-to-computer interaction based upon the design of sophisticated computerized speech recognition algorithms. Advancements such as the arrival of cloud-based computing and software like Google's Web Speech API allows anyone with an Internet connection and Chrome browser to take advantage of…

  19. Evaluation of Imagine Learning English, a Computer-Assisted Instruction of Language and Literacy for Kindergarten Students

    ERIC Educational Resources Information Center

    Longberg, Pauline Oliphant

    2012-01-01

    As computer assisted instruction (CAI) becomes increasingly sophisticated, its appeal as a viable method of literacy intervention with young children continues despite limited evidence of effectiveness. The present study sought to assess the impact of one such CAI program, "Imagine Learning English" (ILE), on both the receptive…

  20. Microcomputer Based Computer-Assisted Learning System: CASTLE.

    ERIC Educational Resources Information Center

    Garraway, R. W. T.

    The purpose of this study was to investigate the extent to which a sophisticated computer assisted instruction (CAI) system could be implemented on the type of microcomputer system currently found in the schools. A method was devised for comparing CAI languages and was used to rank five common CAI languages. The highest ranked language, NATAL,…

  1. Application of the advanced engineering environment for optimization energy consumption in designed vehicles

    NASA Astrophysics Data System (ADS)

    Monica, Z.; Sękala, A.; Gwiazda, A.; Banaś, W.

    2016-08-01

    Nowadays a key issue is to reduce the energy consumption of road vehicles. In particular solution one could find different strategies of energy optimization. The most popular but not sophisticated is so called eco-driving. In this strategy emphasized is particular behavior of drivers. In more sophisticated solution behavior of drivers is supported by control system measuring driving parameters and suggesting proper operation of the driver. The other strategy is concerned with application of different engineering solutions that aid optimization the process of energy consumption. Such systems take into consideration different parameters measured in real time and next take proper action according to procedures loaded to the control computer of a vehicle. The third strategy bases on optimization of the designed vehicle taking into account especially main sub-systems of a technical mean. In this approach the optimal level of energy consumption by a vehicle is obtained by synergetic results of individual optimization of particular constructional sub-systems of a vehicle. It is possible to distinguish three main sub-systems: the structural one the drive one and the control one. In the case of the structural sub-system optimization of the energy consumption level is related with the optimization or the weight parameter and optimization the aerodynamic parameter. The result is optimized body of a vehicle. Regarding the drive sub-system the optimization of the energy consumption level is related with the fuel or power consumption using the previously elaborated physical models. Finally the optimization of the control sub-system consists in determining optimal control parameters.

  2. Undecidability and Irreducibility Conditions for Open-Ended Evolution and Emergence.

    PubMed

    Hernández-Orozco, Santiago; Hernández-Quiroz, Francisco; Zenil, Hector

    2018-01-01

    Is undecidability a requirement for open-ended evolution (OEE)? Using methods derived from algorithmic complexity theory, we propose robust computational definitions of open-ended evolution and the adaptability of computable dynamical systems. Within this framework, we show that decidability imposes absolute limits on the stable growth of complexity in computable dynamical systems. Conversely, systems that exhibit (strong) open-ended evolution must be undecidable, establishing undecidability as a requirement for such systems. Complexity is assessed in terms of three measures: sophistication, coarse sophistication, and busy beaver logical depth. These three complexity measures assign low complexity values to random (incompressible) objects. As time grows, the stated complexity measures allow for the existence of complex states during the evolution of a computable dynamical system. We show, however, that finding these states involves undecidable computations. We conjecture that for similar complexity measures that assign low complexity values, decidability imposes comparable limits on the stable growth of complexity, and that such behavior is necessary for nontrivial evolutionary systems. We show that the undecidability of adapted states imposes novel and unpredictable behavior on the individuals or populations being modeled. Such behavior is irreducible. Finally, we offer an example of a system, first proposed by Chaitin, that exhibits strong OEE.

  3. A Computer-Controlled Laser Bore Scanner

    NASA Astrophysics Data System (ADS)

    Cheng, Charles C.

    1980-08-01

    This paper describes the design and engineering of a laser scanning system for production applications. The laser scanning techniques, the timing control, the logic design of the pattern recognition subsystem, the digital computer servo control for the loading and un-loading of parts, and the laser probe rotation and its synchronization will be discussed. The laser inspection machine is designed to automatically inspect the surface of precision-bored holes, such as those in automobile master cylinders, without contacting the machined surface. Although the controls are relatively sophisticated, operation of the laser inspection machine is simple. A laser light beam from a commercially available gas laser, directed through a probe, scans the entire surface of the bore. Reflected light, picked up through optics by photoelectric sensors, generates signals that are fed to a mini-computer for processing. A pattern recognition techniques program in the computer determines acceptance or rejection of the part being inspected. The system's acceptance specifications are adjustable and are set to the user's established tolerances. However, the computer-controlled laser system is capable of defining from 10 to 75 rms surface finish, and voids or flaws from 0.0005 to 0.020 inch. Following the successful demonstration with an engineering prototype, the described laser machine has proved its capability to consistently ensure high-quality master brake cylinders. It thus provides a safety improvement for the automotive braking system. Flawless, smooth cylinder bores eliminate premature wearing of the rubber seals, resulting in a longer-lasting master brake cylinder and a safer and more reliable automobile. The results obtained from use of this system, which has been in operation about a year for replacement of a tedious, manual operation on one of the high-volume lines at the Bendix Hydraulics Division, have been very satisfactory.

  4. Unified, Cross-Platform, Open-Source Library Package for High-Performance Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kozacik, Stephen

    Compute power is continually increasing, but this increased performance is largely found in sophisticated computing devices and supercomputer resources that are difficult to use, resulting in under-utilization. We developed a unified set of programming tools that will allow users to take full advantage of the new technology by allowing them to work at a level abstracted away from the platform specifics, encouraging the use of modern computing systems, including government-funded supercomputer facilities.

  5. Cloud computing can simplify HIT infrastructure management.

    PubMed

    Glaser, John

    2011-08-01

    Software as a Service (SaaS), built on cloud computing technology, is emerging as the forerunner in IT infrastructure because it helps healthcare providers reduce capital investments. Cloud computing leads to predictable, monthly, fixed operating expenses for hospital IT staff. Outsourced cloud computing facilities are state-of-the-art data centers boasting some of the most sophisticated networking equipment on the market. The SaaS model helps hospitals safeguard against technology obsolescence, minimizes maintenance requirements, and simplifies management.

  6. Neuroprosthetic Decoder Training as Imitation Learning.

    PubMed

    Merel, Josh; Carlson, David; Paninski, Liam; Cunningham, John P

    2016-05-01

    Neuroprosthetic brain-computer interfaces function via an algorithm which decodes neural activity of the user into movements of an end effector, such as a cursor or robotic arm. In practice, the decoder is often learned by updating its parameters while the user performs a task. When the user's intention is not directly observable, recent methods have demonstrated value in training the decoder against a surrogate for the user's intended movement. Here we show that training a decoder in this way is a novel variant of an imitation learning problem, where an oracle or expert is employed for supervised training in lieu of direct observations, which are not available. Specifically, we describe how a generic imitation learning meta-algorithm, dataset aggregation (DAgger), can be adapted to train a generic brain-computer interface. By deriving existing learning algorithms for brain-computer interfaces in this framework, we provide a novel analysis of regret (an important metric of learning efficacy) for brain-computer interfaces. This analysis allows us to characterize the space of algorithmic variants and bounds on their regret rates. Existing approaches for decoder learning have been performed in the cursor control setting, but the available design principles for these decoders are such that it has been impossible to scale them to naturalistic settings. Leveraging our findings, we then offer an algorithm that combines imitation learning with optimal control, which should allow for training of arbitrary effectors for which optimal control can generate goal-oriented control. We demonstrate this novel and general BCI algorithm with simulated neuroprosthetic control of a 26 degree-of-freedom model of an arm, a sophisticated and realistic end effector.

  7. Tools for Atmospheric Radiative Transfer: Streamer and FluxNet. Revised

    NASA Technical Reports Server (NTRS)

    Key, Jeffrey R.; Schweiger, Axel J.

    1998-01-01

    Two tools for the solution of radiative transfer problems are presented. Streamer is a highly flexible medium spectral resolution radiative transfer model based on the plane-parallel theory of radiative transfer. Capable of computing either fluxes or radiances, it is suitable for studying radiative processes at the surface or within the atmosphere and for the development of remote-sensing algorithms. FluxNet is a fast neural network-based implementation of Streamer for computing surface fluxes. It allows for a sophisticated treatment of radiative processes in the analysis of large data sets and potential integration into geophysical models where computational efficiency is an issue. Documentation and tools for the development of alternative versions of Fluxnet are available. Collectively, Streamer and FluxNet solve a wide variety of problems related to radiative transfer: Streamer provides the detail and sophistication needed to perform basic research on most aspects of complex radiative processes while the efficiency and simplicity of FluxNet make it ideal for operational use.

  8. Cognitive engineering models: A prerequisite to the design of human-computer interaction in complex dynamic systems

    NASA Technical Reports Server (NTRS)

    Mitchell, Christine M.

    1993-01-01

    This chapter examines a class of human-computer interaction applications, specifically the design of human-computer interaction for the operators of complex systems. Such systems include space systems (e.g., manned systems such as the Shuttle or space station, and unmanned systems such as NASA scientific satellites), aviation systems (e.g., the flight deck of 'glass cockpit' airplanes or air traffic control) and industrial systems (e.g., power plants, telephone networks, and sophisticated, e.g., 'lights out,' manufacturing facilities). The main body of human-computer interaction (HCI) research complements but does not directly address the primary issues involved in human-computer interaction design for operators of complex systems. Interfaces to complex systems are somewhat special. The 'user' in such systems - i.e., the human operator responsible for safe and effective system operation - is highly skilled, someone who in human-machine systems engineering is sometimes characterized as 'well trained, well motivated'. The 'job' or task context is paramount and, thus, human-computer interaction is subordinate to human job interaction. The design of human interaction with complex systems, i.e., the design of human job interaction, is sometimes called cognitive engineering.

  9. The Detroit Young Adult Asthma Project: Pilot of a Technology-Based Medication Adherence Intervention for African-American Emerging Adults.

    PubMed

    Kolmodin MacDonell, Karen; Naar, Sylvie; Gibson-Scipio, Wanda; Lam, Phebe; Secord, Elizabeth

    2016-10-01

    To conduct a randomized controlled pilot of a multicomponent, technology-based intervention promoting adherence to controller medication in African-American emerging adults with asthma. The intervention consisted of two computer-delivered sessions based on motivational interviewing combined with text messaged reminders between sessions. Participants (N = 49) were 18-29 years old, African-American, with persistent asthma requiring controller medication. Participants had to report poor medication adherence and asthma control. Youth were randomized to receive the intervention or an attention control. Data were collected through computer-delivered self-report questionnaires at baseline, 1, and 3 months. Ecological Momentary Assessment via two-way text messaging was also used to collect "real-time" data on medication use and asthma control. The intervention was feasible and acceptable to the target population, as evidenced by high retention rates and satisfaction scores. Changes in study outcomes from pre- to postintervention favored the intervention, particularly for decrease in asthma symptoms, t (42) = 2.22, p < .05 (Cohen's d = .071). Results suggest that the intervention is feasible and effective. However, findings are preliminary and should be replicated with a larger sample and more sophisticated data analyses. Copyright © 2016 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.

  10. SuML: A Survey Markup Language for Generalized Survey Encoding

    PubMed Central

    Barclay, MW; Lober, WB; Karras, BT

    2002-01-01

    There is a need in clinical and research settings for a sophisticated, generalized, web based survey tool that supports complex logic, separation of content and presentation, and computable guidelines. There are many commercial and open source survey packages available that provide simple logic; few provide sophistication beyond “goto” statements; none support the use of guidelines. These tools are driven by databases, static web pages, and structured documents using markup languages such as eXtensible Markup Language (XML). We propose a generalized, guideline aware language and an implementation architecture using open source standards.

  11. Viscous-Inviscid Methods in Unsteady Aerodynamic Analysis of Bio-Inspired Morphing Wings

    NASA Astrophysics Data System (ADS)

    Dhruv, Akash V.

    Flight has been one of the greatest realizations of human imagination, revolutionizing communication and transportation over the years. This has greatly influenced the growth of technology itself, enabling researchers to communicate and share their ideas more effectively, extending the human potential to create more sophisticated systems. While the end product of a sophisticated technology makes our lives easier, its development process presents an array of challenges in itself. In last decade, scientists and engineers have turned towards bio-inspiration to design more efficient and robust aerodynamic systems to enhance the ability of Unmanned Aerial Vehicles (UAVs) to be operated in cluttered environments, where tight maneuverability and controllability are necessary. Effective use of UAVs in domestic airspace will mark the beginning of a new age in communication and transportation. The design of such complex systems necessitates the need for faster and more effective tools to perform preliminary investigations in design, thereby streamlining the design process. This thesis explores the implementation of numerical panel methods for aerodynamic analysis of bio-inspired morphing wings. Numerical panel methods have been one of the earliest forms of computational methods for aerodynamic analysis to be developed. Although the early editions of this method performed only inviscid analysis, the algorithm has matured over the years as a result of contributions made by prominent aerodynamicists. The method discussed in this thesis is influenced by recent advancements in panel methods and incorporates both viscous and inviscid analysis of multi-flap wings. The surface calculation of aerodynamic coefficients makes this method less computationally expensive than traditional Computational Fluid Dynamics (CFD) solvers available, and thus is effective when both speed and accuracy are desired. The morphing wing design, which consists of sequential feather-like flaps installed over the upper and lower surfaces of a standard airfoil, proves to be an effective alternative to standard control surfaces by increasing the flight capability of bird-scale UAVs. The results obtained for this wing design under various flight and flap configurations provide insight into its aerodynamic behavior, which enhance the maneuverability and controllability. The overall method acts as an important tool to create an aerodynamic database to develop a distributed control system for autonomous operation of the multi-flap morphing wing, supporting the use of viscous-inviscid methods as a tool in rapid aerodynamic analysis.

  12. 32 CFR Appendix E to Part 323 - OMB Guidelines for Matching Programs

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... concern expressed by the Congress in the Privacy Act of 1974 that “the increasing use of computers and sophisticated information technology, while essential to the efficient operation of the Government, has greatly... which a computer is used to compare two or more automated systems of records or a system of records with...

  13. 32 CFR Appendix E to Part 323 - OMB Guidelines for Matching Programs

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... concern expressed by the Congress in the Privacy Act of 1974 that “the increasing use of computers and sophisticated information technology, while essential to the efficient operation of the Government, has greatly... which a computer is used to compare two or more automated systems of records or a system of records with...

  14. 32 CFR Appendix E to Part 323 - OMB Guidelines for Matching Programs

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... concern expressed by the Congress in the Privacy Act of 1974 that “the increasing use of computers and sophisticated information technology, while essential to the efficient operation of the Government, has greatly... which a computer is used to compare two or more automated systems of records or a system of records with...

  15. Light weight portable operator control unit using an Android-enabled mobile phone

    NASA Astrophysics Data System (ADS)

    Fung, Nicholas

    2011-05-01

    There have been large gains in the field of robotics, both in hardware sophistication and technical capabilities. However, as more capable robots have been developed and introduced to battlefield environments, the problem of interfacing with human controllers has proven to be challenging. Particularly in the field of military applications, controller requirements can be stringent and can range from size and power consumption, to durability and cost. Traditional operator control units (OCUs) tend to resemble laptop personal computers (PCs), as these devices are mobile and have ample computing power. However, laptop PCs are bulky and have greater power requirements. To approach this problem, a light weight, inexpensive controller was created based on a mobile phone running the Android operating system. It was designed to control an iRobot Packbot through the Army Research Laboratory (ARL) in-house Agile Computing Infrastructure (ACI). The hardware capabilities of the mobile phone, such as Wi- Fi communications, touch screen interface, and the flexibility of the Android operating system, made it a compelling platform. The Android based OCU offers a more portable package and can be easily carried by a soldier along with normal gear requirements. In addition, the one hand operation of the Android OCU allows for the Soldier to keep an unoccupied hand for greater flexibility. To validate the Android OCU as a capable controller, experimental data was collected evaluating use of the controller and a traditional, tablet PC based OCU. Initial analysis suggests that the Android OCU performed positively in qualitative data collected from participants.

  16. Using a cloud to replenish parched groundwater modeling efforts.

    PubMed

    Hunt, Randall J; Luchette, Joseph; Schreuder, Willem A; Rumbaugh, James O; Doherty, John; Tonkin, Matthew J; Rumbaugh, Douglas B

    2010-01-01

    Groundwater models can be improved by introduction of additional parameter flexibility and simultaneous use of soft-knowledge. However, these sophisticated approaches have high computational requirements. Cloud computing provides unprecedented access to computing power via the Internet to facilitate the use of these techniques. A modeler can create, launch, and terminate "virtual" computers as needed, paying by the hour, and save machine images for future use. Such cost-effective and flexible computing power empowers groundwater modelers to routinely perform model calibration and uncertainty analysis in ways not previously possible.

  17. Using a cloud to replenish parched groundwater modeling efforts

    USGS Publications Warehouse

    Hunt, Randall J.; Luchette, Joseph; Schreuder, Willem A.; Rumbaugh, James O.; Doherty, John; Tonkin, Matthew J.; Rumbaugh, Douglas B.

    2010-01-01

    Groundwater models can be improved by introduction of additional parameter flexibility and simultaneous use of soft-knowledge. However, these sophisticated approaches have high computational requirements. Cloud computing provides unprecedented access to computing power via the Internet to facilitate the use of these techniques. A modeler can create, launch, and terminate “virtual” computers as needed, paying by the hour, and save machine images for future use. Such cost-effective and flexible computing power empowers groundwater modelers to routinely perform model calibration and uncertainty analysis in ways not previously possible.

  18. Control technology for future aircraft propulsion systems

    NASA Technical Reports Server (NTRS)

    Zeller, J. R.; Szuch, J. R.; Merrill, W. C.; Lehtinen, B.; Soeder, J. F.

    1984-01-01

    The need for a more sophisticated engine control system is discussed. The improvements in better thrust-to-weight ratios demand the manipulation of more control inputs. New technological solutions to the engine control problem are practiced. The digital electronic engine control (DEEC) system is a step in the evolution to digital electronic engine control. Technology issues are addressed to ensure a growth in confidence in sophisticated electronic controls for aircraft turbine engines. The need of a control system architecture which permits propulsion controls to be functionally integrated with other aircraft systems is established. Areas of technology studied include: (1) control design methodology; (2) improved modeling and simulation methods; and (3) implementation technologies. Objectives, results and future thrusts are summarized.

  19. Test, Control and Monitor System (TCMS) operations plan

    NASA Technical Reports Server (NTRS)

    Macfarlane, C. K.; Conroy, M. P.

    1993-01-01

    The purpose is to provide a clear understanding of the Test, Control and Monitor System (TCMS) operating environment and to describe the method of operations for TCMS. TCMS is a complex and sophisticated checkout system focused on support of the Space Station Freedom Program (SSFP) and related activities. An understanding of the TCMS operating environment is provided and operational responsibilities are defined. NASA and the Payload Ground Operations Contractor (PGOC) will use it as a guide to manage the operation of the TCMS computer systems and associated networks and workstations. All TCMS operational functions are examined. Other plans and detailed operating procedures relating to an individual operational function are referenced within this plan. This plan augments existing Technical Support Management Directives (TSMD's), Standard Practices, and other management documentation which will be followed where applicable.

  20. Netwar

    NASA Astrophysics Data System (ADS)

    Keen, Arthur A.

    2006-04-01

    This paper describes technology being developed at 21st Century Technologies to automate Computer Network Operations (CNO). CNO refers to DoD activities related to Attacking and Defending Computer Networks (CNA & CND). Next generation cyber threats are emerging in the form of powerful Internet services and tools that automate intelligence gathering, planning, testing, and surveillance. We will focus on "Search-Engine Hacks", queries that can retrieve lists of router/switch/server passwords, control panels, accessible cameras, software keys, VPN connection files, and vulnerable web applications. Examples include "Titan Rain" attacks against DoD facilities and the Santy worm, which identifies vulnerable sites by searching Google for URLs containing application-specific strings. This trend will result in increasingly sophisticated and automated intelligence-driven cyber attacks coordinated across multiple domains that are difficult to defeat or even understand with current technology. One traditional method of CNO relies on surveillance detection as an attack predictor. Unfortunately, surveillance detection is difficult because attackers can perform search engine-driven surveillance such as with Google Hacks, and avoid touching the target site. Therefore, attack observables represent only about 5% of the attacker's total attack time, and are inadequate to provide warning. In order to predict attacks and defend against them, CNO must also employ more sophisticated techniques and work to understand the attacker's Motives, Means and Opportunities (MMO). CNO must use automated reconnaissance tools, such as Google, to identify information vulnerabilities, and then utilize Internet tools to observe the intelligence gathering, planning, testing, and collaboration activities that represent 95% of the attacker's effort.

  1. That Elusive, Eclectic Thing Called Thermal Environment: What a Board Should Know About It

    ERIC Educational Resources Information Center

    Schutte, Frederick

    1970-01-01

    Discussion of proper thermal environment for protection of sophisticated educational equipment such as computer and data-processing machines, magnetic tapes, closed-circuit television and video tape communications systems.

  2. School Architecture: New Activities Dictate New Designs.

    ERIC Educational Resources Information Center

    Hill, Robert

    1984-01-01

    Changing educational requirements have led to many school building design developments in recent years, including technologically sophisticated music and computer rooms, large school kitchens, and Title IX-mandated equal facilities available for both sexes. (MLF)

  3. Continuous Three-Dimensional Control of a Virtual Helicopter Using a Motor Imagery Based Brain-Computer Interface

    PubMed Central

    Doud, Alexander J.; Lucas, John P.; Pisansky, Marc T.; He, Bin

    2011-01-01

    Brain-computer interfaces (BCIs) allow a user to interact with a computer system using thought. However, only recently have devices capable of providing sophisticated multi-dimensional control been achieved non-invasively. A major goal for non-invasive BCI systems has been to provide continuous, intuitive, and accurate control, while retaining a high level of user autonomy. By employing electroencephalography (EEG) to record and decode sensorimotor rhythms (SMRs) induced from motor imaginations, a consistent, user-specific control signal may be characterized. Utilizing a novel method of interactive and continuous control, we trained three normal subjects to modulate their SMRs to achieve three-dimensional movement of a virtual helicopter that is fast, accurate, and continuous. In this system, the virtual helicopter's forward-backward translation and elevation controls were actuated through the modulation of sensorimotor rhythms that were converted to forces applied to the virtual helicopter at every simulation time step, and the helicopter's angle of left or right rotation was linearly mapped, with higher resolution, from sensorimotor rhythms associated with other motor imaginations. These different resolutions of control allow for interplay between general intent actuation and fine control as is seen in the gross and fine movements of the arm and hand. Subjects controlled the helicopter with the goal of flying through rings (targets) randomly positioned and oriented in a three-dimensional space. The subjects flew through rings continuously, acquiring as many as 11 consecutive rings within a five-minute period. In total, the study group successfully acquired over 85% of presented targets. These results affirm the effective, three-dimensional control of our motor imagery based BCI system, and suggest its potential applications in biological navigation, neuroprosthetics, and other applications. PMID:22046274

  4. Synthetic analog computation in living cells.

    PubMed

    Daniel, Ramiz; Rubens, Jacob R; Sarpeshkar, Rahul; Lu, Timothy K

    2013-05-30

    A central goal of synthetic biology is to achieve multi-signal integration and processing in living cells for diagnostic, therapeutic and biotechnology applications. Digital logic has been used to build small-scale circuits, but other frameworks may be needed for efficient computation in the resource-limited environments of cells. Here we demonstrate that synthetic analog gene circuits can be engineered to execute sophisticated computational functions in living cells using just three transcription factors. Such synthetic analog gene circuits exploit feedback to implement logarithmically linear sensing, addition, ratiometric and power-law computations. The circuits exhibit Weber's law behaviour as in natural biological systems, operate over a wide dynamic range of up to four orders of magnitude and can be designed to have tunable transfer functions. Our circuits can be composed to implement higher-order functions that are well described by both intricate biochemical models and simple mathematical functions. By exploiting analog building-block functions that are already naturally present in cells, this approach efficiently implements arithmetic operations and complex functions in the logarithmic domain. Such circuits may lead to new applications for synthetic biology and biotechnology that require complex computations with limited parts, need wide-dynamic-range biosensing or would benefit from the fine control of gene expression.

  5. The Operation of a Specialized Scientific Information and Data Analysis Center With Computer Base and Associated Communications Network.

    ERIC Educational Resources Information Center

    Cottrell, William B.; And Others

    The Nuclear Safety Information Center (NSIC) is a highly sophisticated scientific information center operated at Oak Ridge National Laboratory (ORNL) for the U.S. Atomic Energy Commission. Its information file, which consists of both data and bibliographic information, is computer stored and numerous programs have been developed to facilitate the…

  6. Integrating the iPod Touch in K-12 Education: Visions and Vices

    ERIC Educational Resources Information Center

    Banister, Savilla

    2010-01-01

    Advocates of ubiquitous computing have long been documenting classroom benefits of one-to-one ratios of students to handheld or laptop computers. The recent sophisticated capabilities of the iPod Touch, iPhone, and iPad have encouraged further speculation on exactly how K-12 teaching and learning might be energized by such devices. This paper…

  7. Performance of hybrid programming models for multiscale cardiac simulations: preparing for petascale computation.

    PubMed

    Pope, Bernard J; Fitch, Blake G; Pitman, Michael C; Rice, John J; Reumann, Matthias

    2011-10-01

    Future multiscale and multiphysics models that support research into human disease, translational medical science, and treatment can utilize the power of high-performance computing (HPC) systems. We anticipate that computationally efficient multiscale models will require the use of sophisticated hybrid programming models, mixing distributed message-passing processes [e.g., the message-passing interface (MPI)] with multithreading (e.g., OpenMP, Pthreads). The objective of this study is to compare the performance of such hybrid programming models when applied to the simulation of a realistic physiological multiscale model of the heart. Our results show that the hybrid models perform favorably when compared to an implementation using only the MPI and, furthermore, that OpenMP in combination with the MPI provides a satisfactory compromise between performance and code complexity. Having the ability to use threads within MPI processes enables the sophisticated use of all processor cores for both computation and communication phases. Considering that HPC systems in 2012 will have two orders of magnitude more cores than what was used in this study, we believe that faster than real-time multiscale cardiac simulations can be achieved on these systems.

  8. Software systems for modeling articulated figures

    NASA Technical Reports Server (NTRS)

    Phillips, Cary B.

    1989-01-01

    Research in computer animation and simulation of human task performance requires sophisticated geometric modeling and user interface tools. The software for a research environment should present the programmer with a powerful but flexible substrate of facilities for displaying and manipulating geometric objects, yet insure that future tools have a consistent and friendly user interface. Jack is a system which provides a flexible and extensible programmer and user interface for displaying and manipulating complex geometric figures, particularly human figures in a 3D working environment. It is a basic software framework for high-performance Silicon Graphics IRIS workstations for modeling and manipulating geometric objects in a general but powerful way. It provides a consistent and user-friendly interface across various applications in computer animation and simulation of human task performance. Currently, Jack provides input and control for applications including lighting specification and image rendering, anthropometric modeling, figure positioning, inverse kinematics, dynamic simulation, and keyframe animation.

  9. Distributive, Non-destructive Real-time System and Method for Snowpack Monitoring

    NASA Technical Reports Server (NTRS)

    Frolik, Jeff (Inventor); Skalka, Christian (Inventor)

    2013-01-01

    A ground-based system that provides quasi real-time measurement and collection of snow-water equivalent (SWE) data in remote settings is provided. The disclosed invention is significantly less expensive and easier to deploy than current methods and less susceptible to terrain and snow bridging effects. Embodiments of the invention include remote data recovery solutions. Compared to current infrastructure using existing SWE technology, the disclosed invention allows more SWE sites to be installed for similar cost and effort, in a greater variety of terrain; thus, enabling data collection at improved spatial resolutions. The invention integrates a novel computational architecture with new sensor technologies. The invention's computational architecture is based on wireless sensor networks, comprised of programmable, low-cost, low-powered nodes capable of sophisticated sensor control and remote data communication. The invention also includes measuring attenuation of electromagnetic radiation, an approach that is immune to snow bridging and significantly reduces sensor footprints.

  10. The process group approach to reliable distributed computing

    NASA Technical Reports Server (NTRS)

    Birman, Kenneth P.

    1992-01-01

    The difficulty of developing reliable distribution software is an impediment to applying distributed computing technology in many settings. Experience with the ISIS system suggests that a structured approach based on virtually synchronous process groups yields systems that are substantially easier to develop, exploit sophisticated forms of cooperative computation, and achieve high reliability. Six years of research on ISIS, describing the model, its implementation challenges, and the types of applications to which ISIS has been applied are reviewed.

  11. Computation of repetitions and regularities of biologically weighted sequences.

    PubMed

    Christodoulakis, M; Iliopoulos, C; Mouchard, L; Perdikuri, K; Tsakalidis, A; Tsichlas, K

    2006-01-01

    Biological weighted sequences are used extensively in molecular biology as profiles for protein families, in the representation of binding sites and often for the representation of sequences produced by a shotgun sequencing strategy. In this paper, we address three fundamental problems in the area of biologically weighted sequences: (i) computation of repetitions, (ii) pattern matching, and (iii) computation of regularities. Our algorithms can be used as basic building blocks for more sophisticated algorithms applied on weighted sequences.

  12. Extensible Computational Chemistry Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-08-09

    ECCE provides a sophisticated graphical user interface, scientific visualization tools, and the underlying data management framework enabling scientists to efficiently set up calculations and store, retrieve, and analyze the rapidly growing volumes of data produced by computational chemistry studies. ECCE was conceived as part of the Environmental Molecular Sciences Laboratory construction to solve the problem of researchers being able to effectively utilize complex computational chemistry codes and massively parallel high performance compute resources. Bringing the power of these codes and resources to the desktops of researcher and thus enabling world class research without users needing a detailed understanding of themore » inner workings of either the theoretical codes or the supercomputers needed to run them was a grand challenge problem in the original version of the EMSL. ECCE allows collaboration among researchers using a web-based data repository where the inputs and results for all calculations done within ECCE are organized. ECCE is a first of kind end-to-end problem solving environment for all phases of computational chemistry research: setting up calculations with sophisticated GUI and direct manipulation visualization tools, submitting and monitoring calculations on remote high performance supercomputers without having to be familiar with the details of using these compute resources, and performing results visualization and analysis including creating publication quality images. ECCE is a suite of tightly integrated applications that are employed as the user moves through the modeling process.« less

  13. Verification for measurement-only blind quantum computing

    NASA Astrophysics Data System (ADS)

    Morimae, Tomoyuki

    2014-06-01

    Blind quantum computing is a new secure quantum computing protocol where a client who does not have any sophisticated quantum technology can delegate her quantum computing to a server without leaking any privacy. It is known that a client who has only a measurement device can perform blind quantum computing [T. Morimae and K. Fujii, Phys. Rev. A 87, 050301(R) (2013), 10.1103/PhysRevA.87.050301]. It has been an open problem whether the protocol can enjoy the verification, i.e., the ability of the client to check the correctness of the computing. In this paper, we propose a protocol of verification for the measurement-only blind quantum computing.

  14. Neuroprosthetic Decoder Training as Imitation Learning

    PubMed Central

    Merel, Josh; Paninski, Liam; Cunningham, John P.

    2016-01-01

    Neuroprosthetic brain-computer interfaces function via an algorithm which decodes neural activity of the user into movements of an end effector, such as a cursor or robotic arm. In practice, the decoder is often learned by updating its parameters while the user performs a task. When the user’s intention is not directly observable, recent methods have demonstrated value in training the decoder against a surrogate for the user’s intended movement. Here we show that training a decoder in this way is a novel variant of an imitation learning problem, where an oracle or expert is employed for supervised training in lieu of direct observations, which are not available. Specifically, we describe how a generic imitation learning meta-algorithm, dataset aggregation (DAgger), can be adapted to train a generic brain-computer interface. By deriving existing learning algorithms for brain-computer interfaces in this framework, we provide a novel analysis of regret (an important metric of learning efficacy) for brain-computer interfaces. This analysis allows us to characterize the space of algorithmic variants and bounds on their regret rates. Existing approaches for decoder learning have been performed in the cursor control setting, but the available design principles for these decoders are such that it has been impossible to scale them to naturalistic settings. Leveraging our findings, we then offer an algorithm that combines imitation learning with optimal control, which should allow for training of arbitrary effectors for which optimal control can generate goal-oriented control. We demonstrate this novel and general BCI algorithm with simulated neuroprosthetic control of a 26 degree-of-freedom model of an arm, a sophisticated and realistic end effector. PMID:27191387

  15. Transmission loss optimization in acoustic sandwich panels

    NASA Astrophysics Data System (ADS)

    Makris, S. E.; Dym, C. L.; MacGregor Smith, J.

    1986-06-01

    Considering the sound transmission loss (TL) of a sandwich panel as the single objective, different optimization techniques are examined and a sophisticated computer program is used to find the optimum TL. Also, for one of the possible case studies such as core optimization, closed-form expressions are given between TL and the core-design variables for different sets of skins. The significance of these functional relationships lies in the fact that the panel designer can bypass the necessity of using a sophisticated software package in order to assess explicitly the dependence of the TL on core thickness and density.

  16. Does a better model yield a better argument? An info-gap analysis

    NASA Astrophysics Data System (ADS)

    Ben-Haim, Yakov

    2017-04-01

    Theories, models and computations underlie reasoned argumentation in many areas. The possibility of error in these arguments, though of low probability, may be highly significant when the argument is used in predicting the probability of rare high-consequence events. This implies that the choice of a theory, model or computational method for predicting rare high-consequence events must account for the probability of error in these components. However, error may result from lack of knowledge or surprises of various sorts, and predicting the probability of error is highly uncertain. We show that the putatively best, most innovative and sophisticated argument may not actually have the lowest probability of error. Innovative arguments may entail greater uncertainty than more standard but less sophisticated methods, creating an innovation dilemma in formulating the argument. We employ info-gap decision theory to characterize and support the resolution of this problem and present several examples.

  17. Computer programs: Information retrieval and data analysis, a compilation

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The items presented in this compilation are divided into two sections. Section one treats of computer usage devoted to the retrieval of information that affords the user rapid entry into voluminous collections of data on a selective basis. Section two is a more generalized collection of computer options for the user who needs to take such data and reduce it to an analytical study within a specific discipline. These programs, routines, and subroutines should prove useful to users who do not have access to more sophisticated and expensive computer software.

  18. Interactive Forecasting with the National Weather Service River Forecast System

    NASA Technical Reports Server (NTRS)

    Smith, George F.; Page, Donna

    1993-01-01

    The National Weather Service River Forecast System (NWSRFS) consists of several major hydrometeorologic subcomponents to model the physics of the flow of water through the hydrologic cycle. The entire NWSRFS currently runs in both mainframe and minicomputer environments, using command oriented text input to control the system computations. As computationally powerful and graphically sophisticated scientific workstations became available, the National Weather Service (NWS) recognized that a graphically based, interactive environment would enhance the accuracy and timeliness of NWS river and flood forecasts. Consequently, the operational forecasting portion of the NWSRFS has been ported to run under a UNIX operating system, with X windows as the display environment on a system of networked scientific workstations. In addition, the NWSRFS Interactive Forecast Program was developed to provide a graphical user interface to allow the forecaster to control NWSRFS program flow and to make adjustments to forecasts as necessary. The potential market for water resources forecasting is immense and largely untapped. Any private company able to market the river forecasting technologies currently developed by the NWS Office of Hydrology could provide benefits to many information users and profit from providing these services.

  19. Graphical Requirements for Force Level Planning. Volume 2

    DTIC Science & Technology

    1991-09-01

    technology review includes graphics algorithms, computer hardware, computer software, and design methodologies. The technology can either exist today or...level graphics language. 7.4 User Interface Design Tools As user interfaces have become more sophisticated, they have become harder to develop. Xl...Setphen M. Pizer, editors. Proceedings 1986 Workshop on Interactive 31) Graphics , October 1986. 18 J. S. Dumas. Designing User Interface Software. Prentice

  20. Sensorless control for a sophisticated artificial myocardial contraction by using shape memory alloy fibre.

    PubMed

    Shiraishi, Y; Yambe, T; Saijo, Y; Sato, F; Tanaka, A; Yoshizawa, M; Sugai, T K; Sakata, R; Luo, Y; Park, Y; Uematsu, M; Umezu, M; Fujimoto, T; Masumoto, N; Liu, H; Baba, A; Konno, S; Nitta, S; Imachi, K; Tabayashi, K; Sasada, H; Homma, D

    2008-01-01

    The authors have been developing an artificial myocardium, which is capable of supporting natural contractile function from the outside of the ventricle. The system was originally designed by using sophisticated covalent shape memory alloy fibres, and the surface did not implicate blood compatibility. The purpose of our study on the development of artificial myocardium was to achieve the assistance of myocardial functional reproduction by the integrative small mechanical elements without sensors, so that the effective circulatory support could be accomplished. In this study, the authors fabricated the prototype artificial myocardial assist unit composed of the sophisticated shape memory alloy fibre (Biometal), the diameter of which was 100 microns, and examined the mechanical response by using pulse width modulation (PWM) control method in each unit. Prior to the evaluation of dynamic characteristics, the relationship between strain and electric resistance and also the initial response of each unit were obtained. The component for the PWM control was designed in order to regulate the myocardial contractile function, which consisted of an originally-designed RISC microcomputer with the input of displacement, and its output signal was controlled by pulse wave modulation method. As a result, the optimal PWM parameters were confirmed and the fibrous displacement was successfully regulated under the different heat transfer conditions simulating internal body temperature as well as bias tensile loading. Then it was indicated that this control theory might be applied for more sophisticated ventricular passive or active restraint by the artificial myocardium on physiological demand.

  1. Use of Microcomputers and Personal Computers in Pacing

    PubMed Central

    Sasmor, L.; Tarjan, P.; Mumford, V.; Smith, E.

    1983-01-01

    This paper describes the evolution from the early discrete circuit pacemaker of the past to the sophisticated microprocessor based pacemakers of today. The necessary computerized supporting instrumentation is also described. Technological and economical reasons for this evolution are discussed.

  2. Adaptive Language Games with Robots

    NASA Astrophysics Data System (ADS)

    Steels, Luc

    2010-11-01

    This paper surveys recent research into language evolution using computer simulations and robotic experiments. This field has made tremendous progress in the past decade going from simple simulations of lexicon formation with animallike cybernetic robots to sophisticated grammatical experiments with humanoid robots.

  3. Toward a New Voice

    ERIC Educational Resources Information Center

    Murphy, Patti

    2007-01-01

    Frequently linked to sophisticated speech communication devices resembling laptop computers, augmentative and alternative communication (AAC) encompasses a spectrum of tools and strategies ranging from pointing, writing, gestures, and facial expressions to sign language, manual alphabet boards, picture symbols, and photographs used to convey…

  4. The journey from forensic to predictive materials science using density functional theory

    DOE PAGES

    Schultz, Peter A.

    2017-09-12

    Approximate methods for electronic structure, implemented in sophisticated computer codes and married to ever-more powerful computing platforms, have become invaluable in chemistry and materials science. The maturing and consolidation of quantum chemistry codes since the 1980s, based upon explicitly correlated electronic wave functions, has made them a staple of modern molecular chemistry. Here, the impact of first principles electronic structure in physics and materials science had lagged owing to the extra formal and computational demands of bulk calculations.

  5. The journey from forensic to predictive materials science using density functional theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schultz, Peter A.

    Approximate methods for electronic structure, implemented in sophisticated computer codes and married to ever-more powerful computing platforms, have become invaluable in chemistry and materials science. The maturing and consolidation of quantum chemistry codes since the 1980s, based upon explicitly correlated electronic wave functions, has made them a staple of modern molecular chemistry. Here, the impact of first principles electronic structure in physics and materials science had lagged owing to the extra formal and computational demands of bulk calculations.

  6. Physical Impairment

    NASA Astrophysics Data System (ADS)

    Trewin, Shari

    Many health conditions can lead to physical impairments that impact computer and Web access. Musculoskeletal conditions such as arthritis and cumulative trauma disorders can make movement stiff and painful. Movement disorders such as tremor, Parkinsonism and dystonia affect the ability to control movement, or to prevent unwanted movements. Often, the same underlying health condition also has sensory or cognitive effects. People with dexterity impairments may use a standard keyboard and mouse, or any of a wide range of alternative input mechanisms. Examples are given of the diverse ways that specific dexterity impairments and input mechanisms affect the fundamental actions of Web browsing. As the Web becomes increasingly sophisticated, and physically demanding, new access features at the Web browser and page level will be necessary.

  7. Advanced construction management for lunar base construction - Surface operations planner

    NASA Technical Reports Server (NTRS)

    Kehoe, Robert P.

    1992-01-01

    The study proposes a conceptual solution and lays the framework for developing a new, sophisticated and intelligent tool for a lunar base construction crew to use. This concept integrates expert systems for critical decision making, virtual reality for training, logistics and laydown optimization, automated productivity measurements, and an advanced scheduling tool to form a unique new planning tool. The concept features extensive use of computers and expert systems software to support the actual work, while allowing the crew to control the project from the lunar surface. Consideration is given to a logistics data base, laydown area management, flexible critical progress scheduler, video simulation of assembly tasks, and assembly information and tracking documentation.

  8. Catalytic molecular logic devices by DNAzyme displacement.

    PubMed

    Brown, Carl W; Lakin, Matthew R; Stefanovic, Darko; Graves, Steven W

    2014-05-05

    Chemical reactions catalyzed by DNAzymes offer a route to programmable modification of biomolecules for therapeutic purposes. To this end, we have developed a new type of catalytic DNA-based logic gates in which DNAzyme catalysis is controlled via toehold-mediated strand displacement reactions. We refer to these as DNAzyme displacement gates. The use of toeholds to guide input binding provides a favorable pathway for input recognition, and the innate catalytic activity of DNAzymes allows amplification of nanomolar input concentrations. We demonstrate detection of arbitrary input sequences by rational introduction of mismatched bases into inhibitor strands. Furthermore, we illustrate the applicability of DNAzyme displacement to compute logic functions involving multiple logic gates. This work will enable sophisticated logical control of a range of biochemical modifications, with applications in pathogen detection and autonomous theranostics. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    X. Zhao, S. Ramakrishnan, J. Lawson, C.Neumeyer, R. Marsala, H. Schneider, Engineering Operations

    NSTX at Princeton Plasma Physics Laboratory (PPPL) requires sophisticated plasma positioning control system for stable plasma operation. TF magnetic coils and PF magnetic coils provide electromagnetic fields to position and shape the plasma vertically and horizontally respectively. NSTX utilizes twenty six coil power supplies to establish and initiate electromagnetic fields through the coil system for plasma control. A power protection and interlock system is utilized to detect power system faults and protect the TF coils and PF coils against excessive electromechanical forces, overheating, and over current. Upon detecting any fault condition the power system is restricted, and it is eithermore » prevented from initializing or suppressed to de-energize coil power during pulsing. Power fault status is immediately reported to the computer system. This paper describes the design and operation of NSTX's protection and interlocking system and possible future expansion.« less

  10. Using genetic information while protecting the privacy of the soul.

    PubMed

    Moor, J H

    1999-01-01

    Computing plays an important role in genetics (and vice versa). Theoretically, computing provides a conceptual model for the function and malfunction of our genetic machinery. Practically, contemporary computers and robots equipped with advanced algorithms make the revelation of the complete human genome imminent--computers are about to reveal our genetic souls for the first time. Ethically, computers help protect privacy by restricting access in sophisticated ways to genetic information. But the inexorable fact that computers will increasingly collect, analyze, and disseminate abundant amounts of genetic information made available through the genetic revolution, not to mention that inexpensive computing devices will make genetic information gathering easier, underscores the need for strong and immediate privacy legislation.

  11. Preliminary supersonic flight test evaluation of performance seeking control

    NASA Technical Reports Server (NTRS)

    Orme, John S.; Gilyard, Glenn B.

    1993-01-01

    Digital flight and engine control, powerful onboard computers, and sophisticated controls techniques may improve aircraft performance by maximizing fuel efficiency, maximizing thrust, and extending engine life. An adaptive performance seeking control system for optimizing the quasi-steady state performance of an F-15 aircraft was developed and flight tested. This system has three optimization modes: minimum fuel, maximum thrust, and minimum fan turbine inlet temperature. Tests of the minimum fuel and fan turbine inlet temperature modes were performed at a constant thrust. Supersonic single-engine flight tests of the three modes were conducted using varied after burning power settings. At supersonic conditions, the performance seeking control law optimizes the integrated airframe, inlet, and engine. At subsonic conditions, only the engine is optimized. Supersonic flight tests showed improvements in thrust of 9 percent, increases in fuel savings of 8 percent, and reductions of up to 85 deg R in turbine temperatures for all three modes. The supersonic performance seeking control structure is described and preliminary results of supersonic performance seeking control tests are given. These findings have implications for improving performance of civilian and military aircraft.

  12. Cutter Resource Effectiveness Evaluation (CREE) Program : A Guide for Users and Analysts

    DOT National Transportation Integrated Search

    1978-03-01

    The Cutter Resource Effectiveness Evaluation (CREE) project has developed a sophisticated, user-oriented computer model which can evaluate the effectiveness of any existing Coast Guard craft, or the effectiveness of any of a number of proposed altern...

  13. At Home in the Cell.

    ERIC Educational Resources Information Center

    Flannery, Maura C.

    1999-01-01

    Argues that biologists' understanding of the cell has become richer over the past 30 years. Describes how genetic engineering and sophisticated computer technology have provided an increased knowledge of genes, gene products, components of cells, and the structure and function of proteins. (CCM)

  14. The High-Tech Surge. Focus on Careers.

    ERIC Educational Resources Information Center

    Vo, Chuong-Dai Hong

    1996-01-01

    The computer industry is growing at a phenomenal rate as technology advances and prices fall, stimulating unprecedented demand from business, government, and individuals. Higher levels of education will be the key to securing employment as organizations increasingly rely on sophisticated technology. (Author)

  15. Chandelier: Picturing Potential

    ERIC Educational Resources Information Center

    Tebbs, Trevor J.

    2014-01-01

    The author--artist, scientist, educator, and visual-spatial thinker--describes the genesis of, and provides insight into, an innovative, strength-based, visually dynamic computer-aided communication system called Chandelier©. This system is the powerful combination of a sophisticated, user-friendly software program and an organizational…

  16. Ranking network of a captive rhesus macaque society: a sophisticated corporative kingdom.

    PubMed

    Fushing, Hsieh; McAssey, Michael P; Beisner, Brianne; McCowan, Brenda

    2011-03-15

    We develop a three-step computing approach to explore a hierarchical ranking network for a society of captive rhesus macaques. The computed network is sufficiently informative to address the question: Is the ranking network for a rhesus macaque society more like a kingdom or a corporation? Our computations are based on a three-step approach. These steps are devised to deal with the tremendous challenges stemming from the transitivity of dominance as a necessary constraint on the ranking relations among all individual macaques, and the very high sampling heterogeneity in the behavioral conflict data. The first step simultaneously infers the ranking potentials among all network members, which requires accommodation of heterogeneous measurement error inherent in behavioral data. Our second step estimates the social rank for all individuals by minimizing the network-wide errors in the ranking potentials. The third step provides a way to compute confidence bounds for selected empirical features in the social ranking. We apply this approach to two sets of conflict data pertaining to two captive societies of adult rhesus macaques. The resultant ranking network for each society is found to be a sophisticated mixture of both a kingdom and a corporation. Also, for validation purposes, we reanalyze conflict data from twenty longhorn sheep and demonstrate that our three-step approach is capable of correctly computing a ranking network by eliminating all ranking error.

  17. When Machines Think: Radiology's Next Frontier.

    PubMed

    Dreyer, Keith J; Geis, J Raymond

    2017-12-01

    Artificial intelligence (AI), machine learning, and deep learning are terms now seen frequently, all of which refer to computer algorithms that change as they are exposed to more data. Many of these algorithms are surprisingly good at recognizing objects in images. The combination of large amounts of machine-consumable digital data, increased and cheaper computing power, and increasingly sophisticated statistical models combine to enable machines to find patterns in data in ways that are not only cost-effective but also potentially beyond humans' abilities. Building an AI algorithm can be surprisingly easy. Understanding the associated data structures and statistics, on the other hand, is often difficult and obscure. Converting the algorithm into a sophisticated product that works consistently in broad, general clinical use is complex and incompletely understood. To show how these AI products reduce costs and improve outcomes will require clinical translation and industrial-grade integration into routine workflow. Radiology has the chance to leverage AI to become a center of intelligently aggregated, quantitative, diagnostic information. Centaur radiologists, formed as a synergy of human plus computer, will provide interpretations using data extracted from images by humans and image-analysis computer algorithms, as well as the electronic health record, genomics, and other disparate sources. These interpretations will form the foundation of precision health care, or care customized to an individual patient. © RSNA, 2017.

  18. Volcano Monitoring: A Case Study in Pervasive Computing

    NASA Astrophysics Data System (ADS)

    Peterson, Nina; Anusuya-Rangappa, Lohith; Shirazi, Behrooz A.; Song, Wenzhan; Huang, Renjie; Tran, Daniel; Chien, Steve; Lahusen, Rick

    Recent advances in wireless sensor network technology have provided robust and reliable solutions for sophisticated pervasive computing applications such as inhospitable terrain environmental monitoring. We present a case study for developing a real-time pervasive computing system, called OASIS for optimized autonomous space in situ sensor-web, which combines ground assets (a sensor network) and space assets (NASA’s earth observing (EO-1) satellite) to monitor volcanic activities at Mount St. Helens. OASIS’s primary goals are: to integrate complementary space and in situ ground sensors into an interactive and autonomous sensorweb, to optimize power and communication resource management of the sensorweb and to provide mechanisms for seamless and scalable fusion of future space and in situ components. The OASIS in situ ground sensor network development addresses issues related to power management, bandwidth management, quality of service management, topology and routing management, and test-bed design. The space segment development consists of EO-1 architectural enhancements, feedback of EO-1 data into the in situ component, command and control integration, data ingestion and dissemination and field demonstrations.

  19. Microgravity

    NASA Image and Video Library

    1999-05-26

    Looking for a faster computer? How about an optical computer that processes data streams simultaneously and works with the speed of light? In space, NASA researchers have formed optical thin-film. By turning these thin-films into very fast optical computer components, scientists could improve computer tasks, such as pattern recognition. Dr. Hossin Abdeldayem, physicist at NASA/Marshall Space Flight Center (MSFC) in Huntsville, Al, is working with lasers as part of an optical system for pattern recognition. These systems can be used for automated fingerprinting, photographic scarning and the development of sophisticated artificial intelligence systems that can learn and evolve. Photo credit: NASA/Marshall Space Flight Center (MSFC)

  20. Growth and yield models for central hardwoods

    Treesearch

    Martin E. Dale; Donald E. Hilt

    1989-01-01

    Over the last 20 years computers have become an efficient tool to estimate growth and yield. Computerized yield estimates vary from simple approximation or interpolation of traditional normal yield tables to highly sophisticated programs that simulate the growth and yield of each individual tree.

  1. Development of a Searchable Metabolite Database and Simulator of Xenobiotic Metabolism

    EPA Science Inventory

    A computational tool (MetaPath) has been developed for storage and analysis of metabolic pathways and associated metadata. The system is capable of sophisticated text and chemical structure/substructure searching as well as rapid comparison of metabolites formed across chemicals,...

  2. High-Tech Conservation: Information-Age Tools Have Revolutionized the Work of Ecologists.

    ERIC Educational Resources Information Center

    Chiles, James R.

    1992-01-01

    Describes a new direction for conservation efforts influenced by the advance of the information age and the introduction of many technologically sophisticated information collecting devices. Devices include microscopic computer chips, miniature electronic components, and Earth-observation satellite. (MCO)

  3. Using Visual Basic to Teach Programming for Geographers.

    ERIC Educational Resources Information Center

    Slocum, Terry A.; Yoder, Stephen C.

    1996-01-01

    Outlines reasons why computer programming should be taught to geographers. These include experience using macro (scripting) languages and sophisticated visualization software, and developing a deeper understanding of general hardware and software capabilities. Discusses the distinct advantages and few disadvantages of the programming language…

  4. Resistance Is Futile

    ERIC Educational Resources Information Center

    O'Hanlon, Charlene

    2009-01-01

    How odd it should seem that even today, with interactive whiteboards, content management systems, wireless broadband, handhelds, and every sort of sophisticated computing device penetrating and improving the classroom experience for students, David Roh, general manager for Follett Digital Resources can still say, "There are hundreds of…

  5. Effective Materials Property Information Management for the 21st Century

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, Weiju; Cebon, David; Arnold, Steve

    2010-01-01

    This paper discusses key principles for the development of materials property information management software systems. There are growing needs for automated materials information management in industry, research organizations and government agencies. In part these are fuelled by the demands for higher efficiency in material testing, product design and development and engineering analysis. But equally important, organizations are being driven to employ sophisticated methods and software tools for managing their mission-critical materials information by the needs for consistency, quality and traceability of data, as well as control of access to proprietary or sensitive information. Furthermore the use of increasingly sophisticated nonlinear,more » anisotropic and multi-scale engineering analysis approaches, particularly for composite materials, requires both processing of much larger volumes of test data for development of constitutive models and much more complex materials data input requirements for Computer-Aided Engineering (CAE) software. And finally, the globalization of engineering processes and outsourcing of design and development activities generates much greater needs for sharing a single gold source of materials information between members of global engineering teams in extended supply-chains. Fortunately material property management systems have kept pace with the growing user demands. They have evolved from hard copy archives, through simple electronic databases, to versatile data management systems that can be customized to specific user needs. The more sophisticated of these provide facilities for: (i) data management functions such as access control, version control, and quality control; (ii) a wide range of data import, export and analysis capabilities; (iii) mechanisms for ensuring that all data is traceable to its pedigree sources: details of testing programs, published sources, etc; (iv) tools for searching, reporting and viewing the data; and (v) access to the information via a wide range of interfaces, including web browsers, rich clients, programmatic access and clients embedded in third-party applications, such as CAE systems. This paper discusses the important requirements for advanced material data management systems as well as the future challenges and opportunities such as automated error checking, automated data quality assessment and characterization, identification of gaps in data, as well as functionalities and business models to keep users returning to the source: to generate user demand to fuel database growth and maintenance.« less

  6. Passive motion paradigm: an alternative to optimal control.

    PubMed

    Mohan, Vishwanathan; Morasso, Pietro

    2011-01-01

    IN THE LAST YEARS, OPTIMAL CONTROL THEORY (OCT) HAS EMERGED AS THE LEADING APPROACH FOR INVESTIGATING NEURAL CONTROL OF MOVEMENT AND MOTOR COGNITION FOR TWO COMPLEMENTARY RESEARCH LINES: behavioral neuroscience and humanoid robotics. In both cases, there are general problems that need to be addressed, such as the "degrees of freedom (DoFs) problem," the common core of production, observation, reasoning, and learning of "actions." OCT, directly derived from engineering design techniques of control systems quantifies task goals as "cost functions" and uses the sophisticated formal tools of optimal control to obtain desired behavior (and predictions). We propose an alternative "softer" approach passive motion paradigm (PMP) that we believe is closer to the biomechanics and cybernetics of action. The basic idea is that actions (overt as well as covert) are the consequences of an internal simulation process that "animates" the body schema with the attractor dynamics of force fields induced by the goal and task-specific constraints. This internal simulation offers the brain a way to dynamically link motor redundancy with task-oriented constraints "at runtime," hence solving the "DoFs problem" without explicit kinematic inversion and cost function computation. We argue that the function of such computational machinery is not only restricted to shaping motor output during action execution but also to provide the self with information on the feasibility, consequence, understanding and meaning of "potential actions." In this sense, taking into account recent developments in neuroscience (motor imagery, simulation theory of covert actions, mirror neuron system) and in embodied robotics, PMP offers a novel framework for understanding motor cognition that goes beyond the engineering control paradigm provided by OCT. Therefore, the paper is at the same time a review of the PMP rationale, as a computational theory, and a perspective presentation of how to develop it for designing better cognitive architectures.

  7. Galaxy CloudMan: delivering cloud compute clusters.

    PubMed

    Afgan, Enis; Baker, Dannon; Coraor, Nate; Chapman, Brad; Nekrutenko, Anton; Taylor, James

    2010-12-21

    Widespread adoption of high-throughput sequencing has greatly increased the scale and sophistication of computational infrastructure needed to perform genomic research. An alternative to building and maintaining local infrastructure is "cloud computing", which, in principle, offers on demand access to flexible computational infrastructure. However, cloud computing resources are not yet suitable for immediate "as is" use by experimental biologists. We present a cloud resource management system that makes it possible for individual researchers to compose and control an arbitrarily sized compute cluster on Amazon's EC2 cloud infrastructure without any informatics requirements. Within this system, an entire suite of biological tools packaged by the NERC Bio-Linux team (http://nebc.nerc.ac.uk/tools/bio-linux) is available for immediate consumption. The provided solution makes it possible, using only a web browser, to create a completely configured compute cluster ready to perform analysis in less than five minutes. Moreover, we provide an automated method for building custom deployments of cloud resources. This approach promotes reproducibility of results and, if desired, allows individuals and labs to add or customize an otherwise available cloud system to better meet their needs. The expected knowledge and associated effort with deploying a compute cluster in the Amazon EC2 cloud is not trivial. The solution presented in this paper eliminates these barriers, making it possible for researchers to deploy exactly the amount of computing power they need, combined with a wealth of existing analysis software, to handle the ongoing data deluge.

  8. Evolution of robotic arms.

    PubMed

    Moran, Michael E

    2007-01-01

    The foundation of surgical robotics is in the development of the robotic arm. This is a thorough review of the literature on the nature and development of this device with emphasis on surgical applications. We have reviewed the published literature and classified robotic arms by their application: show, industrial application, medical application, etc. There is a definite trend in the manufacture of robotic arms toward more dextrous devices, more degrees-of-freedom, and capabilities beyond the human arm. da Vinci designed the first sophisticated robotic arm in 1495 with four degrees-of-freedom and an analog on-board controller supplying power and programmability. von Kemplen's chess-playing automaton left arm was quite sophisticated. Unimate introduced the first industrial robotic arm in 1961, it has subsequently evolved into the PUMA arm. In 1963 the Rancho arm was designed; Minsky's Tentacle arm appeared in 1968, Scheinman's Stanford arm in 1969, and MIT's Silver arm in 1974. Aird became the first cyborg human with a robotic arm in 1993. In 2000 Miguel Nicolalis redefined possible man-machine capacity in his work on cerebral implantation in owl-monkeys directly interfacing with robotic arms both locally and at a distance. The robotic arm is the end-effector of robotic systems and currently is the hallmark feature of the da Vinci Surgical System making its entrance into surgical application. But, despite the potential advantages of this computer-controlled master-slave system, robotic arms have definite limitations. Ongoing work in robotics has many potential solutions to the drawbacks of current robotic surgical systems.

  9. A qualitative analysis of bus simulator training on transit incidents : a case study in Florida. [Summary].

    DOT National Transportation Integrated Search

    2013-01-01

    The simulator was once a very expensive, large-scale mechanical device for training military pilots or astronauts. Modern computers, linking sophisticated software and large-screen displays, have yielded simulators for the desktop or configured as sm...

  10. Sharing Digital Data

    ERIC Educational Resources Information Center

    Benedis-Grab, Gregory

    2011-01-01

    Computers have changed the landscape of scientific research in profound ways. Technology has always played an important role in scientific experimentation--through the development of increasingly sophisticated tools, the measurement of elusive quantities, and the processing of large amounts of data. However, the advent of social networking and the…

  11. FRAMEWORK FOR EVALUATION OF PHYSIOLOGICALLY-BASED PHARMACOKINETIC MODELS FOR USE IN SAFETY OR RISK ASSESSMENT

    EPA Science Inventory

    ABSTRACT

    Proposed applications of increasingly sophisticated biologically-based computational models, such as physiologically-based pharmacokinetic (PBPK) models, raise the issue of how to evaluate whether the models are adequate for proposed uses including safety or risk ...

  12. Suicide Prevention in a Treatment Setting.

    ERIC Educational Resources Information Center

    Litman, Robert E.

    1995-01-01

    The author anticipates that sophisticated interactive computer programs will be effective in improving screening and case finding of the suicidal and that they will become invaluable in improving training for primary care providers and outpatient mental health workers. Additionally, improved communication networks will help maintain continuity of…

  13. Analysis of a Distributed Pulse Power System Using a Circuit Analysis Code

    DTIC Science & Technology

    1979-06-01

    dose rate was then integrated to give a number that could be compared with measure- ments made using thermal luminescent dosimeters ( TLD ’ s). Since...NM 8 7117 AND THE BDM CORPORATION, ALBUQUERQUE, NM 87106 Abstract A sophisticated computer code (SCEPTRE), used to analyze electronic circuits...computer code (SCEPTRE), used to analyze electronic circuits, was used to evaluate the performance of a large flash X-ray machine. This device was

  14. Self-tuning bistable parametric feedback oscillator: Near-optimal amplitude maximization without model information

    NASA Astrophysics Data System (ADS)

    Braun, David J.; Sutas, Andrius; Vijayakumar, Sethu

    2017-01-01

    Theory predicts that parametrically excited oscillators, tuned to operate under resonant condition, are capable of large-amplitude oscillation useful in diverse applications, such as signal amplification, communication, and analog computation. However, due to amplitude saturation caused by nonlinearity, lack of robustness to model uncertainty, and limited sensitivity to parameter modulation, these oscillators require fine-tuning and strong modulation to generate robust large-amplitude oscillation. Here we present a principle of self-tuning parametric feedback excitation that alleviates the above-mentioned limitations. This is achieved using a minimalistic control implementation that performs (i) self-tuning (slow parameter adaptation) and (ii) feedback pumping (fast parameter modulation), without sophisticated signal processing past observations. The proposed approach provides near-optimal amplitude maximization without requiring model-based control computation, previously perceived inevitable to implement optimal control principles in practical application. Experimental implementation of the theory shows that the oscillator self-tunes itself near to the onset of dynamic bifurcation to achieve extreme sensitivity to small resonant parametric perturbations. As a result, it achieves large-amplitude oscillations by capitalizing on the effect of nonlinearity, despite substantial model uncertainties and strong unforeseen external perturbations. We envision the present finding to provide an effective and robust approach to parametric excitation when it comes to real-world application.

  15. pH-programmable DNA logic arrays powered by modular DNAzyme libraries.

    PubMed

    Elbaz, Johann; Wang, Fuan; Remacle, Francoise; Willner, Itamar

    2012-12-12

    Nature performs complex information processing circuits, such the programmed transformations of versatile stem cells into targeted functional cells. Man-made molecular circuits are, however, unable to mimic such sophisticated biomachineries. To reach these goals, it is essential to construct programmable modular components that can be triggered by environmental stimuli to perform different logic circuits. We report on the unprecedented design of artificial pH-programmable DNA logic arrays, constructed by modular libraries of Mg(2+)- and UO(2)(2+)-dependent DNAzyme subunits and their substrates. By the appropriate modular design of the DNA computation units, pH-programmable logic arrays of various complexities are realized, and the arrays can be erased, reused, and/or reprogrammed. Such systems may be implemented in the near future for nanomedical applications by pH-controlled regulation of cellular functions or may be used to control biotransformations stimulated by bacteria.

  16. New method for identifying features of an image on a digital video display

    NASA Astrophysics Data System (ADS)

    Doyle, Michael D.

    1991-04-01

    The MetaMap process extends the concept of direct manipulation human-computer interfaces to new limits. Its specific capabilities include the correlation of discrete image elements to relevant text information and the correlation of these image features to other images as well as to program control mechanisms. The correlation is accomplished through reprogramming of both the color map and the image so that discrete image elements comprise unique sets of color indices. This process allows the correlation to be accomplished with very efficient data storage and program execution times. Image databases adapted to this process become object-oriented as a result. Very sophisticated interrelationships can be set up between images text and program control mechanisms using this process. An application of this interfacing process to the design of an interactive atlas of medical histology as well as other possible applications are described. The MetaMap process is protected by U. S. patent #4

  17. Duct wall impedance control as an advanced concept for acoustic suppression enhancement. [engine noise reduction

    NASA Technical Reports Server (NTRS)

    Dean, P. D.

    1978-01-01

    A systems concept procedure is described for the optimization of acoustic duct liner design for both uniform and multisegment types. The concept was implemented by the use of a double reverberant chamber flow duct facility coupled with sophisticated computer control and acoustic analysis systems. The optimization procedure for liner insertion loss was based on the concept of variable liner impedance produced by bias air flow through a multilayer, resonant cavity liner. A multiple microphone technique for in situ wall impedance measurements was used and successfully adapted to produce automated measurements for all liner configurations tested. The complete validation of the systems concept was prevented by the inability to optimize the insertion loss using bias flow induced wall impedance changes. This inability appeared to be a direct function of the presence of a higher order energy carrying modes which were not influenced significantly by the wall impedance changes.

  18. Galaxy CloudMan: delivering cloud compute clusters

    PubMed Central

    2010-01-01

    Background Widespread adoption of high-throughput sequencing has greatly increased the scale and sophistication of computational infrastructure needed to perform genomic research. An alternative to building and maintaining local infrastructure is “cloud computing”, which, in principle, offers on demand access to flexible computational infrastructure. However, cloud computing resources are not yet suitable for immediate “as is” use by experimental biologists. Results We present a cloud resource management system that makes it possible for individual researchers to compose and control an arbitrarily sized compute cluster on Amazon’s EC2 cloud infrastructure without any informatics requirements. Within this system, an entire suite of biological tools packaged by the NERC Bio-Linux team (http://nebc.nerc.ac.uk/tools/bio-linux) is available for immediate consumption. The provided solution makes it possible, using only a web browser, to create a completely configured compute cluster ready to perform analysis in less than five minutes. Moreover, we provide an automated method for building custom deployments of cloud resources. This approach promotes reproducibility of results and, if desired, allows individuals and labs to add or customize an otherwise available cloud system to better meet their needs. Conclusions The expected knowledge and associated effort with deploying a compute cluster in the Amazon EC2 cloud is not trivial. The solution presented in this paper eliminates these barriers, making it possible for researchers to deploy exactly the amount of computing power they need, combined with a wealth of existing analysis software, to handle the ongoing data deluge. PMID:21210983

  19. Explosive Transient Camera (ETC) Program

    NASA Technical Reports Server (NTRS)

    Ricker, George

    1991-01-01

    Since the inception of the ETC program, a wide range of new technologies was developed to support this astronomical instrument. The prototype unit was installed at ETC Site 1. The first partially automated observations were made and some major renovations were later added to the ETC hardware. The ETC was outfitted with new thermoelectrically-cooled CCD cameras and a sophisticated vacuum manifold, which, together, made the ETC a much more reliable unit than the prototype. The ETC instrumentation and building were placed under full computer control, allowing the ETC to operate as an automated, autonomous instrument with virtually no human intervention necessary. The first fully-automated operation of the ETC was performed, during which the ETC monitored the error region of the repeating soft gamma-ray burster SGR 1806-21.

  20. Multi Agent Systems with Symbiotic Learning and Evolution using GNP

    NASA Astrophysics Data System (ADS)

    Eguchi, Toru; Hirasawa, Kotaro; Hu, Jinglu; Murata, Junichi

    Recently, various attempts relevant to Multi Agent Systems (MAS) which is one of the most promising systems based on Distributed Artificial Intelligence have been studied to control large and complicated systems efficiently. In these trends of MAS, Multi Agent Systems with Symbiotic Learning and Evolution named Masbiole has been proposed. In Masbiole, symbiotic phenomena among creatures are considered in the process of learning and evolution of MAS. So we can expect more flexible and sophisticated solutions than conventional MAS. In this paper, we apply Masbiole to Iterative Prisoner’s Dilemma Games (IPD Games) using Genetic Network Programming (GNP) which is a newly developed evolutionary computation method for constituting agents. Some characteristics of Masbiole using GNP in IPD Games are clarified.

  1. Sender–receiver systems and applying information theory for quantitative synthetic biology

    PubMed Central

    Barcena Menendez, Diego; Senthivel, Vivek Raj; Isalan, Mark

    2015-01-01

    Sender–receiver (S–R) systems abound in biology, with communication systems sending information in various forms. Information theory provides a quantitative basis for analysing these processes and is being applied to study natural genetic, enzymatic and neural networks. Recent advances in synthetic biology are providing us with a wealth of artificial S–R systems, giving us quantitative control over networks with a finite number of well-characterised components. Combining the two approaches can help to predict how to maximise signalling robustness, and will allow us to make increasingly complex biological computers. Ultimately, pushing the boundaries of synthetic biology will require moving beyond engineering the flow of information and towards building more sophisticated circuits that interpret biological meaning. PMID:25282688

  2. The Cerebellum: Adaptive Prediction for Movement and Cognition

    PubMed Central

    Sokolov, Arseny A.; Miall, R. Chris; Ivry, Richard B.

    2017-01-01

    Over the past 30 years, cumulative evidence has indicated that cerebellar function extends beyond sensorimotor control. This view has emerged from studies of neuroanatomy, neuroimaging, neuropsychology and brain stimulation, with the results implicating the cerebellum in domains as diverse as attention, language, executive function and social cognition. Although the literature provides sophisticated models of how the cerebellum helps refine movements, it remains unclear how the core mechanisms of these models can be applied when considering a broader conceptualization of cerebellar function. In light of recent multidisciplinary findings, we consider two key concepts that have been suggested as general computational principles of cerebellar function, prediction and error-based learning, examining how these might be relevant in the operation of cognitive cerebro-cerebellar loops. PMID:28385461

  3. Characteristics and requirements of robotic manipulators for space operations

    NASA Technical Reports Server (NTRS)

    Andary, James F.; Hewitt, Dennis R.; Spidaliere, Peter D.; Lambeck, Robert W.

    1992-01-01

    A robotic manipulator, DTF-1, developed as part of the Flight Telerobotic Servicer (FTS) project at Goddard Space Flight Center is discussed focusing on the technical, operational, and safety requirements. The DTF-1 system design, which is based on the manipulator, gripper, cameras, computer, and an operator control station incorporates the fundamental building blocks of the original FTS, the end product of which was to have been a light-weight, dexterous telerobotic device. For the first time in the history of NASA, space technology and robotics were combined to find new and unique solutions to the demanding requirements of flying a sophisticated robotic manipulator in space. DTF-1 is considered to be the prototype for all future development in space robotics.

  4. The effectiveness of virtual reality distraction for pain reduction: a systematic review.

    PubMed

    Malloy, Kevin M; Milling, Leonard S

    2010-12-01

    Virtual reality technology enables people to become immersed in a computer-simulated, three-dimensional environment. This article provides a comprehensive review of controlled research on the effectiveness of virtual reality (VR) distraction for reducing pain. To be included in the review, studies were required to use a between-subjects or mixed model design in which VR distraction was compared with a control condition or an alternative intervention in relieving pain. An exhaustive search identified 11 studies satisfying these criteria. VR distraction was shown to be effective for reducing experimental pain, as well as the discomfort associated with burn injury care. Studies of needle-related pain provided less consistent findings. Use of more sophisticated virtual reality technology capable of fully immersing the individual in a virtual environment was associated with greater relief. Overall, controlled research suggests that VR distraction may be a useful tool for clinicians who work with a variety of pain problems. Copyright © 2010 Elsevier Ltd. All rights reserved.

  5. Inverse kinematic-based robot control

    NASA Technical Reports Server (NTRS)

    Wolovich, W. A.; Flueckiger, K. F.

    1987-01-01

    A fundamental problem which must be resolved in virtually all non-trivial robotic operations is the well-known inverse kinematic question. More specifically, most of the tasks which robots are called upon to perform are specified in Cartesian (x,y,z) space, such as simple tracking along one or more straight line paths or following a specified surfacer with compliant force sensors and/or visual feedback. In all cases, control is actually implemented through coordinated motion of the various links which comprise the manipulator; i.e., in link space. As a consequence, the control computer of every sophisticated anthropomorphic robot must contain provisions for solving the inverse kinematic problem which, in the case of simple, non-redundant position control, involves the determination of the first three link angles, theta sub 1, theta sub 2, and theta sub 3, which produce a desired wrist origin position P sub xw, P sub yw, and P sub zw at the end of link 3 relative to some fixed base frame. Researchers outline a new inverse kinematic solution and demonstrate its potential via some recent computer simulations. They also compare it to current inverse kinematic methods and outline some of the remaining problems which will be addressed in order to render it fully operational. Also discussed are a number of practical consequences of this technique beyond its obvious use in solving the inverse kinematic question.

  6. A HyperCard Program for Business German.

    ERIC Educational Resources Information Center

    Paulsell, Patricia R.

    Although the use of computer-assisted language instruction software has been mainly limited to grammatical/syntactical drills, the increasing number of language professionals with programming skills is leading to the development of more sophisticated language education programs. This report describes the generation of such a program using the…

  7. Inventors in the Making

    ERIC Educational Resources Information Center

    Murray, Jenny; Bartelmay, Kathy

    2005-01-01

    Can second-grade students construct an understanding of sophisticated science processes and explore physics concepts while creating their own inventions? Yes! Students accomplished this and much more through a month-long project in which they used Legos and Robolab, the Lego computer programing software, to create their own inventions. One…

  8. Education in the Information Age.

    ERIC Educational Resources Information Center

    Hay, Lee

    1983-01-01

    This essay considers the revolutionized education of a projected future of cheap and sophisticated technology. Predictions include a redefinition of literacy and basic skills and a restructuring of educational delivery employing computers to dispense information in order to free teachers to work directly with students on cognitive development.…

  9. Research and Development in Natural Language Understanding as Part of the Strategic Computing Program.

    DTIC Science & Technology

    1987-04-01

    facilities. BBN is developing a series of increasingly sophisticated natural language understanding systems which will serve as an integrated interface...Haas, A.R. A Syntactic Theory of Belief and Action. Artificial Intelligence. 1986. Forthcoming. [6] Hinrichs, E. Temporale Anaphora im Englischen

  10. 12 Math Rules That Expire in the Middle Grades

    ERIC Educational Resources Information Center

    Karp, Karen S.; Bush, Sarah B.; Dougherty, Barbara J.

    2015-01-01

    Many rules taught in mathematics classrooms "expire" when students develop knowledge that is more sophisticated, such as using new number systems. For example, in elementary grades, students are sometimes taught that "addition makes bigger" or "subtraction makes smaller" when learning to compute with whole numbers,…

  11. Symbionic Technology and Education. Report 83-02.

    ERIC Educational Resources Information Center

    Cartwright, Glenn F.

    Research findings indicate that major breakthroughs in education will have to occur through direct cortical intervention, using either chemical or electronic means. It will eventually be possible to build sophisticated intelligence amplifiers that will be internal extensions of our brains, significantly more powerful than present day computers,…

  12. Assessing Higher Order Thinking in Video Games

    ERIC Educational Resources Information Center

    Rice, John

    2007-01-01

    Computer video games have become highly interesting to educators and researchers since their sophistication has improved considerably over the last decade. Studies indicate simple video games touting educational benefits are common in classrooms. However, a need for identifying truly useful games for educational purposes exists. This article…

  13. STAF: A Powerful and Sophisticated CAI System.

    ERIC Educational Resources Information Center

    Loach, Ken

    1982-01-01

    Describes the STAF (Science Teacher's Authoring Facility) computer-assisted instruction system developed at Leeds University (England), focusing on STAF language and major program features. Although programs for the system emphasize physical chemistry and organic spectroscopy, the system and language are general purpose and can be used in any…

  14. Interactive Video-Based Industrial Training in Basic Electronics.

    ERIC Educational Resources Information Center

    Mirkin, Barry

    The Wisconsin Foundation for Vocational, Technical, and Adult Education is currently involved in the development, implementation, and distribution of a sophisticated interactive computer and video learning system. Designed to offer trainees an open entry and open exit opportunity to pace themselves through a comprehensive competency-based,…

  15. Microprocessor control and networking for the amps breadboard

    NASA Technical Reports Server (NTRS)

    Floyd, Stephen A.

    1987-01-01

    Future space missions will require more sophisticated power systems, implying higher costs and more extensive crew and ground support involvement. To decrease this human involvement, as well as to protect and most efficiently utilize this important resource, NASA has undertaken major efforts to promote progress in the design and development of autonomously managed power systems. Two areas being actively pursued are autonomous power system (APS) breadboards and knowledge-based expert system (KBES) applications. The former are viewed as a requirement for the timely development of the latter. Not only will they serve as final testbeds for the various KBES applications, but will play a major role in the knowledge engineering phase of their development. The current power system breadboard designs are of a distributed microprocessor nature. The distributed nature, plus the need to connect various external computer capabilities (i.e., conventional host computers and symbolic processors), places major emphasis on effective networking. The communications and networking technologies for the first power system breadboard/test facility are described.

  16. 4D Origami by Smart Embroidery.

    PubMed

    Stoychev, Georgi; Razavi, Mir Jalil; Wang, Xianqiao; Ionov, Leonid

    2017-09-01

    There exist many methods for processing of materials: extrusion, injection molding, fibers spinning, 3D printing, to name a few. In most cases, materials with a static, fixed shape are produced. However, numerous advanced applications require customized elements with reconfigurable shape. The few available techniques capable of overcoming this problem are expensive and/or time-consuming. Here, the use of one of the most ancient technologies for structuring, embroidering, is proposed to generate sophisticated patterns of active materials, and, in this way, to achieve complex actuation. By combining experiments and computational modeling, the fundamental rules that can predict the folding behavior of sheets with a variety of stitch-patterns are elucidated. It is demonstrated that theoretical mechanics analysis is only suitable to predict the behavior of the simplest experimental setups, whereas computer modeling gives better predictions for more complex cases. Finally, the applicability of the rules by designing basic origami structures and wrinkling substrates with controlled thermal insulation properties is shown. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Computational Modeling and Treatment Identification in the Myelodysplastic Syndromes.

    PubMed

    Drusbosky, Leylah M; Cogle, Christopher R

    2017-10-01

    This review discusses the need for computational modeling in myelodysplastic syndromes (MDS) and early test results. As our evolving understanding of MDS reveals a molecularly complicated disease, the need for sophisticated computer analytics is required to keep track of the number and complex interplay among the molecular abnormalities. Computational modeling and digital drug simulations using whole exome sequencing data input have produced early results showing high accuracy in predicting treatment response to standard of care drugs. Furthermore, the computational MDS models serve as clinically relevant MDS cell lines for pre-clinical assays of investigational agents. MDS is an ideal disease for computational modeling and digital drug simulations. Current research is focused on establishing the prediction value of computational modeling. Future research will test the clinical advantage of computer-informed therapy in MDS.

  18. Intelligent model-based diagnostics for vehicle health management

    NASA Astrophysics Data System (ADS)

    Luo, Jianhui; Tu, Fang; Azam, Mohammad S.; Pattipati, Krishna R.; Willett, Peter K.; Qiao, Liu; Kawamoto, Masayuki

    2003-08-01

    The recent advances in sensor technology, remote communication and computational capabilities, and standardized hardware/software interfaces are creating a dramatic shift in the way the health of vehicles is monitored and managed. These advances facilitate remote monitoring, diagnosis and condition-based maintenance of automotive systems. With the increased sophistication of electronic control systems in vehicles, there is a concomitant increased difficulty in the identification of the malfunction phenomena. Consequently, the current rule-based diagnostic systems are difficult to develop, validate and maintain. New intelligent model-based diagnostic methodologies that exploit the advances in sensor, telecommunications, computing and software technologies are needed. In this paper, we will investigate hybrid model-based techniques that seamlessly employ quantitative (analytical) models and graph-based dependency models for intelligent diagnosis. Automotive engineers have found quantitative simulation (e.g. MATLAB/SIMULINK) to be a vital tool in the development of advanced control systems. The hybrid method exploits this capability to improve the diagnostic system's accuracy and consistency, utilizes existing validated knowledge on rule-based methods, enables remote diagnosis, and responds to the challenges of increased system complexity. The solution is generic and has the potential for application in a wide range of systems.

  19. The Impact of an Ego Depletion Manipulation on Performance-Based and Self-Report Assessment Measures.

    PubMed

    Charek, Daniel B; Meyer, Gregory J; Mihura, Joni L

    2016-10-01

    We investigated the impact of ego depletion on selected Rorschach cognitive processing variables and self-reported affect states. Research indicates acts of effortful self-regulation transiently deplete a finite pool of cognitive resources, impairing performance on subsequent tasks requiring self-regulation. We predicted that relative to controls, ego-depleted participants' Rorschach protocols would have more spontaneous reactivity to color, less cognitive sophistication, and more frequent logical lapses in visualization, whereas self-reports would reflect greater fatigue and less attentiveness. The hypotheses were partially supported; despite a surprising absence of self-reported differences, ego-depleted participants had Rorschach protocols with lower scores on two variables indicative of sophisticated combinatory thinking, as well as higher levels of color receptivity; they also had lower scores on a composite variable computed across all hypothesized markers of complexity. In addition, self-reported achievement striving moderated the effect of the experimental manipulation on color receptivity, and in the Depletion condition it was associated with greater attentiveness to the tasks, more color reactivity, and less global synthetic processing. Results are discussed with an emphasis on the response process, methodological limitations and strengths, implications for calculating refined Rorschach scores, and the value of using multiple methods in research and experimental paradigms to validate assessment measures. © The Author(s) 2015.

  20. Climate Models

    NASA Technical Reports Server (NTRS)

    Druyan, Leonard M.

    2012-01-01

    Climate models is a very broad topic, so a single volume can only offer a small sampling of relevant research activities. This volume of 14 chapters includes descriptions of a variety of modeling studies for a variety of geographic regions by an international roster of authors. The climate research community generally uses the rubric climate models to refer to organized sets of computer instructions that produce simulations of climate evolution. The code is based on physical relationships that describe the shared variability of meteorological parameters such as temperature, humidity, precipitation rate, circulation, radiation fluxes, etc. Three-dimensional climate models are integrated over time in order to compute the temporal and spatial variations of these parameters. Model domains can be global or regional and the horizontal and vertical resolutions of the computational grid vary from model to model. Considering the entire climate system requires accounting for interactions between solar insolation, atmospheric, oceanic and continental processes, the latter including land hydrology and vegetation. Model simulations may concentrate on one or more of these components, but the most sophisticated models will estimate the mutual interactions of all of these environments. Advances in computer technology have prompted investments in more complex model configurations that consider more phenomena interactions than were possible with yesterday s computers. However, not every attempt to add to the computational layers is rewarded by better model performance. Extensive research is required to test and document any advantages gained by greater sophistication in model formulation. One purpose for publishing climate model research results is to present purported advances for evaluation by the scientific community.

  1. An Ecological Framework for Cancer Communication: Implications for Research

    PubMed Central

    Intille, Stephen S; Zabinski, Marion F

    2005-01-01

    The field of cancer communication has undergone a major revolution as a result of the Internet. As recently as the early 1990s, face-to-face, print, and the telephone were the dominant methods of communication between health professionals and individuals in support of the prevention and treatment of cancer. Computer-supported interactive media existed, but this usually required sophisticated computer and video platforms that limited availability. The introduction of point-and-click interfaces for the Internet dramatically improved the ability of non-expert computer users to obtain and publish information electronically on the Web. Demand for Web access has driven computer sales for the home setting and improved the availability, capability, and affordability of desktop computers. New advances in information and computing technologies will lead to similarly dramatic changes in the affordability and accessibility of computers. Computers will move from the desktop into the environment and onto the body. Computers are becoming smaller, faster, more sophisticated, more responsive, less expensive, and—essentially—ubiquitous. Computers are evolving into much more than desktop communication devices. New computers include sensing, monitoring, geospatial tracking, just-in-time knowledge presentation, and a host of other information processes. The challenge for cancer communication researchers is to acknowledge the expanded capability of the Web and to move beyond the approaches to health promotion, behavior change, and communication that emerged during an era when language- and image-based interpersonal and mass communication strategies predominated. Ecological theory has been advanced since the early 1900s to explain the highly complex relationships among individuals, society, organizations, the built and natural environments, and personal and population health and well-being. This paper provides background on ecological theory, advances an Ecological Model of Internet-Based Cancer Communication intended to broaden the vision of potential uses of the Internet for cancer communication, and provides some examples of how such a model might inform future research and development in cancer communication. PMID:15998614

  2. Submillisecond Optogenetic Control of Neuronal Firing with Two-Photon Holographic Photoactivation of Chronos

    PubMed Central

    Ronzitti, Emiliano; Conti, Rossella; Zampini, Valeria; Tanese, Dimitrii; Klapoetke, Nathan; Boyden, Edward S.; Papagiakoumou, Eirini

    2017-01-01

    Optogenetic neuronal network manipulation promises to unravel a long-standing mystery in neuroscience: how does microcircuit activity relate causally to behavioral and pathological states? The challenge to evoke spikes with high spatial and temporal complexity necessitates further joint development of light-delivery approaches and custom opsins. Two-photon (2P) light-targeting strategies demonstrated in-depth generation of action potentials in photosensitive neurons both in vitro and in vivo, but thus far lack the temporal precision necessary to induce precisely timed spiking events. Here, we show that efficient current integration enabled by 2P holographic amplified laser illumination of Chronos, a highly light-sensitive and fast opsin, can evoke spikes with submillisecond precision and repeated firing up to 100 Hz in brain slices from Swiss male mice. These results pave the way for optogenetic manipulation with the spatial and temporal sophistication necessary to mimic natural microcircuit activity. SIGNIFICANCE STATEMENT To reveal causal links between neuronal activity and behavior, it is necessary to develop experimental strategies to induce spatially and temporally sophisticated perturbation of network microcircuits. Two-photon computer generated holography (2P-CGH) recently demonstrated 3D optogenetic control of selected pools of neurons with single-cell accuracy in depth in the brain. Here, we show that exciting the fast opsin Chronos with amplified laser 2P-CGH enables cellular-resolution targeting with unprecedented temporal control, driving spiking up to 100 Hz with submillisecond onset precision using low laser power densities. This system achieves a unique combination of spatial flexibility and temporal precision needed to pattern optogenetically inputs that mimic natural neuronal network activity patterns. PMID:28972125

  3. TAIR: A transonic airfoil analysis computer code

    NASA Technical Reports Server (NTRS)

    Dougherty, F. C.; Holst, T. L.; Grundy, K. L.; Thomas, S. D.

    1981-01-01

    The operation of the TAIR (Transonic AIRfoil) computer code, which uses a fast, fully implicit algorithm to solve the conservative full-potential equation for transonic flow fields about arbitrary airfoils, is described on two levels of sophistication: simplified operation and detailed operation. The program organization and theory are elaborated to simplify modification of TAIR for new applications. Examples with input and output are given for a wide range of cases, including incompressible, subcritical compressible, and transonic calculations.

  4. Improving Access to Data While Protecting Confidentiality: Prospects for the Future.

    ERIC Educational Resources Information Center

    Duncan, George T.; Pearson, Robert W.

    Providing researchers, especially those in the social sciences, with access to publicly collected microdata furthers research while advancing public policy goals in a democratic society. However, while technological improvements have eased remote access to these databases and enabled computer using researchers to perform sophisticated statistical…

  5. Available for the Apple II: FIRM: Florida InteRactive Modeler.

    ERIC Educational Resources Information Center

    Levy, C. Michael; And Others

    1983-01-01

    The Apple II microcomputer program described allows instructors with minimal programing experience to construct computer models of psychological phenomena for students to investigate. Use of these models eliminates need to maintain/house/breed animals or purchase sophisticated laboratory equipment. Several content models are also described,…

  6. MODELS-3 (CMAQ). NARSTO NEWS (VOL. 3, NO. 2, SUMMER/FALL 1999)

    EPA Science Inventory

    A revised version of the U.S. EPA's Models-3/CMAQ system was released on June 30, 1999. Models-3 consists of a sophisticated computational framework for environmental models allowing for much flexibility in the communications between component parts of the system, in updating or ...

  7. Analysis of an Anti-Phishing Lab Activity

    ERIC Educational Resources Information Center

    Werner, Laurie A.; Courte, Jill

    2010-01-01

    Despite advances in spam detection software, anti-spam laws, and increasingly sophisticated users, the number of successful phishing scams continues to grow. In addition to monetary losses attributable to phishing, there is also a loss of confidence that stifles use of online services. Using in-class activities in an introductory computer course…

  8. Artificial Intelligence Applications in Special Education: How Feasible? Final Report.

    ERIC Educational Resources Information Center

    Hofmeister, Alan M.; Ferrara, Joseph M.

    The research project investigated whether expert system tools have become sophisticated enough to be applied efficiently to problems in special education. (Expert systems are a development of artificial intelligence that combines the computer's capacity for storing specialized knowledge with a general set of rules intended to replicate the…

  9. Instructional Design Considerations in Converting Non-CBT Materials into CBT Courses.

    ERIC Educational Resources Information Center

    Ng, Raymond

    Instructional designers who are asked to convert existing training materials into computer-based training (CBT) must take special precautions to avoid making the product into a sophisticated page turner. Although conversion may save considerable time on subject research and analysis, courses to be delivered through microcomputers may require…

  10. CBES--An Efficient Implementation of the Coursewriter Language.

    ERIC Educational Resources Information Center

    Franks, Edward W.

    An extensive computer based education system (CBES) built around the IBM Coursewriter III program product at Ohio State University is described. In this system, numerous extensions have been added to the Coursewriter III language to provide capabilities needed to implement sophisticated instructional strategies. CBES design goals include lower CPU…

  11. Detecting Satisficing in Online Surveys

    ERIC Educational Resources Information Center

    Salifu, Shani

    2012-01-01

    The proliferation of computers and high speed internet services are making online activities an integral part of peoples' lives as connect with friends, shop, and exchange data. The increasing ability of the internet to handle sophisticated data exchanges is endearing it to researchers interested in gathering all kinds of data. This method has the…

  12. A Performance Support Tool for Cisco Training Program Managers

    ERIC Educational Resources Information Center

    Benson, Angela D.; Bothra, Jashoda; Sharma, Priya

    2004-01-01

    Performance support systems can play an important role in corporations by managing and allowing distribution of information more easily. These systems run the gamut from simple paper job aids to sophisticated computer- and web-based software applications that support the entire corporate supply chain. According to Gery (1991), a performance…

  13. Introduction to Autonomous Mobile Robotics Using "Lego Mindstorms" NXT

    ERIC Educational Resources Information Center

    Akin, H. Levent; Meriçli, Çetin; Meriçli, Tekin

    2013-01-01

    Teaching the fundamentals of robotics to computer science undergraduates requires designing a well-balanced curriculum that is complemented with hands-on applications on a platform that allows rapid construction of complex robots, and implementation of sophisticated algorithms. This paper describes such an elective introductory course where the…

  14. Robotics for Computer Scientists: What's the Big Idea?

    ERIC Educational Resources Information Center

    Touretzky, David S.

    2013-01-01

    Modern robots, like today's smartphones, are complex devices with intricate software systems. Introductory robot programming courses must evolve to reflect this reality, by teaching students to make use of the sophisticated tools their robots provide rather than reimplementing basic algorithms. This paper focuses on teaching with Tekkotsu, an open…

  15. Teaching Conversations with the XDS Sigma 7. Systems Description.

    ERIC Educational Resources Information Center

    Bork, Alfred M.; Mosmann, Charles

    Some computers permit conventional programing languages to be extended by the use of macro-instructions, a sophisticated programing tool which is especially useful in writing instructional dialogs. Macro-instructions (or "macro's") are complex commands defined in terms of the machine language or other macro-instructions. Like terms in…

  16. Data management in the mission data system

    NASA Technical Reports Server (NTRS)

    Wagner, David A.

    2005-01-01

    As spacecraft evolve from simple embedded devices to become more sophisticated computing platforms with complex behaviors it is increasingly necessary to model and manage the flow of data, and to provide uniform models for managing data that promote adaptability, yet pay heed to the physical limitations of the embedded and space environments.

  17. Technology Acceptance in Social Work Education: Implications for the Field Practicum

    ERIC Educational Resources Information Center

    Colvin, Alex Don; Bullock, Angela N.

    2014-01-01

    The exponential growth and sophistication of new information and computer technology (ICT) have greatly influenced human interactions and provided new metaphors for understanding the world. The acceptance and integration of ICT into social work field education are examined here using the technological acceptance model. This article also explores…

  18. Models and Methodologies for Multimedia Courseware Production.

    ERIC Educational Resources Information Center

    Barker, Philip; Giller, Susan

    Many new technologies are now available for delivering and/or providing access to computer-based learning (CBL) materials. These technologies vary in sophistication in many important ways, depending upon the bandwidth that they provide, the interactivity that they offer and the types of end-user connectivity that they support.Invariably,…

  19. Forecasting the Occurrence of Severe Haze Events in Asia using Machine Learning Algorithms

    NASA Astrophysics Data System (ADS)

    Walton, A. L.

    2016-12-01

    Particulate pollution has become a serious environmental issue of many Asian countries in recent decades, threatening human health and frequently causing low visibility or haze days that interrupt from working, outdoor, and school activities to air, road, and sea transportation. To ultimately prevent such severe haze to occur requires many difficult tasks to be accomplished, dealing with trade and negotiation, emission control, energy consumption, transportation, land and plantation management, among other, of all involved countries or parties. Whereas, before these difficult measures could finally take place, it would be more practical to reduce the economic loss by developing skills to predict the occurrence of such events in reasonable accuracy so that effective mitigation or adaptation measures could be implemented ahead of time. The "traditional" numerical models developed based on fluid dynamics and explicit or parameterized representations of physiochemical processes can be certainly used for this task. However, the significant and sophisticated spatiotemporal variabilities associated with these events, the propagation of numerical or parameterization errors through model integration, and the computational demand all pose serious challenges to the practice of using these models to accomplish this interdisciplinary task. On the other hand, large quantity of meteorological, hydrological, atmospheric aerosol and composition, and surface visibility data from in-situ observation, reanalysis, or satellite retrievals, have become available to the community. These data might still not sufficient for evaluating and improving certain important aspects of the "traditional" models. Nevertheless, it is likely that these data can already support the effort to develop alternative "task-oriented" and computationally efficient forecasting skill using deep machine learning technique to avoid directly dealing with the sophisticated interplays across multiple process layers. I will present an experiential case of applying machine learning technique to predict the occurrence of severe haze events in Asia.

  20. Forecasting the Occurrence of Severe Haze Events in Asia using Machine Learning Algorithms

    NASA Astrophysics Data System (ADS)

    Wang, C.

    2017-12-01

    Particulate pollution has become a serious environmental issue of many Asian countries in recent decades, threatening human health and frequently causing low visibility or haze days that interrupt from working, outdoor, and school activities to air, road, and sea transportation. To ultimately prevent such severe haze to occur requires many difficult tasks to be accomplished, dealing with trade and negotiation, emission control, energy consumption, transportation, land and plantation management, among other, of all involved countries or parties. Whereas, before these difficult measures could finally take place, it would be more practical to reduce the economic loss by developing skills to predict the occurrence of such events in reasonable accuracy so that effective mitigation or adaptation measures could be implemented ahead of time. The "traditional" numerical models developed based on fluid dynamics and explicit or parameterized representations of physiochemical processes can be certainly used for this task. However, the significant and sophisticated spatiotemporal variabilities associated with these events, the propagation of numerical or parameterization errors through model integration, and the computational demand all pose serious challenges to the practice of using these models to accomplish this interdisciplinary task. On the other hand, large quantity of meteorological, hydrological, atmospheric aerosol and composition, and surface visibility data from in-situ observation, reanalysis, or satellite retrievals, have become available to the community. These data might still not sufficient for evaluating and improving certain important aspects of the "traditional" models. Nevertheless, it is likely that these data can already support the effort to develop alternative "task-oriented" and computationally efficient forecasting skill using deep machine learning technique to avoid directly dealing with the sophisticated interplays across multiple process layers. I will present an experiential case of applying machine learning technique to predict the occurrence of severe haze events in Asia.

  1. Computational dynamics of soft machines

    NASA Astrophysics Data System (ADS)

    Hu, Haiyan; Tian, Qiang; Liu, Cheng

    2017-06-01

    Soft machine refers to a kind of mechanical system made of soft materials to complete sophisticated missions, such as handling a fragile object and crawling along a narrow tunnel corner, under low cost control and actuation. Hence, soft machines have raised great challenges to computational dynamics. In this review article, recent studies of the authors on the dynamic modeling, numerical simulation, and experimental validation of soft machines are summarized in the framework of multibody system dynamics. The dynamic modeling approaches are presented first for the geometric nonlinearities of coupled overall motions and large deformations of a soft component, the physical nonlinearities of a soft component made of hyperelastic or elastoplastic materials, and the frictional contacts/impacts of soft components, respectively. Then the computation approach is outlined for the dynamic simulation of soft machines governed by a set of differential-algebraic equations of very high dimensions, with an emphasis on the efficient computations of the nonlinear elastic force vector of finite elements. The validations of the proposed approaches are given via three case studies, including the locomotion of a soft quadrupedal robot, the spinning deployment of a solar sail of a spacecraft, and the deployment of a mesh reflector of a satellite antenna, as well as the corresponding experimental studies. Finally, some remarks are made for future studies.

  2. Natural three-qubit interactions in one-way quantum computing

    NASA Astrophysics Data System (ADS)

    Tame, M. S.; Paternostro, M.; Kim, M. S.; Vedral, V.

    2006-02-01

    We address the effects of natural three-qubit interactions on the computational power of one-way quantum computation. A benefit of using more sophisticated entanglement structures is the ability to construct compact and economic simulations of quantum algorithms with limited resources. We show that the features of our study are embodied by suitably prepared optical lattices, where effective three-spin interactions have been theoretically demonstrated. We use this to provide a compact construction for the Toffoli gate. Information flow and two-qubit interactions are also outlined, together with a brief analysis of relevant sources of imperfection.

  3. A Low Cost Remote Sensing System Using PC and Stereo Equipment

    NASA Technical Reports Server (NTRS)

    Campbell, Joel F.; Flood, Michael A.; Prasad, Narasimha S.; Hodson, Wade D.

    2011-01-01

    A system using a personal computer, speaker, and a microphone is used to detect objects, and make crude measurements using a carrier modulated by a pseudorandom noise (PN) code. This system can be constructed using a personal computer and audio equipment commonly found in the laboratory or at home, or more sophisticated equipment that can be purchased at reasonable cost. We demonstrate its value as an instructional tool for teaching concepts of remote sensing and digital signal processing.

  4. Robust Informatics Infrastructure Required For ICME: Combining Virtual and Experimental Data

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Holland, Frederic A. Jr.; Bednarcyk, Brett A.

    2014-01-01

    With the increased emphasis on reducing the cost and time to market of new materials, the need for robust automated materials information management system(s) enabling sophisticated data mining tools is increasing, as evidenced by the emphasis on Integrated Computational Materials Engineering (ICME) and the recent establishment of the Materials Genome Initiative (MGI). This need is also fueled by the demands for higher efficiency in material testing; consistency, quality and traceability of data; product design; engineering analysis; as well as control of access to proprietary or sensitive information. Further, the use of increasingly sophisticated nonlinear, anisotropic and or multi-scale models requires both the processing of large volumes of test data and complex materials data necessary to establish processing-microstructure-property-performance relationships. Fortunately, material information management systems have kept pace with the growing user demands and evolved to enable: (i) the capture of both point wise data and full spectra of raw data curves, (ii) data management functions such as access, version, and quality controls;(iii) a wide range of data import, export and analysis capabilities; (iv) data pedigree traceability mechanisms; (v) data searching, reporting and viewing tools; and (vi) access to the information via a wide range of interfaces. This paper discusses key principles for the development of a robust materials information management system to enable the connections at various length scales to be made between experimental data and corresponding multiscale modeling toolsets to enable ICME. In particular, NASA Glenn's efforts towards establishing such a database for capturing constitutive modeling behavior for both monolithic and composites materials

  5. Effective Materials Property Information Management for the 21st Century

    NASA Technical Reports Server (NTRS)

    Ren, Weiju; Cebon, David; Arnold, Steve

    2009-01-01

    This paper discusses key principles for the development of materials property information management software systems. There are growing needs for automated materials information management in various organizations. In part these are fueled by the demands for higher efficiency in material testing, product design and engineering analysis. But equally important, organizations are being driven by the need for consistency, quality and traceability of data, as well as control of access to sensitive information such as proprietary data. Further, the use of increasingly sophisticated nonlinear, anisotropic and multi-scale engineering analyses requires both processing of large volumes of test data for development of constitutive models and complex materials data input for Computer-Aided Engineering (CAE) software. And finally, the globalization of economy often generates great needs for sharing a single "gold source" of materials information between members of global engineering teams in extended supply chains. Fortunately, material property management systems have kept pace with the growing user demands and evolved to versatile data management systems that can be customized to specific user needs. The more sophisticated of these provide facilities for: (i) data management functions such as access, version, and quality controls; (ii) a wide range of data import, export and analysis capabilities; (iii) data "pedigree" traceability mechanisms; (iv) data searching, reporting and viewing tools; and (v) access to the information via a wide range of interfaces. In this paper the important requirements for advanced material data management systems, future challenges and opportunities such as automated error checking, data quality characterization, identification of gaps in datasets, as well as functionalities and business models to fuel database growth and maintenance are discussed.

  6. The Matrix Element Method: Past, Present, and Future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gainer, James S.; Lykken, Joseph; Matchev, Konstantin T.

    2013-07-12

    The increasing use of multivariate methods, and in particular the Matrix Element Method (MEM), represents a revolution in experimental particle physics. With continued exponential growth in computing capabilities, the use of sophisticated multivariate methods-- already common-- will soon become ubiquitous and ultimately almost compulsory. While the existence of sophisticated algorithms for disentangling signal and background might naively suggest a diminished role for theorists, the use of the MEM, with its inherent connection to the calculation of differential cross sections will benefit from collaboration between theorists and experimentalists. In this white paper, we will briefly describe the MEM and some ofmore » its recent uses, note some current issues and potential resolutions, and speculate about exciting future opportunities.« less

  7. Novel opportunities for computational biology and sociology in drug discovery☆

    PubMed Central

    Yao, Lixia; Evans, James A.; Rzhetsky, Andrey

    2013-01-01

    Current drug discovery is impossible without sophisticated modeling and computation. In this review we outline previous advances in computational biology and, by tracing the steps involved in pharmaceutical development, explore a range of novel, high-value opportunities for computational innovation in modeling the biological process of disease and the social process of drug discovery. These opportunities include text mining for new drug leads, modeling molecular pathways and predicting the efficacy of drug cocktails, analyzing genetic overlap between diseases and predicting alternative drug use. Computation can also be used to model research teams and innovative regions and to estimate the value of academy–industry links for scientific and human benefit. Attention to these opportunities could promise punctuated advance and will complement the well-established computational work on which drug discovery currently relies. PMID:20349528

  8. A computer simulation of aircraft evacuation with fire

    NASA Technical Reports Server (NTRS)

    Middleton, V. E.

    1983-01-01

    A computer simulation was developed to assess passenger survival during the post-crash evacuation of a transport category aircraft when fire is a major threat. The computer code, FIREVAC, computes individual passenger exit paths and times to exit, taking into account delays and congestion caused by the interaction among the passengers and changing cabin conditions. Simple models for the physiological effects of the toxic cabin atmosphere are included with provision for including more sophisticated models as they become available. Both wide-body and standard-body aircraft may be simulated. Passenger characteristics are assigned stochastically from experimentally derived distributions. Results of simulations of evacuation trials and hypothetical evacuations under fire conditions are presented.

  9. MoCog1: A computer simulation of recognition-primed human decision making

    NASA Technical Reports Server (NTRS)

    Gevarter, William B.

    1991-01-01

    The results of the first stage of a research effort to develop a 'sophisticated' computer model of human cognitive behavior are described. Most human decision making is an experience-based, relatively straight-forward, largely automatic response to internal goals and drives, utilizing cues and opportunities perceived from the current environment. The development of the architecture and computer program (MoCog1) associated with such 'recognition-primed' decision making is discussed. The resultant computer program was successfully utilized as a vehicle to simulate earlier findings that relate how an individual's implicit theories orient the individual toward particular goals, with resultant cognitions, affects, and behavior in response to their environment.

  10. High-Throughput Thermodynamic Modeling and Uncertainty Quantification for ICME

    NASA Astrophysics Data System (ADS)

    Otis, Richard A.; Liu, Zi-Kui

    2017-05-01

    One foundational component of the integrated computational materials engineering (ICME) and Materials Genome Initiative is the computational thermodynamics based on the calculation of phase diagrams (CALPHAD) method. The CALPHAD method pioneered by Kaufman has enabled the development of thermodynamic, atomic mobility, and molar volume databases of individual phases in the full space of temperature, composition, and sometimes pressure for technologically important multicomponent engineering materials, along with sophisticated computational tools for using the databases. In this article, our recent efforts will be presented in terms of developing new computational tools for high-throughput modeling and uncertainty quantification based on high-throughput, first-principles calculations and the CALPHAD method along with their potential propagations to downstream ICME modeling and simulations.

  11. CANFAR + Skytree: Mining Massive Datasets as an Essential Part of the Future of Astronomy

    NASA Astrophysics Data System (ADS)

    Ball, Nicholas M.

    2013-01-01

    The future study of large astronomical datasets, consisting of hundreds of millions to billions of objects, will be dominated by large computing resources, and by analysis tools of the necessary scalability and sophistication to extract useful information. Significant effort will be required to fulfil their potential as a provider of the next generation of science results. To-date, computing systems have allowed either sophisticated analysis of small datasets, e.g., most astronomy software, or simple analysis of large datasets, e.g., database queries. At the Canadian Astronomy Data Centre, we have combined our cloud computing system, the Canadian Advanced Network for Astronomical Research (CANFAR), with the world's most advanced machine learning software, Skytree, to create the world's first cloud computing system for data mining in astronomy. This allows the full sophistication of the huge fields of data mining and machine learning to be applied to the hundreds of millions of objects that make up current large datasets. CANFAR works by utilizing virtual machines, which appear to the user as equivalent to a desktop. Each machine is replicated as desired to perform large-scale parallel processing. Such an arrangement carries far more flexibility than other cloud systems, because it enables the user to immediately install and run the same code that they already utilize for science on their desktop. We demonstrate the utility of the CANFAR + Skytree system by showing science results obtained, including assigning photometric redshifts with full probability density functions (PDFs) to a catalog of approximately 133 million galaxies from the MegaPipe reductions of the Canada-France-Hawaii Telescope Legacy Wide and Deep surveys. Each PDF is produced nonparametrically from 100 instances of the photometric parameters for each galaxy, generated by perturbing within the errors on the measurements. Hence, we produce, store, and assign redshifts to, a catalog of over 13 billion object instances. This catalog is comparable in size to those expected from next-generation surveys, such as Large Synoptic Survey Telescope. The CANFAR+Skytree system is open for use by any interested member of the astronomical community.

  12. Linking live animals and products: traceability.

    PubMed

    Britt, A G; Bell, C M; Evers, K; Paskin, R

    2013-08-01

    It is rarely possible to successfully contain an outbreak of an infectious animal disease, or to respond effectively to a chemical residue incident, without the use of a system for identifying and tracking animals. The linking of animals at the time they are slaughtered--through the use of identification devices or marks and accompanying movement documentation--with the meat produced from their carcasses, adds further value from the perspective of consumer safety. Over the past decade, animal identification technology has become more sophisticated and affordable. The development of the Internet and mobile communication tools, complemented bythe expanded capacity of computers and associated data management applications, has added a new dimension to the ability of Competent Authorities and industry to track animals and the food they produce for disease control, food safety and commercial purposes.

  13. The Cerebellum: Adaptive Prediction for Movement and Cognition.

    PubMed

    Sokolov, Arseny A; Miall, R Chris; Ivry, Richard B

    2017-05-01

    Over the past 30 years, cumulative evidence has indicated that cerebellar function extends beyond sensorimotor control. This view has emerged from studies of neuroanatomy, neuroimaging, neuropsychology, and brain stimulation, with the results implicating the cerebellum in domains as diverse as attention, language, executive function, and social cognition. Although the literature provides sophisticated models of how the cerebellum helps refine movements, it remains unclear how the core mechanisms of these models can be applied when considering a broader conceptualization of cerebellar function. In light of recent multidisciplinary findings, we examine how two key concepts that have been suggested as general computational principles of cerebellar function- prediction and error-based learning- might be relevant in the operation of cognitive cerebro-cerebellar loops. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Moose: An Open-Source Framework to Enable Rapid Development of Collaborative, Multi-Scale, Multi-Physics Simulation Tools

    NASA Astrophysics Data System (ADS)

    Slaughter, A. E.; Permann, C.; Peterson, J. W.; Gaston, D.; Andrs, D.; Miller, J.

    2014-12-01

    The Idaho National Laboratory (INL)-developed Multiphysics Object Oriented Simulation Environment (MOOSE; www.mooseframework.org), is an open-source, parallel computational framework for enabling the solution of complex, fully implicit multiphysics systems. MOOSE provides a set of computational tools that scientists and engineers can use to create sophisticated multiphysics simulations. Applications built using MOOSE have computed solutions for chemical reaction and transport equations, computational fluid dynamics, solid mechanics, heat conduction, mesoscale materials modeling, geomechanics, and others. To facilitate the coupling of diverse and highly-coupled physical systems, MOOSE employs the Jacobian-free Newton-Krylov (JFNK) method when solving the coupled nonlinear systems of equations arising in multiphysics applications. The MOOSE framework is written in C++, and leverages other high-quality, open-source scientific software packages such as LibMesh, Hypre, and PETSc. MOOSE uses a "hybrid parallel" model which combines both shared memory (thread-based) and distributed memory (MPI-based) parallelism to ensure efficient resource utilization on a wide range of computational hardware. MOOSE-based applications are inherently modular, which allows for simulation expansion (via coupling of additional physics modules) and the creation of multi-scale simulations. Any application developed with MOOSE supports running (in parallel) any other MOOSE-based application. Each application can be developed independently, yet easily communicate with other applications (e.g., conductivity in a slope-scale model could be a constant input, or a complete phase-field micro-structure simulation) without additional code being written. This method of development has proven effective at INL and expedites the development of sophisticated, sustainable, and collaborative simulation tools.

  15. GiveMe Shelter: A People-Centred Design Process for Promoting Independent Inquiry-Led Learning in Engineering

    ERIC Educational Resources Information Center

    Dyer, Mark; Grey, Thomas; Kinnane, Oliver

    2017-01-01

    It has become increasingly common for tasks traditionally carried out by engineers to be undertaken by technicians and technologist with access to sophisticated computers and software that can often perform complex calculations that were previously the responsibility of engineers. Not surprisingly, this development raises serious questions about…

  16. C-SWAT: The Soil and Water Assessment Tool with consolidated input files in alleviating computational burden of recursive simulations

    USDA-ARS?s Scientific Manuscript database

    The temptation to include model parameters and high resolution input data together with the availability of powerful optimization and uncertainty analysis algorithms has significantly enhanced the complexity of hydrologic and water quality modeling. However, the ability to take advantage of sophist...

  17. A Developing Market for Continuing Higher Education: The Reserve Components.

    ERIC Educational Resources Information Center

    Watt, David M.

    Due to increasingly sophisticated military equipment, the Reserve Components of the armed forces need to raise the educational standards for recruits. A number of U.S. educational institutions have responded to their needs for continuing higher education in the areas of job skill enhancement (such as computer operation), regular courses directly…

  18. The RCM: A Resource Management and Program Budgeting Approach for State and Local Educational Agencies.

    ERIC Educational Resources Information Center

    Chambers, Jay G.; Parrish, Thomas B.

    The Resource Cost Model (RCM) is a resource management system that combines the technical advantages of sophisticated computer simulation software with the practical benefits of group decision making to provide detailed information about educational program costs. The first section of this document introduces the conceptual framework underlying…

  19. Deep FIFO Surge Buffer

    NASA Technical Reports Server (NTRS)

    Temple, Gerald; Siegel, Marc; Amitai, Zwie

    1991-01-01

    First-in/first-out (FIFO) temporarily stores short surges of data generated by data-acquisition system at excessively high rate and releases data at lower rate suitable for processing by computer. Size and complexity reduced while capacity enhanced by use of newly developed, sophisticated integrated circuits and by "byte-folding" scheme doubling effective depth and data rate.

  20. Let's Dance the "Robot Hokey-Pokey!": Children's Programming Approaches and Achievement throughout Early Cognitive Development

    ERIC Educational Resources Information Center

    Flannery, Louise P.; Bers, Marina Umaschi

    2013-01-01

    Young learners today generate, express, and interact with sophisticated ideas using a range of digital tools to explore interactive stories, animations, computer games, and robotics. In recent years, new developmentally appropriate robotics kits have been entering early childhood classrooms. This paper presents a retrospective analysis of one…

  1. Using Excel's Solver Function to Facilitate Reciprocal Service Department Cost Allocations

    ERIC Educational Resources Information Center

    Leese, Wallace R.

    2013-01-01

    The reciprocal method of service department cost allocation requires linear equations to be solved simultaneously. These computations are often so complex as to cause the abandonment of the reciprocal method in favor of the less sophisticated and theoretically incorrect direct or step-down methods. This article illustrates how Excel's Solver…

  2. ALFIL: A Crowd Simulation Serious Game for Massive Evacuation Training and Awareness

    ERIC Educational Resources Information Center

    García-García, César; Fernández-Robles, José Luis; Larios-Rosillo, Victor; Luga, Hervé

    2012-01-01

    This article presents the current development of a serious game for the simulation of massive evacuations. The purpose of this project is to promote self-protection through awareness of the procedures and different possible scenarios during the evacuation of a massive event. Sophisticated behaviors require massive computational power and it has…

  3. Children's Behavior toward and Understanding of Robotic and Living Dogs

    ERIC Educational Resources Information Center

    Melson, Gail F.; Kahn, Peter H., Jr.; Beck, Alan; Friedman, Batya; Roberts, Trace; Garrett, Erik; Gill, Brian T.

    2009-01-01

    This study investigated children's reasoning about and behavioral interactions with a computationally sophisticated robotic dog (Sony's AIBO) compared to a live dog (an Australian Shepherd). Seventy-two children from three age groups (7-9 years, 10-12 years, and 13-15 years) participated in this study. Results showed that more children…

  4. Using Novel Word Context Measures to Predict Human Ratings of Lexical Proficiency

    ERIC Educational Resources Information Center

    Berger, Cynthia M.; Crossley, Scott A.; Kyle, Kristopher

    2017-01-01

    This study introduces a model of lexical proficiency based on novel computational indices related to word context. The indices come from an updated version of the Tool for the Automatic Analysis of Lexical Sophistication (TAALES) and include associative, lexical, and semantic measures of word context. Human ratings of holistic lexical proficiency…

  5. Using Excel's Matrix Operations to Facilitate Reciprocal Cost Allocations

    ERIC Educational Resources Information Center

    Leese, Wallace R.; Kizirian, Tim

    2009-01-01

    The reciprocal method of service department cost allocation requires linear equations to be solved simultaneously. These computations are often so complex as to cause the abandonment of the reciprocal method in favor of the less sophisticated direct or step-down methods. Here is a short example demonstrating how Excel's sometimes unknown matrix…

  6. Computational Models Used to Assess US Tobacco Control Policies.

    PubMed

    Feirman, Shari P; Glasser, Allison M; Rose, Shyanika; Niaura, Ray; Abrams, David B; Teplitskaya, Lyubov; Villanti, Andrea C

    2017-11-01

    Simulation models can be used to evaluate existing and potential tobacco control interventions, including policies. The purpose of this systematic review was to synthesize evidence from computational models used to project population-level effects of tobacco control interventions. We provide recommendations to strengthen simulation models that evaluate tobacco control interventions. Studies were eligible for review if they employed a computational model to predict the expected effects of a non-clinical US-based tobacco control intervention. We searched five electronic databases on July 1, 2013 with no date restrictions and synthesized studies qualitatively. Six primary non-clinical intervention types were examined across the 40 studies: taxation, youth prevention, smoke-free policies, mass media campaigns, marketing/advertising restrictions, and product regulation. Simulation models demonstrated the independent and combined effects of these interventions on decreasing projected future smoking prevalence. Taxation effects were the most robust, as studies examining other interventions exhibited substantial heterogeneity with regard to the outcomes and specific policies examined across models. Models should project the impact of interventions on overall tobacco use, including nicotine delivery product use, to estimate preventable health and cost-saving outcomes. Model validation, transparency, more sophisticated models, and modeling policy interactions are also needed to inform policymakers to make decisions that will minimize harm and maximize health. In this systematic review, evidence from multiple studies demonstrated the independent effect of taxation on decreasing future smoking prevalence, and models for other tobacco control interventions showed that these strategies are expected to decrease smoking, benefit population health, and are reasonable to implement from a cost perspective. Our recommendations aim to help policymakers and researchers minimize harm and maximize overall population-level health benefits by considering the real-world context in which tobacco control interventions are implemented. © The Author 2017. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  7. Current Status on the use of Parallel Computing in Turbulent Reacting Flow Computations Involving Sprays, Monte Carlo PDF and Unstructured Grids. Chapter 4

    NASA Technical Reports Server (NTRS)

    Raju, M. S.

    1998-01-01

    The state of the art in multidimensional combustor modeling as evidenced by the level of sophistication employed in terms of modeling and numerical accuracy considerations, is also dictated by the available computer memory and turnaround times afforded by present-day computers. With the aim of advancing the current multi-dimensional computational tools used in the design of advanced technology combustors, a solution procedure is developed that combines the novelty of the coupled CFD/spray/scalar Monte Carlo PDF (Probability Density Function) computations on unstructured grids with the ability to run on parallel architectures. In this approach, the mean gas-phase velocity and turbulence fields are determined from a standard turbulence model, the joint composition of species and enthalpy from the solution of a modeled PDF transport equation, and a Lagrangian-based dilute spray model is used for the liquid-phase representation. The gas-turbine combustor flows are often characterized by a complex interaction between various physical processes associated with the interaction between the liquid and gas phases, droplet vaporization, turbulent mixing, heat release associated with chemical kinetics, radiative heat transfer associated with highly absorbing and radiating species, among others. The rate controlling processes often interact with each other at various disparate time 1 and length scales. In particular, turbulence plays an important role in determining the rates of mass and heat transfer, chemical reactions, and liquid phase evaporation in many practical combustion devices.

  8. The house of the future

    ScienceCinema

    None

    2017-12-09

    Learn what it will take to create tomorrow's net-zero energy home as scientists reveal the secrets of cool roofs, smart windows, and computer-driven energy control systems. The net-zero energy home: Scientists are working to make tomorrow's homes more than just energy efficient -- they want them to be zero energy. Iain Walker, a scientist in the Lab's Energy Performance of Buildings Group, will discuss what it takes to develop net-zero energy houses that generate as much energy as they use through highly aggressive energy efficiency and on-site renewable energy generation. Talking back to the grid: Imagine programming your house to use less energy if the electricity grid is full or price are high. Mary Ann Piette, deputy director of Berkeley Lab's building technology department and director of the Lab's Demand Response Research Center, will discuss how new technologies are enabling buildings to listen to the grid and automatically change their thermostat settings or lighting loads, among other demands, in response to fluctuating electricity prices. The networked (and energy efficient) house: In the future, your home's lights, climate control devices, computers, windows, and appliances could be controlled via a sophisticated digital network. If it's plugged in, it'll be connected. Bruce Nordman, an energy scientist in Berkeley Lab's Energy End-Use Forecasting group, will discuss how he and other scientists are working to ensure these networks help homeowners save energy.

  9. The house of the future

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    Learn what it will take to create tomorrow's net-zero energy home as scientists reveal the secrets of cool roofs, smart windows, and computer-driven energy control systems. The net-zero energy home: Scientists are working to make tomorrow's homes more than just energy efficient -- they want them to be zero energy. Iain Walker, a scientist in the Lab's Energy Performance of Buildings Group, will discuss what it takes to develop net-zero energy houses that generate as much energy as they use through highly aggressive energy efficiency and on-site renewable energy generation. Talking back to the grid: Imagine programming your house tomore » use less energy if the electricity grid is full or price are high. Mary Ann Piette, deputy director of Berkeley Lab's building technology department and director of the Lab's Demand Response Research Center, will discuss how new technologies are enabling buildings to listen to the grid and automatically change their thermostat settings or lighting loads, among other demands, in response to fluctuating electricity prices. The networked (and energy efficient) house: In the future, your home's lights, climate control devices, computers, windows, and appliances could be controlled via a sophisticated digital network. If it's plugged in, it'll be connected. Bruce Nordman, an energy scientist in Berkeley Lab's Energy End-Use Forecasting group, will discuss how he and other scientists are working to ensure these networks help homeowners save energy.« less

  10. Perceptual organization in computer vision - A review and a proposal for a classificatory structure

    NASA Technical Reports Server (NTRS)

    Sarkar, Sudeep; Boyer, Kim L.

    1993-01-01

    The evolution of perceptual organization in biological vision, and its necessity in advanced computer vision systems, arises from the characteristic that perception, the extraction of meaning from sensory input, is an intelligent process. This is particularly so for high order organisms and, analogically, for more sophisticated computational models. The role of perceptual organization in computer vision systems is explored. This is done from four vantage points. First, a brief history of perceptual organization research in both humans and computer vision is offered. Next, a classificatory structure in which to cast perceptual organization research to clarify both the nomenclature and the relationships among the many contributions is proposed. Thirdly, the perceptual organization work in computer vision in the context of this classificatory structure is reviewed. Finally, the array of computational techniques applied to perceptual organization problems in computer vision is surveyed.

  11. Control of Flexible Structures (COFS) Flight Experiment Background and Description

    NASA Technical Reports Server (NTRS)

    Hanks, B. R.

    1985-01-01

    A fundamental problem in designing and delivering large space structures to orbit is to provide sufficient structural stiffness and static configuration precision to meet performance requirements. These requirements are directly related to control requirements and the degree of control system sophistication available to supplement the as-built structure. Background and rationale are presented for a research study in structures, structural dynamics, and controls using a relatively large, flexible beam as a focus. This experiment would address fundamental problems applicable to large, flexible space structures in general and would involve a combination of ground tests, flight behavior prediction, and instrumented orbital tests. Intended to be multidisciplinary but basic within each discipline, the experiment should provide improved understanding and confidence in making design trades between structural conservatism and control system sophistication for meeting static shape and dynamic response/stability requirements. Quantitative results should be obtained for use in improving the validity of ground tests for verifying flight performance analyses.

  12. New MHD feedback control schemes using the MARTe framework in RFX-mod

    NASA Astrophysics Data System (ADS)

    Piron, Chiara; Manduchi, Gabriele; Marrelli, Lionello; Piovesan, Paolo; Zanca, Paolo

    2013-10-01

    Real-time feedback control of MHD instabilities is a topic of major interest in magnetic thermonuclear fusion, since it allows to optimize a device performance even beyond its stability bounds. The stability properties of different magnetic configurations are important test benches for real-time control systems. RFX-mod, a Reversed Field Pinch experiment that can also operate as a tokamak, is a well suited device to investigate this topic. It is equipped with a sophisticated magnetic feedback system that controls MHD instabilities and error fields by means of 192 active coils and a corresponding grid of sensors. In addition, the RFX-mod control system has recently gained new potentialities thanks to the introduction of the MARTe framework and of a new CPU architecture. These capabilities allow to study new feedback algorithms relevant to both RFP and tokamak operation and to contribute to the debate on the optimal feedback strategy. This work focuses on the design of new feedback schemes. For this purpose new magnetic sensors have been explored, together with new algorithms that refine the de-aliasing computation of the radial sideband harmonics. The comparison of different sensor and feedback strategy performance is described in both RFP and tokamak experiments.

  13. Planning Framework for Mesolevel Optimization of Urban Runoff Control Schemes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Qianqian; Blohm, Andrew; Liu, Bo

    A planning framework is developed to optimize runoff control schemes at scales relevant for regional planning at an early stage. The framework employs less sophisticated modeling approaches to allow a practical application in developing regions with limited data sources and computing capability. The methodology contains three interrelated modules: (1)the geographic information system (GIS)-based hydrological module, which aims at assessing local hydrological constraints and potential for runoff control according to regional land-use descriptions; (2)the grading module, which is built upon the method of fuzzy comprehensive evaluation. It is used to establish a priority ranking system to assist the allocation of runoffmore » control targets at the subdivision level; and (3)the genetic algorithm-based optimization module, which is included to derive Pareto-based optimal solutions for mesolevel allocation with multiple competing objectives. The optimization approach describes the trade-off between different allocation plans and simultaneously ensures that all allocation schemes satisfy the minimum requirement on runoff control. Our results highlight the importance of considering the mesolevel allocation strategy in addition to measures at macrolevels and microlevels in urban runoff management. (C) 2016 American Society of Civil Engineers.« less

  14. An introduction to real-time graphical techniques for analyzing multivariate data

    NASA Astrophysics Data System (ADS)

    Friedman, Jerome H.; McDonald, John Alan; Stuetzle, Werner

    1987-08-01

    Orion I is a graphics system used to study applications of computer graphics - especially interactive motion graphics - in statistics. Orion I is the newest of a family of "Prim" systems, whose most striking common feature is the use of real-time motion graphics to display three dimensional scatterplots. Orion I differs from earlier Prim systems through the use of modern and relatively inexpensive raster graphics and microprocessor technology. It also delivers more computing power to its user; Orion I can perform more sophisticated real-time computations than were possible on previous such systems. We demonstrate some of Orion I's capabilities in our film: "Exploring data with Orion I".

  15. Computer Assisted Navigation in Knee Arthroplasty

    PubMed Central

    Bae, Dae Kyung

    2011-01-01

    Computer assisted surgery (CAS) was used to improve the positioning of implants during total knee arthroplasty (TKA). Most studies have reported that computer assisted navigation reduced the outliers of alignment and component malpositioning. However, additional sophisticated studies are necessary to determine if the improvement of alignment will improve long-term clinical results and increase the survival rate of the implant. Knowledge of CAS-TKA technology and understanding the advantages and limitations of navigation are crucial to the successful application of the CAS technique in TKA. In this article, we review the components of navigation, classification of the system, surgical method, potential error, clinical results, advantages, and disadvantages. PMID:22162787

  16. Creation of Anatomically Accurate Computer-Aided Design (CAD) Solid Models from Medical Images

    NASA Technical Reports Server (NTRS)

    Stewart, John E.; Graham, R. Scott; Samareh, Jamshid A.; Oberlander, Eric J.; Broaddus, William C.

    1999-01-01

    Most surgical instrumentation and implants used in the world today are designed with sophisticated Computer-Aided Design (CAD)/Computer-Aided Manufacturing (CAM) software. This software automates the mechanical development of a product from its conceptual design through manufacturing. CAD software also provides a means of manipulating solid models prior to Finite Element Modeling (FEM). Few surgical products are designed in conjunction with accurate CAD models of human anatomy because of the difficulty with which these models are created. We have developed a novel technique that creates anatomically accurate, patient specific CAD solids from medical images in a matter of minutes.

  17. Simulating motivated cognition

    NASA Technical Reports Server (NTRS)

    Gevarter, William B.

    1991-01-01

    A research effort to develop a sophisticated computer model of human behavior is described. A computer framework of motivated cognition was developed. Motivated cognition focuses on the motivations or affects that provide the context and drive in human cognition and decision making. A conceptual architecture of the human decision-making approach from the perspective of information processing in the human brain is developed in diagrammatic form. A preliminary version of such a diagram is presented. This architecture is then used as a vehicle for successfully constructing a computer program simulation Dweck and Leggett's findings that relate how an individual's implicit theories orient them toward particular goals, with resultant cognitions, affects, and behavior.

  18. Planetary investigation utilizing an imaging spectrometer system based upon charge injection technology

    NASA Technical Reports Server (NTRS)

    Wattson, R. B.; Harvey, P.; Swift, R.

    1975-01-01

    An intrinsic silicon charge injection device (CID) television sensor array has been used in conjunction with a CaMoO4 colinear tunable acousto optic filter, a 61 inch reflector, a sophisticated computer system, and a digital color TV scan converter/computer to produce near IR images of Saturn and Jupiter with 10A spectral resolution and approximately 3 inch spatial resolution. The CID camera has successfully obtained digitized 100 x 100 array images with 5 minutes of exposure time, and slow-scanned readout to a computer. Details of the equipment setup, innovations, problems, experience, data and final equipment performance limits are given.

  19. Passive Motion Paradigm: An Alternative to Optimal Control

    PubMed Central

    Mohan, Vishwanathan; Morasso, Pietro

    2011-01-01

    In the last years, optimal control theory (OCT) has emerged as the leading approach for investigating neural control of movement and motor cognition for two complementary research lines: behavioral neuroscience and humanoid robotics. In both cases, there are general problems that need to be addressed, such as the “degrees of freedom (DoFs) problem,” the common core of production, observation, reasoning, and learning of “actions.” OCT, directly derived from engineering design techniques of control systems quantifies task goals as “cost functions” and uses the sophisticated formal tools of optimal control to obtain desired behavior (and predictions). We propose an alternative “softer” approach passive motion paradigm (PMP) that we believe is closer to the biomechanics and cybernetics of action. The basic idea is that actions (overt as well as covert) are the consequences of an internal simulation process that “animates” the body schema with the attractor dynamics of force fields induced by the goal and task-specific constraints. This internal simulation offers the brain a way to dynamically link motor redundancy with task-oriented constraints “at runtime,” hence solving the “DoFs problem” without explicit kinematic inversion and cost function computation. We argue that the function of such computational machinery is not only restricted to shaping motor output during action execution but also to provide the self with information on the feasibility, consequence, understanding and meaning of “potential actions.” In this sense, taking into account recent developments in neuroscience (motor imagery, simulation theory of covert actions, mirror neuron system) and in embodied robotics, PMP offers a novel framework for understanding motor cognition that goes beyond the engineering control paradigm provided by OCT. Therefore, the paper is at the same time a review of the PMP rationale, as a computational theory, and a perspective presentation of how to develop it for designing better cognitive architectures. PMID:22207846

  20. A New Approach for Combining Time-of-Flight and RGB Cameras Based on Depth-Dependent Planar Projective Transformations

    PubMed Central

    Salinas, Carlota; Fernández, Roemi; Montes, Héctor; Armada, Manuel

    2015-01-01

    Image registration for sensor fusion is a valuable technique to acquire 3D and colour information for a scene. Nevertheless, this process normally relies on feature-matching techniques, which is a drawback for combining sensors that are not able to deliver common features. The combination of ToF and RGB cameras is an instance that problem. Typically, the fusion of these sensors is based on the extrinsic parameter computation of the coordinate transformation between the two cameras. This leads to a loss of colour information because of the low resolution of the ToF camera, and sophisticated algorithms are required to minimize this issue. This work proposes a method for sensor registration with non-common features and that avoids the loss of colour information. The depth information is used as a virtual feature for estimating a depth-dependent homography lookup table (Hlut). The homographies are computed within sets of ground control points of 104 images. Since the distance from the control points to the ToF camera are known, the working distance of each element on the Hlut is estimated. Finally, two series of experimental tests have been carried out in order to validate the capabilities of the proposed method. PMID:26404315

  1. Virtual Character Animation Based on Affordable Motion Capture and Reconfigurable Tangible Interfaces.

    PubMed

    Lamberti, Fabrizio; Paravati, Gianluca; Gatteschi, Valentina; Cannavo, Alberto; Montuschi, Paolo

    2018-05-01

    Software for computer animation is generally characterized by a steep learning curve, due to the entanglement of both sophisticated techniques and interaction methods required to control 3D geometries. This paper proposes a tool designed to support computer animation production processes by leveraging the affordances offered by articulated tangible user interfaces and motion capture retargeting solutions. To this aim, orientations of an instrumented prop are recorded together with animator's motion in the 3D space and used to quickly pose characters in the virtual environment. High-level functionalities of the animation software are made accessible via a speech interface, thus letting the user control the animation pipeline via voice commands while focusing on his or her hands and body motion. The proposed solution exploits both off-the-shelf hardware components (like the Lego Mindstorms EV3 bricks and the Microsoft Kinect, used for building the tangible device and tracking animator's skeleton) and free open-source software (like the Blender animation tool), thus representing an interesting solution also for beginners approaching the world of digital animation for the first time. Experimental results in different usage scenarios show the benefits offered by the designed interaction strategy with respect to a mouse & keyboard-based interface both for expert and non-expert users.

  2. Collaborative Approach in the Development of High‐Performance Brain–Computer Interfaces for a Neuroprosthetic Arm: Translation from Animal Models to Human Control

    PubMed Central

    Collinger, Jennifer L.; Kryger, Michael A.; Barbara, Richard; Betler, Timothy; Bowsher, Kristen; Brown, Elke H. P.; Clanton, Samuel T.; Degenhart, Alan D.; Foldes, Stephen T.; Gaunt, Robert A.; Gyulai, Ferenc E.; Harchick, Elizabeth A.; Harrington, Deborah; Helder, John B.; Hemmes, Timothy; Johannes, Matthew S.; Katyal, Kapil D.; Ling, Geoffrey S. F.; McMorland, Angus J. C.; Palko, Karina; Para, Matthew P.; Scheuermann, Janet; Schwartz, Andrew B.; Skidmore, Elizabeth R.; Solzbacher, Florian; Srikameswaran, Anita V.; Swanson, Dennis P.; Swetz, Scott; Tyler‐Kabara, Elizabeth C.; Velliste, Meel; Wang, Wei; Weber, Douglas J.; Wodlinger, Brian

    2013-01-01

    Abstract Our research group recently demonstrated that a person with tetraplegia could use a brain–computer interface (BCI) to control a sophisticated anthropomorphic robotic arm with skill and speed approaching that of an able‐bodied person. This multiyear study exemplifies important principles in translating research from foundational theory and animal experiments into a clinical study. We present a roadmap that may serve as an example for other areas of clinical device research as well as an update on study results. Prior to conducting a multiyear clinical trial, years of animal research preceded BCI testing in an epilepsy monitoring unit, and then in a short‐term (28 days) clinical investigation. Scientists and engineers developed the necessary robotic and surgical hardware, software environment, data analysis techniques, and training paradigms. Coordination among researchers, funding institutes, and regulatory bodies ensured that the study would provide valuable scientific information in a safe environment for the study participant. Finally, clinicians from neurosurgery, anesthesiology, physiatry, psychology, and occupational therapy all worked in a multidisciplinary team along with the other researchers to conduct a multiyear BCI clinical study. This teamwork and coordination can be used as a model for others attempting to translate basic science into real‐world clinical situations. PMID:24528900

  3. Symmetrically private information retrieval based on blind quantum computing

    NASA Astrophysics Data System (ADS)

    Sun, Zhiwei; Yu, Jianping; Wang, Ping; Xu, Lingling

    2015-05-01

    Universal blind quantum computation (UBQC) is a new secure quantum computing protocol which allows a user Alice who does not have any sophisticated quantum technology to delegate her computing to a server Bob without leaking any privacy. Using the features of UBQC, we propose a protocol to achieve symmetrically private information retrieval, which allows a quantum limited Alice to query an item from Bob with a fully fledged quantum computer; meanwhile, the privacy of both parties is preserved. The security of our protocol is based on the assumption that malicious Alice has no quantum computer, which avoids the impossibility proof of Lo. For the honest Alice, she is almost classical and only requires minimal quantum resources to carry out the proposed protocol. Therefore, she does not need any expensive laboratory which can maintain the coherence of complicated quantum experimental setups.

  4. Novel opportunities for computational biology and sociology in drug discovery

    PubMed Central

    Yao, Lixia

    2009-01-01

    Drug discovery today is impossible without sophisticated modeling and computation. In this review we touch on previous advances in computational biology and by tracing the steps involved in pharmaceutical development, we explore a range of novel, high value opportunities for computational innovation in modeling the biological process of disease and the social process of drug discovery. These opportunities include text mining for new drug leads, modeling molecular pathways and predicting the efficacy of drug cocktails, analyzing genetic overlap between diseases and predicting alternative drug use. Computation can also be used to model research teams and innovative regions and to estimate the value of academy-industry ties for scientific and human benefit. Attention to these opportunities could promise punctuated advance, and will complement the well-established computational work on which drug discovery currently relies. PMID:19674801

  5. Design and performance analysis of solid-propellant rocket motors using a simplified computer program

    NASA Technical Reports Server (NTRS)

    Sforzini, R. H.

    1972-01-01

    An analysis and a computer program are presented which represent a compromise between the more sophisticated programs using precise burning geometric relations and the textbook type of solutions. The program requires approximately 900 computer cards including a set of 20 input data cards required for a typical problem. The computer operating time for a single configuration is approximately 1 minute and 30 seconds on the IBM 360 computer. About l minute and l5 seconds of the time is compilation time so that additional configurations input at the same time require approximately 15 seconds each. The program uses approximately 11,000 words on the IBM 360. The program is written in FORTRAN 4 and is readily adaptable for use on a number of different computers: IBM 7044, IBM 7094, and Univac 1108.

  6. 3D Texture Features Mining for MRI Brain Tumor Identification

    NASA Astrophysics Data System (ADS)

    Rahim, Mohd Shafry Mohd; Saba, Tanzila; Nayer, Fatima; Syed, Afraz Zahra

    2014-03-01

    Medical image segmentation is a process to extract region of interest and to divide an image into its individual meaningful, homogeneous components. Actually, these components will have a strong relationship with the objects of interest in an image. For computer-aided diagnosis and therapy process, medical image segmentation is an initial mandatory step. Medical image segmentation is a sophisticated and challenging task because of the sophisticated nature of the medical images. Indeed, successful medical image analysis heavily dependent on the segmentation accuracy. Texture is one of the major features to identify region of interests in an image or to classify an object. 2D textures features yields poor classification results. Hence, this paper represents 3D features extraction using texture analysis and SVM as segmentation technique in the testing methodologies.

  7. Recent advances in modeling languages for pathway maps and computable biological networks.

    PubMed

    Slater, Ted

    2014-02-01

    As our theories of systems biology grow more sophisticated, the models we use to represent them become larger and more complex. Languages necessarily have the expressivity and flexibility required to represent these models in ways that support high-resolution annotation, and provide for simulation and analysis that are sophisticated enough to allow researchers to master their data in the proper context. These languages also need to facilitate model sharing and collaboration, which is currently best done by using uniform data structures (such as graphs) and language standards. In this brief review, we discuss three of the most recent systems biology modeling languages to appear: BEL, PySB and BCML, and examine how they meet these needs. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. The interactive digital video interface

    NASA Technical Reports Server (NTRS)

    Doyle, Michael D.

    1989-01-01

    A frequent complaint in the computer oriented trade journals is that current hardware technology is progressing so quickly that software developers cannot keep up. A example of this phenomenon can be seen in the field of microcomputer graphics. To exploit the advantages of new mechanisms of information storage and retrieval, new approaches must be made towards incorporating existing programs as well as developing entirely new applications. A particular area of need is the correlation of discrete image elements to textural information. The interactive digital video (IDV) interface embodies a new concept in software design which addresses these needs. The IDV interface is a patented device and language independent process for identifying image features on a digital video display and which allows a number of different processes to be keyed to that identification. Its capabilities include the correlation of discrete image elements to relevant text information and the correlation of these image features to other images as well as to program control mechanisms. Sophisticated interrelationships can be set up between images, text, and program control mechanisms.

  9. A Human–Robot Interaction Perspective on Assistive and Rehabilitation Robotics

    PubMed Central

    Beckerle, Philipp; Salvietti, Gionata; Unal, Ramazan; Prattichizzo, Domenico; Rossi, Simone; Castellini, Claudio; Hirche, Sandra; Endo, Satoshi; Amor, Heni Ben; Ciocarlie, Matei; Mastrogiovanni, Fulvio; Argall, Brenna D.; Bianchi, Matteo

    2017-01-01

    Assistive and rehabilitation devices are a promising and challenging field of recent robotics research. Motivated by societal needs such as aging populations, such devices can support motor functionality and subject training. The design, control, sensing, and assessment of the devices become more sophisticated due to a human in the loop. This paper gives a human–robot interaction perspective on current issues and opportunities in the field. On the topic of control and machine learning, approaches that support but do not distract subjects are reviewed. Options to provide sensory user feedback that are currently missing from robotic devices are outlined. Parallels between device acceptance and affective computing are made. Furthermore, requirements for functional assessment protocols that relate to real-world tasks are discussed. In all topic areas, the design of human-oriented frameworks and methods is dominated by challenges related to the close interaction between the human and robotic device. This paper discusses the aforementioned aspects in order to open up new perspectives for future robotic solutions. PMID:28588473

  10. A Human-Robot Interaction Perspective on Assistive and Rehabilitation Robotics.

    PubMed

    Beckerle, Philipp; Salvietti, Gionata; Unal, Ramazan; Prattichizzo, Domenico; Rossi, Simone; Castellini, Claudio; Hirche, Sandra; Endo, Satoshi; Amor, Heni Ben; Ciocarlie, Matei; Mastrogiovanni, Fulvio; Argall, Brenna D; Bianchi, Matteo

    2017-01-01

    Assistive and rehabilitation devices are a promising and challenging field of recent robotics research. Motivated by societal needs such as aging populations, such devices can support motor functionality and subject training. The design, control, sensing, and assessment of the devices become more sophisticated due to a human in the loop. This paper gives a human-robot interaction perspective on current issues and opportunities in the field. On the topic of control and machine learning, approaches that support but do not distract subjects are reviewed. Options to provide sensory user feedback that are currently missing from robotic devices are outlined. Parallels between device acceptance and affective computing are made. Furthermore, requirements for functional assessment protocols that relate to real-world tasks are discussed. In all topic areas, the design of human-oriented frameworks and methods is dominated by challenges related to the close interaction between the human and robotic device. This paper discusses the aforementioned aspects in order to open up new perspectives for future robotic solutions.

  11. The centrality of RNA for engineering gene expression

    PubMed Central

    Chappell, James; Takahashi, Melissa K; Meyer, Sarai; Loughrey, David; Watters, Kyle E; Lucks, Julius

    2013-01-01

    Synthetic biology holds promise as both a framework for rationally engineering biological systems and a way to revolutionize how we fundamentally understand them. Essential to realizing this promise is the development of strategies and tools to reliably and predictably control and characterize sophisticated patterns of gene expression. Here we review the role that RNA can play towards this goal and make a case for why this versatile, designable, and increasingly characterizable molecule is one of the most powerful substrates for engineering gene expression at our disposal. We discuss current natural and synthetic RNA regulators of gene expression acting at key points of control – transcription, mRNA degradation, and translation. We also consider RNA structural probing and computational RNA structure predication tools as a way to study RNA structure and ultimately function. Finally, we discuss how next-generation sequencing methods are being applied to the study of RNA and to the characterization of RNA's many properties throughout the cell. PMID:24124015

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    None, None

    The Second SIAM Conference on Computational Science and Engineering was held in San Diego from February 10-12, 2003. Total conference attendance was 553. This is a 23% increase in attendance over the first conference. The focus of this conference was to draw attention to the tremendous range of major computational efforts on large problems in science and engineering, to promote the interdisciplinary culture required to meet these large-scale challenges, and to encourage the training of the next generation of computational scientists. Computational Science & Engineering (CS&E) is now widely accepted, along with theory and experiment, as a crucial third modemore » of scientific investigation and engineering design. Aerospace, automotive, biological, chemical, semiconductor, and other industrial sectors now rely on simulation for technical decision support. For federal agencies also, CS&E has become an essential support for decisions on resources, transportation, and defense. CS&E is, by nature, interdisciplinary. It grows out of physical applications and it depends on computer architecture, but at its heart are powerful numerical algorithms and sophisticated computer science techniques. From an applied mathematics perspective, much of CS&E has involved analysis, but the future surely includes optimization and design, especially in the presence of uncertainty. Another mathematical frontier is the assimilation of very large data sets through such techniques as adaptive multi-resolution, automated feature search, and low-dimensional parameterization. The themes of the 2003 conference included, but were not limited to: Advanced Discretization Methods; Computational Biology and Bioinformatics; Computational Chemistry and Chemical Engineering; Computational Earth and Atmospheric Sciences; Computational Electromagnetics; Computational Fluid Dynamics; Computational Medicine and Bioengineering; Computational Physics and Astrophysics; Computational Solid Mechanics and Materials; CS&E Education; Meshing and Adaptivity; Multiscale and Multiphysics Problems; Numerical Algorithms for CS&E; Discrete and Combinatorial Algorithms for CS&E; Inverse Problems; Optimal Design, Optimal Control, and Inverse Problems; Parallel and Distributed Computing; Problem-Solving Environments; Software and Wddleware Systems; Uncertainty Estimation and Sensitivity Analysis; and Visualization and Computer Graphics.« less

  13. Data mining: sophisticated forms of managed care modeling through artificial intelligence.

    PubMed

    Borok, L S

    1997-01-01

    Data mining is a recent development in computer science that combines artificial intelligence algorithms and relational databases to discover patterns automatically, without the use of traditional statistical methods. Work with data mining tools in health care is in a developmental stage that holds great promise, given the combination of demographic and diagnostic information.

  14. Automation Problems of 1968; Papers Presented at the Meeting...October 4-5, 1968.

    ERIC Educational Resources Information Center

    Andrews, Theodora, Ed.

    Librarians and their concerned colleagues met to give, hear and discuss papers on library automation, primarily by computers. Noted at this second meeting on library automation were: (1) considerably more sophistication and casualness about the techniques involved, (2) considerably more assurance of what and where things can be applied and (3)…

  15. The Relationship of Lexical Richness to the Quality of ESL Learners' Oral Narratives

    ERIC Educational Resources Information Center

    Lu, Xiaofei

    2012-01-01

    This study was an examination of the relationship of lexical richness to the quality of English as a second language (ESL) learners' oral narratives. A computational system was designed to automate the measurement of 3 dimensions of lexical richness, that is, lexical density, sophistication, and variation, using 25 different metrics proposed in…

  16. Requirements for SPIRES II. An External Specification for the Stanford Public Information Retrieval System.

    ERIC Educational Resources Information Center

    Parker, Edwin B.

    SPIRES (Stanford Public Information Retrieval System) is a computerized information storage and retrieval system intended for use by students and faculty members who have little knowledge of computers but who need rapid and sophisticated retrieval and analysis. The functions and capabilities of the system from the user's point of view are…

  17. Science of Security Lablet - Scalability and Usability

    DTIC Science & Technology

    2014-12-16

    mobile computing [19]. However, the high-level infrastructure design and our own implementation (both described throughout this paper) can easily...critical and infrastructural systems demands high levels of sophistication in the technical aspects of cybersecurity, software and hardware design...Forget, S. Komanduri, Alessandro Acquisti, Nicolas Christin, Lorrie Cranor, Rahul Telang. "Security Behavior Observatory: Infrastructure for Long-term

  18. An introduction to UGRS: the ultimate grading and remanufacturing system

    Treesearch

    John Moody; Charles J. Gatchell; Elizabeth S. Walker; Powsiri Klinkhachorn

    1998-01-01

    The Ultimate Grading and Remanufactming System (UGRS) is an advanced computer program for grading and remanufacturing lumber. It is an interactive program that will both grade lumber according to NHLA rules and remanufacture it for maximum value. UGRS is written to run under Microsoft Windows 3.0 or later updates and provides a sophisticated graphical user interface....

  19. A hybrid analog-digital phase-locked loop for frequency mode non-contact scanning probe microscopy.

    PubMed

    Mehta, M M; Chandrasekhar, V

    2014-01-01

    Non-contact scanning probe microscopy (SPM) has developed into a powerful technique to image many different properties of samples. The conventional method involves monitoring the amplitude, phase, or frequency of a cantilever oscillating at or near its resonant frequency as it is scanned across the surface of a sample. For high Q factor cantilevers, monitoring the resonant frequency is the preferred method in order to obtain reasonable scan times. This can be done by using a phase-locked-loop (PLL). PLLs can be obtained as commercial integrated circuits, but these do not have the frequency resolution required for SPM. To increase the resolution, all-digital PLLs requiring sophisticated digital signal processors or field programmable gate arrays have also been implemented. We describe here a hybrid analog/digital PLL where most of the components are implemented using discrete analog integrated circuits, but the frequency resolution is provided by a direct digital synthesis chip controlled by a simple peripheral interface controller (PIC) microcontroller. The PLL has excellent frequency resolution and noise, and can be controlled and read by a computer via a universal serial bus connection.

  20. A hybrid analog-digital phase-locked loop for frequency mode non-contact scanning probe microscopy

    NASA Astrophysics Data System (ADS)

    Mehta, M. M.; Chandrasekhar, V.

    2014-01-01

    Non-contact scanning probe microscopy (SPM) has developed into a powerful technique to image many different properties of samples. The conventional method involves monitoring the amplitude, phase, or frequency of a cantilever oscillating at or near its resonant frequency as it is scanned across the surface of a sample. For high Q factor cantilevers, monitoring the resonant frequency is the preferred method in order to obtain reasonable scan times. This can be done by using a phase-locked-loop (PLL). PLLs can be obtained as commercial integrated circuits, but these do not have the frequency resolution required for SPM. To increase the resolution, all-digital PLLs requiring sophisticated digital signal processors or field programmable gate arrays have also been implemented. We describe here a hybrid analog/digital PLL where most of the components are implemented using discrete analog integrated circuits, but the frequency resolution is provided by a direct digital synthesis chip controlled by a simple peripheral interface controller (PIC) microcontroller. The PLL has excellent frequency resolution and noise, and can be controlled and read by a computer via a universal serial bus connection.

  1. Grids: The Top Ten Questions

    DOE PAGES

    Schopf, Jennifer M.; Nitzberg, Bill

    2002-01-01

    The design and implementation of a national computing system and data grid has become a reachable goal from both the computer science and computational science point of view. A distributed infrastructure capable of sophisticated computational functions can bring many benefits to scientific work, but poses many challenges, both technical and socio-political. Technical challenges include having basic software tools, higher-level services, functioning and pervasive security, and standards, while socio-political issues include building a user community, adding incentives for sites to be part of a user-centric environment, and educating funding sources about the needs of this community. This paper details the areasmore » relating to Grid research that we feel still need to be addressed to fully leverage the advantages of the Grid.« less

  2. Software Surface Modeling and Grid Generation Steering Committee

    NASA Technical Reports Server (NTRS)

    Smith, Robert E. (Editor)

    1992-01-01

    It is a NASA objective to promote improvements in the capability and efficiency of computational fluid dynamics. Grid generation, the creation of a discrete representation of the solution domain, is an essential part of computational fluid dynamics. However, grid generation about complex boundaries requires sophisticated surface-model descriptions of the boundaries. The surface modeling and the associated computation of surface grids consume an extremely large percentage of the total time required for volume grid generation. Efficient and user friendly software systems for surface modeling and grid generation are critical for computational fluid dynamics to reach its potential. The papers presented here represent the state-of-the-art in software systems for surface modeling and grid generation. Several papers describe improved techniques for grid generation.

  3. Computational protein design with backbone plasticity

    PubMed Central

    MacDonald, James T.; Freemont, Paul S.

    2016-01-01

    The computational algorithms used in the design of artificial proteins have become increasingly sophisticated in recent years, producing a series of remarkable successes. The most dramatic of these is the de novo design of artificial enzymes. The majority of these designs have reused naturally occurring protein structures as ‘scaffolds’ onto which novel functionality can be grafted without having to redesign the backbone structure. The incorporation of backbone flexibility into protein design is a much more computationally challenging problem due to the greatly increased search space, but promises to remove the limitations of reusing natural protein scaffolds. In this review, we outline the principles of computational protein design methods and discuss recent efforts to consider backbone plasticity in the design process. PMID:27911735

  4. Simple geometric algorithms to aid in clearance management for robotic mechanisms

    NASA Technical Reports Server (NTRS)

    Copeland, E. L.; Ray, L. D.; Peticolas, J. D.

    1981-01-01

    Global geometric shapes such as lines, planes, circles, spheres, cylinders, and the associated computational algorithms which provide relatively inexpensive estimates of minimum spatial clearance for safe operations were selected. The Space Shuttle, remote manipulator system, and the Power Extension Package are used as an example. Robotic mechanisms operate in quarters limited by external structures and the problem of clearance is often of considerable interest. Safe clearance management is simple and suited to real time calculation, whereas contact prediction requires more precision, sophistication, and computational overhead.

  5. A survey of computational aerodynamics in the United States

    NASA Technical Reports Server (NTRS)

    Gessow, A.; Morris, D. J.

    1977-01-01

    Programs in theoretical and computational aerodynamics in the United States are described. Those aspects of programs that relate to aeronautics are detailed. The role of analysis at various levels of sophistication is discussed as well as the inverse solution techniques that are of primary importance in design methodology. The research is divided into the broad categories of application for boundary layer flow, Navier-Stokes turbulence modeling, internal flows, two-dimensional configurations, subsonic and supersonic aircraft, transonic aircraft, and the space shuttle. A survey of representative work in each area is presented.

  6. WELDSMART: A vision-based expert system for quality control

    NASA Technical Reports Server (NTRS)

    Andersen, Kristinn; Barnett, Robert Joel; Springfield, James F.; Cook, George E.

    1992-01-01

    This work was aimed at exploring means for utilizing computer technology in quality inspection and evaluation. Inspection of metallic welds was selected as the main application for this development and primary emphasis was placed on visual inspection, as opposed to other inspection methods, such as radiographic techniques. Emphasis was placed on methodologies with the potential for use in real-time quality control systems. Because quality evaluation is somewhat subjective, despite various efforts to classify discontinuities and standardize inspection methods, the task of using a computer for both inspection and evaluation was not trivial. The work started out with a review of the various inspection techniques that are used for quality control in welding. Among other observations from this review was the finding that most weld defects result in abnormalities that may be seen by visual inspection. This supports the approach of emphasizing visual inspection for this work. Quality control consists of two phases: (1) identification of weld discontinuities (some of which may be severe enough to be classified as defects), and (2) assessment or evaluation of the weld based on the observed discontinuities. Usually the latter phase results in a pass/fail judgement for the inspected piece. It is the conclusion of this work that the first of the above tasks, identification of discontinuities, is the most challenging one. It calls for sophisticated image processing and image analysis techniques, and frequently ad hoc methods have to be developed to identify specific features in the weld image. The difficulty of this task is generally not due to limited computing power. In most cases it was found that a modest personal computer or workstation could carry out most computations in a reasonably short time period. Rather, the algorithms and methods necessary for identifying weld discontinuities were in some cases limited. The fact that specific techniques were finally developed and successfully demosntrated to work illustrates that the general approach taken here appears to be promising for commercial development of computerized quality inspection systems. Inspection based on these techniques may be used to supplement or substitute more elaborate inspection methods, such as x-ray inspections.

  7. Science Language Accommodation in Elementary School Read-Alouds

    NASA Astrophysics Data System (ADS)

    Glass, Rory; Oliveira, Alandeom W.

    2014-03-01

    This study examines the pedagogical functions of accommodation (i.e. provision of simplified science speech) in science read-aloud sessions facilitated by five elementary teachers. We conceive of read-alouds as communicative events wherein teachers, faced with the task of orally delivering a science text of relatively high linguistic complexity, open up an alternate channel of communication, namely oral discussion. By doing so, teachers grant students access to a simplified linguistic input, a strategy designed to promote student comprehension of the textual contents of children's science books. It was found that nearly half (46%) of the read-aloud time was allotted to discussions with an increased percentage of less sophisticated words and reduced use of more sophisticated vocabulary than found in the books through communicative strategies such as simplified rewording, simplified definition, and simplified questioning. Further, aloud reading of more linguistically complex books required longer periods of discussion and an increased degree of teacher oral input and accommodation. We also found evidence of reversed simplification (i.e. sophistication), leading to student uptake of scientific language. The main significance of this study is that it reveals that teacher talk serves two often competing pedagogical functions (accessible communication of scientific information to students and promotion of student acquisition of the specialized language of science). It also underscores the importance of giving analytical consideration to the simplification-sophistication dimension of science classroom discourse as well as the potential of computer-based analysis of classroom discourse to inform science teaching.

  8. Development and Application of a Numerical Framework for Improving Building Foundation Heat Transfer Calculations

    NASA Astrophysics Data System (ADS)

    Kruis, Nathanael J. F.

    Heat transfer from building foundations varies significantly in all three spatial dimensions and has important dynamic effects at all timescales, from one hour to several years. With the additional consideration of moisture transport, ground freezing, evapotranspiration, and other physical phenomena, the estimation of foundation heat transfer becomes increasingly sophisticated and computationally intensive to the point where accuracy must be compromised for reasonable computation time. The tools currently available to calculate foundation heat transfer are often either too limited in their capabilities to draw meaningful conclusions or too sophisticated to use in common practices. This work presents Kiva, a new foundation heat transfer computational framework. Kiva provides a flexible environment for testing different numerical schemes, initialization methods, spatial and temporal discretizations, and geometric approximations. Comparisons within this framework provide insight into the balance of computation speed and accuracy relative to highly detailed reference solutions. The accuracy and computational performance of six finite difference numerical schemes are verified against established IEA BESTEST test cases for slab-on-grade heat conduction. Of the schemes tested, the Alternating Direction Implicit (ADI) scheme demonstrates the best balance between accuracy, performance, and numerical stability. Kiva features four approaches of initializing soil temperatures for an annual simulation. A new accelerated initialization approach is shown to significantly reduce the required years of presimulation. Methods of approximating three-dimensional heat transfer within a representative two-dimensional context further improve computational performance. A new approximation called the boundary layer adjustment method is shown to improve accuracy over other established methods with a negligible increase in computation time. This method accounts for the reduced heat transfer from concave foundation shapes, which has not been adequately addressed to date. Within the Kiva framework, three-dimensional heat transfer that can require several days to simulate is approximated in two-dimensions in a matter of seconds while maintaining a mean absolute deviation within 3%.

  9. Effective Materials Property Information Management for the 21st Century

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, Weiju; Cebon, David; Barabash, Oleg M

    2011-01-01

    This paper discusses key principles for the development of materials property information management software systems. There are growing needs for automated materials information management in various organizations. In part these are fuelled by the demands for higher efficiency in material testing, product design and engineering analysis. But equally important, organizations are being driven by the needs for consistency, quality and traceability of data, as well as control of access to proprietary or sensitive information. Further, the use of increasingly sophisticated nonlinear, anisotropic and multi-scale engineering analyses requires both processing of large volumes of test data for development of constitutive modelsmore » and complex materials data input for Computer-Aided Engineering (CAE) software. And finally, the globalization of economy often generates great needs for sharing a single gold source of materials information between members of global engineering teams in extended supply-chains. Fortunately material property management systems have kept pace with the growing user demands and evolved to versatile data management systems that can be customized to specific user needs. The more sophisticated of these provide facilities for: (i) data management functions such as access, version, and quality controls; (ii) a wide range of data import, export and analysis capabilities; (iii) data pedigree traceability mechanisms; (iv) data searching, reporting and viewing tools; and (v) access to the information via a wide range of interfaces. In this paper the important requirements for advanced material data management systems, future challenges and opportunities such as automated error checking, data quality characterization, identification of gaps in datasets, as well as functionalities and business models to fuel database growth and maintenance are discussed.« less

  10. Side-effects and technical problems in cytapheresis with cell separators. Results of a retrospective multicenter study.

    PubMed

    Kretschmer, V

    1987-09-01

    On the basis of a survey, the acute side-effects and technical problems in a total of 77,525 cytaphereses (IFC 36,530, CFC 40,995) in donors at 39 hemapheresis centers were retrospectively analysed statistically. In general, relevant donor side-effects (0.78%-1.05%) were more rare than the primary donor-independent disturbances (1.65%-2.63%). The donor side-effects predominated merely with the use of the cell separators Haemonetics M30/Belco (1.06% vs. 0.57%). These were mainly circulatory reactions (0.83%), which were generally much more frequent with IFC (0.54%) than with CFC (IBM/Cobe 0.11%, CS-3000 0.19%). Potentially fatal complications were not reported. The frequency of side-effects, disturbances and discontinuations correlated inversely with the separation rate of the individual centers per method. Centers in which two or three methods were applied simultaneously reported a higher frequency of side-effects and disturbances. Hemolysis was only observed with IFC (0.09%), but not with the use of the Haemonetics V50. The greater susceptibility to disturbances of technical/methodological/operational origin essentially results from the more elaborate, but not yet perfected technology, including computer control and monitoring, as well as defects in the production of the much more complicated disposable sets. Thus the highest rate of discontinuations was calculated for the system which is so far the most sophisticated technically (CS-3000, 1.85%). Although the primary donor-independent problems sometimes correlate directly with the manifestation of donor side-effects, the greater technological sophistication of automatically controlled and monitored systems cannot be dispensed with, since only in this way can potentially fatal risks for the donors be largely ruled out.(ABSTRACT TRUNCATED AT 250 WORDS)

  11. Learning cardiopulmonary resuscitation skills: does the type of mannequin make a difference?

    PubMed

    Noordergraaf, G J; Van Gelder, J M; Van Kesteren, R G; Diets, R F; Savelkoul, T J

    1997-12-01

    Resuscitation (CPR) courses stress acquisition of psychomotor skills. The number of mannequins may limit the 'hands-on' time available for each trainee to practise CPR and impede acquisition of skill. This may occur because expensive, sophisticated mannequins are favoured over individual, simple mannequins. In a blind, prospective, controlled study we compared one-rescuer CPR skills of 165 trainees in two cohorts using their own individual light-weight torso mannequins (Actar 911 and Laerdal Little Anne) and a control cohort with four to five trainees sharing a sophisticated mannequin (Laerdal Recording Resusci Anne). No major significant differences (p = 0.18) were found when using the 'Berden scoring system'. Both the Actar 911 and the Little Anne were compatible with the Recording Resusci Anne. Trainees preferred the individual mannequins. We conclude that the results indicate that the use of individual mannequins in conjunction with a sophisticated mannequin neither results in trainees learning incorrect skills nor in significant improvement. Further analysis of the actual training in lay person CPR training courses and evaluation of course didactics to optimize training time appear indicated.

  12. A High-Speed, Real-Time Visualization and State Estimation Platform for Monitoring and Control of Electric Distribution Systems: Implementation and Field Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lundstrom, Blake; Gotseff, Peter; Giraldez, Julieta

    Continued deployment of renewable and distributed energy resources is fundamentally changing the way that electric distribution systems are controlled and operated; more sophisticated active system control and greater situational awareness are needed. Real-time measurements and distribution system state estimation (DSSE) techniques enable more sophisticated system control and, when combined with visualization applications, greater situational awareness. This paper presents a novel demonstration of a high-speed, real-time DSSE platform and related control and visualization functionalities, implemented using existing open-source software and distribution system monitoring hardware. Live scrolling strip charts of meter data and intuitive annotated map visualizations of the entire state (obtainedmore » via DSSE) of a real-world distribution circuit are shown. The DSSE implementation is validated to demonstrate provision of accurate voltage data. This platform allows for enhanced control and situational awareness using only a minimum quantity of distribution system measurement units and modest data and software infrastructure.« less

  13. Coupled rotor/airframe vibration analysis

    NASA Technical Reports Server (NTRS)

    Sopher, R.; Studwell, R. E.; Cassarino, S.; Kottapalli, S. B. R.

    1982-01-01

    A coupled rotor/airframe vibration analysis developed as a design tool for predicting helicopter vibrations and a research tool to quantify the effects of structural properties, aerodynamic interactions, and vibration reduction devices on vehicle vibration levels is described. The analysis consists of a base program utilizing an impedance matching technique to represent the coupled rotor/airframe dynamics of the system supported by inputs from several external programs supplying sophisticated rotor and airframe aerodynamic and structural dynamic representation. The theoretical background, computer program capabilities and limited correlation results are presented in this report. Correlation results using scale model wind tunnel results show that the analysis can adequately predict trends of vibration variations with airspeed and higher harmonic control effects. Predictions of absolute values of vibration levels were found to be very sensitive to modal characteristics and results were not representative of measured values.

  14. Study of aircraft centered navigation, guidance, and traffic situation system concept for terminal area operation

    NASA Technical Reports Server (NTRS)

    Anderson, W. W.; Will, R. W.; Grantham, C.

    1972-01-01

    A concept for automating the control of air traffic in the terminal area in which the primary man-machine interface is the cockpit is described. The ground and airborne inputs required for implementing this concept are discussed. Digital data link requirements of 10,000 bits per second are explained. A particular implementation of this concept including a sequencing and separation algorithm which generates flight paths and implements a natural order landing sequence is presented. Onboard computer/display avionics utilizing a traffic situation display is described. A preliminary simulation of this concept has been developed which includes a simple, efficient sequencing algorithm and a complete aircraft dynamics model. This simulated jet transport was flown through automated terminal-area traffic situations by pilots using relatively sophisticated displays, and pilot performance and observations are discussed.

  15. Computer based guidance in the modern operating room: a historical perspective.

    PubMed

    Bova, Frank

    2010-01-01

    The past few decades have seen the introduction of many different and innovative approaches aimed at enhancing surgical technique. As microprocessors have decreased in size and increased in processing power, more sophisticated systems have been developed. Some systems have attempted to provide enhanced instrument control while others have attempted to provide tools for surgical guidance. These systems include robotics, image enhancements, and frame-based and frameless guidance procedures. In almost every case the system's design goals were achieved and surgical outcomes were enhanced, yet a vast majority of today's surgical procedures are conducted without the aid of these advances. As new tools are developed and existing tools refined, special attention to the systems interface and integration into the operating room environment will be required before increased utilization of these technologies can be realized.

  16. Biocatalysis for Biobased Chemicals

    PubMed Central

    de Regil, Rubén; Sandoval, Georgina

    2013-01-01

    The design and development of greener processes that are safe and friendly is an irreversible trend that is driven by sustainable and economic issues. The use of Biocatalysis as part of a manufacturing process fits well in this trend as enzymes are themselves biodegradable, require mild conditions to work and are highly specific and well suited to carry out complex reactions in a simple way. The growth of computational capabilities in the last decades has allowed Biocatalysis to develop sophisticated tools to understand better enzymatic phenomena and to have the power to control not only process conditions but also the enzyme’s own nature. Nowadays, Biocatalysis is behind some important products in the pharmaceutical, cosmetic, food and bulk chemicals industry. In this review we want to present some of the most representative examples of industrial chemicals produced in vitro through enzymatic catalysis. PMID:24970192

  17. Computational control of flexible aerospace systems

    NASA Technical Reports Server (NTRS)

    Sharpe, Lonnie, Jr.; Shen, Ji Yao

    1994-01-01

    The main objective of this project is to establish a distributed parameter modeling technique for structural analysis, parameter estimation, vibration suppression and control synthesis of large flexible aerospace structures. This report concentrates on the research outputs produced in the last two years. The main accomplishments can be summarized as follows. A new version of the PDEMOD Code had been completed based on several incomplete versions. The verification of the code had been conducted by comparing the results with those examples for which the exact theoretical solutions can be obtained. The theoretical background of the package and the verification examples has been reported in a technical paper submitted to the Joint Applied Mechanics & Material Conference, ASME. A brief USER'S MANUAL had been compiled, which includes three parts: (1) Input data preparation; (2) Explanation of the Subroutines; and (3) Specification of control variables. Meanwhile, a theoretical investigation of the NASA MSFC two-dimensional ground-based manipulator facility by using distributed parameter modeling technique has been conducted. A new mathematical treatment for dynamic analysis and control of large flexible manipulator systems has been conceived, which may provide an embryonic form of a more sophisticated mathematical model for future modified versions of the PDEMOD Codes.

  18. Biology Inspired Approach for Communal Behavior in Sensor Networks

    NASA Technical Reports Server (NTRS)

    Jones, Kennie H.; Lodding, Kenneth N.; Olariu, Stephan; Wilson, Larry; Xin, Chunsheng

    2006-01-01

    Research in wireless sensor network technology has exploded in the last decade. Promises of complex and ubiquitous control of the physical environment by these networks open avenues for new kinds of science and business. Due to the small size and low cost of sensor devices, visionaries promise systems enabled by deployment of massive numbers of sensors working in concert. Although the reduction in size has been phenomenal it results in severe limitations on the computing, communicating, and power capabilities of these devices. Under these constraints, research efforts have concentrated on developing techniques for performing relatively simple tasks with minimal energy expense assuming some form of centralized control. Unfortunately, centralized control does not scale to massive size networks and execution of simple tasks in sparsely populated networks will not lead to the sophisticated applications predicted. These must be enabled by new techniques dependent on local and autonomous cooperation between sensors to effect global functions. As a step in that direction, in this work we detail a technique whereby a large population of sensors can attain a global goal using only local information and by making only local decisions without any form of centralized control.

  19. RAVEN: a GUI and an Artificial Intelligence Engine in a Dynamic PRA Framework

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    C. Rabiti; D. Mandelli; A. Alfonsi

    Increases in computational power and pressure for more accurate simulations and estimations of accident scenario consequences are driving the need for Dynamic Probabilistic Risk Assessment (PRA) [1] of very complex models. While more sophisticated algorithms and computational power address the back end of this challenge, the front end is still handled by engineers that need to extract meaningful information from the large amount of data and build these complex models. Compounding this problem is the difficulty in knowledge transfer and retention, and the increasing speed of software development. The above-described issues would have negatively impacted deployment of the new highmore » fidelity plant simulator RELAP-7 (Reactor Excursion and Leak Analysis Program) at Idaho National Laboratory. Therefore, RAVEN that was initially focused to be the plant controller for RELAP-7 will help mitigate future RELAP-7 software engineering risks. In order to accomplish this task, Reactor Analysis and Virtual Control Environment (RAVEN) has been designed to provide an easy to use Graphical User Interface (GUI) for building plant models and to leverage artificial intelligence algorithms in order to reduce computational time, improve results, and help the user to identify the behavioral pattern of the Nuclear Power Plants (NPPs). In this paper we will present the GUI implementation and its current capability status. We will also introduce the support vector machine algorithms and show our evaluation of their potentiality in increasing the accuracy and reducing the computational costs of PRA analysis. In this evaluation we will refer to preliminary studies performed under the Risk Informed Safety Margins Characterization (RISMC) project of the Light Water Reactors Sustainability (LWRS) campaign [3]. RISMC simulation needs and algorithm testing are currently used as a guidance to prioritize RAVEN developments relevant to PRA.« less

  20. Why advanced computing? The key to space-based operations

    NASA Astrophysics Data System (ADS)

    Phister, Paul W., Jr.; Plonisch, Igor; Mineo, Jack

    2000-11-01

    The 'what is the requirement?' aspect of advanced computing and how it relates to and supports Air Force space-based operations is a key issue. In support of the Air Force Space Command's five major mission areas (space control, force enhancement, force applications, space support and mission support), two-fifths of the requirements have associated stringent computing/size implications. The Air Force Research Laboratory's 'migration to space' concept will eventually shift Science and Technology (S&T) dollars from predominantly airborne systems to airborne-and-space related S&T areas. One challenging 'space' area is in the development of sophisticated on-board computing processes for the next generation smaller, cheaper satellite systems. These new space systems (called microsats or nanosats) could be as small as a softball, yet perform functions that are currently being done by large, vulnerable ground-based assets. The Joint Battlespace Infosphere (JBI) concept will be used to manage the overall process of space applications coupled with advancements in computing. The JBI can be defined as a globally interoperable information 'space' which aggregates, integrates, fuses, and intelligently disseminates all relevant battlespace knowledge to support effective decision-making at all echelons of a Joint Task Force (JTF). This paper explores a single theme -- on-board processing is the best avenue to take advantage of advancements in high-performance computing, high-density memories, communications, and re-programmable architecture technologies. The goal is to break away from 'no changes after launch' design to a more flexible design environment that can take advantage of changing space requirements and needs while the space vehicle is 'on orbit.'

  1. Projected role of advanced computational aerodynamic methods at the Lockheed-Georgia company

    NASA Technical Reports Server (NTRS)

    Lores, M. E.

    1978-01-01

    Experience with advanced computational methods being used at the Lockheed-Georgia Company to aid in the evaluation and design of new and modified aircraft indicates that large and specialized computers will be needed to make advanced three-dimensional viscous aerodynamic computations practical. The Numerical Aerodynamic Simulation Facility should be used to provide a tool for designing better aerospace vehicles while at the same time reducing development costs by performing computations using Navier-Stokes equations solution algorithms and permitting less sophisticated but nevertheless complex calculations to be made efficiently. Configuration definition procedures and data output formats can probably best be defined in cooperation with industry, therefore, the computer should handle many remote terminals efficiently. The capability of transferring data to and from other computers needs to be provided. Because of the significant amount of input and output associated with 3-D viscous flow calculations and because of the exceedingly fast computation speed envisioned for the computer, special attention should be paid to providing rapid, diversified, and efficient input and output.

  2. Lost in Second Life: Virtual Embodiment and Language Learning via Multimodal Communication

    ERIC Educational Resources Information Center

    Pasfield-Neofitou, Sarah; Huang, Hui; Grant, Scott

    2015-01-01

    Increased recognition of the role of the body and environment in cognition has taken place in recent decades in the form of new theories of embodied and extended cognition. The growing use of ever more sophisticated computer-generated 3D virtual worlds and avatars has added a new dimension to these theories of cognition. Both developments provide…

  3. Impact of Media Richness and Flow on E-Learning Technology Acceptance

    ERIC Educational Resources Information Center

    Liu, Su-Houn; Liao, Hsiu-Li; Pratt, Jean A.

    2009-01-01

    Advances in e-learning technologies parallels a general increase in sophistication by computer users. The use of just one theory or model, such as the technology acceptance model, is no longer sufficient to study the intended use of e-learning systems. Rather, a combination of theories must be integrated in order to fully capture the complexity of…

  4. Box compression analysis of world-wide data spanning 46 years

    Treesearch

    Thomas J. Urbanik; Benjamin Frank

    2006-01-01

    The state of the art among most industry citations of box compression estimation is the equation by McKee developed in 1963. Because of limitations in computing tools at the time the McKee equation was developed, the equation is a simplification, with many constraints, of a more general relationship. By applying the results of sophisticated finite element modeling, in...

  5. Pattern Discovery in Biomolecular Data – Tools, Techniques, and Applications | Center for Cancer Research

    Cancer.gov

    Finding patterns in biomolecular data, particularly in DNA and RNA, is at the center of modern biological research. These data are complex and growing rapidly, so the search for patterns requires increasingly sophisticated computer methods. This book provides a summary of principal techniques. Each chapter describes techniques that are drawn from many fields, including graph

  6. Interesting, Cool and Tantalising? Or Inappropriate, Complicated and Tedious? Pupil and Teacher Views on ICT in Science Teaching

    ERIC Educational Resources Information Center

    Willshire, Michael

    2013-01-01

    In a relatively short space of time, classrooms have become full of computers, gadgets and electronic devices. Technology will only continue to become more sophisticated, more efficient and more abundant in schools. But how desirable is this technological revolution and to what extent should it develop? To measure the effectiveness and popularity…

  7. Revisiting Mathematical Problem Solving and Posing in the Digital Era: Toward Pedagogically Sound Uses of Modern Technology

    ERIC Educational Resources Information Center

    Abramovich, S.

    2014-01-01

    The availability of sophisticated computer programs such as "Wolfram Alpha" has made many problems found in the secondary mathematics curriculum somewhat obsolete for they can be easily solved by the software. Against this background, an interplay between the power of a modern tool of technology and educational constraints it presents is…

  8. Interactive Software For Astrodynamical Calculations

    NASA Technical Reports Server (NTRS)

    Schlaifer, Ronald S.; Skinner, David L.; Roberts, Phillip H.

    1995-01-01

    QUICK computer program provides user with facilities of sophisticated desk calculator performing scalar, vector, and matrix arithmetic; propagate conic-section orbits; determines planetary and satellite coordinates; and performs other related astrodynamic calculations within FORTRAN-like software environment. QUICK is interpreter, and no need to use compiler or linker to run QUICK code. Outputs plotted in variety of formats on variety of terminals. Written in RATFOR.

  9. A novel paradigm for telemedicine using the personal bio-monitor.

    PubMed

    Bhatikar, Sanjay R; Mahajan, Roop L; DeGroff, Curt

    2002-01-01

    The foray of solid-state technology in the medical field has yielded an arsenal of sophisticated healthcare tools. Personal, portable computing power coupled with the information superhighway open up the possibility of sophisticated healthcare management that will impact the medical field just as much. The full synergistic potential of three interwoven technologies: (1) compact electronics, (2) World Wide Web, and (3) Artificial Intelligence is yet to be realized. The system presented in this paper integrates these technologies synergistically, providing a new paradigm for healthcare. Our idea is to deploy internet-enabled, intelligent, handheld personal computers for medical diagnosis. The salient features of the 'Personal Bio-Monitor' we envisage are: (1) Utilization of the peripheral signals of the body which may be acquired non-invasively and with ease, for diagnosis of medical conditions; (2) An Artificial Neural Network (ANN) based approach for diagnosis; (3) Configuration of the diagnostic device as a handheld for personal use; (4) Internet connectivity, following the emerging bluetooth protocol, for prompt conveyance of information to a patient's health care provider via the World Wide Web. The proposal is substantiated with an intelligent handheld device developed by the investigators for pediatric cardiac auscultation. This device performed accurate diagnoses of cardiac abnormalities in pediatrics using an artificial neural network to process heart sounds acquired by a low-frequency microphone and transmitted its diagnosis to a desktop PC via infrared. The idea of the personal biomonitor presented here has the potential to streamline healthcare by optimizing two valuable resources: physicians' time and sophisticated equipment time. We show that the elements of such a system are in place, with our prototype. Our novel contribution is the synergistic integration of compact electronics' technology, artificial neural network methodology and the wireless web resulting in a revolutionary new paradigm for healthcare management.

  10. DOE's Computer Incident Advisory Capability (CIAC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schultz, E.

    1990-09-01

    Computer security is essential in maintaining quality in the computing environment. Computer security incidents, however, are becoming more sophisticated. The DOE Computer Incident Advisory Capability (CIAC) team was formed primarily to assist DOE sites in responding to computer security incidents. Among CIAC's other responsibilities are gathering and distributing information to DOE sites, providing training workshops, coordinating with other agencies, response teams, and vendors, creating guidelines for incident handling, and developing software tools. CIAC has already provided considerable assistance to DOE sites faced with virus infections and worm and hacker attacks, has issued over 40 information bulletins, and has developed andmore » presented a workshop on incident handling. CIAC's experience in helping sites has produced several lessons learned, including the need to follow effective procedures to avoid virus infections in small systems and the need for sound password management and system administration in networked systems. CIAC's activity and scope will expand in the future. 4 refs.« less

  11. Contemporary issues in HIM. The application layer--III.

    PubMed

    Wear, L L; Pinkert, J R

    1993-07-01

    We have seen document preparation systems evolve from basic line editors through powerful, sophisticated desktop publishing programs. This component of the application layer is probably one of the most used, and most readily identifiable. Ask grade school children nowadays, and many will tell you that they have written a paper on a computer. Next month will be a "fun" tour through a number of other application programs we find useful. They will range from a simple notebook reminder to a sophisticated photograph processor. Application layer: Software targeted for the end user, focusing on a specific application area, and typically residing in the computer system as distinct components on top of the OS. Desktop publishing: A document preparation program that begins with the text features of a word processor, then adds the ability for a user to incorporate outputs from a variety of graphic programs, spreadsheets, and other applications. Line editor: A document preparation program that manipulates text in a file on the basis of numbered lines. Word processor: A document preparation program that can, among other things, reformat sections of documents, move and replace blocks of text, use multiple character fonts, automatically create a table of contents and index, create complex tables, and combine text and graphics.

  12. Editorial Comments, 1974-1986: The Case For and Against the Use of Computer-Assisted Decision Making

    PubMed Central

    Weaver, Robert R.

    1987-01-01

    Journal editorials are an important medium for communicating information about medical innovations. Evaluative statements contained in editorials pertain to the innovation's technical merits, as well as its probable economic, social and political, and ethical consequences. This information will either promote or impede the subsequent diffusion of innovations. This paper analyzes the evaluative information contained in thirty editorials that pertain to the topic of computer-assisted decision making (CDM). Most editorials agree that CDM technology is effective and economical in performing routine clinical tasks; controversy surrounds the use of more sophisticated CDM systems for complex problem solving. A few editorials argue that the innovation should play an integral role in transforming the established health care system. Most, however, maintain that it can or should be accommodated within the existing health care framework. Finally, while few editorials discuss the ethical ramifications of CDM technology, those that do suggest that it will contribute to more humane health care. The editorial analysis suggests that CDM technology aimed at routine clinical task will experience rapid diffusion. In contrast, the diffusion of more sophisticated CDM systems will, in the foreseeable future, likely be sporadic at best.

  13. Computational Challenges of Viscous Incompressible Flows

    NASA Technical Reports Server (NTRS)

    Kwak, Dochan; Kiris, Cetin; Kim, Chang Sung

    2004-01-01

    Over the past thirty years, numerical methods and simulation tools for incompressible flows have been advanced as a subset of the computational fluid dynamics (CFD) discipline. Although incompressible flows are encountered in many areas of engineering, simulation of compressible flow has been the major driver for developing computational algorithms and tools. This is probably due to the rather stringent requirements for predicting aerodynamic performance characteristics of flight vehicles, while flow devices involving low-speed or incompressible flow could be reasonably well designed without resorting to accurate numerical simulations. As flow devices are required to be more sophisticated and highly efficient CFD took become increasingly important in fluid engineering for incompressible and low-speed flow. This paper reviews some of the successes made possible by advances in computational technologies during the same period, and discusses some of the current challenges faced in computing incompressible flows.

  14. Facial Animations: Future Research Directions & Challenges

    NASA Astrophysics Data System (ADS)

    Alkawaz, Mohammed Hazim; Mohamad, Dzulkifli; Rehman, Amjad; Basori, Ahmad Hoirul

    2014-06-01

    Nowadays, computer facial animation is used in a significant multitude fields that brought human and social to study the computer games, films and interactive multimedia reality growth. Authoring the computer facial animation, complex and subtle expressions are challenging and fraught with problems. As a result, the current most authored using universal computer animation techniques often limit the production quality and quantity of facial animation. With the supplement of computer power, facial appreciative, software sophistication and new face-centric methods emerging are immature in nature. Therefore, this paper concentrates to define and managerially categorize current and emerged surveyed facial animation experts to define the recent state of the field, observed bottlenecks and developing techniques. This paper further presents a real-time simulation model of human worry and howling with detail discussion about their astonish, sorrow, annoyance and panic perception.

  15. Conservative zonal schemes for patched grids in 2 and 3 dimensions

    NASA Technical Reports Server (NTRS)

    Hessenius, Kristin A.

    1987-01-01

    The computation of flow over complex geometries, such as realistic aircraft configurations, poses difficult grid generation problems for computational aerodynamicists. The creation of a traditional, single-module grid of acceptable quality about an entire configuration may be impossible even with the most sophisticated of grid generation techniques. A zonal approach, wherein the flow field is partitioned into several regions within which grids are independently generated, is a practical alternative for treating complicated geometries. This technique not only alleviates the problems of discretizing a complex region, but also facilitates a block processing approach to computation thereby circumventing computer memory limitations. The use of such a zonal scheme, however, requires the development of an interfacing procedure that ensures a stable, accurate, and conservative calculation for the transfer of information across the zonal borders.

  16. MoCog1: A computer simulation of recognition-primed human decision making

    NASA Technical Reports Server (NTRS)

    Gevarter, William B.

    1991-01-01

    This report describes the successful results of the first stage of a research effort to develop a 'sophisticated' computer model of human cognitive behavior. Most human decision-making is of the experience-based, relatively straight-forward, largely automatic, type of response to internal goals and drives, utilizing cues and opportunities perceived from the current environment. This report describes the development of the architecture and computer program associated with such 'recognition-primed' decision-making. The resultant computer program was successfully utilized as a vehicle to simulate findings that relate how an individual's implicit theories orient them toward particular goals, with resultant cognitions, affects, and behavior in response to their environment. The present work is an expanded version and is based on research reported while the author was an employee of NASA ARC.

  17. Gait and balance of transfemoral amputees using passive mechanical and microprocessor-controlled prosthetic knees.

    PubMed

    Kaufman, K R; Levine, J A; Brey, R H; Iverson, B K; McCrady, S K; Padgett, D J; Joyner, M J

    2007-10-01

    Microprocessor-controlled knee joints appeared on the market a decade ago. These joints are more sophisticated and more expensive than mechanical ones. The literature is contradictory regarding changes in gait and balance when using these sophisticated devices. This study employed a crossover design to assess the comparative performance of a passive mechanical knee prosthesis compared to a microprocessor-controlled knee joint in 15 subjects with an above-knee amputation. Objective measurements of gait and balance were obtained. Subjects demonstrated significantly improved gait characteristics after receiving the microprocessor-controlled prosthetic knee joint (p<0.01). Improvements in gait were a transition from a hyperextended knee to a flexed knee during loading response which resulted in a change from an internal knee flexor moment to a knee extensor moment. The participants' balance also improved (p<0.01). All conditions of the Sensory Organization Test (SOT) demonstrated improvements in equilibrium score. The composite score also increased. Transfemoral amputees using a microprocessor-controlled knee have significant improvements in gait and balance.

  18. CONTROL OF MICROBIAL CONTAMINANTS AND DISINFECTION BY-PRODUCTS (DBPS): COST AND PERFORMANCE

    EPA Science Inventory

    The USEPA is in the process of developing a sophisticated regulatory strategy in an attempt to balance the complex trade-offs in risks associated with controlling disinfectants and disinfection by-products (D/DBPs) in drinking water. EPA first attempted to control DBPs in 1974, w...

  19. Tools for Analyzing Computing Resource Management Strategies and Algorithms for SDR Clouds

    NASA Astrophysics Data System (ADS)

    Marojevic, Vuk; Gomez-Miguelez, Ismael; Gelonch, Antoni

    2012-09-01

    Software defined radio (SDR) clouds centralize the computing resources of base stations. The computing resource pool is shared between radio operators and dynamically loads and unloads digital signal processing chains for providing wireless communications services on demand. Each new user session request particularly requires the allocation of computing resources for executing the corresponding SDR transceivers. The huge amount of computing resources of SDR cloud data centers and the numerous session requests at certain hours of a day require an efficient computing resource management. We propose a hierarchical approach, where the data center is divided in clusters that are managed in a distributed way. This paper presents a set of computing resource management tools for analyzing computing resource management strategies and algorithms for SDR clouds. We use the tools for evaluating a different strategies and algorithms. The results show that more sophisticated algorithms can achieve higher resource occupations and that a tradeoff exists between cluster size and algorithm complexity.

  20. Password Cracking Using Sony Playstations

    NASA Astrophysics Data System (ADS)

    Kleinhans, Hugo; Butts, Jonathan; Shenoi, Sujeet

    Law enforcement agencies frequently encounter encrypted digital evidence for which the cryptographic keys are unknown or unavailable. Password cracking - whether it employs brute force or sophisticated cryptanalytic techniques - requires massive computational resources. This paper evaluates the benefits of using the Sony PlayStation 3 (PS3) to crack passwords. The PS3 offers massive computational power at relatively low cost. Moreover, multiple PS3 systems can be introduced easily to expand parallel processing when additional power is needed. This paper also describes a distributed framework designed to enable law enforcement agents to crack encrypted archives and applications in an efficient and cost-effective manner.

  1. An integrated decision support system for TRAC: A proposal

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi

    1991-01-01

    Optimal allocation and usage of resources is a key to effective management. Resources of concern to TRAC are: Manpower (PSY), Money (Travel, contracts), Computing, Data, Models, etc. Management activities of TRAC include: Planning, Programming, Tasking, Monitoring, Updating, and Coordinating. Existing systems are insufficient, not completely automated, manpower intensive, and has the potential for data inconsistency exists. A system is proposed which suggests a means to integrate all project management activities of TRAC through the development of a sophisticated software and by utilizing the existing computing systems and network resources. The systems integration proposal is examined in detail.

  2. Designer drugs: the evolving science of drug discovery.

    PubMed

    Wanke, L A; DuBose, R F

    1998-07-01

    Drug discovery and design are fundamental to drug development. Until recently, most drugs were discovered through random screening or developed through molecular modification. New technologies are revolutionizing this phase of drug development. Rational drug design, using powerful computers and computational chemistry and employing X-ray crystallography, nuclear magnetic resonance spectroscopy, and three-dimensional quantitative structure activity relationship analysis, is creating highly specific, biologically active molecules by virtual reality modeling. Sophisticated screening technologies are eliminating all but the most active lead compounds. These new technologies promise more efficacious, safe, and cost-effective medications, while minimizing drug development time and maximizing profits.

  3. Extension, validation and application of the NASCAP code

    NASA Technical Reports Server (NTRS)

    Katz, I.; Cassidy, J. J., III; Mandell, M. J.; Schnuelle, G. W.; Steen, P. G.; Parks, D. E.; Rotenberg, M.; Alexander, J. H.

    1979-01-01

    Numerous extensions were made in the NASCAP code. They fall into three categories: a greater range of definable objects, a more sophisticated computational model, and simplified code structure and usage. An important validation of NASCAP was performed using a new two dimensional computer code (TWOD). An interactive code (MATCHG) was written to compare material parameter inputs with charging results. The first major application of NASCAP was performed on the SCATHA satellite. Shadowing and charging calculation were completed. NASCAP was installed at the Air Force Geophysics Laboratory, where researchers plan to use it to interpret SCATHA data.

  4. Advances in borehole geophysics for hydrology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nelson, P.H.

    1982-01-01

    Borehole geophysical methods provide vital subsurface information on rock properties, fluid movement, and the condition of engineered borehole structures. Within the first category, salient advances include the continuing improvement of the borehole televiewer, refinement of the electrical conductivity dipmeter for fracture characterization, and the development of a gigahertz-frequency electromagnetic propagation tool for water saturation measurements. The exploration of the rock mass between boreholes remains a challenging problem with high potential; promising methods are now incorporating high-density spatial sampling and sophisticated data processing. Flow-rate measurement methods appear adequate for all but low-flow situations. At low rates the tagging method seems themore » most attractive. The current exploitation of neutron-activation techniques for tagging means that the wellbore fluid itself is tagged, thereby eliminating the mixing of an alien fluid into the wellbore. Another method uses the acoustic noise generated by flow through constrictions and in and behind casing to detect and locate flaws in the production system. With the advent of field-recorded digital data, the interpretation of logs from sedimentary sequences is now reaching a sophisticated level with the aid of computer processing and the application of statistical methods. Lagging behind are interpretive schemes for the low-porosity, fracture-controlled igneous and metamorphic rocks encountered in the geothermal reservoirs and in potential waste-storage sites. Progress is being made on the general problem of fracture detection by use of electrical and acoustical techniques, but the reliable definition of permeability continues to be an elusive goal.« less

  5. Charity gets children well-connected.

    PubMed

    2008-06-01

    A growing number of young patients at Sheffield Children's Hospital will soon be able to keep up with schoolwork, access TV and other entertainment services, and telephone friends and family, all at no cost, following the installation of a sophisticated bedside patient entertainment/computing system supplied by Wandsworth Group. In a believed UK first, the equipment is being entirely funded by the hospital's charity. Health Estate Journal reports.

  6. Viewing CAD Drawings on the Internet

    ERIC Educational Resources Information Center

    Schwendau, Mark

    2004-01-01

    Computer aided design (CAD) has been producing 3-D models for years. AutoCAD software is frequently used to create sophisticated 3-D models. These CAD files can be exported as 3DS files for import into Autodesk's 3-D Studio Viz. In this program, the user can render and modify the 3-D model before exporting it out as a WRL (world file hyperlinked)…

  7. Thermal Analysis

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The University of Georgia used NASTRAN, a COSMIC program that predicts how a design will stand up under stress, to develop a model for monitoring the transient cooling of vegetables. The winter use of passive solar heating for poultry houses is also under investigation by the Agricultural Engineering Dept. Another study involved thermal analysis of black and green nursery containers. The use of NASTRAN has encouraged student appreciation of sophisticated computer analysis.

  8. Teaching Strategies for Using Projected Images to Develop Conceptual Understanding: Exploring Discussion Practices in Computer Simulation and Static Image-Based Lessons

    ERIC Educational Resources Information Center

    Price, Norman T.

    2013-01-01

    The availability and sophistication of visual display images, such as simulations, for use in science classrooms has increased exponentially however, it can be difficult for teachers to use these images to encourage and engage active student thinking. There is a need to describe flexible discussion strategies that use visual media to engage active…

  9. The Impact of Online Teaching and Learning about Emotional Intelligence, Myers Briggs Personality Dimensions and Mindfulness on Personal and Social Awareness

    ERIC Educational Resources Information Center

    Cotler, Jami L.

    2016-01-01

    As computer-meditated communication continues to evolve and become more sophisticated and accessible, the applications for this technology continue to grow. One area that has garnered a considerable amount of attention is online teaching and learning. Research has shown increasing evidence that learning outcomes of face-to-face, in comparison to…

  10. The Galaxy platform for accessible, reproducible and collaborative biomedical analyses: 2016 update

    PubMed Central

    Afgan, Enis; Baker, Dannon; van den Beek, Marius; Blankenberg, Daniel; Bouvier, Dave; Čech, Martin; Chilton, John; Clements, Dave; Coraor, Nate; Eberhard, Carl; Grüning, Björn; Guerler, Aysam; Hillman-Jackson, Jennifer; Von Kuster, Greg; Rasche, Eric; Soranzo, Nicola; Turaga, Nitesh; Taylor, James; Nekrutenko, Anton; Goecks, Jeremy

    2016-01-01

    High-throughput data production technologies, particularly ‘next-generation’ DNA sequencing, have ushered in widespread and disruptive changes to biomedical research. Making sense of the large datasets produced by these technologies requires sophisticated statistical and computational methods, as well as substantial computational power. This has led to an acute crisis in life sciences, as researchers without informatics training attempt to perform computation-dependent analyses. Since 2005, the Galaxy project has worked to address this problem by providing a framework that makes advanced computational tools usable by non experts. Galaxy seeks to make data-intensive research more accessible, transparent and reproducible by providing a Web-based environment in which users can perform computational analyses and have all of the details automatically tracked for later inspection, publication, or reuse. In this report we highlight recently added features enabling biomedical analyses on a large scale. PMID:27137889

  11. Writing and Computing across the USM Chemistry Curriculum

    NASA Astrophysics Data System (ADS)

    Gordon, Nancy R.; Newton, Thomas A.; Rhodes, Gale; Ricci, John S.; Stebbins, Richard G.; Tracy, Henry J.

    2001-01-01

    The faculty of the University of Southern Maine believes the ability to communicate effectively is one of the most important skills required of successful chemists. To help students achieve that goal, the faculty has developed a Writing and Computer Program consisting of writing and computer assignments of gradually increasing sophistication for all our laboratory courses. The assignments build in complexity until, at the junior level, students are writing full journal-quality laboratory reports. Computer assignments also increase in difficulty as students attack more complicated subjects. We have found the program easy to initiate and our part-time faculty concurs as well. The Writing and Computing across the Curriculum Program also serves to unite the entire chemistry curriculum. We believe the program is helping to reverse what the USM chemistry faculty and other educators have found to be a steady deterioration in the writing skills of many of today's students.

  12. User-customized brain computer interfaces using Bayesian optimization

    NASA Astrophysics Data System (ADS)

    Bashashati, Hossein; Ward, Rabab K.; Bashashati, Ali

    2016-04-01

    Objective. The brain characteristics of different people are not the same. Brain computer interfaces (BCIs) should thus be customized for each individual person. In motor-imagery based synchronous BCIs, a number of parameters (referred to as hyper-parameters) including the EEG frequency bands, the channels and the time intervals from which the features are extracted should be pre-determined based on each subject’s brain characteristics. Approach. To determine the hyper-parameter values, previous work has relied on manual or semi-automatic methods that are not applicable to high-dimensional search spaces. In this paper, we propose a fully automatic, scalable and computationally inexpensive algorithm that uses Bayesian optimization to tune these hyper-parameters. We then build different classifiers trained on the sets of hyper-parameter values proposed by the Bayesian optimization. A final classifier aggregates the results of the different classifiers. Main Results. We have applied our method to 21 subjects from three BCI competition datasets. We have conducted rigorous statistical tests, and have shown the positive impact of hyper-parameter optimization in improving the accuracy of BCIs. Furthermore, We have compared our results to those reported in the literature. Significance. Unlike the best reported results in the literature, which are based on more sophisticated feature extraction and classification methods, and rely on prestudies to determine the hyper-parameter values, our method has the advantage of being fully automated, uses less sophisticated feature extraction and classification methods, and yields similar or superior results compared to the best performing designs in the literature.

  13. Simple deterministic models and applications. Comment on "Coupled disease-behavior dynamics on complex networks: A review" by Z. Wang et al.

    NASA Astrophysics Data System (ADS)

    Yang, Hyun Mo

    2015-12-01

    Currently, discrete modellings are largely accepted due to the access to computers with huge storage capacity and high performance processors and easy implementation of algorithms, allowing to develop and simulate increasingly sophisticated models. Wang et al. [7] present a review of dynamics in complex networks, focusing on the interaction between disease dynamics and human behavioral and social dynamics. By doing an extensive review regarding to the human behavior responding to disease dynamics, the authors briefly describe the complex dynamics found in the literature: well-mixed populations networks, where spatial structure can be neglected, and other networks considering heterogeneity on spatially distributed populations. As controlling mechanisms are implemented, such as social distancing due 'social contagion', quarantine, non-pharmaceutical interventions and vaccination, adaptive behavior can occur in human population, which can be easily taken into account in the dynamics formulated by networked populations.

  14. Near-optimal protocols in complex nonequilibrium transformations

    DOE PAGES

    Gingrich, Todd R.; Rotskoff, Grant M.; Crooks, Gavin E.; ...

    2016-08-29

    The development of sophisticated experimental means to control nanoscale systems has motivated efforts to design driving protocols that minimize the energy dissipated to the environment. Computational models are a crucial tool in this practical challenge. In this paper, we describe a general method for sampling an ensemble of finite-time, nonequilibrium protocols biased toward a low average dissipation. In addition, we show that this scheme can be carried out very efficiently in several limiting cases. As an application, we sample the ensemble of low-dissipation protocols that invert the magnetization of a 2D Ising model and explore how the diversity of themore » protocols varies in response to constraints on the average dissipation. In this example, we find that there is a large set of protocols with average dissipation close to the optimal value, which we argue is a general phenomenon.« less

  15. Specificity of software cooperating with an optoelectronic sensor in the pulse oximeter system

    NASA Astrophysics Data System (ADS)

    Cysewska-Sobusiak, Anna; Wiczynski, Grzegorz; Jedwabny, Tomasz

    1995-06-01

    Specificity of a software package composed of two parts which control an optoelectronic sensor of the computer-aided system made to realize the noninvasive measurements of the arterial blood oxygen saturation as well as some parameters of the peripheral pulse waveform, has been described. Principles of the transmission variant of the one and only noninvasive measurement method, so-called pulse oximetry, has been utilized. The software co-ordinates the suitable cooperation of an IBM PC compatible microcomputer with the sensor and one specialized card. This novel card is a key part of the whole measuring system which some application fields are extended in comparison to pulse oximeters commonly attainable. The user-friendly MS Windows graphical environment which creates the system to be multitask and non-preemptive, has been used to design the specific part of the programming presented here. With this environment, sophisticated tasks of the software package can be performed without excessive complication.

  16. Hot Corrosion Test Facility at the NASA Lewis Special Projects Laboratory

    NASA Technical Reports Server (NTRS)

    Robinson, Raymond C.; Cuy, Michael D.

    1994-01-01

    The Hot Corrosion Test Facility (HCTF) at the NASA Lewis Special Projects Laboratory (SPL) is a high-velocity, pressurized burner rig currently used to evaluate the environmental durability of advanced ceramic materials such as SiC and Si3N4. The HCTF uses laboratory service air which is preheated, mixed with jet fuel, and ignited to simulate the conditions of a gas turbine engine. Air, fuel, and water systems are computer-controlled to maintain test conditions which include maximum air flows of 250 kg/hr (550 lbm/hr), pressures of 100-600 kPa (1-6 atm), and gas temperatures exceeding 1500 C (2732 F). The HCTF provides a relatively inexpensive, yet sophisticated means for researchers to study the high-temperature oxidation of advanced materials, and the injection of a salt solution provides the added capability of conducting hot corrosion studies.

  17. Simple autonomous Mars walker

    NASA Technical Reports Server (NTRS)

    Larimer, Stanley J.; Lisec, Thomas R.; Spiessbach, Andrew J.

    1989-01-01

    Under a contract with NASA's Jet Propulsion Laboratory, Martin Marietta has developed several alternative rover concepts for unmanned exploration of the planet Mars. One of those concepts, the 'Walking Beam', is the subject of this paper. This concept was developed with the goal of achieving many of the capabilities of more sophisticated articulated-leg walkers with a much simpler, more robust, less computationally demanding and more power efficient design. It consists of two large-base tripods nested one within the other which alternately translate with respect to each other along a 5-meter beam to propel the vehicle. The semiautonomous navigation system relies on terrain geometry sensors and tacticle feedback from each foot to autonomously select a path which avoids hazards along a route designated from earth. Both mobility and navigation features of this concept are discussed including a top-level description of the vehicle's physical characteristics, deployment strategy, mobility elements, sensor suite, theory of operation, navigation and control processes, and estimated performance.

  18. Structural optimization of 3D-printed synthetic spider webs for high strength

    NASA Astrophysics Data System (ADS)

    Qin, Zhao; Compton, Brett G.; Lewis, Jennifer A.; Buehler, Markus J.

    2015-05-01

    Spiders spin intricate webs that serve as sophisticated prey-trapping architectures that simultaneously exhibit high strength, elasticity and graceful failure. To determine how web mechanics are controlled by their topological design and material distribution, here we create spider-web mimics composed of elastomeric filaments. Specifically, computational modelling and microscale 3D printing are combined to investigate the mechanical response of elastomeric webs under multiple loading conditions. We find the existence of an asymptotic prey size that leads to a saturated web strength. We identify pathways to design elastomeric material structures with maximum strength, low density and adaptability. We show that the loading type dictates the optimal material distribution, that is, a homogeneous distribution is better for localized loading, while stronger radial threads with weaker spiral threads is better for distributed loading. Our observations reveal that the material distribution within spider webs is dictated by the loading condition, shedding light on their observed architectural variations.

  19. Structural optimization of 3D-printed synthetic spider webs for high strength.

    PubMed

    Qin, Zhao; Compton, Brett G; Lewis, Jennifer A; Buehler, Markus J

    2015-05-15

    Spiders spin intricate webs that serve as sophisticated prey-trapping architectures that simultaneously exhibit high strength, elasticity and graceful failure. To determine how web mechanics are controlled by their topological design and material distribution, here we create spider-web mimics composed of elastomeric filaments. Specifically, computational modelling and microscale 3D printing are combined to investigate the mechanical response of elastomeric webs under multiple loading conditions. We find the existence of an asymptotic prey size that leads to a saturated web strength. We identify pathways to design elastomeric material structures with maximum strength, low density and adaptability. We show that the loading type dictates the optimal material distribution, that is, a homogeneous distribution is better for localized loading, while stronger radial threads with weaker spiral threads is better for distributed loading. Our observations reveal that the material distribution within spider webs is dictated by the loading condition, shedding light on their observed architectural variations.

  20. Functional Laser Trimming Of Thin Film Resistors On Silicon ICs

    NASA Astrophysics Data System (ADS)

    Mueller, Michael J.; Mickanin, Wes

    1986-07-01

    Modern Laser Wafer Trimming (LWT) technology achieves exceptional analog circuit performance and precision while maintain-ing the advantages of high production throughput and yield. Microprocessor-driven instrumentation has both emphasized the role of data conversion circuits and demanded sophisticated signal conditioning functions. Advanced analog semiconductor circuits with bandwidths over 1 GHz, and high precision, trimmable, thin-film resistors meet many of todays emerging circuit requirements. Critical to meeting these requirements are optimum choices of laser characteristics, proper materials, trimming process control, accurate modeling of trimmed resistor performance, and appropriate circuit design. Once limited exclusively to hand-crafted, custom integrated circuits, designs are now available in semi-custom circuit configurations. These are similar to those provided for digital designs and supported by computer-aided design (CAD) tools. Integrated with fully automated measurement and trimming systems, these quality circuits can now be produced in quantity to meet the requirements of communications, instrumentation, and signal processing markets.

  1. 77 FR 3559 - Energy Conservation Program for Consumer Products: Test Procedures for Refrigerators...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-25

    ..., which is typical of an approach enabled by more sophisticated electronic controls. Id. The interim final... and long- time automatic defrost or variable defrost control and adjust the default values of maximum... accurate measurement of the energy use of products with variable defrost control. DATES: The amendments are...

  2. Blend Shape Interpolation and FACS for Realistic Avatar

    NASA Astrophysics Data System (ADS)

    Alkawaz, Mohammed Hazim; Mohamad, Dzulkifli; Basori, Ahmad Hoirul; Saba, Tanzila

    2015-03-01

    The quest of developing realistic facial animation is ever-growing. The emergence of sophisticated algorithms, new graphical user interfaces, laser scans and advanced 3D tools imparted further impetus towards the rapid advancement of complex virtual human facial model. Face-to-face communication being the most natural way of human interaction, the facial animation systems became more attractive in the information technology era for sundry applications. The production of computer-animated movies using synthetic actors are still challenging issues. Proposed facial expression carries the signature of happiness, sadness, angry or cheerful, etc. The mood of a particular person in the midst of a large group can immediately be identified via very subtle changes in facial expressions. Facial expressions being very complex as well as important nonverbal communication channel are tricky to synthesize realistically using computer graphics. Computer synthesis of practical facial expressions must deal with the geometric representation of the human face and the control of the facial animation. We developed a new approach by integrating blend shape interpolation (BSI) and facial action coding system (FACS) to create a realistic and expressive computer facial animation design. The BSI is used to generate the natural face while the FACS is employed to reflect the exact facial muscle movements for four basic natural emotional expressions such as angry, happy, sad and fear with high fidelity. The results in perceiving the realistic facial expression for virtual human emotions based on facial skin color and texture may contribute towards the development of virtual reality and game environment of computer aided graphics animation systems.

  3. Adiabatic quantum computing with spin qubits hosted by molecules.

    PubMed

    Yamamoto, Satoru; Nakazawa, Shigeaki; Sugisaki, Kenji; Sato, Kazunobu; Toyota, Kazuo; Shiomi, Daisuke; Takui, Takeji

    2015-01-28

    A molecular spin quantum computer (MSQC) requires electron spin qubits, which pulse-based electron spin/magnetic resonance (ESR/MR) techniques can afford to manipulate for implementing quantum gate operations in open shell molecular entities. Importantly, nuclear spins, which are topologically connected, particularly in organic molecular spin systems, are client qubits, while electron spins play a role of bus qubits. Here, we introduce the implementation for an adiabatic quantum algorithm, suggesting the possible utilization of molecular spins with optimized spin structures for MSQCs. We exemplify the utilization of an adiabatic factorization problem of 21, compared with the corresponding nuclear magnetic resonance (NMR) case. Two molecular spins are selected: one is a molecular spin composed of three exchange-coupled electrons as electron-only qubits and the other an electron-bus qubit with two client nuclear spin qubits. Their electronic spin structures are well characterized in terms of the quantum mechanical behaviour in the spin Hamiltonian. The implementation of adiabatic quantum computing/computation (AQC) has, for the first time, been achieved by establishing ESR/MR pulse sequences for effective spin Hamiltonians in a fully controlled manner of spin manipulation. The conquered pulse sequences have been compared with the NMR experiments and shown much faster CPU times corresponding to the interaction strength between the spins. Significant differences are shown in rotational operations and pulse intervals for ESR/MR operations. As a result, we suggest the advantages and possible utilization of the time-evolution based AQC approach for molecular spin quantum computers and molecular spin quantum simulators underlain by sophisticated ESR/MR pulsed spin technology.

  4. Performance evaluation of GPU parallelization, space-time adaptive algorithms, and their combination for simulating cardiac electrophysiology.

    PubMed

    Sachetto Oliveira, Rafael; Martins Rocha, Bernardo; Burgarelli, Denise; Meira, Wagner; Constantinides, Christakis; Weber Dos Santos, Rodrigo

    2018-02-01

    The use of computer models as a tool for the study and understanding of the complex phenomena of cardiac electrophysiology has attained increased importance nowadays. At the same time, the increased complexity of the biophysical processes translates into complex computational and mathematical models. To speed up cardiac simulations and to allow more precise and realistic uses, 2 different techniques have been traditionally exploited: parallel computing and sophisticated numerical methods. In this work, we combine a modern parallel computing technique based on multicore and graphics processing units (GPUs) and a sophisticated numerical method based on a new space-time adaptive algorithm. We evaluate each technique alone and in different combinations: multicore and GPU, multicore and GPU and space adaptivity, multicore and GPU and space adaptivity and time adaptivity. All the techniques and combinations were evaluated under different scenarios: 3D simulations on slabs, 3D simulations on a ventricular mouse mesh, ie, complex geometry, sinus-rhythm, and arrhythmic conditions. Our results suggest that multicore and GPU accelerate the simulations by an approximate factor of 33×, whereas the speedups attained by the space-time adaptive algorithms were approximately 48. Nevertheless, by combining all the techniques, we obtained speedups that ranged between 165 and 498. The tested methods were able to reduce the execution time of a simulation by more than 498× for a complex cellular model in a slab geometry and by 165× in a realistic heart geometry simulating spiral waves. The proposed methods will allow faster and more realistic simulations in a feasible time with no significant loss of accuracy. Copyright © 2017 John Wiley & Sons, Ltd.

  5. Adaptive bill morphology for enhanced tool manipulation in New Caledonian crows

    PubMed Central

    Matsui, Hiroshi; Hunt, Gavin R.; Oberhofer, Katja; Ogihara, Naomichi; McGowan, Kevin J.; Mithraratne, Kumar; Yamasaki, Takeshi; Gray, Russell D.; Izawa, Ei-Ichi

    2016-01-01

    Early increased sophistication of human tools is thought to be underpinned by adaptive morphology for efficient tool manipulation. Such adaptive specialisation is unknown in nonhuman primates but may have evolved in the New Caledonian crow, which has sophisticated tool manufacture. The straightness of its bill, for example, may be adaptive for enhanced visually-directed use of tools. Here, we examine in detail the shape and internal structure of the New Caledonian crow’s bill using Principal Components Analysis and Computed Tomography within a comparative framework. We found that the bill has a combination of interrelated shape and structural features unique within Corvus, and possibly birds generally. The upper mandible is relatively deep and short with a straight cutting edge, and the lower mandible is strengthened and upturned. These novel combined attributes would be functional for (i) counteracting the unique loading patterns acting on the bill when manipulating tools, (ii) a strong precision grip to hold tools securely, and (iii) enhanced visually-guided tool use. Our findings indicate that the New Caledonian crow’s innovative bill has been adapted for tool manipulation to at least some degree. Early increased sophistication of tools may require the co-evolution of morphology that provides improved manipulatory skills. PMID:26955788

  6. Using Virtual Pets to Promote Physical Activity in Children: An Application of the Youth Physical Activity Promotion Model.

    PubMed

    Ahn, Sun Joo Grace; Johnsen, Kyle; Robertson, Tom; Moore, James; Brown, Scott; Marable, Amanda; Basu, Aryabrata

    2015-01-01

    A virtual pet was developed based on the framework of the youth physical activity promotion model and tested as a vehicle for promoting physical activity in children. Children in the treatment group interacted with the virtual pet for three days, setting physical activity goals and teaching tricks to the virtual pet when their goals were met. The virtual pet became more fit and learned more sophisticated tricks as the children achieved activity goals. Children in the control group interacted with a computer system presenting equivalent features but without the virtual pet. Physical activity and goal attainment were evaluated using activity monitors. Results indicated that children in the treatment group engaged in 1.09 more hours of daily physical activity (156% more) than did those in the control group. Physical activity self-efficacy and beliefs served as mediators driving this increase in activity. Children that interacted with the virtual pet also expressed higher intentions than children in the control group to continue physical activity in the future. Theoretical and practical potentials of using a virtual pet to systematically promote physical activity in children are discussed.

  7. NAAMES Photo Essay

    NASA Image and Video Library

    2017-12-08

    The bridge of the R/V Atlantis houses a sophisticated collection of computer controlled navigation systems, station-keeping controls and communications capabilities. Even in heavy seas, the ship can hold its position within a margin of just a few meters. --- The North Atlantic Aerosols and Marine Ecosystems Study (NAAMES) is a five year investigation to resolve key processes controlling ocean system function, their influences on atmospheric aerosols and clouds and their implications for climate. Michael Starobin joined the NAAMES field campaign on behalf of Earth Expeditions and NASA Goddard Space Flight Center’s Office of Communications. He presented stories about the important, multi-disciplinary research being conducted by the NAAMES team, with an eye towards future missions on the NASA drawing board. This is a NAAMES photo essay put together by Starobin, a collection of 49 photographs and captions. Photo and Caption Credit: Michael Starobin NASA image use policy NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  8. Save medical personnel's time by improved user interfaces.

    PubMed

    Kindler, H

    1997-01-01

    Common objectives in the industrial countries are the improvement of quality of care, clinical effectiveness, and cost control. Cost control, in particular, has been addressed through the introduction of case mix systems for reimbursement by social-security institutions. More data is required to enable quality improvement, increases in clinical effectiveness and for juridical reasons. At first glance, this documentation effort is contradictory to cost reduction. However, integrated services for resource management based on better documentation should help to reduce costs. The clerical effort for documentation should be decreased by providing a co-operative working environment for healthcare professionals applying sophisticated human-computer interface technology. Additional services, e.g., automatic report generation, increase the efficiency of healthcare personnel. Modelling the medical work flow forms an essential prerequisite for integrated resource management services and for co-operative user interfaces. A user interface aware of the work flow provides intelligent assistance by offering the appropriate tools at the right moment. Nowadays there is a trend to client/server systems with relational databases or object-oriented databases as repository. The work flows used for controlling purposes and to steer the user interfaces must be represented in the repository.

  9. Aircraft integrated design and analysis: A classroom experience

    NASA Technical Reports Server (NTRS)

    1988-01-01

    AAE 451 is the capstone course required of all senior undergraduates in the School of Aeronautics and Astronautics at Purdue University. During the past year the first steps of a long evolutionary process were taken to change the content and expectations of this course. These changes are the result of the availability of advanced computational capabilities and sophisticated electronic media availability at Purdue. This presentation will describe both the long range objectives and this year's experience using the High Speed Commercial Transport (HSCT) design, the AIAA Long Duration Aircraft design and a Remotely Piloted Vehicle (RPV) design proposal as project objectives. The central goal of these efforts was to provide a user-friendly, computer-software-based, environment to supplement traditional design course methodology. The Purdue University Computer Center (PUCC), the Engineering Computer Network (ECN), and stand-alone PC's were used for this development. This year's accomplishments centered primarily on aerodynamics software obtained from the NASA Langley Research Center and its integration into the classroom. Word processor capability for oral and written work and computer graphics were also blended into the course. A total of 10 HSCT designs were generated, ranging from twin-fuselage and forward-swept wing aircraft, to the more traditional delta and double-delta wing aircraft. Four Long Duration Aircraft designs were submitted, together with one RPV design tailored for photographic surveillance. Supporting these activities were three video satellite lectures beamed from NASA/Langley to Purdue. These lectures covered diverse areas such as an overview of HSCT design, supersonic-aircraft stability and control, and optimization of aircraft performance. Plans for next year's effort will be reviewed, including dedicated computer workstation utilization, remote satellite lectures, and university/industrial cooperative efforts.

  10. A study on acceptance of mobileschool at secondary schools in Malaysia: Urban vs rural

    NASA Astrophysics Data System (ADS)

    Hashim, Ahmad Sobri; Ahmad, Wan Fatimah Wan; Sarlan, Aliza

    2017-10-01

    Developing countries are in dilemma where sophisticated technologies are more advance as compared to the way their people think. In education, there have been many novel approaches and technologies were introduced. However, very minimal efforts were put to apply in our education. MobileSchool is a mobile learning (m-learning) management system, developed for administrative, teaching and learning processes at secondary schools in Malaysia. The paper presents the acceptance of MobileSchool between urban and rural secondary schools in Malaysia. Research framework was designed based on Technology Acceptance Model (TAM). The constructs of the framework include computer anxiety, self-efficacy, facilitating condition, technological complexity, perceived behavioral control, perceive ease of use, perceive usefulness, attitude and behavioral intention. Questionnaire was applied as research instrument which involved 373 students from four secondary schools (two schools in urban category and another two in rural category) in Perak. Inferential analyses using hypothesis and t-test, and descriptive analyses using mean and percentage were used to analyze the data. Results showed that there were no big difference (<20%) of all acceptance constructs between urban and rural secondary schools except computer anxiety.

  11. Alert Response to Motion Onset in the Retina

    PubMed Central

    Chen, Eric Y.; Marre, Olivier; Fisher, Clark; Schwartz, Greg; Levy, Joshua; da Silveira, Rava Azeredo

    2013-01-01

    Previous studies have shown that motion onset is very effective at capturing attention and is more salient than smooth motion. Here, we find that this salience ranking is present already in the firing rate of retinal ganglion cells. By stimulating the retina with a bar that appears, stays still, and then starts moving, we demonstrate that a subset of salamander retinal ganglion cells, fast OFF cells, responds significantly more strongly to motion onset than to smooth motion. We refer to this phenomenon as an alert response to motion onset. We develop a computational model that predicts the time-varying firing rate of ganglion cells responding to the appearance, onset, and smooth motion of a bar. This model, termed the adaptive cascade model, consists of a ganglion cell that receives input from a layer of bipolar cells, represented by individual rectified subunits. Additionally, both the bipolar and ganglion cells have separate contrast gain control mechanisms. This model captured the responses to our different motion stimuli over a wide range of contrasts, speeds, and locations. The alert response to motion onset, together with its computational model, introduces a new mechanism of sophisticated motion processing that occurs early in the visual system. PMID:23283327

  12. An expert system for benzole recovery plant

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ishiguro, H.; Matsumura, S.; Kawashima, A.

    1993-01-01

    In the By-Product Plant of NKK's Keihin Works, systematization efforts were made in 1988, including integration of the control rooms, introduction of computers and installation of automatic analyzers. This has however increased the burden on operators with a huge volume of data, and a delay in coping with an operational abnormality might expand risk and extent of damages. There is, on the other hand, a pressing need to take measures to accommodate sophisticated operations resulting from the pursuit of high productivity operation. For the purpose of avoiding these possible inconveniences, development of a real-time operation system has been tried inmore » an attempt to improve safety and operating techniques and productivity in the benzole recovery plant. An offline system based on manual entry of operating data for diagnosis of operation and abnormality was developed in 1990, and an online real-time system operating by incorporating real-time operating data was developed in 1991, which is now smoothly operating in commercial operations. This report presents an outline of the benzole recovery operation diagnosis control expert system.« less

  13. Recent developments in blast furnace process control within British Steel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warren, P.W.

    1995-12-01

    British Steel generally operates seven blast furnaces on four integrated works. All furnaces have been equipped with comprehensive instrumentation and data logging computers over the past eight years. The four Scunthorpe furnaces practice coal injection up to 170 kg/tHM (340 lb/THM), the remainder injecting oil at up to 100 kg/tHM (200 lb/THM). Distribution control is effected by Paul Wurth Bell-Less Tops on six of the seven furnaces, and Movable Throat Armour with bells on the remaining one. All have at least one sub burden probe. The blast furnace operator has a vast quantity of data and signals to consider andmore » evaluate when attempting to achieve the objective of providing a consistent supply of hot metal. Techniques have been, and are being, developed to assist the operator to interpret large numbers of signals. A simple operator guidance system has been developed to provide advice, based on current operating procedures and interpreted data. Further development will involve the use of a sophisticated Expert System software shell.« less

  14. Speeding Up Ecological and Evolutionary Computations in R; Essentials of High Performance Computing for Biologists

    PubMed Central

    Visser, Marco D.; McMahon, Sean M.; Merow, Cory; Dixon, Philip M.; Record, Sydne; Jongejans, Eelke

    2015-01-01

    Computation has become a critical component of research in biology. A risk has emerged that computational and programming challenges may limit research scope, depth, and quality. We review various solutions to common computational efficiency problems in ecological and evolutionary research. Our review pulls together material that is currently scattered across many sources and emphasizes those techniques that are especially effective for typical ecological and environmental problems. We demonstrate how straightforward it can be to write efficient code and implement techniques such as profiling or parallel computing. We supply a newly developed R package (aprof) that helps to identify computational bottlenecks in R code and determine whether optimization can be effective. Our review is complemented by a practical set of examples and detailed Supporting Information material (S1–S3 Texts) that demonstrate large improvements in computational speed (ranging from 10.5 times to 14,000 times faster). By improving computational efficiency, biologists can feasibly solve more complex tasks, ask more ambitious questions, and include more sophisticated analyses in their research. PMID:25811842

  15. Manufacturing Technology Research Needs of the Gear Industry.

    DTIC Science & Technology

    1987-12-31

    Management Shortcomings within the U.S. Precision Gear Industry ........... 33 2.2.7 European Gear and Machine Tool Companies ....... .. 35 2.2.8 German...manufacturing becomes more sophisticated, workers are running numerically con- trolled computer equipment requiring an understanding of math. 2.2.6.9 Management ...inefficiencies of the job shop environ- ment by managing the gear business as a backward integra- tion of the assembly line. o Develop and maintain

  16. Introduction to the concepts of TELEDEMO and TELEDIMS

    NASA Technical Reports Server (NTRS)

    Rice, R. F.; Schlutsmeyer, A. P.

    1982-01-01

    An introduction to the system concepts: TELEDEMO and TELEDIMS is provided. TELEDEMO is derived primarily from computer graphics and, via incorporation of sophisticated image data compression, enables effective low cost teleconferencing at data rates as low as 1K bit/second using dial-up phone lines. Combining TELEDEMO's powerful capabilities for the development of presentation material with microprocessor-based Information Management Systems (IMS) yields a truly all electronic IMS called TELEDIMS.

  17. Decision-Tree Program

    NASA Technical Reports Server (NTRS)

    Buntine, Wray

    1994-01-01

    IND computer program introduces Bayesian and Markov/maximum-likelihood (MML) methods and more-sophisticated methods of searching in growing trees. Produces more-accurate class-probability estimates important in applications like diagnosis. Provides range of features and styles with convenience for casual user, fine-tuning for advanced user or for those interested in research. Consists of four basic kinds of routines: data-manipulation, tree-generation, tree-testing, and tree-display. Written in C language.

  18. Forensic Information Warfare Requirement Study

    DTIC Science & Technology

    2002-06-01

    technologies that are taking place now and in the near future that will adversely impact the current technologies and require additional sophistication...WetStone Technologies, Inc. moderated a panel at the Economic Crime Investigation Institute’s Ninth Annual Conference (Fraud Management in the Twenty-First...second, to ascertain the legal impact of these tools. Their report was delivered to AFRL and provides an in-depth look into these areas. 8 Computer

  19. Computational Modeling of Cultural Dimensions in Adversary Organizations

    DTIC Science & Technology

    2010-01-01

    Nodes”, In the Proceedings of the 9th Conference on Uncertainty in Artificial Intelli - gence, 1993. [8] Pearl, J. Probabilistic Reasoning in...the artificial life simulations; in con- trast, models with only a few agents typically employ quite sophisticated cognitive agents capa- ble of...Model Construction 45 cisions as to how to allocate scarce ISR assets (two Unmanned Air Systems, UAS ) among the two Red activities while at the same

  20. CD-ROM And Knowledge Integration

    NASA Astrophysics Data System (ADS)

    Rann, Leonard S.

    1988-06-01

    As the title of this paper suggests, it is about CD-ROM technology and the structuring of massive databases. Even more, it is about the impact CD-ROM has had on the publication of massive amounts of information, and the unique qualities of the medium that allows for the most sophisticated computer retrieval techniques that have ever been used. I am not drawing on experience as a pedant in the educational field, but rather as a software and database designer who has worked with CD-ROM since its inception. I will be giving examples from my company's current applications, as well as discussing some of the challenges that face information publishers in the future. In particular I have a belief about what the most valuable outlet can be created using CD-ROM will be: The CD-ROM is particularly suited for the mass delivery of information systems and databases that either require or utilize a large amount of computational preprocessing to allow a real-time or interactive response to be achieved. Until the advent of CD-ROM technology this level of sophistication in publication was virtually impossible. I will further explain this later in this paper. First, I will discuss the salient features of CD-ROM that make it unique in the world of data storage for electronic publishing.

  1. BOOK REVIEW: Mathematica for Theoretical Physics: Electrodynamics, Quantum Mechanics, General Relativity and Fractals

    NASA Astrophysics Data System (ADS)

    Heusler, Stefan

    2006-12-01

    The main focus of the second, enlarged edition of the book Mathematica for Theoretical Physics is on computational examples using the computer program Mathematica in various areas in physics. It is a notebook rather than a textbook. Indeed, the book is just a printout of the Mathematica notebooks included on the CD. The second edition is divided into two volumes, the first covering classical mechanics and nonlinear dynamics, the second dealing with examples in electrodynamics, quantum mechanics, general relativity and fractal geometry. The second volume is not suited for newcomers because basic and simple physical ideas which lead to complex formulas are not explained in detail. Instead, the computer technology makes it possible to write down and manipulate formulas of practically any length. For researchers with experience in computing, the book contains a lot of interesting and non-trivial examples. Most of the examples discussed are standard textbook problems, but the power of Mathematica opens the path to more sophisticated solutions. For example, the exact solution for the perihelion shift of Mercury within general relativity is worked out in detail using elliptic functions. The virial equation of state for molecules' interaction with Lennard-Jones-like potentials is discussed, including both classical and quantum corrections to the second virial coefficient. Interestingly, closed solutions become available using sophisticated computing methods within Mathematica. In my opinion, the textbook should not show formulas in detail which cover three or more pages—these technical data should just be contained on the CD. Instead, the textbook should focus on more detailed explanation of the physical concepts behind the technicalities. The discussion of the virial equation would benefit much from replacing 15 pages of Mathematica output with 15 pages of further explanation and motivation. In this combination, the power of computing merged with physical intuition would be of benefit even for newcomers. In summary, this book shows in a convincing manner how classical problems in physics can be attacked with modern computing technology. The second volume is interesting for experienced users of Mathematica. For students, the textbook can be very useful in combination with a seminar.

  2. Optimized Materials From First Principles Simulations: Are We There Yet?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Galli, G; Gygi, F

    2005-07-26

    In the past thirty years, the use of scientific computing has become pervasive in all disciplines: collection and interpretation of most experimental data is carried out using computers, and physical models in computable form, with various degrees of complexity and sophistication, are utilized in all fields of science. However, full prediction of physical and chemical phenomena based on the basic laws of Nature, using computer simulations, is a revolution still in the making, and it involves some formidable theoretical and computational challenges. We illustrate the progress and successes obtained in recent years in predicting fundamental properties of materials in condensedmore » phases and at the nanoscale, using ab-initio, quantum simulations. We also discuss open issues related to the validation of the approximate, first principles theories used in large scale simulations, and the resulting complex interplay between computation and experiment. Finally, we describe some applications, with focus on nanostructures and liquids, both at ambient and under extreme conditions.« less

  3. Practical aspects of modeling aircraft dynamics from flight data

    NASA Technical Reports Server (NTRS)

    Iliff, K. W.; Maine, R. E.

    1984-01-01

    The purpose of parameter estimation, a subset of system identification, is to estimate the coefficients (such as stability and control derivatives) of the aircraft differential equations of motion from sampled measured dynamic responses. In the past, the primary reason for estimating stability and control derivatives from flight tests was to make comparisons with wind tunnel estimates. As aircraft became more complex, and as flight envelopes were expanded to include flight regimes that were not well understood, new requirements for the derivative estimates evolved. For many years, the flight determined derivatives were used in simulations to aid in flight planning and in pilot training. The simulations were particularly important in research flight test programs in which an envelope expansion into new flight regimes was required. Parameter estimation techniques for estimating stability and control derivatives from flight data became more sophisticated to support the flight test programs. As knowledge of these new flight regimes increased, more complex aircraft were flown. Much of this increased complexity was in sophisticated flight control systems. The design and refinement of the control system required higher fidelity simulations than were previously required.

  4. A Survey of Collectives

    NASA Technical Reports Server (NTRS)

    Tumer, Kagan; Wolpert, David

    2004-01-01

    Due to the increasing sophistication and miniaturization of computational components, complex, distributed systems of interacting agents are becoming ubiquitous. Such systems, where each agent aims to optimize its own performance, but where there is a well-defined set of system-level performance criteria, are called collectives. The fundamental problem in analyzing/designing such systems is in determining how the combined actions of self-interested agents leads to 'coordinated' behavior on a iarge scale. Examples of artificial systems which exhibit such behavior include packet routing across a data network, control of an array of communication satellites, coordination of multiple deployables, and dynamic job scheduling across a distributed computer grid. Examples of natural systems include ecosystems, economies, and the organelles within a living cell. No current scientific discipline provides a thorough understanding of the relation between the structure of collectives and how well they meet their overall performance criteria. Although still very young, research on collectives has resulted in successes both in understanding and designing such systems. It is eqected that as it matures and draws upon other disciplines related to collectives, this field will greatly expand the range of computationally addressable tasks. Moreover, in addition to drawing on them, such a fully developed field of collective intelligence may provide insight into already established scientific fields, such as mechanism design, economics, game theory, and population biology. This chapter provides a survey to the emerging science of collectives.

  5. A soft body as a reservoir: case studies in a dynamic model of octopus-inspired soft robotic arm.

    PubMed

    Nakajima, Kohei; Hauser, Helmut; Kang, Rongjie; Guglielmino, Emanuele; Caldwell, Darwin G; Pfeifer, Rolf

    2013-01-01

    The behaviors of the animals or embodied agents are characterized by the dynamic coupling between the brain, the body, and the environment. This implies that control, which is conventionally thought to be handled by the brain or a controller, can partially be outsourced to the physical body and the interaction with the environment. This idea has been demonstrated in a number of recently constructed robots, in particular from the field of "soft robotics". Soft robots are made of a soft material introducing high-dimensionality, non-linearity, and elasticity, which often makes the robots difficult to control. Biological systems such as the octopus are mastering their complex bodies in highly sophisticated manners by capitalizing on their body dynamics. We will demonstrate that the structure of the octopus arm cannot only be exploited for generating behavior but also, in a sense, as a computational resource. By using a soft robotic arm inspired by the octopus we show in a number of experiments how control is partially incorporated into the physical arm's dynamics and how the arm's dynamics can be exploited to approximate non-linear dynamical systems and embed non-linear limit cycles. Future application scenarios as well as the implications of the results for the octopus biology are also discussed.

  6. A soft body as a reservoir: case studies in a dynamic model of octopus-inspired soft robotic arm

    PubMed Central

    Nakajima, Kohei; Hauser, Helmut; Kang, Rongjie; Guglielmino, Emanuele; Caldwell, Darwin G.; Pfeifer, Rolf

    2013-01-01

    The behaviors of the animals or embodied agents are characterized by the dynamic coupling between the brain, the body, and the environment. This implies that control, which is conventionally thought to be handled by the brain or a controller, can partially be outsourced to the physical body and the interaction with the environment. This idea has been demonstrated in a number of recently constructed robots, in particular from the field of “soft robotics”. Soft robots are made of a soft material introducing high-dimensionality, non-linearity, and elasticity, which often makes the robots difficult to control. Biological systems such as the octopus are mastering their complex bodies in highly sophisticated manners by capitalizing on their body dynamics. We will demonstrate that the structure of the octopus arm cannot only be exploited for generating behavior but also, in a sense, as a computational resource. By using a soft robotic arm inspired by the octopus we show in a number of experiments how control is partially incorporated into the physical arm's dynamics and how the arm's dynamics can be exploited to approximate non-linear dynamical systems and embed non-linear limit cycles. Future application scenarios as well as the implications of the results for the octopus biology are also discussed. PMID:23847526

  7. Technology for the product and process data base

    NASA Technical Reports Server (NTRS)

    Barnes, R. D.

    1984-01-01

    The computerized product and process data base is increasingly recognized to be the cornerstone component of an overall system aimed at the integrated automation of the industrial processes of a given company or enterprise. The technology needed to support these more effective computer integrated design and manufacturing methods, especially the concept of 3-D computer-sensible product definitions rather than engineering drawings, is not fully available and rationalized. Progress is being made, however, in bridging this technology gap with concentration on the modeling of sophisticated information and data structures, high-performance interactive user interfaces and comprehensive tools for managing the resulting computerized product definition and process data base.

  8. Lung Cancer: Posttreatment Imaging: Radiation Therapy and Imaging Findings.

    PubMed

    Benveniste, Marcelo F; Welsh, James; Viswanathan, Chitra; Shroff, Girish S; Betancourt Cuellar, Sonia L; Carter, Brett W; Marom, Edith M

    2018-05-01

    In this review, we discuss the different radiation delivery techniques available to treat non-small cell lung cancer, typical radiologic manifestations of conventional radiotherapy, and different patterns of lung injury and temporal evolution of the newer radiotherapy techniques. More sophisticated techniques include intensity-modulated radiotherapy, stereotactic body radiotherapy, proton therapy, and respiration-correlated computed tomography or 4-dimensional computed tomography for radiotherapy planning. Knowledge of the radiation treatment plan and technique, the completion date of radiotherapy, and the temporal evolution of radiation-induced lung injury is important to identify expected manifestations of radiation-induced lung injury and differentiate them from tumor recurrence or infection. Published by Elsevier Inc.

  9. A nonperturbative approximation for the moderate Reynolds number Navier–Stokes equations

    PubMed Central

    Roper, Marcus; Brenner, Michael P.

    2009-01-01

    The nonlinearity of the Navier–Stokes equations makes predicting the flow of fluid around rapidly moving small bodies highly resistant to all approaches save careful experiments or brute force computation. Here, we show how a linearization of the Navier–Stokes equations captures the drag-determining features of the flow and allows simplified or analytical computation of the drag on bodies up to Reynolds number of order 100. We illustrate the utility of this linearization in 2 practical problems that normally can only be tackled with sophisticated numerical methods: understanding flow separation in the flow around a bluff body and finding drag-minimizing shapes. PMID:19211800

  10. A nonperturbative approximation for the moderate Reynolds number Navier-Stokes equations.

    PubMed

    Roper, Marcus; Brenner, Michael P

    2009-03-03

    The nonlinearity of the Navier-Stokes equations makes predicting the flow of fluid around rapidly moving small bodies highly resistant to all approaches save careful experiments or brute force computation. Here, we show how a linearization of the Navier-Stokes equations captures the drag-determining features of the flow and allows simplified or analytical computation of the drag on bodies up to Reynolds number of order 100. We illustrate the utility of this linearization in 2 practical problems that normally can only be tackled with sophisticated numerical methods: understanding flow separation in the flow around a bluff body and finding drag-minimizing shapes.

  11. An evaluation of computer assisted clinical classification algorithms.

    PubMed

    Chute, C G; Yang, Y; Buntrock, J

    1994-01-01

    The Mayo Clinic has a long tradition of indexing patient records in high resolution and volume. Several algorithms have been developed which promise to help human coders in the classification process. We evaluate variations on code browsers and free text indexing systems with respect to their speed and error rates in our production environment. The more sophisticated indexing systems save measurable time in the coding process, but suffer from incompleteness which requires a back-up system or human verification. Expert Network does the best job of rank ordering clinical text, potentially enabling the creation of thresholds for the pass through of computer coded data without human review.

  12. The Matter Simulation (R)evolution

    PubMed Central

    2018-01-01

    To date, the program for the development of methods and models for atomistic and continuum simulation directed toward chemicals and materials has reached an incredible degree of sophistication and maturity. Currently, one can witness an increasingly rapid emergence of advances in computing, artificial intelligence, and robotics. This drives us to consider the future of computer simulation of matter from the molecular to the human length and time scales in a radical way that deliberately dares to go beyond the foreseeable next steps in any given discipline. This perspective article presents a view on this future development that we believe is likely to become a reality during our lifetime. PMID:29532014

  13. Instructional computing in space physics moves ahead

    NASA Astrophysics Data System (ADS)

    Russell, C. T.; Omidi, N.

    As the number of spacecraft stationed in the Earth's magnetosphere exponentiates and society becomes more technologically sophisticated and dependent on these spacebased resources, both the importance of space physics and the need to train people in this field will increase.Space physics is a very difficult subject for students to master. Both mechanical and electromagnetic forces are important. The treatment of problems can be very mathematical, and the scale sizes of phenomena are usually such that laboratory studies become impossible, and experimentation, when possible at all, must be carried out in deep space. Fortunately, computers have evolved to the point that they are able to greatly facilitate instruction in space physics.

  14. Learning of embodied interaction dynamics with recurrent neural networks: some exploratory experiments.

    PubMed

    Oubbati, Mohamed; Kord, Bahram; Koprinkova-Hristova, Petia; Palm, Günther

    2014-04-01

    The new tendency of artificial intelligence suggests that intelligence must be seen as a result of the interaction between brains, bodies and environments. This view implies that designing sophisticated behaviour requires a primary focus on how agents are functionally coupled to their environments. Under this perspective, we present early results with the application of reservoir computing as an efficient tool to understand how behaviour emerges from interaction. Specifically, we present reservoir computing models, that are inspired by imitation learning designs, to extract the essential components of behaviour that results from agent-environment interaction dynamics. Experimental results using a mobile robot are reported to validate the learning architectures.

  15. Learning of embodied interaction dynamics with recurrent neural networks: some exploratory experiments

    NASA Astrophysics Data System (ADS)

    Oubbati, Mohamed; Kord, Bahram; Koprinkova-Hristova, Petia; Palm, Günther

    2014-04-01

    The new tendency of artificial intelligence suggests that intelligence must be seen as a result of the interaction between brains, bodies and environments. This view implies that designing sophisticated behaviour requires a primary focus on how agents are functionally coupled to their environments. Under this perspective, we present early results with the application of reservoir computing as an efficient tool to understand how behaviour emerges from interaction. Specifically, we present reservoir computing models, that are inspired by imitation learning designs, to extract the essential components of behaviour that results from agent-environment interaction dynamics. Experimental results using a mobile robot are reported to validate the learning architectures.

  16. Preliminary performance analysis of an interplanetary navigation system using asteroid based beacons

    NASA Technical Reports Server (NTRS)

    Jee, J. Rodney; Khatib, Ahmad R.; Muellerschoen, Ronald J.; Williams, Bobby G.; Vincent, Mark A.

    1988-01-01

    A futuristic interplanetary navigation system using transmitters placed on selected asteroids is introduced. This network of space beacons is seen as a needed alternative to the overly burdened Deep Space Network. Covariance analyses on the potential performance of these space beacons located on a candidate constellation of eight real asteroids are initiated. Simplified analytic calculations are performed to determine limiting accuracies attainable with the network for geometric positioning. More sophisticated computer simulations are also performed to determine potential accuracies using long arcs of range and Doppler data from the beacons. The results from these computations show promise for this navigation system.

  17. Nonadditivity of van der Waals forces on liquid surfaces

    NASA Astrophysics Data System (ADS)

    Venkataram, Prashanth S.; Whitton, Jeremy D.; Rodriguez, Alejandro W.

    2016-09-01

    We present an approach for modeling nanoscale wetting and dewetting of textured solid surfaces that exploits recently developed, sophisticated techniques for computing exact long-range dispersive van der Waals (vdW) or (more generally) Casimir forces in arbitrary geometries. We apply these techniques to solve the variational formulation of the Young-Laplace equation and predict the equilibrium shapes of liquid-vacuum interfaces near solid gratings. We show that commonly employed methods of computing vdW interactions based on additive Hamaker or Derjaguin approximations, which neglect important electromagnetic boundary effects, can result in large discrepancies in the shapes and behaviors of liquid surfaces compared to exact methods.

  18. Computing the binding affinity of a ligand buried deep inside a protein with the hybrid steered molecular dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Villarreal, Oscar D.; Yu, Lili; Department of Laboratory Medicine, Yancheng Vocational Institute of Health Sciences, Yancheng, Jiangsu 224006

    Computing the ligand-protein binding affinity (or the Gibbs free energy) with chemical accuracy has long been a challenge for which many methods/approaches have been developed and refined with various successful applications. False positives and, even more harmful, false negatives have been and still are a common occurrence in practical applications. Inevitable in all approaches are the errors in the force field parameters we obtain from quantum mechanical computation and/or empirical fittings for the intra- and inter-molecular interactions. These errors propagate to the final results of the computed binding affinities even if we were able to perfectly implement the statistical mechanicsmore » of all the processes relevant to a given problem. And they are actually amplified to various degrees even in the mature, sophisticated computational approaches. In particular, the free energy perturbation (alchemical) approaches amplify the errors in the force field parameters because they rely on extracting the small differences between similarly large numbers. In this paper, we develop a hybrid steered molecular dynamics (hSMD) approach to the difficult binding problems of a ligand buried deep inside a protein. Sampling the transition along a physical (not alchemical) dissociation path of opening up the binding cavity- -pulling out the ligand- -closing back the cavity, we can avoid the problem of error amplifications by not relying on small differences between similar numbers. We tested this new form of hSMD on retinol inside cellular retinol-binding protein 1 and three cases of a ligand (a benzylacetate, a 2-nitrothiophene, and a benzene) inside a T4 lysozyme L99A/M102Q(H) double mutant. In all cases, we obtained binding free energies in close agreement with the experimentally measured values. This indicates that the force field parameters we employed are accurate and that hSMD (a brute force, unsophisticated approach) is free from the problem of error amplification suffered by many sophisticated approaches in the literature.« less

  19. Computer-aided light sheet flow visualization using photogrammetry

    NASA Technical Reports Server (NTRS)

    Stacy, Kathryn; Severance, Kurt; Childers, Brooks A.

    1994-01-01

    A computer-aided flow visualization process has been developed to analyze video images acquired from rotating and translating light sheet visualization systems. The computer process integrates a mathematical model for image reconstruction, advanced computer graphics concepts, and digital image processing to provide a quantitative and a visual analysis capability. The image reconstruction model, based on photogrammetry, uses knowledge of the camera and light sheet locations and orientations to project two-dimensional light sheet video images into three-dimensional space. A sophisticated computer visualization package, commonly used to analyze computational fluid dynamics (CFD) results, was chosen to interactively display the reconstructed light sheet images with the numerical surface geometry for the model or aircraft under study. The photogrammetric reconstruction technique and the image processing and computer graphics techniques and equipment are described. Results of the computer-aided process applied to both a wind tunnel translating light sheet experiment and an in-flight rotating light sheet experiment are presented. The capability to compare reconstructed experimental light sheet images with CFD solutions in the same graphics environment is also demonstrated.

  20. Computer-Aided Light Sheet Flow Visualization

    NASA Technical Reports Server (NTRS)

    Stacy, Kathryn; Severance, Kurt; Childers, Brooks A.

    1993-01-01

    A computer-aided flow visualization process has been developed to analyze video images acquired from rotating and translating light sheet visualization systems. The computer process integrates a mathematical model for image reconstruction, advanced computer graphics concepts, and digital image processing to provide a quantitative and visual analysis capability. The image reconstruction model, based on photogrammetry, uses knowledge of the camera and light sheet locations and orientations to project two-dimensional light sheet video images into three-dimensional space. A sophisticated computer visualization package, commonly used to analyze computational fluid dynamics (CFD) data sets, was chosen to interactively display the reconstructed light sheet images, along with the numerical surface geometry for the model or aircraft under study. A description is provided of the photogrammetric reconstruction technique, and the image processing and computer graphics techniques and equipment. Results of the computer aided process applied to both a wind tunnel translating light sheet experiment and an in-flight rotating light sheet experiment are presented. The capability to compare reconstructed experimental light sheet images and CFD solutions in the same graphics environment is also demonstrated.

  1. A Computing Method for Sound Propagation Through a Nonuniform Jet Stream

    NASA Technical Reports Server (NTRS)

    Padula, S. L.; Liu, C. H.

    1974-01-01

    Understanding the principles of jet noise propagation is an essential ingredient of systematic noise reduction research. High speed computer methods offer a unique potential for dealing with complex real life physical systems whereas analytical solutions are restricted to sophisticated idealized models. The classical formulation of sound propagation through a jet flow was found to be inadequate for computer solutions and a more suitable approach was needed. Previous investigations selected the phase and amplitude of the acoustic pressure as dependent variables requiring the solution of a system of nonlinear algebraic equations. The nonlinearities complicated both the analysis and the computation. A reformulation of the convective wave equation in terms of a new set of dependent variables is developed with a special emphasis on its suitability for numerical solutions on fast computers. The technique is very attractive because the resulting equations are linear in nonwaving variables. The computer solution to such a linear system of algebraic equations may be obtained by well-defined and direct means which are conservative of computer time and storage space. Typical examples are illustrated and computational results are compared with available numerical and experimental data.

  2. Computer-aided light sheet flow visualization

    NASA Technical Reports Server (NTRS)

    Stacy, Kathryn; Severance, Kurt; Childers, Brooks A.

    1993-01-01

    A computer-aided flow visualization process has been developed to analyze video images acquired from rotating and translating light sheet visualization systems. The computer process integrates a mathematical model for image reconstruction, advanced computer graphics concepts, and digital image processing to provide a quantitative and visual analysis capability. The image reconstruction model, based on photogrammetry, uses knowledge of the camera and light sheet locations and orientations to project two-dimensional light sheet video images into three-dimensional space. A sophisticated computer visualization package, commonly used to analyze computational fluid dynamics (CFD) data sets, was chosen to interactively display the reconstructed light sheet images, along with the numerical surface geometry for the model or aircraft under study. A description is provided of the photogrammetric reconstruction technique, and the image processing and computer graphics techniques and equipment. Results of the computer aided process applied to both a wind tunnel translating light sheet experiment and an in-flight rotating light sheet experiment are presented. The capability to compare reconstructed experimental light sheet images and CFD solutions in the same graphics environment is also demonstrated.

  3. Advances in greenhouse automation and controlled environment agriculture: A transition to plant factories and urban farming

    USDA-ARS?s Scientific Manuscript database

    Greenhouse cultivation has evolved from simple covered rows of open-fields crops to highly sophisticated controlled environment agriculture (CEA) facilities that projected the image of plant factories for urban farming. The advances and improvements in CEA have promoted the scientific solutions for ...

  4. Microcomputers and Stimulus Control: From the Laboratory to the Classroom.

    ERIC Educational Resources Information Center

    LeBlanc, Judith M.; And Others

    1985-01-01

    The need for developing a technology of teaching that equals current sophistication of microcomputer technology is addressed. The importance of principles of learning and behavior analysis is emphasized. Potential roles of stimulus control and precise error analysis in educational program development and in prescription of specific learning…

  5. Seminar on Understanding Digital Control and Analysis in Vibration Test Systems, part 2

    NASA Technical Reports Server (NTRS)

    1975-01-01

    A number of techniques for dealing with important technical aspects of the random vibration control problem are described. These include the generation of pseudo-random and true random noise, the control spectrum estimation problem, the accuracy/speed tradeoff, and control correction strategies. System hardware, the operator-system interface, safety features, and operational capabilities of sophisticated digital random vibration control systems are also discussed.

  6. Assessment methodology for computer-based instructional simulations.

    PubMed

    Koenig, Alan; Iseli, Markus; Wainess, Richard; Lee, John J

    2013-10-01

    Computer-based instructional simulations are becoming more and more ubiquitous, particularly in military and medical domains. As the technology that drives these simulations grows ever more sophisticated, the underlying pedagogical models for how instruction, assessment, and feedback are implemented within these systems must evolve accordingly. In this article, we review some of the existing educational approaches to medical simulations, and present pedagogical methodologies that have been used in the design and development of games and simulations at the University of California, Los Angeles, Center for Research on Evaluation, Standards, and Student Testing. In particular, we present a methodology for how automated assessments of computer-based simulations can be implemented using ontologies and Bayesian networks, and discuss their advantages and design considerations for pedagogical use. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.

  7. Toward Scalable Trustworthy Computing Using the Human-Physiology-Immunity Metaphor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hively, Lee M; Sheldon, Frederick T

    The cybersecurity landscape consists of an ad hoc patchwork of solutions. Optimal cybersecurity is difficult for various reasons: complexity, immense data and processing requirements, resource-agnostic cloud computing, practical time-space-energy constraints, inherent flaws in 'Maginot Line' defenses, and the growing number and sophistication of cyberattacks. This article defines the high-priority problems and examines the potential solution space. In that space, achieving scalable trustworthy computing and communications is possible through real-time knowledge-based decisions about cyber trust. This vision is based on the human-physiology-immunity metaphor and the human brain's ability to extract knowledge from data and information. The article outlines future steps towardmore » scalable trustworthy systems requiring a long-term commitment to solve the well-known challenges.« less

  8. The Eukaryotic Pathogen Databases: a functional genomic resource integrating data from human and veterinary parasites.

    PubMed

    Harb, Omar S; Roos, David S

    2015-01-01

    Over the past 20 years, advances in high-throughput biological techniques and the availability of computational resources including fast Internet access have resulted in an explosion of large genome-scale data sets "big data." While such data are readily available for download and personal use and analysis from a variety of repositories, often such analysis requires access to seldom-available computational skills. As a result a number of databases have emerged to provide scientists with online tools enabling the interrogation of data without the need for sophisticated computational skills beyond basic knowledge of Internet browser utility. This chapter focuses on the Eukaryotic Pathogen Databases (EuPathDB: http://eupathdb.org) Bioinformatic Resource Center (BRC) and illustrates some of the available tools and methods.

  9. Generative Representations for Computer-Automated Evolutionary Design

    NASA Technical Reports Server (NTRS)

    Hornby, Gregory S.

    2006-01-01

    With the increasing computational power of computers, software design systems are progressing from being tools for architects and designers to express their ideas to tools capable of creating designs under human guidance. One of the main limitations for these computer-automated design systems is the representation with which they encode designs. If the representation cannot encode a certain design, then the design system cannot produce it. To be able to produce new types of designs, and not just optimize pre-defined parameterizations, evolutionary design systems must use generative representations. Generative representations are assembly procedures, or algorithms, for constructing a design thereby allowing for truly novel design solutions to be encoded. In addition, by enabling modularity, regularity and hierarchy, the level of sophistication that can be evolved is increased. We demonstrate the advantages of generative representations on two different design domains: the evolution of spacecraft antennas and the evolution of 3D objects.

  10. Earth Science Informatics Comes of Age

    NASA Technical Reports Server (NTRS)

    Jodha, Siri; Khalsa, S.; Ramachandran, Rahul

    2014-01-01

    The volume and complexity of Earth science data have steadily increased, placing ever-greater demands on researchers, software developers and data managers tasked with handling such data. Additional demands arise from requirements being levied by funding agencies and governments to better manage, preserve and provide open access to data. Fortunately, over the past 10-15 years significant advances in information technology, such as increased processing power, advanced programming languages, more sophisticated and practical standards, and near-ubiquitous internet access have made the jobs of those acquiring, processing, distributing and archiving data easier. These advances have also led to an increasing number of individuals entering the field of informatics as it applies to Geoscience and Remote Sensing. Informatics is the science and technology of applying computers and computational methods to the systematic analysis, management, interchange, and representation of data, information, and knowledge. Informatics also encompasses the use of computers and computational methods to support decisionmaking and other applications for societal benefits.

  11. Tyramine Hydrochloride Based Label-Free System for Operating Various DNA Logic Gates and a DNA Caliper for Base Number Measurements.

    PubMed

    Fan, Daoqing; Zhu, Xiaoqing; Dong, Shaojun; Wang, Erkang

    2017-07-05

    DNA is believed to be a promising candidate for molecular logic computation, and the fluorogenic/colorimetric substrates of G-quadruplex DNAzyme (G4zyme) are broadly used as label-free output reporters of DNA logic circuits. Herein, for the first time, tyramine-HCl (a fluorogenic substrate of G4zyme) is applied to DNA logic computation and a series of label-free DNA-input logic gates, including elementary AND, OR, and INHIBIT logic gates, as well as a two to one encoder, are constructed. Furthermore, a DNA caliper that can measure the base number of target DNA as low as three bases is also fabricated. This DNA caliper can also perform concatenated AND-AND logic computation to fulfil the requirements of sophisticated logic computing. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. A Gateway for Phylogenetic Analysis Powered by Grid Computing Featuring GARLI 2.0

    PubMed Central

    Bazinet, Adam L.; Zwickl, Derrick J.; Cummings, Michael P.

    2014-01-01

    We introduce molecularevolution.org, a publicly available gateway for high-throughput, maximum-likelihood phylogenetic analysis powered by grid computing. The gateway features a garli 2.0 web service that enables a user to quickly and easily submit thousands of maximum likelihood tree searches or bootstrap searches that are executed in parallel on distributed computing resources. The garli web service allows one to easily specify partitioned substitution models using a graphical interface, and it performs sophisticated post-processing of phylogenetic results. Although the garli web service has been used by the research community for over three years, here we formally announce the availability of the service, describe its capabilities, highlight new features and recent improvements, and provide details about how the grid system efficiently delivers high-quality phylogenetic results. [garli, gateway, grid computing, maximum likelihood, molecular evolution portal, phylogenetics, web service.] PMID:24789072

  13. A tale of three bio-inspired computational approaches

    NASA Astrophysics Data System (ADS)

    Schaffer, J. David

    2014-05-01

    I will provide a high level walk-through for three computational approaches derived from Nature. First, evolutionary computation implements what we may call the "mother of all adaptive processes." Some variants on the basic algorithms will be sketched and some lessons I have gleaned from three decades of working with EC will be covered. Then neural networks, computational approaches that have long been studied as possible ways to make "thinking machines", an old dream of man's, and based upon the only known existing example of intelligence. Then, a little overview of attempts to combine these two approaches that some hope will allow us to evolve machines we could never hand-craft. Finally, I will touch on artificial immune systems, Nature's highly sophisticated defense mechanism, that has emerged in two major stages, the innate and the adaptive immune systems. This technology is finding applications in the cyber security world.

  14. Application of CFD to a generic hypersonic flight research study

    NASA Technical Reports Server (NTRS)

    Green, Michael J.; Lawrence, Scott L.; Dilley, Arthur D.; Hawkins, Richard W.; Walker, Mary M.; Oberkampf, William L.

    1993-01-01

    Computational analyses have been performed for the initial assessment of flight research vehicle concepts that satisfy requirements for potential hypersonic experiments. Results were obtained from independent analyses at NASA Ames, NASA Langley, and Sandia National Labs, using sophisticated time-dependent Navier-Stokes and parabolized Navier-Stokes methods. Careful study of a common problem consisting of hypersonic flow past a slightly blunted conical forebody was undertaken to estimate the level of uncertainty in the computed results, and to assess the capabilities of current computational methods for predicting boundary-layer transition onset. Results of this study in terms of surface pressure and heat transfer comparisons, as well as comparisons of boundary-layer edge quantities and flow-field profiles are presented here. Sensitivities to grid and gas model are discussed. Finally, representative results are presented relating to the use of Computational Fluid Dynamics in the vehicle design and the integration/support of potential experiments.

  15. Efficient modeling of vector hysteresis using a novel Hopfield neural network implementation of Stoner–Wohlfarth-like operators

    PubMed Central

    Adly, Amr A.; Abd-El-Hafiz, Salwa K.

    2012-01-01

    Incorporation of hysteresis models in electromagnetic analysis approaches is indispensable to accurate field computation in complex magnetic media. Throughout those computations, vector nature and computational efficiency of such models become especially crucial when sophisticated geometries requiring massive sub-region discretization are involved. Recently, an efficient vector Preisach-type hysteresis model constructed from only two scalar models having orthogonally coupled elementary operators has been proposed. This paper presents a novel Hopfield neural network approach for the implementation of Stoner–Wohlfarth-like operators that could lead to a significant enhancement in the computational efficiency of the aforementioned model. Advantages of this approach stem from the non-rectangular nature of these operators that substantially minimizes the number of operators needed to achieve an accurate vector hysteresis model. Details of the proposed approach, its identification and experimental testing are presented in the paper. PMID:25685446

  16. 15 CFR 752.11 - Internal Control Programs.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... order for several sophisticated lasers. (C) The product ordered is incompatible with the technical level... equipment would be of little use in a country without an electronics industry. (D) The customer has little...

  17. 15 CFR 752.11 - Internal Control Programs.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... order for several sophisticated lasers. (C) The product ordered is incompatible with the technical level... equipment would be of little use in a country without an electronics industry. (D) The customer has little...

  18. 15 CFR 752.11 - Internal Control Programs.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... order for several sophisticated lasers. (C) The product ordered is incompatible with the technical level... equipment would be of little use in a country without an electronics industry. (D) The customer has little...

  19. 15 CFR 752.11 - Internal Control Programs.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... order for several sophisticated lasers. (C) The product ordered is incompatible with the technical level... equipment would be of little use in a country without an electronics industry. (D) The customer has little...

  20. IPAD: Integrated Programs for Aerospace-vehicle Design

    NASA Technical Reports Server (NTRS)

    Miller, R. E., Jr.

    1985-01-01

    Early work was performed to apply data base technology in support of the management of engineering data in the design and manufacturing environments. The principal objective of the IPAD project is to develop a computer software system for use in the design of aerospace vehicles. Two prototype systems are created for this purpose. Relational Information Manager (RIM) is a successful commercial product. The IPAD Information Processor (IPIP), a much more sophisticated system, is still under development.

  1. Laser targets compensate for limitations in inertial confinement fusion drivers

    NASA Astrophysics Data System (ADS)

    Kilkenny, J. D.; Alexander, N. B.; Nikroo, A.; Steinman, D. A.; Nobile, A.; Bernat, T.; Cook, R.; Letts, S.; Takagi, M.; Harding, D.

    2005-10-01

    Success in inertial confinement fusion (ICF) requires sophisticated, characterized targets. The increasing fidelity of three-dimensional (3D), radiation hydrodynamic computer codes has made it possible to design targets for ICF which can compensate for limitations in the existing single shot laser and Z pinch ICF drivers. Developments in ICF target fabrication technology allow more esoteric target designs to be fabricated. At present, requirements require new deterministic nano-material fabrication on micro scale.

  2. Manufacturing Technology Research Needs of the Gear Industry

    DTIC Science & Technology

    1987-12-31

    Precision Gear Industry, . .... 31 2.2.6.8 Availability’of Skilied Craftsmen. o.... 32 2.2.6.9 Management Shortcomings within the U.S. Precision Gear...becomes more sophisticated, workers are running numerically con- trolled computer equipment requiring an understanding of math. I 2.2.6.9 Management ...inefficiencies of the job shop environ- ment by managing the gear business as a backward integra- tion of the assembly line. o Develop and maintain employee

  3. Real-time seismic monitoring of instrumented hospital buildings

    USGS Publications Warehouse

    Kalkan, Erol; Fletcher, Jon Peter B.; Leith, William S.; McCarthy, William S.; Banga, Krishna

    2012-01-01

    In collaboration with the Department of Veterans Affairs (VA), the U.S. Geological Survey's National Strong Motion Project has recently installed sophisticated seismic monitoring systems to monitor the structural health of two hospital buildings at the Memphis VA Medical Center in Tennessee. The monitoring systems in the Bed Tower and Spinal Cord Injury buildings combine sensing technologies with an on-site computer to capture and analyze seismic performance of buildings in near-real time.

  4. Project Listen Compute Show (LCS) - Marine

    DTIC Science & Technology

    2004-02-01

    Figure 15. Block diagram of a BB-5. Notice the discrete components between the FPGA and the display connection. All of these are scheduled to be... scheduled to form the core of the next generation projection product. This architecture is expected to scale to true HDTV resolution of 1920 by 1080...flight schedule obtained from a SABRE database in order to offer on-time status. We have developed more sophisticated mechanisms for dealing with

  5. Neo-Sophistic Rhetorical Theory: Sophistic Precedents for Contemporary Epistemic Rhetoric.

    ERIC Educational Resources Information Center

    McComiskey, Bruce

    Interest in the sophists has recently intensified among rhetorical theorists, culminating in the notion that rhetoric is epistemic. Epistemic rhetoric has its first and deepest roots in sophistic epistemological and rhetorical traditions, so that the view of rhetoric as epistemic is now being dubbed "neo-sophistic." In epistemic…

  6. Computers in medicine: liability issues for physicians.

    PubMed

    Hafner, A W; Filipowicz, A B; Whitely, W P

    1989-07-01

    Physicians routinely use computers to store, access, and retrieve medical information. As computer use becomes even more widespread in medicine, failure to utilize information systems may be seen as a violation of professional custom and lead to findings of professional liability. Even when a technology is not widespread, failure to incorporate it into medical practice may give rise to liability if the technology is accessible to the physician and reduces risk to the patient. Improvement in the availability of medical information sources imposes a greater burden on the physician to keep current and to obtain informed consent from patients. To routinely perform computer-assisted literature searches for informed consent and diagnosis is 'good medicine'. Clinical and diagnostic applications of computer technology now include computer-assisted decision making with the aid of sophisticated databases. Although such systems will expand the knowledge base and competence of physicians, malfunctioning software raises a major liability question. Also, complex computer-driven technology is used in direct patient care. Defective or improperly used hardware or software can lead to patient injury, thus raising additional complicated questions of professional liability and product liability.

  7. Application of advanced virtual reality and 3D computer assisted technologies in tele-3D-computer assisted surgery in rhinology.

    PubMed

    Klapan, Ivica; Vranjes, Zeljko; Prgomet, Drago; Lukinović, Juraj

    2008-03-01

    The real-time requirement means that the simulation should be able to follow the actions of the user that may be moving in the virtual environment. The computer system should also store in its memory a three-dimensional (3D) model of the virtual environment. In that case a real-time virtual reality system will update the 3D graphic visualization as the user moves, so that up-to-date visualization is always shown on the computer screen. Upon completion of the tele-operation, the surgeon compares the preoperative and postoperative images and models of the operative field, and studies video records of the procedure itself Using intraoperative records, animated images of the real tele-procedure performed can be designed. Virtual surgery offers the possibility of preoperative planning in rhinology. The intraoperative use of computer in real time requires development of appropriate hardware and software to connect medical instrumentarium with the computer and to operate the computer by thus connected instrumentarium and sophisticated multimedia interfaces.

  8. Results from teleoperated free-flying spacecraft simulations in the Martin Marietta space operations simulator lab

    NASA Technical Reports Server (NTRS)

    Hartley, Craig S.

    1990-01-01

    To augment the capabilities of the Space Transportation System, NASA has funded studies and developed programs aimed at developing reusable, remotely piloted spacecraft and satellite servicing systems capable of delivering, retrieving, and servicing payloads at altitudes and inclinations beyond the reach of the present Shuttle Orbiters. Since the mid 1970's, researchers at the Martin Marietta Astronautics Group Space Operations Simulation (SOS) Laboratory have been engaged in investigations of remotely piloted and supervised autonomous spacecraft operations. These investigations were based on high fidelity, real-time simulations and have covered a wide range of human factors issues related to controllability. Among these are: (1) mission conditions, including thruster plume impingements and signal time delays; (2) vehicle performance variables, including control authority, control harmony, minimum impulse, and cross coupling of accelerations; (3) maneuvering task requirements such as target distance and dynamics; (4) control parameters including various control modes and rate/displacement deadbands; and (5) display parameters involving camera placement and function, visual aids, and presentation of operational feedback from the spacecraft. This presentation includes a brief description of the capabilities of the SOS Lab to simulate real-time free-flyer operations using live video, advanced technology ground and on-orbit workstations, and sophisticated computer models of on-orbit spacecraft behavior. Sample results from human factors studies in the five categories cited above are provided.

  9. The Crossroads between the Galactic Disk and Interstellar Space, Ablaze in 3/4 keV Light

    NASA Astrophysics Data System (ADS)

    Shelton, Robin L.

    2016-04-01

    The halo is the crossroads between the Galactic disk and intergalactic space. This region is inhabited by hot gas that has risen from the disk, gas heated in situ, and hot material that has fallen in from intergalactic space. Owing to high spectral resolution observations made by by XMM-Newton, Suzaku, and Chandra of the hot plasma's 3/4 keV emission and absorption, increasingly sophisticated and CPU intensive computer modeling, and an awareness that charge exchange can contaminate 3/4 keV observations, we are now better able to understand the hot halo gas than ever before.Spectral analyses indicate that the 3/4 keV emission comes from T ~ 2.2 million Kelvin gas. Although observations suggest that the gas may be convectively unstable and the spectra's temperature is similar to that predicted by recent sophisticated models of the galactic fountain, the observed emission measure is significantly brighter than that predicted by fountain models. This brightness disparity presents us with another type of crossroads: should we continue down the road of adding physics to already sophisticated modeling or should we seek out other sources? In this presentation, I will discuss the galactic fountain crossroads, note the latitudinal and longitudinal distribution of the hot halo gas, provide an update on charge exchange, and explain how shadowing observations have helped to fine tune our understanding of the hot gas.

  10. DISCRETE EVENT SIMULATION OF OPTICAL SWITCH MATRIX PERFORMANCE IN COMPUTER NETWORKS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Imam, Neena; Poole, Stephen W

    2013-01-01

    In this paper, we present application of a Discrete Event Simulator (DES) for performance modeling of optical switching devices in computer networks. Network simulators are valuable tools in situations where one cannot investigate the system directly. This situation may arise if the system under study does not exist yet or the cost of studying the system directly is prohibitive. Most available network simulators are based on the paradigm of discrete-event-based simulation. As computer networks become increasingly larger and more complex, sophisticated DES tool chains have become available for both commercial and academic research. Some well-known simulators are NS2, NS3, OPNET,more » and OMNEST. For this research, we have applied OMNEST for the purpose of simulating multi-wavelength performance of optical switch matrices in computer interconnection networks. Our results suggest that the application of DES to computer interconnection networks provides valuable insight in device performance and aids in topology and system optimization.« less

  11. Pyramidal neurovision architecture for vision machines

    NASA Astrophysics Data System (ADS)

    Gupta, Madan M.; Knopf, George K.

    1993-08-01

    The vision system employed by an intelligent robot must be active; active in the sense that it must be capable of selectively acquiring the minimal amount of relevant information for a given task. An efficient active vision system architecture that is based loosely upon the parallel-hierarchical (pyramidal) structure of the biological visual pathway is presented in this paper. Although the computational architecture of the proposed pyramidal neuro-vision system is far less sophisticated than the architecture of the biological visual pathway, it does retain some essential features such as the converging multilayered structure of its biological counterpart. In terms of visual information processing, the neuro-vision system is constructed from a hierarchy of several interactive computational levels, whereupon each level contains one or more nonlinear parallel processors. Computationally efficient vision machines can be developed by utilizing both the parallel and serial information processing techniques within the pyramidal computing architecture. A computer simulation of a pyramidal vision system for active scene surveillance is presented.

  12. Highly Sophisticated Virtual Laboratory Instruments in Education

    NASA Astrophysics Data System (ADS)

    Gaskins, T.

    2006-12-01

    Many areas of Science have advanced or stalled according to the ability to see what can not normally be seen. Visual understanding has been key to many of the world's greatest breakthroughs, such as discovery of DNAs double helix. Scientists use sophisticated instruments to see what the human eye can not. Light microscopes, scanning electron microscopes (SEM), spectrometers and atomic force microscopes are employed to examine and learn the details of the extremely minute. It's rare that students prior to university have access to such instruments, or are granted full ability to probe and magnify as desired. Virtual Lab, by providing highly authentic software instruments and comprehensive imagery of real specimens, provides them this opportunity. Virtual Lab's instruments let explorers operate virtual devices on a personal computer to examine real specimens. Exhaustive sets of images systematically and robotically photographed at thousands of positions and multiple magnifications and focal points allow students to zoom in and focus on the most minute detail of each specimen. Controls on each Virtual Lab device interactively and smoothly move the viewer through these images to display the specimen as the instrument saw it. Users control position, magnification, focal length, filters and other parameters. Energy dispersion spectrometry is combined with SEM imagery to enable exploration of chemical composition at minute scale and arbitrary location. Annotation capabilities allow scientists, teachers and students to indicate important features or areas. Virtual Lab is a joint project of NASA and the Beckman Institute at the University of Illinois at Urbana- Champaign. Four instruments currently compose the Virtual Lab suite: A scanning electron microscope and companion energy dispersion spectrometer, a high-power light microscope, and a scanning probe microscope that captures surface properties to the level of atoms. Descriptions of instrument operating principles and uses are also part of Virtual Lab. The Virtual Lab software and its increasingly rich collection of specimens are free to anyone. This presentation describes Virtual Lab and its uses in formal and informal education.

  13. Advanced Vehicle Control Systems Potential Tort Liability For Developers

    DOT National Transportation Integrated Search

    1993-12-01

    AUTOMOBILE ACCIDENTS AVOIDED BECAUSE THE AUTOMATIC COLLISION AVOIDANCE SYSTEM APPLIES THE BRAKES, HIGHWAYS WHICH ACCOMMODATE MORE VEHICLES WITH FEWER ACCIDENTS, AND EVEN CARS WHICH ARE PILOTED ENTIRELY BY SOPHISTICATED ELECTRONIC SYSTEMS -- ALL OF TH...

  14. Pharmacophore modeling, docking, and principal component analysis based clustering: combined computer-assisted approaches to identify new inhibitors of the human rhinovirus coat protein.

    PubMed

    Steindl, Theodora M; Crump, Carolyn E; Hayden, Frederick G; Langer, Thierry

    2005-10-06

    The development and application of a sophisticated virtual screening and selection protocol to identify potential, novel inhibitors of the human rhinovirus coat protein employing various computer-assisted strategies are described. A large commercially available database of compounds was screened using a highly selective, structure-based pharmacophore model generated with the program Catalyst. A docking study and a principal component analysis were carried out within the software package Cerius and served to validate and further refine the obtained results. These combined efforts led to the selection of six candidate structures, for which in vitro anti-rhinoviral activity could be shown in a biological assay.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gavignet, A.A.; Wick, C.J.

    In current practice, pressure drops in the mud circulating system and the settling velocity of cuttings are calculated with simple rheological models and simple equations. Wellsite computers now allow more sophistication in drilling computations. In this paper, experimental results on the settling velocity of spheres in drilling fluids are reported, along with rheograms done over a wide range of shear rates. The flow curves are fitted to polynomials and general methods are developed to predict friction losses and settling velocities as functions of the polynomial coefficients. These methods were incorporated in a software package that can handle any rig configurationmore » system, including riser booster. Graphic displays show the effect of each parameter on the performance of the circulating system.« less

  16. GEOS. User Tutorials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fu, Pengchen; Settgast, Randolph R.; Johnson, Scott M.

    2014-12-17

    GEOS is a massively parallel, multi-physics simulation application utilizing high performance computing (HPC) to address subsurface reservoir stimulation activities with the goal of optimizing current operations and evaluating innovative stimulation methods. GEOS enables coupling of di erent solvers associated with the various physical processes occurring during reservoir stimulation in unique and sophisticated ways, adapted to various geologic settings, materials and stimulation methods. Developed at the Lawrence Livermore National Laboratory (LLNL) as a part of a Laboratory-Directed Research and Development (LDRD) Strategic Initiative (SI) project, GEOS represents the culmination of a multi-year ongoing code development and improvement e ort that hasmore » leveraged existing code capabilities and sta expertise to design new computational geosciences software.« less

  17. Efficient Unsteady Flow Visualization with High-Order Access Dependencies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Jiang; Guo, Hanqi; Yuan, Xiaoru

    We present a novel high-order access dependencies based model for efficient pathline computation in unsteady flow visualization. By taking longer access sequences into account to model more sophisticated data access patterns in particle tracing, our method greatly improves the accuracy and reliability in data access prediction. In our work, high-order access dependencies are calculated by tracing uniformly-seeded pathlines in both forward and backward directions in a preprocessing stage. The effectiveness of our proposed approach is demonstrated through a parallel particle tracing framework with high-order data prefetching. Results show that our method achieves higher data locality and hence improves the efficiencymore » of pathline computation.« less

  18. Depicting 3D shape using lines

    NASA Astrophysics Data System (ADS)

    DeCarlo, Doug

    2012-03-01

    Over the last few years, researchers in computer graphics have developed sophisticated mathematical descriptions of lines on 3D shapes that can be rendered convincingly as strokes in drawings. These innovations highlight fundamental questions about how human perception takes strokes in drawings as evidence of 3D structure. Answering these questions will lead to a greater scientific understanding of the flexibility and richness of human perception, as well as to practical techniques for synthesizing clearer and more compelling drawings. This paper reviews what is known about the mathematics and perception of computer-generated line drawings of shape and motivates an ongoing program of research to better characterize the shapes people see when they look at such drawings.

  19. A scheme to calculate higher-order homogenization as applied to micro-acoustic boundary value problems

    NASA Astrophysics Data System (ADS)

    Vagh, Hardik A.; Baghai-Wadji, Alireza

    2008-12-01

    Current technological challenges in materials science and high-tech device industry require the solution of boundary value problems (BVPs) involving regions of various scales, e.g. multiple thin layers, fibre-reinforced composites, and nano/micro pores. In most cases straightforward application of standard variational techniques to BVPs of practical relevance necessarily leads to unsatisfactorily ill-conditioned analytical and/or numerical results. To remedy the computational challenges associated with sub-sectional heterogeneities various sophisticated homogenization techniques need to be employed. Homogenization refers to the systematic process of smoothing out the sub-structural heterogeneities, leading to the determination of effective constitutive coefficients. Ordinarily, homogenization involves a sophisticated averaging and asymptotic order analysis to obtain solutions. In the majority of the cases only zero-order terms are constructed due to the complexity of the processes involved. In this paper we propose a constructive scheme for obtaining homogenized solutions involving higher order terms, and thus, guaranteeing higher accuracy and greater robustness of the numerical results. We present

  20. HPC Annual Report: Emulytics.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crussell, Jonathan; Boote, Jeffrey W.; Fritz, David Jakob

    Networked Information Technology systems play a key role supporting critical government, military, and private computer installations. Many of today's critical infrastructure systems have strong dependencies on secure information exchange among geographically dispersed facilities. As operations become increasingly dependent on the information exchange they also become targets for exploitation. The need to protect data and defend these systems from external attack has become increasingly vital while the nature of the threats has become sophisticated and pervasive making the challenges daunting. Enter Emulytics.

  1. Development and application of numerical techniques for general-relativistic magnetohydrodynamics simulations of black hole accretion

    NASA Astrophysics Data System (ADS)

    White, Christopher Joseph

    We describe the implementation of sophisticated numerical techniques for general-relativistic magnetohydrodynamics simulations in the Athena++ code framework. Improvements over many existing codes include the use of advanced Riemann solvers and of staggered-mesh constrained transport. Combined with considerations for computational performance and parallel scalability, these allow us to investigate black hole accretion flows with unprecedented accuracy. The capability of the code is demonstrated by exploring magnetically arrested disks.

  2. A Computer Based Data Management System for Automating the Air Force Vehicle Master Plan

    DTIC Science & Technology

    1989-09-01

    and columns -- but a much more sophisticated approach defining the relations among the data. Dr. Edgar F . Codd , who first proposed the relational...39 f . Vehicle Types Screen ... .......... 41 7. NSN Selection Screen ... .......... 41 8. NSN Retry Screen ..... ............ 9. F:ve-Year Outlook...had a single source o: inormation on the vehicle fleet, to be used in deve- uctn..: f .. ng and prioratizing the vehicle program- re~suireo to meet the

  3. Conceptual Memory: A Theory and Computer Program for Processing the Meaning Content of Natural Language Utterances

    DTIC Science & Technology

    1974-07-01

    iiWU -immmemmmmm This document was generated by the Stanford Artificial Intelligence Laboratory’s document compiler, "PUB" and reproducec’ on a...for more sophisticated artificial (programming) languages. The new issues became those of how to represent a grammar as precise syntactic structures...challenge lies in discovering - either by synthesis of an artificial system, or by analysis of a natural one - the underlying logical (a. opposed to

  4. Analysis of peristaltic waves and their role in migrating Physarum plasmodia

    NASA Astrophysics Data System (ADS)

    Lewis, Owen L.; Guy, Robert D.

    2017-07-01

    The true slime mold Physarum polycephalum exhibits a vast array of sophisticated manipulations of its intracellular cytoplasm. Growing microplasmodia of Physarum have been observed to adopt an elongated tadpole shape, then contract in a rhythmic, traveling wave pattern that resembles peristaltic pumping. This contraction drives a fast flow of non-gelated cytoplasm along the cell longitudinal axis. It has been hypothesized that this flow of cytoplasm is a driving factor in generating motility of the plasmodium. In this work, we use two different mathematical models to investigate how peristaltic pumping within Physarum may be used to drive cellular motility. We compare the relative phase of flow and deformation waves predicted by both models to similar phase data collected from in vivo experiments using Physarum plasmodia. The first is a PDE model based on a dimensional reduction of peristaltic pumping within a finite length chamber. The second is a more sophisticated computational model which accounts for more general shape changes, more complex cellular mechanics, and dynamically modulated adhesion to the underlying substrate. This model allows us to directly compute cell crawling speed. Both models suggest that a mechanical asymmetry in the cell is required to reproduce the experimental observations. Such a mechanical asymmetry is also shown to increase the potential for cellular migration, as measured by both stress generation and migration velocity.

  5. Supercomputing resources empowering superstack with interactive and integrated systems

    NASA Astrophysics Data System (ADS)

    Rückemann, Claus-Peter

    2012-09-01

    This paper presents the results from the development and implementation of Superstack algorithms to be dynamically used with integrated systems and supercomputing resources. Processing of geophysical data, thus named geoprocessing, is an essential part of the analysis of geoscientific data. The theory of Superstack algorithms and the practical application on modern computing architectures was inspired by developments introduced with processing of seismic data on mainframes and within the last years leading to high end scientific computing applications. There are several stacking algorithms known but with low signal to noise ratio in seismic data the use of iterative algorithms like the Superstack can support analysis and interpretation. The new Superstack algorithms are in use with wave theory and optical phenomena on highly performant computing resources for huge data sets as well as for sophisticated application scenarios in geosciences and archaeology.

  6. Forecasting techno-social systems: how physics and computing help to fight off global pandemics

    NASA Astrophysics Data System (ADS)

    Vespignani, Alessandro

    2010-03-01

    The crucial issue when planning for adequate public health interventions to mitigate the spread and impact of epidemics is risk evaluation and forecast. This amount to the anticipation of where, when and how strong the epidemic will strike. In the last decade advances in performance in computer technology, data acquisition, statistical physics and complex networks theory allow the generation of sophisticated simulations on supercomputer infrastructures to anticipate the spreading pattern of a pandemic. For the first time we are in the position of generating real time forecast of epidemic spreading. I will review the history of the current H1N1 pandemic, the major road-blocks the community has faced in its containment and mitigation and how physics and computing provide predictive tools that help us to battle epidemics.

  7. A Computational framework for telemedicine.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foster, I.; von Laszewski, G.; Thiruvathukal, G. K.

    1998-07-01

    Emerging telemedicine applications require the ability to exploit diverse and geographically distributed resources. Highspeed networks are used to integrate advanced visualization devices, sophisticated instruments, large databases, archival storage devices, PCs, workstations, and supercomputers. This form of telemedical environment is similar to networked virtual supercomputers, also known as metacomputers. Metacomputers are already being used in many scientific application areas. In this article, we analyze requirements necessary for a telemedical computing infrastructure and compare them with requirements found in a typical metacomputing environment. We will show that metacomputing environments can be used to enable a more powerful and unified computational infrastructure formore » telemedicine. The Globus metacomputing toolkit can provide the necessary low level mechanisms to enable a large scale telemedical infrastructure. The Globus toolkit components are designed in a modular fashion and can be extended to support the specific requirements for telemedicine.« less

  8. A survey of adaptive control technology in robotics

    NASA Technical Reports Server (NTRS)

    Tosunoglu, S.; Tesar, D.

    1987-01-01

    Previous work on the adaptive control of robotic systems is reviewed. Although the field is relatively new and does not yet represent a mature discipline, considerable attention has been given to the design of sophisticated robot controllers. Here, adaptive control methods are divided into model reference adaptive systems and self-tuning regulators with further definition of various approaches given in each class. The similarity and distinct features of the designed controllers are delineated and tabulated to enhance comparative review.

  9. Lesions of dorsal striatum eliminate lose-switch responding but not mixed-response strategies in rats.

    PubMed

    Skelin, Ivan; Hakstol, Rhys; VanOyen, Jenn; Mudiayi, Dominic; Molina, Leonardo A; Holec, Victoria; Hong, Nancy S; Euston, David R; McDonald, Robert J; Gruber, Aaron J

    2014-05-01

    We used focal brain lesions in rats to examine how dorsomedial (DMS) and dorsolateral (DLS) regions of the striatum differently contribute to response adaptation driven by the delivery or omission of rewards. Rats performed a binary choice task under two modes: one in which responses were rewarded on half of the trials regardless of choice; and another 'competitive' one in which only unpredictable choices were rewarded. In both modes, control animals were more likely to use a predictable lose-switch strategy than animals with lesions of either DMS or DLS. Animals with lesions of DMS presumably relied more on DLS for behavioural control, and generated repetitive responses in the first mode. These animals then shifted to a random response strategy in the competitive mode, thereby performing better than controls or animals with DLS lesions. Analysis using computational models of reinforcement learning indicated that animals with striatal lesions, particularly of the DLS, had blunted reward sensitivity and less stochasticity in the choice mechanism. These results provide further evidence that the rodent DLS is involved in rapid response adaptation that is more sophisticated than that embodied by the classic notion of habit formation driven by gradual stimulus-response learning. © 2014 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  10. Control of a robot dinosaur

    PubMed Central

    Papantoniou, V.

    1999-01-01

    The Palaiomation Consortium, supported by the European Commission, is building a robot Iguanodon atherfieldensis for museum display that is much more sophisticated than existing animatronic exhibits. The current half-size (2.5 m) prototype is fully autonomous, carrying its own computer and batteries. It walks around the room, choosing its own path and avoiding obstacles. A bigger version with a larger repertoire of behaviours is planned. Many design problems have had to be overcome. A real dinosaur would have had hundreds of muscles, and we have had to devise means of achieving life-like movement with a much smaller number of motors; we have limited ourselves to 20, to keep the control problems manageable. Realistic stance requires a narrower trackway and a higher centre of mass than in previous (often spider-like) legged robots, making it more difficult to maintain stability. Other important differences from previous walking robots are that the forelegs have to be shorter than the hind, and the machinery has had to be designed to fit inside a realistically shaped body shell. Battery life is about one hour, but to achieve this we have had to design the robot to have very low power consumption. Currently, this limits it to unrealistically slow movement. The control system includes a high-level instructions processor, a gait generator, a motion-coordination generator, and a kinematic model.

  11. Computational models of the pulmonary circulation: Insights and the move towards clinically directed studies

    PubMed Central

    Tawhai, Merryn H.; Clark, Alys R.; Burrowes, Kelly S.

    2011-01-01

    Biophysically-based computational models provide a tool for integrating and explaining experimental data, observations, and hypotheses. Computational models of the pulmonary circulation have evolved from minimal and efficient constructs that have been used to study individual mechanisms that contribute to lung perfusion, to sophisticated multi-scale and -physics structure-based models that predict integrated structure-function relationships within a heterogeneous organ. This review considers the utility of computational models in providing new insights into the function of the pulmonary circulation, and their application in clinically motivated studies. We review mathematical and computational models of the pulmonary circulation based on their application; we begin with models that seek to answer questions in basic science and physiology and progress to models that aim to have clinical application. In looking forward, we discuss the relative merits and clinical relevance of computational models: what important features are still lacking; and how these models may ultimately be applied to further increasing our understanding of the mechanisms occurring in disease of the pulmonary circulation. PMID:22034608

  12. The Center for Computational Biology: resources, achievements, and challenges

    PubMed Central

    Dinov, Ivo D; Thompson, Paul M; Woods, Roger P; Van Horn, John D; Shattuck, David W; Parker, D Stott

    2011-01-01

    The Center for Computational Biology (CCB) is a multidisciplinary program where biomedical scientists, engineers, and clinicians work jointly to combine modern mathematical and computational techniques, to perform phenotypic and genotypic studies of biological structure, function, and physiology in health and disease. CCB has developed a computational framework built around the Manifold Atlas, an integrated biomedical computing environment that enables statistical inference on biological manifolds. These manifolds model biological structures, features, shapes, and flows, and support sophisticated morphometric and statistical analyses. The Manifold Atlas includes tools, workflows, and services for multimodal population-based modeling and analysis of biological manifolds. The broad spectrum of biomedical topics explored by CCB investigators include the study of normal and pathological brain development, maturation and aging, discovery of associations between neuroimaging and genetic biomarkers, and the modeling, analysis, and visualization of biological shape, form, and size. CCB supports a wide range of short-term and long-term collaborations with outside investigators, which drive the center's computational developments and focus the validation and dissemination of CCB resources to new areas and scientific domains. PMID:22081221

  13. The Center for Computational Biology: resources, achievements, and challenges.

    PubMed

    Toga, Arthur W; Dinov, Ivo D; Thompson, Paul M; Woods, Roger P; Van Horn, John D; Shattuck, David W; Parker, D Stott

    2012-01-01

    The Center for Computational Biology (CCB) is a multidisciplinary program where biomedical scientists, engineers, and clinicians work jointly to combine modern mathematical and computational techniques, to perform phenotypic and genotypic studies of biological structure, function, and physiology in health and disease. CCB has developed a computational framework built around the Manifold Atlas, an integrated biomedical computing environment that enables statistical inference on biological manifolds. These manifolds model biological structures, features, shapes, and flows, and support sophisticated morphometric and statistical analyses. The Manifold Atlas includes tools, workflows, and services for multimodal population-based modeling and analysis of biological manifolds. The broad spectrum of biomedical topics explored by CCB investigators include the study of normal and pathological brain development, maturation and aging, discovery of associations between neuroimaging and genetic biomarkers, and the modeling, analysis, and visualization of biological shape, form, and size. CCB supports a wide range of short-term and long-term collaborations with outside investigators, which drive the center's computational developments and focus the validation and dissemination of CCB resources to new areas and scientific domains.

  14. WE-G-16A-01: Evolution of Radiation Treatment Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rothenberg, L; Mohan, R; Van Dyk, J

    Welcome and Introduction - Lawrence N. Rothenberg This symposium is one a continuing series of presentations at AAPM Annual Meetings on the historical aspects of medical physics, radiology, and radiation oncology that have been organized by the AAPM History Committee. Information on previous presentations including “Early Developments in Teletherapy” (Indianapolis 2013), “Historical Aspects of Cross-Sectional Imaging” (Charlotte 2012), “Historical Aspects of Brachytherapy” (Vancouver 2011), “50 Years of Women in Medical Physics” (Houston 2008), and “Roentgen's Early Investigations” (Minneapolis 2007) can be found in the Education Section of the AAPM Website. The Austin 2014 History Symposium will be on “Evolution ofmore » Radiation Treatment Planning.” Overview - Radhe Mohan Treatment planning is one of the most critical components in the chain of radiation therapy of cancers. Treatment plans of today contain a wide variety of sophisticated information conveying the potential clinical effectiveness of the designed treatment to practitioners. Examples of such information include dose distributions superimposed on three- or even four-dimensional anatomic images; dose volume histograms, dose, dose-volume and dose-response indices for anatomic structures of interest; etc. These data are used for evaluating treatment plans and for making treatment decisions. The current state-of-the-art has evolved from the 1940s era when the dose to the tumor and normal tissues was estimated approximately by manual means. However, the symposium will cover the history of the field from the late-1950's, when computers were first introduced for treatment planning, to the present state involving the use of high performance computing and advanced multi-dimensional anatomic, functional and biological imaging, focusing only on external beam treatment planning. The symposium will start with a general overview of the treatment planning process including imaging, structure delineation, assignment of dose requirements, consideration of uncertainties, selection of beam configurations and shaping of beams, and calculations, optimization and evaluation of dose distributions. This will be followed by three presentations covering the evolution of treatment planning, which parallels the evolution of computers, availability of advanced volumetric imaging and the development of novel technologies such as dynamic multi-leaf collimators and online image guidance. This evolution will be divided over three distinct periods - prior to 1970's, the 2D era; from 1980 to the mid-1990's, the 3D era; and from the mid 1990's to today, the IMRT era. When the World was Flat: The Two-Dimensional Radiation Therapy Era” - Jacob Van Dyk In the 2D era, anatomy was defined with the aid of solder wires, special contouring devices and projection x-rays. Dose distributions were calculated manually from single field, flat surface isodoses on transparencies. Precalculated atlases of generic dose distributions were produced by the International Atomic Energy Agency. Massive time-shared main frames and mini-computers were used to compute doses at individual points or dose distributions in a single plane. Beam shapes were generally rectangular, with wedges, missing tissue compensators and occasional blocks to shield critical structures. Dose calculations were measurement-based or they used primary and scatter calculations based on scatter-air ratio methodologies. Dose distributions were displayed on line printers as alpha-numeric character maps or isodose patterns made with pen plotters. More than Pretty Pictures: 3D Treatment Planning and Conformal Therapy - Benedick A. Fraass The introduction of computed tomography allowed the delineation of anatomy three-dimensionally and, supported partly by contracts from the National Cancer Institute, made possible the introduction and clinical use of 3D treatment planning, leading to development and use of 3D conformal therapy in the 1980's. 3D computer graphics and 3D anatomical structure definitions made possible Beam's Eye View (BEV) displays, making conformal beam shaping and much more sophisticated beam arrangements possible. These conformal plans significantly improved target dose coverage as well as normal tissue sparing. The use of dose volume histograms, gross/clinical/planning target volumes, MRI and PET imaging, multileaf collimators, and computer-controlled treatment delivery made sophisticated planning approaches practical. The significant improvements in dose distributions and analysis achievable with 3D conformal therapy made possible formal dose escalation and normal tissue tolerance clinical studies that set new and improved expectations for improved local control and decreasing complications in many clinical sites. From the Art to the State of the Art: Inverse Planning and IMRT - Thomas R. Bortfeld While the potential of intensity modulation was recognized in the mid- 1980's, intensity-modulated radiotherapy (IMRT) did not become a reality until the mid-1990's. Broad beams of photons could be sub-divided into narrow beamlets whose intensities could be determined using sophisticated optimization algorithms to appropriately balance tumor dose with normal tissue sparing. The development of dynamic multi-leaf collimators (on conventional linear accelerators as well as in helical delivery devices) enabled the efficient delivery of IMRT. The evolution of IMRT planning is continuing in the form of Volumetric Modulated Arc Therapy (VMAT) and through advanced optimization tools, such as multi-criteria optimization, automated IMRT planning, and robust optimization to protect dose distributions against uncertainties. IMRT also facilitates “dose painting” in which different sub-volumes of the target are prescribed different doses. Clearly, these advancements are being made possible by the increasing power and lower cost of computers and developments in other fields such as imaging and operations research. Summary - Radhe Mohan The history does not end here. The advancement of treatment planning is expected to continue, leading to further automation and improvements in conformality and robustness of dose distributions, particularly in the area of particle therapy. Radiobiological modeling will gain emphasis as part of the planning process. Learning Objectives: The scope of changes in technology and the capabilities of radiation treatment planning The impact of these changes in the quality of treatment plans and optimality of dose distributions The impact of development in other fields (imaging, computers, operations research, etc.) on the evolution of radiation treatment planning.« less

  15. Artificial Intelligence-Assisted Online Social Therapy for Youth Mental Health

    PubMed Central

    D'Alfonso, Simon; Santesteban-Echarri, Olga; Rice, Simon; Wadley, Greg; Lederman, Reeva; Miles, Christopher; Gleeson, John; Alvarez-Jimenez, Mario

    2017-01-01

    Introduction: Benefits from mental health early interventions may not be sustained over time, and longer-term intervention programs may be required to maintain early clinical gains. However, due to the high intensity of face-to-face early intervention treatments, this may not be feasible. Adjunctive internet-based interventions specifically designed for youth may provide a cost-effective and engaging alternative to prevent loss of intervention benefits. However, until now online interventions have relied on human moderators to deliver therapeutic content. More sophisticated models responsive to user data are critical to inform tailored online therapy. Thus, integration of user experience with a sophisticated and cutting-edge technology to deliver content is necessary to redefine online interventions in youth mental health. This paper discusses the development of the moderated online social therapy (MOST) web application, which provides an interactive social media-based platform for recovery in mental health. We provide an overview of the system's main features and discus our current work regarding the incorporation of advanced computational and artificial intelligence methods to enhance user engagement and improve the discovery and delivery of therapy content. Methods: Our case study is the ongoing Horyzons site (5-year randomized controlled trial for youth recovering from early psychosis), which is powered by MOST. We outline the motivation underlying the project and the web application's foundational features and interface. We discuss system innovations, including the incorporation of pertinent usage patterns as well as identifying certain limitations of the system. This leads to our current motivations and focus on using computational and artificial intelligence methods to enhance user engagement, and to further improve the system with novel mechanisms for the delivery of therapy content to users. In particular, we cover our usage of natural language analysis and chatbot technologies as strategies to tailor interventions and scale up the system. Conclusions: To date, the innovative MOST system has demonstrated viability in a series of clinical research trials. Given the data-driven opportunities afforded by the software system, observed usage patterns, and the aim to deploy it on a greater scale, an important next step in its evolution is the incorporation of advanced and automated content delivery mechanisms. PMID:28626431

  16. Artificial Intelligence-Assisted Online Social Therapy for Youth Mental Health.

    PubMed

    D'Alfonso, Simon; Santesteban-Echarri, Olga; Rice, Simon; Wadley, Greg; Lederman, Reeva; Miles, Christopher; Gleeson, John; Alvarez-Jimenez, Mario

    2017-01-01

    Introduction: Benefits from mental health early interventions may not be sustained over time, and longer-term intervention programs may be required to maintain early clinical gains. However, due to the high intensity of face-to-face early intervention treatments, this may not be feasible. Adjunctive internet-based interventions specifically designed for youth may provide a cost-effective and engaging alternative to prevent loss of intervention benefits. However, until now online interventions have relied on human moderators to deliver therapeutic content. More sophisticated models responsive to user data are critical to inform tailored online therapy. Thus, integration of user experience with a sophisticated and cutting-edge technology to deliver content is necessary to redefine online interventions in youth mental health. This paper discusses the development of the moderated online social therapy (MOST) web application, which provides an interactive social media-based platform for recovery in mental health. We provide an overview of the system's main features and discus our current work regarding the incorporation of advanced computational and artificial intelligence methods to enhance user engagement and improve the discovery and delivery of therapy content. Methods: Our case study is the ongoing Horyzons site (5-year randomized controlled trial for youth recovering from early psychosis), which is powered by MOST. We outline the motivation underlying the project and the web application's foundational features and interface. We discuss system innovations, including the incorporation of pertinent usage patterns as well as identifying certain limitations of the system. This leads to our current motivations and focus on using computational and artificial intelligence methods to enhance user engagement, and to further improve the system with novel mechanisms for the delivery of therapy content to users. In particular, we cover our usage of natural language analysis and chatbot technologies as strategies to tailor interventions and scale up the system. Conclusions: To date, the innovative MOST system has demonstrated viability in a series of clinical research trials. Given the data-driven opportunities afforded by the software system, observed usage patterns, and the aim to deploy it on a greater scale, an important next step in its evolution is the incorporation of advanced and automated content delivery mechanisms.

  17. Grid site availability evaluation and monitoring at CMS

    DOE PAGES

    Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe; ...

    2017-10-01

    The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impactmore » data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Furthermore, enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.« less

  18. Grid site availability evaluation and monitoring at CMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe

    The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impactmore » data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Furthermore, enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.« less

  19. Grid site availability evaluation and monitoring at CMS

    NASA Astrophysics Data System (ADS)

    Lyons, Gaston; Maciulaitis, Rokas; Bagliesi, Giuseppe; Lammel, Stephan; Sciabà, Andrea

    2017-10-01

    The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) uses distributed grid computing to store, process, and analyse the vast quantity of scientific data recorded every year. The computing resources are grouped into sites and organized in a tiered structure. Each site provides computing and storage to the CMS computing grid. Over a hundred sites worldwide contribute with resources from hundred to well over ten thousand computing cores and storage from tens of TBytes to tens of PBytes. In such a large computing setup scheduled and unscheduled outages occur continually and are not allowed to significantly impact data handling, processing, and analysis. Unscheduled capacity and performance reductions need to be detected promptly and corrected. CMS developed a sophisticated site evaluation and monitoring system for Run 1 of the LHC based on tools of the Worldwide LHC Computing Grid. For Run 2 of the LHC the site evaluation and monitoring system is being overhauled to enable faster detection/reaction to failures and a more dynamic handling of computing resources. Enhancements to better distinguish site from central service issues and to make evaluations more transparent and informative to site support staff are planned.

  20. Bio-inspired nano-sensor-enhanced CNN visual computer.

    PubMed

    Porod, Wolfgang; Werblin, Frank; Chua, Leon O; Roska, Tamas; Rodriguez-Vazquez, Angel; Roska, Botond; Fay, Patrick; Bernstein, Gary H; Huang, Yih-Fang; Csurgay, Arpad I

    2004-05-01

    Nanotechnology opens new ways to utilize recent discoveries in biological image processing by translating the underlying functional concepts into the design of CNN (cellular neural/nonlinear network)-based systems incorporating nanoelectronic devices. There is a natural intersection joining studies of retinal processing, spatio-temporal nonlinear dynamics embodied in CNN, and the possibility of miniaturizing the technology through nanotechnology. This intersection serves as the springboard for our multidisciplinary project. Biological feature and motion detectors map directly into the spatio-temporal dynamics of CNN for target recognition, image stabilization, and tracking. The neural interactions underlying color processing will drive the development of nanoscale multispectral sensor arrays for image fusion. Implementing such nanoscale sensors on a CNN platform will allow the implementation of device feedback control, a hallmark of biological sensory systems. These biologically inspired CNN subroutines are incorporated into the new world of analog-and-logic algorithms and software, containing also many other active-wave computing mechanisms, including nature-inspired (physics and chemistry) as well as PDE-based sophisticated spatio-temporal algorithms. Our goal is to design and develop several miniature prototype devices for target detection, navigation, tracking, and robotics. This paper presents an example illustrating the synergies emerging from the convergence of nanotechnology, biotechnology, and information and cognitive science.

  1. CombiROC: an interactive web tool for selecting accurate marker combinations of omics data.

    PubMed

    Mazzara, Saveria; Rossi, Riccardo L; Grifantini, Renata; Donizetti, Simone; Abrignani, Sergio; Bombaci, Mauro

    2017-03-30

    Diagnostic accuracy can be improved considerably by combining multiple markers, whose performance in identifying diseased subjects is usually assessed via receiver operating characteristic (ROC) curves. The selection of multimarker signatures is a complicated process that requires integration of data signatures with sophisticated statistical methods. We developed a user-friendly tool, called CombiROC, to help researchers accurately determine optimal markers combinations from diverse omics methods. With CombiROC data from different domains, such as proteomics and transcriptomics, can be analyzed using sensitivity/specificity filters: the number of candidate marker panels rising from combinatorial analysis is easily optimized bypassing limitations imposed by the nature of different experimental approaches. Leaving to the user full control on initial selection stringency, CombiROC computes sensitivity and specificity for all markers combinations, performances of best combinations and ROC curves for automatic comparisons, all visualized in a graphic interface. CombiROC was designed without hard-coded thresholds, allowing a custom fit to each specific data: this dramatically reduces the computational burden and lowers the false negative rates given by fixed thresholds. The application was validated with published data, confirming the marker combination already originally described or even finding new ones. CombiROC is a novel tool for the scientific community freely available at http://CombiROC.eu.

  2. Optimizing radiotherapy protocols using computer automata to model tumour cell death as a function of oxygen diffusion processes.

    PubMed

    Paul-Gilloteaux, Perrine; Potiron, Vincent; Delpon, Grégory; Supiot, Stéphane; Chiavassa, Sophie; Paris, François; Costes, Sylvain V

    2017-05-23

    The concept of hypofractionation is gaining momentum in radiation oncology centres, enabled by recent advances in radiotherapy apparatus. The gain of efficacy of this innovative treatment must be defined. We present a computer model based on translational murine data for in silico testing and optimization of various radiotherapy protocols with respect to tumour resistance and the microenvironment heterogeneity. This model combines automata approaches with image processing algorithms to simulate the cellular response of tumours exposed to ionizing radiation, modelling the alteration of oxygen permeabilization in blood vessels against repeated doses, and introducing mitotic catastrophe (as opposed to arbitrary delayed cell-death) as a means of modelling radiation-induced cell death. Published data describing cell death in vitro as well as tumour oxygenation in vivo are used to inform parameters. Our model is validated by comparing simulations to in vivo data obtained from the radiation treatment of mice transplanted with human prostate tumours. We then predict the efficacy of untested hypofractionation protocols, hypothesizing that tumour control can be optimized by adjusting daily radiation dosage as a function of the degree of hypoxia in the tumour environment. Further biological refinement of this tool will permit the rapid development of more sophisticated strategies for radiotherapy.

  3. Data management in NOAA

    NASA Technical Reports Server (NTRS)

    Callicott, William M.

    1992-01-01

    NOAA has 11 terabytes of digital data stored on 240,000 computer tapes. There are an additional 100 terabytes (TB) of geostationary satellite data stored in digital form on specially configured SONY U-Matic video tapes at the University of Wisconsin. There are over 90,000,000 non-digital form records in manuscript, film, printed, and chart form which are not easily accessible. The three NOAA Data Centers service 6,000 requests per year and publish 5,000 bulletins which are distributed to 40,000 subscribers. Seventeen CD-ROM's have been produced. Thirty thousand computer tapes containing polar satellite data are being copied to 12 inch WORM optical disks for research applications. The present annual data accumulation rate of 10 TB will grow to 30 TB in 1994 and to 100 TB by the year 2000. The present storage and distribution technologies with their attendant support systems will be overwhelmed by these increases if not improved. Increased user sophistication coupled with more precise measurement technologies will demand better quality control mechanisms, especially for those data maintained in an indefinite archive. There is optimism that the future will offer improved media technologies to accommodate the volumes of data. With the advanced technologies, storage and performance monitoring tools will be pivotal to the successful long-term management of data and information.

  4. Three-dimensional measurement of small inner surface profiles using feature-based 3-D panoramic registration

    PubMed Central

    Gong, Yuanzheng; Seibel, Eric J.

    2017-01-01

    Rapid development in the performance of sophisticated optical components, digital image sensors, and computer abilities along with decreasing costs has enabled three-dimensional (3-D) optical measurement to replace more traditional methods in manufacturing and quality control. The advantages of 3-D optical measurement, such as noncontact, high accuracy, rapid operation, and the ability for automation, are extremely valuable for inline manufacturing. However, most of the current optical approaches are eligible for exterior instead of internal surfaces of machined parts. A 3-D optical measurement approach is proposed based on machine vision for the 3-D profile measurement of tiny complex internal surfaces, such as internally threaded holes. To capture the full topographic extent (peak to valley) of threads, a side-view commercial rigid scope is used to collect images at known camera positions and orientations. A 3-D point cloud is generated with multiview stereo vision using linear motion of the test piece, which is repeated by a rotation to form additional point clouds. Registration of these point clouds into a complete reconstruction uses a proposed automated feature-based 3-D registration algorithm. The resulting 3-D reconstruction is compared with x-ray computed tomography to validate the feasibility of our proposed method for future robotically driven industrial 3-D inspection. PMID:28286351

  5. Three-dimensional measurement of small inner surface profiles using feature-based 3-D panoramic registration

    NASA Astrophysics Data System (ADS)

    Gong, Yuanzheng; Seibel, Eric J.

    2017-01-01

    Rapid development in the performance of sophisticated optical components, digital image sensors, and computer abilities along with decreasing costs has enabled three-dimensional (3-D) optical measurement to replace more traditional methods in manufacturing and quality control. The advantages of 3-D optical measurement, such as noncontact, high accuracy, rapid operation, and the ability for automation, are extremely valuable for inline manufacturing. However, most of the current optical approaches are eligible for exterior instead of internal surfaces of machined parts. A 3-D optical measurement approach is proposed based on machine vision for the 3-D profile measurement of tiny complex internal surfaces, such as internally threaded holes. To capture the full topographic extent (peak to valley) of threads, a side-view commercial rigid scope is used to collect images at known camera positions and orientations. A 3-D point cloud is generated with multiview stereo vision using linear motion of the test piece, which is repeated by a rotation to form additional point clouds. Registration of these point clouds into a complete reconstruction uses a proposed automated feature-based 3-D registration algorithm. The resulting 3-D reconstruction is compared with x-ray computed tomography to validate the feasibility of our proposed method for future robotically driven industrial 3-D inspection.

  6. Using Ada: The deeper challenges

    NASA Technical Reports Server (NTRS)

    Feinberg, David A.

    1986-01-01

    The Ada programming language and the associated Ada Programming Support Environment (APSE) and Ada Run Time Environment (ARTE) provide the potential for significant life-cycle cost reductions in computer software development and maintenance activities. The Ada programming language itself is standardized, trademarked, and controlled via formal validation procedures. Though compilers are not yet production-ready as most would desire, the technology for constructing them is sufficiently well known and understood that time and money should suffice to correct current deficiencies. The APSE and ARTE are, on the other hand, significantly newer issues within most software development and maintenance efforts. Currently, APSE and ARTE are highly dependent on differing implementer concepts, strategies, and market objectives. Complex and sophisticated mission-critical computing systems require the use of a complete Ada-based capability, not just the programming language itself; yet the range of APSE and ARTE features which must actually be utilized can vary significantly from one system to another. As a consequence, the need to understand, objectively evaluate, and select differing APSE and ARTE capabilities and features is critical to the effective use of Ada and the life-cycle efficiencies it is intended to promote. It is the selection, collection, and understanding of APSE and ARTE which provide the deeper challenges of using Ada for real-life mission-critical computing systems. Some of the current issues which must be clarified, often on a case-by-case basis, in order to successfully realize the full capabilities of Ada are discussed.

  7. Structural optimization of 3D-printed synthetic spider webs for high strength

    PubMed Central

    Qin, Zhao; Compton, Brett G.; Lewis, Jennifer A.; Buehler, Markus J.

    2015-01-01

    Spiders spin intricate webs that serve as sophisticated prey-trapping architectures that simultaneously exhibit high strength, elasticity and graceful failure. To determine how web mechanics are controlled by their topological design and material distribution, here we create spider-web mimics composed of elastomeric filaments. Specifically, computational modelling and microscale 3D printing are combined to investigate the mechanical response of elastomeric webs under multiple loading conditions. We find the existence of an asymptotic prey size that leads to a saturated web strength. We identify pathways to design elastomeric material structures with maximum strength, low density and adaptability. We show that the loading type dictates the optimal material distribution, that is, a homogeneous distribution is better for localized loading, while stronger radial threads with weaker spiral threads is better for distributed loading. Our observations reveal that the material distribution within spider webs is dictated by the loading condition, shedding light on their observed architectural variations. PMID:25975372

  8. Simulation and animation of sensor-driven robots.

    PubMed

    Chen, C; Trivedi, M M; Bidlack, C R

    1994-10-01

    Most simulation and animation systems utilized in robotics are concerned with simulation of the robot and its environment without simulation of sensors. These systems have difficulty in handling robots that utilize sensory feedback in their operation. In this paper, a new design of an environment for simulation, animation, and visualization of sensor-driven robots is presented. As sensor technology advances, increasing numbers of robots are equipped with various types of sophisticated sensors. The main goal of creating the visualization environment is to aid the automatic robot programming and off-line programming capabilities of sensor-driven robots. The software system will help the users visualize the motion and reaction of the sensor-driven robot under their control program. Therefore, the efficiency of the software development is increased, the reliability of the software and the operation safety of the robot are ensured, and the cost of new software development is reduced. Conventional computer-graphics-based robot simulation and animation software packages lack of capabilities for robot sensing simulation. This paper describes a system designed to overcome this deficiency.

  9. Designer cell signal processing circuits for biotechnology

    PubMed Central

    Bradley, Robert W.; Wang, Baojun

    2015-01-01

    Microorganisms are able to respond effectively to diverse signals from their environment and internal metabolism owing to their inherent sophisticated information processing capacity. A central aim of synthetic biology is to control and reprogramme the signal processing pathways within living cells so as to realise repurposed, beneficial applications ranging from disease diagnosis and environmental sensing to chemical bioproduction. To date most examples of synthetic biological signal processing have been built based on digital information flow, though analogue computing is being developed to cope with more complex operations and larger sets of variables. Great progress has been made in expanding the categories of characterised biological components that can be used for cellular signal manipulation, thereby allowing synthetic biologists to more rationally programme increasingly complex behaviours into living cells. Here we present a current overview of the components and strategies that exist for designer cell signal processing and decision making, discuss how these have been implemented in prototype systems for therapeutic, environmental, and industrial biotechnological applications, and examine emerging challenges in this promising field. PMID:25579192

  10. A reconfigurable visual-programming library for real-time closed-loop cellular electrophysiology

    PubMed Central

    Biró, István; Giugliano, Michele

    2015-01-01

    Most of the software platforms for cellular electrophysiology are limited in terms of flexibility, hardware support, ease of use, or re-configuration and adaptation for non-expert users. Moreover, advanced experimental protocols requiring real-time closed-loop operation to investigate excitability, plasticity, dynamics, are largely inaccessible to users without moderate to substantial computer proficiency. Here we present an approach based on MATLAB/Simulink, exploiting the benefits of LEGO-like visual programming and configuration, combined to a small, but easily extendible library of functional software components. We provide and validate several examples, implementing conventional and more sophisticated experimental protocols such as dynamic-clamp or the combined use of intracellular and extracellular methods, involving closed-loop real-time control. The functionality of each of these examples is demonstrated with relevant experiments. These can be used as a starting point to create and support a larger variety of electrophysiological tools and methods, hopefully extending the range of default techniques and protocols currently employed in experimental labs across the world. PMID:26157385

  11. Automated Blazar Light Curves Using Machine Learning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, Spencer James

    Every night in a remote clearing called Fenton Hill high in the Jemez Mountains of central New Mexico, a bank of robotically controlled telescopes tilt their lenses to the sky for another round of observation through digital imaging. Los Alamos National Laboratory’s Thinking Telescopes project is watching for celestial transients including high-power cosmic flashes called, and like all science, it can be messy work. To keep the project clicking along, Los Alamos scientists routinely install equipment upgrades, maintain the site, and refine the sophisticated machinelearning computer programs that process those images and extract useful data from them. Each week themore » system amasses 100,000 digital images of the heavens, some of which are compromised by clouds, wind gusts, focus problems, and so on. For a graduate student at the Lab taking a year’s break between master’s and Ph.D. studies, working with state-of-the-art autonomous telescopes that can make fundamental discoveries feels light years beyond the classroom.« less

  12. Evolution of cognitive-behavioral therapy for eating disorders.

    PubMed

    Agras, W Stewart; Fitzsimmons-Craft, Ellen E; Wilfley, Denise E

    2017-01-01

    The evolution of cognitive-behavioral therapy (CBT) for the treatment of bulimic disorders is described in this review. The impacts of successive attempts to enhance CBT such as the addition of exposure and response prevention; the development of enhanced CBT; and broadening the treatment from bulimia nervosa to binge eating disorder are considered. In addition to developing advanced forms of CBT, shortening treatment to guided self-help was the first step in broadening access to treatment. The use of technology such as computer-based therapy and more recently the Internet, promises further broadening of access to self-help and to therapist guided treatment. Controlled studies in this area are reviewed, and the balance of risks and benefits that accompany the use of technology and lessened therapist input are considered. Looking into the future, more sophisticated forms of treatment delivered as mobile applications ("apps") may lead to more personalized and efficacious treatments for bulimic disorders, thus enhancing the delivery of treatments for eating disorders. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Is Life Unique?

    PubMed Central

    Abel, David L.

    2011-01-01

    Is life physicochemically unique? No. Is life unique? Yes. Life manifests innumerable formalisms that cannot be generated or explained by physicodynamics alone. Life pursues thousands of biofunctional goals, not the least of which is staying alive. Neither physicodynamics, nor evolution, pursue goals. Life is largely directed by linear digital programming and by the Prescriptive Information (PI) instantiated particularly into physicodynamically indeterminate nucleotide sequencing. Epigenomic controls only compound the sophistication of these formalisms. Life employs representationalism through the use of symbol systems. Life manifests autonomy, homeostasis far from equilibrium in the harshest of environments, positive and negative feedback mechanisms, prevention and correction of its own errors, and organization of its components into Sustained Functional Systems (SFS). Chance and necessity—heat agitation and the cause-and-effect determinism of nature’s orderliness—cannot spawn formalisms such as mathematics, language, symbol systems, coding, decoding, logic, organization (not to be confused with mere self-ordering), integration of circuits, computational success, and the pursuit of functionality. All of these characteristics of life are formal, not physical. PMID:25382119

  14. Red, purple and pink: the colors of diffusion on pinterest.

    PubMed

    Bakhshi, Saeideh; Gilbert, Eric

    2015-01-01

    Many lab studies have shown that colors can evoke powerful emotions and impact human behavior. Might these phenomena drive how we act online? A key research challenge for image-sharing communities is uncovering the mechanisms by which content spreads through the community. In this paper, we investigate whether there is link between color and diffusion. Drawing on a corpus of one million images crawled from Pinterest, we find that color significantly impacts the diffusion of images and adoption of content on image sharing communities such as Pinterest, even after partially controlling for network structure and activity. Specifically, Red, Purple and pink seem to promote diffusion, while Green, Blue, Black and Yellow suppress it. To our knowledge, our study is the first to investigate how colors relate to online user behavior. In addition to contributing to the research conversation surrounding diffusion, these findings suggest future work using sophisticated computer vision techniques. We conclude with a discussion on the theoretical, practical and design implications suggested by this work-e.g. design of engaging image filters.

  15. Game Theory of Mind

    PubMed Central

    Yoshida, Wako; Dolan, Ray J.; Friston, Karl J.

    2008-01-01

    This paper introduces a model of ‘theory of mind’, namely, how we represent the intentions and goals of others to optimise our mutual interactions. We draw on ideas from optimum control and game theory to provide a ‘game theory of mind’. First, we consider the representations of goals in terms of value functions that are prescribed by utility or rewards. Critically, the joint value functions and ensuing behaviour are optimised recursively, under the assumption that I represent your value function, your representation of mine, your representation of my representation of yours, and so on ad infinitum. However, if we assume that the degree of recursion is bounded, then players need to estimate the opponent's degree of recursion (i.e., sophistication) to respond optimally. This induces a problem of inferring the opponent's sophistication, given behavioural exchanges. We show it is possible to deduce whether players make inferences about each other and quantify their sophistication on the basis of choices in sequential games. This rests on comparing generative models of choices with, and without, inference. Model comparison is demonstrated using simulated and real data from a ‘stag-hunt’. Finally, we note that exactly the same sophisticated behaviour can be achieved by optimising the utility function itself (through prosocial utility), producing unsophisticated but apparently altruistic agents. This may be relevant ethologically in hierarchal game theory and coevolution. PMID:19112488

  16. System analysis tools for an ELT at ESO

    NASA Astrophysics Data System (ADS)

    Mueller, Michael; Koch, Franz

    2006-06-01

    Engineering of complex, large scale systems like the ELT designs currently investigated and developed in Europe and Northern America require powerful and sophisticated tools within specific technical disciplines such as mechanics, optics and control engineering. However, even analyzing a certain component of the telescope like the telescope structure necessitates a system approach to evaluate the structural effects onto the optical performance. This paper shows several software tools developed by the European Southern Observatory (ESO) which focus onto the system approach in the analyses: Using modal results of a finite element analysis the SMI-toolbox allows an easy generation of structural models with different sizes and levels of accuracy for the control design and closed-loop simulations. The optical modeling code BeamWarrior was developed by ESO and Astrium GmbH, Germany) especially for integrated modeling and interfering with a structural model. Within BeamWarrior displacements and deformations can be applied in an arbitrary coordinate system, and hence also in the global coordinates of the FE model avoiding error prone transformations. In addition to this, a sparse state space model object was developed for Matlab to gain in computational efficiency and reduced memory requirements due to the sparsity pattern of both the structural models and the control architecture. As one result these tools allow building an integrated model in order to reliably simulate interactions, cross-coupling effects, system responses, and to evaluate global performance. In order to evaluate disturbance effects on the optical performance in openloop more efficiently, an optical evaluation toolbox was built in the FE software ANSYS which performs Zernike decomposition and best-fit computation of the deformations directly in the FE analysis.

  17. Touchdown: The Development of Propulsion Controlled Aircraft at NASA Dryden

    NASA Technical Reports Server (NTRS)

    Tucker, Tom

    1999-01-01

    This monograph relates the important history of the Propulsion Controlled Aircraft project at NASA's Dryden Flight Research Center. Spurred by a number of airplane crashes caused by the loss of hydraulic flight controls, a NASA-industry team lead by Frank W. Burcham and C. Gordon Fullerton developed a way to land an aircraft safely using only engine thrust to control the airplane. In spite of initial skepticism, the team discovered that, by manually manipulating an airplane's thrust, there was adequate control for extended up-and-away flight. However, there was not adequate control precision for safe runway landings because of the small control forces, slow response, and difficulty in damping the airplane phugoid and Dutch roll oscillations. The team therefore conceived, developed, and tested the first computerized Propulsion Controlled Aircraft (PCA) system. The PCA system takes pilot commands, uses feedback from airplane measurements, and computes commands for the thrust of each engine, yielding much more precise control. Pitch rate and velocity feedback damp the phugoid oscillation, while yaw rate feedback damps the Dutch roll motion. The team tested the PCA system in simulators and conducted flight research in F-15 and MD-11 airplanes. Later, they developed less sophisticated variants of PCA called PCA Lite and PCA Ultralite to make the system cheaper and therefore more attractive to industry. This monograph tells the PCA story in a non- technical way with emphasis on the human aspects of the engineering and flic,ht-research effort. It thereby supplements the extensive technical literature on PCA and makes the development of this technology accessible to a wide audience.

  18. Combined Numerical/Analytical Perturbation Solutions of the Navier-Stokes Equations for Aerodynamic Ejector/Mixer Nozzle Flows

    NASA Technical Reports Server (NTRS)

    DeChant, Lawrence Justin

    1998-01-01

    In spite of rapid advances in both scalar and parallel computational tools, the large number of variables involved in both design and inverse problems make the use of sophisticated fluid flow models impractical, With this restriction, it is concluded that an important family of methods for mathematical/computational development are reduced or approximate fluid flow models. In this study a combined perturbation/numerical modeling methodology is developed which provides a rigorously derived family of solutions. The mathematical model is computationally more efficient than classical boundary layer but provides important two-dimensional information not available using quasi-1-d approaches. An additional strength of the current methodology is its ability to locally predict static pressure fields in a manner analogous to more sophisticated parabolized Navier Stokes (PNS) formulations. To resolve singular behavior, the model utilizes classical analytical solution techniques. Hence, analytical methods have been combined with efficient numerical methods to yield an efficient hybrid fluid flow model. In particular, the main objective of this research has been to develop a system of analytical and numerical ejector/mixer nozzle models, which require minimal empirical input. A computer code, DREA Differential Reduced Ejector/mixer Analysis has been developed with the ability to run sufficiently fast so that it may be used either as a subroutine or called by an design optimization routine. Models are of direct use to the High Speed Civil Transport Program (a joint government/industry project seeking to develop an economically.viable U.S. commercial supersonic transport vehicle) and are currently being adopted by both NASA and industry. Experimental validation of these models is provided by comparison to results obtained from open literature and Limited Exclusive Right Distribution (LERD) sources, as well as dedicated experiments performed at Texas A&M. These experiments have been performed using a hydraulic/gas flow analog. Results of comparisons of DREA computations with experimental data, which include entrainment, thrust, and local profile information, are overall good. Computational time studies indicate that DREA provides considerably more information at a lower computational cost than contemporary ejector nozzle design models. Finally. physical limitations of the method, deviations from experimental data, potential improvements and alternative formulations are described. This report represents closure to the NASA Graduate Researchers Program. Versions of the DREA code and a user's guide may be obtained from the NASA Lewis Research Center.

  19. Vision-Based Steering Control, Speed Assistance and Localization for Inner-City Vehicles

    PubMed Central

    Olivares-Mendez, Miguel Angel; Sanchez-Lopez, Jose Luis; Jimenez, Felipe; Campoy, Pascual; Sajadi-Alamdari, Seyed Amin; Voos, Holger

    2016-01-01

    Autonomous route following with road vehicles has gained popularity in the last few decades. In order to provide highly automated driver assistance systems, different types and combinations of sensors have been presented in the literature. However, most of these approaches apply quite sophisticated and expensive sensors, and hence, the development of a cost-efficient solution still remains a challenging problem. This work proposes the use of a single monocular camera sensor for an automatic steering control, speed assistance for the driver and localization of the vehicle on a road. Herein, we assume that the vehicle is mainly traveling along a predefined path, such as in public transport. A computer vision approach is presented to detect a line painted on the road, which defines the path to follow. Visual markers with a special design painted on the road provide information to localize the vehicle and to assist in its speed control. Furthermore, a vision-based control system, which keeps the vehicle on the predefined path under inner-city speed constraints, is also presented. Real driving tests with a commercial car on a closed circuit finally prove the applicability of the derived approach. In these tests, the car reached a maximum speed of 48 km/h and successfully traveled a distance of 7 km without the intervention of a human driver and any interruption. PMID:26978365

  20. Vision-Based Steering Control, Speed Assistance and Localization for Inner-City Vehicles.

    PubMed

    Olivares-Mendez, Miguel Angel; Sanchez-Lopez, Jose Luis; Jimenez, Felipe; Campoy, Pascual; Sajadi-Alamdari, Seyed Amin; Voos, Holger

    2016-03-11

    Autonomous route following with road vehicles has gained popularity in the last few decades. In order to provide highly automated driver assistance systems, different types and combinations of sensors have been presented in the literature. However, most of these approaches apply quite sophisticated and expensive sensors, and hence, the development of a cost-efficient solution still remains a challenging problem. This work proposes the use of a single monocular camera sensor for an automatic steering control, speed assistance for the driver and localization of the vehicle on a road. Herein, we assume that the vehicle is mainly traveling along a predefined path, such as in public transport. A computer vision approach is presented to detect a line painted on the road, which defines the path to follow. Visual markers with a special design painted on the road provide information to localize the vehicle and to assist in its speed control. Furthermore, a vision-based control system, which keeps the vehicle on the predefined path under inner-city speed constraints, is also presented. Real driving tests with a commercial car on a closed circuit finally prove the applicability of the derived approach. In these tests, the car reached a maximum speed of 48 km/h and successfully traveled a distance of 7 km without the intervention of a human driver and any interruption.

  1. A simple, low-cost, versatile CCD spectrometer for plasma spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Den Hartog, D. J.; Holly, D. J.

    1996-06-01

    The authors have constructed a simple, low-cost CCD spectrometer capable of both high resolution ({Delta}{lambda} {le} 0.015 nm) and large bandpass (110 nm with {Delta}{lambda} {approximately}0.3 nm). These two modes of operation provide two broad areas of capability for plasma spectroscopy. The first major application is measurement of emission line broadening; the second is emission line surveys from the ultraviolet to the near infrared. Measurements have been made on a low-temperature plasma produced by a miniature electrostatic plasma source and the high-temperature plasma in the MST Reversed-Field Pinch. The spectrometer is a modified Jarrell-Ash 0.5 m Ebert-Fastie monochromator. Light ismore » coupled into the entrance slit with a fused silica fiber optic bundle. The exposure time (2 ms minimum) is controlled by a fast electromechanical shutter. The exit plane detector is a compact and robust CCD detector developed for amateur astronomy by Santa Barbara Instrument Group. The CCD detector is controlled and read out by a Macintosh{reg_sign} computer. This spectrometer is sophisticated enough to serve well in a research laboratory, yet is simple and inexpensive enough to be affordable for instructional use.« less

  2. Introducing DeBRa: a detailed breast model for radiological studies

    NASA Astrophysics Data System (ADS)

    Ma, Andy K. W.; Gunn, Spencer; Darambara, Dimitra G.

    2009-07-01

    Currently, x-ray mammography is the method of choice in breast cancer screening programmes. As the mammography technology moves from 2D imaging modalities to 3D, conventional computational phantoms do not have sufficient detail to support the studies of these advanced imaging systems. Studies of these 3D imaging systems call for a realistic and sophisticated computational model of the breast. DeBRa (Detailed Breast model for Radiological studies) is the most advanced, detailed, 3D computational model of the breast developed recently for breast imaging studies. A DeBRa phantom can be constructed to model a compressed breast, as in film/screen, digital mammography and digital breast tomosynthesis studies, or a non-compressed breast as in positron emission mammography and breast CT studies. Both the cranial-caudal and mediolateral oblique views can be modelled. The anatomical details inside the phantom include the lactiferous duct system, the Cooper ligaments and the pectoral muscle. The fibroglandular tissues are also modelled realistically. In addition, abnormalities such as microcalcifications, irregular tumours and spiculated tumours are inserted into the phantom. Existing sophisticated breast models require specialized simulation codes. Unlike its predecessors, DeBRa has elemental compositions and densities incorporated into its voxels including those of the explicitly modelled anatomical structures and the noise-like fibroglandular tissues. The voxel dimensions are specified as needed by any study and the microcalcifications are embedded into the voxels so that the microcalcification sizes are not limited by the voxel dimensions. Therefore, DeBRa works with general-purpose Monte Carlo codes. Furthermore, general-purpose Monte Carlo codes allow different types of imaging modalities and detector characteristics to be simulated with ease. DeBRa is a versatile and multipurpose model specifically designed for both x-ray and γ-ray imaging studies.

  3. Low-Computation Strategies for Extracting CO2 Emission Trends from Surface-Level Mixing Ratio Observations

    NASA Astrophysics Data System (ADS)

    Shusterman, A.; Kim, J.; Lieschke, K.; Newman, C.; Cohen, R. C.

    2017-12-01

    Global momentum is building for drastic, regulated reductions in greenhouse gas emissions over the coming decade. With this increasing regulation comes a clear need for increasingly sophisticated monitoring, reporting, and verification (MRV) strategies capable of enforcing and optimizing emissions-related policy, particularly as it applies to urban areas. Remote sensing and/or activity-based emission inventories can offer MRV insights for entire sectors or regions, but are not yet sophisticated enough to resolve unexpected trends in specific emitters. Urban surface monitors can offer the desired proximity to individual greenhouse gas sources, but due to the densely-packed nature of typical urban landscapes, surface observations are rarely representative of a single source. Most previous efforts to decompose these complex signals into their contributing emission processes have involved inverse atmospheric modeling techniques, which are computationally intensive and believed to depend heavily on poorly understood a priori estimates of error covariance. Here we present a number of transparent, low-computation approaches for extracting source-specific emissions estimates from signals with a variety of nearfield influences. Using observations from the first several years of the BErkeley Atmospheric CO2 Observation Network (BEACO2N), we demonstrate how to exploit strategic pairings of monitoring "nodes," anomalous wind conditions, and well-understood temporal variations to hone in on specific CO2 sources of interest. When evaluated against conventional, activity-based bottom-up emission inventories, these strategies are seen to generate quantitatively rigorous emission estimates. With continued application as the BEACO2N data set grows in time and space, these approaches offer a promising avenue for optimizing greenhouse gas mitigation strategies into the future.

  4. QUALITY CONTROL OF PHARMACEUTICALS.

    PubMed

    LEVI, L; WALKER, G C; PUGSLEY, L I

    1964-10-10

    Quality control is an essential operation of the pharmaceutical industry. Drugs must be marketed as safe and therapeutically active formulations whose performance is consistent and predictable. New and better medicinal agents are being produced at an accelerated rate. At the same time more exacting and sophisticated analytical methods are being developed for their evaluation. Requirements governing the quality control of pharmaceuticals in accordance with the Canadian Food and Drugs Act are cited and discussed.

  5. Shared Savings Contracting for Reducing Energy Costs of Defense Facilities.

    DTIC Science & Technology

    1983-01-01

    also needs additional -4 operation and maintenance help, especially if new equipment is involved. New equipment with sophisticated control technology...process may take time to develop and may be " difficult to integrate with existing financial control practices. For example, the regulations and...will seek a change order on the contract. The potential for such changes, if not controlled , seriously undermines the value of these contracts. DoD

  6. In This Issue

    NASA Astrophysics Data System (ADS)

    1996-04-01

    Advanced Instrumentation in the Educational Laboratory The instruments that chemists use in their research have changed dramatically in the past decades. The explosion in new techniques and their instrumental counterparts has been made possible by two significant advances. The rapid propagation of computer chips and circuitry provides opportunities to collect more specialized and refined data than ever before. Computer-controlled instruments can detect events that a human researcher would never perceive and can record hundreds of data points in the time that a person could only observe one or two. Coupled with these advances in technology are advances in theory that allow more sophisticated interpretation of data. A hundred years ago a sophisticated instrument was a balance that could weigh minute quantities. Today a sophisticated instrument is one that can identify the composition of that minute sample, determine its molecular weight, or reveal a great deal about its energy states and bonding. This bonanza of new instruments is wonderful for the research chemist but a curricular headache for the chemistry teacher. It takes more time to learn how to run an NMR than to use a balance, and more sophistication on the part of the student is needed to interpret the data. And yet many courses that only a generation ago sported an expensive analytical balance as its prize instrument now require students to understand and operate a whole panoply of complex tools. Teachers faced with accommodating these curricular changes will find several articles in this issue helpful--either providing information on new techniques or descriptions of how to incorporate them into the classroom. Electrospray ionization mass spectrometry is one of the newest tools that has been added to the analytical arsenal. It is an extension of mass spectrometry that overcomes the old barrier that allowed only analysis of low molecular weight, volatile compounds. It is now possible to use MS for biomolecule and protein structure elucidation and it is becoming a routine biochemical tool. The explosive growth of this technique has stimulated the publishing of a three-part review in the Topics in Chemical Instrumentation feature. Part I, which covers instrumentation and spectral interpretation, is by Hofstadler, Bakhtiar, and Smith (page A82) and appears in this issue. They describe how the electrospray apparatus, which produces a fine aerosol of highly charged microdroplets, can be coupled with a mass spectrometer to analyze complex molecules. Next they show how the spectra achieved from this process can be interpreted using human hemoglobin as an example. This technique provides the ability to "weigh" large molecules in a manner unimaginable in the era of the two-pan balance and is thus a desirable addition to the curriculum. The appearance of so many new technologies in the curriculum poses a problem for many teachers. They want to teach not only the manipulations required to run the instruments but also provide an understanding of the ideas behind them. Paniagua and Moyano (page 310) tackle the problem of how to introduce the principles of pulsed NMR and answer the basic questions in ways that undergraduates will understand. They discuss these in terms of the classical model, the naive quantum model, and the density operator approach. Equally important to understanding the theory behind analytical instruments is understanding how they actually acquire and process data. Duffy and vanLoon (page 318) have designed an exercise to teach interfacing techniques in the instrumental analysis laboratory. Students use a relatively inexpensive interface unit and a PC to investigate the relationship among the signal generator, the detector, the signal modifier, and the output transducer in five real instrumental situations: temperature, light intensity, and potentiometric measurement; spectrophotometry; and redox titration. The ubiquitous use of instruments and computer acquisition of data has made these techniques desirable even at the high school level. Students who are used to surfing the Internet on their home computer are ready to collect information via PC in their school labs as well. Bindel (page 356) takes advantage of the Personal Science Laboratory, an affordable package of probes and software for PC interfacing, to provide an experiment using the eye-catching lightstick as its object. Students use two methods to determine the activation energy of the reaction that produces the luminescence and explore concepts of kinetics as well as learn about computer-interfaced experimentation. Addendum. The engaging photgraph of Linus Pauling on the cover of the January issue was taken by Joseph McNally and is copyright Joseph McNally Photography, 52 Villard Avenue, Hastings-on-Hudon, NY 10706.

  7. State of the art in adaptive control of robotic systems

    NASA Technical Reports Server (NTRS)

    Tosunoglu, Sabri; Tesar, Delbert

    1988-01-01

    An up-to-date assessment of adaptive control technology as applied to robotics is presented. Although the field is relatively new and does not yet represent a mature discipline, considerable attention for the design of sophisticated robot controllers has occured. In this presentation, adaptive control methods are divided into model reference adaptive systems and self-tuning regulators, with further definition of various approaches given in each class. The similarity and distinct features of the designed controllers are delineated and tabulated to enhance comparative review.

  8. DNA Brick Crystals with Prescribed Depth

    PubMed Central

    Ke, Yonggang; Ong, Luvena L.; Sun, Wei; Song, Jie; Dong, Mingdong; Shih, William M.; Yin, Peng

    2014-01-01

    We describe a general framework for constructing two-dimensional crystals with prescribed depth and sophisticated three-dimensional features. These crystals may serve as scaffolds for the precise spatial arrangements of functional materials for diverse applications. The crystals are self-assembled from single-stranded DNA components called DNA bricks. We demonstrate the experimental construction of DNA brick crystals that can grow to micron-size in the lateral dimensions with precisely controlled depth up to 80 nanometers. They can be designed to display user-specified sophisticated three-dimensional nanoscale features, such as continuous or discontinuous cavities and channels, and to pack DNA helices at parallel and perpendicular angles relative to the plane of the crystals. PMID:25343605

  9. The use of isotope ratios (13C/12C) for vegetable oils authentication

    NASA Astrophysics Data System (ADS)

    Cristea, G.; Magdas, D. A.; Mirel, V.

    2012-02-01

    Stable isotopes are now increasingly used for the control of the geographical origin or authenticity of food products. The falsification may be more or less sophisticated and its sophistication as well as its costs increases with the improvement of analytical methods. In this study 22 vegetable oils (olive, sunflower, palm, maize) commercialized on Romanian market were investigated by mean of δ13C in bulk oil and the obtained results were compared with those reported in literature in order to check the labeling of these natural products. The obtained results were in the range of the mean values found in the literature for these types of oils, thus providing their accurate labeling.

  10. Cancer's Big Data Problem

    DOE PAGES

    Breaux, Justin H. S.

    2017-03-15

    The US Department of Energy (DOE) has partnered with the National Cancer Institute (NCI) to use DOE supercomputers to aid in the fight against cancer by building sophisticated models based on data available at the population, patient, and molecular levels. Here, through a three-year pilot project called the Joint Design of Advanced Computing Solutions for Cancer (JDACSC), four participating national laboratories--Argonne, Lawrence Livermore, Los Alamos, and Oak Ridge--will focus on three problems singled out by the NCI as the biggest bottlenecks to advancing cancer research.

  11. Cancer's Big Data Problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Breaux, Justin H. S.

    The US Department of Energy (DOE) has partnered with the National Cancer Institute (NCI) to use DOE supercomputers to aid in the fight against cancer by building sophisticated models based on data available at the population, patient, and molecular levels. Here, through a three-year pilot project called the Joint Design of Advanced Computing Solutions for Cancer (JDACSC), four participating national laboratories--Argonne, Lawrence Livermore, Los Alamos, and Oak Ridge--will focus on three problems singled out by the NCI as the biggest bottlenecks to advancing cancer research.

  12. Mapping Van

    NASA Technical Reports Server (NTRS)

    1994-01-01

    A NASA Center for the Commercial Development of Space (CCDS) - developed system for satellite mapping has been commercialized for the first time. Global Visions, Inc. maps an area while driving along a road in a sophisticated mapping van equipped with satellite signal receivers, video cameras and computer systems for collecting and storing mapping data. Data is fed into a computerized geographic information system (GIS). The resulting amps can be used for tax assessment purposes, emergency dispatch vehicles and fleet delivery companies as well as other applications.

  13. Development and Applications of the FV3 GEOS-5 Adjoint Modeling System

    NASA Technical Reports Server (NTRS)

    Holdaway, Daniel; Kim, Jong G.; Lin, Shian-Jiann; Errico, Ron; Gelaro, Ron; Kent, James; Coy, Larry; Doyle, Jim; Goldstein, Alex

    2017-01-01

    GMAO has developed a highly sophisticated adjoint modeling system based on the most recent version of the finite volume cubed sphere (FV3) dynamical core. This provides a mechanism for investigating sensitivity to initial conditions and examining observation impacts. It also allows for the computation of singular vectors and for the implementation of hybrid 4DVAR. In this work we will present the scientific assessment of the new adjoint system and show results from a number of research application of the adjoint system.

  14. Statistical Mechanics of Combinatorial Auctions

    NASA Astrophysics Data System (ADS)

    Galla, Tobias; Leone, Michele; Marsili, Matteo; Sellitto, Mauro; Weigt, Martin; Zecchina, Riccardo

    2006-09-01

    Combinatorial auctions are formulated as frustrated lattice gases on sparse random graphs, allowing the determination of the optimal revenue by methods of statistical physics. Transitions between computationally easy and hard regimes are found and interpreted in terms of the geometric structure of the space of solutions. We introduce an iterative algorithm to solve intermediate and large instances, and discuss competing states of optimal revenue and maximal number of satisfied bidders. The algorithm can be generalized to the hard phase and to more sophisticated auction protocols.

  15. Type-2 fuzzy logic control of a 2-DOF helicopter (TRMS system)

    NASA Astrophysics Data System (ADS)

    Zeghlache, Samir; Kara, Kamel; Saigaa, Djamel

    2014-09-01

    The helicopter dynamic includes nonlinearities, parametric uncertainties and is subject to unknown external disturbances. Such complicated dynamics involve designing sophisticated control algorithms that can deal with these difficulties. In this paper, a type 2 fuzzy logic PID controller is proposed for TRMS (twin rotor mimo system) control problem. Using triangular membership functions and based on a human operator experience, two controllers are designed to control the position of the yaw and the pitch angles of the TRMS. Simulation results are given to illustrate the effectiveness of the proposed control scheme.

  16. Computational oncology.

    PubMed

    Lefor, Alan T

    2011-08-01

    Oncology research has traditionally been conducted using techniques from the biological sciences. The new field of computational oncology has forged a new relationship between the physical sciences and oncology to further advance research. By applying physics and mathematics to oncologic problems, new insights will emerge into the pathogenesis and treatment of malignancies. One major area of investigation in computational oncology centers around the acquisition and analysis of data, using improved computing hardware and software. Large databases of cellular pathways are being analyzed to understand the interrelationship among complex biological processes. Computer-aided detection is being applied to the analysis of routine imaging data including mammography and chest imaging to improve the accuracy and detection rate for population screening. The second major area of investigation uses computers to construct sophisticated mathematical models of individual cancer cells as well as larger systems using partial differential equations. These models are further refined with clinically available information to more accurately reflect living systems. One of the major obstacles in the partnership between physical scientists and the oncology community is communications. Standard ways to convey information must be developed. Future progress in computational oncology will depend on close collaboration between clinicians and investigators to further the understanding of cancer using these new approaches.

  17. Towards constructing multi-bit binary adder based on Belousov-Zhabotinsky reaction

    NASA Astrophysics Data System (ADS)

    Zhang, Guo-Mao; Wong, Ieong; Chou, Meng-Ta; Zhao, Xin

    2012-04-01

    It has been proposed that the spatial excitable media can perform a wide range of computational operations, from image processing, to path planning, to logical and arithmetic computations. The realizations in the field of chemical logical and arithmetic computations are mainly concerned with single simple logical functions in experiments. In this study, based on Belousov-Zhabotinsky reaction, we performed simulations toward the realization of a more complex operation, the binary adder. Combining with some of the existing functional structures that have been verified experimentally, we designed a planar geometrical binary adder chemical device. Through numerical simulations, we first demonstrated that the device can implement the function of a single-bit full binary adder. Then we show that the binary adder units can be further extended in plane, and coupled together to realize a two-bit, or even multi-bit binary adder. The realization of chemical adders can guide the constructions of other sophisticated arithmetic functions, ultimately leading to the implementation of chemical computer and other intelligent systems.

  18. RighTime: A real time clock correcting program for MS-DOS-based computer systems

    NASA Technical Reports Server (NTRS)

    Becker, G. Thomas

    1993-01-01

    A computer program is described which effectively eliminates the misgivings of the DOS system clock in PC/AT-class computers. RighTime is a small, sophisticated memory-resident program that automatically corrects both the DOS system clock and the hardware 'CMOS' real time clock (RTC) in real time. RighTime learns what corrections are required without operator interaction beyond the occasional accurate time set. Both warm (power on) and cool (power off) errors are corrected, usually yielding better than one part per million accuracy in the typical desktop computer with no additional hardware, and RighTime increases the system clock resolution from approximately 0.0549 second to 0.01 second. Program tools are also available which allow visualization of RighTime's actions, verification of its performance, display of its history log, and which provide data for graphing of the system clock behavior. The program has found application in a wide variety of industries, including astronomy, satellite tracking, communications, broadcasting, transportation, public utilities, manufacturing, medicine, and the military.

  19. A comparison of transport algorithms for premixed, laminar steady state flames

    NASA Technical Reports Server (NTRS)

    Coffee, T. P.; Heimerl, J. M.

    1980-01-01

    The effects of different methods of approximating multispecies transport phenomena in models of premixed, laminar, steady state flames were studied. Five approximation methods that span a wide range of computational complexity were developed. Identical data for individual species properties were used for each method. Each approximation method is employed in the numerical solution of a set of five H2-02-N2 flames. For each flame the computed species and temperature profiles, as well as the computed flame speeds, are found to be very nearly independent of the approximation method used. This does not indicate that transport phenomena are unimportant, but rather that the selection of the input values for the individual species transport properties is more important than the selection of the method used to approximate the multispecies transport. Based on these results, a sixth approximation method was developed that is computationally efficient and provides results extremely close to the most sophisticated and precise method used.

  20. Increasing signal processing sophistication in the calculation of the respiratory modulation of the photoplethysmogram (DPOP).

    PubMed

    Addison, Paul S; Wang, Rui; Uribe, Alberto A; Bergese, Sergio D

    2015-06-01

    DPOP (∆POP or Delta-POP) is a non-invasive parameter which measures the strength of respiratory modulations present in the pulse oximetry photoplethysmogram (pleth) waveform. It has been proposed as a non-invasive surrogate parameter for pulse pressure variation (PPV) used in the prediction of the response to volume expansion in hypovolemic patients. Many groups have reported on the DPOP parameter and its correlation with PPV using various semi-automated algorithmic implementations. The study reported here demonstrates the performance gains made by adding increasingly sophisticated signal processing components to a fully automated DPOP algorithm. A DPOP algorithm was coded and its performance systematically enhanced through a series of code module alterations and additions. Each algorithm iteration was tested on data from 20 mechanically ventilated OR patients. Correlation coefficients and ROC curve statistics were computed at each stage. For the purposes of the analysis we split the data into a manually selected 'stable' region subset of the data containing relatively noise free segments and a 'global' set incorporating the whole data record. Performance gains were measured in terms of correlation against PPV measurements in OR patients undergoing controlled mechanical ventilation. Through increasingly advanced pre-processing and post-processing enhancements to the algorithm, the correlation coefficient between DPOP and PPV improved from a baseline value of R = 0.347 to R = 0.852 for the stable data set, and, correspondingly, R = 0.225 to R = 0.728 for the more challenging global data set. Marked gains in algorithm performance are achievable for manually selected stable regions of the signals using relatively simple algorithm enhancements. Significant additional algorithm enhancements, including a correction for low perfusion values, were required before similar gains were realised for the more challenging global data set.

Top