Science.gov

Sample records for autonomous vision system

  1. Compact Autonomous Hemispheric Vision System

    NASA Technical Reports Server (NTRS)

    Pingree, Paula J.; Cunningham, Thomas J.; Werne, Thomas A.; Eastwood, Michael L.; Walch, Marc J.; Staehle, Robert L.

    2012-01-01

    Solar System Exploration camera implementations to date have involved either single cameras with wide field-of-view (FOV) and consequently coarser spatial resolution, cameras on a movable mast, or single cameras necessitating rotation of the host vehicle to afford visibility outside a relatively narrow FOV. These cameras require detailed commanding from the ground or separate onboard computers to operate properly, and are incapable of making decisions based on image content that control pointing and downlink strategy. For color, a filter wheel having selectable positions was often added, which added moving parts, size, mass, power, and reduced reliability. A system was developed based on a general-purpose miniature visible-light camera using advanced CMOS (complementary metal oxide semiconductor) imager technology. The baseline camera has a 92 FOV and six cameras are arranged in an angled-up carousel fashion, with FOV overlaps such that the system has a 360 FOV (azimuth). A seventh camera, also with a FOV of 92 , is installed normal to the plane of the other 6 cameras giving the system a > 90 FOV in elevation and completing the hemispheric vision system. A central unit houses the common electronics box (CEB) controlling the system (power conversion, data processing, memory, and control software). Stereo is achieved by adding a second system on a baseline, and color is achieved by stacking two more systems (for a total of three, each system equipped with its own filter.) Two connectors on the bottom of the CEB provide a connection to a carrier (rover, spacecraft, balloon, etc.) for telemetry, commands, and power. This system has no moving parts. The system's onboard software (SW) supports autonomous operations such as pattern recognition and tracking.

  2. Intelligent vision system for autonomous vehicle operations

    NASA Technical Reports Server (NTRS)

    Scholl, Marija S.

    1991-01-01

    A complex optical system consisting of a 4f optical correlator with programmatic filters under the control of a digital on-board computer that operates at video rates for filter generation, storage, and management is described.

  3. Computer vision system for an autonomous mobile robot

    NASA Astrophysics Data System (ADS)

    Liao, Xiaoqun; Cao, Jin; Cao, Ming; Samu, Tayib; Hall, Ernest L.

    2000-10-01

    The purpose of this paper is to compare three methods for 3- D measurements of line position used for the vision guidance to navigate an autonomous mobile robot. A model is first developed to map 3-D ground points into image points to be developed using homogeneous coordinates. Then using the ground plane constraint, the inverse transformation that maps image points into 3-D ground points is determined. And then the system identification problem is solved using a calibration device. Calibration data is used to determine the model parameters by minimizing the mean square error between model and calibration points. A novel simplification is then presented which provides surprisingly accurate results. This method is called the magic matrix approach and uses only the calibration data. A more standard variation of this approach is also considered. The significance of this work is that it shows that three methods that are based on 3-D measurements may be used for mobile robot navigation and that a simple method can achieve accuracy to a fraction of an inch which is sufficient in some applications.

  4. SUMO/FREND: vision system for autonomous satellite grapple

    NASA Astrophysics Data System (ADS)

    Obermark, Jerome; Creamer, Glenn; Kelm, Bernard E.; Wagner, William; Henshaw, C. Glen

    2007-04-01

    SUMO/FREND is a risk reduction program for an advanced servicing spacecraft sponsored by DARPA and executed by the Naval Center for Space Technology at the Naval Research Laboratory in Washington, DC. The overall program will demonstrate the integration of many techniques needed in order to autonomously rendezvous and capture customer satellites at geosynchronous orbits. A flight-qualifiable payload is currently under development to prove out challenging aspects of the mission. The grappling process presents computer vision challenges to properly identify and guide the final step in joining the pursuer craft to the customer. This paper will provide an overview of the current status of the project with an emphasis on the challenges, techniques, and directions of the machine vision processes to guide the grappling.

  5. New vision system and navigation algorithm for an autonomous ground vehicle

    NASA Astrophysics Data System (ADS)

    Tann, Hokchhay; Shakya, Bicky; Merchen, Alex C.; Williams, Benjamin C.; Khanal, Abhishek; Zhao, Jiajia; Ahlgren, David J.

    2013-12-01

    Improvements were made to the intelligence algorithms of an autonomously operating ground vehicle, Q, which competed in the 2013 Intelligent Ground Vehicle Competition (IGVC). The IGVC required the vehicle to first navigate between two white lines on a grassy obstacle course, then pass through eight GPS waypoints, and pass through a final obstacle field. Modifications to Q included a new vision system with a more effective image processing algorithm for white line extraction. The path-planning algorithm adopted the vision system, creating smoother, more reliable navigation. With these improvements, Q successfully completed the basic autonomous navigation challenge, finishing tenth out of over 50 teams.

  6. Street Viewer: An Autonomous Vision Based Traffic Tracking System.

    PubMed

    Bottino, Andrea; Garbo, Alessandro; Loiacono, Carmelo; Quer, Stefano

    2016-06-03

    The development of intelligent transportation systems requires the availability of both accurate traffic information in real time and a cost-effective solution. In this paper, we describe Street Viewer, a system capable of analyzing the traffic behavior in different scenarios from images taken with an off-the-shelf optical camera. Street Viewer operates in real time on embedded hardware architectures with limited computational resources. The system features a pipelined architecture that, on one side, allows one to exploit multi-threading intensively and, on the other side, allows one to improve the overall accuracy and robustness of the system, since each layer is aimed at refining for the following layers the information it receives as input. Another relevant feature of our approach is that it is self-adaptive. During an initial setup, the application runs in learning mode to build a model of the flow patterns in the observed area. Once the model is stable, the system switches to the on-line mode where the flow model is used to count vehicles traveling on each lane and to produce a traffic information summary. If changes in the flow model are detected, the system switches back autonomously to the learning mode. The accuracy and the robustness of the system are analyzed in the paper through experimental results obtained on several different scenarios and running the system for long periods of time.

  7. Street Viewer: An Autonomous Vision Based Traffic Tracking System

    PubMed Central

    Bottino, Andrea; Garbo, Alessandro; Loiacono, Carmelo; Quer, Stefano

    2016-01-01

    The development of intelligent transportation systems requires the availability of both accurate traffic information in real time and a cost-effective solution. In this paper, we describe Street Viewer, a system capable of analyzing the traffic behavior in different scenarios from images taken with an off-the-shelf optical camera. Street Viewer operates in real time on embedded hardware architectures with limited computational resources. The system features a pipelined architecture that, on one side, allows one to exploit multi-threading intensively and, on the other side, allows one to improve the overall accuracy and robustness of the system, since each layer is aimed at refining for the following layers the information it receives as input. Another relevant feature of our approach is that it is self-adaptive. During an initial setup, the application runs in learning mode to build a model of the flow patterns in the observed area. Once the model is stable, the system switches to the on-line mode where the flow model is used to count vehicles traveling on each lane and to produce a traffic information summary. If changes in the flow model are detected, the system switches back autonomously to the learning mode. The accuracy and the robustness of the system are analyzed in the paper through experimental results obtained on several different scenarios and running the system for long periods of time. PMID:27271627

  8. A Vision in Jeopardy: Royal Navy Maritime Autonomous Systems (MAS)

    DTIC Science & Technology

    2017-03-31

    Successive UK governments have recognized the enduring importance of maritime power for Britain as an island nation and have directed the Royal...transformations to analyze the importance of vision to the success of a program, the cultural and leadership frictions within the RN, and the...water.” i Abstract Successive UK governments have recognized the enduring importance of maritime power for Britain as an island nation and

  9. Autonomous Hovering and Landing of a Quad-rotor Micro Aerial Vehicle by Means of on Ground Stereo Vision System

    NASA Astrophysics Data System (ADS)

    Pebrianti, Dwi; Kendoul, Farid; Azrad, Syaril; Wang, Wei; Nonami, Kenzo

    On ground stereo vision system is used for autonomous hovering and landing of a quadrotor Micro Aerial Vehicle (MAV). This kind of system has an advantage to support embedded vision system for autonomous hovering and landing, since an embedded vision system occasionally gives inaccurate distance calculation due to either vibration problem or unknown geometry of the landing target. Color based object tracking by using Continuously Adaptive Mean Shift (CAMSHIFT) algorithm was examined. Nonlinear model of quad-rotor MAV and a PID controller were used for autonomous hovering and landing. The result shows that the Camshift based object tracking algorithm has good performance. Additionally, the comparison between the stereo vision system based and GPS based autonomous hovering of a quad-rotor MAV shows that stereo vision system has better performance. The accuracy of the stereo vision system is about 1 meter in the longitudinal and lateral direction when the quad-rotor flies in 6 meters of altitude. In the same experimental condition, the GPS based system accuracy is about 3 meters. Additionally, experiment on autonomous landing gives a reliable result.

  10. Pose Self-Calibration of Stereo Vision Systems for Autonomous Vehicle Applications

    PubMed Central

    Musleh, Basam; Martín, David; Armingol, José María; de la Escalera, Arturo

    2016-01-01

    Nowadays, intelligent systems applied to vehicles have grown very rapidly; their goal is not only the improvement of safety, but also making autonomous driving possible. Many of these intelligent systems are based on making use of computer vision in order to know the environment and act accordingly. It is of great importance to be able to estimate the pose of the vision system because the measurement matching between the perception system (pixels) and the vehicle environment (meters) depends on the relative position between the perception system and the environment. A new method of camera pose estimation for stereo systems is presented in this paper, whose main contribution regarding the state of the art on the subject is the estimation of the pitch angle without being affected by the roll angle. The validation of the self-calibration method is accomplished by comparing it with relevant methods of camera pose estimation, where a synthetic sequence is used in order to measure the continuous error with a ground truth. This validation is enriched by the experimental results of the method in real traffic environments. PMID:27649178

  11. Pose Self-Calibration of Stereo Vision Systems for Autonomous Vehicle Applications.

    PubMed

    Musleh, Basam; Martín, David; Armingol, José María; de la Escalera, Arturo

    2016-09-14

    Nowadays, intelligent systems applied to vehicles have grown very rapidly; their goal is not only the improvement of safety, but also making autonomous driving possible. Many of these intelligent systems are based on making use of computer vision in order to know the environment and act accordingly. It is of great importance to be able to estimate the pose of the vision system because the measurement matching between the perception system (pixels) and the vehicle environment (meters) depends on the relative position between the perception system and the environment. A new method of camera pose estimation for stereo systems is presented in this paper, whose main contribution regarding the state of the art on the subject is the estimation of the pitch angle without being affected by the roll angle. The validation of the self-calibration method is accomplished by comparing it with relevant methods of camera pose estimation, where a synthetic sequence is used in order to measure the continuous error with a ground truth. This validation is enriched by the experimental results of the method in real traffic environments.

  12. Autonomous navigation vehicle system based on robot vision and multi-sensor fusion

    NASA Astrophysics Data System (ADS)

    Wu, Lihong; Chen, Yingsong; Cui, Zhouping

    2011-12-01

    The architecture of autonomous navigation vehicle based on robot vision and multi-sensor fusion technology is expatiated in this paper. In order to acquire more intelligence and robustness, accurate real-time collection and processing of information are realized by using this technology. The method to achieve robot vision and multi-sensor fusion is discussed in detail. The results simulated in several operating modes show that this intelligent vehicle has better effects in barrier identification and avoidance and path planning. And this can provide higher reliability during vehicle running.

  13. Self-localization for an autonomous mobile robot based on an omni-directional vision system

    NASA Astrophysics Data System (ADS)

    Chiang, Shu-Yin; Lin, Kuang-Yu; Chia, Tsorng-Lin

    2013-12-01

    In this study, we designed an autonomous mobile robot based on the rules of the Federation of International Robotsoccer Association (FIRA) RoboSot category, integrating the techniques of computer vision, real-time image processing, dynamic target tracking, wireless communication, self-localization, motion control, path planning, and control strategy to achieve the contest goal. The self-localization scheme of the mobile robot is based on the algorithms featured in the images from its omni-directional vision system. In previous works, we used the image colors of the field goals as reference points, combining either dual-circle or trilateration positioning of the reference points to achieve selflocalization of the autonomous mobile robot. However, because the image of the game field is easily affected by ambient light, positioning systems exclusively based on color model algorithms cause errors. To reduce environmental effects and achieve the self-localization of the robot, the proposed algorithm is applied in assessing the corners of field lines by using an omni-directional vision system. Particularly in the mid-size league of the RobotCup soccer competition, selflocalization algorithms based on extracting white lines from the soccer field have become increasingly popular. Moreover, white lines are less influenced by light than are the color model of the goals. Therefore, we propose an algorithm that transforms the omni-directional image into an unwrapped transformed image, enhancing the extraction features. The process is described as follows: First, radical scan-lines were used to process omni-directional images, reducing the computational load and improving system efficiency. The lines were radically arranged around the center of the omni-directional camera image, resulting in a shorter computational time compared with the traditional Cartesian coordinate system. However, the omni-directional image is a distorted image, which makes it difficult to recognize the

  14. MMW radar enhanced vision systems: the Helicopter Autonomous Landing System (HALS) and Radar-Enhanced Vision System (REVS) are rotary and fixed wing enhanced flight vision systems that enable safe flight operations in degraded visual environments

    NASA Astrophysics Data System (ADS)

    Cross, Jack; Schneider, John; Cariani, Pete

    2013-05-01

    Sierra Nevada Corporation (SNC) has developed rotary and fixed wing millimeter wave radar enhanced vision systems. The Helicopter Autonomous Landing System (HALS) is a rotary-wing enhanced vision system that enables multi-ship landing, takeoff, and enroute flight in Degraded Visual Environments (DVE). HALS has been successfully flight tested in a variety of scenarios, from brown-out DVE landings, to enroute flight over mountainous terrain, to wire/cable detection during low-level flight. The Radar Enhanced Vision Systems (REVS) is a fixed-wing Enhanced Flight Vision System (EFVS) undergoing prototype development testing. Both systems are based on a fast-scanning, threedimensional 94 GHz radar that produces real-time terrain and obstacle imagery. The radar imagery is fused with synthetic imagery of the surrounding terrain to form a long-range, wide field-of-view display. A symbology overlay is added to provide aircraft state information and, for HALS, approach and landing command guidance cuing. The combination of see-through imagery and symbology provides the key information a pilot needs to perform safe flight operations in DVE conditions. This paper discusses the HALS and REVS systems and technology, presents imagery, and summarizes the recent flight test results.

  15. Towards an Autonomous Vision-Based Unmanned Aerial System against Wildlife Poachers.

    PubMed

    Olivares-Mendez, Miguel A; Fu, Changhong; Ludivig, Philippe; Bissyandé, Tegawendé F; Kannan, Somasundar; Zurad, Maciej; Annaiyan, Arun; Voos, Holger; Campoy, Pascual

    2015-12-12

    Poaching is an illegal activity that remains out of control in many countries. Based on the 2014 report of the United Nations and Interpol, the illegal trade of global wildlife and natural resources amounts to nearly $ 213 billion every year, which is even helping to fund armed conflicts. Poaching activities around the world are further pushing many animal species on the brink of extinction. Unfortunately, the traditional methods to fight against poachers are not enough, hence the new demands for more efficient approaches. In this context, the use of new technologies on sensors and algorithms, as well as aerial platforms is crucial to face the high increase of poaching activities in the last few years. Our work is focused on the use of vision sensors on UAVs for the detection and tracking of animals and poachers, as well as the use of such sensors to control quadrotors during autonomous vehicle following and autonomous landing.

  16. Towards an Autonomous Vision-Based Unmanned Aerial System against Wildlife Poachers

    PubMed Central

    Olivares-Mendez, Miguel A.; Fu, Changhong; Ludivig, Philippe; Bissyandé, Tegawendé F.; Kannan, Somasundar; Zurad, Maciej; Annaiyan, Arun; Voos, Holger; Campoy, Pascual

    2015-01-01

    Poaching is an illegal activity that remains out of control in many countries. Based on the 2014 report of the United Nations and Interpol, the illegal trade of global wildlife and natural resources amounts to nearly $213 billion every year, which is even helping to fund armed conflicts. Poaching activities around the world are further pushing many animal species on the brink of extinction. Unfortunately, the traditional methods to fight against poachers are not enough, hence the new demands for more efficient approaches. In this context, the use of new technologies on sensors and algorithms, as well as aerial platforms is crucial to face the high increase of poaching activities in the last few years. Our work is focused on the use of vision sensors on UAVs for the detection and tracking of animals and poachers, as well as the use of such sensors to control quadrotors during autonomous vehicle following and autonomous landing. PMID:26703597

  17. Research on an autonomous vision-guided helicopter

    NASA Technical Reports Server (NTRS)

    Amidi, Omead; Mesaki, Yuji; Kanade, Takeo

    1994-01-01

    Integration of computer vision with on-board sensors to autonomously fly helicopters was researched. The key components developed were custom designed vision processing hardware and an indoor testbed. The custom designed hardware provided flexible integration of on-board sensors with real-time image processing resulting in a significant improvement in vision-based state estimation. The indoor testbed provided convenient calibrated experimentation in constructing real autonomous systems.

  18. Design of a dynamic test platform for autonomous robot vision systems

    NASA Technical Reports Server (NTRS)

    Rich, G. C.

    1980-01-01

    The concept and design of a dynamic test platform for development and evluation of a robot vision system is discussed. The platform is to serve as a diagnostic and developmental tool for future work with the RPI Mars Rover's multi laser/multi detector vision system. The platform allows testing of the vision system while its attitude is varied, statically or periodically. The vision system is mounted on the test platform. It can then be subjected to a wide variety of simulated can thus be examined in a controlled, quantitative fashion. Defining and modeling Rover motions and designing the platform to emulate these motions are also discussed. Individual aspects of the design process are treated separately, as structural, driving linkages, and motors and transmissions.

  19. Enhanced and synthetic vision system for autonomous all weather approach and landing

    NASA Astrophysics Data System (ADS)

    Korn, Bernd R.

    2007-04-01

    Within its research project ADVISE-PRO (Advanced visual system for situation awareness enhancement - prototype, 2003 - 2006) that will be presented in this contribution, DLR has combined elements of Enhanced Vision and Synthetic Vision to one integrated system to allow all low visibility operations independently from the infrastructure on ground. The core element of this system is the adequate fusion of all information that is available on-board. This fusion process is organized in a hierarchical manner. The most important subsystems are a) the sensor based navigation which determines the aircraft's position relative to the runway by automatically analyzing sensor data (MMW, IR, radar altimeter) without using neither (D)GPS nor precise knowledge about the airport geometry, b) an integrity monitoring of navigation data and terrain data which verifies on-board navigation data ((D)GPS + INS) with sensor data (MMW-Radar, IR-Sensor, Radar altimeter) and airport / terrain databases, c) an obstacle detection system and finally d) a consistent description of situation and respective HMI for the pilot.

  20. Vision Based Autonomous Robotic Control for Advanced Inspection and Repair

    NASA Technical Reports Server (NTRS)

    Wehner, Walter S.

    2014-01-01

    The advanced inspection system is an autonomous control and analysis system that improves the inspection and remediation operations for ground and surface systems. It uses optical imaging technology with intelligent computer vision algorithms to analyze physical features of the real-world environment to make decisions and learn from experience. The advanced inspection system plans to control a robotic manipulator arm, an unmanned ground vehicle and cameras remotely, automatically and autonomously. There are many computer vision, image processing and machine learning techniques available as open source for using vision as a sensory feedback in decision-making and autonomous robotic movement. My responsibilities for the advanced inspection system are to create a software architecture that integrates and provides a framework for all the different subsystem components; identify open-source algorithms and techniques; and integrate robot hardware.

  1. A survey of autonomous vision-based See and Avoid for Unmanned Aircraft Systems

    NASA Astrophysics Data System (ADS)

    Mcfadyen, Aaron; Mejias, Luis

    2016-01-01

    This paper provides a comprehensive review of the vision-based See and Avoid problem for unmanned aircraft. The unique problem environment and associated constraints are detailed, followed by an in-depth analysis of visual sensing limitations. In light of such detection and estimation constraints, relevant human, aircraft and robot collision avoidance concepts are then compared from a decision and control perspective. Remarks on system evaluation and certification are also included to provide a holistic review approach. The intention of this work is to clarify common misconceptions, realistically bound feasible design expectations and offer new research directions. It is hoped that this paper will help us to unify design efforts across the aerospace and robotics communities.

  2. NAVIGATOR: Autonomous navigation system for planetary exploration landers based on stereo vision

    NASA Astrophysics Data System (ADS)

    Guizzo, G. P.; Angrilli, F.; Vukman, I.

    2001-11-01

    This generation of lander navigation system was born in order to cover a wide spectrum of soft landing scenarios on planets like Mars, Mercury or the Moon; it is also particularly useful when there is no a priori knowledge of the ground. The navigation system studied here is completely autonomous and able to land on various kinds of hazardous terrains with many unevennesses like mountain tops, bottoms of illuminated craters, valleys or small plateaus. In order to choose an adequate site for landing the navigation system uses stereo image pairs, obtained with a single camera at high altitudes and with two cameras at lower ones, to produce a digital elevation model (DEM) of the terrain. It uses dense disparity maps, made from the sign-of-laplacian-of-gaussian (SLOG) of the images, as input for the vertical locus method. This technique takes the advantage of a fast and easy processing of the image since it is possible to implement the required algorithms in a efficient way by the use of an ASIC. A DSP is then used for the remaining software operations, i.e. piloting and guidance of the lander. Another advantage of this method is that, unlike most methods involving active sensors, it can be used even with the camera at great distances to the target (e.g. at the beginning of the approach phase) since it is limited only by the camera field of view and resolution. It moreover doesn't rely on any special features of the terrain, like craters, rocks or other landmarks, but it is sufficient to have an illuminated and slightly textured terrain. The navigation system validation was done using a synthetic terrain generator which allowed the closed-loop simulation of the entire system.

  3. Infrared sensors and systems for enhanced vision/autonomous landing applications

    NASA Technical Reports Server (NTRS)

    Kerr, J. Richard

    1993-01-01

    There exists a large body of data spanning more than two decades, regarding the ability of infrared imagers to 'see' through fog, i.e., in Category III weather conditions. Much of this data is anecdotal, highly specialized, and/or proprietary. In order to determine the efficacy and cost effectiveness of these sensors under a variety of climatic/weather conditions, there is a need for systematic data spanning a significant range of slant-path scenarios. These data should include simultaneous video recordings at visible, midwave (3-5 microns), and longwave (8-12 microns) wavelengths, with airborne weather pods that include the capability of determining the fog droplet size distributions. Existing data tend to show that infrared is more effective than would be expected from analysis and modeling. It is particularly more effective for inland (radiation) fog as compared to coastal (advection) fog, although both of these archetypes are oversimplifications. In addition, as would be expected from droplet size vs wavelength considerations, longwave outperforms midwave, in many cases by very substantial margins. Longwave also benefits from the higher level of available thermal energy at ambient temperatures. The principal attraction of midwave sensors is that staring focal plane technology is available at attractive cost-performance levels. However, longwave technology such as that developed at FLIR Systems, Inc. (FSI), has achieved high performance in small, economical, reliable imagers utilizing serial-parallel scanning techniques. In addition, FSI has developed dual-waveband systems particularly suited for enhanced vision flight testing. These systems include a substantial, embedded processing capability which can perform video-rate image enhancement and multisensor fusion. This is achieved with proprietary algorithms and includes such operations as real-time histograms, convolutions, and fast Fourier transforms.

  4. Insect-Based Vision for Autonomous Vehicles: A Feasibility Study

    NASA Technical Reports Server (NTRS)

    Srinivasan, Mandyam V.

    1999-01-01

    The aims of the project were to use a high-speed digital video camera to pursue two questions: (1) To explore the influence of temporal imaging constraints on the performance of vision systems for autonomous mobile robots; (2) To study the fine structure of insect flight trajectories in order to better understand the characteristics of flight control, orientation and navigation.

  5. Insect-Based Vision for Autonomous Vehicles: A Feasibility Study

    NASA Technical Reports Server (NTRS)

    Srinivasan, Mandyam V.

    1999-01-01

    The aims of the project were to use a high-speed digital video camera to pursue two questions: i) To explore the influence of temporal imaging constraints on the performance of vision systems for autonomous mobile robots; To study the fine structure of insect flight trajectories with in order to better understand the characteristics of flight control, orientation and navigation.

  6. Insect-Based Vision for Autonomous Vehicles: A Feasibility Study

    NASA Technical Reports Server (NTRS)

    Srinivasan, Mandyam V.

    1999-01-01

    The aims of the project were to use a high-speed digital video camera to pursue two questions: (1) To explore the influence of temporal imaging constraints on the performance of vision systems for autonomous mobile robots; (2) To study the fine structure of insect flight trajectories in order to better understand the characteristics of flight control, orientation and navigation.

  7. Real-time performance of a hands-free semi-autonomous wheelchair system using a combination of stereoscopic and spherical vision.

    PubMed

    Nguyen, Jordan S; Nguyen, Tuan Nghia; Tran, Yvonne; Su, Steven W; Craig, Ashley; Nguyen, Hung T

    2012-01-01

    This paper is concerned with the operational performance of a semi-autonomous wheelchair system named TIM (Thought-controlled Intelligent Machine), which uses cameras in a system configuration modeled on the vision system of a horse. This new camera configuration utilizes stereoscopic vision for 3-Dimensional (3D) depth perception and mapping ahead of the wheelchair, combined with a spherical camera system for 360-degrees of monocular vision. The unique combination allows for static components of an unknown environment to be mapped and any surrounding dynamic obstacles to be detected, during real-time autonomous navigation, minimizing blind-spots and preventing accidental collisions with people or obstacles. Combining this vision system with a shared control strategy provides intelligent assistive guidance during wheelchair navigation, and can accompany any hands-free wheelchair control technology for people with severe physical disability. Testing of this system in crowded dynamic environments has displayed the feasibility and real-time performance of this system when assisting hands-free control technologies, in this case being a proof-of-concept brain-computer interface (BCI).

  8. Multisensorial Vision For Autonomous Vehicle Driving

    NASA Astrophysics Data System (ADS)

    Giusto, Daniele D.; Regazzoni, Carlo S.; Vernazza, Gianni L.

    1989-03-01

    A multisensorial vision system for autonomous vehicle driving is presented, that operates in outdoor natural environments. The system, currently under development in our laboratories, will be able to integrate data provided by different sensors in order to achieve a more reliable description of a scene and to meet safety requirements. We chose to perform a high-level symbolic fusion of the data to better accomplish the recognition task. A knowledge-based approach is followed, which provides a more accurate solution; in particular, it will be possible to integrate both physical data, furnished by each channel, and different fusion strategies, by using an appropriate control structure. The high complexity of data integration is reduced by acquiring, filtering, segmenting and extracting features from each sensor channel. Production rules, divided into groups according to specific goals, drive the fusion process, linking to a symbolic frame all the segmented regions characterized by similar properties. As a first application, road and obstacle detection is performed. A particular fusion strategy is tested that integrates results separately obtained by applying the recognition module to each different sensor according to the related model description. Preliminary results are very promising and confirm the validity of the proposed approach.

  9. INL Autonomous Navigation System

    SciTech Connect

    2005-03-30

    The INL Autonomous Navigation System provides instructions for autonomously navigating a robot. The system permits high-speed autonomous navigation including obstacle avoidance, waypoing navigation and path planning in both indoor and outdoor environments.

  10. Three-dimensional vision sensors for autonomous robots

    NASA Astrophysics Data System (ADS)

    Uchiyama, Takashi; Okabayashi, Keizyu; Wakitani, Jun

    1993-09-01

    A three dimensional measurement system, which is important for developing autonomous robots is described. Industrial robots used in today's plants are of the preprogrammed teaching playback type. It is necessary to develop autonomous robots which can work based on sensor information for intelligent manufacturing systems. Moreover, practical use of robots which work in unstructured environments such as outdoors and in space is expected. To realize this, a function to measure objects and the environment three-dimensionally is a key technology. Additional important requirements for robotic sensors are real-time processing and compactness. We have developed smart 3-D vision sensors for the purpose of realizing autonomous robots. These are two kinds of sensors with different functions corresponding to the application. One is a slitted light range finder ( SLRF ) to measure stationary objects. The other is a real-time tracking vision ( RTTV ) which can measure moving objects at high speed. SLRF uses multiple slitted lights which are generated by a semiconductor laser through an interference filter and a cylindrical lens. Furthermore, we developed a liquid crystal shutter with multiple electrodes. We devised a technique to make coded slitted light by putting this shutter in front of the light source. As a result, using the principle of triangulation, objects can be measured in three dimensions. In addition, high-speed image input was enabled by projecting multiple slitted light at the same time. We have confirmed the effectiveness of the SLRF applied to a hand-eye system using a robot.

  11. Coherent laser vision system

    SciTech Connect

    Sebastion, R.L.

    1995-10-01

    The Coherent Laser Vision System (CLVS) is being developed to provide precision real-time 3D world views to support site characterization and robotic operations and during facilities Decontamination and Decommissioning. Autonomous or semiautonomous robotic operations requires an accurate, up-to-date 3D world view. Existing technologies for real-time 3D imaging, such as AM laser radar, have limited accuracy at significant ranges and have variability in range estimates caused by lighting or surface shading. Recent advances in fiber optic component technology and digital processing components have enabled the development of a new 3D vision system based upon a fiber optic FMCW coherent laser radar. The approach includes a compact scanner with no-moving parts capable of randomly addressing all pixels. The system maintains the immunity to lighting and surface shading conditions which is characteristic to coherent laser radar. The random pixel addressability allows concentration of scanning and processing on the active areas of a scene, as is done by the human eye-brain system.

  12. Autonomous robot vision software design using Matlab toolboxes

    NASA Astrophysics Data System (ADS)

    Tedder, Maurice; Chung, Chan-Jin

    2004-10-01

    The purpose of this paper is to introduce a cost-effective way to design robot vision and control software using Matlab for an autonomous robot designed to compete in the 2004 Intelligent Ground Vehicle Competition (IGVC). The goal of the autonomous challenge event is for the robot to autonomously navigate an outdoor obstacle course bounded by solid and dashed lines on the ground. Visual input data is provided by a DV camcorder at 160 x 120 pixel resolution. The design of this system involved writing an image-processing algorithm using hue, satuaration, and brightness (HSB) color filtering and Matlab image processing functions to extract the centroid, area, and orientation of the connected regions from the scene. These feature vectors are then mapped to linguistic variables that describe the objects in the world environment model. The linguistic variables act as inputs to a fuzzy logic controller designed using the Matlab fuzzy logic toolbox, which provides the knowledge and intelligence component necessary to achieve the desired goal. Java provides the central interface to the robot motion control and image acquisition components. Field test results indicate that the Matlab based solution allows for rapid software design, development and modification of our robot system.

  13. Computer vision sensor for autonomous helicopter hover stabilization

    NASA Astrophysics Data System (ADS)

    Oertel, Carl-Henrik

    1997-06-01

    Sensors for synthetic vision are needed to extend the mission profiles of helicopters. A special task for various applications is the autonomous position hold of a helicopter above a ground fixed or moving target. A computer-vision based system, which is able to observe the helicopter flight state during hover and low speed, based on the detection and tracking of significant but arbitrary features, has been developed by the Institute of Flight Mechanics of DLR Deutsche Forschungsanstalt fuer Luft- und Raumfahrt e.V. The approach is as follows: A CCD camera looks straight downward to the ground and produces an image of the ground view. The digitized video signal is fed into a high performance on- board computer which looks for distinctive features in the image. Any motion of the helicopter results in movements of these patterns in the camera image. By tracking the distinctive features during the succession of incoming images and by the support of inertial sensor data, it is possible to calculate all necessary helicopter state variables, which are needed for a position hold control algorithm. This information is gained from a state variable observer. That means that no additional information about the appearance of the camera view has to be known in advance to achieve autonomous helicopter hover stabilization. The hardware architecture for this image evaluation system mainly consists of several PowerPC processors which communicate with the aid of transputers and an image distribution bus. Feature tracking is performed by a dedicated 2D-correlator subsystem. The paper presents the characteristics of the computer vision sensor and demonstrates its functionality.

  14. An autonomous vision-based mobile robot

    NASA Astrophysics Data System (ADS)

    Baumgartner, Eric Thomas

    This dissertation describes estimation and control methods for use in the development of an autonomous mobile robot for structured environments. The navigation of the mobile robot is based on precise estimates of the position and orientation of the robot within its environment. The extended Kalman filter algorithm is used to combine information from the robot's drive wheels with periodic observations of small, wall-mounted, visual cues to produce the precise position and orientation estimates. The visual cues are reliably detected by at least one video camera mounted on the mobile robot. Typical position estimates are accurate to within one inch. A path tracking algorithm is also developed to follow desired reference paths which are taught by a human operator. Because of the time-independence of the tracking algorithm, the speed that the vehicle travels along the reference path is specified independent from the tracking algorithm. The estimation and control methods have been applied successfully to two experimental vehicle systems. Finally, an analysis of the linearized closed-loop control system is performed to study the behavior and the stability of the system as a function of various control parameters.

  15. Autonomic Nervous System Disorders

    MedlinePlus

    Your autonomic nervous system is the part of your nervous system that controls involuntary actions, such as the beating of your heart and ... blood vessels. When something goes wrong in this system, it can cause serious problems, including Blood pressure ...

  16. Vision-Aided Autonomous Landing and Ingress of Micro Aerial Vehicles

    NASA Technical Reports Server (NTRS)

    Brockers, Roland; Ma, Jeremy C.; Matthies, Larry H.; Bouffard, Patrick

    2012-01-01

    Micro aerial vehicles have limited sensor suites and computational power. For reconnaissance tasks and to conserve energy, these systems need the ability to autonomously land at vantage points or enter buildings (ingress). But for autonomous navigation, information is needed to identify and guide the vehicle to the target. Vision algorithms can provide egomotion estimation and target detection using input from cameras that are easy to include in miniature systems.

  17. Biomimetic machine vision system.

    PubMed

    Harman, William M; Barrett, Steven F; Wright, Cameron H G; Wilcox, Michael

    2005-01-01

    Real-time application of digital imaging for use in machine vision systems has proven to be prohibitive when used within control systems that employ low-power single processors without compromising the scope of vision or resolution of captured images. Development of a real-time machine analog vision system is the focus of research taking place at the University of Wyoming. This new vision system is based upon the biological vision system of the common house fly. Development of a single sensor is accomplished, representing a single facet of the fly's eye. This new sensor is then incorporated into an array of sensors capable of detecting objects and tracking motion in 2-D space. This system "preprocesses" incoming image data resulting in minimal data processing to determine the location of a target object. Due to the nature of the sensors in the array, hyperacuity is achieved thereby eliminating resolutions issues found in digital vision systems. In this paper, we will discuss the biological traits of the fly eye and the specific traits that led to the development of this machine vision system. We will also discuss the process of developing an analog based sensor that mimics the characteristics of interest in the biological vision system. This paper will conclude with a discussion of how an array of these sensors can be applied toward solving real-world machine vision issues.

  18. FPGA implementation of vision algorithms for small autonomous robots

    NASA Astrophysics Data System (ADS)

    Anderson, J. D.; Lee, D. J.; Archibald, J. K.

    2005-10-01

    The use of on-board vision with small autonomous robots has been made possible by the advances in the field of Field Programmable Gate Array (FPGA) technology. By connecting a CMOS camera to an FPGA board, on-board vision has been used to reduce the computation time inherent in vision algorithms. The FPGA board allows the user to create custom hardware in a faster, safer, and more easily verifiable manner that decreases the computation time and allows the vision to be done in real-time. Real-time vision tasks for small autonomous robots include object tracking, obstacle detection and avoidance, and path planning. Competitions were created to demonstrate that our algorithms work with our small autonomous vehicles in dealing with these problems. These competitions include Mouse-Trapped-in-a-Box, where the robot has to detect the edges of a box that it is trapped in and move towards them without touching them; Obstacle Avoidance, where an obstacle is placed at any arbitrary point in front of the robot and the robot has to navigate itself around the obstacle; Canyon Following, where the robot has to move to the center of a canyon and follow the canyon walls trying to stay in the center; the Grand Challenge, where the robot had to navigate a hallway and return to its original position in a given amount of time; and Stereo Vision, where a separate robot had to catch tennis balls launched from an air powered cannon. Teams competed on each of these competitions that were designed for a graduate-level robotic vision class, and each team had to develop their own algorithm and hardware components. This paper discusses one team's approach to each of these problems.

  19. Autonomous Flight Safety System

    NASA Technical Reports Server (NTRS)

    Simpson, James

    2010-01-01

    The Autonomous Flight Safety System (AFSS) is an independent self-contained subsystem mounted onboard a launch vehicle. AFSS has been developed by and is owned by the US Government. Autonomously makes flight termination/destruct decisions using configurable software-based rules implemented on redundant flight processors using data from redundant GPS/IMU navigation sensors. AFSS implements rules determined by the appropriate Range Safety officials.

  20. Towards an Autonomic Cluster Management System (ACMS) with Reflex Autonomicity

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt; Hinchey, Mike; Sterritt, Roy

    2005-01-01

    Cluster computing, whereby a large number of simple processors or nodes are combined together to apparently function as a single powerful computer, has emerged as a research area in its own right. The approach offers a relatively inexpensive means of providing a fault-tolerant environment and achieving significant computational capabilities for high-performance computing applications. However, the task of manually managing and configuring a cluster quickly becomes daunting as the cluster grows in size. Autonomic computing, with its vision to provide self-management, can potentially solve many of the problems inherent in cluster management. We describe the development of a prototype Autonomic Cluster Management System (ACMS) that exploits autonomic properties in automating cluster management and its evolution to include reflex reactions via pulse monitoring.

  1. Architecture of autonomous systems

    NASA Technical Reports Server (NTRS)

    Dikshit, Piyush; Guimaraes, Katia; Ramamurthy, Maya; Agrawala, Ashok; Larsen, Ronald L.

    1989-01-01

    Automation of Space Station functions and activities, particularly those involving robotic capabilities with interactive or supervisory human control, is a complex, multi-disciplinary systems design problem. A wide variety of applications using autonomous control can be found in the literature, but none of them seem to address the problem in general. All of them are designed with a specific application in mind. In this report, an abstract model is described which unifies the key concepts underlying the design of automated systems such as those studied by the aerospace contractors. The model has been kept as general as possible. The attempt is to capture all the key components of autonomous systems. With a little effort, it should be possible to map the functions of any specific autonomous system application to the model presented here.

  2. Architecture of autonomous systems

    NASA Technical Reports Server (NTRS)

    Dikshit, Piyush; Guimaraes, Katia; Ramamurthy, Maya; Agrawala, Ashok; Larsen, Ronald L.

    1986-01-01

    Automation of Space Station functions and activities, particularly those involving robotic capabilities with interactive or supervisory human control, is a complex, multi-disciplinary systems design problem. A wide variety of applications using autonomous control can be found in the literature, but none of them seem to address the problem in general. All of them are designed with a specific application in mind. In this report, an abstract model is described which unifies the key concepts underlying the design of automated systems such as those studied by the aerospace contractors. The model has been kept as general as possible. The attempt is to capture all the key components of autonomous systems. With a little effort, it should be possible to map the functions of any specific autonomous system application to the model presented here.

  3. Architecture of autonomous systems

    NASA Technical Reports Server (NTRS)

    Dikshit, Piyush; Guimaraes, Katia; Ramamurthy, Maya; Agrawala, Ashok; Larsen, Ronald L.

    1989-01-01

    Automation of Space Station functions and activities, particularly those involving robotic capabilities with interactive or supervisory human control, is a complex, multi-disciplinary systems design problem. A wide variety of applications using autonomous control can be found in the literature, but none of them seem to address the problem in general. All of them are designed with a specific application in mind. In this report, an abstract model is described which unifies the key concepts underlying the design of automated systems such as those studied by the aerospace contractors. The model has been kept as general as possible. The attempt is to capture all the key components of autonomous systems. With a little effort, it should be possible to map the functions of any specific autonomous system application to the model presented here.

  4. Autonomous Vision-Based Tethered-Assisted Rover Docking

    NASA Technical Reports Server (NTRS)

    Tsai, Dorian; Nesnas, Issa A.D.; Zarzhitsky, Dimitri

    2013-01-01

    Many intriguing science discoveries on planetary surfaces, such as the seasonal flows on crater walls and skylight entrances to lava tubes, are at sites that are currently inaccessible to state-of-the-art rovers. The in situ exploration of such sites is likely to require a tethered platform both for mechanical support and for providing power and communication. Mother/daughter architectures have been investigated where a mother deploys a tethered daughter into extreme terrains. Deploying and retracting a tethered daughter requires undocking and re-docking of the daughter to the mother, with the latter being the challenging part. In this paper, we describe a vision-based tether-assisted algorithm for the autonomous re-docking of a daughter to its mother following an extreme terrain excursion. The algorithm uses fiducials mounted on the mother to improve the reliability and accuracy of estimating the pose of the mother relative to the daughter. The tether that is anchored by the mother helps the docking process and increases the system's tolerance to pose uncertainties by mechanically aligning the mating parts in the final docking phase. A preliminary version of the algorithm was developed and field-tested on the Axel rover in the JPL Mars Yard. The algorithm achieved an 80% success rate in 40 experiments in both firm and loose soils and starting from up to 6 m away at up to 40 deg radial angle and 20 deg relative heading. The algorithm does not rely on an initial estimate of the relative pose. The preliminary results are promising and help retire the risk associated with the autonomous docking process enabling consideration in future martian and lunar missions.

  5. Autonomous Vision-Based Tethered-Assisted Rover Docking

    NASA Technical Reports Server (NTRS)

    Tsai, Dorian; Nesnas, Issa A.D.; Zarzhitsky, Dimitri

    2013-01-01

    Many intriguing science discoveries on planetary surfaces, such as the seasonal flows on crater walls and skylight entrances to lava tubes, are at sites that are currently inaccessible to state-of-the-art rovers. The in situ exploration of such sites is likely to require a tethered platform both for mechanical support and for providing power and communication. Mother/daughter architectures have been investigated where a mother deploys a tethered daughter into extreme terrains. Deploying and retracting a tethered daughter requires undocking and re-docking of the daughter to the mother, with the latter being the challenging part. In this paper, we describe a vision-based tether-assisted algorithm for the autonomous re-docking of a daughter to its mother following an extreme terrain excursion. The algorithm uses fiducials mounted on the mother to improve the reliability and accuracy of estimating the pose of the mother relative to the daughter. The tether that is anchored by the mother helps the docking process and increases the system's tolerance to pose uncertainties by mechanically aligning the mating parts in the final docking phase. A preliminary version of the algorithm was developed and field-tested on the Axel rover in the JPL Mars Yard. The algorithm achieved an 80% success rate in 40 experiments in both firm and loose soils and starting from up to 6 m away at up to 40 deg radial angle and 20 deg relative heading. The algorithm does not rely on an initial estimate of the relative pose. The preliminary results are promising and help retire the risk associated with the autonomous docking process enabling consideration in future martian and lunar missions.

  6. Performance evaluation of 3D vision-based semi-autonomous control method for assistive robotic manipulator.

    PubMed

    Ka, Hyun W; Chung, Cheng-Shiu; Ding, Dan; James, Khara; Cooper, Rory

    2017-03-22

    We developed a 3D vision-based semi-autonomous control interface for assistive robotic manipulators. It was implemented based on one of the most popular commercially available assistive robotic manipulator combined with a low-cost depth-sensing camera mounted on the robot base. To perform a manipulation task with the 3D vision-based semi-autonomous control interface, a user starts operating with a manual control method available to him/her. When detecting objects within a set range, the control interface automatically stops the robot, and provides the user with possible manipulation options through audible text output, based on the detected object characteristics. Then, the system waits until the user states a voice command. Once the user command is given, the control interface drives the robot autonomously until the given command is completed. In the empirical evaluations conducted with human subjects from two different groups, it was shown that the semi-autonomous control can be used as an alternative control method to enable individuals with impaired motor control to more efficiently operate the robot arms by facilitating their fine motion control. The advantage of semi-autonomous control was not so obvious for the simple tasks. But, for the relatively complex real-life tasks, the 3D vision-based semi-autonomous control showed significantly faster performance. Implications for Rehabilitation A 3D vision-based semi-autonomous control interface will improve clinical practice by providing an alternative control method that is less demanding physically as well cognitively. A 3D vision-based semi-autonomous control provides the user with task specific intelligent semiautonomous manipulation assistances. A 3D vision-based semi-autonomous control gives the user the feeling that he or she is still in control at any moment. A 3D vision-based semi-autonomous control is compatible with different types of new and existing manual control methods for ARMs.

  7. Micro autonomous robotic system

    NASA Astrophysics Data System (ADS)

    Ishihara, Hidenori; Fukuda, Toshio

    1995-12-01

    This paper deals with the structural proposal of the micro autonomous robotic system, and shows the design of the prototype. We aim at developing the micro robot, which autonomously acts based on its detection, in order to propose a solution to constitute the micro autonomous robotic system. However, as miniaturizing the size, the number of the sensors gets restricted and the information from them becomes lack. Lack of the information makes it difficult to realize an intelligence of quality. Because of that, the micro robotic system needs to develop the simple algorithm. In this paper, we propose the simply logical algorithms to control the actuator, and show the performance of the micro robot controlled by them, and design the Micro Line Trace Robot, which dimension is about 1 cm cube and which moves along the black line on the white-colored ground, and the programmable micro autonomous robot, which dimension is about 2 cm cube and which performs according to the program optionally.

  8. Synthetic Vision Systems

    NASA Technical Reports Server (NTRS)

    Prinzel, L.J.; Kramer, L.J.

    2009-01-01

    A synthetic vision system is an aircraft cockpit display technology that presents the visual environment external to the aircraft using computer-generated imagery in a manner analogous to how it would appear to the pilot if forward visibility were not restricted. The purpose of this chapter is to review the state of synthetic vision systems, and discuss selected human factors issues that should be considered when designing such displays.

  9. An Improved Vision-based Algorithm for Unmanned Aerial Vehicles Autonomous Landing

    NASA Astrophysics Data System (ADS)

    Zhao, Yunji; Pei, Hailong

    In vision-based autonomous landing system of UAV, the efficiency of target detecting and tracking will directly affect the control system. The improved algorithm of SURF(Speed Up Robust Features) will resolve the problem which is the inefficiency of the SURF algorithm in the autonomous landing system. The improved algorithm is composed of three steps: first, detect the region of the target using the Camshift; second, detect the feature points in the region of the above acquired using the SURF algorithm; third, do the matching between the template target and the region of target in frame. The results of experiment and theoretical analysis testify the efficiency of the algorithm.

  10. Autonomous power expert system

    NASA Technical Reports Server (NTRS)

    Ringer, Mark J.; Quinn, Todd M.

    1990-01-01

    The goal of the Autonomous Power System (APS) program is to develop and apply intelligent problem solving and control technologies to the Space Station Freedom Electrical Power Systems (SSF/EPS). The objectives of the program are to establish artificial intelligence/expert system technology paths, to create knowledge based tools with advanced human-operator interfaces, and to integrate and interface knowledge-based and conventional control schemes. This program is being developed at the NASA-Lewis. The APS Brassboard represents a subset of a 20 KHz Space Station Power Management And Distribution (PMAD) testbed. A distributed control scheme is used to manage multiple levels of computers and switchgear. The brassboard is comprised of a set of intelligent switchgear used to effectively switch power from the sources to the loads. The Autonomous Power Expert System (APEX) portion of the APS program integrates a knowledge based fault diagnostic system, a power resource scheduler, and an interface to the APS Brassboard. The system includes knowledge bases for system diagnostics, fault detection and isolation, and recommended actions. The scheduler autonomously assigns start times to the attached loads based on temporal and power constraints. The scheduler is able to work in a near real time environment for both scheduling and dynamic replanning.

  11. Autonomous Robotic Following Using Vision Based Techniques

    DTIC Science & Technology

    2005-02-03

    different methods for the soldier’s control of the vehicle are being investigated. One such method is the Leader - Follower approach. In the Field So...what is the current state of the art for leader - follower applications? One of the leaders in this field is the RF ATD (Robotic Follower Advanced...these systems have in common? Both of these platforms are representative of the state-of-the-art of current leader - follower technology being tested by

  12. Nemesis Autonomous Test System

    NASA Technical Reports Server (NTRS)

    Barltrop, Kevin J.; Lee, Cin-Young; Horvath, Gregory A,; Clement, Bradley J.

    2012-01-01

    A generalized framework has been developed for systems validation that can be applied to both traditional and autonomous systems. The framework consists of an automated test case generation and execution system called Nemesis that rapidly and thoroughly identifies flaws or vulnerabilities within a system. By applying genetic optimization and goal-seeking algorithms on the test equipment side, a "war game" is conducted between a system and its complementary nemesis. The end result of the war games is a collection of scenarios that reveals any undesirable behaviors of the system under test. The software provides a reusable framework to evolve test scenarios using genetic algorithms using an operation model of the system under test. It can automatically generate and execute test cases that reveal flaws in behaviorally complex systems. Genetic algorithms focus the exploration of tests on the set of test cases that most effectively reveals the flaws and vulnerabilities of the system under test. It leverages advances in state- and model-based engineering, which are essential in defining the behavior of autonomous systems. It also uses goal networks to describe test scenarios.

  13. Autonomous Landing and Ingress of Micro-Air-Vehicles in Urban Environments Based on Monocular Vision

    NASA Technical Reports Server (NTRS)

    Brockers, Roland; Bouffard, Patrick; Ma, Jeremy; Matthies, Larry; Tomlin, Claire

    2011-01-01

    Unmanned micro air vehicles (MAVs) will play an important role in future reconnaissance and search and rescue applications. In order to conduct persistent surveillance and to conserve energy, MAVs need the ability to land, and they need the ability to enter (ingress) buildings and other structures to conduct reconnaissance. To be safe and practical under a wide range of environmental conditions, landing and ingress maneuvers must be autonomous, using real-time, onboard sensor feedback. To address these key behaviors, we present a novel method for vision-based autonomous MAV landing and ingress using a single camera for two urban scenarios: landing on an elevated surface, representative of a rooftop, and ingress through a rectangular opening, representative of a door or window. Real-world scenarios will not include special navigation markers, so we rely on tracking arbitrary scene features; however, we do currently exploit planarity of the scene. Our vision system uses a planar homography decomposition to detect navigation targets and to produce approach waypoints as inputs to the vehicle control algorithm. Scene perception, planning, and control run onboard in real-time; at present we obtain aircraft position knowledge from an external motion capture system, but we expect to replace this in the near future with a fully self-contained, onboard, vision-aided state estimation algorithm. We demonstrate autonomous vision-based landing and ingress target detection with two different quadrotor MAV platforms. To our knowledge, this is the first demonstration of onboard, vision-based autonomous landing and ingress algorithms that do not use special purpose scene markers to identify the destination.

  14. Autonomous Landing and Ingress of Micro-Air-Vehicles in Urban Environments Based on Monocular Vision

    NASA Technical Reports Server (NTRS)

    Brockers, Roland; Bouffard, Patrick; Ma, Jeremy; Matthies, Larry; Tomlin, Claire

    2011-01-01

    Unmanned micro air vehicles (MAVs) will play an important role in future reconnaissance and search and rescue applications. In order to conduct persistent surveillance and to conserve energy, MAVs need the ability to land, and they need the ability to enter (ingress) buildings and other structures to conduct reconnaissance. To be safe and practical under a wide range of environmental conditions, landing and ingress maneuvers must be autonomous, using real-time, onboard sensor feedback. To address these key behaviors, we present a novel method for vision-based autonomous MAV landing and ingress using a single camera for two urban scenarios: landing on an elevated surface, representative of a rooftop, and ingress through a rectangular opening, representative of a door or window. Real-world scenarios will not include special navigation markers, so we rely on tracking arbitrary scene features; however, we do currently exploit planarity of the scene. Our vision system uses a planar homography decomposition to detect navigation targets and to produce approach waypoints as inputs to the vehicle control algorithm. Scene perception, planning, and control run onboard in real-time; at present we obtain aircraft position knowledge from an external motion capture system, but we expect to replace this in the near future with a fully self-contained, onboard, vision-aided state estimation algorithm. We demonstrate autonomous vision-based landing and ingress target detection with two different quadrotor MAV platforms. To our knowledge, this is the first demonstration of onboard, vision-based autonomous landing and ingress algorithms that do not use special purpose scene markers to identify the destination.

  15. Autonomous Flight Safety System

    NASA Technical Reports Server (NTRS)

    Ferrell, Bob; Santuro, Steve; Simpson, James; Zoerner, Roger; Bull, Barton; Lanzi, Jim

    2004-01-01

    Autonomous Flight Safety System (AFSS) is an independent flight safety system designed for small to medium sized expendable launch vehicles launching from or needing range safety protection while overlying relatively remote locations. AFSS replaces the need for a man-in-the-loop to make decisions for flight termination. AFSS could also serve as the prototype for an autonomous manned flight crew escape advisory system. AFSS utilizes onboard sensors and processors to emulate the human decision-making process using rule-based software logic and can dramatically reduce safety response time during critical launch phases. The Range Safety flight path nominal trajectory, its deviation allowances, limit zones and other flight safety rules are stored in the onboard computers. Position, velocity and attitude data obtained from onboard global positioning system (GPS) and inertial navigation system (INS) sensors are compared with these rules to determine the appropriate action to ensure that people and property are not jeopardized. The final system will be fully redundant and independent with multiple processors, sensors, and dead man switches to prevent inadvertent flight termination. AFSS is currently in Phase III which includes updated algorithms, integrated GPS/INS sensors, large scale simulation testing and initial aircraft flight testing.

  16. Visual navigation system for autonomous indoor blimps

    NASA Astrophysics Data System (ADS)

    Campos, Mario F.; de Souza Coelho, Lucio

    1999-07-01

    Autonomous dirigibles - aerial robots that are a blimp controlled by computer based on information gathered by sensors - are a new and promising research field in Robotics, offering several original work opportunities. One of them is the study of visual navigation of UAVs. In the work described in this paper, a Computer Vision and Control system was developed to perform automatically very simple navigation task for a small indoor blimp. The vision system is able to track artificial visual beacons - objects with known geometrical properties - and from them a geometrical methodology can extract information about orientation of the blimp. The tracking of natural landmarks is also a possibility for the vision technique developed. The control system uses that data to keep the dirigible on a programmed orientation. Experimental results showing the correct and efficient functioning of the system are shown and have your implications and future possibilities discussed.

  17. A Vision System Model

    DTIC Science & Technology

    1991-06-01

    Finding ................................................................................................. 146 4.6 Internal Transform s...and as basis functions in a recognition/ reconstruction network, as well as methods for integrating color into a vision system. The third major...appears to us internally . They are the color - redness, blueness, greenness - the ap- pearance - fuzzy, crisp - etc. by which we quantify objects we view

  18. Computer Vision Systems

    NASA Astrophysics Data System (ADS)

    Gunasekaran, Sundaram

    Food quality is of paramount consideration for all consumers, and its importance is perhaps only second to food safety. By some definition, food safety is also incorporated into the broad categorization of food quality. Hence, the need for careful and accurate evaluation of food quality is at the forefront of research and development both in the academia and industry. Among the many available methods for food quality evaluation, computer vision has proven to be the most powerful, especially for nondestructively extracting and quantifying many features that have direct relevance to food quality assessment and control. Furthermore, computer vision systems serve to rapidly evaluate the most readily observable foods quality attributes - the external characteristics such as color, shape, size, surface texture etc. In addition, it is now possible, using advanced computer vision technologies, to “see” inside a food product and/or package to examine important quality attributes ordinarily unavailable to human evaluators. With rapid advances in electronic hardware and other associated imaging technologies, the cost-effectiveness and speed of computer vision systems have greatly improved and many practical systems are already in place in the food industry.

  19. Merged Vision and GPS Control of a Semi-Autonomous, Small Helicopter

    NASA Technical Reports Server (NTRS)

    Rock, Stephen M.

    1999-01-01

    This final report documents the activities performed during the research period from April 1, 1996 to September 30, 1997. It contains three papers: Carrier Phase GPS and Computer Vision for Control of an Autonomous Helicopter; A Contestant in the 1997 International Aerospace Robotics Laboratory Stanford University; and Combined CDGPS and Vision-Based Control of a Small Autonomous Helicopter.

  20. Incremental inverse kinematics based vision servo for autonomous robotic capture of non-cooperative space debris

    NASA Astrophysics Data System (ADS)

    Dong, Gangqi; Zhu, Z. H.

    2016-04-01

    This paper proposed a new incremental inverse kinematics based vision servo approach for robotic manipulators to capture a non-cooperative target autonomously. The target's pose and motion are estimated by a vision system using integrated photogrammetry and EKF algorithm. Based on the estimated pose and motion of the target, the instantaneous desired position of the end-effector is predicted by inverse kinematics and the robotic manipulator is moved incrementally from its current configuration subject to the joint speed limits. This approach effectively eliminates the multiple solutions in the inverse kinematics and increases the robustness of the control algorithm. The proposed approach is validated by a hardware-in-the-loop simulation, where the pose and motion of the non-cooperative target is estimated by a real vision system. The simulation results demonstrate the effectiveness and robustness of the proposed estimation approach for the target and the incremental control strategy for the robotic manipulator.

  1. Bird Vision System

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The Bird Vision system is a multicamera photogrammerty software application that runs on a Microsoft Windows XP platform and was developed at Kennedy Space Center by ASRC Aerospace. This software system collects data about the locations of birds within a volume centered on the Space Shuttle and transmits it in real time to the laptop computer of a test director in the Launch Control Center (LCC) Firing Room.

  2. Intelligent Mobile Autonomous System

    DTIC Science & Technology

    1987-01-01

    jerk application. (c) Negative jerk application. Group (a). Application of positve jerk. Force is increased from initial value to force of resistance...fundamentals of the new emerging area of autonomous robotics . The goal of this research is to develop a theory of design and functioning of Intelligent...scientific research. This report contributes to a new rapidly developing area of autonomous robotics . Actual experience of dealing with autonomous robots (or

  3. Space environment robot vision system

    NASA Technical Reports Server (NTRS)

    Wood, H. John; Eichhorn, William L.

    1990-01-01

    A prototype twin-camera stereo vision system for autonomous robots has been developed at Goddard Space Flight Center. Standard charge coupled device (CCD) imagers are interfaced with commercial frame buffers and direct memory access to a computer. The overlapping portions of the images are analyzed using photogrammetric techniques to obtain information about the position and orientation of objects in the scene. The camera head consists of two 510 x 492 x 8-bit CCD cameras mounted on individually adjustable mounts. The 16 mm efl lenses are designed for minimum geometric distortion. The cameras can be rotated in the pitch, roll, and yaw (pan angle) directions with respect to their optical axes. Calibration routines have been developed which automatically determine the lens focal lengths and pan angle between the two cameras. The calibration utilizes observations of a calibration structure with known geometry. Test results show the precision attainable is plus or minus 0.8 mm in range at 2 m distance using a camera separation of 171 mm. To demonstrate a task needed on Space Station Freedom, a target structure with a movable I beam was built. The camera head can autonomously direct actuators to dock the I-beam to another one so that they could be bolted together.

  4. Cybersecurity for aerospace autonomous systems

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2015-05-01

    High profile breaches have occurred across numerous information systems. One area where attacks are particularly problematic is autonomous control systems. This paper considers the aerospace information system, focusing on elements that interact with autonomous control systems (e.g., onboard UAVs). It discusses the trust placed in the autonomous systems and supporting systems (e.g., navigational aids) and how this trust can be validated. Approaches to remotely detect the UAV compromise, without relying on the onboard software (on a potentially compromised system) as part of the process are discussed. How different levels of autonomy (task-based, goal-based, mission-based) impact this remote characterization is considered.

  5. Autonomous power system brassboard

    NASA Technical Reports Server (NTRS)

    Merolla, Anthony

    1992-01-01

    The Autonomous Power System (APS) brassboard is a 20 kHz power distribution system which has been developed at NASA Lewis Research Center, Cleveland, Ohio. The brassboard exists to provide a realistic hardware platform capable of testing artificially intelligent (AI) software. The brassboard's power circuit topology is based upon a Power Distribution Control Unit (PDCU), which is a subset of an advanced development 20 kHz electrical power system (EPS) testbed, originally designed for Space Station Freedom (SSF). The APS program is designed to demonstrate the application of intelligent software as a fault detection, isolation, and recovery methodology for space power systems. This report discusses both the hardware and software elements used to construct the present configuration of the brassboard. The brassboard power components are described. These include the solid-state switches (herein referred to as switchgear), transformers, sources, and loads. Closely linked to this power portion of the brassboard is the first level of embedded control. Hardware used to implement this control and its associated software is discussed. An Ada software program, developed by Lewis Research Center's Space Station Freedom Directorate for their 20 kHz testbed, is used to control the brassboard's switchgear, as well as monitor key brassboard parameters through sensors located within these switches. The Ada code is downloaded from a PC/AT, and is resident within the 8086 microprocessor-based embedded controllers. The PC/AT is also used for smart terminal emulation, capable of controlling the switchgear as well as displaying data from them. Intelligent control is provided through use of a T1 Explorer and the Autonomous Power Expert (APEX) LISP software. Real-time load scheduling is implemented through use of a 'C' program-based scheduling engine. The methods of communication between these computers and the brassboard are explored. In order to evaluate the features of both the

  6. Autonomous power system brassboard

    NASA Astrophysics Data System (ADS)

    Merolla, Anthony

    1992-10-01

    The Autonomous Power System (APS) brassboard is a 20 kHz power distribution system which has been developed at NASA Lewis Research Center, Cleveland, Ohio. The brassboard exists to provide a realistic hardware platform capable of testing artificially intelligent (AI) software. The brassboard's power circuit topology is based upon a Power Distribution Control Unit (PDCU), which is a subset of an advanced development 20 kHz electrical power system (EPS) testbed, originally designed for Space Station Freedom (SSF). The APS program is designed to demonstrate the application of intelligent software as a fault detection, isolation, and recovery methodology for space power systems. This report discusses both the hardware and software elements used to construct the present configuration of the brassboard. The brassboard power components are described. These include the solid-state switches (herein referred to as switchgear), transformers, sources, and loads. Closely linked to this power portion of the brassboard is the first level of embedded control. Hardware used to implement this control and its associated software is discussed. An Ada software program, developed by Lewis Research Center's Space Station Freedom Directorate for their 20 kHz testbed, is used to control the brassboard's switchgear, as well as monitor key brassboard parameters through sensors located within these switches. The Ada code is downloaded from a PC/AT, and is resident within the 8086 microprocessor-based embedded controllers. The PC/AT is also used for smart terminal emulation, capable of controlling the switchgear as well as displaying data from them. Intelligent control is provided through use of a T1 Explorer and the Autonomous Power Expert (APEX) LISP software. Real-time load scheduling is implemented through use of a 'C' program-based scheduling engine. The methods of communication between these computers and the brassboard are explored. In order to evaluate the features of both the

  7. Near real-time stereo vision system

    NASA Technical Reports Server (NTRS)

    Anderson, Charles H. (Inventor); Matthies, Larry H. (Inventor)

    1993-01-01

    The apparatus for a near real-time stereo vision system for use with a robotic vehicle is described. The system is comprised of two cameras mounted on three-axis rotation platforms, image-processing boards, a CPU, and specialized stereo vision algorithms. Bandpass-filtered image pyramids are computed, stereo matching is performed by least-squares correlation, and confidence ranges are estimated by means of Bayes' theorem. In particular, Laplacian image pyramids are built and disparity maps are produced from the 60 x 64 level of the pyramids at rates of up to 2 seconds per image pair. The first autonomous cross-country robotic traverses (of up to 100 meters) have been achieved using the stereo vision system of the present invention with all computing done onboard the vehicle. The overall approach disclosed herein provides a unifying paradigm for practical domain-independent stereo ranging.

  8. Asteroid Exploration with Autonomic Systems

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt; Rash, James; Rouff, Christopher; Hinchey, Mike

    2004-01-01

    NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. The prospective ANTS (Autonomous Nano Technology Swarm) mission comprises autonomous agents including worker agents (small spacecra3) designed to cooperate in asteroid exploration under the overall authoriq of at least one ruler agent (a larger spacecraft) whose goal is to cause science data to be returned to Earth. The ANTS team (ruler plus workers and messenger agents), but not necessarily any individual on the team, will exhibit behaviors that qualify it as an autonomic system, where an autonomic system is defined as a system that self-reconfigures, self-optimizes, self-heals, and self-protects. Autonomic system concepts lead naturally to realistic, scalable architectures rich in capabilities and behaviors. In-depth consideration of a major mission like ANTS in terms of autonomic systems brings new insights into alternative definitions of autonomic behavior. This paper gives an overview of the ANTS mission and discusses the autonomic properties of the mission.

  9. Asteroid Exploration with Autonomic Systems

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt; Rash, James; Rouff, Christopher; Hinchey, Mike

    2004-01-01

    NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. The prospective ANTS (Autonomous Nano Technology Swarm) mission comprises autonomous agents including worker agents (small spacecra3) designed to cooperate in asteroid exploration under the overall authoriq of at least one ruler agent (a larger spacecraft) whose goal is to cause science data to be returned to Earth. The ANTS team (ruler plus workers and messenger agents), but not necessarily any individual on the team, will exhibit behaviors that qualify it as an autonomic system, where an autonomic system is defined as a system that self-reconfigures, self-optimizes, self-heals, and self-protects. Autonomic system concepts lead naturally to realistic, scalable architectures rich in capabilities and behaviors. In-depth consideration of a major mission like ANTS in terms of autonomic systems brings new insights into alternative definitions of autonomic behavior. This paper gives an overview of the ANTS mission and discusses the autonomic properties of the mission.

  10. Industrial robot's vision systems

    NASA Astrophysics Data System (ADS)

    Iureva, Radda A.; Raskin, Evgeni O.; Komarov, Igor I.; Maltseva, Nadezhda K.; Fedosovsky, Michael E.

    2016-03-01

    Due to the improved economic situation in the high technology sectors, work on the creation of industrial robots and special mobile robotic systems are resumed. Despite this, the robotic control systems mostly remained unchanged. Hence one can see all advantages and disadvantages of these systems. This is due to lack of funds, which could greatly facilitate the work of the operator, and in some cases, completely replace it. The paper is concerned with the complex machine vision of robotic system for monitoring of underground pipelines, which collects and analyzes up to 90% of the necessary information. Vision Systems are used to identify obstacles to the process of movement on a trajectory to determine their origin, dimensions and character. The object is illuminated in a structured light, TV camera records projected structure. Distortions of the structure uniquely determine the shape of the object in view of the camera. The reference illumination is synchronized with the camera. The main parameters of the system are the basic distance between the generator and the lights and the camera parallax angle (the angle between the optical axes of the projection unit and camera).

  11. Computer Vision and Machine Learning for Autonomous Characterization of AM Powder Feedstocks

    NASA Astrophysics Data System (ADS)

    DeCost, Brian L.; Jain, Harshvardhan; Rollett, Anthony D.; Holm, Elizabeth A.

    2016-12-01

    By applying computer vision and machine learning methods, we develop a system to characterize powder feedstock materials for metal additive manufacturing (AM). Feature detection and description algorithms are applied to create a microstructural scale image representation that can be used to cluster, compare, and analyze powder micrographs. When applied to eight commercial feedstock powders, the system classifies powder images into the correct material systems with greater than 95% accuracy. The system also identifies both representative and atypical powder images. These results suggest the possibility of measuring variations in powders as a function of processing history, relating microstructural features of powders to properties relevant to their performance in AM processes, and defining objective material standards based on visual images. A significant advantage of the computer vision approach is that it is autonomous, objective, and repeatable.

  12. Computer Vision and Machine Learning for Autonomous Characterization of AM Powder Feedstocks

    NASA Astrophysics Data System (ADS)

    DeCost, Brian L.; Jain, Harshvardhan; Rollett, Anthony D.; Holm, Elizabeth A.

    2017-03-01

    By applying computer vision and machine learning methods, we develop a system to characterize powder feedstock materials for metal additive manufacturing (AM). Feature detection and description algorithms are applied to create a microstructural scale image representation that can be used to cluster, compare, and analyze powder micrographs. When applied to eight commercial feedstock powders, the system classifies powder images into the correct material systems with greater than 95% accuracy. The system also identifies both representative and atypical powder images. These results suggest the possibility of measuring variations in powders as a function of processing history, relating microstructural features of powders to properties relevant to their performance in AM processes, and defining objective material standards based on visual images. A significant advantage of the computer vision approach is that it is autonomous, objective, and repeatable.

  13. Monocular feature tracker for low-cost stereo vision control of an autonomous guided vehicle (AGV)

    NASA Astrophysics Data System (ADS)

    Pearson, Chris M.; Probert, Penelope J.

    1994-02-01

    We describe a monocular feature tracker (MFT), the first stage of a low cost stereoscopic vision system for use on an autonomous guided vehicle (AGV) in an indoor environment. The system does not require artificial markings or other beacons, but relies upon accurate knowledge of the AGV motion. Linear array cameras (LAC) are used to reduce the data and processing bandwidths. The limited information given by LAC require modelling of the expected features. We model an obstacle as a vertical line segment touching the floor, and can distinguish between these obstacles and most other clutter in an image sequence. Detection of these obstacles is sufficient information for local AGV navigation.

  14. A simple, inexpensive, and effective implementation of a vision-guided autonomous robot

    NASA Astrophysics Data System (ADS)

    Tippetts, Beau; Lillywhite, Kirt; Fowers, Spencer; Dennis, Aaron; Lee, Dah-Jye; Archibald, James

    2006-10-01

    This paper discusses a simple, inexpensive, and effective implementation of a vision-guided autonomous robot. This implementation is a second year entrance for Brigham Young University students to the Intelligent Ground Vehicle Competition. The objective of the robot was to navigate a course constructed of white boundary lines and orange obstacles for the autonomous competition. A used electric wheelchair was used as the robot base. The wheelchair was purchased from a local thrift store for $28. The base was modified to include Kegresse tracks using a friction drum system. This modification allowed the robot to perform better on a variety of terrains, resolving issues with last year's design. In order to control the wheelchair and retain the robust motor controls already on the wheelchair the wheelchair joystick was simply removed and replaced with a printed circuit board that emulated joystick operation and was capable of receiving commands through a serial port connection. Three different algorithms were implemented and compared: a purely reactive approach, a potential fields approach, and a machine learning approach. Each of the algorithms used color segmentation methods to interpret data from a digital camera in order to identify the features of the course. This paper will be useful to those interested in implementing an inexpensive vision-based autonomous robot.

  15. Awareness and Responsibility in Autonomous Weapons Systems

    NASA Astrophysics Data System (ADS)

    Bhuta, Nehal; Rotolo, Antonino; Sartor, Giovanni

    The following sections are included: * Introduction * Why Computational Awareness is Important in Autonomous Weapons * Flying Drones and Other Autonomous Weapons * The Impact of Autonomous Weapons Systems * From Autonomy to Awareness: A Perspective from Science Fiction * Summary and Conclusions

  16. A smart telerobotic system driven by monocular vision

    NASA Technical Reports Server (NTRS)

    Defigueiredo, R. J. P.; Maccato, A.; Wlczek, P.; Denney, B.; Scheerer, J.

    1994-01-01

    A robotic system that accepts autonomously generated motion and control commands is described. The system provides images from the monocular vision of a camera mounted on a robot's end effector, eliminating the need for traditional guidance targets that must be predetermined and specifically identified. The telerobotic vision system presents different views of the targeted object relative to the camera, based on a single camera image and knowledge of the target's solid geometry.

  17. 3D vision system assessment

    NASA Astrophysics Data System (ADS)

    Pezzaniti, J. Larry; Edmondson, Richard; Vaden, Justin; Hyatt, Bryan; Chenault, David B.; Kingston, David; Geulen, Vanilynmae; Newell, Scott; Pettijohn, Brad

    2009-02-01

    In this paper, we report on the development of a 3D vision system consisting of a flat panel stereoscopic display and auto-converging stereo camera and an assessment of the system's use for robotic driving, manipulation, and surveillance operations. The 3D vision system was integrated onto a Talon Robot and Operator Control Unit (OCU) such that direct comparisons of the performance of a number of test subjects using 2D and 3D vision systems were possible. A number of representative scenarios were developed to determine which tasks benefited most from the added depth perception and to understand when the 3D vision system hindered understanding of the scene. Two tests were conducted at Fort Leonard Wood, MO with noncommissioned officers ranked Staff Sergeant and Sergeant First Class. The scenarios; the test planning, approach and protocols; the data analysis; and the resulting performance assessment of the 3D vision system are reported.

  18. Gas House Autonomous System Monitoring

    NASA Technical Reports Server (NTRS)

    Miller, Luke; Edsall, Ashley

    2015-01-01

    Gas House Autonomous System Monitoring (GHASM) will employ Integrated System Health Monitoring (ISHM) of cryogenic fluids in the High Pressure Gas Facility at Stennis Space Center. The preliminary focus of development incorporates the passive monitoring and eventual commanding of the Nitrogen System. ISHM offers generic system awareness, adept at using concepts rather than specific error cases. As an enabler for autonomy, ISHM provides capabilities inclusive of anomaly detection, diagnosis, and abnormality prediction. Advancing ISHM and Autonomous Operation functional capabilities enhances quality of data, optimizes safety, improves cost effectiveness, and has direct benefits to a wide spectrum of aerospace applications.

  19. Neuromorphic vision sensors and preprocessors in system applications

    NASA Astrophysics Data System (ADS)

    Kramer, Joerg; Indiveri, Giacomo

    1998-09-01

    A partial review of neuromorphic vision sensors that are suitable for use in autonomous systems is presented. Interfaces are being developed to multiplex the high- dimensional output signals of arrays of such sensors and to communicate them in standard formats to off-chip devices for higher-level processing, actuation, storage and display. Alternatively, on-chip processing stages may be implemented to extract sparse image parameters, thereby obviating the need for multiplexing. Autonomous robots are used to test neuromorphic vision chips in real-world environments and to explore the possibilities of data fusion from different sensing modalities. Examples of autonomous mobile systems that use neuromorphic vision chips for line tracking and optical flow matching are described.

  20. Autonomous vision networking: miniature wireless sensor networks with imaging technology

    NASA Astrophysics Data System (ADS)

    Messinger, Gioia; Goldberg, Giora

    2006-09-01

    The recent emergence of integrated PicoRadio technology, the rise of low power, low cost, System-On-Chip (SOC) CMOS imagers, coupled with the fast evolution of networking protocols and digital signal processing (DSP), created a unique opportunity to achieve the goal of deploying large-scale, low cost, intelligent, ultra-low power distributed wireless sensor networks for the visualization of the environment. Of all sensors, vision is the most desired, but its applications in distributed sensor networks have been elusive so far. Not any more. The practicality and viability of ultra-low power vision networking has been proven and its applications are countless, from security, and chemical analysis to industrial monitoring, asset tracking and visual recognition, vision networking represents a truly disruptive technology applicable to many industries. The presentation discusses some of the critical components and technologies necessary to make these networks and products affordable and ubiquitous - specifically PicoRadios, CMOS imagers, imaging DSP, networking and overall wireless sensor network (WSN) system concepts. The paradigm shift, from large, centralized and expensive sensor platforms, to small, low cost, distributed, sensor networks, is possible due to the emergence and convergence of a few innovative technologies. Avaak has developed a vision network that is aided by other sensors such as motion, acoustic and magnetic, and plans to deploy it for use in military and commercial applications. In comparison to other sensors, imagers produce large data files that require pre-processing and a certain level of compression before these are transmitted to a network server, in order to minimize the load on the network. Some of the most innovative chemical detectors currently in development are based on sensors that change color or pattern in the presence of the desired analytes. These changes are easily recorded and analyzed by a CMOS imager and an on-board DSP processor

  1. A robotic vision system to measure tree traits

    USDA-ARS?s Scientific Manuscript database

    The autonomous measurement of tree traits, such as branching structure, branch diameters, branch lengths, and branch angles, is required for tasks such as robotic pruning of trees as well as structural phenotyping. We propose a robotic vision system called the Robotic System for Tree Shape Estimati...

  2. Contingency Software in Autonomous Systems

    NASA Technical Reports Server (NTRS)

    Lutz, Robyn; Patterson-Hine, Ann

    2006-01-01

    This viewgraph presentation reviews the development of contingency software for autonomous systems. Autonomous vehicles currently have a limited capacity to diagnose and mitigate failures. There is a need to be able to handle a broader range of contingencies. The goals of the project are: 1. Speed up diagnosis and mitigation of anomalous situations.2.Automatically handle contingencies, not just failures.3.Enable projects to select a degree of autonomy consistent with their needs and to incrementally introduce more autonomy.4.Augment on-board fault protection with verified contingency scripts

  3. Contingency Software in Autonomous Systems

    NASA Technical Reports Server (NTRS)

    Lutz, Robyn; Patterson-Hine, Ann

    2006-01-01

    This viewgraph presentation reviews the development of contingency software for autonomous systems. Autonomous vehicles currently have a limited capacity to diagnose and mitigate failures. There is a need to be able to handle a broader range of contingencies. The goals of the project are: 1. Speed up diagnosis and mitigation of anomalous situations.2.Automatically handle contingencies, not just failures.3.Enable projects to select a degree of autonomy consistent with their needs and to incrementally introduce more autonomy.4.Augment on-board fault protection with verified contingency scripts

  4. A Robust Compositional Architecture for Autonomous Systems

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Deney, Ewen; Farrell, Kimberley; Giannakopoulos, Dimitra; Jonsson, Ari; Frank, Jeremy; Bobby, Mark; Carpenter, Todd; Estlin, Tara

    2006-01-01

    Space exploration applications can benefit greatly from autonomous systems. Great distances, limited communications and high costs make direct operations impossible while mandating operations reliability and efficiency beyond what traditional commanding can provide. Autonomous systems can improve reliability and enhance spacecraft capability significantly. However, there is reluctance to utilizing autonomous systems. In part this is due to general hesitation about new technologies, but a more tangible concern is that of reliability of predictability of autonomous software. In this paper, we describe ongoing work aimed at increasing robustness and predictability of autonomous software, with the ultimate goal of building trust in such systems. The work combines state-of-the-art technologies and capabilities in autonomous systems with advanced validation and synthesis techniques. The focus of this paper is on the autonomous system architecture that has been defined, and on how it enables the application of validation techniques for resulting autonomous systems.

  5. Autonomously managed electrical power systems

    NASA Technical Reports Server (NTRS)

    Callis, Charles P.

    1986-01-01

    The electric power systems for future spacecraft such as the Space Station will necessarily be more sophisticated and will exhibit more nearly autonomous operation than earlier spacecraft. These new power systems will be more reliable and flexible than their predecessors offering greater utility to the users. Automation approaches implemented on various power system breadboards are investigated. These breadboards include the Hubble Space Telescope power system test bed, the Common Module Power Management and Distribution system breadboard, the Autonomusly Managed Power System (AMPS) breadboard, and the 20 kilohertz power system breadboard. Particular attention is given to the AMPS breadboard. Future plans for these breadboards including the employment of artificial intelligence techniques are addressed.

  6. Spaceborne autonomous multiprocessor systems

    NASA Technical Reports Server (NTRS)

    Fernquist, Alan

    1990-01-01

    The goal of this task is to provide technology for the specification and integration of advanced processors into the Space Station Freedom data management system environment through computer performance measurement tools, simulators, and an extended testbed facility. The approach focuses on five categories: (1) user requirements--determine the suitability of existing computer technologies and systems for real-time requirements of NASA missions; (2) system performance analysis--characterize the effects of languages, architectures, and commercially available hardware on real-time benchmarks; (3) system architecture--expand NASA's capability to solve problems with integrated numeric and symbolic requirements using advanced multiprocessor architectures; (4) parallel Ada technology--extend Ada software technology to utilize parallel architectures more efficiently; and (5) testbed--extend in-house testbed to support system performance and system analysis studies.

  7. Semi autonomous mine detection system

    NASA Astrophysics Data System (ADS)

    Few, Doug; Versteeg, Roelof; Herman, Herman

    2010-04-01

    CMMAD is a risk reduction effort for the AMDS program. As part of CMMAD, multiple instances of semi autonomous robotic mine detection systems were created. Each instance consists of a robotic vehicle equipped with sensors required for navigation and marking, countermine sensors and a number of integrated software packages which provide for real time processing of the countermine sensor data as well as integrated control of the robotic vehicle, the sensor actuator and the sensor. These systems were used to investigate critical interest functions (CIF) related to countermine robotic systems. To address the autonomy CIF, the INL developed RIK was extended to allow for interaction with a mine sensor processing code (MSPC). In limited field testing this system performed well in detecting, marking and avoiding both AT and AP mines. Based on the results of the CMMAD investigation we conclude that autonomous robotic mine detection is feasible. In addition, CMMAD contributed critical technical advances with regard to sensing, data processing and sensor manipulation, which will advance the performance of future fieldable systems. As a result, no substantial technical barriers exist which preclude - from an autonomous robotic perspective - the rapid development and deployment of fieldable systems.

  8. Semi autonomous mine detection system

    SciTech Connect

    Douglas Few; Roelof Versteeg; Herman Herman

    2010-04-01

    CMMAD is a risk reduction effort for the AMDS program. As part of CMMAD, multiple instances of semi autonomous robotic mine detection systems were created. Each instance consists of a robotic vehicle equipped with sensors required for navigation and marking, a countermine sensors and a number of integrated software packages which provide for real time processing of the countermine sensor data as well as integrated control of the robotic vehicle, the sensor actuator and the sensor. These systems were used to investigate critical interest functions (CIF) related to countermine robotic systems. To address the autonomy CIF, the INL developed RIK was extended to allow for interaction with a mine sensor processing code (MSPC). In limited field testing this system performed well in detecting, marking and avoiding both AT and AP mines. Based on the results of the CMMAD investigation we conclude that autonomous robotic mine detection is feasible. In addition, CMMAD contributed critical technical advances with regard to sensing, data processing and sensor manipulation, which will advance the performance of future fieldable systems. As a result, no substantial technical barriers exist which preclude – from an autonomous robotic perspective – the rapid development and deployment of fieldable systems.

  9. Monocular-Vision-Based Autonomous Hovering for a Miniature Flying Ball

    PubMed Central

    Lin, Junqin; Han, Baoling; Luo, Qingsheng

    2015-01-01

    This paper presents a method for detecting and controlling the autonomous hovering of a miniature flying ball (MFB) based on monocular vision. A camera is employed to estimate the three-dimensional position of the vehicle relative to the ground without auxiliary sensors, such as inertial measurement units (IMUs). An image of the ground captured by the camera mounted directly under the miniature flying ball is set as a reference. The position variations between the subsequent frames and the reference image are calculated by comparing their correspondence points. The Kalman filter is used to predict the position of the miniature flying ball to handle situations, such as a lost or wrong frame. Finally, a PID controller is designed, and the performance of the entire system is tested experimentally. The results show that the proposed method can keep the aircraft in a stable hover. PMID:26057040

  10. Monocular-Vision-Based Autonomous Hovering for a Miniature Flying Ball.

    PubMed

    Lin, Junqin; Han, Baoling; Luo, Qingsheng

    2015-06-05

    This paper presents a method for detecting and controlling the autonomous hovering of a miniature flying ball (MFB) based on monocular vision. A camera is employed to estimate the three-dimensional position of the vehicle relative to the ground without auxiliary sensors, such as inertial measurement units (IMUs). An image of the ground captured by the camera mounted directly under the miniature flying ball is set as a reference. The position variations between the subsequent frames and the reference image are calculated by comparing their correspondence points. The Kalman filter is used to predict the position of the miniature flying ball to handle situations, such as a lost or wrong frame. Finally, a PID controller is designed, and the performance of the entire system is tested experimentally. The results show that the proposed method can keep the aircraft in a stable hover.

  11. Knowledge acquisition for autonomous systems

    NASA Technical Reports Server (NTRS)

    Lum, Henry; Heer, Ewald

    1988-01-01

    Knowledge-based capabilities for autonomous aerospace systems, such as the NASA Space Station, must encompass conflict-resolution functions comparable to those of human operators, with all elements of the system working toward system goals in a concurrent, asynchronous-but-coordinated fashion. Knowledge extracted from a design database will support robotic systems by furnishing geometric, structural, and causal descriptions required for repair, disassembly, and assembly. The factual knowledge for these databases will be obtained from a master database through a technical management information system, and it will in many cases have to be augmented by domain-specific heuristic knowledge acquired from domain experts.

  12. Autonomous Inspection System

    DTIC Science & Technology

    2010-04-26

    P.O. Box 3738 Elk Grove, CA 95757 Suite 204 Telluride, CO 81435 Whitefish, MT 59937 (304) 367-0500: Fairmont, WV Office 916 501-6051 406 270-1176...970) 728-1972: Colorado Office Mountain State Mission Mountain (970) 708-7723: Cell Information Systems, Inc. Technology Associates pgarnett

  13. Intelligent Mobile Autonomous System (IMAS).

    DTIC Science & Technology

    1987-01-01

    goal of this research is to develop a theory of design and functioning of Intelligent Mobile Autonomous Systems (IMAS) to be utilized for solving...two PILOT personalities leads to a number of advantages in the IMAS functioning . 8. A fundamental result is obtained applicable for the lowest level...perception, and with no conceptual * learning. One can expect that IMAS should be used only for limited time and limited function missions with

  14. A vision system for an unmanned nonlethal weapon

    NASA Astrophysics Data System (ADS)

    Kogut, Greg; Drymon, Larry

    2004-10-01

    Unmanned weapons remove humans from deadly situations. However some systems, such as unmanned guns, are difficult to control remotely. It is difficult for a soldier to perform the complex tasks of identifying and aiming at specific points on targets from a remote location. This paper describes a computer vision and control system for providing autonomous control of unmanned guns developed at Space and Naval Warfare Systems Center, San Diego (SSC San Diego). The test platform, consisting of a non-lethal gun mounted on a pan-tilt mechanism, can be used as an unattended device or mounted on a robot for mobility. The system operates with a degree of autonomy determined by a remote user that ranges from teleoperated to fully autonomous. The teleoperated mode consists of remote joystick control over all aspects of the weapon, including aiming, arming, and firing. Visual feedback is provided by near-real-time video feeds from bore-site and wide-angle cameras. The semi-autonomous mode provides the user with tracking information overlayed over the real-time video. This provides the user with information on all detected targets being tracked by the vision system. The user uses a mouse to select a target, and the gun automatically aims the gun at the target. Arming and firing is still performed by teleoperation. In fully autonomous mode, all aspects of gun control are performed by the vision system.

  15. Wearable Improved Vision System for Color Vision Deficiency Correction

    PubMed Central

    Riccio, Daniel; Di Perna, Luigi; Sanniti Di Baja, Gabriella; De Nino, Maurizio; Rossi, Settimio; Testa, Francesco; Simonelli, Francesca; Frucci, Maria

    2017-01-01

    Color vision deficiency (CVD) is an extremely frequent vision impairment that compromises the ability to recognize colors. In order to improve color vision in a subject with CVD, we designed and developed a wearable improved vision system based on an augmented reality device. The system was validated in a clinical pilot study on 24 subjects with CVD (18 males and 6 females, aged 37.4 ± 14.2 years). The primary outcome was the improvement in the Ishihara Vision Test score with the correction proposed by our system. The Ishihara test score significantly improved (\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$p = 0.03$ \\end{document}) from 5.8 ± 3.0 without correction to 14.8 ± 5.0 with correction. Almost all patients showed an improvement in color vision, as shown by the increased test scores. Moreover, with our system, 12 subjects (50%) passed the vision color test as normal vision subjects. The development and preliminary validation of the proposed platform confirm that a wearable augmented-reality device could be an effective aid to improve color vision in subjects with CVD. PMID:28507827

  16. Wearable Improved Vision System for Color Vision Deficiency Correction.

    PubMed

    Melillo, Paolo; Riccio, Daniel; Di Perna, Luigi; Sanniti Di Baja, Gabriella; De Nino, Maurizio; Rossi, Settimio; Testa, Francesco; Simonelli, Francesca; Frucci, Maria

    2017-01-01

    Color vision deficiency (CVD) is an extremely frequent vision impairment that compromises the ability to recognize colors. In order to improve color vision in a subject with CVD, we designed and developed a wearable improved vision system based on an augmented reality device. The system was validated in a clinical pilot study on 24 subjects with CVD (18 males and 6 females, aged 37.4 ± 14.2 years). The primary outcome was the improvement in the Ishihara Vision Test score with the correction proposed by our system. The Ishihara test score significantly improved ([Formula: see text]) from 5.8 ± 3.0 without correction to 14.8 ± 5.0 with correction. Almost all patients showed an improvement in color vision, as shown by the increased test scores. Moreover, with our system, 12 subjects (50%) passed the vision color test as normal vision subjects. The development and preliminary validation of the proposed platform confirm that a wearable augmented-reality device could be an effective aid to improve color vision in subjects with CVD.

  17. Multi-agent autonomous system

    NASA Technical Reports Server (NTRS)

    Fink, Wolfgang (Inventor); Dohm, James (Inventor); Tarbell, Mark A. (Inventor)

    2010-01-01

    A multi-agent autonomous system for exploration of hazardous or inaccessible locations. The multi-agent autonomous system includes simple surface-based agents or craft controlled by an airborne tracking and command system. The airborne tracking and command system includes an instrument suite used to image an operational area and any craft deployed within the operational area. The image data is used to identify the craft, targets for exploration, and obstacles in the operational area. The tracking and command system determines paths for the surface-based craft using the identified targets and obstacles and commands the craft using simple movement commands to move through the operational area to the targets while avoiding the obstacles. Each craft includes its own instrument suite to collect information about the operational area that is transmitted back to the tracking and command system. The tracking and command system may be further coupled to a satellite system to provide additional image information about the operational area and provide operational and location commands to the tracking and command system.

  18. Adaptive estimation and control with application to vision-based autonomous formation flight

    NASA Astrophysics Data System (ADS)

    Sattigeri, Ramachandra

    2007-05-01

    Modern Unmanned Aerial Vehicles (UAVs) are equipped with vision sensors because of their light-weight, low-cost characteristics and also their ability to provide a rich variety of information of the environment in which the UAVs are navigating in. The problem of vision based autonomous flight is very difficult and challenging since it requires bringing together concepts from image processing and computer vision, target tracking and state estimation, and flight guidance and control. This thesis focuses on the adaptive state estimation, guidance and control problems involved in vision-based formation flight. Specifically, the thesis presents a composite adaptation approach to the partial state estimation of a class of nonlinear systems with unmodeled dynamics. In this approach, a linear time-varying Kalman filter is the nominal state estimator which is augmented by the output of an adaptive neural network (NN) that is trained with two error signals. The benefit of the proposed approach is in its faster and more accurate adaptation to the modeling errors over a conventional approach. The thesis also presents two approaches to the design of adaptive guidance and control (G&C) laws for line-of-sight formation flight. In the first approach, the guidance and autopilot systems are designed separately and then combined together by assuming time-scale separation. The second approach is based on integrating the guidance and autopilot design process. The developed G&C laws using both approaches are adaptive to unmodeled leader aircraft acceleration and to own aircraft aerodynamic uncertainties. The thesis also presents theoretical justification based on Lyapunov-like stability analysis for integrating the adaptive state estimation and adaptive G&C designs. All the developed designs are validated in nonlinear, 6DOF fixed-wing aircraft simulations. Finally, the thesis presents a decentralized coordination strategy for vision-based multiple-aircraft formation control. In this

  19. Laboratory Experimentation of Autonomous Spacecraft Docking Using Cooperative Vision Navigation

    DTIC Science & Technology

    2005-12-01

    developed by the University of Padova (Italy). It is an autonomous robot (although connected by an umbilical cord ) that utilizes a large robotic arm used...Reaction Wheel Deck The reaction wheel deck (see Figure 26) contains a reaction wheel and a voltage clamp . The reaction wheel is used for attitude...The reaction wheel is connected, via the voltage clamp , to the analog I/O board where it will be send telemetry data and receive commands. The

  20. Neural Networks for Computer Vision: A Framework for Specifications of a General Purpose Vision System

    NASA Astrophysics Data System (ADS)

    Skrzypek, Josef; Mesrobian, Edmond; Gungner, David J.

    1989-03-01

    The development of autonomous land vehicles (ALV) capable of operating in an unconstrained environment has proven to be a formidable research effort. The unpredictability of events in such an environment calls for the design of a robust perceptual system, an impossible task requiring the programming of a system bases on the expectation of future, unconstrained events. Hence, the need for a "general purpose" machine vision system that is capable of perceiving and understanding images in an unconstrained environment in real-time. The research undertaken at the UCLA Machine Perception Laboratory addresses this need by focusing on two specific issues: 1) the long term goals for machine vision research as a joint effort between the neurosciences and computer science; and 2) a framework for evaluating progress in machine vision. In the past, vision research has been carried out independently within different fields including neurosciences, psychology, computer science, and electrical engineering. Our interdisciplinary approach to vision research is based on the rigorous combination of computational neuroscience, as derived from neurophysiology and neuropsychology, with computer science and electrical engineering. The primary motivation behind our approach is that the human visual system is the only existing example of a "general purpose" vision system and using a neurally based computing substrate, it can complete all necessary visual tasks in real-time.

  1. Experiences in Benchmarking of Autonomic Systems

    NASA Astrophysics Data System (ADS)

    Etchevers, Xavier; Coupaye, Thierry; Vachet, Guy

    Autonomic computing promises improvements of systems quality of service in terms of availability, reliability, performance, security, etc. However, little research and experimental results have so far demonstrated this assertion, nor provided proof of the return on investment stemming from the efforts that introducing autonomic features requires. Existing works in the area of benchmarking of autonomic systems can be characterized by their qualitative and fragmented approaches. Still a crucial need is to provide generic (i.e. independent from business, technology, architecture and implementation choices) autonomic computing benchmarking tools for evaluating and/or comparing autonomic systems from a technical and, ultimately, an economical point of view. This article introduces a methodology and a process for defining and evaluating factors, criteria and metrics in order to qualitatively and quantitatively assess autonomic features in computing systems. It also discusses associated experimental results on three different autonomic systems.

  2. Integrated System for Autonomous Science

    NASA Technical Reports Server (NTRS)

    Chien, Steve; Sherwood, Robert; Tran, Daniel; Cichy, Benjamin; Davies, Ashley; Castano, Rebecca; Rabideau, Gregg; Frye, Stuart; Trout, Bruce; Shulman, Seth; hide

    2006-01-01

    The New Millennium Program Space Technology 6 Project Autonomous Sciencecraft software implements an integrated system for autonomous planning and execution of scientific, engineering, and spacecraft-coordination actions. A prior version of this software was reported in "The TechSat 21 Autonomous Sciencecraft Experiment" (NPO-30784), NASA Tech Briefs, Vol. 28, No. 3 (March 2004), page 33. This software is now in continuous use aboard the Earth Orbiter 1 (EO-1) spacecraft mission and is being adapted for use in the Mars Odyssey and Mars Exploration Rovers missions. This software enables EO-1 to detect and respond to such events of scientific interest as volcanic activity, flooding, and freezing and thawing of water. It uses classification algorithms to analyze imagery onboard to detect changes, including events of scientific interest. Detection of such events triggers acquisition of follow-up imagery. The mission-planning component of the software develops a response plan that accounts for visibility of targets and operational constraints. The plan is then executed under control by a task-execution component of the software that is capable of responding to anomalies.

  3. The Autonomous Pathogen Detection System

    SciTech Connect

    Dzenitis, J M; Makarewicz, A J

    2009-01-13

    We developed, tested, and now operate a civilian biological defense capability that continuously monitors the air for biological threat agents. The Autonomous Pathogen Detection System (APDS) collects, prepares, reads, analyzes, and reports results of multiplexed immunoassays and multiplexed PCR assays using Luminex{copyright} xMAP technology and flow cytometer. The mission we conduct is particularly demanding: continuous monitoring, multiple threat agents, high sensitivity, challenging environments, and ultimately extremely low false positive rates. Here, we introduce the mission requirements and metrics, show the system engineering and analysis framework, and describe the progress to date including early development and current status.

  4. Remote-controlled vision-guided mobile robot system

    NASA Astrophysics Data System (ADS)

    Ande, Raymond; Samu, Tayib; Hall, Ernest L.

    1997-09-01

    Automated guided vehicles (AGVs) have many potential applications in manufacturing, medicine, space and defense. The purpose of this paper is to describe exploratory research on the design of the remote controlled emergency stop and vision systems for an autonomous mobile robot. The remote control provides human supervision and emergency stop capabilities for the autonomous vehicle. The vision guidance provides automatic operation. A mobile robot test-bed has been constructed using a golf cart base. The mobile robot (Bearcat) was built for the Association for Unmanned Vehicle Systems (AUVS) 1997 competition. The mobile robot has full speed control with guidance provided by a vision system and an obstacle avoidance system using ultrasonic sensors systems. Vision guidance is accomplished using two CCD cameras with zoom lenses. The vision data is processed by a high speed tracking device, communicating with the computer the X, Y coordinates of blobs along the lane markers. The system also has three emergency stop switches and a remote controlled emergency stop switch that can disable the traction motor and set the brake. Testing of these systems has been done in the lab as well as on an outside test track with positive results that show that at five mph the vehicle can follow a line and at the same time avoid obstacles.

  5. Real-time vision systems

    SciTech Connect

    Johnson, R.; Hernandez, J.E.; Lu, Shin-yee

    1994-11-15

    Many industrial and defence applications require an ability to make instantaneous decisions based on sensor input of a time varying process. Such systems are referred to as `real-time systems` because they process and act on data as it occurs in time. When a vision sensor is used in a real-time system, the processing demands can be quite substantial, with typical data rates of 10-20 million samples per second. A real-time Machine Vision Laboratory (MVL) was established in FY94 to extend our years of experience in developing computer vision algorithms to include the development and implementation of real-time vision systems. The laboratory is equipped with a variety of hardware components, including Datacube image acquisition and processing boards, a Sun workstation, and several different types of CCD cameras, including monochrome and color area cameras and analog and digital line-scan cameras. The equipment is reconfigurable for prototyping different applications. This facility has been used to support several programs at LLNL, including O Division`s Peacemaker and Deadeye Projects as well as the CRADA with the U.S. Textile Industry, CAFE (Computer Aided Fabric Inspection). To date, we have successfully demonstrated several real-time applications: bullet tracking, stereo tracking and ranging, and web inspection. This work has been documented in the ongoing development of a real-time software library.

  6. APDS: Autonomous Pathogen Detection System

    SciTech Connect

    Langlois, R G; Brown, S; Burris, L; Colston, B; Jones, L; Makarewicz, T; Mariella, R; Masquelier, D; McBride, M; Milanovich, F; Masarabadi, S; Venkateswaran, K; Marshall, G; Olson, D; Wolcott, D

    2002-02-14

    An early warning system to counter bioterrorism, the Autonomous Pathogen Detection System (APDS) continuously monitors the environment for the presence of biological pathogens (e.g., anthrax) and once detected, it sounds an alarm much like a smoke detector warns of a fire. Long before September 11, 2001, this system was being developed to protect domestic venues and events including performing arts centers, mass transit systems, major sporting and entertainment events, and other high profile situations in which the public is at risk of becoming a target of bioterrorist attacks. Customizing off-the-shelf components and developing new components, a multidisciplinary team developed APDS, a stand-alone system for rapid, continuous monitoring of multiple airborne biological threat agents in the environment. The completely automated APDS samples the air, prepares fluid samples in-line, and performs two orthogonal tests: immunoassay and nucleic acid detection. When compared to competing technologies, APDS is unprecedented in terms of flexibility and system performance.

  7. A Vision-Based Trajectory Controller for Autonomous Cleaning Robots

    NASA Astrophysics Data System (ADS)

    Gerstmayr, Lorenz; Röben, Frank; Krzykawski, Martin; Kreft, Sven; Venjakob, Daniel; Möller, Ralf

    Autonomous cleaning robots should completely cover the accessible area with minimal repeated coverage. We present a mostly visionbased navigation strategy for systematical exploration of an area with meandering lanes. The results of the robot experiments show that our approach can guide the robot along parallel lanes while achieving a good coverage with only a small proportion of repeated coverage. The proposed method can be used as a building block for more elaborated navigation strategies which allow the robot to systematically clean rooms with a complex workspace shape.

  8. Autonomous pathogen detection system 2001

    SciTech Connect

    Langlois, R G; Wang, A; Colston, B; Masquelier, D; Jones, L; Venkateswaran, K S; Nasarabadi, S; Brown, S; Ramponi, A; Milanovich, F P

    2001-01-09

    The objective of this project is to design, fabricate and field-demonstrate a fully Autonomous Pathogen Detector (identifier) System (APDS). This will be accomplished by integrating a proven flow cytometer and real-time polymerase chain reaction (PCR) detector with sample collection, sample preparation and fluidics to provide a compact, autonomously operating instrument capable of simultaneously detecting multiple pathogens and/or toxins. The APDS will be designed to operate in fixed locations, where it continuously monitors air samples and automatically reports the presence of specific biological agents. The APDS will utilize both multiplex immuno and nucleic acid assays to provide ''quasi-orthogonal'', multiple agent detection approaches to minimize false positives and increase the reliability of identification. Technical advancements across several fronts must first be made in order to realize the full extent of the APDS. Commercialization will be accomplished through three progressive generations of instruments. The APDS is targeted for domestic applications in which (1) the public is at high risk of exposure to covert releases of bioagent such as in major subway systems and other transportation terminals, large office complexes, and convention centers; and (2) as part of a monitoring network of sensors integrated with command and control systems for wide area monitoring of urban areas and major gatherings (e.g., inaugurations, Olympics, etc.). In this latter application there is potential that a fully developed APDS could add value to Defense Department monitoring architectures.

  9. A Vision System For Robotic Inspection And Manipulation

    NASA Astrophysics Data System (ADS)

    Trivedi, Mohan M.; Chen, Chu X.; Marapane, Suresh

    1988-03-01

    New generation of robotic systems will operate in complex, unstructured environments of industrial plants utilizing sophisticated sensory mechanisms. In this paper we consider development of autonomous robotic systems for various inspection and manipulation tasks associated with advanced nuclear power plants. Our approach in the development of the robotic system is to utilize an array of sensors capable of sensing the robot's environment in several sensory modalities. One of the most important sensor modality utilized is that of vision. We describe the development of a model-based vision system for performing a number of inspection and manipulation tasks. The system is designed and tested using a laboratory based test panel. A number of analog and digital meters and a variety of switches, valves and controls are mounted on the panel. The paper presents details of system design and development and a series of experiments performed to evaluate capabilities of the vision system.

  10. Autonomous Biological System (ABS) experiments.

    PubMed

    MacCallum, T K; Anderson, G A; Poynter, J E; Stodieck, L S; Klaus, D M

    1998-12-01

    Three space flight experiments have been conducted to test and demonstrate the use of a passively controlled, materially closed, bioregenerative life support system in space. The Autonomous Biological System (ABS) provides an experimental environment for long term growth and breeding of aquatic plants and animals. The ABS is completely materially closed, isolated from human life support systems and cabin atmosphere contaminants, and requires little need for astronaut intervention. Testing of the ABS marked several firsts: the first aquatic angiosperms to be grown in space; the first higher organisms (aquatic invertebrate animals) to complete their life cycles in space; the first completely bioregenerative life support system in space; and, among the first gravitational ecology experiments. As an introduction this paper describes the ABS, its flight performance, advantages and disadvantages.

  11. Autonomic Computing for Spacecraft Ground Systems

    NASA Technical Reports Server (NTRS)

    Li, Zhenping; Savkli, Cetin; Jones, Lori

    2007-01-01

    Autonomic computing for spacecraft ground systems increases the system reliability and reduces the cost of spacecraft operations and software maintenance. In this paper, we present an autonomic computing solution for spacecraft ground systems at NASA Goddard Space Flight Center (GSFC), which consists of an open standard for a message oriented architecture referred to as the GMSEC architecture (Goddard Mission Services Evolution Center), and an autonomic computing tool, the Criteria Action Table (CAT). This solution has been used in many upgraded ground systems for NASA 's missions, and provides a framework for developing solutions with higher autonomic maturity.

  12. Testbed for an autonomous system

    NASA Technical Reports Server (NTRS)

    Dikshit, Piyush; Guimaraes, Katia; Ramamurthy, Maya; Agrawala, Ashok K.; Larsen, Ronald L.

    1989-01-01

    In previous works we have defined a general architectural model for autonomous systems, which can easily be mapped to describe the functions of any automated system (SDAG-86-01), and we illustrated that model by applying it to the thermal management system of a space station (SDAG-87-01). In this note, we will further develop that application and design the detail of the implementation of such a model. First we present the environment of our application by describing the thermal management problem and an abstraction, which was called TESTBED, that includes a specific function for each module in the architecture, and the nature of the interfaces between each pair of blocks.

  13. An Autonomous Flight Safety System

    NASA Technical Reports Server (NTRS)

    Bull, James B.; Lanzi, Raymond J.

    2007-01-01

    The Autonomous Flight Safety System (AFSS) being developed by NASA s Goddard Space Flight Center s Wallops Flight Facility and Kennedy Space Center has completed two successful developmental flights and is preparing for a third. AFSS has been demonstrated to be a viable architecture for implementation of a completely vehicle based system capable of protecting life and property in event of an errant vehicle by terminating the flight or initiating other actions. It is capable of replacing current human-in-the-loop systems or acting in parallel with them. AFSS is configured prior to flight in accordance with a specific rule set agreed upon by the range safety authority and the user to protect the public and assure mission success. This paper discusses the motivation for the project, describes the method of development, and presents an overview of the evolving architecture and the current status.

  14. Testbed for an autonomous system

    NASA Technical Reports Server (NTRS)

    Dikshit, Piyush; Guimaraes, Katia; Ramamurthy, Maya; Agrawala, Ashok K.; Larsen, Ronald L.

    1989-01-01

    In previous works we have defined a general architectural model for autonomous systems, which can easily be mapped to describe the functions of any automated system (SDAG-86-01), and we illustrated that model by applying it to the thermal management system of a space station (SDAG-87-01). In this note, we will further develop that application and design the detail of the implementation of such a model. First we present the environment of our application by describing the thermal management problem and an abstraction, which was called TESTBED, that includes a specific function for each module in the architecture, and the nature of the interfaces between each pair of blocks.

  15. Semi-autonomous unmanned ground vehicle control system

    NASA Astrophysics Data System (ADS)

    Anderson, Jonathan; Lee, Dah-Jye; Schoenberger, Robert; Wei, Zhaoyi; Archibald, James

    2006-05-01

    Unmanned Ground Vehicles (UGVs) have advantages over people in a number of different applications, ranging from sentry duty, scouting hazardous areas, convoying goods and supplies over long distances, and exploring caves and tunnels. Despite recent advances in electronics, vision, artificial intelligence, and control technologies, fully autonomous UGVs are still far from being a reality. Currently, most UGVs are fielded using tele-operation with a human in the control loop. Using tele-operations, a user controls the UGV from the relative safety and comfort of a control station and sends commands to the UGV remotely. It is difficult for the user to issue higher level commands such as patrol this corridor or move to this position while avoiding obstacles. As computer vision algorithms are implemented in hardware, the UGV can easily become partially autonomous. As Field Programmable Gate Arrays (FPGAs) become larger and more powerful, vision algorithms can run at frame rate. With the rapid development of CMOS imagers for consumer electronics, frame rate can reach as high as 200 frames per second with a small size of the region of interest. This increase in the speed of vision algorithm processing allows the UGVs to become more autonomous, as they are able to recognize and avoid obstacles in their path, track targets, or move to a recognized area. The user is able to focus on giving broad supervisory commands and goals to the UGVs, allowing the user to control multiple UGVs at once while still maintaining the convenience of working from a central base station. In this paper, we will describe a novel control system for the control of semi-autonomous UGVs. This control system combines a user interface similar to a simple tele-operation station along with a control package, including the FPGA and multiple cameras. The control package interfaces with the UGV and provides the necessary control to guide the UGV.

  16. Research in Computer Vision for Autonomous Systems

    DTIC Science & Technology

    1988-09-15

    segmentation using thresholding. Although it has fared better than the TBIR 2 metric in assessing the difficulty of segmentation on the images we have tested... typographical error in Hu’s 2-8 original paper [Hu62]. Further study showed that this error had been reported in [Ma79]. The feature in error was Tlpq...8217 ~ - 4i C’) 4.. CDB "~’ I - - qAn 01 3-182 typographical errors in the equations that set the partial derivative to zero, so the derivation is repeated

  17. Information for Successful Interaction with Autonomous Systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Johnson, Kathy A.

    2003-01-01

    Interaction in heterogeneous mission operations teams is not well matched to classical models of coordination with autonomous systems. We describe methods of loose coordination and information management in mission operations. We describe an information agent and information management tool suite for managing information from many sources, including autonomous agents. We present an integrated model of levels of complexity of agent and human behavior, which shows types of information processing and points of potential error in agent activities. We discuss the types of information needed for diagnosing problems and planning interactions with an autonomous system. We discuss types of coordination for which designs are needed for autonomous system functions.

  18. A Robot Vision System.

    DTIC Science & Technology

    1985-12-01

    yT P TDX’ - x’TDPY + x’TDX’ (A-38) - . A- 16 7 D-i64 282 A ROBOT VISIN SYSTEM) IR FO RCE INST OF TECH 2/3URIGHT-PATTERSON RFB OH SCHOOL OF...OR. ITEMP3 .OR. ITEMP4 113 CONTINUE C C Output to the disk file C CALL WRBLK(2,I,VFILE,1, IR ) IF (IER .EQ. 1) GOTO 115 TYPE "Write error-",IER OCxTO...34 ,IXS, IXE," IY=11, A IYS, IYE IF (IDEBF .AND. 4) WRITE(12,1001) (ICNTR(I),I=l,6) 1001 FORMAT(" ---ICN IR =",615) C C Process block with current

  19. Dynamical Systems and Motion Vision.

    DTIC Science & Technology

    1988-04-01

    TASK Artificial Inteligence Laboratory AREA I WORK UNIT NUMBERS 545 Technology Square . Cambridge, MA 02139 C\\ II. CONTROLLING OFFICE NAME ANO0 ADDRESS...INSTITUTE OF TECHNOLOGY ARTIFICIAL INTELLIGENCE LABORATORY A.I.Memo No. 1037 April, 1988 Dynamical Systems and Motion Vision Joachim Heel Abstract: In this... Artificial Intelligence L3 Laboratory of the Massachusetts Institute of Technology. Support for the Laboratory’s [1 Artificial Intelligence Research is

  20. Autonomous power system intelligent diagnosis and control

    NASA Technical Reports Server (NTRS)

    Ringer, Mark J.; Quinn, Todd M.; Merolla, Anthony

    1991-01-01

    The Autonomous Power System (APS) project at NASA Lewis Research Center is designed to demonstrate the abilities of integrated intelligent diagnosis, control, and scheduling techniques to space power distribution hardware. Knowledge-based software provides a robust method of control for highly complex space-based power systems that conventional methods do not allow. The project consists of three elements: the Autonomous Power Expert System (APEX) for fault diagnosis and control, the Autonomous Intelligent Power Scheduler (AIPS) to determine system configuration, and power hardware (Brassboard) to simulate a space based power system. The operation of the Autonomous Power System as a whole is described and the responsibilities of the three elements - APEX, AIPS, and Brassboard - are characterized. A discussion of the methodologies used in each element is provided. Future plans are discussed for the growth of the Autonomous Power System.

  1. Autonomic nervous system and immune system interactions.

    PubMed

    Kenney, M J; Ganta, C K

    2014-07-01

    The present review assesses the current state of literature defining integrative autonomic-immune physiological processing, focusing on studies that have employed electrophysiological, pharmacological, molecular biological, and central nervous system experimental approaches. Central autonomic neural networks are informed of peripheral immune status via numerous communicating pathways, including neural and non-neural. Cytokines and other immune factors affect the level of activity and responsivity of discharges in sympathetic and parasympathetic nerves innervating diverse targets. Multiple levels of the neuraxis contribute to cytokine-induced changes in efferent parasympathetic and sympathetic nerve outflows, leading to modulation of peripheral immune responses. The functionality of local sympathoimmune interactions depends on the microenvironment created by diverse signaling mechanisms involving integration between sympathetic nervous system neurotransmitters and neuromodulators; specific adrenergic receptors; and the presence or absence of immune cells, cytokines, and bacteria. Functional mechanisms contributing to the cholinergic anti-inflammatory pathway likely involve novel cholinergic-adrenergic interactions at peripheral sites, including autonomic ganglion and lymphoid targets. Immune cells express adrenergic and nicotinic receptors. Neurotransmitters released by sympathetic and parasympathetic nerve endings bind to their respective receptors located on the surface of immune cells and initiate immune-modulatory responses. Both sympathetic and parasympathetic arms of the autonomic nervous system are instrumental in orchestrating neuroimmune processes, although additional studies are required to understand dynamic and complex adrenergic-cholinergic interactions. Further understanding of regulatory mechanisms linking the sympathetic nervous, parasympathetic nervous, and immune systems is critical for understanding relationships between chronic disease

  2. Autonomic Nervous System and Immune System Interactions

    PubMed Central

    Kenney, MJ; Ganta, CK

    2015-01-01

    The present review assesses the current state of literature defining integrative autonomic-immune physiological processing, focusing on studies that have employed electrophysiological, pharmacological, molecular biological and central nervous system experimental approaches. Central autonomic neural networks are informed of peripheral immune status via numerous communicating pathways, including neural and non-neural. Cytokines and other immune factors affect the level of activity and responsivity of discharges in sympathetic and parasympathetic nerves innervating diverse targets. Multiple levels of the neuraxis contribute to cytokine-induced changes in efferent parasympathetic and sympathetic nerve outflows, leading to modulation of peripheral immune responses. The functionality of local sympathoimmune interactions depends on the microenvironment created by diverse signaling mechanisms involving integration between sympathetic nervous system neurotransmitters and neuromodulators; specific adrenergic receptors; and the presence or absence of immune cells, cytokines and bacteria. Functional mechanisms contributing to the cholinergic anti-inflammatory pathway likely involve novel cholinergic-adrenergic interactions at peripheral sites, including autonomic ganglion and lymphoid targets. Immune cells express adrenergic and nicotinic receptors. Neurotransmitters released by sympathetic and parasympathetic nerve endings bind to their respective receptors located on the surface of immune cells and initiate immune-modulatory responses. Both sympathetic and parasympathetic arms of the autonomic nervous system are instrumental in orchestrating neuroimmune processes, although additional studies are required to understand dynamic and complex adrenergic-cholinergic interactions. Further understanding of regulatory mechanisms linking the sympathetic nervous, parasympathetic nervous, and immune systems is critical for understanding relationships between chronic disease development

  3. Autonomous navigation system and method

    DOEpatents

    Bruemmer, David J [Idaho Falls, ID; Few, Douglas A [Idaho Falls, ID

    2009-09-08

    A robot platform includes perceptors, locomotors, and a system controller, which executes instructions for autonomously navigating a robot. The instructions repeat, on each iteration through an event timing loop, the acts of defining an event horizon based on the robot's current velocity, detecting a range to obstacles around the robot, testing for an event horizon intrusion by determining if any range to the obstacles is within the event horizon, and adjusting rotational and translational velocity of the robot accordingly. If the event horizon intrusion occurs, rotational velocity is modified by a proportion of the current rotational velocity reduced by a proportion of the range to the nearest obstacle and translational velocity is modified by a proportion of the range to the nearest obstacle. If no event horizon intrusion occurs, translational velocity is set as a ratio of a speed factor relative to a maximum speed.

  4. System Engineering of Autonomous Space Vehicles

    NASA Technical Reports Server (NTRS)

    Watson, Michael D.; Johnson, Stephen B.; Trevino, Luis

    2014-01-01

    Human exploration of the solar system requires fully autonomous systems when travelling more than 5 light minutes from Earth. This autonomy is necessary to manage a large, complex spacecraft with limited crew members and skills available. The communication latency requires the vehicle to deal with events with only limited crew interaction in most cases. The engineering of these systems requires an extensive knowledge of the spacecraft systems, information theory, and autonomous algorithm characteristics. The characteristics of the spacecraft systems must be matched with the autonomous algorithm characteristics to reliably monitor and control the system. This presents a large system engineering problem. Recent work on product-focused, elegant system engineering will be applied to this application, looking at the full autonomy stack, the matching of autonomous systems to spacecraft systems, and the integration of different types of algorithms. Each of these areas will be outlined and a general approach defined for system engineering to provide the optimal solution to the given application context.

  5. Basic design principles of colorimetric vision systems

    NASA Astrophysics Data System (ADS)

    Mumzhiu, Alex M.

    1998-10-01

    Color measurement is an important part of overall production quality control in textile, coating, plastics, food, paper and other industries. The color measurement instruments such as colorimeters and spectrophotometers, used for production quality control have many limitations. In many applications they cannot be used for a variety of reasons and have to be replaced with human operators. Machine vision has great potential for color measurement. The components for color machine vision systems, such as broadcast quality 3-CCD cameras, fast and inexpensive PCI frame grabbers, and sophisticated image processing software packages are available. However the machine vision industry has only started to approach the color domain. The few color machine vision systems on the market, produced by the largest machine vision manufacturers have very limited capabilities. A lack of understanding that a vision based color measurement system could fail if it ignores the basic principles of colorimetry is the main reason for the slow progress of color vision systems. the purpose of this paper is to clarify how color measurement principles have to be applied to vision systems and how the electro-optical design features of colorimeters have to be modified in order to implement them for vision systems. The subject of this presentation far exceeds the limitations of a journal paper so only the most important aspects will be discussed. An overview of the major areas of applications for colorimetric vision system will be discussed. Finally, the reasons why some customers are happy with their vision systems and some are not will be analyzed.

  6. Autonomous Operations System: Development and Application

    NASA Technical Reports Server (NTRS)

    Toro Medina, Jaime A.; Wilkins, Kim N.; Walker, Mark; Stahl, Gerald M.

    2016-01-01

    Autonomous control systems provides the ability of self-governance beyond the conventional control system. As the complexity of mechanical and electrical systems increases, there develops a natural drive for developing robust control systems to manage complicated operations. By closing the bridge between conventional automated systems to knowledge based self-awareness systems, nominal control of operations can evolve into relying on safe critical mitigation processes to support any off-nominal behavior. Current research and development efforts lead by the Autonomous Propellant Loading (APL) group at NASA Kennedy Space Center aims to improve cryogenic propellant transfer operations by developing an automated control and health monitoring system. As an integrated systems, the center aims to produce an Autonomous Operations System (AOS) capable of integrating health management operations with automated control to produce a fully autonomous system.

  7. Autonomous docking system for space structures and satellites

    NASA Astrophysics Data System (ADS)

    Prasad, Guru; Tajudeen, Eddie; Spenser, James

    2005-05-01

    Aximetric proposes Distributed Command and Control (C2) architecture for autonomous on-orbit assembly in space with our unique vision and sensor driven docking mechanism. Aximetric is currently working on ip based distributed control strategies, docking/mating plate, alignment and latching mechanism, umbilical structure/cord designs, and hardware/software in a closed loop architecture for smart autonomous demonstration utilizing proven developments in sensor and docking technology. These technologies can be effectively applied to many transferring/conveying and on-orbit servicing applications to include the capturing and coupling of space bound vehicles and components. The autonomous system will be a "smart" system that will incorporate a vision system used for identifying, tracking, locating and mating the transferring device to the receiving device. A robustly designed coupler for the transfer of the fuel will be integrated. Advanced sealing technology will be utilized for isolation and purging of resulting cavities from the mating process and/or from the incorporation of other electrical and data acquisition devices used as part of the overall smart system.

  8. Vision inspection system and method

    NASA Technical Reports Server (NTRS)

    Huber, Edward D. (Inventor); Williams, Rick A. (Inventor)

    1997-01-01

    An optical vision inspection system (4) and method for multiplexed illuminating, viewing, analyzing and recording a range of characteristically different kinds of defects, depressions, and ridges in a selected material surface (7) with first and second alternating optical subsystems (20, 21) illuminating and sensing successive frames of the same material surface patch. To detect the different kinds of surface features including abrupt as well as gradual surface variations, correspondingly different kinds of lighting are applied in time-multiplexed fashion to the common surface area patches under observation.

  9. Autonomous intelligent cruise control system

    NASA Astrophysics Data System (ADS)

    Baret, Marc; Bomer, Thierry T.; Calesse, C.; Dudych, L.; L'Hoist, P.

    1995-01-01

    Autonomous intelligent cruise control (AICC) systems are not only controlling vehicles' speed but acting on the throttle and eventually on the brakes they could automatically maintain the relative speed and distance between two vehicles in the same lane. And more than just for comfort it appears that these new systems should improve the safety on highways. By applying a technique issued from the space research carried out by MATRA, a sensor based on a charge coupled device (CCD) was designed to acquire the reflected light on standard-mounted car reflectors of pulsed laser diodes emission. The CCD is working in a unique mode called flash during transfer (FDT) which allows identification of target patterns in severe optical environments. It provides high accuracy for distance and angular position of targets. The absence of moving mechanical parts ensures high reliability for this sensor. The large field of view and the high measurement rate give a global situation assessment and a short reaction time. Then, tracking and filtering algorithms have been developed in order to select the target, on which the equipped vehicle determines its safety distance and speed, taking into account its maneuvering and the behaviors of other vehicles.

  10. Three-dimensional vision system

    NASA Astrophysics Data System (ADS)

    Tyrsa, Valentin E.; Burtseva, Larisa P.; Tyrsa, Vera; Kalaykov, Ivan; Ananiev, Anani

    2002-10-01

    A relief image focusing system is proposed here, contrary to the normal focusing systems using a flat image. The relief is mapping the panorama's depth by a scale depending on the objective parameters. Opposite to normal understanding, the relief is not static. It is formed synchronously with the forward-backward linear motion of the objective. Each pixel of the vision sensor is illuminated and interference of light waves takes place. As the objective moves, the intensity of the interfered optical signal changes between the internal sensor noise level and certain maximum as a function of time. The process of forming the signal in the relief image space is based on searching extremes of interfered optical signals at each pixel. A method of mapping the measurement scale of a monocular focusing distance-meter is presented in the paper. It is implemented by sampling the signals, digital measurement of the time intervals and the signals' amplitudes and joint processing of the obtained numerical values providing a limited error to portions of micron. The method contributes the vision systems to perceive panoramic view in wide distance ranges. It can work at a wide range of optical signals and is invariant to aberrations.

  11. Advanced integrated enhanced vision systems

    NASA Astrophysics Data System (ADS)

    Kerr, J. R.; Luk, Chiu H.; Hammerstrom, Dan; Pavel, Misha

    2003-09-01

    In anticipation of its ultimate role in transport, business and rotary wing aircraft, we clarify the role of Enhanced Vision Systems (EVS): how the output data will be utilized, appropriate architecture for total avionics integration, pilot and control interfaces, and operational utilization. Ground-map (database) correlation is critical, and we suggest that "synthetic vision" is simply a subset of the monitor/guidance interface issue. The core of integrated EVS is its sensor processor. In order to approximate optimal, Bayesian multi-sensor fusion and ground correlation functionality in real time, we are developing a neural net approach utilizing human visual pathway and self-organizing, associative-engine processing. In addition to EVS/SVS imagery, outputs will include sensor-based navigation and attitude signals as well as hazard detection. A system architecture is described, encompassing an all-weather sensor suite; advanced processing technology; intertial, GPS and other avionics inputs; and pilot and machine interfaces. Issues of total-system accuracy and integrity are addressed, as well as flight operational aspects relating to both civil certification and military applications in IMC.

  12. VISION 21 SYSTEMS ANALYSIS METHODOLOGIES

    SciTech Connect

    G.S. Samuelsen; A. Rao; F. Robson; B. Washom

    2003-08-11

    Under the sponsorship of the U.S. Department of Energy/National Energy Technology Laboratory, a multi-disciplinary team led by the Advanced Power and Energy Program of the University of California at Irvine is defining the system engineering issues associated with the integration of key components and subsystems into power plant systems that meet performance and emission goals of the Vision 21 program. The study efforts have narrowed down the myriad of fuel processing, power generation, and emission control technologies to selected scenarios that identify those combinations having the potential to achieve the Vision 21 program goals of high efficiency and minimized environmental impact while using fossil fuels. The technology levels considered are based on projected technical and manufacturing advances being made in industry and on advances identified in current and future government supported research. Included in these advanced systems are solid oxide fuel cells and advanced cycle gas turbines. The results of this investigation will serve as a guide for the U. S. Department of Energy in identifying the research areas and technologies that warrant further support.

  13. Cooperative Autonomic Management in Dynamic Distributed Systems

    NASA Astrophysics Data System (ADS)

    Xu, Jing; Zhao, Ming; Fortes, José A. B.

    The centralized management of large distributed systems is often impractical, particularly when the both the topology and status of the system change dynamically. This paper proposes an approach to application-centric self-management in large distributed systems consisting of a collection of autonomic components that join and leave the system dynamically. Cooperative autonomic components self-organize into a dynamically created overlay network. Through local information sharing with neighbors, each component gains access to global information as needed for optimizing performance of applications. The approach has been validated and evaluated by developing a decentralized autonomic system consisting of multiple autonomic application managers previously developed for the In-VIGO grid-computing system. Using analytical results from complex random network and measurements done in a prototype system, we demonstrate the robustness, self-organization and adaptability of our approach, both theoretically and experimentally.

  14. Musca domestica inspired machine vision system with hyperacuity

    NASA Astrophysics Data System (ADS)

    Riley, Dylan T.; Harman, William M.; Tomberlin, Eric; Barrett, Steven F.; Wilcox, Michael; Wright, Cameron H. G.

    2005-05-01

    Musca domestica, the common house fly, has a simple yet powerful and accessible vision system. Cajal indicated in 1885 the fly's vision system is the same as in the human retina. The house fly has some intriguing vision system features such as fast, analog, parallel operation. Furthermore, it has the ability to detect movement and objects at far better resolution than predicted by photoreceptor spacing, termed hyperacuity. We are investigating the mechanisms behind these features and incorporating them into next generation vision systems. We have developed a prototype sensor that employs a fly inspired arrangement of photodetectors sharing a common lens. The Gaussian shaped acceptance profile of each sensor coupled with overlapped sensor field of views provide the necessary configuration for obtaining hyperacuity data. The sensor is able to detect object movement with far greater resolution than that predicted by photoreceptor spacing. We have exhaustively tested and characterized the sensor to determine its practical resolution limit. Our tests coupled with theory from Bucklew and Saleh (1985) indicate that the limit to the hyperacuity response may only be related to target contrast. We have also implemented an array of these prototype sensors which will allow for two - dimensional position location. These high resolution, low contrast capable sensors are being developed for use as a vision system for an autonomous robot and the next generation of smart wheel chairs. However, they are easily adapted for biological endoscopy, downhole monitoring in oil wells, and other applications.

  15. The Secure, Transportable, Autonomous Reactor System

    SciTech Connect

    Brown, N.W.; Hassberger, J.A.; Smith, C.; Carelli, M.; Greenspan, E.; Peddicord, K.L.; Stroh, K.; Wade, D.C.; Hill, R.N.

    1999-05-27

    The Secure, Transportable, Autonomous Reactor (STAR) system is a development architecture for implementing a small nuclear power system, specifically aimed at meeting the growing energy needs of much of the developing world. It simultaneously provides very high standards for safety, proliferation resistance, ease and economy of installation, operation, and ultimate disposition. The STAR system accomplishes these objectives through a combination of modular design, factory manufacture, long lifetime without refueling, autonomous control, and high reliability.

  16. Computer vision for driver assistance systems

    NASA Astrophysics Data System (ADS)

    Handmann, Uwe; Kalinke, Thomas; Tzomakas, Christos; Werner, Martin; von Seelen, Werner

    1998-07-01

    Systems for automated image analysis are useful for a variety of tasks and their importance is still increasing due to technological advances and an increase of social acceptance. Especially in the field of driver assistance systems the progress in science has reached a level of high performance. Fully or partly autonomously guided vehicles, particularly for road-based traffic, pose high demands on the development of reliable algorithms due to the conditions imposed by natural environments. At the Institut fur Neuroinformatik, methods for analyzing driving relevant scenes by computer vision are developed in cooperation with several partners from the automobile industry. We introduce a system which extracts the important information from an image taken by a CCD camera installed at the rear view mirror in a car. The approach consists of a sequential and a parallel sensor and information processing. Three main tasks namely the initial segmentation (object detection), the object tracking and the object classification are realized by integration in the sequential branch and by fusion in the parallel branch. The main gain of this approach is given by the integrative coupling of different algorithms providing partly redundant information.

  17. Autonomic nervous system and cardiovascular disease.

    PubMed

    Deschamps, Alain; Denault, André

    2009-06-01

    Because anesthesia affects the integrity of the autonomic nervous system, anesthesiologists use vital signs to maintain respiratory and circulatory homeostasis. However, patients with genetic predispositions or with autonomic dysfunctions are at risk of severe complications from anesthesia. For these patients, the monitoring of vital signs may not give sufficient warning to avoid complications. The development of methods to measure autonomic tone could be of interest to anesthesiologists because they could warn of changes in autonomic tone before vital signs are affected. New noninvasive methods are being developed to obtain measurements of parasympathetic and sympathetic output allowing for the monitoring of perioperative autonomic tone. These measurements are based on analysis of heart rate and blood pressure variability. In this report, the principals of the analysis of heart rate and blood pressure variability will be explained and the usefulness of these methods to anesthesiologists will be discussed.

  18. Development of an autonomous target tracking system

    NASA Astrophysics Data System (ADS)

    Gidda, Venkata Ramaiah

    In recent years, surveillance and border patrol have become one of the key research areas in UAV research. Increase in the computational capability of the computers and embedded electronics, coupled with compatibility of various commercial vision algorithms and commercial off the shelf (COTS) embedded electronics, and has further fuelled the research. The basic task in these applications is perception of environment through the available visual sensors like camera. Visual tracking, as the name implies, is tracking of objects using a camera. The process of autonomous target tracking starts with the selection of the target in a sequence of video frames transmitted from the on-board camera. We use an improved fast dynamic template matching algorithm coupled with Kalman Filter to track the selected target in consecutive video frames. The selected target is saved as a reference template. On the ground station computer, the reference template is overlaid on the live streaming video from the on-board system, starting from the upper left corner of the video frame. The template is slid pixel by pixel over the entire source image. A comparison of the pixels is performed between the template and source image. A confidence value R of the match is calculated at each pixel. Based on the method used to perform the template matching, the best match pixel location is found according to the highest or lowest confidence value R. The best match pixel location is communicated to the on-board gimbal controller over the wireless Xbee network. The software on the controller actuates the pan-tilt servos to continuously to hold the selected target at the center of the video frame. The complete system is a portable control system assembled from commercial off the shelf parts. The tracking system is tested on a target having several motion patterns.

  19. An introduction to autonomous control systems

    NASA Technical Reports Server (NTRS)

    Antsaklis, Panos J.; Passino, Kevin M.; Wang, S. J.

    1991-01-01

    The functions, characteristics, and benefits of autonomous control are outlined. An autonomous control functional architecture for future space vehicles that incorporates the concepts and characteristics described is presented. The controller is hierarchical, with an execution level (the lowest level), coordination level (middle level), and management and organization level (highest level). The general characteristics of the overall architecture, including those of the three levels, are explained, and an example to illustrate their functions is given. Mathematical models for autonomous systems, including 'logical' discrete event system models, are discussed. An approach to the quantitative, systematic modeling, analysis, and design of autonomous controllers is also discussed. It is a hybrid approach since it uses conventional analysis techniques based on difference and differential equations and new techniques for the analysis of the systems described with a symbolic formalism such as finite automata. Some recent results from the areas of planning and expert systems, machine learning, artificial neural networks, and the area restructurable controls are briefly outlined.

  20. An introduction to autonomous control systems

    NASA Technical Reports Server (NTRS)

    Antsaklis, Panos J.; Passino, Kevin M.; Wang, S. J.

    1991-01-01

    The functions, characteristics, and benefits of autonomous control are outlined. An autonomous control functional architecture for future space vehicles that incorporates the concepts and characteristics described is presented. The controller is hierarchical, with an execution level (the lowest level), coordination level (middle level), and management and organization level (highest level). The general characteristics of the overall architecture, including those of the three levels, are explained, and an example to illustrate their functions is given. Mathematical models for autonomous systems, including 'logical' discrete event system models, are discussed. An approach to the quantitative, systematic modeling, analysis, and design of autonomous controllers is also discussed. It is a hybrid approach since it uses conventional analysis techniques based on difference and differential equations and new techniques for the analysis of the systems described with a symbolic formalism such as finite automata. Some recent results from the areas of planning and expert systems, machine learning, artificial neural networks, and the area restructurable controls are briefly outlined.

  1. Precise calibration of binocular vision system used for vision measurement.

    PubMed

    Cui, Yi; Zhou, Fuqiang; Wang, Yexin; Liu, Liu; Gao, He

    2014-04-21

    Binocular vision calibration is of great importance in 3D machine vision measurement. With respect to binocular vision calibration, the nonlinear optimization technique is a crucial step to improve the accuracy. The existing optimization methods mostly aim at minimizing the sum of reprojection errors for two cameras based on respective 2D image pixels coordinate. However, the subsequent measurement process is conducted in 3D coordinate system which is not consistent with the optimization coordinate system. Moreover, the error criterion with respect to optimization and measurement is different. The equal pixel distance error in 2D image plane leads to diverse 3D metric distance error at different position before the camera. To address these issues, we propose a precise calibration method for binocular vision system which is devoted to minimizing the metric distance error between the reconstructed point through optimal triangulation and the ground truth in 3D measurement coordinate system. In addition, the inherent epipolar constraint and constant distance constraint are combined to enhance the optimization process. To evaluate the performance of the proposed method, both simulative and real experiments have been carried out and the results show that the proposed method is reliable and efficient to improve measurement accuracy compared with conventional method.

  2. Intelligent control system of autonomous objects

    NASA Astrophysics Data System (ADS)

    Engel, E. A.; Kovalev, I. V.; Engel, N. E.; Brezitskaya, V. V.; Prohorovich, G. A.

    2017-02-01

    This paper presents an intelligent control system of autonomous objects as framework. The intelligent control framework includes two different layers: a reflexive layer and a reactive layer. The proposed multiagent adaptive fuzzy neuronet combines low-level reaction with high-level reasoning in an intelligent control framework. The formed as the multiagent adaptive fuzzy neuronet the intelligent control system on the base of autonomous object’s state, creates the effective control signal under random perturbations.

  3. A Vision-Based Relative Navigation Approach for Autonomous Multirotor Aircraft

    NASA Astrophysics Data System (ADS)

    Leishman, Robert C.

    Autonomous flight in unstructured, confined, and unknown GPS-denied environments is a challenging problem. Solutions could be tremendously beneficial for scenarios that require information about areas that are difficult to access and that present a great amount of risk. The goal of this research is to develop a new framework that enables improved solutions to this problem and to validate the approach with experiments using a hardware prototype. In Chapter 2 we examine the consequences and practical aspects of using an improved dynamic model for multirotor state estimation, using only IMU measurements. The improved model correctly explains the measurements available from the accelerometers on a multirotor. We provide hardware results demonstrating the improved attitude, velocity and even position estimates that can be achieved through the use of this model. We propose a new architecture to simplify some of the challenges that constrain GPS-denied aerial flight in Chapter 3. At the core, the approach combines visual graph-SLAM with a multiplicative extended Kalman filter (MEKF). More importantly, we depart from the common practice of estimating global states and instead keep the position and yaw states of the MEKF relative to the current node in the map. This relative navigation approach provides a tremendous benefit compared to maintaining estimates with respect to a single global coordinate frame. We discuss the architecture of this new system and provide important details for each component. We verify the approach with goal-directed autonomous flight-test results. The MEKF is the basis of the new relative navigation approach and is detailed in Chapter 4. We derive the relative filter and show how the states must be augmented and marginalized each time a new node is declared. The relative estimation approach is verified using hardware flight test results accompanied by comparisons to motion capture truth. Additionally, flight results with estimates in the control

  4. Autonomous control systems - Architecture and fundamental issues

    NASA Technical Reports Server (NTRS)

    Antsaklis, P. J.; Passino, K. M.; Wang, S. J.

    1988-01-01

    A hierarchical functional autonomous controller architecture is introduced. In particular, the architecture for the control of future space vehicles is described in detail; it is designed to ensure the autonomous operation of the control system and it allows interaction with the pilot and crew/ground station, and the systems on board the autonomous vehicle. The fundamental issues in autonomous control system modeling and analysis are discussed. It is proposed to utilize a hybrid approach to modeling and analysis of autonomous systems. This will incorporate conventional control methods based on differential equations and techniques for the analysis of systems described with a symbolic formalism. In this way, the theory of conventional control can be fully utilized. It is stressed that autonomy is the design requirement and intelligent control methods appear at present, to offer some of the necessary tools to achieve autonomy. A conventional approach may evolve and replace some or all of the `intelligent' functions. It is shown that in addition to conventional controllers, the autonomous control system incorporates planning, learning, and FDI (fault detection and identification).

  5. Autonomous control systems - Architecture and fundamental issues

    NASA Technical Reports Server (NTRS)

    Antsaklis, P. J.; Passino, K. M.; Wang, S. J.

    1988-01-01

    A hierarchical functional autonomous controller architecture is introduced. In particular, the architecture for the control of future space vehicles is described in detail; it is designed to ensure the autonomous operation of the control system and it allows interaction with the pilot and crew/ground station, and the systems on board the autonomous vehicle. The fundamental issues in autonomous control system modeling and analysis are discussed. It is proposed to utilize a hybrid approach to modeling and analysis of autonomous systems. This will incorporate conventional control methods based on differential equations and techniques for the analysis of systems described with a symbolic formalism. In this way, the theory of conventional control can be fully utilized. It is stressed that autonomy is the design requirement and intelligent control methods appear at present, to offer some of the necessary tools to achieve autonomy. A conventional approach may evolve and replace some or all of the `intelligent' functions. It is shown that in addition to conventional controllers, the autonomous control system incorporates planning, learning, and FDI (fault detection and identification).

  6. Comparative anatomy of the autonomic nervous system.

    PubMed

    Nilsson, Stefan

    2011-11-16

    This short review aims to point out the general anatomical features of the autonomic nervous systems of non-mammalian vertebrates. In addition it attempts to outline the similarities and also the increased complexity of the autonomic nervous patterns from fish to tetrapods. With the possible exception of the cyclostomes, perhaps the most striking feature of the vertebrate autonomic nervous system is the similarity between the vertebrate classes. An evolution of the complexity of the system can be seen, with the segmental ganglia of elasmobranchs incompletely connected longitudinally, while well developed paired sympathetic chains are present in teleosts and the tetrapods. In some groups the sympathetic chains may be reduced (dipnoans and caecilians), and have yet to be properly described in snakes. Cranial autonomic pathways are present in the oculomotor (III) and vagus (X) nerves of gnathostome fish and the tetrapods, and with the evolution of salivary and lachrymal glands in the tetrapods, also in the facial (VII) and glossopharyngeal (IX) nerves.

  7. Vehicle autonomous localization in local area of coal mine tunnel based on vision sensors and ultrasonic sensors.

    PubMed

    Xu, Zirui; Yang, Wei; You, Kaiming; Li, Wei; Kim, Young-Il

    2017-01-01

    This paper presents a vehicle autonomous localization method in local area of coal mine tunnel based on vision sensors and ultrasonic sensors. Barcode tags are deployed in pairs on both sides of the tunnel walls at certain intervals as artificial landmarks. The barcode coding is designed based on UPC-A code. The global coordinates of the upper left inner corner point of the feature frame of each barcode tag deployed in the tunnel are uniquely represented by the barcode. Two on-board vision sensors are used to recognize each pair of barcode tags on both sides of the tunnel walls. The distance between the upper left inner corner point of the feature frame of each barcode tag and the vehicle center point can be determined by using a visual distance projection model. The on-board ultrasonic sensors are used to measure the distance from the vehicle center point to the left side of the tunnel walls. Once the spatial geometric relationship between the barcode tags and the vehicle center point is established, the 3D coordinates of the vehicle center point in the tunnel's global coordinate system can be calculated. Experiments on a straight corridor and an underground tunnel have shown that the proposed vehicle autonomous localization method is not only able to quickly recognize the barcode tags affixed to the tunnel walls, but also has relatively small average localization errors in the vehicle center point's plane and vertical coordinates to meet autonomous unmanned vehicle positioning requirements in local area of coal mine tunnel.

  8. Vehicle autonomous localization in local area of coal mine tunnel based on vision sensors and ultrasonic sensors

    PubMed Central

    Yang, Wei; You, Kaiming; Li, Wei; Kim, Young-il

    2017-01-01

    This paper presents a vehicle autonomous localization method in local area of coal mine tunnel based on vision sensors and ultrasonic sensors. Barcode tags are deployed in pairs on both sides of the tunnel walls at certain intervals as artificial landmarks. The barcode coding is designed based on UPC-A code. The global coordinates of the upper left inner corner point of the feature frame of each barcode tag deployed in the tunnel are uniquely represented by the barcode. Two on-board vision sensors are used to recognize each pair of barcode tags on both sides of the tunnel walls. The distance between the upper left inner corner point of the feature frame of each barcode tag and the vehicle center point can be determined by using a visual distance projection model. The on-board ultrasonic sensors are used to measure the distance from the vehicle center point to the left side of the tunnel walls. Once the spatial geometric relationship between the barcode tags and the vehicle center point is established, the 3D coordinates of the vehicle center point in the tunnel’s global coordinate system can be calculated. Experiments on a straight corridor and an underground tunnel have shown that the proposed vehicle autonomous localization method is not only able to quickly recognize the barcode tags affixed to the tunnel walls, but also has relatively small average localization errors in the vehicle center point’s plane and vertical coordinates to meet autonomous unmanned vehicle positioning requirements in local area of coal mine tunnel. PMID:28141829

  9. Autonomous underwater pipeline monitoring navigation system

    NASA Astrophysics Data System (ADS)

    Mitchell, Byrel; Mahmoudian, Nina; Meadows, Guy

    2014-06-01

    This paper details the development of an autonomous motion-control and navigation algorithm for an underwater autonomous vehicle, the Ocean Server IVER3, to track long linear features such as underwater pipelines. As part of this work, the Nonlinear and Autonomous Systems Laboratory (NAS Lab) developed an algorithm that utilizes inputs from the vehicles state of the art sensor package, which includes digital imaging, digital 3-D Sidescan Sonar, and Acoustic Doppler Current Profilers. The resulting algorithms should tolerate real-world waterway with episodic strong currents, low visibility, high sediment content, and a variety of small and large vessel traffic.

  10. Stereo-vision-based terrain mapping for off-road autonomous navigation

    NASA Astrophysics Data System (ADS)

    Rankin, Arturo L.; Huertas, Andres; Matthies, Larry H.

    2009-05-01

    Successful off-road autonomous navigation by an unmanned ground vehicle (UGV) requires reliable perception and representation of natural terrain. While perception algorithms are used to detect driving hazards, terrain mapping algorithms are used to represent the detected hazards in a world model a UGV can use to plan safe paths. There are two primary ways to detect driving hazards with perception sensors mounted to a UGV: binary obstacle detection and traversability cost analysis. Binary obstacle detectors label terrain as either traversable or non-traversable, whereas, traversability cost analysis assigns a cost to driving over a discrete patch of terrain. In uncluttered environments where the non-obstacle terrain is equally traversable, binary obstacle detection is sufficient. However, in cluttered environments, some form of traversability cost analysis is necessary. The Jet Propulsion Laboratory (JPL) has explored both approaches using stereo vision systems. A set of binary detectors has been implemented that detect positive obstacles, negative obstacles, tree trunks, tree lines, excessive slope, low overhangs, and water bodies. A compact terrain map is built from each frame of stereo images. The mapping algorithm labels cells that contain obstacles as nogo regions, and encodes terrain elevation, terrain classification, terrain roughness, traversability cost, and a confidence value. The single frame maps are merged into a world map where temporal filtering is applied. In previous papers, we have described our perception algorithms that perform binary obstacle detection. In this paper, we summarize the terrain mapping capabilities that JPL has implemented during several UGV programs over the last decade and discuss some challenges to building terrain maps with stereo range data.

  11. Stereo Vision Based Terrain Mapping for Off-Road Autonomous Navigation

    NASA Technical Reports Server (NTRS)

    Rankin, Arturo L.; Huertas, Andres; Matthies, Larry H.

    2009-01-01

    Successful off-road autonomous navigation by an unmanned ground vehicle (UGV) requires reliable perception and representation of natural terrain. While perception algorithms are used to detect driving hazards, terrain mapping algorithms are used to represent the detected hazards in a world model a UGV can use to plan safe paths. There are two primary ways to detect driving hazards with perception sensors mounted to a UGV: binary obstacle detection and traversability cost analysis. Binary obstacle detectors label terrain as either traversable or non-traversable, whereas, traversability cost analysis assigns a cost to driving over a discrete patch of terrain. In uncluttered environments where the non-obstacle terrain is equally traversable, binary obstacle detection is sufficient. However, in cluttered environments, some form of traversability cost analysis is necessary. The Jet Propulsion Laboratory (JPL) has explored both approaches using stereo vision systems. A set of binary detectors has been implemented that detect positive obstacles, negative obstacles, tree trunks, tree lines, excessive slope, low overhangs, and water bodies. A compact terrain map is built from each frame of stereo images. The mapping algorithm labels cells that contain obstacles as no-go regions, and encodes terrain elevation, terrain classification, terrain roughness, traversability cost, and a confidence value. The single frame maps are merged into a world map where temporal filtering is applied. In previous papers, we have described our perception algorithms that perform binary obstacle detection. In this paper, we summarize the terrain mapping capabilities that JPL has implemented during several UGV programs over the last decade and discuss some challenges to building terrain maps with stereo range data.

  12. Stereo Vision Based Terrain Mapping for Off-Road Autonomous Navigation

    NASA Technical Reports Server (NTRS)

    Rankin, Arturo L.; Huertas, Andres; Matthies, Larry H.

    2009-01-01

    Successful off-road autonomous navigation by an unmanned ground vehicle (UGV) requires reliable perception and representation of natural terrain. While perception algorithms are used to detect driving hazards, terrain mapping algorithms are used to represent the detected hazards in a world model a UGV can use to plan safe paths. There are two primary ways to detect driving hazards with perception sensors mounted to a UGV: binary obstacle detection and traversability cost analysis. Binary obstacle detectors label terrain as either traversable or non-traversable, whereas, traversability cost analysis assigns a cost to driving over a discrete patch of terrain. In uncluttered environments where the non-obstacle terrain is equally traversable, binary obstacle detection is sufficient. However, in cluttered environments, some form of traversability cost analysis is necessary. The Jet Propulsion Laboratory (JPL) has explored both approaches using stereo vision systems. A set of binary detectors has been implemented that detect positive obstacles, negative obstacles, tree trunks, tree lines, excessive slope, low overhangs, and water bodies. A compact terrain map is built from each frame of stereo images. The mapping algorithm labels cells that contain obstacles as no-go regions, and encodes terrain elevation, terrain classification, terrain roughness, traversability cost, and a confidence value. The single frame maps are merged into a world map where temporal filtering is applied. In previous papers, we have described our perception algorithms that perform binary obstacle detection. In this paper, we summarize the terrain mapping capabilities that JPL has implemented during several UGV programs over the last decade and discuss some challenges to building terrain maps with stereo range data.

  13. Concurrent algorithms for a mobile robot vision system

    SciTech Connect

    Jones, J.P.; Mann, R.C.

    1988-01-01

    The application of computer vision to mobile robots has generally been hampered by insufficient on-board computing power. The advent of VLSI-based general purpose concurrent multiprocessor systems promises to give mobile robots an increasing amount of on-board computing capability, and to allow computation intensive data analysis to be performed without high-bandwidth communication with a remote system. This paper describes the integration of robot vision algorithms on a 3-dimensional hypercube system on-board a mobile robot developed at Oak Ridge National Laboratory. The vision system is interfaced to navigation and robot control software, enabling the robot to maneuver in a laboratory environment, to find a known object of interest and to recognize the object's status based on visual sensing. We first present the robot system architecture and the principles followed in the vision system implementation. We then provide some benchmark timings for low-level image processing routines, describe a concurrent algorithm with load balancing for the Hough transform, a new algorithm for binary component labeling, and an algorithm for the concurrent extraction of region features from labeled images. This system analyzes a scene in less than 5 seconds and has proven to be a valuable experimental tool for research in mobile autonomous robots. 9 refs., 1 fig., 3 tabs.

  14. 77 FR 2342 - Seventeenth Meeting: RTCA Special Committee 213, Enhanced Flight Vision/Synthetic Vision Systems...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-17

    ... Federal Aviation Administration Seventeenth Meeting: RTCA Special Committee 213, Enhanced Flight Vision... Transportation (DOT). ACTION: Notice of RTCA Special Committee 213, Enhanced Flight Vision/ Synthetic Vision... meeting of RTCA Special Committee 213, Enhanced Flight Vision/Synthetic Vision Systems (EFVS/SVS). DATES...

  15. Towards autonomic computing in machine vision applications: techniques and strategies for in-line 3D reconstruction in harsh industrial environments

    NASA Astrophysics Data System (ADS)

    Molleda, Julio; Usamentiaga, Rubén; García, Daniel F.; Bulnes, Francisco G.

    2011-03-01

    Nowadays machine vision applications require skilled users to configure, tune, and maintain. Because such users are scarce, the robustness and reliability of applications are usually significantly affected. Autonomic computing offers a set of principles such as self-monitoring, self-regulation, and self-repair which can be used to partially overcome those problems. Systems which include self-monitoring observe their internal states, and extract features about them. Systems with self-regulation are capable of regulating their internal parameters to provide the best quality of service depending on the operational conditions and environment. Finally, self-repairing systems are able to detect anomalous working behavior and to provide strategies to deal with such conditions. Machine vision applications are the perfect field to apply autonomic computing techniques. This type of application has strong constraints on reliability and robustness, especially when working in industrial environments, and must provide accurate results even under changing conditions such as luminance, or noise. In order to exploit the autonomic approach of a machine vision application, we believe the architecture of the system must be designed using a set of orthogonal modules. In this paper, we describe how autonomic computing techniques can be applied to machine vision systems, using as an example a real application: 3D reconstruction in harsh industrial environments based on laser range finding. The application is based on modules with different responsibilities at three layers: image acquisition and processing (low level), monitoring (middle level) and supervision (high level). High level modules supervise the execution of low-level modules. Based on the information gathered by mid-level modules, they regulate low-level modules in order to optimize the global quality of service, and tune the module parameters based on operational conditions and on the environment. Regulation actions involve

  16. Autonomous Attitude Determination System (AADS). Volume 1: System description

    NASA Technical Reports Server (NTRS)

    Saralkar, K.; Frenkel, Y.; Klitsch, G.; Liu, K. S.; Lefferts, E.; Tasaki, K.; Snow, F.; Garrahan, J.

    1982-01-01

    Information necessary to understand the Autonomous Attitude Determination System (AADS) is presented. Topics include AADS requirements, program structure, algorithms, and system generation and execution.

  17. Open multiagent architecture extended to distributed autonomous robotic systems

    NASA Astrophysics Data System (ADS)

    Sellem, Philippe; Amram, Eric; Luzeaux, Dominique

    2000-07-01

    Our research deals with the design and experiment of a control architecture for an autonomous outdoor mobile robot which uses mainly vision for perception. In this case of a single robot, we have designed a hybrid architecture with an attention mechanism that allows dynamic selection of perception processes. Building on this work, we have developed an open multi-agent architecture, for standard multi-task operating system, using the C++ programming language and Posix threads. Our implementation features of efficient and fully generic messages between agents, automatic acknowledgement receipts and built-in synchronization capabilities. Knowledge is distributed among robots according to a collaborative scheme: every robot builds its own representation of the world and shares it with others. Pieces of information are exchanged when decisions have to be made. Experiments are to be led with two outdoor ActiveMedia Pioneer AT mobile robots. Distributed perception, using mainly vision but also ultrasound, will serve as proof of concept.

  18. A production peripheral vision display system

    NASA Technical Reports Server (NTRS)

    Heinmiller, B.

    1984-01-01

    A small number of peripheral vision display systems in three significantly different configurations were evaluated in various aircraft and simulator situations. The use of these development systems enabled the gathering of much subjective and quantitative data regarding this concept of flight deck instrumentation. However, much was also learned about the limitations of this equipment which needs to be addressed prior to wide spread use. A program at Garrett Manufacturing Limited in which the peripheral vision display system is redesigned and transformed into a viable production avionics system is discussed. Modular design, interchangeable units, optical attenuators, and system fault detection are considered with respect to peripheral vision display systems.

  19. Blending of brain-machine interface and vision-guided autonomous robotics improves neuroprosthetic arm performance during grasping.

    PubMed

    Downey, John E; Weiss, Jeffrey M; Muelling, Katharina; Venkatraman, Arun; Valois, Jean-Sebastien; Hebert, Martial; Bagnell, J Andrew; Schwartz, Andrew B; Collinger, Jennifer L

    2016-03-18

    Recent studies have shown that brain-machine interfaces (BMIs) offer great potential for restoring upper limb function. However, grasping objects is a complicated task and the signals extracted from the brain may not always be capable of driving these movements reliably. Vision-guided robotic assistance is one possible way to improve BMI performance. We describe a method of shared control where the user controls a prosthetic arm using a BMI and receives assistance with positioning the hand when it approaches an object. Two human subjects with tetraplegia used a robotic arm to complete object transport tasks with and without shared control. The shared control system was designed to provide a balance between BMI-derived intention and computer assistance. An autonomous robotic grasping system identified and tracked objects and defined stable grasp positions for these objects. The system identified when the user intended to interact with an object based on the BMI-controlled movements of the robotic arm. Using shared control, BMI controlled movements and autonomous grasping commands were blended to ensure secure grasps. Both subjects were more successful on object transfer tasks when using shared control compared to BMI control alone. Movements made using shared control were more accurate, more efficient, and less difficult. One participant attempted a task with multiple objects and successfully lifted one of two closely spaced objects in 92 % of trials, demonstrating the potential for users to accurately execute their intention while using shared control. Integration of BMI control with vision-guided robotic assistance led to improved performance on object transfer tasks. Providing assistance while maintaining generalizability will make BMI systems more attractive to potential users. NCT01364480 and NCT01894802 .

  20. Sensorpedia: Information Sharing Across Autonomous Sensor Systems

    SciTech Connect

    Gorman, Bryan L; Resseguie, David R; Tomkins-Tinch, Christopher H

    2009-01-01

    The concept of adapting social media technologies is introduced as a means of achieving information sharing across autonomous sensor systems. Historical examples of interoperability as an underlying principle in loosely-coupled systems is compared and contrasted with corresponding tightly-coupled, integrated systems. Examples of ad hoc information sharing solutions based on Web 2.0 social networks, mashups, blogs, wikis, and data tags are presented and discussed. The underlying technologies of these solutions are isolated and defined, and Sensorpedia is presented as a formalized application for implementing sensor information sharing across large-scale enterprises with incompatible autonomous sensor systems.

  1. CYCLOPS: A mobile robotic platform for testing and validating image processing and autonomous navigation algorithms in support of artificial vision prostheses.

    PubMed

    Fink, Wolfgang; Tarbell, Mark A

    2009-12-01

    While artificial vision prostheses are quickly becoming a reality, actual testing time with visual prosthesis carriers is at a premium. Moreover, it is helpful to have a more realistic functional approximation of a blind subject. Instead of a normal subject with a healthy retina looking at a low-resolution (pixelated) image on a computer monitor or head-mounted display, a more realistic approximation is achieved by employing a subject-independent mobile robotic platform that uses a pixelated view as its sole visual input for navigation purposes. We introduce CYCLOPS: an AWD, remote controllable, mobile robotic platform that serves as a testbed for real-time image processing and autonomous navigation systems for the purpose of enhancing the visual experience afforded by visual prosthesis carriers. Complete with wireless Internet connectivity and a fully articulated digital camera with wireless video link, CYCLOPS supports both interactive tele-commanding via joystick, and autonomous self-commanding. Due to its onboard computing capabilities and extended battery life, CYCLOPS can perform complex and numerically intensive calculations, such as image processing and autonomous navigation algorithms, in addition to interfacing to additional sensors. Its Internet connectivity renders CYCLOPS a worldwide accessible testbed for researchers in the field of artificial vision systems. CYCLOPS enables subject-independent evaluation and validation of image processing and autonomous navigation systems with respect to the utility and efficiency of supporting and enhancing visual prostheses, while potentially reducing to a necessary minimum the need for valuable testing time with actual visual prosthesis carriers.

  2. Computer graphics testbed to simulate and test vision systems for space applications

    NASA Technical Reports Server (NTRS)

    Cheatham, John B.

    1991-01-01

    Research activity has shifted from computer graphics and vision systems to the broader scope of applying concepts of artificial intelligence to robotics. Specifically, the research is directed toward developing Artificial Neural Networks, Expert Systems, and Laser Imaging Techniques for Autonomous Space Robots.

  3. Computer graphics testbed to simulate and test vision systems for space applications

    NASA Technical Reports Server (NTRS)

    Cheatham, John B.

    1991-01-01

    Research activity has shifted from computer graphics and vision systems to the broader scope of applying concepts of artificial intelligence to robotics. Specifically, the research is directed toward developing Artificial Neural Networks, Expert Systems, and Laser Imaging Techniques for Autonomous Space Robots.

  4. Improving CAR Navigation with a Vision-Based System

    NASA Astrophysics Data System (ADS)

    Kim, H.; Choi, K.; Lee, I.

    2015-08-01

    The real-time acquisition of the accurate positions is very important for the proper operations of driver assistance systems or autonomous vehicles. Since the current systems mostly depend on a GPS and map-matching technique, they show poor and unreliable performance in blockage and weak areas of GPS signals. In this study, we propose a vision oriented car navigation method based on sensor fusion with a GPS and in-vehicle sensors. We employed a single photo resection process to derive the position and attitude of the camera and thus those of the car. This image georeferencing results are combined with other sensory data under the sensor fusion framework for more accurate estimation of the positions using an extended Kalman filter. The proposed system estimated the positions with an accuracy of 15 m although GPS signals are not available at all during the entire test drive of 15 minutes. The proposed vision based system can be effectively utilized for the low-cost but high-accurate and reliable navigation systems required for intelligent or autonomous vehicles.

  5. Improving Car Navigation with a Vision-Based System

    NASA Astrophysics Data System (ADS)

    Kim, H.; Choi, K.; Lee, I.

    2015-08-01

    The real-time acquisition of the accurate positions is very important for the proper operations of driver assistance systems or autonomous vehicles. Since the current systems mostly depend on a GPS and map-matching technique, they show poor and unreliable performance in blockage and weak areas of GPS signals. In this study, we propose a vision oriented car navigation method based on sensor fusion with a GPS and in-vehicle sensors. We employed a single photo resection process to derive the position and attitude of the camera and thus those of the car. This image georeferencing results are combined with other sensory data under the sensor fusion framework for more accurate estimation of the positions using an extended Kalman filter. The proposed system estimated the positions with an accuracy of 15 m although GPS signals are not available at all during the entire test drive of 15 minutes. The proposed vision based system can be effectively utilized for the low-cost but high-accurate and reliable navigation systems required for intelligent or autonomous vehicles.

  6. COHERENT LASER VISION SYSTEM (CLVS) OPTION PHASE

    SciTech Connect

    Robert Clark

    1999-11-18

    The purpose of this research project was to develop a prototype fiber-optic based Coherent Laser Vision System (CLVS) suitable for DOE's EM Robotic program. The system provides three-dimensional (3D) vision for monitoring situations in which it is necessary to update the dimensional spatial data on the order of once per second. The system has total immunity to ambient lighting conditions.

  7. Autonomous Robotic Refueling System (ARRS) for rapid aircraft turnaround

    NASA Astrophysics Data System (ADS)

    Williams, O. R.; Jackson, E.; Rueb, K.; Thompson, B.; Powell, K.

    An autonomous robotic refuelling system is being developed to achieve rapid aircraft turnaround, notably during combat operations. The proposed system includes a gantry positioner with sufficient reach to position a robotic arm that performs the refuelling tasks; a six degree of freedom manipulator equipped with a remote center of compliance, torque sensor, and a gripper that can handle standard tools; a computer vision system to locate and guide the refuelling nozzle, inspect the nozzle, and avoid collisions; and an operator interface with video and graphics display. The control system software will include components designed for trajectory planning and generation, collision detection, sensor interfacing, sensory processing, and human interfacing. The robotic system will be designed so that upgrading to perform additional tasks will be relatively straightforward.

  8. Robot soccer anywhere: achieving persistent autonomous navigation, mapping, and object vision tracking in dynamic environments

    NASA Astrophysics Data System (ADS)

    Dragone, Mauro; O'Donoghue, Ruadhan; Leonard, John J.; O'Hare, Gregory; Duffy, Brian; Patrikalakis, Andrew; Leederkerken, Jacques

    2005-06-01

    The paper describes an ongoing effort to enable autonomous mobile robots to play soccer in unstructured, everyday environments. Unlike conventional robot soccer competitions that are usually held on purpose-built robot soccer "fields", in our work we seek to develop the capability for robots to demonstrate aspects of soccer-playing in more diverse environments, such as schools, hospitals, or shopping malls, with static obstacles (furniture) and dynamic natural obstacles (people). This problem of "Soccer Anywhere" presents numerous research challenges including: (1) Simultaneous Localization and Mapping (SLAM) in dynamic, unstructured environments, (2) software control architectures for decentralized, distributed control of mobile agents, (3) integration of vision-based object tracking with dynamic control, and (4) social interaction with human participants. In addition to the intrinsic research merit of these topics, we believe that this capability would prove useful for outreach activities, in demonstrating robotics technology to primary and secondary school students, to motivate them to pursue careers in science and engineering.

  9. Environmental Recognition and Guidance Control for Autonomous Vehicles using Dual Vision Sensor and Applications

    NASA Astrophysics Data System (ADS)

    Moriwaki, Katsumi; Koike, Issei; Sano, Tsuyoshi; Fukunaga, Tetsuya; Tanaka, Katsuyuki

    We propose a new method of environmental recognition around an autonomous vehicle using dual vision sensor and navigation control based on binocular images. We consider to develop a guide robot that can play the role of a guide dog as the aid to people such as the visually impaired or the aged, as an application of above-mentioned techniques. This paper presents a recognition algorithm, which finds out the line of a series of Braille blocks and the boundary line between a sidewalk and a roadway where a difference in level exists by binocular images obtained from a pair of parallelarrayed CCD cameras. This paper also presents a tracking algorithm, with which the guide robot traces along a series of Braille blocks and avoids obstacles and unsafe areas which exist in the way of a person with the guide robot.

  10. Shared vision and autonomous motivation vs. financial incentives driving success in corporate acquisitions

    PubMed Central

    Clayton, Byron C.

    2015-01-01

    Successful corporate acquisitions require its managers to achieve substantial performance improvements in order to sufficiently cover acquisition premiums, the expected return of debt and equity investors, and the additional resources needed to capture synergies and accelerate growth. Acquirers understand that achieving the performance improvements necessary to cover these costs and create value for investors will most likely require a significant effort from mergers and acquisitions (M&A) management teams. This understanding drives the common and longstanding practice of offering hefty performance incentive packages to key managers, assuming that financial incentives will induce in-role and extra-role behaviors that drive organizational change and growth. The present study debunks the assumptions of this common M&A practice, providing quantitative evidence that shared vision and autonomous motivation are far more effective drivers of managerial performance than financial incentives. PMID:25610406

  11. Shared vision and autonomous motivation vs. financial incentives driving success in corporate acquisitions.

    PubMed

    Clayton, Byron C

    2014-01-01

    Successful corporate acquisitions require its managers to achieve substantial performance improvements in order to sufficiently cover acquisition premiums, the expected return of debt and equity investors, and the additional resources needed to capture synergies and accelerate growth. Acquirers understand that achieving the performance improvements necessary to cover these costs and create value for investors will most likely require a significant effort from mergers and acquisitions (M&A) management teams. This understanding drives the common and longstanding practice of offering hefty performance incentive packages to key managers, assuming that financial incentives will induce in-role and extra-role behaviors that drive organizational change and growth. The present study debunks the assumptions of this common M&A practice, providing quantitative evidence that shared vision and autonomous motivation are far more effective drivers of managerial performance than financial incentives.

  12. Autonomous Organization-Based Adaptive Information Systems

    DTIC Science & Technology

    2005-01-01

    intentional Multi - agent System (MAS) approach [10]. While these approaches are functional AIS systems, they lack the ability to reorganize and adapt...extended a multi - agent system with a self- reorganizing architecture to create an autonomous, adaptive information system. Design Our organization-based...goals. An advantage of a multi - agent system using the organization theoretic model is its extensibility. The practical, numerical limits to the

  13. Stereo vision and laser odometry for autonomous helicopters in GPS-denied indoor environments

    NASA Astrophysics Data System (ADS)

    Achtelik, Markus; Bachrach, Abraham; He, Ruijie; Prentice, Samuel; Roy, Nicholas

    2009-05-01

    This paper presents our solution for enabling a quadrotor helicopter to autonomously navigate unstructured and unknown indoor environments. We compare two sensor suites, specifically a laser rangefinder and a stereo camera. Laser and camera sensors are both well-suited for recovering the helicopter's relative motion and velocity. Because they use different cues from the environment, each sensor has its own set of advantages and limitations that are complimentary to the other sensor. Our eventual goal is to integrate both sensors on-board a single helicopter platform, leading to the development of an autonomous helicopter system that is robust to generic indoor environmental conditions. In this paper, we present results in this direction, describing the key components for autonomous navigation using either of the two sensors separately.

  14. Far and proximity maneuvers of a constellation of service satellites and autonomous pose estimation of customer satellite using machine vision

    NASA Astrophysics Data System (ADS)

    Arantes, Gilberto, Jr.; Marconi Rocco, Evandro; da Fonseca, Ijar M.; Theil, Stephan

    2010-05-01

    Space robotics has a substantial interest in achieving on-orbit satellite servicing operations autonomously, e.g. rendezvous and docking/berthing (RVD) with customer and malfunctioning satellites. An on-orbit servicing vehicle requires the ability to estimate the position and attitude in situations whenever the targets are uncooperative. Such situation comes up when the target is damaged. In this context, this work presents a robust autonomous pose system applied to RVD missions. Our approach is based on computer vision, using a single camera and some previous knowledge of the target, i.e. the customer spacecraft. A rendezvous analysis mission tool for autonomous service satellite has been developed and presented, for far maneuvers, e.g. distance above 1 km from the target, and close maneuvers. The far operations consist of orbit transfer using the Lambert formulation. The close operations include the inspection phase (during which the pose estimation is computed) and the final approach phase. Our approach is based on the Lambert problem for far maneuvers and the Hill equations are used to simulate and analyze the approaching and final trajectory between target and chase during the last phase of the rendezvous operation. A method for optimally estimating the relative orientation and position between camera system and target is presented in detail. The target is modelled as an assembly of points. The pose of the target is represented by dual quaternion in order to develop a simple quadratic error function in such a way that the pose estimation task becomes a least square minimization problem. The problem of pose is solved and some methods of non-linear square optimization (Newton, Newton-Gauss, and Levenberg-Marquard) are compared and discussed in terms of accuracy and computational cost.

  15. Autonomous Systems, Robotics, and Computing Systems Capability Roadmap: NRC Dialogue

    NASA Technical Reports Server (NTRS)

    Zornetzer, Steve; Gage, Douglas

    2005-01-01

    Contents include the following: Introduction. Process, Mission Drivers, Deliverables, and Interfaces. Autonomy. Crew-Centered and Remote Operations. Integrated Systems Health Management. Autonomous Vehicle Control. Autonomous Process Control. Robotics. Robotics for Solar System Exploration. Robotics for Lunar and Planetary Habitation. Robotics for In-Space Operations. Computing Systems. Conclusion.

  16. Vision based control of unmanned aerial vehicles with applications to an autonomous four-rotor helicopter, quadrotor

    NASA Astrophysics Data System (ADS)

    Altug, Erdinc

    Our work proposes a vision-based stabilization and output tracking control method for a model helicopter. This is a part of our effort to produce a rotorcraft based autonomous Unmanned Aerial Vehicle (UAV). Due to the desired maneuvering ability, a four-rotor helicopter has been chosen as the testbed. On previous research on flying vehicles, vision is usually used as a secondary sensor. Unlike previous research, our goal is to use visual feedback as the main sensor, which is not only responsible for detecting where the ground objects are but also for helicopter localization. A novel two-camera method has been introduced for estimating the full six degrees of freedom (DOF) pose of the helicopter. This two-camera system consists of a pan-tilt ground camera and an onboard camera. The pose estimation algorithm is compared through simulation to other methods, such as four-point, and stereo method and is shown to be less sensitive to feature detection errors. Helicopters are highly unstable flying vehicles; although this is good for agility, it makes the control harder. To build an autonomous helicopter, two methods of control are studied---one using a series of mode-based, feedback linearizing controllers and the other using a back-stepping control law. Various simulations with 2D and 3D models demonstrate the implementation of these controllers. We also show global convergence of the 3D quadrotor controller even with large calibration errors or presence of large errors on the image plane. Finally, we present initial flight experiments where the proposed pose estimation algorithm and non-linear control techniques have been implemented on a remote-controlled helicopter. The helicopter was restricted with a tether to vertical, yaw motions and limited x and y translations.

  17. Artificial vision support system (AVS(2)) for improved prosthetic vision.

    PubMed

    Fink, Wolfgang; Tarbell, Mark A

    2014-11-01

    State-of-the-art and upcoming camera-driven, implanted artificial vision systems provide only tens to hundreds of electrodes, affording only limited visual perception for blind subjects. Therefore, real time image processing is crucial to enhance and optimize this limited perception. Since tens or hundreds of pixels/electrodes allow only for a very crude approximation of the typically megapixel optical resolution of the external camera image feed, the preservation and enhancement of contrast differences and transitions, such as edges, are especially important compared to picture details such as object texture. An Artificial Vision Support System (AVS(2)) is devised that displays the captured video stream in a pixelation conforming to the dimension of the epi-retinal implant electrode array. AVS(2), using efficient image processing modules, modifies the captured video stream in real time, enhancing 'present but hidden' objects to overcome inadequacies or extremes in the camera imagery. As a result, visual prosthesis carriers may now be able to discern such objects in their 'field-of-view', thus enabling mobility in environments that would otherwise be too hazardous to navigate. The image processing modules can be engaged repeatedly in a user-defined order, which is a unique capability. AVS(2) is directly applicable to any artificial vision system that is based on an imaging modality (video, infrared, sound, ultrasound, microwave, radar, etc.) as the first step in the stimulation/processing cascade, such as: retinal implants (i.e. epi-retinal, sub-retinal, suprachoroidal), optic nerve implants, cortical implants, electric tongue stimulators, or tactile stimulators.

  18. An Expert System for Autonomous Spacecraft Control

    NASA Technical Reports Server (NTRS)

    Sherwood, Rob; Chien, Steve; Tran, Daniel; Cichy, Benjamin; Castano, Rebecca; Davies, Ashley; Rabideau, Gregg

    2005-01-01

    The Autonomous Sciencecraft Experiment (ASE), part of the New Millennium Space Technology 6 Project, is flying onboard the Earth Orbiter 1 (EO-1) mission. The ASE software enables EO-1 to autonomously detect and respond to science events such as: volcanic activity, flooding, and water freeze/thaw. ASE uses classification algorithms to analyze imagery onboard to detect chang-e and science events. Detection of these events is then used to trigger follow-up imagery. Onboard mission planning software then develops a response plan that accounts for target visibility and operations constraints. This plan is then executed using a task execution system that can deal with run-time anomalies. In this paper we describe the autonomy flight software and how it enables a new paradigm of autonomous science and mission operations. We will also describe the current experiment status and future plans.

  19. Evolution of autonomous and semi-autonomous robotic surgical systems: a review of the literature.

    PubMed

    Moustris, G P; Hiridis, S C; Deliparaschos, K M; Konstantinidis, K M

    2011-12-01

    Autonomous control of surgical robotic platforms may offer enhancements such as higher precision, intelligent manoeuvres, tissue-damage avoidance, etc. Autonomous robotic systems in surgery are largely at the experimental level. However, they have also reached clinical application. A literature review pertaining to commercial medical systems which incorporate autonomous and semi-autonomous features, as well as experimental work involving automation of various surgical procedures, is presented. Results are drawn from major databases, excluding papers not experimentally implemented on real robots. Our search yielded several experimental and clinical applications, describing progress in autonomous surgical manoeuvres, ultrasound guidance, optical coherence tomography guidance, cochlear implantation, motion compensation, orthopaedic, neurological and radiosurgery robots. Autonomous and semi-autonomous systems are beginning to emerge in various interventions, automating important steps of the operation. These systems are expected to become standard modality and revolutionize the face of surgery. Copyright © 2011 John Wiley & Sons, Ltd.

  20. Autonomous proximity operations using machine vision for trajectory control and pose estimation

    NASA Technical Reports Server (NTRS)

    Cleghorn, Timothy F.; Sternberg, Stanley R.

    1991-01-01

    A machine vision algorithm was developed which permits guidance control to be maintained during autonomous proximity operations. At present this algorithm exists as a simulation, running upon an 80386 based personal computer, using a ModelMATE CAD package to render the target vehicle. However, the algorithm is sufficiently simple, so that following off-line training on a known target vehicle, it should run in real time with existing vision hardware. The basis of the algorithm is a sequence of single camera images of the target vehicle, upon which radial transforms were performed. Selected points of the resulting radial signatures are fed through a decision tree, to determine whether the signature matches that of the known reference signatures for a particular view of the target. Based upon recognized scenes, the position of the maneuvering vehicle with respect to the target vehicles can be calculated, and adjustments made in the former's trajectory. In addition, the pose and spin rates of the target satellite can be estimated using this method.

  1. Autonomous microfluidic system for phosphate detection.

    PubMed

    McGraw, Christina M; Stitzel, Shannon E; Cleary, John; Slater, Conor; Diamond, Dermot

    2007-02-28

    Miniaturization of analytical devices through the advent of microfluidics and micro total analysis systems is an important step forward for applications such as medical diagnostics and environmental monitoring. The development of field-deployable instruments requires that the entire system, including all necessary peripheral components, be miniaturized and packaged in a portable device. A sensor for long-term monitoring of phosphate levels has been developed that incorporates sampling, reagent and waste storage, detection, and wireless communication into a complete, miniaturized system. The device employs a low-power detection and communication system, so the entire instrument can operate autonomously for 7 days on a single rechargeable, 12V battery. In addition, integration of a wireless communication device allows the instrument to be controlled and results to be downloaded remotely. This autonomous system has a limit of detection of 0.3mg/L and a linear dynamic range between 0 and 20mg/L.

  2. Exercise and the autonomic nervous system.

    PubMed

    Fu, Qi; Levine, Benjamin D

    2013-01-01

    The autonomic nervous system plays a crucial role in the cardiovascular response to acute (dynamic) exercise in animals and humans. During exercise, oxygen uptake is a function of the triple-product of heart rate and stroke volume (i.e., cardiac output) and arterial-mixed venous oxygen difference (the Fick principle). The degree to which each of the variables can increase determines maximal oxygen uptake (V˙O2max). Both "central command" and "the exercise pressor reflex" are important in determining the cardiovascular response and the resetting of the arterial baroreflex during exercise to precisely match systemic oxygen delivery with metabolic demand. In general, patients with autonomic disorders have low levels of V˙O2max, indicating reduced physical fitness and exercise capacity. Moreover, the vast majority of the patients have blunted or abnormal cardiovascular response to exercise, especially during maximal exercise. There is now convincing evidence that some of the protective and therapeutic effects of chronic exercise training are related to the impact on the autonomic nervous system. Additionally, training induced improvement in vascular function, blood volume expansion, cardiac remodeling, insulin resistance and renal-adrenal function may also contribute to the protection and treatment of cardiovascular, metabolic and autonomic disorders. Exercise training also improves mental health, helps to prevent depression, and promotes or maintains positive self-esteem. Moderate-intensity exercise at least 30 minutes per day and at least 5 days per week is recommended for the vast majority of people. Supervised exercise training is preferable to maximize function capacity, and may be particularly important for patients with autonomic disorders.

  3. Autonomous System Technologies for Resilient Airspace Operations

    NASA Technical Reports Server (NTRS)

    Houston, Vincent E.; Le Vie, Lisa R.

    2017-01-01

    Increasing autonomous systems within the aircraft cockpit begins with an effort to understand what autonomy is and developing the technology that encompasses it. Autonomy allows an agent, human or machine, to act independently within a circumscribed set of goals; delegating responsibility to the agent(s) to achieve overall system objective(s). Increasingly Autonomous Systems (IAS) are the highly sophisticated progression of current automated systems toward full autonomy. Working in concert with humans, these types of technologies are expected to improve the safety, reliability, costs, and operational efficiency of aviation. IAS implementation is imminent, which makes the development and the proper performance of such technologies, with respect to cockpit operation efficiency, the management of air traffic and data communication information, vital. A prototype IAS agent that attempts to optimize the identification and distribution of "relevant" air traffic data to be utilized by human crews during complex airspace operations has been developed.

  4. Health System Vision of Iran in 2025

    PubMed Central

    Rostamigooran, N; Esmailzadeh, H; Rajabi, F; Majdzadeh, R; Larijani, B; Dastgerdi, M Vahid

    2013-01-01

    Background: Vast changes in disease features and risk factors and influence of demographic, economical, and social trends on health system, makes formulating a long term evolutionary plan, unavoidable. In this regard, to determine health system vision in a long term horizon is a primary stage. Method: After narrative and purposeful review of documentaries, major themes of vision statement were determined and its context was organized in a work group consist of selected managers and experts of health system. Final content of the statement was prepared after several sessions of group discussions and receiving ideas of policy makers and experts of health system. Results: Vision statement in evolutionary plan of health system is considered to be :“a progressive community in the course of human prosperity which has attained to a developed level of health standards in the light of the most efficient and equitable health system in visionary region1 and with the regarding to health in all policies, accountability and innovation”. An explanatory context was compiled either to create a complete image of the vision. Conclusion: Social values and leaders’ strategic goals, and also main orientations are generally mentioned in vision statement. In this statement prosperity and justice are considered as major values and ideals in society of Iran; development and excellence in the region as leaders’ strategic goals; and also considering efficiency and equality, health in all policies, and accountability and innovation as main orientations of health system. PMID:23865011

  5. Autonomous Flight Safety System - Phase III

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The Autonomous Flight Safety System (AFSS) is a joint KSC and Wallops Flight Facility project that uses tracking and attitude data from onboard Global Positioning System (GPS) and inertial measurement unit (IMU) sensors and configurable rule-based algorithms to make flight termination decisions. AFSS objectives are to increase launch capabilities by permitting launches from locations without range safety infrastructure, reduce costs by eliminating some downrange tracking and communication assets, and reduce the reaction time for flight termination decisions.

  6. Autonomous millimeter-wave radar guidance systems

    NASA Astrophysics Data System (ADS)

    Schweiker, Kevin S.

    1992-07-01

    Hercules Defense Electronics Systems, Incorporated has applied millimeter wave technologies to a variety of guidance and control problems. This presentation documents the development and integration of an autonomous millimeter wave seeker to the AGM-65(D) (Maverick) air- to-ground missile. The resulting system was successfully demonstrated to search a large area for potential targets, prioritize detections, and guide the missile to the target during recent free-flight tests.

  7. Simulation of Aircraft Sortie Generation Under an Autonomic Logistics System

    DTIC Science & Technology

    2016-12-01

    SIMULATION OF AIRCRAFT SORTIE GENERATION UNDER AN AUTONOMIC LOGISTICS SYSTEM THESIS Gunduz...Government. AFIT-ENS-MS-16-D-052 SIMULATION OF AIRCRAFT SORTIE GENERATION UNDER AN AUTONOMIC LOGISTICS SYSTEM THESIS Presented to...DISTRIBUTION UNLIMITED. AFIT-ENS-MS-16-D-052 SIMULATION OF AIRCRAFT SORTIE GENERATION UNDER AN AUTONOMIC LOGISTICS SYSTEM THESIS

  8. Development of a Commercially Viable, Modular Autonomous Robotic Systems for Converting any Vehicle to Autonomous Control

    NASA Technical Reports Server (NTRS)

    Parish, David W.; Grabbe, Robert D.; Marzwell, Neville I.

    1994-01-01

    A Modular Autonomous Robotic System (MARS), consisting of a modular autonomous vehicle control system that can be retrofit on to any vehicle to convert it to autonomous control and support a modular payload for multiple applications is being developed. The MARS design is scalable, reconfigurable, and cost effective due to the use of modern open system architecture design methodologies, including serial control bus technology to simplify system wiring and enhance scalability. The design is augmented with modular, object oriented (C++) software implementing a hierarchy of five levels of control including teleoperated, continuous guidepath following, periodic guidepath following, absolute position autonomous navigation, and relative position autonomous navigation. The present effort is focused on producing a system that is commercially viable for routine autonomous patrolling of known, semistructured environments, like environmental monitoring of chemical and petroleum refineries, exterior physical security and surveillance, perimeter patrolling, and intrafacility transport applications.

  9. Development of a Commercially Viable, Modular Autonomous Robotic Systems for Converting any Vehicle to Autonomous Control

    NASA Technical Reports Server (NTRS)

    Parish, David W.; Grabbe, Robert D.; Marzwell, Neville I.

    1994-01-01

    A Modular Autonomous Robotic System (MARS), consisting of a modular autonomous vehicle control system that can be retrofit on to any vehicle to convert it to autonomous control and support a modular payload for multiple applications is being developed. The MARS design is scalable, reconfigurable, and cost effective due to the use of modern open system architecture design methodologies, including serial control bus technology to simplify system wiring and enhance scalability. The design is augmented with modular, object oriented (C++) software implementing a hierarchy of five levels of control including teleoperated, continuous guidepath following, periodic guidepath following, absolute position autonomous navigation, and relative position autonomous navigation. The present effort is focused on producing a system that is commercially viable for routine autonomous patrolling of known, semistructured environments, like environmental monitoring of chemical and petroleum refineries, exterior physical security and surveillance, perimeter patrolling, and intrafacility transport applications.

  10. Information capacity of electronic vision systems

    NASA Astrophysics Data System (ADS)

    Taubkin, Igor I.; Trishenkov, Mikhail A.

    1996-10-01

    The comparison of various electronic-optical vision systems has been conducted based on the criterion ultimate information capacity, C, limited by fluctuations of the flux of quanta. The information capacity of daylight, night, and thermal vision systems is determined first of all by the number of picture elements, M, in the optical system. Each element, under a sufficient level of irradiation, can transfer about one byte of information for the standard frame time and so C ≈ M bytes per frame. The value of the proportionality factor, one byte per picture element, is referred to systems of daylight and thermal vision, in which a photocharge in a unit cell of the imager is limited by storage capacity, and in general it varies within a small interval of 0.5 byte per frame for night vision systems to 2 bytes per frame for ideal thermal imagers. The ultimate specific information capacity, C ∗, of electronic vision systems under low irradiation levels rises with increasing density of optical channels until the number of the irradiance gradations that can be distinguished becomes less than two in each channel. In this case, the maximum value of C ∗ turns out to be proportional to the flux of quanta coming from an object under observation. Under a high level of irradiation, C ∗ is limited by difraction effects and amounts oto 1/ λ2 bytes/cm 2 frame.

  11. Autonomous microexplosives subsurface tracing system final report.

    SciTech Connect

    Engler, Bruce Phillip; Nogan, John; Melof, Brian Matthew; Uhl, James Eugene; Dulleck, George R., Jr.; Ingram, Brian V.; Grubelich, Mark Charles; Rivas, Raul R.; Cooper, Paul W.; Warpinski, Norman Raymond; Kravitz, Stanley H.

    2004-04-01

    The objective of the autonomous micro-explosive subsurface tracing system is to image the location and geometry of hydraulically induced fractures in subsurface petroleum reservoirs. This system is based on the insertion of a swarm of autonomous micro-explosive packages during the fracturing process, with subsequent triggering of the energetic material to create an array of micro-seismic sources that can be detected and analyzed using existing seismic receiver arrays and analysis software. The project included investigations of energetic mixtures, triggering systems, package size and shape, and seismic output. Given the current absence of any technology capable of such high resolution mapping of subsurface structures, this technology has the potential for major impact on petroleum industry, which spends approximately $1 billion dollar per year on hydraulic fracturing operations in the United States alone.

  12. Flight Testing an Integrated Synthetic Vision System

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Arthur, Jarvis J., III; Bailey, Randall E.; Prinzel, Lawrence J., III

    2005-01-01

    NASA's Synthetic Vision Systems (SVS) project is developing technologies with practical applications to eliminate low visibility conditions as a causal factor to civil aircraft accidents while replicating the operational benefits of clear day flight operations, regardless of the actual outside visibility condition. A major thrust of the SVS project involves the development/demonstration of affordable, certifiable display configurations that provide intuitive out-the-window terrain and obstacle information with advanced pathway guidance for transport aircraft. The SVS concept being developed at NASA encompasses the integration of tactical and strategic Synthetic Vision Display Concepts (SVDC) with Runway Incursion Prevention System (RIPS) alerting and display concepts, real-time terrain database integrity monitoring equipment (DIME), and Enhanced Vision Systems (EVS) and/or improved Weather Radar for real-time object detection and database integrity monitoring. A flight test evaluation was jointly conducted (in July and August 2004) by NASA Langley Research Center and an industry partner team under NASA's Aviation Safety and Security, Synthetic Vision System project. A Gulfstream GV aircraft was flown over a 3-week period in the Reno/Tahoe International Airport (NV) local area and an additional 3-week period in the Wallops Flight Facility (VA) local area to evaluate integrated Synthetic Vision System concepts. The enabling technologies (RIPS, EVS and DIME) were integrated into the larger SVS concept design. This paper presents experimental methods and the high level results of this flight test.

  13. Flight testing an integrated synthetic vision system

    NASA Astrophysics Data System (ADS)

    Kramer, Lynda J.; Arthur, Jarvis J., III; Bailey, Randall E.; Prinzel, Lawrence J., III

    2005-05-01

    NASA's Synthetic Vision Systems (SVS) project is developing technologies with practical applications to eliminate low visibility conditions as a causal factor to civil aircraft accidents while replicating the operational benefits of clear day flight operations, regardless of the actual outside visibility condition. A major thrust of the SVS project involves the development/demonstration of affordable, certifiable display configurations that provide intuitive out-the-window terrain and obstacle information with advanced pathway guidance for transport aircraft. The SVS concept being developed at NASA encompasses the integration of tactical and strategic Synthetic Vision Display Concepts (SVDC) with Runway Incursion Prevention System (RIPS) alerting and display concepts, real-time terrain database integrity monitoring equipment (DIME), and Enhanced Vision Systems (EVS) and/or improved Weather Radar for real-time object detection and database integrity monitoring. A flight test evaluation was jointly conducted (in July and August 2004) by NASA Langley Research Center and an industry partner team under NASA's Aviation Safety and Security, Synthetic Vision System project. A Gulfstream G-V aircraft was flown over a 3-week period in the Reno/Tahoe International Airport (NV) local area and an additional 3-week period in the Wallops Flight Facility (VA) local area to evaluate integrated Synthetic Vision System concepts. The enabling technologies (RIPS, EVS and DIME) were integrated into the larger SVS concept design. This paper presents experimental methods and the high level results of this flight test.

  14. Mission planning for autonomous systems

    NASA Technical Reports Server (NTRS)

    Pearson, G.

    1987-01-01

    Planning is a necessary task for intelligent, adaptive systems operating independently of human controllers. A mission planning system that performs task planning by decomposing a high-level mission objective into subtasks and synthesizing a plan for those tasks at varying levels of abstraction is discussed. Researchers use a blackboard architecture to partition the search space and direct the focus of attention of the planner. Using advanced planning techniques, they can control plan synthesis for the complex planning tasks involved in mission planning.

  15. Autonomous precision approach and landing system (APALS)

    NASA Astrophysics Data System (ADS)

    Dieffenbach, Otto W.

    1995-06-01

    The APALSTM system is a precision approach and landing system designed to enable low visibility landings at many more airports than is now possible. It is an autonomous navigation system which uses standard avionics equipment to determine the aircraft position and altitude with respect to unique features over which the aircraft flies. The primary measurement is made with the aircraft's weather radar and provides the range and range rate information necessary to update the precision navigation system. The system makes use of stored terrain map data as references for map matching with Synthetic Aperture Radar maps.

  16. Autonomous omnidirectional spacecraft antenna system

    NASA Technical Reports Server (NTRS)

    Taylor, T. H.

    1983-01-01

    The development of a low gain Electronically Switchable Spherical Array Antenna is discussed. This antenna provides roughly 7 dBic gain for receive/transmit operation between user satellites and the Tracking and Data Relay Satellite System. When used as a pair, the antenna provides spherical coverage. The antenna was tested in its primary operating modes: directed beam, retrodirective, and Omnidirectional.

  17. System for autonomous monitoring of bioagents

    SciTech Connect

    Langlois, Richard G.; Milanovich, Fred P.; Colston, Jr, Billy W.; Brown, Steve B.; Masquelier, Don A.; Mariella, Jr., Raymond P.; Venkateswaran, Kodomudi

    2015-06-09

    An autonomous monitoring system for monitoring for bioagents. A collector gathers the air, water, soil, or substance being monitored. A sample preparation means for preparing a sample is operatively connected to the collector. A detector for detecting the bioagents in the sample is operatively connected to the sample preparation means. One embodiment of the present invention includes confirmation means for confirming the bioagents in the sample.

  18. Lethality and Autonomous Systems: The Roboticist Demographic

    DTIC Science & Technology

    2008-01-01

    humanoid (22%), and other (23%); 9) Media Influence: only 18% said that media had a strong or very strong influence on their attitude to robots ...and whether certain emotions would be appropriate in a military robot . The Wars question was worded as follows: To what extent do you think ...Lethality and Autonomous Systems: The Roboticist Demographic Lilia V. Moshkina and Ronald C. Arkin Mobile Robot Laboratory, College of

  19. A stereo vision-based obstacle detection system in vehicles

    NASA Astrophysics Data System (ADS)

    Huh, Kunsoo; Park, Jaehak; Hwang, Junyeon; Hong, Daegun

    2008-02-01

    Obstacle detection is a crucial issue for driver assistance systems as well as for autonomous vehicle guidance function and it has to be performed with high reliability to avoid any potential collision with the front vehicle. The vision-based obstacle detection systems are regarded promising for this purpose because they require little infrastructure on a highway. However, the feasibility of these systems in passenger car requires accurate and robust sensing performance. In this paper, an obstacle detection system using stereo vision sensors is developed. This system utilizes feature matching, epipoplar constraint and feature aggregation in order to robustly detect the initial corresponding pairs. After the initial detection, the system executes the tracking algorithm for the obstacles. The proposed system can detect a front obstacle, a leading vehicle and a vehicle cutting into the lane. Then, the position parameters of the obstacles and leading vehicles can be obtained. The proposed obstacle detection system is implemented on a passenger car and its performance is verified experimentally.

  20. Autonomous grain combine control system

    DOEpatents

    Hoskinson, Reed L.; Kenney, Kevin L.; Lucas, James R.; Prickel, Marvin A.

    2013-06-25

    A system for controlling a grain combine having a rotor/cylinder, a sieve, a fan, a concave, a feeder, a header, an engine, and a control system. The feeder of the grain combine is engaged and the header is lowered. A separator loss target, engine load target, and a sieve loss target are selected. Grain is harvested with the lowered header passing the grain through the engaged feeder. Separator loss, sieve loss, engine load and ground speed of the grain combine are continuously monitored during the harvesting. If the monitored separator loss exceeds the selected separator loss target, the speed of the rotor/cylinder, the concave setting, the engine load target, or a combination thereof is adjusted. If the monitored sieve loss exceeds the selected sieve loss target, the speed of the fan, the size of the sieve openings, or the engine load target is adjusted.

  1. A multilayer perceptron hazard detector for vision-based autonomous planetary landing

    NASA Astrophysics Data System (ADS)

    Lunghi, Paolo; Ciarambino, Marco; Lavagna, Michèle

    2016-07-01

    A hazard detection and target selection algorithm for autonomous spacecraft planetary landing, based on Artificial Neural Networks, is presented. From a single image of the landing area, acquired by a VIS camera during the descent, the system computes a hazard map, exploited to select the best target, in terms of safety, guidance constraints, and scientific interest. ANNs generalization properties allow the system to correctly operate also in conditions not explicitly considered during calibration. The net architecture design, training, verification and results are critically presented. Performances are assessed in terms of recognition accuracy and selected target safety. Results for a lunar landing scenario are discussed to highlight the effectiveness of the system.

  2. Measures of Autonomic Nervous System

    DTIC Science & Technology

    2011-04-01

    optimal level of the individual’s lung function is measured by using three color-coded peak flow zones. The individual monitoring and peak flow monitor... monoamine oxidase inhibitors, which may interfere with accurate measurements of catecholamine metabolites. Three tools for measuring catecholamine...monitoring system for patient transport . IEEE Trans Inf Technol Biomed. 2004;8(4):439. 25. Blank JM, Altman DG. Statistical methods for assessing

  3. Why Computer-Based Systems Should be Autonomic

    NASA Technical Reports Server (NTRS)

    Sterritt, Roy; Hinchey, Mike

    2005-01-01

    The objective of this paper is to discuss why computer-based systems should be autonomic, where autonomicity implies self-managing, often conceptualized in terms of being self-configuring, self-healing, self-optimizing, self-protecting and self-aware. We look at motivations for autonomicity, examine how more and more systems are exhibiting autonomic behavior, and finally look at future directions.

  4. Seizures and brain regulatory systems: Consciousness, sleep, and autonomic systems

    PubMed Central

    Sedigh-Sarvestani, Madineh; Blumenfeld, Hal; Loddenkemper, Tobias; Bateman, Lisa M

    2014-01-01

    Research into the physiological underpinnings of epilepsy has revealed reciprocal relationships between seizures and the activity of several regulatory systems in the brain, including those governing sleep, consciousness and autonomic functions. This review highlights recent progress in understanding and utilizing the relationships between seizures and the arousal or consciousness system, the sleep-wake and associated circadian system, and the central autonomic network. PMID:25233249

  5. Agent Technology, Complex Adaptive Systems, and Autonomic Systems: Their Relationships

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt; Rash, James; Rouff, Chistopher; Hincheny, Mike

    2004-01-01

    To reduce the cost of future spaceflight missions and to perform new science, NASA has been investigating autonomous ground and space flight systems. These goals of cost reduction have been further complicated by nanosatellites for future science data-gathering which will have large communications delays and at times be out of contact with ground control for extended periods of time. This paper describes two prototype agent-based systems, the Lights-out Ground Operations System (LOGOS) and the Agent Concept Testbed (ACT), and their autonomic properties that were developed at NASA Goddard Space Flight Center (GSFC) to demonstrate autonomous operations of future space flight missions. The paper discusses the architecture of the two agent-based systems, operational scenarios of both, and the two systems autonomic properties.

  6. Laser Imaging Systems For Computer Vision

    NASA Astrophysics Data System (ADS)

    Vlad, Ionel V.; Ionescu-Pallas, Nicholas; Popa, Dragos; Apostol, Ileana; Vlad, Adriana; Capatina, V.

    1989-05-01

    The computer vision is becoming an essential feature of the high level artificial intelligence. Laser imaging systems act as special kind of image preprocessors/converters enlarging the access of the computer "intelligence" to the inspection, analysis and decision in new "world" : nanometric, three-dimensionals(3D), ultrafast, hostile for humans etc. Considering that the heart of the problem is the matching of the optical methods and the compu-ter software , some of the most promising interferometric,projection and diffraction systems are reviewed with discussions of our present results and of their potential in the precise 3D computer vision.

  7. Autonomous landing guidance system validation

    NASA Astrophysics Data System (ADS)

    Bui, Long Q.; Franklin, Michael R.; Taylor, Christopher; Neilson, Graham

    1997-06-01

    ALG is a combination of raster imaging sensor, head-up displays, flight guidance and procedures which allow pilots to perform hand flown aircraft maneuvers in adverse weather, at night, or in low visibility conditions at facilities with minimal or no ground aids. Maneuvers in the context of ALG relate to takeoff, landing, rollout, taxi and terminal parking. Commercial needs are driven by potential revenue savings since today only 43 Type III and 80 Type II instrumented landing system (ILS) runway ends in the United States are equipped for lower minimum flight operations. Additionally, most of these ILS facilities are clustered at major gateway airports which further impacts on dispatch authority and general ATC regional delays. Infrastructure consists to upgrade additional runways must not only account for the high integrity ground instrumentation, but also the installation of lights and markers mandated for Cat III operations. The military services ability to train under realistic battlefield conditions, to project power globally in support of national interests, while providing humanitarian aid, is significantly impaired by the inability to conduct precision approaches and landings in low visibility conditions to either instrumented runways or to a more tactical environment with operations into and out of unprepared landing strips, particularly when time does not permit deployment of ground aids and the verification of their integrity. Recently, Lear Astronics, in cooperation with Consortium members of the ALG Program, concluded a flight test program which evaluated the utility of the ALG system in meeting both civil and military needs. Those results are the subject of this paper.

  8. Vision-Aided Autonomous Precision Weapon Terminal Guidance Using a Tightly-Coupled INS and Predictive Rendering Techniques

    DTIC Science & Technology

    2011-03-01

    Government and is not subject to copy- right protection in the United States. AFIT/GE/ENG/11-42 Vision-Aided Autonomous Precision Weapon Terminal Guidance...the Requirements for the Degree of Master of Science in Electrical Engineering Jonathan W. Beich, BSEE, MS Space Studies, MS Flight Test Engineering...bias noise . . . . . . . . . . . . . . 33 wbbbias Additive gyro bias noise . . . . . . . . . . . . . . . . . . . 33 zk Measurement vector

  9. A Monocular Vision Sensor-Based Obstacle Detection Algorithm for Autonomous Robots.

    PubMed

    Lee, Tae-Jae; Yi, Dong-Hoon; Cho, Dong-Il Dan

    2016-03-01

    This paper presents a monocular vision sensor-based obstacle detection algorithm for autonomous robots. Each individual image pixel at the bottom region of interest is labeled as belonging either to an obstacle or the floor. While conventional methods depend on point tracking for geometric cues for obstacle detection, the proposed algorithm uses the inverse perspective mapping (IPM) method. This method is much more advantageous when the camera is not high off the floor, which makes point tracking near the floor difficult. Markov random field-based obstacle segmentation is then performed using the IPM results and a floor appearance model. Next, the shortest distance between the robot and the obstacle is calculated. The algorithm is tested by applying it to 70 datasets, 20 of which include nonobstacle images where considerable changes in floor appearance occur. The obstacle segmentation accuracies and the distance estimation error are quantitatively analyzed. For obstacle datasets, the segmentation precision and the average distance estimation error of the proposed method are 81.4% and 1.6 cm, respectively, whereas those for a conventional method are 57.5% and 9.9 cm, respectively. For nonobstacle datasets, the proposed method gives 0.0% false positive rates, while the conventional method gives 17.6%.

  10. A Monocular Vision Sensor-Based Obstacle Detection Algorithm for Autonomous Robots

    PubMed Central

    Lee, Tae-Jae; Yi, Dong-Hoon; Cho, Dong-Il “Dan”

    2016-01-01

    This paper presents a monocular vision sensor-based obstacle detection algorithm for autonomous robots. Each individual image pixel at the bottom region of interest is labeled as belonging either to an obstacle or the floor. While conventional methods depend on point tracking for geometric cues for obstacle detection, the proposed algorithm uses the inverse perspective mapping (IPM) method. This method is much more advantageous when the camera is not high off the floor, which makes point tracking near the floor difficult. Markov random field-based obstacle segmentation is then performed using the IPM results and a floor appearance model. Next, the shortest distance between the robot and the obstacle is calculated. The algorithm is tested by applying it to 70 datasets, 20 of which include nonobstacle images where considerable changes in floor appearance occur. The obstacle segmentation accuracies and the distance estimation error are quantitatively analyzed. For obstacle datasets, the segmentation precision and the average distance estimation error of the proposed method are 81.4% and 1.6 cm, respectively, whereas those for a conventional method are 57.5% and 9.9 cm, respectively. For nonobstacle datasets, the proposed method gives 0.0% false positive rates, while the conventional method gives 17.6%. PMID:26938540

  11. Information-Driven Autonomous Exploration for a Vision-Based Mav

    NASA Astrophysics Data System (ADS)

    Palazzolo, E.; Stachniss, C.

    2017-08-01

    Most micro aerial vehicles (MAV) are flown manually by a pilot. When it comes to autonomous exploration for MAVs equipped with cameras, we need a good exploration strategy for covering an unknown 3D environment in order to build an accurate map of the scene. In particular, the robot must select appropriate viewpoints to acquire informative measurements. In this paper, we present an approach that computes in real-time a smooth flight path with the exploration of a 3D environment using a vision-based MAV. We assume to know a bounding box of the object or building to explore and our approach iteratively computes the next best viewpoints using a utility function that considers the expected information gain of new measurements, the distance between viewpoints, and the smoothness of the flight trajectories. In addition, the algorithm takes into account the elapsed time of the exploration run to safely land the MAV at its starting point after a user specified time. We implemented our algorithm and our experiments suggest that it allows for a precise reconstruction of the 3D environment while guiding the robot smoothly through the scene.

  12. Autonomous Flight Safety System Road Test

    NASA Technical Reports Server (NTRS)

    Simpson, James C.; Zoemer, Roger D.; Forney, Chris S.

    2005-01-01

    On February 3, 2005, Kennedy Space Center (KSC) conducted the first Autonomous Flight Safety System (AFSS) test on a moving vehicle -- a van driven around the KSC industrial area. A subset of the Phase III design was used consisting of a single computer, GPS receiver, and UPS antenna. The description and results of this road test are described in this report.AFSS is a joint KSC and Wallops Flight Facility project that is in its third phase of development. AFSS is an independent subsystem intended for use with Expendable Launch Vehicles that uses tracking data from redundant onboard sensors to autonomously make flight termination decisions using software-based rules implemented on redundant flight processors. The goals of this project are to increase capabilities by allowing launches from locations that do not have or cannot afford extensive ground-based range safety assets, to decrease range costs, and to decrease reaction time for special situations.

  13. Autonomic complications following central nervous system injury.

    PubMed

    Baguley, Ian J

    2008-11-01

    Severe sympathetic overactivity occurs in several conditions that are recognized as medical emergencies. Following central nervous system injury, a small proportion of individuals develop severe paroxysmal sympathetic and motor overactivity. These individuals have a high attendant risk of unnecessary secondary morbidity. Following acquired brain injury, the syndrome is known by a number of names including dysautonomia and sympathetic storm. Dysautonomia is currently a diagnosis of exclusion and often goes unrecognized. The evidence base for management is almost entirely anecdotal in nature; there has been little structured or prospective research. In contrast, the evidence base for autonomic dysreflexia following spinal cord injury is much stronger, with level 1 evidence for many treatment interventions. This review presents a current understanding of each condition and suggests simple management protocols. With the marked disparity in the literature for the two conditions, the main focus is on the literature for dysautonomia. The similarity between these two conditions and the other autonomic emergency conditions is discussed.

  14. The autonomic nervous system and perinatal metabolism.

    PubMed

    Milner, R D; De Gasparo, M

    1981-01-01

    The development of the autonomic nervous system in relation to perinatal metabolism is reviewed with particular attention given to the adipocyte, hepatocyte and the A and B cells of the islets of Langerhans. Adrenergic receptors develop in the B cell independently of normal innervation and by the time of birth, in most species studied, the pancreas, liver and adipose tissue respond appropriately to autonomic signals. Birth is associated with a huge surge in circulating catecholamines which is probably responsible for the early postnatal rise in free fatty acids and glucagon concentrations in plasma. beta-Blocking drugs such as propranolol have an adverse effect on fetal growth and neonatal metabolism, being responsible for hypoglycemia and for impairing the thermogenic response to cold exposure. beta-Mimetic drugs are commonly used to prevent premature labour and may help the fetus in other ways, for example, by improving the placental blood supply and the delivery of nutrients by increasing maternal fat and carbohydrate mobilization.

  15. Statins and the autonomic nervous system.

    PubMed

    Millar, Philip J; Floras, John S

    2014-03-01

    Statins (3-hydroxy-3-methylglutaryl-CoA reductase inhibitors) reduce plasma cholesterol and improve endothelium-dependent vasodilation, inflammation and oxidative stress. A 'pleiotropic' property of statins receiving less attention is their effect on the autonomic nervous system. Increased central sympathetic outflow and diminished cardiac vagal tone are disturbances characteristic of a range of cardiovascular conditions for which statins are now prescribed routinely to reduce cardiovascular events: following myocardial infarction, and in hypertension, chronic kidney disease, heart failure and diabetes. The purpose of the present review is to synthesize contemporary evidence that statins can improve autonomic circulatory regulation. In experimental preparations, high-dose lipophilic statins have been shown to reduce adrenergic outflow by attenuating oxidative stress in central brain regions involved in sympathetic and parasympathetic discharge induction and modulation. In patients with hypertension, chronic kidney disease and heart failure, lipophilic statins, such as simvastatin or atorvastatin, have been shown to reduce MNSA (muscle sympathetic nerve activity) by 12-30%. Reports concerning the effect of statin therapy on HRV (heart rate variability) are less consistent. Because of their implications for BP (blood pressure) control, insulin sensitivity, arrhythmogenesis and sudden cardiac death, these autonomic nervous system actions should be considered additional mechanisms by which statins lower cardiovascular risk.

  16. Development of an Autonomous Pathogen Detection System

    SciTech Connect

    Langlosi, S.; Brown, S.; Colston, B.; Jones, L.; Masquelier, D.; Meyer, P.; McBride, M.; Nasarabad, S.; Ramponi, A.J.; Venkatseswarm, K.; Milanovich, F.

    2000-10-12

    An Autonomous Pathogen Detection System (APDS) is being designed and evaluated for use in domestic counter-terrorism. The goal is a fully automated system that utilizes both flow cytometry and polymerase chain reaction (PCR) to continuously monitor the air for BW pathogens in major buildings or high profile events. A version 1 APDS system consisting of an aerosol collector, a sample preparation subsystem, and a flow cytometer for detecting the antibody-labeled target organisms has been completed and evaluated. Improved modules are under development for a version 2 APDS including a Lawrence Livermore National Laboratory-designed aerosol preconcentrator, a multiplex flow cytometer, and a flow-through PCR detector.

  17. Sustainable and Autonomic Space Exploration Missions

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Sterritt, Roy; Rouff, Christopher; Rash, James L.; Truszkowski, Walter

    2006-01-01

    Visions for future space exploration have long term science missions in sight, resulting in the need for sustainable missions. Survivability is a critical property of sustainable systems and may be addressed through autonomicity, an emerging paradigm for self-management of future computer-based systems based on inspiration from the human autonomic nervous system. This paper examines some of the ongoing research efforts to realize these survivable systems visions, with specific emphasis on developments in Autonomic Policies.

  18. MARVEL: A system that recognizes world locations with stereo vision

    SciTech Connect

    Braunegg, D.J. . Artificial Intelligence Lab.)

    1993-06-01

    MARVEL is a system that supports autonomous navigation by building and maintaining its own models of world locations and using these models and stereo vision input to recognize its location in the world and its position and orientation within that location. The system emphasizes the use of simple, easily derivable features for recognition, whose aggregate identifies a location, instead of complex features that also require recognition. MARVEL is designed to be robust with respect to input errors and to respond to a gradually changing world by updating its world location models. In over 1,000 recognition tests using real-world data, MARVEL yielded a false negative rate under 10% with zero false positives.

  19. Image Control In Automatic Welding Vision System

    NASA Technical Reports Server (NTRS)

    Richardson, Richard W.

    1988-01-01

    Orientation and brightness varied to suit welding conditions. Commands from vision-system computer drive servomotors on iris and Dove prism, providing proper light level and image orientation. Optical-fiber bundle carries view of weld area as viewed along axis of welding electrode. Image processing described in companion article, "Processing Welding Images for Robot Control" (MFS-26036).

  20. Lumber Grading With A Computer Vision System

    Treesearch

    Richard W. Conners; Tai-Hoon Cho; Philip A. Araman

    1989-01-01

    Over the past few years significant progress has been made in developing a computer vision system for locating and identifying defects on surfaced hardwood lumber. Unfortunately, until September of 1988 little research had gone into developing methods for analyzing rough lumber. This task is arguably more complex than the analysis of surfaced lumber. The prime...

  1. A Design Methodology For Industrial Vision Systems

    NASA Astrophysics Data System (ADS)

    Batchelor, B. G.; Waltz, F. M.; Snyder, M. A.

    1988-11-01

    The cost of design, rather than that of target system hardware, represents the principal factor inhibiting the adoption of machine vision systems by manufacturing industry. To reduce design costs to a minimum, a number of software and hardware aids have been developed or are currently being built by the authors. These design aids are as follows: a. An expert system for giving advice about which image acquisition techniques (i.e. lighting/viewing techniques) might be appropriate in a given situation. b. A program to assist in the selection and setup of camera lenses. c. A rich repertoire of image processing procedures, integrated with the Al language Prolog. This combination (called ProVision) provides a facility for experimenting with intelligent image processing techniques and is intended to allow rapid prototyping of algorithms and/or heuristics. d. Fast image processing hardware, capable of implementing commands in the ProVision language. The speed of operation of this equipment is sufficiently high for it to be used, without modification, in many industrial applications. Where this is not possible, even higher execution speed may be achieved by adding extra modules to the processing hardware. In this way, it is possible to trade speed against the cost of the target system hardware. New and faster implementations of a given algorithm/heuristic can usually be achieved with the expenditure of only a small effort. Throughout this article, the emphasis is on designing an industrial vision system in a smooth and effortless manner. In order to illustrate our main thesis that the design of industrial vision systems can be made very much easier through the use of suitable utilities, the article concludes with a discussion of a case study: the dissection of tiny plants using a visually controlled robot.

  2. Multi-Spacecraft Autonomous Positioning System

    NASA Technical Reports Server (NTRS)

    Anzalone, Evan

    2015-01-01

    As the number of spacecraft in simultaneous operation continues to grow, there is an increased dependency on ground-based navigation support. The current baseline system for deep space navigation utilizes Earth-based radiometric tracking, requiring long-duration observations to perform orbit determination and generate a state update. The age, complexity, and high utilization of the ground assets pose a risk to spacecraft navigation performance. In order to perform complex operations at large distances from Earth, such as extraterrestrial landing and proximity operations, autonomous systems are required. With increasingly complex mission operations, the need for frequent and Earth-independent navigation capabilities is further reinforced. The Multi-spacecraft Autonomous Positioning System (MAPS) takes advantage of the growing interspacecraft communication network and infrastructure to allow for Earth-autonomous state measurements to enable network-based space navigation. A notional concept of operations is given in figure 1. This network is already being implemented and routinely used in Martian communications through the use of the Mars Reconnaissance Orbiter and Mars Odyssey spacecraft as relays for surface assets. The growth of this communications architecture is continued through MAVEN, and future potential commercial Mars telecom orbiters. This growing network provides an initial Marslocal capability for inter-spacecraft communication and navigation. These navigation updates are enabled by cross-communication between assets in the network, coupled with onboard navigation estimation routines to integrate packet travel time to generate ranging measurements. Inter-spacecraft communication allows for frequent state broadcasts and time updates from trusted references. The architecture is a software-based solution, enabling its implementation on a wide variety of current assets, with the operational constraints and measurement accuracy determined by onboard systems.

  3. Versatile 360-deg panoramic optical system for autonomous robots

    NASA Astrophysics Data System (ADS)

    Barton, George G.; Feldman, Sidney; Beckstead, Jeffrey A.; Nordhauser, Sidney R.

    1999-01-01

    Autonomous mobile robots require wide angle vision for navigation and threat detection and analysis, best served with full panoramic vision. The panoramic optical element is a unique inexpensive first surface reflective aspheric convex cone. This cone can be sized and configured for any vertical FOV desired. The cone acts as a negative optical element generating a panoramic virtual image. When this virtual image is viewed through a standard camera lens it produces at the lenses focal pane a panoramic toroidal image with a translational linearity of > 99 percent. One of three image transducers can be used to convert the toroidal panoramic image to a video signal. Raster scanned CCDs, radially scanned Vidicons and linear CCD arrays on a mechanically rotated state, each have their own particular advantage. Field object distances can be determined in two ways. If the robot is moving the range can be calculated by the size change of a field object versus the distance traversed in a specific time interval. By vertically displacing the panoramic camera by several inches a quasibinocular system is created and the range determined by simple math. Ranging thus produces the third dimension.

  4. Vision-based map building and trajectory planning to enable autonomous flight through urban environments

    NASA Astrophysics Data System (ADS)

    Watkins, Adam S.

    The desire to use Unmanned Air Vehicles (UAVs) in a variety of complex missions has motivated the need to increase the autonomous capabilities of these vehicles. This research presents autonomous vision-based mapping and trajectory planning strategies for a UAV navigating in an unknown urban environment. It is assumed that the vehicle's inertial position is unknown because GPS in unavailable due to environmental occlusions or jamming by hostile military assets. Therefore, the environment map is constructed from noisy sensor measurements taken at uncertain vehicle locations. Under these restrictions, map construction becomes a state estimation task known as the Simultaneous Localization and Mapping (SLAM) problem. Solutions to the SLAM problem endeavor to estimate the state of a vehicle relative to concurrently estimated environmental landmark locations. The presented work focuses specifically on SLAM for aircraft, denoted as airborne SLAM, where the vehicle is capable of six degree of freedom motion characterized by highly nonlinear equations of motion. The airborne SLAM problem is solved with a variety of filters based on the Rao-Blackwellized particle filter. Additionally, the environment is represented as a set of geometric primitives that are fit to the three-dimensional points reconstructed from gathered onboard imagery. The second half of this research builds on the mapping solution by addressing the problem of trajectory planning for optimal map construction. Optimality is defined in terms of maximizing environment coverage in minimum time. The planning process is decomposed into two phases of global navigation and local navigation. The global navigation strategy plans a coarse, collision-free path through the environment to a goal location that will take the vehicle to previously unexplored or incompletely viewed territory. The local navigation strategy plans detailed, collision-free paths within the currently sensed environment that maximize local coverage

  5. Visual-tracking-based robot vision system

    NASA Astrophysics Data System (ADS)

    Deng, Keqiang; Wilson, Joseph N.; Ritter, Gerhard X.

    1992-11-01

    There are two kinds of depth perception for robot vision systems: quantitative and qualitative. The first one can be used to reconstruct the visible surfaces numerically while the second to describe the visible surfaces qualitatively. In this paper, we present a qualitative vision system suitable for intelligent robots. The goal of such a system is to perceive depth information qualitatively using monocular 2-D images. We first establish a set of propositions relating depth information, such as 3-D orientation and distance, to the changes of image region caused by camera motion. We then introduce an approximation-based visual tracking system. Given an object, the tracking system tracks its image while moving the camera in a way dependent upon the particular depth property to be perceived. Checking the data generated by the tracking system with our propositions provides us the depth information about the object. The visual tracking system can track image regions in real-time even as implemented on a PC AT clone machine, and mobile robots can naturally provide the inputs to our visual tracking system, therefore, we are able to construct a real-time, cost effective, monocular, qualitative and 3-dimensional robot vision system. To verify our idea, we present examples of perception of planar surface orientation, distance, size, dimensionality and convexity/concavity.

  6. Verification of Autonomous Systems for Space Applications

    NASA Technical Reports Server (NTRS)

    Brat, G.; Denney, E.; Giannakopoulou, D.; Frank, J.; Jonsson, A.

    2006-01-01

    Autonomous software, especially if it is based on model, can play an important role in future space applications. For example, it can help streamline ground operations, or, assist in autonomous rendezvous and docking operations, or even, help recover from problems (e.g., planners can be used to explore the space of recovery actions for a power subsystem and implement a solution without (or with minimal) human intervention). In general, the exploration capabilities of model-based systems give them great flexibility. Unfortunately, it also makes them unpredictable to our human eyes, both in terms of their execution and their verification. The traditional verification techniques are inadequate for these systems since they are mostly based on testing, which implies a very limited exploration of their behavioral space. In our work, we explore how advanced V&V techniques, such as static analysis, model checking, and compositional verification, can be used to gain trust in model-based systems. We also describe how synthesis can be used in the context of system reconfiguration and in the context of verification.

  7. Physiology of the Autonomic Nervous System

    PubMed Central

    2007-01-01

    This manuscript discusses the physiology of the autonomic nervous system (ANS). The following topics are presented: regulation of activity; efferent pathways; sympathetic and parasympathetic divisions; neurotransmitters, their receptors and the termination of their activity; functions of the ANS; and the adrenal medullae. In addition, the application of this material to the practice of pharmacy is of special interest. Two case studies regarding insecticide poisoning and pheochromocytoma are included. The ANS and the accompanying case studies are discussed over 5 lectures and 2 recitation sections during a 2-semester course in Human Physiology. The students are in the first-professional year of the doctor of pharmacy program. PMID:17786266

  8. Malicious Hubs: Detecting Abnormally Malicious Autonomous Systems

    SciTech Connect

    Kalafut, Andrew J.; Shue, Craig A; Gupta, Prof. Minaxi

    2010-01-01

    While many attacks are distributed across botnets, investigators and network operators have recently targeted malicious networks through high profile autonomous system (AS) de-peerings and network shut-downs. In this paper, we explore whether some ASes indeed are safe havens for malicious activity. We look for ISPs and ASes that exhibit disproportionately high malicious behavior using 12 popular blacklists. We find that some ASes have over 80% of their routable IP address space blacklisted and others account for large fractions of blacklisted IPs. Overall, we conclude that examining malicious activity at the AS granularity can unearth networks with lax security or those that harbor cybercrime.

  9. Vision based flight procedure stereo display system

    NASA Astrophysics Data System (ADS)

    Shen, Xiaoyun; Wan, Di; Ma, Lan; He, Yuncheng

    2008-03-01

    A virtual reality flight procedure vision system is introduced in this paper. The digital flight map database is established based on the Geographic Information System (GIS) and high definitions satellite remote sensing photos. The flight approaching area database is established through computer 3D modeling system and GIS. The area texture is generated from the remote sensing photos and aerial photographs in various level of detail. According to the flight approaching procedure, the flight navigation information is linked to the database. The flight approaching area vision can be dynamic displayed according to the designed flight procedure. The flight approaching area images are rendered in 2 channels, one for left eye images and the others for right eye images. Through the polarized stereoscopic projection system, the pilots and aircrew can get the vivid 3D vision of the flight destination approaching area. Take the use of this system in pilots preflight preparation procedure, the aircrew can get more vivid information along the flight destination approaching area. This system can improve the aviator's self-confidence before he carries out the flight mission, accordingly, the flight safety is improved. This system is also useful in validate the visual flight procedure design, and it helps to the flight procedure design.

  10. Mobile robot on-board vision system

    SciTech Connect

    McClure, V.W.; Nai-Yung Chen.

    1993-06-15

    An automatic robot system is described comprising: an AGV transporting and transferring work piece, a control computer on board the AGV, a process machine for working on work pieces, a flexible robot arm with a gripper comprising two gripper fingers at one end of the arm, wherein the robot arm and gripper are controllable by the control computer for engaging a work piece, picking it up, and setting it down and releasing it at a commanded location, locating beacon means mounted on the process machine, wherein the locating beacon means are for locating on the process machine a place to pick up and set down work pieces, vision means, including a camera fixed in the coordinate system of the gripper means, attached to the robot arm near the gripper, such that the space between said gripper fingers lies within the vision field of said vision means, for detecting the locating beacon means, wherein the vision means provides the control computer visual information relating to the location of the locating beacon means, from which information the computer is able to calculate the pick up and set down place on the process machine, wherein said place for picking up and setting down work pieces on the process machine is a nest means and further serves the function of holding a work piece in place while it is worked on, the robot system further comprising nest beacon means located in the nest means detectable by the vision means for providing information to the control computer as to whether or not a work piece is present in the nest means.

  11. Zoom Vision System For Robotic Welding

    NASA Technical Reports Server (NTRS)

    Gilbert, Jeffrey L.; Hudyma, Russell M.

    1990-01-01

    Rugged zoom lens subsystem proposed for use in along-the-torch vision system of robotic welder. Enables system to adapt, via simple mechanical adjustments, to gas cups of different lengths, electrodes of different protrusions, and/or different distances between end of electrode and workpiece. Unnecessary to change optical components to accommodate changes in geometry. Easy to calibrate with respect to object in view. Provides variable focus and variable magnification.

  12. Vision enhanced navigation for unmanned systems

    NASA Astrophysics Data System (ADS)

    Wampler, Brandon Loy

    A vision based simultaneous localization and mapping (SLAM) algorithm is evaluated for use on unmanned systems. SLAM is a technique used by a vehicle to build a map of an environment while concurrently keeping track of its location within the map, without a priori knowledge. The work in this thesis is focused on using SLAM as a navigation solution when global positioning system (GPS) service is degraded or temporarily unavailable. Previous work on unmanned systems that lead up to the determination that a better navigation solution than GPS alone is first presented. This previous work includes control of unmanned systems, simulation, and unmanned vehicle hardware testing. The proposed SLAM algorithm follows the work originally developed by Davidson et al. in which they dub their algorithm MonoSLAM [1--4]. A new approach using the Pyramidal Lucas-Kanade feature tracking algorithm from Intel's OpenCV (open computer vision) library is presented as a means of keeping correct landmark correspondences as the vehicle moves through the scene. Though this landmark tracking method is unusable for long term SLAM due to its inability to recognize revisited landmarks, as opposed to the Scale Invariant Feature Transform (SIFT) and Speeded Up Robust Features (SURF), its computational efficiency makes it a good candidate for short term navigation between GPS position updates. Additional sensor information is then considered by fusing INS and GPS information into the SLAM filter. The SLAM system, in its vision only and vision/IMU form, is tested on a table top, in an open room, and finally in an outdoor environment. For the outdoor environment, a form of the slam algorithm that fuses vision, IMU, and GPS information is tested. The proposed SLAM algorithm, and its several forms, are implemented in C++ using an Extended Kalman Filter (EKF). Experiments utilizing a live video feed from a webcam are performed. The different forms of the filter are compared and conclusions are made on

  13. Missileborne artificial vision system (MAVIS)

    NASA Astrophysics Data System (ADS)

    Andes, David K.; Witham, James C.; Miles, Michael D.

    1994-03-01

    The Naval Air Warfare Center, China Lake has developed a real time, hardware and software system designed to implement and evaluate biologically inspired retinal and cortical models. The hardware is based on the Adaptive Solutions Inc. massively parallel CNAPS system COHO boards. Each COHO board is a standard size 6U VME card featuring 256 fixed point, RISC processors running at 20 MHz in a SIMD configuration. Each COHO board has a Companion board built to support a real time VSB interface to an imaging seeker, a NTSC camera and to other COHO boards. The system is designed to have multiple SIMD machines each performing different Corticomorphic functions. The system level software has been developed which allows a high level description of Corticomorphic structures to be translated into the native microcode of the CNAPS chips. Corticomorphic structures are those neural structures with a form similar to that of the retina, the lateral geniculate nucleus or the visual cortex. This real time hardware system is designed to be shrunk into a volume compatible with air launched tactical missiles. Initial versions of the software and hardware have been completed and are in the early stages of integration with a missile seeker.

  14. Missileborne Artificial Vision System (MAVIS)

    NASA Technical Reports Server (NTRS)

    Andes, David K.; Witham, James C.; Miles, Michael D.

    1994-01-01

    Several years ago when INTEL and China Lake designed the ETANN chip, analog VLSI appeared to be the only way to do high density neural computing. In the last five years, however, digital parallel processing chips capable of performing neural computation functions have evolved to the point of rough equality with analog chips in system level computational density. The Naval Air Warfare Center, China Lake, has developed a real time, hardware and software system designed to implement and evaluate biologically inspired retinal and cortical models. The hardware is based on the Adaptive Solutions Inc. massively parallel CNAPS system COHO boards. Each COHO board is a standard size 6U VME card featuring 256 fixed point, RISC processors running at 20 MHz in a SIMD configuration. Each COHO board has a companion board built to support a real time VSB interface to an imaging seeker, a NTSC camera, and to other COHO boards. The system is designed to have multiple SIMD machines each performing different corticomorphic functions. The system level software has been developed which allows a high level description of corticomorphic structures to be translated into the native microcode of the CNAPS chips. Corticomorphic structures are those neural structures with a form similar to that of the retina, the lateral geniculate nucleus, or the visual cortex. This real time hardware system is designed to be shrunk into a volume compatible with air launched tactical missiles. Initial versions of the software and hardware have been completed and are in the early stages of integration with a missile seeker.

  15. Autonomous Underwater Vehicle Magnetic Mapping System

    NASA Astrophysics Data System (ADS)

    Steigerwalt, R.; Johnson, R. M.; Trembanis, A. C.; Schmidt, V. E.; Tait, G.

    2012-12-01

    An Autonomous Underwater Vehicle (AUV) Magnetic Mapping (MM) System has been developed and tested for military munitions detection as well as pipeline locating, wreck searches, and geologic surveys in underwater environments. The system is comprised of a high sensitivity Geometrics G-880AUV cesium vapor magnetometer integrated with a Teledyne-Gavia AUV and associated Doppler enabled inertial navigation further utilizing traditional acoustic bathymetric and side scan imaging. All onboard sensors and associated electronics are managed through customized crew members to autonomously operate through the vehicles primary control module. Total field magnetic measurements are recorded with asynchronous time-stamped data logs which include position, altitude, heading, pitch, roll, and electrical current usage. Pre-planned mission information can be uploaded to the system operators to define data collection metrics including speed, height above seafloor, and lane or transect spacing specifically designed to meet data quality objectives for the survey. As a result of the AUVs modular design, autonomous navigation and rapid deployment capabilities, the AUV MM System provides cost savings over current surface vessel surveys by reducing the mobilization/demobilization effort, thus requiring less manpower for operation and reducing or eliminating the need for a surface support vessel altogether. When the system completes its mission, data can be remotely downloaded via W-LAN and exported for use in advanced signal processing platforms. Magnetic compensation software has been concurrently developed to accept electrical current measurements directly from the AUV to address distortions from permanent and induced magnetization effects on the magnetometer. Maneuver and electrical current compensation terms can be extracted from the magnetic survey missions to perform automated post-process corrections. Considerable suppression of system noise has been observed over traditional

  16. Visual Turing test for computer vision systems

    PubMed Central

    Geman, Donald; Geman, Stuart; Hallonquist, Neil; Younes, Laurent

    2015-01-01

    Today, computer vision systems are tested by their accuracy in detecting and localizing instances of objects. As an alternative, and motivated by the ability of humans to provide far richer descriptions and even tell a story about an image, we construct a “visual Turing test”: an operator-assisted device that produces a stochastic sequence of binary questions from a given test image. The query engine proposes a question; the operator either provides the correct answer or rejects the question as ambiguous; the engine proposes the next question (“just-in-time truthing”). The test is then administered to the computer-vision system, one question at a time. After the system’s answer is recorded, the system is provided the correct answer and the next question. Parsing is trivial and deterministic; the system being tested requires no natural language processing. The query engine employs statistical constraints, learned from a training set, to produce questions with essentially unpredictable answers—the answer to a question, given the history of questions and their correct answers, is nearly equally likely to be positive or negative. In this sense, the test is only about vision. The system is designed to produce streams of questions that follow natural story lines, from the instantiation of a unique object, through an exploration of its properties, and on to its relationships with other uniquely instantiated objects. PMID:25755262

  17. The MAP Autonomous Mission Control System

    NASA Technical Reports Server (NTRS)

    Breed, Juile; Coyle, Steven; Blahut, Kevin; Dent, Carolyn; Shendock, Robert; Rowe, Roger

    2000-01-01

    The Microwave Anisotropy Probe (MAP) mission is the second mission in NASA's Office of Space Science low-cost, Medium-class Explorers (MIDEX) program. The Explorers Program is designed to accomplish frequent, low cost, high quality space science investigations utilizing innovative, streamlined, efficient management, design and operations approaches. The MAP spacecraft will produce an accurate full-sky map of the cosmic microwave background temperature fluctuations with high sensitivity and angular resolution. The MAP spacecraft is planned for launch in early 2001, and will be staffed by only single-shift operations. During the rest of the time the spacecraft must be operated autonomously, with personnel available only on an on-call basis. Four (4) innovations will work cooperatively to enable a significant reduction in operations costs for the MAP spacecraft. First, the use of a common ground system for Spacecraft Integration and Test (I&T) as well as Operations. Second, the use of Finite State Modeling for intelligent autonomy. Third, the integration of a graphical planning engine to drive the autonomous systems without an intermediate manual step. And fourth, the ability for distributed operations via Web and pager access.

  18. Design of optimal correlation filters for hybrid vision systems

    NASA Technical Reports Server (NTRS)

    Rajan, Periasamy K.

    1990-01-01

    Research is underway at the NASA Johnson Space Center on the development of vision systems that recognize objects and estimate their position by processing their images. This is a crucial task in many space applications such as autonomous landing on Mars sites, satellite inspection and repair, and docking of space shuttle and space station. Currently available algorithms and hardware are too slow to be suitable for these tasks. Electronic digital hardware exhibits superior performance in computing and control; however, they take too much time to carry out important signal processing operations such as Fourier transformation of image data and calculation of correlation between two images. Fortunately, because of the inherent parallelism, optical devices can carry out these operations very fast, although they are not quite suitable for computation and control type operations. Hence, investigations are currently being conducted on the development of hybrid vision systems that utilize both optical techniques and digital processing jointly to carry out the object recognition tasks in real time. Algorithms for the design of optimal filters for use in hybrid vision systems were developed. Specifically, an algorithm was developed for the design of real-valued frequency plane correlation filters. Furthermore, research was also conducted on designing correlation filters optimal in the sense of providing maximum signal-to-nose ratio when noise is present in the detectors in the correlation plane. Algorithms were developed for the design of different types of optimal filters: complex filters, real-value filters, phase-only filters, ternary-valued filters, coupled filters. This report presents some of these algorithms in detail along with their derivations.

  19. Autonomic nervous system dysregulation in pediatric hypertension.

    PubMed

    Feber, Janusz; Ruzicka, Marcel; Geier, Pavel; Litwin, Mieczyslaw

    2014-05-01

    Historically, primary hypertension (HTN) has been prevalent typically in adults. Recent data however, suggests an increasing number of children diagnosed with primary HTN, mainly in the setting of obesity. One of the factors considered in the etiology of HTN is the autonomous nervous system, namely its dysregulation. In the past, the sympathetic nervous system (SNS) was regarded as a system engaged mostly in buffering major acute changes in blood pressure (BP), in response to physical and emotional stressors. Recent evidence suggests that the SNS plays a much broader role in the regulation of BP, including the development and maintenance of sustained HTN by a chronically elevated central sympathetic tone in adults and children with central/visceral obesity. Consequently, attempts have been made to reduce the SNS hyperactivity, in order to intervene early in the course of the disease and prevent HTN-related complications later in life.

  20. Seizures and brain regulatory systems: consciousness, sleep, and autonomic systems.

    PubMed

    Sedigh-Sarvestani, Madineh; Blumenfeld, Hal; Loddenkemper, Tobias; Bateman, Lisa M

    2015-06-01

    Research into the physiologic underpinnings of epilepsy has revealed reciprocal relationships between seizures and the activity of several regulatory systems in the brain. This review highlights recent progress in understanding and using the relationships between seizures and the arousal or consciousness system, the sleep-wake and associated circadian system, and the central autonomic network.

  1. Applications of Augmented Vision Head-Mounted Systems in Vision Rehabilitation

    PubMed Central

    Peli, Eli; Luo, Gang; Bowers, Alex; Rensing, Noa

    2007-01-01

    Vision loss typically affects either the wide peripheral vision (important for mobility), or central vision (important for seeing details). Traditional optical visual aids usually recover the lost visual function, but at a high cost for the remaining visual function. We have developed a novel concept of vision-multiplexing using augmented vision head-mounted display systems to address vision loss. Two applications are discussed in this paper. In the first, minified edge images from a head-mounted video camera are presented on a see-through display providing visual field expansion for people with peripheral vision loss, while still enabling the full resolution of the residual central vision to be maintained. The concept has been applied in daytime and nighttime devices. A series of studies suggested that the system could help with visual search, obstacle avoidance, and nighttime mobility. Subjects were positive in their ratings of device cosmetics and ergonomics. The second application is for people with central vision loss. Using an on-axis aligned camera and display system, central visibility is enhanced with 1:1 scale edge images, while still enabling the wide field of the unimpaired peripheral vision to be maintained. The registration error of the system was found to be low in laboratory testing. PMID:18172511

  2. The autonomic nervous system and hypertension.

    PubMed

    Mancia, Giuseppe; Grassi, Guido

    2014-05-23

    Physiological studies have long documented the key role played by the autonomic nervous system in modulating cardiovascular functions and in controlling blood pressure values, both at rest and in response to environmental stimuli. Experimental and clinical investigations have tested the hypothesis that the origin, progression, and outcome of human hypertension are related to dysfunctional autonomic cardiovascular control and especially to abnormal activation of the sympathetic division. Here, we review the recent literature on the adrenergic and vagal abnormalities that have been reported in essential hypertension, with emphasis on their role as promoters and as amplifiers of the high blood pressure state. We also discuss the possible mechanisms underlying these abnormalities and their importance in the development and progression of the structural and functional cardiovascular damage that characterizes hypertension. Finally, we examine the modifications of sympathetic and vagal cardiovascular influences induced by current nonpharmacological and pharmacological interventions aimed at correcting elevations in blood pressure and restoring the normotensive state. © 2014 American Heart Association, Inc.

  3. Intelligent data reduction for autonomous power systems

    NASA Technical Reports Server (NTRS)

    Floyd, Stephen A.

    1988-01-01

    Since 1984 Marshall Space Flight Center was actively engaged in research and development concerning autonomous power systems. Much of the work in this domain has dealt with the development and application of knowledge-based or expert systems to perform tasks previously accomplished only through intensive human involvement. One such task is the health status monitoring of electrical power systems. Such monitoring is a manpower intensive task which is vital to mission success. The Hubble Space Telescope testbed and its associated Nickel Cadmium Battery Expert System (NICBES) were designated as the system on which the initial proof of concept for intelligent power system monitoing will be established. The key function performed by an engineer engaged in system monitoring is to analyze the raw telemetry data and identify from the whole only those elements which can be considered significant. This function requires engineering expertise on the functionality of the system, the mode of operation and the efficient and effective reading of the telemetry data. Application of this expertise to extract the significant components of the data is referred to as data reduction. Such a function possesses characteristics which make it a prime candidate for the application of knowledge-based systems' technologies. Such applications are investigated and recommendations are offered for the development of intelligent data reduction systems.

  4. Progress in building a cognitive vision system

    NASA Astrophysics Data System (ADS)

    Benjamin, D. Paul; Lyons, Damian; Yue, Hong

    2016-05-01

    We are building a cognitive vision system for mobile robots that works in a manner similar to the human vision system, using saccadic, vergence and pursuit movements to extract information from visual input. At each fixation, the system builds a 3D model of a small region, combining information about distance, shape, texture and motion to create a local dynamic spatial model. These local 3D models are composed to create an overall 3D model of the robot and its environment. This approach turns the computer vision problem into a search problem whose goal is the acquisition of sufficient spatial understanding for the robot to succeed at its tasks. The research hypothesis of this work is that the movements of the robot's cameras are only those that are necessary to build a sufficiently accurate world model for the robot's current goals. For example, if the goal is to navigate through a room, the model needs to contain any obstacles that would be encountered, giving their approximate positions and sizes. Other information does not need to be rendered into the virtual world, so this approach trades model accuracy for speed.

  5. A Proposal of Autonomous Robotic Systems Educative Environment

    NASA Astrophysics Data System (ADS)

    Ierache, Jorge; Garcia-Martinez, Ramón; de Giusti, Armando

    This work presents our experiences in the implementation of a laboratory of autonomous robotic systems applied to the training of beginner and advanced students doing a degree course in Computer Engineering., taking into account the specific technologies, robots, autonomous toys, and programming languages. They provide a strategic opportunity for human resources formation by involving different aspects which range from the specification elaboration, modeling, software development and implementation and testing of an autonomous robotic system.

  6. Computation and design of autonomous intelligent systems

    NASA Astrophysics Data System (ADS)

    Fry, Robert L.

    2008-04-01

    This paper describes a theory of intelligent systems and its reduction to engineering practice. The theory is based on a broader theory of computation wherein information and control are defined within the subjective frame of a system. At its most primitive level, the theory describes what it computationally means to both ask and answer questions which, like traditional logic, are also Boolean. The logic of questions describes the subjective rules of computation that are objective in the sense that all the described systems operate according to its principles. Therefore, all systems are autonomous by construct. These systems include thermodynamic, communication, and intelligent systems. Although interesting, the important practical consequence is that the engineering framework for intelligent systems can borrow efficient constructs and methodologies from both thermodynamics and information theory. Thermodynamics provides the Carnot cycle which describes intelligence dynamics when operating in the refrigeration mode. It also provides the principle of maximum entropy. Information theory has recently provided the important concept of dual-matching useful for the design of efficient intelligent systems. The reverse engineered model of computation by pyramidal neurons agrees well with biology and offers a simple and powerful exemplar of basic engineering concepts.

  7. Autonomous Robot System for Sensor Characterization

    SciTech Connect

    David Bruemmer; Douglas Few; Frank Carney; Miles Walton; Heather Hunting; Ron Lujan

    2004-03-01

    This paper discusses an innovative application of new Markov localization techniques that combat the problem of odometry drift, allowing a novel control architecture developed at the Idaho National Engineering and Environmental Laboratory (INEEL) to be utilized within a sensor characterization facility developed at the Remote Sensing Laboratory (RSL) in Nevada. The new robotic capability provided by the INEEL will allow RSL to test and evaluate a wide variety of sensors including radiation detection systems, machine vision systems, and sensors that can detect and track heat sources (e.g. human bodies, machines, chemical plumes). By accurately moving a target at varying speeds along designated paths, the robotic solution allows the detection abilities of a wide variety of sensors to be recorded and analyzed.

  8. The nature of the autonomic dysfunction in multiple system atrophy

    NASA Technical Reports Server (NTRS)

    Parikh, Samir M.; Diedrich, Andre; Biaggioni, Italo; Robertson, David

    2002-01-01

    The concept that multiple system atrophy (MSA, Shy-Drager syndrome) is a disorder of the autonomic nervous system is several decades old. While there has been renewed interest in the movement disorder associated with MSA, two recent consensus statements confirm the centrality of the autonomic disorder to the diagnosis. Here, we reexamine the autonomic pathophysiology in MSA. Whereas MSA is often thought of as "autonomic failure", new evidence indicates substantial persistence of functioning sympathetic and parasympathetic nerves even in clinically advanced disease. These findings help explain some of the previously poorly understood features of MSA. Recognition that MSA entails persistent, constitutive autonomic tone requires a significant revision of our concepts of its diagnosis and therapy. We will review recent evidence bearing on autonomic tone in MSA and discuss their therapeutic implications, particularly in terms of the possible development of a bionic baroreflex for better control of blood pressure.

  9. The nature of the autonomic dysfunction in multiple system atrophy

    NASA Technical Reports Server (NTRS)

    Parikh, Samir M.; Diedrich, Andre; Biaggioni, Italo; Robertson, David

    2002-01-01

    The concept that multiple system atrophy (MSA, Shy-Drager syndrome) is a disorder of the autonomic nervous system is several decades old. While there has been renewed interest in the movement disorder associated with MSA, two recent consensus statements confirm the centrality of the autonomic disorder to the diagnosis. Here, we reexamine the autonomic pathophysiology in MSA. Whereas MSA is often thought of as "autonomic failure", new evidence indicates substantial persistence of functioning sympathetic and parasympathetic nerves even in clinically advanced disease. These findings help explain some of the previously poorly understood features of MSA. Recognition that MSA entails persistent, constitutive autonomic tone requires a significant revision of our concepts of its diagnosis and therapy. We will review recent evidence bearing on autonomic tone in MSA and discuss their therapeutic implications, particularly in terms of the possible development of a bionic baroreflex for better control of blood pressure.

  10. Autonomous Formations of Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Dhali, Sanjana; Joshi, Suresh M.

    2013-01-01

    Autonomous formation control of multi-agent dynamic systems has a number of applications that include ground-based and aerial robots and satellite formations. For air vehicles, formation flight ("flocking") has the potential to significantly increase airspace utilization as well as fuel efficiency. This presentation addresses two main problems in multi-agent formations: optimal role assignment to minimize the total cost (e.g., combined distance traveled by all agents); and maintaining formation geometry during flock motion. The Kuhn-Munkres ("Hungarian") algorithm is used for optimal assignment, and consensus-based leader-follower type control architecture is used to maintain formation shape despite the leader s independent movements. The methods are demonstrated by animated simulations.

  11. Autonomous Formations of Multi-Agent Systems

    NASA Technical Reports Server (NTRS)

    Dhali, Sanjana; Joshi, Suresh M.

    2013-01-01

    Autonomous formation control of multi-agent dynamic systems has a number of applications that include ground-based and aerial robots and satellite formations. For air vehicles, formation flight ("flocking") has the potential to significantly increase airspace utilization as well as fuel efficiency. This presentation addresses two main problems in multi-agent formations: optimal role assignment to minimize the total cost (e.g., combined distance traveled by all agents); and maintaining formation geometry during flock motion. The Kuhn-Munkres ("Hungarian") algorithm is used for optimal assignment, and consensus-based leader-follower type control architecture is used to maintain formation shape despite the leader s independent movements. The methods are demonstrated by animated simulations.

  12. A demonstration of autonomous navigation and machine vision using the HERMIES-IIB robot

    SciTech Connect

    Burks, B.L.; Barnett, D.L.; Jones, J.P.; Killough, S.M.

    1987-01-01

    In this paper, advances to our mobile robot series (currently HERMIES-IIB) to include 8 NCUBE processors on-board, (computationally equivalent to 8 Vax 11/780's) operating in parallel, and augmentation of the sensor suite with cameras to facilitate on-board vision analysis and goal finding are described. The essential capabilities of the expert system described in earlier papers have been ported to the on-board HERMIES-IIB computers thereby eliminating off-board computation. A successful experiment is described in which a robot is placed in an initial arbitrary location without prior specification of the room contents, successfully discovers and navigates around stationary and moving obstacles, picks up and moves small obstacles, searches for a control panel, and reads the meters found on the panel. 19 refs., 5 figs.

  13. Mechanical deployment system on aries an autonomous mobile robot

    SciTech Connect

    Rocheleau, D.N.

    1995-12-01

    ARIES (Autonomous Robotic Inspection Experimental System) is under development for the Department of Energy (DOE) to survey and inspect drums containing low-level radioactive waste stored in warehouses at DOE facilities. This paper focuses on the mechanical deployment system-referred to as the camera positioning system (CPS)-used in the project. The CPS is used for positioning four identical but separate camera packages consisting of vision cameras and other required sensors such as bar-code readers and light stripe projectors. The CPS is attached to the top of a mobile robot and consists of two mechanisms. The first is a lift mechanism composed of 5 interlocking rail-elements which starts from a retracted position and extends upward to simultaneously position 3 separate camera packages to inspect the top three drums of a column of four drums. The second is a parallelogram special case Grashof four-bar mechanism which is used for positioning a camera package on drums on the floor. Both mechanisms are the subject of this paper, where the lift mechanism is discussed in detail.

  14. Design and evaluation of an autonomous, obstacle avoiding, flight control system using visual sensors

    NASA Astrophysics Data System (ADS)

    Crawford, Bobby Grant

    In an effort to field smaller and cheaper Uninhabited Aerial Vehicles (UAVs), the Army has expressed an interest in an ability of the vehicle to autonomously detect and avoid obstacles. Current systems are not suitable for small aircraft. NASA Langley Research Center has developed a vision sensing system that uses small semiconductor cameras. The feasibility of using this sensor for the purpose of autonomous obstacle avoidance by a UAV is the focus of the research presented in this document. The vision sensor characteristics are modeled and incorporated into guidance and control algorithms designed to generate flight commands based on obstacle information received from the sensor. The system is evaluated by simulating the response to these flight commands using a six degree-of-freedom, non-linear simulation of a small, fixed wing UAV. The simulation is written using the MATLAB application and runs on a PC. Simulations were conducted to test the longitudinal and lateral capabilities of the flight control for a range of airspeeds, camera characteristics, and wind speeds. Results indicate that the control system is suitable for obstacle avoiding flight control using the simulated vision system. In addition, a method for designing and evaluating the performance of such a system has been developed that allows the user to easily change component characteristics and evaluate new systems through simulation.

  15. Networks for Autonomous Formation Flying Satellite Systems

    NASA Technical Reports Server (NTRS)

    Knoblock, Eric J.; Konangi, Vijay K.; Wallett, Thomas M.; Bhasin, Kul B.

    2001-01-01

    The performance of three communications networks to support autonomous multi-spacecraft formation flying systems is presented. All systems are comprised of a ten-satellite formation arranged in a star topology, with one of the satellites designated as the central or "mother ship." All data is routed through the mother ship to the terrestrial network. The first system uses a TCP/lP over ATM protocol architecture within the formation the second system uses the IEEE 802.11 protocol architecture within the formation and the last system uses both of the previous architectures with a constellation of geosynchronous satellites serving as an intermediate point-of-contact between the formation and the terrestrial network. The simulations consist of file transfers using either the File Transfer Protocol (FTP) or the Simple Automatic File Exchange (SAFE) Protocol. The results compare the IF queuing delay, and IP processing delay at the mother ship as well as application-level round-trip time for both systems, In all cases, using IEEE 802.11 within the formation yields less delay. Also, the throughput exhibited by SAFE is better than FTP.

  16. Autonomous remote monitoring system for landslides

    NASA Astrophysics Data System (ADS)

    Manetti, Luca; Terribilini, Andrea; Knecht, Alfredo

    2002-07-01

    There is a general tendency in systems for environmental monitoring towards ever more automatic and autonomous operation. Moreover, technologies and instruments are available to reliably interconnect distributed, disparate components. This allows the measurement, logging, data processing and interpretation activities to be carried out by separate units at different locations in near real-time. Building on the results of a previous research and development project at SUPSI, which focused on movement monitoring with GPS, the system has been generalized to accommodate a range of other sensors, thus rendering it even more interesting for geotechnical applications. In particular a laser distance meter and a robotized theodolite have been integrated. First results confirm an expected increase in robustness of the combined measurement network, which is particularly important in unfavorable stand-alone GPS reception conditions. Due to the modular architecture of the system, other sensor types, ranging from simple analog or digital sensors to complex measuring instruments may be supported with minimal effort. Measurements are transmitted via cellular or point-to-point radio links to a control station, which provides for post-processing and system management. The control station may be remotely accessed via an Internet connection. The system takes advantage of a standard and flexible database structure which has been tailored to measurement and monitoring projects using different sensors. The system represents an architecture for remote monitoring tasks requiring a high degree of autonomy, reliability and automation. The solution can be advantageously applied to remote, near real-time measurements of low dynamics movements.

  17. Synchronization in autonomous mercury beating heart systems.

    PubMed

    Verma, Dinesh Kumar; Singh, Harpartap; Contractor, A Q; Parmananda, P

    2014-07-03

    The ability of the mercury beating heart (MBH) system to exhibit sustained mechanical and electrochemical activities simultaneously without any external agent (fluctuating or constant), has attracted researchers for decades. The interplay of these activities could mimic the biological phenomena such as a pulsating heart that occurs due to the coupled tissues exhibiting mechanical as well as electrical dynamics. In the present work, we have studied experimentally the dynamics of electrically coupled two and three autonomous MBH systems. A dynamical triangular (heart) shape, in the traditional watch glass geometry, has been chosen for the experiments. It is found that the redox potentials (electrical behavior) of the quasi-identical (due to the inherent heterogeneities in the setup) MBH systems get synchronized at the intermediate coupling strengths whereas coherence in their mechanical activities occur only at large coupling strengths. To the best of our knowledge, this synchronization phenomenon involving two distinct activities (electrical and mechanical) and different coupling thresholds has not been reported, so far. The coherent mechanical activities means the simultaneous occurrence of compressions and expansions in the coupled Hg drops, which are shown using snapshots. In addition to this, the redox time series have also been provided to demonstrate the synchronization in the electrical behavior of MBH systems. Moreover, a mathematical framework considering only electrical and mechanical components of the MBH systems is presented to validate the experimental findings that the strong synchrony in the redox potentials of the MBH systems is a prerequisite for the synchrony in their mechanical activities.

  18. Distributed Learning and Information Dynamics In Networked Autonomous Systems

    DTIC Science & Technology

    2015-11-20

    AFRL-AFOSR-VA-TR-2015-0387 (MURI-09) DISTRIBUTED LEARNING AND INFORMATION DYNAMICS IN NETWORKED AUTONOMOUS Eric Feron GEORGIA TECH RESEARCH...2009 to June 30, 2015 4. TITLE AND SUBTITLE DISTRIBUTED LEARNING AND INFORMATION DYNAMICS IN NETWORKED AUTONOMOUS SYSTEMS 5a. CONTRACT NUMBER 5b...operations of teams of autonomous vehicles to learn and adapt to uncertain and hostile environments under effective utilization of communications resources

  19. From Automation to Autonomy-Trends Towards Autonomous Combat Systems

    DTIC Science & Technology

    2000-04-01

    UNCLASSIFIED Defense Technical Information Center Compilation Part Notice ADPO10300 TITLE: From Automation to Autonomy-Trends Towards Autonomous ...Systems Concepts and Integration. [les Avancees en concepts systemes pour vehicules et en integration] To order the complete compilation report, use...part numbers comprise the compilation report: ADPO10300 thru ADP010339 UNCLASSIFIED K3-1 FROM AUTOMATION TO AUTONOMY -TRENDS TOWARDS AUTONOMOUS

  20. Afghanistan as a Federal System with Autonomous Regions

    DTIC Science & Technology

    2009-05-21

    governed utilizing a federal system with strong autonomous areas. It begins with a discussion of the modern history of Afghanistan, focusing on...that Afghanistan should be governed utilizing a federal system with strong autonomous areas. It begins with a discussion of the modern history of...3  Governmental History

  1. Modular control systems for teleoperated and autonomous mobile robots

    NASA Astrophysics Data System (ADS)

    Kadonoff, Mark B.; Parish, David W.

    1995-01-01

    This paper will discuss components of a modular hardware and software architecture for mobile robots that supports both teleoperation and autonomous control. The Modular Autonomous Robot System architecture enables rapid development of control systems for unmanned vehicles for a wide variety of commercial and military applications.

  2. Multi-channel automotive night vision system

    NASA Astrophysics Data System (ADS)

    Lu, Gang; Wang, Li-jun; Zhang, Yi

    2013-09-01

    A four-channel automotive night vision system is designed and developed .It is consist of the four active near-infrared cameras and an Mulit-channel image processing display unit,cameras were placed in the automobile front, left, right and rear of the system .The system uses near-infrared laser light source,the laser light beam is collimated, the light source contains a thermoelectric cooler (TEC),It can be synchronized with the camera focusing, also has an automatic light intensity adjustment, and thus can ensure the image quality. The principle of composition of the system is description in detail,on this basis, beam collimation,the LD driving and LD temperature control of near-infrared laser light source,four-channel image processing display are discussed.The system can be used in driver assistance, car BLIS, car parking assist system and car alarm system in day and night.

  3. Digital Autonomous Terminal Access Communication (DATAC) system

    NASA Technical Reports Server (NTRS)

    Novacki, Stanley M., III

    1987-01-01

    In order to accommodate the increasing number of computerized subsystems aboard today's more fuel efficient aircraft, the Boeing Co. has developed the DATAC (Digital Autonomous Terminal Access Control) bus to minimize the need for point-to-point wiring to interconnect these various systems, thereby reducing total aircraft weight and maintaining an economical flight configuration. The DATAC bus is essentially a local area network providing interconnections for any of the flight management and control systems aboard the aircraft. The task of developing a Bus Monitor Unit was broken down into four subtasks: (1) providing a hardware interface between the DATAC bus and the Z8000-based microcomputer system to be used as the bus monitor; (2) establishing a communication link between the Z8000 system and a CP/M-based computer system; (3) generation of data reduction and display software to output data to the console device; and (4) development of a DATAC Terminal Simulator to facilitate testing of the hardware and software which transfer data between the DATAC's bus and the operator's console in a near real time environment. These tasks are briefly discussed.

  4. Autonomous photovoltaic-diesel power system design

    NASA Astrophysics Data System (ADS)

    Calloway, T. M.

    A methodology for designing an autonomous photovoltaic power system in conjunction with a diesel-fueled electric generator and a battery has been developed. Any photovoltaic array energy not utilized immediately by the load is stored in the battery bank. The diesel generator set is operated periodically at 14-day intervals to ensure its availability and occasionally as needed during winter to supplement combined output from the array and battery. It is hypothesized that logistical support is infrequent, so the hybrid photovoltaic-diesel power system is designed to consume only 10% as much fuel as would a diesel-only system. This constraint is used to generate a set of possible combinations of array area and battery energy storage capacity. For each combination, a battery-life model predicts the time interval between battery replacements by deducting the fraction of total life consumed each day. An economic model then produces life-cycle system cost. Repeating this process for different combinations of array area and battery capacity identifies the minimum-cost system design.

  5. Autonomous Control of Space Reactor Systems

    SciTech Connect

    Belle R. Upadhyaya; K. Zhao; S.R.P. Perillo; Xiaojia Xu; M.G. Na

    2007-11-30

    Autonomous and semi-autonomous control is a key element of space reactor design in order to meet the mission requirements of safety, reliability, survivability, and life expectancy. Interrestrial nuclear power plants, human operators are avilable to perform intelligent control functions that are necessary for both normal and abnormal operational conditions.

  6. Systems Architecture for Fully Autonomous Space Missions

    NASA Technical Reports Server (NTRS)

    Esper, Jamie; Schnurr, R.; VanSteenberg, M.; Brumfield, Mark (Technical Monitor)

    2002-01-01

    The NASA Goddard Space Flight Center is working to develop a revolutionary new system architecture concept in support of fully autonomous missions. As part of GSFC's contribution to the New Millenium Program (NMP) Space Technology 7 Autonomy and on-Board Processing (ST7-A) Concept Definition Study, the system incorporates the latest commercial Internet and software development ideas and extends them into NASA ground and space segment architectures. The unique challenges facing the exploration of remote and inaccessible locales and the need to incorporate corresponding autonomy technologies within reasonable cost necessitate the re-thinking of traditional mission architectures. A measure of the resiliency of this architecture in its application to a broad range of future autonomy missions will depend on its effectiveness in leveraging from commercial tools developed for the personal computer and Internet markets. Specialized test stations and supporting software come to past as spacecraft take advantage of the extensive tools and research investments of billion-dollar commercial ventures. The projected improvements of the Internet and supporting infrastructure go hand-in-hand with market pressures that provide continuity in research. By taking advantage of consumer-oriented methods and processes, space-flight missions will continue to leverage on investments tailored to provide better services at reduced cost. The application of ground and space segment architectures each based on Local Area Networks (LAN), the use of personal computer-based operating systems, and the execution of activities and operations through a Wide Area Network (Internet) enable a revolution in spacecraft mission formulation, implementation, and flight operations. Hardware and software design, development, integration, test, and flight operations are all tied-in closely to a common thread that enables the smooth transitioning between program phases. The application of commercial software

  7. Systems Architecture for Fully Autonomous Space Missions

    NASA Technical Reports Server (NTRS)

    Esper, Jamie; Schnurr, R.; VanSteenberg, M.; Brumfield, Mark (Technical Monitor)

    2002-01-01

    The NASA Goddard Space Flight Center is working to develop a revolutionary new system architecture concept in support of fully autonomous missions. As part of GSFC's contribution to the New Millenium Program (NMP) Space Technology 7 Autonomy and on-Board Processing (ST7-A) Concept Definition Study, the system incorporates the latest commercial Internet and software development ideas and extends them into NASA ground and space segment architectures. The unique challenges facing the exploration of remote and inaccessible locales and the need to incorporate corresponding autonomy technologies within reasonable cost necessitate the re-thinking of traditional mission architectures. A measure of the resiliency of this architecture in its application to a broad range of future autonomy missions will depend on its effectiveness in leveraging from commercial tools developed for the personal computer and Internet markets. Specialized test stations and supporting software come to past as spacecraft take advantage of the extensive tools and research investments of billion-dollar commercial ventures. The projected improvements of the Internet and supporting infrastructure go hand-in-hand with market pressures that provide continuity in research. By taking advantage of consumer-oriented methods and processes, space-flight missions will continue to leverage on investments tailored to provide better services at reduced cost. The application of ground and space segment architectures each based on Local Area Networks (LAN), the use of personal computer-based operating systems, and the execution of activities and operations through a Wide Area Network (Internet) enable a revolution in spacecraft mission formulation, implementation, and flight operations. Hardware and software design, development, integration, test, and flight operations are all tied-in closely to a common thread that enables the smooth transitioning between program phases. The application of commercial software

  8. Ball stud inspection system using machine vision.

    PubMed

    Shin, Dongik; Han, Changsoo; Moon, Young Shik

    2002-01-01

    In this paper, a vision-based inspection system that measures the dimensions of a ball stud is designed and implemented. The system acquires silhouetted images by backlighting and extracts the outlines of the nearly dichotomized images in subpixel accuracy. The sets of boundary data are modeled with reasonable geometric primitives and the parameters of the models are estimated in a manner that minimizes error. Jig-fixtures and servo systems for the inspection are also contrived. The system rotates an inspected object to recognize the objects in space not on a plane. The system moves the object vertically so that it may take several pictures of different parts of the object, resulting in improvement of measuring resolution. The performance of the system is evaluated by measurement of the dimensions of a standard ball, a standard cylinder, and a ball stud.

  9. 360 degree vision system: opportunities in transportation

    NASA Astrophysics Data System (ADS)

    Thibault, Simon

    2007-09-01

    Panoramic technologies are experiencing new and exciting opportunities in the transportation industries. The advantages of panoramic imagers are numerous: increased areas coverage with fewer cameras, imaging of multiple target simultaneously, instantaneous full horizon detection, easier integration of various applications on the same imager and others. This paper reports our work on panomorph optics and potential usage in transportation applications. The novel panomorph lens is a new type of high resolution panoramic imager perfectly suitable for the transportation industries. The panomorph lens uses optimization techniques to improve the performance of a customized optical system for specific applications. By adding a custom angle to pixel relation at the optical design stage, the optical system provides an ideal image coverage which is designed to reduce and optimize the processing. The optics can be customized for the visible, near infra-red (NIR) or infra-red (IR) wavebands. The panomorph lens is designed to optimize the cost per pixel which is particularly important in the IR. We discuss the use of the 360 vision system which can enhance on board collision avoidance systems, intelligent cruise controls and parking assistance. 360 panoramic vision systems might enable safer highways and significant reduction in casualties.

  10. Unified formalism for higher order non-autonomous dynamical systems

    NASA Astrophysics Data System (ADS)

    Prieto-Martínez, Pedro Daniel; Román-Roy, Narciso

    2012-03-01

    This work is devoted to giving a geometric framework for describing higher order non-autonomous mechanical systems. The starting point is to extend the Lagrangian-Hamiltonian unified formalism of Skinner and Rusk for these kinds of systems, generalizing previous developments for higher order autonomous mechanical systems and first-order non-autonomous mechanical systems. Then, we use this unified formulation to derive the standard Lagrangian and Hamiltonian formalisms, including the Legendre-Ostrogradsky map and the Euler-Lagrange and the Hamilton equations, both for regular and singular systems. As applications of our model, two examples of regular and singular physical systems are studied.

  11. [Neuropeptide Y and autonomic nervous system].

    PubMed

    Nozdrachev, A D; Masliukov, P M

    2011-01-01

    Neuropeptide Y (NPY) containing 36 amino acid residues belongs to peptides widely spread in the central and peripheral nervous system. NPY and its receptors play an extremely diverse role in the nervous system, including regulation of satiety, of emotional state, of vascular tone, and of gastrointestinal secretion. In mammals, NPY has been revealed in the majority of sympathetic ganglion neurons, in a high number of neurons of parasympathetic cranial ganglia as well as of intramural ganglia of the metasympathetic nervous system. At present, six types of receptors to NPY (Y1-Y6) have been identified. All receptors to NPY belong to the family of G-bound proteins. Action of NPY on peripheral organs-targets is predominantly realized through postsynaptic receptors Y1, Y3-Y5, and presynaptic receptors of the Y2 type. NPY is present in large electron-dense vesicles and is released at high-frequency stimulation. NPY affects not only vascular tone, frequency and strength of heart contractions, motorics and secretion of the gastrointestinal tract, but also has trophic effect and produces proliferation of cells of organs-targets, specifically of vessels, myocardium, and adipose tissue. In early postnatal ontogenesis the percent of the NPY-containing neurons in ganglia of the autonomic nervous system increases. In adult organisms, this parameter decreases. This seems to be connected with the trophic NPY effect on cells-targets as well as with regulation of their functional state.

  12. Closed-loop autonomous docking system

    NASA Technical Reports Server (NTRS)

    Dabney, Richard W. (Inventor); Howard, Richard T. (Inventor)

    1992-01-01

    An autonomous docking system is provided which produces commands for the steering and propulsion system of a chase vehicle used in the docking of that chase vehicle with a target vehicle. The docking system comprises a passive optical target affixed to the target vehicle and comprising three reflective areas including a central area mounted on a short post, and tracking sensor and process controller apparatus carried by the chase vehicle. The latter apparatus comprises a laser diode array for illuminating the target so as to cause light to be reflected from the reflective areas of the target; a sensor for detecting the light reflected from the target and for producing an electrical output signal in accordance with an image of the reflected light; a signal processor for processing the electrical output signal in accordance with an image of the reflected light; a signal processor for processing the electrical output signal and for producing, based thereon, output signals relating to the relative range, roll, pitch, yaw, azimuth, and elevation of the chase and target vehicles; and a docking process controller, responsive to the output signals produced by the signal processor, for producing command signals for controlling the steering and propulsion system of the chase vehicle.

  13. Collaborative Autonomous Unmanned Aerial - Ground Vehicle Systems for Field Operations

    DTIC Science & Technology

    2007-08-31

    used on UGV for stereo vision); • Microstrain 3DM-G IMU; • 2 Gigs 333 MHz RAM; • 4 channel Frame Graber; • Superstar-2 GPS receiver; • Servo...The fuzzy EKF is used to fuse information acquired from the UGV odometer, stereo vision system and laser range finder in order to estimate the...used to update the measurement covariance matrix of the EKF. Artificial landmarks are recognized by the stereo vision system and distances between the

  14. Vision-based obstacle recognition system for automated lawn mower robot development

    NASA Astrophysics Data System (ADS)

    Mohd Zin, Zalhan; Ibrahim, Ratnawati

    2011-06-01

    Digital image processing techniques (DIP) have been widely used in various types of application recently. Classification and recognition of a specific object using vision system require some challenging tasks in the field of image processing and artificial intelligence. The ability and efficiency of vision system to capture and process the images is very important for any intelligent system such as autonomous robot. This paper gives attention to the development of a vision system that could contribute to the development of an automated vision based lawn mower robot. The works involve on the implementation of DIP techniques to detect and recognize three different types of obstacles that usually exist on a football field. The focus was given on the study on different types and sizes of obstacles, the development of vision based obstacle recognition system and the evaluation of the system's performance. Image processing techniques such as image filtering, segmentation, enhancement and edge detection have been applied in the system. The results have shown that the developed system is able to detect and recognize various types of obstacles on a football field with recognition rate of more 80%.

  15. Synthetic Vision Systems - Operational Considerations Simulation Experiment

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Williams, Steven P.; Bailey, Randall E.; Glaab, Louis J.

    2007-01-01

    Synthetic vision is a computer-generated image of the external scene topography that is generated from aircraft attitude, high-precision navigation information, and data of the terrain, obstacles, cultural features, and other required flight information. A synthetic vision system (SVS) enhances this basic functionality with real-time integrity to ensure the validity of the databases, perform obstacle detection and independent navigation accuracy verification, and provide traffic surveillance. Over the last five years, NASA and its industry partners have developed and deployed SVS technologies for commercial, business, and general aviation aircraft which have been shown to provide significant improvements in terrain awareness and reductions in the potential for Controlled-Flight-Into-Terrain incidents/accidents compared to current generation cockpit technologies. It has been hypothesized that SVS displays can greatly improve the safety and operational flexibility of flight in Instrument Meteorological Conditions (IMC) to a level comparable to clear-day Visual Meteorological Conditions (VMC), regardless of actual weather conditions or time of day. An experiment was conducted to evaluate SVS and SVS-related technologies as well as the influence of where the information is provided to the pilot (e.g., on a Head-Up or Head-Down Display) for consideration in defining landing minima based upon aircraft and airport equipage. The "operational considerations" evaluated under this effort included reduced visibility, decision altitudes, and airport equipage requirements, such as approach lighting systems, for SVS-equipped aircraft. Subjective results from the present study suggest that synthetic vision imagery on both head-up and head-down displays may offer benefits in situation awareness; workload; and approach and landing performance in the visibility levels, approach lighting systems, and decision altitudes tested.

  16. Synthetic vision systems: operational considerations simulation experiment

    NASA Astrophysics Data System (ADS)

    Kramer, Lynda J.; Williams, Steven P.; Bailey, Randall E.; Glaab, Louis J.

    2007-04-01

    Synthetic vision is a computer-generated image of the external scene topography that is generated from aircraft attitude, high-precision navigation information, and data of the terrain, obstacles, cultural features, and other required flight information. A synthetic vision system (SVS) enhances this basic functionality with real-time integrity to ensure the validity of the databases, perform obstacle detection and independent navigation accuracy verification, and provide traffic surveillance. Over the last five years, NASA and its industry partners have developed and deployed SVS technologies for commercial, business, and general aviation aircraft which have been shown to provide significant improvements in terrain awareness and reductions in the potential for Controlled-Flight-Into-Terrain incidents / accidents compared to current generation cockpit technologies. It has been hypothesized that SVS displays can greatly improve the safety and operational flexibility of flight in Instrument Meteorological Conditions (IMC) to a level comparable to clear-day Visual Meteorological Conditions (VMC), regardless of actual weather conditions or time of day. An experiment was conducted to evaluate SVS and SVS-related technologies as well as the influence of where the information is provided to the pilot (e.g., on a Head-Up or Head-Down Display) for consideration in defining landing minima based upon aircraft and airport equipage. The "operational considerations" evaluated under this effort included reduced visibility, decision altitudes, and airport equipage requirements, such as approach lighting systems, for SVS-equipped aircraft. Subjective results from the present study suggest that synthetic vision imagery on both head-up and head-down displays may offer benefits in situation awareness; workload; and approach and landing performance in the visibility levels, approach lighting systems, and decision altitudes tested.

  17. A bio-inspired apposition compound eye machine vision sensor system.

    PubMed

    Davis, J D; Barrett, S F; Wright, C H G; Wilcox, M

    2009-12-01

    The Wyoming Information, Signal Processing, and Robotics Laboratory is developing a wide variety of bio-inspired vision sensors. We are interested in exploring the vision system of various insects and adapting some of their features toward the development of specialized vision sensors. We do not attempt to supplant traditional digital imaging techniques but rather develop sensor systems tailor made for the application at hand. We envision that many applications may require a hybrid approach using conventional digital imaging techniques enhanced with bio-inspired analogue sensors. In this specific project, we investigated the apposition compound eye and its characteristics commonly found in diurnal insects and certain species of arthropods. We developed and characterized an array of apposition compound eye-type sensors and tested them on an autonomous robotic vehicle. The robot exhibits the ability to follow a pre-defined target and avoid specified obstacles using a simple control algorithm.

  18. Real-time Enhanced Vision System

    NASA Technical Reports Server (NTRS)

    Hines, Glenn D.; Rahman, Zia-Ur; Jobson, Daniel J.; Woodell, Glenn A.; Harrah, Steven D.

    2005-01-01

    Flying in poor visibility conditions, such as rain, snow, fog or haze, is inherently dangerous. However these conditions can occur at nearly any location, so inevitably pilots must successfully navigate through them. At NASA Langley Research Center (LaRC), under support of the Aviation Safety and Security Program Office and the Systems Engineering Directorate, we are developing an Enhanced Vision System (EVS) that combines image enhancement and synthetic vision elements to assist pilots flying through adverse weather conditions. This system uses a combination of forward-looking infrared and visible sensors for data acquisition. A core function of the system is to enhance and fuse the sensor data in order to increase the information content and quality of the captured imagery. These operations must be performed in real-time for the pilot to use while flying. For image enhancement, we are using the LaRC patented Retinex algorithm since it performs exceptionally well for improving low-contrast range imagery typically seen during poor visibility conditions. In general, real-time operation of the Retinex requires specialized hardware. To date, we have successfully implemented a single-sensor real-time version of the Retinex on several different Digital Signal Processor (DSP) platforms. In this paper we give an overview of the EVS and its performance requirements for real-time enhancement and fusion and we discuss our current real-time Retinex implementations on DSPs.

  19. Active State Model for Autonomous Systems

    NASA Technical Reports Server (NTRS)

    Park, Han; Chien, Steve; Zak, Michail; James, Mark; Mackey, Ryan; Fisher, Forest

    2003-01-01

    The concept of the active state model (ASM) is an architecture for the development of advanced integrated fault-detection-and-isolation (FDI) systems for robotic land vehicles, pilotless aircraft, exploratory spacecraft, or other complex engineering systems that will be capable of autonomous operation. An FDI system based on the ASM concept would not only provide traditional diagnostic capabilities, but also integrate the FDI system under a unified framework and provide mechanism for sharing of information between FDI subsystems to fully assess the overall health of the system. The ASM concept begins with definitions borrowed from psychology, wherein a system is regarded as active when it possesses self-image, self-awareness, and an ability to make decisions itself, such that it is able to perform purposeful motions and other transitions with some degree of autonomy from the environment. For an engineering system, self-image would manifest itself as the ability to determine nominal values of sensor data by use of a mathematical model of itself, and selfawareness would manifest itself as the ability to relate sensor data to their nominal values. The ASM for such a system may start with the closed-loop control dynamics that describe the evolution of state variables. As soon as this model was supplemented with nominal values of sensor data, it would possess self-image. The ability to process the current sensor data and compare them with the nominal values would represent self-awareness. On the basis of self-image and self-awareness, the ASM provides the capability for self-identification, detection of abnormalities, and self-diagnosis.

  20. Knowledge-based machine vision systems for space station automation

    NASA Technical Reports Server (NTRS)

    Ranganath, Heggere S.; Chipman, Laure J.

    1989-01-01

    Computer vision techniques which have the potential for use on the space station and related applications are assessed. A knowledge-based vision system (expert vision system) and the development of a demonstration system for it are described. This system implements some of the capabilities that would be necessary in a machine vision system for the robot arm of the laboratory module in the space station. A Perceptics 9200e image processor, on a host VAXstation, was used to develop the demonstration system. In order to use realistic test images, photographs of actual space shuttle simulator panels were used. The system's capabilities of scene identification and scene matching are discussed.

  1. Evaluation of stereo vision obstacle detection algorithms for off-road autonomous navigation

    NASA Technical Reports Server (NTRS)

    Rankin, Arturo; Huertas, Andres; Matthies, Larry

    2005-01-01

    Reliable detection of non-traversable hazards is a key requirement for off-road autonomous navigation. A detailed description of each obstacle detection algorithm and their performance on the surveyed obstacle course is presented in this paper.

  2. APDS: The Autonomous Pathogen Detection System

    SciTech Connect

    Hindson, B; Makarewicz, A; Setlur, U; Henderer, B; McBride, M; Dzenitis, J

    2004-10-04

    We have developed and tested a fully autonomous pathogen detection system (APDS) capable of continuously monitoring the environment for airborne biological threat agents. The system was developed to provide early warning to civilians in the event of a bioterrorism incident and can be used at high profile events for short-term, intensive monitoring or in major public buildings or transportation nodes for long-term monitoring. The APDS is completely automated, offering continuous aerosol sampling, in-line sample preparation fluidics, multiplexed detection and identification immunoassays, and nucleic-acid based polymerase chain reaction (PCR) amplification and detection. Highly multiplexed antibody-based and duplex nucleic acid-based assays are combined to reduce false positives to a very low level, lower reagent costs, and significantly expand the detection capabilities of this biosensor. This article provides an overview of the current design and operation of the APDS. Certain sub-components of the ADPS are described in detail, including the aerosol collector, the automated sample preparation module that performs multiplexed immunoassays with confirmatory PCR, and the data monitoring and communications system. Data obtained from an APDS that operated continuously for seven days in a major U.S. transportation hub is reported.

  3. Lightweight autonomous chemical identification system (LACIS)

    NASA Astrophysics Data System (ADS)

    Lozos, George; Lin, Hai; Burch, Timothy

    2012-06-01

    Smiths Detection and Intelligent Optical Systems have developed prototypes for the Lightweight Autonomous Chemical Identification System (LACIS) for the US Department of Homeland Security. LACIS is to be a handheld detection system for Chemical Warfare Agents (CWAs) and Toxic Industrial Chemicals (TICs). LACIS is designed to have a low limit of detection and rapid response time for use by emergency responders and could allow determination of areas having dangerous concentration levels and if protective garments will be required. Procedures for protection of responders from hazardous materials incidents require the use of protective equipment until such time as the hazard can be assessed. Such accurate analysis can accelerate operations and increase effectiveness. LACIS is to be an improved point detector employing novel CBRNE detection modalities that includes a militaryproven ruggedized ion mobility spectrometer (IMS) with an array of electro-resistive sensors to extend the range of chemical threats detected in a single device. It uses a novel sensor data fusion and threat classification architecture to interpret the independent sensor responses and provide robust detection at low levels in complex backgrounds with minimal false alarms. The performance of LACIS prototypes have been characterized in independent third party laboratory tests at the Battelle Memorial Institute (BMI, Columbus, OH) and indoor and outdoor field tests at the Nevada National Security Site (NNSS). LACIS prototypes will be entering operational assessment by key government emergency response groups to determine its capabilities versus requirements.

  4. 75 FR 60478 - In the Matter of Certain Machine Vision Software, Machine Vision Systems, and Products Containing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-30

    ... COMMISSION In the Matter of Certain Machine Vision Software, Machine Vision Systems, and Products Containing... importation of certain machine vision software, machine vision systems, or products containing same by reason... Soft'') of Japan; Fuji Machine Manufacturing Co., Ltd. of Japan and Fuji America Corporation of...

  5. DLP™-based dichoptic vision test system

    NASA Astrophysics Data System (ADS)

    Woods, Russell L.; Apfelbaum, Henry L.; Peli, Eli

    2010-01-01

    It can be useful to present a different image to each of the two eyes while they cooperatively view the world. Such dichoptic presentation can occur in investigations of stereoscopic and binocular vision (e.g., strabismus, amblyopia) and vision rehabilitation in clinical and research settings. Various techniques have been used to construct dichoptic displays. The most common and most flexible modern technique uses liquid-crystal (LC) shutters. When used in combination with cathode ray tube (CRT) displays, there is often leakage of light from the image intended for one eye into the view of the other eye. Such interocular crosstalk is 14% even in our state of the art CRT-based dichoptic system. While such crosstalk may have minimal impact on stereo movie or video game experiences, it can defeat clinical and research investigations. We use micromirror digital light processing (DLP™) technology to create a novel dichoptic visual display system with substantially lower interocular crosstalk (0.3% remaining crosstalk comes from the LC shutters). The DLP system normally uses a color wheel to display color images. Our approach is to disable the color wheel, synchronize the display directly to the computer's sync signal, allocate each of the three (former) color presentations to one or both eyes, and open and close the LC shutters in synchrony with those color events.

  6. DLP™-based dichoptic vision test system

    PubMed Central

    Woods, Russell L.; Apfelbaum, Henry L.; Peli, Eli

    2010-01-01

    It can be useful to present a different image to each of the two eyes while they cooperatively view the world. Such dichoptic presentation can occur in investigations of stereoscopic and binocular vision (e.g., strabismus, amblyopia) and vision rehabilitation in clinical and research settings. Various techniques have been used to construct dichoptic displays. The most common and most flexible modern technique uses liquid-crystal (LC) shutters. When used in combination with cathode ray tube (CRT) displays, there is often leakage of light from the image intended for one eye into the view of the other eye. Such interocular crosstalk is 14% even in our state of the art CRT-based dichoptic system. While such crosstalk may have minimal impact on stereo movie or video game experiences, it can defeat clinical and research investigations. We use micromirror digital light processing (DLP™) technology to create a novel dichoptic visual display system with substantially lower interocular crosstalk (0.3%; remaining crosstalk comes from the LC shutters). The DLP system normally uses a color wheel to display color images. Our approach is to disable the color wheel, synchronize the display directly to the computer’s sync signal, allocate each of the three (former) color presentations to one or both eyes, and open and close the LC shutters in synchrony with those color events. PMID:20210457

  7. Forward Obstacle Detection System by Stereo Vision

    NASA Astrophysics Data System (ADS)

    Iwata, Hiroaki; Saneyoshi, Keiji

    Forward obstacle detection is needed to prevent car accidents. We have developed forward obstacle detection system which has good detectability and the accuracy of distance only by using stereo vision. The system runs in real time by using a stereo processing system based on a Field-Programmable Gate Array (FPGA). Road surfaces are detected and the space to drive can be limited. A smoothing filter is also used. Owing to these, the accuracy of distance is improved. In the experiments, this system could detect forward obstacles 100 m away. Its error of distance up to 80 m was less than 1.5 m. It could immediately detect cutting-in objects.

  8. Autonomous Segmentation of Outcrop Images Using Computer Vision and Machine Learning

    NASA Astrophysics Data System (ADS)

    Francis, R.; McIsaac, K.; Osinski, G. R.; Thompson, D. R.

    2013-12-01

    As planetary exploration missions become increasingly complex and capable, the motivation grows for improved autonomous science. New capabilities for onboard science data analysis may relieve radio-link data limits and provide greater throughput of scientific information. Adaptive data acquisition, storage and downlink may ultimately hold implications for mission design and operations. For surface missions, geology remains an essential focus, and the investigation of in place, exposed geological materials provides the greatest scientific insight and context for the formation and history of planetary materials and processes. The goal of this research program is to develop techniques for autonomous segmentation of images of rock outcrops. Recognition of the relationships between different geological units is the first step in mapping and interpreting a geological setting. Applications of automatic segmentation include instrument placement and targeting and data triage for downlink. Here, we report on the development of a new technique in which a photograph of a rock outcrop is processed by several elementary image processing techniques, generating a feature space which can be interrogated and classified. A distance metric learning technique (Multiclass Discriminant Analysis, or MDA) is tested as a means of finding the best numerical representation of the feature space. MDA produces a linear transformation that maximizes the separation between data points from different geological units. This ';training step' is completed on one or more images from a given locality. Then we apply the same transformation to improve the segmentation of new scenes containing similar materials to those used for training. The technique was tested using imagery from Mars analogue settings at the Cima volcanic flows in the Mojave Desert, California; impact breccias from the Sudbury impact structure in Ontario, Canada; and an outcrop showing embedded mineral veins in Gale Crater on Mars

  9. Autonomous and Autonomic Systems: A Paradigm for Future Space Exploration Missions

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walter F.; Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2004-01-01

    NASA increasingly will rely on autonomous systems concepts, not only in the mission control centers on the ground, but also on spacecraft and on rovers and other assets on extraterrestrial bodies. Automomy enables not only reduced operations costs, But also adaptable goal-driven functionality of mission systems. Space missions lacking autonomy will be unable to achieve the full range of advanced mission objectives, given that human control under dynamic environmental conditions will not be feasible due, in part, to the unavoidably high signal propagation latency and constrained data rates of mission communications links. While autonomy cost-effectively supports accomplishment of mission goals, autonomicity supports survivability of remote mission assets, especially when human tending is not feasible. Autonomic system properties (which ensure self-configuring, self-optimizing self-healing, and self-protecting behavior) conceptually may enable space missions of a higher order into any previously flown. Analysis of two NASA agent-based systems previously prototyped, and of a proposed future mission involving numerous cooperating spacecraft, illustrates how autonomous and autonomic system concepts may be brought to bear on future space missions.

  10. Automatic Welding System Using Speed Controllable Autonomous Mobile Robot

    NASA Astrophysics Data System (ADS)

    Kim, Taewon; Suto, Takeshi; Kobayashi, Junya; Kim, Jongcheol; Suga, Yasuo

    A prototype of autonomous mobile robot with two vision sensors for automatic welding of steel plates was constructed. The robot can move straight, steer and turn around the robot center by controlling the driving speed of the two wheels respectively. At the tip of the movable arm, two CCD cameras are fixed. A local camera observes the welding line near the welding torch and another wide camera observes relatively wide area in front of the welding part. The robot controls the traveling speed in accordance with the shape of the welding line. In the case of straight welding line, the speed of the robot is accelerated and the welding efficiency is improved. However, if the robot finds a corner of welding line, the speed is decelerated in order to realize the precise seam tracking and stable welding. Therefore, the robot can realize precise and high speed seam-tracking by controlling the travel speed. The effectiveness of the control system is confirmed by welding experiments.

  11. Cardiac autonomic nervous system activity in obesity.

    PubMed

    Liatis, Stavros; Tentolouris, Nikolaos; Katsilambros, Nikolaos

    2004-08-01

    The development of obesity is caused by a disturbance of energy balance, with energy intake exceeding energy expenditure. As the autonomic nervous system (ANS) has a role in the regulation of both these variables, it has become a major focus of investigation in the fields of obesity pathogenesis. The enhanced cardiac sympathetic drive shown in most of the studies in obese persons might be due to an increase in their levels of circulating insulin. The role of leptin needs further investigation with studies in humans. There is a blunted response of the cardiac sympathetic nervous system (SNS) activity in obese subjects after consumption of a carbohydrate-rich meal as well as after insulin administration. This might be due to insulin resistance. It is speculated that increased SNS activity in obesity may contribute to the development of hypertension in genetically susceptible individuals. It is also speculated that the increase in cardiac SNS activity under fasting conditions in obesity may be associated with high cardiovascular morbidity and mortality.

  12. Implementation of Deconfliction in Multivehicle Autonomous Systems

    DTIC Science & Technology

    2010-01-01

    two fin -actuated vehicles was replaced with a remote control toy shark controlled by a human operator. The human operator drove the toy shark directly...Fig. 4 Vehicle Swarm Technology Laboratory (VSTL) developed by the Boeing Research and Technology group. 3.2 University of Washington Fin -Actuated...Autonomous Underwater Vehicles The UW testbed is composed of a set of three fin -actuated autonomous underwater vehi- cles (Fig. 6) operating in a

  13. Robot vision system programmed in Prolog

    NASA Astrophysics Data System (ADS)

    Batchelor, Bruce G.; Hack, Ralf

    1995-10-01

    This is the latest in a series of publications which develop the theme of programming a machine vision system using the artificial intelligence language Prolog. The article states the long-term objective of the research program of which this work forms part. Many but not yet all of the goals laid out in this plan have already been achieved in an integrated system, which uses a multi-layer control hierarchy. The purpose of the present paper is to demonstrate that a system based upon a Prolog controller is capable of making complex decisions and operating a standard robot. The authors chose, as a vehicle for this exercise, the task of playing dominoes against a human opponent. This game was selected for this demonstration since it models a range of industrial assembly tasks, where parts are to be mated together. (For example, a 'daisy chain' of electronic equipment and the interconnecting cables/adapters may be likened to a chain of dominoes.)

  14. Differential responses of components of the autonomic nervous system.

    PubMed

    Goldstein, David S

    2013-01-01

    This chapter conveys several concepts and points of view about the scientific and medical significance of differential alterations in activities of components of the autonomic nervous system in stress and disease. The use of terms such as "the autonomic nervous system," "autonomic failure," "dysautonomia," and "autonomic dysfunction" imply the existence of a single entity; however, the autonomic nervous system has functionally and neurochemically distinctive components, which are reflected in differential responses to stressors and differential involvement in pathophysiologic states. One can conceptualize the autonomic nervous system as having at least five components: the sympathetic noradrenergic system, the sympathetic cholinergic system, the parasympathetic cholinergic system, the sympathetic adrenergic system, and the enteric nervous system. Evidence has accumulated for differential noradrenergic vs. adrenergic responses in various situations. The largest sympathetic adrenergic system responses are seen when the organism encounters stressors that pose a global or metabolic threat. Sympathetic noradrenergic system activation dominates the responses to orthostasis, moderate exercise, and exposure to cold, whereas sympathetic adrenergic system activation dominates those to glucoprivation and emotional distress. There seems to be at least as good a justification for the concept of coordinated adrenocortical-adrenomedullary responses as for coordinated adrenomedullary-sympathoneural responses in stress. Fainting reactions involve differential adrenomedullary hormonal vs. sympathetic noradrenergic activation. Parkinson disease entails relatively selective dysfunction of the sympathetic noradrenergic system, with prominent loss of noradrenergic nerves in the heart, yet normal adrenomedullary function. Allostatic load links stress with degenerative diseases, and Parkinson disease may be a disease of the elderly because of allostatic load.

  15. The Autonomous Pathogen Detection System (APDS)

    SciTech Connect

    Morris, J; Dzenitis, J

    2004-09-22

    Shaped like a mailbox on wheels, it's been called a bioterrorism ''smoke detector.'' It can be found in transportation hubs such as airports and subways, and it may be coming to a location near you. Formally known as the Autonomous Pathogen Detection System, or APDS, this latest tool in the war on bioterrorism was developed at Lawrence Livermore National Laboratory to continuously sniff the air for airborne pathogens and toxins such as anthrax or plague. The APDS is the modern day equivalent of the canaries miners took underground with them to test for deadly carbon dioxide gas. But this canary can test for numerous bacteria, viruses, and toxins simultaneously, report results every hour, and confirm positive samples and guard against false positive results by using two different tests. The fully automated system collects and prepares air samples around the clock, does the analysis, and interprets the results. It requires no servicing or human intervention for an entire week. Unlike its feathered counterpart, when an APDS unit encounters something deadly in the air, that's when it begins singing, quietly. The APDS unit transmits a silent alert and sends detailed data to public health authorities, who can order evacuation and begin treatment of anyone exposed to toxic or biological agents. It is the latest in a series of biodefense detectors developed at DOE/NNSA national laboratories. The manual predecessor to APDS, called BASIS (for Biological Aerosol Sentry and Information System), was developed jointly by Los Alamos and Lawrence Livermore national laboratories. That system was modified to become BioWatch, the Department of Homeland Security's biological urban monitoring program. A related laboratory instrument, the Handheld Advanced Nucleic Acid Analyzer (HANAA), was first tested successfully at LLNL in September 1997. Successful partnering with private industry has been a key factor in the rapid advancement and deployment of biodefense instruments such as these

  16. Cardiac autonomic profile in rheumatoid arthritis and systemic lupus erythematosus.

    PubMed

    Aydemir, M; Yazisiz, V; Basarici, I; Avci, A B; Erbasan, F; Belgi, A; Terzioglu, E

    2010-03-01

    Neurological involvement is a well-documented issue in patients with systemic lupus erythematosus (SLE) and rheumatoid arthritis (RA). However, little is known about the involvement of the autonomic nervous system. This study was conducted to investigate autonomic nervous system dysfunction in patients with RA and SLE. Twenty-six RA patients, 38 SLE patients and 40 healthy controls were recruited from our in- and out-patient departments. Heart rate variability (HRV) parameters (the power of the high- [HF] and low-frequency [LF] band of haemodynamic time series, the ratio between low- and high-frequency components [LF/HF ratio], the power spectral density), baroreflex sensitivity (BRS) and beat-to-beat blood pressures were assessed by a novel non-invasive haemodynamic monitoring tool (Task Force Monitor [TFM], CNSystems Medizintechnik GmbH, Graz, Austria). Autonomic nervous system dysfunction was determined according to classical Ewing autonomic test battery. Furthermore, we implemented a secondary autonomic test score by modifying the Ewing test battery with additional criteria. Both the classical and modified Ewing test batteries have revealed that the frequencies of autonomic neuropathy were significantly higher in patient groups compared with controls (p < 0.001). Evaluation by TFM revealed that deterioration of sophisticated autonomic parameters (such as HRV and BRS) were more pronounced in the patient groups compared with controls. There was a significant association between BRS and Ewing test scores and abnormal BRS results were more frequent in patients with autonomic dysfunction according to Ewing test batteries. No relation was found between autonomic neuropathy and disease duration, disease activity and autoantibody positivity. Consequently, we believe that further large-scale studies investigating cardiovascular autonomic neuropathy in rheumatic diseases should be carried out to verify our findings and manifest clinical consequences beyond these results.

  17. IPS - a vision aided navigation system

    NASA Astrophysics Data System (ADS)

    Börner, Anko; Baumbach, Dirk; Buder, Maximilian; Choinowski, Andre; Ernst, Ines; Funk, Eugen; Grießbach, Denis; Schischmanow, Adrian; Wohlfeil, Jürgen; Zuev, Sergey

    2017-04-01

    Ego localization is an important prerequisite for several scientific, commercial, and statutory tasks. Only by knowing one's own position, can guidance be provided, inspections be executed, and autonomous vehicles be operated. Localization becomes challenging if satellite-based navigation systems are not available, or data quality is not sufficient. To overcome this problem, a team of the German Aerospace Center (DLR) developed a multi-sensor system based on the human head and its navigation sensors - the eyes and the vestibular system. This system is called integrated positioning system (IPS) and contains a stereo camera and an inertial measurement unit for determining an ego pose in six degrees of freedom in a local coordinate system. IPS is able to operate in real time and can be applied for indoor and outdoor scenarios without any external reference or prior knowledge. In this paper, the system and its key hardware and software components are introduced. The main issues during the development of such complex multi-sensor measurement systems are identified and discussed, and the performance of this technology is demonstrated. The developer team started from scratch and transfers this technology into a commercial product right now. The paper finishes with an outlook.

  18. Evaluating the autonomic nervous system in patients with laryngopharyngeal reflux.

    PubMed

    Huang, Wan-Ju; Shu, Chih-Hung; Chou, Kun-Ta; Wang, Yi-Fen; Hsu, Yen-Bin; Ho, Ching-Yin; Lan, Ming-Ying

    2013-06-01

    The pathogenesis of laryngopharyngeal reflux (LPR) remains unclear. It is linked to but distinct from gastroesophageal reflux disease (GERD), which has been shown to be related to disturbed autonomic regulation. The aim of this study is to investigate whether autonomic dysfunction also plays a role in the pathogenesis of LPR. Case-control study. Tertiary care center. Seventeen patients with LPR and 19 healthy controls, aged between 19 and 50 years, were enrolled in the study. The patients were diagnosed with LPR if they had a reflux symptom index (RSI) ≥ 13 and a reflux finding score (RFS) ≥ 7. Spectral analysis of heart rate variability (HRV) analysis was used to assess autonomic function. Anxiety and depression levels measured by the Beck Anxiety Inventory (BAI) and Beck Depression Inventory II (BDI-II) were also conducted. In HRV analysis, high frequency (HF) represents the parasympathetic activity of the autonomic nervous system, whereas low frequency (LF) represents the total autonomic activity. There were no significant differences in the LF power and HF power between the 2 groups. However, significantly lower HF% (P = .003) and a higher LF/HF ratio (P = .012) were found in patients with LPR, who demonstrated poor autonomic modulation and higher sympathetic activity. Anxiety was also frequently observed in the patient group. The study suggests that autonomic dysfunction seems to be involved in the pathogenesis of LPR. The potential beneficial effect of autonomic nervous system modulation as a therapeutic modality for LPR merits further investigation.

  19. Application of aircraft navigation sensors to enhanced vision systems

    NASA Technical Reports Server (NTRS)

    Sweet, Barbara T.

    1993-01-01

    In this presentation, the applicability of various aircraft navigation sensors to enhanced vision system design is discussed. First, the accuracy requirements of the FAA for precision landing systems are presented, followed by the current navigation systems and their characteristics. These systems include Instrument Landing System (ILS), Microwave Landing System (MLS), Inertial Navigation, Altimetry, and Global Positioning System (GPS). Finally, the use of navigation system data to improve enhanced vision systems is discussed. These applications include radar image rectification, motion compensation, and image registration.

  20. A Machine Vision System for Automatically Grading Hardwood Lumber - (Proceedings)

    Treesearch

    Richard W. Conners; Tai-Hoon Cho; Chong T. Ng; Thomas H. Drayer; Joe G. Tront; Philip A. Araman; Robert L. Brisbon

    1990-01-01

    Any automatic system for grading hardwood lumber can conceptually be divided into two components. One of these is a machine vision system for locating and identifying grading defects. The other is an automatic grading program that accepts as input the output of the machine vision system and, based on these data, determines the grade of a board. The progress that has...

  1. Monitoring wildfires using an autonomous aerial system (AAS)

    NASA Astrophysics Data System (ADS)

    Levine, Joel S.; Ambrosia, Vincent; Brass, James A.; Davis, Richard E.; Dull, Charles W.; Greenfield, Paul H.; Harrison, F. W.; Killough, Brian D.; Kist, Edward H.; Pinto, Joseph P.; Stover, Gregory; Tappan, Nina D.; Wegener, Steve S.

    2004-12-01

    The environmental and health effects of wildfires are discussed. The monitoring of wildfires from aircraft using remote sensing techniques is reviewed. A future autonomous aerial observing system for fire monitoring is described.

  2. Vision-based augmented reality system

    NASA Astrophysics Data System (ADS)

    Chen, Jing; Wang, Yongtian; Shi, Qi; Yan, Dayuan

    2003-04-01

    The most promising aspect of augmented reality lies in its ability to integrate the virtual world of the computer with the real world of the user. Namely, users can interact with the real world subjects and objects directly. This paper presents an experimental augmented reality system with a video see-through head-mounted device to display visual objects, as if they were lying on the table together with real objects. In order to overlay virtual objects on the real world at the right position and orientation, the accurate calibration and registration are most important. A vision-based method is used to estimate CCD external parameters by tracking 4 known points with different colors. It achieves sufficient accuracy for non-critical applications such as gaming, annotation and so on.

  3. Physics Simulation Software for Autonomous Propellant Loading and Gas House Autonomous System Monitoring

    NASA Technical Reports Server (NTRS)

    Regalado Reyes, Bjorn Constant

    2015-01-01

    1. Kennedy Space Center (KSC) is developing a mobile launching system with autonomous propellant loading capabilities for liquid-fueled rockets. An autonomous system will be responsible for monitoring and controlling the storage, loading and transferring of cryogenic propellants. The Physics Simulation Software will reproduce the sensor data seen during the delivery of cryogenic fluids including valve positions, pressures, temperatures and flow rates. The simulator will provide insight into the functionality of the propellant systems and demonstrate the effects of potential faults. This will provide verification of the communications protocols and the autonomous system control. 2. The High Pressure Gas Facility (HPGF) stores and distributes hydrogen, nitrogen, helium and high pressure air. The hydrogen and nitrogen are stored in cryogenic liquid state. The cryogenic fluids pose several hazards to operators and the storage and transfer equipment. Constant monitoring of pressures, temperatures and flow rates are required in order to maintain the safety of personnel and equipment during the handling and storage of these commodities. The Gas House Autonomous System Monitoring software will be responsible for constantly observing and recording sensor data, identifying and predicting faults and relaying hazard and operational information to the operators.

  4. Autonomic system modification in Zen practitioners.

    PubMed

    Fiorentini, Alessandra; Ora, Josuel; Tubani, Luigi

    2013-01-01

    Meditation in its various forms is a traditional exercise with a potential benefit on well-being and health. On a psychosomatic level these exercises seem to improve the salutogenetic potential in man.Especially the cardiorespiratory interaction seems to play an important role since most meditation techniques make use of special low frequency breathing patterns regardless of whether they result from a deliberate guidance of breathing or other mechanisms, for example, the recitation of specific verse. During the different exercises of Zen meditation the depth and the duration of each respiratory cycle is determined only by the process of breathing. Respiratory manoeuvres during Zazen meditation may produce HR variability changes similar to those produces during biofeedback. Recognition that the respiratory sinus arrhythmia (RSA) was mediated by efferent vagal activity acting on the sinus node led investigators to attempt to quantify the fluctuations in R-R intervals that were related to breathing. Nine Zen practitioners with five years of experience took part in the study. Autonomic nervous system function was evaluated by heart rate variability (HRV) analysis during 24-hours ECG recording during zen meditation and at rest. The data of this small observational study confirm that ZaZen breathing falls within the range of low frequency HR spectral bands. Our data suggest that the modification of HR spectral power remained also in normal day when the subject have a normal breathing. We suggest that the changes in the breathing rate might modify the chemoreflex and the continuous practice in slow breathing can reduce chemoreflex. This change in the automonic control of respiration can be permanent with a resetting of endogenous circulatory rhythms.

  5. Autonomous Mobility Applique System Joint Capability Technology Demonstration

    DTIC Science & Technology

    2013-04-22

    UNCLASSIFIED Page-1 Autonomous Mobility Appliqué System Joint Capability Technology Demonstration Participants • COCOM Sponsor: CENTCOM...COCOM Co-Sponsor: TRANSCOM • Lead Service: US Army • Supporting Service: USMC • Oversight Executive: OUSD(AT&L)DDRE/RFD/CS/MK Tribbie • Technical...Autonomous Mobility Applique System Joint Capability Technology Demonstration 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6

  6. Hi-Vision telecine system using pickup tube

    NASA Astrophysics Data System (ADS)

    Iijima, Goro

    1992-08-01

    Hi-Vision broadcasting, offering far more lifelike pictures than those produced by existing television broadcasting systems, has enormous potential in both industrial and commercial fields. The dissemination of the Hi-Vision system will enable vivid, movie theater quality pictures to be readily enjoyed in homes in the near future. To convert motion film pictures into Hi-Vision signals, a telecine system is needed. The Hi-Vision telecine systems currently under development are the "laser telecine," "flying-spot telecine," and "Saticon telecine" systems. This paper provides an overview of the pickup tube type Hi-Vision telecine system (referred to herein as the Saticon telecine system) developed and marketed by Ikegami Tsushinki Co., Ltd.

  7. Nuclear bimodal new vision solar system missions

    SciTech Connect

    Mondt, J.F.; Zubrin, R.M.

    1996-03-01

    This paper presents an analysis of the potential mission capability using space reactor bimodal systems for planetary missions. Missions of interest include the Main belt asteroids, Jupiter, Saturn, Neptune, and Pluto. The space reactor bimodal system, defined by an Air Force study for Earth orbital missions, provides 10 kWe power, 1000 N thrust, 850 s Isp, with a 1500 kg system mass. Trajectories to the planetary destinations were examined and optimal direct and gravity assisted trajectories were selected. A conceptual design for a spacecraft using the space reactor bimodal system for propulsion and power, that is capable of performing the missions of interest, is defined. End-to-end mission conceptual designs for bimodal orbiter missions to Jupiter and Saturn are described. All missions considered use the Delta 3 class or Atlas 2AS launch vehicles. The space reactor bimodal power and propulsion system offers both; new vision {open_quote}{open_quote}constellation{close_quote}{close_quote} type missions in which the space reactor bimodal spacecraft acts as a carrier and communication spacecraft for a fleet of microspacecraft deployed at different scientific targets and; conventional missions with only a space reactor bimodal spacecraft and its science payload. {copyright} {ital 1996 American Institute of Physics.}

  8. An Expert Vision System for Autonomous Land Vehicle Road Following.

    DTIC Science & Technology

    1988-01-01

    TR-138, Center for Automa- tioii Hesearch, University of Maryland, July 1985. ’Miinskyl Minsky , Marvin , "A Framework for Representing Knowledge", in...relationships, frames have been chosen to model objects , Minsky ]. A frame is a data structure containing a set of slots (or attributes) which en- capsulate

  9. Flight Test Comparison Between Enhanced Vision (FLIR) and Synthetic Vision Systems

    NASA Technical Reports Server (NTRS)

    Arthur, Jarvis J., III; Kramer, Lynda J.; Bailey, Randall E.

    2005-01-01

    Limited visibility and reduced situational awareness have been cited as predominant causal factors for both Controlled Flight Into Terrain (CFIT) and runway incursion accidents. NASA s Synthetic Vision Systems (SVS) project is developing practical application technologies with the goal of eliminating low visibility conditions as a causal factor to civil aircraft accidents while replicating the operational benefits of clear day flight operations, regardless of the actual outside visibility condition. A major thrust of the SVS project involves the development/demonstration of affordable, certifiable display configurations that provide intuitive out-the-window terrain and obstacle information with advanced pathway guidance. A flight test evaluation was conducted in the summer of 2004 by NASA Langley Research Center under NASA s Aviation Safety and Security, Synthetic Vision System - Commercial and Business program. A Gulfstream G-V aircraft, modified and operated under NASA contract by the Gulfstream Aerospace Corporation, was flown over a 3-week period at the Reno/Tahoe International Airport and an additional 3-week period at the NASA Wallops Flight Facility to evaluate integrated Synthetic Vision System concepts. Flight testing was conducted to evaluate the performance, usability, and acceptance of an integrated synthetic vision concept which included advanced Synthetic Vision display concepts for a transport aircraft flight deck, a Runway Incursion Prevention System, an Enhanced Vision Systems (EVS), and real-time Database Integrity Monitoring Equipment. This paper focuses on comparing qualitative and subjective results between EVS and SVS display concepts.

  10. Technological process supervising using vision systems cooperating with the LabVIEW vision builder

    NASA Astrophysics Data System (ADS)

    Hryniewicz, P.; Banaś, W.; Gwiazda, A.; Foit, K.; Sękala, A.; Kost, G.

    2015-11-01

    One of the most important tasks in the production process is to supervise its proper functioning. Lack of required supervision over the production process can lead to incorrect manufacturing of the final element, through the production line downtime and hence to financial losses. The worst result is the damage of the equipment involved in the manufacturing process. Engineers supervise the production flow correctness use the great range of sensors supporting the supervising of a manufacturing element. Vision systems are one of sensors families. In recent years, thanks to the accelerated development of electronics as well as the easier access to electronic products and attractive prices, they become the cheap and universal type of sensors. These sensors detect practically all objects, regardless of their shape or even the state of matter. The only problem is considered with transparent or mirror objects, detected from the wrong angle. Integrating the vision system with the LabVIEW Vision and the LabVIEW Vision Builder it is possible to determine not only at what position is the given element but also to set its reorientation relative to any point in an analyzed space. The paper presents an example of automated inspection. The paper presents an example of automated inspection of the manufacturing process in a production workcell using the vision supervising system. The aim of the work is to elaborate the vision system that could integrate different applications and devices used in different production systems to control the manufacturing process.

  11. Flight test comparison between enhanced vision (FLIR) and synthetic vision systems

    NASA Astrophysics Data System (ADS)

    Arthur, Jarvis J., III; Kramer, Lynda J.; Bailey, Randall E.

    2005-05-01

    Limited visibility and reduced situational awareness have been cited as predominant causal factors for both Controlled Flight Into Terrain (CFIT) and runway incursion accidents. NASA"s Synthetic Vision Systems (SVS) project is developing practical application technologies with the goal of eliminating low visibility conditions as a causal factor to civil aircraft accidents while replicating the operational benefits of clear day flight operations, regardless of the actual outside visibility condition. A major thrust of the SVS project involves the development/demonstration of affordable, certifiable display configurations that provide intuitive out-the-window terrain and obstacle information with advanced pathway guidance. A flight test evaluation was conducted in the summer of 2004 by NASA Langley Research Center under NASA's Aviation Safety and Security, Synthetic Vision System - Commercial and Business program. A Gulfstream G-V aircraft, modified and operated under NASA contract by the Gulfstream Aerospace Corporation, was flown over a 3-week period at the Reno/Tahoe International Airport and an additional 3-week period at the NASA Wallops Flight Facility to evaluate integrated Synthetic Vision System concepts. Flight testing was conducted to evaluate the performance, usability, and acceptance of an integrated synthetic vision concept which included advanced Synthetic Vision display concepts for a transport aircraft flight deck, a Runway Incursion Prevention System, an Enhanced Vision Systems (EVS), and real-time Database Integrity Monitoring Equipment. This paper focuses on comparing qualitative and subjective results between EVS and SVS display concepts.

  12. Vision Systems with the Human in the Loop

    NASA Astrophysics Data System (ADS)

    Bauckhage, Christian; Hanheide, Marc; Wrede, Sebastian; Käster, Thomas; Pfeiffer, Michael; Sagerer, Gerhard

    2005-12-01

    The emerging cognitive vision paradigm deals with vision systems that apply machine learning and automatic reasoning in order to learn from what they perceive. Cognitive vision systems can rate the relevance and consistency of newly acquired knowledge, they can adapt to their environment and thus will exhibit high robustness. This contribution presents vision systems that aim at flexibility and robustness. One is tailored for content-based image retrieval, the others are cognitive vision systems that constitute prototypes of visual active memories which evaluate, gather, and integrate contextual knowledge for visual analysis. All three systems are designed to interact with human users. After we will have discussed adaptive content-based image retrieval and object and action recognition in an office environment, the issue of assessing cognitive systems will be raised. Experiences from psychologically evaluated human-machine interactions will be reported and the promising potential of psychologically-based usability experiments will be stressed.

  13. Feeling good: autonomic nervous system responding in five positive emotions.

    PubMed

    Shiota, Michelle N; Neufeld, Samantha L; Yeung, Wan H; Moser, Stephanie E; Perea, Elaine F

    2011-12-01

    Although dozens of studies have examined the autonomic nervous system (ANS) aspects of negative emotions, less is known about ANS responding in positive emotion. An evolutionary framework was used to define five positive emotions in terms of fitness-enhancing function, and to guide hypotheses regarding autonomic responding. In a repeated measures design, participants viewed sets of visual images eliciting these positive emotions (anticipatory enthusiasm, attachment love, nurturant love, amusement, and awe) plus an emotionally neutral state. Peripheral measures of sympathetic and vagal parasympathetic activation were assessed. Results indicated that the emotion conditions were characterized by qualitatively distinct profiles of autonomic activation, suggesting the existence of multiple, physiologically distinct positive emotions.

  14. Machine Vision Systems for Processing Hardwood Lumber and Logs

    Treesearch

    Philip A. Araman; Daniel L. Schmoldt; Tai-Hoon Cho; Dongping Zhu; Richard W. Conners; D. Earl Kline

    1992-01-01

    Machine vision and automated processing systems are under development at Virginia Tech University with support and cooperation from the USDA Forest Service. Our goals are to help U.S. hardwood producers automate, reduce costs, increase product volume and value recovery, and market higher value, more accurately graded and described products. Any vision system is...

  15. 3-D Signal Processing in a Computer Vision System

    Treesearch

    Dongping Zhu; Richard W. Conners; Philip A. Araman

    1991-01-01

    This paper discusses the problem of 3-dimensional image filtering in a computer vision system that would locate and identify internal structural failure. In particular, a 2-dimensional adaptive filter proposed by Unser has been extended to 3-dimension. In conjunction with segmentation and labeling, the new filter has been used in the computer vision system to...

  16. Intelligent Computer Vision System for Automated Classification

    NASA Astrophysics Data System (ADS)

    Jordanov, Ivan; Georgieva, Antoniya

    2010-05-01

    In this paper we investigate an Intelligent Computer Vision System applied for recognition and classification of commercially available cork tiles. The system is capable of acquiring and processing gray images using several feature generation and analysis techniques. Its functionality includes image acquisition, feature extraction and preprocessing, and feature classification with neural networks (NN). We also discuss system test and validation results from the recognition and classification tasks. The system investigation also includes statistical feature processing (features number and dimensionality reduction techniques) and classifier design (NN architecture, target coding, learning complexity and performance, and training with our own metaheuristic optimization method). The NNs trained with our genetic low-discrepancy search method (GLPτS) for global optimisation demonstrated very good generalisation abilities. In our view, the reported testing success rate of up to 95% is due to several factors: combination of feature generation techniques; application of Analysis of Variance (ANOVA) and Principal Component Analysis (PCA), which appeared to be very efficient for preprocessing the data; and use of suitable NN design and learning method.

  17. Intelligent Computer Vision System for Automated Classification

    SciTech Connect

    Jordanov, Ivan; Georgieva, Antoniya

    2010-05-21

    In this paper we investigate an Intelligent Computer Vision System applied for recognition and classification of commercially available cork tiles. The system is capable of acquiring and processing gray images using several feature generation and analysis techniques. Its functionality includes image acquisition, feature extraction and preprocessing, and feature classification with neural networks (NN). We also discuss system test and validation results from the recognition and classification tasks. The system investigation also includes statistical feature processing (features number and dimensionality reduction techniques) and classifier design (NN architecture, target coding, learning complexity and performance, and training with our own metaheuristic optimization method). The NNs trained with our genetic low-discrepancy search method (GLPtauS) for global optimisation demonstrated very good generalisation abilities. In our view, the reported testing success rate of up to 95% is due to several factors: combination of feature generation techniques; application of Analysis of Variance (ANOVA) and Principal Component Analysis (PCA), which appeared to be very efficient for preprocessing the data; and use of suitable NN design and learning method.

  18. VIBANASS (VIsion BAsed NAvigation Sensor System) System Test Results

    NASA Astrophysics Data System (ADS)

    Hausmann, G.; Muhlbauer, Q.; Rank, P.; Kaiser, C.

    2013-08-01

    Future Active Debris Removal missions will require vision sensors both to support guidance, navigation and control and to examine the targeted debris object prior to capture. With this scenario in mind, Kayser-Threde has developed the VIsion BAsed NAvigation Sensor System (VIBANASS). A demonstrator model representative of the flight hardware was built for execution of a space qualification program and subjected to an extensive test campaign at the European Proximity Operations Simulator (EPOS). It was shown that VIBANASS is able to perform its tasks reliably in vision-based Rendezvous and Docking maneuvers under a wide variety of illumination conditions. These tests included image processing algorithms for target distance evaluation and a closed-loop rendezvous experiment.

  19. Range gated active night vision system for automobiles.

    PubMed

    David, Ofer; Kopeika, Norman S; Weizer, Boaz

    2006-10-01

    Night vision for automobiles is an emerging safety feature that is being introduced for automotive safety. We develop what we believe is an innovative new night vision system using gated imaging principles. The concept of gated imaging is described and its basic advantages, including the backscatter reduction mechanism for improved vision through fog, rain, and snow. Evaluation of performance is presented by analyzing bar pattern modulation and comparing Johnson chart predictions.

  20. Enhanced Flight Vision Systems and Synthetic Vision Systems for NextGen Approach and Landing Operations

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Bailey, Randall E.; Ellis, Kyle K. E.; Williams, Steven P.; Arthur, Jarvis J., III; Prinzel, Lawrence J., III; Shelton, Kevin J.

    2013-01-01

    Synthetic Vision Systems and Enhanced Flight Vision System (SVS/EFVS) technologies have the potential to provide additional margins of safety for aircrew performance and enable operational improvements for low visibility operations in the terminal area environment with equivalent efficiency as visual operations. To meet this potential, research is needed for effective technology development and implementation of regulatory standards and design guidance to support introduction and use of SVS/EFVS advanced cockpit vision technologies in Next Generation Air Transportation System (NextGen) operations. A fixed-base pilot-in-the-loop simulation test was conducted at NASA Langley Research Center that evaluated the use of SVS/EFVS in NextGen low visibility approach and landing operations. Twelve crews flew approach and landing operations in a simulated NextGen Chicago O'Hare environment. Various scenarios tested the potential for using EFVS to conduct approach, landing, and roll-out operations in visibility as low as 1000 feet runway visual range (RVR). Also, SVS was tested to evaluate the potential for lowering decision heights (DH) on certain instrument approach procedures below what can be flown today. Expanding the portion of the visual segment in which EFVS can be used in lieu of natural vision from 100 feet above the touchdown zone elevation to touchdown and rollout in visibilities as low as 1000 feet RVR appears to be viable as touchdown performance was acceptable without any apparent workload penalties. A lower DH of 150 feet and/or possibly reduced visibility minima using SVS appears to be viable when implemented on a Head-Up Display, but the landing data suggests further study for head-down implementations.

  1. Formal Methods for Autonomic and Swarm-based Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy; Hinchey, Mike; Truszkowski, Walt; Rash, James

    2004-01-01

    Swarms of intelligent rovers and spacecraft are being considered for a number of future NASA missions. These missions will provide MSA scientist and explorers greater flexibility and the chance to gather more science than traditional single spacecraft missions. These swarms of spacecraft are intended to operate for large periods of time without contact with the Earth. To do this, they must be highly autonomous, have autonomic properties and utilize sophisticated artificial intelligence. The Autonomous Nano Technology Swarm (ANTS) mission is an example of one of the swarm type of missions NASA is considering. This mission will explore the asteroid belt using an insect colony analogy cataloging the mass, density, morphology, and chemical composition of the asteroids, including any anomalous concentrations of specific minerals. Verifying such a system would be a huge task. This paper discusses ongoing work to develop a formal method for verifying swarm and autonomic systems.

  2. Experimental design for assessing the effectiveness of autonomous countermine systems

    NASA Astrophysics Data System (ADS)

    Chappell, Isaac; May, Michael; Moses, Franklin L.

    2010-04-01

    The countermine mission (CM) is a compelling example of what autonomous systems must address to reduce risks that Soldiers take routinely. The list of requirements is formidable and includes autonomous navigation, autonomous sensor scanning, platform mobility and stability, mobile manipulation, automatic target recognition (ATR), and systematic integration and control of components. This paper compares and contrasts how the CM is done today against the challenges of achieving comparable performance using autonomous systems. The Soldier sets a high standard with, for example, over 90% probability of detection (Pd) of metallic and low-metal mines and a false alarm rate (FAR) as low as 0.05/m2. In this paper, we suggest a simplification of the semi-autonomous CM by breaking it into three components: sensor head maneuver, robot navigation, and kill-chain prosecution. We also discuss the measurements required to map the system's physical and state attributes to performance specifications and note that current Army countermine metrics are insufficient to the guide the design of a semi-autonomous countermine system.

  3. Networked vision system using a Prolog controller

    NASA Astrophysics Data System (ADS)

    Batchelor, B. G.; Caton, S. J.; Chatburn, L. T.; Crowther, R. A.; Miller, J. W. V.

    2005-11-01

    Prolog offers a very different style of programming compared to conventional languages; it can define object properties and abstract relationships in a way that Java, C, C++, etc. find awkward. In an accompanying paper, the authors describe how a distributed web-based vision systems can be built using elements that may even be located on different continents. One particular system of this general type is described here. The top-level controller is a Prolog program, which operates one, or more, image processing engines. This type of function is natural to Prolog, since it is able to reason logically using symbolic (non-numeric) data. Although Prolog is not suitable for programming image processing functions directly, it is ideal for analysing the results derived by an image processor. This article describes the implementation of two systems, in which a Prolog program controls several image processing engines, a simple robot, a pneumatic pick-and-place arm), LED illumination modules and a various mains-powered devices.

  4. Equipment Proposal for the Autonomous Vehicle Systems Laboratory at UIW

    DTIC Science & Technology

    2015-04-29

    Conference. 17-MAY-15, . : , Michael T. Frye, Robert S. Provence. Direct Inverse Control using an Artificial Neural Network for the Autonomous Hover of...As a first step to demonstrating this objective, the PI has been investigating a Machine Learning technique using Direct Inverse Control for the... control of a formation of multi-agent autonomous systems in uncertain dynamic environments. The educational mission of this laboratory is to introduce new

  5. Autonomous Systems: Issues for Defence Policymakers

    DTIC Science & Technology

    2015-09-30

    this provision, but bullets tipped with poison or specifically designed to cause untreatable wounds would be. There is no reason why autonomous weapon...During the Cold War, defence planners faced a similar problem of ‘fragile stability’, whereby vulnerable nuclear arsenals incentivized an enemy to...in ways that are conducive to the cause of peace. An accelerated tempo of operations may lead to combat that is more chaotic, but not more

  6. Development of an Automatic Identification System Autonomous Positioning System

    PubMed Central

    Hu, Qing; Jiang, Yi; Zhang, Jingbo; Sun, Xiaowen; Zhang, Shufang

    2015-01-01

    In order to overcome the vulnerability of the global navigation satellite system (GNSS) and provide robust position, navigation and time (PNT) information in marine navigation, the autonomous positioning system based on ranging-mode Automatic Identification System (AIS) is presented in the paper. The principle of the AIS autonomous positioning system (AAPS) is investigated, including the position algorithm, the signal measurement technique, the geometric dilution of precision, the time synchronization technique and the additional secondary factor correction technique. In order to validate the proposed AAPS, a verification system has been established in the Xinghai sea region of Dalian (China). Static and dynamic positioning experiments are performed. The original function of the AIS in the AAPS is not influenced. The experimental results show that the positioning precision of the AAPS is better than 10 m in the area with good geometric dilution of precision (GDOP) by the additional secondary factor correction technology. This is the most economical solution for a land-based positioning system to complement the GNSS for the navigation safety of vessels sailing along coasts. PMID:26569258

  7. Development of an Automatic Identification System Autonomous Positioning System.

    PubMed

    Hu, Qing; Jiang, Yi; Zhang, Jingbo; Sun, Xiaowen; Zhang, Shufang

    2015-11-11

    In order to overcome the vulnerability of the global navigation satellite system (GNSS) and provide robust position, navigation and time (PNT) information in marine navigation, the autonomous positioning system based on ranging-mode Automatic Identification System (AIS) is presented in the paper. The principle of the AIS autonomous positioning system (AAPS) is investigated, including the position algorithm, the signal measurement technique, the geometric dilution of precision, the time synchronization technique and the additional secondary factor correction technique. In order to validate the proposed AAPS, a verification system has been established in the Xinghai sea region of Dalian (China). Static and dynamic positioning experiments are performed. The original function of the AIS in the AAPS is not influenced. The experimental results show that the positioning precision of the AAPS is better than 10 m in the area with good geometric dilution of precision (GDOP) by the additional secondary factor correction technology. This is the most economical solution for a land-based positioning system to complement the GNSS for the navigation safety of vessels sailing along coasts.

  8. ROVER: A prototype active vision system

    NASA Astrophysics Data System (ADS)

    Coombs, David J.; Marsh, Brian D.

    1987-08-01

    The Roving Eyes project is an experiment in active vision. We present the design and implementation of a prototype that tracks colored balls in images from an on-line charge coupled device (CCD) camera. Rover is designed to keep up with its rapidly changing environment by handling best and average case conditions and ignoring the worst case. This allows Rover's techniques to be less sophisticated and consequently faster. Each of Rover's major functional units is relatively isolated from the others, and an executive which knows all the functional units directs the computation by deciding which jobs would be most effective to run. This organization is realized with a priority queue of jobs and their arguments. Rover's structure not only allows it to adapt its strategy to the environment, but also makes the system extensible. A capability can be added to the system by adding a functional module with a well defined interface and by modifying the executive to make use of the new module. The current implementation is discussed in the appendices.

  9. Targeting the autonomic nervous system: measuring autonomic function and novel devices for heart failure management.

    PubMed

    Patel, Hitesh C; Rosen, Stuart D; Lindsay, Alistair; Hayward, Carl; Lyon, Alexander R; di Mario, Carlo

    2013-12-10

    Neurohumoral activation, in which enhanced activity of the autonomic nervous system (ANS) is a key component, plays a pivotal role in heart failure. The neurohumoral system affects several organs and currently our knowledge of the molecular and systemic pathways involved in the neurohumoral activation is incomplete. All the methods of assessing the degree of activation of the autonomic system have limitations and they are not interchangeable. The methods considered include noradrenaline spillover, microneurography, radiotracer imaging and analysis of heart rate and blood pressure (heart rate variability, baroreceptor sensitivity, heart rate turbulence). Despite the difficulties, medications that affect the ANS have been shown to improve mortality in heart failure and the mechanism is related to attenuation of the sympathetic nervous system (SNS) and stimulation of the parasympathetic nervous system. However, limitations of compliance with medication, side effects and inadequate SNS attenuation are issues of concern with the pharmacological approach. The newer device based therapies for sympathetic modulation are showing encouraging results. As they directly influence the autonomic nervous system, more mechanistic information can be gleaned if appropriate investigations are performed at the time of the outcome trials. However, clinicians should be reminded that the ANS is an evolutionary survival mechanism and therefore there is a need to proceed with caution when trying to completely attenuate its effects. So our enthusiasm for the application of these devices in heart failure should be controlled, especially as none of the devices have trial data powered to assess effects on mortality or cardiovascular events. © 2013.

  10. Autonomous Car Parking System through a Cooperative Vehicular Positioning Network

    PubMed Central

    Correa, Alejandro; Boquet, Guillem; Morell, Antoni; Lopez Vicario, Jose

    2017-01-01

    The increasing development of the automotive industry towards a fully autonomous car has motivated the design of new value-added services in Vehicular Sensor Networks (VSNs). Within the context of VSNs, the autonomous car, with an increasing number of on-board sensors, is a mobile node that exchanges sensed and state information within the VSN. Among all the value added services for VSNs, the design of new intelligent parking management architectures where the autonomous car will coexist with traditional cars is mandatory in order to profit from all the opportunities associated with the increasing intelligence of the new generation of cars. In this work, we design a new smart parking system on top of a VSN that takes into account the heterogeneity of cars and provides guidance to the best parking place for the autonomous car based on a collaborative approach that searches for the common good of all of them measured by the accessibility rate, which is the ratio of the free parking places accessible for an autonomous car. Then, we simulate a real parking lot and the results show that the performance of our system is close to the optimum considering different communication ranges and penetration rates for the autonomous car. PMID:28406426

  11. Autonomous Car Parking System through a Cooperative Vehicular Positioning Network.

    PubMed

    Correa, Alejandro; Boquet, Guillem; Morell, Antoni; Lopez Vicario, Jose

    2017-04-13

    The increasing development of the automotive industry towards a fully autonomous car has motivated the design of new value-added services in Vehicular Sensor Networks (VSNs). Within the context of VSNs, the autonomous car, with an increasing number of on-board sensors, is a mobile node that exchanges sensed and state information within the VSN. Among all the value added services for VSNs, the design of new intelligent parking management architectures where the autonomous car will coexist with traditional cars is mandatory in order to profit from all the opportunities associated with the increasing intelligence of the new generation of cars. In this work, we design a new smart parking system on top of a VSN that takes into account the heterogeneity of cars and provides guidance to the best parking place for the autonomous car based on a collaborative approach that searches for the common good of all of them measured by the accessibility rate, which is the ratio of the free parking places accessible for an autonomous car. Then, we simulate a real parking lot and the results show that the performance of our system is close to the optimum considering different communication ranges and penetration rates for the autonomous car.

  12. Role of the autonomic nervous system in modulating cardiac arrhythmias.

    PubMed

    Shen, Mark J; Zipes, Douglas P

    2014-03-14

    The autonomic nervous system plays an important role in the modulation of cardiac electrophysiology and arrhythmogenesis. Decades of research has contributed to a better understanding of the anatomy and physiology of cardiac autonomic nervous system and provided evidence supporting the relationship of autonomic tone to clinically significant arrhythmias. The mechanisms by which autonomic activation is arrhythmogenic or antiarrhythmic are complex and different for specific arrhythmias. In atrial fibrillation, simultaneous sympathetic and parasympathetic activations are the most common trigger. In contrast, in ventricular fibrillation in the setting of cardiac ischemia, sympathetic activation is proarrhythmic, whereas parasympathetic activation is antiarrhythmic. In inherited arrhythmia syndromes, sympathetic stimulation precipitates ventricular tachyarrhythmias and sudden cardiac death except in Brugada and J-wave syndromes where it can prevent them. The identification of specific autonomic triggers in different arrhythmias has brought the idea of modulating autonomic activities for both preventing and treating these arrhythmias. This has been achieved by either neural ablation or stimulation. Neural modulation as a treatment for arrhythmias has been well established in certain diseases, such as long QT syndrome. However, in most other arrhythmia diseases, it is still an emerging modality and under investigation. Recent preliminary trials have yielded encouraging results. Further larger-scale clinical studies are necessary before widespread application can be recommended.

  13. The function of the autonomic nervous system during spaceflight.

    PubMed

    Mandsager, Kyle Timothy; Robertson, David; Diedrich, André

    2015-06-01

    Despite decades of study, a clear understanding of autonomic nervous system activity in space remains elusive. Differential interpretation of fundamental data has driven divergent theories of sympathetic activation and vasorelaxation. This paper will review the available in-flight autonomic and hemodynamic data in an effort to resolve these discrepancies. The NASA NEUROLAB mission, the most comprehensive assessment of autonomic function in microgravity to date, will be highlighted. The mechanisms responsible for altered autonomic activity during spaceflight, which include the effects of hypovolemia, cardiovascular deconditioning, and altered central processing, will be presented. The NEUROLAB experiments demonstrated increased sympathetic activity and impairment of vagal baroreflex function during short-duration spaceflight. Subsequent non-invasive studies of autonomic function during spaceflight have largely reinforced these findings, and provide strong evidence that sympathetic activity is increased in space relative to the supine position on Earth. Others have suggested that microgravity induces a state of relative vasorelaxation and increased vagal activity when compared to upright posture on Earth. These ostensibly disparate theories are not mutually exclusive, but rather directly reflect different pre-flight postural controls. When these results are taken together, they demonstrate that the effectual autonomic challenge of spaceflight is small, and represents an orthostatic stress less than that of upright posture on Earth. In-flight countermeasures, including aerobic and resistance exercise, as well short-arm centrifugation, have been successfully deployed to counteract these mechanisms. Despite subtle changes in autonomic activity during spaceflight, underlying neurohumoral mechanisms of the autonomic nervous system remain intact and cardiovascular function remains stable during long-duration flight.

  14. The Function of the Autonomic Nervous System during Spaceflight

    PubMed Central

    Mandsager, Kyle Timothy; Robertson, David; Diedrich, André

    2015-01-01

    Introduction Despite decades of study, a clear understanding of autonomic nervous system activity in space remains elusive. Differential interpretation of fundamental data have driven divergent theories of sympathetic activation and vasorelaxation. Methods This paper will review the available in-flight autonomic and hemodynamic data in an effort to resolve these discrepancies. The NASA NEUROLAB mission, the most comprehensive assessment of autonomic function in microgravity to date, will be highlighted. The mechanisms responsible for altered autonomic activity during spaceflight, which include the effects of hypovolemia, cardiovascular deconditioning, and altered central processing, will be presented. Results The NEUROLAB experiments demonstrated increased sympathetic activity and impairment of vagal baroreflex function during short-duration spaceflight. Subsequent non-invasive studies of autonomic function during spaceflight have largely reinforced these findings, and provide strong evidence that sympathetic activity is increased in space relative to the supine position on Earth. Others have suggested that microgravity induces a state of relative vasorelaxation and increased vagal activity when compared to upright posture on Earth. These ostensibly disparate theories are not mutually exclusive, but rather directly reflect different pre-flight postural controls. Conclusion When these results are taken together, they demonstrate that the effectual autonomic challenge of spaceflight is small, and represents an orthostatic stress less than that of upright posture on Earth. In-flight countermeasures, including aerobic and resistance exercise, as well as short-arm centrifugation have been successfully deployed to counteract these mechanisms. Despite subtle changes in autonomic activity during spaceflight, underlying neurohumoral mechanisms of the autonomic nervous system remain intact and cardiovascular function remains stable during long-duration flight. PMID:25820827

  15. Neuronal degeneration in autonomic nervous system of Dystonia musculorum mice

    PubMed Central

    2011-01-01

    Background Dystonia musculorum (dt) is an autosomal recessive hereditary neuropathy with a characteristic uncoordinated movement and is caused by a defect in the bullous pemphigoid antigen 1 (BPAG1) gene. The neural isoform of BPAG1 is expressed in various neurons, including those in the central and peripheral nerve systems of mice. However, most previous studies on neuronal degeneration in BPAG1-deficient mice focused on peripheral sensory neurons and only limited investigation of the autonomic system has been conducted. Methods In this study, patterns of nerve innervation in cutaneous and iridial tissues were examined using general neuronal marker protein gene product 9.5 via immunohistochemistry. To perform quantitative analysis of the autonomic neuronal number, neurons within the lumbar sympathetic and parasympathetic ciliary ganglia were calculated. In addition, autonomic neurons were cultured from embryonic dt/dt mutants to elucidate degenerative patterns in vitro. Distribution patterns of neuronal intermediate filaments in cultured autonomic neurons were thoroughly studied under immunocytochemistry and conventional electron microscopy. Results Our immunohistochemistry results indicate that peripheral sensory nerves and autonomic innervation of sweat glands and irises dominated degeneration in dt/dt mice. Quantitative results confirmed that the number of neurons was significantly decreased in the lumbar sympathetic ganglia as well as in the parasympathetic ciliary ganglia of dt/dt mice compared with those of wild-type mice. We also observed that the neuronal intermediate filaments were aggregated abnormally in cultured autonomic neurons from dt/dt embryos. Conclusions These results suggest that a deficiency in the cytoskeletal linker BPAG1 is responsible for dominant sensory nerve degeneration and severe autonomic degeneration in dt/dt mice. Additionally, abnormally aggregated neuronal intermediate filaments may participate in neuronal death of cultured

  16. Vision system for dial gage torque wrench calibration

    NASA Astrophysics Data System (ADS)

    Aggarwal, Neelam; Doiron, Theodore D.; Sanghera, Paramjeet S.

    1993-11-01

    In this paper, we present the development of a fast and robust vision system which, in conjunction with the Dial Gage Calibration system developed by AKO Inc., will be used by the U.S. Army in calibrating dial gage torque wrenches. The vision system detects the change in the angular position of the dial pointer in a dial gage. The angular change is proportional to the applied torque. The input to the system is a sequence of images of the torque wrench dial gage taken at different dial pointer positions. The system then reports the angular difference between the different positions. The primary components of this vision system include modules for image acquisition, linear feature extraction and angle measurements. For each of these modules, several techniques were evaluated and the most applicable one was selected. This system has numerous other applications like vision systems to read and calibrate analog instruments.

  17. Achieving safe autonomous landings on Mars using vision-based approaches

    NASA Astrophysics Data System (ADS)

    Pien, Homer

    1992-03-01

    Autonomous landing capabilities will be critical to the success of planetary exploration missions, and in particular to the exploration of Mars. Past studies have indicated that the probability of failure associated with open-loop landings is unacceptably high. Two approaches to achieving autonomous landings with higher probabilities of success are currently under analysis. If a landing site has been certified as hazard free, then navigational aids can be used to facilitate a precision landing. When only limited surface knowledge is available and landing areas cannot be certified as hazard free, then a hazard detection and avoidance approach can be used, in which the vehicle selects hazard free landing sites in real-time during its descent. Issues pertinent to both approaches, including sensors and algorithms, are presented. Preliminary results indicate that one promising approach to achieving high accuracy precision landing is to correlate optical images of the terrain acquired during the terminal descent phase with a reference image. For hazard detection scenarios, a sensor suite comprised of a passive intensity sensor and a laser ranging sensor appears promising as a means of achieving robust landings.

  18. Improving Human/Autonomous System Teaming Through Linguistic Analysis

    NASA Technical Reports Server (NTRS)

    Meszaros, Erica L.

    2016-01-01

    An area of increasing interest for the next generation of aircraft is autonomy and the integration of increasingly autonomous systems into the national airspace. Such integration requires humans to work closely with autonomous systems, forming human and autonomous agent teams. The intention behind such teaming is that a team composed of both humans and autonomous agents will operate better than homogenous teams. Procedures exist for licensing pilots to operate in the national airspace system and current work is being done to define methods for validating the function of autonomous systems, however there is no method in place for assessing the interaction of these two disparate systems. Moreover, currently these systems are operated primarily by subject matter experts, limiting their use and the benefits of such teams. Providing additional information about the ongoing mission to the operator can lead to increased usability and allow for operation by non-experts. Linguistic analysis of the context of verbal communication provides insight into the intended meaning of commonly heard phrases such as "What's it doing now?" Analyzing the semantic sphere surrounding these common phrases enables the prediction of the operator's intent and allows the interface to supply the operator's desired information.

  19. Vision-based augmented reality computer assisted surgery navigation system

    NASA Astrophysics Data System (ADS)

    Sun, Lei; Chen, Xin; Xu, Kebin; Li, Xin; Xu, Wei

    2007-12-01

    A vision-based Augmented Reality computer assisted surgery navigation system is presented in this paper. It applies the Augmented Reality technique to surgery navigation system, so the surgeon's vision of the real world is enhanced. In the system, the camera calibration is adopted to calculate the cameras projection matrix, and then make the virtual-real registration by using the transformation relation. The merging of synthetic 3D information into user's vision is realized by texture technique. The experiment results demonstrate the feasibility of the system we have designed.

  20. Night vision imaging system lighting evaluation methodology

    NASA Astrophysics Data System (ADS)

    Task, H. Lee; Pinkus, Alan R.; Barbato, Maryann H.; Hausmann, Martha A.

    2005-05-01

    In order for night vision goggles (NVGs) to be effective in aircraft operations, it is necessary for the cockpit lighting and displays to be NVG compatible. It has been assumed that the cockpit lighting is compatible with NVGs if the radiance values are compliant with the limits listed in Mil-L-85762A and Mil-Std-3009. However, these documents also describe a NVG-lighting compatibility field test procedure that is based on visual acuity. The objective of the study described in this paper was to determine how reliable and precise the visual acuity-based (VAB) field evaluation method is and compare it to a VAB method that employs less expensive equipment. In addition, an alternative, objective method of evaluating compatibility of the cockpit lighting was investigated. An inexpensive cockpit lighting simulator was devised to investigate two different interference conditions and six different radiance levels per condition. This paper describes the results, which indicate the objective method, based on light output of the NVGs, is more precise and reliable than the visual acuity-based method. Precision and reliability were assessed based on a probability of rejection (of the lighting system) function approach that was developed specifically for this study.

  1. Lighting And Optics Expert System For Machine Vision

    NASA Astrophysics Data System (ADS)

    Novini, Amir

    1989-03-01

    Machine Vision and the field of Artificial Intelligence are both new technologies which have evolved mainly within the past decade with the growth of computers and microchips. And, although research continues, both have emerged from the experimental state to industrial reality. Today's machine vision systems are solving thousands of manufacturing problems in various industries, and the impact of Artificial Intelligence, and more specifically, the use of "Expert Systems" in industry is also being realized. This paper will examine how the two technologies can cross paths, and how an Expert System can become an important part of an overall machine vision solution. An actual example of a development of an Expert System that helps solve machine vision lighting and optics problems will be discussed. The lighting and optics Expert System was developed to assist the end user to configure the "Front End" of a vision system to help solve the overall machine vision problem more effectively, since lack of attention to lighting and optics has caused many failures of this technology. Other areas of machine vision technology where Expert Systems could apply will also be discussed.

  2. Lighting And Optics Expert System For Machine Vision

    NASA Astrophysics Data System (ADS)

    Novini, Amir

    1988-12-01

    Machine Vision and the field of Artificial Intelligence are both new technologies which have evolved mainly within the past decade with the growth of computers and microchips. And, although research continues, both have emerged from the experimental state to industrial reality. Today's machine vision systems are solving thousands of manufacturing problems in various industries, and the impact of Artificial Intelligence, and more specifically, the use of "Expert Systems" in industry is also being realized. This paper will examine how the two technologies can cross paths, and how an Expert System can become an important part of an overall machine vision solution. An actual example of a development of an Expert System that helps solve machine vision lighting and optics problems will be discussed. The lighting and optics Expert System was developed to assist the end user to configure the "Front End" of a vision system to help solve the overall machine vision problem more effectively, since lack of attention to lighting and optics has caused many failures of this technology. Other areas of machine vision technology where Expert Systems could apply will also be discussed.

  3. Advances in Autonomous Systems for Missions of Space Exploration

    NASA Astrophysics Data System (ADS)

    Gross, A. R.; Smith, B. D.; Briggs, G. A.; Hieronymus, J.; Clancy, D. J.

    New missions of space exploration will require unprecedented levels of autonomy to successfully accomplish their objectives. Both inherent complexity and communication distances will preclude levels of human involvement common to current and previous space flight missions. With exponentially increasing capabilities of computer hardware and software, including networks and communication systems, a new balance of work is being developed between humans and machines. This new balance holds the promise of meeting the greatly increased space exploration requirements, along with dramatically reduced design, development, test, and operating costs. New information technologies, which take advantage of knowledge-based software, model-based reasoning, and high performance computer systems, will enable the development of a new generation of design and development tools, schedulers, and vehicle and system health monitoring and maintenance capabilities. Such tools will provide a degree of machine intelligence and associated autonomy that has previously been unavailable. These capabilities are critical to the future of space exploration, since the science and operational requirements specified by such missions, as well as the budgetary constraints that limit the ability to monitor and control these missions by a standing army of ground- based controllers. System autonomy capabilities have made great strides in recent years, for both ground and space flight applications. Autonomous systems have flown on advanced spacecraft, providing new levels of spacecraft capability and mission safety. Such systems operate by utilizing model-based reasoning that provides the capability to work from high-level mission goals, while deriving the detailed system commands internally, rather than having to have such commands transmitted from Earth. This enables missions of such complexity and communications distance as are not otherwise possible, as well as many more efficient and low cost

  4. Systems, methods and apparatus for quiesence of autonomic systems with self action

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G. (Inventor); Sterritt, Roy (Inventor)

    2011-01-01

    Systems, methods and apparatus are provided in which an autonomic unit or element is quiesced. A quiesce component of an autonomic unit can cause the autonomic unit to self-destruct if a stay-alive reprieve signal is not received after a predetermined time.

  5. A design strategy for autonomous systems

    NASA Technical Reports Server (NTRS)

    Forster, Pete

    1989-01-01

    Some solutions to crucial issues regarding the competent performance of an autonomously operating robot are identified; namely, that of handling multiple and variable data sources containing overlapping information and maintaining coherent operation while responding adequately to changes in the environment. Support for the ideas developed for the construction of such behavior are extracted from speculations in the study of cognitive psychology, an understanding of the behavior of controlled mechanisms, and the development of behavior-based robots in a few robot research laboratories. The validity of these ideas is supported by some simple simulation experiments in the field of mobile robot navigation and guidance.

  6. Fuzzy Decision-Making Fuser (FDMF) for Integrating Human-Machine Autonomous (HMA) Systems with Adaptive Evidence Sources.

    PubMed

    Liu, Yu-Ting; Pal, Nikhil R; Marathe, Amar R; Wang, Yu-Kai; Lin, Chin-Teng

    2017-01-01

    A brain-computer interface (BCI) creates a direct communication pathway between the human brain and an external device or system. In contrast to patient-oriented BCIs, which are intended to restore inoperative or malfunctioning aspects of the nervous system, a growing number of BCI studies focus on designing auxiliary systems that are intended for everyday use. The goal of building these BCIs is to provide capabilities that augment existing intact physical and mental capabilities. However, a key challenge to BCI research is human variability; factors such as fatigue, inattention, and stress vary both across different individuals and for the same individual over time. If these issues are addressed, autonomous systems may provide additional benefits that enhance system performance and prevent problems introduced by individual human variability. This study proposes a human-machine autonomous (HMA) system that simultaneously aggregates human and machine knowledge to recognize targets in a rapid serial visual presentation (RSVP) task. The HMA focuses on integrating an RSVP BCI with computer vision techniques in an image-labeling domain. A fuzzy decision-making fuser (FDMF) is then applied in the HMA system to provide a natural adaptive framework for evidence-based inference by incorporating an integrated summary of the available evidence (i.e., human and machine decisions) and associated uncertainty. Consequently, the HMA system dynamically aggregates decisions involving uncertainties from both human and autonomous agents. The collaborative decisions made by an HMA system can achieve and maintain superior performance more efficiently than either the human or autonomous agents can achieve independently. The experimental results shown in this study suggest that the proposed HMA system with the FDMF can effectively fuse decisions from human brain activities and the computer vision techniques to improve overall performance on the RSVP recognition task. This conclusion

  7. Autonomous rendezvous and feature detection system using TV imagery

    NASA Technical Reports Server (NTRS)

    Rice, R. B., Jr.

    1977-01-01

    Algorithms and equations are used for conversion of standard television imaging system information into directly usable spatial and dimensional information. System allows utilization of spacecraft imagery system as sensor in application to operations such as deriving spacecraft steering signal, tracking, autonomous rendezvous and docking and ranging.

  8. Lighting and optics expert system for machine vision

    NASA Astrophysics Data System (ADS)

    Novini, Amir R.

    1991-03-01

    Machine Vision and the field of Artificial Intelligence are both new technologies which hive evolved mainly within the past decade with the growth of computers and microchips. And although research continues both have emerged from tF experimental state to industrial reality. Today''s machine vision systEns are solving thousands of manufacturing problems in various industries and the impact of Artificial Intelligence and more specifically the ue of " Expert Systems" in industry is also being realized. This pape will examine how the two technologies can cross paths and how an E7ert System can become an important part of an overall machine vision solution. An actual example of a development of an Expert System that helps solve machine vision lighting and optics problems will be discussed. The lighting and optics xpert System was developed to assist the end user to configure the " Front End" of a vision system to help solve the overall machine vision problem more effectively since lack of attention to lighting and optics has caused many failures of this technology. Other areas of machine vision technology where Expert Systems could apply will also be ciscussed.

  9. Autonomic Cluster Management System (ACMS): A Demonstration of Autonomic Principles at Work

    NASA Technical Reports Server (NTRS)

    Baldassari, James D.; Kopec, Christopher L.; Leshay, Eric S.; Truszkowski, Walt; Finkel, David

    2005-01-01

    Cluster computing, whereby a large number of simple processors or nodes are combined together to apparently function as a single powerful computer, has emerged as a research area in its own right. The approach offers a relatively inexpensive means of achieving significant computational capabilities for high-performance computing applications, while simultaneously affording the ability to. increase that capability simply by adding more (inexpensive) processors. However, the task of manually managing and con.guring a cluster quickly becomes impossible as the cluster grows in size. Autonomic computing is a relatively new approach to managing complex systems that can potentially solve many of the problems inherent in cluster management. We describe the development of a prototype Automatic Cluster Management System (ACMS) that exploits autonomic properties in automating cluster management.

  10. Guidance, navigation and control system for autonomous proximity operations and docking of spacecraft

    NASA Astrophysics Data System (ADS)

    Lee, Daero

    This study develops an integrated guidance, navigation and control system for use in autonomous proximity operations and docking of spacecraft. A new approach strategy is proposed based on a modified system developed for use with the International Space Station. It is composed of three "V-bar hops" in the closing transfer phase, two periods of stationkeeping and a "straight line V-bar" approach to the docking port. Guidance, navigation and control functions are independently designed and are then integrated in the form of linear Gaussian-type control. The translational maneuvers are determined through the integration of the state-dependent Riccati equation control formulated using the nonlinear relative motion dynamics with the weight matrices adjusted at the steady state condition. The reference state is provided by a guidance function, and the relative navigation is performed using a rendezvous laser vision system and a vision sensor system, where a sensor mode change is made along the approach in order to provide effective navigation. The rotational maneuvers are determined through a linear quadratic Gaussian-type control using star trackers and gyros, and a vision sensor. The attitude estimation mode change is made from absolute estimation to relative attitude estimation during the stationkeeping phase inside the approach corridor. The rotational controller provides the precise attitude control using weight matrices adjusted at the steady state condition, including the uncertainty of the moment of inertia and external disturbance torques. A six degree-of-freedom simulation demonstrates that the newly developed GNC system successfully autonomously performs proximity operations and meets the conditions for entering the final docking phase.

  11. Autonomous navigation system based on GPS and magnetometer data

    NASA Technical Reports Server (NTRS)

    Julie, Thienel K. (Inventor); Richard, Harman R. (Inventor); Bar-Itzhack, Itzhack Y. (Inventor)

    2004-01-01

    This invention is drawn to an autonomous navigation system using Global Positioning System (GPS) and magnetometers for low Earth orbit satellites. As a magnetometer is reliable and always provides information on spacecraft attitude, rate, and orbit, the magnetometer-GPS configuration solves GPS initialization problem, decreasing the convergence time for navigation estimate and improving the overall accuracy. Eventually the magnetometer-GPS configuration enables the system to avoid costly and inherently less reliable gyro for rate estimation. Being autonomous, this invention would provide for black-box spacecraft navigation, producing attitude, orbit, and rate estimates without any ground input with high accuracy and reliability.

  12. Integrated vision-based GNC for autonomous rendezvous and capture around Mars

    NASA Astrophysics Data System (ADS)

    Strippoli, L.; Novelli, G.; Gil Fernandez, J.; Colmenarejo, P.; Le Peuvedic, C.; Lanza, P.; Ankersen, F.

    2015-06-01

    Integrated GNC (iGNC) is an activity aimed at designing, developing and validating the GNC for autonomously performing the rendezvous and capture phase of the Mars sample return mission as defined during the Mars sample return Orbiter (MSRO) ESA study. The validation cycle includes testing in an end-to-end simulator, in a real-time avionics-representative test bench and, finally, in a dynamic HW in the loop test bench for assessing the feasibility, performances and figure of merits of the baseline approach defined during the MSRO study, for both nominal and contingency scenarios. The on-board software (OBSW) is tailored to work with the sensors, actuators and orbits baseline proposed in MSRO. The whole rendezvous is based on optical navigation, aided by RF-Doppler during the search and first orbit determination of the orbiting sample. The simulated rendezvous phase includes also the non-linear orbit synchronization, based on a dedicated non-linear guidance algorithm robust to Mars ascent vehicle (MAV) injection accuracy or MAV failures resulting in elliptic target orbits. The search phase is very demanding for the image processing (IP) due to the very high visual magnitude of the target wrt. the stellar background, and the attitude GNC requires very high pointing stability accuracies to fulfil IP constraints. A trade-off of innovative, autonomous navigation filters indicates the unscented Kalman filter (UKF) as the approach that provides the best results in terms of robustness, response to non-linearities and performances compatibly with computational load. At short range, an optimized IP based on a convex hull algorithm has been developed in order to guarantee LoS and range measurements from hundreds of metres to capture.

  13. Vision aided inertial navigation system augmented with a coded aperture

    NASA Astrophysics Data System (ADS)

    Morrison, Jamie R.

    Navigation through a three-dimensional indoor environment is a formidable challenge for an autonomous micro air vehicle. A main obstacle to indoor navigation is maintaining a robust navigation solution (i.e. air vehicle position and attitude estimates) given the inadequate access to satellite positioning information. A MEMS (micro-electro-mechanical system) based inertial navigation system provides a small, power efficient means of maintaining a vehicle navigation solution; however, unmitigated error propagation from relatively noisy MEMS sensors results in the loss of a usable navigation solution over a short period of time. Several navigation systems use camera imagery to diminish error propagation by measuring the direction to features in the environment. Changes in feature direction provide information regarding direction for vehicle movement, but not the scale of movement. Movement scale information is contained in the depth to the features. Depth-from-defocus is a classic technique proposed to derive depth from a single image that involves analysis of the blur inherent in a scene with a narrow depth of field. A challenge to this method is distinguishing blurriness caused by the focal blur from blurriness inherent to the observed scene. In 2007, MIT's Computer Science and Artificial Intelligence Laboratory demonstrated replacing the traditional rounded aperture with a coded aperture to produce a complex blur pattern that is more easily distinguished from the scene. A key to measuring depth using a coded aperture then is to correctly match the blur pattern in a region of the scene with a previously determined set of blur patterns for known depths. As the depth increases from the focal plane of the camera, the observable change in the blur pattern for small changes in depth is generally reduced. Consequently, as the depth of a feature to be measured using a depth-from-defocus technique increases, the measurement performance decreases. However, a Fresnel zone

  14. 78 FR 5557 - Twenty-First Meeting: RTCA Special Committee 213, Enhanced Flight Vision Systems/Synthetic Vision...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-25

    ... Federal Aviation Administration Twenty-First Meeting: RTCA Special Committee 213, Enhanced Flight Vision... of Transportation (DOT). ACTION: Meeting Notice of RTCA Special Committee 213, Enhanced Flight Vision... of the twenty-first meeting of the RTCA Special Committee 213, Enhanced Flight Vision Systems...

  15. 75 FR 71146 - In the Matter of Certain Machine Vision Software, Machine Vision Systems, and Products Containing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-22

    ... COMMISSION In the Matter of Certain Machine Vision Software, Machine Vision Systems, and Products Containing..., and the sale within the United States after importation of certain machine vision software, machine..., California; Techno Soft Systemnics, Inc. (``Techno Soft'') of Japan; Fuji Machine Manufacturing Co., Ltd....

  16. Autonomous Dispersed Control System for Independent Micro Grid

    NASA Astrophysics Data System (ADS)

    Kawasaki, Kensuke; Matsumura, Shigenori; Iwabu, Koichi; Fujimura, Naoto; Iima, Takahito

    In this paper, we show an autonomous dispersed control system for independent micro grid of which performance has been substantiated in China by Shikoku Electric Power Co. and its subsidiary companies under the trust of NEDO (New Energy and Industrial Technology Development Organization). For the control of grid interconnected generators, the exclusive information line is very important to save fuel cost and maintain high frequency quality on electric power supply, but it is relatively expensive in such small micro grid. We contrived an autonomous dispersed control system without any exclusive information line for dispatching control and adjusting supply control. We have confirmed through the substantiation project in China that this autonomous dispersed control system for independent micro grid has a well satisfactory characteristic from the view point of less fuel consumption and high electric quality.

  17. Demonstration of a Semi-Autonomous Hybrid Brain-Machine Interface using Human Intracranial EEG, Eye Tracking, and Computer Vision to Control a Robotic Upper Limb Prosthetic

    PubMed Central

    McMullen, David P.; Hotson, Guy; Katyal, Kapil D.; Wester, Brock A.; Fifer, Matthew S.; McGee, Timothy G.; Harris, Andrew; Johannes, Matthew S.; Vogelstein, R. Jacob; Ravitz, Alan D.; Anderson, William S.; Thakor, Nitish V.; Crone, Nathan E.

    2014-01-01

    To increase the ability of brain-machine interfaces (BMIs) to control advanced prostheses such as the modular prosthetic limb (MPL), we are developing a novel system: the Hybrid Augmented Reality Multimodal Operation Neural Integration Environment (HARMONIE). This system utilizes hybrid input, supervisory control, and intelligent robotics to allow users to identify an object (via eye tracking and computer vision) and initiate (via brain-control) a semi-autonomous reach-grasp-and-drop of the object by the MPL. Sequential iterations of HARMONIE were tested in two pilot subjects implanted with electrocorticographic (ECoG) and depth electrodes within motor areas. The subjects performed the complex task in 71.4% (20/28) and 67.7% (21/31) of trials after minimal training. Balanced accuracy for detecting movements was 91.1% and 92.9%, significantly greater than chance accuracies (p < 0.05). After BMI-based initiation, the MPL completed the entire task 100% (one object) and 70% (three objects) of the time. The MPL took approximately 12.2 seconds for task completion after system improvements implemented for the second subject. Our hybrid-BMI design prevented all but one baseline false positive from initiating the system. The novel approach demonstrated in this proof-of-principle study, using hybrid input, supervisory control, and intelligent robotics, addresses limitations of current BMIs. PMID:24760914

  18. Demonstration of a semi-autonomous hybrid brain-machine interface using human intracranial EEG, eye tracking, and computer vision to control a robotic upper limb prosthetic.

    PubMed

    McMullen, David P; Hotson, Guy; Katyal, Kapil D; Wester, Brock A; Fifer, Matthew S; McGee, Timothy G; Harris, Andrew; Johannes, Matthew S; Vogelstein, R Jacob; Ravitz, Alan D; Anderson, William S; Thakor, Nitish V; Crone, Nathan E

    2014-07-01

    To increase the ability of brain-machine interfaces (BMIs) to control advanced prostheses such as the modular prosthetic limb (MPL), we are developing a novel system: the Hybrid Augmented Reality Multimodal Operation Neural Integration Environment (HARMONIE). This system utilizes hybrid input, supervisory control, and intelligent robotics to allow users to identify an object (via eye tracking and computer vision) and initiate (via brain-control) a semi-autonomous reach-grasp-and-drop of the object by the MPL. Sequential iterations of HARMONIE were tested in two pilot subjects implanted with electrocorticographic (ECoG) and depth electrodes within motor areas. The subjects performed the complex task in 71.4% (20/28) and 67.7% (21/31) of trials after minimal training. Balanced accuracy for detecting movements was 91.1% and 92.9%, significantly greater than chance accuracies (p < 0.05). After BMI-based initiation, the MPL completed the entire task 100% (one object) and 70% (three objects) of the time. The MPL took approximately 12.2 s for task completion after system improvements implemented for the second subject. Our hybrid-BMI design prevented all but one baseline false positive from initiating the system. The novel approach demonstrated in this proof-of-principle study, using hybrid input, supervisory control, and intelligent robotics, addresses limitations of current BMIs.

  19. Autonomous-Control Concept For Instrument Pointing System

    NASA Technical Reports Server (NTRS)

    Mettler, Edward; Milman, Mark H.; Bayard, David S.

    1990-01-01

    Integrated payload articulation and identification system (IPAIDS) is conceptual system to control aiming of instruments aboard spacecraft of proposed Earth Observation System (EOS). Principal features of concept include advanced control strategies intended to assure robustness of performance over wide range of uncertainties in characteristics of spacecraft and instrument system. Intended originally for application to spacecraft system, has potential utility on Earth for automatic control of autonomous (robotic) vehicles or of remote sensing systems.

  20. Constructing an autonomous system with infinitely many chaotic attractors

    NASA Astrophysics Data System (ADS)

    Zhang, Xu; Chen, Guanrong

    2017-07-01

    Some classical chaotic systems such as the Lorenz system and Chua system have finite numbers of chaotic attractors. This letter develops a simple, effective method for constructing lower-dimensional autonomous systems with infinitely many chaotic attractors. As an application, a Lorenz-type system and a Rössler-type system with infinitely many chaotic attractors are constructed with bifurcation analysis, and with an extension to the fractional-order setting.

  1. A 3D terrain reconstruction method of stereo vision based quadruped robot navigation system

    NASA Astrophysics Data System (ADS)

    Ge, Zhuo; Zhu, Ying; Liang, Guanhao

    2017-01-01

    To provide 3D environment information for the quadruped robot autonomous navigation system during walking through rough terrain, based on the stereo vision, a novel 3D terrain reconstruction method is presented. In order to solve the problem that images collected by stereo sensors have large regions with similar grayscale and the problem that image matching is poor at real-time performance, watershed algorithm and fuzzy c-means clustering algorithm are combined for contour extraction. Aiming at the problem of error matching, duel constraint with region matching and pixel matching is established for matching optimization. Using the stereo matching edge pixel pairs, the 3D coordinate algorithm is estimated according to the binocular stereo vision imaging model. Experimental results show that the proposed method can yield high stereo matching ratio and reconstruct 3D scene quickly and efficiently.

  2. Space station automation study: Autonomous systems and assembly, volume 2

    NASA Technical Reports Server (NTRS)

    Bradford, K. Z.

    1984-01-01

    This final report, prepared by Martin Marietta Denver Aerospace, provides the technical results of their input to the Space Station Automation Study, the purpose of which is to develop informed technical guidance in the use of autonomous systems to implement space station functions, many of which can be programmed in advance and are well suited for automated systems.

  3. Autonomous Control and Diagnostics of Space Reactor Systems

    SciTech Connect

    Upadhyaya, B.R.; Xu, X.; Perillo, S.R.P.; Na, M.G.

    2006-07-01

    This paper describes three key features of the development of an autonomous control strategy for space reactor systems. These include the development of a reactor simulation model for transient analysis, development of model-predictive control as part of the autonomous control strategy, and a fault detection and isolation module. The latter is interfaced with the control supervisor as part of a hierarchical control system. The approach has been applied to the nodal model of the SP-100 reactor with a thermo-electric generator. The results of application demonstrate the effectiveness of the control approach and its ability to reconfigure the control mode under fault conditions. (authors)

  4. REACT - A Third Generation Language For Autonomous Robot Systems

    NASA Astrophysics Data System (ADS)

    Longley, Maxwell J.; Owens, John; Allen, Charles R.; Ratcliff, Karl

    1990-03-01

    REACT is a language under development at Newcastle for the programming of autonomous robot systems, which uses AI constructs and sensor information to respond to failures in assumptions about the real-world by replanning a task. This paper describes the important features of a REACT programmed robotic system, and the results of some initial studies made on defining an executive language using a concept called visiblity sets. Several examples from the language are then applied to specific examples e.g. a white line follower and a railway network controller. The applicability of visibility sets to autonomous robots is evaluated.

  5. Turning a remotely controllable observatory into a fully autonomous system

    NASA Astrophysics Data System (ADS)

    Swindell, Scott; Johnson, Chris; Gabor, Paul; Zareba, Grzegorz; Kubánek, Petr; Prouza, Michael

    2014-08-01

    We describe a complex process needed to turn an existing, old, operational observatory - The Steward Observatory's 61" Kuiper Telescope - into a fully autonomous system, which observers without an observer. For this purpose, we employed RTS2,1 an open sourced, Linux based observatory control system, together with other open sourced programs and tools (GNU compilers, Python language for scripting, JQuery UI for Web user interface). This presentation provides a guide with time estimates needed for a newcomers to the field to handle such challenging tasks, as fully autonomous observatory operations.

  6. An autonomous control system for boiler-turbine units

    SciTech Connect

    Ben-Abdennour, A.; Lee, K.Y.

    1996-06-01

    Achieving a more autonomous power plant operation is an important part of power plant control. To be autonomous, a control system needs to provide adequate control actions in the presence of significant uncertainties and/or disturbances, such as actuator or component failures, with minimum or no human assistance. However, a reasonable degree of autonomy is difficult to obtain without incorporating intelligence in the control system. This paper presents a coordinated intelligent control scheme with a high degree of autonomy. In this scheme, a Fuzzy-Logic based supervisor monitors the overall plant operation and carries the tasks of coordination, fault diagnosis, fault isolation, and fault accommodation.

  7. Autonomous satellite navigation with the Global Positioning System

    NASA Technical Reports Server (NTRS)

    Fuchs, A. J.; Wooden, W. H., II; Long, A. C.

    1977-01-01

    This paper discusses the potential of using the Global Positioning System (GPS) to provide autonomous navigation capability to NASA satellites in the 1980 era. Some of the driving forces motivating autonomous navigation are presented. These include such factors as advances in attitude control systems, onboard science annotation, and onboard gridding of imaging data. Simulation results which demonstrate baseline orbit determination accuracies using GPS data on Seasat, Landsat-D, and the Solar Maximum Mission are presented. Emphasis is placed on identifying error sources such as GPS time, GPS ephemeris, user timing biases, and user orbit dynamics, and in a parametric sense on evaluating their contribution to the orbit determination accuracies.

  8. The Tactile Vision Substitution System: Applications in Education and Employment

    ERIC Educational Resources Information Center

    Scadden, Lawrence A.

    1974-01-01

    The Tactile Vision Substitution System converts the visual image from a narrow-angle television camera to a tactual image on a 5-inch square, 100-point display of vibrators placed against the abdomen of the blind person. (Author)

  9. The Tactile Vision Substitution System: Applications in Education and Employment

    ERIC Educational Resources Information Center

    Scadden, Lawrence A.

    1974-01-01

    The Tactile Vision Substitution System converts the visual image from a narrow-angle television camera to a tactual image on a 5-inch square, 100-point display of vibrators placed against the abdomen of the blind person. (Author)

  10. Parallel Architectures and Parallel Algorithms for Integrated Vision Systems. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Choudhary, Alok Nidhi

    1989-01-01

    Computer vision is regarded as one of the most complex and computationally intensive problems. An integrated vision system (IVS) is a system that uses vision algorithms from all levels of processing to perform for a high level application (e.g., object recognition). An IVS normally involves algorithms from low level, intermediate level, and high level vision. Designing parallel architectures for vision systems is of tremendous interest to researchers. Several issues are addressed in parallel architectures and parallel algorithms for integrated vision systems.

  11. Trust Management in Swarm-Based Autonomic Computing Systems

    SciTech Connect

    Maiden, Wendy M.; Haack, Jereme N.; Fink, Glenn A.; McKinnon, Archibald D.; Fulp, Errin W.

    2009-07-07

    Reputation-based trust management techniques can address issues such as insider threat as well as quality of service issues that may be malicious in nature. However, trust management techniques must be adapted to the unique needs of the architectures and problem domains to which they are applied. Certain characteristics of swarms such as their lightweight ephemeral nature and indirect communication make this adaptation especially challenging. In this paper we look at the trust issues and opportunities in mobile agent swarm-based autonomic systems and find that by monitoring the trustworthiness of the autonomic managers rather than the swarming sensors, the trust management problem becomes much more scalable and still serves to protect the swarms. We also analyze the applicability of trust management research as it has been applied to architectures with similar characteristics. Finally, we specify required characteristics for trust management mechanisms to be used to monitor the trustworthiness of the entities in a swarm-based autonomic computing system.

  12. Music and Autonomic Nervous System (Dys)function

    PubMed Central

    Ellis, Robert J.; Thayer, Julian F.

    2010-01-01

    Despite a wealth of evidence for the involvement of the autonomic nervous system (ANS) in health and disease and the ability of music to affect ANS activity, few studies have systematically explored the therapeutic effects of music on ANS dysfunction. Furthermore, when ANS activity is quantified and analyzed, it is usually from a point of convenience rather than from an understanding of its physiological basis. After a review of the experimental and therapeutic literatures exploring music and the ANS, a “Neurovisceral Integration” perspective on the interplay between the central and autonomic nervous systems is introduced, and the associated implications for physiological, emotional, and cognitive health are explored. The construct of heart rate variability is discussed both as an example of this complex interplay and as a useful metric for exploring the sometimes subtle effect of music on autonomic response. Suggestions for future investigations using musical interventions are offered based on this integrative account. PMID:21197136

  13. Building Artificial Vision Systems with Machine Learning

    SciTech Connect

    LeCun, Yann

    2011-02-23

    Three questions pose the next challenge for Artificial Intelligence (AI), robotics, and neuroscience. How do we learn perception (e.g. vision)? How do we learn representations of the perceptual world? How do we learn visual categories from just a few examples?

  14. Planning In A Hierarchical Nested Autonomous Control System

    NASA Astrophysics Data System (ADS)

    Meystel, A.

    1987-02-01

    In this paper, theoretical foundations of planning processes are outlined in a form applicable for design and control of autonomous mobile robots. Planning/control is shown to be a unified recursive operation of decision making applied to a nested hierarchy of knowledge representation. The core of the theory is based upon methods developed in the areas of Post-production systems, theory of coding, and the team theory of decentralized stochastic control. A class of autonomous control systems for robots is defined, and a problem of information representation is addressed for this class. A phenomenon of nesting is analyzed and the minimum c-entropy rule is determined for arranging efficient design and control procedures for systems of intelligent control. A concept of nested hierarchical knowledge-based controller is employed in this paper which enables minimum-time control using nested dynamic programming. An application of this concept is unfolded for a system of knowledge-based control of an autonomous mobile robot. Key words: Autonomous Control Systems, Decision Making, Production Systems, Decentralized Stochastic Control, Dynamic Programming, Hierarchical Control, Knowledge Based Controllers, E-entropy, Planning, Navigation, Guidance, Prediction, Contingencies, Mobile Robots.

  15. Vision/INS Integrated Navigation System for Poor Vision Navigation Environments

    PubMed Central

    Kim, Youngsun; Hwang, Dong-Hwan

    2016-01-01

    In order to improve the performance of an inertial navigation system, many aiding sensors can be used. Among these aiding sensors, a vision sensor is of particular note due to its benefits in terms of weight, cost, and power consumption. This paper proposes an inertial and vision integrated navigation method for poor vision navigation environments. The proposed method uses focal plane measurements of landmarks in order to provide position, velocity and attitude outputs even when the number of landmarks on the focal plane is not enough for navigation. In order to verify the proposed method, computer simulations and van tests are carried out. The results show that the proposed method gives accurate and reliable position, velocity and attitude outputs when the number of landmarks is insufficient. PMID:27754350

  16. Vision/INS Integrated Navigation System for Poor Vision Navigation Environments.

    PubMed

    Kim, Youngsun; Hwang, Dong-Hwan

    2016-10-12

    In order to improve the performance of an inertial navigation system, many aiding sensors can be used. Among these aiding sensors, a vision sensor is of particular note due to its benefits in terms of weight, cost, and power consumption. This paper proposes an inertial and vision integrated navigation method for poor vision navigation environments. The proposed method uses focal plane measurements of landmarks in order to provide position, velocity and attitude outputs even when the number of landmarks on the focal plane is not enough for navigation. In order to verify the proposed method, computer simulations and van tests are carried out. The results show that the proposed method gives accurate and reliable position, velocity and attitude outputs when the number of landmarks is insufficient.

  17. 77 FR 16890 - Eighteenth Meeting: RTCA Special Committee 213, Enhanced Flight Visions Systems/Synthetic Vision...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-22

    ... Review SVS/CVS (WG1) and Vision Systems (WG2) objectives Work Group 2 (VS) Discussion April 18, 2012 Work Group 2 (VS) Discussion Work Group 1 (SVS/CVS) break discussion out if needed April 19, 2012 Work Group... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF TRANSPORTATION...

  18. Human Factors And Safety Considerations Of Night Vision Systems Flight

    NASA Astrophysics Data System (ADS)

    Verona, Robert W.; Rash, Clarence E.

    1989-03-01

    Military aviation night vision systems greatly enhance the capability to operate during periods of low illumination. After flying with night vision devices, most aviators are apprehensive about returning to unaided night flight. Current night vision imaging devices allow aviators to fly during ambient light conditions which would be extremely dangerous, if not impossible, with unaided vision. However, the visual input afforded with these devices does not approach that experienced using the unencumbered, unaided eye during periods of daylight illumination. Many visual parameters, e,g., acuity, field-of-view, depth perception, etc., are compromised when night vision devices are used. The inherent characteristics of image intensification based sensors introduce new problems associated with the interpretation of visual information based on different spatial and spectral content from that of unaided vision. In addition, the mounting of these devices onto the helmet is accompanied by concerns of fatigue resulting from increased head supported weight and shift in center-of-gravity. All of these concerns have produced numerous human factors and safety issues relating to thb use of night vision systems. These issues are identified and discussed in terms of their possible effects on user performance and safety.

  19. Using Multimodal Input for Autonomous Decision Making for Unmanned Systems

    NASA Technical Reports Server (NTRS)

    Neilan, James H.; Cross, Charles; Rothhaar, Paul; Tran, Loc; Motter, Mark; Qualls, Garry; Trujillo, Anna; Allen, B. Danette

    2016-01-01

    Autonomous decision making in the presence of uncertainly is a deeply studied problem space particularly in the area of autonomous systems operations for land, air, sea, and space vehicles. Various techniques ranging from single algorithm solutions to complex ensemble classifier systems have been utilized in a research context in solving mission critical flight decisions. Realized systems on actual autonomous hardware, however, is a difficult systems integration problem, constituting a majority of applied robotics development timelines. The ability to reliably and repeatedly classify objects during a vehicles mission execution is vital for the vehicle to mitigate both static and dynamic environmental concerns such that the mission may be completed successfully and have the vehicle operate and return safely. In this paper, the Autonomy Incubator proposes and discusses an ensemble learning and recognition system planned for our autonomous framework, AEON, in selected domains, which fuse decision criteria, using prior experience on both the individual classifier layer and the ensemble layer to mitigate environmental uncertainty during operation.

  20. A functional system architecture for fully autonomous robot

    NASA Astrophysics Data System (ADS)

    Kalaycioglu, S.

    The Mobile Servicing System (MSS) Autonomous Robotics Program intends to define and plan the development of technologies required to provide a supervised autonomous operation capability for the Special Purpose Dexterous Manipulator (SPDM) on the MSS. The operational functions for the SPDM to perform the required tasks, both in fully autonomous or supervised modes, are identified. Functional decomposition is performed using a graphics oriented methodology called Structural Analysis Design Technique. This process defines the functional architecture of the system, the types of data required to support its functionality, and the control processes that need to be emplaced. On the basis of the functional decomposition, a technology breakdown structure is also developed. A preliminary estimate of the status and maturity of each relevant technology is made, based on this technology breakdown. The developed functional hierarchy is found to be very effective for a robotic system with any level of autonomy. Moreover, this hierarchy can easily be applied to an existing very low level autonomous system and can provide a smooth transition towards a higher degree of autonomy. The effectiveness of the developed functional hierarchy will also play a very significant role both in the system design as well as in the development of the control hierarchy.

  1. Autonomous Frequency-Domain System-Identification Program

    NASA Technical Reports Server (NTRS)

    Yam, Yeung; Mettler, Edward; Bayard, David S.; Hadaegh, Fred Y.; Milman, Mark H.; Scheid, Robert E.

    1993-01-01

    Autonomous Frequency Domain Identification (AU-FREDI) computer program implements system of methods, algorithms, and software developed for identification of parameters of mathematical models of dynamics of flexible structures and characterization, by use of system transfer functions, of such models, dynamics, and structures regarded as systems. Software considered collection of routines modified and reassembled to suit system-identification and control experiments on large flexible structures.

  2. Autonomous Frequency-Domain System-Identification Program

    NASA Technical Reports Server (NTRS)

    Yam, Yeung; Mettler, Edward; Bayard, David S.; Hadaegh, Fred Y.; Milman, Mark H.; Scheid, Robert E.

    1993-01-01

    Autonomous Frequency Domain Identification (AU-FREDI) computer program implements system of methods, algorithms, and software developed for identification of parameters of mathematical models of dynamics of flexible structures and characterization, by use of system transfer functions, of such models, dynamics, and structures regarded as systems. Software considered collection of routines modified and reassembled to suit system-identification and control experiments on large flexible structures.

  3. Expert system isssues in automated, autonomous space vehicle rendezvous

    NASA Technical Reports Server (NTRS)

    Goodwin, Mary Ann; Bochsler, Daniel C.

    1987-01-01

    The problems involved in automated autonomous rendezvous are briefly reviewed, and the Rendezvous Expert (RENEX) expert system is discussed with reference to its goals, approach used, and knowledge structure and contents. RENEX has been developed to support streamlining operations for the Space Shuttle and Space Station program and to aid definition of mission requirements for the autonomous portions of rendezvous for the Mars Surface Sample Return and Comet Nucleus Sample return unmanned missions. The experience with REMEX to date and recommendations for further development are presented.

  4. Mamdani Fuzzy System for Indoor Autonomous Mobile Robot

    NASA Astrophysics Data System (ADS)

    Khan, M. K. A. Ahamed; Rashid, Razif; Elamvazuthi, I.

    2011-06-01

    Several control algorithms for autonomous mobile robot navigation have been proposed in the literature. Recently, the employment of non-analytical methods of computing such as fuzzy logic, evolutionary computation, and neural networks has demonstrated the utility and potential of these paradigms for intelligent control of mobile robot navigation. In this paper, Mamdani fuzzy system for an autonomous mobile robot is developed. The paper begins with the discussion on the conventional controller and then followed by the description of fuzzy logic controller in detail.

  5. TOWARD HIGHLY SECURE AND AUTONOMIC COMPUTING SYSTEMS: A HIERARCHICAL APPROACH

    SciTech Connect

    Lee, Hsien-Hsin S

    2010-05-11

    The overall objective of this research project is to develop novel architectural techniques as well as system software to achieve a highly secure and intrusion-tolerant computing system. Such system will be autonomous, self-adapting, introspective, with self-healing capability under the circumstances of improper operations, abnormal workloads, and malicious attacks. The scope of this research includes: (1) System-wide, unified introspection techniques for autonomic systems, (2) Secure information-flow microarchitecture, (3) Memory-centric security architecture, (4) Authentication control and its implication to security, (5) Digital right management, (5) Microarchitectural denial-of-service attacks on shared resources. During the period of the project, we developed several architectural techniques and system software for achieving a robust, secure, and reliable computing system toward our goal.

  6. Area scanning vision inspection system by using mirror control

    NASA Astrophysics Data System (ADS)

    Jeong, Sang Y.; Min, Sungwook; Yang, Wonyoung

    2001-02-01

    12 As the pressure increases to deliver vision products with faster speed while inspection higher resolution at lower cost, the area scanning vision inspection system can be one of the good solutions. To inspect large area with high resolution, the conventional vision system requires moving either camera or the target, therefore, the system suffers low speed and high cost due to the requirements of mechanical moving system or higher resolution camera. Because there are only tiny mirror angle movements required to change the field of view, the XY mirror controlled area scanning vision system is able to capture random area images with high speed. Elimination of external precise moving mechanism is another benefit of the mirror control. The image distortion due to the lens and the mirror system shall be automatically compensated right after each image captured so that the absolute coordination can be calculated in real- time. Motorized focusing system is used for the large area inspection, so that the proper focusing achieved for the variable working distance between lens and targets by the synchronization to the mirror scanning system. By using XY mirror controlled area scanning vision inspection system, fast and economic system can be integrated while no vibration induced and smaller space required. This paper describes the principle of the area scanning method, optical effects of the scanning, position calibration method, inspection flows and some of implementation results.

  7. Panoramic stereo sphere vision

    NASA Astrophysics Data System (ADS)

    Feng, Weijia; Zhang, Baofeng; Röning, Juha; Zong, Xiaoning; Yi, Tian

    2013-01-01

    Conventional stereo vision systems have a small field of view (FOV) which limits their usefulness for certain applications. While panorama vision is able to "see" in all directions of the observation space, scene depth information is missed because of the mapping from 3D reference coordinates to 2D panoramic image. In this paper, we present an innovative vision system which builds by a special combined fish-eye lenses module, and is capable of producing 3D coordinate information from the whole global observation space and acquiring no blind area 360°×360° panoramic image simultaneously just using single vision equipment with one time static shooting. It is called Panoramic Stereo Sphere Vision (PSSV). We proposed the geometric model, mathematic model and parameters calibration method in this paper. Specifically, video surveillance, robotic autonomous navigation, virtual reality, driving assistance, multiple maneuvering target tracking, automatic mapping of environments and attitude estimation are some of the applications which will benefit from PSSV.

  8. Is There Anything "Autonomous" in the Nervous System?

    ERIC Educational Resources Information Center

    Rasia-Filho, Alberto A.

    2006-01-01

    The terms "autonomous" or "vegetative" are currently used to identify one part of the nervous system composed of sympathetic, parasympathetic, and gastrointestinal divisions. However, the concepts that are under the literal meaning of these words can lead to misconceptions about the actual nervous organization. Some clear-cut examples indicate…

  9. Modeling and Control Strategies for Autonomous Robotic Systems

    DTIC Science & Technology

    1991-12-23

    Robotic Systems 12 PERSONAL AUTHOR(S) Roger W. Brockett A]&. TYPE Of REPORT 113b. TIME COVERD 114 DATt__Of RE PVRT (Year- month, Day) S.PAGE COUNT Final...NO. ACCESSION NO Reseaich Triangle Park, NC 27709-2211I I 11 TITLE (ir-’-4 Cae-unrv Canmuicauon) Modeling and Control Strategies for Autonomous

  10. Is There Anything "Autonomous" in the Nervous System?

    ERIC Educational Resources Information Center

    Rasia-Filho, Alberto A.

    2006-01-01

    The terms "autonomous" or "vegetative" are currently used to identify one part of the nervous system composed of sympathetic, parasympathetic, and gastrointestinal divisions. However, the concepts that are under the literal meaning of these words can lead to misconceptions about the actual nervous organization. Some clear-cut examples indicate…

  11. Random attractor of non-autonomous stochastic Boussinesq lattice system

    SciTech Connect

    Zhao, Min Zhou, Shengfan

    2015-09-15

    In this paper, we first consider the existence of tempered random attractor for second-order non-autonomous stochastic lattice dynamical system of nonlinear Boussinesq equations effected by time-dependent coupled coefficients and deterministic forces and multiplicative white noise. Then, we establish the upper semicontinuity of random attractors as the intensity of noise approaches zero.

  12. Latency in Visionic Systems: Test Methods and Requirements

    NASA Technical Reports Server (NTRS)

    Bailey, Randall E.; Arthur, J. J., III; Williams, Steven P.; Kramer, Lynda J.

    2005-01-01

    A visionics device creates a pictorial representation of the external scene for the pilot. The ultimate objective of these systems may be to electronically generate a form of Visual Meteorological Conditions (VMC) to eliminate weather or time-of-day as an operational constraint and provide enhancement over actual visual conditions where eye-limiting resolution may be a limiting factor. Empirical evidence has shown that the total system delays or latencies including the imaging sensors and display systems, can critically degrade their utility, usability, and acceptability. Definitions and measurement techniques are offered herein as common test and evaluation methods for latency testing in visionics device applications. Based upon available data, very different latency requirements are indicated based upon the piloting task, the role in which the visionics device is used in this task, and the characteristics of the visionics cockpit display device including its resolution, field-of-regard, and field-of-view. The least stringent latency requirements will involve Head-Up Display (HUD) applications, where the visionics imagery provides situational information as a supplement to symbology guidance and command information. Conversely, the visionics system latency requirement for a large field-of-view Head-Worn Display application, providing a Virtual-VMC capability from which the pilot will derive visual guidance, will be the most stringent, having a value as low as 20 msec.

  13. The research on projective visual system of night vision goggles

    NASA Astrophysics Data System (ADS)

    Zhao, Shun-long

    2009-07-01

    Driven by the need for lightweight night vision goggles with good performance, we apply the projective lens into night vision goggles to act as visual system. A 40-deg FOV projection lens is provided. The useful diameter of the image intensifier is 16mm, and the Resolutions at Center and edge are both 60-lp/mm. The projection lens has a 28mm diameter and 20g weight. The maximum distortion of the system is less than 0.15%. The MTF maintained more than 0.6 at a 60-lp/mm resolution across the FOV. So the lens meets the requirements of the visual system. Besides, two types of projective visual system of night vision goggles are presented: the Direct-view projective visual system and the Seethrough projective visual system. And the See-through projective visual system enables us to observe the object with our eyes directly, without other action, when the environment becomes bright in a sudden. Finally we have reached a conclusion: The projective system has advantages over traditional eyepiece in night vision goggles. It is very useful to minish the volume, lighten the neck supports, and improve the imaging quality. It provides a new idea and concept for visual system design in night vision goggles.

  14. Central- and autonomic nervous system coupling in schizophrenia

    PubMed Central

    Schulz, Steffen; Bolz, Mathias; Bär, Karl-Jürgen

    2016-01-01

    The autonomic nervous system (ANS) dysfunction has been well described in schizophrenia (SZ), a severe mental disorder. Nevertheless, the coupling between the ANS and central brain activity has been not addressed until now in SZ. The interactions between the central nervous system (CNS) and ANS need to be considered as a feedback–feed-forward system that supports flexible and adaptive responses to specific demands. For the first time, to the best of our knowledge, this study investigates central–autonomic couplings (CAC) studying heart rate, blood pressure and electroencephalogram in paranoid schizophrenic patients, comparing them with age–gender-matched healthy subjects (CO). The emphasis is to determine how these couplings are composed by the different regulatory aspects of the CNS–ANS. We found that CAC were bidirectional, and that the causal influence of central activity towards systolic blood pressure was more strongly pronounced than such causal influence towards heart rate in paranoid schizophrenic patients when compared with CO. In paranoid schizophrenic patients, the central activity was a much stronger variable, being more random and having fewer rhythmic oscillatory components. This study provides a more in-depth understanding of the interplay of neuronal and autonomic regulatory processes in SZ and most likely greater insights into the complex relationship between psychotic stages and autonomic activity. PMID:27044986

  15. A Vision for Systems Engineering Applied to Wind Energy (Presentation)

    SciTech Connect

    Felker, F.; Dykes, K.

    2015-01-01

    This presentation was given at the Third Wind Energy Systems Engineering Workshop on January 14, 2015. Topics covered include the importance of systems engineering, a vision for systems engineering as applied to wind energy, and application of systems engineering approaches to wind energy research and development.

  16. Design of a Multi-Sensor Cooperation Travel Environment Perception System for Autonomous Vehicle

    PubMed Central

    Chen, Long; Li, Qingquan; Li, Ming; Zhang, Liang; Mao, Qingzhou

    2012-01-01

    This paper describes the environment perception system designed for intelligent vehicle SmartV-II, which won the 2010 Future Challenge. This system utilizes the cooperation of multiple lasers and cameras to realize several necessary functions of autonomous navigation: road curb detection, lane detection and traffic sign recognition. Multiple single scan lasers are integrated to detect the road curb based on Z-variance method. Vision based lane detection is realized by two scans method combining with image model. Haar-like feature based method is applied for traffic sign detection and SURF matching method is used for sign classification. The results of experiments validate the effectiveness of the proposed algorithms and the whole system.

  17. Depth perception using crossed-looking stereo vision system

    NASA Astrophysics Data System (ADS)

    Yeh, Chih-Ping

    1994-10-01

    It has been well known that the human vision system is a multi-resolution and spatially shift- variant system. The size of the edge filters on each retina are small and roughly constant within the foveal area, and are increased linearly with eccentricity outside the fovea. This mechanism allows the human visual system to perceive detailed description about the target surface within the fovea vision area, and to obtain a global description about the scene in the peripheral vision area. This paper describes a stereo vision system which simulates this mechanism of human vision. The system uses a pair of crossed-looking cameras as sensors. The fixation point is used as the geometric center of the 3D space. With this perception geometry, the vision system perceives depth variation of the target surface, rather than the absolute distance from the target surface to the sensors. This property allows the stereo correspondence to be achieved through a fusion process, which is similar to the optical fusion of two diffraction patterns. The fusion process obtains disparity information of the entire image in one convolution operation. The volume of calculation, and therefore the processing time needs for depth perception, is largely reduced. The mechanism of spatially shift-variant processing is implemented by applying logarithm conformal mapping to the images. As a result, the sensitivity in depth perception decreases exponentially from the center to the peripheral area of the image. This allows the vision system to obtain depth information about the scene within a broad field of view.

  18. An autonomous rendezvous and docking system using cruise missile technologies

    NASA Technical Reports Server (NTRS)

    Jones, Ruel Edwin

    1991-01-01

    In November 1990 the Autonomous Rendezvous & Docking (AR&D) system was first demonstrated for members of NASA's Strategic Avionics Technology Working Group. This simulation utilized prototype hardware from the Cruise Missile and Advanced Centaur Avionics systems. The object was to show that all the accuracy, reliability and operational requirements established for a space craft to dock with Space Station Freedom could be met by the proposed system. The rapid prototyping capabilities of the Advanced Avionics Systems Development Laboratory were used to evaluate the proposed system in a real time, hardware in the loop simulation of the rendezvous and docking reference mission. The simulation permits manual, supervised automatic and fully autonomous operations to be evaluated. It is also being upgraded to be able to test an Autonomous Approach and Landing (AA&L) system. The AA&L and AR&D systems are very similar. Both use inertial guidance and control systems supplemented by GPS. Both use an Image Processing System (IPS), for target recognition and tracking. The IPS includes a general purpose multiprocessor computer and a selected suite of sensors that will provide the required relative position and orientation data. Graphic displays can also be generated by the computer, providing the astronaut / operator with real-time guidance and navigation data with enhanced video or sensor imagery.

  19. Blackboard architectures and their relationship to autonomous space systems

    NASA Technical Reports Server (NTRS)

    Thornbrugh, Allison

    1988-01-01

    The blackboard architecture provides a powerful paradigm for the autonomy expected in future spaceborne systems, especially SDI and Space Station. Autonomous systems will require skill in both the classic task of information analysis and the newer tasks of decision making, planning and system control. Successful blackboard systems have been built to deal with each of these tasks separately. The blackboard paradigm achieves success in difficult domains through its ability to integrate several uncertain sources of knowledge. In addition to flexible behavior during autonomous operation, the system must also be capable of incrementally growing from semiautonomy to full autonomy. The blackboard structure allows this development. The blackboard's ability to handle error, its flexible execution, and variants of this paradigm are discussed as they apply to specific problems of the space environment.

  20. Blackboard architectures and their relationship to autonomous space systems

    NASA Technical Reports Server (NTRS)

    Thornbrugh, Allison

    1988-01-01

    The blackboard architecture provides a powerful paradigm for the autonomy expected in future spaceborne systems, especially SDI and Space Station. Autonomous systems will require skill in both the classic task of information analysis and the newer tasks of decision making, planning and system control. Successful blackboard systems have been built to deal with each of these tasks separately. The blackboard paradigm achieves success in difficult domains through its ability to integrate several uncertain sources of knowledge. In addition to flexible behavior during autonomous operation, the system must also be capable of incrementally growing from semiautonomy to full autonomy. The blackboard structure allows this development. The blackboard's ability to handle error, its flexible execution, and variants of this paradigm are discussed as they apply to specific problems of the space environment.

  1. Global Positioning System Synchronized Active Light Autonomous Docking System

    NASA Technical Reports Server (NTRS)

    Howard, Richard T. (Inventor); Book, Michael L. (Inventor); Bryan, Thomas C. (Inventor); Bell, Joseph L. (Inventor)

    1996-01-01

    A Global Positioning System Synchronized Active Light Autonomous Docking System (GPSSALADS) for automatically docking a chase vehicle with a target vehicle comprising at least one active light emitting target which is operatively attached to the target vehicle. The target includes a three-dimensional array of concomitantly flashing lights which flash at a controlled common frequency. The GPSSALADS further comprises a visual tracking sensor operatively attached to the chase vehicle for detecting and tracking the target vehicle. Its performance is synchronized with the flash frequency of the lights by a synchronization means which is comprised of first and second internal clocks operatively connected to the active light target and visual tracking sensor, respectively, for providing timing control signals thereto, respectively. The synchronization means further includes first and second Global Positioning System receivers operatively connected to the first and second internal clocks, respectively, for repeatedly providing simultaneous synchronization pulses to the internal clocks, respectively. In addition, the GPSSALADS includes a docking process controller means which is operatively attached to the chase vehicle and is responsive to the visual tracking sensor for producing commands for the guidance and propulsion system of the chase vehicle.

  2. Global Positioning System Synchronized Active Light Autonomous Docking System

    NASA Technical Reports Server (NTRS)

    Howard, Richard (Inventor)

    1994-01-01

    A Global Positioning System Synchronized Active Light Autonomous Docking System (GPSSALADS) for automatically docking a chase vehicle with a target vehicle comprises at least one active light emitting target which is operatively attached to the target vehicle. The target includes a three-dimensional array of concomitantly flashing lights which flash at a controlled common frequency. The GPSSALADS further comprises a visual tracking sensor operatively attached to the chase vehicle for detecting and tracking the target vehicle. Its performance is synchronized with the flash frequency of the lights by a synchronization means which is comprised of first and second internal clocks operatively connected to the active light target and visual tracking sensor, respectively, for providing timing control signals thereto, respectively. The synchronization means further includes first and second Global Positioning System receivers operatively connected to the first and second internal clocks, respectively, for repeatedly providing simultaneous synchronization pulses to the internal clocks, respectively. In addition, the GPSSALADS includes a docking process controller means which is operatively attached to the chase vehicle and is responsive to the visual tracking sensor for producing commands for the guidance and propulsion system of the chase vehicle.

  3. Miniature Autonomous Rocket Recovery System (MARRS)

    DTIC Science & Technology

    2011-05-01

    ChIMU = Cheap Inertial Measurement Unit GNC = Guidance, Navigation and Control GPS = Global Positioning System INS = Inertial Navigation System...factors during deployment and necessity to employ integrated GPS/INS navigation system rather than just GPS-based GNC system. This paper focuses on...delivering payloads even closer to the target.3,4 Specifically, this paper explores a capability to utilize an advanced GNC system developed for the

  4. TVS: An Environment For Building Knowledge-Based Vision Systems

    NASA Astrophysics Data System (ADS)

    Weymouth, Terry E.; Amini, Amir A.; Tehrani, Saeid

    1989-03-01

    Advances in the field of knowledge-guided computer vision require the development of large scale projects and experimentation with them. One factor which impedes such development is the lack of software environments which combine standard image processing and graphics abilities with the ability to perform symbolic processing. In this paper, we describe a software environment that assists in the development of knowledge-based computer vision projects. We have built, upon Common LISP and C, a software development environment which combines standard image processing tools and a standard blackboard-based system, with the flexibility of the LISP programming environment. This environment has been used to develop research projects in knowledge-based computer vision and dynamic vision for robot navigation.

  5. Skin biopsies in the assessment of the autonomic nervous system.

    PubMed

    Wang, Ningshan; Gibbons, Christopher H

    2013-01-01

    Cutaneous punch biopsies are widely used to evaluate nociceptive C fibers in patients with suspected small-fiber neuropathy. Recent advances in immunohistochemical techniques and interest in cutaneous autonomic innervation has expanded the role of skin biopsy in the evaluation of the peripheral nervous system. The dermal layers of the skin provide a unique window into the structural evaluation of the autonomic nervous system. Peripheral adrenergic and cholinergic fibers innervate a number of cutaneous structures, such as sweat glands and arrector pili muscles, and can easily be seen with punch skin biopsies. Skin biopsies allow for both regional sampling, in diseases with patchy distribution, and the opportunity for repeated sampling in progressive disorders. The structural evaluation of cutaneous autonomic innervation is still in its scientific infancy, with a number of different methodologies and techniques that will require standardization and widespread acceptance before becoming a standard of care. Future studies of autonomic innervation in acquired, hereditary, neurodegenerative, or autoimmune disorders will be necessary to determine the clinical utility of skin biopsy in these disease states.

  6. Scheduling lessons learned from the Autonomous Power System

    NASA Technical Reports Server (NTRS)

    Ringer, Mark J.

    1992-01-01

    The Autonomous Power System (APS) project at NASA LeRC is designed to demonstrate the applications of integrated intelligent diagnosis, control, and scheduling techniques to space power distribution systems. The project consists of three elements: the Autonomous Power Expert System (APEX) for Fault Diagnosis, Isolation, and Recovery (FDIR); the Autonomous Intelligent Power Scheduler (AIPS) to efficiently assign activities start times and resources; and power hardware (Brassboard) to emulate a space-based power system. The AIPS scheduler was tested within the APS system. This scheduler is able to efficiently assign available power to the requesting activities and share this information with other software agents within the APS system in order to implement the generated schedule. The AIPS scheduler is also able to cooperatively recover from fault situations by rescheduling the affected loads on the Brassboard in conjunction with the APEX FDIR system. AIPS served as a learning tool and an initial scheduling testbed for the integration of FDIR and automated scheduling systems. Many lessons were learned from the AIPS scheduler and are now being integrated into a new scheduler called SCRAP (Scheduler for Continuous Resource Allocation and Planning). This paper will service three purposes: an overview of the AIPS implementation, lessons learned from the AIPS scheduler, and a brief section on how these lessons are being applied to the new SCRAP scheduler.

  7. Multiple-channel Streaming Delivery for Omnidirectional Vision System

    NASA Astrophysics Data System (ADS)

    Iwai, Yoshio; Nagahara, Hajime; Yachida, Masahiko

    An omnidirectional vision is an imaging system that can capture a surrounding image in whole direction by using a hyperbolic mirror and a conventional CCD camera. This paper proposes a streaming server that can efficiently transfer movies captured by an omnidirectional vision system through the Internet. The proposed system uses multiple channels to deliver multiple movies synchronously. Through this method, the system enables clients to view the different direction of omnidirectional movies and also support the function to change the view are during playback period. Our evaluation experiments show that our proposed streaming server can effectively deliver multiple movies via multiple channels.

  8. Plugin-docking system for autonomous charging using particle filter

    NASA Astrophysics Data System (ADS)

    Koyasu, Hiroshi; Wada, Masayoshi

    2017-03-01

    Autonomous charging of the robot battery is one of the key functions for the sake of expanding working areas of the robots. To realize it, most of existing systems use custom docking stations or artificial markers. By the other words, they can only charge on a few specific outlets. If the limit can be removed, working areas of the robots significantly expands. In this paper, we describe a plugin-docking system for the autonomous charging, which does not require any custom docking stations or artificial markers. A single camera is used for recognizing the 3D position of an outlet socket. A particle filter-based image tracking algorithm which is robust to the illumination change is applied. The algorithm is implemented on a robot with an omnidirectional moving system. The experimental results show the effectiveness of our system.

  9. Multisensor robotic system for autonomous space maintenance and repair

    NASA Technical Reports Server (NTRS)

    Abidi, M. A.; Green, W. L.; Chandra, T.; Spears, J.

    1988-01-01

    The feasibility of realistic autonomous space manipulation tasks using multisensory information is demonstrated. The system is capable of acquiring, integrating, and interpreting multisensory data to locate, mate, and demate a Fluid Interchange System (FIS) and a Module Interchange System (MIS). In both cases, autonomous location of a guiding light target, mating, and demating of the system are performed. Implemented visio-driven techniques are used to determine the arbitrary two-dimensional position and orientation of the mating elements as well as the arbitrary three-dimensional position and orientation of the light targets. A force/torque sensor continuously monitors the six components of force and torque exerted on the end-effector. Both FIS and MIS experiments were successfully accomplished on mock-ups built for this purpose. The method is immune to variations in the ambient light, in particular because of the 90-minute day-night shift in space.

  10. Multisensor robotic system for autonomous space maintenance and repair

    NASA Technical Reports Server (NTRS)

    Abidi, M. A.; Green, W. L.; Chandra, T.; Spears, J.

    1988-01-01

    The feasibility of realistic autonomous space manipulation tasks using multisensory information is demonstrated. The system is capable of acquiring, integrating, and interpreting multisensory data to locate, mate, and demate a Fluid Interchange System (FIS) and a Module Interchange System (MIS). In both cases, autonomous location of a guiding light target, mating, and demating of the system are performed. Implemented visio-driven techniques are used to determine the arbitrary two-dimensional position and orientation of the mating elements as well as the arbitrary three-dimensional position and orientation of the light targets. A force/torque sensor continuously monitors the six components of force and torque exerted on the end-effector. Both FIS and MIS experiments were successfully accomplished on mock-ups built for this purpose. The method is immune to variations in the ambient light, in particular because of the 90-minute day-night shift in space.

  11. Preliminary Design of an Autonomous Amphibious System

    DTIC Science & Technology

    2016-09-01

    report was performed by the Unmanned Systems Advanced Develop- ment Branch (Code 71720) of the Advanced Systems & Applied Sciences Division, Space and...amphibious system and associated software architecture being developed under the Space and Naval Warfare Systems Center Pacific (SSC Pacific) Naval...master cylinders and a special component called a shuttle valve are required. One master cylinder provides braking from the human driver and the other

  12. The organization of an autonomous learning system

    NASA Technical Reports Server (NTRS)

    Kanerva, Pentti

    1988-01-01

    The organization of systems that learn from experience is examined, human beings and animals being prime examples of such systems. How is their information processing organized. They build an internal model of the world and base their actions on the model. The model is dynamic and predictive, and it includes the systems' own actions and their effects. In modeling such systems, a large pattern of features represents a moment of the system's experience. Some of the features are provided by the system's senses, some control the system's motors, and the rest have no immediate external significance. A sequence of such patterns then represents the system's experience over time. By storing such sequences appropriately in memory, the system builds a world model based on experience. In addition to the essential function of memory, fundamental roles are played by a sensory system that makes raw information about the world suitable for memory storage and by a motor system that affects the world. The relation of sensory and motor systems to the memory is discussed, together with how favorable actions can be learned and unfavorable actions can be avoided. Results in classical learning theory are explained in terms of the model, more advanced forms of learning are discussed, and the relevance of the model to the frame problem of robotics is examined.

  13. Autonomous Systems in Human Behavior and Development

    ERIC Educational Resources Information Center

    Wolff, P.

    1974-01-01

    Reviews research which demonstrates that responses from different behavior systems to a given stimulus situation may be far from perfectly correlated with each other. Discusses the phylogenetic and ontogenetic development of these systems and the roles of both the species and the individual in bringing the systems into mutual correspondence.…

  14. Large autonomous spacecraft electrical power system (LASEPS)

    NASA Technical Reports Server (NTRS)

    Dugal-Whitehead, Norma R.; Johnson, Yvette B.

    1992-01-01

    NASA - Marshall Space Flight Center is creating a large high voltage electrical power system testbed called LASEPS. This testbed is being developed to simulate an end-to-end power system from power generation and source to loads. When the system is completed it will have several power configurations, which will include several battery configurations. These configurations are: two 120 V batteries, one or two 150 V batteries, and one 250 to 270 V battery. This breadboard encompasses varying levels of autonomy from remote power converters to conventional software control to expert system control of the power system elements. In this paper, the construction and provisions of this breadboard are discussed.

  15. Large autonomous spacecraft electrical power system (LASEPS)

    NASA Technical Reports Server (NTRS)

    Dugal-Whitehead, Norma R.; Johnson, Yvette B.

    1992-01-01

    NASA - Marshall Space Flight Center is creating a large high voltage electrical power system testbed called LASEPS. This testbed is being developed to simulate an end-to-end power system from power generation and source to loads. When the system is completed it will have several power configurations, which will include several battery configurations. These configurations are: two 120 V batteries, one or two 150 V batteries, and one 250 to 270 V battery. This breadboard encompasses varying levels of autonomy from remote power converters to conventional software control to expert system control of the power system elements. In this paper, the construction and provisions of this breadboard are discussed.

  16. Machine vision system for online inspection of freshly slaughtered chickens

    USDA-ARS?s Scientific Manuscript database

    A machine vision system was developed and evaluated for the automation of online inspection to differentiate freshly slaughtered wholesome chickens from systemically diseased chickens. The system consisted of an electron-multiplying charge-coupled-device camera used with an imaging spectrograph and ...

  17. Representing Autonomous Systems Self-Confidence through Competency Boundaries

    DTIC Science & Technology

    2015-01-01

    operators in understanding the state of unmanned vehicles under control. A sensing-optimization/verification-action (SOVA) model, similar to the perception...how areas of uncertainty affect system performance. LIDAR and GPS were examined for scenarios where sensed surroundings could be inaccurate, while...Figure 1 illustrates the human-machine trust loop in which an operator assigns a task to an autonomous system, in this case an Unmanned Vehicle (UV

  18. Challenges of the Viking Mars Lander system. [autonomous design

    NASA Technical Reports Server (NTRS)

    Goodlette, J. D.

    1975-01-01

    A number of natural constraints have led to a highly autonomous Lander system design. Almost all communications to and from the Lander are via the Orbiter. A functional description of the Lander mission is given, taking into account deorbit and descent, entry, terminal descent, landing, and landed operations. The challenges of the system design are considered along with the mechanical configuration and aspects of thermal control. Attention is given to science data return, aspects of reliability and redundancy, and details regarding the software.

  19. Toward autonomous driving: The CMU Navlab. II - Architecture and systems

    NASA Technical Reports Server (NTRS)

    Thorpe, Charles; Hebert, Martial; Kanade, Takeo; Shafer, Steven

    1991-01-01

    A description is given of EDDIE, the architecture for the Navlab mobile robot which provides a toolkit for building specific systems quickly and easily. Included in the discussion are the annotated maps used by EDDIE and the Navlab's road-following system, called the Autonomous Mail Vehicle, which was built using EDDIE and its annotated maps as a basis. The contributions of the Navlab project and the lessons learned from it are examined.

  20. Toward autonomous driving: The CMU Navlab. II - Architecture and systems

    NASA Technical Reports Server (NTRS)

    Thorpe, Charles; Hebert, Martial; Kanade, Takeo; Shafer, Steven

    1991-01-01

    A description is given of EDDIE, the architecture for the Navlab mobile robot which provides a toolkit for building specific systems quickly and easily. Included in the discussion are the annotated maps used by EDDIE and the Navlab's road-following system, called the Autonomous Mail Vehicle, which was built using EDDIE and its annotated maps as a basis. The contributions of the Navlab project and the lessons learned from it are examined.

  1. Machine vision system for inspecting characteristics of hybrid rice seed

    NASA Astrophysics Data System (ADS)

    Cheng, Fang; Ying, Yibin

    2004-03-01

    Obtaining clear images advantaged of improving the classification accuracy involves many factors, light source, lens extender and background were discussed in this paper. The analysis of rice seed reflectance curves showed that the wavelength of light source for discrimination of the diseased seeds from normal rice seeds in the monochromic image recognition mode was about 815nm for jinyou402 and shanyou10. To determine optimizing conditions for acquiring digital images of rice seed using a computer vision system, an adjustable color machine vision system was developed. The machine vision system with 20mm to 25mm lens extender produce close-up images which made it easy to object recognition of characteristics in hybrid rice seeds. White background was proved to be better than black background for inspecting rice seeds infected by disease and using the algorithms based on shape. Experimental results indicated good classification for most of the characteristics with the machine vision system. The same algorithm yielded better results in optimizing condition for quality inspection of rice seed. Specifically, the image processing can correct for details such as fine fissure with the machine vision system.

  2. A modular real-time vision system for humanoid robots

    NASA Astrophysics Data System (ADS)

    Trifan, Alina L.; Neves, António J. R.; Lau, Nuno; Cunha, Bernardo

    2012-01-01

    Robotic vision is nowadays one of the most challenging branches of robotics. In the case of a humanoid robot, a robust vision system has to provide an accurate representation of the surrounding world and to cope with all the constraints imposed by the hardware architecture and the locomotion of the robot. Usually humanoid robots have low computational capabilities that limit the complexity of the developed algorithms. Moreover, their vision system should perform in real time, therefore a compromise between complexity and processing times has to be found. This paper presents a reliable implementation of a modular vision system for a humanoid robot to be used in color-coded environments. From image acquisition, to camera calibration and object detection, the system that we propose integrates all the functionalities needed for a humanoid robot to accurately perform given tasks in color-coded environments. The main contributions of this paper are the implementation details that allow the use of the vision system in real-time, even with low processing capabilities, the innovative self-calibration algorithm for the most important parameters of the camera and its modularity that allows its use with different robotic platforms. Experimental results have been obtained with a NAO robot produced by Aldebaran, which is currently the robotic platform used in the RoboCup Standard Platform League, as well as with a humanoid build using the Bioloid Expert Kit from Robotis. As practical examples, our vision system can be efficiently used in real time for the detection of the objects of interest for a soccer playing robot (ball, field lines and goals) as well as for navigating through a maze with the help of color-coded clues. In the worst case scenario, all the objects of interest in a soccer game, using a NAO robot, with a single core 500Mhz processor, are detected in less than 30ms. Our vision system also includes an algorithm for self-calibration of the camera parameters as well

  3. Neural associative memories for the integration of language, vision and action in an autonomous agent.

    PubMed

    Markert, H; Kaufmann, U; Kara Kayikci, Z; Palm, G

    2009-03-01

    Language understanding is a long-standing problem in computer science. However, the human brain is capable of processing complex languages with seemingly no difficulties. This paper shows a model for language understanding using biologically plausible neural networks composed of associative memories. The model is able to deal with ambiguities on the single word and grammatical level. The language system is embedded into a robot in order to demonstrate the correct semantical understanding of the input sentences by letting the robot perform corresponding actions. For that purpose, a simple neural action planning system has been combined with neural networks for visual object recognition and visual attention control mechanisms.

  4. Autonomic nervous system activities during motor imagery in elite athletes.

    PubMed

    Oishi, Kazuo; Maeshima, Takashi

    2004-01-01

    Motor imagery (MI), a mental simulation of voluntary motor actions, has been used as a training method for athletes for many years. It is possible that MI techniques might similarly be useful as part of rehabilitative strategies to help people regain skills lost as a consequence of diseases or stroke. Mental activity and stress induce several different autonomic responses as part of the behavioral response to movement (e.g., motor anticipation) and as part of the central planning and preprogramming of movement. However, the interrelationships between MI, the autonomic responses, and the motor system have not yet been worked out. The authors compare a number of autonomic responses (respiration, heart rate, electro skin resistance) and motoneuron excitability (soleus H-reflex) in elite and nonelite speed skaters during MI. In contrast to the nonelite athletes, MI of elite speed skaters is characterized by larger changes in heart rate and respiration, a greater reliance on an internal perspective for MI, a more vivid MI, a more accurate correspondence between the MI and actual race times, and decreased motoneuron excitability. Two observations suggest that the changes in the autonomic responses and motoneuron excitability for the elite speed skaters are related to the effects of central motor programming: (1) there was no correlation between the autonomic responses for MI and those recorded during mental arithmetic; and (2) mental arithmetic did not significantly alter motoneuron activity. It is suggested that in elite speed skaters, the descending neural mechanisms that reduce motoneuron excitability are activated even when full, vivid MI is performed internally. These inhibitory responses of the motor system may enhance actual motor performance under conditions of remarkably high mental stress, such as that which occurs in the Olympic games.

  5. Differential GNSS and Vision-Based Tracking to Improve Navigation Performance in Cooperative Multi-UAV Systems.

    PubMed

    Vetrella, Amedeo Rodi; Fasano, Giancarmine; Accardo, Domenico; Moccia, Antonio

    2016-12-17

    Autonomous navigation of micro-UAVs is typically based on the integration of low cost Global Navigation Satellite System (GNSS) receivers and Micro-Electro-Mechanical Systems (MEMS)-based inertial and magnetic sensors to stabilize and control the flight. The resulting navigation performance in terms of position and attitude accuracy may not suffice for other mission needs, such as the ones relevant to fine sensor pointing. In this framework, this paper presents a cooperative UAV navigation algorithm that allows a chief vehicle, equipped with inertial and magnetic sensors, a Global Positioning System (GPS) receiver, and a vision system, to improve its navigation performance (in real time or in the post processing phase) exploiting formation flying deputy vehicles equipped with GPS receivers. The focus is set on outdoor environments and the key concept is to exploit differential GPS among vehicles and vision-based tracking (DGPS/Vision) to build a virtual additional navigation sensor whose information is then integrated in a sensor fusion algorithm based on an Extended Kalman Filter. The developed concept and processing architecture are described, with a focus on DGPS/Vision attitude determination algorithm. Performance assessment is carried out on the basis of both numerical simulations and flight tests. In the latter ones, navigation estimates derived from the DGPS/Vision approach are compared with those provided by the onboard autopilot system of a customized quadrotor. The analysis shows the potential of the developed approach, mainly deriving from the possibility to exploit magnetic- and inertial-independent accurate attitude information.

  6. Database Integrity Monitoring for Synthetic Vision Systems Using Machine Vision and SHADE

    NASA Technical Reports Server (NTRS)

    Cooper, Eric G.; Young, Steven D.

    2005-01-01

    In an effort to increase situational awareness, the aviation industry is investigating technologies that allow pilots to visualize what is outside of the aircraft during periods of low-visibility. One of these technologies, referred to as Synthetic Vision Systems (SVS), provides the pilot with real-time computer-generated images of obstacles, terrain features, runways, and other aircraft regardless of weather conditions. To help ensure the integrity of such systems, methods of verifying the accuracy of synthetically-derived display elements using onboard remote sensing technologies are under investigation. One such method is based on a shadow detection and extraction (SHADE) algorithm that transforms computer-generated digital elevation data into a reference domain that enables direct comparison with radar measurements. This paper describes machine vision techniques for making this comparison and discusses preliminary results from application to actual flight data.

  7. Workshop on Assurance for Autonomous Systems for Aviation

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Davies, Misty; Giannakopoulou, Dimitra; Neogi, Natasha

    2016-01-01

    This report describes the workshop on Assurance for Autonomous Systems for Aviation that was held in January 2016 in conjunction with the SciTech 2016 conference held in San Diego, CA. The workshop explored issues related to assurance for autonomous systems and also the idea of trust in these systems. Specifically, we focused on discussing current practices for assurance of autonomy, identifying barriers specific to autonomy as related to assurance as well as operational scenarios demonstrating the need to address the barriers. Furthermore, attention was given to identifying verification techniques that may be applicable to autonomy, as well as discussing new research directions needed to address barriers, thereby involving potential shifts in current practices.

  8. Multi-agent autonomous system and method

    NASA Technical Reports Server (NTRS)

    Fink, Wolfgang (Inventor); Dohm, James (Inventor); Tarbell, Mark A. (Inventor)

    2010-01-01

    A method of controlling a plurality of crafts in an operational area includes providing a command system, a first craft in the operational area coupled to the command system, and a second craft in the operational area coupled to the command system. The method further includes determining a first desired destination and a first trajectory to the first desired destination, sending a first command from the command system to the first craft to move a first distance along the first trajectory, and moving the first craft according to the first command. A second desired destination and a second trajectory to the second desired destination are determined and a second command is sent from the command system to the second craft to move a second distance along the second trajectory.

  9. Intelligent systems for the autonomous exploration of Titan and Enceladus

    NASA Astrophysics Data System (ADS)

    Furfaro, Roberto; Lunine, Jonathan I.; Kargel, Jeffrey S.; Fink, Wolfgang

    2008-04-01

    Future planetary exploration of the outer satellites of the Solar System will require higher levels of onboard automation, including autonomous determination of sites where the probability of significant scientific findings is highest. Generally, the level of needed automation is heavily influenced by the distance between Earth and the robotic explorer(s) (e.g. spacecraft(s), rover(s), and balloon(s)). Therefore, planning missions to the outer satellites mandates the analysis, design and integration within the mission architecture of semi- and/or completely autonomous intelligence systems. Such systems should (1) include software packages that enable fully automated and comprehensive identification, characterization, and quantification of feature information within an operational region with subsequent target prioritization and selection for close-up reexamination; and (2) integrate existing information with acquired, "in transit" spatial and temporal sensor data to automatically perform intelligent planetary reconnaissance, which includes identification of sites with the highest potential to yield significant geological and astrobiological information. In this paper we review and compare some of the available Artificial Intelligence (AI) schemes and their adaptation to the problem of designing expert systems for onboard-based, autonomous science to be performed in the course of outer satellites exploration. More specifically, the fuzzy-logic framework proposed is analyzed in some details to show the effectiveness of such a scheme when applied to the problem of designing expert systems capable of identifying and further exploring regions on Titan and/or Enceladus that have the highest potential to yield evidence for past or present life. Based on available information (e.g., Cassini data), the current knowledge and understanding of Titan and Enceladus environments is evaluated to define a path for the design of a fuzzy-based system capable of reasoning over

  10. A Test-Bed Configuration: Toward an Autonomous System

    NASA Astrophysics Data System (ADS)

    Ocaña, F.; Castillo, M.; Uranga, E.; Ponz, J. D.; TBT Consortium

    2015-09-01

    In the context of the Space Situational Awareness (SSA) program of ESA, it is foreseen to deploy several large robotic telescopes in remote locations to provide surveillance and tracking services for man-made as well as natural near-Earth objects (NEOs). The present project, termed Telescope Test Bed (TBT) is being developed under ESA's General Studies and Technology Programme, and shall implement a test-bed for the validation of an autonomous optical observing system in a realistic scenario, consisting of two telescopes located in Spain and Australia, to collect representative test data for precursor NEO services. In order to fulfill all the security requirements for the TBT project, the use of a autonomous emergency system (AES) is foreseen to monitor the control system. The AES will monitor remotely the health of the observing system and the internal and external environment. It will incorporate both autonomous and interactive actuators to force the protection of the system (i.e., emergency dome close out).

  11. Autonomous Aerial Payload Delivery System Blizzard

    DTIC Science & Technology

    2011-05-01

    known systems. Another technique to achieve a high touchdown accuracy is networking, enabling communication between multiple descending ADSs, UAV...Global System for Mobile ( Communications ) MCCC = mission C2 center PATCAD = Precision Airdrop Technology Conference and Demonstration SA = situational... high performance gimbal (seen in Fig.1 and shown in more details in Fig.2) featuring a full 360° un-obstructed field of view, direct drive

  12. Mathematical biomarkers for the autonomic regulation of cardiovascular system.

    PubMed

    Campos, Luciana A; Pereira, Valter L; Muralikrishna, Amita; Albarwani, Sulayma; Brás, Susana; Gouveia, Sónia

    2013-10-07

    Heart rate and blood pressure are the most important vital signs in diagnosing disease. Both heart rate and blood pressure are characterized by a high degree of short term variability from moment to moment, medium term over the normal day and night as well as in the very long term over months to years. The study of new mathematical algorithms to evaluate the variability of these cardiovascular parameters has a high potential in the development of new methods for early detection of cardiovascular disease, to establish differential diagnosis with possible therapeutic consequences. The autonomic nervous system is a major player in the general adaptive reaction to stress and disease. The quantitative prediction of the autonomic interactions in multiple control loops pathways of cardiovascular system is directly applicable to clinical situations. Exploration of new multimodal analytical techniques for the variability of cardiovascular system may detect new approaches for deterministic parameter identification. A multimodal analysis of cardiovascular signals can be studied by evaluating their amplitudes, phases, time domain patterns, and sensitivity to imposed stimuli, i.e., drugs blocking the autonomic system. The causal effects, gains, and dynamic relationships may be studied through dynamical fuzzy logic models, such as the discrete-time model and discrete-event model. We expect an increase in accuracy of modeling and a better estimation of the heart rate and blood pressure time series, which could be of benefit for intelligent patient monitoring. We foresee that identifying quantitative mathematical biomarkers for autonomic nervous system will allow individual therapy adjustments to aim at the most favorable sympathetic-parasympathetic balance.

  13. Mathematical biomarkers for the autonomic regulation of cardiovascular system

    PubMed Central

    Campos, Luciana A.; Pereira, Valter L.; Muralikrishna, Amita; Albarwani, Sulayma; Brás, Susana; Gouveia, Sónia

    2013-01-01

    Heart rate and blood pressure are the most important vital signs in diagnosing disease. Both heart rate and blood pressure are characterized by a high degree of short term variability from moment to moment, medium term over the normal day and night as well as in the very long term over months to years. The study of new mathematical algorithms to evaluate the variability of these cardiovascular parameters has a high potential in the development of new methods for early detection of cardiovascular disease, to establish differential diagnosis with possible therapeutic consequences. The autonomic nervous system is a major player in the general adaptive reaction to stress and disease. The quantitative prediction of the autonomic interactions in multiple control loops pathways of cardiovascular system is directly applicable to clinical situations. Exploration of new multimodal analytical techniques for the variability of cardiovascular system may detect new approaches for deterministic parameter identification. A multimodal analysis of cardiovascular signals can be studied by evaluating their amplitudes, phases, time domain patterns, and sensitivity to imposed stimuli, i.e., drugs blocking the autonomic system. The causal effects, gains, and dynamic relationships may be studied through dynamical fuzzy logic models, such as the discrete-time model and discrete-event model. We expect an increase in accuracy of modeling and a better estimation of the heart rate and blood pressure time series, which could be of benefit for intelligent patient monitoring. We foresee that identifying quantitative mathematical biomarkers for autonomic nervous system will allow individual therapy adjustments to aim at the most favorable sympathetic-parasympathetic balance. PMID:24109456

  14. Analysis of the development and the prospects about vehicular infrared night vision system

    NASA Astrophysics Data System (ADS)

    Li, Jing; Fan, Hua-ping; Xie, Zu-yun; Zhou, Xiao-hong; Yu, Hong-qiang; Huang, Hui

    2013-08-01

    Through the classification of vehicular infrared night vision system and comparing the mainstream vehicle infrared night vision products, we summarized the functions of vehicular infrared night vision system which conclude night vision, defogging , strong-light resistance and biological recognition. At the same time , the vehicular infrared night vision system's markets of senior car and fire protection industry were analyzed。Finally, the conclusion was given that vehicle infrared night vision system would be used as a safety essential active safety equipment to promote the night vision photoelectric industry and automobile industry.

  15. Component based open middleware architecture for autonomous navigation system

    NASA Astrophysics Data System (ADS)

    Ahn, Myung Kil; Park, Yong Woon; Jee, Tae Young

    2007-04-01

    This paper introduces component based open middleware architecture implemented by ADD(Agency for Defense Development) to accommodate new technology evolution of unmanned autonomous system. The proposed open system architecture can be considered as a standard interface which defines the messages and operations between software components on application layer level, and its purpose is to ensure the portability of future technology onto multi-platforms as well as the inter-operability domains. In this architecture, the domain is defined as the space where several different robots are operated, and each robot is defined as a subsystem within the domain. Each subsystem, i.e., robot, is composed of several nodes, and then each node is composed of various components including node manager and communicator. The implemented middleware uses reference architecture from JAUS (Joint Architecture for Unmanned System) as a guidance. Among the key achievements of this research is the development of general node manager which makes it possible to easily accommodate a new interface or the new core technology developed on the application layer by providing a platform-independent communication interface between each subsystem and the components. This paper introduces reference architecture and middleware applied in XAV (eXperimental Autonomous Vehicle) developed in ADD. In addition, the performance of autonomous navigation and system design characteristics are briefly introduced.

  16. Autonomous electrochemical biosensors: A new vision to direct methanol fuel cells.

    PubMed

    Sales, M Goreti F; Brandão, Lúcia

    2017-12-15

    A new approach to biosensing devices is demonstrated aiming an easier and simpler application in routine health care systems. Our methodology considered a new concept for the biosensor transducing event that allows to obtain, simultaneously, an equipment-free, user-friendly, cheap electrical biosensor. The use of the anode triple-phase boundary (TPB) layer of a passive direct methanol fuel cell (DMFC) as biosensor transducer is herein proposed. For that, the ionomer present in the anode catalytic layer of the DMFC is partially replaced by an ionomer with molecular recognition capability working as the biorecognition element of the biosensor. In this approach, fuel cell anode catalysts are modified with a molecularly imprinted polymer (plastic antibody) capable of protein recognition (ferritin is used as model protein), inserted in a suitable membrane electrode assembly (MEA) and tested, as initial proof-of-concept, in a non-passive fuel cell operation environment. The anchoring of the ionomer-based plastic antibody on the catalyst surface follows a simple one-step grafting from approach through radical polymerization. Such modification increases fuel cell performance due to the proton conductivity and macroporosity characteristics of the polymer on the TPB. Finally, the response and selectivity of the bioreceptor inside the fuel cell showed a clear and selective signal from the biosensor. Moreover, such pioneering transducing approach allowed amplification of the electrochemical response and increased biosensor sensitivity by 2 orders of magnitude when compared to a 3-electrodes configuration system. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  17. Test of Lander Vision System for Mars 2020

    NASA Image and Video Library

    2016-10-04

    A prototype of the Lander Vision System for NASA Mars 2020 mission was tested in this Dec. 9, 2014, flight of a Masten Space Systems Xombie vehicle at Mojave Air and Space Port in California. http://photojournal.jpl.nasa.gov/catalog/PIA20848

  18. Future Automated Rough Mills Hinge on Vision Systems

    Treesearch

    Philip A. Araman

    1996-01-01

    The backbone behind major changes to present and future rough mills in dimension, furniture, cabinet or millwork facilities will be computer vision systems. Because of the wide variety of products and the quality of parts produced, the scanning systems and rough mills will vary greatly. The scanners will vary in type. For many complicated applications, multiple scanner...

  19. Cloud Absorption Radiometer Autonomous Navigation System - CANS

    NASA Technical Reports Server (NTRS)

    Kahle, Duncan; Gatebe, Charles; McCune, Bill; Hellwig, Dustan

    2013-01-01

    CAR (cloud absorption radiometer) acquires spatial reference data from host aircraft navigation systems. This poses various problems during CAR data reduction, including navigation data format, accuracy of position data, accuracy of airframe inertial data, and navigation data rate. Incorporating its own navigation system, which included GPS (Global Positioning System), roll axis inertia and rates, and three axis acceleration, CANS expedites data reduction and increases the accuracy of the CAR end data product. CANS provides a self-contained navigation system for the CAR, using inertial reference and GPS positional information. The intent of the software application was to correct the sensor with respect to aircraft roll in real time based upon inputs from a precision navigation sensor. In addition, the navigation information (including GPS position), attitude data, and sensor position details are all streamed to a remote system for recording and later analysis. CANS comprises a commercially available inertial navigation system with integral GPS capability (Attitude Heading Reference System AHRS) integrated into the CAR support structure and data system. The unit is attached to the bottom of the tripod support structure. The related GPS antenna is located on the P-3 radome immediately above the CAR. The AHRS unit provides a RS-232 data stream containing global position and inertial attitude and velocity data to the CAR, which is recorded concurrently with the CAR data. This independence from aircraft navigation input provides for position and inertial state data that accounts for very small changes in aircraft attitude and position, sensed at the CAR location as opposed to aircraft state sensors typically installed close to the aircraft center of gravity. More accurate positional data enables quicker CAR data reduction with better resolution. The CANS software operates in two modes: initialization/calibration and operational. In the initialization/calibration mode

  20. Analysis of battery current microcycles in autonomous renewable energy systems

    NASA Astrophysics Data System (ADS)

    Ruddell, A. J.; Dutton, A. G.; Wenzl, H.; Ropeter, C.; Sauer, D. U.; Merten, J.; Orfanogiannis, C.; Twidell, J. W.; Vezin, P.

    Battery currents in autonomous renewable energy systems (RES) are generally predicted or measured in terms of mean values over intervals of 1 min or longer. As a result, battery charge-discharge cycles with periods less than the averaging period are ignored, and the actual battery ampere hour (A h) throughput and resulting battery wear may be seriously underestimated, leading to optimistic prediction of battery lifetime. This paper considers short charge-discharge cycles or microcycles, arising from the characteristics of autonomous renewable energy systems, including generators, regulators, loads, and load inverter. Simulation results are used to show that inverters operating directly from the battery can cause microcycles, resulting in significantly increased battery throughput. Initial experimental results of the effects of microcycles on battery capacity and charging characteristics, and the contributing processes, are discussed.

  1. Simulation Of Dual Behavior Of An Autonomous System

    NASA Astrophysics Data System (ADS)

    Bhatt, R.; Gaw, D.; Meystel, Alexander M.

    1990-02-01

    This paper describes a system of guidance for an intelligent mobile autonomous system (autonomous robot) based upon an algorithm of "pilot decision making" which incorporates different strategies of operation. Depending on the set of circumstances including the level of "informedness", initial data, concrete environment, and so on, the "personality" of PILOT is being selected between two alternatives: 1) a diligent strategist which tends to explore all available trajectories off-line and be prepared to follow one of them precisely, and 2) a hasty decision maker inclined to make a choice of solution in a rather reckless manner base upon short term alternatives not regarding long term consequences. Simulation shows that these two personalities support each other in a beneficial way.

  2. Control Problems in Autonomous Life Support Systems

    NASA Technical Reports Server (NTRS)

    Colombano, S.

    1982-01-01

    The problem of constructing life support systems which require little or no input of matter (food and gases) for long, or even indefinite, periods of time is addressed. Natural control in ecosystems, a control theory for ecosystems, and an approach to the design of an ALSS are addressed.

  3. An experiment in vision based autonomous grasping within a reduced gravity environment

    NASA Technical Reports Server (NTRS)

    Grimm, K. A.; Erickson, J. D.; Anderson, G.; Chien, C. H.; Hewgill, L.; Littlefield, M.; Norsworthy, R.

    1992-01-01

    The National Aeronautics and Space Administration's Reduced Gravity Program (RGP) offers opportunities for experimentation in gravities of less than one-g. The Extravehicular Activity Helper/Retriever (EVAHR) robot project of the Automation and Robotics Division at the Lyndon B. Johnson Space Center in Houston, Texas, is undertaking a task that will culminate in a series of tests in simulated zero-g using this facility. A subset of the final robot hardware consisting of a three-dimensional laser mapper, a Robotics Research 807 arm, a Jameson JH-5 hand, and the appropriate interconnect hardware/software will be used. This equipment will be flown on the RGP's KC-135 aircraft. This aircraft will fly a series of parabolas creating the effect of zero-g. During the periods of zero-g, a number of objects will be released in front of the fixed base robot hardware in both static and dynamic configurations. The system will then inspect the object, determine the objects pose, plan a grasp strategy, and execute the grasp. This must all be accomplished in the approximately 27 seconds of zero-g.

  4. Robotic reactions: delay-induced patterns in autonomous vehicle systems.

    PubMed

    Orosz, Gábor; Moehlis, Jeff; Bullo, Francesco

    2010-02-01

    Fundamental design principles are presented for vehicle systems governed by autonomous cruise control devices. By analyzing the corresponding delay differential equations, it is shown that for any car-following model short-wavelength oscillations can appear due to robotic reaction times, and that there are tradeoffs between the time delay and the control gains. The analytical findings are demonstrated on an optimal velocity model using numerical continuation and numerical simulation.

  5. Robotic reactions: Delay-induced patterns in autonomous vehicle systems

    NASA Astrophysics Data System (ADS)

    Orosz, Gábor; Moehlis, Jeff; Bullo, Francesco

    2010-02-01

    Fundamental design principles are presented for vehicle systems governed by autonomous cruise control devices. By analyzing the corresponding delay differential equations, it is shown that for any car-following model short-wavelength oscillations can appear due to robotic reaction times, and that there are tradeoffs between the time delay and the control gains. The analytical findings are demonstrated on an optimal velocity model using numerical continuation and numerical simulation.

  6. Software Testbed for Developing and Evaluating Integrated Autonomous Systems

    DTIC Science & Technology

    2015-03-01

    978-1-4799-5380-6/15/$31.00 ©2015 IEEE 1 Software Testbed for Developing and Evaluating Integrated Autonomous Systems James Ong , Emilio...Remolina, Axel Prompt Stottler Henke Associates, Inc. 1670 S. Amphlett Blvd., suite 310 San Mateo, CA 94402 650-931-2700 ong , remolina, aprompt...www.stottlerhenke.com/datamontage/ [13] Ong , J., E. Remolina, D. E. Smith, M. S. Boddy (2013) A Visual Integrated Development Environment for Automated Planning

  7. Measures of Autonomic Nervous System Regulation

    DTIC Science & Technology

    2011-04-01

    flowing out of the lungs. The optimal level of the individual’s lung function is measured by using three color-coded peak flow zones. The individual...amphetamines, alcohol and monoamine oxidase inhibitors, which may interfere with accurate measurements of catecholamine metabolites. Three tools for...wireless PDA-Based physiological monitoring system for patient transport . IEEE Trans Inf Technol Biomed. 2004;8(4):439. 25. Blank JM, Altman DG

  8. Autonomous Systems and Robotics: 2000-2004

    NASA Technical Reports Server (NTRS)

    2004-01-01

    This custom bibliography from the NASA Scientific and Technical Information Program lists a sampling of records found in the NASA Aeronautics and Space Database. The scope of this topic includes technologies to monitor, maintain, and where possible, repair complex space systems. This area of focus is one of the enabling technologies as defined by NASA s Report of the President s Commission on Implementation of United States Space Exploration Policy, published in June 2004.

  9. Ceramic substrate's detection system based on machine vision

    NASA Astrophysics Data System (ADS)

    Yang, Li-na; Zhou, Zhen-feng; Zhu, Li-jun

    2009-05-01

    Machine vision detection technology is an integrated modern inspection technology including optoelectronics, computer image, information processing and computer vision etc. It regards image as means and carrier of transmitting information, and extracts useful information from image and acquires all kinds of necessary parameters by dealing with images. Combining key project in Zhejiang Province Office of Education-research of high accuracy and large size machine vision automatic detection and separation technology. The paper describes the primary factors of influencing system's precision, develops an automatic detection system of ceramic substrate. The system gathers the image of ceramic substrate by CMOS( Complementary Metal-Oxide Semiconductor). The quality of image is improved by optical imaging and lighting system. The precision of edge detection is improved by image preprocessing and sub-pixel. In image enhancement part , image filter and geometric distortion correction are used. Edges are obtained through a sub-pixel edge detection method: determining the probable position of image edge by advanced Sobel operator and then taking three-order spline interpolation function to interpolate the gray edge image. The mathematical modeling of dimensional and geometric error of visual inspection system is developed. The parameters of ceramic substrate's length, and width are acquired. The experiment results show that the presented method in this paper increases the precision of vision detection system , and measuring results of this system are satisfying.

  10. Mobile Autonomous Humanoid Assistant

    NASA Technical Reports Server (NTRS)

    Diftler, M. A.; Ambrose, R. O.; Tyree, K. S.; Goza, S. M.; Huber, E. L.

    2004-01-01

    A mobile autonomous humanoid robot is assisting human co-workers at the Johnson Space Center with tool handling tasks. This robot combines the upper body of the National Aeronautics and Space Administration (NASA)/Defense Advanced Research Projects Agency (DARPA) Robonaut system with a Segway(TradeMark) Robotic Mobility Platform yielding a dexterous, maneuverable humanoid perfect for aiding human co-workers in a range of environments. This system uses stereo vision to locate human team mates and tools and a navigation system that uses laser range and vision data to follow humans while avoiding obstacles. Tactile sensors provide information to grasping algorithms for efficient tool exchanges. The autonomous architecture utilizes these pre-programmed skills to form human assistant behaviors. The initial behavior demonstrates a robust capability to assist a human by acquiring a tool from a remotely located individual and then following the human in a cluttered environment with the tool for future use.

  11. Mobile Autonomous Humanoid Assistant

    NASA Technical Reports Server (NTRS)

    Diftler, M. A.; Ambrose, R. O.; Tyree, K. S.; Goza, S. M.; Huber, E. L.

    2004-01-01

    A mobile autonomous humanoid robot is assisting human co-workers at the Johnson Space Center with tool handling tasks. This robot combines the upper body of the National Aeronautics and Space Administration (NASA)/Defense Advanced Research Projects Agency (DARPA) Robonaut system with a Segway(TradeMark) Robotic Mobility Platform yielding a dexterous, maneuverable humanoid perfect for aiding human co-workers in a range of environments. This system uses stereo vision to locate human team mates and tools and a navigation system that uses laser range and vision data to follow humans while avoiding obstacles. Tactile sensors provide information to grasping algorithms for efficient tool exchanges. The autonomous architecture utilizes these pre-programmed skills to form human assistant behaviors. The initial behavior demonstrates a robust capability to assist a human by acquiring a tool from a remotely located individual and then following the human in a cluttered environment with the tool for future use.

  12. Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems

    NASA Technical Reports Server (NTRS)

    Bujorianu, Marius C.; Bujorianu, Manuela L.

    2009-01-01

    In this paper, we sketch a framework for interdisciplinary modeling of space systems, by proposing a holistic view. We consider different system dimensions and their interaction. Specifically, we study the interactions between computation, physics, communication, uncertainty and autonomy. The most comprehensive computational paradigm that supports a holistic perspective on autonomous space systems is given by cyber-physical systems. For these, the state of art consists of collaborating multi-engineering efforts that prompt for an adequate formal foundation. To achieve this, we propose a leveraging of the traditional content of formal modeling by a co-engineering process.

  13. A laser-based vision system for weld quality inspection.

    PubMed

    Huang, Wei; Kovacevic, Radovan

    2011-01-01

    Welding is a very complex process in which the final weld quality can be affected by many process parameters. In order to inspect the weld quality and detect the presence of various weld defects, different methods and systems are studied and developed. In this paper, a laser-based vision system is developed for non-destructive weld quality inspection. The vision sensor is designed based on the principle of laser triangulation. By processing the images acquired from the vision sensor, the geometrical features of the weld can be obtained. Through the visual analysis of the acquired 3D profiles of the weld, the presences as well as the positions and sizes of the weld defects can be accurately identified and therefore, the non-destructive weld quality inspection can be achieved.

  14. A Laser-Based Vision System for Weld Quality Inspection

    PubMed Central

    Huang, Wei; Kovacevic, Radovan

    2011-01-01

    Welding is a very complex process in which the final weld quality can be affected by many process parameters. In order to inspect the weld quality and detect the presence of various weld defects, different methods and systems are studied and developed. In this paper, a laser-based vision system is developed for non-destructive weld quality inspection. The vision sensor is designed based on the principle of laser triangulation. By processing the images acquired from the vision sensor, the geometrical features of the weld can be obtained. Through the visual analysis of the acquired 3D profiles of the weld, the presences as well as the positions and sizes of the weld defects can be accurately identified and therefore, the non-destructive weld quality inspection can be achieved. PMID:22344308

  15. On non-autonomous dynamical systems

    NASA Astrophysics Data System (ADS)

    Anzaldo-Meneses, A.

    2015-04-01

    In usual realistic classical dynamical systems, the Hamiltonian depends explicitly on time. In this work, a class of classical systems with time dependent nonlinear Hamiltonians is analyzed. This type of problems allows to find invariants by a family of Veronese maps. The motivation to develop this method results from the observation that the Poisson-Lie algebra of monomials in the coordinates and momenta is clearly defined in terms of its brackets and leads naturally to an infinite linear set of differential equations, under certain circumstances. To perform explicit analytic and numerical calculations, two examples are presented to estimate the trajectories, the first given by a nonlinear problem and the second by a quadratic Hamiltonian with three time dependent parameters. In the nonlinear problem, the Veronese approach using jets is shown to be equivalent to a direct procedure using elliptic functions identities, and linear invariants are constructed. For the second example, linear and quadratic invariants as well as stability conditions are given. Explicit solutions are also obtained for stepwise constant forces. For the quadratic Hamiltonian, an appropriated set of coordinates relates the geometric setting to that of the three dimensional manifold of central conic sections. It is shown further that the quantum mechanical problem of scattering in a superlattice leads to mathematically equivalent equations for the wave function, if the classical time is replaced by the space coordinate along a superlattice. The mathematical method used to compute the trajectories for stepwise constant parameters can be applied to both problems. It is the standard method in quantum scattering calculations, as known for locally periodic systems including a space dependent effective mass.

  16. On non-autonomous dynamical systems

    SciTech Connect

    Anzaldo-Meneses, A.

    2015-04-15

    In usual realistic classical dynamical systems, the Hamiltonian depends explicitly on time. In this work, a class of classical systems with time dependent nonlinear Hamiltonians is analyzed. This type of problems allows to find invariants by a family of Veronese maps. The motivation to develop this method results from the observation that the Poisson-Lie algebra of monomials in the coordinates and momenta is clearly defined in terms of its brackets and leads naturally to an infinite linear set of differential equations, under certain circumstances. To perform explicit analytic and numerical calculations, two examples are presented to estimate the trajectories, the first given by a nonlinear problem and the second by a quadratic Hamiltonian with three time dependent parameters. In the nonlinear problem, the Veronese approach using jets is shown to be equivalent to a direct procedure using elliptic functions identities, and linear invariants are constructed. For the second example, linear and quadratic invariants as well as stability conditions are given. Explicit solutions are also obtained for stepwise constant forces. For the quadratic Hamiltonian, an appropriated set of coordinates relates the geometric setting to that of the three dimensional manifold of central conic sections. It is shown further that the quantum mechanical problem of scattering in a superlattice leads to mathematically equivalent equations for the wave function, if the classical time is replaced by the space coordinate along a superlattice. The mathematical method used to compute the trajectories for stepwise constant parameters can be applied to both problems. It is the standard method in quantum scattering calculations, as known for locally periodic systems including a space dependent effective mass.

  17. Autonomic nervous system correlates in movement observation and motor imagery

    PubMed Central

    Collet, C.; Di Rienzo, F.; El Hoyek, N.; Guillot, A.

    2013-01-01

    The purpose of the current article is to provide a comprehensive overview of the literature offering a better understanding of the autonomic nervous system (ANS) correlates in motor imagery (MI) and movement observation. These are two high brain functions involving sensori-motor coupling, mediated by memory systems. How observing or mentally rehearsing a movement affect ANS activity has not been extensively investigated. The links between cognitive functions and ANS responses are not so obvious. We will first describe the organization of the ANS whose main purposes are controlling vital functions by maintaining the homeostasis of the organism and providing adaptive responses when changes occur either in the external or internal milieu. We will then review how scientific knowledge evolved, thus integrating recent findings related to ANS functioning, and show how these are linked to mental functions. In turn, we will describe how movement observation or MI may elicit physiological responses at the peripheral level of the autonomic effectors, thus eliciting autonomic correlates to cognitive activity. Key features of this paper are to draw a step-by step progression from the understanding of ANS physiology to its relationships with high mental processes such as movement observation or MI. We will further provide evidence that mental processes are co-programmed both at the somatic and autonomic levels of the central nervous system (CNS). We will thus detail how peripheral physiological responses may be analyzed to provide objective evidence that MI is actually performed. The main perspective is thus to consider that, during movement observation and MI, ANS activity is an objective witness of mental processes. PMID:23908623

  18. Flight Control System Development for the BURRO Autonomous UAV

    NASA Technical Reports Server (NTRS)

    Colbourne, Jason D.; Frost, Chad R.; Tischler, Mark B.; Ciolani, Luigi; Sahai, Ranjana; Tomoshofski, Chris; LaMontagne, Troy; Rutkowski, Michael (Technical Monitor)

    2000-01-01

    Developing autonomous flying vehicles has been a growing field in aeronautical research within the last decade and will continue into the next century. With concerns about safety, size, and cost of manned aircraft, several autonomous vehicle projects are currently being developed; uninhabited rotorcraft offer solutions to requirements for hover, vertical take-off and landing, as well as slung load transportation capabilities. The newness of the technology requires flight control engineers to question what design approaches, control law architectures, and performance criteria apply to control law development and handling quality evaluation. To help answer these questions, this paper documents the control law design process for Kaman Aerospace BURRO project. This paper will describe the approach taken to design control laws and develop math models which will be used to convert the manned K-MAX into the BURRO autonomous rotorcraft. With the ability of the K-MAX to lift its own weight (6000 lb) the load significantly affects the dynamics of the system; the paper addresses the additional design requirements for slung load autonomous flight. The approach taken in this design was to: 1) generate accurate math models of the K-MAX helicopter with and without slung loads, 2) select design specifications that would deliver good performance as well as satisfy mission criteria, and 3) develop and tune the control system architecture to meet the design specs and mission criteria. An accurate math model was desired for control system development. The Comprehensive Identification from Frequency Responses (CIFER(R)) software package was used to identify a linear math model for unloaded and loaded flight at hover, 50 kts, and 100 kts. The results of an eight degree-of-freedom CIFER(R)-identified linear model for the unloaded hover flight condition are presented herein, and the identification of the two-body slung-load configuration is in progress.

  19. Lyapunov stability of n-D strongly autonomous systems

    NASA Astrophysics Data System (ADS)

    Pal, Debasattam; Pillai, Harish K.

    2011-11-01

    In this article we look into stability properties of strongly autonomous n-D systems, i.e. systems having finite-dimensional behaviour. These systems are known to have a first-order representation akin to 1-D state-space representation; we consider our systems to be already in this form throughout. We first define restriction of an n-D system to a 1-D subspace. Using this we define stability with respect to a given half-line, and then stability with respect to collections of such half-lines: proper cones. Then we show how stability with respect to a half-line, for the strongly autonomous case, reduces to a linear combination of the state representation matrices being Hurwitz. We first relate the eigenvalues of this linear combination with those of the individual matrices. With this we give an equivalent geometric criterion in terms of the real part of the characteristic variety of the system for half-line stability. Then we extend this geometric criterion to the case of stability with respect to a proper cone. Finally, we look into a Lyapunov theory of stability with respect to a proper cone for strongly autonomous systems. Each non-zero vector in the given proper cone gives rise to a linear combination of the system matrices. Each of these linear combinations gives a corresponding Lyapunov inequality. We show that the system is stable with respect to the proper cone if and only if there exists a common solution to all of these Lyapunov inequalities.

  20. Intelligent systems in space : the EO-1 Autonomous Sciencecraft

    NASA Technical Reports Server (NTRS)

    Sherwood, Robert L.; Chien, Steve; Tran, Daniel; Cichy, Benjamin; Castano, Rebecca; Davies, Ashley; Rabideau, Gregg

    2005-01-01

    The Autonomous Sciencecraft Software (ASE) is currently flying onboard the Earth Observing One (EO-1) Spacecraft. This software enables the spacecraft to autonomously detect and respond to science events occurring on the Earth. The package includes software systems that perform science data analysis, deliberative planning, and runtime robust execution. Because of the deployment to the EO-1 spacecraft, the ASE software has stringent constraints of autonomy and limited computing resources. We describe these constraints and how they are reflected in our operations approach. A summary of the final results of the experiment is also included. This software has demonstrated the potential for space missions to use onboard decision-making to detect, analyze, and respond to science events, and to downlink only the highest value science data. As a result, ground-based mission planning and analysis functions have been greatly simplified, thus reducing operations cost.

  1. Autonomous system for pathogen detection and identification

    SciTech Connect

    Belgrader, P.; Benett, W.; Bergman, W.; Langlois, R.; Mariella, R.; Milanovich, F.; Miles, R.; Venkateswaran, K.; Long, G.; Nelson, W.

    1998-09-24

    This purpose of this project is to build a prototype instrument that will, running unattended, detect, identify, and quantify BW agents. In order to accomplish this, we have chosen to start with the world' s leading, proven, assays for pathogens: surface-molecular recognition assays, such as antibody-based assays, implemented on a high-performance, identification (ID)-capable flow cytometer, and the polymerase chain reaction (PCR) for nucleic-acid based assays. With these assays, we must integrate the capability to: l collect samples from aerosols, water, or surfaces; l perform sample preparation prior to the assays; l incubate the prepared samples, if necessary, for a period of time; l transport the prepared, incubated samples to the assays; l perform the assays; l interpret and report the results of the assays. Issues such as reliability, sensitivity and accuracy, quantity of consumables, maintenance schedule, etc. must be addressed satisfactorily to the end user. The highest possible sensitivity and specificity of the assay must be combined with no false alarms. Today, we have assays that can, in under 30 minutes, detect and identify simulants for BW agents at concentrations of a few hundred colony-forming units per ml of solution. If the bio-aerosol sampler of this system collects 1000 Ymin and concentrates the respirable particles into 1 ml of solution with 70% processing efficiency over a period of 5 minutes, then this translates to a detection/ID capability of under 0.1 agent-containing particle/liter of air.

  2. The impact of changing night vision goggle spectral response on night vision imaging system lighting compatibility

    NASA Astrophysics Data System (ADS)

    Task, Harry L.; Marasco, Peter L.

    2004-09-01

    The defining document outlining night-vision imaging system (NVIS) compatible lighting, MIL-L-85762A, was written in the mid 1980's, based on what was then the state of the art in night vision and image intensification. Since that time there have been changes in the photocathode sensitivity and the minus-blue coatings applied to the objective lenses. Specifically, many aviation night-vision goggles (NVGs) in the Air Force are equipped with so-called "leaky green" or Class C type objective lens coatings that provide a small amount of transmission around 545 nanometers so that the displays that use a P-43 phosphor can be seen through the NVGs. However, current NVIS compatibility requirements documents have not been updated to include these changes. Documents that followed and replaced MIL-L-85762A (ASC/ENFC-96-01 and MIL-STD-3009) addressed aspects of then current NVIS technology, but did little to change the actual content or NVIS radiance requirements set forth in the original MIL-L-85762A. This paper examines the impact of spectral response changes, introduced by changes in image tube parameters and objective lens minus-blue filters, on NVIS compatibility and NVIS radiance calculations. Possible impact on NVIS lighting requirements is also discussed. In addition, arguments are presented for revisiting NVIS radiometric unit conventions.

  3. Autonomous Control Capabilities for Space Reactor Power Systems

    SciTech Connect

    Wood, Richard T.; Neal, John S.; Brittain, C. Ray; Mullens, James A.

    2004-02-04

    The National Aeronautics and Space Administration's (NASA's) Project Prometheus, the Nuclear Systems Program, is investigating a possible Jupiter Icy Moons Orbiter (JIMO) mission, which would conduct in-depth studies of three of the moons of Jupiter by using a space reactor power system (SRPS) to provide energy for propulsion and spacecraft power for more than a decade. Terrestrial nuclear power plants rely upon varying degrees of direct human control and interaction for operations and maintenance over a forty to sixty year lifetime. In contrast, an SRPS is intended to provide continuous, remote, unattended operation for up to fifteen years with no maintenance. Uncertainties, rare events, degradation, and communications delays with Earth are challenges that SRPS control must accommodate. Autonomous control is needed to address these challenges and optimize the reactor control design. In this paper, we describe an autonomous control concept for generic SRPS designs. The formulation of an autonomous control concept, which includes identification of high-level functional requirements and generation of a research and development plan for enabling technologies, is among the technical activities that are being conducted under the U.S. Department of Energy's Space Reactor Technology Program in support of the NASA's Project Prometheus. The findings from this program are intended to contribute to the successful realization of the JIMO mission.

  4. Autonomous Control Capabilities for Space Reactor Power Systems

    NASA Astrophysics Data System (ADS)

    Wood, Richard T.; Neal, John S.; Brittain, C. Ray; Mullens, James A.

    2004-02-01

    The National Aeronautics and Space Administration's (NASA's) Project Prometheus, the Nuclear Systems Program, is investigating a possible Jupiter Icy Moons Orbiter (JIMO) mission, which would conduct in-depth studies of three of the moons of Jupiter by using a space reactor power system (SRPS) to provide energy for propulsion and spacecraft power for more than a decade. Terrestrial nuclear power plants rely upon varying degrees of direct human control and interaction for operations and maintenance over a forty to sixty year lifetime. In contrast, an SRPS is intended to provide continuous, remote, unattended operation for up to fifteen years with no maintenance. Uncertainties, rare events, degradation, and communications delays with Earth are challenges that SRPS control must accommodate. Autonomous control is needed to address these challenges and optimize the reactor control design. In this paper, we describe an autonomous control concept for generic SRPS designs. The formulation of an autonomous control concept, which includes identification of high-level functional requirements and generation of a research and development plan for enabling technologies, is among the technical activities that are being conducted under the U.S. Department of Energy's Space Reactor Technology Program in support of the NASA's Project Prometheus. The findings from this program are intended to contribute to the successful realization of the JIMO mission.

  5. The 3D laser radar vision processor system

    NASA Technical Reports Server (NTRS)

    Sebok, T. M.

    1990-01-01

    Loral Defense Systems (LDS) developed a 3D Laser Radar Vision Processor system capable of detecting, classifying, and identifying small mobile targets as well as larger fixed targets using three dimensional laser radar imagery for use with a robotic type system. This processor system is designed to interface with the NASA Johnson Space Center in-house Extra Vehicular Activity (EVA) Retriever robot program and provide to it needed information so it can fetch and grasp targets in a space-type scenario.

  6. Enabling autonomous control for space reactor power systems

    SciTech Connect

    Wood, R. T.

    2006-07-01

    The application of nuclear reactors for space power and/or propulsion presents some unique challenges regarding the operations and control of the power system. Terrestrial nuclear reactors employ varying degrees of human control and decision-making for operations and benefit from periodic human interaction for maintenance. In contrast, the control system of a space reactor power system (SRPS) employed for deep space missions must be able to accommodate unattended operations due to communications delays and periods of planetary occlusion while adapting to evolving or degraded conditions with no opportunity for repair or refurbishment. Thus, a SRPS control system must provide for operational autonomy. Oak Ridge National Laboratory (ORNL) has conducted an investigation of the state of the technology for autonomous control to determine the experience base in the nuclear power application domain, both for space and terrestrial use. It was found that control systems with varying levels of autonomy have been employed in robotic, transportation, spacecraft, and manufacturing applications. However, autonomous control has not been implemented for an operating terrestrial nuclear power plant nor has there been any experience beyond automating simple control loops for space reactors. Current automated control technologies for nuclear power plants are reasonably mature, and basic control for a SRPS is clearly feasible under optimum circumstances. However, autonomous control is primarily intended to account for the non optimum circumstances when degradation, failure, and other off-normal events challenge the performance of the reactor and near-term human intervention is not possible. Thus, the development and demonstration of autonomous control capabilities for the specific domain of space nuclear power operations is needed. This paper will discuss the findings of the ORNL study and provide a description of the concept of autonomy, its key characteristics, and a prospective

  7. System safety analysis of an autonomous mobile robot

    SciTech Connect

    Bartos, R.J.

    1994-08-01

    Analysis of the safety of operating and maintaining the Stored Waste Autonomous Mobile Inspector (SWAMI) II in a hazardous environment at the Fernald Environmental Management Project (FEMP) was completed. The SWAMI II is a version of a commercial robot, the HelpMate{trademark} robot produced by the Transitions Research Corporation, which is being updated to incorporate the systems required for inspecting mixed toxic chemical and radioactive waste drums at the FEMP. It also has modified obstacle detection and collision avoidance subsystems. The robot will autonomously travel down the aisles in storage warehouses to record images of containers and collect other data which are transmitted to an inspector at a remote computer terminal. A previous study showed the SWAMI II has economic feasibility. The SWAMI II will more accurately locate radioactive contamination than human inspectors. This thesis includes a System Safety Hazard Analysis and a quantitative Fault Tree Analysis (FTA). The objectives of the analyses are to prevent potentially serious events and to derive a comprehensive set of safety requirements from which the safety of the SWAMI II and other autonomous mobile robots can be evaluated. The Computer-Aided Fault Tree Analysis (CAFTA{copyright}) software is utilized for the FTA. The FTA shows that more than 99% of the safety risk occurs during maintenance, and that when the derived safety requirements are implemented the rate of serious events is reduced to below one event per million operating hours. Training and procedures in SWAMI II operation and maintenance provide an added safety margin. This study will promote the safe use of the SWAMI II and other autonomous mobile robots in the emerging technology of mobile robotic inspection.

  8. Active-Vision Control Systems for Complex Adversarial 3-D Environments

    DTIC Science & Technology

    2009-03-01

    environment. The new capabilities of autonomous sensing and control enable UAV /munition operations: in a clandestine/covert manner; in close proximity...nature, and without relying upon highly accurate 3-D models of the environment. The new capabilities of autonomous sensing and control enable UAV ...blur). While these problems are classical in computer vision and image analysis , all algorithms published so far required knowledge of the calibration

  9. Experimental study on a smart wheelchair system using a combination of stereoscopic and spherical vision.

    PubMed

    Nguyen, Jordan S; Su, Steven W; Nguyen, Hung T

    2013-01-01

    This paper is concerned with the experimental study performance of a smart wheelchair system named TIM (Thought-controlled Intelligent Machine), which uses a unique camera configuration for vision. Included in this configuration are stereoscopic cameras for 3-Dimensional (3D) depth perception and mapping ahead of the wheelchair, and a spherical camera system for 360-degrees of monocular vision. The camera combination provides obstacle detection and mapping in unknown environments during real-time autonomous navigation of the wheelchair. With the integration of hands-free wheelchair control technology, designed as control methods for people with severe physical disability, the smart wheelchair system can assist the user with automated guidance during navigation. An experimental study on this system was conducted with a total of 10 participants, consisting of 8 able-bodied subjects and 2 tetraplegic (C-6 to C-7) subjects. The hands-free control technologies utilized for this testing were a head-movement controller (HMC) and a brain-computer interface (BCI). The results showed the assistance of TIM's automated guidance system had a statistically significant reduction effect (p-value = 0.000533) on the completion times of the obstacle course presented in the experimental study, as compared to the test runs conducted without the assistance of TIM.

  10. Computer Vision Systems for Hardwood Logs and Lumber

    Treesearch

    Philip A. Araman; Tai-Hoon Cho; D. Zhu; R. Conners

    1991-01-01

    Computer vision systems being developed at Virginia Tech University with the support and cooperation from the U.S. Forest Service are presented. Researchers at Michigan State University, West Virginia University, and Mississippi State University are also members of the research team working on various parts of this research. Our goals are to help U.S. hardwood...

  11. Global vision systems regulatory and standard setting activities

    NASA Astrophysics Data System (ADS)

    Tiana, Carlo; Münsterer, Thomas

    2016-05-01

    A number of committees globally, and the Regulatory Agencies they support, are active delivering and updating performance standards for vision system: Enhanced, Synthetic and Combined, as they apply to both Fixed Wing and, more recently, Rotorcraft operations in low visibility. We provide an overview of each committee's present and past work, as well as an update of recent activities and future goals.

  12. Multiple-Agent Air/Ground Autonomous Exploration Systems

    NASA Technical Reports Server (NTRS)

    Fink, Wolfgang; Chao, Tien-Hsin; Tarbell, Mark; Dohm, James M.

    2007-01-01

    Autonomous systems of multiple-agent air/ground robotic units for exploration of the surfaces of remote planets are undergoing development. Modified versions of these systems could be used on Earth to perform tasks in environments dangerous or inaccessible to humans: examples of tasks could include scientific exploration of remote regions of Antarctica, removal of land mines, cleanup of hazardous chemicals, and military reconnaissance. A basic system according to this concept (see figure) would include a unit, suspended by a balloon or a blimp, that would be in radio communication with multiple robotic ground vehicles (rovers) equipped with video cameras and possibly other sensors for scientific exploration. The airborne unit would be free-floating, controlled by thrusters, or tethered either to one of the rovers or to a stationary object in or on the ground. Each rover would contain a semi-autonomous control system for maneuvering and would function under the supervision of a control system in the airborne unit. The rover maneuvering control system would utilize imagery from the onboard camera to navigate around obstacles. Avoidance of obstacles would also be aided by readout from an onboard (e.g., ultrasonic) sensor. Together, the rover and airborne control systems would constitute an overarching closed-loop control system to coordinate scientific exploration by the rovers.

  13. Digital vision system for three-dimensional model acquisition

    NASA Astrophysics Data System (ADS)

    Yuan, Ta; Lin, Huei-Yung; Qin, Xiangdong; Subbarao, Murali

    2000-10-01

    A digital vision system and the computational algorithms used by the system for three-dimensional (3D) model acquisition are described. The system is named Stonybrook VIsion System (SVIS). The system can acquire the 3D model (which includes the 3D shape and the corresponding image texture) of a simple object within a 300 mm X 300 mm X 300 mm volume placed about 600 mm from the system. SVIS integrates Image Focus Analysis (IFA) and Stereo Image Analysis (SIA) techniques for 3D shape and image texture recovery. First, 4 to 8 partial 3D models of the object are obtained from 4 to 8 views of the object. The partial models are then integrated to obtain a complete model of the object. The complete model is displayed using a 3D graphics rendering software (Apple's QuickDraw). Experimental results on several objects are presented.

  14. Brain, mind, body and society: autonomous system in robotics.

    PubMed

    Shimoda, Motomu

    2013-12-01

    In this paper I examine the issues related to the robot with mind. To create a robot with mind aims to recreate neuro function by engineering. The robot with mind is expected not only to process external information by the built-in program and behave accordingly, but also to gain the consciousness activity responding multiple conditions and flexible and interactive communication skills coping with unknown situation. That prospect is based on the development of artificial intelligence in which self-organizing and self-emergent functions have been available in recent years. To date, controllable aspects in robotics have been restricted to data making and programming of cognitive abilities, while consciousness activities and communication skills have been regarded as uncontrollable aspects due to their contingency and uncertainty. However, some researchers of robotics claim that every activity of the mind can be recreated by engineering and is therefore controllable. Based on the development of the cognitive abilities of children and the findings of neuroscience, researchers have attempted to produce the latest artificial intelligence with autonomous learning systems. I conclude that controllability is inconsistent with autonomy in the genuine sense and autonomous robots recreated by engineering cannot be autonomous partners of humans.

  15. The role of the autonomic nervous system in Tourette Syndrome

    PubMed Central

    Hawksley, Jack; Cavanna, Andrea E.; Nagai, Yoko

    2015-01-01

    Tourette Syndrome (TS) is a neurodevelopmental disorder, consisting of multiple involuntary movements (motor tics) and one or more vocal (phonic) tics. It affects up to one percent of children worldwide, of whom about one third continue to experience symptoms into adulthood. The central neural mechanisms of tic generation are not clearly understood, however recent neuroimaging investigations suggest impaired cortico-striato-thalamo-cortical activity during motor control. In the current manuscript, we will tackle the relatively under-investigated role of the peripheral autonomic nervous system, and its central influences, on tic activity. There is emerging evidence that both sympathetic and parasympathetic nervous activity influences tic expression. Pharmacological treatments which act on sympathetic tone are often helpful: for example, Clonidine (an alpha-2 adrenoreceptor agonist) is often used as first choice medication for treating TS in children due to its good tolerability profile and potential usefulness for co-morbid attention-deficit and hyperactivity disorder. Clonidine suppresses sympathetic activity, reducing the triggering of motor tics. A general elevation of sympathetic tone is reported in patients with TS compared to healthy people, however this observation may reflect transient responses coupled to tic activity. Thus, the presence of autonomic impairments in patients with TS remains unclear. Effect of autonomic afferent input to cortico-striato-thalamo-cortical circuit will be discussed schematically. We additionally review how TS is affected by modulation of central autonomic control through biofeedback and Vagus Nerve Stimulation (VNS). Biofeedback training can enable a patient to gain voluntary control over covert physiological responses by making these responses explicit. Electrodermal biofeedback training to elicit a reduction in sympathetic tone has a demonstrated association with reduced tic frequency. VNS, achieved through an implanted device

  16. Central autonomic nervous system response to autonomic challenges is altered in patients with a previous episode of Takotsubo cardiomyopathy.

    PubMed

    Pereira, Vitor H; Marques, Paulo; Magalhães, Ricardo; Português, João; Calvo, Lucy; Cerqueira, João J; Sousa, Nuno

    2016-04-01

    Takotsubo cardiomyopathy is an intriguing disease characterized by acute transient left ventricular dysfunction usually triggered by an episode of severe stress. The excessive levels of catecholamines and the overactivation of the sympathetic system are believed to be the main pathophysiologic mechanisms of Takotsubo cardiomyopathy, but it is unclear whether there is a structural or functional signature of the disease. In this sense, our aim was to characterize the central autonomic system response to autonomic challenges in patients with a previous episode of Takotsubo cardiomyopathy when compared with a control group of healthy volunteers. Functional magnetic resonance imaging (fMRI) was performed in four patients with a previous episode of Takotsubo cardiomyopathy (average age of 67 ± 12 years) and in eight healthy volunteers (average age of 66 ± 5 years) while being submitted to different autonomic challenges (cold exposure and Valsalva manoeuvre). The fMRI analysis revealed a significant variation of the blood oxygen level dependent signal triggered by the Valsalva manoeuvre in specific areas of the brain involved in the cortical control of the autonomic system and significant differences in the pattern of activation of the insular cortex, amygdala and the right hippocampus between patients with Takotsubo cardiomyopathy and controls, even though these regions did not present significant volumetric changes. The central autonomic response to autonomic challenges is altered in patients with Takotsubo cardiomyopathy, thus suggesting a dysregulation of the central autonomic nervous system network. Subsequent studies are needed to unveil whether these alterations are causal or predisposing factors to Takotsubo cardiomyopathy. © The European Society of Cardiology 2015.

  17. System control of an autonomous planetary mobile spacecraft

    NASA Technical Reports Server (NTRS)

    Dias, William C.; Zimmerman, Barbara A.

    1990-01-01

    The goal is to suggest the scheduling and control functions necessary for accomplishing mission objectives of a fairly autonomous interplanetary mobile spacecraft, while maximizing reliability. Goals are to provide an extensible, reliable system conservative in its use of on-board resources, while getting full value from subsystem autonomy, and avoiding the lure of ground micromanagement. A functional layout consisting of four basic elements is proposed: GROUND and SYSTEM EXECUTIVE system functions and RESOURCE CONTROL and ACTIVITY MANAGER subsystem functions. The system executive includes six subfunctions: SYSTEM MANAGER, SYSTEM FAULT PROTECTION, PLANNER, SCHEDULE ADAPTER, EVENT MONITOR and RESOURCE MONITOR. The full configuration is needed for autonomous operation on Moon or Mars, whereas a reduced version without the planning, schedule adaption and event monitoring functions could be appropriate for lower-autonomy use on the Moon. An implementation concept is suggested which is conservative in use of system resources and consists of modules combined with a network communications fabric. A language concept termed a scheduling calculus for rapidly performing essential on-board schedule adaption functions is introduced.

  18. Building a 3D scanner system based on monocular vision.

    PubMed

    Zhang, Zhiyi; Yuan, Lin

    2012-04-10

    This paper proposes a three-dimensional scanner system, which is built by using an ingenious geometric construction method based on monocular vision. The system is simple, low cost, and easy to use, and the measurement results are very precise. To build it, one web camera, one handheld linear laser, and one background calibration board are required. The experimental results show that the system is robust and effective, and the scanning precision can be satisfied for normal users.

  19. Component-Oriented Behavior Extraction for Autonomic System Design

    NASA Technical Reports Server (NTRS)

    Bakera, Marco; Wagner, Christian; Margaria,Tiziana; Hinchey, Mike; Vassev, Emil; Steffen, Bernhard

    2009-01-01

    Rich and multifaceted domain specific specification languages like the Autonomic System Specification Language (ASSL) help to design reliable systems with self-healing capabilities. The GEAR game-based Model Checker has been used successfully to investigate properties of the ESA Exo- Mars Rover in depth. We show here how to enable GEAR s game-based verification techniques for ASSL via systematic model extraction from a behavioral subset of the language, and illustrate it on a description of the Voyager II space mission.

  20. Using Robotic Operating System (ROS) to control autonomous observatories

    NASA Astrophysics Data System (ADS)

    Vilardell, Francesc; Artigues, Gabriel; Sanz, Josep; García-Piquer, Álvaro; Colomé, Josep; Ribas, Ignasi

    2016-07-01

    Astronomical observatories are complex systems requiring the integration of numerous devices into a common platform. We are presenting here the firsts steps to integrate the popular Robotic Operating System (ROS) into the control of a fully autonomous observatory. The observatory is also equipped with a decision-making procedure that can automatically react to a changing environment (like weather events). The results obtained so far have shown that the automation of a small observatory can be greatly simplified when using ROS, as well as robust, with the implementation of our decision-making algorithms.

  1. Hepatic Control of Energy Metabolism via the Autonomic Nervous System

    PubMed Central

    2017-01-01

    Although the human liver comprises approximately 2.8% of the body weight, it plays a central role in the control of energy metabolism. While the biochemistry of energy substrates such as glucose, fatty acids, and ketone bodies in the liver is well understood, many aspects of the overall control system for hepatic metabolism remain largely unknown. These include mechanisms underlying the ascertainment of its energy metabolism status by the liver, and the way in which this information is used to communicate and function together with adipose tissues and other organs involved in energy metabolism. This review article summarizes hepatic control of energy metabolism via the autonomic nervous system. PMID:27592630

  2. TOPEX/Poseidon electrical power system -- Autonomous operation

    SciTech Connect

    Chetty, P.R.K.; Richardson, R.; Sherwood, R.; Deligiannis, F.

    1996-12-31

    The main objective of the TOPEX/Poseidon Satellite is to monitor the world`s oceans for scientific study of weather and climate prediction, coastal storm warning and maritime safety. The operational conditions of this satellite imposed challenging requirements for the on-board Electrical Power System (EPS). The power system is designed to maintain a certain level of autonomy. This paper presents the autonomous operations planned, their on-orbit performance and how some of the operations were modified as certain unpredictable circumstances were discovered.

  3. Hepatic Control of Energy Metabolism via the Autonomic Nervous System.

    PubMed

    Yahagi, Naoya

    2017-01-01

    Although the human liver comprises approximately 2.8% of the body weight, it plays a central role in the control of energy metabolism. While the biochemistry of energy substrates such as glucose, fatty acids, and ketone bodies in the liver is well understood, many aspects of the overall control system for hepatic metabolism remain largely unknown. These include mechanisms underlying the ascertainment of its energy metabolism status by the liver, and the way in which this information is used to communicate and function together with adipose tissues and other organs involved in energy metabolism.This review article summarizes hepatic control of energy metabolism via the autonomic nervous system.

  4. Autonomous and Autonomic Swarms

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Truszkowski, Walter F.; Rouff, Christopher A.; Sterritt, Roy

    2005-01-01

    A watershed in systems engineering is represented by the advent of swarm-based systems that accomplish missions through cooperative action by a (large) group of autonomous individuals each having simple capabilities and no global knowledge of the group s objective. Such systems, with individuals capable of surviving in hostile environments, pose unprecedented challenges to system developers. Design and testing and verification at much higher levels will be required, together with the corresponding tools, to bring such systems to fruition. Concepts for possible future NASA space exploration missions include autonomous, autonomic swarms. Engineering swarm-based missions begins with understanding autonomy and autonomicity and how to design, test, and verify systems that have those properties and, simultaneously, the capability to accomplish prescribed mission goals. Formal methods-based technologies, both projected and in development, are described in terms of their potential utility to swarm-based system developers.

  5. Application of edge detection algorithm for vision guided robotics assembly system

    NASA Astrophysics Data System (ADS)

    Balabantaray, Bunil Kumar; Jha, Panchanand; Biswal, Bibhuti Bhusan

    2013-12-01

    Machine vision system has a major role in making robotic assembly system autonomous. Part detection and identification of the correct part are important tasks which need to be carefully done by a vision system to initiate the process. This process consists of many sub-processes wherein, the image capturing, digitizing and enhancing, etc. do account for reconstructive the part for subsequent operations. Edge detection of the grabbed image, therefore, plays an important role in the entire image processing activity. Thus one needs to choose the correct tool for the process with respect to the given environment. In this paper the comparative study of edge detection algorithm with grasping the object in robot assembly system is presented. The proposed work is performed on the Matlab R2010a Simulink. This paper proposes four algorithms i.e. Canny's, Robert, Prewitt and Sobel edge detection algorithm. An attempt has been made to find the best algorithm for the problem. It is found that Canny's edge detection algorithm gives better result and minimum error for the intended task.

  6. 3D vision upgrade kit for the TALON robot system

    NASA Astrophysics Data System (ADS)

    Bodenhamer, Andrew; Pettijohn, Bradley; Pezzaniti, J. Larry; Edmondson, Richard; Vaden, Justin; Hyatt, Brian; Morris, James; Chenault, David; Tchon, Joe; Barnidge, Tracy; Kaufman, Seth; Kingston, David; Newell, Scott

    2010-02-01

    In September 2009 the Fort Leonard Wood Field Element of the US Army Research Laboratory - Human Research and Engineering Directorate, in conjunction with Polaris Sensor Technologies and Concurrent Technologies Corporation, evaluated the objective performance benefits of Polaris' 3D vision upgrade kit for the TALON small unmanned ground vehicle (SUGV). This upgrade kit is a field-upgradable set of two stereo-cameras and a flat panel display, using only standard hardware, data and electrical connections existing on the TALON robot. Using both the 3D vision system and a standard 2D camera and display, ten active-duty Army Soldiers completed seven scenarios designed to be representative of missions performed by military SUGV operators. Mission time savings (6.5% to 32%) were found for six of the seven scenarios when using the 3D vision system. Operators were not only able to complete tasks quicker but, for six of seven scenarios, made fewer mistakes in their task execution. Subjective Soldier feedback was overwhelmingly in support of pursuing 3D vision systems, such as the one evaluated, for fielding to combat units.

  7. A vision guided hybrid robotic prototype system for stereotactic surgery.

    PubMed

    Wei, Jun; Wang, Tianmiao; Liu, Da

    2011-12-01

    Robot-assisted surgery (RAS) systems help surgeons performing accurate operations, but a number of drawbacks render them not yet suitable for clinical theaters and procedures. In this paper, a novel vision guided robotic system is proposed to facilitate navigation procedures. A vision guided hybrid robotic system is designed, consisting of a passive serial arm and an active parallel frame. Navigation is accomplished in three steps: approaching, aiming and insertion. First, the target is safely approached with the passive arm. Second, the trajectory is automatically aligned using the parallel frame. And then the target is reached by manual insertion. A stereo camera is used to position fiducials, the robot and the surgical tool. It also provides working area images for professional surgeons at a remote site. The prototype system accomplished phantom and animal trials with satisfactory accuracy. The robot can easily be adjusted to avoid obstacles and quickly set up on an optimal 'approaching' place. The surgical tool is automatically aligned with the trajectory. The system can withdraw from the working area and restore the aiming posture freely. With the help of the working area images, some important navigation steps can be handled remotely. A novel vision guided robotic system is proposed and validated. It enables surgeons to fit the system to the clinical theater. System safety and feasibility are enhanced by multi-step navigation procedures and remote image monitoring. The system can be operated easily by general clinical staff. Copyright © 2011 John Wiley & Sons, Ltd.

  8. Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Mount, Frances; Carreon, Patricia; Torney, Susan E.

    2001-01-01

    The Engineering and Mission Operations Directorates at NASA Johnson Space Center are combining laboratories and expertise to establish the Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations. This is a testbed for human centered design, development and evaluation of intelligent autonomous and assistant systems that will be needed for human exploration and development of space. This project will improve human-centered analysis, design and evaluation methods for developing intelligent software. This software will support human-machine cognitive and collaborative activities in future interplanetary work environments where distributed computer and human agents cooperate. We are developing and evaluating prototype intelligent systems for distributed multi-agent mixed-initiative operations. The primary target domain is control of life support systems in a planetary base. Technical approaches will be evaluated for use during extended manned tests in the target domain, the Bioregenerative Advanced Life Support Systems Test Complex (BIO-Plex). A spinoff target domain is the International Space Station (ISS) Mission Control Center (MCC). Prodl}cts of this project include human-centered intelligent software technology, innovative human interface designs, and human-centered software development processes, methods and products. The testbed uses adjustable autonomy software and life support systems simulation models from the Adjustable Autonomy Testbed, to represent operations on the remote planet. Ground operations prototypes and concepts will be evaluated in the Exploration Planning and Operations Center (ExPOC) and Jupiter Facility.

  9. Enhanced vision systems: results of simulation and operational tests

    NASA Astrophysics Data System (ADS)

    Hecker, Peter; Doehler, Hans-Ullrich

    1998-07-01

    Today's aircrews have to handle more and more complex situations. Most critical tasks in the field of civil aviation are landing approaches and taxiing. Especially under bad weather conditions the crew has to handle a tremendous workload. Therefore DLR's Institute of Flight Guidance has developed a concept for an enhanced vision system (EVS), which increases performance and safety of the aircrew and provides comprehensive situational awareness. In previous contributions some elements of this concept have been presented, i.e. the 'Simulation of Imaging Radar for Obstacle Detection and Enhanced Vision' by Doehler and Bollmeyer 1996. Now the presented paper gives an overview about the DLR's enhanced vision concept and research approach, which consists of two main components: simulation and experimental evaluation. In a first step the simulational environment for enhanced vision research with a pilot-in-the-loop is introduced. An existing fixed base flight simulator is supplemented by real-time simulations of imaging sensors, i.e. imaging radar and infrared. By applying methods of data fusion an enhanced vision display is generated combining different levels of information, such as terrain model data, processed images acquired by sensors, aircraft state vectors and data transmitted via datalink. The second part of this contribution presents some experimental results. In cooperation with Daimler Benz Aerospace Sensorsystems Ulm, a test van and a test aircraft were equipped with a prototype of an imaging millimeter wave radar. This sophisticated HiVision Radar is up to now one of the most promising sensors for all weather operations. Images acquired by this sensor are shown as well as results of data fusion processes based on digital terrain models. The contribution is concluded by a short video presentation.

  10. Systems, methods and apparatus for quiesence of autonomic safety devices with self action

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G. (Inventor); Sterritt, Roy (Inventor)

    2011-01-01

    Systems, methods and apparatus are provided through which in some embodiments an autonomic environmental safety device may be quiesced. In at least one embodiment, a method for managing an autonomic safety device, such as a smoke detector, based on functioning state and operating status of the autonomic safety device includes processing received signals from the autonomic safety device to obtain an analysis of the condition of the autonomic safety device, generating one or more stay-awake signals based on the functioning status and the operating state of the autonomic safety device, transmitting the stay-awake signal, transmitting self health/urgency data, and transmitting environment health/urgency data. A quiesce component of an autonomic safety device can render the autonomic safety device inactive for a specific amount of time or until a challenging situation has passed.

  11. Levels of Autonomy and Autonomous System Performance Assessment for Intelligent Unmanned Systems

    DTIC Science & Technology

    2014-04-01

    GSL SR-14-1 v Preface This report was compiled for Kelly Swinson, ASTERS Study Director, Unmanned Ground Vehicle (UGV) Test Officer, US Army...Aberdeen Test Center, Aberdeen Proving Ground, MD, as part of the Autonomy System Testing and Evaluation Study ( ASTERS ). The objective of this report is...GSL SR-14-1 vi Acronyms AGV Autonomous Ground Vehicle AL Autonomy Level ALFUS Autonomy Levels for Unmanned Systems ASTERS Autonomous Systems

  12. Knowledge-based and integrated monitoring and diagnosis in autonomous power systems

    NASA Technical Reports Server (NTRS)

    Momoh, J. A.; Zhang, Z. Z.

    1990-01-01

    A new technique of knowledge-based and integrated monitoring and diagnosis (KBIMD) to deal with abnormalities and incipient or potential failures in autonomous power systems is presented. The KBIMD conception is discussed as a new function of autonomous power system automation. Available diagnostic modelling, system structure, principles and strategies are suggested. In order to verify the feasibility of the KBIMD, a preliminary prototype expert system is designed to simulate the KBIMD function in a main electric network of the autonomous power system.

  13. Distributed autonomous systems: resource management, planning, and control algorithms

    NASA Astrophysics Data System (ADS)

    Smith, James F., III; Nguyen, ThanhVu H.

    2005-05-01

    Distributed autonomous systems, i.e., systems that have separated distributed components, each of which, exhibit some degree of autonomy are increasingly providing solutions to naval and other DoD problems. Recently developed control, planning and resource allocation algorithms for two types of distributed autonomous systems will be discussed. The first distributed autonomous system (DAS) to be discussed consists of a collection of unmanned aerial vehicles (UAVs) that are under fuzzy logic control. The UAVs fly and conduct meteorological sampling in a coordinated fashion determined by their fuzzy logic controllers to determine the atmospheric index of refraction. Once in flight no human intervention is required. A fuzzy planning algorithm determines the optimal trajectory, sampling rate and pattern for the UAVs and an interferometer platform while taking into account risk, reliability, priority for sampling in certain regions, fuel limitations, mission cost, and related uncertainties. The real-time fuzzy control algorithm running on each UAV will give the UAV limited autonomy allowing it to change course immediately without consulting with any commander, request other UAVs to help it, alter its sampling pattern and rate when observing interesting phenomena, or to terminate the mission and return to base. The algorithms developed will be compared to a resource manager (RM) developed for another DAS problem related to electronic attack (EA). This RM is based on fuzzy logic and optimized by evolutionary algorithms. It allows a group of dissimilar platforms to use EA resources distributed throughout the group. For both DAS types significant theoretical and simulation results will be presented.

  14. Measuring cardiac autonomic nervous system (ANS) activity in children.

    PubMed

    van Dijk, Aimée E; van Lien, René; van Eijsden, Manon; Gemke, Reinoud J B J; Vrijkotte, Tanja G M; de Geus, Eco J

    2013-04-29

    The autonomic nervous system (ANS) controls mainly automatic bodily functions that are engaged in homeostasis, like heart rate, digestion, respiratory rate, salivation, perspiration and renal function. The ANS has two main branches: the sympathetic nervous system, preparing the human body for action in times of danger and stress, and the parasympathetic nervous system, which regulates the resting state of the body. ANS activity can be measured invasively, for instance by radiotracer techniques or microelectrode recording from superficial nerves, or it can be measured non-invasively by using changes in an organ's response as a proxy for changes in ANS activity, for instance of the sweat glands or the heart. Invasive measurements have the highest validity but are very poorly feasible in large scale samples where non-invasive measures are the preferred approach. Autonomic effects on the heart can be reliably quantified by the recording of the electrocardiogram (ECG) in combination with the impedance cardiogram (ICG), which reflects the changes in thorax impedance in response to respiration and the ejection of blood from the ventricle into the aorta. From the respiration and ECG signals, respiratory sinus arrhythmia can be extracted as a measure of cardiac parasympathetic control. From the ECG and the left ventricular ejection signals, the preejection period can be extracted as a measure of cardiac sympathetic control. ECG and ICG recording is mostly done in laboratory settings. However, having the subjects report to a laboratory greatly reduces ecological validity, is not always doable in large scale epidemiological studies, and can be intimidating for young children. An ambulatory device for ECG and ICG simultaneously resolves these three problems. Here, we present a study design for a minimally invasive and rapid assessment of cardiac autonomic control in children, using a validated ambulatory device (1-5), the VU University Ambulatory Monitoring System (VU

  15. 75 FR 71183 - Twelfth Meeting: Joint RTCA Special Committee 213: EUROCAE WG-79: Enhanced Flight Vision Systems...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-22

    ...: Enhanced Flight Vision Systems/Synthetic Vision Systems (EFVS/SVS) AGENCY: Federal Aviation Administration... Flight Vision Systems/Synthetic Vision Systems (EFVS/SVS). SUMMARY: The FAA is issuing this notice to advise the public of a meeting of Joint RTCA Special Committee 213: EUROCAE WG-79: Enhanced Flight Vision...

  16. Crew and Display Concepts Evaluation for Synthetic / Enhanced Vision Systems

    NASA Technical Reports Server (NTRS)

    Bailey, Randall E.; Kramer, Lynda J.; Prinzel, Lawrence J., III

    2006-01-01

    NASA s Synthetic Vision Systems (SVS) project is developing technologies with practical applications that strive to eliminate low-visibility conditions as a causal factor to civil aircraft accidents and replicate the operational benefits of clear day flight operations, regardless of the actual outside visibility condition. Enhanced Vision System (EVS) technologies are analogous and complementary in many respects to SVS, with the principle difference being that EVS is an imaging sensor presentation, as opposed to a database-derived image. The use of EVS in civil aircraft is projected to increase rapidly as the Federal Aviation Administration recently changed the aircraft operating rules under Part 91, revising the flight visibility requirements for conducting operations to civil airports. Operators conducting straight-in instrument approach procedures may now operate below the published approach minimums when using an approved EVS that shows the required visual references on the pilot s Head-Up Display. An experiment was conducted to evaluate the complementary use of SVS and EVS technologies, specifically focusing on new techniques for integration and/or fusion of synthetic and enhanced vision technologies and crew resource management while operating under the newly adopted FAA rules which provide operating credit for EVS. Overall, the experimental data showed that significant improvements in SA without concomitant increases in workload and display clutter could be provided by the integration and/or fusion of synthetic and enhanced vision technologies for the pilot-flying and the pilot-not-flying.

  17. Supervised autonomous rendezvous and docking system technology evaluation

    NASA Technical Reports Server (NTRS)

    Marzwell, Neville I.

    1991-01-01

    Technology for manned space flight is mature and has an extensive history of the use of man-in-the-loop rendezvous and docking, but there is no history of automated rendezvous and docking. Sensors exist that can operate in the space environment. The Shuttle radar can be used for ranges down to 30 meters, Japan and France are developing laser rangers, and considerable work is going on in the U.S. However, there is a need to validate a flight qualified sensor for the range of 30 meters to contact. The number of targets and illumination patterns should be minimized to reduce operation constraints with one or more sensors integrated into a robust system for autonomous operation. To achieve system redundancy, it is worthwhile to follow a parallel development of qualifying and extending the range of the 0-12 meter MSFC sensor and to simultaneously qualify the 0-30(+) meter JPL laser ranging system as an additional sensor with overlapping capabilities. Such an approach offers a redundant sensor suite for autonomous rendezvous and docking. The development should include the optimization of integrated sensory systems, packaging, mission envelopes, and computer image processing to mimic brain perception and real-time response. The benefits of the Global Positioning System in providing real-time positioning data of high accuracy must be incorporated into the design. The use of GPS-derived attitude data should be investigated further and validated.

  18. The organizing vision of integrated health information systems.

    PubMed

    Ellingsen, Gunnar; Monteiro, Eric

    2008-09-01

    The notion of 'integration' in the context of health information systems is ill-defined yet in widespread use. We identify a variety of meanings ranging from the purely technical integration of information systems to the integration of services. This ambiguity (or interpretive flexibility), we argue, is inherent rather than accidental: it is a necessary prerequisite for mobilizing political and ideological support among stakeholders for integrated health information systems. Building on this, our aim is to trace out the career dynamics of the vision of 'integration/ integrated'. The career dynamics is the transformation of both the imaginary and the material (technological) realizations of the unfolding implementation of the vision of integrated care. Empirically we draw on a large, ongoing project at the University Hospital of North Norway (UNN) to establish an integrated health information system.

  19. Flexible Vision Control System For Precision Robotic Arc Welding

    NASA Astrophysics Data System (ADS)

    Richardson, Richard W.

    1989-02-01

    A system is described which is based on a unique weld image sensor design which integrates the optical system into the weld end effector to produce the so-called "coaxial view" of the weld zone. The resulting weld image is processed by a flexible, table driven vision processing system which can be adapted to detect a variety of features and feature relationships. Provision is made for interactive "teaching" of image features for generation of table parameters from test welds. A table driven control program allows various vision control strategies to be invoked. The main result of the system is a level of emulation of the capability of the expert welder or welding operator, essential to successful precision welding robotization.

  20. Fiber optic coherent laser radar 3D vision system

    SciTech Connect

    Clark, R.B.; Gallman, P.G.; Slotwinski, A.R.; Wagner, K.; Weaver, S.; Xu, Jieping

    1996-12-31

    This CLVS will provide a substantial advance in high speed computer vision performance to support robotic Environmental Management (EM) operations. This 3D system employs a compact fiber optic based scanner and operator at a 128 x 128 pixel frame at one frame per second with a range resolution of 1 mm over its 1.5 meter working range. Using acousto-optic deflectors, the scanner is completely randomly addressable. This can provide live 3D monitoring for situations where it is necessary to update once per second. This can be used for decontamination and decommissioning operations in which robotic systems are altering the scene such as in waste removal, surface scarafacing, or equipment disassembly and removal. The fiber- optic coherent laser radar based system is immune to variations in lighting, color, or surface shading, which have plagued the reliability of existing 3D vision systems, while providing substantially superior range resolution.