Sample records for flight vision system

  1. 77 FR 16890 - Eighteenth Meeting: RTCA Special Committee 213, Enhanced Flight Visions Systems/Synthetic Vision...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-22

    ... Committee 213, Enhanced Flight Visions Systems/Synthetic Vision Systems (EFVS/SVS) AGENCY: Federal Aviation... 213, Enhanced Flight Visions Systems/Synthetic Vision Systems (EFVS/SVS). SUMMARY: The FAA is issuing... Flight Visions Systems/Synthetic Vision Systems (EFVS/SVS). DATES: The meeting will be held April 17-19...

  2. 78 FR 5557 - Twenty-First Meeting: RTCA Special Committee 213, Enhanced Flight Vision Systems/Synthetic Vision...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-25

    ... Committee 213, Enhanced Flight Vision Systems/Synthetic Vision Systems (EFVS/SVS) AGENCY: Federal Aviation... 213, Enhanced Flight Vision Systems/Synthetic Vision Systems (EFVS/SVS). SUMMARY: The FAA is issuing..., Enhanced Flight Vision Systems/Synthetic Vision Systems (EFVS/SVS). DATES: The meeting will be held...

  3. 77 FR 56254 - Twentieth Meeting: RTCA Special Committee 213, Enhanced Flight Vision Systems/Synthetic Vision...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-12

    ... Committee 213, Enhanced Flight Vision Systems/Synthetic Vision Systems (EFVS/SVS) AGENCY: Federal Aviation... 213, Enhanced Flight Vision Systems/Synthetic Vision Systems (EFVS/SVS). SUMMARY: The FAA is issuing... Flight Vision Systems/Synthetic Vision Systems (EFVS/SVS). DATES: The meeting will be held October 2-4...

  4. 78 FR 16756 - Twenty-Second Meeting: RTCA Special Committee 213, Enhanced Flight Vision Systems/Synthetic...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-18

    ... Committee 213, Enhanced Flight Vision Systems/Synthetic Vision Systems (EFVS/SVS) AGENCY: Federal Aviation... 213, Enhanced Flight Vision Systems/Synthetic Vision Systems (EFVS/SVS). SUMMARY: The FAA is issuing..., Enhanced Flight Vision Systems/Synthetic Vision Systems (EFVS/SVS). DATES: The meeting will be held April...

  5. 78 FR 55774 - Twenty Fourth Meeting: RTCA Special Committee 213, Enhanced Flight Vision Systems/Synthetic...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-11

    ... Committee 213, Enhanced Flight Vision Systems/Synthetic Vision Systems (EFVS/SVS) AGENCY: Federal Aviation... 213, Enhanced Flight Vision Systems/Synthetic Vision Systems (EFVS/SVS). SUMMARY: The FAA is issuing..., Enhanced Flight Vision Systems/Synthetic Vision Systems (EFVS/SVS). DATES: The meeting will be held October...

  6. 75 FR 17202 - Eighth Meeting: Joint RTCA Special Committee 213: EUROCAE WG-79: Enhanced Flight Vision Systems...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-05

    ... Committee 213: EUROCAE WG-79: Enhanced Flight Vision Systems/Synthetic Vision Systems (EFVS/SVS) AGENCY...-79: Enhanced Flight Vision Systems/Synthetic Vision Systems (EFVS/SVS). SUMMARY: The FAA is issuing...: Enhanced Flight Vision Systems/Synthetic Vision Systems (EFVS/SVS). DATES: The meeting will be held April...

  7. 75 FR 44306 - Eleventh Meeting: Joint RTCA Special Committee 213: EUROCAE WG-79: Enhanced Flight Vision Systems...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-28

    ... Committee 213: EUROCAE WG- 79: Enhanced Flight Vision Systems/Synthetic Vision Systems (EFVS/SVS) AGENCY... Special Committee 213: EUROCAE WG-79: Enhanced Flight Vision Systems/Synthetic Vision Systems (EFVS/SVS... 213: EUROCAE WG-79: Enhanced Flight Vision Systems/Synthetic Vision Systems (EFVS/SVS). DATES: The...

  8. 75 FR 71183 - Twelfth Meeting: Joint RTCA Special Committee 213: EUROCAE WG-79: Enhanced Flight Vision Systems...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-22

    ... Committee 213: EUROCAE WG-79: Enhanced Flight Vision Systems/Synthetic Vision Systems (EFVS/SVS) AGENCY... Committee 213: EUROCAE WG-79: Enhanced Flight Vision Systems/Synthetic Vision Systems (EFVS/SVS). SUMMARY...: EUROCAE WG-79: Enhanced Flight Vision Systems/Synthetic Vision Systems (EFVS/SVS). DATES: The meeting will...

  9. 76 FR 11847 - Thirteenth Meeting: Joint RTCA Special Committee 213: EUROCAE WG-79: Enhanced Flight Vision...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-03

    ... Special Committee 213: EUROCAE WG- 79: Enhanced Flight Vision Systems/Synthetic Vision Systems (EFVS/SVS... Joint RTCA Special Committee 213: EUROCAE WG-79: Enhanced Flight Vision Systems/Synthetic Vision Systems... Special Committee 213: EUROCAE WG-79: Enhanced Flight Vision Systems/Synthetic Vision Systems (EFVS/SVS...

  10. 76 FR 20437 - Fourteenth Meeting: Joint RTCA Special Committee 213: EUROCAE WG-79: Enhanced Flight Vision...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-12

    ... Special Committee 213: EUROCAE WG- 79: Enhanced Flight Vision Systems/Synthetic Vision Systems (EFVS/SVS... Joint RTCA Special Committee 213: EUROCAE WG-79: Enhanced Flight Vision Systems/Synthetic Vision Systems... Special Committee 213: EUROCAE WG-79: Enhanced Flight Vision Systems/Synthetic Vision Systems (EFVS/SVS...

  11. 75 FR 38863 - Tenth Meeting: Joint RTCA Special Committee 213: EUROCAE WG-79: Enhanced Flight Vision Systems...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-06

    ... Committee 213: EUROCAE WG-79: Enhanced Flight Vision Systems/Synthetic Vision Systems (EFVS/SVS) AGENCY...-79: Enhanced Flight Vision Systems/Synthetic Vision Systems (EFVS/SVS). SUMMARY: The FAA is issuing...: Enhanced Flight [[Page 38864

  12. 77 FR 2342 - Seventeenth Meeting: RTCA Special Committee 213, Enhanced Flight Vision/Synthetic Vision Systems...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-01-17

    ... Committee 213, Enhanced Flight Vision/Synthetic Vision Systems (EFVS/SVS) AGENCY: Federal Aviation..., Enhanced Flight Vision/ Synthetic Vision Systems (EFVS/SVS). SUMMARY: The FAA is issuing this notice to advise the public of the seventeenth meeting of RTCA Special Committee 213, Enhanced Flight Vision...

  13. MMW radar enhanced vision systems: the Helicopter Autonomous Landing System (HALS) and Radar-Enhanced Vision System (REVS) are rotary and fixed wing enhanced flight vision systems that enable safe flight operations in degraded visual environments

    NASA Astrophysics Data System (ADS)

    Cross, Jack; Schneider, John; Cariani, Pete

    2013-05-01

    Sierra Nevada Corporation (SNC) has developed rotary and fixed wing millimeter wave radar enhanced vision systems. The Helicopter Autonomous Landing System (HALS) is a rotary-wing enhanced vision system that enables multi-ship landing, takeoff, and enroute flight in Degraded Visual Environments (DVE). HALS has been successfully flight tested in a variety of scenarios, from brown-out DVE landings, to enroute flight over mountainous terrain, to wire/cable detection during low-level flight. The Radar Enhanced Vision Systems (REVS) is a fixed-wing Enhanced Flight Vision System (EFVS) undergoing prototype development testing. Both systems are based on a fast-scanning, threedimensional 94 GHz radar that produces real-time terrain and obstacle imagery. The radar imagery is fused with synthetic imagery of the surrounding terrain to form a long-range, wide field-of-view display. A symbology overlay is added to provide aircraft state information and, for HALS, approach and landing command guidance cuing. The combination of see-through imagery and symbology provides the key information a pilot needs to perform safe flight operations in DVE conditions. This paper discusses the HALS and REVS systems and technology, presents imagery, and summarizes the recent flight test results.

  14. Flight Test Evaluation of Situation Awareness Benefits of Integrated Synthetic Vision System Technology f or Commercial Aircraft

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Kramer, Lynda J.; Arthur, Jarvis J., III

    2005-01-01

    Research was conducted onboard a Gulfstream G-V aircraft to evaluate integrated Synthetic Vision System concepts during flight tests over a 6-week period at the Wallops Flight Facility and Reno/Tahoe International Airport. The NASA Synthetic Vision System incorporates database integrity monitoring, runway incursion prevention alerting, surface maps, enhanced vision sensors, and advanced pathway guidance and synthetic terrain presentation. The paper details the goals and objectives of the flight test with a focus on the situation awareness benefits of integrating synthetic vision system enabling technologies for commercial aircraft.

  15. Vision based flight procedure stereo display system

    NASA Astrophysics Data System (ADS)

    Shen, Xiaoyun; Wan, Di; Ma, Lan; He, Yuncheng

    2008-03-01

    A virtual reality flight procedure vision system is introduced in this paper. The digital flight map database is established based on the Geographic Information System (GIS) and high definitions satellite remote sensing photos. The flight approaching area database is established through computer 3D modeling system and GIS. The area texture is generated from the remote sensing photos and aerial photographs in various level of detail. According to the flight approaching procedure, the flight navigation information is linked to the database. The flight approaching area vision can be dynamic displayed according to the designed flight procedure. The flight approaching area images are rendered in 2 channels, one for left eye images and the others for right eye images. Through the polarized stereoscopic projection system, the pilots and aircrew can get the vivid 3D vision of the flight destination approaching area. Take the use of this system in pilots preflight preparation procedure, the aircrew can get more vivid information along the flight destination approaching area. This system can improve the aviator's self-confidence before he carries out the flight mission, accordingly, the flight safety is improved. This system is also useful in validate the visual flight procedure design, and it helps to the flight procedure design.

  16. Flight Test Comparison Between Enhanced Vision (FLIR) and Synthetic Vision Systems

    NASA Technical Reports Server (NTRS)

    Arthur, Jarvis J., III; Kramer, Lynda J.; Bailey, Randall E.

    2005-01-01

    Limited visibility and reduced situational awareness have been cited as predominant causal factors for both Controlled Flight Into Terrain (CFIT) and runway incursion accidents. NASA s Synthetic Vision Systems (SVS) project is developing practical application technologies with the goal of eliminating low visibility conditions as a causal factor to civil aircraft accidents while replicating the operational benefits of clear day flight operations, regardless of the actual outside visibility condition. A major thrust of the SVS project involves the development/demonstration of affordable, certifiable display configurations that provide intuitive out-the-window terrain and obstacle information with advanced pathway guidance. A flight test evaluation was conducted in the summer of 2004 by NASA Langley Research Center under NASA s Aviation Safety and Security, Synthetic Vision System - Commercial and Business program. A Gulfstream G-V aircraft, modified and operated under NASA contract by the Gulfstream Aerospace Corporation, was flown over a 3-week period at the Reno/Tahoe International Airport and an additional 3-week period at the NASA Wallops Flight Facility to evaluate integrated Synthetic Vision System concepts. Flight testing was conducted to evaluate the performance, usability, and acceptance of an integrated synthetic vision concept which included advanced Synthetic Vision display concepts for a transport aircraft flight deck, a Runway Incursion Prevention System, an Enhanced Vision Systems (EVS), and real-time Database Integrity Monitoring Equipment. This paper focuses on comparing qualitative and subjective results between EVS and SVS display concepts.

  17. Going Below Minimums: The Efficacy of Display Enhanced/Synthetic Vision Fusion for Go-Around Decisions during Non-Normal Operations

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Kramer, Lynda J.; Bailey, Randall E.

    2007-01-01

    The use of enhanced vision systems in civil aircraft is projected to increase rapidly as the Federal Aviation Administration recently changed the aircraft operating rules under Part 91, revising the flight visibility requirements for conducting approach and landing operations. Operators conducting straight-in instrument approach procedures may now operate below the published approach minimums when using an approved enhanced flight vision system that shows the required visual references on the pilot's Head-Up Display. An experiment was conducted to evaluate the complementary use of synthetic vision systems and enhanced vision system technologies, focusing on new techniques for integration and/or fusion of synthetic and enhanced vision technologies and crew resource management while operating under these newly adopted rules. Experimental results specific to flight crew response to non-normal events using the fused synthetic/enhanced vision system are presented.

  18. Visual Advantage of Enhanced Flight Vision System During NextGen Flight Test Evaluation

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Harrison, Stephanie J.; Bailey, Randall E.; Shelton, Kevin J.; Ellis, Kyle K.

    2014-01-01

    Synthetic Vision Systems and Enhanced Flight Vision System (SVS/EFVS) technologies have the potential to provide additional margins of safety for aircrew performance and enable operational improvements for low visibility operations in the terminal area environment. Simulation and flight tests were jointly sponsored by NASA's Aviation Safety Program, Vehicle Systems Safety Technology project and the Federal Aviation Administration (FAA) to evaluate potential safety and operational benefits of SVS/EFVS technologies in low visibility Next Generation Air Transportation System (NextGen) operations. The flight tests were conducted by a team of Honeywell, Gulfstream Aerospace Corporation and NASA personnel with the goal of obtaining pilot-in-the-loop test data for flight validation, verification, and demonstration of selected SVS/EFVS operational and system-level performance capabilities. Nine test flights were flown in Gulfstream's G450 flight test aircraft outfitted with the SVS/EFVS technologies under low visibility instrument meteorological conditions. Evaluation pilots flew 108 approaches in low visibility weather conditions (600 feet to 3600 feet reported visibility) under different obscurants (mist, fog, drizzle fog, frozen fog) and sky cover (broken, overcast). Flight test videos were evaluated at three different altitudes (decision altitude, 100 feet radar altitude, and touchdown) to determine the visual advantage afforded to the pilot using the EFVS/Forward-Looking InfraRed (FLIR) imagery compared to natural vision. Results indicate the EFVS provided a visual advantage of two to three times over that of the out-the-window (OTW) view. The EFVS allowed pilots to view the runway environment, specifically runway lights, before they would be able to OTW with natural vision.

  19. 77 FR 36331 - Nineteenth Meeting: RTCA Special Committee 213, Enhanced Flight Vision Systems/Synthetic Vision...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-18

    ... Document--Draft DO-XXX, Minimum Aviation Performance Standards (MASPS) for an Enhanced Flight Vision System... Discussion (9:00 a.m.-5:00 p.m.) Provide Comment Resolution of Document--Draft DO-XXX, Minimum Aviation.../Approve FRAC Draft for PMC Consideration--Draft DO- XXX, Minimum Aviation Performance Standards (MASPS...

  20. Flight Deck-Based Delegated Separation: Evaluation of an On-Board Interval Management System with Synthetic and Enhanced Vision Technology

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Shelton, Kevin J.; Kramer, Lynda J.; Arthur, Jarvis J.; Bailey, Randall E.; Norman, Rober M.; Ellis, Kyle K. E.; Barmore, Bryan E.

    2011-01-01

    An emerging Next Generation Air Transportation System concept - Equivalent Visual Operations (EVO) - can be achieved using an electronic means to provide sufficient visibility of the external world and other required flight references on flight deck displays that enable the safety, operational tempos, and visual flight rules (VFR)-like procedures for all weather conditions. Synthetic and enhanced flight vision system technologies are critical enabling technologies to EVO. Current research evaluated concepts for flight deck-based interval management (FIM) operations, integrated with Synthetic Vision and Enhanced Vision flight-deck displays and technologies. One concept involves delegated flight deck-based separation, in which the flight crews were paired with another aircraft and responsible for spacing and maintaining separation from the paired aircraft, termed, "equivalent visual separation." The operation required the flight crews to acquire and maintain an "equivalent visual contact" as well as to conduct manual landings in low-visibility conditions. The paper describes results that evaluated the concept of EVO delegated separation, including an off-nominal scenario in which the lead aircraft was not able to conform to the assigned spacing resulting in a loss of separation.

  1. Flight Testing an Integrated Synthetic Vision System

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Arthur, Jarvis J., III; Bailey, Randall E.; Prinzel, Lawrence J., III

    2005-01-01

    NASA's Synthetic Vision Systems (SVS) project is developing technologies with practical applications to eliminate low visibility conditions as a causal factor to civil aircraft accidents while replicating the operational benefits of clear day flight operations, regardless of the actual outside visibility condition. A major thrust of the SVS project involves the development/demonstration of affordable, certifiable display configurations that provide intuitive out-the-window terrain and obstacle information with advanced pathway guidance for transport aircraft. The SVS concept being developed at NASA encompasses the integration of tactical and strategic Synthetic Vision Display Concepts (SVDC) with Runway Incursion Prevention System (RIPS) alerting and display concepts, real-time terrain database integrity monitoring equipment (DIME), and Enhanced Vision Systems (EVS) and/or improved Weather Radar for real-time object detection and database integrity monitoring. A flight test evaluation was jointly conducted (in July and August 2004) by NASA Langley Research Center and an industry partner team under NASA's Aviation Safety and Security, Synthetic Vision System project. A Gulfstream GV aircraft was flown over a 3-week period in the Reno/Tahoe International Airport (NV) local area and an additional 3-week period in the Wallops Flight Facility (VA) local area to evaluate integrated Synthetic Vision System concepts. The enabling technologies (RIPS, EVS and DIME) were integrated into the larger SVS concept design. This paper presents experimental methods and the high level results of this flight test.

  2. 77 FR 21861 - Special Conditions: Boeing, Model 777F; Enhanced Flight Vision System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-04-12

    ... System AGENCY: Federal Aviation Administration (FAA), DOT. ACTION: Final special conditions; request for... with an advanced, enhanced flight vision system (EFVS). The EFVS consists of a head-up display (HUD) system modified to display forward-looking infrared (FLIR) imagery. The applicable airworthiness...

  3. Integrated navigation, flight guidance, and synthetic vision system for low-level flight

    NASA Astrophysics Data System (ADS)

    Mehler, Felix E.

    2000-06-01

    Future military transport aircraft will require a new approach with respect to the avionics suite to fulfill an ever-changing variety of missions. The most demanding phases of these mission are typically the low level flight segments, including tactical terrain following/avoidance,payload drop and/or board autonomous landing at forward operating strips without ground-based infrastructure. As a consequence, individual components and systems must become more integrated to offer a higher degree of reliability, integrity, flexibility and autonomy over existing systems while reducing crew workload. The integration of digital terrain data not only introduces synthetic vision into the cockpit, but also enhances navigation and guidance capabilities. At DaimlerChrysler Aerospace AG Military Aircraft Division (Dasa-M), an integrated navigation, flight guidance and synthetic vision system, based on digital terrain data, has been developed to fulfill the requirements of the Future Transport Aircraft (FTA). The fusion of three independent navigation sensors provides a more reliable and precise solution to both the 4D-flight guidance and the display components, which is comprised of a Head-up and a Head-down Display with synthetic vision. This paper will present the system, its integration into the DLR's VFW 614 Advanced Technology Testing Aircraft System (ATTAS) and the results of the flight-test campaign.

  4. Assessing Dual Sensor Enhanced Flight Vision Systems to Enable Equivalent Visual Operations

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Etherington, Timothy J.; Severance, Kurt; Bailey, Randall E.; Williams, Steven P.; Harrison, Stephanie J.

    2016-01-01

    Flight deck-based vision system technologies, such as Synthetic Vision (SV) and Enhanced Flight Vision Systems (EFVS), may serve as a revolutionary crew/vehicle interface enabling technologies to meet the challenges of the Next Generation Air Transportation System Equivalent Visual Operations (EVO) concept - that is, the ability to achieve the safety of current-day Visual Flight Rules (VFR) operations and maintain the operational tempos of VFR irrespective of the weather and visibility conditions. One significant challenge lies in the definition of required equipage on the aircraft and on the airport to enable the EVO concept objective. A motion-base simulator experiment was conducted to evaluate the operational feasibility, pilot workload and pilot acceptability of conducting straight-in instrument approaches with published vertical guidance to landing, touchdown, and rollout to a safe taxi speed in visibility as low as 300 ft runway visual range by use of onboard vision system technologies on a Head-Up Display (HUD) without need or reliance on natural vision. Twelve crews evaluated two methods of combining dual sensor (millimeter wave radar and forward looking infrared) EFVS imagery on pilot-flying and pilot-monitoring HUDs as they made approaches to runways with and without touchdown zone and centerline lights. In addition, the impact of adding SV to the dual sensor EFVS imagery on crew flight performance, workload, and situation awareness during extremely low visibility approach and landing operations was assessed. Results indicate that all EFVS concepts flown resulted in excellent approach path tracking and touchdown performance without any workload penalty. Adding SV imagery to EFVS concepts provided situation awareness improvements but no discernible improvements in flight path maintenance.

  5. Flexible Wing Base Micro Aerial Vehicles: Vision-Guided Flight Stability and Autonomy for Micro Air Vehicles

    NASA Technical Reports Server (NTRS)

    Ettinger, Scott M.; Nechyba, Michael C.; Ifju, Peter G.; Wazak, Martin

    2002-01-01

    Substantial progress has been made recently towards design building and test-flying remotely piloted Micro Air Vehicle's (MAVs). We seek to complement this progress in overcoming the aerodynamic obstacles to.flight at very small scales with a vision stability and autonomy system. The developed system based on a robust horizon detection algorithm which we discuss in greater detail in a companion paper. In this paper, we first motivate the use of computer vision for MAV autonomy arguing that given current sensor technology, vision may he the only practical approach to the problem. We then briefly review our statistical vision-based horizon detection algorithm, which has been demonstrated at 30Hz with over 99.9% correct horizon identification. Next we develop robust schemes for the detection of extreme MAV attitudes, where no horizon is visible, and for the detection of horizon estimation errors, due to external factors such as video transmission noise. Finally, we discuss our feed-back controller for self-stabilized flight, and report results on vision autonomous flights of duration exceeding ten minutes.

  6. Present and future of vision systems technologies in commercial flight operations

    NASA Astrophysics Data System (ADS)

    Ward, Jim

    2016-05-01

    The development of systems to enable pilots of all types of aircraft to see through fog, clouds, and sandstorms and land in low visibility has been widely discussed and researched across aviation. For military applications, the goal has been to operate in a Degraded Visual Environment (DVE), using sensors to enable flight crews to see and operate without concern to weather that limits human visibility. These military DVE goals are mainly oriented to the off-field landing environment. For commercial aviation, the Federal Aviation Agency (FAA) implemented operational regulations in 2004 that allow the flight crew to see the runway environment using an Enhanced Flight Vision Systems (EFVS) and continue the approach below the normal landing decision height. The FAA is expanding the current use and economic benefit of EFVS technology and will soon permit landing without any natural vision using real-time weather-penetrating sensors. The operational goals of both of these efforts, DVE and EFVS, have been the stimulus for development of new sensors and vision displays to create the modern flight deck.

  7. Using Vision System Technologies for Offset Approaches in Low Visibility Operations

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Bailey, Randall E.; Ellis, Kyle K.

    2015-01-01

    Flight deck-based vision systems, such as Synthetic Vision Systems (SVS) and Enhanced Flight Vision Systems (EFVS), have the potential to provide additional margins of safety for aircrew performance and enable the implementation of operational improvements for low visibility surface, arrival, and departure operations in the terminal environment with equivalent efficiency to visual operations. Twelve air transport-rated crews participated in a motion-base simulation experiment to evaluate the use of SVS/EFVS in Next Generation Air Transportation System low visibility approach and landing operations at Chicago O'Hare airport. Three monochromatic, collimated head-up display (HUD) concepts (conventional HUD, SVS HUD, and EFVS HUD) and three instrument approach types (straight-in, 3-degree offset, 15-degree offset) were experimentally varied to test the efficacy of the SVS/EFVS HUD concepts for offset approach operations. The findings suggest making offset approaches in low visibility conditions with an EFVS HUD or SVS HUD appear feasible. Regardless of offset approach angle or HUD concept being flown, all approaches had comparable ILS tracking during the instrument segment and were within the lateral confines of the runway with acceptable sink rates during the visual segment of the approach. Keywords: Enhanced Flight Vision Systems; Synthetic Vision Systems; Head-up Display; NextGen

  8. Assessing Impact of Dual Sensor Enhanced Flight Vision Systems on Departure Performance

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Etherington, Timothy J.; Severance, Kurt; Bailey, Randall E.

    2016-01-01

    Synthetic Vision (SV) and Enhanced Flight Vision Systems (EFVS) may serve as game-changing technologies to meet the challenges of the Next Generation Air Transportation System and the envisioned Equivalent Visual Operations (EVO) concept - that is, the ability to achieve the safety and operational tempos of current-day Visual Flight Rules operations irrespective of the weather and visibility conditions. One significant obstacle lies in the definition of required equipage on the aircraft and on the airport to enable the EVO concept objective. A motion-base simulator experiment was conducted to evaluate the operational feasibility and pilot workload of conducting departures and approaches on runways without centerline lighting in visibility as low as 300 feet runway visual range (RVR) by use of onboard vision system technologies on a Head-Up Display (HUD) without need or reliance on natural vision. Twelve crews evaluated two methods of combining dual sensor (millimeter wave radar and forward looking infrared) EFVS imagery on pilot-flying and pilot-monitoring HUDs. In addition, the impact of adding SV to the dual sensor EFVS imagery on crew flight performance and workload was assessed. Using EFVS concepts during 300 RVR terminal operations on runways without centerline lighting appears feasible as all EFVS concepts had equivalent (or better) departure performance and landing rollout performance, without any workload penalty, than those flown with a conventional HUD to runways having centerline lighting. Adding SV imagery to EFVS concepts provided situation awareness improvements but no discernible improvements in flight path maintenance.

  9. Square tracking sensor for autonomous helicopter hover stabilization

    NASA Astrophysics Data System (ADS)

    Oertel, Carl-Henrik

    1995-06-01

    Sensors for synthetic vision are needed to extend the mission profiles of helicopters. A special task for various applications is the autonomous position hold of a helicopter above a ground fixed or moving target. As a proof of concept for a general synthetic vision solution a restricted machine vision system, which is capable of locating and tracking a special target, was developed by the Institute of Flight Mechanics of Deutsche Forschungsanstalt fur Luft- und Raumfahrt e.V. (i.e., German Aerospace Research Establishment). This sensor, which is specialized to detect and track a square, was integrated in the fly-by-wire helicopter ATTHeS (i.e., Advanced Technology Testing Helicopter System). An existing model following controller for the forward flight condition was adapted for the hover and low speed requirements of the flight vehicle. The special target, a black square with a length of one meter, was mounted on top of a car. Flight tests demonstrated the automatic stabilization of the helicopter above the moving car by synthetic vision.

  10. Synthetic and Enhanced Vision Systems for NextGen (SEVS) Simulation and Flight Test Performance Evaluation

    NASA Technical Reports Server (NTRS)

    Shelton, Kevin J.; Kramer, Lynda J.; Ellis,Kyle K.; Rehfeld, Sherri A.

    2012-01-01

    The Synthetic and Enhanced Vision Systems for NextGen (SEVS) simulation and flight tests are jointly sponsored by NASA's Aviation Safety Program, Vehicle Systems Safety Technology project and the Federal Aviation Administration (FAA). The flight tests were conducted by a team of Honeywell, Gulfstream Aerospace Corporation and NASA personnel with the goal of obtaining pilot-in-the-loop test data for flight validation, verification, and demonstration of selected SEVS operational and system-level performance capabilities. Nine test flights (38 flight hours) were conducted over the summer and fall of 2011. The evaluations were flown in Gulfstream.s G450 flight test aircraft outfitted with the SEVS technology under very low visibility instrument meteorological conditions. Evaluation pilots flew 108 approaches in low visibility weather conditions (600 ft to 2400 ft visibility) into various airports from Louisiana to Maine. In-situ flight performance and subjective workload and acceptability data were collected in collaboration with ground simulation studies at LaRC.s Research Flight Deck simulator.

  11. Effect of microgravity on several visual functions during STS shuttle missions

    NASA Technical Reports Server (NTRS)

    Oneal, Melvin R.; Task, H. Lee; Genco, Louis V.

    1992-01-01

    Changes in the acuity of astronaut vision during flight are discussed. Parameters such as critical flicker vision, stereopsis to 10 seconds of arc, visual acuity in small steps to 20/7.7, cyclophoria, lateral and vertical phoria and retinal rivalry were tested using a visual function tester. Twenty-three Space Transportation System (STS) astronauts participated in the experiments. Their vision was assessed twice before launch and after landing, and three to four times while on-orbit and landing. No significant differences during space flight were observed for any of the visual parameters tested. In some cases, slight changes in acuity and stereopsis were observed with a subsequent return to normal vision after flight.

  12. Drone-Augmented Human Vision: Exocentric Control for Drones Exploring Hidden Areas.

    PubMed

    Erat, Okan; Isop, Werner Alexander; Kalkofen, Denis; Schmalstieg, Dieter

    2018-04-01

    Drones allow exploring dangerous or impassable areas safely from a distant point of view. However, flight control from an egocentric view in narrow or constrained environments can be challenging. Arguably, an exocentric view would afford a better overview and, thus, more intuitive flight control of the drone. Unfortunately, such an exocentric view is unavailable when exploring indoor environments. This paper investigates the potential of drone-augmented human vision, i.e., of exploring the environment and controlling the drone indirectly from an exocentric viewpoint. If used with a see-through display, this approach can simulate X-ray vision to provide a natural view into an otherwise occluded environment. The user's view is synthesized from a three-dimensional reconstruction of the indoor environment using image-based rendering. This user interface is designed to reduce the cognitive load of the drone's flight control. The user can concentrate on the exploration of the inaccessible space, while flight control is largely delegated to the drone's autopilot system. We assess our system with a first experiment showing how drone-augmented human vision supports spatial understanding and improves natural interaction with the drone.

  13. Flexible Wing Base Micro Aerial Vehicles: Towards Flight Autonomy: Vision-Based Horizon Detection for Micro Air Vehicles

    NASA Technical Reports Server (NTRS)

    Nechyba, Michael C.; Ettinger, Scott M.; Ifju, Peter G.; Wazak, Martin

    2002-01-01

    Recently substantial progress has been made towards design building and testifying remotely piloted Micro Air Vehicles (MAVs). This progress in overcoming the aerodynamic obstacles to flight at very small scales has, unfortunately, not been matched by similar progress in autonomous MAV flight. Thus, we propose a robust, vision-based horizon detection algorithm as the first step towards autonomous MAVs. In this paper, we first motivate the use of computer vision for the horizon detection task by examining the flight of birds (biological MAVs) and considering other practical factors. We then describe our vision-based horizon detection algorithm, which has been demonstrated at 30 Hz with over 99.9% correct horizon identification, over terrain that includes roads, buildings large and small, meadows, wooded areas, and a lake. We conclude with some sample horizon detection results and preview a companion paper, where the work discussed here forms the core of a complete autonomous flight stability system.

  14. Technical Challenges in the Development of a NASA Synthetic Vision System Concept

    NASA Technical Reports Server (NTRS)

    Bailey, Randall E.; Parrish, Russell V.; Kramer, Lynda J.; Harrah, Steve; Arthur, J. J., III

    2002-01-01

    Within NASA's Aviation Safety Program, the Synthetic Vision Systems Project is developing display system concepts to improve pilot terrain/situation awareness by providing a perspective synthetic view of the outside world through an on-board database driven by precise aircraft positioning information updating via Global Positioning System-based data. This work is aimed at eliminating visibility-induced errors and low visibility conditions as a causal factor to civil aircraft accidents, as well as replicating the operational benefits of clear day flight operations regardless of the actual outside visibility condition. Synthetic vision research and development activities at NASA Langley Research Center are focused around a series of ground simulation and flight test experiments designed to evaluate, investigate, and assess the technology which can lead to operational and certified synthetic vision systems. The technical challenges that have been encountered and that are anticipated in this research and development activity are summarized.

  15. Augmentation of Cognition and Perception Through Advanced Synthetic Vision Technology

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Kramer, Lynda J.; Bailey, Randall E.; Arthur, Jarvis J.; Williams, Steve P.; McNabb, Jennifer

    2005-01-01

    Synthetic Vision System technology augments reality and creates a virtual visual meteorological condition that extends a pilot's cognitive and perceptual capabilities during flight operations when outside visibility is restricted. The paper describes the NASA Synthetic Vision System for commercial aviation with an emphasis on how the technology achieves Augmented Cognition objectives.

  16. Design and evaluation of an autonomous, obstacle avoiding, flight control system using visual sensors

    NASA Astrophysics Data System (ADS)

    Crawford, Bobby Grant

    In an effort to field smaller and cheaper Uninhabited Aerial Vehicles (UAVs), the Army has expressed an interest in an ability of the vehicle to autonomously detect and avoid obstacles. Current systems are not suitable for small aircraft. NASA Langley Research Center has developed a vision sensing system that uses small semiconductor cameras. The feasibility of using this sensor for the purpose of autonomous obstacle avoidance by a UAV is the focus of the research presented in this document. The vision sensor characteristics are modeled and incorporated into guidance and control algorithms designed to generate flight commands based on obstacle information received from the sensor. The system is evaluated by simulating the response to these flight commands using a six degree-of-freedom, non-linear simulation of a small, fixed wing UAV. The simulation is written using the MATLAB application and runs on a PC. Simulations were conducted to test the longitudinal and lateral capabilities of the flight control for a range of airspeeds, camera characteristics, and wind speeds. Results indicate that the control system is suitable for obstacle avoiding flight control using the simulated vision system. In addition, a method for designing and evaluating the performance of such a system has been developed that allows the user to easily change component characteristics and evaluate new systems through simulation.

  17. Improving the Flight Path Marker Symbol on Rotorcraft Synthetic Vision Displays

    NASA Technical Reports Server (NTRS)

    Szoboszlay, Zoltan P.; Hardy, Gordon H.; Welsh, Terence M.

    2004-01-01

    Two potential improvements to the flight path marker symbol were evaluated on a panel-mounted, synthetic vision, primary flight display in a rotorcraft simulation. One concept took advantage of the fact that synthetic vision systems have terrain height information available ahead of the aircraft. For this first concept, predicted altitude and ground track information was added to the flight path marker. In the second concept, multiple copies of the flight path marker were displayed at 3, 4, and 5 second prediction times as compared to a single prediction time of 3 seconds. Objective and subjective data were collected for eight rotorcraft pilots. The first concept produced significant improvements in pilot attitude control, ground track control, workload ratings, and preference ratings. The second concept did not produce significant differences in the objective or subjective measures.

  18. Test of Lander Vision System for Mars 2020

    NASA Image and Video Library

    2016-10-04

    A prototype of the Lander Vision System for NASA Mars 2020 mission was tested in this Dec. 9, 2014, flight of a Masten Space Systems Xombie vehicle at Mojave Air and Space Port in California. http://photojournal.jpl.nasa.gov/catalog/PIA20848

  19. Helicopter flights with night-vision goggles: Human factors aspects

    NASA Technical Reports Server (NTRS)

    Brickner, Michael S.

    1989-01-01

    Night-vision goggles (NVGs) and, in particular, the advanced, helmet-mounted Aviators Night-Vision-Imaging System (ANVIS) allows helicopter pilots to perform low-level flight at night. It consists of light intensifier tubes which amplify low-intensity ambient illumination (star and moon light) and an optical system which together produce a bright image of the scene. However, these NVGs do not turn night into day, and, while they may often provide significant advantages over unaided night flight, they may also result in visual fatigue, high workload, and safety hazards. These problems reflect both system limitations and human-factors issues. A brief description of the technical characteristics of NVGs and of human night-vision capabilities is followed by a description and analysis of specific perceptual problems which occur with the use of NVGs in flight. Some of the issues addressed include: limitations imposed by a restricted field of view; problems related to binocular rivalry; the consequences of inappropriate focusing of the eye; the effects of ambient illumination levels and of various types of terrain on image quality; difficulties in distance and slope estimation; effects of dazzling; and visual fatigue and superimposed symbology. These issues are described and analyzed in terms of their possible consequences on helicopter pilot performance. The additional influence of individual differences among pilots is emphasized. Thermal imaging systems (forward looking infrared (FLIR)) are described briefly and compared to light intensifier systems (NVGs). Many of the phenomena which are described are not readily understood. More research is required to better understand the human-factors problems created by the use of NVGs and other night-vision aids, to enhance system design, and to improve training methods and simulation techniques.

  20. Using Vision System Technologies to Enable Operational Improvements for Low Visibility Approach and Landing Operations

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Ellis, Kyle K. E.; Bailey, Randall E.; Williams, Steven P.; Severance, Kurt; Le Vie, Lisa R.; Comstock, James R.

    2014-01-01

    Flight deck-based vision systems, such as Synthetic and Enhanced Vision System (SEVS) technologies, have the potential to provide additional margins of safety for aircrew performance and enable the implementation of operational improvements for low visibility surface, arrival, and departure operations in the terminal environment with equivalent efficiency to visual operations. To achieve this potential, research is required for effective technology development and implementation based upon human factors design and regulatory guidance. This research supports the introduction and use of Synthetic Vision Systems and Enhanced Flight Vision Systems (SVS/EFVS) as advanced cockpit vision technologies in Next Generation Air Transportation System (NextGen) operations. Twelve air transport-rated crews participated in a motion-base simulation experiment to evaluate the use of SVS/EFVS in NextGen low visibility approach and landing operations. Three monochromatic, collimated head-up display (HUD) concepts (conventional HUD, SVS HUD, and EFVS HUD) and two color head-down primary flight display (PFD) concepts (conventional PFD, SVS PFD) were evaluated in a simulated NextGen Chicago O'Hare terminal environment. Additionally, the instrument approach type (no offset, 3 degree offset, 15 degree offset) was experimentally varied to test the efficacy of the HUD concepts for offset approach operations. The data showed that touchdown landing performance were excellent regardless of SEVS concept or type of offset instrument approach being flown. Subjective assessments of mental workload and situation awareness indicated that making offset approaches in low visibility conditions with an EFVS HUD or SVS HUD may be feasible.

  1. Vision Aspects of Space Flight

    NASA Technical Reports Server (NTRS)

    Manuel, Keith; Billica, Roger (Technical Monitor)

    2000-01-01

    Vision, being one of our most important senses, is critically important in the unique working environment of space flight. Critical evaluation of the astronauts visual system begins with pre-selection examinations resulting in an average of 65% of all medical disqualification's caused by ocular findings. With an average age of 42, approximately 60% of the astronaut corps requires vision correction. Further demands of the unique training and working environment of microgravity, variable lighting from very poor to extreme brightness of sunlight and exposure to extremes of electromagnetic energy results in unique eyewear and contact lens applications. This presentation will describe some of those unique eyewear and contact lens applications used in space flight and training environments. Additionally, ocular findings from 26 shuttle and 5 MIR mission post-flight examinations will be presented.

  2. Insect-Based Vision for Autonomous Vehicles: A Feasibility Study

    NASA Technical Reports Server (NTRS)

    Srinivasan, Mandyam V.

    1999-01-01

    The aims of the project were to use a high-speed digital video camera to pursue two questions: i) To explore the influence of temporal imaging constraints on the performance of vision systems for autonomous mobile robots; To study the fine structure of insect flight trajectories with in order to better understand the characteristics of flight control, orientation and navigation.

  3. Insect-Based Vision for Autonomous Vehicles: A Feasibility Study

    NASA Technical Reports Server (NTRS)

    Srinivasan, Mandyam V.

    1999-01-01

    The aims of the project were to use a high-speed digital video camera to pursue two questions: (1) To explore the influence of temporal imaging constraints on the performance of vision systems for autonomous mobile robots; (2) To study the fine structure of insect flight trajectories in order to better understand the characteristics of flight control, orientation and navigation.

  4. Multi-Dimensionality of Synthetic Vision Cockpit Displays: Prevention of Controlled-Flight-Into-Terrain

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Kramer, Lynda J.; Arthur, Jarvis J.; Bailey, Randall E.

    2006-01-01

    NASA's Synthetic Vision Systems (SVS) project is developing technologies with practical applications that will help to eliminate low visibility conditions as a causal factor to civil aircraft accidents while replicating the operational benefits of clear day flight operations, regardless of the actual outside visibility condition. The paper describes experimental evaluation of a multi-mode 3-D exocentric synthetic vision navigation display concept for commercial aircraft. Experimental results showed the situation awareness benefits of 2-D and 3-D exocentric synthetic vision displays over traditional 2-D co-planar navigation and vertical situation displays. Conclusions and future research directions are discussed.

  5. Smart Camera System for Aircraft and Spacecraft

    NASA Technical Reports Server (NTRS)

    Delgado, Frank; White, Janis; Abernathy, Michael F.

    2003-01-01

    This paper describes a new approach to situation awareness that combines video sensor technology and synthetic vision technology in a unique fashion to create a hybrid vision system. Our implementation of the technology, called "SmartCam3D" (SC3D) has been flight tested by both NASA and the Department of Defense with excellent results. This paper details its development and flight test results. Windshields and windows add considerable weight and risk to vehicle design, and because of this, many future vehicles will employ a windowless cockpit design. This windowless cockpit design philosophy prompted us to look at what would be required to develop a system that provides crewmembers and awareness. The system created to date provides a real-time operations personnel an appropriate level of situation 3D perspective display that can be used during all-weather and visibility conditions. While the advantages of a synthetic vision only system are considerable, the major disadvantage of such a system is that it displays the synthetic scene created using "static" data acquired by an aircraft or satellite at some point in the past. The SC3D system we are presenting in this paper is a hybrid synthetic vision system that fuses live video stream information with a computer generated synthetic scene. This hybrid system can display a dynamic, real-time scene of a region of interest, enriched by information from a synthetic environment system, see figure 1. The SC3D system has been flight tested on several X-38 flight tests performed over the last several years and on an ARMY Unmanned Aerial Vehicle (UAV) ground control station earlier this year. Additional testing using an assortment of UAV ground control stations and UAV simulators from the Army and Air Force will be conducted later this year.

  6. Effects of a Velocity-Vector Based Command Augmentation System and Synthetic Vision System Terrain Portrayal and Guidance Symbology Concepts on Single-Pilot Performance

    NASA Technical Reports Server (NTRS)

    Liu, Dahai; Goodrich, Kenneth H.; Peak, Bob

    2010-01-01

    This study investigated the effects of synthetic vision system (SVS) concepts and advanced flight controls on the performance of pilots flying a light, single-engine general aviation airplane. We evaluated the effects and interactions of two levels of terrain portrayal, guidance symbology, and flight control response type on pilot performance during the conduct of a relatively complex instrument approach procedure. The terrain and guidance presentations were evaluated as elements of an integrated primary flight display system. The approach procedure used in the study included a steeply descending, curved segment as might be encountered in emerging, required navigation performance (RNP) based procedures. Pilot performance measures consisted of flight technical performance, perceived workload, perceived situational awareness and subjective preference. The results revealed that an elevation based generic terrain portrayal significantly improved perceived situation awareness without adversely affecting flight technical performance or workload. Other factors (pilot instrument rating, control response type, and guidance symbology) were not found to significantly affect the performance measures.

  7. Flight Testing of Night Vision Systems in Rotorcraft (Test en vol de systemes de vision nocturne a bord des aeronefs a voilure tournante)

    DTIC Science & Technology

    2007-07-01

    SAS System Analysis and Studies Panel • SCI Systems Concepts and Integration Panel • SET Sensors and Electronics Technology Panel These...Daylight Readability 4-2 4.1.4 Night-Time Readability 4-2 4.1.5 NVIS Radiance 4-2 4.1.6 Human Factors Analysis 4-3 4.1.7 Flight Tests 4-3 4.1.7.1...position is shadowing. Moonlight creates shadows during night-time just as sunlight does during the day. Understanding what cannot be seen in night-time

  8. Piloted Simulation of Various Synthetic Vision Systems Terrain Portrayal and Guidance Symbology Concepts for Low Altitude En-Route Scenario

    NASA Technical Reports Server (NTRS)

    Takallu, M. A.; Glaab, L. J.; Hughes, M. F.; Wong, D. T.; Bartolone, A. P.

    2008-01-01

    In support of the NASA Aviation Safety Program's Synthetic Vision Systems Project, a series of piloted simulations were conducted to explore and quantify the relationship between candidate Terrain Portrayal Concepts and Guidance Symbology Concepts, specific to General Aviation. The experiment scenario was based on a low altitude en route flight in Instrument Metrological Conditions in the central mountains of Alaska. A total of 18 general aviation pilots, with three levels of pilot experience, evaluated a test matrix of four terrain portrayal concepts and six guidance symbology concepts. Quantitative measures included various pilot/aircraft performance data, flight technical errors and flight control inputs. The qualitative measures included pilot comments and pilot responses to the structured questionnaires such as perceived workload, subjective situation awareness, pilot preferences, and the rare event recognition. There were statistically significant effects found from guidance symbology concepts and terrain portrayal concepts but no significant interactions between them. Lower flight technical errors and increased situation awareness were achieved using Synthetic Vision Systems displays, as compared to the baseline Pitch/Roll Flight Director and Blue Sky Brown Ground combination. Overall, those guidance symbology concepts that have both path based guidance cue and tunnel display performed better than the other guidance concepts.

  9. Multi-spectrum-based enhanced synthetic vision system for aircraft DVE operations

    NASA Astrophysics Data System (ADS)

    Kashyap, Sudesh K.; Naidu, V. P. S.; Shanthakumar, N.

    2016-04-01

    This paper focus on R&D being carried out at CSIR-NAL on Enhanced Synthetic Vision System (ESVS) for Indian regional transport aircraft to enhance all weather operational capabilities with safety and pilot Situation Awareness (SA) improvements. Flight simulator has been developed to study ESVS related technologies and to develop ESVS operational concepts for all weather approach and landing and to provide quantitative and qualitative information that could be used to develop criteria for all-weather approach and landing at regional airports in India. Enhanced Vision System (EVS) hardware prototype with long wave Infrared sensor and low light CMOS camera is used to carry out few field trials on ground vehicle at airport runway at different visibility conditions. Data acquisition and playback system has been developed to capture EVS sensor data (image) in time synch with test vehicle inertial navigation data during EVS field experiments and to playback the experimental data on ESVS flight simulator for ESVS research and concept studies. Efforts are on to conduct EVS flight experiments on CSIR-NAL research aircraft HANSA in Degraded Visual Environment (DVE).

  10. A lightweight, inexpensive robotic system for insect vision.

    PubMed

    Sabo, Chelsea; Chisholm, Robert; Petterson, Adam; Cope, Alex

    2017-09-01

    Designing hardware for miniaturized robotics which mimics the capabilities of flying insects is of interest, because they share similar constraints (i.e. small size, low weight, and low energy consumption). Research in this area aims to enable robots with similarly efficient flight and cognitive abilities. Visual processing is important to flying insects' impressive flight capabilities, but currently, embodiment of insect-like visual systems is limited by the hardware systems available. Suitable hardware is either prohibitively expensive, difficult to reproduce, cannot accurately simulate insect vision characteristics, and/or is too heavy for small robotic platforms. These limitations hamper the development of platforms for embodiment which in turn hampers the progress on understanding of how biological systems fundamentally work. To address this gap, this paper proposes an inexpensive, lightweight robotic system for modelling insect vision. The system is mounted and tested on a robotic platform for mobile applications, and then the camera and insect vision models are evaluated. We analyse the potential of the system for use in embodiment of higher-level visual processes (i.e. motion detection) and also for development of navigation based on vision for robotics in general. Optic flow from sample camera data is calculated and compared to a perfect, simulated bee world showing an excellent resemblance. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  11. Latency Requirements for Head-Worn Display S/EVS Applications

    NASA Technical Reports Server (NTRS)

    Bailey, Randall E.; Trey Arthur, J. J., III; Williams, Steven P.

    2004-01-01

    NASA s Aviation Safety Program, Synthetic Vision Systems Project is conducting research in advanced flight deck concepts, such as Synthetic/Enhanced Vision Systems (S/EVS), for commercial and business aircraft. An emerging thrust in this activity is the development of spatially-integrated, large field-of-regard information display systems. Head-worn or helmet-mounted display systems are being proposed as one method in which to meet this objective. System delays or latencies inherent to spatially-integrated, head-worn displays critically influence the display utility, usability, and acceptability. Research results from three different, yet similar technical areas flight control, flight simulation, and virtual reality are collectively assembled in this paper to create a global perspective of delay or latency effects in head-worn or helmet-mounted display systems. Consistent definitions and measurement techniques are proposed herein for universal application and latency requirements for Head-Worn Display S/EVS applications are drafted. Future research areas are defined.

  12. Latency requirements for head-worn display S/EVS applications

    NASA Astrophysics Data System (ADS)

    Bailey, Randall E.; Arthur, Jarvis J., III; Williams, Steven P.

    2004-08-01

    NASA's Aviation Safety Program, Synthetic Vision Systems Project is conducting research in advanced flight deck concepts, such as Synthetic/Enhanced Vision Systems (S/EVS), for commercial and business aircraft. An emerging thrust in this activity is the development of spatially-integrated, large field-of-regard information display systems. Head-worn or helmet-mounted display systems are being proposed as one method in which to meet this objective. System delays or latencies inherent to spatially-integrated, head-worn displays critically influence the display utility, usability, and acceptability. Research results from three different, yet similar technical areas - flight control, flight simulation, and virtual reality - are collectively assembled in this paper to create a global perspective of delay or latency effects in head-worn or helmet-mounted display systems. Consistent definitions and measurement techniques are proposed herein for universal application and latency requirements for Head-Worn Display S/EVS applications are drafted. Future research areas are defined.

  13. External Vision Systems (XVS) Proof-of-Concept Flight Test Evaluation

    NASA Technical Reports Server (NTRS)

    Shelton, Kevin J.; Williams, Steven P.; Kramer, Lynda J.; Arthur, Jarvis J.; Prinzel, Lawrence, III; Bailey, Randall E.

    2014-01-01

    NASA's Fundamental Aeronautics Program, High Speed Project is performing research, development, test and evaluation of flight deck and related technologies to support future low-boom, supersonic configurations (without forward-facing windows) by use of an eXternal Vision System (XVS). The challenge of XVS is to determine a combination of sensor and display technologies which can provide an equivalent level of safety and performance to that provided by forward-facing windows in today's aircraft. This flight test was conducted with the goal of obtaining performance data on see-and-avoid and see-to-follow traffic using a proof-of-concept XVS design in actual flight conditions. Six data collection flights were flown in four traffic scenarios against two different sized participating traffic aircraft. This test utilized a 3x1 array of High Definition (HD) cameras, with a fixed forward field-of-view, mounted on NASA Langley's UC-12 test aircraft. Test scenarios, with participating NASA aircraft serving as traffic, were presented to two evaluation pilots per flight - one using the proof-of-concept (POC) XVS and the other looking out the forward windows. The camera images were presented on the XVS display in the aft cabin with Head-Up Display (HUD)-like flight symbology overlaying the real-time imagery. The test generated XVS performance data, including comparisons to natural vision, and post-run subjective acceptability data were also collected. This paper discusses the flight test activities, its operational challenges, and summarizes the findings to date.

  14. Advanced helmet vision system (AHVS) integrated night vision helmet mounted display (HMD)

    NASA Astrophysics Data System (ADS)

    Ashcraft, Todd W.; Atac, Robert

    2012-06-01

    Gentex Corporation, under contract to Naval Air Systems Command (AIR 4.0T), designed the Advanced Helmet Vision System to provide aircrew with 24-hour, visor-projected binocular night vision and HMD capability. AHVS integrates numerous key technologies, including high brightness Light Emitting Diode (LED)-based digital light engines, advanced lightweight optical materials and manufacturing processes, and innovations in graphics processing software. This paper reviews the current status of miniaturization and integration with the latest two-part Gentex modular helmet, highlights the lessons learned from previous AHVS phases, and discusses plans for qualification and flight testing.

  15. 2D/3D Synthetic Vision Navigation Display

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Kramer, Lynda J.; Arthur, J. J., III; Bailey, Randall E.; Sweeters, jason L.

    2008-01-01

    Flight-deck display software was designed and developed at NASA Langley Research Center to provide two-dimensional (2D) and three-dimensional (3D) terrain, obstacle, and flight-path perspectives on a single navigation display. The objective was to optimize the presentation of synthetic vision (SV) system technology that permits pilots to view multiple perspectives of flight-deck display symbology and 3D terrain information. Research was conducted to evaluate the efficacy of the concept. The concept has numerous unique implementation features that would permit enhanced operational concepts and efficiencies in both current and future aircraft.

  16. A Hybrid Synthetic Vision System for the Tele-operation of Unmanned Vehicles

    NASA Technical Reports Server (NTRS)

    Delgado, Frank; Abernathy, Mike

    2004-01-01

    A system called SmartCam3D (SC3D) has been developed to provide enhanced situational awareness for operators of a remotely piloted vehicle. SC3D is a Hybrid Synthetic Vision System (HSVS) that combines live sensor data with information from a Synthetic Vision System (SVS). By combining the dual information sources, the operators are afforded the advantages of each approach. The live sensor system provides real-time information for the region of interest. The SVS provides information rich visuals that will function under all weather and visibility conditions. Additionally, the combination of technologies allows the system to circumvent some of the limitations from each approach. Video sensor systems are not very useful when visibility conditions are hampered by rain, snow, sand, fog, and smoke, while a SVS can suffer from data freshness problems. Typically, an aircraft or satellite flying overhead collects the data used to create the SVS visuals. The SVS data could have been collected weeks, months, or even years ago. To that extent, the information from an SVS visual could be outdated and possibly inaccurate. SC3D was used in the remote cockpit during flight tests of the X-38 132 and 131R vehicles at the NASA Dryden Flight Research Center. SC3D was also used during the operation of military Unmanned Aerial Vehicles. This presentation will provide an overview of the system, the evolution of the system, the results of flight tests, and future plans. Furthermore, the safety benefits of the SC3D over traditional and pure synthetic vision systems will be discussed.

  17. 76 FR 8278 - Special Conditions: Gulfstream Model GVI Airplane; Enhanced Flight Vision System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-14

    ... detected by infrared sensors can be much different from that detected by natural pilot vision. On a dark... by many imaging infrared systems. On the other hand, contrasting colors in visual wavelengths may be... of the EFVS image and the level of EFVS infrared sensor performance could depend significantly on...

  18. Ultraviolet sensor as integrity monitor for enhanced flight vision system (EFVS) approaches to Cat II RVR conditions

    NASA Astrophysics Data System (ADS)

    McKinley, John B.; Pierson, Roger; Ertem, M. C.; Krone, Norris J., Jr.; Cramer, James A.

    2008-04-01

    Flight tests were conducted at Greenbrier Valley Airport (KLWB) and Easton Municipal Airport / Newnam Field (KESN) in a Cessna 402B aircraft using a head-up display (HUD) and a Norris Electro Optical Systems Corporation (NEOC) developmental ultraviolet (UV) sensor. These flights were sponsored by NEOC under a Federal Aviation Administration program, and the ultraviolet concepts, technology, system mechanization, and hardware for landing during low visibility landing conditions have been patented by NEOC. Imagery from the UV sensor, HUD guidance cues, and out-the-window videos were separately recorded at the engineering workstation for each approach. Inertial flight path data were also recorded. Various configurations of portable UV emitters were positioned along the runway edge and threshold. The UV imagery of the runway outline was displayed on the HUD along with guidance generated from the mission computer. Enhanced Flight Vision System (EFVS) approaches with the UV sensor were conducted from the initial approach fix to the ILS decision height in both VMC and IMC. Although the availability of low visibility conditions during the flight test period was limited, results from previous fog range testing concluded that UV EFVS has the performance capability to penetrate CAT II runway visual range obscuration. Furthermore, independent analysis has shown that existing runway light emit sufficient UV radiation without the need for augmentation other than lens replacement with UV transmissive quartz lenses. Consequently, UV sensors should qualify as conforming to FAA requirements for EFVS approaches. Combined with Synthetic Vision System (SVS), UV EFVS would function as both a precision landing aid, as well as an integrity monitor for the GPS and SVS database.

  19. Synthetic Vision Displays for Planetary and Lunar Lander Vehicles

    NASA Technical Reports Server (NTRS)

    Arthur, Jarvis J., III; Prinzel, Lawrence J., III; Williams, Steven P.; Shelton, Kevin J.; Kramer, Lynda J.; Bailey, Randall E.; Norman, Robert M.

    2008-01-01

    Aviation research has demonstrated that Synthetic Vision (SV) technology can substantially enhance situation awareness, reduce pilot workload, improve aviation safety, and promote flight path control precision. SV, and related flight deck technologies are currently being extended for application in planetary exploration vehicles. SV, in particular, holds significant potential for many planetary missions since the SV presentation provides a computer-generated view for the flight crew of the terrain and other significant environmental characteristics independent of the outside visibility conditions, window locations, or vehicle attributes. SV allows unconstrained control of the computer-generated scene lighting, terrain coloring, and virtual camera angles which may provide invaluable visual cues to pilots/astronauts, not available from other vision technologies. In addition, important vehicle state information may be conformally displayed on the view such as forward and down velocities, altitude, and fuel remaining to enhance trajectory control and vehicle system status. The paper accompanies a conference demonstration that introduced a prototype NASA Synthetic Vision system for lunar lander spacecraft. The paper will describe technical challenges and potential solutions to SV applications for the lunar landing mission, including the requirements for high-resolution lunar terrain maps, accurate positioning and orientation, and lunar cockpit display concepts to support projected mission challenges.

  20. Transformational Spaceport and Range Concept of Operations: A Vision to Transform Ground and Launch Operations

    NASA Technical Reports Server (NTRS)

    2005-01-01

    The Transformational Concept of Operations (CONOPS) provides a long-term, sustainable vision for future U.S. space transportation infrastructure and operations. This vision presents an interagency concept, developed cooperatively by the Department of Defense (DoD), the Federal Aviation Administration (FAA), and the National Aeronautics and Space Administration (NASA) for the upgrade, integration, and improved operation of major infrastructure elements of the nation s space access systems. The interagency vision described in the Transformational CONOPS would transform today s space launch infrastructure into a shared system that supports worldwide operations for a variety of users. The system concept is sufficiently flexible and adaptable to support new types of missions for exploration, commercial enterprise, and national security, as well as to endure further into the future when space transportation technology may be sufficiently advanced to enable routine public space travel as part of the global transportation system. The vision for future space transportation operations is based on a system-of-systems architecture that integrates the major elements of the future space transportation system - transportation nodes (spaceports), flight vehicles and payloads, tracking and communications assets, and flight traffic coordination centers - into a transportation network that concurrently accommodates multiple types of mission operators, payloads, and vehicle fleets. This system concept also establishes a common framework for defining a detailed CONOPS for the major elements of the future space transportation system. The resulting set of four CONOPS (see Figure 1 below) describes the common vision for a shared future space transportation system (FSTS) infrastructure from a variety of perspectives.

  1. Commercial Flight Crew Decision-Making during Low-Visibility Approach Operations Using Fused Synthetic/Enhanced Vision Systems

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Bailey, Randall E.; Prinzel, Lawrence J., III

    2007-01-01

    NASA is investigating revolutionary crew-vehicle interface technologies that strive to proactively overcome aircraft safety barriers that would otherwise constrain the full realization of the next-generation air transportation system. A fixed-based piloted simulation experiment was conducted to evaluate the complementary use of Synthetic and Enhanced Vision technologies. Specific focus was placed on new techniques for integration and/or fusion of Enhanced and Synthetic Vision and its impact within a two-crew flight deck on the crew's decision-making process during low-visibility approach and landing operations. Overall, the experimental data showed that significant improvements in situation awareness, without concomitant increases in workload and display clutter, could be provided by the integration and/or fusion of synthetic and enhanced vision technologies for the pilot-flying and the pilot-not-flying. During non-normal operations, the ability of the crew to handle substantial navigational errors and runway incursions were neither improved nor adversely impacted by the display concepts. The addition of Enhanced Vision may not, unto itself, provide an improvement in runway incursion detection without being specifically tailored for this application. Existing enhanced vision system procedures were effectively used in the crew decision-making process during approach and missed approach operations but having to forcibly transition from an excellent FLIR image to natural vision by 100 ft above field level was awkward for the pilot-flying.

  2. Differential GNSS and Vision-Based Tracking to Improve Navigation Performance in Cooperative Multi-UAV Systems.

    PubMed

    Vetrella, Amedeo Rodi; Fasano, Giancarmine; Accardo, Domenico; Moccia, Antonio

    2016-12-17

    Autonomous navigation of micro-UAVs is typically based on the integration of low cost Global Navigation Satellite System (GNSS) receivers and Micro-Electro-Mechanical Systems (MEMS)-based inertial and magnetic sensors to stabilize and control the flight. The resulting navigation performance in terms of position and attitude accuracy may not suffice for other mission needs, such as the ones relevant to fine sensor pointing. In this framework, this paper presents a cooperative UAV navigation algorithm that allows a chief vehicle, equipped with inertial and magnetic sensors, a Global Positioning System (GPS) receiver, and a vision system, to improve its navigation performance (in real time or in the post processing phase) exploiting formation flying deputy vehicles equipped with GPS receivers. The focus is set on outdoor environments and the key concept is to exploit differential GPS among vehicles and vision-based tracking (DGPS/Vision) to build a virtual additional navigation sensor whose information is then integrated in a sensor fusion algorithm based on an Extended Kalman Filter. The developed concept and processing architecture are described, with a focus on DGPS/Vision attitude determination algorithm. Performance assessment is carried out on the basis of both numerical simulations and flight tests. In the latter ones, navigation estimates derived from the DGPS/Vision approach are compared with those provided by the onboard autopilot system of a customized quadrotor. The analysis shows the potential of the developed approach, mainly deriving from the possibility to exploit magnetic- and inertial-independent accurate attitude information.

  3. Evolving EO-1 Sensor Web Testbed Capabilities in Pursuit of GEOSS

    NASA Technical Reports Server (NTRS)

    Mandi, Dan; Ly, Vuong; Frye, Stuart; Younis, Mohamed

    2006-01-01

    A viewgraph presentation to evolve sensor web capabilities in pursuit of capabilities to support Global Earth Observing System of Systems (GEOSS) is shown. The topics include: 1) Vision to Enable Sensor Webs with "Hot Spots"; 2) Vision Extended for Communication/Control Architecture for Missions to Mars; 3) Key Capabilities Implemented to Enable EO-1 Sensor Webs; 4) One of Three Experiments Conducted by UMBC Undergraduate Class 12-14-05 (1 - 3); 5) Closer Look at our Mini-Rovers and Simulated Mars Landscae at GSFC; 6) Beginning to Implement Experiments with Standards-Vision for Integrated Sensor Web Environment; 7) Goddard Mission Services Evolution Center (GMSEC); 8) GMSEC Component Catalog; 9) Core Flight System (CFS) and Extension for GMSEC for Flight SW; 10) Sensor Modeling Language; 11) Seamless Ground to Space Integrated Message Bus Demonstration (completed December 2005); 12) Other Experiments in Queue; 13) Acknowledgements; and 14) References.

  4. Aspects of Synthetic Vision Display Systems and the Best Practices of the NASA's SVS Project

    NASA Technical Reports Server (NTRS)

    Bailey, Randall E.; Kramer, Lynda J.; Jones, Denise R.; Young, Steven D.; Arthur, Jarvis J.; Prinzel, Lawrence J.; Glaab, Louis J.; Harrah, Steven D.; Parrish, Russell V.

    2008-01-01

    NASA s Synthetic Vision Systems (SVS) Project conducted research aimed at eliminating visibility-induced errors and low visibility conditions as causal factors in civil aircraft accidents while enabling the operational benefits of clear day flight operations regardless of actual outside visibility. SVS takes advantage of many enabling technologies to achieve this capability including, for example, the Global Positioning System (GPS), data links, radar, imaging sensors, geospatial databases, advanced display media and three dimensional video graphics processors. Integration of these technologies to achieve the SVS concept provides pilots with high-integrity information that improves situational awareness with respect to terrain, obstacles, traffic, and flight path. This paper attempts to emphasize the system aspects of SVS - true systems, rather than just terrain on a flight display - and to document from an historical viewpoint many of the best practices that evolved during the SVS Project from the perspective of some of the NASA researchers most heavily involved in its execution. The Integrated SVS Concepts are envisagements of what production-grade Synthetic Vision systems might, or perhaps should, be in order to provide the desired functional capabilities that eliminate low visibility as a causal factor to accidents and enable clear-day operational benefits regardless of visibility conditions.

  5. Evaluation of Candidate Millimeter Wave Sensors for Synthetic Vision

    NASA Technical Reports Server (NTRS)

    Alexander, Neal T.; Hudson, Brian H.; Echard, Jim D.

    1994-01-01

    The goal of the Synthetic Vision Technology Demonstration Program was to demonstrate and document the capabilities of current technologies to achieve safe aircraft landing, take off, and ground operation in very low visibility conditions. Two of the major thrusts of the program were (1) sensor evaluation in measured weather conditions on a tower overlooking an unused airfield and (2) flight testing of sensor and pilot performance via a prototype system. The presentation first briefly addresses the overall technology thrusts and goals of the program and provides a summary of MMW sensor tower-test and flight-test data collection efforts. Data analysis and calibration procedures for both the tower tests and flight tests are presented. The remainder of the presentation addresses the MMW sensor flight-test evaluation results, including the processing approach for determination of various performance metrics (e.g., contrast, sharpness, and variability). The variation of the very important contrast metric in adverse weather conditions is described. Design trade-off considerations for Synthetic Vision MMW sensors are presented.

  6. Influence of control parameters on the joint tracking performance of a coaxial weld vision system

    NASA Technical Reports Server (NTRS)

    Gangl, K. J.; Weeks, J. L.

    1985-01-01

    The first phase of a series of evaluations of a vision-based welding control sensor for the Space Shuttle Main Engine Robotic Welding System is described. The robotic welding system is presently under development at the Marshall Space Flight Center. This evaluation determines the standard control response parameters necessary for proper trajectory of the welding torch along the joint.

  7. Study of Synthetic Vision Systems (SVS) and Velocity-vector Based Command Augmentation System (V-CAS) on Pilot Performance

    NASA Technical Reports Server (NTRS)

    Liu, Dahai; Goodrich, Ken; Peak, Bob

    2006-01-01

    This study investigated the effects of synthetic vision system (SVS) concepts and advanced flight controls on single pilot performance (SPP). Specifically, we evaluated the benefits and interactions of two levels of terrain portrayal, guidance symbology, and control-system response type on SPP in the context of lower-landing minima (LLM) approaches. Performance measures consisted of flight technical error (FTE) and pilot perceived workload. In this study, pilot rating, control type, and guidance symbology were not found to significantly affect FTE or workload. It is likely that transfer from prior experience, limited scope of the evaluation task, specific implementation limitations, and limited sample size were major factors in obtaining these results.

  8. Synthetic Vision System Commercial Aircraft Flight Deck Display Technologies for Unusual Attitude Recovery

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Ellis, Kyle E.; Arthur, Jarvis J.; Nicholas, Stephanie N.; Kiggins, Daniel

    2017-01-01

    A Commercial Aviation Safety Team (CAST) study of 18 worldwide loss-of-control accidents and incidents determined that the lack of external visual references was associated with a flight crew's loss of attitude awareness or energy state awareness in 17 of these events. Therefore, CAST recommended development and implementation of virtual day-Visual Meteorological Condition (VMC) display systems, such as synthetic vision systems, which can promote flight crew attitude awareness similar to a day-VMC environment. This paper describes the results of a high-fidelity, large transport aircraft simulation experiment that evaluated virtual day-VMC displays and a "background attitude indicator" concept as an aid to pilots in recovery from unusual attitudes. Twelve commercial airline pilots performed multiple unusual attitude recoveries and both quantitative and qualitative dependent measures were collected. Experimental results and future research directions under this CAST initiative and the NASA "Technologies for Airplane State Awareness" research project are described.

  9. Enhanced Flight Vision Systems Operational Feasibility Study Using Radar and Infrared Sensors

    NASA Technical Reports Server (NTRS)

    Etherington, Timothy J.; Kramer, Lynda J.; Severance, Kurt; Bailey, Randall E.; Williams, Steven P.; Harrison, Stephanie J.

    2015-01-01

    Approach and landing operations during periods of reduced visibility have plagued aircraft pilots since the beginning of aviation. Although techniques are currently available to mitigate some of the visibility conditions, these operations are still ultimately limited by the pilot's ability to "see" required visual landing references (e.g., markings and/or lights of threshold and touchdown zone) and require significant and costly ground infrastructure. Certified Enhanced Flight Vision Systems (EFVS) have shown promise to lift the obscuration veil. They allow the pilot to operate with enhanced vision, in lieu of natural vision, in the visual segment to enable equivalent visual operations (EVO). An aviation standards document was developed with industry and government consensus for using an EFVS for approach, landing, and rollout to a safe taxi speed in visibilities as low as 300 feet runway visual range (RVR). These new standards establish performance, integrity, availability, and safety requirements to operate in this regime without reliance on a pilot's or flight crew's natural vision by use of a fail-operational EFVS. A pilot-in-the-loop high-fidelity motion simulation study was conducted at NASA Langley Research Center to evaluate the operational feasibility, pilot workload, and pilot acceptability of conducting straight-in instrument approaches with published vertical guidance to landing, touchdown, and rollout to a safe taxi speed in visibility as low as 300 feet RVR by use of vision system technologies on a head-up display (HUD) without need or reliance on natural vision. Twelve crews flew various landing and departure scenarios in 1800, 1000, 700, and 300 RVR. This paper details the non-normal results of the study including objective and subjective measures of performance and acceptability. The study validated the operational feasibility of approach and departure operations and success was independent of visibility conditions. Failures were handled within the lateral confines of the runway for all conditions tested. The fail-operational concept with pilot in the loop needs further study.

  10. Differential GNSS and Vision-Based Tracking to Improve Navigation Performance in Cooperative Multi-UAV Systems

    PubMed Central

    Vetrella, Amedeo Rodi; Fasano, Giancarmine; Accardo, Domenico; Moccia, Antonio

    2016-01-01

    Autonomous navigation of micro-UAVs is typically based on the integration of low cost Global Navigation Satellite System (GNSS) receivers and Micro-Electro-Mechanical Systems (MEMS)-based inertial and magnetic sensors to stabilize and control the flight. The resulting navigation performance in terms of position and attitude accuracy may not suffice for other mission needs, such as the ones relevant to fine sensor pointing. In this framework, this paper presents a cooperative UAV navigation algorithm that allows a chief vehicle, equipped with inertial and magnetic sensors, a Global Positioning System (GPS) receiver, and a vision system, to improve its navigation performance (in real time or in the post processing phase) exploiting formation flying deputy vehicles equipped with GPS receivers. The focus is set on outdoor environments and the key concept is to exploit differential GPS among vehicles and vision-based tracking (DGPS/Vision) to build a virtual additional navigation sensor whose information is then integrated in a sensor fusion algorithm based on an Extended Kalman Filter. The developed concept and processing architecture are described, with a focus on DGPS/Vision attitude determination algorithm. Performance assessment is carried out on the basis of both numerical simulations and flight tests. In the latter ones, navigation estimates derived from the DGPS/Vision approach are compared with those provided by the onboard autopilot system of a customized quadrotor. The analysis shows the potential of the developed approach, mainly deriving from the possibility to exploit magnetic- and inertial-independent accurate attitude information. PMID:27999318

  11. 2013-2363

    NASA Image and Video Library

    2013-05-15

    (left to right) NASA Langley aerospace engineer Bruce Jackson briefs astronauts Rex Walheim and Gregory Johnson about the Synthetic Vision (SV) and Enhanced Vision (EV) systems in a flight simulator at the center's Cockpit Motion Facility. The astronauts were training to land the Dream Chaser spacecraft May 15th 2013. credit NASA/David C. Bowman

  12. An Analysis of Helicopter Pilot Scan Techniques While Flying at Low Altitudes and High Speed

    DTIC Science & Technology

    2012-09-01

    Manager SV Synthetic Vision TFH Total Flight Hours TOFT Tactical Operational Flight Trainer VFR Visual Flight Rules VMC Visual Meteorological...Crognale, 2008). Recently, the use of synthetic vision (SV) and a heads-up- display (HUD) have been a topic of discussion in the aviation community... Synthetic vision uses external cameras to provide the pilot with an enhanced view of the outside world, usually with the assistance of night vision

  13. GSFC Information Systems Technology Developments Supporting the Vision for Space Exploration

    NASA Technical Reports Server (NTRS)

    Hughes, Peter; Dennehy, Cornelius; Mosier, Gary; Smith, Dan; Rykowski, Lisa

    2004-01-01

    The Vision for Space Exploration will guide NASA's future human and robotic space activities. The broad range of human and robotic missions now being planned will require the development of new system-level capabilities enabled by emerging new technologies. Goddard Space Flight Center is actively supporting the Vision for Space Exploration in a number of program management, engineering and technology areas. This paper provides a brief background on the Vision for Space Exploration and a general overview of potential key Goddard contributions. In particular, this paper focuses on describing relevant GSFC information systems capabilities in architecture development; interoperable command, control and communications; and other applied information systems technology/research activities that are applicable to support the Vision for Space Exploration goals. Current GSFC development efforts and task activities are presented together with future plans.

  14. Theory underlying the peripheral vision horizon device

    NASA Technical Reports Server (NTRS)

    Money, K. E.

    1984-01-01

    Peripheral Vision Horizon Device (PVHD) theory states that the likelihood of pilot disorientation in flight is reduced by providing an artificial horizon that provides orientation information to peripheral vision. In considering the validity of the theory, three areas are explored: the use of an artificial horizon device over some other flight instrument; the use of peripheral vision over foveal vision; and the evidence that peripheral vision is well suited to the processing of orientation information.

  15. Interaction of vestibular, echolocation, and visual modalities guiding flight by the big brown bat, Eptesicus fuscus.

    PubMed

    Horowitz, Seth S; Cheney, Cheryl A; Simmons, James A

    2004-01-01

    The big brown bat (Eptesicus fuscus) is an aerial-feeding insectivorous species that relies on echolocation to avoid obstacles and to detect flying insects. Spatial perception in the dark using echolocation challenges the vestibular system to function without substantial visual input for orientation. IR thermal video recordings show the complexity of bat flights in the field and suggest a highly dynamic role for the vestibular system in orientation and flight control. To examine this role, we carried out laboratory studies of flight behavior under illuminated and dark conditions in both static and rotating obstacle tests while administering heavy water (D2O) to impair vestibular inputs. Eptesicus carried out complex maneuvers through both fixed arrays of wires and a rotating obstacle array using both vision and echolocation, or when guided by echolocation alone. When treated with D2O in combination with lack of visual cues, bats showed considerable decrements in performance. These data indicate that big brown bats use both vision and echolocation to provide spatial registration for head position information generated by the vestibular system.

  16. Vision-Based UAV Flight Control and Obstacle Avoidance

    DTIC Science & Technology

    2006-01-01

    denoted it by Vb = (Vb1, Vb2 , Vb3). Fig. 2 shows the block diagram of the proposed vision-based motion analysis and obstacle avoidance system. We denote...structure analysis often involve computation- intensive computer vision tasks, such as feature extraction and geometric modeling. Computation-intensive...First, we extract a set of features from each block. 2) Second, we compute the distance between these two sets of features. In conventional motion

  17. Computer vision techniques for rotorcraft low-altitude flight

    NASA Technical Reports Server (NTRS)

    Sridhar, Banavar; Cheng, Victor H. L.

    1988-01-01

    A description is given of research that applies techniques from computer vision to automation of rotorcraft navigation. The effort emphasizes the development of a methodology for detecting the ranges to obstacles in the region of interest based on the maximum utilization of passive sensors. The range map derived from the obstacle detection approach can be used as obstacle data for the obstacle avoidance in an automataic guidance system and as advisory display to the pilot. The lack of suitable flight imagery data, however, presents a problem in the verification of concepts for obstacle detection. This problem is being addressed by the development of an adequate flight database and by preprocessing of currently available flight imagery. Some comments are made on future work and how research in this area relates to the guidance of other autonomous vehicles.

  18. 78 FR 32078 - Special Conditions: Gulfstream Model G280 Airplane, Enhanced Flight Vision System (EFVS) With...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-29

    ... document refers to a system comprised of a head-up display, imaging sensor(s), and avionics interfaces that display the sensor imagery on the HUD, and which overlay that imagery with alpha-numeric and symbolic... the sensor imagery, with or without other flight information, on a head-down display. For clarity, the...

  19. Bringing UAVs to the fight: recent army autonomy research and a vision for the future

    NASA Astrophysics Data System (ADS)

    Moorthy, Jay; Higgins, Raymond; Arthur, Keith

    2008-04-01

    The Unmanned Autonomous Collaborative Operations (UACO) program was initiated in recognition of the high operational burden associated with utilizing unmanned systems by both mounted and dismounted, ground and airborne warfighters. The program was previously introduced at the 62nd Annual Forum of the American Helicopter Society in May of 20061. This paper presents the three technical approaches taken and results obtained in UACO. All three approaches were validated extensively in contractor simulations, two were validated in government simulation, one was flight tested outside the UACO program, and one was flight tested in Part 2 of UACO. Results and recommendations are discussed regarding diverse areas such as user training and human-machine interface, workload distribution, UAV flight safety, data link bandwidth, user interface constructs, adaptive algorithms, air vehicle system integration, and target recognition. Finally, a vision for UAV As A Wingman is presented.

  20. Analysis of Wallops Flight Test Data Through an Automated COTS System

    NASA Technical Reports Server (NTRS)

    Blackstock, Dexter Lee; Theobalds, Andre B.

    2005-01-01

    During the summer of 2004 NASA Langley Research Center flight tested a Synthetic Vision System (SVS) at the Reno/Tahoe International Airport (RNO) and the Wallops Flight Facility (WAL). The SVS included a Runway Incursion Prevention System (RIPS) to improve pilot situational awareness while operating near and on the airport surface. The flight tests consisted of air and ground operations to evaluate and validate the performance of the system. This paper describes the flight test and emphasizes how positioning data was collected, post processed and analyzed through the use of a COTS-derived software system. The system that was developed to analyze the data was constructed within the MATLAB(TM) environment. The software was modified to read the data, perform several if-then scenarios and produce the relevant graphs, figures and tables.

  1. An embedded vision system for an unmanned four-rotor helicopter

    NASA Astrophysics Data System (ADS)

    Lillywhite, Kirt; Lee, Dah-Jye; Tippetts, Beau; Fowers, Spencer; Dennis, Aaron; Nelson, Brent; Archibald, James

    2006-10-01

    In this paper an embedded vision system and control module is introduced that is capable of controlling an unmanned four-rotor helicopter and processing live video for various law enforcement, security, military, and civilian applications. The vision system is implemented on a newly designed compact FPGA board (Helios). The Helios board contains a Xilinx Virtex-4 FPGA chip and memory making it capable of implementing real time vision algorithms. A Smooth Automated Intelligent Leveling daughter board (SAIL), attached to the Helios board, collects attitude and heading information to be processed in order to control the unmanned helicopter. The SAIL board uses an electrolytic tilt sensor, compass, voltage level converters, and analog to digital converters to perform its operations. While level flight can be maintained, problems stemming from the characteristics of the tilt sensor limits maneuverability of the helicopter. The embedded vision system has proven to give very good results in its performance of a number of real-time robotic vision algorithms.

  2. Image processing for flight crew enhanced situation awareness

    NASA Technical Reports Server (NTRS)

    Roberts, Barry

    1993-01-01

    This presentation describes the image processing work that is being performed for the Enhanced Situational Awareness System (ESAS) application. Specifically, the presented work supports the Enhanced Vision System (EVS) component of ESAS.

  3. Enhanced Flight Vision Systems and Synthetic Vision Systems for NextGen Approach and Landing Operations

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Bailey, Randall E.; Ellis, Kyle K. E.; Williams, Steven P.; Arthur, Jarvis J., III; Prinzel, Lawrence J., III; Shelton, Kevin J.

    2013-01-01

    Synthetic Vision Systems and Enhanced Flight Vision System (SVS/EFVS) technologies have the potential to provide additional margins of safety for aircrew performance and enable operational improvements for low visibility operations in the terminal area environment with equivalent efficiency as visual operations. To meet this potential, research is needed for effective technology development and implementation of regulatory standards and design guidance to support introduction and use of SVS/EFVS advanced cockpit vision technologies in Next Generation Air Transportation System (NextGen) operations. A fixed-base pilot-in-the-loop simulation test was conducted at NASA Langley Research Center that evaluated the use of SVS/EFVS in NextGen low visibility approach and landing operations. Twelve crews flew approach and landing operations in a simulated NextGen Chicago O'Hare environment. Various scenarios tested the potential for using EFVS to conduct approach, landing, and roll-out operations in visibility as low as 1000 feet runway visual range (RVR). Also, SVS was tested to evaluate the potential for lowering decision heights (DH) on certain instrument approach procedures below what can be flown today. Expanding the portion of the visual segment in which EFVS can be used in lieu of natural vision from 100 feet above the touchdown zone elevation to touchdown and rollout in visibilities as low as 1000 feet RVR appears to be viable as touchdown performance was acceptable without any apparent workload penalties. A lower DH of 150 feet and/or possibly reduced visibility minima using SVS appears to be viable when implemented on a Head-Up Display, but the landing data suggests further study for head-down implementations.

  4. Synthetic and Enhanced Vision System for Altair Lunar Lander

    NASA Technical Reports Server (NTRS)

    Prinzell, Lawrence J., III; Kramer, Lynda J.; Norman, Robert M.; Arthur, Jarvis J., III; Williams, Steven P.; Shelton, Kevin J.; Bailey, Randall E.

    2009-01-01

    Past research has demonstrated the substantial potential of synthetic and enhanced vision (SV, EV) for aviation (e.g., Prinzel & Wickens, 2009). These augmented visual-based technologies have been shown to significantly enhance situation awareness, reduce workload, enhance aviation safety (e.g., reduced propensity for controlled flight -into-terrain accidents/incidents), and promote flight path control precision. The issues that drove the design and development of synthetic and enhanced vision have commonalities to other application domains; most notably, during entry, descent, and landing on the moon and other planetary surfaces. NASA has extended SV/EV technology for use in planetary exploration vehicles, such as the Altair Lunar Lander. This paper describes an Altair Lunar Lander SV/EV concept and associated research demonstrating the safety benefits of these technologies.

  5. Implementing the President's Vision: JPL and NASA's Exploration Systems Mission Directorate

    NASA Technical Reports Server (NTRS)

    Sander, Michael J.

    2006-01-01

    As part of the NASA team the Jet Propulsion Laboratory is involved in the Exploration Systems Mission Directorate (ESMD) work to implement the President's Vision for Space exploration. In this slide presentation the roles that are assigned to the various NASA centers to implement the vision are reviewed. The plan for JPL is to use the Constellation program to advance the combination of science an Constellation program objectives. JPL's current participation is to contribute systems engineering support, Command, Control, Computing and Information (C3I) architecture, Crew Exploration Vehicle, (CEV) Thermal Protection System (TPS) project support/CEV landing assist support, Ground support systems support at JSC and KSC, Exploration Communication and Navigation System (ECANS), Flight prototypes for cabin atmosphere instruments

  6. General Aviation Flight Test of Advanced Operations Enabled by Synthetic Vision

    NASA Technical Reports Server (NTRS)

    Glaab, Louis J.; Hughhes, Monica F.; Parrish, Russell V.; Takallu, Mohammad A.

    2014-01-01

    A flight test was performed to compare the use of three advanced primary flight and navigation display concepts to a baseline, round-dial concept to assess the potential for advanced operations. The displays were evaluated during visual and instrument approach procedures including an advanced instrument approach resembling a visual airport traffic pattern. Nineteen pilots from three pilot groups, reflecting the diverse piloting skills of the General Aviation pilot population, served as evaluation subjects. The experiment had two thrusts: 1) an examination of the capabilities of low-time (i.e., <400 hours), non-instrument-rated pilots to perform nominal instrument approaches, and 2) an exploration of potential advanced Visual Meteorological Conditions (VMC)-like approaches in Instrument Meteorological Conditions (IMC). Within this context, advanced display concepts are considered to include integrated navigation and primary flight displays with either aircraft attitude flight directors or Highway In The Sky (HITS) guidance with and without a synthetic depiction of the external visuals (i.e., synthetic vision). Relative to the first thrust, the results indicate that using an advanced display concept, as tested herein, low-time, non-instrument-rated pilots can exhibit flight-technical performance, subjective workload and situation awareness ratings as good as or better than high-time Instrument Flight Rules (IFR)-rated pilots using Baseline Round Dials for a nominal IMC approach. For the second thrust, the results indicate advanced VMC-like approaches are feasible in IMC, for all pilot groups tested for only the Synthetic Vision System (SVS) advanced display concept.

  7. Synthetic Vision Systems - Operational Considerations Simulation Experiment

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Williams, Steven P.; Bailey, Randall E.; Glaab, Louis J.

    2007-01-01

    Synthetic vision is a computer-generated image of the external scene topography that is generated from aircraft attitude, high-precision navigation information, and data of the terrain, obstacles, cultural features, and other required flight information. A synthetic vision system (SVS) enhances this basic functionality with real-time integrity to ensure the validity of the databases, perform obstacle detection and independent navigation accuracy verification, and provide traffic surveillance. Over the last five years, NASA and its industry partners have developed and deployed SVS technologies for commercial, business, and general aviation aircraft which have been shown to provide significant improvements in terrain awareness and reductions in the potential for Controlled-Flight-Into-Terrain incidents/accidents compared to current generation cockpit technologies. It has been hypothesized that SVS displays can greatly improve the safety and operational flexibility of flight in Instrument Meteorological Conditions (IMC) to a level comparable to clear-day Visual Meteorological Conditions (VMC), regardless of actual weather conditions or time of day. An experiment was conducted to evaluate SVS and SVS-related technologies as well as the influence of where the information is provided to the pilot (e.g., on a Head-Up or Head-Down Display) for consideration in defining landing minima based upon aircraft and airport equipage. The "operational considerations" evaluated under this effort included reduced visibility, decision altitudes, and airport equipage requirements, such as approach lighting systems, for SVS-equipped aircraft. Subjective results from the present study suggest that synthetic vision imagery on both head-up and head-down displays may offer benefits in situation awareness; workload; and approach and landing performance in the visibility levels, approach lighting systems, and decision altitudes tested.

  8. Synthetic vision systems: operational considerations simulation experiment

    NASA Astrophysics Data System (ADS)

    Kramer, Lynda J.; Williams, Steven P.; Bailey, Randall E.; Glaab, Louis J.

    2007-04-01

    Synthetic vision is a computer-generated image of the external scene topography that is generated from aircraft attitude, high-precision navigation information, and data of the terrain, obstacles, cultural features, and other required flight information. A synthetic vision system (SVS) enhances this basic functionality with real-time integrity to ensure the validity of the databases, perform obstacle detection and independent navigation accuracy verification, and provide traffic surveillance. Over the last five years, NASA and its industry partners have developed and deployed SVS technologies for commercial, business, and general aviation aircraft which have been shown to provide significant improvements in terrain awareness and reductions in the potential for Controlled-Flight-Into-Terrain incidents / accidents compared to current generation cockpit technologies. It has been hypothesized that SVS displays can greatly improve the safety and operational flexibility of flight in Instrument Meteorological Conditions (IMC) to a level comparable to clear-day Visual Meteorological Conditions (VMC), regardless of actual weather conditions or time of day. An experiment was conducted to evaluate SVS and SVS-related technologies as well as the influence of where the information is provided to the pilot (e.g., on a Head-Up or Head-Down Display) for consideration in defining landing minima based upon aircraft and airport equipage. The "operational considerations" evaluated under this effort included reduced visibility, decision altitudes, and airport equipage requirements, such as approach lighting systems, for SVS-equipped aircraft. Subjective results from the present study suggest that synthetic vision imagery on both head-up and head-down displays may offer benefits in situation awareness; workload; and approach and landing performance in the visibility levels, approach lighting systems, and decision altitudes tested.

  9. Performance Evaluation of Speech Recognition Systems as a Next-Generation Pilot-Vehicle Interface Technology

    NASA Technical Reports Server (NTRS)

    Arthur, Jarvis J., III; Shelton, Kevin J.; Prinzel, Lawrence J., III; Bailey, Randall E.

    2016-01-01

    During the flight trials known as Gulfstream-V Synthetic Vision Systems Integrated Technology Evaluation (GV-SITE), a Speech Recognition System (SRS) was used by the evaluation pilots. The SRS system was intended to be an intuitive interface for display control (rather than knobs, buttons, etc.). This paper describes the performance of the current "state of the art" Speech Recognition System (SRS). The commercially available technology was evaluated as an application for possible inclusion in commercial aircraft flight decks as a crew-to-vehicle interface. Specifically, the technology is to be used as an interface from aircrew to the onboard displays, controls, and flight management tasks. A flight test of a SRS as well as a laboratory test was conducted.

  10. An Integrated Vision-Based System for Spacecraft Attitude and Topology Determination for Formation Flight Missions

    NASA Technical Reports Server (NTRS)

    Rogers, Aaron; Anderson, Kalle; Mracek, Anna; Zenick, Ray

    2004-01-01

    With the space industry's increasing focus upon multi-spacecraft formation flight missions, the ability to precisely determine system topology and the orientation of member spacecraft relative to both inertial space and each other is becoming a critical design requirement. Topology determination in satellite systems has traditionally made use of GPS or ground uplink position data for low Earth orbits, or, alternatively, inter-satellite ranging between all formation pairs. While these techniques work, they are not ideal for extension to interplanetary missions or to large fleets of decentralized, mixed-function spacecraft. The Vision-Based Attitude and Formation Determination System (VBAFDS) represents a novel solution to both the navigation and topology determination problems with an integrated approach that combines a miniature star tracker with a suite of robust processing algorithms. By combining a single range measurement with vision data to resolve complete system topology, the VBAFDS design represents a simple, resource-efficient solution that is not constrained to certain Earth orbits or formation geometries. In this paper, analysis and design of the VBAFDS integrated guidance, navigation and control (GN&C) technology will be discussed, including hardware requirements, algorithm development, and simulation results in the context of potential mission applications.

  11. Vision-based aircraft guidance

    NASA Technical Reports Server (NTRS)

    Menon, P. K.

    1993-01-01

    Early research on the development of machine vision algorithms to serve as pilot aids in aircraft flight operations is discussed. The research is useful for synthesizing new cockpit instrumentation that can enhance flight safety and efficiency. With the present work as the basis, future research will produce low-cost instrument by integrating a conventional TV camera together with off-the=shelf digitizing hardware for flight test verification. Initial focus of the research will be on developing pilot aids for clear-night operations. Latter part of the research will examine synthetic vision issues for poor visibility flight operations. Both research efforts will contribute towards the high-speed civil transport aircraft program. It is anticipated that the research reported here will also produce pilot aids for conducting helicopter flight operations during emergency search and rescue. The primary emphasis of the present research effort is on near-term, flight demonstrable technologies. This report discusses pilot aids for night landing and takeoff and synthetic vision as an aid to low visibility landing.

  12. Evaluating the Effects of Dimensionality in Advanced Avionic Display Concepts for Synthetic Vision Systems

    NASA Technical Reports Server (NTRS)

    Alexander, Amy L.; Prinzel, Lawrence J., III; Wickens, Christopher D.; Kramer, Lynda J.; Arthur, Jarvis J.; Bailey, Randall E.

    2007-01-01

    Synthetic vision systems provide an in-cockpit view of terrain and other hazards via a computer-generated display representation. Two experiments examined several display concepts for synthetic vision and evaluated how such displays modulate pilot performance. Experiment 1 (24 general aviation pilots) compared three navigational display (ND) concepts: 2D coplanar, 3D, and split-screen. Experiment 2 (12 commercial airline pilots) evaluated baseline 'blue sky/brown ground' or synthetic vision-enabled primary flight displays (PFDs) and three ND concepts: 2D coplanar with and without synthetic vision and a dynamic multi-mode rotatable exocentric format. In general, the results pointed to an overall advantage for a split-screen format, whether it be stand-alone (Experiment 1) or available via rotatable viewpoints (Experiment 2). Furthermore, Experiment 2 revealed benefits associated with utilizing synthetic vision in both the PFD and ND representations and the value of combined ego- and exocentric presentations.

  13. Flight Simulator Evaluation of Synthetic Vision Display Concepts to Prevent Controlled Flight Into Terrain (CFIT)

    NASA Technical Reports Server (NTRS)

    Arthur, Jarvis J., III; Prinzel, Lawrence J., III; Kramer, Lynda J.; Parrish, Russell V.; Bailey, Randall E.

    2004-01-01

    In commercial aviation, over 30-percent of all fatal accidents worldwide are categorized as Controlled Flight Into Terrain (CFIT) accidents, where a fully functioning airplane is inadvertently flown into the ground. The major hypothesis for a simulation experiment conducted at NASA Langley Research Center was that a Primary Flight Display (PFD) with synthetic terrain will improve pilots ability to detect and avoid potential CFITs compared to conventional instrumentation. All display conditions, including the baseline, contained a Terrain Awareness and Warning System (TAWS) and Vertical Situation Display (VSD) enhanced Navigation Display (ND). Each pilot flew twenty-two approach departure maneuvers in Instrument Meteorological Conditions (IMC) to the terrain challenged Eagle County Regional Airport (EGE) in Colorado. For the final run, flight guidance cues were altered such that the departure path went into terrain. All pilots with a synthetic vision system (SVS) PFD (twelve of sixteen pilots) noticed and avoided the potential CFIT situation. The four pilots who flew the anomaly with the conventional baseline PFD configuration (which included a TAWS and VSD enhanced ND) had a CFIT event. Additionally, all the SVS display concepts enhanced the pilot s situational awareness, decreased workload and improved flight technical error (FTE) compared to the baseline configuration.

  14. Synthetic vision systems: the effects of guidance symbology, display size, and field of view.

    PubMed

    Alexander, Amy L; Wickens, Christopher D; Hardy, Thomas J

    2005-01-01

    Two experiments conducted in a high-fidelity flight simulator examined the effects of guidance symbology, display size, and geometric field of view (GFOV) within a synthetic vision system (SVS). In Experiment 1, 18 pilots flew highlighted and low-lighted tunnel-in-the-sky displays, as well as a less cluttered follow-me aircraft (FMA), through a series of curved approaches over rugged terrain. The results revealed that both tunnels supported better flight path tracking and lower workload levels than did the FMA because of the availability of more preview information. Increasing tunnel intensity had no benefit on tracking and, in fact, degraded traffic awareness because of clutter and attentional tunneling. In Experiment 2, 24 pilots flew a lowlighted tunnel configured according to different display sizes (small or large) and GFOVs (30 degrees or 60 degrees). Measures of flight path tracking and terrain awareness generally favored the 60 degrees GFOV; however, there were no effects of display size. Actual or potential applications of this research include understanding the impact of SVS properties on flight path tracking, traffic and terrain awareness, workload, and the allocation of attention.

  15. High contrast sensitivity for visually guided flight control in bumblebees.

    PubMed

    Chakravarthi, Aravin; Kelber, Almut; Baird, Emily; Dacke, Marie

    2017-12-01

    Many insects rely on vision to find food, to return to their nest and to carefully control their flight between these two locations. The amount of information available to support these tasks is, in part, dictated by the spatial resolution and contrast sensitivity of their visual systems. Here, we investigate the absolute limits of these visual properties for visually guided position and speed control in Bombus terrestris. Our results indicate that the limit of spatial vision in the translational motion detection system of B. terrestris lies at 0.21 cycles deg -1 with a peak contrast sensitivity of at least 33. In the perspective of earlier findings, these results indicate that bumblebees have higher contrast sensitivity in the motion detection system underlying position control than in their object discrimination system. This suggests that bumblebees, and most likely also other insects, have different visual thresholds depending on the behavioral context.

  16. Research on detection method of UAV obstruction based on binocular vision

    NASA Astrophysics Data System (ADS)

    Zhu, Xiongwei; Lei, Xusheng; Sui, Zhehao

    2018-04-01

    For the autonomous obstacle positioning and ranging in the process of UAV (unmanned aerial vehicle) flight, a system based on binocular vision is constructed. A three-stage image preprocessing method is proposed to solve the problem of the noise and brightness difference in the actual captured image. The distance of the nearest obstacle is calculated by using the disparity map that generated by binocular vision. Then the contour of the obstacle is extracted by post-processing of the disparity map, and a color-based adaptive parameter adjustment algorithm is designed to extract contours of obstacle automatically. Finally, the safety distance measurement and obstacle positioning during the UAV flight process are achieved. Based on a series of tests, the error of distance measurement can keep within 2.24% of the measuring range from 5 m to 20 m.

  17. Insect vision: a few tricks to regulate flight altitude.

    PubMed

    Floreano, Dario; Zufferey, Jean-Christophe

    2010-10-12

    A recent study sheds new light on the visual cues used by Drosophila to regulate flight altitude. The striking similarity with previously identified steering mechanisms provides a coherent basis for novel models of vision-based flight control in insects and robots. Copyright © 2010 Elsevier Ltd. All rights reserved.

  18. Evidence Based Medicine in Space Flight: Evaluation of Inflight Vision Data for Operational Decision-Making

    NASA Technical Reports Server (NTRS)

    Van Baalen, Mary; Mason, Sara; Foy, Millennia; Wear, Mary; Taiym, Wafa; Moynihan, Shannan; Alexander, David; Hart, Steve; Tarver, William

    2015-01-01

    Due to recently identified vision changes associated with space flight, JSC Space and Clinical Operations (SCO) implemented broad mission-related vision testing starting in 2009. Optical Coherence Tomography (OCT), 3 Tesla Brain and Orbit MRIs, Optical Biometry were implemented terrestrially for clinical monitoring. While no inflight vision testing was in place, already available onorbit technology was leveraged to facilitate in-flight clinical monitoring, including visual acuity, Amsler grid, tonometry, and ultrasonography. In 2013, on-orbit testing capabilities were expanded to include contrast sensitivity testing and OCT. As these additional testing capabilities have been added, resource prioritization, particularly crew time, is under evaluation.

  19. Crew and Display Concepts Evaluation for Synthetic / Enhanced Vision Systems

    NASA Technical Reports Server (NTRS)

    Bailey, Randall E.; Kramer, Lynda J.; Prinzel, Lawrence J., III

    2006-01-01

    NASA s Synthetic Vision Systems (SVS) project is developing technologies with practical applications that strive to eliminate low-visibility conditions as a causal factor to civil aircraft accidents and replicate the operational benefits of clear day flight operations, regardless of the actual outside visibility condition. Enhanced Vision System (EVS) technologies are analogous and complementary in many respects to SVS, with the principle difference being that EVS is an imaging sensor presentation, as opposed to a database-derived image. The use of EVS in civil aircraft is projected to increase rapidly as the Federal Aviation Administration recently changed the aircraft operating rules under Part 91, revising the flight visibility requirements for conducting operations to civil airports. Operators conducting straight-in instrument approach procedures may now operate below the published approach minimums when using an approved EVS that shows the required visual references on the pilot s Head-Up Display. An experiment was conducted to evaluate the complementary use of SVS and EVS technologies, specifically focusing on new techniques for integration and/or fusion of synthetic and enhanced vision technologies and crew resource management while operating under the newly adopted FAA rules which provide operating credit for EVS. Overall, the experimental data showed that significant improvements in SA without concomitant increases in workload and display clutter could be provided by the integration and/or fusion of synthetic and enhanced vision technologies for the pilot-flying and the pilot-not-flying.

  20. Database Integrity Monitoring for Synthetic Vision Systems Using Machine Vision and SHADE

    NASA Technical Reports Server (NTRS)

    Cooper, Eric G.; Young, Steven D.

    2005-01-01

    In an effort to increase situational awareness, the aviation industry is investigating technologies that allow pilots to visualize what is outside of the aircraft during periods of low-visibility. One of these technologies, referred to as Synthetic Vision Systems (SVS), provides the pilot with real-time computer-generated images of obstacles, terrain features, runways, and other aircraft regardless of weather conditions. To help ensure the integrity of such systems, methods of verifying the accuracy of synthetically-derived display elements using onboard remote sensing technologies are under investigation. One such method is based on a shadow detection and extraction (SHADE) algorithm that transforms computer-generated digital elevation data into a reference domain that enables direct comparison with radar measurements. This paper describes machine vision techniques for making this comparison and discusses preliminary results from application to actual flight data.

  1. A Height Estimation Approach for Terrain Following Flights from Monocular Vision.

    PubMed

    Campos, Igor S G; Nascimento, Erickson R; Freitas, Gustavo M; Chaimowicz, Luiz

    2016-12-06

    In this paper, we present a monocular vision-based height estimation algorithm for terrain following flights. The impressive growth of Unmanned Aerial Vehicle (UAV) usage, notably in mapping applications, will soon require the creation of new technologies to enable these systems to better perceive their surroundings. Specifically, we chose to tackle the terrain following problem, as it is still unresolved for consumer available systems. Virtually every mapping aircraft carries a camera; therefore, we chose to exploit this in order to use presently available hardware to extract the height information toward performing terrain following flights. The proposed methodology consists of using optical flow to track features from videos obtained by the UAV, as well as its motion information to estimate the flying height. To determine if the height estimation is reliable, we trained a decision tree that takes the optical flow information as input and classifies whether the output is trustworthy or not. The classifier achieved accuracies of 80 % for positives and 90 % for negatives, while the height estimation algorithm presented good accuracy.

  2. Toward Head-Up and Head-Worn Displays for Equivalent Visual Operations

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Arthur, Jarvis J.; Bailey, Randall E.; Shelton, Kevin J.; Kramer, Lynda J.; Jones, Denise R.; Williams, Steven P.; Harrison, Stephanie J.; Ellis, Kyle K.

    2015-01-01

    A key capability envisioned for the future air transportation system is the concept of equivalent visual operations (EVO). EVO is the capability to achieve the safety of current-day Visual Flight Rules (VFR) operations and maintain the operational tempos of VFR irrespective of the weather and visibility conditions. Enhanced Flight Vision Systems (EFVS) offer a path to achieve EVO. NASA has successfully tested EFVS for commercial flight operations that has helped establish the technical merits of EFVS, without reliance on natural vision, to runways without category II/III ground-based navigation and lighting requirements. The research has tested EFVS for operations with both Head-Up Displays (HUDs) and "HUD equivalent" Head-Worn Displays (HWDs). The paper describes the EVO concept and representative NASA EFVS research that demonstrate the potential of these technologies to safely conduct operations in visibilities as low as 1000 feet Runway Visual Range (RVR). Future directions are described including efforts to enable low-visibility approach, landing, and roll-outs using EFVS under conditions as low as 300 feet RVR.

  3. Symbology Development for General Aviation Synthetic Vision Primary Flight Displays for the Approach and Missed-Approach Modes of Flight

    NASA Technical Reports Server (NTRS)

    Bartolone, Anthony P.; Hughes, Monica F.; Wong, Douglas T.; Takallu, Mohammad A.

    2004-01-01

    Spatial disorientation induced by inadvertent flight into instrument meteorological conditions (IMC) continues to be a leading cause of fatal accidents in general aviation. The Synthetic Vision Systems General Aviation (SVS-GA) research element, an integral part of NASA s Aviation Safety and Security Program (AvSSP), is investigating a revolutionary display technology designed to mitigate low visibility events such as controlled flight into terrain (CFIT) and low-visibility loss of control (LVLoC). The integrated SVS Primary Flight Display (SVS-PFD) utilizes computer generated 3-dimensional imagery of the surrounding terrain augmented with flight path guidance symbology. This unique combination will provide GA pilots with an accurate representation of their environment and projection of their flight path, regardless of time of day or out-the-window (OTW) visibility. The initial Symbology Development for Head-Down Displays (SD-HDD) simulation experiment examined 16 display configurations on a centrally located high-resolution PFD installed in NASA s General Aviation Work Station (GAWS) flight simulator. The results of the experiment indicate that situation awareness (SA) can be enhanced without having a negative impact on flight technical error (FTE), by providing a general aviation pilot with an integrated SVS display to use when OTW visibility is obscured.

  4. Development and Evaluation of 2-D and 3-D Exocentric Synthetic Vision Navigation Display Concepts for Commercial Aircraft

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Kramer, Lynda J.; Arthur, J. J., III; Bailey, Randall E.; Sweeters, Jason L.

    2005-01-01

    NASA's Synthetic Vision Systems (SVS) project is developing technologies with practical applications that will help to eliminate low visibility conditions as a causal factor to civil aircraft accidents while replicating the operational benefits of clear day flight operations, regardless of the actual outside visibility condition. The paper describes experimental evaluation of a multi-mode 3-D exocentric synthetic vision navigation display concept for commercial aircraft. Experimental results evinced the situation awareness benefits of 2-D and 3-D exocentric synthetic vision displays over traditional 2-D co-planar navigation and vertical situation displays. Conclusions and future research directions are discussed.

  5. [Personnel with poor vision at fighter pilot school].

    PubMed

    Corbé, C; Menu, J P

    1997-10-01

    The piloting of fighting aircraft, the navigation of space-shuttle, the piloting of an helicopter in tactical flight at an altitude of 50 metres require the use of all sensorial, ocular, vestibular, proprioceptive ... sensors. So, the selection and the follow-up of these aerial engines' pilots need a very complete study of medical parameters, in particular sensorial and notably visual system. The doctors and the expert researchers in Aeronautical and spatial Medicine of the Army Health Department, which have in charge the medical supervision of flight crew, should study, create, and improve tests of visual sensorial exploration developed from fundamental and applied research. These authenticated tests with military pilots were applied in ophthalmology for the estimation of normal and deficient vision. A proposition to change norms of World Health Organisation applied to the vision has been following these to low visual persons was equally introduced.

  6. Helicopter human factors

    NASA Technical Reports Server (NTRS)

    Hart, Sandra G.

    1988-01-01

    The state-of-the-art helicopter and its pilot are examined using the tools of human-factors analysis. The significant role of human error in helicopter accidents is discussed; the history of human-factors research on helicopters is briefly traced; the typical flight tasks are described; and the noise, vibration, and temperature conditions typical of modern military helicopters are characterized. Also considered are helicopter controls, cockpit instruments and displays, and the impact of cockpit design on pilot workload. Particular attention is given to possible advanced-technology improvements, such as control stabilization and augmentation, FBW and fly-by-light systems, multifunction displays, night-vision goggles, pilot night-vision systems, night-vision displays with superimposed symbols, target acquisition and designation systems, and aural displays. Diagrams, drawings, and photographs are provided.

  7. 75 FR 47176 - Special Conditions: Dassault Aviation Model Falcon 7X; Enhanced Flight Visibility System (EFVS)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-05

    ...), imaging sensor(s), and avionics interfaces that display the sensor imagery on the HUD and overlay it with... that display the sensor imagery, with or without other flight information, on a head-down display. To... infrared sensors can be much different from that detected by natural pilot vision. On a dark night, thermal...

  8. CFIT Prevention Using Synthetic Vision

    NASA Technical Reports Server (NTRS)

    Arthur, Jarvis J., III; Prinzel, Lawrence J., III; Kramer, Lynda J.; Bailey, Randall E.; Parrish, Russell V.

    2003-01-01

    In commercial aviation, over 30-percent of all fatal accidents worldwide are categorized as Controlled Flight Into Terrain (CFIT) accidents where a fully functioning airplane is inadvertently flown into the ground, water, or an obstacle. An experiment was conducted at NASA Langley Research Center investigating the presentation of a synthetic terrain database scene to the pilot on a Primary Flight Display (PFD). The major hypothesis for the experiment is that a synthetic vision system (SVS) will improve the pilot s ability to detect and avoid a potential CFIT compared to conventional flight instrumentation. All display conditions, including the baseline, contained a Terrain Awareness and Warning System (TAWS) and Vertical Situation Display (VSD) enhanced Navigation Display (ND). Sixteen pilots each flew 22 approach - departure maneuvers in Instrument Meteorological Conditions (IMC) to the terrain challenged Eagle County Regional Airport (EGE) in Colorado. For the final run, the flight guidance cues were altered such that the departure path went into the terrain. All pilots with a SVS enhanced PFD (12 of 16 pilots) noticed and avoided the potential CFIT situation. All of the pilots who flew the anomaly with the baseline display configuration (which included a TAWS and VSD enhanced ND) had a CFIT event.

  9. Vision-based flight control in the hawkmoth Hyles lineata

    PubMed Central

    Windsor, Shane P.; Bomphrey, Richard J.; Taylor, Graham K.

    2014-01-01

    Vision is a key sensory modality for flying insects, playing an important role in guidance, navigation and control. Here, we use a virtual-reality flight simulator to measure the optomotor responses of the hawkmoth Hyles lineata, and use a published linear-time invariant model of the flight dynamics to interpret the function of the measured responses in flight stabilization and control. We recorded the forces and moments produced during oscillation of the visual field in roll, pitch and yaw, varying the temporal frequency, amplitude or spatial frequency of the stimulus. The moths’ responses were strongly dependent upon contrast frequency, as expected if the optomotor system uses correlation-type motion detectors to sense self-motion. The flight dynamics model predicts that roll angle feedback is needed to stabilize the lateral dynamics, and that a combination of pitch angle and pitch rate feedback is most effective in stabilizing the longitudinal dynamics. The moths’ responses to roll and pitch stimuli coincided qualitatively with these functional predictions. The moths produced coupled roll and yaw moments in response to yaw stimuli, which could help to reduce the energetic cost of correcting heading. Our results emphasize the close relationship between physics and physiology in the stabilization of insect flight. PMID:24335557

  10. Vision-based flight control in the hawkmoth Hyles lineata.

    PubMed

    Windsor, Shane P; Bomphrey, Richard J; Taylor, Graham K

    2014-02-06

    Vision is a key sensory modality for flying insects, playing an important role in guidance, navigation and control. Here, we use a virtual-reality flight simulator to measure the optomotor responses of the hawkmoth Hyles lineata, and use a published linear-time invariant model of the flight dynamics to interpret the function of the measured responses in flight stabilization and control. We recorded the forces and moments produced during oscillation of the visual field in roll, pitch and yaw, varying the temporal frequency, amplitude or spatial frequency of the stimulus. The moths' responses were strongly dependent upon contrast frequency, as expected if the optomotor system uses correlation-type motion detectors to sense self-motion. The flight dynamics model predicts that roll angle feedback is needed to stabilize the lateral dynamics, and that a combination of pitch angle and pitch rate feedback is most effective in stabilizing the longitudinal dynamics. The moths' responses to roll and pitch stimuli coincided qualitatively with these functional predictions. The moths produced coupled roll and yaw moments in response to yaw stimuli, which could help to reduce the energetic cost of correcting heading. Our results emphasize the close relationship between physics and physiology in the stabilization of insect flight.

  11. Comparative system identification of flower tracking performance in three hawkmoth species reveals adaptations for dim light vision.

    PubMed

    Stöckl, Anna L; Kihlström, Klara; Chandler, Steven; Sponberg, Simon

    2017-04-05

    Flight control in insects is heavily dependent on vision. Thus, in dim light, the decreased reliability of visual signal detection also prompts consequences for insect flight. We have an emerging understanding of the neural mechanisms that different species employ to adapt the visual system to low light. However, much less explored are comparative analyses of how low light affects the flight behaviour of insect species, and the corresponding links between physiological adaptations and behaviour. We investigated whether the flower tracking behaviour of three hawkmoth species with different diel activity patterns revealed luminance-dependent adaptations, using a system identification approach. We found clear luminance-dependent differences in flower tracking in all three species, which were explained by a simple luminance-dependent delay model, which generalized across species. We discuss physiological and anatomical explanations for the variance in tracking responses, which could not be explained by such simple models. Differences between species could not be explained by the simple delay model. However, in several cases, they could be explained through the addition on a second model parameter, a simple scaling term, that captures the responsiveness of each species to flower movements. Thus, we demonstrate here that much of the variance in the luminance-dependent flower tracking responses of hawkmoths with different diel activity patterns can be captured by simple models of neural processing.This article is part of the themed issue 'Vision in dim light'. © 2017 The Author(s).

  12. Integration of a 3D perspective view in the navigation display: featuring pilot's mental model

    NASA Astrophysics Data System (ADS)

    Ebrecht, L.; Schmerwitz, S.

    2015-05-01

    Synthetic vision systems (SVS) appear as spreading technology in the avionic domain. Several studies prove enhanced situational awareness when using synthetic vision. Since the introduction of synthetic vision a steady change and evolution started concerning the primary flight display (PFD) and the navigation display (ND). The main improvements of the ND comprise the representation of colored ground proximity warning systems (EGPWS), weather radar, and TCAS information. Synthetic vision seems to offer high potential to further enhance cockpit display systems. Especially, concerning the current trend having a 3D perspective view in a SVS-PFD while leaving the navigational content as well as methods of interaction unchanged the question arouses if and how the gap between both displays might evolve to a serious problem. This issue becomes important in relation to the transition and combination of strategic and tactical flight guidance. Hence, pros and cons of 2D and 3D views generally as well as the gap between the egocentric perspective 3D view of the PFD and the exocentric 2D top and side view of the ND will be discussed. Further a concept for the integration of a 3D perspective view, i.e., bird's eye view, in synthetic vision ND will be presented. The combination of 2D and 3D views in the ND enables a better correlation of the ND and the PFD. Additionally, this supports the building of pilot's mental model. The authors believe it will improve the situational and spatial awareness. It might prove to further raise the safety margin when operating in mountainous areas.

  13. Data acquisition and analysis of range-finding systems for spacing construction

    NASA Technical Reports Server (NTRS)

    Shen, C. N.

    1981-01-01

    For space missions of future, completely autonomous robotic machines will be required to free astronauts from routine chores of equipment maintenance, servicing of faulty systems, etc. and to extend human capabilities in hazardous environments full of cosmic and other harmful radiations. In places of high radiation and uncontrollable ambient illuminations, T.V. camera based vision systems cannot work effectively. However, a vision system utilizing directly measured range information with a time of flight laser rangefinder, can successfully operate under these environments. Such a system will be independent of proper illumination conditions and the interfering effects of intense radiation of all kinds will be eliminated by the tuned input of the laser instrument. Processing the range data according to certain decision, stochastic estimation and heuristic schemes, the laser based vision system will recognize known objects and thus provide sufficient information to the robot's control system which can develop strategies for various objectives.

  14. One Hundred Years of Powered Flight

    NASA Technical Reports Server (NTRS)

    2003-01-01

    This year, Centennial of Flight celebrations across the United States are marking the tremendous achievement of the Wright brothers successful, powered, heavier-than-air flight on December 17, 1903. The vision and persistence of these two men pioneered the way for explorers, inventors, and innovators to take aeronautics from the beaches of Kitty Hawk, North Carolina, to the outer reaches of the solar system. Along this 100-year journey, NASA has played a significant role in developing and supporting the technologies that have shaped the aviation industry.

  15. Helicopter synthetic vision based DVE processing for all phases of flight

    NASA Astrophysics Data System (ADS)

    O'Brien, Patrick; Baughman, David C.; Wallace, H. Bruce

    2013-05-01

    Helicopters experience nearly 10 times the accident rate of fixed wing platforms, due largely to the nature of their mission, frequently requiring operations in close proximity to terrain and obstacles. Degraded visual environments (DVE), including brownout or whiteout conditions generated by rotor downwash, result in loss of situational awareness during the most critical phase of flight, and contribute significantly to this accident rate. Considerable research into sensor and system solutions to address DVE has been conducted in recent years; however, the promise of a Synthetic Vision Avionics Backbone (SVAB) extends far beyond DVE, enabling improved situational awareness and mission effectiveness during all phases of flight and in all visibility conditions. The SVAB fuses sensor information with high resolution terrain databases and renders it in synthetic vision format for display to the crew. Honeywell was awarded the DARPA MFRF Technical Area 2 contract in 2011 to develop an SVAB1. This work includes creation of a common sensor interface, development of SVAB hardware and software, and flight demonstration on a Black Hawk helicopter. A "sensor agnostic" SVAB allows platform and mission diversity with efficient upgrade path, even while research continues into new and improved sensors for use in DVE conditions. Through careful integration of multiple sources of information such as sensors, terrain and obstacle databases, mission planning information, and aircraft state information, operations in all conditions and phases of flight can be enhanced. This paper describes the SVAB and its functionality resulting from the DARPA contract as well as Honeywell RD investment.

  16. Two-Phase Flow Technology Developed and Demonstrated for the Vision for Exploration

    NASA Technical Reports Server (NTRS)

    Sankovic, John M.; McQuillen, John B.; Lekan, Jack F.

    2005-01-01

    NASA s vision for exploration will once again expand the bounds of human presence in the universe with planned missions to the Moon and Mars. To attain the numerous goals of this vision, NASA will need to develop technologies in several areas, including advanced power-generation and thermal-control systems for spacecraft and life support. The development of these systems will have to be demonstrated prior to implementation to ensure safe and reliable operation in reduced-gravity environments. The Two-Phase Flow Facility (T(PHI) FFy) Project will provide the path to these enabling technologies for critical multiphase fluid products. The safety and reliability of future systems will be enhanced by addressing focused microgravity fluid physics issues associated with flow boiling, condensation, phase separation, and system stability, all of which are essential to exploration technology. The project--a multiyear effort initiated in 2004--will include concept development, normal-gravity testing (laboratories), reduced gravity aircraft flight campaigns (NASA s KC-135 and C-9 aircraft), space-flight experimentation (International Space Station), and model development. This project will be implemented by a team from the NASA Glenn Research Center, QSS Group, Inc., ZIN Technologies, Inc., and the Extramural Strategic Research Team composed of experts from academia.

  17. 77 FR 55895 - Meeting: RTCA Program Management Committee (PMC)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-11

    ... Enhanced Flight Vision System to Enable All-Weather Approach, Landing and Roll-Out to a Safe Taxi Speed... but limited to space availability. With the approval of the chairman, members of the public may...

  18. DTO 700-11, Kavandi conducts OSVS OPS

    NASA Image and Video Library

    2016-08-24

    STS091-349-005 (2-12 June 1998) --- Astronaut Janet L. Kavandi, mission specialist, performs a check of the Orbiter Space Vision Systems (OSVS) on the flight deck of the Earth-orbiting Space Shuttle Discovery.

  19. Visual cues in low-level flight - Implications for pilotage, training, simulation, and enhanced/synthetic vision systems

    NASA Technical Reports Server (NTRS)

    Foyle, David C.; Kaiser, Mary K.; Johnson, Walter W.

    1992-01-01

    This paper reviews some of the sources of visual information that are available in the out-the-window scene and describes how these visual cues are important for routine pilotage and training, as well as the development of simulator visual systems and enhanced or synthetic vision systems for aircraft cockpits. It is shown how these visual cues may change or disappear under environmental or sensor conditions, and how the visual scene can be augmented by advanced displays to capitalize on the pilot's excellent ability to extract visual information from the visual scene.

  20. Effectively Transforming IMC Flight into VMC Flight: An SVS Case Study

    NASA Technical Reports Server (NTRS)

    Glaab, Louis J.; Hughes, Monic F.; Parrish, Russell V.; Takallu, Mohammad A.

    2006-01-01

    A flight-test experiment was conducted using the NASA LaRC Cessna 206 aircraft. Four primary flight and navigation display concepts, including baseline and Synthetic Vision System (SVS) concepts, were evaluated in the local area of Roanoke Virginia Airport, flying visual and instrument approach procedures. A total of 19 pilots, from 3 pilot groups reflecting the diverse piloting skills of the GA population, served as evaluation pilots. Multi-variable Discriminant Analysis was applied to three carefully selected and markedly different operating conditions with conventional instrumentation to provide an extension of traditional analysis methods as well as provide an assessment of the effectiveness of SVS displays to effectively transform IMC flight into VMC flight.

  1. Towards a Decision Support System for Space Flight Operations

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Hogle, Charles; Ruszkowski, James

    2013-01-01

    The Mission Operations Directorate (MOD) at the Johnson Space Center (JSC) has put in place a Model Based Systems Engineering (MBSE) technological framework for the development and execution of the Flight Production Process (FPP). This framework has provided much added value and return on investment to date. This paper describes a vision for a model based Decision Support System (DSS) for the development and execution of the FPP and its design and development process. The envisioned system extends the existing MBSE methodology and technological framework which is currently in use. The MBSE technological framework currently in place enables the systematic collection and integration of data required for building an FPP model for a diverse set of missions. This framework includes the technology, people and processes required for rapid development of architectural artifacts. It is used to build a feasible FPP model for the first flight of spacecraft and for recurrent flights throughout the life of the program. This model greatly enhances our ability to effectively engage with a new customer. It provides a preliminary work breakdown structure, data flow information and a master schedule based on its existing knowledge base. These artifacts are then refined and iterated upon with the customer for the development of a robust end-to-end, high-level integrated master schedule and its associated dependencies. The vision is to enhance this framework to enable its application for uncertainty management, decision support and optimization of the design and execution of the FPP by the program. Furthermore, this enhanced framework will enable the agile response and redesign of the FPP based on observed system behavior. The discrepancy of the anticipated system behavior and the observed behavior may be due to the processing of tasks internally, or due to external factors such as changes in program requirements or conditions associated with other organizations that are outside of MOD. The paper provides a roadmap for the three increments of this vision. These increments include (1) hardware and software system components and interfaces with the NASA ground system, (2) uncertainty management and (3) re-planning and automated execution. Each of these increments provide value independently; but some may also enable building of a subsequent increment.

  2. A Height Estimation Approach for Terrain Following Flights from Monocular Vision

    PubMed Central

    Campos, Igor S. G.; Nascimento, Erickson R.; Freitas, Gustavo M.; Chaimowicz, Luiz

    2016-01-01

    In this paper, we present a monocular vision-based height estimation algorithm for terrain following flights. The impressive growth of Unmanned Aerial Vehicle (UAV) usage, notably in mapping applications, will soon require the creation of new technologies to enable these systems to better perceive their surroundings. Specifically, we chose to tackle the terrain following problem, as it is still unresolved for consumer available systems. Virtually every mapping aircraft carries a camera; therefore, we chose to exploit this in order to use presently available hardware to extract the height information toward performing terrain following flights. The proposed methodology consists of using optical flow to track features from videos obtained by the UAV, as well as its motion information to estimate the flying height. To determine if the height estimation is reliable, we trained a decision tree that takes the optical flow information as input and classifies whether the output is trustworthy or not. The classifier achieved accuracies of 80% for positives and 90% for negatives, while the height estimation algorithm presented good accuracy. PMID:27929424

  3. Systems and Techniques for Identifying and Avoiding Ice

    NASA Technical Reports Server (NTRS)

    Hansman, R. John

    1995-01-01

    In-flight icing is one of the most difficult aviation weather hazards facing general aviation. Because most aircraft in the general aviation category are not certified for flight into known icing conditions, techniques for identifying and avoiding in-flight ice are important to maintain safety while increasing the utility and dispatch capability which is part of the AGATE vision. This report summarizes a brief study effort which: (1) Reviewed current ice identification, forecasting, and avoidance techniques; (2) Assessed feasibility of improved forecasting and ice avoidance procedures; and (3) Identified key issues for the development of improved capability with regard to in-flight icing.

  4. Flight test of a passive millimeter-wave imaging system

    NASA Astrophysics Data System (ADS)

    Martin, Christopher A.; Manning, Will; Kolinko, Vladimir G.; Hall, Max

    2005-05-01

    A real-time passive millimeter-wave imaging system with a wide-field of view and 3K temperature sensitivity is described. The system was flown on a UH-1H helicopter in a flight test conducted by the U.S. Army RDECOM CERDEC Night Vision and Electronic Sensors Directorate (NVESD). We collected approximately eight hours of data over the course of the two-week flight test. Flight data was collected in horizontal and vertical polarizations at look down angles from 0 to 40 degrees. Speeds varied from 0 to 90 knots and altitudes varied from 0' to 1000'. Targets imaged include roads, freeways, railroads, houses, industrial buildings, power plants, people, streams, rivers, bridges, cars, trucks, trains, boats, planes, runways, treelines, shorelines, and the horizon. The imaging system withstood vibration and temperature variations, but experienced some RF interference. The flight test demonstrated the system's capabilities as an airborne navigation and surveillance aid. It also performed in a personnel recovery scenario.

  5. Flight Deck Display Technologies for 4DT and Surface Equivalent Visual Operations

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Jones, Denis R.; Shelton, Kevin J.; Arthur, Jarvis J., III; Bailey, Randall E.; Allamandola, Angela S.; Foyle, David C.; Hooey, Becky L.

    2009-01-01

    NASA research is focused on flight deck display technologies that may significantly enhance situation awareness, enable new operating concepts, and reduce the potential for incidents/accidents for terminal area and surface operations. The display technologies include surface map, head-up, and head-worn displays; 4DT guidance algorithms; synthetic and enhanced vision technologies; and terminal maneuvering area traffic conflict detection and alerting systems. This work is critical to ensure that the flight deck interface technologies and the role of the human participants can support the full realization of the Next Generation Air Transportation System (NextGen) and its novel operating concepts.

  6. Real-time Enhancement, Registration, and Fusion for a Multi-Sensor Enhanced Vision System

    NASA Technical Reports Server (NTRS)

    Hines, Glenn D.; Rahman, Zia-ur; Jobson, Daniel J.; Woodell, Glenn A.

    2006-01-01

    Over the last few years NASA Langley Research Center (LaRC) has been developing an Enhanced Vision System (EVS) to aid pilots while flying in poor visibility conditions. The EVS captures imagery using two infrared video cameras. The cameras are placed in an enclosure that is mounted and flown forward-looking underneath the NASA LaRC ARIES 757 aircraft. The data streams from the cameras are processed in real-time and displayed on monitors on-board the aircraft. With proper processing the camera system can provide better-than- human-observed imagery particularly during poor visibility conditions. However, to obtain this goal requires several different stages of processing including enhancement, registration, and fusion, and specialized processing hardware for real-time performance. We are using a real-time implementation of the Retinex algorithm for image enhancement, affine transformations for registration, and weighted sums to perform fusion. All of the algorithms are executed on a single TI DM642 digital signal processor (DSP) clocked at 720 MHz. The image processing components were added to the EVS system, tested, and demonstrated during flight tests in August and September of 2005. In this paper we briefly discuss the EVS image processing hardware and algorithms. We then discuss implementation issues and show examples of the results obtained during flight tests. Keywords: enhanced vision system, image enhancement, retinex, digital signal processing, sensor fusion

  7. A rotorcraft flight database for validation of vision-based ranging algorithms

    NASA Technical Reports Server (NTRS)

    Smith, Phillip N.

    1992-01-01

    A helicopter flight test experiment was conducted at the NASA Ames Research Center to obtain a database consisting of video imagery and accurate measurements of camera motion, camera calibration parameters, and true range information. The database was developed to allow verification of monocular passive range estimation algorithms for use in the autonomous navigation of rotorcraft during low altitude flight. The helicopter flight experiment is briefly described. Four data sets representative of the different helicopter maneuvers and the visual scenery encountered during the flight test are presented. These data sets will be made available to researchers in the computer vision community.

  8. Vision Issues and Space Flight: Evaluation of One-Carbon Metabolism Polymorphisms

    NASA Technical Reports Server (NTRS)

    Smith, Scott M.; Gregory, Jesse F.; Zeisel, Steven; Ueland, Per; Gibson, C. R.; Mader, Thomas; Kinchen, Jason; Ploutz-Snyder, Robert; Zwart, Sara R.

    2015-01-01

    Intermediates of the one-carbon metabolic pathway are altered in astronauts who experience vision-related issues during and after space flight. Serum concentrations of homocysteine, cystathionine, 2-methylcitric acid, and methylmalonic acid were higher in astronauts with ophthalmic changes than in those without (Zwart et al., J Nutr, 2012). These differences existed before, during, and after flight. Potential confounding factors did not explain the differences. Genetic polymorphisms could contribute to these differences, and could help explain why crewmembers on the same mission do not all have ophthalmic issues, despite the same environmental factors (e.g., microgravity, exercise, diet). A follow-up study was conducted to evaluate 5 polymorphisms of enzymes in the one-carbon pathway, and to evaluate how these relate to vision and other ophthalmic changes after flight. Preliminary evaluations of the genetic data indicate that all of the crewmembers with the MTRR GG genotype had vision issues to one degree or another. However, not everyone who had vision issues had this genetic polymorphism, so the situation is more complex than the involvement of this single polymorphism. Metabolomic and further data analyses are underway to clarify these findings, but the preliminary assessments are promising.

  9. A comparison of effects of peripheral vision cues on pilot performance during instrument flight in dissimilar aircraft simulators.

    DOT National Transportation Integrated Search

    1968-09-01

    Pilot response to peripheral vision cues relating to aircraft bank angle was studied during instrument flight in two simulators representing (1) a conventional, medium weight, piston engine airliner, and (2) a heavy, jet engine, sweptwing transport. ...

  10. Vision-Aided RAIM: A New Method for GPS Integrity Monitoring in Approach and Landing Phase

    PubMed Central

    Fu, Li; Zhang, Jun; Li, Rui; Cao, Xianbin; Wang, Jinling

    2015-01-01

    In the 1980s, Global Positioning System (GPS) receiver autonomous integrity monitoring (RAIM) was proposed to provide the integrity of a navigation system by checking the consistency of GPS measurements. However, during the approach and landing phase of a flight path, where there is often low GPS visibility conditions, the performance of the existing RAIM method may not meet the stringent aviation requirements for availability and integrity due to insufficient observations. To solve this problem, a new RAIM method, named vision-aided RAIM (VA-RAIM), is proposed for GPS integrity monitoring in the approach and landing phase. By introducing landmarks as pseudo-satellites, the VA-RAIM enriches the navigation observations to improve the performance of RAIM. In the method, a computer vision system photographs and matches these landmarks to obtain additional measurements for navigation. Nevertheless, the challenging issue is that such additional measurements may suffer from vision errors. To ensure the reliability of the vision measurements, a GPS-based calibration algorithm is presented to reduce the time-invariant part of the vision errors. Then, the calibrated vision measurements are integrated with the GPS observations for integrity monitoring. Simulation results show that the VA-RAIM outperforms the conventional RAIM with a higher level of availability and fault detection rate. PMID:26378533

  11. Vision-Aided RAIM: A New Method for GPS Integrity Monitoring in Approach and Landing Phase.

    PubMed

    Fu, Li; Zhang, Jun; Li, Rui; Cao, Xianbin; Wang, Jinling

    2015-09-10

    In the 1980s, Global Positioning System (GPS) receiver autonomous integrity monitoring (RAIM) was proposed to provide the integrity of a navigation system by checking the consistency of GPS measurements. However, during the approach and landing phase of a flight path, where there is often low GPS visibility conditions, the performance of the existing RAIM method may not meet the stringent aviation requirements for availability and integrity due to insufficient observations. To solve this problem, a new RAIM method, named vision-aided RAIM (VA-RAIM), is proposed for GPS integrity monitoring in the approach and landing phase. By introducing landmarks as pseudo-satellites, the VA-RAIM enriches the navigation observations to improve the performance of RAIM. In the method, a computer vision system photographs and matches these landmarks to obtain additional measurements for navigation. Nevertheless, the challenging issue is that such additional measurements may suffer from vision errors. To ensure the reliability of the vision measurements, a GPS-based calibration algorithm is presented to reduce the time-invariant part of the vision errors. Then, the calibrated vision measurements are integrated with the GPS observations for integrity monitoring. Simulation results show that the VA-RAIM outperforms the conventional RAIM with a higher level of availability and fault detection rate.

  12. Synthetic Vision Enhanced Surface Operations and Flight Procedures Rehearsal Tool

    NASA Technical Reports Server (NTRS)

    Arthur, Jarvis J., III; Prinzel, Lawrence J., III; Williams, Steven P.; Kramer, Lynda J.

    2006-01-01

    Limited visibility has been cited as predominant causal factor for both Controlled-Flight-Into-Terrain (CFIT) and runway incursion accidents. NASA is conducting research and development of Synthetic Vision Systems (SVS) technologies which may potentially mitigate low visibility conditions as a causal factor to these accidents while replicating the operational benefits of clear day flight operations, regardless of the actual outside visibility condition. Two experimental evaluation studies were performed to determine the efficacy of two concepts: 1) head-worn display application of SVS technology to enhance transport aircraft surface operations, and 2) three-dimensional SVS electronic flight bag display concept for flight plan preview, mission rehearsal and controller-pilot data link communications interface of flight procedures. In the surface operation study, pilots evaluated two display devices and four display modes during taxi under unlimited and CAT II visibility conditions. In the mission rehearsal study, pilots flew approaches and departures in an operationally-challenged airport environment, including CFIT scenarios. Performance using the SVS concepts was compared to traditional baseline displays with paper charts only or EFB information. In general, the studies evince the significant situation awareness and enhanced operational capabilities afforded from these advanced SVS display concepts. The experimental results and conclusions from these studies are discussed along with future directions.

  13. Synthetic Vision CFIT Experiments for GA and Commercial Aircraft: "A Picture Is Worth A Thousand Lives"

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J., III; Hughes, Monica F.; Arthur, Jarvis J., III; Kramer, Lynda J.; Glaab, Louis J.; Bailey, Randy E.; Parrish, Russell V.; Uenking, Michael D.

    2003-01-01

    Because restricted visibility has been implicated in the majority of commercial and general aviation accidents, solutions will need to focus on how to enhance safety during instrument meteorological conditions (IMC). The NASA Synthetic Vision Systems (SVS) project is developing technologies to help achieve these goals through the synthetic presentation of how the outside world would look to the pilot if vision were not reduced. The potential safety outcome would be a significant reduction in several accident categories, such as controlled-flight-into-terrain (CFIT), that have restricted visibility as a causal factor. The paper describes two experiments that demonstrated the efficacy of synthetic vision technology to prevent CFIT accidents for both general aviation and commercial aircraft.

  14. Enhanced/synthetic vision and head-worn display technologies for terminal maneuvering area NextGen operations

    NASA Astrophysics Data System (ADS)

    Arthur, Jarvis J., III; Prinzel, Lawrence J., III; Williams, Steven P.; Bailey, Randall E.; Shelton, Kevin J.; Norman, R. Mike

    2011-06-01

    NASA is researching innovative technologies for the Next Generation Air Transportation System (NextGen) to provide a "Better-Than-Visual" (BTV) capability as adjunct to "Equivalent Visual Operations" (EVO); that is, airport throughputs equivalent to that normally achieved during Visual Flight Rules (VFR) operations rates with equivalent and better safety in all weather and visibility conditions including Instrument Meteorological Conditions (IMC). These new technologies build on proven flight deck systems and leverage synthetic and enhanced vision systems. Two piloted simulation studies were conducted to access the use of a Head-Worn Display (HWD) with head tracking for synthetic and enhanced vision systems concepts. The first experiment evaluated the use a HWD for equivalent visual operations to San Francisco International Airport (airport identifier: KSFO) compared to a visual concept and a head-down display concept. A second experiment evaluated symbology variations under different visibility conditions using a HWD during taxi operations at Chicago O'Hare airport (airport identifier: KORD). Two experiments were conducted, one in a simulated San Francisco airport (KSFO) approach operation and the other, in simulated Chicago O'Hare surface operations, evaluating enhanced/synthetic vision and head-worn display technologies for NextGen operations. While flying a closely-spaced parallel approach to KSFO, pilots rated the HWD, under low-visibility conditions, equivalent to the out-the-window condition, under unlimited visibility, in terms of situational awareness (SA) and mental workload compared to a head-down enhanced vision system. There were no differences between the 3 display concepts in terms of traffic spacing and distance and the pilot decision-making to land or go-around. For the KORD experiment, the visibility condition was not a factor in pilot's rating of clutter effects from symbology. Several concepts for enhanced implementations of an unlimited field-of-regard BTV concept for low-visibility surface operations were determined to be equivalent in pilot ratings of efficacy and usability.

  15. Enhanced/Synthetic Vision and Head-Worn Display Technologies for Terminal Maneuvering Area NextGen Operations

    NASA Technical Reports Server (NTRS)

    Arthur, Jarvis J., III; Prinzell, Lawrence J.; Williams, Steven P.; Bailey, Randall E.; Shelton, Kevin J.; Norman, R. Mike

    2011-01-01

    NASA is researching innovative technologies for the Next Generation Air Transportation System (NextGen) to provide a "Better-Than-Visual" (BTV) capability as adjunct to "Equivalent Visual Operations" (EVO); that is, airport throughputs equivalent to that normally achieved during Visual Flight Rules (VFR) operations rates with equivalent and better safety in all weather and visibility conditions including Instrument Meteorological Conditions (IMC). These new technologies build on proven flight deck systems and leverage synthetic and enhanced vision systems. Two piloted simulation studies were conducted to access the use of a Head-Worn Display (HWD) with head tracking for synthetic and enhanced vision systems concepts. The first experiment evaluated the use a HWD for equivalent visual operations to San Francisco International Airport (airport identifier: KSFO) compared to a visual concept and a head-down display concept. A second experiment evaluated symbology variations under different visibility conditions using a HWD during taxi operations at Chicago O'Hare airport (airport identifier: KORD). Two experiments were conducted, one in a simulated San Francisco airport (KSFO) approach operation and the other, in simulated Chicago O'Hare surface operations, evaluating enhanced/synthetic vision and head-worn display technologies for NextGen operations. While flying a closely-spaced parallel approach to KSFO, pilots rated the HWD, under low-visibility conditions, equivalent to the out-the-window condition, under unlimited visibility, in terms of situational awareness (SA) and mental workload compared to a head-down enhanced vision system. There were no differences between the 3 display concepts in terms of traffic spacing and distance and the pilot decision-making to land or go-around. For the KORD experiment, the visibility condition was not a factor in pilot's rating of clutter effects from symbology. Several concepts for enhanced implementations of an unlimited field-of-regard BTV concept for low-visibility surface operations were determined to be equivalent in pilot ratings of efficacy and usability.

  16. Awareness and Detection of Traffic and Obstacles Using Synthetic and Enhanced Vision Systems

    NASA Technical Reports Server (NTRS)

    Bailey, Randall E.

    2012-01-01

    Research literature are reviewed and summarized to evaluate the awareness and detection of traffic and obstacles when using Synthetic Vision Systems (SVS) and Enhanced Vision Systems (EVS). The study identifies the critical issues influencing the time required, accuracy, and pilot workload associated with recognizing and reacting to potential collisions or conflicts with other aircraft, vehicles and obstructions during approach, landing, and surface operations. This work considers the effect of head-down display and head-up display implementations of SVS and EVS as well as the influence of single and dual pilot operations. The influences and strategies of adding traffic information and cockpit alerting with SVS and EVS were also included. Based on this review, a knowledge gap assessment was made with recommendations for ground and flight testing to fill these gaps and hence, promote the safe and effective implementation of SVS/EVS technologies for the Next Generation Air Transportation System

  17. Utilization of the Space Vision System as an Augmented Reality System For Mission Operations

    NASA Technical Reports Server (NTRS)

    Maida, James C.; Bowen, Charles

    2003-01-01

    Augmented reality is a technique whereby computer generated images are superimposed on live images for visual enhancement. Augmented reality can also be characterized as dynamic overlays when computer generated images are registered with moving objects in a live image. This technique has been successfully implemented, with low to medium levels of registration precision, in an NRA funded project entitled, "Improving Human Task Performance with Luminance Images and Dynamic Overlays". Future research is already being planned to also utilize a laboratory-based system where more extensive subject testing can be performed. However successful this might be, the problem will still be whether such a technology can be used with flight hardware. To answer this question, the Canadian Space Vision System (SVS) will be tested as an augmented reality system capable of improving human performance where the operation requires indirect viewing. This system has already been certified for flight and is currently flown on each shuttle mission for station assembly. Successful development and utilization of this system in a ground-based experiment will expand its utilization for on-orbit mission operations. Current research and development regarding the use of augmented reality technology is being simulated using ground-based equipment. This is an appropriate approach for development of symbology (graphics and annotation) optimal for human performance and for development of optimal image registration techniques. It is anticipated that this technology will become more pervasive as it matures. Because we know what and where almost everything is on ISS, this reduces the registration problem and improves the computer model of that reality, making augmented reality an attractive tool, provided we know how to use it. This is the basis for current research in this area. However, there is a missing element to this process. It is the link from this research to the current ISS video system and to flight hardware capable of utilizing this technology. This is the basis for this proposed Space Human Factors Engineering project, the determination of the display symbology within the performance limits of the Space Vision System that will objectively improve human performance. This utilization of existing flight hardware will greatly reduce the costs of implementation for flight. Besides being used onboard shuttle and space station and as a ground-based system for mission operational support, it also has great potential for science and medical training and diagnostics, remote learning, team learning, video/media conferencing, and educational outreach.

  18. Design of an Eye Limiting Resolution Visual System Using Commercial-Off-the-Shelf Equipment

    NASA Technical Reports Server (NTRS)

    Sweet, Barbara T.; Giovannetti, Dean P.

    2008-01-01

    A feasibility study was conducted to determine if a flight simulator with an eye-limiting resolution out-the-window (OTW) visual system could be built using commercial off-the-shelf (COTS) technology and used to evaluate the visual performance of Air Force pilots in an operations context. Results of this study demonstrate that an eye limiting OTW visual system can be built using COTS technology. Further, a series of operationally-based tasks linked to clinical vision tests can be used within the synthetic environment to demonstrate a correlation and quantify the level of correlation between vision and operational aviation performance.

  19. COBALT Flight Demonstrations Fuse Technologies

    NASA Image and Video Library

    2017-06-07

    This 5-minute, 50-second video shows how the CoOperative Blending of Autonomous Landing Technologies (COBALT) system pairs new landing sensor technologies that promise to yield the highest precision navigation solution ever tested for NASA space landing applications. The technologies included a navigation doppler lidar (NDL), which provides ultra-precise velocity and line-of-sight range measurements, and the Lander Vision System (LVS), which provides terrain-relative navigation. Through flight campaigns conducted in March and April 2017 aboard Masten Space Systems' Xodiac, a rocket-powered vertical takeoff, vertical landing (VTVL) platform, the COBALT system was flight tested to collect sensor performance data for NDL and LVS and to check the integration and communication between COBALT and the rocket. The flight tests provided excellent performance data for both sensors, as well as valuable information on the integrated performance with the rocket that will be used for subsequent COBALT modifications prior to follow-on flight tests. Based at NASA’s Armstrong Flight Research Center in Edwards, CA, the Flight Opportunities program funds technology development flight tests on commercial suborbital space providers of which Masten is a vendor. The program has previously tested the LVS on the Masten rocket and validated the technology for the Mars 2020 rover.

  20. Color Helmet Mounted Display System with Real Time Computer Generated and Video Imagery for In-Flight Simulation

    NASA Technical Reports Server (NTRS)

    Sawyer, Kevin; Jacobsen, Robert; Aiken, Edwin W. (Technical Monitor)

    1995-01-01

    NASA Ames Research Center and the US Army are developing the Rotorcraft Aircrew Systems Concepts Airborne Laboratory (RASCAL) using a Sikorsky UH-60 helicopter for the purpose of flight systems research. A primary use of the RASCAL is in-flight simulation for which the visual scene will use computer generated imagery and synthetic vision. This research is made possible in part to a full color wide field of view Helmet Mounted Display (HMD) system that provides high performance color imagery suitable for daytime operations in a flight-rated package. This paper describes the design and performance characteristics of the HMD system. Emphasis is placed on the design specifications, testing, and integration into the aircraft of Kaiser Electronics' RASCAL HMD system that was designed and built under contract for NASA. The optical performance and design of the Helmet mounted display unit will be discussed as well as the unique capabilities provided by the system's Programmable Display Generator (PDG).

  1. Vision based techniques for rotorcraft low altitude flight

    NASA Technical Reports Server (NTRS)

    Sridhar, Banavar; Suorsa, Ray; Smith, Philip

    1991-01-01

    An overview of research in obstacle detection at NASA Ames Research Center is presented. The research applies techniques from computer vision to automation of rotorcraft navigation. The development of a methodology for detecting the range to obstacles based on the maximum utilization of passive sensors is emphasized. The development of a flight and image data base for verification of vision-based algorithms, and a passive ranging methodology tailored to the needs of helicopter flight are discussed. Preliminary results indicate that it is possible to obtain adequate range estimates except at regions close to the FOE. Closer to the FOE, the error in range increases since the magnitude of the disparity gets smaller, resulting in a low SNR.

  2. Flight instruments and helmet-mounted SWIR imaging systems

    NASA Astrophysics Data System (ADS)

    Robinson, Tim; Green, John; Jacobson, Mickey; Grabski, Greg

    2011-06-01

    Night vision technology has experienced significant advances in the last two decades. Night vision goggles (NVGs) based on gallium arsenide (GaAs) continues to raise the bar for alternative technologies. Resolution, gain, sensitivity have all improved; the image quality through these devices is nothing less than incredible. Panoramic NVGs and enhanced NVGs are examples of recent advances that increase the warfighter capabilities. Even with these advances, alternative night vision devices such as solid-state indium gallium arsenide (InGaAs) focal plane arrays are under development for helmet-mounted imaging systems. The InGaAs imaging system offers advantages over the existing NVGs. Two key advantages are; (1) the new system produces digital image data, and (2) the new system is sensitive to energy in the shortwave infrared (SWIR) spectrum. While it is tempting to contrast the performance of these digital systems to the existing NVGs, the advantage of different spectral detection bands leads to the conclusion that the technologies are less competitive and more synergistic. It is likely, by the end of the decade, pilots within a cockpit will use multi-band devices. As such, flight decks will need to be compatible with both NVGs and SWIR imaging systems. Insertion of NVGs in aircraft during the late 70's and early 80's resulted in many "lessons learned" concerning instrument compatibility with NVGs. These "lessons learned" ultimately resulted in specifications such as MIL-L-85762A and MIL-STD-3009. These specifications are now used throughout industry to produce NVG-compatible illuminated instruments and displays for both military and civilian applications. Inserting a SWIR imaging device in a cockpit will require similar consideration. A project evaluating flight deck instrument compatibility with SWIR devices is currently ongoing; aspects of this evaluation are described in this paper. This project is sponsored by the Air Force Research Laboratory (AFRL).

  3. NASA Crew Launch Vehicle Flight Test Options

    NASA Technical Reports Server (NTRS)

    Cockrell, Charles E., Jr.; Davis, Stephan R.; Robonson, Kimberly; Tuma, Margaret L.; Sullivan, Greg

    2006-01-01

    Options for development flight testing (DFT) of the Ares I Crew Launch Vehicle (CLV) are discussed. The Ares-I Crew Launch Vehicle (CLV) is being developed by the U.S. National Aeronautics and Space Administration (NASA) to launch the Crew Exploration Vehicle (CEV) into low Earth Orbit (LEO). The Ares-I implements one of the components of the Vision for Space Exploration (VSE), providing crew and cargo access to the International Space Station (ISS) after retirement of the Space Shuttle and, eventually, forming part of the launch capability needed for lunar exploration. The role of development flight testing is to demonstrate key sub-systems, address key technical risks, and provide flight data to validate engineering models in representative flight environments. This is distinguished from certification flight testing, which is designed to formally validate system functionality and achieve flight readiness. Lessons learned from Saturn V, Space Shuttle, and other flight programs are examined along with key Ares-I technical risks in order to provide insight into possible development flight test strategies. A strategy for the first test flight of the Ares I, known as Ares I-1, is presented.

  4. Design and implementation of a vision-based hovering and feature tracking algorithm for a quadrotor

    NASA Astrophysics Data System (ADS)

    Lee, Y. H.; Chahl, J. S.

    2016-10-01

    This paper demonstrates an approach to the vision-based control of the unmanned quadrotors for hover and object tracking. The algorithms used the Speed Up Robust Features (SURF) algorithm to detect objects. The pose of the object in the image was then calculated in order to pass the pose information to the flight controller. Finally, the flight controller steered the quadrotor to approach the object based on the calculated pose data. The above processes was run using standard onboard resources found in the 3DR Solo quadrotor in an embedded computing environment. The obtained results showed that the algorithm behaved well during its missions, tracking and hovering, although there were significant latencies due to low CPU performance of the onboard image processing system.

  5. Assessment of Intraocular and Systemic Vasculature Pressure Parameters in Simulated Microgravity with Thigh Cuff Countermeasure

    NASA Technical Reports Server (NTRS)

    Huang, Alex S.; Balasubramanian, Siva; Tepelus, Tudor; Sadda, Jaya; Sadda, Srinivas; Stenger, Michael B.; Lee, Stuart M. C.; Laurie, Steve S.; Liu, John; Macias, Brandon R.

    2017-01-01

    Changes in vision have been well documented among astronauts during and after long-duration space flight. One hypothesis is that the space flight induced headward fluid alters posterior ocular pressure and volume and may contribute to visual acuity decrements. Therefore, we evaluated venoconstrictive thigh cuffs as a potential countermeasure to the headward fluid shift-induced effects on intraocular pressure (IOP) and cephalic vascular pressure and volumes.

  6. TAMU: A New Space Mission Operations Paradigm

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Ruszkowski, James; Haensly, Jean; Pennington, Granvil A.; Hogle, Charles

    2011-01-01

    The Transferable, Adaptable, Modular and Upgradeable (TAMU) Flight Production Process (FPP) is a model-centric System of System (SoS) framework which cuts across multiple organizations and their associated facilities, that are, in the most general case, in geographically diverse locations, to develop the architecture and associated workflow processes for a broad range of mission operations. Further, TAMU FPP envisions the simulation, automatic execution and re-planning of orchestrated workflow processes as they become operational. This paper provides the vision for the TAMU FPP paradigm. This includes a complete, coherent technique, process and tool set that result in an infrastructure that can be used for full lifecycle design and decision making during any flight production process. A flight production process is the process of developing all products that are necessary for flight.

  7. Advanced Pathway Guidance Evaluations on a Synthetic Vision Head-Up Display

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Prinzel, Lawrence J., III; Arthur, Jarvis J., III; Bailey, Randall E.

    2005-01-01

    NASA's Synthetic Vision Systems (SVS) project is developing technologies with practical applications to potentially eliminate low visibility conditions as a causal factor to civil aircraft accidents while replicating the operational benefits of clear day flight operations, regardless of the actual outside visibility condition. A major thrust of the SVS project involves the development/demonstration of affordable, certifiable display configurations that provide intuitive out-the-window terrain and obstacle information with advanced guidance for commercial and business aircraft. This experiment evaluated the influence of different pathway and guidance display concepts upon pilot situation awareness (SA), mental workload, and flight path tracking performance for Synthetic Vision display concepts using a Head-Up Display (HUD). Two pathway formats (dynamic and minimal tunnel presentations) were evaluated against a baseline condition (no tunnel) during simulated instrument meteorological conditions approaches to Reno-Tahoe International airport. Two guidance cues (tadpole, follow-me aircraft) were also evaluated to assess their influence. Results indicated that the presence of a tunnel on an SVS HUD had no effect on flight path performance but that it did have significant effects on pilot SA and mental workload. The dynamic tunnel concept with the follow-me aircraft guidance symbol produced the lowest workload and provided the highest SA among the tunnel concepts evaluated.

  8. Pathway Design Effects on Synthetic Vision Head-Up Displays

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Prinzel, Lawrence J., III; Arthur, Jarvis J., III; Bailey, Randall E.

    2004-01-01

    NASA s Synthetic Vision Systems (SVS) project is developing technologies with practical applications that will eliminate low visibility conditions as a causal factor to civil aircraft accidents while replicating the operational benefits of clear day flight operations, regardless of the actual outside visibility condition. A major thrust of the SVS project involves the development/demonstration of affordable, certifiable display configurations that provide intuitive out-the-window terrain and obstacle information with advanced pathway guidance for transport aircraft. This experiment evaluated the influence of different tunnel and guidance concepts upon pilot situation awareness (SA), mental workload, and flight path tracking performance for Synthetic Vision display concepts using a Head-Up Display (HUD). Two tunnel formats (dynamic, minimal) were evaluated against a baseline condition (no tunnel) during simulated IMC approaches to Reno-Tahoe International airport. Two guidance cues (tadpole, follow-me aircraft) were also evaluated to assess their influence on the tunnel formats. Results indicated that the presence of a tunnel on an SVS HUD had no effect on flight path performance but that it did have significant effects on pilot SA and mental workload. The dynamic tunnel concept with the follow-me aircraft guidance symbol produced the lowest workload and provided the highest SA among the tunnel concepts evaluated.

  9. 75 FR 38391 - Special Conditions: Boeing 757-200 With Enhanced Flight Vision System

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-02

    .... SUMMARY: These special conditions are issued for the Boeing Model 757- 200 series airplanes. These... system (EFVS). The EFVS is a novel or unusual design feature which consists of a head-up display (HUD... regulations do not contain adequate or appropriate safety standards for this design feature. These special...

  10. X-37 Flight Demonstrator Project: Capabilities for Future Space Transportation System Development

    NASA Technical Reports Server (NTRS)

    Dumbacher, Daniel L.

    2004-01-01

    The X-37 Approach and Landing Vehicle (ALTV) is an automated (unmanned) spacecraft designed to reduce technical risk in the descent and landing phases of flight. ALTV mission requirements and Orbital Vehicle (OV) technology research and development (R&D) goals are formulated to validate and mature high-payoff ground and flight technologies such as Thermal Protection Systems (TPS). It has been more than three decades since the Space Shuttle was designed and built. Real-world hardware experience gained through the multitude of X-37 Project activities has expanded both Government and industry knowledge of the challenges involved in developing new generations of spacecraft that can fulfill the Vision for Space Exploration.

  11. Clipping polygon faces through a polyhedron of vision

    NASA Technical Reports Server (NTRS)

    Florence, Judit K. (Inventor); Rohner, Michel A. (Inventor)

    1980-01-01

    A flight simulator combines flight data and polygon face terrain data to provide a CRT display at each window of the simulated aircraft. The data base specifies the relative position of each vertex of each polygon face therein. Only those terrain faces currently appearing within the pyramid of vision defined by the pilots eye and the edges of the pilots window need be displayed at any given time. As the orientation of the pyramid of vision changes in response to flight data, the displayed faces are correspondingly displaced, eventually moving out of the pyramid of vision. Faces which are currently not visible (outside the pyramid of vision) are clipped from the data flow. In addition, faces which are only partially outside of pyramid of vision are reconstructed to eliminate the outside portion. Window coordinates are generated defining the distance between each vertex and each of the boundary planes forming the pyramid of vision. The sign bit of each window coordinate indicates whether the vertex is on the pyramid of vision side of the associated boundary panel (positive), or on the other side thereof (negative). The set of sign bits accompanying each vertex constitute the outcode of that vertex. The outcodes (O.C.) are systematically processed and examined to determine which faces are completely inside the pyramid of vision (Case A--all signs positive), which faces are completely outside (Case C--All signs negative) and which faces must be reconstructed (Case B--both positive and negative signs).

  12. A Vision for Ice Giant Exploration

    NASA Technical Reports Server (NTRS)

    Hofstadter, M.; Simon, A.; Atreya, S.; Banfield, D.; Fortney, J.; Hayes, A.; Hedman, M.; Hospodarsky, G.; Mandt, K.; Masters, A.; hide

    2017-01-01

    From Voyager to a Vision for 2050: NASA and ESA have just completed a study of candidate missionsto Uranus and Neptune, the so-called ice giant planets. It is a Pre-Decadal Survey Study, meant to inform the next Planetary Science Decadal Survey about opportunities for missions launching in the 2020's and early 2030's. There have been no space flight missions to the ice giants since the Voyager 2 flybys of Uranus in 1986 and Neptune in 1989. This paper presents some conclusions of that study (hereafter referred to as The Study), and how the results feed into a vision for where planetary science can be in 2050. Reaching that vision will require investments in technology andground-based science in the 2020's, flight during the 2030's along with continued technological development of both ground- and space-based capabilities, and data analysis and additional flights in the 2040's. We first discuss why exploring the ice giants is important. We then summarize the science objectives identified by The Study, and our vision of the science goals for 2050. We then review some of the technologies needed to make this vision a reality.

  13. Initial SVS Integrated Technology Evaluation Flight Test Requirements and Hardware Architecture

    NASA Technical Reports Server (NTRS)

    Harrison, Stella V.; Kramer, Lynda J.; Bailey, Randall E.; Jones, Denise R.; Young, Steven D.; Harrah, Steven D.; Arthur, Jarvis J.; Parrish, Russell V.

    2003-01-01

    This document presents the flight test requirements for the Initial Synthetic Vision Systems Integrated Technology Evaluation flight Test to be flown aboard NASA Langley's ARIES aircraft and the final hardware architecture implemented to meet these requirements. Part I of this document contains the hardware, software, simulator, and flight operations requirements for this light test as they were defined in August 2002. The contents of this section are the actual requirements document that was signed for this flight test. Part II of this document contains information pertaining to the hardware architecture that was realized to meet these requirements as presented to and approved by a Critical Design Review Panel prior to installation on the B-757 Airborne Research Integrated Experiments Systems (ARIES) airplane. This information includes a description of the equipment, block diagrams of the architecture, layouts of the workstations, and pictures of the actual installations.

  14. 75 FR 28852 - Ninth Meeting: Joint RTCA Special Committee 213: EUROCAE WG-79: Enhanced Flight Vision Systems...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-24

    ... approach and landing. FOR FURTHER INFORMATION CONTACT: (1) RTCA Secretariat, 1828 L Street, NW., Suite 805, Washington, DC 20036; telephone (202) 833-9339; fax (202) 833-9434; Web site http://www.rtca.org...

  15. Simulation Evaluation of Synthetic Vision as an Enabling Technology for Equivalent Visual Operations

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Williams, Steven P.; Bailey, Randall E.

    2008-01-01

    Enhanced Vision (EV) and synthetic vision (SV) systems may serve as enabling technologies to meet the challenges of the Next Generation Air Transportation System (NextGen) Equivalent Visual Operations (EVO) concept ? that is, the ability to achieve or even improve on the safety of Visual Flight Rules (VFR) operations, maintain the operational tempos of VFR, and even, perhaps, retain VFR procedures independent of actual weather and visibility conditions. One significant challenge lies in the definition of required equipage on the aircraft and on the airport to enable the EVO concept objective. A piloted simulation experiment was conducted to evaluate the effects of the presence or absence of Synthetic Vision, the location of this information during an instrument approach (i.e., on a Head-Up or Head-Down Primary Flight Display), and the type of airport lighting information on landing minima. The quantitative data from this experiment were analyzed to begin the definition of performance-based criteria for all-weather approach and landing operations. Objective results from the present study showed that better approach performance was attainable with the head-up display (HUD) compared to the head-down display (HDD). A slight performance improvement in HDD performance was shown when SV was added, as the pilots descended below 200 ft to a 100 ft decision altitude, but this performance was not tested for statistical significance (nor was it expected to be statistically significant). The touchdown data showed that regardless of the display concept flown (SV HUD, Baseline HUD, SV HDD, Baseline HDD) a majority of the runs were within the performance-based defined approach and landing criteria in all the visibility levels, approach lighting systems, and decision altitudes tested. For this visual flight maneuver, RVR appeared to be the most significant influence in touchdown performance. The approach lighting system clearly impacted the pilot's ability to descend to 100 ft height above touchdown based on existing Federal Aviation Regulation (FAR) 91.175 using a 200 ft decision height, but did not appear to influence touchdown performance or approach path maintenance

  16. Vision Research for Flight Simulation. Final Report.

    ERIC Educational Resources Information Center

    Richards, Whitman, Ed.; Dismukes, Key, Ed.

    Based on a workshop on vision research issues in flight-training simulators held in June 1980, this report focuses on approaches for the conduct of research on what visual information is needed for simulation and how it can best be presented. An introduction gives an overview of the workshop and describes the contents of the report. Section 1…

  17. Guidance, Navigation and Control Innovations at the NASA Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Ericsson, Aprille Joy

    2002-01-01

    A viewgraph presentation on guidance navigation and control innovations at the NASA Goddard Space Flight Center is presented. The topics include: 1) NASA's vision; 2) NASA's Mission; 3) Earth Science Enterprise (ESE); 4) Guidance, Navigation and Control Division (GN&C); 5) Landsat-7 Earth Observer-1 Co-observing Program; and 6) NASA ESE Vision.

  18. Synthetic Vision Systems in GA Cockpit-Evaluation of Basic Maneuvers Performed by Low Time GA Pilots During Transition from VMC to IMC

    NASA Technical Reports Server (NTRS)

    Takallu, M. A.; Wong, D. T.; Uenking, M. D.

    2002-01-01

    An experimental investigation was conducted to study the effectiveness of modern flight displays in general aviation cockpits for mitigating Low Visibility Loss of Control and the Controlled Flight Into Terrain accidents. A total of 18 General Aviation (GA) pilots with private pilot, single engine land rating, with no additional instrument training beyond private pilot license requirements, were recruited to evaluate three different display concepts in a fixed-based flight simulator at the NASA Langley Research Center's General Aviation Work Station. Evaluation pilots were asked to continue flight from Visual Meteorological Conditions (VMC) into Instrument Meteorological Conditions (IMC) while performing a series of 4 basic precision maneuvers. During the experiment, relevant pilot/vehicle performance variables, pilot control inputs and physiological data were recorded. Human factors questionnaires and interviews were administered after each scenario. Qualitative and quantitative data have been analyzed and the results are presented here. Pilot performance deviations from the established target values (errors) were computed and compared with the FAA Practical Test Standards. Results of the quantitative data indicate that evaluation pilots committed substantially fewer errors when using the Synthetic Vision Systems (SVS) displays than when they were using conventional instruments. Results of the qualitative data indicate that evaluation pilots perceived themselves to have a much higher level of situation awareness while using the SVS display concept.

  19. Flight Simulator Evaluation of Display Media Devices for Synthetic Vision Concepts

    NASA Technical Reports Server (NTRS)

    Arthur, J. J., III; Williams, Steven P.; Prinzel, Lawrence J., III; Kramer, Lynda J.; Bailey, Randall E.

    2004-01-01

    The Synthetic Vision Systems (SVS) Project of the National Aeronautics and Space Administration's (NASA) Aviation Safety Program (AvSP) is striving to eliminate poor visibility as a causal factor in aircraft accidents as well as enhance operational capabilities of all aircraft. To accomplish these safety and capacity improvements, the SVS concept is designed to provide a clear view of the world around the aircraft through the display of computer-generated imagery derived from an onboard database of terrain, obstacle, and airport information. Display media devices with which to implement SVS technology that have been evaluated so far within the Project include fixed field of view head up displays and head down Primary Flight Displays with pilot-selectable field of view. A simulation experiment was conducted comparing these display devices to a fixed field of view, unlimited field of regard, full color Helmet-Mounted Display system. Subject pilots flew a visual circling maneuver in IMC at a terrain-challenged airport. The data collected for this experiment is compared to past SVS research studies.

  20. Flight Test Evaluation of Synthetic Vision Concepts at a Terrain Challenged Airport

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Prince, Lawrence J., III; Bailey, Randell E.; Arthur, Jarvis J., III; Parrish, Russell V.

    2004-01-01

    NASA's Synthetic Vision Systems (SVS) Project is striving to eliminate poor visibility as a causal factor in aircraft accidents as well as enhance operational capabilities of all aircraft through the display of computer generated imagery derived from an onboard database of terrain, obstacle, and airport information. To achieve these objectives, NASA 757 flight test research was conducted at the Eagle-Vail, Colorado airport to evaluate three SVS display types (Head-up Display, Head-Down Size A, Head-Down Size X) and two terrain texture methods (photo-realistic, generic) in comparison to the simulated Baseline Boeing-757 Electronic Attitude Direction Indicator and Navigation/Terrain Awareness and Warning System displays. The results of the experiment showed significantly improved situation awareness, performance, and workload for SVS concepts compared to the Baseline displays and confirmed the retrofit capability of the Head-Up Display and Size A SVS concepts. The research also demonstrated that the tunnel guidance display concept used within the SVS concepts achieved required navigation performance (RNP) criteria.

  1. Free flight odor tracking in Drosophila: Effect of wing chemosensors, sex and pheromonal gene regulation

    PubMed Central

    Houot, Benjamin; Gigot, Vincent; Robichon, Alain; Ferveur, Jean-François

    2017-01-01

    The evolution of powered flight in insects had major consequences for global biodiversity and involved the acquisition of adaptive processes allowing individuals to disperse to new ecological niches. Flies use both vision and olfactory input from their antennae to guide their flight; chemosensors on fly wings have been described, but their function remains mysterious. We studied Drosophila flight in a wind tunnel. By genetically manipulating wing chemosensors, we show that these structures play an essential role in flight performance with a sex-specific effect. Pheromonal systems are also involved in Drosophila flight guidance: transgenic expression of the pheromone production and detection gene, desat1, produced low, rapid flight that was absent in control flies. Our study suggests that the sex-specific modulation of free-flight odor tracking depends on gene expression in various fly tissues including wings and pheromonal-related tissues. PMID:28067325

  2. Posture, locomotion, spatial orientation, and motion sickness as a function of space flight

    NASA Technical Reports Server (NTRS)

    Reschke, M. F.; Bloomberg, J. J.; Harm, D. L.; Paloski, W. H.; Layne, C.; McDonald, V.

    1998-01-01

    This article summarizes a variety of newly published findings obtained by the Neuroscience Laboratory, Johnson Space Center, and attempts to place this work within a historical framework of previous results on posture, locomotion, motion sickness, and perceptual responses that have been observed in conjunction with space flight. In this context, we have taken the view that correct transduction and integration of signals from all sensory systems is essential to maintaining stable vision, postural and locomotor control, and eye-hand coordination as components of spatial orientation. The plasticity of the human central nervous system allows individuals to adapt to altered stimulus conditions encountered in a microgravity environment. However, until some level of adaptation is achieved, astronauts and cosmonauts often experience space motion sickness, disturbances in motion control and eye-hand coordination, unstable vision, and illusory motion of the self, the visual scene, or both. Many of the same types of disturbances encountered in space flight reappear immediately after crew members return to earth. The magnitude of these neurosensory, sensory-motor and perceptual disturbances, and the time needed to recover from them, tend to vary as a function of mission duration and the space travelers prior experience with the stimulus rearrangement of space flight. To adequately chart the development of neurosensory changes associated with space flight, we recommend development of enhanced eye movement systems and body position measurement. We also advocate the use of a human small radius centrifuge as both a research tool and as a means of providing on-orbit countermeasures that will lessen the impact of living for long periods of time with out exposure to altering gravito-inertial forces. Copyright 1998 Elsevier Science B.V.

  3. Synthetic Vision for Lunar and Planetary Landing Vehicles

    NASA Technical Reports Server (NTRS)

    Williams, Steven P.; Arthur, Jarvis (Trey) J., III; Shelton, Kevin J.; Prinzel, Lawrence J., III; Norman, R. Michael

    2008-01-01

    The Crew Vehicle Interface (CVI) group of the Integrated Intelligent Flight Deck Technologies (IIFDT) has done extensive research in the area of Synthetic Vision (SV), and has shown that SV technology can substantially enhance flight crew situation awareness, reduce pilot workload, promote flight path control precision and improve aviation safety. SV technology is being extended to evaluate its utility for lunar and planetary exploration vehicles. SV may hold significant potential for many lunar and planetary missions since the SV presentation provides a computer-generated view of the terrain and other significant environment characteristics independent of the outside visibility conditions, window locations, or vehicle attributes. SV allows unconstrained control of the computer-generated scene lighting, terrain coloring, and virtual camera angles which may provide invaluable visual cues to pilots/astronauts and in addition, important vehicle state information may be conformally displayed on the view such as forward and down velocities, altitude, and fuel remaining to enhance trajectory control and vehicle system status. This paper discusses preliminary SV concepts for tactical and strategic displays for a lunar landing vehicle. The technical challenges and potential solutions to SV applications for the lunar landing mission are explored, including the requirements for high resolution terrain lunar maps and an accurate position and orientation of the vehicle that is essential in providing lunar Synthetic Vision System (SVS) cockpit displays. The paper also discusses the technical challenge of creating an accurate synthetic terrain portrayal using an ellipsoid lunar digital elevation model which eliminates projection errors and can be efficiently rendered in real-time.

  4. A Program in Air Transportation Technology (Joint University Program)

    NASA Technical Reports Server (NTRS)

    Stengel, Robert F.

    1996-01-01

    The Joint University Program on Air Transportation Technology was conducted at Princeton University from 1971 to 1995. Our vision was to further understanding of the design and operation of transport aircraft, of the effects of atmospheric environment on aircraft flight, and of the development and utilization of the National Airspace System. As an adjunct, the program emphasized the independent research of both graduate and undergraduate students. Recent principal goals were to develop and verify new methods for design and analysis of intelligent flight control systems, aircraft guidance logic for recovery from wake vortex encounter, and robust flight control systems. Our research scope subsumed problems associated with multidisciplinary aircraft design synthesis and analysis based on flight physics, providing a theoretical basis for developing innovative control concepts that enhance aircraft performance and safety. Our research focus was of direct interest not only to NASA but to manufacturers of aircraft and their associated systems. Our approach, metrics, and future directions described in the remainder of the report.

  5. ALHAT COBALT: CoOperative Blending of Autonomous Landing Technology

    NASA Technical Reports Server (NTRS)

    Carson, John M.

    2015-01-01

    The COBALT project is a flight demonstration of two NASA ALHAT (Autonomous precision Landing and Hazard Avoidance Technology) capabilities that are key for future robotic or human landing GN&C (Guidance, Navigation and Control) systems. The COBALT payload integrates the Navigation Doppler Lidar (NDL) for ultraprecise velocity and range measurements with the Lander Vision System (LVS) for Terrain Relative Navigation (TRN) position estimates. Terrestrial flight tests of the COBALT payload in an open-loop and closed-loop GN&C configuration will be conducted onboard a commercial, rocket-propulsive Vertical Test Bed (VTB) at a test range in Mojave, CA.

  6. Antennal Regulation of Migratory Flight in the Neotropical Moth, Urania fulgens

    USDA-ARS?s Scientific Manuscript database

    Migrating insects use their sensory system to acquire local and global cues about their surroundings. Previous research on tethered insects has suggested that in addition to vision and bending of cephalic bristles, insects use antennal mechanosensory feedback to maintain their airspeeds. Due to larg...

  7. Advanced integrated enhanced vision systems

    NASA Astrophysics Data System (ADS)

    Kerr, J. R.; Luk, Chiu H.; Hammerstrom, Dan; Pavel, Misha

    2003-09-01

    In anticipation of its ultimate role in transport, business and rotary wing aircraft, we clarify the role of Enhanced Vision Systems (EVS): how the output data will be utilized, appropriate architecture for total avionics integration, pilot and control interfaces, and operational utilization. Ground-map (database) correlation is critical, and we suggest that "synthetic vision" is simply a subset of the monitor/guidance interface issue. The core of integrated EVS is its sensor processor. In order to approximate optimal, Bayesian multi-sensor fusion and ground correlation functionality in real time, we are developing a neural net approach utilizing human visual pathway and self-organizing, associative-engine processing. In addition to EVS/SVS imagery, outputs will include sensor-based navigation and attitude signals as well as hazard detection. A system architecture is described, encompassing an all-weather sensor suite; advanced processing technology; intertial, GPS and other avionics inputs; and pilot and machine interfaces. Issues of total-system accuracy and integrity are addressed, as well as flight operational aspects relating to both civil certification and military applications in IMC.

  8. Real-time synthetic vision cockpit display for general aviation

    NASA Astrophysics Data System (ADS)

    Hansen, Andrew J.; Smith, W. Garth; Rybacki, Richard M.

    1999-07-01

    Low cost, high performance graphics solutions based on PC hardware platforms are now capable of rendering synthetic vision of a pilot's out-the-window view during all phases of flight. When coupled to a GPS navigation payload the virtual image can be fully correlated to the physical world. In particular, differential GPS services such as the Wide Area Augmentation System WAAS will provide all aviation users with highly accurate 3D navigation. As well, short baseline GPS attitude systems are becoming a viable and inexpensive solution. A glass cockpit display rendering geographically specific imagery draped terrain in real-time can be coupled with high accuracy (7m 95% positioning, sub degree pointing), high integrity (99.99999% position error bound) differential GPS navigation/attitude solutions to provide both situational awareness and 3D guidance to (auto) pilots throughout en route, terminal area, and precision approach phases of flight. This paper describes the technical issues addressed when coupling GPS and glass cockpit displays including the navigation/display interface, real-time 60 Hz rendering of terrain with multiple levels of detail under demand paging, and construction of verified terrain databases draped with geographically specific satellite imagery. Further, on-board recordings of the navigation solution and the cockpit display provide a replay facility for post-flight simulation based on live landings as well as synchronized multiple display channels with different views from the same flight. PC-based solutions which integrate GPS navigation and attitude determination with 3D visualization provide the aviation community, and general aviation in particular, with low cost high performance guidance and situational awareness in all phases of flight.

  9. Flight Research and Validation Formerly Experimental Capabilities Supersonic Project

    NASA Technical Reports Server (NTRS)

    Banks, Daniel

    2009-01-01

    This slide presentation reviews the work of the Experimental Capabilities Supersonic project, that is being reorganized into Flight Research and Validation. The work of Experimental Capabilities Project in FY '09 is reviewed, and the specific centers that is assigned to do the work is given. The portfolio of the newly formed Flight Research and Validation (FRV) group is also reviewed. The various projects for FY '10 for the FRV are detailed. These projects include: Eagle Probe, Channeled Centerbody Inlet Experiment (CCIE), Supersonic Boundary layer Transition test (SBLT), Aero-elastic Test Wing-2 (ATW-2), G-V External Vision Systems (G5 XVS), Air-to-Air Schlieren (A2A), In Flight Background Oriented Schlieren (BOS), Dynamic Inertia Measurement Technique (DIM), and Advanced In-Flight IR Thermography (AIR-T).

  10. Runway Incursion Prevention System Testing at the Wallops Flight Facility

    NASA Technical Reports Server (NTRS)

    Jones, Denise R.

    2005-01-01

    A Runway Incursion Prevention System (RIPS) integrated with a Synthetic Vision System concept (SVS) was tested at the Reno/Tahoe International Airport (RNO) and Wallops Flight Facility (WAL) in the summer of 2004. RIPS provides enhanced surface situational awareness and alerts of runway conflicts in order to prevent runway incidents while also improving operational capability. A series of test runs was conducted using a Gulfstream-V (G-V) aircraft as the test platform and a NASA test aircraft and a NASA test van as incurring traffic. The purpose of the study, from the RIPS perspective, was to evaluate the RIPS airborne incursion detection algorithms and associated alerting and airport surface display concepts, focusing on crossing runway incursion scenarios. This paper gives an overview of the RIPS, WAL flight test activities, and WAL test results.

  11. PICASSO VISION instrument design, engineering model test results, and flight model development status

    NASA Astrophysics Data System (ADS)

    Näsilä, Antti; Holmlund, Christer; Mannila, Rami; Näkki, Ismo; Ojanen, Harri J.; Akujärvi, Altti; Saari, Heikki; Fussen, Didier; Pieroux, Didier; Demoulin, Philippe

    2016-10-01

    PICASSO - A PICo-satellite for Atmospheric and Space Science Observations is an ESA project led by the Belgian Institute for Space Aeronomy, in collaboration with VTT Technical Research Centre of Finland Ltd, Clyde Space Ltd. (UK) and Centre Spatial de Liège (BE). The test campaign for the engineering model of the PICASSO VISION instrument, a miniaturized nanosatellite spectral imager, has been successfully completed. The test results look very promising. The proto-flight model of VISION has also been successfully integrated and it is waiting for the final integration to the satellite platform.

  12. Test and Verification Approach for the NASA Constellation Program

    NASA Technical Reports Server (NTRS)

    Strong, Edward

    2008-01-01

    This viewgraph presentation is a test and verification approach for the NASA Constellation Program. The contents include: 1) The Vision for Space Exploration: Foundations for Exploration; 2) Constellation Program Fleet of Vehicles; 3) Exploration Roadmap; 4) Constellation Vehicle Approximate Size Comparison; 5) Ares I Elements; 6) Orion Elements; 7) Ares V Elements; 8) Lunar Lander; 9) Map of Constellation content across NASA; 10) CxP T&V Implementation; 11) Challenges in CxP T&V Program; 12) T&V Strategic Emphasis and Key Tenets; 13) CxP T&V Mission & Vision; 14) Constellation Program Organization; 15) Test and Evaluation Organization; 16) CxP Requirements Flowdown; 17) CxP Model Based Systems Engineering Approach; 18) CxP Verification Planning Documents; 19) Environmental Testing; 20) Scope of CxP Verification; 21) CxP Verification - General Process Flow; 22) Avionics and Software Integrated Testing Approach; 23) A-3 Test Stand; 24) Space Power Facility; 25) MEIT and FEIT; 26) Flight Element Integrated Test (FEIT); 27) Multi-Element Integrated Testing (MEIT); 28) Flight Test Driving Principles; and 29) Constellation s Integrated Flight Test Strategy Low Earth Orbit Servicing Capability.

  13. Beyond the cockpit: The visual world as a flight instrument

    NASA Technical Reports Server (NTRS)

    Johnson, W. W.; Kaiser, M. K.; Foyle, D. C.

    1992-01-01

    The use of cockpit instruments to guide flight control is not always an option (e.g., low level rotorcraft flight). Under such circumstances the pilot must use out-the-window information for control and navigation. Thus it is important to determine the basis of visually guided flight for several reasons: (1) to guide the design and construction of the visual displays used in training simulators; (2) to allow modeling of visibility restrictions brought about by weather, cockpit constraints, or distortions introduced by sensor systems; and (3) to aid in the development of displays that augment the cockpit window scene and are compatible with the pilot's visual extraction of information from the visual scene. The authors are actively pursuing these questions. We have on-going studies using both low-cost, lower fidelity flight simulators, and state-of-the-art helicopter simulation research facilities. Research results will be presented on: (1) the important visual scene information used in altitude and speed control; (2) the utility of monocular, stereo, and hyperstereo cues for the control of flight; (3) perceptual effects due to the differences between normal unaided daylight vision, and that made available by various night vision devices (e.g., light intensifying goggles and infra-red sensor displays); and (4) the utility of advanced contact displays in which instrument information is made part of the visual scene, as on a 'scene linked' head-up display (e.g., displaying altimeter information on a virtual billboard located on the ground).

  14. The NASA Bed Rest Project

    NASA Technical Reports Server (NTRS)

    Rhodes, Bradley; Meck, Janice

    2005-01-01

    NASA s National Vision for Space Exploration includes human travel beyond low earth orbit and the ultimate safe return of the crews. Crucial to fulfilling the vision is the successful and timely development of countermeasures for the adverse physiological effects on human systems caused by long term exposure to the microgravity environment. Limited access to in-flight resources for the foreseeable future increases NASA s reliance on ground-based analogs to simulate these effects of microgravity. The primary analog for human based research will be head-down bed rest. By this approach NASA will be able to evaluate countermeasures in large sample sizes, perform preliminary evaluations of proposed in-flight protocols and assess the utility of individual or combined strategies before flight resources are requested. In response to this critical need, NASA has created the Bed Rest Project at the Johnson Space Center. The Project establishes the infrastructure and processes to provide a long term capability for standardized domestic bed rest studies and countermeasure development. The Bed Rest Project design takes a comprehensive, interdisciplinary, integrated approach that reduces the resource overhead of one investigator for one campaign. In addition to integrating studies operationally relevant for exploration, the Project addresses other new Vision objectives, namely: 1) interagency cooperation with the NIH allows for Clinical Research Center (CRC) facility sharing to the benefit of both agencies, 2) collaboration with our International Partners expands countermeasure development opportunities for foreign and domestic investigators as well as promotes consistency in approach and results, 3) to the greatest degree possible, the Project also advances research by clinicians and academia alike to encourage return to earth benefits. This paper will describe the Project s top level goals, organization and relationship to other Exploration Vision Projects, implementation strategy, address Project deliverables, schedules and provide a status of bed rest campaigns presently underway.

  15. Flight Testing of Terrain-Relative Navigation and Large-Divert Guidance on a VTVL Rocket

    NASA Technical Reports Server (NTRS)

    Trawny, Nikolas; Benito, Joel; Tweddle, Brent; Bergh, Charles F.; Khanoyan, Garen; Vaughan, Geoffrey M.; Zheng, Jason X.; Villalpando, Carlos Y.; Cheng, Yang; Scharf, Daniel P.; hide

    2015-01-01

    Since 2011, the Autonomous Descent and Ascent Powered-Flight Testbed (ADAPT) has been used to demonstrate advanced descent and landing technologies onboard the Masten Space Systems (MSS) Xombie vertical-takeoff, vertical-landing suborbital rocket. The current instantiation of ADAPT is a stand-alone payload comprising sensing and avionics for terrain-relative navigation and fuel-optimal onboard planning of large divert trajectories, thus providing complete pin-point landing capabilities needed for planetary landers. To this end, ADAPT combines two technologies developed at JPL, the Lander Vision System (LVS), and the Guidance for Fuel Optimal Large Diverts (G-FOLD) software. This paper describes the integration and testing of LVS and G-FOLD in the ADAPT payload, culminating in two successful free flight demonstrations on the Xombie vehicle conducted in December 2014.

  16. Fusion of Synthetic and Enhanced Vision for All-Weather Commercial Aviation Operations

    NASA Technical Reports Server (NTRS)

    Bailey, Randall E.; Kramer, Lynda J.; Prinzel, Lawrence, III

    2007-01-01

    NASA is developing revolutionary crew-vehicle interface technologies that strive to proactively overcome aircraft safety barriers that would otherwise constrain the full realization of the next-generation air transportation system. A piloted simulation experiment was conducted to evaluate the complementary use of Synthetic and Enhanced Vision technologies. Specific focus was placed on new techniques for integration and/or fusion of Enhanced and Synthetic Vision and its impact within a two-crew flight deck during low visibility approach and landing operations. Overall, the experimental data showed that significant improvements in situation awareness, without concomitant increases in workload and display clutter, could be provided by the integration and/or fusion of synthetic and enhanced vision technologies for the pilot-flying and the pilot-not-flying. During non-normal operations, the ability of the crew to handle substantial navigational errors and runway incursions were not adversely impacted by the display concepts although the addition of Enhanced Vision did not, unto itself, provide an improvement in runway incursion detection.

  17. Robot Guidance Using Machine Vision Techniques in Industrial Environments: A Comparative Review.

    PubMed

    Pérez, Luis; Rodríguez, Íñigo; Rodríguez, Nuria; Usamentiaga, Rubén; García, Daniel F

    2016-03-05

    In the factory of the future, most of the operations will be done by autonomous robots that need visual feedback to move around the working space avoiding obstacles, to work collaboratively with humans, to identify and locate the working parts, to complete the information provided by other sensors to improve their positioning accuracy, etc. Different vision techniques, such as photogrammetry, stereo vision, structured light, time of flight and laser triangulation, among others, are widely used for inspection and quality control processes in the industry and now for robot guidance. Choosing which type of vision system to use is highly dependent on the parts that need to be located or measured. Thus, in this paper a comparative review of different machine vision techniques for robot guidance is presented. This work analyzes accuracy, range and weight of the sensors, safety, processing time and environmental influences. Researchers and developers can take it as a background information for their future works.

  18. Robot Guidance Using Machine Vision Techniques in Industrial Environments: A Comparative Review

    PubMed Central

    Pérez, Luis; Rodríguez, Íñigo; Rodríguez, Nuria; Usamentiaga, Rubén; García, Daniel F.

    2016-01-01

    In the factory of the future, most of the operations will be done by autonomous robots that need visual feedback to move around the working space avoiding obstacles, to work collaboratively with humans, to identify and locate the working parts, to complete the information provided by other sensors to improve their positioning accuracy, etc. Different vision techniques, such as photogrammetry, stereo vision, structured light, time of flight and laser triangulation, among others, are widely used for inspection and quality control processes in the industry and now for robot guidance. Choosing which type of vision system to use is highly dependent on the parts that need to be located or measured. Thus, in this paper a comparative review of different machine vision techniques for robot guidance is presented. This work analyzes accuracy, range and weight of the sensors, safety, processing time and environmental influences. Researchers and developers can take it as a background information for their future works. PMID:26959030

  19. PRoViScout: a planetary scouting rover demonstrator

    NASA Astrophysics Data System (ADS)

    Paar, Gerhard; Woods, Mark; Gimkiewicz, Christiane; Labrosse, Frédéric; Medina, Alberto; Tyler, Laurence; Barnes, David P.; Fritz, Gerald; Kapellos, Konstantinos

    2012-01-01

    Mobile systems exploring Planetary surfaces in future will require more autonomy than today. The EU FP7-SPACE Project ProViScout (2010-2012) establishes the building blocks of such autonomous exploration systems in terms of robotics vision by a decision-based combination of navigation and scientific target selection, and integrates them into a framework ready for and exposed to field demonstration. The PRoViScout on-board system consists of mission management components such as an Executive, a Mars Mission On-Board Planner and Scheduler, a Science Assessment Module, and Navigation & Vision Processing modules. The platform hardware consists of the rover with the sensors and pointing devices. We report on the major building blocks and their functions & interfaces, emphasizing on the computer vision parts such as image acquisition (using a novel zoomed 3D-Time-of-Flight & RGB camera), mapping from 3D-TOF data, panoramic image & stereo reconstruction, hazard and slope maps, visual odometry and the recognition of potential scientifically interesting targets.

  20. Direct Evidence for Vision-based Control of Flight Speed in Budgerigars.

    PubMed

    Schiffner, Ingo; Srinivasan, Mandyam V

    2015-06-05

    We have investigated whether, and, if so, how birds use vision to regulate the speed of their flight. Budgerigars, Melopsittacus undulatus, were filmed in 3-D using high-speed video cameras as they flew along a 25 m tunnel in which stationary or moving vertically oriented black and white stripes were projected on the side walls. We found that the birds increased their flight speed when the stripes were moved in the birds' flight direction, but decreased it only marginally when the stripes were moved in the opposite direction. The results provide the first direct evidence that Budgerigars use cues based on optic flow, to regulate their flight speed. However, unlike the situation in flying insects, it appears that the control of flight speed in Budgerigars is direction-specific. It does not rely solely on cues derived from optic flow, but may also be determined by energy constraints.

  1. Direct Evidence for Vision-based Control of Flight Speed in Budgerigars

    PubMed Central

    Schiffner, Ingo; Srinivasan, Mandyam V.

    2015-01-01

    We have investigated whether, and, if so, how birds use vision to regulate the speed of their flight. Budgerigars, Melopsittacus undulatus, were filmed in 3-D using high-speed video cameras as they flew along a 25 m tunnel in which stationary or moving vertically oriented black and white stripes were projected on the side walls. We found that the birds increased their flight speed when the stripes were moved in the birds’ flight direction, but decreased it only marginally when the stripes were moved in the opposite direction. The results provide the first direct evidence that Budgerigars use cues based on optic flow, to regulate their flight speed. However, unlike the situation in flying insects, it appears that the control of flight speed in Budgerigars is direction-specific. It does not rely solely on cues derived from optic flow, but may also be determined by energy constraints. PMID:26046799

  2. Computer vision techniques for rotorcraft low altitude flight

    NASA Technical Reports Server (NTRS)

    Sridhar, Banavar

    1990-01-01

    Rotorcraft operating in high-threat environments fly close to the earth's surface to utilize surrounding terrain, vegetation, or manmade objects to minimize the risk of being detected by an enemy. Increasing levels of concealment are achieved by adopting different tactics during low-altitude flight. Rotorcraft employ three tactics during low-altitude flight: low-level, contour, and nap-of-the-earth (NOE). The key feature distinguishing the NOE mode from the other two modes is that the whole rotorcraft, including the main rotor, is below tree-top whenever possible. This leads to the use of lateral maneuvers for avoiding obstacles, which in fact constitutes the means for concealment. The piloting of the rotorcraft is at best a very demanding task and the pilot will need help from onboard automation tools in order to devote more time to mission-related activities. The development of an automation tool which has the potential to detect obstacles in the rotorcraft flight path, warn the crew, and interact with the guidance system to avoid detected obstacles, presents challenging problems. Research is described which applies techniques from computer vision to automation of rotorcraft navigtion. The effort emphasizes the development of a methodology for detecting the ranges to obstacles in the region of interest based on the maximum utilization of passive sensors. The range map derived from the obstacle-detection approach can be used as obstacle data for the obstacle avoidance in an automatic guidance system and as advisory display to the pilot. The lack of suitable flight imagery data presents a problem in the verification of concepts for obstacle detection. This problem is being addressed by the development of an adequate flight database and by preprocessing of currently available flight imagery. The presentation concludes with some comments on future work and how research in this area relates to the guidance of other autonomous vehicles.

  3. Airborne Windshear Detection and Warning Systems. Fifth and Final Combined Manufacturers' and Technologists' Conference, part 1

    NASA Technical Reports Server (NTRS)

    Delnore, Victor E. (Compiler)

    1994-01-01

    The Fifth (and Final) Combined Manufacturers' and Technologists' Airborne Windshear Review Meeting was hosted jointly by the NASA Langley Research Center (LaRC) and the Federal Aviation Administration (FAA) in Hampton, Virginia, on September 28-30, 1993. The purpose of the meeting was to report on the highly successful windshear experiments conducted by government, academic institutions, and industry; to transfer the results to regulators, manufacturers, and users; and to set initiatives for future aeronautics technology research. The formal sessions covered recent developments in windshear flight testing; windshear modeling, flight management, and ground-based systems; airborne windshear detection systems; certification and regulatory issues; development and applications of sensors for wake vortex detection; and synthetic and enhanced vision systems.

  4. 14 CFR 61.31 - Type rating requirements, additional training, and authorization requirements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... holder is already qualified. (k) Additional training required for night vision goggle operations. (1... aircraft using night vision goggles only if that person receives and logs ground training from an...: (i) Applicable portions of this chapter that relate to night vision goggle limitations and flight...

  5. 14 CFR 61.31 - Type rating requirements, additional training, and authorization requirements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... holder is already qualified. (k) Additional training required for night vision goggle operations. (1... aircraft using night vision goggles only if that person receives and logs ground training from an...: (i) Applicable portions of this chapter that relate to night vision goggle limitations and flight...

  6. 78 FR 54790 - Revisions to Operational Requirements for the Use of Enhanced Flight Vision Systems (EFVS) and to...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-06

    ....gov . SUPPLEMENTARY INFORMATION: See the ``Additional Information'' section for information on how to comment on this proposal and how the FAA will handle comments received. The ``Additional Information..., environmental, energy, or federalism impacts that might result from adopting the proposals in this document. The...

  7. A Vision of Quantitative Imaging Technology for Validation of Advanced Flight Technologies

    NASA Technical Reports Server (NTRS)

    Horvath, Thomas J.; Kerns, Robert V.; Jones, Kenneth M.; Grinstead, Jay H.; Schwartz, Richard J.; Gibson, David M.; Taylor, Jeff C.; Tack, Steve; Dantowitz, Ronald F.

    2011-01-01

    Flight-testing is traditionally an expensive but critical element in the development and ultimate validation and certification of technologies destined for future operational capabilities. Measurements obtained in relevant flight environments also provide unique opportunities to observe flow phenomenon that are often beyond the capabilities of ground testing facilities and computational tools to simulate or duplicate. However, the challenges of minimizing vehicle weight and internal complexity as well as instrumentation bandwidth limitations often restrict the ability to make high-density, in-situ measurements with discrete sensors. Remote imaging offers a potential opportunity to noninvasively obtain such flight data in a complementary fashion. The NASA Hypersonic Thermodynamic Infrared Measurements Project has demonstrated such a capability to obtain calibrated thermal imagery on a hypersonic vehicle in flight. Through the application of existing and accessible technologies, the acreage surface temperature of the Shuttle lower surface was measured during reentry. Future hypersonic cruise vehicles, launcher configurations and reentry vehicles will, however, challenge current remote imaging capability. As NASA embarks on the design and deployment of a new Space Launch System architecture for access beyond earth orbit (and the commercial sector focused on low earth orbit), an opportunity exists to implement an imagery system and its supporting infrastructure that provides sufficient flexibility to incorporate changing technology to address the future needs of the flight test community. A long term vision is offered that supports the application of advanced multi-waveband sensing technology to aid in the development of future aerospace systems and critical technologies to enable highly responsive vehicle operations across the aerospace continuum, spanning launch, reusable space access and global reach. Motivations for development of an Agency level imagery-based measurement capability to support cross cutting applications that span the Agency mission directorates as well as meeting potential needs of the commercial sector and national interests of the Intelligence, Surveillance and Reconnaissance community are explored. A recommendation is made for an assessment study to baseline current imaging technology including the identification of future mission requirements. Development of requirements fostered by the applications suggested in this paper would be used to identify technology gaps and direct roadmapping for implementation of an affordable and sustainable next generation sensor/platform system.

  8. Evaluation of Alternate Concepts for Synthetic Vision Flight Displays With Weather-Penetrating Sensor Image Inserts During Simulated Landing Approaches

    NASA Technical Reports Server (NTRS)

    Parrish, Russell V.; Busquets, Anthony M.; Williams, Steven P.; Nold, Dean E.

    2003-01-01

    A simulation study was conducted in 1994 at Langley Research Center that used 12 commercial airline pilots repeatedly flying complex Microwave Landing System (MLS)-type approaches to parallel runways under Category IIIc weather conditions. Two sensor insert concepts of 'Synthetic Vision Systems' (SVS) were used in the simulated flights, with a more conventional electro-optical display (similar to a Head-Up Display with raster capability for sensor imagery), flown under less restrictive visibility conditions, used as a control condition. The SVS concepts combined the sensor imagery with a computer-generated image (CGI) of an out-the-window scene based on an onboard airport database. Various scenarios involving runway traffic incursions (taxiing aircraft and parked fuel trucks) and navigational system position errors (both static and dynamic) were used to assess the pilots' ability to manage the approach task with the display concepts. The two SVS sensor insert concepts contrasted the simple overlay of sensor imagery on the CGI scene without additional image processing (the SV display) to the complex integration (the AV display) of the CGI scene with pilot-decision aiding using both object and edge detection techniques for detection of obstacle conflicts and runway alignment errors.

  9. The JPL/KSC telerobotic inspection demonstration

    NASA Technical Reports Server (NTRS)

    Mittman, David; Bon, Bruce; Collins, Carol; Fleischer, Gerry; Litwin, Todd; Morrison, Jack; Omeara, Jacquie; Peters, Stephen; Brogdon, John; Humeniuk, Bob

    1990-01-01

    An ASEA IRB90 robotic manipulator with attached inspection cameras was moved through a Space Shuttle Payload Assist Module (PAM) Cradle under computer control. The Operator and Operator Control Station, including graphics simulation, gross-motion spatial planning, and machine vision processing, were located at JPL. The Safety and Support personnel, PAM Cradle, IRB90, and image acquisition system, were stationed at the Kennedy Space Center (KSC). Images captured at KSC were used both for processing by a machine vision system at JPL, and for inspection by the JPL Operator. The system found collision-free paths through the PAM Cradle, demonstrated accurate knowledge of the location of both objects of interest and obstacles, and operated with a communication delay of two seconds. Safe operation of the IRB90 near Shuttle flight hardware was obtained both through the use of a gross-motion spatial planner developed at JPL using artificial intelligence techniques, and infrared beams and pressure sensitive strips mounted to the critical surfaces of the flight hardward at KSC. The Demonstration showed that telerobotics is effective for real tasks, safe for personnel and hardware, and highly productive and reliable for Shuttle payload operations and Space Station external operations.

  10. Development of ADOCS controllers and control laws. Volume 2: Literature review and preliminary analysis

    NASA Technical Reports Server (NTRS)

    Landis, Kenneth H.; Glusman, Steven I.

    1985-01-01

    The Advanced Cockpit Controls/Advanced Flight Control System (ACC/AFCS) study was conducted by the Boeing Vertol Company as part of the Army's Advanced Digital/Optical Control System (ADOCS) program. Specifically, the ACC/AFCS investigation was aimed at developing the flight control laws for the ADOCS demonstrator aircraft which will provide satisfactory handling qualities for an attack helicopter mission. The three major elements of design considered are as follows: Pilot's integrated Side-Stick Controller (SSC) -- Number of axes controlled; force/displacement characteristics; ergonomic design. Stability and Control Augmentation System (SCAS)--Digital flight control laws for the various mission phases; SCAS mode switching logic. Pilot's Displays--For night/adverse weather conditions, the dynamics of the superimposed symbology presented to the pilot in a format similar to the Advanced Attack Helicopter (AAH) Pilot Night Vision System (PNVS) for each mission phase as a function of ACAS characteristics; display mode switching logic. Findings from the literature review and the analysis and synthesis of desired control laws are reported in Volume 2. Conclusions drawn from pilot rating data and commentary were used to formulate recommendations for the ADOCS demonstrator flight control system design. The ACC/AFCS simulation data also provide an extensive data base to aid the development of advanced flight control system design for future V/STOL aircraft.

  11. Development of ADOCS controllers and control laws. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Landis, Kenneth H.; Glusman, Steven I.

    1985-01-01

    The Advanced Cockpit Controls/Advanced Flight Control System (ACC/AFCS) study was conducted by the Boeing Vertol Company as part of the Army's Advanced Digital/Optical Control System (ADOCS) program. Specifically, the ACC/AFCS investigation was aimed at developing the flight control laws for the ADOCS demonstrator aircraft that will provide satisfactory handling qualities for an attack helicopter mission. The three major elements of design considered during the study are as follows: Pilot's integrated Side-Stick Controller (SSC) -- Number of axes controlled; force/displacement characteristics; ergonomic design. Stability and Control Augmentation System (SCAS)--Digital flight control laws for the various mission phases; SCAS mode switching logic. Pilot's Displays--For night/adverse weather conditions, the dynamics of the superimposed symbology presented to the pilot in a format similar to the Advanced Attack Helicopter (AAH) Pilot Night Vision System (PNVS) for each mission phase as a function of SCAS characteristics; display mode switching logic. Volume 1 is an Executive Summary of the study. Conclusions drawn from analysis of pilot rating data and commentary were used to formulate recommendations for the ADOCS demonstrator flight control system design. The ACC/AFCS simulation data also provide an extensive data base to aid the development of advanced flight control system design for future V/STOL aircraft.

  12. Automation and robotics for Space Station in the twenty-first century

    NASA Technical Reports Server (NTRS)

    Willshire, K. F.; Pivirotto, D. L.

    1986-01-01

    Space Station telerobotics will evolve beyond the initial capability into a smarter and more capable system as we enter the twenty-first century. Current technology programs including several proposed ground and flight experiments to enable development of this system are described. Advancements in the areas of machine vision, smart sensors, advanced control architecture, manipulator joint design, end effector design, and artificial intelligence will provide increasingly more autonomous telerobotic systems.

  13. Flight data acquisition methodology for validation of passive ranging algorithms for obstacle avoidance

    NASA Technical Reports Server (NTRS)

    Smith, Phillip N.

    1990-01-01

    The automation of low-altitude rotorcraft flight depends on the ability to detect, locate, and navigate around obstacles lying in the rotorcraft's intended flightpath. Computer vision techniques provide a passive method of obstacle detection and range estimation, for obstacle avoidance. Several algorithms based on computer vision methods have been developed for this purpose using laboratory data; however, further development and validation of candidate algorithms require data collected from rotorcraft flight. A data base containing low-altitude imagery augmented with the rotorcraft and sensor parameters required for passive range estimation is not readily available. Here, the emphasis is on the methodology used to develop such a data base from flight-test data consisting of imagery, rotorcraft and sensor parameters, and ground-truth range measurements. As part of the data preparation, a technique for obtaining the sensor calibration parameters is described. The data base will enable the further development of algorithms for computer vision-based obstacle detection and passive range estimation, as well as provide a benchmark for verification of range estimates against ground-truth measurements.

  14. Runway Safety Monitor Algorithm for Single and Crossing Runway Incursion Detection and Alerting

    NASA Technical Reports Server (NTRS)

    Green, David F., Jr.

    2006-01-01

    The Runway Safety Monitor (RSM) is an aircraft based algorithm for runway incursion detection and alerting that was developed in support of NASA's Runway Incursion Prevention System (RIPS) research conducted under the NASA Aviation Safety and Security Program's Synthetic Vision System project. The RSM algorithm provides warnings of runway incursions in sufficient time for pilots to take evasive action and avoid accidents during landings, takeoffs or when taxiing on the runway. The report documents the RSM software and describes in detail how RSM performs runway incursion detection and alerting functions for NASA RIPS. The report also describes the RIPS flight tests conducted at the Reno/Tahoe International Airport (RNO) and the Wallops Flight Facility (WAL) during July and August of 2004, and the RSM performance results and lessons learned from those flight tests.

  15. NASA Synthetic Vision EGE Flight Test

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence J.; Kramer, Lynda J.; Comstock, J. Raymond; Bailey, Randall E.; Hughes, Monica F.; Parrish, Russell V.

    2002-01-01

    NASA Langley Research Center conducted flight tests at the Eagle County, Colorado airport to evaluate synthetic vision concepts. Three display concepts (size 'A' head-down, size 'X' head-down, and head-up displays) and two texture concepts (photo, generic) were assessed for situation awareness and flight technical error / performance while making approaches to Runway 25 and Runway 07 and simulated engine-out Cottonwood 2 and KREMM departures. The results of the study confirm the retrofit capability of the HUD and Size 'A' SVS concepts to significantly improve situation awareness and performance over current EFIS glass and non-glass instruments for difficult approaches in terrain-challenged environments.

  16. Computing Optic Flow with ArduEye Vision Sensor

    DTIC Science & Technology

    2013-01-01

    processing algorithm that can be applied to the flight control of other robotic platforms. 15. SUBJECT TERMS Optical flow, ArduEye, vision based ...2 Figure 2. ArduEye vision chip on Stonyman breakout board connected to Arduino Mega (8) (left) and the Stonyman vision chips (7...robotic platforms. There is a significant need for small, light , less power-hungry sensors and sensory data processing algorithms in order to control the

  17. Real-time Enhancement, Registration, and Fusion for an Enhanced Vision System

    NASA Technical Reports Server (NTRS)

    Hines, Glenn D.; Rahman, Zia-ur; Jobson, Daniel J.; Woodell, Glenn A.

    2006-01-01

    Over the last few years NASA Langley Research Center (LaRC) has been developing an Enhanced Vision System (EVS) to aid pilots while flying in poor visibility conditions. The EVS captures imagery using two infrared video cameras. The cameras are placed in an enclosure that is mounted and flown forward-looking underneath the NASA LaRC ARIES 757 aircraft. The data streams from the cameras are processed in real-time and displayed on monitors on-board the aircraft. With proper processing the camera system can provide better-than-human-observed imagery particularly during poor visibility conditions. However, to obtain this goal requires several different stages of processing including enhancement, registration, and fusion, and specialized processing hardware for real-time performance. We are using a real-time implementation of the Retinex algorithm for image enhancement, affine transformations for registration, and weighted sums to perform fusion. All of the algorithms are executed on a single TI DM642 digital signal processor (DSP) clocked at 720 MHz. The image processing components were added to the EVS system, tested, and demonstrated during flight tests in August and September of 2005. In this paper we briefly discuss the EVS image processing hardware and algorithms. We then discuss implementation issues and show examples of the results obtained during flight tests.

  18. Canadian medical experiments on Shuttle Flight 41-G

    NASA Technical Reports Server (NTRS)

    Watt, D. G. D.; Money, K. E.; Bondar, R. L.; Thirsk, R. B.; Garneau, M.

    1985-01-01

    During the 41-G mission, two payload specialist astronauts took part in six Canadian medical experiments designed to measure how the human nervous system adapts to weightlessness, and how this might contribute to space motion sickness. Similar tests conducted pre-flight provided base-line data, and post-flight experiments examined re-adaptation to the ground. No changes were detected in the vestibulo-ocular reflex during this 8-day mission. Pronounced proprioceptive illusions were experienced, especially immediately post-flight. Tactile acuity was normal in the fingers and toes, but the ability to judge limb position was degraded. Estimates of the locations of familiar targets were grossly distorted in the absence of vision. There were no differences in taste thresholds or olfaction. Despite pre-flight tests showing unusual susceptibility to motion sickness, the Canadian payload specialist turned out to be less susceptible than normal on-orbit. Re-adaptation to the normal gravity environment occurred within the first day after landing.

  19. Flight Test Comparison of Synthetic Vision Display Concepts at Dallas/Fort Worth International Airport

    NASA Technical Reports Server (NTRS)

    Glaab, Louis J.; Kramer, Lynda J.; Arthur, Trey; Parrish, Russell V.; Barry, John S.

    2003-01-01

    Limited visibility is the single most critical factor affecting the safety and capacity of worldwide aviation operations. Synthetic Vision Systems (SVS) technology can solve this visibility problem with a visibility solution. These displays employ computer-generated terrain imagery to present 3D, perspective out-the-window scenes with sufficient information and realism to enable operations equivalent to those of a bright, clear day, regardless of weather conditions. To introduce SVS display technology into as many existing aircraft as possible, a retrofit approach was defined that employs existing HDD display capabilities for glass cockpits and HUD capabilities for the other aircraft. This retrofit approach was evaluated for typical nighttime airline operations at a major international airport. Overall, 6 evaluation pilots performed 75 research approaches, accumulating 18 hours flight time evaluating SVS display concepts that used the NASA LaRC's Boeing B-757-200 aircraft at Dallas/Fort Worth International Airport. Results from this flight test establish the SVS retrofit concept, regardless of display size, as viable for tested conditions. Future assessments need to extend evaluation of the approach to operations in an appropriate, terrain-challenged environment with daytime test conditions.

  20. Airborne Windshear Detection and Warning Systems. Fifth and Final Combined Manufacturers' and Technologists' Conference, part 2

    NASA Technical Reports Server (NTRS)

    Delnore, Victor E. (Compiler)

    1994-01-01

    The Fifth Combined Manufacturers' and Technologists' Airborne Windshear Review Meeting was hosted by the NASA Langley Research Center and the Federal Aviation Administration in Hampton, Virginia, on September 28-30, 1993. The purpose was to report on the highly successful windshear experiments conducted by government, academic institutions, and industry; to transfer the results to regulators, manufacturers, and users; and to set initiatives for future aeronautics technology research. The formal sessions covered recent developments in windshear flight testing, windshear modeling, flight management, and ground-based systems, airborne windshear detection systems, certification and regulatory issues, and development and applications of sensors for wake vortices and for synthetic and enhanced vision systems. This report was compiled to record and make available the technology updates and materials from the conference.

  1. Biomorphic Explorers

    NASA Technical Reports Server (NTRS)

    Thakoor, Sarita

    1999-01-01

    This paper presents, in viewgraph form, the first NASA/JPL workshop on Biomorphic Explorers for future missions. The topics include: 1) Biomorphic Explorers: Classification (Based on Mobility and Ambient Environment); 2) Biomorphic Flight Systems: Vision; 3) Biomorphic Explorer: Conceptual Design; 4) Biomorphic Gliders; 5) Summary and Roadmap; 6) Coordinated/Cooperative Exploration Scenario; and 7) Applications. This paper also presents illustrations of the various biomorphic explorers.

  2. 78 FR 34935 - Revisions to Operational Requirements for the Use of Enhanced Flight Vision Systems (EFVS) and to...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-06-11

    ... Transportation Safety Board OEM Original equipment manufacturer OpSpec Operation specification PAR Precision....1) B. Consolidate EFVS requirements in part 91 in a new section (Sec. 91.176) C. Establish equipment... single new section for organizational and regulatory clarity. Section 91.189 would be amended to permit...

  3. Utilizing Commercial Hardware and Open Source Computer Vision Software to Perform Motion Capture for Reduced Gravity Flight

    NASA Technical Reports Server (NTRS)

    Humphreys, Brad; Bellisario, Brian; Gallo, Christopher; Thompson, William K.; Lewandowski, Beth

    2016-01-01

    Long duration space travel to Mars or to an asteroid will expose astronauts to extended periods of reduced gravity. Since gravity is not present to aid loading, astronauts will use resistive and aerobic exercise regimes for the duration of the space flight to minimize the loss of bone density, muscle mass and aerobic capacity that occurs during exposure to a reduced gravity environment. Unlike the International Space Station (ISS), the area available for an exercise device in the next generation of spacecraft is limited. Therefore, compact resistance exercise device prototypes are being developed. The NASA Digital Astronaut Project (DAP) is supporting the Advanced Exercise Concepts (AEC) Project, Exercise Physiology and Countermeasures (ExPC) project and the National Space Biomedical Research Institute (NSBRI) funded researchers by developing computational models of exercising with these new advanced exercise device concepts. To perform validation of these models and to support the Advanced Exercise Concepts Project, several candidate devices have been flown onboard NASAs Reduced Gravity Aircraft. In terrestrial laboratories, researchers typically have available to them motion capture systems for the measurement of subject kinematics. Onboard the parabolic flight aircraft it is not practical to utilize the traditional motion capture systems due to the large working volume they require and their relatively high replacement cost if damaged. To support measuring kinematics on board parabolic aircraft, a motion capture system is being developed utilizing open source computer vision code with commercial off the shelf (COTS) video camera hardware. While the systems accuracy is lower than lab setups, it provides a means to produce quantitative comparison motion capture kinematic data. Additionally, data such as required exercise volume for small spaces such as the Orion capsule can be determined. METHODS: OpenCV is an open source computer vision library that provides the ability to perform multi-camera 3 dimensional reconstruction. Utilizing OpenCV, via the Python programming language, a set of tools has been developed to perform motion capture in confined spaces using commercial cameras. Four Sony Video Cameras were intrinsically calibrated prior to flight. Intrinsic calibration provides a set of camera specific parameters to remove geometric distortion of the lens and sensor (specific to each individual camera). A set of high contrast markers were placed on the exercising subject (safety also necessitated that they be soft in case they become detached during parabolic flight); small yarn balls were used. Extrinsic calibration, the determination of camera location and orientation parameters, is performed using fixed landmark markers shared by the camera scenes. Additionally a wand calibration, the sweeping of the camera scenes simultaneously, was also performed. Techniques have been developed to perform intrinsic calibration, extrinsic calibration, isolation of the markers in the scene, calculation of marker 2D centroids, and 3D reconstruction from multiple cameras. These methods have been tested in the laboratory side-by-side comparison to a traditional motion capture system and also on a parabolic flight.

  4. The effects of moon illumination, moon angle, cloud cover, and sky glow on night vision goggle flight performance

    NASA Astrophysics Data System (ADS)

    Loro, Stephen Lee

    This study was designed to examine moon illumination, moon angle, cloud cover, sky glow, and Night Vision Goggle (NVG) flight performance to determine possible effects. The research was a causal-comparative design. The sample consisted of 194 Fort Rucker Initial Entry Rotary Wing NVG flight students being observed by 69 NVG Instructor Pilots. The students participated in NVG flight training from September 1992 through January 1993. Data were collected using a questionnaire. Observations were analyzed using a Kruskal-Wallis one-way analysis of variance and a Wilcox matched pairs signed-ranks test for difference. Correlations were analyzed using Pearson's r. The analyses results indicated that performance at high moon illumination levels is superior to zero moon illumination, and in most task maneuvers, superior to >0%--50% moon illumination. No differences were found in performance at moon illumination levels above 50%. Moon angle had no effect on night vision goggle flight performance. Cloud cover and sky glow have selective effects on different maneuvers. For most task maneuvers, cloud cover does not affect performance. Overcast cloud cover had a significant effect on seven of the 14 task maneuvers. Sky glow did not affect eight out of 14 task maneuvers at any level of sky glow.

  5. Preliminary Effect of Synthetic Vision Systems Displays to Reduce Low-Visibility Loss of Control and Controlled Flight Into Terrain Accidents

    NASA Technical Reports Server (NTRS)

    Glaab, Louis J.; Takallu, Mohammad A.

    2002-01-01

    An experimental investigation was conducted to study the effectiveness of Synthetic Vision Systems (SVS) flight displays as a means of eliminating Low Visibility Loss of Control (LVLOC) and Controlled Flight Into Terrain (CFIT) accidents by low time general aviation (GA) pilots. A series of basic maneuvers were performed by 18 subject pilots during transition from Visual Meteorological Conditions (VMC) to Instrument Meteorological Conditions (IMC), with continued flight into IMC, employing a fixed-based flight simulator. A total of three display concepts were employed for this evaluation. One display concept, referred to as the Attitude Indicator (AI) replicated instrumentation common in today's General Aviation (GA) aircraft. The second display concept, referred to as the Electronic Attitude Indicator (EAI), featured an enlarged attitude indicator that was more representative of a glass display that also included advanced flight symbology, such as a velocity vector. The third concept, referred to as the SVS display, was identical to the EAI except that computer-generated terrain imagery replaced the conventional blue-sky/brown-ground of the EAI. Pilot performance parameters, pilot control inputs and physiological data were recorded for post-test analysis. Situation awareness (SA) and qualitative pilot comments were obtained through questionnaires and free-form interviews administered immediately after the experimental session. Initial pilot performance data were obtained by instructor pilot observations. Physiological data (skin temperature, heart rate, and muscle flexure) were also recorded. Preliminary results indicate that far less errors were committed when using the EAI and SVS displays than when using conventional instruments. The specific data example examined in this report illustrates the benefit from SVS displays to avoid massive loss of SA conditions. All pilots acknowledged the enhanced situation awareness provided by the SVS display concept. Levels of pilot stress appear to be correlated with skin temperature measurements.

  6. Challenges in Evaluating Relationships Between Quantitative Data (Carbon Dioxide) and Qualitative Data (Self-Reported Visual Changes)

    NASA Technical Reports Server (NTRS)

    Mendez, C. M.; Foy, M.; Mason, S.; Wear, M. L.; Meyers, V.; Law, J.; Alexander, D.; Van Baalen, M.

    2014-01-01

    Understanding the nuances in clinical data is critical in developing a successful data analysis plan. Carbon dioxide (CO2) data are collected on board the International Space Station (ISS) in a continuous stream. Clinical data on ISS are primarily collected via conversations between individual crewmembers and NASA Flight Surgeons during weekly Private Medical Conferences (PMC). Law, et.al, 20141 demonstrated a statistically significant association between weekly average CO2 levels on ISS and self-reported headaches over the reporting period from March 14, 2001 to May 31, 2012. The purpose of this analysis is to describe the evaluation of a possible association between visual changes and CO2 levels on ISS and to discuss challenges in developing an appropriate analysis plan. METHODS & PRELIMINARY RESULTS: A first analysis was conducted following the same study design as the published work on CO2 and self-reported headaches1; substituting self-reported changes in visual acuity in place of self-reported headaches. The analysis demonstrated no statistically significant association between visual impairment characterized by vision symptoms self-reported during PMCs and ISS average CO2 levels over ISS missions. Closer review of the PMC records showed that vision outcomes are not well-documented in terms of clinical severity, timing of onset, or timing of resolution, perhaps due to the incipient nature of vision changes. Vision has been monitored in ISS crewmembers, pre- and post-flight, using standard optometry evaluations. In-flight visual assessments were limited early in the ISS program, primarily consisting of self-perceived changes reported by crewmembers. Recently, on-orbit capabilities have greatly improved. Vision data ranges from self-reported post-flight changes in visual acuity, pre- to postflight changes identified during fundoscopic examination, and in-flight progression measured by advanced on-orbit clinical imaging capabilities at predetermined testing intervals. In contrast, CO2 data are recorded in a continuous stream over time; however, for the initial analysis this data was categorized into weekly averages.

  7. A Biomimetic Algorithm for Flight Stabilization in Airborne Vehicles, Based on Dragonfly Ocellar Vision

    DTIC Science & Technology

    2006-07-27

    9 10 Technical horizon sensors Over the past few years, a remarkable proliferation of designs for micro-aerial vehicles (MAVs) has occurred... photodiode Fig. 15 Fig. 14 Sky scans with a GaP UV pho to dio de a lo ng three vert ical paths. A ngle o f v iew 30 degrees, 50% clo ud co ver, sun at...Australia Email: gert.stange@anu.edu.au A biomimetic algorithm for flight stabilization in airborne vehicles , based on dragonfly ocellar vision

  8. Validation of vision-based obstacle detection algorithms for low-altitude helicopter flight

    NASA Technical Reports Server (NTRS)

    Suorsa, Raymond; Sridhar, Banavar

    1991-01-01

    A validation facility being used at the NASA Ames Research Center is described which is aimed at testing vision based obstacle detection and range estimation algorithms suitable for low level helicopter flight. The facility is capable of processing hundreds of frames of calibrated multicamera 6 degree-of-freedom motion image sequencies, generating calibrated multicamera laboratory images using convenient window-based software, and viewing range estimation results from different algorithms along with truth data using powerful window-based visualization software.

  9. A review of flight simulation techniques

    NASA Astrophysics Data System (ADS)

    Baarspul, Max

    After a brief historical review of the evolution of flight simulation techniques, this paper first deals with the main areas of flight simulator applications. Next, it describes the main components of a piloted flight simulator. Because of the presence of the pilot-in-the-loop, the digital computer driving the simulator must solve the aircraft equations of motion in ‘real-time’. Solutions to meet the high required computer power of todays modern flight simulator are elaborated. The physical similarity between aircraft and simulator in cockpit layout, flight instruments, flying controls etc., is discussed, based on the equipment and environmental cue fidelity required for training and research simulators. Visual systems play an increasingly important role in piloted flight simulation. The visual systems now available and most widely used are described, where image generators and display devices will be distinguished. The characteristics of out-of-the-window visual simulation systems pertaining to the perceptual capabilities of human vision are discussed. Faithful reproduction of aircraft motion requires large travel, velocity and acceleration capabilities of the motion system. Different types and applications of motion systems in e.g. airline training and research are described. The principles of motion cue generation, based on the characteristics of the non-visual human motion sensors, are described. The complete motion system, consisting of the hardware and the motion drive software, is discussed. The principles of mathematical modelling of the aerodynamic, flight control, propulsion, landing gear and environmental characteristics of the aircraft are reviewed. An example of the identification of an aircraft mathematical model, based on flight and taxi tests, is presented. Finally, the paper deals with the hardware and software integration of the flight simulator components and the testing and acceptance of the complete flight simulator. Examples of the so-called ‘Computer Generated Checkout’ and ‘Proof of Match’ are presented. The concluding remarks briefly summarize the status of flight simulator technology and consider possibilities for future research.

  10. Stroboscopic Vision as a Treatment for Space Motion Sickness

    NASA Technical Reports Server (NTRS)

    Reschke, Millard F.; Somers, Jeffrey T.; Ford, George; Krnavek, Jody M.

    2007-01-01

    Results obtained from space flight indicate that most space crews will experience some symptoms of motion sickness causing significant impact on the operational objectives that must be accomplished to assure mission success. Based on the initial work of Melvill Jones we have evaluated stroboscopic vision as a method of preventing motion sickness. Given that the data presented by professor Melvill Jones were primarily post hoc results following a study not designed to investigate motion sickness, it is unclear how motion sickness results were actually determined. Building on these original results, we undertook a three part study that was designed to investigate the effect of stroboscopic vision (either with a strobe light or LCD shutter glasses) on motion sickness using: (1) visual field reversal, (2) Reading while riding in a car (with or without external vision present), and (3) making large pitch head movements during parabolic flight.

  11. High-speed potato grading and quality inspection based on a color vision system

    NASA Astrophysics Data System (ADS)

    Noordam, Jacco C.; Otten, Gerwoud W.; Timmermans, Toine J. M.; van Zwol, Bauke H.

    2000-03-01

    A high-speed machine vision system for the quality inspection and grading of potatoes has been developed. The vision system grades potatoes on size, shape and external defects such as greening, mechanical damages, rhizoctonia, silver scab, common scab, cracks and growth cracks. A 3-CCD line-scan camera inspects the potatoes in flight as they pass under the camera. The use of mirrors to obtain a 360-degree view of the potato and the lack of product holders guarantee a full view of the potato. To achieve the required capacity of 12 tons/hour, 11 SHARC Digital Signal Processors perform the image processing and classification tasks. The total capacity of the system is about 50 potatoes/sec. The color segmentation procedure uses Linear Discriminant Analysis (LDA) in combination with a Mahalanobis distance classifier to classify the pixels. The procedure for the detection of misshapen potatoes uses a Fourier based shape classification technique. Features such as area, eccentricity and central moments are used to discriminate between similar colored defects. Experiments with red and yellow skin-colored potatoes have shown that the system is robust and consistent in its classification.

  12. The Efficacy of Using Synthetic Vision Terrain-Textured Images to Improve Pilot Situation Awareness

    NASA Technical Reports Server (NTRS)

    Uenking, Michael D.; Hughes, Monica F.

    2002-01-01

    The General Aviation Element of the Aviation Safety Program's Synthetic Vision Systems (SVS) Project is developing technology to eliminate low visibility induced General Aviation (GA) accidents. SVS displays present computer generated 3-dimensional imagery of the surrounding terrain on the Primary Flight Display (PFD) to greatly enhance pilot's situation awareness (SA), reducing or eliminating Controlled Flight into Terrain, as well as Low-Visibility Loss of Control accidents. SVS-conducted research is facilitating development of display concepts that provide the pilot with an unobstructed view of the outside terrain, regardless of weather conditions and time of day. A critical component of SVS displays is the appropriate presentation of terrain to the pilot. An experimental study is being conducted at NASA Langley Research Center (LaRC) to explore and quantify the relationship between the realism of the terrain presentation and resulting enhancements of pilot SA and performance. Composed of complementary simulation and flight test efforts, Terrain Portrayal for Head-Down Displays (TP-HDD) experiments will help researchers evaluate critical terrain portrayal concepts. The experimental effort is to provide data to enable design trades that optimize SVS applications, as well as develop requirements and recommendations to facilitate the certification process. In this part of the experiment a fixed based flight simulator was equipped with various types of Head Down flight displays, ranging from conventional round dials (typical of most GA aircraft) to glass cockpit style PFD's. The variations of the PFD included an assortment of texturing and Digital Elevation Model (DEM) resolution combinations. A test matrix of 10 terrain display configurations (in addition to the baseline displays) were evaluated by 27 pilots of various backgrounds and experience levels. Qualitative (questionnaires) and quantitative (pilot performance and physiological) data were collected during the experimental runs. This paper focuses on the experimental set-up and final physiological results of the TP-HDD simulation experiment. The physiological measures of skin temperature, heart rate, and muscle response, show a decreased engagement (while using the synthetic vision displays as compared to the baseline conventional display) of the sympathetic and somatic nervous system responses which, in turn, indicates a reduced level of mental workload. This decreased level of workload is expected to enable improvement in the pilot's situation and terrain awareness.

  13. Visual Impairment/Increased Intracranial Pressure (VIIP): Layman's Summary

    NASA Technical Reports Server (NTRS)

    Fogarty, Jennifer

    2011-01-01

    To date NASA has documented that seven long duration astronauts have experienced in-flight and post-flight changes in vision and eye anatomy including degraded distant vision, swelling of the back of the eye, and changes in the shape of the globe. We have also documented in a few of these astronauts post-flight, increases in the pressure of the fluid that surrounds the brain and spinal cord. This is referred to as increased intracranial pressure (ICP). The functional and anatomical changes have varied in severity and duration. In the post-flight time period, some individuals have experienced a return to a pre-flight level of visual function while others have experienced changes that remain significantly altered compared to pre-flight. In addition, the increased ICP also persists in the post-flight time period. Currently, the underlying cause or causes of these changes is/are unknown but the spaceflight community at NASA suspects that the shift of blood toward the head and the changes in physiology that accompany it, such as increased intracranial pressure, play a significant role.

  14. Flight Test Results of a Synthetic Vision Elevation Database Integrity Monitor

    NASA Technical Reports Server (NTRS)

    deHaag, Maarten Uijt; Sayre, Jonathon; Campbell, Jacob; Young, Steve; Gray, Robert

    2001-01-01

    This paper discusses the flight test results of a real-time Digital Elevation Model (DEM) integrity monitor for Civil Aviation applications. Providing pilots with Synthetic Vision (SV) displays containing terrain information has the potential to improve flight safety by improving situational awareness and thereby reducing the likelihood of Controlled Flight Into Terrain (CFIT). Utilization of DEMs, such as the digital terrain elevation data (DTED), requires a DEM integrity check and timely integrity alerts to the pilots when used for flight-critical terrain-displays, otherwise the DEM may provide hazardous misleading terrain information. The discussed integrity monitor checks the consistency between a terrain elevation profile synthesized from sensor information, and the profile given in the DEM. The synthesized profile is derived from DGPS and radar altimeter measurements. DEMs of various spatial resolutions are used to illustrate the dependency of the integrity monitor s performance on the DEMs spatial resolution. The paper will give a description of proposed integrity algorithms, the flight test setup, and the results of a flight test performed at the Ohio University airport and in the vicinity of Asheville, NC.

  15. Spaceborne GPS: Current Status and Future Visions

    NASA Technical Reports Server (NTRS)

    Bauer, Frank H.; Hartman, Kate; Lightsey, E. Glenn

    1998-01-01

    The Global Positioning System (GPS), developed by the Department of Defense is quickly revolutionizing the architecture of future spacecraft and spacecraft systems. Significant savings in spacecraft life cycle cost, in power, and in mass can be realized by exploiting GPS technology in spaceborne vehicles. These savings are realized because GPS is a systems sensor--it combines the ability to sense space vehicle trajectory, attitude, time, and relative ranging between vehicles into one package. As a result, a reduced spacecraft sensor complement can be employed and significant reductions in space vehicle operations cost can be realized through enhanced on-board autonomy. This paper provides an overview of the current status of spaceborne GPS, a description of spaceborne GPS receivers available now and in the near future, a description of the 1997-2000 GPS flight experiments, and the spaceborne GPS team's vision for the future.

  16. Computer-aided system for detecting runway incursions

    NASA Astrophysics Data System (ADS)

    Sridhar, Banavar; Chatterji, Gano B.

    1994-07-01

    A synthetic vision system for enhancing the pilot's ability to navigate and control the aircraft on the ground is described. The system uses the onboard airport database and images acquired by external sensors. Additional navigation information needed by the system is provided by the Inertial Navigation System and the Global Positioning System. The various functions of the system, such as image enhancement, map generation, obstacle detection, collision avoidance, guidance, etc., are identified. The available technologies, some of which were developed at NASA, that are applicable to the aircraft ground navigation problem are noted. Example images of a truck crossing the runway while the aircraft flies close to the runway centerline are described. These images are from a sequence of images acquired during one of the several flight experiments conducted by NASA to acquire data to be used for the development and verification of the synthetic vision concepts. These experiments provide a realistic database including video and infrared images, motion states from the Inertial Navigation System and the Global Positioning System, and camera parameters.

  17. Airbreathing Hypersonic Systems Focus at NASA Langley Research Center

    NASA Technical Reports Server (NTRS)

    Hunt, James L.; Rausch, Vincent L.

    1998-01-01

    This paper presents the status of the airbreathing hypersonic airplane and space-access vehicle design matrix, reflects on the synergies and issues, and indicates the thrust of the effort to resolve the design matrix and to focus/advance systems technology maturation. Priority is given to the design of the vision operational vehicles followed by flow-down requirements to flight demonstrator vehicles and their design for eventual consideration in the Future-X Program.

  18. A Future Vision for Remotely Piloted Aircraft: Leveraging Interoperability and Networked Operations

    DTIC Science & Technology

    2013-06-21

    over the next 25 years  Balances the effects envisioned in the USAF UAS Flight Plan with the reality of constrained resources and ambitious...theater-level unmanned systems must detect, avoid, or counter threats – operating from permissive to highly contested access in all weather...Rapid Reaction Group II/III SUAS Unit  Light Footprint, Low Cost ISR Option  Networked Autonomous C2 System  Air-Launched SUAS  Common

  19. Evaluation of Fused Synthetic and Enhanced Vision Display Concepts for Low-Visibility Approach and Landing

    NASA Technical Reports Server (NTRS)

    Bailey, Randall E.; Kramer, Lynda J.; Prinzel, Lawrence J., III; Wilz, Susan J.

    2009-01-01

    NASA is developing revolutionary crew-vehicle interface technologies that strive to proactively overcome aircraft safety barriers that would otherwise constrain the full realization of the next generation air transportation system. A piloted simulation experiment was conducted to evaluate the complementary use of Synthetic and Enhanced Vision technologies. Specific focus was placed on new techniques for integration and/or fusion of Enhanced and Synthetic Vision and its impact within a two-crew flight deck during low-visibility approach and landing operations. Overall, the experimental data showed that significant improvements in situation awareness, without concomitant increases in workload and display clutter, could be provided by the integration and/or fusion of synthetic and enhanced vision technologies for the pilot-flying and the pilot-not-flying. Improvements in lateral path control performance were realized when the Head-Up Display concepts included a tunnel, independent of the imagery (enhanced vision or fusion of enhanced and synthetic vision) presented with it. During non-normal operations, the ability of the crew to handle substantial navigational errors and runway incursions were neither improved nor adversely impacted by the display concepts. The addition of Enhanced Vision may not, of itself, provide an improvement in runway incursion detection without being specifically tailored for this application.

  20. Genitourinary issues during spaceflight: a review.

    PubMed

    Jones, J A; Jennings, R; Pietryzk, R; Ciftcioglu, N; Stepaniak, P

    2005-12-01

    The genitourinary (GU) system is not uncommonly affected during previous spaceflights. GU issues that have been observed during spaceflight include urinary calculi, infections, retention, waste management, and reproductive. In-flight countermeasures for each of these issues are being developed to reduce the likelihood of adverse sequelae, due to GU issues during exploration-class spaceflight, to begin in 2018 with flights back to the Moon and on to Mars, according to the February 2004 Presendent's Vision for US Space Exploration. With implementation of a robust countermeasures program, GU issues should not have a significant threat for mission impact during future spaceflights.

  1. Cameras Reveal Elements in the Short Wave Infrared

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Goodrich ISR Systems Inc. (formerly Sensors Unlimited Inc.), based out of Princeton, New Jersey, received Small Business Innovation Research (SBIR) contracts from the Jet Propulsion Laboratory, Marshall Space Flight Center, Kennedy Space Center, Goddard Space Flight Center, Ames Research Center, Stennis Space Center, and Langley Research Center to assist in advancing and refining indium gallium arsenide imaging technology. Used on the Lunar Crater Observation and Sensing Satellite (LCROSS) mission in 2009 for imaging the short wave infrared wavelengths, the technology has dozens of applications in military, security and surveillance, machine vision, medical, spectroscopy, semiconductor inspection, instrumentation, thermography, and telecommunications.

  2. Advanced flight computers for planetary exploration

    NASA Technical Reports Server (NTRS)

    Stephenson, R. Rhoads

    1988-01-01

    Research concerning flight computers for use on interplanetary probes is reviewed. The history of these computers from the Viking mission to the present is outlined. The differences between ground commercial computers and computers for planetary exploration are listed. The development of a computer for the Mariner Mark II comet rendezvous asteroid flyby mission is described. Various aspects of recently developed computer systems are examined, including the Max real time, embedded computer, a hypercube distributed supercomputer, a SAR data processor, a processor for the High Resolution IR Imaging Spectrometer, and a robotic vision multiresolution pyramid machine for processsing images obtained by a Mars Rover.

  3. 14 CFR 25.1321 - Arrangement and visibility.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... and line of vision when he is looking forward along the flight path. (b) The flight instruments... center position. (c) Required powerplant instruments must be closely grouped on the instrument panel. In...

  4. 14 CFR 25.1321 - Arrangement and visibility.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... and line of vision when he is looking forward along the flight path. (b) The flight instruments... center position. (c) Required powerplant instruments must be closely grouped on the instrument panel. In...

  5. Development of ADOCS controllers and control laws. Volume 3: Simulation results and recommendations

    NASA Technical Reports Server (NTRS)

    Landis, Kenneth H.; Glusman, Steven I.

    1985-01-01

    The Advanced Cockpit Controls/Advanced Flight Control System (ACC/AFCS) study was conducted by the Boeing Vertol Company as part of the Army's Advanced Digital/Optical Control System (ADOCS) program. Specifically, the ACC/AFCS investigation was aimed at developing the flight control laws for the ADOCS demonstator aircraft which will provide satisfactory handling qualities for an attack helicopter mission. The three major elements of design considered are as follows: Pilot's integrated Side-Stick Controller (SSC) -- Number of axes controlled; force/displacement characteristics; ergonomic design. Stability and Control Augmentation System (SCAS)--Digital flight control laws for the various mission phases; SCAS mode switching logic. Pilot's Displays--For night/adverse weather conditions, the dynamics of the superimposed symbology presented to the pilot in a format similar to the Advanced Attack Helicopter (AAH) Pilot Night Vision System (PNVS) for each mission phase is a function of SCAS characteristics; display mode switching logic. Results of the five piloted simulations conducted at the Boeing Vertol and NASA-Ames simulation facilities are presented in Volume 3. Conclusions drawn from analysis of pilot rating data and commentary were used to formulate recommendations for the ADOCS demonstrator flight control system design. The ACC/AFCS simulation data also provide an extensive data base to aid the development of advanced flight control system design for future V/STOL aircraft.

  6. Spaceborne GPS Current Status and Future Visions

    NASA Technical Reports Server (NTRS)

    Bauer, Frank H.; Hartman, Kate; Lightsey, E. Glenn

    1998-01-01

    The Global Positioning System (GPS), developed by the Department of Defense, is quickly revolutionizing the architecture of future spacecraft and spacecraft systems. Significant savings in spacecraft life cycle cost, in power, and in mass can be realized by exploiting Global Positioning System (GPS) technology in spaceborne vehicles. These savings are realized because GPS is a systems sensor-it combines the ability to sense space vehicle trajectory, attitude, time, and relative ranging between vehicles into one package. As a result, a reduced spacecraft sensor complement can be employed on spacecraft and significant reductions in space vehicle operations cost can be realized through enhanced on- board autonomy. This paper provides an overview of the current status of spaceborne GPS, a description of spaceborne GPS receivers available now and in the near future, a description of the 1997-1999 GPS flight experiments and the spaceborne GPS team's vision for the future.

  7. OH Vision Test

    NASA Image and Video Library

    2014-06-03

    ISS040-E-006739 (3 June 2014) --- European Space Agency astronaut Alexander Gerst, Expedition 40 flight engineer, uses the Optical Coherence Tomography (OCT) camera during an Ocular Health (OH) vision test in the Harmony node of the International Space Station. The OH experiment observes and seeks to understand vision changes during long-term space missions. NASA astronaut Steve Swanson (left), Expedition 40 commander, assists Gerst.

  8. "Space flight is utter bilge"

    NASA Astrophysics Data System (ADS)

    Yeomans, Donald

    2004-01-01

    Despite skepticism and ridicule from scientists and the public alike, a small handful of dreamers kept faith in their vision of space flight and planned for the day when humanity would break loose from Earth.

  9. Concept of Operations for Commercial and Business Aircraft Synthetic Vision Systems. 1.0

    NASA Technical Reports Server (NTRS)

    Williams Daniel M.; Waller, Marvin C.; Koelling, John H.; Burdette, Daniel W.; Capron, William R.; Barry, John S.; Gifford, Richard B.; Doyle, Thomas M.

    2001-01-01

    A concept of operations (CONOPS) for the Commercial and Business (CaB) aircraft synthetic vision systems (SVS) is described. The CaB SVS is expected to provide increased safety and operational benefits in normal and low visibility conditions. Providing operational benefits will promote SVS implementation in the Net, improve aviation safety, and assist in meeting the national aviation safety goal. SVS will enhance safety and enable consistent gate-to-gate aircraft operations in normal and low visibility conditions. The goal for developing SVS is to support operational minima as low as Category 3b in a variety of environments. For departure and ground operations, the SVS goal is to enable operations with a runway visual range of 300 feet. The system is an integrated display concept that provides a virtual visual environment. The SVS virtual visual environment is composed of three components: an enhanced intuitive view of the flight environment, hazard and obstacle defection and display, and precision navigation guidance. The virtual visual environment will support enhanced operations procedures during all phases of flight - ground operations, departure, en route, and arrival. The applications selected for emphasis in this document include low visibility departures and arrivals including parallel runway operations, and low visibility airport surface operations. These particular applications were selected because of significant potential benefits afforded by SVS.

  10. Using X-band Weather Radar Measurements to Monitor the Integrity of Digital Elevation Models for Synthetic Vision Systems

    NASA Technical Reports Server (NTRS)

    Young, Steve; UijtdeHaag, Maarten; Sayre, Jonathon

    2003-01-01

    Synthetic Vision Systems (SVS) provide pilots with displays of stored geo-spatial data representing terrain, obstacles, and cultural features. As comprehensive validation is impractical, these databases typically have no quantifiable level of integrity. Further, updates to the databases may not be provided as changes occur. These issues limit the certification level and constrain the operational context of SVS for civil aviation. Previous work demonstrated the feasibility of using a realtime monitor to bound the integrity of Digital Elevation Models (DEMs) by using radar altimeter measurements during flight. This paper describes an extension of this concept to include X-band Weather Radar (WxR) measurements. This enables the monitor to detect additional classes of DEM errors and to reduce the exposure time associated with integrity threats. Feature extraction techniques are used along with a statistical assessment of similarity measures between the sensed and stored features that are detected. Recent flight-testing in the area around the Juneau, Alaska Airport (JNU) has resulted in a comprehensive set of sensor data that is being used to assess the feasibility of the proposed monitor technology. Initial results of this assessment are presented.

  11. Nutritional Status Assessment (SMO 016E)

    NASA Technical Reports Server (NTRS)

    Smith, S. M.; Heer, M. A.; Zwart, S. R.

    2014-01-01

    The Nutritional Status Assessment Supplemental Medical Objective was initiated to expand nominal clinical nutrition testing of ISS astronauts, and to gain a better understanding of the time course of changes in nutritional status during flight. The primary activity of this effort was collecting blood and urine samples during flight for analysis after return to Earth. Samples were subjected to a battery of tests. The resulting data provide a comprehensive survey of how nutritional status and related systems are affected by 4-6 months of space flight. Analysis of these data has yielded many findings to date, including: Vision. Documented evidence that biochemical markers involved in one-carbon metabolism were altered in crewmembers who experienced vision-related issues during and after flight (1). Iron, Oxidative Stress, and Bone. In-flight data document a clear association of increased iron stores, markers of oxidative damage to DNA, and bone loss (2). Exercise. Documented that well-nourished crewmembers performing heavy resistance exercise returned from ISS with bone mineral densities unchanged from preflight (3). Furthermore, the response of bone to space flight and exercise countermeasures was the same in men and women (4). Body Mass. Crewmembers lose 2-5% of their body mass in the first month of flight, and maintain the lower body mass during flight (5). Additionally, the two devices to measure body mass on orbit, the SLAMMD and BMMD, provide similar results (5). Cytokines. Findings indicated that a pattern of persistent physiological adaptations occurs during space flight that includes shifts in immune and hormonal regulation (6). Fish/Bone. Documented a relationship between fish intake and bone loss in astronauts (that is, those who ate more fish lost less bone) (7). Vitamin K. Documented that in generally well-fed and otherwise healthy individuals, vitamin K status and bone vitamin K-dependent proteins are unaffected by space flight (and bed rest) (8). Testosterone. Documented that blood concentrations of testosterone were unchanged during flight, but a transient decline occurred after landing (9). Calcium. Nutrition SMO data contributed to the ISS Program by helping understand how and why the Urine Processor Assembly clogged with calcium sulfate precipitate (10). Sample Processing. Ground-based analytical testing results have also been published (11).

  12. [Micron]ADS-B Detect and Avoid Flight Tests on Phantom 4 Unmanned Aircraft System

    NASA Technical Reports Server (NTRS)

    Arteaga, Ricardo; Dandachy, Mike; Truong, Hong; Aruljothi, Arun; Vedantam, Mihir; Epperson, Kraettli; McCartney, Reed

    2018-01-01

    Researchers at the National Aeronautics and Space Administration Armstrong Flight Research Center in Edwards, California and Vigilant Aerospace Systems collaborated for the flight-test demonstration of an Automatic Dependent Surveillance-Broadcast based collision avoidance technology on a small unmanned aircraft system equipped with the uAvionix Automatic Dependent Surveillance-Broadcast transponder. The purpose of the testing was to demonstrate that National Aeronautics and Space Administration / Vigilant software and algorithms, commercialized as the FlightHorizon UAS"TM", are compatible with uAvionix hardware systems and the DJI Phantom 4 small unmanned aircraft system. The testing and demonstrations were necessary for both parties to further develop and certify the technology in three key areas: flights beyond visual line of sight, collision avoidance, and autonomous operations. The National Aeronautics and Space Administration and Vigilant Aerospace Systems have developed and successfully flight-tested an Automatic Dependent Surveillance-Broadcast Detect and Avoid system on the Phantom 4 small unmanned aircraft system. The Automatic Dependent Surveillance-Broadcast Detect and Avoid system architecture is especially suited for small unmanned aircraft systems because it integrates: 1) miniaturized Automatic Dependent Surveillance-Broadcast hardware; 2) radio data-link communications; 3) software algorithms for real-time Automatic Dependent Surveillance-Broadcast data integration, conflict detection, and alerting; and 4) a synthetic vision display using a fully-integrated National Aeronautics and Space Administration geobrowser for three dimensional graphical representations for ownship and air traffic situational awareness. The flight-test objectives were to evaluate the performance of Automatic Dependent Surveillance-Broadcast Detect and Avoid collision avoidance technology as installed on two small unmanned aircraft systems. In December 2016, four flight tests were conducted at Edwards Air Force Base. Researchers in the ground control station looking at displays were able to verify the Automatic Dependent Surveillance-Broadcast target detection and collision avoidance resolutions.

  13. Cognitive mapping based on synthetic vision?

    NASA Astrophysics Data System (ADS)

    Helmetag, Arnd; Halbig, Christian; Kubbat, Wolfgang; Schmidt, Rainer

    1999-07-01

    The analysis of accidents focused our work on the avoidance of 'Controlled Flight Into Terrain' caused by insufficient situation awareness. Analysis of safety concepts led us to the design of the proposed synthetic vision system that will be described. Since most information on these 3D-Displays is shown in a graphical way, it can intuitively be understood by the pilot. What are the new possibilities using SVS enhancing situation awareness? First, detection of ground collision hazard is possible by monitoring a perspective Primary Flight Display. Under the psychological point of view it is based on the perception of expanding objects in the visual flow field. Supported by a Navigation Display a local conflict resolution can be mentally worked out very fast. Secondly, it is possible to follow a 3D flight path visualized as a 'Tunnel in the sky.' This can further be improved by using a flight path prediction. These are the prerequisites for a safe and adequate movement in any kind of spatial environment. However situation awareness requires the ability of navigation and spatial problem solving. Both abilities are based on higher cognitive functions in real as well as in a synthetic environment. In this paper the current training concept will be analyzed. Advantages resulting from the integration of a SVS concerning pilot training will be discussed and necessary requirements in terrain depiction will be pinpointed. Finally a modified Computer Based Training for the familiarization with Salzburg Airport for a SVS equipped aircraft will be presented. It is developed by Darmstadt University of Technology in co-operation with Lufthansa Flight Training.

  14. Robust object tracking techniques for vision-based 3D motion analysis applications

    NASA Astrophysics Data System (ADS)

    Knyaz, Vladimir A.; Zheltov, Sergey Y.; Vishnyakov, Boris V.

    2016-04-01

    Automated and accurate spatial motion capturing of an object is necessary for a wide variety of applications including industry and science, virtual reality and movie, medicine and sports. For the most part of applications a reliability and an accuracy of the data obtained as well as convenience for a user are the main characteristics defining the quality of the motion capture system. Among the existing systems for 3D data acquisition, based on different physical principles (accelerometry, magnetometry, time-of-flight, vision-based), optical motion capture systems have a set of advantages such as high speed of acquisition, potential for high accuracy and automation based on advanced image processing algorithms. For vision-based motion capture accurate and robust object features detecting and tracking through the video sequence are the key elements along with a level of automation of capturing process. So for providing high accuracy of obtained spatial data the developed vision-based motion capture system "Mosca" is based on photogrammetric principles of 3D measurements and supports high speed image acquisition in synchronized mode. It includes from 2 to 4 technical vision cameras for capturing video sequences of object motion. The original camera calibration and external orientation procedures provide the basis for high accuracy of 3D measurements. A set of algorithms as for detecting, identifying and tracking of similar targets, so for marker-less object motion capture is developed and tested. The results of algorithms' evaluation show high robustness and high reliability for various motion analysis tasks in technical and biomechanics applications.

  15. Practical color vision tests for air traffic control applicants: en route center and terminal facilities.

    PubMed

    Mertens, H W; Milburn, N J; Collins, W E

    2000-12-01

    Two practical color vision tests were developed and validated for use in screening Air Traffic Control Specialist (ATCS) applicants for work at en route center or terminal facilities. The development of the tests involved careful reproduction/simulation of color-coded materials from the most demanding, safety-critical color task performed in each type of facility. The tests were evaluated using 106 subjects with normal color vision and 85 with color vision deficiency. The en route center test, named the Flight Progress Strips Test (FPST), required the identification of critical red/black coding in computer printing and handwriting on flight progress strips. The terminal option test, named the Aviation Lights Test (ALT), simulated red/green/white aircraft lights that must be identified in night ATC tower operations. Color-coding is a non-redundant source of safety-critical information in both tasks. The FPST was validated by direct comparison of responses to strip reproductions with responses to the original flight progress strips and a set of strips selected independently. Validity was high; Kappa = 0.91 with original strips as the validation criterion and 0.86 with different strips. The light point stimuli of the ALT were validated physically with a spectroradiometer. The reliabilities of the FPST and ALT were estimated with Chronbach's alpha as 0.93 and 0.98, respectively. The high job-relevance, validity, and reliability of these tests increases the effectiveness and fairness of ATCS color vision testing.

  16. Navigation integrity monitoring and obstacle detection for enhanced-vision systems

    NASA Astrophysics Data System (ADS)

    Korn, Bernd; Doehler, Hans-Ullrich; Hecker, Peter

    2001-08-01

    Typically, Enhanced Vision (EV) systems consist of two main parts, sensor vision and synthetic vision. Synthetic vision usually generates a virtual out-the-window view using databases and accurate navigation data, e. g. provided by differential GPS (DGPS). The reliability of the synthetic vision highly depends on both, the accuracy of the used database and the integrity of the navigation data. But especially in GPS based systems, the integrity of the navigation can't be guaranteed. Furthermore, only objects that are stored in the database can be displayed to the pilot. Consequently, unexpected obstacles are invisible and this might cause severe problems. Therefore, additional information has to be extracted from sensor data to overcome these problems. In particular, the sensor data analysis has to identify obstacles and has to monitor the integrity of databases and navigation. Furthermore, if a lack of integrity arises, navigation data, e.g. the relative position of runway and aircraft, has to be extracted directly from the sensor data. The main contribution of this paper is about the realization of these three sensor data analysis tasks within our EV system, which uses the HiVision 35 GHz MMW radar of EADS, Ulm as the primary EV sensor. For the integrity monitoring, objects extracted from radar images are registered with both database objects and objects (e. g. other aircrafts) transmitted via data link. This results in a classification into known and unknown radar image objects and consequently, in a validation of the integrity of database and navigation. Furthermore, special runway structures are searched for in the radar image where they should appear. The outcome of this runway check contributes to the integrity analysis, too. Concurrent to this investigation a radar image based navigation is performed without using neither precision navigation nor detailed database information to determine the aircraft's position relative to the runway. The performance of our approach is demonstrated with real data acquired during extensive flight tests to several airports in Northern Germany.

  17. Ultra Lightweight Ballutes for Return to Earth from the Moon

    NASA Technical Reports Server (NTRS)

    Masciarelli, James P.; Lin, John K. H.; Ware, Joanne S.; Rohrschneider, Reuben R.; Braun, Robert D.; Bartels, Robert E.; Moses, Robert W.; Hall, Jeffery L.

    2006-01-01

    Ultra lightweight ballutes offer revolutionary mass and cost benefits along with flexibility in flight system design compared to traditional entry system technologies. Under funding provided by NASA s Exploration Systems Research & Technology program, our team was able to make progress in developing this technology through systems analysis and design, evaluation of materials and construction methods, and development of critical analysis tools. Results show that once this technology is mature, significant launch mass savings, operational simplicity, and mission robustness will be available to help carry out NASA s Vision for Space Exploration.

  18. Welding technology transfer task/laser based weld joint tracking system for compressor girth welds

    NASA Technical Reports Server (NTRS)

    Looney, Alan

    1991-01-01

    Sensors to control and monitor welding operations are currently being developed at Marshall Space Flight Center. The laser based weld bead profiler/torch rotation sensor was modified to provide a weld joint tracking system for compressor girth welds. The tracking system features a precision laser based vision sensor, automated two-axis machine motion, and an industrial PC controller. The system benefits are elimination of weld repairs caused by joint tracking errors which reduces manufacturing costs and increases production output, simplification of tooling, and free costly manufacturing floor space.

  19. The Pixhawk Open-Source Computer Vision Framework for Mavs

    NASA Astrophysics Data System (ADS)

    Meier, L.; Tanskanen, P.; Fraundorfer, F.; Pollefeys, M.

    2011-09-01

    Unmanned aerial vehicles (UAV) and micro air vehicles (MAV) are already intensively used in geodetic applications. State of the art autonomous systems are however geared towards the application area in safe and obstacle-free altitudes greater than 30 meters. Applications at lower altitudes still require a human pilot. A new application field will be the reconstruction of structures and buildings, including the facades and roofs, with semi-autonomous MAVs. Ongoing research in the MAV robotics field is focusing on enabling this system class to operate at lower altitudes in proximity to nearby obstacles and humans. PIXHAWK is an open source and open hardware toolkit for this purpose. The quadrotor design is optimized for onboard computer vision and can connect up to four cameras to its onboard computer. The validity of the system design is shown with a fully autonomous capture flight along a building.

  20. A Comparison of the AVS-9 and the Panoramic Night Vision Goggle During Rotorcraft Hover and Landing

    NASA Technical Reports Server (NTRS)

    Szoboszlay, Zoltan; Haworth, Loran; Simpson, Carol; Rutkowski, Michael (Technical Monitor)

    2001-01-01

    The purpose of this flight test was to measure any differences in pilot-vehicle performance and pilot opinion between the use of the current generation AVS-9 Night Vision Goggle and one variant of the prototype Panoramic Night Vision Goggle (the PNV.GII). The PNVGII has more than double the horizontal field-of-view of the AVS-9, but reduced image quality. The flight path of the AH-1S helicopter was used as a measure of pilot-vehicle performance. Also recorded were subjective measures of flying qualities, physical reserves of the pilot, situational awareness, and display usability. Pilot comment and data indicate that the benefits of additional FOV with the PNVGIIs are to some extent negated by the reduced image quality of the PNVGIIs.

  1. Damage Detection and Verification System (DDVS) for In-Situ Health Monitoring

    NASA Technical Reports Server (NTRS)

    Williams, Martha K.; Lewis, Mark; Szafran, J.; Shelton, C.; Ludwig, L.; Gibson, T.; Lane, J.; Trautwein, T.

    2015-01-01

    Project presentation for Game Changing Program Smart Book Release. Detection and Verification System (DDVS) expands the Flat Surface Damage Detection System (FSDDS) sensory panels damage detection capabilities and includes an autonomous inspection capability utilizing cameras and dynamic computer vision algorithms to verify system health. Objectives of this formulation task are to establish the concept of operations, formulate the system requirements for a potential ISS flight experiment, and develop a preliminary design of an autonomous inspection capability system that will be demonstrated as a proof-of-concept ground based damage detection and inspection system.

  2. Risk of Visual Impairment and Intracranial Hypertension After Space Flight: Evaluation of the Role of Polymorphism of Enzymes Involved in One-Carbon Metabolism

    NASA Technical Reports Server (NTRS)

    Smith, S. M.; Gregory, J. F.; Zeisel, G. H.; Gibson, C. R.; Mader, T. H.; Kinchen, J.; Ueland, P.; Ploutz-Snyder, R.; Heer, M.; Zwart, S. R.

    2016-01-01

    Data from the Nutritional Status Assessment protocol provided biochemical evidence that the one-carbon metabolic pathway may be altered in individuals experiencing vision-related issues during and after space flight (1, 2). Briefly, serum concentrations of homocysteine, cystathionine, 2-methylcitric acid, and methylmalonic acid were significantly (P<0.001) higher (25-45%) in astronauts with ophthalmic changes than in those without such changes (1). These differences existed before, during, and after flight. Serum folate was lower (P<0.01) during flight in individuals with ophthalmic changes. Preflight serum concentrations of cystathionine and 2-methylcitric acid, and mean in-flight serum folate, were significantly (P<0.05) correlated with postflight changes in refraction (1). A follow-up study was conducted to evaluate a small number of known polymorphisms of enzymes in the one-carbon pathway, and to evaluate how these relate to vision and other medical aspects of the eye. Specifically, we investigated 5 polymorphisms in MTRR, MTHFR, SHMT, and CBS genes and their association with ophthalmic changes after flight in 49 astronauts. The number of G alleles of MTRR 66 and C alleles of SHMT1 1420 both contributed to the odds of visual disturbances (3). Block regression showed that B-vitamin status at landing and genetics were significant predictors for many of the ophthalmic outcomes studied (3). In conclusion, we document an association between MTRR 66 and SHMT1 1420 polymorphisms and space flightinduced vision changes. These data document that individuals with an altered 1-carbon metabolic pathway may be predisposed to anatomic and/or physiologic changes that render them susceptible to ophthalmic damage during space flight.

  3. Open-Loop Flight Testing of COBALT GN&C Technologies for Precise Soft Landing

    NASA Technical Reports Server (NTRS)

    Carson, John M., III; Amzajerdian, Farzin; Seubert, Carl R.; Restrepo, Carolina I.

    2017-01-01

    A terrestrial, open-loop (OL) flight test campaign of the NASA COBALT (CoOperative Blending of Autonomous Landing Technologies) platform was conducted onboard the Masten Xodiac suborbital rocket testbed, with support through the NASA Advanced Exploration Systems (AES), Game Changing Development (GCD), and Flight Opportunities (FO) Programs. The COBALT platform integrates NASA Guidance, Navigation and Control (GN&C) sensing technologies for autonomous, precise soft landing, including the Navigation Doppler Lidar (NDL) velocity and range sensor and the Lander Vision System (LVS) Terrain Relative Navigation (TRN) system. A specialized navigation filter running onboard COBALT fuzes the NDL and LVS data in real time to produce a precise navigation solution that is independent of the Global Positioning System (GPS) and suitable for future, autonomous planetary landing systems. The OL campaign tested COBALT as a passive payload, with COBALT data collection and filter execution, but with the Xodiac vehicle Guidance and Control (G&C) loops closed on a Masten GPS-based navigation solution. The OL test was performed as a risk reduction activity in preparation for an upcoming 2017 closed-loop (CL) flight campaign in which Xodiac G&C will act on the COBALT navigation solution and the GPS-based navigation will serve only as a backup monitor.

  4. Initial Development of a Metric to Describe the Level of Safety Associated with Piloting an Aircraft with Synthetic Vision Systems (SVS) Displays

    NASA Technical Reports Server (NTRS)

    Bartolone, Anthony P.; Glabb, Louis J.; Hughes, Monica F.; Parrish, Russell V.

    2005-01-01

    Synthetic Vision Systems (SVS) displays provide pilots with a continuous view of terrain combined with integrated guidance symbology in an effort to increase situation awareness (SA) and decrease workload during operations in Instrument Meteorological Conditions (IMC). It is hypothesized that SVS displays can replicate the safety and operational flexibility of flight in Visual Meteorological Conditions (VMC), regardless of actual out-the-window (OTW) visibility or time of day. Significant progress has been made towards evolving SVS displays as well as demonstrating their ability to increase SA compared to conventional avionics in a variety of conditions. While a substantial amount of data has been accumulated demonstrating the capabilities of SVS displays, the ability of SVS to replicate the safety and operational flexibility of VMC flight performance in all visibility conditions is unknown to any specific degree. In order to more fully quantify the relationship of flight operations in IMC with SVS displays to conventional operations conducted in VMC, a fundamental comparison to current day general aviation (GA) flight instruments was warranted. Such a comparison could begin to establish the extent to which SVS display concepts are capable of maintaining an "equivalent level of safety" with the round dials they could one day replace, for both current and future operations. A combination of subjective and objective data measures were used to quantify the relationship between selected components of safety that are associated with flying an approach. Four information display methods ranging from a "round dials" baseline through a fully integrated SVS package that includes terrain, pathway based guidance, and a strategic navigation display, were investigated in this high fidelity simulation experiment. In addition, a broad spectrum of pilots, representative of the GA population, were employed for testing in an attempt to enable greater application of the results and determine if "equivalent levels of safety" are achievable through the incorporation of SVS technology regardless of a pilot's flight experience.

  5. The flight telerobotic servicer and technology transfer

    NASA Technical Reports Server (NTRS)

    Andary, James F.; Bradford, Kayland Z.

    1991-01-01

    The Flight Telerobotic Servicer (FTS) project at the Goddard Space Flight Center is developing an advanced telerobotic system to assist in and reduce crew extravehicular activity (EVA) for Space Station Freedom (SSF). The FTS will provide a telerobotic capability in the early phases of the SSF program and will be employed for assembly, maintenance, and inspection applications. The current state of space technology and the general nature of the FTS tasks dictate that the FTS be designed with sophisticated teleoperational capabilities for its internal primary operating mode. However, technologies such as advanced computer vision and autonomous planning techniques would greatly enhance the FTS capabilities to perform autonomously in less structured work environments. Another objective of the FTS program is to accelerate technology transfer from research to U.S. industry.

  6. Pilot response to peripheral vision cues during instrument flying tasks.

    DOT National Transportation Integrated Search

    1968-02-01

    In an attempt to more closely associate the visual aspects of instrument flying with that of contact flight, a study was made of human response to peripheral vision cues relating to aircraft roll attitude. Pilots, ranging from 52 to 12,000 flying hou...

  7. Centaur Test Bed (CTB) for Cryogenic Fluid Management

    NASA Technical Reports Server (NTRS)

    Sakla, Steven; Kutter, Bernard; Wall, John

    2006-01-01

    Future missions such as NASA s space exploration vision and DOD satellite servicing will require significant increases in the understanding and knowledge of space based cryogenic fluid management (CFM), including the transfer and storage of cryogenic fluids. Existing CFM capabilities are based on flight of upper stage cryogenic vehicles, scientific dewars, a few dedicated flight demonstrations and ground testing. This current capability is inadequate to support development of the CEV cryogenic propulsion system, other aspects of robust space exploration or the refueling of satellite cryo propulsion systems with reasonable risk. In addition, these technologies can provide significant performance increases for missions beyond low-earth orbit to enable manned missions to the Moon and beyond. The Centaur upper-stage vehicle can provide a low cost test platform for performing numerous flight demonstrations of the full breadth of required CFM technologies to support CEV development. These flight demonstrations can be performed as secondary mission objectives using excess LH2 and/or LO2 from the main vehicle propellant tanks following primary spacecraft separation at minimal cost and risk.

  8. Physics-based simulations of aerial attacks by peregrine falcons reveal that stooping at high speed maximizes catch success against agile prey.

    PubMed

    Mills, Robin; Hildenbrandt, Hanno; Taylor, Graham K; Hemelrijk, Charlotte K

    2018-04-01

    The peregrine falcon Falco peregrinus is renowned for attacking its prey from high altitude in a fast controlled dive called a stoop. Many other raptors employ a similar mode of attack, but the functional benefits of stooping remain obscure. Here we investigate whether, when, and why stooping promotes catch success, using a three-dimensional, agent-based modeling approach to simulate attacks of falcons on aerial prey. We simulate avian flapping and gliding flight using an analytical quasi-steady model of the aerodynamic forces and moments, parametrized by empirical measurements of flight morphology. The model-birds' flight control inputs are commanded by their guidance system, comprising a phenomenological model of its vision, guidance, and control. To intercept its prey, model-falcons use the same guidance law as missiles (pure proportional navigation); this assumption is corroborated by empirical data on peregrine falcons hunting lures. We parametrically vary the falcon's starting position relative to its prey, together with the feedback gain of its guidance loop, under differing assumptions regarding its errors and delay in vision and control, and for three different patterns of prey motion. We find that, when the prey maneuvers erratically, high-altitude stoops increase catch success compared to low-altitude attacks, but only if the falcon's guidance law is appropriately tuned, and only given a high degree of precision in vision and control. Remarkably, the optimal tuning of the guidance law in our simulations coincides closely with what has been observed empirically in peregrines. High-altitude stoops are shown to be beneficial because their high airspeed enables production of higher aerodynamic forces for maneuvering, and facilitates higher roll agility as the wings are tucked, each of which is essential to catching maneuvering prey at realistic response delays.

  9. Lockheed Martin Response to the OSP Challenge

    NASA Technical Reports Server (NTRS)

    Sullivan, Robert T.; Munkres, Randy; Megna, Thomas D.; Beckham, Joanne

    2003-01-01

    The Lockheed Martin Orbital Space Plane System provides crew transfer and rescue for the International Space Station more safely and affordably than current human space transportation systems. Through planned upgrades and spiral development, it is also capable of satisfying the Nation's evolving space transportation requirements and enabling the national vision for human space flight. The OSP System, formulated through rigorous requirements definition and decomposition, consists of spacecraft and launch vehicle flight elements, ground processing facilities and existing transportation, launch complex, range, mission control, weather, navigation, communication and tracking infrastructure. The concept of operations, including procurement, mission planning, launch preparation, launch and mission operations and vehicle maintenance, repair and turnaround, is structured to maximize flexibility and mission availability and minimize program life cycle cost. The approach to human rating and crew safety utilizes simplicity, performance margin, redundancy, abort modes and escape modes to mitigate credible hazards that cannot be designed out of the system.

  10. The Role of Vision and Mechanosensation in Insect Flight Control

    DTIC Science & Technology

    2012-01-01

    intensity. We used bumblebees (Bombus terrestris), honeybees ( Apis mellifera ), the common wasp (Vespa vulgaris), hornets (Vespa crabro) flies (Mousca...bees ( Apis mellifera L.). J. Exp. Biol. 209, 978-984. Beyeler, A., Zufferey, J.-C. and Floreano, D. (2009). Vision-based control of near- obstacle

  11. System level mechanical testing of the Clementine spacecraft

    NASA Technical Reports Server (NTRS)

    Haughton, James; Hauser, Joseph; Raynor, William; Lynn, Peter

    1994-01-01

    This paper discusses the system level structural testing that was performed to qualify the Clementine Spacecraft for flight. These tests included spin balance, combined acoustic and axial random vibration, lateral random vibration, quasi-static loads, pyrotechnic shock, modal survey and on-orbit jitter simulation. Some innovative aspects of this effort were: the simultaneously combined acoustic and random vibration test; the mass loaded interface modal survey test; and the techniques used to assess how operating on board mechanisms and thrusters affect sensor vision.

  12. Bird Flight as a Model for a Course in Unsteady Aerodynamics

    NASA Astrophysics Data System (ADS)

    Jacob, Jamey; Mitchell, Jonathan; Puopolo, Michael

    2014-11-01

    Traditional unsteady aerodynamics courses at the graduate level focus on theoretical formulations of oscillating airfoil behavior. Aerodynamics students with a vision for understanding bird-flight and small unmanned aircraft dynamics desire to move beyond traditional flow models towards new and creative ways of appreciating the motion of agile flight systems. High-speed videos are used to record kinematics of bird flight, particularly barred owls and red-shouldered hawks during perching maneuvers, and compared with model aircraft performing similar maneuvers. Development of a perching glider and associated control laws to model the dynamics are used as a class project. Observations are used to determine what different species and sizes of birds share in their methods to approach a perch under similar conditions. Using fundamental flight dynamics, simplified models capable of predicting position, attitude, and velocity of the flier are developed and compared with the observations. By comparing the measured data from the videos and predicted and measured motions from the glider models, it is hoped that the students gain a better understanding of the complexity of unsteady aerodynamics and aeronautics and an appreciation for the beauty of avian flight.

  13. Stroboscopic Vision as a Treatment for Retinal Slip Induced Motion Sickness

    NASA Technical Reports Server (NTRS)

    Reschke, M. F.; Somers, J. T.; Ford, G.; Krnavek, J. M.; Hwang, E. J.; Leigh, R. J.; Estrada, A.

    2007-01-01

    Motion sickness in the general population is a significant problem driven by the increasingly more sophisticated modes of transportation, visual displays, and virtual reality environments. It is important to investigate non-pharmacological alternatives for the prevention of motion sickness for individuals who cannot tolerate the available anti-motion sickness drugs, or who are precluded from medication because of different operational environments. Based on the initial work of Melvill Jones, in which post hoc results indicated that motion sickness symptoms were prevented during visual reversal testing when stroboscopic vision was used to prevent retinal slip, we have evaluated stroboscopic vision as a method of preventing motion sickness in a number of different environments. Specifically, we have undertaken a five part study that was designed to investigate the effect of stroboscopic vision (either with a strobe light or LCD shutter glasses) on motion sickness while: (1) using visual field reversal, (2) reading while riding in a car (with or without external vision present), (3) making large pitch head movements during parabolic flight, (4) during exposure to rough seas in a small boat, and (5) seated and reading in the cabin area of a UH60 Black Hawk Helicopter during 20 min of provocative flight patterns.

  14. Neural Summation in the Hawkmoth Visual System Extends the Limits of Vision in Dim Light.

    PubMed

    Stöckl, Anna Lisa; O'Carroll, David Charles; Warrant, Eric James

    2016-03-21

    Most of the world's animals are active in dim light and depend on good vision for the tasks of daily life. Many have evolved visual adaptations that permit a performance superior to that of manmade imaging devices [1]. In insects, a major model visual system, nocturnal species show impressive visual abilities ranging from flight control [2, 3], to color discrimination [4, 5], to navigation using visual landmarks [6-8] or dim celestial compass cues [9, 10]. In addition to optical adaptations that improve their sensitivity in dim light [11], neural summation of light in space and time-which enhances the coarser and slower features of the scene at the expense of noisier finer and faster features-has been suggested to improve sensitivity in theoretical [12-14], anatomical [15-17], and behavioral [18-20] studies. How these summation strategies function neurally is, however, presently unknown. Here, we quantified spatial and temporal summation in the motion vision pathway of a nocturnal hawkmoth. We show that spatial and temporal summation combine supralinearly to substantially increase contrast sensitivity and visual information rate over four decades of light intensity, enabling hawkmoths to see at light levels 100 times dimmer than without summation. Our results reveal how visual motion is calculated neurally in dim light and how spatial and temporal summation improve sensitivity while simultaneously maximizing spatial and temporal resolution, thus extending models of insect motion vision derived predominantly from diurnal flies. Moreover, the summation strategies we have revealed may benefit manmade vision systems optimized for variable light levels [21]. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. The Malcolm horizon: History and future

    NASA Technical Reports Server (NTRS)

    Malcolm, R.

    1984-01-01

    The development of the Malcolm Horizon, a peripheral vision horizon used in flight simulation, is discussed. A history of the horizon display is presented as well as a brief overview of vision physiology, and the role balance plays is spatial orientation. Avenues of continued research in subconscious cockpit instrumentation are examined.

  16. Performance of color-dependent tasks of air traffic control specialists as a function of type and degree of color vision deficiency.

    DOT National Transportation Integrated Search

    1992-08-01

    This experiment was conducted to expand initial efforts to validate the requirement for normal color vision in Air Traffic Control Specialist (ATCS) personnel who work at en route center, terminal, and flight service station facilities. An enlarged d...

  17. NASA Runway Incursion Prevention System (RIPS) Dallas-Fort Worth Demonstration Performance Analysis

    NASA Technical Reports Server (NTRS)

    Cassell, Rick; Evers, Carl; Esche, Jeff; Sleep, Benjamin; Jones, Denise R. (Technical Monitor)

    2002-01-01

    NASA's Aviation Safety Program Synthetic Vision System project conducted a Runway Incursion Prevention System (RIPS) flight test at the Dallas-Fort Worth International Airport in October 2000. The RIPS research system includes advanced displays, airport surveillance system, data links, positioning system, and alerting algorithms to provide pilots with enhanced situational awareness, supplemental guidance cues, a real-time display of traffic information, and warnings of runway incursions. This report describes the aircraft and ground based runway incursion alerting systems and traffic positioning systems (Automatic Dependent Surveillance - Broadcast (ADS-B) and Traffic Information Service - Broadcast (TIS-B)). A performance analysis of these systems is also presented.

  18. Real-time image processing of TOF range images using a reconfigurable processor system

    NASA Astrophysics Data System (ADS)

    Hussmann, S.; Knoll, F.; Edeler, T.

    2011-07-01

    During the last years, Time-of-Flight sensors achieved a significant impact onto research fields in machine vision. In comparison to stereo vision system and laser range scanners they combine the advantages of active sensors providing accurate distance measurements and camera-based systems recording a 2D matrix at a high frame rate. Moreover low cost 3D imaging has the potential to open a wide field of additional applications and solutions in markets like consumer electronics, multimedia, digital photography, robotics and medical technologies. This paper focuses on the currently implemented 4-phase-shift algorithm in this type of sensors. The most time critical operation of the phase-shift algorithm is the arctangent function. In this paper a novel hardware implementation of the arctangent function using a reconfigurable processor system is presented and benchmarked against the state-of-the-art CORDIC arctangent algorithm. Experimental results show that the proposed algorithm is well suited for real-time processing of the range images of TOF cameras.

  19. What Drives Bird Vision? Bill Control and Predator Detection Overshadow Flight.

    PubMed

    Martin, Graham R

    2017-01-01

    Although flight is regarded as a key behavior of birds this review argues that the perceptual demands for its control are met within constraints set by the perceptual demands of two other key tasks: the control of bill (or feet) position, and the detection of food items/predators. Control of bill position, or of the feet when used in foraging, and timing of their arrival at a target, are based upon information derived from the optic flow-field in the binocular region that encompasses the bill. Flow-fields use information extracted from close to the bird using vision of relatively low spatial resolution. The detection of food items and predators is based upon information detected at a greater distance and depends upon regions in the retina with relatively high spatial resolution. The tasks of detecting predators and of placing the bill (or feet) accurately, make contradictory demands upon vision and these have resulted in trade-offs in the form of visual fields and in the topography of retinal regions in which spatial resolution is enhanced, indicated by foveas, areas, and high ganglion cell densities. The informational function of binocular vision in birds does not lie in binocularity per se (i.e., two eyes receiving slightly different information simultaneously about the same objects) but in the contralateral projection of the visual field of each eye. This ensures that each eye receives information from a symmetrically expanding optic flow-field centered close to the direction of the bill, and from this the crucial information of direction of travel and time-to-contact can be extracted, almost instantaneously. Interspecific comparisons of visual fields between closely related species have shown that small differences in foraging techniques can give rise to different perceptual challenges and these have resulted in differences in visual fields even within the same genus. This suggests that vision is subject to continuing and relatively rapid natural selection based upon individual differences in the structure of the optical system, retinal topography, and eye position in the skull. From a sensory ecology perspective a bird is best characterized as "a bill guided by an eye" and that control of flight is achieved within constraints on visual capacity dictated primarily by the demands of foraging and bill control.

  20. What Drives Bird Vision? Bill Control and Predator Detection Overshadow Flight

    PubMed Central

    Martin, Graham R.

    2017-01-01

    Although flight is regarded as a key behavior of birds this review argues that the perceptual demands for its control are met within constraints set by the perceptual demands of two other key tasks: the control of bill (or feet) position, and the detection of food items/predators. Control of bill position, or of the feet when used in foraging, and timing of their arrival at a target, are based upon information derived from the optic flow-field in the binocular region that encompasses the bill. Flow-fields use information extracted from close to the bird using vision of relatively low spatial resolution. The detection of food items and predators is based upon information detected at a greater distance and depends upon regions in the retina with relatively high spatial resolution. The tasks of detecting predators and of placing the bill (or feet) accurately, make contradictory demands upon vision and these have resulted in trade-offs in the form of visual fields and in the topography of retinal regions in which spatial resolution is enhanced, indicated by foveas, areas, and high ganglion cell densities. The informational function of binocular vision in birds does not lie in binocularity per se (i.e., two eyes receiving slightly different information simultaneously about the same objects) but in the contralateral projection of the visual field of each eye. This ensures that each eye receives information from a symmetrically expanding optic flow-field centered close to the direction of the bill, and from this the crucial information of direction of travel and time-to-contact can be extracted, almost instantaneously. Interspecific comparisons of visual fields between closely related species have shown that small differences in foraging techniques can give rise to different perceptual challenges and these have resulted in differences in visual fields even within the same genus. This suggests that vision is subject to continuing and relatively rapid natural selection based upon individual differences in the structure of the optical system, retinal topography, and eye position in the skull. From a sensory ecology perspective a bird is best characterized as “a bill guided by an eye” and that control of flight is achieved within constraints on visual capacity dictated primarily by the demands of foraging and bill control. PMID:29163020

  1. High-speed civil transport - Advanced flight deck challenges

    NASA Technical Reports Server (NTRS)

    Swink, Jay R.; Goins, Richard T.

    1992-01-01

    This paper presents the results of a nine month study of the HSCT flight deck challenges and assessment of its benefits. Operational requirements are discussed and the most significant findings for specified advanced concepts are highlighted. These concepts are a no nose-droop configuration, a far forward cockpit location and advanced crew monitoring and control of complex systems. Results indicate that the no nose-droop configuration is critically dependent on the design and development of a safe, reliable and certifiable synthetic vision system (SVS). This configuration would cause significant weight, performance and cost penalties. A far forward cockpit configuration with a tandem seating arrangement allows either an increase in additional payload or potential downsizing of the vehicle leading to increased performance efficiency and reductions in emissions. The technologies enabling such capabilities, which provide for Category III all-weather opreations on every flight represent a benefit multiplier in a 20005 ATM network in terms of enhanced economic viability and environmental acceptability.

  2. Kinect2 - respiratory movement detection study.

    PubMed

    Rihana, Sandy; Younes, Elie; Visvikis, Dimitris; Fayad, Hadi

    2016-08-01

    Radiotherapy is one of the main cancer treatments. It consists in irradiating tumor cells to destroy them while sparing healthy tissue. The treatment is planned based on Computed Tomography (CT) and is delivered over fractions during several days. One of the main challenges is replacing patient in the same position every day to irradiate the tumor volume while sparing healthy tissues. Many patient positioning techniques are available. They are both invasive and not accurate performed using tattooed marker on the patient's skin aligned with a laser system calibrated in the treatment room or irradiating using X-ray. Currently systems such as Vision RT use two Time of Flight cameras. Time of Flight cameras have the advantage of having a very fast acquisition rate allows the real time monitoring of patient movement and patient repositioning. The purpose of this work is to test the Microsoft Kinect2 camera for potential use for patient positioning and respiration trigging. This type of Time of Flight camera is non-invasive and costless which facilitate its transfer to clinical practice.

  3. The role of passive avian head stabilization in flapping flight

    PubMed Central

    Pete, Ashley E.; Kress, Daniel; Dimitrov, Marina A.; Lentink, David

    2015-01-01

    Birds improve vision by stabilizing head position relative to their surroundings, while their body is forced up and down during flapping flight. Stabilization is facilitated by compensatory motion of the sophisticated avian head–neck system. While relative head motion has been studied in stationary and walking birds, little is known about how birds accomplish head stabilization during flapping flight. To unravel this, we approximate the avian neck with a linear mass–spring–damper system for vertical displacements, analogous to proven head stabilization models for walking humans. We corroborate the model's dimensionless natural frequency and damping ratios from high-speed video recordings of whooper swans (Cygnus cygnus) flying over a lake. The data show that flap-induced body oscillations can be passively attenuated through the neck. We find that the passive model robustly attenuates large body oscillations, even in response to head mass and gust perturbations. Our proof of principle shows that bird-inspired drones with flapping wings could record better images with a swan-inspired passive camera suspension. PMID:26311316

  4. Validation of vision-based range estimation algorithms using helicopter flight data

    NASA Technical Reports Server (NTRS)

    Smith, Phillip N.

    1993-01-01

    The objective of this research was to demonstrate the effectiveness of an optic flow method for passive range estimation using a Kalman-filter implementation with helicopter flight data. This paper is divided into the following areas: (1) ranging algorithm; (2) flight experiment; (3) analysis methodology; (4) results; and (5) concluding remarks. The discussion is presented in viewgraph format.

  5. Rotary-wing flight test methods used for the evaluation of night vision devices

    NASA Astrophysics Data System (ADS)

    Haworth, Loran A.; Blanken, Christopher J.; Szoboszlay, Zoltan P.

    2001-08-01

    The U.S. Army Aviation mission includes flying helicopters at low altitude, at night, and in adverse weather. Night Vision Devices (NVDs) are used to supplement the pilot's visual cues for night flying. As the military requirement to conduct night helicopter operations has increased, the impact of helicopter flight operations with NVD technology in the Degraded Visual Environment (DVE) became increasingly important to quantify. Aeronautical Design Standard-33 (ADS- 33) was introduced to update rotorcraft handling qualities requirements and to quantify the impact of the NVDs in the DVE. As reported in this paper, flight test methodology in ADS-33 has been used by the handling qualities community to measure the impact of NVDs on task performance in the DVE. This paper provides the background and rationale behind the development of ADS-33 flight test methodology for handling qualities in the DVE, as well as the test methodology developed for human factor assessment of NVDs in the DVE. Lessons learned, shortcomings and recommendations for NVD flight test methodology are provided in this paper.

  6. Neck muscle activity in fighter pilots wearing night-vision equipment during simulated flight.

    PubMed

    Ang, Björn O; Kristoffersson, Mats

    2013-02-01

    Night-vision goggles (NVG) in jet fighter aircraft appear to increase the risk of neck strain due to increased neck loading. The present aim was, therefore, to evaluate the effect on neck-muscle activity and subjective ratings of head-worn night-vision (NV) equipment in controlled simulated flights. Five experienced fighter pilots twice flew a standardized 2.5-h program in a dynamic flight simulator; one session with NVG and one with standard helmet mockup (control session). Each session commenced with a 1-h simulation at 1 Gz followed by a 1.5-h dynamic flight with repeated Gz profiles varying between 3 and 7 Gz and including aerial combat maneuvers (ACM) at 3-5 Gz. Large head-and-neck movements under high G conditions were avoided. Surface electromyographic (EMG) data was simultaneously measured bilaterally from anterior neck, upper and lower posterior neck, and upper shoulder muscles. EMG activity was normalized as the percentage of pretest maximal voluntary contraction (%MVC). Head-worn equipment (helmet comfort, balance, neck mobility, and discomfort) was rated subjectively immediately after flight. A trend emerged toward greater overall neck muscle activity in NV flight during sustained ACM episodes (10% vs. 8% MVC for the control session), but with no such effects for temporary 3-7 Gz profiles. Postflight ratings for NV sessions emerged as "unsatisfactory" for helmet comfort/neck discomfort. However, this was not significant compared to the control session. Helmet mounted NV equipment caused greater neck muscle activity during sustained combat maneuvers, indicating increased muscle strain due to increased neck loading. In addition, postflight ratings indicated neck discomfort after NV sessions, although not clearly increased compared to flying with standard helmet mockup.

  7. Simulation Evaluation of Equivalent Vision Technologies for Aerospace Operations

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Williams, Steven P.; Wilz, Susan J.; Arthur, Jarvis J.

    2009-01-01

    A fixed-based simulation experiment was conducted in NASA Langley Research Center s Integration Flight Deck simulator to investigate enabling technologies for equivalent visual operations (EVO) in the emerging Next Generation Air Transportation System operating environment. EVO implies the capability to achieve or even improve on the safety of current-day Visual Flight Rules (VFR) operations, maintain the operational tempos of VFR, and perhaps even retain VFR procedures - all independent of the actual weather and visibility conditions. Twenty-four air transport-rated pilots evaluated the use of Synthetic/Enhanced Vision Systems (S/EVS) and eXternal Vision Systems (XVS) technologies as enabling technologies for future all-weather operations. The experimental objectives were to determine the feasibility of XVS/SVS/EVS to provide for all weather (visibility) landing capability without the need (or ability) for a visual approach segment and to determine the interaction of XVS/EVS and peripheral vision cues for terminal area and surface operations. Another key element of the testing investigated the pilot's awareness and reaction to non-normal events (i.e., failure conditions) that were unexpectedly introduced into the experiment. These non-normal runs served as critical determinants in the underlying safety of all-weather operations. Experimental data from this test are cast into performance-based approach and landing standards which might establish a basis for future all-weather landing operations. Glideslope tracking performance appears to have improved with the elimination of the approach visual segment. This improvement can most likely be attributed to the fact that the pilots didn't have to simultaneously perform glideslope corrections and find required visual landing references in order to continue a landing. Lateral tracking performance was excellent regardless of the display concept being evaluated or whether or not there were peripheral cues in the side window. Although workload ratings were significantly less when peripheral cues were present compared to when there were none, these differences appear to be operationally inconsequential. Larger display concepts tested in this experiment showed significant situation awareness (SA) improvements and workload reductions compared to smaller display concepts. With a fixed display size, a color display was more influential in SA and workload ratings than a collimated display.

  8. Flight Deck Technologies to Enable NextGen Low Visibility Surface Operations

    NASA Technical Reports Server (NTRS)

    Prinzel, Lawrence (Lance) J., III; Arthur, Jarvis (Trey) J.; Kramer, Lynda J.; Norman, Robert M.; Bailey, Randall E.; Jones, Denise R.; Karwac, Jerry R., Jr.; Shelton, Kevin J.; Ellis, Kyle K. E.

    2013-01-01

    Many key capabilities are being identified to enable Next Generation Air Transportation System (NextGen), including the concept of Equivalent Visual Operations (EVO) . replicating the capacity and safety of today.s visual flight rules (VFR) in all-weather conditions. NASA is striving to develop the technologies and knowledge to enable EVO and to extend EVO towards a Better-Than-Visual operational concept. This operational concept envisions an .equivalent visual. paradigm where an electronic means provides sufficient visual references of the external world and other required flight references on flight deck displays that enable Visual Flight Rules (VFR)-like operational tempos while maintaining and improving safety of VFR while using VFR-like procedures in all-weather conditions. The Langley Research Center (LaRC) has recently completed preliminary research on flight deck technologies for low visibility surface operations. The work assessed the potential of enhanced vision and airport moving map displays to achieve equivalent levels of safety and performance to existing low visibility operational requirements. The work has the potential to better enable NextGen by perhaps providing an operational credit for conducting safe low visibility surface operations by use of the flight deck technologies.

  9. 3D vision system for intelligent milking robot automation

    NASA Astrophysics Data System (ADS)

    Akhloufi, M. A.

    2013-12-01

    In a milking robot, the correct localization and positioning of milking teat cups is of very high importance. The milking robots technology has not changed since a decade and is based primarily on laser profiles for teats approximate positions estimation. This technology has reached its limit and does not allow optimal positioning of the milking cups. Also, in the presence of occlusions, the milking robot fails to milk the cow. These problems, have economic consequences for producers and animal health (e.g. development of mastitis). To overcome the limitations of current robots, we have developed a new system based on 3D vision, capable of efficiently positioning the milking cups. A prototype of an intelligent robot system based on 3D vision for real-time positioning of a milking robot has been built and tested under various conditions on a synthetic udder model (in static and moving scenarios). Experimental tests, were performed using 3D Time-Of-Flight (TOF) and RGBD cameras. The proposed algorithms permit the online segmentation of teats by combing 2D and 3D visual information. The obtained results permit the teat 3D position computation. This information is then sent to the milking robot for teat cups positioning. The vision system has a real-time performance and monitors the optimal positioning of the cups even in the presence of motion. The obtained results, with both TOF and RGBD cameras, show the good performance of the proposed system. The best performance was obtained with RGBD cameras. This latter technology will be used in future real life experimental tests.

  10. Technologies Render Views of Earth for Virtual Navigation

    NASA Technical Reports Server (NTRS)

    2012-01-01

    On a December night in 1995, 159 passengers and crewmembers died when American Airlines Flight 965 flew into the side of a mountain while in route to Cali, Colombia. A key factor in the tragedy: The pilots had lost situational awareness in the dark, unfamiliar terrain. They had no idea the plane was approaching a mountain until the ground proximity warning system sounded an alarm only seconds before impact. The accident was of the kind most common at the time CFIT, or controlled flight into terrain says Trey Arthur, research aerospace engineer in the Crew Systems and Aviation Operations Branch at NASA s Langley Research Center. In situations such as bad weather, fog, or nighttime flights, pilots would rely on airspeed, altitude, and other readings to get an accurate sense of location. Miscalculations and rapidly changing conditions could contribute to a fully functioning, in-control airplane flying into the ground. To improve aviation safety by enhancing pilots situational awareness even in poor visibility, NASA began exploring the possibilities of synthetic vision creating a graphical display of the outside terrain on a screen inside the cockpit. How do you display a mountain in the cockpit? You have to have a graphics-powered computer, a terrain database you can render, and an accurate navigation solution, says Arthur. In the mid-1990s, developing GPS technology offered a means for determining an aircraft s position in space with high accuracy, Arthur explains. As the necessary technologies to enable synthetic vision emerged, NASA turned to an industry partner to develop the terrain graphical engine and database for creating the virtual rendering of the outside environment.

  11. Flight performance using a hyperstereo helmet-mounted display: aircraft handling

    NASA Astrophysics Data System (ADS)

    Jennings, Sion A.; Craig, Gregory L.; Stuart, Geoffrey W.; Kalich, Melvyn E.; Rash, Clarence E.; Harding, Thomas H.

    2009-05-01

    A flight study was conducted to assess the impact of hyperstereopsis on helicopter handling proficiency, workload and pilot acceptance. Three pilots with varying levels of night vision goggle and hyperstereo helmet-mounted display experience participated in the test. The pilots carried out a series of flights consisting of low-level maneuvers over a period of two weeks. Four of the test maneuvers, The turn around the tail, the hard surface landing, the hover height estimation and the tree-line following were analysed in detail. At the end of the testing period, no significant difference was observed in the performance data, between maneuvers performed with the TopOwl helmet and maneuvers performed with the standard night vision goggle. This study addressed only the image intensification display aspects of the TopOwl helmet system. The tests did not assess the added benefits of overlaid symbology or head slaved infrared camera imagery. These capabilities need to be taken into account when assessing the overall usefulness of the TopOwl system. Even so, this test showed that pilots can utilize the image intensification imagery displayed on the TopOwl to perform benign night flying tasks to an equivalent level as pilots using ANVIS. The study should be extended to investigate more dynamic and aggressive low level flying, slope landings and ship deck landings. While there may be concerns regarding the effect of hyperstereopsis on piloting, this initial study suggests that pilots can either adapt or compensate for hyperstereo effects with sufficient exposure and training. Further analysis and testing is required to determine the extent of training required.

  12. Electromagnetic Interference to Flight Navigation and Communication Systems: New Strategies in the Age of Wireless

    NASA Technical Reports Server (NTRS)

    Ely, Jay J.

    2005-01-01

    Electromagnetic interference (EMI) promises to be an ever-evolving concern for flight electronic systems. This paper introduces EMI and identifies its impact upon civil aviation radio systems. New wireless services, like mobile phones, text messaging, email, web browsing, radio frequency identification (RFID), and mobile audio/video services are now being introduced into passenger airplanes. FCC and FAA rules governing the use of mobile phones and other portable electronic devices (PEDs) on board airplanes are presented along with a perspective of how these rules are now being rewritten to better facilitate in-flight wireless services. This paper provides a comprehensive overview of NASA cooperative research with the FAA, RTCA, airlines and universities to obtain laboratory radiated emission data for numerous PED types, aircraft radio frequency (RF) coupling measurements, estimated aircraft radio interference thresholds, and direct-effects EMI testing. These elements are combined together to provide high-confidence answers regarding the EMI potential of new wireless products being used on passenger airplanes. This paper presents a vision for harmonizing new wireless services with aeronautical radio services by detecting, assessing, controlling and mitigating the effects of EMI.

  13. Fire protection for launch facilities using machine vision fire detection

    NASA Astrophysics Data System (ADS)

    Schwartz, Douglas B.

    1993-02-01

    Fire protection of critical space assets, including launch and fueling facilities and manned flight hardware, demands automatic sensors for continuous monitoring, and in certain high-threat areas, fast-reacting automatic suppression systems. Perhaps the most essential characteristic for these fire detection and suppression systems is high reliability; in other words, fire detectors should alarm only on actual fires and not be falsely activated by extraneous sources. Existing types of fire detectors have been greatly improved in the past decade; however, fundamental limitations of their method of operation leaves open a significant possibility of false alarms and restricts their usefulness. At the Civil Engineering Laboratory at Tyndall Air Force Base in Florida, a new type of fire detector is under development which 'sees' a fire visually, like a human being, and makes a reliable decision based on known visual characteristics of flames. Hardware prototypes of the Machine Vision (MV) Fire Detection System have undergone live fire tests and demonstrated extremely high accuracy in discriminating actual fires from false alarm sources. In fact, this technology promises to virtually eliminate false activations. This detector could be used to monitor fueling facilities, launch towers, clean rooms, and other high-value and high-risk areas. Applications can extend to space station and in-flight shuttle operations as well; fiber optics and remote camera heads enable the system to see around obstructed areas and crew compartments. The capability of the technology to distinguish fires means that fire detection can be provided even during maintenance operations, such as welding.

  14. Fire protection for launch facilities using machine vision fire detection

    NASA Technical Reports Server (NTRS)

    Schwartz, Douglas B.

    1993-01-01

    Fire protection of critical space assets, including launch and fueling facilities and manned flight hardware, demands automatic sensors for continuous monitoring, and in certain high-threat areas, fast-reacting automatic suppression systems. Perhaps the most essential characteristic for these fire detection and suppression systems is high reliability; in other words, fire detectors should alarm only on actual fires and not be falsely activated by extraneous sources. Existing types of fire detectors have been greatly improved in the past decade; however, fundamental limitations of their method of operation leaves open a significant possibility of false alarms and restricts their usefulness. At the Civil Engineering Laboratory at Tyndall Air Force Base in Florida, a new type of fire detector is under development which 'sees' a fire visually, like a human being, and makes a reliable decision based on known visual characteristics of flames. Hardware prototypes of the Machine Vision (MV) Fire Detection System have undergone live fire tests and demonstrated extremely high accuracy in discriminating actual fires from false alarm sources. In fact, this technology promises to virtually eliminate false activations. This detector could be used to monitor fueling facilities, launch towers, clean rooms, and other high-value and high-risk areas. Applications can extend to space station and in-flight shuttle operations as well; fiber optics and remote camera heads enable the system to see around obstructed areas and crew compartments. The capability of the technology to distinguish fires means that fire detection can be provided even during maintenance operations, such as welding.

  15. NASA Precision Landing Technologies Completes Initial Flight Tests on Vertical Testbed Rocket

    NASA Image and Video Library

    2017-04-19

    This 2-minute, 40-second video shows how over the past 5 weeks, NASA and Masten Space Systems teams have prepared for and conducted sub-orbital rocket flight tests of next-generation lander navigation technology through the CoOperative Blending of Autonomous Landing Technologies (COBALT) project. The COBALT payload was integrated onto Masten’s rocket, Xodiac. The Xodiac vehicle used the Global Positioning System (GPS) for navigation during this first campaign, which was intentional to verify and refine COBALT system performance. The joint teams conducted numerous ground verification tests, made modifications in the process, practiced and refined operations’ procedures, conducted three tether tests, and have now flown two successful free flights. This successful, collaborative campaign has provided the COBALT and Xodiac teams with the valuable performance data needed to refine the systems and prepare them for the second flight test campaign this summer when the COBALT system will navigate the Xodiac rocket to a precision landing. The technologies within COBALT provide a spacecraft with knowledge during entry, descent, and landing that enables it to precisely navigate and softly land close to surface locations that have been previously too risky to target with current capabilities. The technologies will enable future exploration destinations on Mars, the moon, Europa, and other planets and moons. The two primary navigation components within COBALT include the Langley Research Center’s Navigation Doppler Lidar, which provides ultra-precise velocity and line-of-sight range measurements, and Jet Propulsion Laboratory’s Lander Vision System (LVS), which provides navigation estimates relative to an existing surface map. The integrated system is being flight tested onboard a Masten suborbital rocket vehicle called Xodiac. The COBALT project is led by the Johnson Space Center, with funding provided through the Game Changing Development, Flight Opportunities program, and Advanced Exploration Systems programs. Based at NASA’s Armstrong Flight Research Center in Edwards, CA, the Flight Opportunities program funds technology development flight tests on commercial suborbital space providers of which Masten is a vendor. The program has previously tested the LVS on the Masten rocket and validated the technology for the Mars 2020 rover.

  16. A Sensory-Motor Control Model of Animal Flight Explains Why Bats Fly Differently in Light Versus Dark

    PubMed Central

    Bar, Nadav S.; Skogestad, Sigurd; Marçal, Jose M.; Ulanovsky, Nachum; Yovel, Yossi

    2015-01-01

    Animal flight requires fine motor control. However, it is unknown how flying animals rapidly transform noisy sensory information into adequate motor commands. Here we developed a sensorimotor control model that explains vertebrate flight guidance with high fidelity. This simple model accurately reconstructed complex trajectories of bats flying in the dark. The model implies that in order to apply appropriate motor commands, bats have to estimate not only the angle-to-target, as was previously assumed, but also the angular velocity (“proportional-derivative” controller). Next, we conducted experiments in which bats flew in light conditions. When using vision, bats altered their movements, reducing the flight curvature. This change was explained by the model via reduction in sensory noise under vision versus pure echolocation. These results imply a surprising link between sensory noise and movement dynamics. We propose that this sensory-motor link is fundamental to motion control in rapidly moving animals under different sensory conditions, on land, sea, or air. PMID:25629809

  17. The Effect of a Monocular Helmet-Mounted Display on Aircrew Health: A Longitudinal Cohort Study of Apache AH Mk 1 Pilots -(Vision and Handedness)

    DTIC Science & Technology

    2015-05-19

    reported by U.S. Army aviators using NVG for night flights (Glick and Moser, 1974). It was initially, and incorrectly, called “brown eye syndrome ...112 FREQUENCY Never Rarely Occasionally Often Eye irritation Eye pain Blurred vision Dry eye ... Eye pain Blurred vision Dry eye Light sensitivity j. Since your last contact lens review, did you experience any of the following

  18. Bioinspired optical sensors for unmanned aerial systems

    NASA Astrophysics Data System (ADS)

    Chahl, Javaan; Rosser, Kent; Mizutani, Akiko

    2011-04-01

    Insects are dependant on the spatial, spectral and temporal distributions of light in the environment for flight control and navigation. This paper reports on flight trials of implementations of insect inspired behaviors on unmanned aerial vehicles. Optical flow methods for maintaining a constant height above ground and a constant course have been demonstrated to provide navigation capabilities that are impossible using conventional avionics sensors. Precision control of height above ground and ground course were achieved over long distances. Other vision based techniques demonstrated include a biomimetic stabilization sensor that uses the ultraviolet and green bands of the spectrum, and a sky polarization compass. Both of these sensors were tested over long trajectories in different directions, in each case showing performance similar to low cost inertial heading and attitude systems. The behaviors demonstrate some of the core functionality found in the lower levels of the sensorimotor system of flying insects and shows promise for more integrated solutions in the future.

  19. Quantifying Pilot Visual Attention in Low Visibility Terminal Operations

    NASA Technical Reports Server (NTRS)

    Ellis, Kyle K.; Arthur, J. J.; Latorella, Kara A.; Kramer, Lynda J.; Shelton, Kevin J.; Norman, Robert M.; Prinzel, Lawrence J.

    2012-01-01

    Quantifying pilot visual behavior allows researchers to determine not only where a pilot is looking and when, but holds implications for specific behavioral tracking when these data are coupled with flight technical performance. Remote eye tracking systems have been integrated into simulators at NASA Langley with effectively no impact on the pilot environment. This paper discusses the installation and use of a remote eye tracking system. The data collection techniques from a complex human-in-the-loop (HITL) research experiment are discussed; especially, the data reduction algorithms and logic to transform raw eye tracking data into quantified visual behavior metrics, and analysis methods to interpret visual behavior. The findings suggest superior performance for Head-Up Display (HUD) and improved attentional behavior for Head-Down Display (HDD) implementations of Synthetic Vision System (SVS) technologies for low visibility terminal area operations. Keywords: eye tracking, flight deck, NextGen, human machine interface, aviation

  20. Helicopter pilot estimation of self-altitude in a degraded visual environment

    NASA Astrophysics Data System (ADS)

    Crowley, John S.; Haworth, Loran A.; Szoboszlay, Zoltan P.; Lee, Alan G.

    2000-06-01

    The effect of night vision devices and degraded visual imagery on self-attitude perception is unknown. Thirteen Army aviators with normal vision flew five flights under various visual conditions in a modified AH-1 (Cobra) helicopter. Subjects estimated their altitude or flew to specified altitudes while flying a series of maneuvers. The results showed that subjects were better at detecting and controlling changes in altitude than they were at flying to or naming a specific altitude. In cruise flight and descent, the subjects tended to fly above the desired altitude, an error in the safe direction. While hovering, the direction of error was less predictable. In the low-level cruise flight scenario tested in this study, altitude perception was affected more by changes in image resolution than by changes in FOV or ocularity.

  1. Building a Shared Definitional Model of Long Duration Human Spaceflight

    NASA Technical Reports Server (NTRS)

    Arias, Diana; Orr, Martin; Whitmire, Alexandra; Leveton, Lauren; Sandoval, Luis

    2012-01-01

    Objective: To establish the need for a shared definitional model of long duration human spaceflight, that would provide a framework and vision to facilitate communication, research and practice In 1956, on the eve of human space travel, Hubertus Strughold first proposed a "simple classification of the present and future stages of manned flight" that identified key factors, risks and developmental stages for the evolutionary journey ahead. As we look to new destinations, we need a current shared working definitional model of long duration human space flight to help guide our path. Here we describe our preliminary findings and outline potential approaches for the future development of a definition and broader classification system

  2. Auditory opportunity and visual constraint enabled the evolution of echolocation in bats.

    PubMed

    Thiagavel, Jeneni; Cechetto, Clément; Santana, Sharlene E; Jakobsen, Lasse; Warrant, Eric J; Ratcliffe, John M

    2018-01-08

    Substantial evidence now supports the hypothesis that the common ancestor of bats was nocturnal and capable of both powered flight and laryngeal echolocation. This scenario entails a parallel sensory and biomechanical transition from a nonvolant, vision-reliant mammal to one capable of sonar and flight. Here we consider anatomical constraints and opportunities that led to a sonar rather than vision-based solution. We show that bats' common ancestor had eyes too small to allow for successful aerial hawking of flying insects at night, but an auditory brain design sufficient to afford echolocation. Further, we find that among extant predatory bats (all of which use laryngeal echolocation), those with putatively less sophisticated biosonar have relatively larger eyes than do more sophisticated echolocators. We contend that signs of ancient trade-offs between vision and echolocation persist today, and that non-echolocating, phytophagous pteropodid bats may retain some of the necessary foundations for biosonar.

  3. Comparison of Orion Vision Navigation Sensor Performance from STS-134 and the Space Operations Simulation Center

    NASA Technical Reports Server (NTRS)

    Christian, John A.; Patangan, Mogi; Hinkel, Heather; Chevray, Keiko; Brazzel, Jack

    2012-01-01

    The Orion Multi-Purpose Crew Vehicle is a new spacecraft being designed by NASA and Lockheed Martin for future crewed exploration missions. The Vision Navigation Sensor is a Flash LIDAR that will be the primary relative navigation sensor for this vehicle. To obtain a better understanding of this sensor's performance, the Orion relative navigation team has performed both flight tests and ground tests. This paper summarizes and compares the performance results from the STS-134 flight test, called the Sensor Test for Orion RelNav Risk Mitigation (STORRM) Development Test Objective, and the ground tests at the Space Operations Simulation Center.

  4. From wheels to wings with evolutionary spiking circuits.

    PubMed

    Floreano, Dario; Zufferey, Jean-Christophe; Nicoud, Jean-Daniel

    2005-01-01

    We give an overview of the EPFL indoor flying project, whose goal is to evolve neural controllers for autonomous, adaptive, indoor micro-flyers. Indoor flight is still a challenge because it requires miniaturization, energy efficiency, and control of nonlinear flight dynamics. This ongoing project consists of developing a flying, vision-based micro-robot, a bio-inspired controller composed of adaptive spiking neurons directly mapped into digital microcontrollers, and a method to evolve such a neural controller without human intervention. This article describes the motivation and methodology used to reach our goal as well as the results of a number of preliminary experiments on vision-based wheeled and flying robots.

  5. Perception-based synthetic cueing for night vision device rotorcraft hover operations

    NASA Astrophysics Data System (ADS)

    Bachelder, Edward N.; McRuer, Duane

    2002-08-01

    Helicopter flight using night-vision devices (NVDs) is difficult to perform, as evidenced by the high accident rate associated with NVD flight compared to day operation. The approach proposed in this paper is to augment the NVD image with synthetic cueing, whereby the cues would emulate position and motion and appear to be actually occurring in physical space on which they are overlaid. Synthetic cues allow for selective enhancement of perceptual state gains to match the task requirements. A hover cue set was developed based on an analogue of a physical target used in a flight handling qualities tracking task, a perceptual task analysis for hover, and fundamentals of human spatial perception. The display was implemented on a simulation environment, constructed using a virtual reality device, an ultrasound head-tracker, and a fixed-base helicopter simulator. Seven highly trained helicopter pilots were used as experimental subjects and tasked to maintain hover in the presence of aircraft positional disturbances while viewing a synthesized NVD environment and the experimental hover cues. Significant performance improvements were observed when using synthetic cue augmentation. This paper demonstrates that artificial magnification of perceptual states through synthetic cueing can be an effective method of improving night-vision helicopter hover operations.

  6. Modeling of pilot's visual behavior for low-level flight

    NASA Astrophysics Data System (ADS)

    Schulte, Axel; Onken, Reiner

    1995-06-01

    Developers of synthetic vision systems for low-level flight simulators deal with the problem to decide which features to incorporate in order to achieve most realistic training conditions. This paper supports an approach to this problem on the basis of modeling the pilot's visual behavior. This approach is founded upon the basic requirement that the pilot's mechanisms of visual perception should be identical in simulated and real low-level flight. Flight simulator experiments with pilots were conducted for knowledge acquisition. During the experiments video material of a real low-level flight mission containing different situations was displayed to the pilot who was acting under a realistic mission assignment in a laboratory environment. Pilot's eye movements could be measured during the replay. The visual mechanisms were divided into rule based strategies for visual navigation, based on the preflight planning process, as opposed to skill based processes. The paper results in a model of the pilot's planning strategy of a visual fixing routine as part of the navigation task. The model is a knowledge based system based upon the fuzzy evaluation of terrain features in order to determine the landmarks used by pilots. It can be shown that a computer implementation of the model selects those features, which were preferred by trained pilots, too.

  7. Moving Towards a Common Ground and Flight Data Systems Architecture for NASA's Exploration Missions

    NASA Technical Reports Server (NTRS)

    Rader. Steve; Kearney, Mike; McVittie, Thom; Smith, Dan

    2006-01-01

    The National Aeronautics and Space Administration has embarked on an ambitious effort to return man to the moon and then on to Mars. The Exploration Vision requires development of major new space and ground assets and poses challenges well beyond those faced by many of NASA's recent programs. New crewed vehicles must be developed. Compatible supply vehicles, surface mobility modules and robotic exploration capabilities will supplement the manned exploration vehicle. New launch systems will be developed as well as a new ground communications and control infrastructure. The development must take place in a cost-constrained environment and must advance along an aggressive schedule. Common solutions and system interoperability and will be critical to the successful development of the Exploration data systems for this wide variety of flight and ground elements. To this end, NASA has assembled a team of engineers from across the agency to identify the key challenges for Exploration data systems and to establish the most beneficial strategic approach to be followed. Key challenges and the planned NASA approach for flight and ground systems will be discussed in the paper. The described approaches will capitalize on new technologies, and will result in cross-program interoperability between spacecraft and ground systems, from multiple suppliers and agencies.

  8. Enhanced and Synthetic Vision for Terminal Maneuvering Area NextGen Operations

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Bailey, Randall E.; Ellis, Kyle K. E.; Norman, R. Michael; Williams, Steven P.; Arthur, Jarvis J., III; Shelton, Kevin J.; Prinzel, Lawrence J., III

    2011-01-01

    Synthetic Vision Systems and Enhanced Flight Vision System (SVS/EFVS) technologies have the potential to provide additional margins of safety for aircrew performance and enable operational improvements for low visibility operations in the terminal area environment with equivalent efficiency as visual operations. To meet this potential, research is needed for effective technology development and implementation of regulatory and design guidance to support introduction and use of SVS/EFVS advanced cockpit vision technologies in Next Generation Air Transportation System (NextGen) operations. A fixed-base pilot-in-the-loop simulation test was conducted at NASA Langley Research Center that evaluated the use of SVS/EFVS in NextGen low visibility ground (taxi) operations and approach/landing operations. Twelve crews flew approach and landing operations in a simulated NextGen Chicago O Hare environment. Various scenarios tested the potential for EFVS for operations in visibility as low as 1000 ft runway visibility range (RVR) and SVS to enable lower decision heights (DH) than can currently be flown today. Expanding the EFVS visual segment from DH to the runway in visibilities as low as 1000 RVR appears to be viable as touchdown performance was excellent without any workload penalties noted for the EFVS concept tested. A lower DH to 150 ft and/or possibly reduced visibility minima by virtue of SVS equipage appears to be viable when implemented on a Head-Up Display, but the landing data suggests further study for head-down implementations.

  9. Achieving the Air Force’s Energy Vision

    DTIC Science & Technology

    2011-01-01

    sor of logistics and supply chain management at AFIT. The US Air Force is the largest con -sumer of energy in the federal gov-ernment, spending $9...technology, motors , advanced bat- teries, and ultracapacitors. AFIT is playing a critical role in meeting Air Force and industry demand for more...Flight Design displayed a parallel hybrid-electric propul- sion system with an ICE and electric motor (fig. 2) for a general aviation aircraft. A

  10. Research Institute for Autonomous Precision Guided Systems

    DTIC Science & Technology

    2007-03-08

    research on agile autonomous munitions, in direct support of the Air Force Research Laboratory Munitions Directorate (AFRL/MN). The grant was awarded with a...Flight had (5) research task areas: 1. Aeroforms and Actuation for Small and Micro Agile Air Vehicles 2. Sensing for Autonomous Control and...critical barriers in AAM, but are not covered in the scope of the AVCAAF (Vision-Based Control of Agile, Autonomous Micro Air Vehicles and Small UAVs

  11. Helmet-Mounted Display Design Guide

    DTIC Science & Technology

    1997-11-03

    flight conditions ( Plaga , 1991). The system need not function during ejection; the criteria should be avoiding further injury. In the past, some...34 Optical En- gineering: 29, 1990, 883-892 E. Peli, "Real Vision and Virtual Reality," Optics and Photonics News, July 1995, pp. 28-34 J. A. Plaga , I...landings. For ejection-seat-equipped aircraft, the cg requirements are based on the I-NIGHTS studies ( Plaga , 1991, and Stickly and Wiley, 1992). The

  12. Space Industry. Industry Study, Spring 2009

    DTIC Science & Technology

    2009-01-01

    Space Flight Center, Cocoa Beach, FL Cape Canaveral Air Force Station, Cocoa Beach, FL Naval Ordnance Test Unit, Cocoa Beach, FL 50th Space Wing... America .” In 2009, as we celebrate the 40th anniversary of the fulfillment of that vision, it is appropriate to pause and reflect on how far we...value system, providing high-value services to both government and commercial consumers. The estimate of international and U.S. government consumption

  13. Variably Transmittive, Electronically-Controlled Eyewear

    NASA Technical Reports Server (NTRS)

    Chapman, John J. (Inventor); Glaab, Louis J. (Inventor); Schott, Timothy D. (Inventor); Howell, Charles T. (Inventor); Fleck, Vincent J. (Inventor)

    2013-01-01

    A system and method for flight training and evaluation of pilots comprises electronically activated vision restriction glasses that detect the pilot's head position and automatically darken and restrict the pilot's ability to see through the front and side windscreens when the pilot-in-training attempts to see out the windscreen. Thus, the pilot-in-training sees only within the aircraft cockpit, forcing him or her to fly by instruments in the most restricted operational mode.

  14. The hazard of spatial disorientation during helicopter flight using night vision devices.

    PubMed

    Braithwaite, M G; Douglass, P K; Durnford, S J; Lucas, G

    1998-11-01

    Night Vision Devices (NVDs) provide an enormous advantage to the operational effectiveness of military helicopter flying by permitting flight throughout the night. However, compared with daytime flight, many of the depth perception and orientational cues are severely degraded. These degraded cues predispose aviators to spatial disorientation (SD), which is a serious drawback of these devices. As part of an overall analysis of Army helicopter accidents to assess the impact of SD on military flying, we scrutinized the class A-C mishap reports involving night-aided flight from 1987 to 1995. The accidents were classified according to the role of SD by three independent assessors, with the SD group further analyzed to determine associated factors and possible countermeasures. Almost 43% of all SD-related accidents in this series occurred during flight using NVDs, whereas only 13% of non-SD accidents involved NVDs. An examination of the SD accident rates per 100,000 flying hours revealed a significant difference between the rate for day flying and the rate for flight using NVDs (mean rate for daytime flight = 1.66, mean rate for NVD flight = 9.00, p < 0.001). The most important factors associated with these accidents were related to equipment limitations, distraction from the task, and training or procedural inadequacies. SD remains an important source of attrition of Army aircraft. The more than fivefold increase in risk associated with NVD flight is of serious concern. The associated factors and suggested countermeasures should be urgently addressed.

  15. Artificial Gravity as a Multi-System Countermeasure for Exploration Class Space Flight Missions

    NASA Technical Reports Server (NTRS)

    Paloski, William H.; Dawson, David L. (Technical Monitor)

    2000-01-01

    NASA's vision for space exploration includes missions of unprecedented distance and duration. However, during 30 years of human space flight experience, including numerous long-duration missions, research has not produced any single countermeasure or combination of countermeasures that is completely effective. Current countermeasures do not fully protect crews in low-Earth orbit, and certainly will not be appropriate for crews journeying to Mars and back over a three-year period. The urgency for exploration-class countermeasures is compounded by continued technical and scientific successes that make exploration class missions increasingly attractive. The critical and possibly fatal problems of bone loss, cardiovascular deconditioning, muscle weakening, neurovestibular disturbance, space anemia, and immune compromise may be alleviated by the appropriate application of artificial gravity (AG). However, despite a manifest need for new countermeasure approaches, concepts for applying AG as a countermeasure have not developed apace. To explore the utility of AG as a multi-system countermeasure during long-duration, exploration-class space flight, eighty-three members of the international space life science and space flight community met earlier this year. They concluded unanimously that the potential of AG as a multi-system countermeasure is indeed worth pursuing, and that the requisite AG research needs to be supported more systematically by NASA. This presentation will review the issues discussed and recommendations made.

  16. Application of Fiber Optic Instrumentation

    NASA Technical Reports Server (NTRS)

    Richards, William Lance; Parker, Allen R., Jr.; Ko, William L.; Piazza, Anthony; Chan, Patrick

    2012-01-01

    Fiber optic sensing technology has emerged in recent years offering tremendous advantages over conventional aircraft instrumentation systems. The advantages of fiber optic sensors over their conventional counterparts are well established; they are lighter, smaller, and can provide enormous numbers of measurements at a fraction of the total sensor weight. After a brief overview of conventional and fiber-optic sensing technology, this paper presents an overview of the research that has been conducted at NASA Dryden Flight Research Center in recent years to advance this promising new technology. Research and development areas include system and algorithm development, sensor characterization and attachment, and real-time experimentally-derived parameter monitoring for ground- and flight-based applications. The vision of fiber optic smart structure technology is presented and its potential benefits to aerospace vehicles throughout the lifecycle, from preliminary design to final retirement, are presented.

  17. Experimental results in autonomous landing approaches by dynamic machine vision

    NASA Astrophysics Data System (ADS)

    Dickmanns, Ernst D.; Werner, Stefan; Kraus, S.; Schell, R.

    1994-07-01

    The 4-D approach to dynamic machine vision, exploiting full spatio-temporal models of the process to be controlled, has been applied to on board autonomous landing approaches of aircraft. Aside from image sequence processing, for which it was developed initially, it is also used for data fusion from a range of sensors. By prediction error feedback an internal representation of the aircraft state relative to the runway in 3-D space and time is servo- maintained in the interpretation process, from which the control applications required are being derived. The validity and efficiency of the approach have been proven both in hardware- in-the-loop simulations and in flight experiments with a twin turboprop aircraft Do128 under perturbations from cross winds and wind gusts. The software package has been ported to `C' and onto a new transputer image processing platform; the system has been expanded for bifocal vision with two cameras of different focal length mounted fixed relative to each other on a two-axes platform for viewing direction control.

  18. Selectable Hyperspectral Airborne Remote-sensing Kit (SHARK) on the Vision II turbine rotorcraft UAV over the Florida Keys

    NASA Astrophysics Data System (ADS)

    Holasek, R. E.; Nakanishi, K.; Swartz, B.; Zacaroli, R.; Hill, B.; Naungayan, J.; Herwitz, S.; Kavros, P.; English, D. C.

    2013-12-01

    As part of the NASA ROSES program, the NovaSol Selectable Hyperspectral Airborne Remote-sensing Kit (SHARK) was flown as the payload on the unmanned Vision II helicopter. The goal of the May 2013 data collection was to obtain high resolution visible and near-infrared (visNIR) hyperspectral data of seagrasses and coral reefs in the Florida Keys. The specifications of the SHARK hyperspectral system and the Vision II turbine rotorcraft will be described along with the process of integrating the payload to the vehicle platform. The minimal size, weight, and power (SWaP) specifications of the SHARK system is an ideal match to the Vision II helicopter and its flight parameters. One advantage of the helicopter over fixed wing platforms is its inherent ability to take off and land in a limited area and without a runway, enabling the UAV to be located in close proximity to the experiment areas and the science team. Decisions regarding integration times, waypoint selection, mission duration, and mission frequency are able to be based upon the local environmental conditions and can be modified just prior to take off. The operational procedures and coordination between the UAV pilot, payload operator, and scientist will be described. The SHARK system includes an inertial navigation system and digital elevation model (DEM) which allows image coordinates to be calculated onboard the aircraft in real-time. Examples of the geo-registered images from the data collection will be shown. SHARK mounted below VTUAV. SHARK deployed on VTUAV over water.

  19. Modifying and Testing ATC Controller Interface (CI) for Data Link Clearances

    NASA Technical Reports Server (NTRS)

    2001-01-01

    The Controller-Pilot Data Link Communications (CPDLC) and Air Traffic Control workstation research was conducted as part of the 1997 NASA Low Visibility Landing and Surface Operations (LVLASO) demonstration program at Atlanta Hartsfield airport. Research activity under this grant increased the sophistication of the Controllers' Communication and Situational Awareness Terminal (C-CAST) and developed a VHF Data Link -Mode 2 communications platform. The research culminated with participation in the 2000 NASA Aviation Safety Program's Synthetic Vision System (SVS) / Runway Incursion Prevention System (RIPS) flight demonstration at Dallas-Fort Worth Airport.

  20. Human Factors and Safety Considerations of Night Vision Systems Flight Using Thermal Imaging Systems

    DTIC Science & Technology

    1990-04-01

    s4 tank with engine, drive playa that are twit very daIrk in the kcast brightest wheel-, and eXhSAus at temprrAtures between M5 anid Warea IvPKlly...be presented to the pilot on a miniature (1- sources, when viewed through the combiner, degra- inch diameter) cathodec ray tube (CRT) in the de the...performance. Also. flme Ccas Affect the display’s performance as well. The see-through chrsctcrisics of the display permit de Unlike sytcms using the

  1. Time of flight imaging through scattering environments (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Le, Toan H.; Breitbach, Eric C.; Jackson, Jonathan A.; Velten, Andreas

    2017-02-01

    Light scattering is a primary obstacle to imaging in many environments. On small scales in biomedical microscopy and diffuse tomography scenarios scattering is caused by tissue. On larger scales scattering from dust and fog provide challenges to vision systems for self driving cars and naval remote imaging systems. We are developing scale models for scattering environments and investigation methods for improved imaging particularly using time of flight transient information. With the emergence of Single Photon Avalanche Diode detectors and fast semiconductor lasers, illumination and capture on picosecond timescales are becoming possible in inexpensive, compact, and robust devices. This opens up opportunities for new computational imaging techniques that make use of photon time of flight. Time of flight or range information is used in remote imaging scenarios in gated viewing and in biomedical imaging in time resolved diffuse tomography. In addition spatial filtering is popular in biomedical scenarios with structured illumination and confocal microscopy. We are presenting a combination analytical, computational, and experimental models that allow us develop and test imaging methods across scattering scenarios and scales. This framework will be used for proof of concept experiments to evaluate new computational imaging methods.

  2. COBALT: A GN&C Payload for Testing ALHAT Capabilities in Closed-Loop Terrestrial Rocket Flights

    NASA Technical Reports Server (NTRS)

    Carson, John M., III; Amzajerdian, Farzin; Hines, Glenn D.; O'Neal, Travis V.; Robertson, Edward A.; Seubert, Carl; Trawny, Nikolas

    2016-01-01

    The COBALT (CoOperative Blending of Autonomous Landing Technology) payload is being developed within NASA as a risk reduction activity to mature, integrate and test ALHAT (Autonomous precision Landing and Hazard Avoidance Technology) systems targeted for infusion into near-term robotic and future human space flight missions. The initial COBALT payload instantiation is integrating the third-generation ALHAT Navigation Doppler Lidar (NDL) sensor, for ultra high-precision velocity plus range measurements, with the passive-optical Lander Vision System (LVS) that provides Terrain Relative Navigation (TRN) global-position estimates. The COBALT payload will be integrated onboard a rocket-propulsive terrestrial testbed and will provide precise navigation estimates and guidance planning during two flight test campaigns in 2017 (one open-loop and closed- loop). The NDL is targeting performance capabilities desired for future Mars and Moon Entry, Descent and Landing (EDL). The LVS is already baselined for TRN on the Mars 2020 robotic lander mission. The COBALT platform will provide NASA with a new risk-reduction capability to test integrated EDL Guidance, Navigation and Control (GN&C) components in closed-loop flight demonstrations prior to the actual mission EDL.

  3. Humans and machines in space: The vision, the challenge, the payoff; Proceedings of the 29th Goddard Memorial Symposium, Washington, Mar. 14, 15, 1991

    NASA Astrophysics Data System (ADS)

    Johnson, Bradley; May, Gayle L.; Korn, Paula

    The present conference discusses the currently envisioned goals of human-machine systems in spacecraft environments, prospects for human exploration of the solar system, and plausible methods for meeting human needs in space. Also discussed are the problems of human-machine interaction in long-duration space flights, remote medical systems for space exploration, the use of virtual reality for planetary exploration, the alliance between U.S. Antarctic and space programs, and the economic and educational impacts of the U.S. space program.

  4. Enhanced vision flight deck technology for commercial aircraft low-visibility surface operations

    NASA Astrophysics Data System (ADS)

    Arthur, Jarvis J.; Norman, R. M.; Kramer, Lynda J.; Prinzel, Lawerence J.; Ellis, Kyle K.; Harrison, Stephanie J.; Comstock, J. R.

    2013-05-01

    NASA Langley Research Center and the FAA collaborated in an effort to evaluate the effect of Enhanced Vision (EV) technology display in a commercial flight deck during low visibility surface operations. Surface operations were simulated at the Memphis, TN (FAA identifier: KMEM) airfield during nighttime with 500 Runway Visual Range (RVR) in a high-fidelity, full-motion simulator. Ten commercial airline flight crews evaluated the efficacy of various EV display locations and parallax and minification effects. The research paper discusses qualitative and quantitative results of the simulation experiment, including the effect of EV display placement on visual attention, as measured by the use of non-obtrusive oculometry and pilot mental workload. The results demonstrated the potential of EV technology to enhance situation awareness which is dependent on the ease of access and location of the displays. Implications and future directions are discussed.

  5. Enhanced Vision Flight Deck Technology for Commercial Aircraft Low-Visibility Surface Operations

    NASA Technical Reports Server (NTRS)

    Arthur, Jarvis J., III; Norman, R. Michael; Kramer, Lynda J.; Prinzel, Lawrence J., III; Ellis, Kyle K. E.; Harrison, Stephanie J.; Comstock, J. Ray

    2013-01-01

    NASA Langley Research Center and the FAA collaborated in an effort to evaluate the effect of Enhanced Vision (EV) technology display in a commercial flight deck during low visibility surface operations. Surface operations were simulated at the Memphis, TN (FAA identifier: KMEM) air field during nighttime with 500 Runway Visual Range (RVR) in a high-fidelity, full-motion simulator. Ten commercial airline flight crews evaluated the efficacy of various EV display locations and parallax and mini cation effects. The research paper discusses qualitative and quantitative results of the simulation experiment, including the effect of EV display placement on visual attention, as measured by the use of non-obtrusive oculometry and pilot mental workload. The results demonstrated the potential of EV technology to enhance situation awareness which is dependent on the ease of access and location of the displays. Implications and future directions are discussed.

  6. Vision Changes after Space Flight Are Related to Alterations in Folate-Dependent One-Carbon Metabolism

    NASA Technical Reports Server (NTRS)

    Smith, Scott M.; Gibson, C. Robert; Mader, Thomas H.; Ericson, Karen; Ploutz-Snyder, Robert; Heer, Martina; Zwart, Sara R.

    2011-01-01

    About 20% of astronauts on International Space Station missions have developed measurable ophthalmic changes after flight. This study was conducted to determine whether the folate-dependent 1-carbon pathway is altered in these individuals. Data were modeled to evaluate differences between individuals with ophthalmic changes (n=5) and those without them (n=15). We also correlated mean preflight serum concentrations of the 1-carbon metabolites with changes in measured refraction after flight. Serum homocysteine (HCy), cystathionine, 2-methylcitric acid, and methylmalonic acid concentrations were 25%-45% higher (P<0.001) in astronauts with ophthalmic changes than in those without them. These differences existed before, during, and after flight. Preflight serum HCy and cystathionine, and in-flight serum folate, were significantly (P<0.05) correlated with postflight change in refraction, and preflight serum concentrations of 2-methylcitric acid tended to be associated (P=0.06) with ophthalmic changes. The biochemical differences observed in those with vision issues strongly suggests impairment of the folate-dependent 1-carbon transfer pathway. Impairment of this pathway, by polymorphisms, diet or other means, may interact with components of the microgravity environment to influence these pathophysiologic changes. This study was funded by the NASA Human Research Program.

  7. Automated vision occlusion-timing instrument for perception-action research.

    PubMed

    Brenton, John; Müller, Sean; Rhodes, Robbie; Finch, Brad

    2018-02-01

    Vision occlusion spectacles are a highly valuable instrument for visual-perception-action research in a variety of disciplines. In sports, occlusion spectacles have enabled invaluable knowledge to be obtained about the superior capability of experts to use visual information to guide actions within in-situ settings. Triggering the spectacles to occlude a performer's vision at a precise time in an opponent's action or object flight has been problematic, due to experimenter error in using a manual buttonpress approach. This article describes a new laser curtain wireless trigger for vision occlusion spectacles that is portable and fast in terms of its transmission time. The laser curtain can be positioned in a variety of orientations to accept a motion trigger, such as a cricket bowler's arm that distorts the lasers, which then activates a wireless signal for the occlusion spectacles to change from transparent to opaque, which occurs in only 8 ms. Results are reported from calculations done in an electronics laboratory, as well as from tests in a performance laboratory with a cricket bowler and a baseball pitcher, which verified this short time delay before vision occlusion. In addition, our results show that occlusion consistently occurred when it was intended-that is, near ball release and during mid-ball-flight. Only 8% of the collected data trials were unusable. The laser curtain improves upon the limitations of existing vision occlusion spectacle triggers, indicating that it is a valuable instrument for perception-action research in a variety of disciplines.

  8. Runway Safety Monitor Algorithm for Runway Incursion Detection and Alerting

    NASA Technical Reports Server (NTRS)

    Green, David F., Jr.; Jones, Denise R. (Technical Monitor)

    2002-01-01

    The Runway Safety Monitor (RSM) is an algorithm for runway incursion detection and alerting that was developed in support of NASA's Runway Incursion Prevention System (RIPS) research conducted under the NASA Aviation Safety Program's Synthetic Vision System element. The RSM algorithm provides pilots with enhanced situational awareness and warnings of runway incursions in sufficient time to take evasive action and avoid accidents during landings, takeoffs, or taxiing on the runway. The RSM currently runs as a component of the NASA Integrated Display System, an experimental avionics software system for terminal area and surface operations. However, the RSM algorithm can be implemented as a separate program to run on any aircraft with traffic data link capability. The report documents the RSM software and describes in detail how RSM performs runway incursion detection and alerting functions for NASA RIPS. The report also describes the RIPS flight tests conducted at the Dallas-Ft Worth International Airport (DFW) during September and October of 2000, and the RSM performance results and lessons learned from those flight tests.

  9. Open-Loop Performance of COBALT Precision Landing Payload on a Commercial Sub-Orbital Rocket

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina I.; Carson, John M., III; Amzajerdian, Farzin; Seubert, Carl R.; Lovelace, Ronney S.; McCarthy, Megan M.; Tse, Teming; Stelling, Richard; Collins, Steven M.

    2018-01-01

    An open-loop flight test campaign of the NASA COBALT (CoOperative Blending of Autonomous Landing Technologies) platform was conducted onboard the Masten Xodiac suborbital rocket testbed. The COBALT platform integrates NASA Guidance, Navigation and Control (GN&C) sensing technologies for autonomous, precise soft landing, including the Navigation Doppler Lidar (NDL) velocity and range sensor and the Lander Vision System (LVS) Terrain Relative Navigation (TRN) system. A specialized navigation filter running onboard COBALT fuses the NDL and LVS data in real time to produce a navigation solution that is independent of GPS and suitable for future, autonomous, planetary, landing systems. COBALT was a passive payload during the open loop tests. COBALT's sensors were actively taking data and processing it in real time, but the Xodiac rocket flew with its own GPS-navigation system as a risk reduction activity in the maturation of the technologies towards space flight. A future closed-loop test campaign is planned where the COBALT navigation solution will be used to fly its host vehicle.

  10. Vision requirements for Space Station applications

    NASA Technical Reports Server (NTRS)

    Crouse, K. R.

    1985-01-01

    Problems which will be encountered by computer vision systems in Space Station operations are discussed, along with solutions be examined at Johnson Space Station. Lighting cannot be controlled in space, nor can the random presence of reflective surfaces. Task-oriented capabilities are to include docking to moving objects, identification of unexpected objects during autonomous flights to different orbits, and diagnoses of damage and repair requirements for autonomous Space Station inspection robots. The approaches being examined to provide these and other capabilities are television IR sensors, advanced pattern recognition programs feeding on data from laser probes, laser radar for robot eyesight and arrays of SMART sensors for automated location and tracking of target objects. Attention is also being given to liquid crystal light valves for optical processing of images for comparisons with on-board electronic libraries of images.

  11. Distracting people from sources of discomfort in a simulated aircraft environment.

    PubMed

    Lewis, Laura; Patel, Harshada; Cobb, Sue; D'Cruz, Mirabelle; Bues, Matthias; Stefani, Oliver; Grobler, Tredeaux

    2016-07-19

    Comfort is an important factor in the acceptance of transport systems. In 2010 and 2011, the European Commission (EC) put forward its vision for air travel in the year 2050 which envisaged the use of in-flight virtual reality. This paper addressed the EC vision by investigating the effect of virtual environments on comfort. Research has shown that virtual environments can provide entertaining experiences and can be effective distracters from painful experiences. To determine the extent to which a virtual environment could distract people from sources of discomfort. Experiments which involved inducing discomfort commonly experienced in-flight (e.g. limited space, noise) in order to determine the extent to which viewing a virtual environment could distract people from discomfort. Virtual environments can fully or partially distract people from sources of discomfort, becoming more effective when they are interesting. They are also more effective at distracting people from discomfort caused by restricted space than noise disturbances. Virtual environments have the potential to enhance passenger comfort by providing positive distractions from sources of discomfort. Further research is required to understand more fully the reasons why the effect was stronger for one source of discomfort than the other.

  12. A strategic vision for telemedicine and medical informatics in space flight

    NASA Technical Reports Server (NTRS)

    Williams, D. R.; Bashshur, R. L.; Pool, S. L.; Doarn, C. R.; Merrell, R. C.; Logan, J. S.

    2000-01-01

    This Workshop was designed to assist in the ongoing development and application of telemedicine and medical informatics to support extended space flight. Participants included specialists in telemedicine and medical/health informatics (terrestrial and space) medicine from NASA, federal agencies, academic centers, and research and development institutions located in the United States and several other countries. The participants in the working groups developed vision statements, requirements, approaches, and recommendations pertaining to developing and implementing a strategy pertaining to telemedicine and medical informatics. Although some of the conclusions and recommendations reflect ongoing work at NASA, others provided new insight and direction that may require a reprioritization of current NASA efforts in telemedicine and medical informatics. This, however, was the goal of the Workshop. NASA is seeking other perspectives and views from leading practitioners in the fields of telemedicine and medical informatics to invigorate an essential and high-priority component of the International Space Station and future extended exploration missions. Subsequent workshops will further define and refine the general findings and recommendations achieved here. NASA's ultimate aim is to build a sound telemedicine and medical informatics operational system to provide the best medical care available for astronauts going to Mars and beyond.

  13. A strategic vision for telemedicine and medical informatics in space flight.

    PubMed

    Williams, D R; Bashshur, R L; Pool, S L; Doarn, C R; Merrell, R C; Logan, J S

    2000-01-01

    This Workshop was designed to assist in the ongoing development and application of telemedicine and medical informatics to support extended space flight. Participants included specialists in telemedicine and medical/health informatics (terrestrial and space) medicine from NASA, federal agencies, academic centers, and research and development institutions located in the United States and several other countries. The participants in the working groups developed vision statements, requirements, approaches, and recommendations pertaining to developing and implementing a strategy pertaining to telemedicine and medical informatics. Although some of the conclusions and recommendations reflect ongoing work at NASA, others provided new insight and direction that may require a reprioritization of current NASA efforts in telemedicine and medical informatics. This, however, was the goal of the Workshop. NASA is seeking other perspectives and views from leading practitioners in the fields of telemedicine and medical informatics to invigorate an essential and high-priority component of the International Space Station and future extended exploration missions. Subsequent workshops will further define and refine the general findings and recommendations achieved here. NASA's ultimate aim is to build a sound telemedicine and medical informatics operational system to provide the best medical care available for astronauts going to Mars and beyond.

  14. Landmark-aided localization for air vehicles using learned object detectors

    NASA Astrophysics Data System (ADS)

    DeAngelo, Mark Patrick

    This research presents two methods to localize an aircraft without GPS using fixed landmarks observed from an optical sensor. Onboard absolute localization is useful for vehicle navigation free from an external network. The objective is to achieve practical navigation performance using available autopilot hardware and a downward pointing camera. The first method uses computer vision cascade object detectors, which are trained to detect predetermined, distinct landmarks prior to a flight. The first method also concurrently explores aircraft localization using roads between landmark updates. During a flight, the aircraft navigates with attitude, heading, airspeed, and altitude measurements and obtains measurement updates when landmarks are detected. The sensor measurements and landmark coordinates extracted from the aircraft's camera images are combined into an unscented Kalman filter to obtain an estimate of the aircraft's position and wind velocities. The second method uses computer vision object detectors to detect abundant generic landmarks referred as buildings, fields, trees, and road intersections from aerial perspectives. Various landmark attributes and spatial relationships to other landmarks are used to help associate observed landmarks with reference landmarks. The computer vision algorithms automatically extract reference landmarks from maps, which are processed offline before a flight. During a flight, the aircraft navigates with attitude, heading, airspeed, and altitude measurements and obtains measurement corrections by processing aerial photos with similar generic landmark detection techniques. The method also combines sensor measurements and landmark coordinates into an unscented Kalman filter to obtain an estimate of the aircraft's position and wind velocities.

  15. Human Requirements of Flight. Aviation and Spaceflight. Aerospace Education III.

    ERIC Educational Resources Information Center

    Coard, E. A.

    This book, one in the series on Aerospace Education III, deals with the general nature of human physiology during space flights. Chapter 1 begins with a brief discussion of the nature of the atmosphere. Other topics examined in this chapter include respiration and circulation, principles and problems of vision, noise and vibration, and…

  16. Antimicrobial Medication Stability During Space Flight

    NASA Technical Reports Server (NTRS)

    Putcha, Lakshmi; Berens, Kurt; Du, Jianping

    2004-01-01

    The current vision for manned space flight involves lunar and Martian exploration within the next two decades. In order for NASA to achieve these goals, a significant amount of preparation is necessary to assure crew health and safety. A mission critical component of this vision centers around the stability of pharmaceutical preparations contained in the space medicine kits. Evidence suggests that even brief periods of space flight have significant detrimental effects for some pharmaceutical formulations. The effects observed include decreases in physical stability of drug formulations of sufficient magnitude to effect bioavailability. Other formulations exhibit decreases in chemical stability resulting in a loss of potency. Physical or-chemical instability of pharmaceutical formulations i n space medicine kits could render the products ineffective. Of additional concern is the potential for formation of toxic degradation products as a result of the observed product instability. This proposal addresses Question number 11 of Clinical Capabilities in the Critical Path Roadmap. In addition, this proposal will reduce the risks and/or enhance the capabilities of humans exposed to the environments of space flight or an extraterrestrial destination by identifying drugs that may be unstable during spaceflight.

  17. Computational imaging of light in flight

    NASA Astrophysics Data System (ADS)

    Hullin, Matthias B.

    2014-10-01

    Many computer vision tasks are hindered by image formation itself, a process that is governed by the so-called plenoptic integral. By averaging light falling into the lens over space, angle, wavelength and time, a great deal of information is irreversibly lost. The emerging idea of transient imaging operates on a time resolution fast enough to resolve non-stationary light distributions in real-world scenes. It enables the discrimination of light contributions by the optical path length from light source to receiver, a dimension unavailable in mainstream imaging to date. Until recently, such measurements used to require high-end optical equipment and could only be acquired under extremely restricted lab conditions. To address this challenge, we introduced a family of computational imaging techniques operating on standard time-of-flight image sensors, for the first time allowing the user to "film" light in flight in an affordable, practical and portable way. Just as impulse responses have proven a valuable tool in almost every branch of science and engineering, we expect light-in-flight analysis to impact a wide variety of applications in computer vision and beyond.

  18. 77 FR 5990 - Special Conditions: Learjet Inc., Model LJ-200-1A10 Airplane, Pilot-Compartment View Through...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-07

    ... airflow to maintain a clear-vision area. The heavy rain and high speed conditions specified in the current... to maintain a sufficiently clear area of the windshield in low-speed flight or during surface... airflow disturbance or separation on the windshield could cause failure to maintain a clear-vision area on...

  19. The Effects of Synthetic and Enhanced Vision Technologies for Lunar Landings

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Norman, Robert M.; Prinzel, Lawrence J., III; Bailey, Randall E.; Arthur, Jarvis J., III; Shelton, Kevin J.; Williams, Steven P.

    2009-01-01

    Eight pilots participated as test subjects in a fixed-based simulation experiment to evaluate advanced vision display technologies such as Enhanced Vision (EV) and Synthetic Vision (SV) for providing terrain imagery on flight displays in a Lunar Lander Vehicle. Subjects were asked to fly 20 approaches to the Apollo 15 lunar landing site with four different display concepts - Baseline (symbology only with no terrain imagery), EV only (terrain imagery from Forward Looking Infra Red, or FLIR, and LIght Detection and Ranging, or LIDAR, sensors), SV only (terrain imagery from onboard database), and Fused EV and SV concepts. As expected, manual landing performance was excellent (within a meter of landing site center) and not affected by the inclusion of EV or SV terrain imagery on the Lunar Lander flight displays. Subjective ratings revealed significant situation awareness improvements with the concepts employing EV and/or SV terrain imagery compared to the Baseline condition that had no terrain imagery. In addition, display concepts employing EV imagery (compared to the SV and Baseline concepts which had none) were significantly better for pilot detection of intentional but unannounced navigation failures since this imagery provided an intuitive and obvious visual methodology to monitor the validity of the navigation solution.

  20. Design Considerations for Attitude State Awareness and Prevention of Entry into Unusual Attitudes

    NASA Technical Reports Server (NTRS)

    Ellis, Kyle K. E.; Prinzel, Lawrence J., III; Arthur, Jarvis J.; Nicholas, Stephanie N.; Kiggins, Daniel; Verstynen, Harry; Hubbs, Clay; Wilkerson, James

    2017-01-01

    Loss of control - inflight (LOC-I) has historically represented the largest category of commercial aviation fatal accidents. A review of the worldwide transport airplane accidents (2001-2010) evinced that loss of attitude or energy state awareness was responsible for a large majority of the LOC-I events. A Commercial Aviation Safety Team (CAST) study of 18 worldwide loss-of-control accidents and incidents determined that flight crew loss of attitude awareness or energy state awareness due to lack of external visual reference cues was a significant causal factor in 17 of the 18 reviewed flights. CAST recommended that "Virtual Day-Visual Meteorological Condition" (Virtual Day-VMC) displays be developed to provide the visual cues necessary to prevent loss-of-control resulting from flight crew spatial disorientation and loss of energy state awareness. Synthetic vision or equivalent systems (SVS) were identified for a design "safety enhancement" (SE-200). Part of this SE involves the conduct of research for developing minimum aviation system performance standards (MASPS) for these flight deck display technologies to aid flight crew attitude and energy state awareness similar to that of a virtual day-VMC-like environment. This paper will describe a novel experimental approach to evaluating a flight crew's ability to maintain attitude awareness and to prevent entry into unusual attitudes across several SVS optical flow design considerations. Flight crews were subjected to compound-event scenarios designed to elicit channelized attention and startle/surprise within the crew. These high-fidelity scenarios, designed from real-world events, enable evaluation of the efficacy of SVS at improving flight crew attitude awareness to reduce the occurrence of LOC-I incidents in commercial flight operations.

  1. Open-Loop Flight Testing of COBALT Navigation and Sensor Technologies for Precise Soft Landing

    NASA Technical Reports Server (NTRS)

    Carson, John M., III; Restrepo, Caroline I.; Seubert, Carl R.; Amzajerdian, Farzin; Pierrottet, Diego F.; Collins, Steven M.; O'Neal, Travis V.; Stelling, Richard

    2017-01-01

    An open-loop flight test campaign of the NASA COBALT (CoOperative Blending of Autonomous Landing Technologies) payload was conducted onboard the Masten Xodiac suborbital rocket testbed. The payload integrates two complementary sensor technologies that together provide a spacecraft with knowledge during planetary descent and landing to precisely navigate and softly touchdown in close proximity to targeted surface locations. The two technologies are the Navigation Doppler Lidar (NDL), for high-precision velocity and range measurements, and the Lander Vision System (LVS) for map-relative state esti- mates. A specialized navigation filter running onboard COBALT fuses the NDL and LVS data in real time to produce a very precise Terrain Relative Navigation (TRN) solution that is suitable for future, autonomous planetary landing systems that require precise and soft landing capabilities. During the open-loop flight campaign, the COBALT payload acquired measurements and generated a precise navigation solution, but the Xodiac vehicle planned and executed its maneuvers based on an independent, GPS-based navigation solution. This minimized the risk to the vehicle during the integration and testing of the new navigation sensing technologies within the COBALT payload.

  2. Physics-based simulations of aerial attacks by peregrine falcons reveal that stooping at high speed maximizes catch success against agile prey

    PubMed Central

    Hildenbrandt, Hanno

    2018-01-01

    The peregrine falcon Falco peregrinus is renowned for attacking its prey from high altitude in a fast controlled dive called a stoop. Many other raptors employ a similar mode of attack, but the functional benefits of stooping remain obscure. Here we investigate whether, when, and why stooping promotes catch success, using a three-dimensional, agent-based modeling approach to simulate attacks of falcons on aerial prey. We simulate avian flapping and gliding flight using an analytical quasi-steady model of the aerodynamic forces and moments, parametrized by empirical measurements of flight morphology. The model-birds’ flight control inputs are commanded by their guidance system, comprising a phenomenological model of its vision, guidance, and control. To intercept its prey, model-falcons use the same guidance law as missiles (pure proportional navigation); this assumption is corroborated by empirical data on peregrine falcons hunting lures. We parametrically vary the falcon’s starting position relative to its prey, together with the feedback gain of its guidance loop, under differing assumptions regarding its errors and delay in vision and control, and for three different patterns of prey motion. We find that, when the prey maneuvers erratically, high-altitude stoops increase catch success compared to low-altitude attacks, but only if the falcon’s guidance law is appropriately tuned, and only given a high degree of precision in vision and control. Remarkably, the optimal tuning of the guidance law in our simulations coincides closely with what has been observed empirically in peregrines. High-altitude stoops are shown to be beneficial because their high airspeed enables production of higher aerodynamic forces for maneuvering, and facilitates higher roll agility as the wings are tucked, each of which is essential to catching maneuvering prey at realistic response delays. PMID:29649207

  3. Development of the Space Operations Incident Reporting Tool (SOIRT)

    NASA Technical Reports Server (NTRS)

    Minton, Jacquie

    1997-01-01

    The space operations incident reporting tool (SOIRT) is an instrument used to record information about an anomaly occurring during flight which may have been due to insufficient and/or inappropriate application of human factors knowledge. We originally developed the SOIRT form after researching other incident reporting systems of this type. We modified the form after performing several in-house reviews and a pilot test to access usability. Finally, crew members from Space Shuttle flights participated in a usability test of the tool after their missions. Since the National Aeronautics and Space Administration (NASA) currently has no system for continuous collection of this type of information, the SOIRT was developed to report issues such as reach envelope constraints, control operation difficulties, and vision impairments. However, if the SOIRT were to become a formal NASA process, information from crew members could be collected in a database and made available to individuals responsible for improving in-flight safety and productivity. Potential benefits include documentation to justify the redesign or development of new equipment/systems, provide the mission planners with a method for identifying past incidents, justify the development of timelines and mission scenarios, and require the creation of more appropriate work/rest cycles.

  4. How Lovebirds Maneuver Rapidly Using Super-Fast Head Saccades and Image Feature Stabilization

    PubMed Central

    Kress, Daniel; van Bokhorst, Evelien; Lentink, David

    2015-01-01

    Diurnal flying animals such as birds depend primarily on vision to coordinate their flight path during goal-directed flight tasks. To extract the spatial structure of the surrounding environment, birds are thought to use retinal image motion (optical flow) that is primarily induced by motion of their head. It is unclear what gaze behaviors birds perform to support visuomotor control during rapid maneuvering flight in which they continuously switch between flight modes. To analyze this, we measured the gaze behavior of rapidly turning lovebirds in a goal-directed task: take-off and fly away from a perch, turn on a dime, and fly back and land on the same perch. High-speed flight recordings revealed that rapidly turning lovebirds perform a remarkable stereotypical gaze behavior with peak saccadic head turns up to 2700 degrees per second, as fast as insects, enabled by fast neck muscles. In between saccades, gaze orientation is held constant. By comparing saccade and wingbeat phase, we find that these super-fast saccades are coordinated with the downstroke when the lateral visual field is occluded by the wings. Lovebirds thus maximize visual perception by overlying behaviors that impair vision, which helps coordinate maneuvers. Before the turn, lovebirds keep a high contrast edge in their visual midline. Similarly, before landing, the lovebirds stabilize the center of the perch in their visual midline. The perch on which the birds land swings, like a branch in the wind, and we find that retinal size of the perch is the most parsimonious visual cue to initiate landing. Our observations show that rapidly maneuvering birds use precisely timed stereotypic gaze behaviors consisting of rapid head turns and frontal feature stabilization, which facilitates optical flow based flight control. Similar gaze behaviors have been reported for visually navigating humans. This finding can inspire more effective vision-based autopilots for drones. PMID:26107413

  5. Flying Drosophila stabilize their vision-based velocity controller by sensing wind with their antennae

    PubMed Central

    Fuller, Sawyer Buckminster; Straw, Andrew D.; Peek, Martin Y.; Murray, Richard M.; Dickinson, Michael H.

    2014-01-01

    Flies and other insects use vision to regulate their groundspeed in flight, enabling them to fly in varying wind conditions. Compared with mechanosensory modalities, however, vision requires a long processing delay (~100 ms) that might introduce instability if operated at high gain. Flies also sense air motion with their antennae, but how this is used in flight control is unknown. We manipulated the antennal function of fruit flies by ablating their aristae, forcing them to rely on vision alone to regulate groundspeed. Arista-ablated flies in flight exhibited significantly greater groundspeed variability than intact flies. We then subjected them to a series of controlled impulsive wind gusts delivered by an air piston and experimentally manipulated antennae and visual feedback. The results show that an antenna-mediated response alters wing motion to cause flies to accelerate in the same direction as the gust. This response opposes flying into a headwind, but flies regularly fly upwind. To resolve this discrepancy, we obtained a dynamic model of the fly’s velocity regulator by fitting parameters of candidate models to our experimental data. The model suggests that the groundspeed variability of arista-ablated flies is the result of unstable feedback oscillations caused by the delay and high gain of visual feedback. The antenna response drives active damping with a shorter delay (~20 ms) to stabilize this regulator, in exchange for increasing the effect of rapid wind disturbances. This provides insight into flies’ multimodal sensory feedback architecture and constitutes a previously unknown role for the antennae. PMID:24639532

  6. System identification and sensorimotor determinants of flight maneuvers in an insect

    NASA Astrophysics Data System (ADS)

    Sponberg, Simon; Hall, Robert; Roth, Eatai

    Locomotor maneuvers are inherently closed-loop processes. They are generally characterized by the integration of multiple sensory inputs and adaptation or learning over time. To probe sensorimotor processing we take a system identification approach treating the underlying physiological systems as dynamic processes and altering the feedback topology in experiment and analysis. As a model system, we use agile hawk moths (Manduca sexta), which feed from real and robotic flowers while hovering in mid air. Moths rely on vision and mechanosensation to track floral targets and can do so at exceptionally low luminance levels despite hovering being a mechanically unstable behavior that requires neural feedback to stabilize. By altering the sensory environment and placing mechanical and visual signals in conflict we show a surprisingly simple linear summation of visual and mechanosensation produces a generative prediction of behavior to novel stimuli. Tracking performance is also limited more by the mechanics of flight than the magnitude of the sensory cue. A feedback systems approach to locomotor control results in new insights into how behavior emerges from the interaction of nonlinear physiological systems.

  7. Measures for simulator evaluation of a helicopter obstacle avoidance system

    NASA Technical Reports Server (NTRS)

    Demaio, Joe; Sharkey, Thomas J.; Kennedy, David; Hughes, Micheal; Meade, Perry

    1993-01-01

    The U.S. Army Aeroflightdynamics Directorate (AFDD) has developed a high-fidelity, full-mission simulation facility for the demonstration and evaluation of advanced helicopter mission equipment. The Crew Station Research and Development Facility (CSRDF) provides the capability to conduct one- or two-crew full-mission simulations in a state-of-the-art helicopter simulator. The CSRDF provides a realistic, full field-of-regard visual environment with simulation of state-of-the-art weapons, sensors, and flight control systems. We are using the CSRDF to evaluate the ability of an obstacle avoidance system (OASYS) to support low altitude flight in cluttered terrain using night vision goggles (NVG). The OASYS uses a laser radar to locate obstacles to safe flight in the aircraft's flight path. A major concern is the detection of wires, which can be difficult to see with NVG, but other obstacles--such as trees, poles or the ground--are also a concern. The OASYS symbology is presented to the pilot on a head-up display mounted on the NVG (NVG-HUD). The NVG-HUD presents head-stabilized symbology to the pilot while allowing him to view the image intensified, out-the-window scene through the HUD. Since interference with viewing through the display is a major concern, OASYS symbology must be designed to present usable obstacle clearance information with a minimum of clutter.

  8. The MSFC Systems Engineering Guide: An Overview and Plan

    NASA Technical Reports Server (NTRS)

    Shelby, Jerry; Thomas, L. Dale

    2007-01-01

    This paper describes the guiding vision, progress to date and the plan forward for development of the Marshall Space Flight Center (MSFC) Systems Engineering Guide (SEG), a virtual systems engineering handbook and archive that describes the system engineering processes used by MSFC in the development of ongoing complex space systems such as the Ares launch vehicle and forthcoming ones as well. It is the intent of this website to be a "One Stop Shop' for MSFC systems engineers that will provide tutorial information, an overview of processes and procedures and links to assist system engineering with guidance and references, and provide an archive of relevant systems engineering artifacts produced by the many NASA projects developed and managed by MSFC over the years.

  9. TODD MAY ADDRESSES ALL HANDS

    NASA Image and Video Library

    2016-06-22

    NASA MARSHALL SPACE FLIGHT CENTER DIRECTOR TODD MAY TALKS ABOUT HIS VISION FOR THE CENTER DURING AN ALL-HANDS MEETING JUNE 22 IN MORRIS AUDITORIUM, AND BROADCAST CENTERWIDE. ALSO SPEAKING TO THE MARSHALL TEAM AND TAKING QUESTIONS DURING THE EVENT ARE, FROM LEFT, MARSHALL DEPUTY DIRECTOR JODY SINGER, ASSOCIATE DIRECTOR ROBIN HENDERSON AND ASSOCIATE DIRECTOR, TECHNICAL, PAUL MCCONNAUGHEY. "WE'RE IN THE BUSINESS OF MAKING THE IMPOSSIBLE POSSIBLE," SAID MAY, CITING PROGRESS ON THE SPACE LAUNCH SYSTEM AND THE JOURNEY TO MARS AND RECOUNTING HIGHLIGHTS OF MARSHALL'S 56-YEAR HISTORY.

  10. Proceedings of the Second Joint Technology Workshop on Neural Networks and Fuzzy Logic, volume 1

    NASA Technical Reports Server (NTRS)

    Lea, Robert N. (Editor); Villarreal, James (Editor)

    1991-01-01

    Documented here are papers presented at the Neural Networks and Fuzzy Logic Workshop sponsored by NASA and the University of Houston, Clear Lake. The workshop was held April 11 to 13 at the Johnson Space Flight Center. Technical topics addressed included adaptive systems, learning algorithms, network architectures, vision, robotics, neurobiological connections, speech recognition and synthesis, fuzzy set theory and application, control and dynamics processing, space applications, fuzzy logic and neural network computers, approximate reasoning, and multiobject decision making.

  11. Airbreathing Hypersonic Technology Vision Vehicles and Development Dreams

    NASA Technical Reports Server (NTRS)

    McClinton, C. R.; Hunt, J. L.; Ricketts, R. H.; Reukauf, P.; Peddie, C. L.

    1999-01-01

    Significant advancements in hypersonic airbreathing vehicle technology have been made in the country's research centers and industry over the past 40 years. Some of that technology is being validated with the X-43 flight tests. This paper presents an overview of hypersonic airbreathing technology status within the US, and a hypersonic technology development plan. This plan builds on the nation's large investment in hypersonics. This affordable, incremental plan focuses technology development on hypersonic systems, which could be operating by the 2020's.

  12. Propulsion and Power Technologies for the NASA Exploration Vision: A Research Perspective

    NASA Technical Reports Server (NTRS)

    Litchford, Ron J.

    2004-01-01

    Future propulsion and power technologies for deep space missions are profiled in this viewgraph presentation. The presentation includes diagrams illustrating possible future travel times to other planets in the solar system. The propulsion technologies researched at Marshall Space Flight Center (MSFC) include: 1) Chemical Propulsion; 2) Nuclear Propulsion; 3) Electric and Plasma Propulsion; 4) Energetics. The presentation contains additional information about these technologies, as well as space reactors, reactor simulation, and the Propulsion Research Laboratory (PRL) at MSFC.

  13. The Role of X-Rays in Future Space Navigation and Communication

    NASA Technical Reports Server (NTRS)

    Winternitz, Luke M. B.; Gendreau, Keith C.; Hasouneh, Monther A.; Mitchell, Jason W.; Fong, Wai H.; Lee, Wing-Tsz; Gavriil, Fotis; Arzoumanian, Zaven

    2013-01-01

    In the near future, applications using X-rays will enable autonomous navigation and time distribution throughout the solar system, high capacity and low-power space data links, highly accurate attitude sensing, and extremely high-precision formation flying capabilities. Each of these applications alone has the potential to revolutionize mission capabilities, particularly beyond Earth orbit. This paper will outline the NASA Goddard Space Flight Center vision and efforts toward realizing the full potential of X-ray navigation and communications.

  14. Biomorphic Explorers Leading Towards a Robotic Ecology

    NASA Technical Reports Server (NTRS)

    Thakoor, Sarita; Miralles, Carlos; Chao, Tien-Hsin

    1999-01-01

    This paper presents viewgraphs of biomorphic explorers as they provide extended survival and useful life of robots in ecology. The topics include: 1) Biomorphic Explorers; 2) Advanced Mobility for Biomorphic Explorers; 3) Biomorphic Explorers: Size Based Classification; 4) Biomorphic Explorers: Classification (Based on Mobility and Ambient Environment); 5) Biomorphic Flight Systems: Vision; 6) Biomorphic Glider Deployment Concept: Larger Glider Deploy/Local Relay; 7) Biomorphic Glider Deployment Concept: Balloon Deploy/Dual Relay; 8) Biomorphic Exlplorer: Conceptual Design; 9) Biomorphic Gliders; and 10) Applications.

  15. Software Interface Assessment of the Centralized Aviation Flight Records System (CAFRS) 4.0

    DTIC Science & Technology

    2015-05-01

    administrator based on the role of the user needing the permission. Within CAFRS, some permissions are grouped into common roles based on job...simplify. Ex: “Compl RL3 ref tng – Designated RL2 D/N, RL3 Night Vision Goggles (NVG)”. • Once entry is made on 7122 initial and signed and remark... designated by other authorized documents. Citation of manufacturer’s or trade names does not constitute an official endorsement or approval of the use

  16. Night Vision Manual for the Flight Surgeon

    DTIC Science & Technology

    1992-08-01

    may cause night blindness are glaucoma, progressive cone/rod dystrophies (e.g., retinitis pigmentosa , Stargardt’s disease), drug toxicity (e.g...Alabama, July 1989. 38. Berson EL, Rabin AR, Mehaffey L. Advances in night vision twchnology: A pocketscope for patients with retinitis pigmentosa ... retinal sensitivity to dim light. Regeneration of the photopigments occurs during dark adaptation. The fully dark-adapted eye, in which photopigment

  17. Effects of the Abnormal Acceleratory Environment of Flight

    DTIC Science & Technology

    1974-12-01

    vision Return of arteriolar pulsa- tion and temporary venous distension Visual failure is a continuum from loss of peripheral vision (grey- out) to...distance); intrathoracic pressure is increased by strong muscular expiratorv efforts against a partially closed glottis; and the contraction of...vigorous skeletal muscular tensing (Valsalva maneuver) can reduce +GZ tolerance and lead to an episode of unconsciousness at extremely low G levels

  18. Integrated Flight and Propulsion Controls for Advanced Aircraft Configurations

    NASA Technical Reports Server (NTRS)

    Merrill, Walter; Garg, Sanjay

    1995-01-01

    The research vision of the NASA Lewis Research Center in the area of integrated flight and propulsion controls technologies is described. In particular the Integrated Method for Propulsion and Airframe Controls developed at the Lewis Research Center is described including its application to an advanced aircraft configuration. Additionally, future research directions in integrated controls are described.

  19. Integrated Flight and Propulsion Controls for Advanced Aircraft Configurations

    NASA Technical Reports Server (NTRS)

    Merrill, Walter; Garg, Sanjay

    1996-01-01

    The research vision of the NASA Lewis Research Center in the area of integrated flight and propulsion controls technologies is described. In particular, the integrated method for propulsion and airframe controls developed at the Lewis Research Center is described including its application to an advanced aircraft configuration. Additionally, future research directions in integrated controls are described.

  20. Improving Real World Performance of Vision Aided Navigation in a Flight Environment

    DTIC Science & Technology

    2016-09-15

    Introduction . . . . . . . 63 4.2 Wide Area Search Extent . . . . . . . . . . . . . . . . . 64 4.3 Large-Scale Image Navigation Histogram Filter ...65 4.3.1 Location Model . . . . . . . . . . . . . . . . . . 66 4.3.2 Measurement Model . . . . . . . . . . . . . . . 66 4.3.3 Histogram Filter ...Iteration of Histogram Filter . . . . . . . . . . . 70 4.4 Implementation and Flight Test Campaign . . . . . . . . 71 4.4.1 Software Implementation

  1. Mission Assurance and Flight Safety of Manned Space Flight: Implications for Future Exploration of the Moon and Mars

    NASA Technical Reports Server (NTRS)

    Kezirian, M. T.

    2007-01-01

    As NASA implements the nation's Vision for Space Exploration to return to the moon and travel to Mars, new considerations will be be given to the processes governing design and operations of manned spaceflight. New objectives bring new technical challenges; Safety will drive many of these decisions.

  2. The role of nutritional research in the success of human space flight.

    PubMed

    Lane, Helen W; Bourland, Charles; Barrett, Ann; Heer, Martina; Smith, Scott M

    2013-09-01

    The United States has had human space flight programs for >50 y and has had a continued presence in space since 2000. Providing nutritious and safe food is imperative for astronauts because space travelers are totally dependent on launched food. Space flight research topics have included energy, protein, nutritional aspects of bone and muscle health, and vision issues related to 1-carbon metabolism. Research has shown that energy needs during flight are similar to energy needs on Earth. Low energy intakes affect protein turnover. The type of dietary protein is also important for bone health, plant-based protein being more efficacious than animal protein. Bone loss is greatly ameliorated with adequate intakes of energy and vitamin D, along with routine resistive exercise. Astronauts with lower plasma folate concentrations may be more susceptible to vision changes. Foods for space flight were developed initially by the U.S. Air Force School of Aerospace Medicine in conjunction with the U.S. Army Natick Laboratories and NASA. Hazard Analysis Critical Control Point safety standards were specifically developed for space feeding. Prepackaged foods for the International Space Station were originally high in sodium (5300 mg/d), but NASA has recently reformulated >90 foods to reduce sodium intake to 3000 mg/d. Food development has improved nutritional quality as well as safety and acceptability.

  3. Understanding Crew Decision-Making in the Presence of Complexity: A Flight Simulation Experiment

    NASA Technical Reports Server (NTRS)

    Young, Steven D.; Daniels, Taumi S.; Evans, Emory; deHaag, Maarten Uijt; Duan, Pengfei

    2013-01-01

    Crew decision making and response have long been leading causal and contributing factors associated with aircraft accidents. Further, it is anticipated that future aircraft and operational environments will increase exposure to risks related to these factors if proactive steps are not taken to account for ever-increasing complexity. A flight simulation study was designed to collect data to help in understanding how complexity can, or may, be manifest. More specifically, an experimental apparatus was constructed that allowed for manipulation of information complexity and uncertainty, while also manipulating operational complexity and uncertainty. Through these manipulations, and the aid of experienced airline pilots, several issues have been discovered, related most prominently to the influence of information content, quality, and management. Flight crews were immersed in an environment that included new operational complexities suggested for the future air transportation system as well as new technological complexities (e.g. electronic flight bags, expanded data link services, synthetic and enhanced vision systems, and interval management automation). In addition, a set of off-nominal situations were emulated. These included, for example, adverse weather conditions, traffic deviations, equipment failures, poor data quality, communication errors, and unexpected clearances, or changes to flight plans. Each situation was based on one or more reference events from past accidents or incidents, or on a similar case that had been used in previous developmental tests or studies. Over the course of the study, 10 twopilot airline crews participated, completing over 230 flights. Each flight consisted of an approach beginning at 10,000 ft. Based on the recorded data and pilot and research observations, preliminary results are presented regarding decision-making issues in the presence of the operational and technological complexities encountered during the flights.

  4. Status of the Space-Rated Lithium-Ion Battery Advanced Development Project in Support of the Exploration Vision

    NASA Technical Reports Server (NTRS)

    Miller, Thomas

    2007-01-01

    The NASA Glenn Research Center (GRC), along with the Goddard Space Flight Center (GSFC), Jet Propulsion Laboratory (JPL), Johnson Space Center (JSC), Marshall Space Flight Center (MSFC), and industry partners, is leading a space-rated lithium-ion advanced development battery effort to support the vision for Exploration. This effort addresses the lithium-ion battery portion of the Energy Storage Project under the Exploration Technology Development Program. Key discussions focus on the lithium-ion cell component development activities, a common lithium-ion battery module, test and demonstration of charge/discharge cycle life performance and safety characterization. A review of the space-rated lithium-ion battery project will be presented highlighting the technical accomplishments during the past year.

  5. Cockpit weather graphics using mobile satellite communications

    NASA Astrophysics Data System (ADS)

    Seth, Shashi

    Many new companies are pushing state-of-the-art technology to bring a revolution in the cockpits of General Aviation (GA) aircraft. The vision, according to Dr. Bruce Holmes - the Assistant Director for Aeronautics at National Aeronautics and Space Administration's (NASA) Langley Research Center, is to provide such an advanced flight control system that the motor and cognitive skills you use to drive a car would be very similar to the ones you would use to fly an airplane. We at ViGYAN, Inc., are currently developing a system called the Pilot Weather Advisor (PWxA), which would be a part of such an advanced technology flight management system. The PWxA provides graphical depictions of weather information in the cockpit of aircraft in near real-time, through the use of broadcast satellite communications. The purpose of this system is to improve the safety and utility of GA aircraft operations. Considerable effort is being extended for research in the design of graphical weather systems, notably the works of Scanlon and Dash. The concept of providing pilots with graphical depictions of weather conditions, overlaid on geographical and navigational maps, is extremely powerful.

  6. Cockpit weather graphics using mobile satellite communications

    NASA Technical Reports Server (NTRS)

    Seth, Shashi

    1993-01-01

    Many new companies are pushing state-of-the-art technology to bring a revolution in the cockpits of General Aviation (GA) aircraft. The vision, according to Dr. Bruce Holmes - the Assistant Director for Aeronautics at National Aeronautics and Space Administration's (NASA) Langley Research Center, is to provide such an advanced flight control system that the motor and cognitive skills you use to drive a car would be very similar to the ones you would use to fly an airplane. We at ViGYAN, Inc., are currently developing a system called the Pilot Weather Advisor (PWxA), which would be a part of such an advanced technology flight management system. The PWxA provides graphical depictions of weather information in the cockpit of aircraft in near real-time, through the use of broadcast satellite communications. The purpose of this system is to improve the safety and utility of GA aircraft operations. Considerable effort is being extended for research in the design of graphical weather systems, notably the works of Scanlon and Dash. The concept of providing pilots with graphical depictions of weather conditions, overlaid on geographical and navigational maps, is extremely powerful.

  7. Evaluation of Equivalent Vision Technologies for Supersonic Aircraft Operations

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Williams, Steven P.; Wilz, Susan P.; Arthur, Jarvis J., III; Bailey, Randall E.

    2009-01-01

    Twenty-four air transport-rated pilots participated as subjects in a fixed-based simulation experiment to evaluate the use of Synthetic/Enhanced Vision (S/EV) and eXternal Vision System (XVS) technologies as enabling technologies for future all-weather operations. Three head-up flight display concepts were evaluated a monochromatic, collimated Head-up Display (HUD) and a color, non-collimated XVS display with a field-of-view (FOV) equal to and also, one significantly larger than the collimated HUD. Approach, landing, departure, and surface operations were conducted. Additionally, the apparent angle-of-attack (AOA) was varied (high/low) to investigate the vertical field-of-view display requirements and peripheral, side window visibility was experimentally varied. The data showed that lateral approach tracking performance and lateral landing position were excellent regardless of the display type and AOA condition being evaluated or whether or not there were peripheral cues in the side windows. Longitudinal touchdown and glideslope tracking were affected by the display concepts. Larger FOV display concepts showed improved longitudinal touchdown control, superior glideslope tracking, significant situation awareness improvements and workload reductions compared to smaller FOV display concepts.

  8. Development of collision avoidance system for useful UAV applications using image sensors with laser transmitter

    NASA Astrophysics Data System (ADS)

    Cheong, M. K.; Bahiki, M. R.; Azrad, S.

    2016-10-01

    The main goal of this study is to demonstrate the approach of achieving collision avoidance on Quadrotor Unmanned Aerial Vehicle (QUAV) using image sensors with colour- based tracking method. A pair of high definition (HD) stereo cameras were chosen as the stereo vision sensor to obtain depth data from flat object surfaces. Laser transmitter was utilized to project high contrast tracking spot for depth calculation using common triangulation. Stereo vision algorithm was developed to acquire the distance from tracked point to QUAV and the control algorithm was designed to manipulate QUAV's response based on depth calculated. Attitude and position controller were designed using the non-linear model with the help of Optitrack motion tracking system. A number of collision avoidance flight tests were carried out to validate the performance of the stereo vision and control algorithm based on image sensors. In the results, the UAV was able to hover with fairly good accuracy in both static and dynamic collision avoidance for short range collision avoidance. Collision avoidance performance of the UAV was better with obstacle of dull surfaces in comparison to shiny surfaces. The minimum collision avoidance distance achievable was 0.4 m. The approach was suitable to be applied in short range collision avoidance.

  9. RTO Technical Publications: A Quarterly Listing

    NASA Technical Reports Server (NTRS)

    2005-01-01

    This is a listing of recent unclassified RTO technical publications processed by the NASA Center for AeroSpace Information covering the period from July 1, 2005 to September 30, 2005; and available in the NASA Aeronautics and Space Database. Contents include: Aeroelastic Deformation: Adaptation of Wind Tunnel Measurement Concepts to Full-Scale Vehicle Flight Testing; Actively Controlling Buffet-Induced Excitations; Modelling and Simulation to Address NATO's New and Existing Military Requirements; Latency in Visionic Systems: Test Methods and Requirements; Personal Hearing Protection including Active Noise Reduction; Virtual Laboratory Enabling Collaborative Research in Applied Vehicle Technologies; A Method to Analyze Tail Buffet Loads of Aircraft; Particle Image Velocimetry Measurements to Evaluate the Effectiveness of Deck-Edge Columnar Vortex Generators on Aircraft Carriers; Introduction to Flight Test Engineering, Volume 14; Pathological Aspects and Associated Biodynamics in Aircraft Accident Investigation;

  10. STS-95 Day 07 Highlights

    NASA Technical Reports Server (NTRS)

    1998-01-01

    On this seventh day of the STS-95 mission, the flight crew, Cmdr. Curtis L. Brown, Pilot Steven W. Lindsey, Mission Specialists Scott E. Parazynski, Stephen K. Robinson, and Pedro Duque, and Payload Specialists Chiaki Mukai and John H. Glenn, again test the Orbiter Space Vision System. OSVS uses special markings on Spartan and the shuttle cargo bay to provide an alignment aid for the arm's operator using shuttle television images. It will be used extensively on the next Space Shuttle flight in December as an aid in using the arm to join together the first two modules of the International Space Station. Specialist John Glenn will complete a daily back-pain questionnaire by as part of a study of how the muscle, intervertebral discs and bone marrow change after exposure to microgravity.

  11. STS-114: Discovery Post Landing Press Briefing from JSC

    NASA Technical Reports Server (NTRS)

    2005-01-01

    LeRoy Cain, STS-114 Ascent/Entry Flight Director, takes a solo stand with the Press in this briefing. He noted that the successful flight and return of Discovery is another important milestone, a fresh start, and a new beginning as part of NASA's commitment to the President's vision of man's return to the Moon, Mars and beyond. From this successful test flight, NASA will have a lot of learning and hard work to do in preparation for the next flight. Weather factors, safe landing, touch down, communications, re-entry, the Columbia, were some topics covered with the News media.

  12. Small Aircraft Transportation System Concept and Technologies

    NASA Technical Reports Server (NTRS)

    Holmes, Bruce J.; Durham, Michael H.; Tarry, Scott E.

    2005-01-01

    This paper summarizes both the vision and the early public-private collaborative research for the Small Aircraft Transportation System (SATS). The paper outlines an operational definition of SATS, describes how SATS conceptually differs from current air transportation capabilities, introduces four SATS operating capabilities, and explains the relation between the SATS operating capabilities and the potential for expanded air mobility. The SATS technology roadmap encompasses on-demand, widely distributed, point-to-point air mobility, through hired-pilot modes in the nearer-term, and through self-operated user modes in the farther-term. The nearer-term concept is based on aircraft and airspace technologies being developed to make the use of smaller, more widely distributed community reliever and general aviation airports and their runways more useful in more weather conditions, in commercial hired-pilot service modes. The farther-term vision is based on technical concepts that could be developed to simplify or automate many of the operational functions in the aircraft and the airspace for meeting future public transportation needs, in personally operated modes. NASA technology strategies form a roadmap between the nearer-term concept and the farther-term vision. This paper outlines a roadmap for scalable, on-demand, distributed air mobility technologies for vehicle and airspace systems. The audiences for the paper include General Aviation manufacturers, small aircraft transportation service providers, the flight training industry, airport and transportation authorities at the Federal, state and local levels, and organizations involved in planning for future National Airspace System advancements.

  13. A Preliminary Evaluation of Supersonic Transport Category Vehicle Operations in the National Airspace System

    NASA Technical Reports Server (NTRS)

    Underwood, Matthew C.; Guminsky, Michael D.

    2015-01-01

    Several public sector businesses and government agencies, including the National Aeronautics and Space Administration are currently working on solving key technological barriers that must be overcome in order to realize the vision of low-boom supersonic flights conducted over land. However, once these challenges are met, the manner in which this class of aircraft is integrated in the National Airspace System may become a potential constraint due to the significant environmental, efficiency, and economic repercussions that their integration may cause. Background research was performed on historic supersonic operations in the National Airspace System, including both flight deck procedures and air traffic controller procedures. Using this information, an experiment was created to test some of these historic procedures in a current-day, emerging Next Generation Air Transportation System (NextGen) environment and observe the interactions between commercial supersonic transport aircraft and modern-day air traffic. Data was gathered through batch simulations of supersonic commercial transport category aircraft operating in present-day traffic scenarios as a base-lining study to identify the magnitude of the integration problems and begin the exploration of new air traffic management technologies and architectures which will be needed to seamlessly integrate subsonic and supersonic transport aircraft operations. The data gathered include information about encounters between subsonic and supersonic aircraft that may occur when supersonic commercial transport aircraft are integrated into the National Airspace System, as well as flight time data. This initial investigation is being used to inform the creation and refinement of a preliminary Concept of Operations and for the subsequent development of technologies that will enable overland supersonic flight.

  14. Medical and Urologic Issues in Space Flight and Lunar/Mars Exploration

    NASA Technical Reports Server (NTRS)

    Jones, Jeffrey A.

    2004-01-01

    Dr. Jeffrey Jones will be talking about medical issues in space flight secondary to microgravity: fluid shifts, orthostatic changes, muscle and endurance losses, bone mineral losses, radiation exposure, etc. He will discuss the International Space Station (ISS) benefits to medicine. He will show the ISS crew video and share the President's new vision as per the speaker's bureau direction.

  15. NASA Dryden's UAS Service Capabilities

    NASA Technical Reports Server (NTRS)

    Bauer, Jeff

    2007-01-01

    The vision of NASA s Dryden Flight Research Center is to "fly what others only imagine." Its mission is to advance technology and science through flight. Objectives supporting the mission include performing flight research and technology integration to revolutionize aviation and pioneer aerospace technology, validating space exploration concepts, conducting airborne remote sensing and science missions, and supporting operations of the Space Shuttle and the International Space Station. A significant focus of effort in recent years has been on Unmanned Aircraft Systems (UAS), both in support of the Airborne Science Program and as research vehicles to advance the state of the art in UAS. Additionally, the Center has used its piloted aircraft in support of UAS technology development. In order to facilitate greater access to the UAS expertise that exists at the Center, that expertise has been organized around three major capabilities. The first is access to high-altitude, long-endurance UAS. The second is the establishment of a test range for small UAS. The third is safety case assessment support.

  16. Development of Inflatable Entry Systems Technologies

    NASA Technical Reports Server (NTRS)

    Player, Charles J.; Cheatwood, F. McNeil; Corliss, James

    2005-01-01

    Achieving the objectives of NASA s Vision for Space Exploration will require the development of new technologies, which will in turn require higher fidelity modeling and analysis techniques, and innovative testing capabilities. Development of entry systems technologies can be especially difficult due to the lack of facilities and resources available to test these new technologies in mission relevant environments. This paper discusses the technology development process to bring inflatable aeroshell technology from Technology Readiness Level 2 (TRL-2) to TRL-7. This paper focuses mainly on two projects: Inflatable Reentry Vehicle Experiment (IRVE), and Inflatable Aeroshell and Thermal Protection System Development (IATD). The objectives of IRVE are to conduct an inflatable aeroshell flight test that demonstrates exoatmospheric deployment and inflation, reentry survivability and stability, and predictable drag performance. IATD will continue the development of the technology by conducting exploration specific trade studies and feeding forward those results into three more flight tests. Through an examination of these projects, and other potential projects, this paper discusses some of the risks, issues, and unexpected benefits associated with the development of inflatable entry systems technology.

  17. Benign Episodic Unilateral Mydriasis in a Flight Nurse.

    PubMed

    Schiemer, Anthony

    2017-05-01

    Benign episodic unilateral mydriasis is one cause of anisocoria. This phenomenon is thought to be related to an imbalance between the sympathetic and parasympathetic nervous systems. There is a documented association with migraines, but asymptomatic cases have also been reported. A challenge with all cases is the level of investigation required to exclude more sinister causes of nervous system dysfunction. In a dynamic flight environment, additional considerations need to be made, such as varying light levels and use of night vision devices. A 27-yr-old woman on deployment to Afghanistan as a flight nurse presented to the role one clinic with right-sided mydriasis. The patient denied headache or any history of migraines. A dilated right pupil that was reactive to light was found on exam. Symptoms and exam findings resolved shortly after initial presentation. We consulted an ophthalmologist who requested patient transfer for review. He made a diagnosis of benign episodic unilateral mydriasis. There are a variety of causes for anisocoria. A thorough history and examination are required to avoid unnecessary investigations that may not be locally available in the more austere deployed military settings. From an operational perspective, the decision needs to be made regarding the maintenance of flight status. Consideration needs to be given to patient care capability when treating a flight nurse. In cases of rapid resolution such as this, removal from operational status is not reasonable should a clinician be confident of the diagnosis.Schiemer A. Benign episodic unilateral mydriasis in a flight nurse. Aerosp Med Hum Perform. 2017; 88(5):500-502.

  18. Nocturnal insects use optic flow for flight control

    PubMed Central

    Baird, Emily; Kreiss, Eva; Wcislo, William; Warrant, Eric; Dacke, Marie

    2011-01-01

    To avoid collisions when navigating through cluttered environments, flying insects must control their flight so that their sensory systems have time to detect obstacles and avoid them. To do this, day-active insects rely primarily on the pattern of apparent motion generated on the retina during flight (optic flow). However, many flying insects are active at night, when obtaining reliable visual information for flight control presents much more of a challenge. To assess whether nocturnal flying insects also rely on optic flow cues to control flight in dim light, we recorded flights of the nocturnal neotropical sweat bee, Megalopta genalis, flying along an experimental tunnel when: (i) the visual texture on each wall generated strong horizontal (front-to-back) optic flow cues, (ii) the texture on only one wall generated these cues, and (iii) horizontal optic flow cues were removed from both walls. We find that Megalopta increase their groundspeed when horizontal motion cues in the tunnel are reduced (conditions (ii) and (iii)). However, differences in the amount of horizontal optic flow on each wall of the tunnel (condition (ii)) do not affect the centred position of the bee within the flight tunnel. To better understand the behavioural response of Megalopta, we repeated the experiments on day-active bumble-bees (Bombus terrestris). Overall, our findings demonstrate that despite the limitations imposed by dim light, Megalopta—like their day-active relatives—rely heavily on vision to control flight, but that they use visual cues in a different manner from diurnal insects. PMID:21307047

  19. Multiphase Flow Technology Impacts on Thermal Control Systems for Exploration

    NASA Technical Reports Server (NTRS)

    McQuillen, John; Sankovic, John; Lekan, Jack

    2006-01-01

    The Two-Phase Flow Facility (TPHIFFy) Project focused on bridging the critical knowledge gap by developing and demonstrating critical multiphase fluid products for advanced life support, thermal management and power conversion systems that are required to enable the Vision for Space Exploration. Safety and reliability of future systems will be enhanced by addressing critical microgravity fluid physics issues associated with flow boiling, condensation, phase separation, and system stability. The project included concept development, normal gravity testing, and reduced gravity aircraft flight campaigns, in preparation for the development of a space flight experiment implementation. Data will be utilized to develop predictive models that could be used for system design and operation. A single fluid, two-phase closed thermodynamic loop test bed was designed, assembled and tested. The major components in this test bed include: a boiler, a condenser, a phase separator and a circulating pump. The test loop was instrumented with flow meters, thermocouples, pressure transducers and both high speed and normal speed video cameras. A low boiling point surrogate fluid, FC-72, was selected based on scaling analyses using preliminary designs for operational systems. Preliminary results are presented which include flow regime transitions and some observations regarding system stability.

  20. Bioelectric Control of a 757 Class High Fidelity Aircraft Simulation

    NASA Technical Reports Server (NTRS)

    Jorgensen, Charles; Wheeler, Kevin; Stepniewski, Slawomir; Norvig, Peter (Technical Monitor)

    2000-01-01

    This paper presents results of a recent experiment in fine grain Electromyographic (EMG) signal recognition, We demonstrate bioelectric flight control of 757 class simulation aircraft landing at San Francisco International Airport. The physical instrumentality of a pilot control stick is not used. A pilot closes a fist in empty air and performs control movements which are captured by a dry electrode array on the arm, analyzed and routed through a flight director permitting full pilot outer loop control of the simulation. A Vision Dome immersive display is used to create a VR world for the aircraft body mechanics and flight changes to pilot movements. Inner loop surfaces and differential aircraft thrust is controlled using a hybrid neural network architecture that combines a damage adaptive controller (Jorgensen 1998, Totah 1998) with a propulsion only based control system (Bull & Kaneshige 1997). Thus the 757 aircraft is not only being flown bioelectrically at the pilot level but also demonstrates damage adaptive neural network control permitting adaptation to severe changes in the physical flight characteristics of the aircraft at the inner loop level. To compensate for accident scenarios, the aircraft uses remaining control surface authority and differential thrust from the engines. To the best of our knowledge this is the first time real time bioelectric fine-grained control, differential thrust based control, and neural network damage adaptive control have been integrated into a single flight demonstration. The paper describes the EMG pattern recognition system and the bioelectric pattern recognition methodology.

  1. Exploration Architecture Options - ECLSS, EVA, TCS Implications

    NASA Technical Reports Server (NTRS)

    Chambliss, Joe; Henninger, Don; Lawrence, Carl

    2010-01-01

    Many options for exploration of space have been identified and evaluated since the Vision for Space Exploration (VSE) was announced in 2004. Lunar architectures have been identified and addressed in the Lunar Surface Systems team to establish options for how to get to and then inhabit and explore the moon. The Augustine Commission evaluated human space flight for the Obama administration and identified many options for how to conduct human spaceflight in the future. This paper will evaluate the options for exploration of space for the implications of architectures on the Environmental Control and Life Support (ECLSS), ExtraVehicular Activity (EVA) and Thermal Control System (TCS) Systems. The advantages and disadvantages of each architecture and options are presented.

  2. A 3-Dimensional Cockpit Display with Traffic and Terrain Information for the Small Aircraft Transportation System

    NASA Technical Reports Server (NTRS)

    UijtdeHaag, Maarten; Thomas, Robert; Rankin, James R.

    2004-01-01

    The report discusses the architecture and the flight test results of a 3-Dimensional Cockpit Display of Traffic and terrain Information (3D-CDTI). The presented 3D-CDTI is a perspective display format that combines existing Synthetic Vision System (SVS) research and Automatic Dependent Surveillance-Broadcast (ADS-B) technology to improve the pilot's situational awareness. The goal of the 3D-CDTI is to contribute to the development of new display concepts for NASA's Small Aircraft Transportation System research program. Papers were presented at the PLANS 2002 meeting and the ION-GPS 2002 meeting. The contents of this report are derived from the results discussed in those papers.

  3. Night Vision Manual for the Flight Surgeon.

    DTIC Science & Technology

    1985-08-01

    by optic nerve and pathways to Brodmann’s occipital areas 17 and 18). Perception occurs - vision Sensitive material ( retinal pigment) must be...clearly may be defined as glare. Glare becomes a problem in patients with opacities of the ocular media or with retinal diseases. 3 FME tN [I.I Sl IN FM...reduction of pupillary area caused by the drug. 3. Retinal causes of abnormal dark adaptation. a. Congenital stationary night blindness. b. etinitis

  4. Evaluation of novel technologies for the miniaturization of flash imaging lidar

    NASA Astrophysics Data System (ADS)

    Mitev, V.; Pollini, A.; Haesler, J.; Perenzoni, D.; Stoppa, D.; Kolleck, Christian; Chapuy, M.; Kervendal, E.; Pereira do Carmo, João.

    2017-11-01

    Planetary exploration constitutes one of the main components in the European Space activities. Missions to Mars, Moon and asteroids are foreseen where it is assumed that the human missions shall be preceded by robotic exploitation flights. The 3D vision is recognised as a key enabling technology in the relative proximity navigation of the space crafts, where imaging LiDAR is one of the best candidates for such 3D vision sensor.

  5. Vision Algorithm for the Solar Aspect System of the HEROES Mission

    NASA Technical Reports Server (NTRS)

    Cramer, Alexander

    2014-01-01

    This work covers the design and test of a machine vision algorithm for generating high-accuracy pitch and yaw pointing solutions relative to the sun for the High Energy Replicated Optics to Explore the Sun (HEROES) mission. It describes how images were constructed by focusing an image of the sun onto a plate printed with a pattern of small fiducial markers. Images of this plate were processed in real time to determine relative position of the balloon payload to the sun. The algorithm is broken into four problems: circle detection, fiducial detection, fiducial identification, and image registration. Circle detection is handled by an "Average Intersection" method, fiducial detection by a matched filter approach, identification with an ad-hoc method based on the spacing between fiducials, and image registration with a simple least squares fit. Performance is verified on a combination of artificially generated images, test data recorded on the ground, and images from the 2013 flight

  6. Vision Algorithm for the Solar Aspect System of the HEROES Mission

    NASA Technical Reports Server (NTRS)

    Cramer, Alexander; Christe, Steven; Shih, Albert

    2014-01-01

    This work covers the design and test of a machine vision algorithm for generating high-accuracy pitch and yaw pointing solutions relative to the sun for the High Energy Replicated Optics to Explore the Sun (HEROES) mission. It describes how images were constructed by focusing an image of the sun onto a plate printed with a pattern of small fiducial markers. Images of this plate were processed in real time to determine relative position of the balloon payload to the sun. The algorithm is broken into four problems: circle detection, fiducial detection, fiducial identification, and image registration. Circle detection is handled by an Average Intersection method, fiducial detection by a matched filter approach, identification with an ad-hoc method based on the spacing between fiducials, and image registration with a simple least squares fit. Performance is verified on a combination of artificially generated images, test data recorded on the ground, and images from the 2013 flight.

  7. Binary pressure-sensitive paint measurements using miniaturised, colour, machine vision cameras

    NASA Astrophysics Data System (ADS)

    Quinn, Mark Kenneth

    2018-05-01

    Recent advances in machine vision technology and capability have led to machine vision cameras becoming applicable for scientific imaging. This study aims to demonstrate the applicability of machine vision colour cameras for the measurement of dual-component pressure-sensitive paint (PSP). The presence of a second luminophore component in the PSP mixture significantly reduces its inherent temperature sensitivity, increasing its applicability at low speeds. All of the devices tested are smaller than the cooled CCD cameras traditionally used and most are of significantly lower cost, thereby increasing the accessibility of such technology and techniques. Comparisons between three machine vision cameras, a three CCD camera, and a commercially available specialist PSP camera are made on a range of parameters, and a detailed PSP calibration is conducted in a static calibration chamber. The findings demonstrate that colour machine vision cameras can be used for quantitative, dual-component, pressure measurements. These results give rise to the possibility of performing on-board dual-component PSP measurements in wind tunnels or on real flight/road vehicles.

  8. Initial development of a metric to describe the level of safety associated with piloting an aircraft with synthetic vision systems (SVS) displays

    NASA Astrophysics Data System (ADS)

    Bartolone, Anthony P.; Glaab, Louis J.; Hughes, Monica F.; Parrish, Russell V.

    2005-05-01

    Synthetic Vision Systems (SVS) displays provide pilots with a continuous view of terrain combined with integrated guidance symbology in an effort to increase situation awareness (SA) and decrease workload during operations in Instrument Meteorological Conditions (IMC). It is hypothesized that SVS displays can replicate the safety and operational flexibility of flight in Visual Meteorological Conditions (VMC), regardless of actual out-the-window (OTW) visibility or time of day. Throughout the course of recent SVS research, significant progress has been made towards evolving SVS displays as well as demonstrating their ability to increase SA compared to conventional avionics in a variety of conditions. While a substantial amount of data has been accumulated demonstrating the capabilities of SVS displays, the ability of SVS to replicate the safety and operational flexibility of VMC flight performance in all visibility conditions is unknown to any specific degree. The previous piloted simulations and flight tests have shown better SA and path precision is achievable with SVS displays without causing an increase in workload, however none of the previous SVS research attempted to fully capture the significance of SVS displays in terms of their contribution to safety or operational benefits. In order to more fully quantify the relationship of flight operations in IMC with SVS displays to conventional operations conducted in VMC, a fundamental comparison to current day general aviation (GA) flight instruments was warranted. Such a comparison could begin to establish the extent to which SVS display concepts are capable of maintaining an "equivalent level of safety" with the round dials they could one day replace, for both current and future operations. Such a comparison was the focus of the SVS-ES experiment conducted under the Aviation Safety and Security Program's (AvSSP) GA Element of the SVS Project at NASA Langley Research Center in Hampton, Virginia. A combination of subjective and objective data measures were used in this preliminary research to quantify the relationship between selected components of safety that are associated with flying an approach. Four information display methods ranging from a "round dials" baseline through a fully integrated SVS package that includes terrain, pathway based guidance, and a strategic navigation display, were investigated in this high fidelity simulation experiment. In addition, a broad spectrum of pilots, representative of the GA population, were employed for testing in an attempt to enable greater application of the results and determine if "equivalent levels of safety" are achievable through the incorporation of SVS technology regardless of a pilot's flight experience.

  9. The Transition from Spacecraft Development Ot Flight Operation: Human Factor Considerations

    NASA Technical Reports Server (NTRS)

    Basilio, Ralph R.

    2000-01-01

    In the field of aeronautics and astronautics, a paradigm shift has been witnessed by those in academia, research and development, and private industry. Long development life cycles and the budgets to support such programs and projects has given way to aggressive task schedules and leaner resources to draw from all the while challenging assigned individuals to create and produce improved products of processes. however, this "faster, better, cheaper" concept cannot merely be applied to the design, development, and test of complex systems such as earth-orbiting of interplanetary robotic spacecraft. Full advantage is not possible without due consideration and application to mission operations planning and flight operations, Equally as important as the flight system, the mission operations system consisting of qualified personnel, ground hardware and software tools, and verified and validated operational processes, should also be regarded as a complex system requiring personnel to draw upon formal education, training, related experiences, and heuristic reasoning in engineering an effective and efficient system. Unquestionably, qualified personnel are the most important elements of a mission operations system. This paper examines the experiences of the Deep Space I Project, the first in a series of new technology in-flight validation missions sponsored by the United States National Aeronautics and Space Administration (NASA), specifically, in developing a subsystems analysis and technology validation team comprised of former spacecraft development personnel. Human factor considerations are investigated from initial concept/vision formulation; through operational process development; personnel test and training; to initial uplink product development and test support. Emphasis has been placed on challenges and applied or recommended solutions, so as to provide opportunities for future programs and projects to address and disposition potential issues and concerns as early as possible to reap the benefits associated with learning from other's past experiences.

  10. Rationale for windshield glass system specification requirements for shuttle orbiter

    NASA Technical Reports Server (NTRS)

    Hayashida, K.; King, G. L.; Tesinsiky, J.; Wittenburg, D. R.

    1972-01-01

    A preliminary procurement specification for the space shuttle orbiter windshield pane, and some of the design considerations and rationale leading to its development are presented. The windshield designer is given the necessary methods and procedures for assuring glass pane structural integrity by proof test. These methods and procedures are fully developed for annealed and thermally tempered aluminosilicate, borosilicate, and soda lime glass and for annealed fused silica. Application of the method to chemically tempered glass is considered. Other considerations are vision requirements, protection against bird impact, hail, frost, rain, and meteoroids. The functional requirements of the windshield system during landing, ferrying, boost, space flight, and entry are included.

  11. Concept and realization of unmanned aerial system with different modes of operation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Czyba, Roman; Szafrański, Grzegorz; Janusz, Wojciech

    2014-12-10

    In this paper we describe the development process of unmanned aerial system, its mechanical components, electronics and software solutions. During the stage of design, we have formulated some necessary requirements for the multirotor vehicle and ground control station in order to build an optimal system which can be used for the reconnaissance missions. Platform is controlled by use of the ground control station (GCS) and has possibility of accomplishing video based observation tasks. In order to fulfill this requirement the on-board payload consists of mechanically stabilized camera augmented with machine vision algorithms to enable object tracking tasks. Novelty of themore » system are four modes of flight, which give full functionality of the developed UAV system. Designed ground control station is consisted not only of the application itself, but also a built-in dedicated components located inside the chassis, which together creates an advanced UAV system supporting the control and management of the flight. Mechanical part of quadrotor is designed to ensure its robustness while meeting objectives of minimizing weight of the platform. Finally the designed electronics allows for implementation of control and estimation algorithms without the needs for their excessive computational optimization.« less

  12. A Concept of Operations for an Integrated Vehicle Health Assurance System

    NASA Technical Reports Server (NTRS)

    Hunter, Gary W.; Ross, Richard W.; Berger, David E.; Lekki, John D.; Mah, Robert W.; Perey, Danie F.; Schuet, Stefan R.; Simon, Donald L.; Smith, Stephen W.

    2013-01-01

    This document describes a Concept of Operations (ConOps) for an Integrated Vehicle Health Assurance System (IVHAS). This ConOps is associated with the Maintain Vehicle Safety (MVS) between Major Inspections Technical Challenge in the Vehicle Systems Safety Technologies (VSST) Project within NASA s Aviation Safety Program. In particular, this document seeks to describe an integrated system concept for vehicle health assurance that integrates ground-based inspection and repair information with in-flight measurement data for airframe, propulsion, and avionics subsystems. The MVS Technical Challenge intends to maintain vehicle safety between major inspections by developing and demonstrating new integrated health management and failure prevention technologies to assure the integrity of vehicle systems between major inspection intervals and maintain vehicle state awareness during flight. The approach provided by this ConOps is intended to help optimize technology selection and development, as well as allow the initial integration and demonstration of these subsystem technologies over the 5 year span of the VSST program, and serve as a guideline for developing IVHAS technologies under the Aviation Safety Program within the next 5 to 15 years. A long-term vision of IVHAS is provided to describe a basic roadmap for more intelligent and autonomous vehicle systems.

  13. Landing performance by low-time private pilots after the sudden loss of binocular vision - Cyclops II

    NASA Technical Reports Server (NTRS)

    Lewis, C. E., Jr.; Swaroop, R.; Mcmurty, T. C.; Blakeley, W. R.; Masters, R. L.

    1973-01-01

    Study of low-time general aviation pilots, who, in a series of spot landings, were suddenly deprived of binocular vision by patching either eye on the downwind leg of a standard, closed traffic pattern. Data collected during these landings were compared with control data from landings flown with normal vision during the same flight. The sequence of patching and the mix of control and monocular landings were randomized to minimize the effect of learning. No decrease in performance was observed during landings with vision restricted to one eye, in fact, performance improved. This observation is reported at a high level of confidence (p less than 0.001). These findings confirm the previous work of Lewis and Krier and have important implications with regard to aeromedical certification standards.

  14. TAMU: Blueprint for A New Space Mission Operations System Paradigm

    NASA Technical Reports Server (NTRS)

    Ruszkowski, James T.; Meshkat, Leila; Haensly, Jean; Pennington, Al; Hogle, Charles

    2011-01-01

    The Transferable, Adaptable, Modular and Upgradeable (TAMU) Flight Production Process (FPP) is a System of System (SOS) framework which cuts across multiple organizations and their associated facilities, that are, in the most general case, in geographically disperse locations, to develop the architecture and associated workflow processes of products for a broad range of flight projects. Further, TAMU FPP provides for the automatic execution and re-planning of the workflow processes as they become operational. This paper provides the blueprint for the TAMU FPP paradigm. This blueprint presents a complete, coherent technique, process and tool set that results in an infrastructure that can be used for full lifecycle design and decision making during the flight production process. Based on the many years of experience with the Space Shuttle Program (SSP) and the International Space Station (ISS), the currently cancelled Constellation Program which aimed on returning humans to the moon as a starting point, has been building a modern model-based Systems Engineering infrastructure to Re-engineer the FPP. This infrastructure uses a structured modeling and architecture development approach to optimize the system design thereby reducing the sustaining costs and increasing system efficiency, reliability, robustness and maintainability metrics. With the advent of the new vision for human space exploration, it is now necessary to further generalize this framework to take into consideration a broad range of missions and the participation of multiple organizations outside of the MOD; hence the Transferable, Adaptable, Modular and Upgradeable (TAMU) concept.

  15. COBALT: Development of a Platform to Flight Test Lander GN&C Technologies on Suborbital Rockets

    NASA Technical Reports Server (NTRS)

    Carson, John M., III; Seubert, Carl R.; Amzajerdian, Farzin; Bergh, Chuck; Kourchians, Ara; Restrepo, Carolina I.; Villapando, Carlos Y.; O'Neal, Travis V.; Robertson, Edward A.; Pierrottet, Diego; hide

    2017-01-01

    The NASA COBALT Project (CoOperative Blending of Autonomous Landing Technologies) is developing and integrating new precision-landing Guidance, Navigation and Control (GN&C) technologies, along with developing a terrestrial fight-test platform for Technology Readiness Level (TRL) maturation. The current technologies include a third- generation Navigation Doppler Lidar (NDL) sensor for ultra-precise velocity and line- of-site (LOS) range measurements, and the Lander Vision System (LVS) that provides passive-optical Terrain Relative Navigation (TRN) estimates of map-relative position. The COBALT platform is self contained and includes the NDL and LVS sensors, blending filter, a custom compute element, power unit, and communication system. The platform incorporates a structural frame that has been designed to integrate with the payload frame onboard the new Masten Xodiac vertical take-o, vertical landing (VTVL) terrestrial rocket vehicle. Ground integration and testing is underway, and terrestrial fight testing onboard Xodiac is planned for 2017 with two flight campaigns: one open-loop and one closed-loop.

  16. Night vision imaging systems design, integration, and verification in military fighter aircraft

    NASA Astrophysics Data System (ADS)

    Sabatini, Roberto; Richardson, Mark A.; Cantiello, Maurizio; Toscano, Mario; Fiorini, Pietro; Jia, Huamin; Zammit-Mangion, David

    2012-04-01

    This paper describes the developmental and testing activities conducted by the Italian Air Force Official Test Centre (RSV) in collaboration with Alenia Aerospace, Litton Precision Products and Cranfiled University, in order to confer the Night Vision Imaging Systems (NVIS) capability to the Italian TORNADO IDS (Interdiction and Strike) and ECR (Electronic Combat and Reconnaissance) aircraft. The activities consisted of various Design, Development, Test and Evaluation (DDT&E) activities, including Night Vision Goggles (NVG) integration, cockpit instruments and external lighting modifications, as well as various ground test sessions and a total of eighteen flight test sorties. RSV and Litton Precision Products were responsible of coordinating and conducting the installation activities of the internal and external lights. Particularly, an iterative process was established, allowing an in-site rapid correction of the major deficiencies encountered during the ground and flight test sessions. Both single-ship (day/night) and formation (night) flights were performed, shared between the Test Crews involved in the activities, allowing for a redundant examination of the various test items by all participants. An innovative test matrix was developed and implemented by RSV for assessing the operational suitability and effectiveness of the various modifications implemented. Also important was definition of test criteria for Pilot and Weapon Systems Officer (WSO) workload assessment during the accomplishment of various operational tasks during NVG missions. Furthermore, the specific technical and operational elements required for evaluating the modified helmets were identified, allowing an exhaustive comparative evaluation of the two proposed solutions (i.e., HGU-55P and HGU-55G modified helmets). The results of the activities were very satisfactory. The initial compatibility problems encountered were progressively mitigated by incorporating modifications both in the front and rear cockpits at the various stages of the test campaign. This process allowed a considerable enhancement of the TORNADO NVIS configuration, giving a good medium-high level NVG operational capability to the aircraft. Further developments also include the design, integration and test of internal/external lighting for the Italian TORNADO "Mid Life Update" (MLU) and other programs, such as the AM-X aircraft internal/external lights modification/testing and the activities addressing low-altitude NVG operations with fast jets (e.g., TORNADO, AM-X, MB-339CD), a major issue being the safe ejection of aircrew with NVG and NVG modified helmets. Two options have been identified for solving this problem: namely the modification of the current Gentex HGU-55 helmets and the design of a new helmet incorporating a reliable NVG connection/disconnection device (i.e., a mechanical system fully integrated in the helmet frame), with embedded automatic disconnection capability in case of ejection.

  17. 77 FR 40023 - 36(b)(1) Arms Sales Notification

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-06

    ... vision goggle compatible and sun light readable. The pilots and aircrew have common programmable keysets... pilots and aircrew have common programmable keysets, a mass memory unit, mission and flight management...

  18. Computational Models of the Eye and their Applications in Long Duration Space Flight

    NASA Technical Reports Server (NTRS)

    Chen, Richard; Best, Lauren; Mason, Kyle; Mulugeta, Lealem

    2011-01-01

    Astronauts are exposed to cephalad fluid shift, increased carbon dioxide levels and other environmental factors during space flight. As a result of these conditions, it is believed that they are at risk of developing increased intracranial pressure (ICP) and intraocular pressure (IOP), which in turn may cause papilledema and other disorders of the eye that can lead to temporary or permanent changes in vision. However, the mechanisms behind this risk are not fully understood. Ground analog and flight studies pose challenges because there are limited non-invasive methods that can be used to study the eye and intracranial space. Therefore it is proposed that computational models can be applied to help address this gap by providing a low cost method for studying the effects of IOP, ICP and various properties of the eye on these diseases. The information presented by the authors provides a summary of several models found in literature that could potentially be augmented and applied to inform research. Specifically, finite element models of the optic nerve head, sclera and other structures of the eye can be readily adapted as potential building blocks. These models may also be integrated with a brain/cerebrospinal fluid (CSF) model which will take into account the interaction between the CSF fluid and its pressure on the optic nerve. This integration can enable the study of the effects of microgravity on the interaction between the vasculature system and CSF system and can determine the effects of these changes on the optic nerve, and in turn the eye. Ultimately, it can help pinpoint the influences of long-term exposure to microgravity on vision and inform the future research into countermeasure development. In addition to spaceflight, these models can provide deeper understanding of the mechanisms of glaucoma, papilledema and other eye disorders observed in terrestrial conditions.

  19. Vision-Based Precision Landings of a Tailsitter UAV

    DTIC Science & Technology

    2010-04-01

    2.2: Schematic of the controller used in simulation. The block diagram shown in Figure 2.2 shows the simulation structure used to simulate the vision...the structure of the flight facility walls, any vibration applied to the structure would potentially change the pose of the cameras. Each camera’s pose...relative to the target in Chap- ter 4, a flat earth assumption was made. In several situations the approximation that the ground over which the UAV is

  20. Vision-based sensing for autonomous in-flight refueling

    NASA Astrophysics Data System (ADS)

    Scott, D.; Toal, M.; Dale, J.

    2007-04-01

    A significant capability of unmanned airborne vehicles (UAV's) is that they can operate tirelessly and at maximum efficiency in comparison to their human pilot counterparts. However a major limiting factor preventing ultra-long endurance missions is that they require landing to refuel. Development effort has been directed to allow UAV's to automatically refuel in the air using current refueling systems and procedures. The 'hose & drogue' refueling system was targeted as it is considered the more difficult case. Recent flight trials resulted in the first-ever fully autonomous airborne refueling operation. Development has gone into precision GPS-based navigation sensors to maneuver the aircraft into the station-keeping position and onwards to dock with the refueling drogue. However in the terminal phases of docking, the accuracy of the GPS is operating at its performance limit and also disturbance factors on the flexible hose and basket are not predictable using an open-loop model. Hence there is significant uncertainty on the position of the refueling drogue relative to the aircraft, and is insufficient in practical operation to achieve a successful and safe docking. A solution is to augment the GPS based system with a vision-based sensor component through the terminal phase to visually acquire and track the drogue in 3D space. The higher bandwidth and resolution of camera sensors gives significantly better estimates on the state of the drogue position. Disturbances in the actual drogue position caused by subtle aircraft maneuvers and wind gusting can be visually tracked and compensated for, providing an accurate estimate. This paper discusses the issues involved in visually detecting a refueling drogue, selecting an optimum camera viewpoint, and acquiring and tracking the drogue throughout a widely varying operating range and conditions.

  1. The space station freedom flight telerobotic servicer. The design and evolution of a dexterous space robot

    NASA Astrophysics Data System (ADS)

    McCain, Harry G.; Andary, James F.; Hewitt, Dennis R.; Haley, Dennis C.

    The Flight Telerobotic Servicer (FTS) Project at the Goddard Space Flight Center is developing an advanced telerobotic system to assist in and reduce crew extravehicular activity (EVA) for Space Station Freedom (SSF). The FTS will provide a telerobotic capability to the Freedom Station in the early assembly phases of the program and will be employed for assembly, maintenance, and inspection applications throughout the lifetime of the space station. Appropriately configured elements of the FTS will also be employed for robotic manipulation in remote satellite servicing applications and possibly the Lunar/Mars Program. In mid-1989, the FTS entered the flight system design and implementation phase (Phase C/D) of development with the signing of the FTS prime contract with Martin Marietta Astronautics Group in Denver, Colorado. The basic FTS design is now established and can be reported on in some detail. This paper will describe the FTS flight system design and the rationale for the specific design approaches and component selections. The current state of space technology and the general nature of the FTS task dictate that the FTS be designed with sophisticated teleoperation capabilities for its initial primary operating mode. However, there are technologies, such as advanced computer vision and autonomous planning techniques currently in research and advanced development phases which would greatly enhance the FTS capabilities to perform autonomously in less structured work environments. Therefore, a specific requirement on the initial FTS design is that it has the capability to evolve as new technology becomes available. This paper will describe the FTS design approach for evolution to more autonomous capabilities. Some specific task applications of the FTS and partial automation approaches of these tasks will also be discussed in this paper.

  2. The Space Station Freedom Flight Telerobotic Servicer: the design and evolution of a dexterous space robot.

    PubMed

    McCain, H G; Andary, J F; Hewitt, D R; Haley, D C

    1991-01-01

    The Flight Telerobotic Servicer (FTS) Project at the Goddard Space Flight Center is developing an advanced telerobotic system to assist in and reduce crew extravehicular activity (EVA) for Space Station) Freedom (SSF). The FTS will provide a telerobotic capability to the Freedom Station in the early assembly phases of the program and will be employed for assembly, maintenance, and inspection applications throughout the lifetime of the space station. Appropriately configured elements of the FTS will also be employed for robotic manipulation in remote satellite servicing applications and possibly the Lunar/Mars Program. In mid-1989, the FTS entered the flight system design and implementation phase (Phase C/D) of development with the signing of the FTS prime contract with Martin Marietta Astronautics Group in Denver, Colorado. The basic FTS design is now established and can be reported on in some detail. This paper will describe the FTS flight system design and the rationale for the specific design approaches and component selections. The current state of space technology and the nature of the FTS task dictate that the FTS be designed with sophisticated teleoperation capabilities for its initial primary operating mode. However, there are technologies, such as advanced computer vision and autonomous planning techniques currently in research and advanced development phases which would greatly enhance the FTS capabilities to perform autonomously in less structured work environments. Therefore, a specific requirement on the initial FTS design is that it has the capability to evolve as new technology becomes available. This paper will describe the FTS design approach for evolution to more autonomous capabilities. Some specific task applications of the FTS and partial automation approaches of these tasks will also be discussed in this paper.

  3. The Space Station Freedom Flight Telerobotic Servicer: the design and evolution of a dexterous space robot

    NASA Technical Reports Server (NTRS)

    McCain, H. G.; Andary, J. F.; Hewitt, D. R.; Haley, D. C.

    1991-01-01

    The Flight Telerobotic Servicer (FTS) Project at the Goddard Space Flight Center is developing an advanced telerobotic system to assist in and reduce crew extravehicular activity (EVA) for Space Station) Freedom (SSF). The FTS will provide a telerobotic capability to the Freedom Station in the early assembly phases of the program and will be employed for assembly, maintenance, and inspection applications throughout the lifetime of the space station. Appropriately configured elements of the FTS will also be employed for robotic manipulation in remote satellite servicing applications and possibly the Lunar/Mars Program. In mid-1989, the FTS entered the flight system design and implementation phase (Phase C/D) of development with the signing of the FTS prime contract with Martin Marietta Astronautics Group in Denver, Colorado. The basic FTS design is now established and can be reported on in some detail. This paper will describe the FTS flight system design and the rationale for the specific design approaches and component selections. The current state of space technology and the nature of the FTS task dictate that the FTS be designed with sophisticated teleoperation capabilities for its initial primary operating mode. However, there are technologies, such as advanced computer vision and autonomous planning techniques currently in research and advanced development phases which would greatly enhance the FTS capabilities to perform autonomously in less structured work environments. Therefore, a specific requirement on the initial FTS design is that it has the capability to evolve as new technology becomes available. This paper will describe the FTS design approach for evolution to more autonomous capabilities. Some specific task applications of the FTS and partial automation approaches of these tasks will also be discussed in this paper.

  4. Hypoxia and Coriolis Illusion in Pilots During Simulated Flight.

    PubMed

    Kowalczuk, Krzysztof P; Gazdzinski, Stefan P; Janewicz, Michał; Gąsik, Marek; Lewkowicz, Rafał; Wyleżoł, Mariusz

    2016-02-01

    Pilots' vision and flight performance may be impeded by spatial disorientation and high altitude hypoxia. The Coriolis illusion affects both orientation and vision. However, the combined effect of simultaneous Coriolis illusion and hypoxia on saccadic eye movement has not been evaluated. A simulated flight was performed by 14 experienced pilots under 3 conditions: once under normal oxygen partial pressure and twice under reduced oxygen partial pressures, reflecting conditions at 5000 m and 6000 m (16,404 and 19,685 ft), respectively. Eye movements were evaluated with a saccadometer. At normal oxygen pressure, Coriolis illusion resulted in 55% and 31% increases in mean saccade amplitude and duration, respectively, but a 32% increase in mean saccade frequency was only noted for saccades smaller than the angular distance between cockpit instruments, suggesting an increase in the number of correction saccades. At lower oxygen pressures a pronounced increase in the standard deviation of all measures was noticed; however, the pattern of changes remained unchanged. Simple measures of saccadic movement are not affected by short-term hypoxia, most likely due to compensatory mechanisms.

  5. Spaceflight-Induced Intracranial Hypertension: An Overview

    NASA Technical Reports Server (NTRS)

    Traver, William J.

    2011-01-01

    This slide presentation is an overview of the some of the known results of spaceflight induced intracranial hypertension. Historical information from Gemini 5, Apollo, and the space shuttle programs indicated that some vision impairment was reported and a comparison between these historical missions and present missions is included. Optic Disc Edema, Globe Flattening, Choroidal Folds, Hyperopic Shifts and Raised Intracranial Pressure has occurred in Astronauts During and After Long Duration Space Flight. Views illustrate the occurrence of Optic Disc Edema, Globe Flattening, and Choroidal Folds. There are views of the Arachnoid Granulations and Venous return, and the question of spinal or venous compliance issues is discussed. The question of increased blood flow and its relation to increased Cerebrospinal fluid (CSF) is raised. Most observed on-orbit papilledema does not progress, and this might be a function of plateau homeostasis for the higher level of intracranial pressure. There are seven cases of astronauts experiencing in flight and post flight symptoms, which are summarized and follow-up is reviewed along with a comparison of the treatment options. The question is "is there other involvement besides vision," and other Clinical implications are raised,

  6. iPAS: AES Flight System Technology Maturation for Human Spaceflight

    NASA Technical Reports Server (NTRS)

    Othon, William L.

    2014-01-01

    In order to realize the vision of expanding human presence in space, NASA will develop new technologies that can enable future crewed spacecraft to go far beyond Earth orbit. These technologies must be matured to the point that future project managers can accept the risk of incorporating them safely and effectively within integrated spacecraft systems, to satisfy very challenging mission requirements. The technologies must also be applied and managed within an operational context that includes both on-board crew and mission support on Earth. The Advanced Exploration Systems (AES) Program is one part of the NASA strategy to identify and develop key capabilities for human spaceflight, and mature them for future use. To support this initiative, the Integrated Power Avionics and Software (iPAS) environment has been developed that allows engineers, crew, and flight operators to mature promising technologies into applicable capabilities, and to assess the value of these capabilities within a space mission context. This paper describes the development of the integration environment to support technology maturation and risk reduction, and offers examples of technology and mission demonstrations executed to date.

  7. Weather Requirements and Procedures for Step 1: High Altitude Long Endurance (HALE) Unmanned Aircraft System (UAS) Flight Operations in the National Air Space (NAS)

    NASA Technical Reports Server (NTRS)

    2007-01-01

    This cover sheet is for version 2 of the weather requirements document along with Appendix A. The purpose of the requirements document was to identify and to list the weather functional requirements needed to achieve the Access 5 vision of "operating High Altitude, Long Endurance (HALE) Unmanned Aircraft Systems (UAS) routinely, safely, and reliably in the National Airspace System (NAS) for Step 1." A discussion of the Federal Aviation Administration (FAA) references and related policies, procedures, and standards is provided as basis for the recommendations supported within this document. Additional procedures and reference documentation related to weather functional requirements is also provided for background. The functional requirements and related information are to be proposed to the FAA and various standards organizations for consideration and approval. The appendix was designed to show that sources of flight weather information are readily available to UAS pilots conducting missions in the NAS. All weather information for this presentation was obtained from the public internet.

  8. Revitalizing the Space Shuttle's Thermal Protection System with Reverse Engineering and 3D Vision Technology

    NASA Technical Reports Server (NTRS)

    Wilson, Brad; Galatzer, Yishai

    2008-01-01

    The Space Shuttle is protected by a Thermal Protection System (TPS) made of tens of thousands of individually shaped heat protection tile. With every flight, tiles are damaged on take-off and return to earth. After each mission, the heat tiles must be fixed or replaced depending on the level of damage. As part of the return to flight mission, the TPS requirements are more stringent, leading to a significant increase in heat tile replacements. The replacement operation requires scanning tile cavities, and in some cases the actual tiles. The 3D scan data is used to reverse engineer each tile into a precise CAD model, which in turn, is exported to a CAM system for the manufacture of the heat protection tile. Scanning is performed while other activities are going on in the shuttle processing facility. Many technicians work simultaneously on the space shuttle structure, which results in structural movements and vibrations. This paper will cover a portable, ultra-fast data acquisition approach used to scan surfaces in this unstable environment.

  9. Flying by Ear: Blind Flight with a Music-Based Artificial Horizon

    NASA Technical Reports Server (NTRS)

    Simpson, Brian D.; Brungart, Douglas S.; Dallman, Ronald C.; Yasky, Richard J., Jr.; Romigh, Griffin

    2008-01-01

    Two experiments were conducted in actual flight operations to evaluate an audio artificial horizon display that imposed aircraft attitude information on pilot-selected music. The first experiment examined a pilot's ability to identify, with vision obscured, a change in aircraft roll or pitch, with and without the audio artificial horizon display. The results suggest that the audio horizon display improves the accuracy of attitude identification overall, but differentially affects response time across conditions. In the second experiment, subject pilots performed recoveries from displaced aircraft attitudes using either standard visual instruments, or, with vision obscured, the audio artificial horizon display. The results suggest that subjects were able to maneuver the aircraft to within its safety envelope. Overall, pilots were able to benefit from the display, suggesting that such a display could help to improve overall safety in general aviation.

  10. Influence of vision and dental occlusion on body posture in pilots.

    PubMed

    Baldini, Alberto; Nota, Alessandro; Cravino, Gaia; Cioffi, Clementina; Rinaldi, Antonio; Cozza, Paola

    2013-08-01

    Air force pilots have great postural control, movement coordination, motor learning, and motor transformation. They undergo abnormal stresses during flight that affect their organs and systems, with consequences such as barodontalgia, bruxism, TMJ dysfunctions, and cervical pain. The aim of this study was to evaluate the influence of dental occlusion and vision on their body posture. In collaboration with the "A. Mosso" Legal Medical Institute (Aeronautica Militare), two groups, consisting of 20 air force and 20 civilian pilots, were selected for the study using a protocol approved by the Italian Air Force. An oral examination and a force platform test were performed in order to evaluate the subjects' postural system efficiency. A MANOVA (Multivariate analysis of variance) analysis was performed by using the Wilkes' criterion, in order to statistically evaluate the influence of each factor. Both the sway area and velocity parameters are very strongly influenced by vision: the sway area increases by approximately 32% and the sway velocity increases by approximately 50% when the pilot closes his eyes. Only the sway area parameter was significantly influenced by the mandibular position: the mandibular position with eyes open changed the sway area by about 51% and with eyes closed by about 40%. No statistically significant differences were found between air force and civilian pilots. The results of this analysis show that occlusion and visual function could influence posture in air force and civilian pilots.

  11. Drogue pose estimation for unmanned aerial vehicle autonomous aerial refueling system based on infrared vision sensor

    NASA Astrophysics Data System (ADS)

    Chen, Shanjun; Duan, Haibin; Deng, Yimin; Li, Cong; Zhao, Guozhi; Xu, Yan

    2017-12-01

    Autonomous aerial refueling is a significant technology that can significantly extend the endurance of unmanned aerial vehicles. A reliable method that can accurately estimate the position and attitude of the probe relative to the drogue is the key to such a capability. A drogue pose estimation method based on infrared vision sensor is introduced with the general goal of yielding an accurate and reliable drogue state estimate. First, by employing direct least squares ellipse fitting and convex hull in OpenCV, a feature point matching and interference point elimination method is proposed. In addition, considering the conditions that some infrared LEDs are damaged or occluded, a missing point estimation method based on perspective transformation and affine transformation is designed. Finally, an accurate and robust pose estimation algorithm improved by the runner-root algorithm is proposed. The feasibility of the designed visual measurement system is demonstrated by flight test, and the results indicate that our proposed method enables precise and reliable pose estimation of the probe relative to the drogue, even in some poor conditions.

  12. Point-to-Point! Validation of the Small Aircraft Transportation System Higher Volume Operations Concept

    NASA Technical Reports Server (NTRS)

    Williams, Daniel M.

    2006-01-01

    Described is the research process that NASA researchers used to validate the Small Aircraft Transportation System (SATS) Higher Volume Operations (HVO) concept. The four phase building-block validation and verification process included multiple elements ranging from formal analysis of HVO procedures to flight test, to full-system architecture prototype that was successfully shown to the public at the June 2005 SATS Technical Demonstration in Danville, VA. Presented are significant results of each of the four research phases that extend early results presented at ICAS 2004. HVO study results have been incorporated into the development of the Next Generation Air Transportation System (NGATS) vision and offer a validated concept to provide a significant portion of the 3X capacity improvement sought after in the United States National Airspace System (NAS).

  13. Testing the Efficacy of Synthetic Vision during Non-Normal Operations as an Enabling Technology for Equivalent Visual Operations

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Williams, Steven P.

    2008-01-01

    Synthetic Vision (SV) may serve as a revolutionary crew/vehicle interface enabling technology to meet the challenges of the Next Generation Air Transportation System Equivalent Visual Operations (EVO) concept that is, the ability to achieve or even improve on the safety of Visual Flight Rules (VFR) operations, maintain the operational tempos of VFR, and potentially retain VFR procedures independent of actual weather and visibility conditions. One significant challenge lies in the definition of required equipage on the aircraft and on the airport to enable the EVO concept objective. An experiment was conducted to evaluate the effects of the presence or absence of SV, the location (head-up or head-down) of this information during an instrument approach, and the type of airport lighting information on landing minima. Another key element of the testing entailed investigating the pilot s awareness and reaction to non-normal events (i.e., failure conditions) that were unexpectedly introduced into the experiment. These non-normals are critical determinants in the underlying safety of all-weather operations. This paper presents the experimental results specific to pilot response to non-normal events using head-up and head-down synthetic vision displays.

  14. Fluid Shifts Before, During, and After Prolonged Space Flight and their Association with Intracranial Pressure and Visual Impairment

    NASA Technical Reports Server (NTRS)

    Stenger, M.; Lee, S.; Platts, S.; Macias, B.; Lui, J.; Ebert, D.; Sargsyan, A.; Dulchavsky, S.; Alferova, I.; Yarmanova, E.; hide

    2013-01-01

    With the conclusion of the Space Shuttle program, NASA is focusing on long-duration missions on the International Space Station (ISS) and future exploration-class missions beyond low Earth orbit. Visual acuity changes observed in Space Shuttle crewmembers after their short-duration missions were largely transient, but more than 30% of ISS astronauts experience more profound changes in vision, some with objective structural and functional findings such as papilledema and choroidal folds on ophthalmologic examination. Globe flattening, optic nerve sheath dilatation, optic nerve tortuosity, and other findings have been noted in imaging studies. This pattern is referred to as visual impairment and intracranial pressure (VIIP) syndrome. The VIIP signs and symptoms, as well as postflight lumbar puncture data, suggest that elevated intracranial pressure (ICP) is associated with the space flight-induced cephalad fluid shifts, but this hypothesis has not been systematically tested. The purpose of this study is to objectively characterize the fluid distribution and compartmentalization associated with long-duration space flight, and to correlate the findings with vision changes and other elements of the VIIP syndrome. We also seek to determine whether the magnitude of fluid shifts during space flight, as well as the VIIP-related effects of those shifts, can be predicted by crewmember baseline data and responses to acute hemodynamic manipulations (such as head-down tilt tests) obtained before flight. Lastly, we will evaluate the patterns of fluid distribution in astronaut subjects on the ISS during the use of lower body negative pressure (LBNP) and respiratory maneuvers to characterize and explain general and individual responses during space flight.

  15. Dynamic Tunnel Usability Study: Format Recommendations for Synthetic Vision System Primary Flight Displays

    NASA Technical Reports Server (NTRS)

    Arthur, Jarvis J., III; Prinzel, Lawrence J., III; Kramer, Lynda J.; Bailey, Randall E.

    2006-01-01

    A usability study evaluating dynamic tunnel concepts has been completed under the Aviation Safety and Security Program, Synthetic Vision Systems Project. The usability study was conducted in the Visual Imaging Simulator for Transport Aircraft Systems (VISTAS) III simulator in the form of questionnaires and pilot-in-the-loop simulation sessions. Twelve commercial pilots participated in the study to determine their preferences via paired comparisons and subjective rankings regarding the color, line thickness and sensitivity of the dynamic tunnel. The results of the study showed that color was not significant in pilot preference paired comparisons or in pilot rankings. Line thickness was significant for both pilot preference paired comparisons and in pilot rankings. The preferred line/halo thickness combination was a line width of 3 pixels and a halo of 4 pixels. Finally, pilots were asked their preference for the current dynamic tunnel compared to a less sensitive dynamic tunnel. The current dynamic tunnel constantly gives feedback to the pilot with regard to path error while the less sensitive tunnel only changes as the path error approaches the edges of the tunnel. The tunnel sensitivity comparison results were not statistically significant.

  16. MIT-NASA Workshop: Transformational Technologies

    NASA Technical Reports Server (NTRS)

    Mankins, J. C. (Editor); Christensen, C. B.; Gresham, E. C.; Simmons, A.; Mullins, C. A.

    2005-01-01

    As a space faring nation, we are at a critical juncture in the evolution of space exploration. NASA has announced its Vision for Space Exploration, a vision of returning humans to the Moon, sending robots and eventually humans to Mars, and exploring the outer solar system via automated spacecraft. However, mission concepts have become increasingly complex, with the potential to yield a wealth of scientific knowledge. Meanwhile, there are significant resource challenges to be met. Launch costs remain a barrier to routine space flight; the ever-changing fiscal and political environments can wreak havoc on mission planning; and technologies are constantly improving, and systems that were state of the art when a program began can quickly become outmoded before a mission is even launched. This Conference Publication describes the workshop and featured presentations by world-class experts presenting leading-edge technologies and applications in the areas of power and propulsion; communications; automation, robotics, computing, and intelligent systems; and transformational techniques for space activities. Workshops such as this one provide an excellent medium for capturing the broadest possible array of insights and expertise, learning from researchers in universities, national laboratories, NASA field Centers, and industry to help better our future in space.

  17. Nocturnal insects use optic flow for flight control.

    PubMed

    Baird, Emily; Kreiss, Eva; Wcislo, William; Warrant, Eric; Dacke, Marie

    2011-08-23

    To avoid collisions when navigating through cluttered environments, flying insects must control their flight so that their sensory systems have time to detect obstacles and avoid them. To do this, day-active insects rely primarily on the pattern of apparent motion generated on the retina during flight (optic flow). However, many flying insects are active at night, when obtaining reliable visual information for flight control presents much more of a challenge. To assess whether nocturnal flying insects also rely on optic flow cues to control flight in dim light, we recorded flights of the nocturnal neotropical sweat bee, Megalopta genalis, flying along an experimental tunnel when: (i) the visual texture on each wall generated strong horizontal (front-to-back) optic flow cues, (ii) the texture on only one wall generated these cues, and (iii) horizontal optic flow cues were removed from both walls. We find that Megalopta increase their groundspeed when horizontal motion cues in the tunnel are reduced (conditions (ii) and (iii)). However, differences in the amount of horizontal optic flow on each wall of the tunnel (condition (ii)) do not affect the centred position of the bee within the flight tunnel. To better understand the behavioural response of Megalopta, we repeated the experiments on day-active bumble-bees (Bombus terrestris). Overall, our findings demonstrate that despite the limitations imposed by dim light, Megalopta-like their day-active relatives-rely heavily on vision to control flight, but that they use visual cues in a different manner from diurnal insects. This journal is © 2011 The Royal Society

  18. Unmanned aircraft systems image collection and computer vision image processing for surveying and mapping that meets professional needs

    NASA Astrophysics Data System (ADS)

    Peterson, James Preston, II

    Unmanned Aerial Systems (UAS) are rapidly blurring the lines between traditional and close range photogrammetry, and between surveying and photogrammetry. UAS are providing an economic platform for performing aerial surveying on small projects. The focus of this research was to describe traditional photogrammetric imagery and Light Detection and Ranging (LiDAR) geospatial products, describe close range photogrammetry (CRP), introduce UAS and computer vision (CV), and investigate whether industry mapping standards for accuracy can be met using UAS collection and CV processing. A 120-acre site was selected and 97 aerial targets were surveyed for evaluation purposes. Four UAS flights of varying heights above ground level (AGL) were executed, and three different target patterns of varying distances between targets were analyzed for compliance with American Society for Photogrammetry and Remote Sensing (ASPRS) and National Standard for Spatial Data Accuracy (NSSDA) mapping standards. This analysis resulted in twelve datasets. Error patterns were evaluated and reasons for these errors were determined. The relationship between the AGL, ground sample distance, target spacing and the root mean square error of the targets is exploited by this research to develop guidelines that use the ASPRS and NSSDA map standard as the template. These guidelines allow the user to select the desired mapping accuracy and determine what target spacing and AGL is required to produce the desired accuracy. These guidelines also address how UAS/CV phenomena affect map accuracy. General guidelines and recommendations are presented that give the user helpful information for planning a UAS flight using CV technology.

  19. What Makes Earth and Space Science Sexy? A Model for Developing Systemic Change in Earth and Space Systems Science Curriculum and Instruction

    NASA Astrophysics Data System (ADS)

    Slutskin, R. L.

    2001-12-01

    Earth and Space Science may be the neglected child in the family of high school sciences. In this session, we examine the strategies that Anne Arundel County Public Schools and NASA Goddard Space Flight Center used to develop a dynamic and highly engaging program which follows the vision of the National Science Education Standards, is grounded in key concepts of NASA's Earth Science Directorate, and allows students to examine and apply the current research of NASA scientists. Find out why Earth/Space Systems Science seems to have usurped biology and has made students, principals, and teachers clamor for similar instructional practices in what is traditionally thought of as the "glamorous" course.

  20. The International Space Station: Operations and Assembly - Learning From Experiences - Past, Present, and Future

    NASA Technical Reports Server (NTRS)

    Fuller, Sean; Dillon, William F.

    2006-01-01

    As the Space Shuttle continues flight, construction and assembly of the International Space Station (ISS) carries on as the United States and our International Partners resume the building, and continue to carry on the daily operations, of this impressive and historical Earth-orbiting research facility. In his January 14, 2004, speech announcing a new vision for America s space program, President Bush ratified the United States commitment to completing construction of the ISS by 2010. Since the launch and joining of the first two elements in 1998, the ISS and the partnership have experienced and overcome many challenges to assembly and operations, along with accomplishing many impressive achievements and historical firsts. These experiences and achievements over time have shaped our strategy, planning, and expectations. The continual operation and assembly of ISS leads to new knowledge about the design, development and operation of systems and hardware that will be utilized in the development of new deep-space vehicles needed to fulfill the Vision for Exploration and to generate the data and information that will enable our programs to return to the Moon and continue on to Mars. This paper will provide an overview of the complexity of the ISS Program, including a historical review of the major assembly events and operational milestones of the program, along with the upcoming assembly plans and scheduled missions of the space shuttle flights and ISS Assembly sequence.

  1. Terrain Portrayal for Head-Down Displays Flight Test

    NASA Technical Reports Server (NTRS)

    Hughes, Monica F.; Glaab, Louis J.

    2003-01-01

    The Synthetic Vision Systems General Aviation (SVS-GA) element of NASA's Aviation Safety Program is developing technology to eliminate low visibility induced General Aviation (GA) accidents through the application of synthetic vision techniques. SVS displays present computer generated 3-dimensional imagery of the surrounding terrain to greatly enhance pilot's situation awareness (SA), reducing or eliminating Controlled Flight into Terrain (CFIT), as well as Low-Visibility Loss of Control (LVLOC) accidents. In addition to substantial safety benefits, SVS displays have many potential operational benefits that can lead to flight in instrument meteorological conditions (IMC) resembling those conducted in visual meteorological conditions (VMC). Potential benefits could include lower landing minimums, more approach options, reduced training time, etc. SVS conducted research will develop display concepts providing the pilot with an unobstructed view of the outside terrain, regardless of weather conditions and time of day. A critical component of SVS displays is the appropriate presentation of terrain to the pilot. The relationship between the realism of the terrain presentation and resulting enhancements of pilot SA and pilot performance has been largely undefined. Comprised of coordinated simulation and flight test efforts, the terrain portrayal for head-down displays (TP-HDD) test series examined the effects of two primary elements of terrain portrayal: variations of digital elevation model (DEM) resolution and terrain texturing. Variations in DEM resolution ranged from sparsely spaced (30 arc-sec/2,953ft) to very closely spaced data (1 arc-sec/98 ft). Variations in texture involved three primary methods: constant color, elevation-based generic, and photo-realistic, along with a secondary depth cue enhancer in the form of a fishnet grid overlay. The TP-HDD test series was designed to provide comprehensive data to enable design trades to optimize all SVS applications, as well as develop requirements and recommendations to facilitate the implementation and certification of SVS displays. The TP-HDD flight experiment utilized the NASA LaRC Cessna 206 Stationaire and evaluated eight terrain portrayal concepts in an effort to confirm and extend results from the previously conducted TP-HDD simulation experiment. A total of 15 evaluation pilots, of various qualifications, accumulated over 75 hours of dedicated research flight time at Newport News (PHF) and Roanoke (ROA), VA, airports from August through October, 2002. This report will present results from the portion of testing conducted at Roanoke, VA.

  2. Astronaut training for STS 41-D mission

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Astronauts David C. Leestma and Kathryn D. Sullivan, two of three 41-D mission specialists, rehearse some of the duties they will be performing on their flight. Dr. Sullivan holds the Krimsky rule against her cheekbones as part of an ongoing Shuttle study on near vision acuity. Astronaut Leestma reviews a flight data file flipbook. They are seated on the floor of the Space Shuttle Simulator, in front of the forward middeck lockers.

  3. Exploration Architecture Options - ECLSS, TCS, EVA Implications

    NASA Technical Reports Server (NTRS)

    Chambliss, Joe; Henninger, Don

    2011-01-01

    Many options for exploration of space have been identified and evaluated since the Vision for Space Exploration (VSE) was announced in 2004. The Augustine Commission evaluated human space flight for the Obama administration then the Human Exploration Framework Teams (HEFT and HEFT2) evaluated potential exploration missions and the infrastructure and technology needs for those missions. Lunar architectures have been identified and addressed by the Lunar Surface Systems team to establish options for how to get to, and then inhabit and explore, the moon. This paper will evaluate the options for exploration of space for the implications of architectures on the Environmental Control and Life Support (ECLSS), Thermal Control (TCS), and Extravehicular Activity (EVA) Systems.

  4. Real-time Terrain Relative Navigation Test Results from a Relevant Environment for Mars Landing

    NASA Technical Reports Server (NTRS)

    Johnson, Andrew E.; Cheng, Yang; Montgomery, James; Trawny, Nikolas; Tweddle, Brent; Zheng, Jason

    2015-01-01

    Terrain Relative Navigation (TRN) is an on-board GN&C function that generates a position estimate of a spacecraft relative to a map of a planetary surface. When coupled with a divert, the position estimate enables access to more challenging landing sites through pin-point landing or large hazard avoidance. The Lander Vision System (LVS) is a smart sensor system that performs terrain relative navigation by matching descent camera imagery to a map of the landing site and then fusing this with inertial measurements to obtain high rate map relative position, velocity and attitude estimates. A prototype of the LVS was recently tested in a helicopter field test over Mars analog terrain at altitudes representative of Mars Entry Descent and Landing conditions. TRN ran in real-time on the LVS during the flights without human intervention or tuning. The system was able to compute estimates accurate to 40m (3 sigma) in 10 seconds on a flight like processing system. This paper describes the Mars operational test space definition, how the field test was designed to cover that operational envelope, the resulting TRN performance across the envelope and an assessment of test space coverage.

  5. Development of an In Flight Vision Self-Assessment Questionnaire for Long Duration Space Missions

    NASA Technical Reports Server (NTRS)

    Byrne, Vicky E.; Gibson, Charles R.; Pierpoline, Katherine M.

    2010-01-01

    OVERVIEW A NASA Flight Medicine optometrist teamed with a human factors specialist to develop an electronic questionnaire for crewmembers to record their visual acuity test scores and perceived vision assessment. It will be implemented on the International Space Station (ISS) and administered as part of a suite of tools for early detection of potential vision changes. The goal of this effort was to rapidly develop a set of questions to help in early detection of visual (e.g. blurred vision) and/or non-visual (e.g. headaches) symptoms by allowing the ISS crewmembers to think about their own current vision during their spaceflight missions. PROCESS An iterative process began with a Space Shuttle one-page paper questionnaire generated by the optometrist that was updated by applying human factors design principles. It was used as a baseline to establish an electronic questionnaire for ISS missions. Additional questions needed for the ISS missions were included and the information was organized to take advantage of the computer-based file format available. Human factors heuristics were applied to the prototype and then they were reviewed by the optometrist and procedures specialists with rapid-turn around updates that lead to the final questionnaire. CONCLUSIONS With about only a month lead time, a usable tool to collect crewmember assessments was developed through this cross-discipline collaboration. With only a little expenditure of energy, the potential payoff is great. ISS crewmembers will complete the questionnaire at 30 days into the mission, 100 days into the mission and 30 days prior to return to Earth. The systematic layout may also facilitate physicians later data extraction for quick interpretation of the data. The data collected along with other measures (e.g. retinal and ultrasound imaging) at regular intervals could potentially lead to early detection and treatment of related vision problems than using the other measures alone.

  6. Evaluation of Synthetic Vision Display Concepts for Improved Awareness in Unusual Attitude Recovery Scenarios

    NASA Technical Reports Server (NTRS)

    Nicholas, Stephanie

    2016-01-01

    A recent study conducted by the Commercial Aviation Safety Team (CAST) determined 40 percent of all fixed-wing fatal accidents, between 2001 and 2011, were caused by Loss-of-Control (LOC) in flight (National Transportation Safety Board, 2015). Based on their findings, CAST recommended manufacturers develop and implement virtual day-visual meteorological conditions (VMC) display systems, such as synthetic vision or equivalent systems (CAST, 2016). In a 2015 simulation study conducted at NASA Langley Research Center (LaRC), researchers gathered to test and evaluate virtual day-VMC displays under realistic flight operation scenarios capable of inducing reduced attention states in pilots. Each display concept was evaluated to determine its efficacy to improve attitude awareness. During the experiment, Evaluation Pilots (EPs) were shown the following three display concepts on the Primary Flight Display (PFD): Baseline, Synthetic Vision (SV) with color gradient, and SV with texture. The baseline configuration was a standard, conventional 'blue over brown' display. Experiment scenarios were simulated over water to evaluate Unusual Attitude (UA) recovery over 'featureless terrain' environments. Thus, the SV with color gradient configuration presented a 'blue over blue' display with a linear blue color progression, to differentiate attitude changes between sky and ocean. The SV with texture configuration presented a 'blue over blue' display with a black checkerboard texture atop a synthetic ocean. These displays were paired with a Background Attitude Indicator (BAI) concept. The BAI was presented across all four Head-Down Displays (HDDs), displaying a wide field-of-view blue-over-blue attitude indicator. The BAI aligned with the PFD and showed through the background of the navigation displays with opaque transparency. Each EP participated in a two-part experiment series with a total seventy-five trial runs: Part I included a set of twenty-five Unusual Attitude Recovery (UAR) scenarios; Part II included a set of fifty Attitude Memory Recall Tasks (AMRT). At the conclusion of each trial, EPs were asked to complete a set post-run questionnaires. Quantitative results showed that there were no significant statistical effects on UA recovery times when utilizing SV with or without the presence of a BAI. Qualitative results show the SV displays (color, texture) with BAI On are most preferred for both UA recognition and recovery when compared with the baseline display. When only comparing SV display concepts, EPs performed better when using the SV with texture, BAI On, than any other display configuration. This is an interesting find considering most EPs noted their preference towards the SV with color gradient when the BAI was on.

  7. Performance Characterization of Obstacle Detection Algorithms for Aircraft Navigation

    NASA Technical Reports Server (NTRS)

    Kasturi, Rangachar; Camps, Octavia; Coraor, Lee; Gandhi, Tarak; Hartman, Kerry; Yang, Mau-Tsuen

    2000-01-01

    The research reported here is a part of NASA's Synthetic Vision System (SVS) project for the development of a High Speed Civil Transport Aircraft (HSCT). One of the components of the SVS is a module for detection of potential obstacles in the aircraft's flight path by analyzing the images captured by an on-board camera in real-time. Design of such a module includes the selection and characterization of robust, reliable, and fast techniques and their implementation for execution in real-time. This report describes the results of our research in realizing such a design.

  8. Head-Mounted and Head-Up Display Glossary

    NASA Technical Reports Server (NTRS)

    Newman, Richard L.; Allen, J. Edwin W. (Technical Monitor)

    1997-01-01

    One of the problems in head-up and helmet-mounted display (HMD) literature has been a lack of standardization of words and abbreviations. Several different words have been used for the same concept; for example, flight path angle, flight path marker, velocity vector, and total velocity vector all refer to the same thing. In other cases, the same term has been used with two different meanings, such as binocular field-of-view which means the field-of-view visible to both left and right eyes according to some or the field-of-view visible to either the left or right eye or both according to others. Many of the terms used in HMD studies have not been well-defined. We need to have a common language to ensure that system descriptions are communicated. As an example, the term 'stabilized' has been widely used with two meanings. 'Roll-stabilized' has been used to mean a symbol which rotates to indicate the roll or bank of the aircraft. 'World-stabilized' and 'head-stabilized' have both been used to indicate symbols which move to remain fixed with respect to external objects. HMDs present unique symbology problems not found in HUDs. Foremost among these is the issue of maintaining spatial orientation of the symbols. All previous flight displays, round dial instruments, HDDs, and HUDs have been fixed in the cockpit. With the HMD, the flight display can move through a large angle. The coordinates use in transforming from the real-world to the aircraft to the HMD have not been consistently defined. This glossary contains terms relating to optics and vision, displays, and flight information, weapons and aircraft systems. Some definitions, such as Navigation Display, have been added to clarify the definitions for Primary Flight Display and Primary Flight Reference. A list of HUD/HMD related abbreviations is also included.

  9. Stroboscopic Vision as a Treatment Motion Sickness

    NASA Technical Reports Server (NTRS)

    Reschke, Millard F.; Somers, J. T.; Ford, G.; Krnavek, J. M.; Hwang, E. y.; Kornilova, L. N.; Leigh, R. J.

    2006-01-01

    Results obtained from space flight indicate that most space crews will experience some symptoms of motion sickness causing significant impact on the operational objectives that must be accomplished to assure mission success. Based on the initial work of Melvill-Jones, we have evaluated stroboscopic vision as a method of preventing motion sickness. Methods: Nineteen subjects read text while making +/-20deg head movements in the horizontal plane at 0.2 Hz while wearing left-right reversing prisms during exposure to 4 Hz stroboscopic or normal room illumination. Testing was repeated using LCD shutter glasses as the stroboscopic source with an additional 19 subjects. Results: With Strobe, motion sickness was significantly lower than with normal room illumination. Results with the LCD shutter glasses were analogous to those observed with environmental strobe. Conclusions: Stroboscopic illumination appears to be effective where retinal slip is a factor in eliciting motion sickness. Additional research is evaluating the glasses efficacy for, carsickness, sickness in parabolic flight and seasickness. There is evidence from pilot studies showing that the glasses reduce saccade velocity to visually presented targets by approximately half of the normal values. It is interesting to note that adaptation to space flight may also slow saccade velocity.

  10. The International Space Station: Stepping-stone to Exploration

    NASA Technical Reports Server (NTRS)

    Gerstenmaier, William H.; Kelly, Brian K.; Kelly, Brian K.

    2005-01-01

    As the Space Shuttle returns to flight this year, major reconfiguration and assembly of the International Space Station continues as the United States and our 5 International Partners resume building and carry on operating this impressive Earth-orbiting research facility. In his January 14, 2004, speech announcing a new vision for America's space program, President Bush ratified the United States' commitment to completing construction of the ISS by 2010. The current ongoing research aboard the Station on the long-term effects of space travel on human physiology will greatly benefit human crews to venture through the vast voids of space for months at a time. The continual operation of ISS leads to new knowledge about the design, development and operation of system and hardware that will be utilized in the development of new deep-space vehicles needed to fulfill the Vision for Exploration. This paper will provide an overview of the ISS Program, including a review of the events of the past year, as well as plans for next year and the future.

  11. Rules to fly by: pigeons navigating horizontal obstacles limit steering by selecting gaps most aligned to their flight direction.

    PubMed

    Ros, Ivo G; Bhagavatula, Partha S; Lin, Huai-Ti; Biewener, Andrew A

    2017-02-06

    Flying animals must successfully contend with obstacles in their natural environments. Inspired by the robust manoeuvring abilities of flying animals, unmanned aerial systems are being developed and tested to improve flight control through cluttered environments. We previously examined steering strategies that pigeons adopt to fly through an array of vertical obstacles (VOs). Modelling VO flight guidance revealed that pigeons steer towards larger visual gaps when making fast steering decisions. In the present experiments, we recorded three-dimensional flight kinematics of pigeons as they flew through randomized arrays of horizontal obstacles (HOs). We found that pigeons still decelerated upon approach but flew faster through a denser array of HOs compared with the VO array previously tested. Pigeons exhibited limited steering and chose gaps between obstacles most aligned to their immediate flight direction, in contrast to VO navigation that favoured widest gap steering. In addition, pigeons navigated past the HOs with more variable and decreased wing stroke span and adjusted their wing stroke plane to reduce contact with the obstacles. Variability in wing extension, stroke plane and wing stroke path was greater during HO flight. Pigeons also exhibited pronounced head movements when negotiating HOs, which potentially serve a visual function. These head-bobbing-like movements were most pronounced in the horizontal (flight direction) and vertical directions, consistent with engaging motion vision mechanisms for obstacle detection. These results show that pigeons exhibit a keen kinesthetic sense of their body and wings in relation to obstacles. Together with aerodynamic flapping flight mechanics that favours vertical manoeuvring, pigeons are able to navigate HOs using simple rules, with remarkable success.

  12. Rules to fly by: pigeons navigating horizontal obstacles limit steering by selecting gaps most aligned to their flight direction

    PubMed Central

    Ros, Ivo G.; Bhagavatula, Partha S.; Lin, Huai-Ti

    2017-01-01

    Flying animals must successfully contend with obstacles in their natural environments. Inspired by the robust manoeuvring abilities of flying animals, unmanned aerial systems are being developed and tested to improve flight control through cluttered environments. We previously examined steering strategies that pigeons adopt to fly through an array of vertical obstacles (VOs). Modelling VO flight guidance revealed that pigeons steer towards larger visual gaps when making fast steering decisions. In the present experiments, we recorded three-dimensional flight kinematics of pigeons as they flew through randomized arrays of horizontal obstacles (HOs). We found that pigeons still decelerated upon approach but flew faster through a denser array of HOs compared with the VO array previously tested. Pigeons exhibited limited steering and chose gaps between obstacles most aligned to their immediate flight direction, in contrast to VO navigation that favoured widest gap steering. In addition, pigeons navigated past the HOs with more variable and decreased wing stroke span and adjusted their wing stroke plane to reduce contact with the obstacles. Variability in wing extension, stroke plane and wing stroke path was greater during HO flight. Pigeons also exhibited pronounced head movements when negotiating HOs, which potentially serve a visual function. These head-bobbing-like movements were most pronounced in the horizontal (flight direction) and vertical directions, consistent with engaging motion vision mechanisms for obstacle detection. These results show that pigeons exhibit a keen kinesthetic sense of their body and wings in relation to obstacles. Together with aerodynamic flapping flight mechanics that favours vertical manoeuvring, pigeons are able to navigate HOs using simple rules, with remarkable success. PMID:28163883

  13. A Comparison of the AVS-9 and the Panoramic Night Vision Goggles During Rotorcraft Hover and Landing

    NASA Technical Reports Server (NTRS)

    Szoboszlay, Zoltan; Haworth, Loran; Simpson, Carol

    2000-01-01

    A flight test was conducted to assess any differences in pilot-vehicle performance and pilot opinion between the use of a current generation night vision goggle (the AVS-9) and one variant of the prototype panoramic night vision goggle (the PNVGII). The panoramic goggle has more than double the horizontal field-of-view of the AVS-9, but reduced image quality. Overall the panoramic goggles compared well to the AVS-9 goggles. However, pilot comment and data are consistent with the assertion that some of the benefits of additional field-of-view with the panoramic goggles were negated by the reduced image quality of the particular variant of the panoramic goggles tested.

  14. Design and implementation of a control system for a quadrotor MAV

    NASA Astrophysics Data System (ADS)

    Bawek, Dean

    The quadrotor is a 200 g MAV with rapid-prototyped rotors that are driven by four brushless electric motors, capable of a collective thrust of around 400 g using an 11 V battery. The vehicle is compact with its largest dimension at 188 mm. Without any feedback control, the quadrotor is unstable. For flight stability, the vehicle incorporates a linear quadratic regulator to augment its dynamics for hover. The quadrotor's nonlinear dynamics are linearized about hover in order to be used in controller formulation. Feedback comes both directly from sensors and a Luenberger observer that computes the rotor velocities. A Simulink simulation uses hardware and software properties to serve as an environment for controller gain tuning prior to flight testing. The results from the simulation generate stabilizing control gains for the on-board attitude controller and for an off-board PC autopilot that uses the Vicon computer vision system for position feedback. Through the combined effort of the on-board and off-board controllers, the quadrotor successfully demonstrates stable hover in both nominal and disturbed conditions.

  15. Mission-oriented requirements for updating MIL-H-8501. Volume 1: STI proposed structure. [military rotorcraft

    NASA Technical Reports Server (NTRS)

    Clement, W. F.; Hoh, R. H.; Ferguson, S. W., III; Mitchell, D. G.; Ashkenas, I. L.; Mcruer, D. T.

    1985-01-01

    The structure of a new flying and ground handling qualities specification for military rotorcraft is presented. This preliminary specification structure is intended to evolve into a replacement for specification MIL-H-8501A. The new structure is designed to accommodate a variety of rotorcraft types, mission flight phases, flight envelopes, and flight environmental characteristics and to provide criteria for three levels of flying qualities, a systematic treatment of failures and reliability, both conventional and multiaxis controllers, and external vision aids which may also incorporate synthetic display content. Existing and new criteria were incorporated into the new structure wherever they could be substantiated.

  16. Pilot performance and heart rate during in-flight use of a compact instrument display.

    DOT National Transportation Integrated Search

    1975-11-01

    Instrument panels in many general aviation aircraft are becoming increasingly crowded, presenting the pilot with an instrument scanning problem. Because most aircraft instruments require use of central (foveal) vision, the pilot must look directly at...

  17. Information Systems for NASA's Aeronautics and Space Enterprises

    NASA Technical Reports Server (NTRS)

    Kutler, Paul

    1998-01-01

    The aerospace industry is being challenged to reduce costs and development time as well as utilize new technologies to improve product performance. Information technology (IT) is the key to providing revolutionary solutions to the challenges posed by the increasing complexity of NASA's aeronautics and space missions and the sophisticated nature of the systems that enable them. The NASA Ames vision is to develop technologies enabling the information age, expanding the frontiers of knowledge for aeronautics and space, improving America's competitive position, and inspiring future generations. Ames' missions to accomplish that vision include: 1) performing research to support the American aviation community through the unique integration of computation, experimentation, simulation and flight testing, 2) studying the health of our planet, understanding living systems in space and the origins of the universe, developing technologies for space flight, and 3) to research, develop and deliver information technologies and applications. Information technology may be defined as the use of advance computing systems to generate data, analyze data, transform data into knowledge and to use as an aid in the decision-making process. The knowledge from transformed data can be displayed in visual, virtual and multimedia environments. The decision-making process can be fully autonomous or aided by a cognitive processes, i.e., computational aids designed to leverage human capacities. IT Systems can learn as they go, developing the capability to make decisions or aid the decision making process on the basis of experiences gained using limited data inputs. In the future, information systems will be used to aid space mission synthesis, virtual aerospace system design, aid damaged aircraft during landing, perform robotic surgery, and monitor the health and status of spacecraft and planetary probes. NASA Ames through the Center of Excellence for Information Technology Office is leading the effort in pursuit of revolutionary, IT-based approaches to satisfying NASA's aeronautics and space requirements. The objective of the effort is to incorporate information technologies within each of the Agency's four Enterprises, i.e., Aeronautics and Space Transportation Technology, Earth, Science, Human Exploration and Development of Space and Space Sciences. The end results of these efforts for Enterprise programs and projects should be reduced cost, enhanced mission capability and expedited mission completion.

  18. LauncherOne: Virgin Orbit's Dedicated Launch Vehicle for Small Satellites & Impact to the Space Enterprise Vision

    NASA Astrophysics Data System (ADS)

    Vaughn, M.; Kwong, J.; Pomerantz, W.

    Virgin Orbit is developing a space transportation service to provide an affordable, reliable, and responsive dedicated ride to orbit for smaller payloads. No longer will small satellite users be forced to make a choice between accepting the limitations of flight as a secondary payload, paying dramatically more for a dedicated launch vehicle, or dealing with the added complexity associated with export control requirements and international travel to distant launch sites. Virgin Orbit has made significant progress towards first flight of a new vehicle that will give satellite developers and operators a better option for carrying their small satellites into orbit. This new service is called LauncherOne (See the figure below). LauncherOne is a two stage, air-launched liquid propulsion (LOX/RP) rocket. Air launched from a specially modified 747-400 carrier aircraft (named “Cosmic Girl”), this system is designed to conduct operations from a variety of locations, allowing customers to select various launch azimuths and increasing available orbital launch windows. This provides small satellite customers an affordable, flexible and dedicated option for access to space. In addition to developing the LauncherOne vehicle, Virgin Orbit has worked with US government customers and across the new, emerging commercial sector to refine concepts for resiliency, constellation replenishment and responsive launch elements that can be key enables for the Space Enterprise Vision (SEV). This element of customer interaction is being led by their new subsidiary company, VOX Space. This paper summarizes technical progress made on LauncherOne in the past year and extends the thinking of how commercial space, small satellites and this new emerging market can be brought to bear to enable true space system resiliency.

  19. Insect-Inspired Optical-Flow Navigation Sensors

    NASA Technical Reports Server (NTRS)

    Thakoor, Sarita; Morookian, John M.; Chahl, Javan; Soccol, Dean; Hines, Butler; Zornetzer, Steven

    2005-01-01

    Integrated circuits that exploit optical flow to sense motions of computer mice on or near surfaces ( optical mouse chips ) are used as navigation sensors in a class of small flying robots now undergoing development for potential use in such applications as exploration, search, and surveillance. The basic principles of these robots were described briefly in Insect-Inspired Flight Control for Small Flying Robots (NPO-30545), NASA Tech Briefs, Vol. 29, No. 1 (January 2005), page 61. To recapitulate from the cited prior article: The concept of optical flow can be defined, loosely, as the use of texture in images as a source of motion cues. The flight-control and navigation systems of these robots are inspired largely by the designs and functions of the vision systems and brains of insects, which have been demonstrated to utilize optical flow (as detected by their eyes and brains) resulting from their own motions in the environment. Optical flow has been shown to be very effective as a means of avoiding obstacles and controlling speeds and altitudes in robotic navigation. Prior systems used in experiments on navigating by means of optical flow have involved the use of panoramic optics, high-resolution image sensors, and programmable imagedata- processing computers.

  20. The remarkable visual capacities of nocturnal insects: vision at the limits with small eyes and tiny brains

    PubMed Central

    2017-01-01

    Nocturnal insects have evolved remarkable visual capacities, despite small eyes and tiny brains. They can see colour, control flight and land, react to faint movements in their environment, navigate using dim celestial cues and find their way home after a long and tortuous foraging trip using learned visual landmarks. These impressive visual abilities occur at light levels when only a trickle of photons are being absorbed by each photoreceptor, begging the question of how the visual system nonetheless generates the reliable signals needed to steer behaviour. In this review, I attempt to provide an answer to this question. Part of the answer lies in their compound eyes, which maximize light capture. Part lies in the slow responses and high gains of their photoreceptors, which improve the reliability of visual signals. And a very large part lies in the spatial and temporal summation of these signals in the optic lobe, a strategy that substantially enhances contrast sensitivity in dim light and allows nocturnal insects to see a brighter world, albeit a slower and coarser one. What is abundantly clear, however, is that during their evolution insects have overcome several serious potential visual limitations, endowing them with truly extraordinary night vision. This article is part of the themed issue ‘Vision in dim light’. PMID:28193808

  1. Parallel Architectures and Parallel Algorithms for Integrated Vision Systems. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Choudhary, Alok Nidhi

    1989-01-01

    Computer vision is regarded as one of the most complex and computationally intensive problems. An integrated vision system (IVS) is a system that uses vision algorithms from all levels of processing to perform for a high level application (e.g., object recognition). An IVS normally involves algorithms from low level, intermediate level, and high level vision. Designing parallel architectures for vision systems is of tremendous interest to researchers. Several issues are addressed in parallel architectures and parallel algorithms for integrated vision systems.

  2. Gimbals Drive and Control Electronics Design, Development and Testing of the LRO High Gain Antenna and Solar Array Systems

    NASA Technical Reports Server (NTRS)

    Chernyakov, Boris; Thakore, Kamal

    2010-01-01

    Launched June 18, 2009 on an Atlas V rocket, NASA's Lunar Reconnaissance Orbiter (LRO) is the first step in NASA's Vision for Space Exploration program and for a human return to the Moon. The spacecraft (SC) carries a wide variety of scientific instruments and provides an extraordinary opportunity to study the lunar landscape at resolutions and over time scales never achieved before. The spacecraft systems are designed to enable achievement of LRO's mission requirements. To that end, LRO's mechanical system employed two two-axis gimbal assemblies used to drive the deployment and articulation of the Solar Array System (SAS) and the High Gain Antenna System (HGAS). This paper describes the design, development, integration, and testing of Gimbal Control Electronics (GCE) and Actuators for both the HGAS and SAS systems, as well as flight testing during the on-orbit commissioning phase and lessons learned.

  3. Conceptual Study on Hypersonic Turbojet Experimental Vehicle (HYTEX)

    NASA Astrophysics Data System (ADS)

    Taguchi, Hideyuki; Murakami, Akira; Sato, Tetsuya; Tsuchiya, Takeshi

    Pre-cooled turbojet engines have been investigated aiming at realization of reusable space transportation systems and hypersonic airplanes. Evaluation methods of these engine performances have been established based on ground tests. There are some plans on the demonstration of hypersonic propulsion systems. JAXA focused on hypersonic propulsion systems as a key technology of hypersonic transport airplane. Demonstrations of Mach 5 class hypersonic technologies are stated as a development target at 2025 in the long term vision. In this study, systems analyses of hypersonic turbojet experiment (HYTEX) with Mach 5 flight capability is performed. Aerodynamic coefficients are obtained by CFD analyses and wind tunnel tests. Small Pre-cooled turbojet is fabricated and tested using liquid hydrogen as fuel. As a result, characteristics of the baseline vehicle shape is clarified, . and effects of pre-cooling are confirmed at the firing test.

  4. Nocturnal Visual Orientation in Flying Insects: A Benchmark for the Design of Vision-based Sensors in Micro-Aerial Vehicles

    DTIC Science & Technology

    2011-03-12

    determine (1) the ability of Megalopta to approach and land on its nest entrance at night (using high-speed filming in bright infrared light), (2...lined with black-and-white patterns and filming their flight trajectories from below in infrared light), and (3) whether Megalopta uses...at the end of a flight tunnel so that we could record both the approach and the final landing phase (Figure 1). The approach was filmed over a

  5. The First Year in Review: NASA's Ares I Crew Launch Vehicle and Ares V Cargo Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Dumbacher, Daniel L.; Reuter, James L.

    2007-01-01

    The U.S. Vision for Space Exploration guides NASA's challenging missions of scientific discovery.' Developing safe, reliable, and affordable space transportation systems for the human and robotic exploration of space is a key component of fulfilling the strategic goals outlined in the Vision, as well as in the U.S. Space Policy. In October 2005, the Exploration Systems Mission Directorate and its Constellation Program chartered the Exploration Launch Projects Office, located at the Marshall Space Flight Center, to design, develop, test, and field a new generation of launch vehicles that would fulfill customer and stakeholder requirements for trips to the Moon, Mars, and beyond. The Ares I crew launch vehicle is slated to loft the Orion crew exploration vehicle to orbit by 2014, while the heavy-lift Ares V cargo launch vehicle will deliver the lunar lander to orbit by 2020 (Fig. 1). These systems are being designed to empower America's return to the Moon to prepare for the first astronaut on Mars. The new launch vehicle designs now under study reflect almost 50 years of hard-won experience gained from the Saturn's missions to the Moon in the late 1960s and early 1970s, and from the venerable Space Shuttle, which is due to be retired by 2010.

  6. Single-Photon Detectors for Time-of-Flight Range Imaging

    NASA Astrophysics Data System (ADS)

    Stoppa, David; Simoni, Andrea

    We live in a three-dimensional (3D) world and thanks to the stereoscopic vision provided by our two eyes, in combination with the powerful neural network of the brain we are able to perceive the distance of the objects. Nevertheless, despite the huge market volume of digital cameras, solid-state image sensors can capture only a two-dimensional (2D) projection, of the scene under observation, losing a variable of paramount importance, i.e., the scene depth. On the contrary, 3D vision tools could offer amazing possibilities of improvement in many areas thanks to the increased accuracy and reliability of the models representing the environment. Among the great variety of distance measuring techniques and detection systems available, this chapter will treat only the emerging niche of solid-state, scannerless systems based on the TOF principle and using a detector SPAD-based pixels. The chapter is organized into three main parts. At first, TOF systems and measuring techniques will be described. In the second part, most meaningful sensor architectures for scannerless TOF distance measurements will be analyzed, focusing onto the circuital building blocks required by time-resolved image sensors. Finally, a performance summary is provided and a perspective view for the near future developments of SPAD-TOF sensors is given.

  7. Looking Back and Looking Forward: Reprising the Promise and Predicting the Future of Formation Flying and Spaceborne GPS Navigation Systems

    NASA Technical Reports Server (NTRS)

    Bauer, Frank H.; Dennehy, Neil

    2015-01-01

    A retrospective consideration of two 15-year old Guidance, Navigation and Control (GN&C) technology 'vision' predictions will be the focus of this paper. A look back analysis and critique of these late 1990s technology roadmaps out-lining the future vision, for two then nascent, but rapidly emerging, GN&C technologies will be performed. Specifically, these two GN&C technologies were: 1) multi-spacecraft formation flying and 2) the spaceborne use and exploitation of global positioning system (GPS) signals to enable formation flying. This paper reprises the promise of formation flying and spaceborne GPS as depicted in the cited 1999 and 1998 papers. It will discuss what happened to cause that promise to be mostly unfulfilled and the reasons why the envisioned formation flying dream has yet to become a reality. The recent technology trends over the past few years will then be identified and a renewed government interest in spacecraft formation flying/cluster flight will be highlighted. The authors will conclude with a reality-tempered perspective, 15 years after the initial technology roadmaps were published, predicting a promising future of spacecraft formation flying technology development over the next decade.

  8. Multi-Image Registration for an Enhanced Vision System

    NASA Technical Reports Server (NTRS)

    Hines, Glenn; Rahman, Zia-Ur; Jobson, Daniel; Woodell, Glenn

    2002-01-01

    An Enhanced Vision System (EVS) utilizing multi-sensor image fusion is currently under development at the NASA Langley Research Center. The EVS will provide enhanced images of the flight environment to assist pilots in poor visibility conditions. Multi-spectral images obtained from a short wave infrared (SWIR), a long wave infrared (LWIR), and a color visible band CCD camera, are enhanced and fused using the Retinex algorithm. The images from the different sensors do not have a uniform data structure: the three sensors not only operate at different wavelengths, but they also have different spatial resolutions, optical fields of view (FOV), and bore-sighting inaccuracies. Thus, in order to perform image fusion, the images must first be co-registered. Image registration is the task of aligning images taken at different times, from different sensors, or from different viewpoints, so that all corresponding points in the images match. In this paper, we present two methods for registering multiple multi-spectral images. The first method performs registration using sensor specifications to match the FOVs and resolutions directly through image resampling. In the second method, registration is obtained through geometric correction based on a spatial transformation defined by user selected control points and regression analysis.

  9. Synthetic Vision Enhances Situation Awareness and RNP Capabilities for Terrain-Challenged Approaches

    NASA Technical Reports Server (NTRS)

    Kramer, Lynda J.; Prinzel, Lawrence J., III; Bailey, Randall E.; Arthur, Jarvis J., III

    2003-01-01

    The Synthetic Vision Systems (SVS) Project of Aviation Safety Program is striving to eliminate poor visibility as a causal factor in aircraft accidents as well as enhance operational capabilities of all aircraft through the display of computer generated imagery derived from an onboard database of terrain, obstacle, and airport information. To achieve these objectives, NASA 757 flight test research was conducted at the Eagle-Vail, Colorado airport to evaluate three SVS display types (Head-Up Display, Head-Down Size A, Head-Down Size X) and two terrain texture methods (photo-realistic, generic) in comparison to the simulated Baseline Boeing-757 Electronic Attitude Direction Indicator and Navigation / Terrain Awareness and Warning System displays. These independent variables were evaluated for situation awareness, path error, and workload while making approaches to Runway 25 and 07 and during simulated engine-out Cottonwood 2 and KREMM departures. The results of the experiment showed significantly improved situation awareness, performance, and workload for SVS concepts compared to the Baseline displays and confirmed the retrofit capability of the Head-Up Display and Size A SVS concepts. The research also demonstrated that the pathway and pursuit guidance used within the SVS concepts achieved required navigation performance (RNP) criteria.

  10. COBALT CoOperative Blending of Autonomous Landing Technology

    NASA Technical Reports Server (NTRS)

    Carson, John M. III; Restrepo, Carolina I.; Robertson, Edward A.; Seubert, Carl R.; Amzajerdian, Farzin

    2016-01-01

    COBALT is a terrestrial test platform for development and maturation of GN&C (Guidance, Navigation and Control) technologies for PL&HA (Precision Landing and Hazard Avoidance). The project is developing a third generation, Langley Navigation Doppler Lidar (NDL) for ultra-precise velocity and range measurements, which will be integrated and tested with the JPL Lander Vision System (LVS) for Terrain Relative Navigation (TRN) position estimates. These technologies together provide navigation that enables controlled precision landing. The COBALT hardware will be integrated in 2017 into the GN&C subsystem of the Xodiac rocket-propulsive Vertical Test Bed (VTB) developed by Masten Space Systems (MSS), and two terrestrial flight campaigns will be conducted: one open-loop (i.e., passive) and one closed-loop (i.e., active).

  11. Development of a GPS/INS/MAG navigation system and waypoint navigator for a VTOL UAV

    NASA Astrophysics Data System (ADS)

    Meister, Oliver; Mönikes, Ralf; Wendel, Jan; Frietsch, Natalie; Schlaile, Christian; Trommer, Gert F.

    2007-04-01

    Unmanned aerial vehicles (UAV) can be used for versatile surveillance and reconnaissance missions. If a UAV is capable of flying automatically on a predefined path the range of possible applications is widened significantly. This paper addresses the development of the integrated GPS/INS/MAG navigation system and a waypoint navigator for a small vertical take-off and landing (VTOL) unmanned four-rotor helicopter with a take-off weight below 1 kg. The core of the navigation system consists of low cost inertial sensors which are continuously aided with GPS, magnetometer compass, and a barometric height information. Due to the fact, that the yaw angle becomes unobservable during hovering flight, the integration with a magnetic compass is mandatory. This integration must be robust with respect to errors caused by the terrestrial magnetic field deviation and interferences from surrounding electronic devices as well as ferrite metals. The described integration concept with a Kalman filter overcomes the problem that erroneous magnetic measurements yield to an attitude error in the roll and pitch axis. The algorithm provides long-term stable navigation information even during GPS outages which is mandatory for the flight control of the UAV. In the second part of the paper the guidance algorithms are discussed in detail. These algorithms allow the UAV to operate in a semi-autonomous mode position hold as well an complete autonomous waypoint mode. In the position hold mode the helicopter maintains its position regardless of wind disturbances which ease the pilot job during hold-and-stare missions. The autonomous waypoint navigator enable the flight outside the range of vision and beyond the range of the radio link. Flight test results of the implemented modes of operation are shown.

  12. Antenna Technology and other Radio Frequency (RF) Communications Activities at the Glenn Research Center in Support of NASA's Exploration Vision

    NASA Technical Reports Server (NTRS)

    Miranda, Felix A.

    2007-01-01

    NASA s Vision for Space Exploration outlines a very ambitious program for the next several decades of the Space Agency endeavors. Ahead is the completion of the International Space Station (ISS); safely flight the shuttle (STS) until 2010; develop and fly the Crew Exploration Vehicle (Orion) by no later than 2014; return to the moon by no later than 2020; extend human presence across the solar system and beyond; implement a sustainable and affordable human and robotic program; develop supporting innovative technologies, knowledge and infrastructure; and promote international and commercial participation in exploration. To achieve these goals, a series of enabling technologies must be developed or matured in a timely manner. Some of these technologies are: spacecraft RF technology (e.g., high power sources and large antennas which using surface receive arrays can get up to 1 Gbps from Mars), uplink arraying (reduce reliance on large ground-based antennas and high operation costs; single point of failure; enable greater data-rates or greater effective distance; scalable, evolvable, flexible scheduling), software define radio (i.e., reconfigurable, flexible interoperability allows for in flight updates open architecture; reduces mass, power, volume), and optical communications (high capacity communications with low mass/power required; significantly increases data rates for deep space). This presentation will discuss some of the work being performed at the NASA Glenn Research Center, Cleveland, Ohio, in antenna technology as well as other on-going RF communications efforts.

  13. High speed research system study. Advanced flight deck configuration effects

    NASA Technical Reports Server (NTRS)

    Swink, Jay R.; Goins, Richard T.

    1992-01-01

    In mid-1991 NASA contracted with industry to study the high-speed civil transport (HSCT) flight deck challenges and assess the benefits, prior to initiating their High Speed Research Program (HSRP) Phase 2 efforts, then scheduled for FY-93. The results of this nine-month effort are presented, and a number of the most significant findings for the specified advanced concepts are highlighted: (1) a no nose-droop configuration; (2) a far forward cockpit location; and (3) advanced crew monitoring and control of complex systems. The results indicate that the no nose-droop configuration is critically dependent upon the design and development of a safe, reliable, and certifiable Synthetic Vision System (SVS). The droop-nose configuration would cause significant weight, performance, and cost penalties. The far forward cockpit location, with the conventional side-by-side seating provides little economic advantage; however, a configuration with a tandem seating arrangement provides a substantial increase in either additional payload (i.e., passengers) or potential downsizing of the vehicle with resulting increases in performance efficiencies and associated reductions in emissions. Without a droop nose, forward external visibility is negated and takeoff/landing guidance and control must rely on the use of the SVS. The technologies enabling such capabilities, which de facto provides for Category 3 all-weather operations on every flight independent of weather, represent a dramatic benefits multiplier in a 2005 global ATM network: both in terms of enhanced economic viability and environmental acceptability.

  14. A Vision-Based Relative Navigation Approach for Autonomous Multirotor Aircraft

    NASA Astrophysics Data System (ADS)

    Leishman, Robert C.

    Autonomous flight in unstructured, confined, and unknown GPS-denied environments is a challenging problem. Solutions could be tremendously beneficial for scenarios that require information about areas that are difficult to access and that present a great amount of risk. The goal of this research is to develop a new framework that enables improved solutions to this problem and to validate the approach with experiments using a hardware prototype. In Chapter 2 we examine the consequences and practical aspects of using an improved dynamic model for multirotor state estimation, using only IMU measurements. The improved model correctly explains the measurements available from the accelerometers on a multirotor. We provide hardware results demonstrating the improved attitude, velocity and even position estimates that can be achieved through the use of this model. We propose a new architecture to simplify some of the challenges that constrain GPS-denied aerial flight in Chapter 3. At the core, the approach combines visual graph-SLAM with a multiplicative extended Kalman filter (MEKF). More importantly, we depart from the common practice of estimating global states and instead keep the position and yaw states of the MEKF relative to the current node in the map. This relative navigation approach provides a tremendous benefit compared to maintaining estimates with respect to a single global coordinate frame. We discuss the architecture of this new system and provide important details for each component. We verify the approach with goal-directed autonomous flight-test results. The MEKF is the basis of the new relative navigation approach and is detailed in Chapter 4. We derive the relative filter and show how the states must be augmented and marginalized each time a new node is declared. The relative estimation approach is verified using hardware flight test results accompanied by comparisons to motion capture truth. Additionally, flight results with estimates in the control loop are provided. We believe that the relative, vision-based framework described in this work is an important step in furthering the capabilities of indoor aerial navigation in confined, unknown environments. Current approaches incur challenging problems by requiring globally referenced states. Utilizing a relative approach allows more flexibility as the critical, real-time processes of localization and control do not depend on computationally-demanding optimization and loop-closure processes.

  15. Space flight and changes in spatial orientation

    NASA Technical Reports Server (NTRS)

    Reschke, Millard F.; Bloomberg, Jacob J.; Harm, Deborah L.; Paloski, William H.

    1992-01-01

    From a sensory point of view, space flight represents a form of stimulus rearrangement requiring modification of established terrestrial response patterns through central reinterpretation. Evidence of sensory reinterpretation is manifested as postflight modifications of eye/head coordination, locomotor patterns, postural control strategies, and illusory perceptions of self or surround motion in conjunction with head movements. Under normal preflight conditions, the head is stabilized during locomotion, but immediately postflight reduced head stability, coupled with inappropriate eye/head coordination, results in modifications of gait. Postflight postural control exhibits increased dependence on vision which compensates for inappropriate interpretation of otolith and proprioceptive inputs. Eye movements compensatory for perceived self motion, rather than actual head movements have been observed postflight. Overall, the in-flight adaptive modification of head stabilization strategies, changes in head/eye coordination, illusionary motion, and postural control are maladaptive for a return to the terrestrial environment. Appropriate countermeasures for long-duration flights will rely on preflight adaptation and in-flight training.

  16. STS-70 Post Flight Presentation

    NASA Technical Reports Server (NTRS)

    Peterson, Glen (Editor)

    1995-01-01

    In this post-flight overview, the flight crew of the STS-70 mission, Tom Henricks (Cmdr.), Kevin Kregel (Pilot), Major Nancy Currie (MS), Dr. Mary Ellen Weber (MS), and Dr. Don Thomas (MS), discuss their mission and accompanying experiments. Pre-flight, launch, and orbital footage is followed by the in-orbit deployment of the Tracking and Data Relay Satellite (TDRS) and a discussion of the following spaceborne experiments: a microgravity bioreactor experiment to grow 3D body-like tissue; pregnant rat muscular changes in microgravity; embryonic development in microgravity; Shuttle Amateur Radio Experiment (SAREX); terrain surface imagery using the HERCULES camera; and a range of other physiological tests, including an eye and vision test. Views of Earth include: tropical storm Chantal; the Nile River and Red Sea; lightning over Brazil. A three planet view (Earth, Mars, and Venus) was taken right before sunrise. The end footage shows shuttle pre-landing checkout, entry, and landing, along with a slide presentation of the flight.

  17. Visual control of robots using range images.

    PubMed

    Pomares, Jorge; Gil, Pablo; Torres, Fernando

    2010-01-01

    In the last years, 3D-vision systems based on the time-of-flight (ToF) principle have gained more importance in order to obtain 3D information from the workspace. In this paper, an analysis of the use of 3D ToF cameras to guide a robot arm is performed. To do so, an adaptive method to simultaneous visual servo control and camera calibration is presented. Using this method a robot arm is guided by using range information obtained from a ToF camera. Furthermore, the self-calibration method obtains the adequate integration time to be used by the range camera in order to precisely determine the depth information.

  18. STS-52 Mission Insignia

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The STS-52 insignia, designed by the mission's crew members, features a large gold star to symbolize the crew's mission on the frontiers of space. A gold star is often used to symbolize the frontier period of the American West. The red star in the shape of the Greek letter lambda represents both the laser measurements taken from the Laser Geodynamic Satellite (LAGEOS II) and the Lambda Point Experiment, which was part of the United States Microgravity Payload (USMP-l). The remote manipulator and maple leaf are emblematic of the Canadian payload specialist who conducted a series of Canadian flight experiments (CANEX-2), including the Space Vision System test.

  19. Space Shuttle Projects

    NASA Image and Video Library

    1992-10-20

    The STS-52 insignia, designed by the mission’s crew members, features a large gold star to symbolize the crew's mission on the frontiers of space. A gold star is often used to symbolize the frontier period of the American West. The red star in the shape of the Greek letter lambda represents both the laser measurements taken from the Laser Geodynamic Satellite (LAGEOS II) and the Lambda Point Experiment, which was part of the United States Microgravity Payload (USMP-l). The remote manipulator and maple leaf are emblematic of the Canadian payload specialist who conducted a series of Canadian flight experiments (CANEX-2), including the Space Vision System test.

  20. Rocket-Based Combined Cycle Engine Concept Development

    NASA Technical Reports Server (NTRS)

    Ratekin, G.; Goldman, Allen; Ortwerth, P.; Weisberg, S.; McArthur, J. Craig (Technical Monitor)

    2001-01-01

    The development of rocket-based combined cycle (RBCC) propulsion systems is part of a 12 year effort under both company funding and contract work. The concept is a fixed geometry integrated rocket, ramjet, scramjet, which is hydrogen fueled and uses hydrogen regenerative cooling. The baseline engine structural configuration uses an integral structure that eliminates panel seals, seal purge gas, and closeout side attachments. Engine A5 is the current configuration for NASA Marshall Space Flight Center (MSFC) for the ART program. Engine A5 models the complete flight engine flowpath of inlet, isolator, airbreathing combustor, and nozzle. High-performance rocket thrusters are integrated into the engine enabling both low speed air-augmented rocket (AAR) and high speed pure rocket operation. Engine A5 was tested in GASL's new Flight Acceleration Simulation Test (FAST) facility in all four operating modes, AAR, RAM, SCRAM, and Rocket. Additionally, transition from AAR to RAM and RAM to SCRAM was also demonstrated. Measured performance demonstrated vision vehicle performance levels for Mach 3 AAR operation and ramjet operation from Mach 3 to 4. SCRAM and rocket mode performance was above predictions. For the first time, testing also demonstrated transition between operating modes.

  1. Space Flight-Induced Intracranial Hypertension: An Ophthalmic Review

    NASA Technical Reports Server (NTRS)

    Gibson, Charles Robert; Mader, Thomas H.

    2010-01-01

    Background: Although physiologic and pathologic changes associated with microgravity exposure have been studied extensively, the effect of this environment on the eye is largely unknown. Over the last several years, NASA s Space Medicine Division has documented astronauts presenting with varying degrees of disc edema, globe flattening, choroidal folds, cotton wool spots, and hyperopic shifts after long-duration space flight. Methods: Before and after long-duration space flight, six astronauts underwent complete eye examinations to include cycloplegic and/or manifest refraction and fundus photography. Five of these astronauts had Optical Coherence Tomography (OCT) and Magnetic Resonance Imaging (MRI) performed following their missions. Results: Following exposure to space flight of approximately 6-months duration, six astronauts had neuro-ophthalmic findings. These consisted of disc edema in four astronauts, globe flattening in four astronauts, choroidal folds in four astronauts, cotton wool spots in three astronauts, nerve fiber layer thickening by OCT in five astronauts, and decreased near vision in five astronauts. Four of the astronauts with near vision complaints had a hyperopic shift equal to or greater than + 0.50D between pre- and post-mission spherical equivalent refraction in one or both eyes (range +0.50D to +1.50D). These same four had globe flattening by MRI. Conclusions: The findings we describe may have resulted from a rise in intracranial pressure caused by microgravity fluid shifts, and could represent parts of a spectrum of ocular and cerebral responses to extended microgravity.

  2. Binocular fusion time in sleep-deprived subjects.

    DOT National Transportation Integrated Search

    1969-01-01

    The attainment of binocular single vision when the distance of gaze is changed is a component of total reaction time and may be critical in flight when the gaze is changed from the instrument panel to the outside or from the outside to the instrument...

  3. Visual Impairment/lntracranial Pressure Risk Clinical Care Data Tools

    NASA Technical Reports Server (NTRS)

    Van Baalen, Mary; Mason, Sara S.; Taiym, Wafa; Wear, Mary L.; Moynihan, Shannan; Alexander, David; Hart, Steve; Tarver, William

    2014-01-01

    Prior to 2010, several ISS crewmembers returned from spaceflight with changes to their vision, ranging from a mild hyperopic shift to frank disc edema. As a result, NASA expanded clinical vision testing to include more comprehensive medical imaging, including Optical Coherence Tomography and 3 Tesla Brain and Orbit MRIs. The Space and Clinical Operations (SCO) Division developed a clinical practice guideline that classified individuals based on their symptoms and diagnoses to facilitate clinical care. For the purposes of clinical surveillance, this classification was applied retrospectively to all crewmembers who had sufficient testing for classification. This classification is also a tool that has been leveraged for researchers to identify potential risk factors. In March 2014, driven in part by a more comprehensive understanding of the imaging data and increased imaging capability on orbit, the SCO Division revised their clinical care guidance to outline in-flight care and increase post-flight follow up. The new clinical guidance does not include a classification scheme

  4. Human-rating Automated and Robotic Systems - (How HAL Can Work Safely with Astronauts)

    NASA Technical Reports Server (NTRS)

    Baroff, Lynn; Dischinger, Charlie; Fitts, David

    2009-01-01

    Long duration human space missions, as planned in the Vision for Space Exploration, will not be possible without applying unprecedented levels of automation to support the human endeavors. The automated and robotic systems must carry the load of routine housekeeping for the new generation of explorers, as well as assist their exploration science and engineering work with new precision. Fortunately, the state of automated and robotic systems is sophisticated and sturdy enough to do this work - but the systems themselves have never been human-rated as all other NASA physical systems used in human space flight have. Our intent in this paper is to provide perspective on requirements and architecture for the interfaces and interactions between human beings and the astonishing array of automated systems; and the approach we believe necessary to create human-rated systems and implement them in the space program. We will explain our proposed standard structure for automation and robotic systems, and the process by which we will develop and implement that standard as an addition to NASA s Human Rating requirements. Our work here is based on real experience with both human system and robotic system designs; for surface operations as well as for in-flight monitoring and control; and on the necessities we have discovered for human-systems integration in NASA's Constellation program. We hope this will be an invitation to dialog and to consideration of a new issue facing new generations of explorers and their outfitters.

  5. Vision and Control for UAVs: A Survey of General Methods and of Inexpensive Platforms for Infrastructure Inspection

    PubMed Central

    Máthé, Koppány; Buşoniu, Lucian

    2015-01-01

    Unmanned aerial vehicles (UAVs) have gained significant attention in recent years. Low-cost platforms using inexpensive sensor payloads have been shown to provide satisfactory flight and navigation capabilities. In this report, we survey vision and control methods that can be applied to low-cost UAVs, and we list some popular inexpensive platforms and application fields where they are useful. We also highlight the sensor suites used where this information is available. We overview, among others, feature detection and tracking, optical flow and visual servoing, low-level stabilization and high-level planning methods. We then list popular low-cost UAVs, selecting mainly quadrotors. We discuss applications, restricting our focus to the field of infrastructure inspection. Finally, as an example, we formulate two use-cases for railway inspection, a less explored application field, and illustrate the usage of the vision and control techniques reviewed by selecting appropriate ones to tackle these use-cases. To select vision methods, we run a thorough set of experimental evaluations. PMID:26121608

  6. Vision in flying insects.

    PubMed

    Egelhaaf, Martin; Kern, Roland

    2002-12-01

    Vision guides flight behaviour in numerous insects. Despite their small brain, insects easily outperform current man-made autonomous vehicles in many respects. Examples are the virtuosic chasing manoeuvres male flies perform as part of their mating behaviour and the ability of bees to assess, on the basis of visual motion cues, the distance travelled in a novel environment. Analyses at both the behavioural and neuronal levels are beginning to unveil reasons for such extraordinary capabilities of insects. One recipe for their success is the adaptation of visual information processing to the specific requirements of the behavioural tasks and to the specific spatiotemporal properties of the natural input.

  7. Hi-Vision telecine system using pickup tube

    NASA Astrophysics Data System (ADS)

    Iijima, Goro

    1992-08-01

    Hi-Vision broadcasting, offering far more lifelike pictures than those produced by existing television broadcasting systems, has enormous potential in both industrial and commercial fields. The dissemination of the Hi-Vision system will enable vivid, movie theater quality pictures to be readily enjoyed in homes in the near future. To convert motion film pictures into Hi-Vision signals, a telecine system is needed. The Hi-Vision telecine systems currently under development are the "laser telecine," "flying-spot telecine," and "Saticon telecine" systems. This paper provides an overview of the pickup tube type Hi-Vision telecine system (referred to herein as the Saticon telecine system) developed and marketed by Ikegami Tsushinki Co., Ltd.

  8. Multirate and event-driven Kalman filters for helicopter flight

    NASA Technical Reports Server (NTRS)

    Sridhar, Banavar; Smith, Phillip; Suorsa, Raymond E.; Hussien, Bassam

    1993-01-01

    A vision-based obstacle detection system that provides information about objects as a function of azimuth and elevation is discussed. The range map is computed using a sequence of images from a passive sensor, and an extended Kalman filter is used to estimate range to obstacles. The magnitude of the optical flow that provides measurements for each Kalman filter varies significantly over the image depending on the helicopter motion and object location. In a standard Kalman filter, the measurement update takes place at fixed intervals. It may be necessary to use a different measurement update rate in different parts of the image in order to maintain the same signal to noise ratio in the optical flow calculations. A range estimation scheme that accepts the measurement only under certain conditions is presented. The estimation results from the standard Kalman filter are compared with results from a multirate Kalman filter and an event-driven Kalman filter for a sequence of helicopter flight images.

  9. Overview of the Nasa/science Mission Directorate University Student Instrument Project (usip)

    NASA Astrophysics Data System (ADS)

    Pierce, D. L.

    2016-12-01

    These are incredible times of space and Earth science discovery related to the Earth system, our Sun, the planets, and the universe. The National Aeronautics and Space Administration (NASA) Science Mission Directorate (SMD) provides authentic student-led hands-on flight research projects as a component part of the NASA's science program. The goal of the Undergraduate Student Instrument Project (USIP) is to enable student-led scientific and technology investigations, while also providing crucial hands-on training opportunities for the Nation's future researchers. SMD, working with NASA's Office of Education (OE), the Space Technology Mission Directorate (STMD) and its Centers (GSFC/WFF and AFRC), is actively advancing the vision for student flight research using NASA's suborbital and small spacecraft platforms. Recently proposed and selected USIP projects will open up opportunities for undergraduate researchers in conducting science and developing space technologies. The paper will present an overview of USIP, results of USIP-I, and the status of current USIP-II projects that NASA is sponsoring and expects to fly in the near future.

  10. Pushbroom Hyperspectral Imaging from AN Unmanned Aircraft System (uas) - Geometric Processingworkflow and Accuracy Assessment

    NASA Astrophysics Data System (ADS)

    Turner, D.; Lucieer, A.; McCabe, M.; Parkes, S.; Clarke, I.

    2017-08-01

    In this study, we assess two push broom hyperspectral sensors as carried by small (10-15 kg) multi-rotor Unmanned Aircraft Systems (UAS). We used a Headwall Photonics micro-Hyperspec push broom sensor with 324 spectral bands (4-5 nm FWHM) and a Headwall Photonics nano-Hyperspec sensor with 270 spectral bands (6 nm FWHM) both in the VNIR spectral range (400-1000 nm). A gimbal was used to stabilise the sensors in relation to the aircraft flight dynamics, and for the micro-Hyperspec a tightly coupled dual frequency Global Navigation Satellite System (GNSS) receiver, an Inertial Measurement Unit (IMU), and Machine Vision Camera (MVC) were used for attitude and position determination. For the nano-Hyperspec, a navigation grade GNSS system and IMU provided position and attitude data. This study presents the geometric results of one flight over a grass oval on which a dense Ground Control Point (GCP) network was deployed. The aim being to ascertain the geometric accuracy achievable with the system. Using the PARGE software package (ReSe - Remote Sensing Applications) we ortho-rectify the push broom hyperspectral image strips and then quantify the accuracy of the ortho-rectification by using the GCPs as check points. The orientation (roll, pitch, and yaw) of the sensor is measured by the IMU. Alternatively imagery from a MVC running at 15 Hz, with accurate camera position data can be processed with Structure from Motion (SfM) software to obtain an estimated camera orientation. In this study, we look at which of these data sources will yield a flight strip with the highest geometric accuracy.

  11. A Practical Solution Using A New Approach To Robot Vision

    NASA Astrophysics Data System (ADS)

    Hudson, David L.

    1984-01-01

    Up to now, robot vision systems have been designed to serve both application development and operational needs in inspection, assembly and material handling. This universal approach to robot vision is too costly for many practical applications. A new industrial vision system separates the function of application program development from on-line operation. A Vision Development System (VDS) is equipped with facilities designed to simplify and accelerate the application program development process. A complimentary but lower cost Target Application System (TASK) runs the application program developed with the VDS. This concept is presented in the context of an actual robot vision application that improves inspection and assembly for a manufacturer of electronic terminal keyboards. Applications developed with a VDS experience lower development cost when compared with conventional vision systems. Since the TASK processor is not burdened with development tools, it can be installed at a lower cost than comparable "universal" vision systems that are intended to be used for both development and on-line operation. The VDS/TASK approach opens more industrial applications to robot vision that previously were not practical because of the high cost of vision systems. Although robot vision is a new technology, it has been applied successfully to a variety of industrial needs in inspection, manufacturing, and material handling. New developments in robot vision technology are creating practical, cost effective solutions for a variety of industrial needs. A year or two ago, researchers and robot manufacturers interested in implementing a robot vision application could take one of two approaches. The first approach was to purchase all the necessary vision components from various sources. That meant buying an image processor from one company, a camera from another and lens and light sources from yet others. The user then had to assemble the pieces, and in most instances he had to write all of his own software to test, analyze and process the vision application. The second and most common approach was to contract with the vision equipment vendor for the development and installation of a turnkey inspection or manufacturing system. The robot user and his company paid a premium for their vision system in an effort to assure the success of the system. Since 1981, emphasis on robotics has skyrocketed. New groups have been formed in many manufacturing companies with the charter to learn about, test and initially apply new robot and automation technologies. Machine vision is one of new technologies being tested and applied. This focused interest has created a need for a robot vision system that makes it easy for manufacturing engineers to learn about, test, and implement a robot vision application. A newly developed vision system addresses those needs. Vision Development System (VDS) is a complete hardware and software product for the development and testing of robot vision applications. A complimentary, low cost Target Application System (TASK) runs the application program developed with the VDS. An actual robot vision application that demonstrates inspection and pre-assembly for keyboard manufacturing is used to illustrate the VDS/TASK approach.

  12. Hot Structure Control Surface Progress for X-37 Technology Development Program

    NASA Technical Reports Server (NTRS)

    Valentine, P. G.; Meyer, David L. (Editor); Snow, Holly (Editor)

    2004-01-01

    The NASA Marshall Space Flight Center (MSFC) has been leading the development of technologies that will enable the development, fabrication, and flight of the automated X-37 Orbital Vehicle (OV). With the Administration s recent announcement of the Vision for Space Exploration, NASA placed the X-37 OV design on hold while developing detailed requirements for a Crew Exploration Vehicle, but has continued funding the development of high-risk, critical technologies for potential future space exploration vehicle applications. Hot Structure Control Surfaces (HSCS) technology development is one of the high-priority areas being funded at this time. The goal of HSCS research is to mitigate risk by qualifying the lightest possible components that meet the stringent X-37 OV weight and performance requirements, including Shuttle-type reen- try environments with peak temperatures of 2800 OF. The small size of the X-37 OV (25.7-feet long and 14.9-foot wingspan) drives the need for advanced HSCS because the vehicle's two primary aerodynamic surfaces, the flaperons and ruddervators, have thicknesses ranging from approximately 5 in. down to 1 in. Traditional metallic or polymer-matrix composites covered with tile or blanket thermal protection system (TPS) materials cannot be used as there is insufficient volume to fabricate such multi-component structures. Therefore, carbon-carbon (C-C) and carbodsilicon-carbide (C-SiC) composite HSCS structures are being developed in parallel by two teams supporting the X-37 prime contractor (The Boeing Company). The Science Applications International Coy. (SAIC) and Carbon-Carbon Advanced Technologies, Inc. (C-CAT) team is developing the C-C HSCS, while the General Electric Energy Power Systems Composites (GE-PSC) and Materials Research and Design (MRD) team is developing the C-SiC HSCS. These two teams were selected to reduce the high level of risk associated with developing advanced control surface components. They have continued HSCS development work as part of the X-37 critical technology development contract. The SAIC/C-CAT team is using Advanced Carbon-Carbon (ACC) because its fabrication is very similar to the process used for Space Shuttle Reinforced Carbon-Carbon fabrication, including the Sic-based pack cementation conversion coating systems using with both materials. ACC was selected over RCC because it has much higher tension and compressions strengths, and because T-300 fiber is readily available, whereas RCC rayon fiber is no longer manufactured. The GE-PSC/MRD team is using a T-300 fiber-reinforced Sic matrix composite material densified by chemical vapor infiltration. The C-Sic material has an Sic-based environmental barrier coating. Major accomplishments have been made over the past year by both HSCS teams. C-C and C- SiC flaperon subcomponents, which are truncated full-scale versions of flight hardware, have been fabricated and are undergoing testing at the NASA Dryden Flight Research Center, NASA Langley Research Center, and U.S. Air Force Research Laboratory. By the end of 2004, ruddervator subcomponents also will be delivered and tested. As NASA moves forward in realizing the Vision for Space Exploration, it will continue to invest in advanced research and development aimed at making new generations of spacecraft safer, more reliable, and more affordable. The X-37 HSCS effort ultimately will benefit the Agency's vision and mission.

  13. Hyperstereopsis in night vision devices: basic mechanisms and impact for training requirements

    NASA Astrophysics Data System (ADS)

    Priot, Anne-Emmanuelle; Hourlier, Sylvain; Giraudet, Guillaume; Leger, Alain; Roumes, Corinne

    2006-05-01

    Including night vision capabilities in Helmet Mounted Displays has been a serious challenge for many years. The use of "see through" head mounted image intensifiers systems is particularly challenging as it introduces some peculiar visual characteristics usually referred as "hyperstereopsis". Flight testing of such systems has started in the early nineties, both in US and Europe. While the trials conducted in US yielded quite controversial results, convergent positive ones were obtained from European testing, mainly in UK, Germany and France. Subsequently, work on integrating optically coupled I2 tubes on HMD was discontinued in the US, while European manufacturers developed such HMDs for various rotary wings platforms like the TIGER. Coping with hyperstereopsis raises physiological and cognitive human factors issues. Starting in the sixties, effects of increased interocular separation and adaptation to such unusual vision conditions has been quite extensively studied by a number of authors as Wallach, Schor, Judge and Miles, Fisher and Ciuffreda. A synthetic review of literature on this subject will be presented. According to users' reports, three successive phases will be described for habituation to such devices: initial exposure, building compensation phase and behavioral adjustments phase. An habituation model will be suggested to account for HMSD users' reports and literature data bearing on hyperstereopsis, cue weighting for depth perception, adaptation and learning processes, task cognitive control. Finally, some preliminary results on hyperstereopsis spatial and temporal adaptation coming from the survey of training of TIGER pilots, currently conducted at the French-German Army Aviation Training Center, will be unveiled.

  14. Aviator's night vision system (ANVIS) in Operation Enduring Freedom (OEF): user acceptability survey

    NASA Astrophysics Data System (ADS)

    Hiatt, Keith L.; Trollman, Christopher J.; Rash, Clarence E.

    2010-04-01

    In 1973, the U.S. Army adopted night vision devices for use in the aviation environment. These devices are based on the principle of image intensification (I2) and have become the mainstay for the aviator's capability to operate during periods of low illumination, i.e., at night. In the nearly four decades that have followed, a number of engineering advancements have significantly improved the performance of these devices. The current version, using 3rd generation I2 technology is known as the Aviator's Night Vision Imaging System (ANVIS). While considerable experience with performance has been gained during training and peacetime operations, no previous studies have looked at user acceptability and performance issues in a combat environment. This study was designed to compare Army Aircrew experiences in a combat environment to currently available information in the published literature (all peacetime laboratory and field training studies) and to determine if the latter is valid. The purpose of this study was to identify and assess aircrew satisfaction with the ANVIS and any visual performance issues or problems relating to its use in Operation Enduring Freedom (OEF). The study consisted of an anonymous survey (based on previous validated surveys used in the laboratory and training environments) of 86 Aircrew members (64% Rated and 36% Non-rated) of an Aviation Task Force approximately 6 months into their OEF deployment. This group represents an aggregate of >94,000 flight hours of which ~22,000 are ANVIS and ~16,000 during this deployment. Overall user acceptability of ANVIS in a combat environment will be discussed.

  15. Effect of light intensity on flight control and temporal properties of photoreceptors in bumblebees.

    PubMed

    Reber, Therese; Vähäkainu, Antti; Baird, Emily; Weckström, Matti; Warrant, Eric; Dacke, Marie

    2015-05-01

    To control flight, insects rely on the pattern of visual motion generated on the retina as they move through the environment. When light levels fall, vision becomes less reliable and flight control thus becomes more challenging. Here, we investigated the effect of light intensity on flight control by filming the trajectories of free-flying bumblebees (Bombus terrestris, Linnaeus 1758) in an experimental tunnel at different light levels. As light levels fell, flight speed decreased and the flight trajectories became more tortuous but the bees were still remarkably good at centring their flight about the tunnel's midline. To investigate whether this robust flight performance can be explained by visual adaptations in the bumblebee retina, we also examined the response speed of the green-sensitive photoreceptors at the same light intensities. We found that the response speed of the photoreceptors significantly decreased as light levels fell. This indicates that bumblebees have both behavioural (reduction in flight speed) and retinal (reduction in response speed of the photoreceptors) adaptations to allow them to fly in dim light. However, the more tortuous flight paths recorded in dim light suggest that these adaptations do not support flight with the same precision during the twilight hours of the day. © 2015. Published by The Company of Biologists Ltd.

  16. Basic design principles of colorimetric vision systems

    NASA Astrophysics Data System (ADS)

    Mumzhiu, Alex M.

    1998-10-01

    Color measurement is an important part of overall production quality control in textile, coating, plastics, food, paper and other industries. The color measurement instruments such as colorimeters and spectrophotometers, used for production quality control have many limitations. In many applications they cannot be used for a variety of reasons and have to be replaced with human operators. Machine vision has great potential for color measurement. The components for color machine vision systems, such as broadcast quality 3-CCD cameras, fast and inexpensive PCI frame grabbers, and sophisticated image processing software packages are available. However the machine vision industry has only started to approach the color domain. The few color machine vision systems on the market, produced by the largest machine vision manufacturers have very limited capabilities. A lack of understanding that a vision based color measurement system could fail if it ignores the basic principles of colorimetry is the main reason for the slow progress of color vision systems. the purpose of this paper is to clarify how color measurement principles have to be applied to vision systems and how the electro-optical design features of colorimeters have to be modified in order to implement them for vision systems. The subject of this presentation far exceeds the limitations of a journal paper so only the most important aspects will be discussed. An overview of the major areas of applications for colorimetric vision system will be discussed. Finally, the reasons why some customers are happy with their vision systems and some are not will be analyzed.

  17. Computational approaches to vision

    NASA Technical Reports Server (NTRS)

    Barrow, H. G.; Tenenbaum, J. M.

    1986-01-01

    Vision is examined in terms of a computational process, and the competence, structure, and control of computer vision systems are analyzed. Theoretical and experimental data on the formation of a computer vision system are discussed. Consideration is given to early vision, the recovery of intrinsic surface characteristics, higher levels of interpretation, and system integration and control. A computational visual processing model is proposed and its architecture and operation are described. Examples of state-of-the-art vision systems, which include some of the levels of representation and processing mechanisms, are presented.

  18. The Application of Lidar to Synthetic Vision System Integrity

    NASA Technical Reports Server (NTRS)

    Campbell, Jacob L.; UijtdeHaag, Maarten; Vadlamani, Ananth; Young, Steve

    2003-01-01

    One goal in the development of a Synthetic Vision System (SVS) is to create a system that can be certified by the Federal Aviation Administration (FAA) for use at various flight criticality levels. As part of NASA s Aviation Safety Program, Ohio University and NASA Langley have been involved in the research and development of real-time terrain database integrity monitors for SVS. Integrity monitors based on a consistency check with onboard sensors may be required if the inherent terrain database integrity is not sufficient for a particular operation. Sensors such as the radar altimeter and weather radar, which are available on most commercial aircraft, are currently being investigated for use in a real-time terrain database integrity monitor. This paper introduces the concept of using a Light Detection And Ranging (LiDAR) sensor as part of a real-time terrain database integrity monitor. A LiDAR system consists of a scanning laser ranger, an inertial measurement unit (IMU), and a Global Positioning System (GPS) receiver. Information from these three sensors can be combined to generate synthesized terrain models (profiles), which can then be compared to the stored SVS terrain model. This paper discusses an initial performance evaluation of the LiDAR-based terrain database integrity monitor using LiDAR data collected over Reno, Nevada. The paper will address the consistency checking mechanism and test statistic, sensitivity to position errors, and a comparison of the LiDAR-based integrity monitor to a radar altimeter-based integrity monitor.

  19. Exploration Architecture Options - ECLSS, EVA, TCS Implications

    NASA Technical Reports Server (NTRS)

    Chambliss, Joe; Henninger, Don; Lawrence, Carl

    2009-01-01

    Many options for exploration of the Moon and Mars have been identified and evaluated since the Vision for Space Exploration VSE was announced in 2004. Lunar architectures have been identified and addressed in the Lunar Surface Systems team to establish options for how to get to and then inhabit and explore the moon. The Augustine Commission evaluated human space flight for the Obama administration and identified many options for how to conduct human spaceflight in the future. This paper will evaluate the options for exploration of the moon and Mars and those of the Augustine human spaceflight commission for the implications of each architecture on the Environmental Control and Life Support, ExtraVehicular Activity and Thermal Control systems. The advantages and disadvantages of each architecture and options are presented.

  20. Launch Vehicles

    NASA Image and Video Library

    2007-09-09

    Under the goals of the Vision for Space Exploration, Ares I is a chief component of the cost-effective space transportation infrastructure being developed by NASA's Constellation Program. This transportation system will safely and reliably carry human explorers back to the moon, and then onward to Mars and other destinations in the solar system. The Ares I effort includes multiple project element teams at NASA centers and contract organizations around the nation, and is managed by the Exploration Launch Projects Office at NASA's Marshall Space Flight Center (MFSC). ATK Launch Systems near Brigham City, Utah, is the prime contractor for the first stage booster. ATK's subcontractor, United Space Alliance of Houston, is designing, developing and testing the parachutes at its facilities at NASA's Kennedy Space Center in Florida. NASA's Johnson Space Center in Houston hosts the Constellation Program and Orion Crew Capsule Project Office and provides test instrumentation and support personnel. Together, these teams are developing vehicle hardware, evolving proven technologies, and testing components and systems. Their work builds on powerful, reliable space shuttle propulsion elements and nearly a half-century of NASA space flight experience and technological advances. Ares I is an inline, two-stage rocket configuration topped by the Crew Exploration Vehicle, its service module, and a launch abort system. In this HD video image, the first stage reentry 1/2% model is undergoing pressure measurements inside the wind tunnel testing facility at MSFC. (Highest resolution available)

  1. U.S. Air Force Aircrew Flight Protective Eyewear Program

    DTIC Science & Technology

    2013-02-01

    MIL-DTL-32000), fire - resistant hydraulic fluid (MIL-PRF-46170), petroleum-based hydraulic fluid (MIL-PRF-6083), gasoline (87% octane), motor oil...UPLC installed. ABRASION FPE spectacles and goggles shall maximize resistance to scratching/ abrasion to minimize interference with vision...Visual Area ................................................................................. 2 3.0 BALLISTIC AND IMPACT RESISTANCE : BALLISTIC

  2. The 1995 Aircrew Operational Vision Survey: Results, Analysis, and Recommendations.

    DTIC Science & Technology

    1999-05-04

    the high 55.4% overall and 60% Active Duty rate of return. Pre-production and Advertising Campaign There were two main goals in the advertising ... strategy . First, aircrew members needed to know that a survey was forthcoming. Also, flight surgeons needed to be prepared to brief entire squadrons

  3. 14 CFR 23.775 - Windshields and windows.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... loadings and flight loads, or compliance with the fail-safe requirements of paragraph (d) of this section... loads combined with critical aerodynamic pressure and temperature effects, after failure of any load... in front of the pilots must be arranged so that, assuming the loss of vision through any one panel...

  4. 14 CFR 23.775 - Windshields and windows.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... loadings and flight loads, or compliance with the fail-safe requirements of paragraph (d) of this section... loads combined with critical aerodynamic pressure and temperature effects, after failure of any load... in front of the pilots must be arranged so that, assuming the loss of vision through any one panel...

  5. 14 CFR 23.775 - Windshields and windows.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... loadings and flight loads, or compliance with the fail-safe requirements of paragraph (d) of this section... loads combined with critical aerodynamic pressure and temperature effects, after failure of any load... in front of the pilots must be arranged so that, assuming the loss of vision through any one panel...

  6. 14 CFR 23.775 - Windshields and windows.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... loadings and flight loads, or compliance with the fail-safe requirements of paragraph (d) of this section... loads combined with critical aerodynamic pressure and temperature effects, after failure of any load... in front of the pilots must be arranged so that, assuming the loss of vision through any one panel...

  7. 14 CFR 23.775 - Windshields and windows.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... loadings and flight loads, or compliance with the fail-safe requirements of paragraph (d) of this section... loads combined with critical aerodynamic pressure and temperature effects, after failure of any load... in front of the pilots must be arranged so that, assuming the loss of vision through any one panel...

  8. The remarkable visual capacities of nocturnal insects: vision at the limits with small eyes and tiny brains.

    PubMed

    Warrant, Eric J

    2017-04-05

    Nocturnal insects have evolved remarkable visual capacities, despite small eyes and tiny brains. They can see colour, control flight and land, react to faint movements in their environment, navigate using dim celestial cues and find their way home after a long and tortuous foraging trip using learned visual landmarks. These impressive visual abilities occur at light levels when only a trickle of photons are being absorbed by each photoreceptor, begging the question of how the visual system nonetheless generates the reliable signals needed to steer behaviour. In this review, I attempt to provide an answer to this question. Part of the answer lies in their compound eyes, which maximize light capture. Part lies in the slow responses and high gains of their photoreceptors, which improve the reliability of visual signals. And a very large part lies in the spatial and temporal summation of these signals in the optic lobe, a strategy that substantially enhances contrast sensitivity in dim light and allows nocturnal insects to see a brighter world, albeit a slower and coarser one. What is abundantly clear, however, is that during their evolution insects have overcome several serious potential visual limitations, endowing them with truly extraordinary night vision.This article is part of the themed issue 'Vision in dim light'. © 2017 The Author(s).

  9. Vision and Voyages: Lessons Learned from the Planetary Decadal Survey

    NASA Astrophysics Data System (ADS)

    Squyres, S. W.

    2015-12-01

    The most recent planetary decadal survey, entitled Vision and Voyages for Planetary Science in the Decade 2013-2022, provided a detailed set of priorities for solar system exploration. Those priorities drew on broad input from the U.S. and international planetary science community. Using white papers, town hall meetings, and open meetings of the decadal committees, community views were solicited and a consensus began to emerge. The final report summarized that consensus. Like many past decadal reports, the centerpiece of Vision and Voyages was a set of priorities for future space flight projects. Two things distinguished this report from some previous decadals. First, conservative and independent cost estimates were obtained for all of the projects that were considered. These independent cost estimates, rather than estimates generated by project advocates, were used to judge each project's expected science return per dollar. Second, rather than simply accepting NASA's ten-year projection of expected funding for planetary exploration, decision rules were provided to guide program adjustments if actual funding did not follow projections. To date, NASA has closely followed decadal recommendations. In particular, the two highest priority "flagship" missions, a Mars rover to collect samples for return to Earth and a mission to investigate a possible ocean on Europa, are both underway. The talk will describe the planetary decadal process in detail, and provide a more comprehensive assessment of NASA's response to it.

  10. Biomimetic machine vision system.

    PubMed

    Harman, William M; Barrett, Steven F; Wright, Cameron H G; Wilcox, Michael

    2005-01-01

    Real-time application of digital imaging for use in machine vision systems has proven to be prohibitive when used within control systems that employ low-power single processors without compromising the scope of vision or resolution of captured images. Development of a real-time machine analog vision system is the focus of research taking place at the University of Wyoming. This new vision system is based upon the biological vision system of the common house fly. Development of a single sensor is accomplished, representing a single facet of the fly's eye. This new sensor is then incorporated into an array of sensors capable of detecting objects and tracking motion in 2-D space. This system "preprocesses" incoming image data resulting in minimal data processing to determine the location of a target object. Due to the nature of the sensors in the array, hyperacuity is achieved thereby eliminating resolutions issues found in digital vision systems. In this paper, we will discuss the biological traits of the fly eye and the specific traits that led to the development of this machine vision system. We will also discuss the process of developing an analog based sensor that mimics the characteristics of interest in the biological vision system. This paper will conclude with a discussion of how an array of these sensors can be applied toward solving real-world machine vision issues.

  11. Color image processing and vision system for an automated laser paint-stripping system

    NASA Astrophysics Data System (ADS)

    Hickey, John M., III; Hise, Lawson

    1994-10-01

    Color image processing in machine vision systems has not gained general acceptance. Most machine vision systems use images that are shades of gray. The Laser Automated Decoating System (LADS) required a vision system which could discriminate between substrates of various colors and textures and paints ranging from semi-gloss grays to high gloss red, white and blue (Air Force Thunderbirds). The changing lighting levels produced by the pulsed CO2 laser mandated a vision system that did not require a constant color temperature lighting for reliable image analysis.

  12. Habitats and Surface Construction Technology and Development Roadmap

    NASA Technical Reports Server (NTRS)

    Cohen, Marc; Kennedy, Kriss J.

    1997-01-01

    The vision of the technology and development teams at NASA Ames and Johnson Research Centers is to provide the capability for automated delivery and emplacement of habitats and surface facilities. The benefits of the program are as follows: Composites and Inflatables: 30-50% (goal) lighter than Al Hard Structures; Capability for Increased Habitable Volume, Launch Efficiency; Long Term Growth Potential; and Supports initiation of commercial and industrial expansion. Key Habitats and Surface Construction (H&SC) technology issues are: Habitat Shell Structural Materials; Seals and Mechanisms; Construction and Assembly: Automated Pro-Deploy Construction Systems; ISRU Soil/Construction Equipment: Lightweight and Lower Power Needs; Radiation Protection (Health and Human Performance Tech.); Life Support System (Regenerative Life Support System Tech.); Human Physiology of Long Duration Space Flight (Health and Human Performance Tech.); and Human Psychology of Long Duration Space Flight (Health and Human Performance Tech.) What is being done regarding these issues?: Use of composite materials for X-38 CRV, RLV, etc.; TransHAB inflatable habitat design/development; Japanese corporations working on ISRU-derived construction processes. What needs to be done for the 2004 Go Decision?: Characterize Mars Environmental Conditions: Civil Engineering, Material Durability, etc.; Determine Credibility of Inflatable Structures for Human Habitation; and Determine Seal Technology for Mechanisms and Hatches, Life Cycle, and Durability. An overview encompassing all of the issues above is presented.

  13. Deployable reconnaissance from a VTOL UAS in urban environments

    NASA Astrophysics Data System (ADS)

    Barnett, Shane; Bird, John; Culhane, Andrew; Sharkasi, Adam; Reinholtz, Charles

    2007-04-01

    Reconnaissance collection in unknown or hostile environments can be a dangerous and life threatening task. To reduce this risk, the Unmanned Systems Group at Virginia Tech has produced a fully autonomous reconnaissance system able to provide live video reconnaissance from outside and inside unknown structures. This system consists of an autonomous helicopter which launches a small reconnaissance pod inside a building and an operator control unit (OCU) on a ground station. The helicopter is a modified Bergen Industrial Twin using a Rotomotion flight controller and can fly missions of up to one half hour. The mission planning OCU can control the helicopter remotely through teleoperation or fully autonomously by GPS waypoints. A forward facing camera and template matching aid in navigation by identifying the target building. Once the target structure is identified, vision algorithms will center the UAS adjacent to open windows or doorways. Tunable parameters in the vision algorithm account for varying launch distances and opening sizes. Launch of the reconnaissance pod may be initiated remotely through a human in the loop or autonomously. Compressed air propels the half pound stationary pod or the larger mobile pod into the open portals. Once inside the building, the reconnaissance pod will then transmit live video back to the helicopter. The helicopter acts as a repeater node for increased video range and simplification of communication back to the ground station.

  14. Bioinspired engineering of exploration systems for NASA and DoD

    NASA Technical Reports Server (NTRS)

    Thakoor, Sarita; Chahl, Javaan; Srinivasan, M. V.; Young, L.; Werblin, Frank; Hine, Butler; Zornetzer, Steven

    2002-01-01

    A new approach called bioinspired engineering of exploration systems (BEES) and its value for solving pressing NASA and DoD needs are described. Insects (for example honeybees and dragonflies) cope remarkably well with their world, despite possessing a brain containing less than 0.01% as many neurons as the human brain. Although most insects have immobile eyes with fixed focus optics and lack stereo vision, they use a number of ingenious, computationally simple strategies for perceiving their world in three dimensions and navigating successfully within it. We are distilling selected insect-inspired strategies to obtain novel solutions for navigation, hazard avoidance, altitude hold, stable flight, terrain following, and gentle deployment of payload. Such functionality provides potential solutions for future autonomous robotic space and planetary explorers. A BEES approach to developing lightweight low-power autonomous flight systems should be useful for flight control of such biomorphic flyers for both NASA and DoD needs. Recent biological studies of mammalian retinas confirm that representations of multiple features of the visual world are systematically parsed and processed in parallel. Features are mapped to a stack of cellular strata within the retina. Each of these representations can be efficiently modeled in semiconductor cellular nonlinear network (CNN) chips. We describe recent breakthroughs in exploring the feasibility of the unique blending of insect strategies of navigation with mammalian visual search, pattern recognition, and image understanding into hybrid biomorphic flyers for future planetary and terrestrial applications. We describe a few future mission scenarios for Mars exploration, uniquely enabled by these newly developed biomorphic flyers.

  15. Wearable Improved Vision System for Color Vision Deficiency Correction

    PubMed Central

    Riccio, Daniel; Di Perna, Luigi; Sanniti Di Baja, Gabriella; De Nino, Maurizio; Rossi, Settimio; Testa, Francesco; Simonelli, Francesca; Frucci, Maria

    2017-01-01

    Color vision deficiency (CVD) is an extremely frequent vision impairment that compromises the ability to recognize colors. In order to improve color vision in a subject with CVD, we designed and developed a wearable improved vision system based on an augmented reality device. The system was validated in a clinical pilot study on 24 subjects with CVD (18 males and 6 females, aged 37.4 ± 14.2 years). The primary outcome was the improvement in the Ishihara Vision Test score with the correction proposed by our system. The Ishihara test score significantly improved (\\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}$p = 0.03$ \\end{document}) from 5.8 ± 3.0 without correction to 14.8 ± 5.0 with correction. Almost all patients showed an improvement in color vision, as shown by the increased test scores. Moreover, with our system, 12 subjects (50%) passed the vision color test as normal vision subjects. The development and preliminary validation of the proposed platform confirm that a wearable augmented-reality device could be an effective aid to improve color vision in subjects with CVD. PMID:28507827

  16. The Flight Telerobotic Servicer (FTS) - A focus for automation and robotics on the Space Station

    NASA Technical Reports Server (NTRS)

    Hinkal, Sanford W.; Andary, James F.; Watzin, James G.; Provost, David E.

    1987-01-01

    The concept, fundamental design principles, and capabilities of the FTS, a multipurpose telerobotic system for use on the Space Station and Space Shuttle, are discussed. The FTS is intended to assist the crew in the performance of extravehicular tasks; the telerobot will also be used on the Orbital Maneuvering Vehicle to service free-flyer spacecraft. The FTS will be capable of both teleoperation and autonomous operation; eventually it may also utilize ground control. By careful selection of the functional architecture and a modular approach to the hardware and software design, the FTS can accept developments in artificial intelligence and newer, more advanced sensors, such as machine vision and collision avoidance.

  17. A survey of selected aviators' perceptions regarding Army crew endurance guidelines.

    PubMed

    Caldwell, J A; Caldwell, J L; Hartnett, T C

    1995-01-01

    A 59-item questionnaire was administered to Army helicopter pilots from a variety of Army units to assess crew endurance issues. Analysis of 653 completed questionnaires indicated that respondents felt that the maintenance of aviator proficiency was more important than the fulfillment of only currency requirements in improving flight endurance. Approximately three-quarters of the respondents said that physical training was important to them personally, and 63% said that improved physical fitness reduces flight-related fatigue. With regard to the current crew endurance guide, only 1% of the respondents thought that the guide was exceptional and 65% said that they thought it should be rewritten. Adjustments were suggested for some of the recommended flight time limitations, to include liberalizing the factor associated with night-vision device flight. A majority of respondents indicated that data from either in-flight endurance evaluations or questionnaires administered to personnel in the field should be used to develop a new guide. Most respondents did not feel comfortable delegating responsibility for total crew endurance planning to unit commanders.

  18. Infusing Stretch Goal Requirements into the Constellation Program

    NASA Technical Reports Server (NTRS)

    Lee, Young H.; Galpin, Roger A.; Ingoldsby, Kevin

    2008-01-01

    In 2004, the Vision for Space Exploration (VSE) was announced by the United States President's Administration in an effort to explore space and to extend a human presence across our solar system. Subsequently, the National Aeronautics and Space Administration (NASA) established the Exploration Systems Mission Directorate (ESMD) to develop a constellation of new capabilities, supporting technologies, and foundational research that allows for the sustained and affordable exploration of space. Then, ESMD specified the primary mission for the Constellation Program to carry out a series of human expeditions, ranging from Low Earth Orbit (LEO) to the surface of Moon, Mars, and beyond for the purposes of conducting human exploration of space. Thus, the Constellation Program was established at the Lyndon B. Johnson Space Center (JSC) to manage the development of the flight and ground infrastructure and systems that can enable continued and extended human access to space. Constellation Program's "Design Objectives" call for an early attention to the program's life cycle costs management through the Program's Need, Goals, and Objectives (NGO) document, which provides the vision, scope, and key areas of focus for the Program. One general policy of the Constellation Program, found in the Constellation Architecture Requirements Document (CARD), states: "A sustainable program hinges on how effectively total life cycle costs are managed. Developmental costs are a key consideration, but total life cycle costs related to the production, processing, and operation of the entire architecture must be accounted for in design decisions sufficiently to ensure future resources are available for ever more ambitious missions into the solar system....It is the intent of the Constellation Program to aggressively manage this aspect of the program using the design policies and simplicity." To respond to the Program's strong desire to manage the program life cycle costs, special efforts were established to identify operability requirements to influence flight vehicle and ground infrastructure design in order to impact the life cycle operations costs, and stretch goal requirements were introduced to the Program. This paper will describe how these stretch goal requirements were identified, developed, refined, matured, approved, and infused into the CARD. The paper will also document several challenges encountered when infusing the stretch goal requirements into the Constellation Program.

  19. Visual Odometry for Autonomous Deep-Space Navigation Project

    NASA Technical Reports Server (NTRS)

    Robinson, Shane; Pedrotty, Sam

    2016-01-01

    Autonomous rendezvous and docking (AR&D) is a critical need for manned spaceflight, especially in deep space where communication delays essentially leave crews on their own for critical operations like docking. Previously developed AR&D sensors have been large, heavy, power-hungry, and may still require further development (e.g. Flash LiDAR). Other approaches to vision-based navigation are not computationally efficient enough to operate quickly on slower, flight-like computers. The key technical challenge for visual odometry is to adapt it from the current terrestrial applications it was designed for to function in the harsh lighting conditions of space. This effort leveraged Draper Laboratory’s considerable prior development and expertise, benefitting both parties. The algorithm Draper has created is unique from other pose estimation efforts as it has a comparatively small computational footprint (suitable for use onboard a spacecraft, unlike alternatives) and potentially offers accuracy and precision needed for docking. This presents a solution to the AR&D problem that only requires a camera, which is much smaller, lighter, and requires far less power than competing AR&D sensors. We have demonstrated the algorithm’s performance and ability to process ‘flight-like’ imagery formats with a ‘flight-like’ trajectory, positioning ourselves to easily process flight data from the upcoming ‘ISS Selfie’ activity and then compare the algorithm’s quantified performance to the simulated imagery. This will bring visual odometry beyond TRL 5, proving its readiness to be demonstrated as part of an integrated system.Once beyond TRL 5, visual odometry will be poised to be demonstrated as part of a system in an in-space demo where relative pose is critical, like Orion AR&D, ISS robotic operations, asteroid proximity operations, and more.

  20. Visual Odometry for Autonomous Deep-Space Navigation Project

    NASA Technical Reports Server (NTRS)

    Robinson, Shane; Pedrotty, Sam

    2016-01-01

    Autonomous rendezvous and docking (AR&D) is a critical need for manned spaceflight, especially in deep space where communication delays essentially leave crews on their own for critical operations like docking. Previously developed AR&D sensors have been large, heavy, power-hungry, and may still require further development (e.g. Flash LiDAR). Other approaches to vision-based navigation are not computationally efficient enough to operate quickly on slower, flight-like computers. The key technical challenge for visual odometry is to adapt it from the current terrestrial applications it was designed for to function in the harsh lighting conditions of space. This effort leveraged Draper Laboratory's considerable prior development and expertise, benefitting both parties. The algorithm Draper has created is unique from other pose estimation efforts as it has a comparatively small computational footprint (suitable for use onboard a spacecraft, unlike alternatives) and potentially offers accuracy and precision needed for docking. This presents a solution to the AR&D problem that only requires a camera, which is much smaller, lighter, and requires far less power than competing AR&D sensors. We have demonstrated the algorithm's performance and ability to process 'flight-like' imagery formats with a 'flight-like' trajectory, positioning ourselves to easily process flight data from the upcoming 'ISS Selfie' activity and then compare the algorithm's quantified performance to the simulated imagery. This will bring visual odometry beyond TRL 5, proving its readiness to be demonstrated as part of an integrated system. Once beyond TRL 5, visual odometry will be poised to be demonstrated as part of a system in an in-space demo where relative pose is critical, like Orion AR&D, ISS robotic operations, asteroid proximity operations, and more.

  1. Vision Screening

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Visi Screen OSS-C, marketed by Vision Research Corporation, incorporates image processing technology originally developed by Marshall Space Flight Center. Its advantage in eye screening is speed. Because it requires no response from a subject, it can be used to detect eye problems in very young children. An electronic flash from a 35 millimeter camera sends light into a child's eyes, which is reflected back to the camera lens. The photorefractor then analyzes the retinal reflexes generated and produces an image of the child's eyes, which enables a trained observer to identify any defects. The device is used by pediatricians, day care centers and civic organizations that concentrate on children with special needs.

  2. Knowledge-based machine vision systems for space station automation

    NASA Technical Reports Server (NTRS)

    Ranganath, Heggere S.; Chipman, Laure J.

    1989-01-01

    Computer vision techniques which have the potential for use on the space station and related applications are assessed. A knowledge-based vision system (expert vision system) and the development of a demonstration system for it are described. This system implements some of the capabilities that would be necessary in a machine vision system for the robot arm of the laboratory module in the space station. A Perceptics 9200e image processor, on a host VAXstation, was used to develop the demonstration system. In order to use realistic test images, photographs of actual space shuttle simulator panels were used. The system's capabilities of scene identification and scene matching are discussed.

  3. A Successful Infusion Process for Enabling Lunar Exploration Technologies

    NASA Technical Reports Server (NTRS)

    Over, Ann P.; Klem, Mark K.; Motil, Susan M.

    2008-01-01

    The NASA Vision for Space Exploration begins with a more reliable flight capability to the International Space Station and ends with sending humans to Mars. An important stepping stone on the path to Mars encompasses human missions to the Moon. There is little doubt throughout the stakeholder community that new technologies will be required to enable this Vision. However, there are many factors that influence the ability to successfully infuse any technology including the technical risk, requirement and development schedule maturity, and, funds available. This paper focuses on effective infusion processes that have been used recently for the technologies in development for the lunar exploration flight program, Constellation. Recent successes with Constellation customers are highlighted for the Exploration Technology Development Program (ETDP) Projects managed by NASA Glenn Research Center (GRC). Following an overview of the technical context of both the flight program and the technology capability mapping, the process is described for how to effectively build an integrated technology infusion plan. The process starts with a sound risk development plan and is completed with an integrated project plan, including content, schedule and cost. In reality, the available resources for this development are going to change over time, necessitating some level of iteration in the planning. However, the driving process is based on the initial risk assessment, which changes only when the overall architecture changes, enabling some level of stability in the process.

  4. 14 CFR 25.775 - Windshields and windows.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... differential loads combined with critical aerodynamic pressure and temperature effects after any single failure... front of the pilots must be arranged so that, assuming the loss of vision through any one panel, one or... flight and landing. [Doc. No. 5066, 29 FR 18291, Dec. 24, 1964, as amended by Amdt. 25-23, 35 FR 5676...

  5. 14 CFR 25.775 - Windshields and windows.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... differential loads combined with critical aerodynamic pressure and temperature effects after any single failure... front of the pilots must be arranged so that, assuming the loss of vision through any one panel, one or... flight and landing. [Doc. No. 5066, 29 FR 18291, Dec. 24, 1964, as amended by Amdt. 25-23, 35 FR 5676...

  6. 14 CFR 25.775 - Windshields and windows.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... differential loads combined with critical aerodynamic pressure and temperature effects after any single failure... front of the pilots must be arranged so that, assuming the loss of vision through any one panel, one or... flight and landing. [Doc. No. 5066, 29 FR 18291, Dec. 24, 1964, as amended by Amdt. 25-23, 35 FR 5676...

  7. 14 CFR 25.775 - Windshields and windows.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... differential loads combined with critical aerodynamic pressure and temperature effects after any single failure... front of the pilots must be arranged so that, assuming the loss of vision through any one panel, one or... flight and landing. [Doc. No. 5066, 29 FR 18291, Dec. 24, 1964, as amended by Amdt. 25-23, 35 FR 5676...

  8. 14 CFR 25.775 - Windshields and windows.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... differential loads combined with critical aerodynamic pressure and temperature effects after any single failure... front of the pilots must be arranged so that, assuming the loss of vision through any one panel, one or... flight and landing. [Doc. No. 5066, 29 FR 18291, Dec. 24, 1964, as amended by Amdt. 25-23, 35 FR 5676...

  9. Information-Driven Autonomous Exploration for a Vision-Based Mav

    NASA Astrophysics Data System (ADS)

    Palazzolo, E.; Stachniss, C.

    2017-08-01

    Most micro aerial vehicles (MAV) are flown manually by a pilot. When it comes to autonomous exploration for MAVs equipped with cameras, we need a good exploration strategy for covering an unknown 3D environment in order to build an accurate map of the scene. In particular, the robot must select appropriate viewpoints to acquire informative measurements. In this paper, we present an approach that computes in real-time a smooth flight path with the exploration of a 3D environment using a vision-based MAV. We assume to know a bounding box of the object or building to explore and our approach iteratively computes the next best viewpoints using a utility function that considers the expected information gain of new measurements, the distance between viewpoints, and the smoothness of the flight trajectories. In addition, the algorithm takes into account the elapsed time of the exploration run to safely land the MAV at its starting point after a user specified time. We implemented our algorithm and our experiments suggest that it allows for a precise reconstruction of the 3D environment while guiding the robot smoothly through the scene.

  10. Biological Basis For Computer Vision: Some Perspectives

    NASA Astrophysics Data System (ADS)

    Gupta, Madan M.

    1990-03-01

    Using biology as a basis for the development of sensors, devices and computer vision systems is a challenge to systems and vision scientists. It is also a field of promising research for engineering applications. Biological sensory systems, such as vision, touch and hearing, sense different physical phenomena from our environment, yet they possess some common mathematical functions. These mathematical functions are cast into the neural layers which are distributed throughout our sensory regions, sensory information transmission channels and in the cortex, the centre of perception. In this paper, we are concerned with the study of the biological vision system and the emulation of some of its mathematical functions, both retinal and visual cortex, for the development of a robust computer vision system. This field of research is not only intriguing, but offers a great challenge to systems scientists in the development of functional algorithms. These functional algorithms can be generalized for further studies in such fields as signal processing, control systems and image processing. Our studies are heavily dependent on the the use of fuzzy - neural layers and generalized receptive fields. Building blocks of such neural layers and receptive fields may lead to the design of better sensors and better computer vision systems. It is hoped that these studies will lead to the development of better artificial vision systems with various applications to vision prosthesis for the blind, robotic vision, medical imaging, medical sensors, industrial automation, remote sensing, space stations and ocean exploration.

  11. Influence of Sensory Dependence on Postural Control

    NASA Technical Reports Server (NTRS)

    Santana, Patricia A.; Mulavara, Ajitkumar P.; Fiedler, Matthew J.

    2011-01-01

    The current project is part of an NSBRI funded project, "Development of Countermeasures to Aid Functional Egress from the Crew Exploration Vehicle Following Long-Duration Spaceflight." The development of this countermeasure is based on the use of imperceptible levels of electrical stimulation to the balance organs of the inner ear to assist and enhance the response of a person s sensorimotor function. These countermeasures could be used to increase an astronaut s re-adaptation rate to Earth s gravity following long-duration space flight. The focus of my project is to evaluate and examine the correlation of sensory preferences for vision and vestibular systems. Disruption of the sensorimotor functions following space flight affects posture, locomotion and spatial orientation tasks in astronauts. The Group Embedded Figures Test (GEFT), the Rod and Frame Test (RFT) and the Computerized Dynamic Posturography Test (CDP) are measurements used to examine subjects visual and vestibular sensory preferences. The analysis of data from these tasks will assist in relating the visual dependence measures recognized in the GEFT and RFT with vestibular dependence measures recognized in the stability measures obtained during CDP. Studying the impact of sensory dependence on the performance in varied tasks will help in the development of targeted countermeasures to help astronauts readapt to gravitational changes after long duration space flight.

  12. Open Source and Design Thinking at NASA: A Vision for Future Software

    NASA Technical Reports Server (NTRS)

    Trimble, Jay

    2017-01-01

    NASA Mission Control Software for the Visualization of data has historically been closed, accessible only to small groups of flight controllers, often bound to a specific mission discipline such as flight dynamics, health and status or mission planning. Open Mission Control Technologies (MCT) provides new capability for NASA mission controllers and, by being fully open source, opens up NASA software for the visualization of mission data to broader communities inside and outside of NASA. Open MCT is the product of a design thinking process within NASA, using participatory design and design sprints to build a product that serves users.

  13. Progress in computer vision.

    NASA Astrophysics Data System (ADS)

    Jain, A. K.; Dorai, C.

    Computer vision has emerged as a challenging and important area of research, both as an engineering and a scientific discipline. The growing importance of computer vision is evident from the fact that it was identified as one of the "Grand Challenges" and also from its prominent role in the National Information Infrastructure. While the design of a general-purpose vision system continues to be elusive machine vision systems are being used successfully in specific application elusive, machine vision systems are being used successfully in specific application domains. Building a practical vision system requires a careful selection of appropriate sensors, extraction and integration of information from available cues in the sensed data, and evaluation of system robustness and performance. The authors discuss and demonstrate advantages of (1) multi-sensor fusion, (2) combination of features and classifiers, (3) integration of visual modules, and (IV) admissibility and goal-directed evaluation of vision algorithms. The requirements of several prominent real world applications such as biometry, document image analysis, image and video database retrieval, and automatic object model construction offer exciting problems and new opportunities to design and evaluate vision algorithms.

  14. Around Marshall

    NASA Image and Video Library

    2001-01-01

    This photograph shows the Starship 2040 leaving the Marshall Space Flight Center (MSFC) for the exhibit site. Developed by the Space Transportation Directorate at MSFC, the Starship 2040 exhibit is housed in a 48-ft (14.6-m) tractor and trailer rig, permitting it to travel around the Nation, demonstrating NASA's vision of what commercial spaceflight might be like 40 years from now. All the irnovations suggested aboard the exhibit, automated vehicle health monitoring systems, high-energy propulsion drive, navigational aids and emergency and safety systems, are based on concepts and technologies now being studied at NASA Centers and partner institutions around the Nation. NASA is the nation's premier agency for development of the space transportation system, including future-generation reusable launch vehicles. Such systems, the keys to a "real" Starship 2040, require revolutionary advances in critical aerospace technologies, from thermal, magnetic, chemical, and propellantless propulsion systems to new energy sources such as space solar power or antimatter propulsion. These and other advances are now being studied, developed, and tested at NASA field centers and partner institutions all over the Nation.

  15. Review of ultraresolution (10-100 megapixel) visualization systems built by tiling commercial display components

    NASA Astrophysics Data System (ADS)

    Hopper, Darrel G.; Haralson, David G.; Simpson, Matthew A.; Longo, Sam J.

    2002-08-01

    Ultra-resolution visualization systems are achieved by the technique of tiling many direct or project-view displays. During the past fews years, several such systems have been built from commercial electronics components (displays, computers, image generators, networks, communication links, and software). Civil applications driving this development have independently determined that they require images at 10-100 megapixel (Mpx) resolution to enable state-of-the-art research, engineering, design, stock exchanges, flight simulators, business information and enterprise control centers, education, art and entertainment. Military applications also press the art of the possible to improve the productivity of warfighters and lower the cost of providing for the national defense. The environment in some 80% of defense applications can be addressed by ruggedization of commercial components. This paper reviews the status of ultra-resolution systems based on commercial components and describes a vision for their integration into advanced yet affordable military command centers, simulator/trainers, and, eventually, crew stations in air, land, sea and space systems.

  16. A Sustained Proximity Network for Multi-Mission Lunar Exploration

    NASA Technical Reports Server (NTRS)

    Soloff, Jason A.; Noreen, Gary; Deutsch, Leslie; Israel, David

    2005-01-01

    Tbe Vision for Space Exploration calls for an aggressive sequence of robotic missions beginning in 2008 to prepare for a human return to the Moon by 2020, with the goal of establishing a sustained human presence beyond low Earth orbit. A key enabler of exploration is reliable, available communication and navigation capabilities to support both human and robotic missions. An adaptable, sustainable communication and navigation architecture has been developed by Goddard Space Flight Center and the Jet Propulsion Laboratory to support human and robotic lunar exploration through the next two decades. A key component of the architecture is scalable deployment, with the infrastructure evolving as needs emerge, allowing NASA and its partner agencies to deploy an interoperable communication and navigation system in an evolutionary way, enabling cost effective, highly adaptable systems throughout the lunar exploration program.

  17. Cobalt: Development and Maturation of GN&C Technologies for Precision Landing

    NASA Technical Reports Server (NTRS)

    Carson, John M.; Restrepo, Carolina; Seubert, Carl; Amzajerdian, Farzin

    2016-01-01

    The CoOperative Blending of Autonomous Landing Technologies (COBALT) instrument is a terrestrial test platform for development and maturation of guidance, navigation and control (GN&C) technologies for precision landing. The project is developing a third-generation Langley Research Center (LaRC) navigation doppler lidar (NDL) for ultra-precise velocity and range measurements, which will be integrated and tested with the Jet Propulsion Laboratory (JPL) lander vision system (LVS) for terrain relative navigation (TRN) position estimates. These technologies together provide precise navigation knowledge that is critical for a controlled and precise touchdown. The COBALT hardware will be integrated in 2017 into the GN&C subsystem of the Xodiac rocket-propulsive vertical test bed (VTB) developed by Masten Space Systems, and two terrestrial flight campaigns will be conducted: one open-loop (i.e., passive) and one closed-loop (i.e., active).

  18. A novel visual-inertial monocular SLAM

    NASA Astrophysics Data System (ADS)

    Yue, Xiaofeng; Zhang, Wenjuan; Xu, Li; Liu, JiangGuo

    2018-02-01

    With the development of sensors and computer vision research community, cameras, which are accurate, compact, wellunderstood and most importantly cheap and ubiquitous today, have gradually been at the center of robot location. Simultaneous localization and mapping (SLAM) using visual features, which is a system getting motion information from image acquisition equipment and rebuild the structure in unknown environment. We provide an analysis of bioinspired flights in insects, employing a novel technique based on SLAM. Then combining visual and inertial measurements to get high accuracy and robustness. we present a novel tightly-coupled Visual-Inertial Simultaneous Localization and Mapping system which get a new attempt to address two challenges which are the initialization problem and the calibration problem. experimental results and analysis show the proposed approach has a more accurate quantitative simulation of insect navigation, which can reach the positioning accuracy of centimeter level.

  19. A parallel implementation of a multisensor feature-based range-estimation method

    NASA Technical Reports Server (NTRS)

    Suorsa, Raymond E.; Sridhar, Banavar

    1993-01-01

    There are many proposed vision based methods to perform obstacle detection and avoidance for autonomous or semi-autonomous vehicles. All methods, however, will require very high processing rates to achieve real time performance. A system capable of supporting autonomous helicopter navigation will need to extract obstacle information from imagery at rates varying from ten frames per second to thirty or more frames per second depending on the vehicle speed. Such a system will need to sustain billions of operations per second. To reach such high processing rates using current technology, a parallel implementation of the obstacle detection/ranging method is required. This paper describes an efficient and flexible parallel implementation of a multisensor feature-based range-estimation algorithm, targeted for helicopter flight, realized on both a distributed-memory and shared-memory parallel computer.

  20. Stabilization and control of quad-rotor helicopter using a smartphone device

    NASA Astrophysics Data System (ADS)

    Desai, Alok; Lee, Dah-Jye; Moore, Jason; Chang, Yung-Ping

    2013-01-01

    In recent years, autonomous, micro-unmanned aerial vehicles (micro-UAVs), or more specifically hovering micro- UAVs, have proven suitable for many promising applications such as unknown environment exploration and search and rescue operations. The early versions of UAVs had no on-board control capabilities, and were difficult for manual control from a ground station. Many UAVs now are equipped with on-board control systems that reduce the amount of control required from the ground-station operator. However, the limitations on payload, power consumption and control without human interference remain the biggest challenges. This paper proposes to use a smartphone as the sole computational device to stabilize and control a quad-rotor. The goal is to use the readily available sensors in a smartphone such as the GPS, the accelerometer, the rate-gyros, and the camera to support vision-related tasks such as flight stabilization, estimation of the height above ground, target tracking, obstacle detection, and surveillance. We use a quad-rotor platform that has been built in the Robotic Vision Lab at Brigham Young University for our development and experiments. An Android smartphone is connected through the USB port to an external hardware that has a microprocessor and circuitries to generate pulse-width modulation signals to control the brushless servomotors on the quad-rotor. The high-resolution camera on the smartphone is used to detect and track features to maintain a desired altitude level. The vision algorithms implemented include template matching, Harris feature detector, RANSAC similarity-constrained homography, and color segmentation. Other sensors are used to control yaw, pitch, and roll of the quad-rotor. This smartphone-based system is able to stabilize and control micro-UAVs and is ideal for micro-UAVs that have size, weight, and power limitations.

Top